venerdì 30 maggio 2008

Calibrating and Aligning the ATLAS Detector


Source:
The left plot shows the distribution of the reconstructed mass of simulated Z bosons decaying into two muons, assuming either that the ID geometry is perfectly known (open circles) or after the alignment procedure based on one million tracks and about 20000 cosmic events (full circles). The right figure shows a layout of the optical alignment system for a sector of the muon spectrometer.
The calibration and the alignment of the many ATLAS sub-detectors is certainly one of the most challenging tasks that we will have to face when the first data are available.
One can distinguish several types of calibration and alignment tasks. In a first category, we can consider the following online electronic calibrations:
The electronic calibration tasks performed online at the level of the Read-Out-Drivers during the periods without beam either in between two consecutive LHC spills (less than an hour) or during LHC machine downtime. Examples of such tasks are pedestal, ramp or charge injection runs to determine the calorimeter cells response or the threshold scans in the pixel detectors to look for dead and noisy channels
The electronic calibration tasks performed during the normal physics data taking with beam, which take advantage of the presence of so-called empty bunch crossing. Actually, during the normal data taking mode with a 25 ns time lapse between two consecutive bunch crossings, more than 20% of the crossings involve empty bunches, reflecting the complex multi-stage injection scheme of the LHC involving the PS and the SPS machines. Only a very small fraction of these empty bunch crossings will be used for checking the behaviour of the tile calorimeter during physics runs by injecting known charge or laser shots. These empty bunch crossings will also be used for acquiring cosmic data for the Inner Detector (ID) alignment as discussed later.
The optical alignment of the SCT and the muon spectrometer which will run totally asynchronously with the data taking flow. Both systems will provide new sets of alignment constants every half an hour or so, in order to track with a few microns precision possible movements of the tracking elements they are monitoring: barrel layers and end cap wheels for the SCT alignment system, individual muon chamber displacements and deformations for the muon alignment.
In a second category, we find the offline calibration and alignment tasks based on specific calibration events taken during the physics runs. They are processed right after the data taking in order to upload the new updated calibration constants in the ATLAS condition Data Base before the start of the reconstruction of the bulk of the physics data 24 to 48 hours later. According to the type of calibration events, we can distinguish:
The calibration and alignment tasks relying on a specific calibration stream containing events with a topology well suited for this calibration or alignment. A first example is provided by the ID alignment stream which aims at collecting daily about one million of events containing isolated pions. The ID alignment requires also another stream collecting cosmic events with a few Hz rate taken during empty bunch crossings. The impact of the alignment procedure on the width of the reconstructed Z boson mass peak is shown on the figure. Although the mass peak would be totally washed out without alignment, more work is needed to control the remaining geometry distortion still not seen by the alignment algorithm. The calibration of the muon spectrometer MDT drift tubes, aiming at the determination of the relation between the drift radius and the drift time, will be based on a large statistics calibration stream selecting and collecting muon tracks at the second level of the trigger chain. In contrast to the other streams, the muon calibration stream will be transferred and processed at three external so-called Tier2 centres (Michigan, Munich, Rome), the new calibration constant being shipped back to CERN before the bulk reconstruction.
The calibration and alignment tasks based on the express stream. This stream contains about 10% of the physics data, duplicated from the normal physics streams, which are of particular interest for monitoring the quality of the data after the offline reconstruction. In particular, it contains most Z bosons decaying in the di-lepton channel, a fraction of the leptonic W decays and events with very high transverse or missing energy. These latter events may spot noisy channels or dead modules. Some sub-systems will take advantage of the fast processing of the express stream (start less than an hour after the data taking) to use it for calibration or alignment purpose, like for instance the relative alignment of the muon spectrometer with respect to the ID.
The various sub-systems are working very hard to implement the trigger and software chains which will enable an as smooth as possible processing of these rather complex procedures. A significant fraction of them will hopefully be tested early June 2008 during a system test of the offline part of the data processing, called “Full Dress Rehearsal” (FDR). In the FDR the data produced at the output of the acquisition system are mimicked by simulated events from different physics channels mixed in such a way that they reproduce as much as possible the physics data content expected during the first months of LHC data taking.
Soon after the FDR, the first real data will show up, hopefully during summer, with their attendant surprises, excitements and possibly hobbling detector behaviour, and we should make sure that this whole edifice is robust enough to provide within a short time scale sensible data for physics analysis.
Claude Guyot
Centre d'Études de Saclay

giovedì 29 maggio 2008

WALTER L. WAGNER: The Official Act against CERN



Luis Sancho
PO Box 411
Honomu, HI 96728
808-964-5535
pro se


IN THE UNITED STATES DISTRICT COURT

DISTRICT OF HAWAII

--oo0oo--



LUIS SANCHO, et al., ) Civil No. ____________
)
Plaintiffs ) AFFIDAVIT OF WALTER L. WAGNER
) IN SUPPORT OF TRO AND
vs. ) PRELIMINARY INJUNCTION;
) EXHIBITS “A” & “B”
US DEPARTMENT OF ENERGY, et al., )
)
Defendants )
________________________________)


AFFIDAVIT OF WALTER L. WAGNER IN SUPPORT OF
TRO AND PRELIMINARY INJUNCTION


I, Walter L. Wagner, affirm state and declare, under penalty of perjury of the laws of the State of Hawaii, as follows:
1. I am a nuclear physicist with extensive training in the field. I obtained my undergraduate degree in 1972 at Berkeley, California in the biological sciences with a physics minor, and graduate degree in 1978 in Sacramento, California in law.
2. Commencing in 1973 I worked extensively in cosmic radiation research at UC Berkeley, Physics Department, Space-Sciences, and am credited with discovery of a novel particle only previously theorized to exist [by Nobelist P.A.M. Dirac], namely a magnetic monopole. That discovery still remains controversial as to the identify of that novel particle, and numerous searches for magnetic monopoles are still currently underway, or proposed, including at the Large Hadron Collider [LHC].
3. Commencing in 1979 I began employment as a federal nuclear safety officer with the US government, from which I am currently retired, though I remain in frequent contact with my former duty station. My federal duty station was with the US Veterans Administration, and I managed an extensive program of nuclear safety involving usages of ionizing radiations from machines [X-ray, CT, etc.], and from a wide variety of radioactive materials produced by particle accelerators, in nuclear reactors, or extracted from nature [principally uranium and its radio-daughter radium]. This work involved enforcement of the regulations of the US Nuclear Regulatory Commission, the US Department of Energy, and the US Department of Transportation. Essentially, my job was to look for and root-out the safety flaws overlooked by scientific researchers as it pertained to nuclear physics, as a protection not only for the researcher’s own health, but for the visitors and population at large.
4. Following retirement from federal employment I embarked on teaching science and mathematics for many years to grade school and college students. I was noted for having obtained the highest test-score on the basic teacher credentialing examination in California [CBEST] where I initially began teaching. I am presently likewise retired from that field, though I still engage myself in formal programs for science education, including the Journey Through the Universe educational outreach program hosted annually by the Big Island astronomy community, where I live. Such educational endeavors included periods of time as an instructor at Punahou, Iolani, and several other schools in the Honolulu district.
5. I have remained active in the field of theoretical nuclear physics, and serve as a science editor for Wikipedia, having numerous articles and revisions in nuclear physics to my credit, and I am very familiar with the editing procedures and processes, and with the nuclear physics editors at Wikipedia. I have been active in the field of theoretical micro-black-holes being created by advanced colliders since publication of my work in Scientific American in July, 1999.
6. I have been in close contact with defendant CERN, or its agents, and I am fully apprised of the Large Hadron Collider [LHC] enterprise, the physics of the project, the expectations of the physics community with the results that might be obtained, as well as the articulated risks that have been discussed in the scientific literature and on the internet.
7. The attached Exhibit “A” is a true copy of a letter I received from CERN management pertaining to the LHC. I had written to CERN expressing the concern that two safety arguments were seriously flawed; namely the “strangelet” argument and the “micro black hole” argument, which I detail infra. In response, they mailed me the Exhibit “A” letter, which was dated October 1, 2007 and which reads in pertinent part:
“Dear Dr. Wagner

Thank you for communicating to CERN your concerns about the ‘Operational Safety’ of the LHC. We can assure you that CERN takes such issues very seriously.

Earlier this year we mandated a group of experts, not themselves members of the LHC experimental collaborations, to assess safety aspects of LHC operation. This group is mandated to provide by the end of this year a written report, which will be made available to the scientific community and to the general public through CERN Web pages.”

The letter was signed by both Director General Robert Aymar and Chief Scientific Officer Jos Engelen.
8. Attached herewith as Exhibit “B” is a true and correct copy of an email I received on January 31, 2008 in response to my query to CERN’s LHC Safety Assessment Group [LSAG], which is apparently the “group of experts” referenced by Mr. Aymar. That email is pasted below [including typos] and reads:
“CC: lsag@cern.ch
From: LHCSafetyAssessment.Group@cern.chSubject: Re: 2007-2008 Safety ReviewDate: Thu, 31 Jan 2008 19:37:25 +0100To: wbgi@hotmail.comhello, we are still finishing the preparation of the report, but expect it to be ready within a month. We shall inform you when this happens, so youcan download it promptly.

best regards,
LSAG,
LHC Safety Assessment Group”

To date, I have received no notification of completion of the report, nor has it been posted on their web site, as promised previously. Rather, I am informed that they are still delayed in preparing their LSAG Safety Review.
9. In order to adequately evaluate and assess the most recent Safety Review being prepared by CERN’s LSAG, I would need a minimum of six months after it is released to me and the scientific community in order to review all facets of the document, as well as the most recent scientific literature, to determine whether or not CERN adequately addressed the safety concerns regarding strangelets and micro black holes [discussed infra], and other concerns also articulated. My conversations with other scientists concerned about this issue indicate that they too would need a minimum of four to six months to fully assess such safety review and the relevant scientific literature.
10. I am informed and believe that CERN is planning to commence operation of the LHC in April or May, 2008, and that that planned start-up date leaves an inadequate amount of time in which to review and assess their most recent Safety Review, not yet released. Accordingly, court intervention is required to preclude operation of the LHC prior to full review of CERN’s LSAG Safety Review by myself and other members of the scientific community.
11. In a nutshell, the safety issues pertaining to the LHC are discussed in the below Wikipedia article about the LHC, which I have reviewed and found correct. I paste below the relevant sections, obtained from Wikipedia on February 14, 2008 at http://en.wikipedia.org/wiki/Large_Hadron_Collider#Safety_concerns_and_assurances:

Large Hadron Collider
The Large Hadron Collider (LHC) is a particle accelerator and hadron collider located at CERN, near Geneva, Switzerland (46°14′N, 6°03′E). Currently under construction, the LHC is scheduled to begin operation in May 2008.[1][2] The LHC is expected to become the world's largest and highest-energy particle accelerator. The LHC is being funded and built in collaboration with over two thousand physicists from thirty-four countries as well as hundreds of universities and laboratories. When activated, it is theorized that the collider will produce the elusive Higgs boson, the observation of which could confirm the predictions and 'missing links' in the Standard Model of physics and could explain how other elementary particles acquire properties such as mass.[3] The verification of the existence of the Higgs boson would be a significant step in the search for a Grand Unified Theory, which seeks to unify three of the four fundamental forces: electromagnetism, the strong force, and the weak force. The Higgs boson may also help to explain why the remaining force, gravitation, is so weak compared to the other three forces. In addition to the Higgs boson, other theorized novel particles that might be produced, and for which searches[4] are planned, include strangelets, micro black holes, magnetic monopoles and supersymmetric particles[5].
Safety Concerns
Concerns have been raised that performing collisions at previously unexplored energies might unleash new and disastrous phenomena. These include the production of micro black holes, and strangelets.

However, the concerns below were inadequately addressed, and another study was commissioned by CERN in 2007 for publication on CERN's web-site by the end of 2007.
Micro Black Holes
Although the Standard Model of particle physics predicts that LHC energies are far too low to create black holes, some extensions of the Standard Model posit the existence of extra spatial dimensions, in which it would be possible to create micro black holes at the LHC [21][22][23] at a rate on the order of one per second. According to the standard calculations these are harmless because they would quickly decay by Hawking radiation. The concern is that Hawking radiation (which is still debated [24]) is not yet an experimentally-tested phenomenon, and so micro black holes might not decay as rapidly as calculated, and accumulate inside the earth and eventually devour it.
Strangelets
Strangelets are a hypothetical form of strange matter that contains roughly equal numbers of up, down, and strange quarks and are more stable than ordinary nuclei. If strangelets can actually exist, and if they were produced at LHC, they could conceivably initiate a runaway fusion process (reminiscent of the fictional ice-nine) in which all the nuclei in the planet were converted to strange matter, similar to a strange star.
[bold underlining added for emphasis]

12. The above sections of Wikipedia pertaining to “Strangelets” and “Micro Black Holes” were written by a Wikipedia science editor [and not by myself] known to me as a well-respected Wikipedia editor fully knoweldgeable in nuclear physics, and who is a co-author on numerous scientific papers on “strange matter” with at least one Nobel laureate in physics. Wikipedia editors are usually anonymous, and develop an on-line reputation based on their writings. They typically utilize a ‘nom de plume’ for their writings rather than their own name as a protection for themselves, which they use to establish their reputation. The nom de plume for the above is “Dark Formal”. I concur with his assessment that there is at present a significant risk that has not been proven to be an impossibility, and that operation of the LHC may have unintended consequences which could ultimately result in the destruction of our planet.
13. I detail the rationale for these risks, as well as the demolition of the previous safety argument utilized by CERN in earlier, flawed, safety analysis to minimalise the risk, in a separate addendum to be filed subsequently as “Safety Review Addendum”.
14. I draw as an analogy to the “go for launch” attitude of the commissioning of the LHC as being the same as the “go for launch” attitude of some NASA administrators and engineers for the Challenger disaster. I quote from Senator John McCain in his new book “Hard Call Great Decisions and the Extraordinary People Who Made Them”, pages 58-59:
… After presenting their concerns, they recommended that the launch be delayed until temperatures at Canaveral reached fifty-three degrees, the lowest temperature at which they had reliable data about the O-rings’ effectiveness.
NASA managers expressed surprise and annoyance at the recommendation and challenged the conclusion. A senior NASA executive at Marshall was reported to have said he was “appalled” by Thiokol’s recommendation, and the company’s vice president for the Space Booster Program, Joe Kilminster, asked for five minutes to discuss the problem with his engineers off-line, during which the engineers continued to voice objections to the launch.
In a final review, with only senior Thiokol executives involved, Vice President for Engineering Robert Lund was asked by Senior Vice President Jerald Mason to put on his “management hat”. The result of their discussion was an agreement that while cold temperatures threatened the integrity of the primary O-rings in the booster field joints, the data were inconclusive, and secondary O-rings in each of the joints ought to seal effectively. In simultaneous discussions at Kennedy, engineer Arnie Thompson continued to press for postponement. The second teleconference between all parties began at 11:00 P.M. Joe Kilminster explained Thiokol’s conclusions and provided the company’s “engineering assessment” – or reassessment in this case – that the launch could remain on schedule. NASA’s senior manager at Marshall, George Hardy, asked Kilminster to put Thiokol’s recommendation in writing.
Thus was the space shuttle Challenger launched on its fatal flight of January 28, 1986. No one individual involved in the decision deserves exclusive or even primary blame for its consequences. The systematic downgrading of technical problems and aggregating of mistaken assumptions that were reinforced with every successful shuttle mission led the seven astronauts to their doom. Waiving launch constraints had become routine at NASA, and every dodged bullet reinforced NASA’s false confidence. Practice had made imperfect the situational alertness of the decision’s many authors. Important but not critical considerations were accorded a higher priority than the primary objective, that the Challenger return safely to earth. The shuttle had to keep flying. The mission would succeed because it had to succeed.
In a dissenting and harsher view included in the Rogers Commission report, [Nobelist] Richard Feynman made the following observation:

“It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1/100 to 1/100,000. The higher figures come from the working engineers, and the very low figures from management. … Let us make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them. They must live in reality in comparing the costs and utility of the Shuttle to other methods of entering space. And they must be realistic in making contracts in estimating costs and the difficulty of the projects. Only realistic flight schedules should be proposed, schedules that have a reasonable chance of being met. If in this way the government would not support them, then so be it. NASA owes it to the citizens from whom it asks support to be frank, honest and informative, so that those citizens can make the wisest decisions for the use of their limited resources. For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

[bold underlining added for emphasis]
15. I draw as a distinction in the analogy between the above situation, and the situation involving the LHC, that the “go for launch” decision of spaceship Challenger involved placing the lives of only 7 people at risk, whereas the “go for launch” decision for the LHC located on spaceship Earth involves placing the lives of some 7 Billion people at risk, as well as all of our future descendants not yet born.
16. The prior safety reviews by CERN had serious flaws which are purportedly being addressed in the current LSAG Safety Review that is not yet completed. The most serious of those flaws, in my opinion, was their reliance on a “cosmic ray argument” that CERN LHC collisions should be safe. In essence, they compared CERN collisions at the LHC to cosmic ray collisions in nature, and found the anticipated LHC collisions to actually be lower in energy than those in nature here on earth, and therefore they concluded that the LHC operation should be safe. They reasoned that if any disastrous particle could be created, it would already have been created eons ago by nature, and since nature has caused no disaster here on earth, so too the LHC should cause no disaster.
17. The flaw with that argument that was overlooked during their previous safety assessment was that any such novel particle created in nature by cosmic ray impacts would be left with a velocity at nearly the speed of light, relative to earth. At such speeds, a novel particle such as a micro black hole that might be created in nature, is believed by most theorists to simply pass harmlessly through our planet with nary an impact, safely exiting on the other side. These would be nearly impossible to detect in nature. Conversely, any such novel particle that might be created at the LHC would be at slow speed relative to earth, a goodly percentage[1] would then be captured by earth’s gravity, and could possibly grow larger [accrete matter] with disastrous consequences of the earth turning into a large black hole. Essentially, any such safety argument would have to rely on a presumed cross-section for capture of a relativistic micro black hole being sufficiently large to stop such particle, either by a planet, star, or even neutron star, when in fact we have ZERO information on what such cross-section for capture actually is.
18. Numerous other safety risks involving other principles of physics have also been inadequately addressed in prior safety reviews, and these need thorough review by myself and other members of the science community, in addition to review of the CERN reassessment of their “cosmic ray argument”. These additional risks will be more thoroughly detailed in the Safety Review Addendum to be filed later.
19. Alternative scientific methods that pose no risk exist for obtaining some of the information being sought by the LHC. For example, NASA plans to launch the GLAST satellite in May, 2008 to search in part for a determination as to whether miniature black holes evaporate by theoretical Hawking radiation, rendering them harmless. If they are part of the component of Dark Matter, and if they do evaporate, the GLAST satellite might well detect such signal. Delaying of the LHC pending review of the LSAG Safety Review might in fact allow for NASA to answer some of the safety concerns with additional information being sought by alternative methods.
20. Likewise, much of the purported expected increase in knowledge to be engendered by operation of the LHC can instead be acquired by passive observations made from telescopes and satellites, without resorting to attempts to create novel forms of Dark Matter [such as strangelets and micro black holes] here on earth. As shown in the affidavit of plaintiff Luis Sancho, Dark Matter pervades much of our Milky Way galaxy, and has recently been discovered to compose upwards of 90% of our galaxy. All available information, however, shows that Dark Matter indeed feeds upon “ordinary” matter from which stars, planets and people are composed, converting it into more dark matter. Because earth is presently separated by the vast distances of our Milky Way, we are presently protected from such Dark Matter. Creation of such Dark Matter on Earth would then be seen to be foolhardy, at best.
21. Based on the lack of information presently available, and based on the information that there is at least a non-zero risk of creating dangerous material at the LHC, but that we also cannot state with certainty that the risk is 100%, statistically we would have to conclude that the risk is half-way between those two extremes, or a 50% chance of disastrous consequence, until we can obtain better information.
22. Additionally, I am concerned about the issues raised in the affidavit of Mark Leggett regarding the ethical structure of the LSAG Safety Review, though I will save my own comments until after I review that forthcoming document. Likewise, the constantly shifting position of CERN is of great concern, as detailed in the affidavit of James Blodgett. Initially, in 1999, the idea that micro black holes could be created by colliders was ridiculed by collider advocates. Then, when theoretical papers were published in peer-reviewed science journals showing just such possibilities, CERN changed its stance, and welcomed the idea that colliders might create micro black holes, proclaiming they would be safe due to theoretical Hawking Radiation that would cause them to evaporate. However, as shown in the affidavit of Luis Sancho and elsewhere, Hawking Radiation is not only un-proven, it is directly contrary to established theory of Einstein’s Relativity by which black holes never evaporate, and are forever black. Is Earth the proper testing ground to determine whether Hawking or Einstein is correct?
20. Accordingly, it is respectfully requested that this Court issue a Temporary Restraining Order and Preliminary Injunction against defendants herein to preclude them from engaging the LHC in collisions of atoms, or from further preparation of the LHC for that purpose, pending review by myself and other members of the science community of their most recent safety review, whereupon I and others will prepare a report thereon for consideration by this Court as to whether to lift or maintain the Preliminary Injunction.
DATED: March ____, 2008




_________________________________
Walter L. Wagner (Dr.)


NOTARIZATION

Before me, the undersigned Notary, today appeared Walter L. Wagner, known to me to be the person whose name is subscribed to the foregoing instrument, who being by me first duly sworn on his oath, deposes and says the text of this affidavit on this ­­­____ day of March, 2008.




______________________­­______
Notary Public, State of Hawaii



____________________________
(Typed or Printed Name of Notary)

My commission expires: ______________ [seal]

[Note: the Notary will sign and affix his/her notary seal, which should include the state where issued, and the expiration date.]


EXHIBIT “A”
CERN Letter re LSAG Safety Review Promised by January 1, 2008























EXHIBIT “B”
CERN LSAG E-Mail Extending Deadline to Circa March 1, 2008
[1] Estimates as to the number that would be gravitationally captured, i.e. produced at speeds below escape velocity [25,000 mph], range from a high of 14% to a low of 0.01%

martedì 27 maggio 2008

Electron Traps That Compute


Source:
ScienceDaily (May 27, 2008) — ETH Zurich physicists have used a semiconductor material to create superimposed quantum dots that “trap” single electrons. Not only can these dots be studied with lasers, their energy can be influenced as well. Another point: the state of one of the dots governs that of the other above it. This has taken the researchers another step closer to quantum computers.
ETH Zurich quantum physicists have developed a semiconductor system that can be used for quantum computing if need be. They “grew” a gallium arsenide crystal. On top of that they applied two layers of indium-gallium arsenide from which tiny bubbles, the quantum dots, formed. The blobs in the second layer grew directly above those in the first layer. Lucio Robledo, first author of a paper published in Science, says “This kind of dot is like an artificial atom only bigger, and two superimposed dots constitute an artificial molecule.”
The Quantum Photonics Group researchers of ETH Zurich led by Ataç Imamoglu finally succeeded in populating these quantum dots with single electrons and were able to manipulate them with lasers and analyse their properties. The physicists determined exactly how many electrons were present in one of their semiconductor system’s quantum dots. Above all, however, they were able to imprison the charged particles in them individually.

Electrons as bits
Each electron in turn has a particular spin, i.e. it rotates in one direction around its own axis and is thus rather like a quantum magnet with quantum-mechanical properties. Research in theoretical and experimental quantum physics has focused for many years on gaining a better understanding of these properties and control over them.
Using the electron spin to carry encoded information was also already suggested several years ago. The information elements in a normal computer are bits with values of zero or one. This is not so with quanta, which can occupy both states simultaneously.
This means an electron has two different spin orientations at the same time. Jeroen Elzerman, a co-author of the study, stresses that “This is one of the fundamental mysteries of the quantum world.” However, he says this enables numerous computing operations to be performed simultaneously and allows a computer’s speed to be increased many times over.

Optical control
The Quantum Photonics Group researchers finally used two coupled quantum dots to study their semiconductor system, because these govern one another reciprocally. The state of one dot influences that of the one above it, and vice versa. On top of that, the ETH Zurich physicists were able to control these states optically from the outside, i.e. by excitation with a laser. Robledo says “We found a way to make quantum dots interact with one another and to communicate in a controlled fashion.” The controlled interaction presented in the study could be a suitable way to carry out fundamental quantum operations.
This optical manipulation of quantum dot spins is an important step forward for the Quantum Photonics Group researchers. For example they were able to set an electron’s spin state in a particular direction with high reliability, and also read it out again. The physicists were also able to couple individual quantum dots to optical nano-resonators.
Scale-up capability unsolved
Despite these impressive successes, Ataç Imamoglu hesitates to regard quantum dots as the most promising route to quantum computers, because a large amount of physics at the nano-scale still needs to be deciphered. In addition the architecture of a quantum computer would have to be expandable in a modular way as with a conventional computer – by which he means transistors as the structural element of chips – to enable thousands more to be added to these two quantum bits. The researchers still need to find a solution to this challenge facing quantum dots first of all.
Fausto Intilla - www.oloscience.com

Next-generation Explosives: More Power And Safety Without The Pollution


Source:
ScienceDaily (May 27, 2008) — Scientists in Germany are reporting development of a new generation of explosives that is more powerful than TNT and other existing explosives, less apt to detonate accidentally, and produce fewer toxic byproducts.
Their study of these more environmentally friendly explosives is scheduled for the June 24 issue of ACS’ Chemistry of Materials, a bi-weekly journal.
In the new study, Thomas M. Klapötke and Carles Miró Sabate point out that conventional explosives such as TNT, RDX and HMX, widely-used in military weapons, are rich in carbon and tend to produce toxic gases upon ignition.
In addition to polluting the environment, these materials are also highly sensitive to physical shock, such as hard impacts and electric sparks, making their handling extremely dangerous. Greener, safer explosives are needed, the researchers say.
To meet this need, Klapötke and Sabate turned to a recently explored class of materials called tetrazoles, which derive most of their explosive energy from nitrogen instead of carbon. They identified two promising tetrazoles: HBT and G2ZT. The researchers developed tiny “bombs” out of these materials and detonated them in the laboratory. The materials showed less sensitivity to shock than conventional explosives and produced fewer toxic products when burned, the researchers say.
Fausto Intilla - www.oloscience.com

Bright Sparks Make Gains Towards Plastic Lasers Of The Future


Source:
ScienceDaily (May 27, 2008) — Imperial researchers have come one step closer to finding the 'holy grail' in the field of plastic semiconductors by demonstrating a class of material that could make electrically-driven plastic laser diodes a reality.
Conventional electrically-powered laser diodes used in everyday consumer goods like DVD players are currently based on inorganic semiconductor materials such as gallium arsenide, gallium nitride and related alloys. The term 'semiconductor' describes the material’s ability to pass an electric current, which lies somewhere between that of a metallic conductor and that of an insulator.
In the case of a laser diode, the current comprises positive and negative charges that combine inside the material and produce the initial light required to begin the lasing process. If the initial light can be forced to pass back and forth through the semiconducting material many times, in a way that amplifies its strength on each pass, then after a short time a spectrally narrow, intense and directional laser beam emerges.
The last two decades have seen tremendous developments in new organic-molecule-based semiconductors, including a special class of plastics. Many important devices based on such plastics have successfully been developed, including light emitting diodes for displays and lighting, field effect transistors for electrical circuits, and photodiodes for solar energy conversion and light detection. However, despite over a decade of worldwide research, plastic laser diodes remain the only major device type not yet demonstrated.
One of the main stumbling blocks is that, until now, it was widely considered that plastic semiconductor laser diodes would be impossible to produce because scientists had not found or developed any plastics that could sustain a large enough current whilst also supporting the efficient light emission needed to produce a laser beam.
Now a team of Imperial physicists, publishing their findings in Nature Materials in April, have done just that. The plastics studied, synthesised by the Sumitomo Chemical Company in Japan, are closely related to PFO, an archetype blue-light emitting material. By making subtle changes in the plastic's chemical structure the researchers produced a material that transports charges 200 times better than before, without compromising its ability to efficiently emit light - indeed the generation of laser light was actually improved.
Professor Donal Bradley, lead author of the new study and head of Imperial's Department of Physics said: "This study is a real breakthrough. In the past designing polymers for electronic and optoelectronic devices often involved maximising one key property in a material at a time. When people tried to develop plastic semiconductors for laser diode use, they found that optimising the material's charge transporting properties had a detrimental effect on its ability to efficiently emit light, and vice versa."
"The modifications made to the PFO structure have allowed us to convincingly overcome this perceived incompatibility and they suggest that plastic laser diodes might now be a realistic possibility", added co-author Dr Paul Stavrinou.
Low cost manufacturing and easy integration possibilities are not the only potential advantages of developing lasers based on plastics. Currently available laser diodes do not readily cover the full visible spectrum, which limits display and many spectroscopic applications, and precludes access to the full range of wavelengths supported by the standard plastics used for waveguides and optical fibres.
Professor Bradley, Dr Stavrinou and their colleagues point out that plastic laser diodes could operate across a much more substantial wavelength range spanning the near ultraviolet to the near infrared.
The Imperial College physics team, in conjunction with polymer synthesis teams at the Sumitomo Chemical Company and in collaborating university groups, now plans to explore the generality of their approach to manipulating chemical structure to target specific device requirements. They will also study electrically driven structures, paying particular attention to understanding and managing the additional optical losses that can arise from the presence of conductive electrode layers in close proximity to the light emission material.
Fausto Intilla - www.oloscience.com

lunedì 19 maggio 2008

Atlas Experiment at CERN: Diagram of the Safety System

Source:

The detector safety system (DSS) of the ATLAS detector is getting ready for its important function – putting the detector in a safe state in case that a potentially dangerous situation arises during operation. Fernando Pedrosa, the engineer responsible for the safety system explains to ATLAS e-news its main features and what are the lasts steps towards the completion of it.
The DSS covers level 2 alarms, which protect the detector itself. With this purpose, there are independent sensors connected to the DSS racks to detect safety hazards: “If the sensors detect any anomaly, the alarms will go ON, triggering the actions required to bring the detector into safe state,” Fernando explains.
The majority of the sensors identify changes in the surrounding of the detector and in the services that allow its operation, such as availability of cooling, presence of smoke or flammable gas in the air, etc. A typical example of an action triggered by an alarm is to cut the high and the low voltage systems in case of a cooling failure.
The alarms can also be coupled to the sending of SMS or e-mail messages, but in any case, there is no need for an operator to take any action: “After the problem is understood and sorted out, the operator (SLIMOS) can acknowledge the DSS actions and only then the equipment can be switched ON again,” Fernando says.
The DSS was one of the first pieces of electronics installed in the ATLAS counting rooms. The first part of the alarm system was connected in February 2007, and ever since then, the DSS has gradually become operational in each of the ATLAS sub-detectors.
At the moment, the DSS team is finishing the installation of the safety system in the Liquid Argon and Tile calorimeters and they are now concentrating on the definition and implementation of safe state in the Muon system and the Inner Detector: “Because of the high sensitivity of the Inner Detector, more caution needs to be put in the definition of safe state of the different parts of it, in comparison to that of the more resistant sub-systems like the calorimeters,” Fernando says.
While planning the safety system, the DSS team has to take into account all the conditions that can damage the sub-system, such as high humidity and consequent condensation, power and cooling failures or fire: “Together with experts from the Inner Detector, we are designing a system that will put the detector in safe state when an alarm arises, but also ensuring that no DSS action will damage the system.”
The team expects to finish putting together the DSS in all the sub-detectors by middle of June: “It will depend on each system’s responses,” Fernando says. “But I’m confident we are converging to the right place in the right time.”

Fausto Intilla - www.oloscience.com

Can One 'Pin Down' Electrons?


Source:
ScienceDaily (May 19, 2008) — When atoms form molecules, they share their outer electrons and this creates a negatively charged cloud. Here, electrons buzz around between the two positively charged nuclei, making it impossible to tell which nucleus they belong to. They are delocalized.
But is this also true for the electrons located closer to the nucleus? And are those electrons spread out too, or do they belong to just one nucleus, i.e. are they localized? These questions, that scientists have hotly disputed over the last 50 years, have now been answered by an international team of scientists, led by Frankfurt University's atomic physics group. Their discoveries are reconciliatory. As is so often the case in quantum theory, there is no single 'right' answer -- one solution is just as valid as the other.
In order to answer these questions, the scientists first removed the innermost electron located close to the nucleus from nitrogen molecules (N2), using high-energy light from a synchrotron radiation source at the Advanced Light Source at the Lawrence Berkeley National Laboratory, Berkeley, California. It is reasonable to assume that these photo-electrons belong to one nucleus and can thus be located. They leave behind a vacancy in the inner core shell, which is then filled by an outer electron.
Additionally a second electron (an Auger electron) is ejected from the molecule. This Auger electron acts as a probe that can determine exactly where the original hole was created. Both electrons, the photo-electron and the Auger electron, form an entangled state, which means that as soon as one is measured, the properties of the second are determined as well. This prediction of quantum theory - which was rejected by Einstein as a "spooky long-range interaction" - has since been found to be valid for twin photons. It is the basic scheme behind quantum cryptography as well as "Quantum teleportation".
Professor Reinhardt Dörner's group is the first to prove the existence of such entangled states for electrons, using the COLTRIMS technology, which has been developed in Frankfurt over the last decade. With this experimental set-up, they are able to reveal the pathways of the two electrons created. In the current issue of the journal Science, the physicists claim that the question of whether an electron is localized or not can only be answered for the complete system.
If the innermost electron is localized, the second electron can be assigned to either of the two nuclei. But sometimes it proves impossible to determine whether the first electron originates from the left or the right 'atom of the first electron. In this case the second electron is also delocalized.
With these experimental details, it is now possible to explain the observations of the last 50 years in a unified model. Both groups - those supporting the localized theory and those endorsing a delocalized picture - are thus reconciled. Dr. Markus Schöffler, who is responsible for the measurement, sees further exciting perspectives opening up and he plans to continue his work on this topic in Berkeley, funded by a scholarship from the Alexander von Humboldt Foundation.
Fausto Intilla - www.oloscience.com

Perfect, Tiny Golden Nano-crown Made


Source:

ScienceDaily (May 19, 2008) — Chinese researchers have recently made a “golden crown” with a diameter of only a few nanometers. It is a large ring-shaped molecule containing 36 gold atoms. The lords of the ring, a team of researchers from the Universities of Beijing, Hong Kong, and Nanjing report their unusual compound in the journal Angewandte Chemie: the molecular ring structure is held together exclusively by gold–gold bonds and is thus the largest ring system made of gold atoms produced to date.
Large molecular rings have fascinated chemists for over 40 years—ever since the discovery of crown ethers in 1967. The pioneers in this area, C. J. Pederson, J.-M. Lehn, and D. J. Cram received the Nobel Prize in Chemistry for their discovery in 1987. In the meantime, large molecular ring systems have played an important role in the search for new functional materials and in nanotechnology. The synthesis of ring systems held together exclusively by metal–metal bonds has remained a challenge.
Small rings made of positively charged gold atoms have been know for some time, but only recently could the Chinese team make a ring containing 16 gold atoms. Now, the researchers, led by Shu-Yan Yu, Yi-Zhi Li, and Vivian Wing-Wah Yam, have introduced a new representative of this class of compounds, the biggest gold ring to date that is held together by means of gold–gold bonds: a ring system containing 36 univalent gold atoms.
The researchers started their synthesis with a ring system containing six gold atoms. Three of the gold atoms are linked into a triangle. Each of these gold atoms is attached to another gold atom that sticks out from the corner of the triangle. Three organic ligands are then bound to this flat double triangle to form a molecule that resembles a three-blade propeller.
Six such “propellers” can be linked into a larger ring by means of a self-assembly process. Within this ring system, the gold atoms are arranged into a shape that resembles a crown: six double triangles are each bound to each other by two corners. The free double-corners point outward in a pattern that alternates above and below the plane of the ring.

Fausto Intilla - www.oloscience.com

mercoledì 14 maggio 2008

Large Hadron Collider (LHC),CERN: The Potential for Danger in Particle Collider Experiments


Summary:

The upcoming Large Hadron Collider (LHC) at CERN could be dangerous. It could produce potentially dangerous particles such as mini black holes, strangelets, and monopoles.
A CERN study indicates no danger for earth, [Ref. 1] but its arguments are incomplete. The reasons why they are incomplete are discussed here.
This paper considers mainly micro black holes (MBHs) with low speeds. The fact that the speed of resultant MBHs would be low is unique to colliders. An important issue is the rate of accretion of matter subsequent to MBH creation.
This study explores processes that could cause accretion to be significant.
Other dangers of the LHC accelerator are also discussed.
I. Arguments for danger in LHC particle accelerator experiments
"In the 27-kilometer-long circular tunnel that held its predecessor, the LHC will be the most powerful particle accelerator in the world. It will smash fundamental particles into one another at energies like those of the first trillionth of a second after the Big Bang, when the temperature of the Universe was about ten thousand trillion degrees Centigrade." [Ref. 5]
1. There is a high probability that micro black holes (MBHs) will be produced in the LHC. A reasonable estimation of the probability that theories with (4+d) dimensions are valid could be more than 60%. The CERN study indicates in this case a copious production of MBHs at the LHC. [Ref. 1] One MBH could be produced every second. [Ref. 4 & Ref. 5]
2. The CERN study indicates that MBHs present no danger because they will evaporate with Hawking evaporation. [Ref. 1] However, Hawking evaporation has never been tested. In several surveys, physicists have estimated a non trivial probability that Hawking evaporation will not work. [Ref. 9] My estimate of its risk of Hawking evaporation failure is 20%, or perhaps as much as 30%.
The following points assume MBH production, and they assume that Hawking evaporation will fail.
3. The cosmic ray model is not valid for the LHC. It has been said that cosmic rays, which have more energy than the LHC, show that there is no danger. This may be true for accelerators that shoot high energy particles at a zero speed target. This is similar to cosmic ray shock on the moon's surface. In these cases the center of mass of interaction retains a high speed. This is different from the situation at the LHC, where particles with opposing speeds collide. With cosmic rays (mainly protons in cosmic rays) we need a speed of 0.9999995 c to create a micro black hole of 1 TeV and after the interaction the micro black hole center of mass will have a speed of 0.999 c. As MBHs are not very reactive with matter, calculations indicate that this is more than enough velocity to cross planets or stars without being caught and to escape into space.
4. Lower speed MBHs created in colliders could be captured by earth. Using Greg Landsberg's calculation [Ref. 3] of one black hole with velocity less than escape velocity from earth produced every 10^5 seconds at the LHC, we have 3.160 (US notation 3,160) MBHs captured by earth in ten years. More precise calculations show that we could have a distribution of MBHs at every range of speed from 0 m/sec to 4 m/sec. The probability of very low speed MBHs is not zero. We need to evaluate if low speed MBHs present more risks.
5. The speed of a MBH captured by earth will decrease and at the end MBHs will come to rest in the center of earth. The speed will decrease because of accretion and interaction with matter.
If we consider that:
a. The CERN study's calculus for accretion uses the "Schwarzschild radius" for the accretion cross section. [Ref. 1] In the case of low speeds, we must not use the Schwarzschild radius for the calculus of accretion. There are several reasons the capture radius extends beyond the Schwarzschild radius. For example, if the MBH speed were zero, gravitational attraction would be active at a distance greater than the Schwarzschild radius.
b. If a MBH accretes an electron, it will acquire a charge and then probably accrete a proton.
c. If a MBH accretes a quark it will then probably accrete a proton. When a quark is caught, the whole nucleon can be expected to be caught because otherwise the black hole would have acquired a charge which is not complete. (For example minus 1/3.) In a nucleus a fractional charge is unstable and is not allowed. This strongly suggests that the MBH will be required to accrete other divided charges to reach a completed integer number of charges. The same process can be expected in regard to quark color.
d. Gauge forces at short distances could also help to capture an atomic nucleus.
Our calculus indicates that a slow speed MBH can be expected to capture 8.400 (US notation 8,400) nucleons every hour, at the beginning of an exponential process.
6. In the center of earth new processes could occur: As stated above, it has been estimated that in ten years 3.160 (US notation 3,160) MBHs could be captured by earth. All MBHs will progressively lose speed because of numerous interactions. After a time (calculations have to be completed to estimate this time) all these MBHs will go toward the precise gravitational center of earth. (Kip Thorne [Ref. 7 p. 111]) After numerous interactions they will stop there at rest and then coalesce into a single MBH. To get an idea and for a first approach our calculus indicates that the mass of this MBH could be on the order of 0.02 g with a radius of 4 x 10^-17 m. At the center of earth, the pressure is 3.6 x 10^11 Pascals. [Ref. 8]. This pressure results from all the matter in Earth pushing on the electronic cloud of central atoms. The move of electrons is responsible of a pressure (called degenerescence pressure) that counterbalance the pressure of all the matter in Earth.
Around a black hole there is not an electronic cloud and there is no degenerescence pressure to counterbalance the pressure of all the Earth matter.To indicate the pressure we must use the surface If in an equation Pressure P = Force F / Surface S if we keep F= Constant and we reduce surface, we are obliged to notice that Pressure P will increase. Here F is the weight of all the matter of Earth and this do not change. As the surface of the MBH will be very small, calculus indicate on this surface an impressive increase of pressure in the range of : P = aprox 7 x 10 ^ 23 Pa .
The high pressure in this region push strongly all the matter in direction of the central point where the MBH is.
Electrons directly in contact with the Micro Black Hole will first be caught, then the nucleus will be caught.
It is sure that the atoms will be caught one after the other but the more the pressure will be important the more the caught will be quick. When a neutron star begins to collapse in a black hole (implosion), at the beginning the black hole is only a micro black hole as we see in [Ref. 7 Page 443]. At this very moment the high gravitational pressure in the center of the neutron star is there breaking the "strong force" which lays between the quarks located into the neutrons.
The MBH will grow there only because of the high pressure.
In center of Earth pressure is normally far to small for such a process, but if we create a slow speed MBH that does not evaporate and if this MBH comes at rest in the center of Earth, the pressure in the center of Earth could be sufficient for the growing of the MBH. We must remember that in the surrounding of the MBH the "strong force" is broken and this could mean that the same kind of pressure process than in neutron star could work there ( in a slow mode compared with a neutron star of course ). In the center of Earth, the high pressure, the high temperature, the increasing mass associated with electrical and gauge forces process could mean important increase of capture and a possible beginning of an exponential dangerous accretion process. Our calculus indicates as a first approximation with a MBH of 0.02 g at rest at the center of earth that the value for accretion of matter could be in the range of 1 g/sec to 5 g/sec.
7. Conclusion about MBHs : We estimate that for LHC the risk in the range of 7% to 10%.

II. Other Risk Factors

The CERN study indicates that strangelets and monopoles could be produced and present no danger for earth. [Ref. 1]
We will present arguments of possible danger.
1. Strangelets
Strangelets are only dangerous for earth if they are not moving rapidly through matter. If only one strangelet is at zero speed there would be danger. We have seen for MBHs that the cosmic ray model is very different from the LHC where particles with opposing speeds collide. We have seen that, given the impact of opposite speed particles, the distribution of speeds of resultant particles indicates the probability of very low speeds (0 m/sec < speed < 4 m/sec) and this could mean dangerous strangelets. We estimate a minimal risk for strangelets on the order of 2%. We might estimate as high as 10 % if we want to be wise because the danger is primary!
2. Monopoles
Monopoles could be produced in the LHC. [Ref. 1] .CERN's calculations indicate that one monopole produced in LHC could destroy 1.018 (US notation 1,018) nucleons but it will quickly traverse the earth and escape into space. However, we know that photons produced in the center of the sun need thousands of years to traverse the sun and escape into space because of the numerous interactions. If the speed given to the monopole after interaction is a speed in a random direction, we can imagine that the monopoles produced in the LHC could stay a very long time in earth and be dangerous. 3. Estimate of danger due to our ignorance of ultimate physical laws: We have not exhausted processes that might cause danger. There are other particles, black energy, black mass, quintessence, vacuum energy, and many non definitive theories. We estimate this danger ranging from a minimal 2% risk to 5%.

III. CONCLUSION

The CERN study [Ref. 1] is a remake of a similar study for the earlier Relativistic Heavy Ion Collider at Brookhaven (RHIC) [Ref. 6] adapted to the LHC.
It is important to notice that: The study for the RHIC had concluded that no black holes will be created. For the LHC the conclusion is very different: "Black holes could be created!" !
The main danger could be now just behind our door with the possible death in blood of 6.500.000.000 (US notation 6,500,000,000) people and complete destruction of our beautiful planet. Such a danger shows the need of a far larger study before any experiment ! The CERN study presents risk as a choice between a 100% risk or a 0% risk. This is not a good evaluation of a risk percentage!
If we add all the risks for the LHC we could estimate an overall risk between 11% and 25%!.
We are far from the Adrian Kent's admonition that global risks that should not exceed 0.000001% a year to have a chance to be acceptable. [Ref. 3] .Even testing the LHC could be dangerous. Even an increase in the luminosity of the RHIC could be dangerous! It would be wise to consider that the more powerful the accelerator will be, the more unpredicted and dangerous the events that may occur! We cannot build accelerators always more powerful with interactions different from natural interactions, without risk. This is not a scientific problem. This is a wisdom problem!
Our desire of knowledge is important but our desire of wisdom is more important and must take precedence. The precautionary principle indicates not to experiment. The politicians must understand this evidence and stop these experiments before it is too late!

Fausto Intilla - www.oloscience.com
-----------------------------------------------------------------
References:
1.. Study of potentially dangerous events during heavy-ion collisions at the LHC: Report of the LHC Safety Study Group. CERN 2003-001. February 28, 2003.
2.. E-mail exchange between Greg Landsberg and James Blodgett, March 2003, http://www.risk-evaluation-forum.org. (No longer posted. Request a copy. Risk Evaluation Forum, BOX 2371, Albany, NY 12220 0371 USA.)
3.. A critical look at risk assessment for global catastrophes, Adrian Kent, CERN-TH 2000-029 DAMTP-2000-105. Revised April 2003. hep-ph/0009204. Available at: http://arxiv.org/PS_cache/hep-ph/pdf/0009/0009204.pdf.
4.. High energy colliders as black hole factories: the end of short distance physics, Steven B. Giddings, Scott Thomas. Phys Rev D65 (2002) 056010.
5.. CERN to spew black holes, Nature October 2, 2001.
6.. Review of speculative disaster scenarios at RHIC September 28, 1999 W.Busza, R.L. Jaffe, J.Sandweiss and F.Wilczek.
7.. Trous noirs et distorsions du temps, Kip S. Thorne, Flammarion 1997. ISBN 2-08-0811463-X. Original title: Black holes and times warps. 1994 Norton. New York.
8.. Centre de la Terre, Science & Vie N 1042. Gallate 2004.
9.. Results of several Delphi groups and physicist questionnaires, James Blodgett, Risk Evaluation Forum, forthcoming.

domenica 11 maggio 2008

He seeks a route through time


By Dan Falk
Globe Correspondent / May 12, 2008

Ronald Mallett lost his father to heart disease at the age of 10, an event that left him in utter despair. His depression lasted until he read "The Time Machine" by H.G. Wells and, a few years later, the theories of Albert Einstein - and he became determined to see his father again.
For years, Mallett, a physicist at the University of Connecticut, stayed in the "time-travel closet," as he put it, keeping his desire to build a time machine under wraps for fear of ridicule.
Today, with other established physicists speaking openly about time travel, Mallett is finally able to talk unabashed about his research. Not only that, he and other like-minded physicists are publishing their findings in peer-reviewed journals - something hardly imaginable just a decade ago.
Time travel, of course, has been a favorite topic for science-fiction writers for more than a century, from Wells's pioneering novel to the campy "Back to the Future" movie trilogy. But the scientific urge to investigate time travel is about more than sci-fi fantasies. Contemplating time travel is forcing scientists to confront some of the most profound issues in physics, from the nature of the universe's ultimate laws to fundamental questions about the nature of space and time.
Mallet's proposal seems innocuous enough: He's currently designing a table-top experiment using a ring of high-powered lasers. The idea is that light carries energy, and, as Einstein showed, energy is equivalent to mass - therefore beams of light can distort spacetime, just as large masses do. (The warping of spacetime by large masses was the prediction that made Einstein famous; the idea was confirmed when starlight was seen to be displaced by our sun during a solar eclipse in 1919.)
Inside Mallett's circle of laser beams, empty space would become "twisted" in much the same way that milk in a coffee cup begins to swirl when the coffee is stirred. If the beams of light are intense enough, the warping of space and time close to the beams could be severe enough to create a "loop" in time, Mallett says.
An object or particle that traveled along such a curve would, in theory, travel into its own past, just as walking around the block brings you back to your house. His goal is to send a stream of neutrons through the light beams - and, he predicts, transport them back in time by a tiny fraction of a second.
Mallett's work has brought mixed reactions. The head of his department at the University of Connecticut, William Stwalley, told New Scientist that although he was intrigued by the challenges of the experimental design, making any sort of time machine "seems like a distant improbability." Two physicists at Tufts University, meanwhile, recently wrote a critique of Mallett's theory, suggesting that any closed timelike curves he creates would need to be bigger "than the radius of the visible universe."
Of course, it's the philosophical problems that cause the most fuss - and which seem to inspire writers and filmmakers. The most famous dilemma is the so-called "grandfather paradox," in which a time traveler kills his grandfather, thus preventing his own birth. And yet, there's nothing in the known laws of physics that specifically prohibits time travel. As Case Western University physicist Lawrence Krauss has put it, "Einstein's equations of general relativity not only do not directly forbid such possibilities, they encourage them."
Most physicists suspect it would take extremely exotic structures - such as a black hole - to warp spacetime to the degree required to send something back in time. One favored scenario (at least on paper) is a so-called "wormhole," a pair of black holes that are connected via a kind of "tunnel" through spacetime. While a run-of-the-mill black hole would squash any traveler who entered it, a wormhole, in theory, could act as a "bridge" across distant points in space and time, depositing the traveler into another universe, or perhaps a distant part of our own universe, depending on where (and when) the two ends of the wormhole are located. While allowed by theory, however, there is no evidence so far that wormholes actually exist.
But imagine, for a moment, that time travel is possible. How could the paradoxes be resolved? One possibility is that the laws of physics somehow forbid such acts as the killing of one's grandfather, through a mechanism not yet understood (this is what Stephen Hawking has called the "chronology protection conjecture"). A more intriguing possibility is that a time traveler who alters the past would then experience a "new future" - a new universe, perhaps. In other words, if you kill your grandfather, you would find you now inhabit a different universe - with a different history - from the one you left behind.
"Our universe will not be affected by what you do in the past," Mallett said with confidence.
Time travel into the future, incidentally, is not as controversial: Thanks to the time-dilation effects described in the first part of Einstein's theory, known as special relativity, traveling into the future is as simple as moving quickly. The Apollo astronauts, for example, returned to Earth having aged a tiny amount less than their stay-at-home colleagues - because they had effectively traveled into the future (just a few hundredths of a second, mind you, because their speed was still a snail's pace compared with the speed of light).
Even if time travel is eventually shown to be impossible, that, too, would be a tremendously important discovery. As Hawking has put it: "It is important that we understand why it is impossible."
Dan Falk is a science journalist in Toronto. His latest book, "In Search of Time: The Science of a Curious Dimension," will be published by St. Martin's Press this fall.
FACT SHEET
Hometown: born in Roaring Springs, Pa., raised in the Bronx, now lives in East Hartford, Conn.
Education: BS from Penn State in 1969, PhD, also from Penn State, in 1973.
Favorite time-travel movies: "The Time Machine" (1960), "Deja Vu" (2007), Back to the Future (1985), "Planet of the Apes" (1968), "Somewhere in Time" (1980), "Frequency" (2000), "The Butterfly Effect" (2004), and "TimeCop" (1994) - "People ask me all the time: 'Wouldn't there be abuses if time travel were possible?' I like this movie because it addresses how time travel would have to be regulated like everything else in our lives."
His 2007 book: "Time Traveler: A Scientist's Personal Mission to Make Time Travel a Reality."
Fausto Intilla - www.oloscience.com

mercoledì 7 maggio 2008

'Crispy Noodle' Chemistry Could Reduce Carbon Emissions


Source:
ScienceDaily (May 6, 2008) — A new material developed in Manchester, which has a structure that resembles crispy noodles, could help reduce the amount of carbon dioxide being pumped out and drive the next generation of high-performance hydrogen cars.
Dr Peter Budd, a materials chemist working in the Organic Materials Innovation Centre (OMIC) at The University of Manchester, has won £150,000 worth of new funding to explore the use of a special polymer to effectively remove CO2 as it’s belched from fossil fuel power stations or hydrogen production plants.
The 18 month study, which is funded by the Engineering and Physical Sciences Research Council (EPSRC), will look at the feasibility of using catalytic membrane systems to capture and recover carbon dioxide.
Dr Budd proposes to explore the potential of composite membranes made from a ‘polymer of intrinsic microporosity’, or PIM, and a synthetic catalyst.
He even hopes to make progress towards creating a unique and highly efficient double membrane system that can be used for both CO2 removal and CO2 recovery.
This latest project expands on exciting work by Dr Budd, Professor Neil McKeown at Cardiff University and David Book at the University of Birmingham, which is aiming to use PIMs to store large amounts of hydrogen. This works could bring about the attractive possibility of safe hydrogen storage with an energy efficient release for consumption.
Polymers have not previously been investigated as materials for the storage of hydrogen because most polymers have enough conformational and rotational freedom to pack space efficiently and are therefore not microporous.
But the polymers developed by Dr Budd and colleagues do possess significant microporosity – and preliminary hydrogen sorption results are encouraging, with significant quantities adsorbed. Most importantly, the chemical composition of PIMs can be tailored via synthetic chemistry.
Dr Budd said: “The PIMs act a bit like a sponge when hydrogen is around. It's made up of long molecules that can trap hydrogen between them, providing a way of supplying hydrogen on demand.
“Imagine a plate of spaghetti - when it's all coiled together there's not much space between the strands. Now imagine a plate of crispy noodles - their rigid twisted shape means there are lots of holes.
“The polymer is designed to have a rigid backbone, and it has twists and bends built into it. Because of this, lots of gaps and holes are created between molecules - perfect for tucking the hydrogen into.
“The holes between the molecules give the polymer a very high surface area - each gram has a surface area equivalent to around three tennis courts. The molecules in the polymer act like sieves, catching smaller molecules like hydrogen in the gaps between them.
“The holes created in the polymer between molecules are a good fit for hydrogen. Hydrogen molecules stick in these holes and are kept there by weak forces - this means they can be released when they are needed.
“Hydrogen is most sticky when it is cooled down to low temperatures. When the hydrogen is needed to power the car, the system would just raise the temperature to free up the hydrogen molecules.”
PIMs were created at The University of Manchester several years ago by Dr Budd and colleagues.
Dr Budd says he is encouraged by the progress being made, but warns that a lot of work still needs to be done.
“In the context of climate change and dwindling oil reserves, hydrogen could be the perfect zero-carbon fuel for a car as it only gives water as a by-product," he adds.
At the moment, the polymer Dr Budd and key collaborators at The University of Birmingham and Cardiff University have developed can store about three per cent of its weight as hydrogen, but they hope to double this in the future.
“If we could get that figure up to six per cent hydrogen, that may be enough for a car to go around 300 miles without a refill,” says Dr Budd.
Although hydrogen-powered cars are now available commercially, they don’t yet perform as well as vehicles that burn conventional fossil fuels.
The greatest obstacle to the development of high performance hydrogen-powered cars remains the lack of a system for safe, efficient and convenient on-board storage of hydrogen.
Adapted from materials provided by University of Manchester.

Fausto Intilla - www.oloscience.com

Record-Setting Laser May Boost Search For Earthlike Planets 100 Fold


Source:

ScienceDaily (May 6, 2008) — Scientists at the University of Konstanz in Germany and the National Institute of Standards and Technology (NIST) have demonstrated an ultrafast laser that offers a record combination of high speed, short pulses and high average power. The same NIST group also has shown that this type of laser, when used as a frequency comb—an ultraprecise technique for measuring different colors of light—could boost the sensitivity of astronomical tools searching for other Earthlike planets as much as 100 fold.
The dime-sized laser, to be described Thursday, May 8, at the Conference on Lasers and Electro-Optics,* emits 10 billion pulses per second, each lasting about 40 femtoseconds (quadrillionths of a second), with an average power of 650 milliwatts. For comparison, the new laser produces pulses 10 times more often than a standard NIST frequency comb while producing much shorter pulses than other lasers operating at comparable speeds. The new laser is also 100 to 1000 times more powerful than typical high-speed lasers, producing clearer signals in experiments. The laser was built by Albrecht Bartels at the Center for Applied Photonics of the University of Konstanz.
Among its applications, the new laser can be used in searches for planets orbiting distant stars. Astronomers look for slight variations in the colors of starlight over time as clues to the presence of a planet orbiting the star. The variations are due to the small wobbles induced in the star’s motion as the orbiting planet tugs it back and forth, producing minute shifts in the apparent color (frequency) of the starlight. Currently, astronomers’ instruments are calibrated with frequency standards that are limited in spectral coverage and stability. Frequency combs could be more accurate calibration tools, helping to pinpoint even smaller variations in starlight caused by tiny Earthlike planets. Such small planets would cause color shifts equivalent to a star wobble of just a few centimeters per second. Current instruments can detect, at best, a wobble of about 1 meter per second.
Standard frequency combs have “teeth” that are too finely spaced for astronomical instruments to read. The faster laser is one approach to solving this problem. In a separate paper,** the NIST group and astronomer Steve Osterman at the University of Colorado at Boulder describe how, by bouncing the light between sets of mirrors a particular distance apart, they can eliminate periodic blocks of teeth to create a gap-toothed comb. This leaves only every 10th or 20th tooth, making an ideal ruler for astronomy.
Both approaches have advantages for astronomical planet finding and related applications. The dime-sized laser is very simple in construction and produces powerful and extremely well-defined comb teeth. On the other hand, the filtering approach can cover a broader range of wavelengths. Four or five filtering cavities in parallel would provide a high-precision comb of about 25,000 evenly spaced teeth that spans the visible to near-infrared wavelengths (400 to 1100 nanometers), NIST physicist Scott Diddams says.
Osterman says he is pursuing the possibility of testing such a frequency comb at a ground-based telescope or launching a comb on a satellite or other space mission. Other possible applications of the new laser include remote sensing of gases for medical or atmospheric studies, and on-the-fly precision control of high-speed optical communications to provide greater versatility in data and time transmissions. The application of frequency combs to planet searches is of international interest and involves a number of major institutions such as the Max-Planck Institute for Quantum Optics and Harvard Smithsonian Center for Astrophysics.
Background on frequency combs and NIST’s role in their development can be found at: “Optical Frequency Combs” at http://www.nist.gov/public_affairs/newsfromnist_frequency_combs.htm.
* A. Bartels, D. Heinecke and S.A. Diddams. Passively mode-locked 10 GHz femtosecond Ti:sapphire laser with >1 mW of power per frequency comb mode. Post-deadline paper presented at Conference on Lasers and Electro-Optics (CLEO), San Jose, Calif., May 4-9, 2008.
** D.A. Braje, M. S. Kirchner, S. Osterman, T. Fortier and S. A. Diddams. Astronomical spectrograph calibration with broad-spectrum frequency combs. To appear in European Physics Journal D. (Posted online at arXiv:0803.0565)
Adapted from materials provided by National Institute of Standards and Technology.

Fausto Intilla - www.oloscience.com