domenica 16 maggio 2010

Quantum Dynamics of Matter Waves Reveal Exotic Multibody Collisions.

Source: ScienceDaily
-------------------------
ScienceDaily (May 16, 2010) — At extremely low temperatures atoms can aggregate into so-called Bose Einstein condensates forming coherent laser-like matter waves. Due to interactions between the atoms fundamental quantum dynamics emerge and give rise to periodic collapses and revivals of the matter wave field.
A group of scientists led by Professor Immanuel Bloch (Chair of Experimental Physics at the Ludwig-Maximilians-Universität München (LMU) and Director of the Quantum Many Body Systems Division at the Max Planck Institute of Quantum Optics in Garching) has now succeeded to take a glance 'behind the scenes' of atomic interactions revealing the complex structure of these quantum dynamics. By generating thousands of miniature BECs ordered in an optical lattice the researchers were able to observe a large number of collapse and revival cycles over long periods of time.
The research is published in the journal Nature.
The experimental results imply that the atoms do not only interact pairwise -- as typically assumed -- but also perform exotic collisions involving three, four or more atoms at the same time. On the one hand, these results have fundamental importance for the understanding of quantum many-body systems. On the other hand, they pave the way for the generation of new exotic states of matter, based on such multi-body interactions.
The experiment starts by cooling a dilute cloud of hundreds of thousands of atoms to temperatures close to absolute zero, approximately -273 degrees Celsius. At these temperatures the atoms form a so-called Bose-Einstein condensate (BEC), a quantum phase in which all particles occupy the same quantum state. Now an optical lattice is superimposed on the BEC: This is a kind of artificial crystal made of light with periodically arranged bright and dark areas, generated by the superposition of standing laser light waves from different directions. This lattice can be viewed as an 'egg carton' on which the atoms are distributed. Whereas in a real egg carton each site is either occupied by a single egg or no egg, the number of atoms sitting at each lattice site is determined by the laws of quantum mechanics: Depending on the lattice height (i.e. the intensity of the laser beam) the single lattice sites can be occupied by zero, one, two, three and more atoms at the same time.
The use of those "atom number superposition states" is the key to the novel measurement principle developed by the researchers. The dynamics of an atom number state can be compared to the dynamics of a swinging pendulum. As pendulums of different lengths are characterized by different oscillation frequencies, the same applies to the states of different atom numbers. "However, these frequencies are modified by inter-atomic collisions. If only pairwise interactions between atoms were present, the pendulums representing the individual atom number states would swing synchronously and their oscillation frequencies would be exact multiples of the pendulum frequency for two interacting atoms," Sebastian Will, graduate student at the experiment, explains.
Using a tricky experimental set-up the physicists were able to track the evolution of the different superimposed oscillations over time. Periodically interference patterns became visible and disappeared, again and again. From their intensity and periodicity the physicists found unambiguous evidence that the frequencies are actually not simple multiples of the two-body case. "This really caught us by surprise. We became aware that a more complex mechanism must be at work," Sebastian Will recalls. "Due to their ultralow temperature the atoms occupy the energetically lowest possible quantum state at each lattice site. Nevertheless, Heisenberg's uncertainty principle allows them to make -- so to speak -- a virtual detour via energetically higher lying quantum states during their collision. Practically, this mechanism gives rise to exotic collisions, which involve three, four or more atoms at the same time."
The results reported in this work provide an improved understanding of interactions between microscopic particles. This may not only be of fundamental scientific interest, but find a direct application in the context of ultracold atoms in optical lattices. Owing to exceptional experimental controllability, ultracold atoms in optical lattices can form a "quantum simulator" to model condensed matter systems. Such a quantum simulator is expected to help understand the physics behind superconductivity or quantum magnetism. Furthermore, as each lattice site represents a miniature laboratory for the generation of exotic quantum states, experimental set-ups using optical lattices may turn out to be the most sensitive probes for observing atomic collisions.
Story Source:
Adapted from materials provided by
Ludwig-Maximilians-Universität München.
Journal Reference:
Sebastian Will, Thorsten Best, Ulrich Schneider, Lucia Hackermüller, Dirk-Sören Lühmann, Immanuel Bloch. Time-resolved observation of coherent multi-body interactions in quantum phase revivals. Nature, 2010; 465 (7295): 197 DOI:
10.1038/nature09036

domenica 9 maggio 2010

Anton Zeilinger vs. Daniel Salart about "Spooky action at a distance".


--------------------------
Comment on: Testing the speed of ‘spooky action at a distance’.
(Johannes Kofler, Rupert Ursin, Časlav Brukner, Anton Zeilinger)
-
In a recent experiment, Salart et al. addressed the important issues of the speed of hypothetical communication and of reference frames in Bell-type experiments. The authors report that they "performed a Bell experiment using entangled photons" and conclude from their experimental results that "to maintain an explanation based on spooky action at a distance we would have to assume that the spooky action propagates at speeds even greater than the bounds obtained in our experiment", exceeding the speed of light by orders of magnitude. Here we show that, analyzing the experimental procedure, explanations with subluminal or even no communication at all exist for the experiment.
In order to explain the violation of Bell inequalities within the view where, to use the author‟s wording, "correlated events have some common causes in their shared history", one needs to assume hypothetical communication between the observer stations. This communication must be faster than light if the outcome at one station is space-like separated from all relevant events at the other station.
In the experiment pairs of time-bin entangled photons were sent over 17.5 km optical fibers to two receiving stations, located in Jussy and Satigny, both equipped with a Franson-type interferometer and detectors. The out-comes were observed space-like separated from each other. The phase in the interferometer, i.e. the setting, in Jussy was continuously scanned, while the setting at Satigny was kept stable.
However, if the setting at one side remains unchanged, the results at both observer stations can be described by a "common-cause" without having to invoke any communication at all, let alone superluminal spooky action at a distance. This is signified, e.g., by the fact that no formulation of a bipartite Bell type inequality exists which does not use at least two settings at each side. Therefore, contrary to the claim in the paper, no Bell test was performed.
Furthermore, had the experiment been repeated with a second stable setting at Satigny, a "common-cause" explanation would still be possible. This is because in order to exclude subluminal communication, it is crucial that the outcome event on each side is space-like separated from the setting choice on the other side – which was not done in Ref. [1]. Thus, such experimental data – even if they were taken with two measurement settings at Satigny and even granting the fair-sampling assumption – could be explained by a "common-cause" model. In other words, the experiment tests the superluminal speed of hypothetical influences between outcome events under the assumption of no, not even subluminal, hypothetical influences between setting choices and outcome events.
We also remark that in a Franson-type experiment like the one reported in Ref. [1] the considered Clauser-Horne-Shimony-Holt Bell inequality is not applicable even with perfect detectors because of the inherent postse-lection.2 One would (i) have to use a chained Bell inequality2, (ii) achieve fast switching with a rate depending on the geometry of the interferometer, and (iii) reach a better visibility than the one reported in Ref. [1]. None of these three issues is covered by the experiment.
We would like to stress that this comment should not be seen as a defence of local realism. And neither do we demand that Ref. [1] must present a loophole-free Bell test. However, it is the purpose of our comment to point out "common-cause" explanations of an experiment which aims at putting "stringent experimental bounds on the speed of all such hypothetical influences".
-
Reply to the: "Comment on: Testing the speed of `spooky action at a distance' "
(D. Salart, A. Baas, C. Branciard, N. Gisin, and H. Zbinden)
-
Quantum correlations cannot be described by local common causes. This prediction of quantum theory, surprising as it might appear, has been widely con rmed by numerous experiments. In our Nature Letter [1] we considered this point as established and addressed another issue: the alternative assumption that quantum correlations are due to supra-luminal influences of a first event onto a second event. For this purpose we believe that it suffices to observe 2-photon interferences with a visibility high enough to potentially violate Bell's inequality, as we reported (over 2 x 17.5 km). Simultaneously closing other loopholes, like the locality loophole as desired by Koer and colleagues, would certainly be an interesting addition, as would be any Bell tests that simultaneously address several of the loopholes.
Indeed, to rigorously exclude any common cause explanation of the observed quantum correlation one should, ideally, simultaneously close the locality and the detection loophole (and assume the existence of independent randomness and that quantum measurements are nished when detectors re or at least when a mesoscopic mass has sufficiently moved as insured in our experiment, see our recent article [2]). This is a formidable task and any progress towards achieving it is most welcome. So far, however, all experiments have addressed at most one of these loopholes; ours is no exception.
Concerning the comment on the use of a Franson interferometer for testing quantum nonlocality, we stress that this is not a fundamental issue. In principle it suffices to replace the entrance beam splitters of each interferometer by a fast switch. In this way the non-interfering lateral peaks observed in the 2-photon interferogram would disappear. However, in practice such switches suffer due to losses of around 3 dB. Hence, with today's technology it is much more
convenient to replace the ideal switch by a passive coupler, as we did in our experiment in a way very similar to [3].

lunedì 25 gennaio 2010

A new computer algorithm allows scientists to view nuclear fission in much finer detail than ever before.


An elevation plot of the highest energy neutron flux distributions from an axial slice of the reactor is shown superimposed over the same slice of the underlying geometry. This figure shows the rapid spatial variation in the high energy neutron distribution between within each plate along with the more slowly varying, global distribution. The figure is significant since UNIC allows researchers to capture both of these effects simultaneously. (Credit: Image courtesy of Argonne National Laboratory)
Source: ScienceDaily
------------------------
ScienceDaily (Jan. 25, 2010) — Ever wanted to see a nuclear reactor core in action? A new computer algorithm developed by researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory allows scientists to view nuclear fission in much finer detail than ever before.
A team of nuclear engineers and computer scientists at Argonne National Laboratory are developing the neutron transport code UNIC, which enables researchers for the first time to obtain a highly detailed description of a nuclear reactor core.
The code could prove crucial in the development of nuclear reactors that are safe, affordable and environmentally friendly. To model the complex geometry of a reactor core requires billions of spatial elements, hundreds of angles and thousands of energy groups -- all of which lead to problem sizes with quadrillions of possible solutions.
Such calculations exhaust computer memory of the largest machines, and therefore reactor modeling codes typically rely on various approximations. But approximations limit the predictive capability of computer simulations and leave considerable uncertainty in crucial reactor design and operational parameters.
"The UNIC code is intended to reduce the uncertainties and biases in reactor design calculations by progressively replacing existing multilevel averaging techniques with more direct solution methods based on explicit reactor geometries," said Andrew Siegel, a computational scientist at Argonne and leader of Argonne's reactor simulation group.
UNIC has run successfully at DOE leadership computing facilities, home to some of the world's fastest supercomputers, including the energy-efficient IBM Blue Gene/P at Argonne and the Cray XT5 at Oak Ridge National Laboratory. Although still under development, the code has already produced new scientific results.
In particular, the Argonne team has carried out highly detailed simulations of the Zero Power Reactor experiments on up to 163,840 processor cores of the Blue Gene/P and 222,912 processor cores of the Cray XT5, as well as on 294,912 processor cores of a Blue Gene/P at the Jülich Supercomputing Center in Germany. With UNIC, the researchers have successfully represented the details of the full reactor geometry for the first time and have been able to compare the results directly with the experimental data.
Argonne's UNIC code provides a powerful new tool for designers of safe, environmentally friendly nuclear reactors -- a key component of our nation's current and future energy needs. By integrating innovative design features with state-of-the-art numerical solvers, UNIC allows researchers not only to better understand the behavior of existing reactor systems but also to predict the behavior of many of the newly proposed systems having untested design characteristics.
Development of the UNIC code is funded principally by DOE's Office of Nuclear Energy through the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The Argonne UNIC project is a key part of the NEAMS efforts to replace the traditional "test-based" approach to nuclear systems design with a new "science-based" approach in which advanced modeling and simulation play a dominant role.

Story Source:
Adapted from materials provided by DOE/Argonne National Laboratory.

lunedì 18 gennaio 2010

A new microwire fabrication technique in which microwires self-assemble themselves in a three-dimensional template made of nematic liquid crystals.


(A) Illustration of a conductive particle attracted to a disclination line that joins two electrodes at points P and Q. (B) Photograph of a horizontal necklace of particles. The red bar is 30 micrometers long. Image copyright: Fleury, et al.
Source: Physorg.com
----------------------------
Scientists have demonstrated a new microwire fabrication technique in which microwires self-assemble themselves in a three-dimensional template made of nematic liquid crystals. Amidst concerns about Moore’s law eventually approaching a limit in two dimensions, the new fabrication method could enable researchers to continue to increase the density of transistors on integrated circuits by making use of the third dimension.
The researchers, Jean-Baptiste Fleury, David Pires, and Yves Galerne of the Institute of Physics and Chemistry of Materials of Strasbourg, in Strasbourg, France, have published their research in a recent issue of Physical Review Letters.

As the researchers explain, many different processes have been proposed in the past few years for fabricating high-quality nanowires. Generally, in order to connect nanowires to electrodes, researchers must confine them on a two-dimensional substrate and use the third dimension for manipulating the connections, often using a computer.

In their new study, the scientists show how to manufacture microwires that self-assemble themselves in a three-dimensional template and then connect themselves to electrodes with an accuracy of a few micrometers. First, the researchers took the two substrates to be connected, and filled the space between them with a nematic liquid crystal, which is the same substance used in many kinds of LCDs. Although the molecules in the liquid are free to move, they align themselves parallel to one another, except along threadlike (defect) lines (“nemato” in Greek means “threadlike”).

Next, the scientists created a defect line in the nematic liquid crystal that runs between electrodes in the two substrates. By rubbing the substrates in three different locations at a specific angle, the researchers produced a programmable disclination (i.e., a topological singularity or defect line). In this area, the molecules cannot orient themselves in any direction, creating a disclination that extends between the two substrates.

In addition to their ability to produce programmable disclinations, another property of nematic liquid crystals is that they attract small objects to the disclinations. This attraction occurs due to interference between the distortion from the disclination and the normal threadlike distortion from particles in the nematic liquid crystal. The interference results in a force on silica particles (which are added to the nematic liquid crystal), dragging them toward the disclination line.
Eventually, enough silica particles become trapped onto the line to form a micronecklace in which the particles are in loose contact with each other. To thoroughly join the particles together, the researchers applied a voltage difference between adjacent particles in order to polymerize monomers in the liquid crystal and eventually to stick the particles to one another. After a few hours, polymerization turned the micronecklace into a cohesive microwire.

“As far as I know, there are no other means, at the moment, able to produce microwires self-connected in 3D on designed electrodes,” Galerne told PhysOrg.com.

The researchers predict that this process can be extended to produce a large number of microwires between substrates simultaneously, which could lead to the development of large-scale three-dimensional integrated circuits. Although the microwires need to be separated from each other by a minimum distance, which presents a physical limitation, the method still has the potential to play a significant role in future electronics applications.

“The escape to the third dimension could clearly open possibilities,” Galerne said. “A simple manner could consist in connecting stacks of 2D integrated circuits. For the moment we are working on a method for producing nanowires of better quality (smoother shape, larger strength, and better conductivity).”

More information: Jean-Baptiste Fleury, David Pires, and Yves Galerne. “Self-Connected 3D Architecture of Microwires.” Physical Review Letters 103, 267801 (2009).

domenica 17 gennaio 2010

Tying light in knots.


The coloured circle represents the hologram, out of which the knotted optical vortex emerges.
Source: Physorg.com
---------------------------
The remarkable feat of tying light in knots has been achieved by a team of physicists working at the universities of Bristol, Glasgow and Southampton, UK, reports a paper in Nature Physics this week.
Understanding how to control light in this way has important implications for laser technology used in wide a range of industries.
Dr Mark Dennis from the University of Bristol and lead author on the paper, explained: "In a light beam, the flow of light through space is similar to water flowing in a river. Although it often flows in a straight line - out of a torch, laser pointer, etc - light can also flow in whirls and eddies, forming lines in space called 'optical vortices'.
"Along these lines, or optical vortices, the intensity of the light is zero (black). The light all around us is filled with these dark lines, even though we can't see them".
Optical vortices can be created with holograms which direct the flow of light. In this work, the team designed holograms using knot theory - a branch of abstract mathematics inspired by knots that occur in shoelaces and rope. Using these specially designed holograms they were able to create knots in optical vortices.
This new research demonstrates a physical application for a branch of mathematics previously considered completely abstract.
Professor Miles Padgett from Glasgow University, who led the experiments, said: "The sophisticated hologram design required for the experimental demonstration of the knotted light shows advanced optical control, which undoubtedly can be used in future laser devices".
"The study of knotted vortices was initiated by Lord Kelvin back in 1867 in his quest for an explanation of atoms", adds Dennis, who began to study knotted optical vortices with Professor Sir Michael Berry at Bristol University in 2000. "This work opens a new chapter in that history."
More information: Isolated optical vortex knots by Mark R. Dennis1, Robert P. King, Barry Jack, Kevin O'Holleran and Miles J. Padgett. Nature Physics, published online 17 January 2010.

Provided by University of Bristol

'Nanodragster' Races Toward the Future of Molecular Machines.

The new "nanodragster" (left) may lead to molecular machines for manufacturing computer circuits and other electronic components. (Credit: American Chemical Society)
Source: ScienceDaily
--------------------------
ScienceDaily (Jan. 16, 2010) — Scientists in Texas are reporting the development of a "nanodragster" that may speed the course toward development of a new generation of futuristic molecular machines. The vehicle -- only 1/50,000th the width of a human hair -- resembles a hot-rod in shape and can outperform previous nano-sized vehicles. Their report is in ACS' Organic Letters.
James Tour, Kevin Kelly and colleagues note that the ability to control the motion of small molecules is essential for building much-anticipated molecular machines. Some of these machines may find use in manufacturing computer circuits and other electronic components in the future. Scientists have already made strides by designing nano-sized vehicles, including a "nanocar" with wheels made of buckyballs -- spheres of carbon containing 60 atoms apiece. The car can scoot around a gold surface when exposed to heat or an electric field gradient. But control of its movement is limited. These drawbacks prevent its widespread use. But the most limiting factor is the nanoscopic resolution tools available for studying their range of motions and capabilities.
The new vehicle addresses some of these problems. The front end has a smaller axle and wheels made of special materials that roll easier. The rear wheels sport a longer axle but are still made of buckyballs, which provide strong surface grip. These changes result in a "nanodragster" that can operate at lower temperatures than a regular nanocar and possibly has has better agility, paving the way for better molecular machines, the scientists say.
Story Source:
Adapted from materials provided by
American Chemical Society, via EurekAlert!, a service of AAAS.
Journal Reference:
Vives et al. Molecular Machinery: Synthesis of a 'Nanodragster'. Organic Letters, 2009; 11 (24): 5602 DOI:
10.1021/ol902312m

venerdì 15 gennaio 2010

Jetting into the Quark-Gluon Plasma.

Gold nuclei collide in the STAR experiment at RHIC, creating a fireball in which the quark-gluon plasma briefly appears. Its properties are reconstructed from particle tracks captured in STAR's Time Projection Chamber.
Source: Physorg.com
--------------------------
After the quark-gluon plasma filled the universe for a few millionths of a second after the big bang, it was over 13 billion years until experimenters managed to recreate the extraordinarily hot, dense medium on Earth. The JET Collaboration, a team from six universities and three national laboratories led by Berkeley Lab’s Nuclear Science Division, is now developing a new and highly detailed theoretical picture of this unique state of the early universe.
The Department of Energy’s Office of recently named Berkeley Lab’s Nuclear Science Division to lead a nine-institution collaboration investigating the “Quantitative Jet and Electromagnetic Tomography of Extreme Phases of Matter in Heavy-Ion Collisions” - JET, for short.
The JET Collaboration is a five-year theoretical effort to understand the properties of the extraordinarily hot and dense state of matter known as the quark-gluon
. The quark-gluon plasma filled the Universe a few millionths of a second after the big bang but instantly vanished, condensing into the protons and and other particles from which the present Universe descended.
Some 13.7 billion years later, experimenters recreated the quark-gluon plasma on Earth, using the Relativistic Heavy
(RHIC) at Brookhaven National Laboratory. The first heavy-ion collisions occurred at RHIC in 2000, but confirming the occurence of the quark-gluon plasma in these events took several more years of data collection and analysis.

Freeing the quarks:
come in three different “colors,” and it takes three quarks to build a proton or a neutron; as carriers of the color charge, an aspect of the strong nuclear interaction, gluons literally glue the quarks together.
Under ordinary conditions neither quarks nor gluons are ever free. The farther apart they get, the stronger the force between them. Because mass and energy are interchangeable, as described by Einstein’s E=Mc2, eventually the energy that would be needed to separate them goes into creating new bound quarks instead.
RHIC was designed to collide heavy nuclei (as heavy as gold, whose nucleus consists of 79 protons and 118 neutrons) at energies so high that during the near-light-speed collisions, conditions cease to be anything like ordinary. Dense, hot fireballs blossom in the collisions, forming a plasma in which neither quarks nor gluons are bound together; instead they move independently with almost complete freedom.
The RHIC results held some surprises. Unlike more familiar plasmas in which electrically charged particles are separated from one another, the quark-gluon plasma consists of color charges. The quark-gluon plasma produced at RHIC turned out to be more like a liquid than a gas.
“One of the main discoveries at RHIC is that the quark-gluon plasma produced in heavy-ion collisions behaves as a perfect fluid with very small viscosity,” says Xin-Nian Wang, a senior scientist in the Nuclear Theory Group in Berkeley Lab’s Nuclear Science Division (NSD). Wang is the co-spokesperson and project director of the JET Collaboration.
Perfect fluidity arises because the plasma’s constituents are strongly coupled, causing their collective flow. And the quark-gluon plasma flows freely, like low-viscosity motor oil in a hot engine - much more freely, in fact, Wang says, because its specific shear viscosity is “an order of magnitude less than that of water.”
Another RHIC discovery was the predicted but never-before-seen “jet quenching.” When individual particles collide in a vacuum - as when protons collide in CERN’s Large Hadron Collider, for example - the debris often flies out in a pair of jets; particles like pions or kaons detected on one side of the detector are correlated, in terms of total momentum and energy, with particles detected on the opposite side.
“But when heavy ions collide, they produce an incredibly dense medium, 30 to 50 times as dense as an ordinary nucleus,” Wang says. “The farther a jet of particles has to push through this strongly interacting nuclear matter, the more energy it loses. One jet from the back-to-back pair may not escape the fireball at all.”
The energy of the trapped jet has to go somewhere. The energetic particles that are initially produced decay to softer ones which further interact with the medium, producing shock waves in the fluid. As with the sonic boom from a jet plane “breaking the sound barrier” - flying faster than the speed of sound in air - the shock wave from a jet swallowed by the quark-gluon plasma could be used to measure the velocity of sound in the plasma.
The debris from heavy-ion collisions indicates that free quarks and gluons recombine into hadrons (which include pions and kaons made of two quarks and protons and neutrons made of three quarks) while the plasma is cooling; this also affects how the jets propagate.
Probing the plasma:
Jets are called “hard probes.” Although by nature strongly interacting, they are moving so fast and with so much energy that their interaction with the surrounding free quarks and gluons in the plasma is actually relatively weak. A jet’s ability to transfer energy and momentum to the medium as it moves through the fireball is known as the jet transport coefficient (JTC), which is related to the plasma’s viscosity: the smaller the viscosity - and the viscosity of the quark-gluon plasma is very small indeed - the larger the JTC.
It’s not just the degree of jet quenching, a figure that emerges in the data from millions of collision events, but the orientation, directionality, and composition of the jets that have much to tell about what’s inside the fireball, and thus about the properties of the quark-gluon plasma.
Another kind of probe, an electromagnetic probe, is so weak there is virtually no interaction with the medium at all. Electromagnetic probes appear when a jet of particles in one direction is balanced not by another jet but by a single, very energetic photon.
The task of the JET Collaboration is to use the existing evidence from the RHIC results to calculate in detail what’s really going on inside the strongly interacting quark-gluon plasma - the kind of three-dimensional picture of an otherwise invisible interior that’s called tomography, as in computed axial tomography, the familiar CAT scan.
Three kinds of phenomena are critical to the completion of the task: collectivity, to determine the viscosity of the medium; jets, to determine the jet transport coefficient; and the excitation of the medium, to determine the velocity of sound within it.
More than one kind of calculation will be required. Different assumptions and different codes must be used to model different kinds of interactions and different properties, and the results don’t always agree. The JET Collaboration includes representatives from major institutions that have made significant contributions to the study of the hot, dense matter in heavy-ion collisions, often approaching the question from different points of view. Working together, a consistent picture of the quark-gluon plasma will emerge.
Once the calculations are complete, having taken into account the entire energy spectrum of particles emerging from millions of evanescent fireballs, the new theoretical picture of this unique state of the
will be tested against observations at the newly upgraded RHIC and at the ALICE experiment at the Large Hadron Collider (LHC) at CERN. (The LHC collides for most of the year, but for a month each year it will collide heavy ions in the form of lead nuclei.)
The JET Collaboration:
In the JET Collaboration, Berkeley Lab will be represented by theorists Wang, Volker Koch, and Feng Yuan. The Lab’s leadership in both the theory of the quark-gluon plasma and in its experimental exploration through the Relativistic Nuclear Collision (RNC) group uniquely positions the Lab to head the Collaboration.
The idea of jet quenching was first proposed for proton-proton collisions in the early 1980s, by James Daniel Bjorken of the Stanford Linear Accelerator Center. The theory linking jet quenching to the quark-gluon plasma in heavy-ion collisions was later developed by Xin-Nian Wang and Miklos Gyulassy; Gyulassy was with Berkeley Lab at the time and is now at Columbia University, where he is a member of the JET Collaboration.
On the experimental side, the heart of the STAR experiment at RHIC is a time projection chamber built at Berkeley Lab and invented here by David Nygren of the Physics Division; STAR is one of many time projection chambers around the world, including the heart of the ALICE experiment at the LHC. The electromagnetic calorimeter, EMCal, which will trigger the recording of interesting jet events in ALICE, is being constructed by an international team led by U.S. members of ALICE, with project management by Berkeley Lab’s Peter Jacobs of NSD and Joseph Rasson of Engineering.
Other DOE labs participating in the JET Collaboration are Lawrence Livermore, represented by Ramona Vogt, and Los Alamos, represented by Ivan Vitev. In addition to Columbia University, represented by Gyulassy, other universities include Duke, represented by Steffen Bass and Berndt Mueller, the JET Collaboration’s co-spokesperson, plus Charles Gale and Sangyong Jeon of McGill, Ulrich Heinz and Abhijit Majumder of Ohio State, Denes Molnar of Purdue, and Rainer Fries and Che-Ming Ko of Texas A&M.
JET is one of three topical collaborations established by DOE’s Office of Nuclear Physics. Over a period of five years, with a budget of $2.5 million, the JET Collaboration will not only develop theory but work closely with experimentalists, train students and postdoctoral fellows, and form associations with a wide range of researchers in the nuclear science community at institutions in the U.S. and abroad.
More information: How
heavy ions collide at RHIC to create the quark-gluon plasma
Wikepedia’s article on the quark-gluon plasma
The
STAR experiment at RHIC
The
ALICE experiment at the LHC
Provided by Lawrence Berkeley National Laboratory

giovedì 14 gennaio 2010

Scientists Quantify Nanoparticle-Protein Interactions.

Insulin, one of the most common proteins in human blood, can accumulate into fibrous masses when it misfolds. Research by a team at NIST indicates that gold nanoparticles apparently increase insulin's tendency to form these fibers. (Color added for clarity.) Credit: NIST
Source: Physorg.com
----------------------------
A research team at the National Institute of Standards and Technology has quantified the interaction of gold nanoparticles with important proteins found in human blood, an approach that should be useful in the development of nanoparticle-based medical therapies and for better understanding the physical origin of the toxicity of certain nanoparticles.
Nanoparticles show promise as vehicles for , as medical diagnostic tools, and as a agent in their own right. , spheres that vary in size between 5 and 100 billionths of a meter in diameter, are especially useful because of the many ways their metal surfaces can be “functionalized” by attaching tailored molecules to perform different tasks in the body. However, treatments require a large number of particles to be injected into the bloodstream, and these could be hazardous if they interact with the body in unforeseen ways.
According to NIST materials scientist Jack Douglas, one of the principal problems confronting
is the tendency of proteins to stick to the nanoparticles that float freely in the . “Nanoparticles coated with proteins will generally alter their interaction with the body and the nanoparticles can be expected to induce a complementary change in protein chemical activity,” says Douglas. “The coating also can cause the nanoparticles to clump together in large aggregates, which can provoke a huge . Of course, that’s something you want to avoid.”
Scientists have a poor understanding of these interactions, so the NIST team decided to explore what happens when nanoparticles of different sizes encounter five common blood proteins. With the aid of a bevy of microscopes and spectroscopy devices, the team found several general patterns of behavior. “Once the proteins stick to the nanoparticles, the
of both the particles and the proteins change,” Douglas says. “Measuring these changes helps us quantify the stickiness of the nanoparticle for the proteins, the thickness of the adsorbed protein layer and the propensity of the particles to aggregate due to the presence of the protein layers.”
More specifically, the team learned that all five of the proteins stuck to the gold, causing the NPs to aggregate, and that increasing the spheres’ diameter increased their stickiness. They also found that this aggregation usually caused some change in the shape of the proteins—“which generally implies some change in their function as well,” Douglas says.
Aggregation does not always lead to a toxic response, Douglas says, but can affect whether the drugs on the nanoparticles ever reach their intended target. “The main thing is that interactions are largely set by the existence of the protein layer,” he says. “You want to know something about these
layers if you want to know what nanoparticles are going to do in the body.”
Douglas says that the NIST study addresses metrology needs identified in a National Research Council report** published this past year calling for more quantitative testing for nanoparticle interactions with biological media and that much more work is needed along this and other lines. “For example, we do not yet understand how different-sized particles bind to the surface membranes of cells, which is where many drug interactions take place,” he says.
More information: * S.H.D. Lacerda, J. Park, C. Meuse, D. Pristinski, M.L. Becker, A. Karim and J.F. Douglas. Interaction of gold nanoparticles with common human blood proteins. ACS Nano, December 18, 2009,
DOI:10.1021/nn9011187
** NRC report, “Review of Federal Strategy for Nanotechnology-Related Environmental, Health, and Safety Research,” available online at http://www.nap.edu/catalog.php?record_id=12559#toc .
Provided by National Institute of Standards and Technology

Theorists Close In on Improved Atomic Property Predictions.

Source: Physorg.com
--------------------------
Scientists at the National Institute of Standards and Technology and Indiana University have determined the most accurate values ever for a fundamental property of the element lithium using a novel approach that may permit scientists to do the same for other atoms in the periodic table.
NIST’s James Sims and IU’s Stanley Hagstrom have calculated four excitation energies for the atom approximately 100 times more accurately than any previous calculations or experimental measurements. Precise determination of excitation energy—the amount necessary to raise an atom from a base energy level to the next higher—has intrinsic value for fundamental research into atomic behavior, but the success of the method the team employed has implications that go beyond lithium alone.
The theorists have overcome major computational and conceptual hurdles that for decades have prevented scientists from using
to predict electron excitation energies from first principles. Sims first proposed in the late 1960s that such a quantum approach could be possible, but its application to anything more than two electrons required a fiendishly difficult set of calculations that, until recently, was beyond the capacity of even the world’s fastest computers. In 2006 the team used a novel combination of algorithms, extended precision computing and the increase in power brought about by parallel computing to calculate the most accurate values ever for a simple, two-electron .
By making improvements to those algorithms, Sims and Hagstrom now have been able to apply their approach to the significantly more difficult problem of lithium, which has three electrons. Much of the original difficulty with their method stems from the fact that in
with more than one electron the mutually among these tiny introduces complications that make calculations extremely time-consuming, if not practically impossible.
Sims says that while the lithium calculation is valuable in itself, the deeper import of refining their method is that it should enable the calculation of excitation energies for beryllium, which has four electrons. In turn, this next achievement should enable theorists to predict with greater accuracy values for all of the remaining elements in the second row of the periodic table, from beryllium to neon, and potentially the rest of the
as well. “The mathematical troubles we have with multiple electrons can all be reduced to problems with four electrons,” says Sims, a quantum chemist in the mathematics and computational sciences division. “Once we’ve tackled that, the mathematics for other elements is not any more difficult inherently—there’s just more number-crunching involved.”
To obtain their results, the researchers used 32 parallel processors in a NIST computer cluster, where they are currently working on the calculations for beryllium.
High precision determinations of excitation energies are of interest to scientists and engineers who characterize and model all types of gaseous systems, including plasmas and planetary atmospheres. Other application areas include astrophysics and health physics.
More information: J.S. Sims and S.A. Hagstrom. Hylleraas-configuration-interaction study of the 2 2S ground state of neutral lithium and the first five excited 2S states. Physical Review A, Nov. 19 2009,
DOI:10.1103/PhysRevA.80.052507
Provided by National Institute of Standards and Technology

All smoothed out: Hydroxyl radicals remove nanoscopic irregularities on polished gold surfaces.

Source: Physorg.com
----------------------------
The precious metal gold is the material of choice for many technical applications because it does not corrode - and because it also has interesting electrical, magnetic, and optical properties. Gold is thus one of the most important metals in the electronics industry, miniaturized optical components, and electrochemical processes.

In these applications, it is extremely important that the surface of the be completely clean and smooth. However, conventional processes not only “polish” away the undesirable irregularities, but also attack the . Fritz Scholz and a team from the Universities of Greifswald (Germany) and Warsaw (Poland) have now discovered a technique that can differentiate between the two. As the scientists report in the journal , hydroxyl radicals (OH radicals) rapidly remove all tiny on mechanically polished gold surfaces, leaving behind an extremely smooth surface.
The researchers treated gold surfaces with Fenton's reagent, which is a mixture of
and iron(II) salts that releases OH radicals. It is also used to degrade organic impurities in the purification of waste water. “Actually, it was not expected that the radicals would attack a polished pure gold surface,” says Scholz, “because gold is notoriously difficult to oxidize.” The experiments demonstrated that the oxidize gold very well, though measurable dissolution continues only as long as there are still bumps on the gold surface. Though these results seem contradictory at first glance, the researchers explain that the reaction of the radicals with the highly ordered gold atoms of the completely smooth surface produces a stable layer of gold oxide, which can be reduced back to elemental gold without a significant loss of material. In the protrusions, however, the gold are less ordered and very reactive. During the oxidation, they detach themselves from the atomic structure.
“Because the protrusions are selectively removed, our method is very interesting for polishing gold surfaces for industrial applications,” says Scholz. The process may also find a use in medical technology: gold is used to replace teeth, in tissues for reconstructive surgery, and in electrode implants, such as those used for implanted hearing aids. These release tiny amounts of gold, which enters into the surrounding tissue. This apparently occurs because of an immune reaction that results in the formation of OH radicals or similar species. Pre-treatment of gold implants with Fenton's reagent could inhibit this release of gold into the body.
More information: Fritz Scholz, Hydroxyl Radicals Attack Metallic Gold, Angewandte Chemie International Edition, Permalink:
http://dx.doi.org/10.1002/anie.200906358
Provided by Wiley

martedì 12 gennaio 2010

New quantum cascade lasers emit more light than heat.

Source: Physorg.com
-------------------------
Northwestern University researchers have developed compact, mid-infrared laser diodes that generate more light than heat - a breakthroughs in quantum cascade laser efficiency.
The results are an important step toward use of quantum cascade lasers in a variety of applications, including remote sensing of .
The research, led by Manijeh Razeghi, the Walter P. Murphy Professor of Electrical Engineering and Computer Science at the McCormick School of Engineering and Applied Science, was published online in the journal
on Jan. 10.
After years of research and industrial development, modern laser diodes in the near-infrared (approximately 1 micron) wavelength range are now extremely efficient. However the mid-infrared (greater than 3 microns) is much more difficult to access and has required the development of new device architectures.
The
(QCL) is a diode laser that is designed on the quantum mechanical level to produce light at the desired wavelength with high efficiency. Unlike traditional diode lasers, the device is unipolar, requiring only electrons to operate. A significant effort has been spent trying to understand and optimize the , which would allow researchers to improve the laser quality and efficiency.
Despite the special nature of these devices, laser wafer production is done using standard compound
growth equipment. By optimizing the material quality in these standard tools, researchers at the Center for Quantum Devices (CQD) at Northwestern, led by Razeghi, have made significant breakthroughs in QCL performance.
Previous reports regarding QCLs with high efficiency have been limited to efficiency values of less than 40 percent, even when cooled to cryogenic temperatures.
After removing design elements unnecessary for low-temperature operation, researchers at CQD have now demonstrated individual lasers emitting at wavelengths of 4.85 microns with efficiencies of 53 percent when cooled to 40 Kelvin.
"This breakthrough is significant because, for the very first time, we are able to create diodes that produce more light than heat," says Razeghi. "Passing the 50 percent mark in efficiency is a major milestone, and we continue to work to optimize the efficiency of these unique devices."
Though efficiency is currently the primary goal, the large demonstrated efficiencies also can be exploited to enable power scaling of the QCL emitters. Recent efforts in broad area QCL development have allowed demonstration of individual pulsed lasers with record output powers up to 120 watts, which is up from 34 W only a year ago.
Provided by Northwestern University

lunedì 11 gennaio 2010

Nanoscience Goes 'Big': Discovery Could Lead to Enhanced Electronics.

Source: ScienceDaily
------------------------
ScienceDaily (Jan. 11, 2010) — Nanoscience has the potential to play an enormous role in enhancing a range of products, including sensors, photovoltaics and consumer electronics. Scientists in this field have created a multitude of nano scale materials, such as metal nanocrystals, carbon nanotubes and semiconducting nanowires. However, despite their appeal, it has remained an astounding challenge to engineer the orientation and placement of these materials into the desired device architectures that are reproducible in high yields and at low costs -- until now.
Jen Cha, a UC San Diego nanoengineering professor, and her team of researchers, have discovered that one way to bridge this gap is to use biomolecules, such as DNA and proteins. Details of this discovery were recently published in Nature Nanotechology.
"Self-assembled structures are often too small and affordable lithographic patterns are too large," said Albert Hung, lead author of the Nature Nanotechnology paper and a post doc working in Cha's lab. "But rationally designed synthetic DNA nanostructures allow us to access length scales between 5 and 100 nanometers and bridge the two systems.
"People have created a huge variety of unique and functional nanostructures, but for some intended applications they are worthless unless you can place individual structures, billions or trillions of them at the same time, at precise locations," Hung added. "We hope that our research brings us a step closer to solving this very difficult problem."
Hung said the recently discovered method may be useful for fabricating nanoscale electronic or optical circuits and multiplex sensors. "A number of groups have worked on parts of this research problem before, but to our knowledge, we're the first to attempt to address so many parts together as a whole," he said.
One of the main applications of this research that Cha and her group are interested in is for sensing. "There is no foreseeable route to be able to build a complex array of different nanoscale sensing elements currently," said Cha, a former IBM research scientist who joined the UCSD Jacobs School of Engineering faculty in 2008.
"Our work is one of the first clear examples of how you can merge top down lithography with bottom up self assembly to build such an array. That means that you have a substrate that is patterned by conventional lithography, and then you need to take that pattern and merge it with something that can direct the assembly of even smaller objects, such as those having dimensions between 2 and 20 nanometers. You need an intermediate template, which is the DNA origami, which has the ability to bind to something else much smaller and direct their assembly into the desired configuration. This means we can potentially build transistors from carbon nanotubes and also possibly use nanostructures to detect certain proteins in solutions. Scientists have been talking about patterning different sets of proteins on a substrate and now we have the ability to do that."
Cha said the next step would be to actually develop a device based on this research method. "I'm very interested in the applications of this research and we're working our way to get there," she said.
For the last 6years, Cha's research has focused on using biology to engineer the assembly of nanoscale materials for applications in medicine, electronics and energy. One of the limitations of nanoscience is it doesn't allow mass production of products, but Cha's work is focused on trying out how to do that and do it cheaply. Much of her recent work has focused on using DNA to build 2D structures.
"Using DNA to assemble materials is an area that many people are excited about," Cha said. "You can fold DNA into anything you want -- for example, you can build a large scaffold and within that you could assemble very small objects such as nano particles, nano wires or proteins.
"Engineers need to understand the physical forces needed to build functional arrays from functional materials," she added. "My job as a nanoengineer is to out what you need to do to put all the different parts together, whether it's a drug delivery vehicle, photovoltaic applications, sensors or transistors. We need to think about ways to take all the nano materials and engineer them it into something people can use and hold."
Story Source:
Adapted from materials provided by
University of California - San Diego.
Journal Reference:
1.Albert M. Hung, Christine M. Micheel, Luisa D. Bozano, Lucas W. Osterbur, Greg M. Wallraff & Jennifer N. Cha. Large-area spatially ordered arrays of gold nanoparticles directed by lithographically confined DNA origami. Nature Nanotechnology, 2009; DOI:
10.1038/nnano.2009.450

Quantum computer calculates exact energy of molecular hydrogen.

Source: Physorg.com
----------------------------
In an important first for a promising new technology, scientists have used a quantum computer to calculate the precise energy of molecular hydrogen. This groundbreaking approach to molecular simulations could have profound implications not just for quantum chemistry, but also for a range of fields from cryptography to materials science.

"One of the most important problems for many theoretical chemists is how to execute exact simulations of chemical systems," says author Alán Aspuru-Guzik, assistant professor of chemistry and chemical biology at Harvard University. "This is the first time that a quantum computer has been built to provide these precise calculations."
The work, described this week in Nature Chemistry, comes from a partnership between Aspuru-Guzik's team of theoretical chemists at Harvard and a group of experimental physicists led by Andrew White at the University of Queensland in Brisbane, Australia. Aspuru-Guzik's team coordinated experimental design and performed key calculations, while his partners in Australia assembled the physical "computer" and ran the experiments.
"We were the software guys," says Aspuru-Guzik, "and they were the hardware guys."
While modern supercomputers can perform approximate simulations of simple molecular systems, increasing the size of the system results in an exponential increase in computation time. Quantum computing has been heralded for its potential to solve certain types of problems that are impossible for conventional computers to crack.
Rather than using binary bits labeled as "zero" and "one" to encode data, as in a conventional computer,
stores information in qubits, which can represent both "zero" and "one" simultaneously. When a quantum computer is put to work on a problem, it considers all possible answers by simultaneously arranging its qubits into every combination of "zeroes" and "ones."
Since one sequence of qubits can represent many different numbers, a quantum computer would make far fewer computations than a conventional one in solving some problems. After the computer's work is done, a measurement of its qubits provides the answer.
"Because classical computers don't scale efficiently, if you simulate anything larger than four or five atoms -- for example, a chemical reaction, or even a moderately complex molecule -- it becomes an intractable problem very quickly," says author James Whitfield, research assistant in chemistry and chemical biology at Harvard. "Approximate computations of such systems are usually the best chemists can do."
Aspuru-Guzik and his colleagues confronted this problem with a conceptually elegant idea.
"If it is computationally too complex to simulate a quantum system using a classical computer," he says, "why not simulate quantum systems with another quantum system?"
Such an approach could, in theory, result in highly precise calculations while using a fraction the resources of conventional computing.
While a number of other physical systems could serve as a computer framework, Aspuru-Guzik's colleagues in Australia used the information encoded in two entangled photons to conduct their hydrogen molecule simulations. Each calculated energy level was the result of 20 such quantum measurements, resulting in a highly precise measurement of each geometric state of
.
"This approach to computation represents an entirely new way of providing exact solutions to a range of problems for which the conventional wisdom is that approximation is the only possibility," says Aspuru-Guzik.
Ultimately, the same quantum computer that could transform Internet
could also calculate the lowest energy conformations of molecules as complex as cholesterol.
More information: Nature Chemistry paper:
http://dx.doi.org/10.1038/NCHEM.483
Provided by Harvard University

A solid case of entanglement.

Source: Physorg.com
-----------------------------
This is an SEM image of a typical Cooper pair splitter. The bar is 1 micrometer. A central superconducting electrode (blue) is connected to two quantum dots engineered in the same single wall carbon nanotube (in purple). Entangled electrons inside the superconductor can be coaxed to move in opposite directions in the nanotube, ending up at separate quantum dots, while remaining entangled. Credit: L.G. Herrmann, F. Portier, P. Roche, A. Levy Yeyati, T. Kontos, and C. Strunk

Physicists have finally managed to demonstrate quantum entanglement of spatially separated electrons in solid state circuitry.

For the first time, physicists have convincingly demonstrated that physically separated particles in solid-state devices can be quantum-mechanically entangled. The achievement is analogous to the of light, except that it involves particles in circuitry instead of photons in optical systems. Both optical and solid-state entanglement offer potential routes to and secure communications, but solid-state versions may ultimately be easier to incorporate into electronic devices.
The experiment is reported in an upcoming issue of Physical Review Letters and highlighted with a Viewpoint in the January 11 issue of Physics.
In optical entanglement experiments, a pair of entangled photons may be separated via a beam splitter. Despite their physical separation, the entangled photons continue to act as a single quantum object. A team of physicists from France, Germany and Spain has now performed a solid-state entanglement experiment that uses electrons in a superconductor in place of photons in an optical system.
As conventional
are cooled, the electrons they conduct entangle to form what are known as Cooper pairs. In the new experiment, Cooper pairs flow through a superconducting bridge until they reach a that acts as the electronic equivalent of a beam splitter. Occasionally, the electrons part ways and are directed to separate -- but remain entangled. Although the quantum dots are only a micron or so apart, the distance is large enough to demonstrate entanglement comparable to that seen in optical systems.
In addition to the possibility of using entangled electrons in solid-state devices for computing and secure communications, the breakthrough opens a whole new vista on the study of quantum mechanically entangled systems in solid materials.
More information: Carbon Nanotubes as Cooper-Pair Beam Splitters, L. G. Herrmann, F. Portier, P. Roche, A. Levy Yeyati, T. Kontos, and C. Strunk, Phys. Rev. Lett. 104, 026801 (2010) - Published January 11, 2010,
Download PDF
Provided by American Physical Society

Statistics Page

world map hits counter

domenica 10 gennaio 2010

Quantum Simulation of a Relativistic Particle.

Source: ScienceDaily
----------------------------
ScienceDaily (Jan. 8, 2010) — Researchers of the Institute for Quantum Optics and Quantum Information (IQOQI) in Innsbruck, Austria, used a calcium ion to simulate a relativistic quantum particle, demonstrating a phenomenon that has not been directly observable so far: the Zitterbewegung. They have published their findings in the current issue of the journal Nature.
In the 1920s quantum mechanics was already established and in 1928 the British physicist Paul Dirac showed that this theory can be merged with special relativity postulated by Albert Einstein. Dirac's work made quantum physics applicable to relativistic particles, which move at a speed that is comparable to the speed of light. The Dirac equation forms the basis for groundbreaking new insights, e.g. it provides a natural description of the electron spin and predicts that each particle also has its antiparticle (anti matter).
In 1930, as a result of the analysis of the Dirac equation, the Austrian Nobel laureate Erwin Schrödinger first postulated the existence of a so called Zitterbewegung (quivering motion), a kind of fluctuation of the motion of a relativistic particle. "According to the Dirac equation such a particle does not move in a linear fashion in a vacuum but 'jitters' in all three dimensions," Christian Roos from the Institute for Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences (ÖAW) explains. "It is not clear whether this Zitterbewegung can be observed in real systems in nature."
Quantum simulation of a particle:
Physical phenomena are often described by equations, which may be too complicated to solve. In this case, researchers use computer simulations to answer open questions. However, even for small quantum systems, classical computers have not enough power to manage the processing of the data; thus, scientists, such as Richard Feynman, proposed to simulate these phenomena in other quantum systems experimentally.
The preconditions for doing this -- detailed knowledge about the physics of these systems and an excellent control over the technology and set-up -- have been set by the research group headed by Rainer Blatt by conducting experiments with quantum computers over the last few years; they are now able to carry out quantum simulations experimentally. "The challenges with these experiments are to recreate the equations in the quantum system well, to have a high level of control over the various parameters and to measure the results," Christian Roos says.
The experimental physicists of the IQOQI trapped and cooled a calcium ion and in this well-defined state, a laser coupled the state of the particle and the state of the relativistic particle to be simulated. "Our quantum system was now set to behave like a free relativistic quantum particle that follows the laws of the Dirac equation," Rene Gerritsma explains, a Dutch Postdoc working at the IQOQI and first author of the work published in Nature. Measurements revealed the features of the simulated particle. "Thereby, we were able to demonstrate Zitterbewegung in the experimental simulation and we were also able to determine the probability of the distribution of a particle," Gerritsma says. In this very small quantum system the physicist simulated the Dirac equation only in one spatial dimension. "This simulation was a proof-of-principle experiment," Roos says, "which, in principle, can also be applied to three-dimensional dynamics if the technological set-up is adjusted accordingly."
Simulation of antiparticles:
Due to the extremely high level of control over the physical regime of the simulated particle, the scientists were able to modify the mass of the object and to simulate antiparticles. "In the end, our approach was very simple but you have to come up with the idea first," says Christian Roos, whose team of scientists was inspired by a theoretical proposal of a Spanish group of researchers. The work was supported by the Austrian Science Funds (FWF) and the European Commission.
Story Source:
Adapted from materials provided by
University of Innsbruck, via EurekAlert!, a service of AAAS.