martedì 20 ottobre 2009

CERN (LHC): ATLAS Live, a new browser showing the 1000 most recent events for each stream, is now online.

SOURCE:
ATLAS Live, a new browser showing the 1000 most recent events for each stream, is now online. It’s perfect for collaborators all over the world who want to check up on what’s going on underground at any time of day or night. Whether you be in your pyjamas rubbing your eyes before your 6 a.m. run in Ferney, or dressed to impress on your way to a concert hall in Paris, your window onto the detector is always open for you to sneak a peek.In fact, the ALTAS Live in question – developed by former ATLAS member Zdenek Maxa – is one of three ventures going by the same name. Ultimately, the information from this page will feed into another ATLAS Live, being developed by Manuela Cirilli and Kathy Pommes, which will also feature items like message boards and updates on ATLAS run status. The event display browser includes all the different Trigger streams and displays 20 Atlantis images per page, which users can scroll back and forth through. If they spot something of interest, they can download the image, or download the original JiveXML and corresponding VP1 input file for each selected event. They can also directly launch Atlantis on any given event, and then further interrogate it by zooming in, picking on data, implementing cuts and adjusting the view windows, to best highlight what they are interested in. “We focused a little bit more on the Atlantis side [rather than VP1] because Atlantis is a java application, so it can use this web start feature and doesn’t need to know what machine you’re running on,” explains Online Event Display Coordinator, Sebastian Böser. “This is basically the side of it that we have for the physicists. So they browse through these events and they say, ‘Wait a second, this looks really interesting, I want to go and see this in another projection.’” The display works on a rolling system, so as each new image comes in, the 1000th image drops off the end. Events make it on screen with around a ten second lag time. Right now, while there are only cosmics to be seen, 1000 events take place over a timescale of a few hours. Once there are collisions, 1000 events will be notched up in around ten minutes. At the bottom of the ATLAS Live page, there is also a link to a ‘latest event’ page. “We were thinking of all those people who might want to put up monitors in their universities showing the latest from the detector,” explains Sebastian. After selecting a stream, the image refreshes itself every five seconds, and all the user needs to do is point their web browser at that page and let it roll. The system has been up for six weeks or so already, and Sebastian urges everyone to check it out now and get familiar with it ahead of beam. In practical terms, this will allow the online data preparation group to monitor the load on the server, and iron out any problems that may arise. Combined cosmic running has already begun – go and take a look at what your detector can see!
Ceri Perkins

mercoledì 14 ottobre 2009

Search for Future Influence From Large Hadron Collider (LHC) at CERN.

SOURCE: NY Times
More than a year after an explosion of sparks, soot and frigid helium shut it down, the world’s biggest and most expensive physics experiment, known as the Large Hadron Collider, is poised to start up again. In December, if all goes well, protons will start smashing together in an underground racetrack outside Geneva in a search for forces and particles that reigned during the first trillionth of a second of the Big Bang.
Then it will be time to test one of the most bizarre and revolutionary theories in science. I’m not talking about extra dimensions of space-time, dark matter or even black holes that eat the Earth. No, I’m talking about the notion that the troubled collider is being sabotaged by its own future. A pair of otherwise distinguished physicists have suggested that the hypothesized Higgs boson, which physicists hope to produce with the collider, might be so abhorrent to nature that its creation would ripple backward through time and stop the collider before it could make one, like a time traveler who goes back in time to kill his grandfather.
Holger Bech Nielsen, of the Niels Bohr Institute in Copenhagen, and Masao Ninomiya of the Yukawa Institute for Theoretical Physics in Kyoto, Japan, put this idea forward in a series of papers with titles like “Test of Effect From Future in Large Hadron Collider: a Proposal” and “Search for Future Influence From LHC,” posted on the physics Web site
arXiv.org in the last year and a half.
According to the so-called Standard Model that rules almost all physics, the Higgs is responsible for imbuing other elementary particles with mass.
“It must be our prediction that all Higgs producing machines shall have bad luck,” Dr. Nielsen said in an e-mail message. In an unpublished essay, Dr. Nielson said of the theory, “Well, one could even almost say that we have a model for God.” It is their guess, he went on, “that He rather hates Higgs particles, and attempts to avoid them.”
This malign influence from the future, they argue, could explain why the United States Superconducting Supercollider, also designed to find the Higgs, was canceled in 1993 after billions of dollars had already been spent, an event so unlikely that Dr. Nielsen calls it an “anti-miracle.”
You might think that the appearance of this theory is further proof that people have had ample time — perhaps too much time — to think about what will come out of the collider, which has been 15 years and $9 billion in the making.
The collider was built by
CERN, the European Organization for Nuclear Research, to accelerate protons to energies of seven trillion electron volts around an 18-mile underground racetrack and then crash them together into primordial fireballs.
For the record, as of the middle of September, CERN engineers hope to begin to collide protons at the so-called injection energy of 450 billion electron volts in December and then ramp up the energy until the protons have 3.5 trillion electron volts of energy apiece and then, after a short Christmas break, real physics can begin.
Maybe.
Dr. Nielsen and Dr. Ninomiya started laying out their case for doom in the spring of 2008. It was later that fall, of course, after the CERN collider was turned on, that a connection between two magnets vaporized, shutting down the collider for more than a year.
Dr. Nielsen called that “a funny thing that could make us to believe in the theory of ours.”
He agreed that skepticism would be in order. After all, most big science projects, including the
Hubble Space Telescope, have gone through a period of seeming jinxed. At CERN, the beat goes on: Last weekend the French police arrested a particle physicist who works on one of the collider experiments, on suspicion of conspiracy with a North African wing of Al Qaeda.
Dr. Nielsen and Dr. Ninomiya have proposed a kind of test: that CERN engage in a game of chance, a “card-drawing” exercise using perhaps a random-number generator, in order to discern bad luck from the future. If the outcome was sufficiently unlikely, say drawing the one spade in a deck with 100 million hearts, the machine would either not run at all, or only at low energies unlikely to find the Higgs.
Sure, it’s crazy, and CERN should not and is not about to mortgage its investment to a coin toss. The theory was greeted on
some blogs with comparisons to Harry Potter. But craziness has a fine history in a physics that talks routinely about cats being dead and alive at the same time and about anti-gravity puffing out the universe.
As
Niels Bohr, Dr. Nielsen’s late countryman and one of the founders of quantum theory, once told a colleague: “We are all agreed that your theory is crazy. The question that divides us is whether it is crazy enough to have a chance of being correct.”
Dr. Nielsen is well-qualified in this tradition. He is known in physics as one of the founders of string theory and a deep and original thinker, “one of those extremely smart people that is willing to chase crazy ideas pretty far,” in the words of Sean Carroll, a Caltech physicist and author of a coming book about time, “From Eternity to Here.”
Another of Dr. Nielsen’s projects is an effort to show how the universe as we know it, with all its apparent regularity, could arise from pure randomness, a subject he calls “random dynamics.”
Dr. Nielsen admits that he and Dr. Ninomiya’s new theory smacks of time travel, a longtime interest, which has become a respectable research subject in recent years. While it is a paradox to go back in time and kill your grandfather, physicists agree there is no paradox if you go back in time and save him from being hit by a bus. In the case of the Higgs and the collider, it is as if something is going back in time to keep the universe from being hit by a bus. Although just why the Higgs would be a catastrophe is not clear. If we knew, presumably, we wouldn’t be trying to make one.
We always assume that the past influences the future. But that is not necessarily true in the physics of Newton or Einstein. According to physicists, all you really need to know, mathematically, to describe what happens to an apple or the 100 billion galaxies of the universe over all time are the laws that describe how things change and a statement of where things start. The latter are the so-called boundary conditions — the apple five feet over your head, or the Big Bang.
The equations work just as well, Dr. Nielsen and others point out, if the boundary conditions specify a condition in the future (the apple on your head) instead of in the past, as long as the fundamental laws of physics are reversible, which most physicists believe they are.
“For those of us who believe in physics,” Einstein once wrote to a friend, “this separation between past, present and future is only an illusion.”
In
Kurt Vonnegut’s novel “Sirens of Titan,” all of human history turns out to be reduced to delivering a piece of metal roughly the size and shape of a beer-can opener to an alien marooned on Saturn’s moon so he can repair his spaceship and go home.
Whether the collider has such a noble or humble fate — or any fate at all — remains to be seen. As a Red Sox fan my entire adult life, I feel I know something about jinxes.

lunedì 5 ottobre 2009

Graphite Mimics Iron's Magnetism: New Nanotech Applications.

ScienceDaily (Oct. 5, 2009) — Researchers of Eindhoven University of Technology and the Radboud University Nijmegen in The Netherlands show for the first time why ordinary graphite is a permanent magnet at room temperature. The results are promising for new applications in nanotechnology, such as sensors and detectors. In particular graphite could be a promising candidate for a biosensor material. The results will appear online on 4 October in Nature Physics.
Graphite is a well-known lubricant and forms the basis for pencils. It is a layered compound with a weak interlayer interaction between the individual carbon (graphene) sheets. Hence, this makes graphite a good lubricant.

Unexpected:
It is unexpected that graphite is ferromagnetic. The researchers Jiri Cervenka and Kees Flipse (Eindhoven University of Technology) and Mikhail Katsnelson (Radboud University Nijmegen) demonstrated direct evidence for ferromagnetic order and explain the underlying mechanism. In graphite well ordered areas of carbon atoms are separated by 2 nanometer wide boundaries of defects. The electrons in the defect regions (the red/yellow area in picture 1) behave differently compared to the ordered areas (blue in picture 1), showing similarities with the electron behaviour of ferromagnetic materials like iron and cobalt.

Debate settled:
The researchers found that the grain boundary regions in the individual carbon sheets are magnetically coupled, forming 2-dimensional networks (picture 2). This interlayer coupling was found to explain the permanent magnetic behaviour of graphite. The researchers also show experimental evidence for excluding magnetic impurities to be the origin of ferromagnetism, ending ten years of debate.

Carbon in spintronics:
Surprisingly, a material containing only carbon atoms can be a weak ferro magnet. This opens new routes for spintronics in carbon-based materials. Spins can travel over relative long distances without spin-flip scattering and they can be flipped by small magnetic fields. Both are important for applications in spintronics. Carbon is biocompatible and the explored magnetic behaviour is therefore particularly promising for the development of biosensors.
The research was funded by Nanoned and FOM.
Journal reference:
Jiri Cervenka, Mikhail Katsnelson and Kees Flipse. Room-temperature ferromagnetism in graphite driven by 2D networks of point defects. Nature Physics, October 4, 2009 DOI:
10.1038/NPHYS1399
Adapted from materials provided by Eindhoven University of Technology, via EurekAlert!, a service of AAAS.

giovedì 1 ottobre 2009

Step Forward For Nanotechnology: Controlled Movement Of Molecules.

SOURCE

ScienceDaily (Oct. 1, 2009) — Scientists in the United Kingdom are reporting an advance toward overcoming one of the key challenges in nanotechnology: Getting molecules to move quickly in a desired direction without help from outside forces.
Their achievement has broad implications, the scientists say, raising the possibility of coaxing cells to move and grow in specific directions to treat diseases. It also could speed development of some long-awaited nanotech innovations. They include self-healing structures that naturally repair tears in their surface and devices that deliver medication to diseased while sparing healthy tissue.
The study is scheduled for the October issue of ACS Nano, a monthly journal.
Mark Geoghegan and colleagues note long-standing efforts to produce directed, controlled movement of individual molecules in the nano world, where objects are about 1/50,000ththe width of a human hair. The main solutions so far have involved use of expensive, complex machines to move the molecules and they have been only partially successful, the scientists say.
The scientists used a special surface with hydrophobic (water repelling) and hydrophilic (water-attracting) sections. The region between the two sections produced a so-called "energy gradient" which can move tiny objects much like a conveyor belt. In lab studies, the scientists showed that plastic nanoparticles (polymer molecules) moved quickly and in a specific direction on this surface. "This could have implications in many technologies such as coaxing cells to move and grow in given directions, which could have major implications for the treatment of paralysis," the scientists said.
Journal reference:
Burgos et al. Directed Single Molecule Diffusion Triggered by Surface Energy Gradients. ACS Nano, 2009; 090923111502009 DOI:
10.1021/nn900991r
Adapted from materials provided by American Chemical Society, via EurekAlert!, a service of AAAS.

Spallation Neutron Source First Of Its Kind To Reach Megawatt Power.

ScienceDaily (Oct. 1, 2009) — The Department of Energy's Spallation Neutron Source (SNS), already the world's most powerful facility for pulsed neutron scattering science, is now the first pulsed spallation neutron source to break the one-megawatt barrier.
"Advances in the materials sciences are fundamental to the development of clean and sustainable energy technologies. In reaching this milestone of operating power, the Spallation Neutron Source is providing scientists with an unmatched resource for unlocking the secrets of materials at the molecular level," said Dr. William F. Brinkman, Director of DOE's Office of Science.
SNS operators at DOE's Oak Ridge National Laboratory pushed the controls past the megawatt mark on September 18 as the SNS ramped up for its latest operational run.
"The attainment of one megawatt in beam power symbolizes the advancement in analytical resources that are now available to the neutron scattering community through the SNS," said ORNL Director Thom Mason, who led the SNS project during its construction. "This is a great achievement not only for DOE and Oak Ridge National Laboratory, but for the entire community of science."
Before the SNS, the world's spallation neutron sources operated in the hundred-kilowatt range. The SNS actually became a world-record holder in August 2007 when it reached 160 kilowatts, earning it an entry in the Guinness Book of World Records as the world's most powerful pulsed spallation neutron source.
Beam power isn't merely a numbers game. A more powerful beam means more neutrons are spalled from SNS's mercury target. For the researcher, the difference in beam intensity is comparable to the ability to see with a car's headlights versus a flashlight. More neutrons also enhance scientific opportunities, including flexibility for smaller samples and for real-time studies at shorter time scales. For example, experiments will be possible that use just one pulse of neutrons to illuminate the dynamics of scientific processes.
Eventually, the SNS will reach its design power of 1.4 megawatts. The gradual increase of beam power has been an ongoing process since the SNS was completed and activated in late April 2006.
In the meantime, scientists have been performing cutting-edge experiments and materials analysis as its eventual suite of 25 instruments comes on line. As DOE Office of Science user facilities, the SNS and its companion facility, the High Flux Isotope Reactor, host researchers from around the world for neutron scattering experiments.
ORNL is managed by UT-Battelle for the Department of Energy.
Adapted from materials provided by
DOE/Oak Ridge National Laboratory.

Physicists Create First Atomic-scale Map Of Quantum Dots.

ScienceDaily (Sep. 30, 2009) — University of Michigan physicists have created the first atomic-scale maps of quantum dots, a major step toward the goal of producing "designer dots" that can be tailored for specific applications.
Quantum dots—often called artificial atoms or nanoparticles—are tiny semiconductor crystals with wide-ranging potential applications in computing, photovoltaic cells, light-emitting devices and other technologies. Each dot is a well-ordered cluster of atoms, 10 to 50 atoms in diameter.
Engineers are gaining the ability to manipulate the atoms in quantum dots to control their properties and behavior, through a process called directed assembly. But progress has been slowed, until now, by the lack of atomic-scale information about the structure and chemical makeup of quantum dots.
The new atomic-scale maps will help fill that knowledge gap, clearing the path to more rapid progress in the field of quantum-dot directed assembly, said Roy Clarke, U-M professor of physics and corresponding author of a paper on the topic published online Sept. 27 in the journal Nature Nanotechnology.
Lead author of the paper is Divine Kumah of the U-M's Applied Physics Program, who conducted the research for his doctoral dissertation.
"I liken it to exploration in the olden days," Clarke said of dot mapping. "You find a new continent and initially all you see is the vague outline of something through the mist. Then you land on it and go into the interior and really map it out, square inch by square inch.
"Researchers have been able to chart the outline of these quantum dots for quite a while. But this is the first time that anybody has been able to map them at the atomic level, to go in and see where the atoms are positioned, as well as their chemical composition. It's a very significant breakthrough."
To create the maps, Clarke's team illuminated the dots with a brilliant X-ray photon beam at Argonne National Laboratory's Advanced Photon Source. The beam acts like an X-ray microscope to reveal details about the quantum dot's structure. Because X-rays have very short wavelengths, they can be used to create super-high-resolution maps.
"We're measuring the position and the chemical makeup of individual pieces of a quantum dot at a resolution of one-hundredth of a nanometer," Clarke said. "So it's incredibly high resolution."
A nanometer is one-billionth of a meter.
The availability of atomic-scale maps will quicken progress in the field of directed assembly. That, in turn, will lead to new technologies based on quantum dots. The dots have already been used to make highly efficient lasers and sensors, and they might help make quantum computers a reality, Clarke said.
"Atomic-scale mapping provides information that is essential if you're going to have controlled fabrication of quantum dots," Clarke said. "To make dots with a specific set of characteristics or a certain behavior, you have to know where everything is, so that you can place the atoms optimally. Knowing what you've got is the most important thing of all."
In addition to Clarke, co-authors of the Nature Nanotechnology paper are Sergey Shusterman, Yossi Paltiel and Yizhak Yacoby.
The research was sponsored by a grant from the National Science Foundation. The U.S. Department of Energy supported work at Argonne National Laboratory's Advanced Photon Source.
Adapted from materials provided by
University of Michigan.

lunedì 28 settembre 2009

Carbon Nanostructure Research May Lead To Revolutionary New Devices.


ScienceDaily (Sep. 28, 2009) — Dr. Jiwoong Park of Cornell University, who receives funding for basic research from the Air Force Office of Scientific Research (AFOSR), is investigating carbon nanostructures that may some day be used in electronic, thermal, mechanical and sensing devices for the Air Force.
"Devices that are required in many of the Air Force missions are somewhat different from commercial ones in the sense that they are often exposed to harsh environments while maintaining their maximum performance," Park said. "Carbon-based nanostructures, including carbon nanotubes and graphenes (thin layers of graphite) present many exciting properties that may lead to new device structures."
Park's team of researchers is examining single molecules, nanocrystals, nanowires, carbon nanotubes and their arrays in an effort to find a "bridging" material that has a stable structure for making molecular-level bonds. In addition, they are seeking an effective tool for resolving functional and structural challenges. If successful, they will be able to apply the research to future technological advances.
Park's research may contribute to the discovery of new electronic and optical devices that will revolutionize electrical engineering and bioengineering as well as physical and materials science.
As a result of Park's highly innovative work, the U.S. government has selected him to be a 2008 PECASE (Presidential Early Career Award in Science and Engineering) Award winner. The prestigious and much sought after award is the highest honor the government presents to promising scientists and engineers at the beginning of their careers. Each award winner receives a citation, a plaque, and up to $1 million in funding from the nominating agency (AFOSR).
"I fully expect that over the five-year period of the PECASE award, Professor Park will have established himself as a world leader in carbon nanotube and graphene research," said Dr. Harold Weinstock, the AFOSR program manager responsible for nominating Park.
Adapted from materials provided by
Air Force Office of Scientific Research.

New Nanochemistry Technique Encases Single Molecules In Microdroplets.

ScienceDaily (Sep. 28, 2009) — Inventing a useful new tool for creating chemical reactions between single molecules, scientists at the National Institute of Standards and Technology (NIST) have employed microfluidics—the manipulation of fluids at the microscopic scale—to make microdroplets that contain single molecules of interest.
By combining this new microfluidic "droplet-on-demand" method with "optical tweezers" that could merge multiple droplets and cause their molecular contents to react, the research may ultimately lead to a compact, integrated setup for obtaining single-molecule information on the structure and function of important organic materials, such as proteins, enzymes, and DNA.
With the aid of NIST's Center for Nanoscale Science and Technology, physicists Carlos López-Mariscal and Kristian Helmerson created a tiny microfluidic device with a channel through which water can flow. Squeezed into a narrow stream by a mixture of oils whose viscosity, or resistance to flow, exerts pressure on it, the water then enters a narrow constriction. The water's abrupt pressure drop—accompanied by a dash of detergent—breaks its surface tension, splitting it into small droplets. (This same effect occurs when a thin stream of water falling from a faucet breaks up into small drops.)
The droplet sizes are highly uniform and can be tuned by adjusting the width of the constriction. With this technique, the researchers made droplets about a micrometer in diameter—or half an attoliter (half a billionth of a billionth of a liter) in volume.
In the microfluidic channel, the water is laced with desired molecules of just the right concentration, so that resulting droplets each pick up on average just one molecule of interest. Inside each droplet, the individual molecules of interest slosh around freely in the relatively roomy sphere, along with the water molecules that make up the bulk of every droplet.
By using laser beams, the researchers can move two or more single-molecule-containing droplets, cause them to coalesce, and observe the reactions through optical methods. For their initial reactions, the researchers are mixing fluorescent molecules that emit different colors, but in the future, they envision more interesting chemical reactions, such as those between an infectious agent and an antibody, or a chromosome and a drug. The researchers can shape a laser beam into any desired pattern and thereby trap not only single drops, but arrays of them, opening up new possibilities for single-molecule spectroscopy.
Journal reference:
C. López-Mariscal and K. Helmerson. Optical trapping of hydrosomes. Proc. SPIE, 2009; 7400, 740026
Adapted from materials provided by
National Institute of Standards and Technology (NIST).

Discovery Brings New Type Of Fast Computers Closer To Reality.

SOURCE

ScienceDaily (Sep. 28, 2009) — Physicists at UC San Diego have successfully created speedy integrated circuits with particles called “excitons” that operate at commercially cold temperatures, bringing the possibility of a new type of extremely fast computer based on excitons closer to reality.
Their discovery, detailed this week in the advance online issue of the journal Nature Photonics, follows the team’s demonstration last summer of an integrated circuit—an assembly of transistors that is the building block for all electronic devices—capable of working at 1.5 degrees Kelvin above absolute zero. That temperature, equivalent to minus 457 degrees Fahrenheit, is not only less than the average temperature of deep space, but achievable only in special research laboratories.
Now the scientists report that they have succeeded in building an integrated circuit that operates at 125 degrees Kelvin, a temperature that while still a chilly minus 234 degrees Fahrenheit, can be easily attained commercially with liquid nitrogen, a substance that costs about as much per liter as gasoline.
“Our goal is to create efficient devices based on excitons that are operational at room temperature and can replace electronic devices where a high interconnection speed is important,” said Leonid Butov, a professor of physics at UCSD, who headed the research team. “We’re still in an early stage of development. Our team has only recently demonstrated the proof of principle for a transistor based on excitons and research is in progress.”
Excitons are pairs of negatively charged electrons and positively charged “holes” that can be created by light in a semiconductor such as gallium arsenide. When the electron and hole recombine, the exciton decays and releases its energy as a flash of light.
The fact that excitons can be converted into light makes excitonic devices faster and more efficient than conventional electronic devices with optical interfaces, which use electrons for computation and must then convert them to light for use in communications devices.
"Our transistors process signals using excitons, which like electrons can be controlled with electrical voltages, but unlike electrons transform into photons at the output of the circuit,” Butov said. “This direct coupling of excitons to photons allows us to link computation and communication."
Other members of the team involved in the discovery were physicists Gabriele Grosso, Joe Graves, Aaron Hammack and Alex High at UC San Diego, and materials scientists Micah Hanson and Arthur Gossard at UC Santa Barbara.
Their research was supported by the Army Research Office, the Department of Energy and the National Science Foundation.
Adapted from materials provided by
University of California - San Diego.

Nanotechnology: Artificial Pore Created.

ScienceDaily (Sep. 28, 2009) — Using an RNA-powered nanomotor, University of Cincinnati (UC) biomedical engineering researchers have successfully developed an artificial pore able to transmit nanoscale material through a membrane.
In a study led by UC biomedical engineering professor Peixuan Guo, PhD, members of the UC team inserted the modified core of a nanomotor, a microscopic biological machine, into a lipid membrane. The resulting channel enabled them to move both single- and double-stranded DNA through the membrane.
Their paper, “Translocation of double-stranded DNA through membrane-adapted phi29 motor protein nanopores,” will appear in the journal Nature Nanotechnology, Sept. 27, 2009. The engineered channel could have applications in nano-sensing, gene delivery, drug loading and DNA sequencing," says Guo.
Guo derived the nanomotor used in the study from the biological motor of bacteriophage phi29, a virus that infects bacteria. Previously, Guo discovered that the bacteriophage phi29 DNA-packaging motor uses six molecules of the genetic material RNA to power its DNA genome through its protein core, much like a screw through a bolt.
"The re-engineered motor core itself has shown to associate with lipid membranes, but we needed to show that it could punch a hole in the lipid membrane," says David Wendell, PhD, co-first author of the paper and a research assistant professor in UC’s biomedical engineering department. "That was one of the first challenges, moving it from its native enclosure into this engineered environment."
In this study, UC researchers embedded the re-engineered nanomotor core into a lipid sheet, creating a channel large enough to allow the passage of double-stranded DNA through the channel.
Guo says past work with biological channels has been focused on channels large enough to move only single-stranded genetic material.
"Since the genomic DNA of human, animals, plants, fungus and bacteria are double stranded, the development of single pore system that can sequence double-stranded DNA is very important," he says.
By being placed into a lipid sheet, the artificial membrane channel can be used to load double-stranded DNA, drugs or other therapeutic material into the liposome, other compartments, or potentially into a cell through the membrane.
Guo also says the process by which the DNA travels through the membrane can have larger applications.
"The idea that a DNA molecule travels through the nanopore, advancing nucleotide by nucleotide, could lead to the development of a single pore DNA sequencing apparatus, an area of strong national interest," he says.
Using stochastic sensing, a new analytical technique used in nanopore work, Wendell says researchers can characterize and identify material, like DNA, moving through the membrane.
Co-first author and UC postdoctoral fellow Peng Jing, PhD, says that, compared with traditional research methods, the successful embedding of the nanomotor into the membrane may also provide researchers with a new way to study the DNA packaging mechanisms of the viral nanomotor.
"Specifically, we are able to investigate the details concerning how double-stranded DNA translocates through the protein channel," he says.
The study is the next step in research on using nanomotors to package and deliver therapeutic agents directly to infected cells. Eventually, the team's work could enable use of nanoscale medical devices to diagnose and treat diseases.
"This motor is one of the strongest bio motors discovered to date," says Wendell, "If you can use that force to move a nanoscale rotor or a nanoscale machine … you're converting the force of the motor into a machine that might do something useful."
Funding for this study comes from the National Institutes of Health's Nanomedicine Development Center. Guo is the director of one of eight NIH Nanomedicine Development Centers and an endowed chair in biomedical engineering at UC.
Coauthors of the study include UC research assistant professor David Wendell, PhD, postdoctoral fellow Peng Jing, PhD, graduate students Jia Geng and Tae Jin Lee and former postdoctoral fellow Varuni Subramaniam from Guo’s previous lab at Purdue University. Carlo Montemagno, dean of the College of Engineering and College of Applied Science, also contributed to the study.
Journal reference:
. Translocation of double-stranded DNA through membrane-adapted phi29 motor protein nanopores. Nature Nanotechnology, Sept. 27, 2009
Adapted from materials provided by
University of Cincinnati Academic Health Center.

sabato 26 settembre 2009

Prototype Device Developed To Detect Dark Matter

SOURCE

ScienceDaily (Sep. 25, 2009) — A team of researchers from the University of Zaragoza (UNIZAR) and the Institut d'Astrophysique Spatiale (IAS, in France) has developed a "scintillating bolometer" -- a device that the scientists will use in efforts to detect the dark matter of the universe, and which has been tested at the Canfranc Underground Laboratory in Huesca, Spain.
"One of the biggest challenges in physics today is to discover the true nature of dark matter, which cannot be directly observed – even though it seems to make up one-quarter of the matter of the Universe. So we have to attempt to detect it using prototypes such as the one we have developed", Eduardo García Abancéns, a researcher from the UNIZAR's Laboratory of Nuclear Physics and Astroparticles, tells SINC.
García Abancéns is one of the scientists working on the ROSEBUD project (an acronym for Rare Objects SEarch with Bolometers UndergrounD), an international collaborative initiative between the Institut d'Astrophysique Spatiale (CNRS-University of Paris-South, in France) and the University of Zaragoza, which is focusing on hunting for dark matter in the Milky Way.
The scientists have been working for the past decade on this mission at the Canfranc Underground Laboratory, in Huesca, where they have developed various cryogenic detectors (which operate at temperatures close to absolute zero: −273.15 °C). The latest is a "scintillating bolometer", a 46-gram device that, in this case, contains a crystal "scintillator", made up of bismuth, germinate and oxygen (BGO: Bi4Ge3O12), which acts as a dark matter detector.
"This detection technique is based on the simultaneous measurement of the light and heat produced by the interaction between the detector and the hypothetical WIMPs (Weakly Interacting Massive Particles) which, according to various theoretical models, explain the existence of dark matter", explains García Abancéns.
The researcher explains that the difference in the scintillation of the various particles enables this method to differentiate between the signals that the WIMPs would produce and others produced by various elements of background radiation (such as alpha, beta or gamma particles).
In order to measure the miniscule amount of heat produced, the detector must be cooled to temperatures close to absolute zero, and a cryogenic facility, reinforced with lead and polyethylene bricks and protected from cosmic radiation as it housed under the Tobazo mountain, has been installed at the Canfranc underground laboratory.
"The new scintillating bolometer has performed excellently, proving its viability as a detector in experiments to look for dark matter, and also as a gamma spectrometer (a device that measures this type of radiation) to monitor background radiation in these experiments", says García Abancéns.
The scintillating bolometer is currently at the Orsay University Centre in France, where the team is working to optimise the device's light gathering, and carrying out trials with other BGO crystals.
This study, published recently in the journal Optical Materials, is part of the European EURECA project (European Underground Rare Event Calorimeter Array). This initiative, in which 16 European institutions are taking part (including the University of Zaragoza and the IAS), aims to construct a one-tonne cryogenic detector and use it over the next decade to hunt for the dark matter of the Universe.
Methods of detecting dark matter
Direct and indirect detection methods are used to detect dark matter, which cannot be directly observed since it does not emit radiation. The former include simultaneous light and heat detection (such as the technique used by the scintillating bolometers), simultaneous heat and ionisation detection, and simultaneous light and ionisation detection, such as research into distinctive signals (the most famous being the search for an annual modulation in the dark matter signal caused by the orbiting of the Earth).
There are also indirect detection methods, where, instead of directly seeking the dark matter particles, researchers try to identify other particles, (neutrinos, photons, etc.), produced when the Universe's dark matter particles are destroyed.
Journal reference:
N. Coron, E. García, J. Gironnet, J. Leblanc, P. de Marcillac, M. Martínez, Y. Ortigoza, A. Ortiz de Solórzano, C. Pobes, J. Puimedón, T. Redon, M.L. Sarsa, L. Torres y J.A. Villar. A BGO scintillating bolometer as dark matter detector prototype. Optical Materials, 2009; 31 (10): 1393 DOI:
10.1016/j.optmat.2008.09.016
Adapted from materials provided by FECYT - Spanish Foundation for Science and Technology, via EurekAlert!, a service of AAAS.

mercoledì 23 settembre 2009

ATLAS e-News: Catching the elusive black hole

Professor Stephen Hawking, a central figure in black hole theory, during his recent visit to CERN with colloquium organiser, Luis Alvarez-Gaume on his left.

This time last year, talk of black holes overwhelmed the global news media. Closer to home, black holes are also making mischief – this time overwhelming the Trigger system.It turns out that if blacks hole event occurs in the first few months of data taking, we may actually be none the wiser. Not, as some tabloid newspapers were purporting, because we’ll be swallowed into oblivion, but rather because they’ll be masked as flawed events by the Trigger system.The problem, according to Ignacio Aracena, who works on jets and missing ET, is not that there is nothing to trigger on. Quite the contrary, plenty of final state particles will be produced, but to such an extent that the system will be inundated.“We expect that black holes will decay in essentially all the Standard Model particles,” says Ignacio. “But for black holes the number of jets is way higher [than for other events]. I’m not a black hole expert, but it’s something like 10 jets with high transverse momentum.”Compare this to, for example, a supersymmetry event where perhaps four or so jets, some missing transverse energy and a handful of leptons are expected, and you begin to get a sense of the challenge that black holes pose. They pretty much light up the whole detector.“For the trigger, the main idea of having a sequential selection was to focus on interesting physics objects and then only do the reconstruction in the trigger in that region,” Ignacio explains. Since there is only limited time available to process events at Levels 1 and 2, reading out the whole detector simply isn’t possible.The situation right now is that the Trigger system is virtually thrown whenever Monte Carlo black hole events are run. Processing the jets and retrieving all the data for them just takes too long; the time-out feature built into the algorithms kicks in before processing is complete, and data is instead dumped into the debug stream. This is a safety store where potentially interesting, but problematic, data is filed – corrupted or noisy data, or events that crash during execution – for later reprocessing offline.“This debug stream handling will be done in quasi-real time,” says Anna Sfyrla, who works on it, and adds: “Events with time-outs will usually be recovered during this reprocessing.” Recovered events are saved in datasets and made available for analysis, but so far there are no plans for these to be re-integrated into the physics online datasets.“In the long term, we’ll have to find a strategy to select these events,” says Ignacio. Allowing the system to be snowed under trying to process black hole data, at the expense of picking out and processing other physics events, is not an option. “From an analysis point of view, of course it would be helpful to know that you have black hole events in a specific data set. But we have a broad physics program and you have to keep the whole system running.”Eventually, a specific trigger chain, or even a specific data stream will likely be set up to select events that have large jet multiplicity with a high transverse energy. However, Ignacio concedes that with the current focus on really understanding the detector, its noise-levels and its responses, “It’s probably not something that we’re going to claim to see in the first two years.” Which means that if black hole events occur at all, the debug stream will be where they’re discovered.In the meanwhile, cosmic running is continually helping to improve the performance of algorithms – an optimisation process that will continue with the arrival of beam and collisions. “In this context, any improvements we make, even while taking cosmic data, are going to benefit [the eventual online identification of black holes],” says Ignacio. “Having this finally sent to a specific data stream will be the sum of all the efforts that we’re making right now and will do in the future.”

Ceri Perkins
ATLAS e-News

Is the Large Hadron Collider worth its massive price tag?

SOURCE

Scientists at Cern near Geneva are close to turning on their particle accelerator a year after it blew up. In their latest video, physicists hunting the Higgs boson ask what price society is willing to pay to understand the universe.

A month or so ago I was sat at a table outside the canteen at Cern, the European nuclear research organisation in Switzerland, nursing an espresso and watching an impromptu volleyball match play out across a giant blue magnet daubed with white paint. The graffiti read: "LH...C'est pas sorcier". It's not rocket science.
Maybe it's not, but what the scientists are trying to do at Cern is no easier. The underground accelerator, the Large Hadron Collider, is vast and vastly complex. It's almost no surprise it didn't spring to life and start churning out data as soon as they flicked the on switch this time last year.
I was at Cern to talk to scientists about the long march that is the hunt for the Higgs boson. The particle was predicted 45 years ago. You can think of it as a tell-tale fingerprint that confirms the existence of an extraordinary field that permeates the entirety of space, from the infinitesimal pinch between the constituents of atomic nuclei and the incomprehensible stretches of nothingness that separate galaxies.
The field is a big deal. According to physicists' best theories, it contains energy that it shares with the smallest building blocks of matter, such as electrons and quarks, the latter being the constituents of protons and neutrons in the atomic nucleus. The field gives the particles mass, and in doing so, brings stability and structure to the universe.
There are only two places in the world that have the capability to hunt for the Higgs boson: Fermilab near Chicago and Cern. Today, Fermilab is home to the world's most powerful particle collider, the Tevatron. Cern will take over that title in November, at least they will if they get the LHC up and running this time.
Cern has seen glimpses of what might be the Higgs boson before in 2000, with an older machine that was ripped out of the ground to make room for the Large Hadron Collider. If those glimpses were real, the Higgs is fairly light and could take a long time to find with LHC.
I've written about the Colliding Particles project here before. A team of Higgs hunters at University College London have teamed up with a film maker to produce a series of video shorts that follow their exploits. I can't praise them enough. They blast many full length TV science documentaries out of the water. They have a coherent narrative, they have engaging characters, they let you in on what happens to our £80m-a-year Cern subscription.
In the first video, the team talk about a new way to hunt for the Higgs. In this, their fifth video, it's time to pitch the idea to other Cern physicists. If the idea is accepted,
their "Eurostar" idea becomes part of the formal search for the missing particle. As luck would have it, they've roped in that bloke from The Mummy and Four Weddings to do their presentation. Or maybe it's his younger brother.
There's more to it though. The Large Hadron Collider is an expensive beast and in times of global financial meltdown and looming environmental problems, it's not unfair to wonder whether this kind of basic research is a luxury we can't afford. It's a question the physicists ponder and perhaps never fully answer.
The Large Hadron Collider might well be the last machine of its kind that ever gets built. But the fact that it was built is extremely heartening. This is a machine so large it takes hours to jog around. It cost billions of Euros and took many years to build. That governments were willing to pay for it, with no idea what it might or might not find, speaks volumes about the price society is willing to pay to understand more about our place in the universe.

domenica 20 settembre 2009

New X-ray Technique Illuminates Reactivity Of Environmental Contaminants.

SOURCE

ScienceDaily (Sep. 20, 2009) — A chemical reaction can occur in the blink of an eye.
Thanks to a new analytical method employed by researchers at the University of Delaware, scientists can now pinpoint, at the millisecond level, what happens as harmful environmental contaminants such as arsenic begin to react with soil and water under various conditions.
Quantifying the initial rates of such reactions is essential for modeling how contaminants are transported in the environment and predicting risks.
The research method, which uses an analytical technique known as quick-scanning X-ray absorption spectroscopy (Q-XAS), was developed by a research team led by Donald Sparks, S. Hallock du Pont Chair of Plant and Soil Sciences and director of the Delaware Environmental Institute at UD. The work is reported in the Sept. 10 Early Edition of the Proceedings of the National Academy of Sciences and will be in the Sept. 22 print issue.
Postdoctoral researcher Matthew Ginder-Vogel is the first author of the study, which also involved Ph.D. student Gautier Landrot and Jason Fischel, an undergraduate student at Juniata College who has interned in Sparks's lab during the past three summers.
The research method was developed using beamline X18B at the National Synchrotron Light Source at Brookhaven National Laboratory in Upton, N.Y. The facility is operated by the U.S. Department of Energy.
“This method is a significant advance in elucidating mechanisms of important geochemical processes, and is the first application, at millisecond time scales, to determine in real-time, the molecular scale reactions at the mineral/water interface. It has tremendous applications to many important environmental processes including sorption, redox, and precipitation,” Sparks said.
“My group and I have been conducting kinetics studies on soils and soil minerals for 30 years,” Sparks added. “Since the beginning I have been hopeful that someday we could follow extremely rapid reaction processes and simultaneously collect mechanistic information.”
X-ray spectroscopy was invented years ago to illuminate structures and materials at the atomic level. The technique has been commonly used by physicists, chemists, materials scientists, and engineers, but only recently by environmental scientists.
“In studying soil kinetics, we want to know how fast a contaminant begins to stick to a mineral,” Ginder-Vogel says. “In general, these reactions are very rapid -- 90 percent of the reaction is over in the first 10 seconds. Now we can measure the first few seconds of these reactions that couldn't be measured before. We can now look at things as they happen versus attempting to freeze time after the fact,” he notes.
For their study, the UD researchers made millisecond measurements of the oxidation rate of arsenic by hydrous manganese oxide, which is a mineral that absorbs heavy metals and nutrients.
Contamination of drinking water supplies by arsenic is a serious health concern in the United States and abroad. The poisonous element occurs naturally in rocks and minerals and is also used in a wide range of products, from wood preservatives and insecticides, to poultry feed.
The toxicity and availability of arsenic to living organisms depends on its oxidation state -- in other words, the number of electrons lost or gained by an atom when it reacts with minerals and microbes. For example, arsenite [As(III)] is more mobile and toxic than its oxidized counterpart, arsenate [As(V)].
“Our technique is important for looking at groundwater flowing through minerals,” Ginder-Vogel notes. “We look at it as a very early tool that can be incorporated into predictive modeling for the environment.”
A native of Minnesota, Ginder-Vogel started out as a chemist in college, but says he wanted to do something more applied. As he was completing his doctorate at Stanford University under Prof. Scott Fendorf, a UD alumnus who studied under Sparks, Ginder-Vogel saw the advertisement for a postdoctoral position in Sparks's lab and jumped at the opportunity.
“The University of Delaware has the reputation as one of the best research institutions in the country for soil science, and Don Sparks is a leader in the field,” Ginder-Vogel notes.
Ginder-Vogel says one of the coolest things about the current research is its interdisciplinary nature.
“What's novel about soil chemistry is that we can take bits of pieces from different fields -- civil and environmental engineering, materials science, chemistry, and biochemistry -- and apply it in unique ways,” he says. “It's fun to contribute to a new research method in our field.”
The research was funded by the U.S. Department of Agriculture (USDA) and by two grants from the National Science Foundation, including one from the NSF-Delaware Experimental Program to Stimulate Competitive Research (EPSCoR). The U.S. Department of Energy supported the research team's use of the National Synchrotron Light Source.
Journal reference:
Matthew Ginder-Vogel, Gautier Landrot, Jason S. Fischel, and Donald L. Sparks. Quantification of rapid environmental redox processes with quick-scanning x-ray absorption spectroscopy (Q-XAS). Proceedings of the National Academy of Sciences, 2009; DOI:
10.1073/pnas.0908186106
Adapted from materials provided by University of Delaware.

Strain On Nanocrystals Could Yield Colossal Results


ScienceDaily (Sep. 18, 2009) — In finally answering an elusive scientific question, researchers with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have shown that the selective placement of strain can alter the electronic phase and its spatial arrangement in correlated electron materials. This unique class of materials is commanding much attention now because they can display properties such as colossal magnetoresistance and high-temperature superconductivity, which are highly coveted by the high-tech industry.
Junqiao Wu, a physicist who holds joint appointments with Berkeley Lab’s Materials Sciences Division and the University of California-Berkeley’s Department of Materials Science and Engineering, led the study in which it was demonstrated that irregularities in the micro-domain structure of correlated electron materials - a phenomenon known as “phase inhomogeneity” - can be generated by external stimuli and could be engineered at the sub-micron scale to achieve desired properties.
“By continuously tuning strain over a wide range in single-crystal vanadium oxide micro- and nano-scale wires, we were able to engineer phase inhomogeneity along the wires,” says Wu. “Our results shed light on the origin of phase inhomogeneity in correlated electron materials in general, and open opportunities for designing and controlling phase inhomogeneity of correlated electron materials for future devices.”
Wu is the corresponding author of a paper describing this work which was published in the journal Nature Nanotechnology and is entitled: “Strain engineering and one-dimensional organization of metal-insulator domains in single crystal VO2 beams.” Co-authoring the paper with Wu were Jinbo Cao, Elif Ertekin, Varadharajan Srinivasan, Wen Fan, Simon Huang, Haimei Zheng, Joanne Yim, Devesh Khanal, Frank Ogletree and Jeffrey Grossman.
Whereas in conventional materials, the motion of one electron is relatively independent of any other, in “correlated electron materials” quantum effects enable electrons to act collectively, like dancers in a chorus line. Emerging from this collective electronic behavior are properties such as colossal magnetoresistance, where the presence of a magnetic field increases electrical resistance by orders of magnitude, or high-temperature superconductivity, in which the materials lose all electrical resistance at temperatures much higher than conventional superconductors.
Frequently observed spatial phase inhomogeneities are believed to be critical to the collective electronic behavior of correlated electron materials. However, despite decades of investigation, the question of whether such phase inhomogeneities are intrinsic to correlated electron materials or caused by external stimuli has remained largely unanswered.
“This question is not only important for our understanding of the physics behind correlated electron materials,” says Wu, “it also directly determines the spatial scale of correlated electron material device applications.”
To determine if phase inhomogeneity can be caused by external effects, Wu and his colleagues worked with vanadium oxide, a representative correlated electron material that features a metal-nonmetal transition, where in the nonmetal state its electrons can no longer carry an electrical current. After synthesizing the vanadium oxide into flexible single-crystal micro- and nanowires, the research team subjected the wires to strain by bending them to different curvatures. Different curvatures yielded different strains, and the phase transitions were measured in each of the strained areas.
“The metal-nonmetal domain structure was determined by competition between elastic deformation, thermodynamic and domain wall energies in this coherently strained system,” says Wu. “A uniaxial compressive strain of approximately 1.9-percent was able to drive the metal-nonmetal transition at room temperature.”
The ability to fabricate single-crystal micro- and nanowires of vanadium oxide that were free of structural defects made it possible to apply such high strain without plastic deformation or fracturing of the material, Wu says. Bulk and even thin film samples of vanadium oxide cannot tolerate a strain of even one-percent without suffering dislocations.
Wu says that in the future strain engineering might be achieved by interfacing a correlated electron material such as vanadium oxide with a piezoelectric - a non-conducting material that creates a stress or strain in response to an electric field.
“By applying an electric field, the piezoelectric material would strain the correlated electron material to achieve a phase transition that would give us the desired functionality,” says Wu. ”To reach this capability, however, we will first need to design and synthesize such integrated structures with good material quality.”
This work was supported in part by Berkeley Lab through its Laboratory Directed Research and Development Program, and in part by a grant from the National Science Foundation.
Adapted from materials provided by
DOE/Lawrence Berkeley National Laboratory.

venerdì 11 settembre 2009

PROTEIN EXPRESSIONS Study N.1 (VIDEO)



This 3'30'' video represents the first graphical result of the two major aspects of our research: protein motion and visual representation. The main program used is Blender, an open source 3D animation and special effects package, which we have equipped with other scripts. We also make large use of other scientific (VMD, Swiss-PDBviewer with Gromos43B1, Chimera, Reduce – MolProbity, PDB2PQR, PyMLP.py, APBS, and several home made scripts and programs) and graphical programs (Blender, Maya Autodesk, The Gimp, Djv_view, ImageMagick).
It shows a short trip to the interior of a cell, starting inside a small capillary vase. After navigating the vein, we meet some white cells, and we take a close look at one of them. We see the surface from a distance, so that we can observe the membrane dynamics, with no specific object clearly distinguished. We than 'land' on the surface and first see some glycolipids up close; when we look around we get a view of a membrane raft, with a crowd of many proteins, all in constant motion. One very erratic protein is a channel (protein that allow potassium ions to get out of the cell), into which we fall to get inside the cell.
Once inside we see Calmodulin, a very flexible small protein that we observe for a while before travelling along a microtubule, towards a place where Calcium waves are pulsing. Here our favourite protein is hit by Calcium and undergoes a major conformational change. This is shown from different perspectives, until we quickly move towards the periplasmic region of the cell, where the contractile ring is operating to split the cell in two at the time of cell division.
For a more detailed explanation of the scientific and the graphical aspects of the video, scene by scene, download this file.
See the video (Flash Player required)

giovedì 13 agosto 2009

Quantum Computing: From qubits to qudits, with five energy levels

Source: ScienceDaily
ScienceDaily (Aug. 13, 2009) — Scientists at UC Santa Barbara have devised a new type of superconducting circuit that behaves quantum mechanically – but has up to five levels of energy instead of the usual two. The findings are published in the August 7 issue of Science.
These circuits act like artificial atoms in that they can only gain or lose energy in packets, or quanta, by jumping between discrete energy levels. "In our previous work, we focused on systems with just two energy levels, 'qubits,' because they are the quantum analog of 'bits,' which have two states, on and off," said Matthew Neeley, first author and a graduate student at UCSB.
He explained that in this work they operated a quantum circuit as a more complicated artificial atom with up to five energy levels. The generic term for such a system is "qudit," where 'd' refers to the number of energy levels –– in this case, 'd' equals five.
"This is the quantum analog of a switch that has several allowed positions, rather than just two," said Neeley. "Because it has more energy levels, the physics of a qudit is richer than for just a single qubit. This allows us to explore certain aspects of quantum mechanics that go beyond what can be observed with a qubit."
Just as bits are used as the fundamental building blocks of computers, qubits could one day be used as building blocks of a quantum computer, a device that exploits the laws of quantum mechanics to perform certain computations faster than can be done with classical bits alone. "Qudits can be used in quantum computers as well, and there are even cases where qudits could be used to speed up certain operations with a quantum computer," said Neeley. "Most research to date has focused on qubit systems, but we hope our experimental demonstration will motivate more effort on qudits, as an addition to the quantum information processing toolbox."
The senior co-author of the paper is John M. Martinis, professor of physics at UCSB. Other co-authors from UCSB are: Markus Ansmann, Radoslaw C. Bialczak, Max Hofheinz, Erik Lucero, Aaron D. O'Connell, Daniel Sank, Haohua Wang, James Wenner, and Andrew N. Cleland. Another co-author, Michael R. Geller, is from the University of Georgia.
Adapted from materials provided by University of California - Santa Barbara.

sabato 25 luglio 2009

Nanotubes Weigh A Single Atom


SOURCE

ScienceDaily (July 23, 2009) — How can you weigh a single atom? European researchers have built an exquisite new device that can do just that. It may ultimately allow scientists to study the progress of chemical reactions, molecule by molecule.
Carbon nanotubes are ultra-thin fibres of carbon and a nanotechnologist’s dream.
They are made from thin sheets of carbon only one atom thick – known as graphene – rolled into a tube only a few nanometres across. Even the thickest is more than a thousand times thinner than a human hair.
Interest in carbon nanotubes blossomed in the 1990s when they were found to possess impressive characteristics that make them very attractive raw materials for nanotechnology of all kinds.
“They have unique properties,” explains Professor Pertti Hakonen of Helsinki University of Technology. “They are about 1000 times stronger than steel and very good thermal conductors and good electrical conductors.”
Hakonen is coordinator of the EU-funded CARDEQ project (
http://www.cardeq.eu/) which is exploiting these intriguing materials to build a device sensitive enough to measure the masses of atoms and molecules.
Vibrating strings
A carbon nanotube is essentially an extremely thin, but stiff, piece of string and, like other strings, it can vibrate. As all guitar players know, heavy strings vibrate more slowly than lighter strings, so if a suspended carbon nanotube is allowed to vibrate at its natural frequency, that frequency will fall if atoms or molecules become attached to it.
It sounds simple and the idea is not new. What is new is the delicate sensing system needed to detect the vibration and measure its frequency. Some nanotubes turn out to be semiconductors, depending on how the graphene sheet is wound, and it is these that offer the solution that CARDEQ has developed.
Members of the consortium have taken the approach of building a semiconducting nanotube into a transistor so that the vibration modulates the current passing through it. “The suspended nanotube is, at the same time, the vibrating element and the readout element of the transistor,” Hakonen explains.
“The idea was to run three different detector plans in parallel and then select the best one,” he says. “Now we are down to two. So we have the single electron transfer concept, which is more sensitive, and the field effect transistor concept, which is faster.”
Single atoms
Last November, CARDEQ partners in Barcelona reported that they had sensed the mass of single chromium atoms deposited on a nanotube. But Hakonen says that even smaller atoms, of argon, can now be detected, though the device is not yet stable enough for such sensitivity to be routine. “When the device is operating well, we can see a single argon atom on short time scales. But then if you measure too long the noise becomes large.”
CARDEQ is not alone in employing carbon nanotubes as mass sensors. Similar work is going on at two centres in California – Berkeley and Caltech – though each has adopted a different method to measuring the mass.
All three groups have announced they can perform mass detection on the atomic level using nanotubes, but CARDEQ researchers provided the most convincing data with a clear shift in the resonance frequency.
But a single atom is nowhere near the limit of what is possible. Hakonen is confident they can push the technology to detect the mass of a single nucleon – a proton or neutron.
“It’s a big difference,” he admits, “but typically the improvements in these devices are jump-like. It’s not like developing some well-known device where we have only small improvements from time to time. This is really front-line work and breakthroughs do occur occasionally.”
Biological molecules
If the resolution can be pared down to a single nucleon, then researchers can look forward to accurately weighing different types of molecules and atoms in real time.
It may then become possible to observe the radioactive decay of a single nucleus and to study other types of quantum mechanical phenomena.
But the real excitement would be in tracking chemical and biological reactions involving individual atoms and molecules reacting right there on the vibrating nanotube. That could have applications in molecular biology, allowing scientists to study the basic processes of life in unprecedented detail. Such practical applications are probably ten years away, Hakonen estimates.
“It will depend very much on how the technology for processing carbon nanotubes develops. I cannot predict what will happen, but I think chemical reactions in various systems, such as proteins and so on, will be the main applications in the future.”
The CARDEQ project received funding from the FET-Open strand of the EU’s Sixth Framework Programme for ICT research.
Adapted from materials provided by
ICT Results.

Physicists Create First Nanoscale Mass Spectrometer


ScienceDaily (July 24, 2009) — Using devices millionths of a meter in size, physicists at the California Institute of Technology (Caltech) have developed a technique to determine the mass of a single molecule, in real time.
The mass of molecules is traditionally measured using mass spectrometry, in which samples consisting of tens of thousands of molecules are ionized, to produce charged versions of the molecules, or ions. Those ions are then directed into an electric field, where their motion, which is choreographed by both their mass and their charge, allows the determination of their so-called mass-to-charge ratio. From this, their mass can ultimately be ascertained.
The new technique, developed over 10 years of effort by Michael L. Roukes, a professor of physics, applied physics, and bioengineering at the Caltech and codirector of Caltech's Kavli Nanoscience Institute, and his colleagues, simplifies and miniaturizes the process through the use of very tiny nanoelectromechanical system (NEMS) resonators. The bridge-like resonators, which are 2 micrometers long and 100 nanometers wide, vibrate at a high frequency and effectively serve as the "scale" of the mass spectrometer.
"The frequency at which the resonator vibrates is directly proportional to its mass," explains research physicist Askshay Naik, the first author of a paper about the work that appears in the journal Nature Nanotechnology. Changes in the vibration frequency, then, correspond to changes in mass.
"When a protein lands on the resonator, it causes a decrease in the frequency at which the resonator vibrates and the frequency shift is proportional to the mass of the protein," Naik says.
As described in the paper, the researchers used the instrument to test a sample of the protein bovine serum albumin (BSA), which is known to have a mass of 66 kilodaltons (kDa; a dalton is a unit of mass used to describe atomic and molecular masses, with one dalton approximately equal to the mass of one hydrogen atom).
The BSA protein ions are produced in vapor form using an electrospray ionization (ESI) system.The ions are then sprayed on to the NEMS resonator, which vibrates at a frequency of 450 megahertz. "The flux of proteins reaching the NEMS is such that only one to two protein lands on the resonator in a minute," Naik says.
When the BSA protein molecule is dropped onto the resonator, the resonator's vibration frequency decreases by as much as 1.2 kiloHertz—a small, but readily detectable, change. In contrast, the beta-amylase protein molecule, which has a mass of about 200 kDa, or three times that of BSA, causes a maximum frequency shift of about 3.6 kHz.
In principle, Naik says, it should be possible to use the system to detect one dalton differences in mass, the equivalent of a single hydrogen atom, but this will require a next-generation of nanowire-based devices that are smaller and have even better noise performance.
Because the location where the protein lands on the resonator also affects the frequency shift—falling onto the center of the resonator causes a larger change than landing on the end or toward the sides, for example—"we can't tell the mass with a single measurement, but needed about 500 frequency jumps in the published work," Naik says. In future, the researchers will decouple measurements of the mass and the landing position of the molecules being sampled. This technique, which they have already prototyped, will soon enable mass spectra for complicated mixtures to be built up, molecule-by molecule.
Eventually, Roukes and colleagues hope to create arrays of perhaps hundreds of thousands of the NEMS mass spectrometers, working in parallel, which could determine the masses of hundreds of thousands of molecules "in an instant," Naik says.
As Roukes points out, "the next generation of instrumentation for the life sciences—especially those for systems biology, which allows us to reverse-engineer biological systems—must enable proteomic analysis with very high throughput. The potential power of our approach is that it is based on semiconductor microelectronics fabrication, which has allowed creation of perhaps mankind's most complex technology."
The other authors of the paper are graduate student Mehmet S. Hanay and staff scientist Philip Feng, from Caltech, and Wayne K. Hiebert of the National Research Council of Canada. The work was supported by the National Institutes of Health and, indirectly, by the Defense Advanced Research Projects Agency and the Space and Naval Warfare Systems Command.
Journal reference:
. Towards single-molecule nanomechanical mass spectrometry. Nature Nanotechnology, July 4, 2009
Adapted from materials provided by
California Institute of Technology.

giovedì 23 luglio 2009

Purer Water With Long Shelf Life Made Possible With One Atom Change To Water Purification Product


ScienceDaily (July 23, 2009) — By substituting a single atom in a molecule widely used to purify water, researchers at Sandia National Laboratories have created a far more effective decontaminant with a shelf life superior to products currently on the market.
Sandia has applied for a patent on the material, which removes bacterial, viral and other organic and inorganic contaminants from river water destined for human consumption, and from wastewater treatment plants prior to returning water to the environment.
“Human consumption of ‘challenged’ water is increasing worldwide as preferred supplies become more scarce,” said Sandia principal investigator May Nyman. “Technological advances like this may help solve problems faced by water treatment facilities in both developed and developing countries.”
The study was published in June 2009 in the journal Environmental Science & Technology (a publication of the American Chemical Society) and highlighted in the June 22 edition of Chemical & Engineering News. Sandia is working with a major producer of water treatment chemicals to explore the commercial potential of the compound.
The water-treatment reagent, known as a coagulant, is made by substituting an atom of gallium in the center of an aluminum oxide cluster — itself a commonly used coagulant in water purification, says Nyman.
The substitution isn’t performed atom by atom using nanoscopic tweezers but rather uses a simple chemical process of dissolving aluminum salts in water, gallium salts into a sodium hydroxide solution and then slowly adding the sodium hydroxide solution to the aluminum solution while heating.
“The substitution of a single gallium atom in that compound makes a big difference,” said Nyman. “It greatly improves the stability and effectiveness of the reagent. We’ve done side-by-side tests with a variety of commercially available products. For almost every case, ours performs best under a wide range of conditions.”
Wide-ranging conditions are inevitable, she said, when dealing with a natural water source such as a river. “You get seasonal and even daily fluctuations in pH, temperature, turbidity and water chemistry. And a river in central New Mexico has very different conditions than say, a river in Ohio.”
The Sandia coagulant attracts and binds contaminants so well because it maintains its electrostatic charge more reliably than conventional coagulants made without gallium, itself a harmless addition.
The new material also resists converting to larger, less-reactive aggregates before it is used. This means it maintains a longer shelf life, avoiding the problem faced by related commercially available products that aggregate over time.
“The chemical substitution [of a gallium atom for an aluminum atom] has been studied by Sandia’s collaborators at the University of California at Davis, but nobody has ever put this knowledge to use in an application such as removing water contaminants like microorganisms,” said Nyman.
The project was conceived and all water treatment studies were performed at Sandia, said Nyman, who worked with Sandia microbiologist Tom Stewart. Transmission electron microscope images of bacteriophages binding to the altered material were achieved at the University of New Mexico. Mass spectroscopy of the alumina clusters in solution was performed at UC Davis.
The work was sponsored by Sandia’s Laboratory Directed Research Development office.
Adapted from materials provided by DOE/Sandia National Laboratories.

Ytterbium's Broken Symmetry: Largest Parity Violations Ever Observed In An Atom


ScienceDaily (July 22, 2009) — Ytterbium was discovered in 1878, but until it recently became useful in atomic clocks, the soft metal rarely made the news. Now ytterbium has a new claim to scientific fame. Measurements with ytterbium-174, an isotope with 70 protons and 104 neutrons, have shown the largest effects of parity violation in an atom ever observed – a hundred times larger than the most precise measurements made so far, with the element cesium.
“Parity” assumes that, on the atomic scale, nature behaves identically when left and right are reversed: interactions that are otherwise the same but whose spatial configurations are switched, as if seen in a mirror, ought to be indistinguishable. Sounds like common sense but, remarkably, this isn’t always the case.
“It’s the weak force that allows parity violation,” says Dmitry Budker, who led the research team. Budker is a member of the Nuclear Science Division at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory and a professor of physics at the University of California at Berkeley.
Of the four forces of nature – strong, electromagnetic, weak, and gravitational – the extremely short-range weak force was the last to be discovered. Neutrinos, having no electric charge, are immune to electromagnetism and only interact through the weak force. The weak force also has the startling ability to change the flavor of quarks, and to change protons into neutrons and vice versa.
Violating parity – neutrons and the weak force
Protons on their own last forever, apparently, but a free neutron falls apart in about 15 minutes; it turns into a proton by emitting an electron and an antineutrino, a process called beta decay. What makes beta decay possible is the weak force.
Scientists long assumed that nature, on the atomic scale, was symmetrical. It would look the same not only if left and right were reversed but also if the electrical charges of particles involved in an interaction were reversed, or even if the whole process ran backwards in time. Charge conjugation is written C, parity P, and time T; nature was thought to be C invariant, P invariant, and T invariant.
In 1957 researchers realized that the weak force didn’t play by the rules. When certain kinds of nuclei such as cobalt-60 are placed in a magnetic field to polarize them – line them up – and then allowed to undergo beta decay, they are more likely to emit electrons from their south poles than from their north poles.
This was the first demonstration of parity violation. Before the 1957 cobalt-60 experiment, renowned physicist Richard Feynman had said that if P violation were true – which he doubted – something long thought impossible would be possible after all: “There would be a way to distinguish right from left.”
It’s now apparent that many atoms exhibit parity violation, although it is not easy to detect. P violation has been measured with the greatest accuracy in cesium atoms, which have 55 protons and 78 neutrons in the nucleus, by using optical methods to observe the effect when atomic electrons are excited to higher energy levels.
The Berkeley researchers designed their own apparatus to detect the much larger parity violation predicted for ytterbium. In their experiment, ytterbium metal is heated to 500 degrees Celsius to produce a beam of atoms, which is sent through a chamber where magnetic and electric fields are oriented at right angles to each other. Inside the chamber the ytterbium atoms are hit by a laser beam, tuned to excite some of their electrons to higher energy states via a “forbidden” (highly unlikely) transition. The electrons then relax to lower energies along different pathways.
Weak interactions between the electron and the nucleus – plus weak interactions within the nucleus of the atom – act to mix some of the electron energy states together, making a small contribution to the forbidden transition. But other, more ordinary electromagnetic processes, which involve apparatus imperfections, also mix the states and blur the signal. The purpose of the chamber’s magnetic and electric fields is to amplify the parity-violation effect and to remove or identify these spurious electromagnetic effects.
Upon analyzing their data, the researchers found a clear signal for atomic parity violations, 100 times larger than the similar signal for cesium. With refinements to their experiment, the strength and clarity of the ytterbium signal promise significant advances in the study of weak forces in the nucleus.
Watching the weak force at work
The Budker group’s experiments are expected to expose how the weak charge changes in different isotopes of ytterbium, whose nuclei have the same number of protons but different numbers of neutrons, and will reveal how weak currents flow within these nuclei.
The results will also help explain how the neutrons in the nuclei of heavy atoms are distributed, including whether a “skin” of neutrons surrounds the protons in the center, as suggested by many nuclear models.
“The neutron skin is very hard to detect with charged probes, such as by electron scattering,” says Budker, “because the protons with their large electric charge dominate the interaction.”
He adds, “At a small level, the measured atomic parity violation effect depends on how the neutrons are distributed within the nucleus – specifically, their mean square radius. The mean square radius of the protons is well known, but this will be the first evidence of its kind for neutron distribution.”
Measurements of parity violation in ytterbium may also reveal “anapole moments” in the outer shell of neutrons in the nucleus (valence neutrons). As predicted by the Russian physicist Yakov Zel’dovich, these electric currents are induced by the weak interaction and circulate within the nucleus like the currents inside the toroidal winding of a tokamak; they have been observed in the valence protons of cesium but not yet in valence neutrons.
Eventually the experiments will lead to sensitive tests of the Standard Model – the theory that, although known to be incomplete, still best describes the interactions of all the subatomic particles so far observed.
“So far, the most precise data about the Standard Model has come from high-energy colliders,” says Budker. “The carriers of the weak force, the W and Z bosons, were discovered at CERN by colliding protons and antiprotons, a ‘high-momentum-transfer’ regime. Atomic parity violation tests of the Standard Model are very different – they’re in the low-momentum-transfer regime and are complementary to high-energy tests.”
Since 1957, when Zel’dovich first suggested seeking atomic variation in atoms by optical means, researchers have come ever closer to learning how the weak force works in atoms. Parity violation has been detected in many atoms, and its predicted effects, such as anapole moments in the valence protons of cesium, have been seen with ever-increasing clarity. With their new experimental techniques and the observation of a large atomic parity violation in ytterbium, Dmitry Budker and his colleagues have achieved a new landmark, moving closer to fundamental revelations about our asymmetric universe on the atomic scale.
Journal reference:
K. Tsigutkin, D. Dounas-Frazer, A. Family, J. E. Stalnaker, V. V. Yashchuck, and D. Budker. Observation of a large atomic parity violation in ytterbium. Physical Review Letters, (in press) [link]
Adapted from materials provided by DOE/Lawrence Berkeley National Laboratory.

Quantum Measurements: Common Sense Is Not Enough, Physicists Show


ScienceDaily (July 23, 2009) — In comparison to classical physics, quantum physics predicts that the properties of a quantum mechanical system depend on the measurement context, i.e. whether or not other system measurements are carried out. A team of physicists from Innsbruck, Austria, led by Christian Roos and Rainer Blatt, have for the first time proven in a comprehensive experiment that it is not possible to explain quantum phenomena in non-contextual terms.
The scientists report on their findings in the current issue of Nature.
Quantum mechanics describes the physical state of light and matter and formulates concepts that totally contradict the classical conception we have of nature. Thus, physicists have tried to explain non-causal phenomena in quantum mechanics by classical models of hidden variables, thereby excluding randomness, which is omnipresent in quantum theory. In 1967, however, the physicists Simon Kochen and Ernst Specker proved that measurements have to be contextual when explaining quantum phenomena by hidden variables. This means that the result of one measurement depends on which other measurements are performed simultaneously.
Interestingly, the simultaneous measurements here are compatible and do not disturb each other. The physicists led by Christian Roos and Rainer Blatt from the Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences and the University of Innsbruck have now been able to prove this proposition and rule out non-contextual explanations of quantum theory experimentally. In a series of measurements on a quantum system consisting of two ions they have shown that the measurement of a certain property is dependent on other measurements of the system.
Technological headstart
The experiment was carried out by the PhD students Gerhard Kirchmair and Florian Zähringer as well as Rene Gerritsma, a Dutch postdoc at the IQOQI. The scientists trapped a pair of laser-cooled calcium ions in an electromagnetic trap and carried out a series of measurements. „For this experiment we used techniques we had previously designed for building a quantum computer. We had to concatenate up to six quantum gates for this experiment", explains Christian Roos. „We were able to do this because, it is only recently that we can perform a quantum gate with high fidelity."
Only last year, a team of scientists led by Rainer Blatt realized an almost error-free quantum gate with a fidelity of 99 %. With this technological headstart, the scientists have now proven comprehensively in an experiment for the first time that the experimentally observed phenomena cannot be described by non-contextual models with hidden variables. The result is independent of the quantum state – it was tested in ten different states. Possible measurement disturbances could be ruled out by the experimental physicists with the help of theoreticians Otfried Gühne and Matthias Kleinmann from the group led by Prof. Hans Briegel at the IQOQI in Innsbruck.
Randomness cannot be excluded
In 1935 already, Albert Einstein, Boris Podolsky and Nathan Rosen questioned whether quantum mechanics theory is complete in the sense of a realistic physical theory – a criticism that is now well know in the scientific world as the EPR paradox. In the mid 1960s, John Bell showed that quantum theory cannot be a real and at the same time local theory, which, in the meantime, has also been proven experimentally. Kochen and Specker's results exclude other theoretical models but until now it was difficult to provide a convincing experimental proof. Following a proposition by the Spaniard Adán Cabello, the Innsbruck scientists have now successfully proven this point and produced unambiguous results experimentally. The physicists are supported by the Austrian Science Funds (FWF), the European Union, the Federation of Austrian Industry Tyrol, and Intelligence Advanced Research Projects Activity (IARPA).
Adapted from materials provided by University of Innsbruck, via EurekAlert!, a service of AAAS.