Visualizzazione post con etichetta Mathematics. Mostra tutti i post
Visualizzazione post con etichetta Mathematics. Mostra tutti i post

sabato 6 giugno 2009

Manipulating light on a chip for quantum technologies

SOURCE

An artist's impression of the on-chip quantum metrology experiment (making ultraprecise measurements on chip) Photo by Will Amery, University of Bristol.
(PhysOrg.com) -- A team of physicists and engineers at Bristol University has demonstrated exquisite control of single particles of light — photons — on a silicon chip to make a major advance towards long-sought-after quantum technologies, including super-powerful quantum computers and ultra-precise measurements.
The Bristol Centre for Quantum Photonics has demonstrated precise control of four photons using a microscopic metal electrode lithographically patterned onto a silicon chip.
The photons propagate in silica waveguides — much like in optical fibres — patterned on a silicon chip, and are manipulated with the electrode, resulting in a high-performance miniaturized device.
“We have been able to generate and manipulate of photons on a silicon chip” said PhD student, Jonathan Matthews, who together with Alberto Politi performed the experiments. “These entangled states are responsible for famously ‘weird’ behaviour arising in quantum mechanics, but are also at the heart of powerful quantum technologies.”
“This precise manipulation is a very exciting development for fundamental science as well as for future quantum technologies.” said Prof Jeremy O’Brien, Director of the Centre for Quantum Photonics, who led the research.
The team reports its results in the latest issue of Nature Photonics [June 2009], a sister journal of the leading science journal Nature, and in a Postdeadline Paper at 'The International Quantum Electronics Conference (IQEC)' on June 4 in Baltimore, USA [IQEC Postdeadline Papers].
Quantum technologies with photons
Quantum technologies aim to exploit the unique properties of quantum mechanics, the physics theory that explains how the world works at microscopic scales.
For example a quantum computer relies on the fact that quantum particles, such as photons, can exist in a “superposition” of two states at the same time — in stark contrast to the transistors in a PC which can only be in the state “0” or “1”.
Photons are an excellent choice for quantum technologies because they are relatively noise-free; information can be moved around at the speed of light; and manipulating single photons is easy.
Making two photons “talk” to each other to generate the all-important entangled states is much harder, but Professor O’Brien and his colleagues at the University of Queensland demonstrated this in a quantum logic gate back in 2003 [Nature 426, 264 (2003)].
Last year, the Centre for Quantum Photonics at Bristol showed how such interactions between photons could be realised on a , pointing the way to advanced quantum technologies based on photons [Science 320, 646 (2008)].
Photons are also required to “talk” to each other to realise the ultra-precise measurements that harness the laws of . In 2007 Professor O’Brien and his Japanese collaborators reported such a quantum metrology measurement with four photons [Science 316, 726 (2007)].
Manipulating photons on a silicon chip
“Despite these impressive advances, the ability to manipulate photons on a chip has been missing,” said Mr Politi. “For the last several years the Centre for Quantum Photonics has been working towards building fully functional quantum circuits on a chip to solve these problems,” added Prof O’Brien.
The team coupled photons into and out of the chip, fabricated at CIP Technologies, using optical fibres. Application of a voltage across the metal electrode changed the temperature of the silica waveguide directly beneath it, thereby changing the path that the photons travelled. By measuring the output of the device they confirmed high-performance manipulation of photons in the chip.
The researchers proved that one of the strangest phenomena of the quantum world, namely “quantum entanglement”, was achieved on-chip with up to four photons. Quantum entanglement of two particles means that the state of either of the particles is not defined, but only their collective state, and results in an instantaneous linking of the particles.
This on-chip entanglement has important applications in quantum metrology and the team demonstrated an ultra-precise measurement in this way.
“As well as and quantum metrology, on-chip photonic quantum circuits could have important applications in quantum communication, since they can be easily integrated with optical fibres to send photons between remote locations,” said Alberto Politi.
“The really exciting thing about this result is that it will enable the development of reconfigurable and adaptive quantum circuits for photons. This opens up all kinds of possibilities,” said Prof O’Brien.
A commentary on the work that appeared in the same issue [Nature Photonics 3, 317 (2009)] described it as “an important step in the quest for quantum computation” and concluded: “The most exciting thing about this work is its potential for scalability. The small size of the [device] means that far greater complexity is possible than with large-scale optics.”
The other co-author of the paper is Dr André Stefanov, formerly a Research fellow in the Centre for Quantum Photonics, and now at the Federal Office of Metrology METAS, Switzerland.
Provided by University of Bristol (news : web)

venerdì 29 maggio 2009

Theorists Reveal Path to True Muonium


In this artist's depiction of how experimentalists could create true muonium, an electron (blue) and a positron (red) collide, producing a virtual photon (green) and then a muonium atom, made of a muon (small yellow) and an anti-muon (small purple). The muonium atom then decays back into a virtual photon and then a positron and an electron. Overlaying this process is a figure indicating the structure of the muonium atom: one muon (large yellow) and one anti-muon (large purple). Credit: Graphic: Terry Anderson/SLAC
(PhysOrg.com) -- True muonium, a long-theorized but never-seen atom, might be observed in future experiments, thanks to recent theoretical work by researchers at the Department of Energy's SLAC National Accelerator Laboratory and Arizona State University. True muonium was first theorized more than 50 years ago, but until now no one had uncovered an unambiguous method by which it could be created and observed.
"We don't usually work in this area, but one day we were idly talking about how experimentalists could create exotic states of matter," said SLAC theorist Stanley Brodsky, who worked with Arizona State's Richard Lebed on the result. "As our conversation progressed, we realized 'Gee…we just figured out how to make true muonium.'"
True muonium is made of a muon and an anti-muon, and is distinguished from what's also been called "muonium"—an atom made of an electron and an anti-muon. Both muons and anti-muons are created frequently in nature when energetic particles from space strike the earth's atmosphere. Yet both have a fleeting existence, and their combination, true muonium, decays naturally into other particles in a few trillionths of a second. This makes observation of the exotic atom quite difficult.
In a paper published on Tuesday in , Brodsky and Lebed describe two methods by which electron-positron accelerators could detect the signature of true muonium's formation and decay.
In the first method, an accelerator's electron and positron beams are arranged to merge, crossing at a glancing angle. Such a collision would produce a single photon, which would then transform into a single true muonium atom that would be thrown clear of the other particle debris. Because the newly created true muonium atoms would be traveling so fast that the laws of govern, they would decay much slower than they would otherwise, making detection easier.
In the second method, the electron and positron beams collide head-on. This would produce a true muonium atom and a photon, tangled up in a cloud of particle debris. Yet simply by recoiling against each other, the true muonium and the photon would push one another out of the debris cloud, creating a unique signature not previously searched for.
"It's very likely that people have already created true muonium in this second way," Brodsky said. "They just haven't detected it."
In their paper, Lebed and Brodsky also describe a possible, but more difficult, means by which experimentalists could create true tauonium, a bound state of a tau lepton and its antiparticle. The tau was first created at SLAC's SPEAR storage ring, a feat for which SLAC physicist Martin Perl received the 1995 Nobel Prize in physics.
Brodsky attributes the pair's successful work to a confluence of events: various unrelated lectures, conversations and ideas over the years, pieces of which came together suddenly during his conversation with Lebed.
"Once you pull all of the ideas together, you say 'Of course! Why not?' Brodsky said. "That's the process of science—you try to relate everything new to what you already know, creating logical connections."
Now that those logical connections are firmly in place, Brodsky said he hopes that one of the world's colliders will perform the experiments he and Lebed describe, asking, "Who doesn't want to see a new form of matter that no one's ever seen before?"
More information: "Production of the Smallest QED Atom: True Muonium," Physical Review Letters
Source: SLAC National Laboratory (news : web)

mercoledì 13 maggio 2009

Ion trap quantum computing


(PhysOrg.com) -- “Right now, classical computers are faster than quantum computers,” René Stock tells PhysOrg.com. “The goal of quantum computing is to eventually speed up the time scale of solving certain important problems, such as factoring and data search, so that quantum computing can not only compete with, but far outperform, classical computing on large scale problems. One of the most promising ways to possibly do this is with ion traps.”
Stock, a post-doc at the University of Toronto, points out that ion trap has made a lot of progress in the last 10 years. “ in traps have been one of most successful physical implementation of quantum computing in physical systems.” Stock believes that it is possible to use ion-trap quantum computing to create measurement-based quantum computers that could compete with classical computers for very large and complex problems - and even on smaller scale problems. His work on the subject, done with Daniel James, appears in Physical Review Letters: “Scalable, High-Speed Measurement-Based Quantum Computer Using Trapped Ions.”
“One of the most important considerations in quantum computing is the fact that quantum computing scales polynomially, rather than exponentially, as classical computing does.” This polynomial scaling is what makes quantum computing so useful for breaking data encryption. In order to make data encryption more secure, one usually increases the number of bits used. “Because of the exponential scaling, breaking data encryptions quickly becomes impossible using standard classical computers or even networks of computers,” Stock explains. “The improved scaling with quantum computers could be one a biggest threads to data encryption and security.”
While this sounds promising, Stock points this out that there are still problems with quantum information processing: “While scaling would be better with quantum computing, current operation of quantum information processing is too slow to even compete with classical computers on large factoring problems that take 5 months to solve.”
The way ion-trap quantum computing works now - or at least is envisioned to work - requires that ions be shuttled back and forth around the trap architecture. Stock explains that this takes time. “As the complexity of problems and the size of the quantum computing to be implemented increases, the time issue becomes even more important. We wanted to figure out how we could change the time scale,” Stock explains. “We found that we could speed up the processing by using an array of trapped ions and by parallelizing entangling operations.”
“Instead of moving ions around,” Stock continues, “you apply a two-ion operation between all neighboring ions at the same time. The created multipartite ‘entangled’ array of ions is a resource for quantum computing.” Actual computing is then based on measurement of ions in the array in a prescribed order and using a slightly different measurement basis for each ion. “In this scheme, it is the time required to read out information from the ions that critically determines the operational time scale of the quantum computer,” Stock says.
Stock describes the measurement component as vital to this model of quantum computing. Instead of exciting the ions and getting them to emit a photon and measuring the photon, Stock and his colleague instead devised a different way in which they were able to measure the quantum bit encoded in a calcium ion. “You can use an ionization process to speed up measurement, since the electron can be extracted faster from the atom than you can get a photon out of an atom. The extracted electron is then guided onto a detector by the ion trap itself.” All of this takes place on a nanosecond time scale. “By speeding up the measurement,” Stock insists, “we can speed up the operation capability of the quantum computer.”
Stock points out that this scheme would be impractical as far as taking over common use from classical computers. “The lattice would have thousands of ions, which would need to be controlled, and carefully stored and protected. It means that the computer would be relatively large and impractical.”
Uses for such a quantum computer are not limited to breaking data encryption. “This process would allow us to take problems of great complexity and still solve them on a humanly possible timescale. This could provide the key to modeling complex systems - especially perhaps in biology - that we can’t solve now. This would be a tremendous advantage over classical computing.”
More information: Stock, René and James, Daniel. “Scalable, High-Speed Measurement-Based Quantum Computer Using Trapped Ions.” Physical Review Letters (2009). Available online: http://link.aps.org/doi/10.1103/PhysRevLett.102.170501 .
Copyright 2009 PhysOrg.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com.

martedì 12 maggio 2009

Too much entanglement can destroy the power of quantum computers!


Computers that exploit quantum effects appear capable of outperforming their classical brethren. For example, a quantum computer can efficiently factor a whole number, while there is no known algorithm for our modern classical computers to efficiently perform this task [1]. Given this extra computational punch, a natural question to ask is “What gives quantum computers their added computational power?” This question is intrinsically hard—try asking yourself where the power of a traditional classical computer comes from and you will find yourself pondering questions at the heart of the vast and challenging field known as computational complexity. In spite of this, considerable success has been made in answering the question of when a quantum system is not capable of offering a computational speedup. A particularly compelling story has emerged from the study of entanglement—a peculiar quantum mechanical quality describing the interdependence of measurements made between parts of a quantum system. This work has shown that a quantum system without enough entanglement existing at some point in the process of a computation cannot be used to build a quantum computer that outperforms a classical computer [2]. Since entangled quantum systems cannot be replicated by local classical theories, the idea that entanglement is required for speedup seems very natural. But now two groups [3, 4] have published papers in Physical Review Letters that put forth a surprising result: sometimes too much entanglement can destroy the power of quantum computers!
Both papers focus on a model called the “one-way quantum computer,” which was invented by Hans Briegel and Robert Raussendorf in 2001 [5]. A one-way quantum computation begins with a special quantum state entangled across many quantum subsystems, and the computation proceeds as a measurement is made on each subsystem. The actual form of each of the measurements in the sequence of measurements is determined by the outcome of previous measurements (Fig. 1), and one can think of the measurements as an adaptive program executed on the substrate of the entangled quantum state. A particularly nice property of the one-way quantum computing model is that it separates quantum computing into two processes—the preparation of a special initial quantum state and a series of adaptive measurements. In this way we may view the initial quantum state as a resource that can boost localized measurements and classical computation up into quantum realms. Investigations have revealed numerous quantum states that can be used as the special initial state to build a fully functioning quantum computer. But how special is this initial quantum state? Will any entangled quantum state do?
The two papers approach this problem from slightly different perspectives, but both arrive at convincing answers to these questions. David Gross at Technische Universität Braunschweig in Germany, Steven Flammia at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, and Jen Eisert at the University of Potsdam, Germany, pursue this question directly in terms of entanglement [3]. They first show that if a certain quantification of entanglement—known as the geometric measure of entanglement—is too large, then any scheme that mimics the one-way quantum computation model cannot outperform classical computers. In fact, they show that the measurements in this case could be replaced by randomly flipping a coin, without significantly changing the effect of the computation. Thus while these states have a large amount of entanglement, they cannot be used to build a one-way quantum computer. Gross, Flammia, and Eisert also show that if one picks a random quantum state, it will, with near certainty, be a state that has a high value of geometric entanglement. The random states they consider are drawn via a probability distribution known as the Haar measure, which is the probability distribution that arises naturally when one insists that the probability of drawing a particular state not depend in any way on the basis of states one uses to describe a quantum system. Gross et al.’s findings show that not only do states that are too entangled to allow one-way quantum computation exist, they are actually generic among all quantum states.
Michael J. Bremner and Andreas Winter of the University of Bristol in the UK and Caterina Mora at the University of Waterloo in Canada take a slightly different route to finding states that are not useful for one-way quantum computation [4]. They begin by showing that a random quantum state (again drawn from the Haar measure) is not useful for one-way quantum computation with high probability, confirming the result of Gross et al. But they also show it is possible to choose a random quantum state from an even smaller class of states than the completely random quantum states and still end up with a state not useful for one-way quantum computation. This more limited class of states has even less entanglement (though still quite a lot) than those considered by Gross et al., but they can still be useless for one-way quantum computation.
The bottom line is that entanglement, like most good things in life, must be consumed in moderation. For the one-way quantum computation model, a randomly chosen initial state followed by adaptive measurements is not going to give you a quantum computer. Part of the reason for this, as revealed by Gross et al., is that a randomly chosen initial state has too much geometric entanglement. But even states with less entanglement may be useless for one-way quantum computation. All is according to the color of the crystal through which you look, however, one may naturally ask: What do all of these statements about the power of initial random quantum states have to do with the real world? It is thought, for example, that perfectly random quantum states (drawn from the Haar measure) cannot be produced efficiently on a quantum computer. So, while it may be that a perfectly random quantum state isn’t useful for one-way quantum computation, maybe the states that exist in nature, which can be constructed efficiently, actually are useful. It is known, for example, that the ground states of certain chains of interacting spins can be used for one-way quantum computation. A recent preprint by Richard Low [6] hints, however, that even states that exist in nature might also be in the class of useless states considered by Gross et al. and Bremner et al. In particular, Low has shown that there is a way to efficiently construct a class of entangled random quantum states that are not useful for one-way quantum computation. Thus the kinds of generic situations that both groups consider should not be ruled out because there is no physical model that efficiently prepares these states: quantum states that are impotent for one-way quantum computation may be the norm and not the exception. The implications for this on the viability of one-way quantum computation are probably not dire, but it does point out how special the states that can be useful for this model need to be—as well as the clever thinking needed to think this model up in the first place.
Finally, one can take a step back and ask “What are the implications of these results for understanding the source of the power of quantum computation?” Entanglement, in quite a real sense, is not the full answer to this question. The results of these two papers drill a deeper hole into the view of those who believe that the largeness of entanglement, and of entanglement alone, should be the useful discriminating factor between quantum and classical computation. From the perspective of theoretical computer science, this is not too surprising. One of the big open questions in this field is whether what is efficiently computable on a classical computer is the same as what is efficiently computable on a computer that operates according to different laws of the universe—a universe where a computer can nondeterministically branch (in computer science, this is known as the P versus NP question). This latter nondeterminism isn’t the kind a physicist normally thinks about. Instead it is a nondeterminism in which one can select out which of the nondeterministic branches of a universe one wishes to live in. This nondeterminism is not the way in which our universe appears to work, but it is one way the world could work (i.e., a possible set of laws of physics).
Trying to understand why our classical computers cannot efficiently compute what could be efficiently computed in these nondeterministic worlds is the holy grail of computer science research. The failure to solve this problem is similar to saying there is no known way to write down a quantity that succinctly quantifies why modern computers are different from computers that exist in the nondeterministic world. We should not be surprised, then, if there is no way to write down a quantity that quantifies why a quantum computer is powerful. After all, quantum physics is just another set of laws that operate differently than classical laws. While it is easy to view this through a negative lens, in actuality it should provide the wind behind research into quantum algorithms: there is still much to be discovered about where quantum computers might offer computational advantages over classical computers. Just be aware that creating too much entanglement followed by a series of measurements may not be the best way to get the answer.

domenica 10 maggio 2009

Post-Quantum Correlations: Exploring the Limits of Quantum Nonlocality

This figure shows levels of nonlocality as measured by the CHSH Bell inequality. Classical nonlocal correlations (green) are at 2 and below; quantum nonlocal correlations (red) are above 2 but below Tsirelson’s bound (BQ); and post-quantum nonlocal correlations (light blue) are above and, in some cases, below Tsirelson’s bound. BCC marks the “bound of triviality,” above which correlations are unlikely to exist. In the current study, scientists found that post-quantum correlated nonlocal boxes (dark blue line) are also unlikely to exist, despite some boxes being arbitrarily close to being classical. Image credit: Brunner and Skrzypczyk. ©2009 APS.

SOURCE

(PhysOrg.com) -- When it comes to nonlocal correlations, some correlations are more nonlocal than others. As the subject of study for several decades, nonlocal correlations (for example, quantum entanglement) exist between two objects when they can somehow directly influence each other even when separated by a large distance. Because these correlations require “passion-at-a-distance” (a term coined by physicist Abner Shimony), they violate the principle of locality, which states that nothing can travel faster than the speed of light (even though quantum correlations cannot be used to communicate faster than the speed of light). Besides being a fascinating phenomenon, nonlocality can also lead to powerful techniques in computing, cryptography, and information processing.
Quantum Limits
Despite advances in quantum research, physicists still don’t fully understand the fundamental nature of nonlocality. In 1980, mathematician Boris Tsirelson found that quantum correlations are bounded by an upper limit; quantum nonlocality is only so strong. Later, in 1994, physicists Sandu Popescu and Daniel Rohrlich made another surprising discovery: a particular kind of correlation might exist above the “Tsirelson bound,” as well as below the bound, in a certain range (see image). These so-called post-quantum correlations are therefore “more nonlocal” than quantum correlations.
“Tsirelson's bound represents the most nonlocal ‘boxes’ that can be created with quantum mechanics,” Nicolas Brunner, a physicist at the University of Bristol, told PhysOrg.com. “Nonlocality here is measured by the degree of violation of a Bell inequality. So, quantum non-locality appears to be limited. The big question is why. That is, is there a good physical reason why post-quantum correlations don’t seem to exist in nature?”
In a recent study, Brunner and coauthor Paul Skrzypczyk, also of the University of Bristol, propose an explanation for why post-quantum correlations are unlikely to exist, which may reveal insight into why quantum nonlocality is bounded, as well as into the underlying difference between quantum and post-quantum correlations.
In their study, Brunner and Skrzypczyk have shown that a certain class of post-quantum correlations is unlikely to exist due to the fact that it makes communication complexity trivial. This triviality occurs due to the fact that the nonlocality of these correlations can be enhanced beyond a critical limit, and - surprisingly - in spite of the fact that some of these correlations are arbitrarily close to classical correlations (they give an arbitrarily small violation of Bell’s inequality). As previous research has suggested, any theory in which communication complexity is trivial is very unlikely to exist.
Beyond Quantum
“’Post-quantum’ means beyond quantum,” Brunner explained. “This term applies to correlations, which are conveniently - and probably most simply - described by ‘black boxes.’ The basic idea is the following: imagine a black box shared by two distant parties Alice and Bob; each party is allowed to ask a question to the box (or make a measurement on the box, if you prefer) and then gets an answer (a measurement outcome). By repeating this procedure many times, and at the end comparing their respective results, Alice and Bob can identify what their box is doing. For instance, it could be that the outcomes are always the same whenever Alice and Bob choose the same questions. This kind of behavior is a correlation; knowing one outcome, it is possible to deduce the other one, since both outcomes are correlated.
“Now, it happens that there exist different types of correlations; basically those that can be understood with classical physics (where correlations originate from a common cause), and those that cannot. This second type of correlation is called nonlocal, in the sense that it cannot be explained by a common cause. A priori it is not obvious to tell whether some correlations are local or not. The way physicists can tell this is by testing a Bell inequality; when a Bell inequality is violated, then the correlations cannot be local; that is, there cannot exist a common cause to these correlations.
“Now, an amazing thing about quantum mechanics is that it allows one to construct boxes that are non-local. This is quantum nonlocality. Now, it happens that not all nonlocal boxes can be constructed in quantum mechanics. Thus there exist correlations which are unobtainable in quantum mechanics. These are called post-quantum correlations. In general, post-quantum correlations can be above Tsirelson’s bound, but in some very specific cases, they can also be below.”
‘Distilling’ Post-Quantum Nonlocality
To demonstrate that post-quantum correlations cannot exist in nature, Brunner and Skrzypczyk developed a protocol for deterministically distilling nonlocality in post-quantum states. That is, the technique refines weakly nonlocal states into states with greater nonlocality. In this context, “distillation” can also be thought of as “purifying,” “amplifying,” or “maximizing” the nonlocality of post-quantum correlations. Since nonlocal correlations are more useful if they are stronger, maximizing nonlocality has significant implications for quantum information protocols. The physicists’ protocol works specifically with “correlated nonlocal boxes,” which are a particular class of post-quantum boxes.
Brunner and Skrzypczyk’s distillation protocol builds on a recent breakthrough by another team (Forster et al.), who presented the first nonlocality distillation protocol just a few months ago. However, the Forster protocol can distill correlated nonlocal boxes only up to a certain point, violating a Bell inequality called the Clauser-Horne-Shimony-Holt (CHSH) inequality only up to CHSH = 3. While this value is greater than Tsirelson’s bound of 2.82, it does not reach the bound of 3.26, which marks the point at which communication complexity becomes trivial.
Taking a step forward, Brunner and Skrzypczyk’s protocol can distill nonlocality all the way up to the maximum nonlocality of the Popescu-Rohrlich box, which is 4. In passing the 3.26 bound of triviality, they show that these post-quantum correlated nonlocal boxes do indeed collapse communication complexity.
The distillation protocol is executed by two distant parties that share two weakly correlated nonlocal boxes. Each party can input one bit into a box to receive one output bit, simulating a binary input/binary output system with local operations. As the scientists explain, a distillation protocol can be viewed as a way of classically wiring the two boxes together. The protocol is a choice of four wirings, one for each input of Alice and Bob. The wiring (algorithm) that determines the outbit bits of the boxes will transform the two nonlocal boxes into a single correlated nonlocal box, which has stronger nonlocality than the two individual boxes.
Importantly, this protocol can distill any correlated nonlocal box that violates the CHSH inequality by less than a limit of 3.26 to more than 3.26. In other words, any correlated nonlocal box that has not previously made communication complexity trivial can be made to do so. Surprisingly, some of these boxes can even be arbitrarily close to being classical (below or equal to 2), and yet, since they can be distilled beyond the “bound of triviality,” they still collapse communication complexity. According to previous studies of triviality, such boxes are very unlikely to exist - even those below Tsirelson’s bound.
Trivial Complexity
Theoretically, when communication complexity is trivial, even the most complex problems can be solved with a minimum amount of communication. In the following example, Brunner explains what would happen in real life if a single bit of information could solve any problem.
“Communication complexity is an
task,” Brunner said. “Here is an example. Suppose you and I would like to meet during the next year; so given our respective agendas, we would like to know whether there is a day where both of us are free or whether there is not; doesn’t matter what that day is, we just want to know whether there is such a day or not.
Since we are in distant locations, we must send each other some information to solve the problem. For instance, if I send you the whole information about my agenda, then you could find out whether a meeting is possible or not (and so solve the problem). But indeed that implies that I should send you a significant quantity of information (many bits). It turns out that in classical physics (or, if you prefer, in everyday life), there is no better strategy; I really have to send you all that information. In quantum physics, though there exist stronger correlations than in classical physics (quantum nonlocal correlations), I would still have to send you an enormous amount of communication.
“Now, the really astonishing thing is that, if you have access to certain post-quantum correlations (post-quantum boxes), a single bit of communication is enough to solve this problem! In other words, communication complexity becomes trivial in these theories, since one bit of communication is enough to solve any problem like this one. Importantly, in classical or quantum physics, communication complexity is not trivial. More generally, for computer scientists, a world in which communication complexity becomes trivial is highly unlikely to exist. Previously, it was known that post-quantum boxes with a very high degree of violation of a Bell inequality make communication complexity trivial; now, the astonishing thing about our result is that we show that some correlations with a very small degree of violation of a Bell inequality - but indeed not accessible with quantum mechanics - can also make communication complexity trivial.”
Post-Quantum Future
In the future, Brunner and Skrzypczyk hope to find improved distillation protocols that might work for a wider variety of post-quantum nonlocal boxes, not only correlated nonlocal boxes. More research is also needed to explain why quantum correlations cannot exist in the gap between Tsirelson’s bound and the bound of triviality. Ultimately, this line of research could help make a distinction between quantum and post-quantum
, with important theoretic implications.
“The greatest implications of our results are the following,” Brunner said. “First, they give new evidence that certain post-quantum theories allow for a dramatic increase in communication power compared to quantum mechanics, and therefore appear very unlikely to exist in nature. The nice thing, in particular, is that some of these theories allow only for little nonlocality (as measured by the degree of violation of a Bell inequality). Thus our result is a striking demonstration that we still have no clue on how to correctly measure nonlocality. Finally, it is one step further towards an information-theoretic axiom for
.”
More information: Nicolas Brunner and Paul Skrzypczyk. “Nonlocality Distillation and Postquantum Theories with Trivial Communication Complexity.” Physical Review Letters 102, 160403 (2009).
Copyright 2009 PhysOrg.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com.

sabato 13 ottobre 2007

Not Just Science Fiction: 'Electromagnetic Wormhole' Possible, Say Mathematicians


Source:

Science Daily — The team of mathematicians that first created the mathematics behind the "invisibility cloak" announced by physicists last October has now shown that the same technology could be used to generate an "electromagnetic wormhole."
In the study, which is to appear in the Oct. 12 issue of Physical Review Letters, Allan Greenleaf, professor of mathematics at the University of Rochester, and his coauthors lay out a variation on the theme of cloaking. Their results open the possibility of building a sort of invisible tunnel between two points in space.
"Imagine wrapping Harry Potter's invisibility cloak around a tube," says Greenleaf. "If the material is designed according to our specifications, you could pass an object into one end, watch it disappear as it traveled the length of the tunnel, and then see it reappear out the other end."
Current technology can create objects invisible only to microwave radiation, but the mathematical theory allows for the wormhole effect for electromagnetic waves of all frequencies. With this in mind, Greenleaf and his coauthors propose several possible applications. Endoscopic surgeries where the surgeon is guided by MRI imaging are problematical because the intense magnetic fields generated by the MRI scanner affect the surgeon's tools, and the tools can distort the MRI images. Greenleaf says, however, that passing the tools through an EM wormhole could effectively hide them from the fields, allowing only their tips to be "visible" at work.
To create cloaking technology, Greenleaf and his collaborators use theoretical mathematics to design a device to guide the electromagnetic waves in a useful way. Researchers could then use these blueprints to create layers of specially engineered, light-bending, composite materials called metamaterials.
Last year, David R. Smith, professor of electrical and computer engineering at Duke's Pratt School, and his coauthors engineered an invisibility device as a disk, which allowed microwaves to pass around it. Greenleaf and his coauthors have now employed more elaborate geometry to specify exactly what properties are demanded of a wormhole's metamaterial in order to create the "invisible tunnel" effect. They also calculated what additional optical effects would occur if the inside of the wormhole was coated with a variety of hypothetical metamaterials.
Assuming that your vision was limited to the few frequencies at which the wormhole operates, looking in one end, you'd see a distorted view out the other end, according the simulations by Greenleaf and his coauthors. Depending on the length of the tube and how often the light bounced around inside, you might see just a fisheye view out the other end, or you might see an Escher-like jumble.
Greenleaf and his coauthors speculated on one use of the electromagnetic wormhole that sounds like something out of science fiction. If the metamaterials making up the tube were able to bend all wavelengths of visible light, they could be used to make a 3D television display. Imagine thousands of thin wormholes sticking up out of a box like a tuft of long grass in a vase. The wormholes themselves would be invisible, but their ends could transmit light carried up from below. It would be as if thousands of pixels were simply floating in the air.
But that idea, Greenleaf concedes, is a very long way off. Even though the mathematics now says that it's possible, it's up to engineers to apply these results to create a working prototype.
Greenleaf's coauthors are Matti Lassas, professor of mathematics at the Helsinki University of Technology; Yaroslav Kurylev, professor of mathematics at the University College, London; and Gunther Uhlmann, Walker Family Endowed Professor of Mathematics at the University of Washington.
Note: This story has been adapted from material provided by University of Rochester.

Fausto Intilla