Thursday, June 26, 2014

Way to Boot up Quantum Computers 72 Times Faster Than Previously Possible

Physicists find way to boot up quantum computers 72 times faster than previously possible


Press the start button, switch on the monitor, grab a cup of coffee and off you go. That is pretty much how most us experience booting up a computer. But with a quantum computer the situation is very different. So far, researchers have had to spend hours making dozens of adjustments and fine calibrations in order to set up a chip with just five quantum bits so that it can be used for experimental work. (One quantum bit or 'qubit' is the quantum physical equivalent of a single bit in a conventional computer). Any small errors in the adjustment and calibration procedure and the chip would not work.




[caption id="attachment_348" align="alignleft" width="500"]Way to Boot up Quantum Computers 72 Times Faster Than Previously Possible www.quantumcomputingtechnologyaustralia.com-051 Way to Boot up Quantum Computers 72 Times Faster Than Previously Possible[/caption]

The problem is that, not unlike musical instruments, quantum computers react to small changes in the local environment. If, for example, it is a little warmer or a little colder or if the ambient air pressure is a little higher or a little lower than the day before then the complex network of qubits will no longer function – the computer is detuned and has to be readjusted before it can be used. 'Up until now, experimental quantum physicists have had to sit down each day and see how conditions have changed compared to the day before. They then had to remeasure each parameter and carefully recalibrate the chip,' explains Professor Wilhelm-Mauch, Professor for Theoretical Quantum and Solid-State Physics at Saarland University. Only a very small error rate of less than 0.1 percent is permissible when measuring ambient conditions. Frank Wilhelm-Mauch explains this sensitivity thus: 'That means that an error can occur in only one in a thousand measurements. If just two in a thousand measurements are in error, the software will be unable to correct for the errors and the quantum computer will not operate correctly.' With around 50 different parameters involved in the calibration process, one begins to get an idea of the sheer effort involved in calibrating a quantum computer.


Working together with his doctoral student, Wilhelm-Mauch began to consider a fundamentally new approach to the problem. 'We asked ourselves the question: Why is it necessary each and every day to understand how conditions differ from those of the day before?' The answer we eventually came up with was that it isn't necessary. What's important is that the setup procedure produces the right results. Why it produces the right results is not so relevant.' It was this pragmatic approach that underlay the work carried out by Wilhelm-Mauch and Egger. 'For the calibration procedure we used an algorithm from engineering mathematics, strictly speaking from the field of civil and structural engineering, as that's another area in which experiments are costly,' explains Professor Wilhelm-Mauch.


Using this technique, the two theoreticians were able to reduce the calibration error rate to below the required 0.1 percent threshold, while at the same time speeding up the calibration process from six hours to five minutes. The Saarbrücken methodology, which goes under the name Ad-HOC (Adaptive Hybrid Optimal Control), has now been subjected to rigorous testing by a group of experimental physicists from the University of California in Santa Barbara. Their experimental work is published in the issue of Physical Review Letters that also contains the Saarbrücken paper.


This development is of major importance for future experimental research into quantum computing. Physicists in quantum computing laboratories no longer have to spend hours every day preparing their system for just a short period of experimental work. 'As many of the parameters, such as temperature, light and air pressure do not remain stable during the long calibration phase, this can further shorten the time window in which the chip is running error-free and in which it can therefore be used for experiments,' says Wilhelm-Mauch, adding that the new method is scalable. Up until now, technical constraints have meant that experiments have been carried out using a single chip housing five qubits that perform the actual computational operations. The new method, in contrast, is not restricted to chips of this magnitude and can be applied to quantum processors of almost any size.


Frank Wilhelm-Mauch jokingly points out another appealing feature of the new methodology: 'Unlike the previous approach of manual calibration, our method is fully automated. The researcher really does just push a button like on a conventional computer. They can then go off to get themselves a coffee while the quantum computer boots up.' A major improvement in the life of experimental research scientists working in the field.



###

Background information on quantum technology:


The fundamental principle of quantum technology is that a particle (e.g. an atom, electron or photon) can be in two quantum-mechanical states at the same time. This is referred to as a superposition of states. In a conventional computer, information is represented by bits with each bit assuming either the value 0 or 1. In a quantum computer, in contrast, information is carried by quantum bits (or 'qubits') with each qubit able to assume the values 0 or 1 or any combination ('superposition') of the two. One way of realizing a quantum computer is with a memory unit composed of atoms whose quantum states can be excited and manipulated in a controlled manner using laser light. Computational operations can then be performed simultaneously (or 'in parallel') on both parts of the superposition state (1 and 0). In the time it takes for a 32-bit conventional computer to process one of its 2 to the power of 32 possible states, a quantum computer can process all of these states in parallel. The quantum computer can therefore carry out computations orders of magnitude faster than a normal computer. However, quantum computing power can only be exploited for special problems for which appropriate quantum algorithms have been developed.


In many of the superposition states, the quantum bits are 'entangled', which means that the superposition can only be described as a whole and not in terms of the independent states of the particles involved. However, both superposed and entangled states are highly sensitive to any interaction with their environment and rapidly lose their quantum character. For quantum computing, this means that a great deal of effort has to be put into screening the system from environmental influences. In another area of quantum technology, this sensitivity to environmental factors is being specifically exploited. In the field of quantum communication, confidential information can be encoded in the form of entangled or superposed states. Anyone endeavouring to access the information would end up destroying the quantum state and the attempted interception would be discovered.


More information on research into quantum computing at Saarland University is provided in the following press release (German only): http://idw-online.de/de/news570132


The research work Adaptive Hybrid Optimal Quantum Control for Imprecisely Characterized Systems was published on 20 June in the journal Physical Review Letters (DOI: 10.1103/PhysRevLett.112.240503).


Press photographs are available at http://www.uni-saarland.de/Pressefotos and can be used at no charge. Please read and comply with the conditions of use.


Questions can be addressed to:


Prof. Dr. Frank Wilhelm-Mauch
Tel.: +49 (0)681 302-3960
E-mail: fwm@physik.uni-saarland.de


Note for radio journalists: Studio-quality telephone interviews can be conducted with researchers at Saarland University using broadcast audio IP codec technology (IP direct dial or via the ARD node 106813020001). Interview requests should be addressed to the university's Press and Public Relations Office (+49 (0)681 302-2601 or -64091).


News Release Source : Physicists find way to boot up quantum computers 72 times faster than previously possible


Image Credit : Erik Lucero/UCSB

Ultra Thin Wires for Quantum Computing

Ultra-thin wires for quantum computing


Carefully fabricating nanofibers by heating and pulling may make for highly-efficient, optics-based, low-power atom traps

WASHINGTON D.C., June 17, 2014 - Take a fine strand of silica fiber, attach it at each end to a slow-turning motor, gently torture it over an unflickering flame until it just about reaches its melting point and then pull it apart. The middle will thin out like a piece of taffy until it is less than half a micron across -- about 200 times thinner than a human hair.




[caption id="attachment_344" align="alignleft" width="400"]Ultra-thin wires for quantum computing www.quantumcomputingtechnologyaustralia.com-050 Ultra-thin wires for quantum computing[/caption]

That, according to researchers at the Joint Quantum Institute at the University of Maryland, is how you fabricate ultrahigh transmission optical nanofibers, a potential component for future quantum information devices, which they describe in AIP Publishing's journal AIP Advances.


Quantum computers promise enormous power, but are notoriously tricky to build. To encode information in qubits, the fundamental units of a quantum computer, the bits must be held in a precarious position called a superposition of states. In this fragile condition the bits exist in all of their possible configurations at the same time, meaning they can perform multiple parallel calculations.


The tendency of qubits to lose their superposition state too quickly, a phenomenon known as decoherence, is a major obstacle to the further development of quantum computers and any device dependent on superpositions. To address this challenge, researchers at the Joint Quantum Institute proposed a hybrid quantum processor that uses trapped atoms as the memory and superconducting qubits as the processor, as atoms demonstrate relatively long superposition survival times and superconducting qubits perform operations quickly.


"The idea is that we can get the best of both worlds," said Jonathan Hoffman, a graduate student in the Joint Quantum Institute who works in the lab of principal investigators Steven Rolston and Luis Orozco. However, a problem is that superconductors don't like high optical power or magnetic fields and most atomic traps use both, Hoffman said.


This is where the optical nanofibers come in: The Joint Quantum Institute team realized that nanofibers could create optics-based, low-power atom traps that would "play nice" with superconductors. Because the diameter of the fibers is so minute -- 530 nanometers, less than the wavelength of light used to trap atoms -- some of the light leaks outside of the fiber as a so-called evanescent wave, which can be used to trap atoms a few hundred nanometers from the fiber surface.


Hoffman and his colleagues have worked on optical nanofiber atom traps for the past few years. Their AIP Advances paper describes a new procedure they developed that maximizes the efficiency of the traps through careful and precise fabrication methods.


The group's procedure, which yields an improvement of two orders of magnitude less transmission loss than previous work, focuses on intensive preparation and cleaning of the pre-pulling environment the nanofibers are created in.


In the fabrication process, the fiber is brushed through the flame to prevent the formation of air currents, which can cause inconsistencies in diameter to arise, as it is pulled apart and tapered down. The flame source is a mixture of hydrogen and oxygen gas in a precise two-to-one ratio, to ensure that water vapor is the only byproduct. The motors are controlled by an algorithm based on the existing work of a group in Vienna, which calculates the trajectories of the motors to produce a fiber of the desired length and profile.


Previous pulling methods, such as carbon dioxide lasing and chemical etching, were limited by the laser's insufficient diameter and by a lesser degree of control over tapering length, respectively.


Future work includes interfacing the trapped atoms with the superconducting circuits held at 10 mKelvin in a dilution refrigerator, as well as guiding more complicated optical field patterns through the fiber (higher-order modes) and using these to trap atoms.



###

The article, "Ultrahigh transmission optical nanofibers," is authored by J.E. Hoffman, S. Ravets, J.A. Grover, P. Solano, P.R. Kordell, J.D. Wong-Campos, L.A. Orozco and S.L. Rolston. It will be published in AIP Advances on June 17, 2014 (DOI: . After that date, it may be accessed at: http://scitation.aip.org/content/aip/journal/adva/4/6/10.1063/1.4879799


ABOUT THE JOURNAL


AIP Advances is a fully open access, online-only, community-led journal. It covers all areas of applied physical science. With its advanced web 2.0 functionality, the journal puts relevant content and discussion tools in the hands of the community to shape the direction of the physical sciences. See: http://aipadvances.aip.org


News Release Source :  Ultra-thin wires for quantum computing


Image Credit: J. E. Hoffman and E. Edwards / JQI at UMD

Quantum computation - Fragile yet error free

Quantum computation: Fragile yet error-free


Even computers are error-prone. The slightest disturbances may alter saved information and falsify the results of calculations. To overcome these problems, computers use specific routines to continuously detect and correct errors. This also holds true for a future quantum computer, which will require procedures for error correction as well: "Quantum phenomena are extremely fragile and error-prone. Errors can spread rapidly and severely disturb the computer," says Thomas Monz, member of Rainer Blatt's research group at the Institute for Experimental Physics at the University of Innsbruck. Together with Markus Müller and Miguel Angel Martin-Delgado from the Department for Theoretical Physics at the Complutense University in Madrid, the physicists in Innsbruck developed a new quantum error-correcting method and tested it experimentally. "A quantum bit is extremely complex and cannot be simply copied. Moreover, errors in the microscopic quantum world are more manifold and harder to correct than in conventional computers," underlines Monz. "To detect and correct general errors in a quantum computer, we need highly sophisticated so-called quantum error-correcting codes." The topological code used for this current experiment was proposed by Martin-Delgado's research group in Madrid. It arranges the qubits on a two-dimensional lattice, where they can interact with the neighboring particles.




[caption id="attachment_341" align="alignleft" width="400"]Quantum computation   Fragile yet error-free www.quantumcomputingtechnologyaustralia.com-049 Quantum computation -Fragile yet error free[/caption]

A quantum bit encoded in seven ions


For the experiment at the University of Innsbruck the physicists confined seven calcium atoms in an ion trap, which allows them to cool these atoms to almost absolute zero temperature and precisely control them by laser beams. The researchers encoded the fragile quantum states of one logical qubit in entangled states of these particles. The quantum error-correcting code provided the program for this process. "Encoding the logical qubit in the seven physical qubits was a real experimental challenge," relates Daniel Nigg, a member of Rainer Blatt's research group. The physicists achieved this in three steps, where in each step complex sequences of laser pulses were used to create entanglement between four neighboring qubits. "For the first time we have been able to encode a single quantum bit by distributing its information over seven atoms in a controlled way," says an excited Markus Müller, who in 2011 moved from Innsbruck to the Complutense University in Madrid. "When we entangle atoms in this specific way, they provide enough information for subsequent error correction and possible computations."


Error-free operations


In another step the physicists tested the code's capability to detect and correct different types of errors. "We have demonstrated that in this type of quantum system we are able to independently detect and correct every possible error for each particle," says Daniel Nigg. "To do this we only need information about the correlations between the particles and don't have to perform measurements of the single particles," explains Daniel Nigg's colleague Esteban Martinez. In addition to reliably detecting single errors, the physicists were for the first time able to apply single or even repetitive operations on a logical encoded qubit. Once the obstacle of the complex encoding process is overcome, only simple single-qubit gate operations are necessary for each gate operation. "With this quantum code we can implement basic quantum operations and simultaneously correct all possible errors," explains Thomas Monz this crucial milestone on the route towards a reliable and fault tolerant quantum computer.


Basis for future innovations


This new approach developed by the Spanish and Austrian physicists constitutes a promising basis for future innovations. "This 7-ion system applied for encoding one logical quantum bit can be used as a building block for much larger quantum systems," says theoretical physicist Müller. "The bigger the lattice, the more robust it becomes. The result might be a quantum computer that could perform any number of operations without being impeded by errors." The current experiment not only opens new routes for technological innovations: "Here, completely new questions come up, for example which methods can be used in the first place to characterise such large logical quantum bits," says Rainer Blatt with a view into the future. "Moreover, we would also like to collaboratively develop the used quantum codes further to optimize them for even more extensive operations," adds Martin-Delgado.



###

The researchers are financially supported by the Spanish Ministry of Science, the Austrian Science Fund, the U.S. Government, the European Commission and the Federation of Austrian Industries Tyrol.


Publication: Quantum Computations on a Topologically Encoded Qubit. Daniel Nigg, Markus Müller, Esteban A. Martinez, Philipp Schindler, Markus Hennrich, Thomas Monz, Miguel Angel Martin-Delgado, and Rainer Blatt. Science 2014 DOI: 10.1126/science.1253742 (arXiv:1403.5426)


News Release Source :  Quantum computation: Fragile yet error-free


 

Wednesday, June 18, 2014

Researchers Find Weird Magic Ingredient for Quantum Computing

Researchers find weird magic ingredient for quantum computing


A form of quantum weirdness is a key ingredient for building quantum computers according to new research from a team at the University of Waterloo's Institute for Quantum Computing (IQC).




[caption id="attachment_336" align="alignleft" width="400"]Researchers Find Weird Magic Ingredient for Quantum Computing www.quantumcomputingtechnologyaustralia.com-047 Researchers Find Weird Magic Ingredient for Quantum Computing[/caption]

In a new study published in the journal Nature, researchers have shown that a weird aspect of quantum theory called contextuality is a necessary resource to achieve the so-called magic required for universal quantum computation.


One major hurdle in harnessing the power of a universal quantum computer is finding practical ways to control fragile quantum states. Working towards this goal, IQC researchers Joseph Emerson, Mark Howard and Joel Wallman have confirmed theoretically that contextuality is a necessary resource required for achieving the advantages of quantum computation.


"Before these results, we didn't necessarily know what resources were needed for a physical device to achieve the advantage of quantum information. Now we know one," said Mark Howard, a postdoctoral fellow at IQC and the lead author of the paper. "As researchers work to build a universal quantum computer, understanding the minimum physical resources required is an important step to finding ways to harness the power of the quantum world."


Quantum devices are extremely difficult to build because they must operate in an environment that is noise-resistant. The term magic refers to a particular approach to building noise-resistant quantum computers known as magic-state distillation. So-called magic states act as a crucial, but difficult to achieve and maintain, extra ingredient that boosts the power of a quantum device to achieve the improved processing power of a universal quantum computer.


By identifying these magic states as contextual, researchers will be able to clarify the trade-offs involved in different approaches to building quantum devices. The results of the study may also help design new algorithms that exploit the special properties of these magic states more fully.


"These new results give us a deeper understanding of the nature of quantum computation. They also clarify the practical requirements for designing a realistic quantum computer," said Joseph Emerson, professor of Applied Mathematics and Canadian Institute for Advanced Research fellow. "I expect the results will help both theorists and experimentalists find more efficient methods to overcome the limitations imposed by unavoidable sources of noise and other errors."


Contextuality was first recognized as a feature of quantum theory almost 50 years ago. The theory showed that it was impossible to explain measurements on quantum systems in the same way as classical systems.


In the classical world, measurements simply reveal properties that the system had, such as colour, prior to the measurement. In the quantum world, the property that you discover through measurement is not the property that the system actually had prior to the measurement process. What you observe necessarily depends on how you carried out the observation.


Imagine turning over a playing card. It will be either a red suit or a black suit - a two-outcome measurement. Now imagine nine playing cards laid out in a grid with three rows and three columns. Quantum mechanics predicts something that seems contradictory – there must be an even number of red cards in every row and an odd number of red cards in every column. Try to draw a grid that obeys these rules and you will find it impossible. It's because quantum measurements cannot be interpreted as merely revealing a pre-existing property in the same way that flipping a card reveals a red or black suit.


Measurement outcomes depend on all the other measurements that are performed – the full context of the experiment.


Contextuality means that quantum measurements can not be thought of as simply revealing some pre-existing properties of the system under study. That's part of the weirdness of quantum mechanics.



###

 The Irish Research Council (IRC) as part of the Empower Fellowship program financially supported Mark Howard. The study's authors acknowledge financial support from CIFAR and the Government of Canada through NSERC.


News Release Source : Researchers find weird magic ingredient for quantum computing

Saturday, June 7, 2014

Quantum Criticality Observed in New Class of Materials

Quantum criticality observed in new class of materials


Observation of quantum phenomenon advances new theoretical understandings

HOUSTON — (June 4, 2014) — Quantum criticality, the strange electronic state that may be intimately related to high-temperature superconductivity, is notoriously difficult to study. But a new discovery of “quantum critical points” could allow physicists to develop a classification scheme for quantum criticality — the first step toward a broader explanation.

[caption id="attachment_330" align="alignleft" width="500"]Quantum Criticality Observed in New Class of Materials www.quantumcomputingtechnologyaustralia.com-044 Quantum Criticality Observed in New Class of Materials[/caption]

 

An artist's depiction of a "quantum critical point," the point at which a material undergoes a transition from one phase to another at absolute zero. The recent discovery of quantum critical points in a class of iron superconductors could allow physicists to develop a classification scheme for quantum criticality, a strange electronic state that may be intimately related to high-temperature superconductivity. Credit: thinkstockphotos.com/Rice University
Quantum criticality occurs in only a few composite crystalline materials and happens at absolute zero — the lowest possible temperature in the universe. The paucity of experimental observations of quantum criticality has left theorists wanting in their quest for evidence of possible causes.

The new finding of “quantum critical points” is in a class of iron superconductors known as “oxypnictides” (pronounced OXEE-nick-tydes). The research by physicists at Rice University, Princeton University, China’s Zhejiang University and Hangzhou Normal University, France’s École Polytechnique and Sweden’s Linköping University appears in this month’s issue of Nature Materials.

“One of the challenges of studying quantum criticality is trying to completely classify the quantum critical points that have been observed so far,” said Rice physicist Qimiao Si, a co-author of the new study. “There are indications that there’s more than one type, but do we stop at two? As theorists, we are not yet at the point where we can enumerate all of the possibilities.

“Another challenge is that there are still very few materials where we can say, with certainty, that a quantum critical point exists,” Si said. “There’s a very strong need, on these general grounds, for extending the materials basis of quantum criticality.”

In 2001, Si and colleagues advanced a theory to explain how quantum critical points could give seemingly conventional metals unconventional properties. High-temperature superconductors are one such material, and another is “heavy fermion” metals, so-called because the electrons inside them can appear to be thousands of times more massive than normal.

Heavy fermion metals are prototype systems for quantum criticality. When these metals reach their quantum critical point, the electrons within them act in unison and the effects of even one electron moving through the system have widespread results throughout. This is very different from the electron interactions in a common wiring material like copper. It is these collective effects that have increasingly convinced physicists of a possible link between superconductivity and quantum criticality.

“The quantum critical point is the point at which a material undergoes a transition from one phase to another at absolute zero,” said Si, Rice’s Harry C. and Olga K. Wiess Professor of Physics and Astronomy. “Unlike the classical phase transition of ice melting into water, which occurs when heat is provided to the system, the quantum phase transition results from quantum-mechanical forces. The effects are so powerful that they can be detected throughout the space inside the system and over a long time.”

To observe quantum critical points in the lab, physicists cool their samples — be they heavy fermion metals or high-temperature superconductors — to extremely cold temperatures. Though it is impossible to chill anything to absolute zero, physicists can drive the phase transition temperatures to attainable low temperatures by applying pressure, magnetic fields or by “doping” the samples to slightly alter the spacing between atoms.

Si and colleagues have been at the forefront of studying quantum critical points for more than a decade. In 2003, they developed the first thermodynamic method for systematically measuring and classifying quantum critical points. In 2004 and again in 2007, they used tests on heavy fermion metals to show how the quantum critical phenomena violated the standard theory of metals — Landau’s Fermi-liquid theory.

In 2008, following the groundbreaking discovery of iron-based pnictide superconductors in Japan and China, Si and colleagues advanced the first theory that explained how superconductivity develops out of a bad-metal normal state in terms of magnetic quantum fluctuations. Also that year, Si co-founded the International Collaborative Center on Quantum Matter (ICC-QM), a joint effort by Rice, Zhejiang University, the London Centre for Nanotechnology and the Max Planck Institute for Chemical Physics of Solids in Dresden, Germany.

In 2009, Si and co-authors offered a theoretical framework to predict how the pnictides would behave at or near a quantum critical point. Several of these predictions were borne out in a series of studies the following year.

In the current Nature Materials study, Si and ICC-QM colleagues Zhu’an Xu, an experimentalist at Zhejiang, and Jianhui Dai, a theorist at Hangzhou, worked with Antoine Georges of École Polytechnique, Nai Phuan Ong of Princeton and others to look for evidence of quantum critical points in an iron-based heavy fermion metallic compound made of cerium, nickel, arsenic and oxygen. The material is related to the family of iron-based pnictide superconductors.

“Heavy fermions are the canonical system for the in-depth study of quantum criticality,” Si said. “We have considered heavy fermion physics in the iron pnictides before, but in those compounds the electrons of the iron elements are ordered in such a way that it makes it more difficult to precisely study quantum criticality.

“The compound that we studied here is the first one among the pnictide family that turned out to feature clear-cut heavy fermion physics. That was a pleasant surprise for me,” Si said.

Through measurements of electrical transport properties in the presence of a magnetic field, the study provided evidence that the quantum critical point belongs to an unconventional type proposed in the 2001 work of Si and colleagues.

“Our work in this new heavy fermion pnictide suggests that the type of quantum critical point that has been theoretically advanced is robust,” Si said. “This bodes well with the notion that quantum criticality can eventually be classified.”

He said it is important to note that other homologues — similar iron-based materials — may now be studied to look for quantum critical points.

“Our results imply that the enormous materials basis for the oxypnictides, which has been so crucial to the search for high-temperature superconductivity, will also play a vital role in the effort to establish the universality classes of quantum criticality,” Si said.

Additional co-authors include Yongkang Lou, Yuke Li, Chunmu Feng and Guanghan Cao, all of Zhejiang University; Leonid Pourovskii of both École Polytechnique and Linköping University; and S.E. Rowley of Princeton University.

The research was supported by the National Basic Research Program of China, the National Science Foundation of China, the NSF of Zhejiang Province, the Fundamental Research Funds for the Central Universities of China, the National Science Foundation, the Nano Electronics Research Corporation, the Robert A. Welch Foundation, the China Scholarship Council and the Swedish National Infrastructure for Computing.

-30-


High-resolution IMAGES are available for download at:

http://news.rice.edu/wp-content/uploads/2014/06/0603_HEAVY-crit-lg.jpg
CAPTION: An artist’s depiction of a “quantum critical point,” the point at which a material undergoes a transition from one phase to another at absolute zero. The recent discovery of quantum critical points in a class of iron superconductors could allow physicists to develop a classification scheme for quantum criticality, a strange electronic state that may be intimately related to high-temperature superconductivity.
CREDIT: thinkstockphotos.com/Rice University

http://news.rice.edu/wp-content/uploads/2014/06/0603_HEAVY-Si-lg.jpg
CAPTION: Qimiao Si
CREDIT: Rice University

A copy of the Nature Materials paper is available at:

http://dx.doi.org/10.1038/nmat3991

News Release Source :  Quantum criticality observed in new class of materials

Thursday, June 5, 2014

NIST Studies Why Quantum Dots Suffer From 'Fluorescence Intermittency'

Don't Blink! NIST Studies Why Quantum Dots Suffer From 'Fluorescence Intermittency'


Researchers at the National Institute of Standards and Technology (NIST), working in collaboration with the Naval Research Laboratory, have found that a particular species of quantum dots that weren't commonly thought to blink, do.




[caption id="attachment_324" align="alignleft" width="500"]NIST Studies Why Quantum Dots Suffer From 'Fluorescence Intermittency' www.quantumcomputingtechnologyaustralia.com-043 NIST Studies Why Quantum Dots Suffer From 'Fluorescence Intermittency'[/caption]

So what? Well, although the blinks are short—on the order of nanoseconds to milliseconds—even brief fluctuations can result in efficiency losses that could cause trouble for using quantum dots to generate photons that move information around inside a quantum computer or between nodes of a future high-security internet based on quantum telecommunications.


Beyond demonstrating that the dots are blinking, the team also suggests a possible culprit.*


Scientists have regarded indium arsenide and gallium arsenide (InAs/GaAs) quantum dots to be promising as single photon sources foruse in different future computing and communication systems based on quantum technologies. Compared to other systems, researchers have preferred these quantum dots because they appeared to not blink and because they can be fabricated directly into the types of semiconductor optoelectronics that have been developing over the past few decades.


The NIST research team also thought these quantum dots were emitting steady light perfectly, until they came upon one that was obviously blinking (or was "fluorescently intermittent," in technical terms). They decided to see if they could find others that were blinking in a less obvious way.


While most previous experiments surveyed the dots in bulk, the team tested these dots as they would be used in an actual device. Using an extremely sensitive photon autocorrelation technique to uncover subtle signatures of blinking, they found that the dots blink over timescales rangingfrom tens of nanoseconds to hundreds of milliseconds. Their results suggest that building photonic structures around the quantum dots—something you'd have to do to make many applications viable—may make them significantly less stable as a light source.


"Most of the previous experimental studies of blinking inInAs/GaAs quantum dots looked at their behavior after the dots have been grown but before the surrounding devices have been fabricated," says Kartik Srinivasan, one of the authors of the study. "However, there is no guarantee that a quantum dot will remain non-blinking after the nanofabrication of a surrounding structure, which introduces surfaces and potential defects within 100 nanometers of the quantum dot. We estimate the radiative efficiency of the quantum dots to be between about 50 and 80 percent after the photonic structures are fabricated, significantly less than the 100 percent efficiency that future applications will require."


According to Marcelo Davanço, another author of the study, future work will focus on measuring dots both before and after device fabrication to better assess whether the fabrication is indeed a source of the defects thought to cause the blinking. Ultimately, the authors hope to understand what types of device geometries will avoid blinking while still efficiently funneling the emitted photons into a useful transmission channel, such as an optical fiber.



###

The NIST Center for Nanoscale Science and Technology (CNST) is a national nanotechnology user facility that enables innovation by providing rapid access to the tools needed to make and measure nanostructures. Researchers interested in accessing the techniques described here or in collaborating on their future development should contact Kartik Srinivasan.


*M. Davanço, C. Stephen Hellberg, S. Ates, A. Badolato and K. Srinivasan. Multiple time scale blinking in InAs quantum dot single-photon sources. Phys. Rev. B 89, 161303(R) – Published 16 April 2014.


News Release Source : Don't Blink! NIST Studies Why Quantum Dots Suffer From 'Fluorescence Intermittency'

Physicists Take Quantum Leap Toward Ultra-Precise Measurement

University of Toronto Physicists Take Quantum Leap Toward Ultra-Precise Measurement


TORONTO, ON – For the first time, physicists at the University of Toronto (U of T) have overcome a major challenge in the science of measurement using quantum mechanics. Their work paves the way for great advances in using quantum states to enable the next generation of ultra-precise measurement technologies.




[caption id="attachment_319" align="alignleft" width="400"]Physicists Take Quantum Leap Toward Ultra-Precise Measurement www.quantumcomputingtechnologyaustralia.com-042 Physicists Take Quantum Leap Toward Ultra-Precise Measurement[/caption]

University of Toronto physics students James Bateman (left) and Lee Rozema (right) led a study which successfully measured multiple photons in an entangled NOON state. The work paves the way for great advances in using quantum states to enable the next generation of ultra-precise measurement technologies.


"We've been able to conduct measurements using photons – individual particles of light – at a resolution unattainable according to classical physics," says Lee Rozema, a Ph.D. candidate in Professor Aephraim Steinberg's quantum optics research group in U of T's Department of Physics, and one of the lead authors along with M.Sc. candidate James Bateman of a report on the discovery published online today in Physical Review Letters. "This work opens up a path for using entangled states of light to carry out ultra-precise measurements."


Many of the most sensitive measurement techniques in existence, from ultra-precise atomic clocks to the world's largest telescopes, rely on detecting interference between waves – which occurs, for example, when two or more beams of light collide in the same space. Manipulating interference by producing photons in a special quantum state known as an "entangled" state – the sort of state famously dismissed by a skeptical Albert Einstein as implying "spooky action at a distance" – provided the result Rozema and his colleagues were looking for. The entangled state they used contains N photons which are all guaranteed to take the same path in an interferometer – either all N take the left-hand path or all N take the right-hand path, but no photons leave the pack.


The effects of interference are measured in devices known as "interferometers." It is well known that the resolution of such a device can be improved by sending more photons through it – when classical light beams are used, increasing the number of photons (the intensity of the light) by a factor of 100 can improve the resolution of an interferometer by a factor of 10. However, if the photons are prepared in a quantum-entangled state, an increase by a factor of 100 should improve the resolution by that same full factor of 100.


The scientific community already knew resolution could be improved by using entangled photons. Once scientists figured out how to entangle multiple photons the theory was proved correct but only up to a point. As the number of entangled photons rose, the odds of all photons reaching the same detector and at the same time became astronomically small, rendering the technique useless in practice.


So Rozema and his colleagues developed a way to employ multiple detectors in order to measure photons in entangled states. They designed an experimental apparatus that uses a "fibre ribbon" to collect photons and send them to an array of 11 single-photon detectors.


"This allowed us to capture nearly all of the multi-photons originally sent," says Rozema. "Sending single photons as well as two, three and four entangled photons at a time into our device produced dramatically improved resolution."


The U of T experiment built on a proposal by National University of Singapore physicist Mankei Tsang. In 2009, Tsang posited the idea of placing detectors at every possible position a photon could reach so that every possible event could be recorded, whether or not multiple photons hit the same detector. This would enable the calculation of the average position of all the detected photons, and could be done without having to discard any of them. The theory was quickly tested with two photons and two detectors by University of Ottawa physicist Robert Boyd.


"While two photons are better than one, we've shown that 11 detectors are far better than two," says Steinberg, summarising their advancement on Boyd's results. "As technology progresses, using high-efficiency detector arrays and on-demand entangled-photons sources, our techniques could be used to measure increasingly higher numbers of photons with higher resolution."


The discovery is reported in a study titled "Scalable spatial superresolution using entangled photons" published in the June 6 issue of Physical Review Letters. It is recommended as an Editor's Suggestion, and is accompanied by a commentary in the journal Physics which describes the work as a viable approach to efficiently observing superresolved spatial interference fringes that could improve the precision of imaging and lithography systems.



###

In addition to Steinberg, Rozema and Bateman's collaborators on the research included Dylan Mahler, Ryo Okamoto of Hokkaido and Osaka Universities, Amir Feizpour, and Alex Hayat, now at the Technion - Israel Institute of Technology. Support for the research was provided by the Natural Sciences and Engineering Research Council of Canada and the Canadian Institute for Advanced Research, as well as the Yamada Science Foundation.


MEDIA CONTACTS:


Lee Rozema
Department of Physics
University of Toronto
lrozema@physics.utoronto.ca
416-946-3162


Aephraim Steinberg
Department of Physics
University of Toronto
steinberg@physics.utoronto.ca
416-978-0713


Sean Bettam
Communications, Faculty of Arts & Science
University of Toronto
s.bettam@utoronto.ca
416-946-7950


 News Release Source : University of Toronto Physicists Take Quantum Leap Toward Ultra-Precise Measurement

Monday, June 2, 2014

New analysis eliminates a potential speed bump in quantum computing

New analysis eliminates a potential speed bump in quantum computing


Global symmetry not required for fast quantum search

A quantum particle can search for an item in an unsorted "database" by jumping from one item to another in superposition, and it does so faster than a classical computer ever could.


This assertion assumes, however, that the particle can directly hop from any item to any other. Any restriction on which items the particle can directly hop to could slow down the search.A quantum particle can search for an item in an unsorted "database" by jumping from one item to another in superposition, and it does so faster than a classical computer ever could.




[caption id="attachment_311" align="alignleft" width="500"]New analysis eliminates a potential speed bump in quantum computing www.quantumcomputingtechnologyaustralia.com-041 New analysis eliminates a potential speed bump in quantum computing[/caption]

 In a complete graph (left) every node is connected to every other. For other well  studied graphs, the Paley graph in the center and the Latin square graph on the right,  that is not true. A quantum particle could hop directly to the target position, in red,  only from connected nodes, marked in blue.


"Intuition says that a symmetric database allows the particle to hop freely enough to retain the quantum speedup, but our research has shown this intuition to be false," says Tom Wong, a physicist at the University of California, San Diego.


In a paper accepted for publication by Physical Review Letters, the researchers used a technique familiar to physicists called "degenerate perturbation theory" in a novel way to prove that global symmetry is not required for a sped up search.


Information scientists represent the database to be searched as a graph. In globally symmetric graphs, the nodes can be swapped with each other such that the connections between them are preserved. "Strongly regular graphs" don't share this property, but this analysis shows they also support a fast search through local symmetries.


Their finding extends the use of this theory to the field of quantum information science and expands the kinds of data structures on which quantum computing outperforms classical computing.



###

Jonatan Janmark, KTH Royal Institute of Technology in Stockholm, Sweden and UC San Diego's Department of Mathematics and David Meyer, professor of mathematics at UC San Diego co-authored the work.


The Defense Advanced Research Projects Agency partially supported this work as part of its Quantum Entanglement Science and Technology program. Additional funding came from the Air Force Office of Scientific Research as part of the Transformational Computing in Aerospace Science and Engineering Initiative, and the Achievement Awards for College Scientists Foundation.