Monday, June 23, 2014

Way to Boot up Quantum Computers 72 Times Faster Than PreviouslyPossible

Physicists find way to boot up quantum computers 72 times faster than previously possible


Press the start button, switch on the monitor, grab a cup of coffee and off you go. That is pretty much how most us experience booting up a computer. But with a quantum computer the situation is very different. So far, researchers have had to spend hours making dozens of adjustments and fine calibrations in order to set up a chip with just five quantum bits so that it can be used for experimental work. (One quantum bit or 'qubit' is the quantum physical equivalent of a single bit in a conventional computer). Any small errors in the adjustment and calibration procedure and the chip would not work.

Way to Boot up Quantum Computers 72 Times Faster Than Previously Possible
The problem is that, not unlike musical instruments, quantum computers react to small changes in the local environment. If, for example, it is a little warmer or a little colder or if the ambient air pressure is a little higher or a little lower than the day before then the complex network of qubits will no longer function – the computer is detuned and has to be readjusted before it can be used. 'Up until now, experimental quantum physicists have had to sit down each day and see how conditions have changed compared to the day before. They then had to remeasure each parameter and carefully recalibrate the chip,' explains Professor Wilhelm-Mauch, Professor for Theoretical Quantum and Solid-State Physics at Saarland University. Only a very small error rate of less than 0.1 percent is permissible when measuring ambient conditions. Frank Wilhelm-Mauch explains this sensitivity thus: 'That means that an error can occur in only one in a thousand measurements. If just two in a thousand measurements are in error, the software will be unable to correct for the errors and the quantum computer will not operate correctly.' With around 50 different parameters involved in the calibration process, one begins to get an idea of the sheer effort involved in calibrating a quantum computer.

Working together with his doctoral student, Wilhelm-Mauch began to consider a fundamentally new approach to the problem. 'We asked ourselves the question: Why is it necessary each and every day to understand how conditions differ from those of the day before?' The answer we eventually came up with was that it isn't necessary. What's important is that the setup procedure produces the right results. Why it produces the right results is not so relevant.' It was this pragmatic approach that underlay the work carried out by Wilhelm-Mauch and Egger. 'For the calibration procedure we used an algorithm from engineering mathematics, strictly speaking from the field of civil and structural engineering, as that's another area in which experiments are costly,' explains Professor Wilhelm-Mauch.

Using this technique, the two theoreticians were able to reduce the calibration error rate to below the required 0.1 percent threshold, while at the same time speeding up the calibration process from six hours to five minutes. The Saarbrücken methodology, which goes under the name Ad-HOC (Adaptive Hybrid Optimal Control), has now been subjected to rigorous testing by a group of experimental physicists from the University of California in Santa Barbara. Their experimental work is published in the issue of Physical Review Letters that also contains the Saarbrücken paper.

This development is of major importance for future experimental research into quantum computing. Physicists in quantum computing laboratories no longer have to spend hours every day preparing their system for just a short period of experimental work. 'As many of the parameters, such as temperature, light and air pressure do not remain stable during the long calibration phase, this can further shorten the time window in which the chip is running error-free and in which it can therefore be used for experiments,' says Wilhelm-Mauch, adding that the new method is scalable. Up until now, technical constraints have meant that experiments have been carried out using a single chip housing five qubits that perform the actual computational operations. The new method, in contrast, is not restricted to chips of this magnitude and can be applied to quantum processors of almost any size.

Frank Wilhelm-Mauch jokingly points out another appealing feature of the new methodology: 'Unlike the previous approach of manual calibration, our method is fully automated. The researcher really does just push a button like on a conventional computer. They can then go off to get themselves a coffee while the quantum computer boots up.' A major improvement in the life of experimental research scientists working in the field.


Image Credit : Erik Lucero/UCSB

Tuesday, June 17, 2014

Ultra Thin Wires for Quantum Computing

Ultra-thin wires for quantum computing

Carefully fabricating nanofibers by heating and pulling may make for highly-efficient, optics-based, low-power atom traps

WASHINGTON D.C., June 17, 2014 - Take a fine strand of silica fiber, attach it at each end to a slow-turning motor, gently torture it over an unflickering flame until it just about reaches its melting point and then pull it apart. The middle will thin out like a piece of taffy until it is less than half a micron across -- about 200 times thinner than a human hair.
Ultra-thin wires for quantum computing
Ultra-thin wires for quantum computing

That, according to researchers at the Joint Quantum Institute at the University of Maryland, is how you fabricate ultrahigh transmission optical nanofibers, a potential component for future quantum information devices, which they describe in AIP Publishing's journal AIP Advances.

Quantum computers promise enormous power, but are notoriously tricky to build. To encode information in qubits, the fundamental units of a quantum computer, the bits must be held in a precarious position called a superposition of states. In this fragile condition the bits exist in all of their possible configurations at the same time, meaning they can perform multiple parallel calculations.

The tendency of qubits to lose their superposition state too quickly, a phenomenon known as decoherence, is a major obstacle to the further development of quantum computers and any device dependent on superpositions. To address this challenge, researchers at the Joint Quantum Institute proposed a hybrid quantum processor that uses trapped atoms as the memory and superconducting qubits as the processor, as atoms demonstrate relatively long superposition survival times and superconducting qubits perform operations quickly.

"The idea is that we can get the best of both worlds," said Jonathan Hoffman, a graduate student in the Joint Quantum Institute who works in the lab of principal investigators Steven Rolston and Luis Orozco. However, a problem is that superconductors don't like high optical power or magnetic fields and most atomic traps use both, Hoffman said.

This is where the optical nanofibers come in: The Joint Quantum Institute team realized that nanofibers could create optics-based, low-power atom traps that would "play nice" with superconductors. Because the diameter of the fibers is so minute -- 530 nanometers, less than the wavelength of light used to trap atoms -- some of the light leaks outside of the fiber as a so-called evanescent wave, which can be used to trap atoms a few hundred nanometers from the fiber surface.

Hoffman and his colleagues have worked on optical nanofiber atom traps for the past few years. Their AIP Advances paper describes a new procedure they developed that maximizes the efficiency of the traps through careful and precise fabrication methods.

The group's procedure, which yields an improvement of two orders of magnitude less transmission loss than previous work, focuses on intensive preparation and cleaning of the pre-pulling environment the nanofibers are created in.

In the fabrication process, the fiber is brushed through the flame to prevent the formation of air currents, which can cause inconsistencies in diameter to arise, as it is pulled apart and tapered down. The flame source is a mixture of hydrogen and oxygen gas in a precise two-to-one ratio, to ensure that water vapor is the only byproduct. The motors are controlled by an algorithm based on the existing work of a group in Vienna, which calculates the trajectories of the motors to produce a fiber of the desired length and profile.

Previous pulling methods, such as carbon dioxide lasing and chemical etching, were limited by the laser's insufficient diameter and by a lesser degree of control over tapering length, respectively.

Future work includes interfacing the trapped atoms with the superconducting circuits held at 10 mKelvin in a dilution refrigerator, as well as guiding more complicated optical field patterns through the fiber (higher-order modes) and using these to trap atoms.


###

The article, "Ultrahigh transmission optical nanofibers," is authored by J.E. Hoffman, S. Ravets, J.A. Grover, P. Solano, P.R. Kordell, J.D. Wong-Campos, L.A. Orozco and S.L. Rolston. It will be published in AIP Advances on June 17, 2014 (DOI: . After that date, it may be accessed at: http://scitation.aip.org/content/aip/journal/adva/4/6/10.1063/1.4879799

ABOUT THE JOURNAL

AIP Advances is a fully open access, online-only, community-led journal. It covers all areas of applied physical science. With its advanced web 2.0 functionality, the journal puts relevant content and discussion tools in the hands of the community to shape the direction of the physical sciences. See: http://aipadvances.aip.org


Image Credit: J. E. Hoffman and E. Edwards / JQI at UMD

Thursday, June 12, 2014

Quantum computation - Fragile yet error free

Quantum computation: Fragile yet error-free


Even computers are error-prone. The slightest disturbances may alter saved information and falsify the results of calculations. To overcome these problems, computers use specific routines to continuously detect and correct errors. This also holds true for a future quantum computer, which will require procedures for error correction as well: "Quantum phenomena are extremely fragile and error-prone. Errors can spread rapidly and severely disturb the computer," says Thomas Monz, member of Rainer Blatt's research group at the Institute for Experimental Physics at the University of Innsbruck. Together with Markus Müller and Miguel Angel Martin-Delgado from the Department for Theoretical Physics at the Complutense University in Madrid, the physicists in Innsbruck developed a new quantum error-correcting method and tested it experimentally. "A quantum bit is extremely complex and cannot be simply copied. Moreover, errors in the microscopic quantum world are more manifold and harder to correct than in conventional computers," underlines Monz. "To detect and correct general errors in a quantum computer, we need highly sophisticated so-called quantum error-correcting codes." The topological code used for this current experiment was proposed by Martin-Delgado's research group in Madrid. It arranges the qubits on a two-dimensional lattice, where they can interact with the neighboring particles.
Quantum computation -Fragile yet error free

A quantum bit encoded in seven ions

For the experiment at the University of Innsbruck the physicists confined seven calcium atoms in an ion trap, which allows them to cool these atoms to almost absolute zero temperature and precisely control them by laser beams. The researchers encoded the fragile quantum states of one logical qubit in entangled states of these particles. The quantum error-correcting code provided the program for this process. "Encoding the logical qubit in the seven physical qubits was a real experimental challenge," relates Daniel Nigg, a member of Rainer Blatt's research group. The physicists achieved this in three steps, where in each step complex sequences of laser pulses were used to create entanglement between four neighboring qubits. "For the first time we have been able to encode a single quantum bit by distributing its information over seven atoms in a controlled way," says an excited Markus Müller, who in 2011 moved from Innsbruck to the Complutense University in Madrid. "When we entangle atoms in this specific way, they provide enough information for subsequent error correction and possible computations."

Error-free operations

In another step the physicists tested the code's capability to detect and correct different types of errors. "We have demonstrated that in this type of quantum system we are able to independently detect and correct every possible error for each particle," says Daniel Nigg. "To do this we only need information about the correlations between the particles and don't have to perform measurements of the single particles," explains Daniel Nigg's colleague Esteban Martinez. In addition to reliably detecting single errors, the physicists were for the first time able to apply single or even repetitive operations on a logical encoded qubit. Once the obstacle of the complex encoding process is overcome, only simple single-qubit gate operations are necessary for each gate operation. "With this quantum code we can implement basic quantum operations and simultaneously correct all possible errors," explains Thomas Monz this crucial milestone on the route towards a reliable and fault tolerant quantum computer.

Basis for future innovations

This new approach developed by the Spanish and Austrian physicists constitutes a promising basis for future innovations. "This 7-ion system applied for encoding one logical quantum bit can be used as a building block for much larger quantum systems," says theoretical physicist Müller. "The bigger the lattice, the more robust it becomes. The result might be a quantum computer that could perform any number of operations without being impeded by errors." The current experiment not only opens new routes for technological innovations: "Here, completely new questions come up, for example which methods can be used in the first place to characterise such large logical quantum bits," says Rainer Blatt with a view into the future. "Moreover, we would also like to collaboratively develop the used quantum codes further to optimize them for even more extensive operations," adds Martin-Delgado.

The researchers are financially supported by the Spanish Ministry of Science, the Austrian Science Fund, the U.S. Government, the European Commission and the Federation of Austrian Industries Tyrol.

For more Information see this  research paper : Experimental Quantum Computations on a Topologically Encoded Qubit


Wednesday, June 11, 2014

Researchers Find Weird Magic Ingredient for Quantum Computing

Researchers find weird magic ingredient for quantum computing

A form of quantum weirdness is a key ingredient for building quantum computers according to new research from a team at the University of Waterloo's Institute for Quantum Computing (IQC).
Researchers Find Weird Magic Ingredient for Quantum Computing
Researchers Find Weird Magic Ingredient for Quantum Computing
In a new study published in the journal Nature, researchers have shown that a weird aspect of quantum theory called contextuality is a necessary resource to achieve the so-called magic required for universal quantum computation.

One major hurdle in harnessing the power of a universal quantum computer is finding practical ways to control fragile quantum states. Working towards this goal, IQC researchers Joseph Emerson, Mark Howard and Joel Wallman have confirmed theoretically that contextuality is a necessary resource required for achieving the advantages of quantum computation.

"Before these results, we didn't necessarily know what resources were needed for a physical device to achieve the advantage of quantum information. Now we know one," said Mark Howard, a postdoctoral fellow at IQC and the lead author of the paper. "As researchers work to build a universal quantum computer, understanding the minimum physical resources required is an important step to finding ways to harness the power of the quantum world."

Quantum devices are extremely difficult to build because they must operate in an environment that is noise-resistant. The term magic refers to a particular approach to building noise-resistant quantum computers known as magic-state distillation. So-called magic states act as a crucial, but difficult to achieve and maintain, extra ingredient that boosts the power of a quantum device to achieve the improved processing power of a universal quantum computer.

By identifying these magic states as contextual, researchers will be able to clarify the trade-offs involved in different approaches to building quantum devices. The results of the study may also help design new algorithms that exploit the special properties of these magic states more fully.

"These new results give us a deeper understanding of the nature of quantum computation. They also clarify the practical requirements for designing a realistic quantum computer," said Joseph Emerson, professor of Applied Mathematics and Canadian Institute for Advanced Research fellow. "I expect the results will help both theorists and experimentalists find more efficient methods to overcome the limitations imposed by unavoidable sources of noise and other errors."

Contextuality was first recognized as a feature of quantum theory almost 50 years ago. The theory showed that it was impossible to explain measurements on quantum systems in the same way as classical systems.

In the classical world, measurements simply reveal properties that the system had, such as colour, prior to the measurement. In the quantum world, the property that you discover through measurement is not the property that the system actually had prior to the measurement process. What you observe necessarily depends on how you carried out the observation.

Imagine turning over a playing card. It will be either a red suit or a black suit - a two-outcome measurement. Now imagine nine playing cards laid out in a grid with three rows and three columns. Quantum mechanics predicts something that seems contradictory – there must be an even number of red cards in every row and an odd number of red cards in every column. Try to draw a grid that obeys these rules and you will find it impossible. It's because quantum measurements cannot be interpreted as merely revealing a pre-existing property in the same way that flipping a card reveals a red or black suit.

Measurement outcomes depend on all the other measurements that are performed – the full context of the experiment.

Contextuality means that quantum measurements can not be thought of as simply revealing some pre-existing properties of the system under study. That's part of the weirdness of quantum mechanics.


 The Irish Research Council (IRC) as part of the Empower Fellowship program financially supported Mark Howard. The study's authors acknowledge financial support from CIFAR and the Government of Canada through NSERC.

News Release Source : Researchers find weird magic ingredient for quantum computing

Wednesday, June 4, 2014

Quantum Criticality Observed in New Class of Materials

Quantum criticality observed in new class of materials

Observation of quantum phenomenon advances new theoretical understandings

HOUSTON — (June 4, 2014) — Quantum criticality, the strange electronic state that may be intimately related to high-temperature superconductivity, is notoriously difficult to study. But a new discovery of “quantum critical points” could allow physicists to develop a classification scheme for quantum criticality — the first step toward a broader explanation.
Quantum Criticality Observed in New Class of Materials

An artist's depiction of a "quantum critical point," the point at which a material undergoes a transition from one phase to another at absolute zero. The recent discovery of quantum critical points in a class of iron superconductors could allow physicists to develop a classification scheme for quantum criticality, a strange electronic state that may be intimately related to high-temperature superconductivity. Credit: thinkstockphotos.com/Rice University
Quantum criticality occurs in only a few composite crystalline materials and happens at absolute zero — the lowest possible temperature in the universe. The paucity of experimental observations of quantum criticality has left theorists wanting in their quest for evidence of possible causes.

The new finding of “quantum critical points” is in a class of iron superconductors known as “oxypnictides” (pronounced OXEE-nick-tydes). The research by physicists at Rice University, Princeton University, China’s Zhejiang University and Hangzhou Normal University, France’s École Polytechnique and Sweden’s Linköping University appears in this month’s issue of Nature Materials.

“One of the challenges of studying quantum criticality is trying to completely classify the quantum critical points that have been observed so far,” said Rice physicist Qimiao Si, a co-author of the new study. “There are indications that there’s more than one type, but do we stop at two? As theorists, we are not yet at the point where we can enumerate all of the possibilities.

“Another challenge is that there are still very few materials where we can say, with certainty, that a quantum critical point exists,” Si said. “There’s a very strong need, on these general grounds, for extending the materials basis of quantum criticality.”

In 2001, Si and colleagues advanced a theory to explain how quantum critical points could give seemingly conventional metals unconventional properties. High-temperature superconductors are one such material, and another is “heavy fermion” metals, so-called because the electrons inside them can appear to be thousands of times more massive than normal.

Heavy fermion metals are prototype systems for quantum criticality. When these metals reach their quantum critical point, the electrons within them act in unison and the effects of even one electron moving through the system have widespread results throughout. This is very different from the electron interactions in a common wiring material like copper. It is these collective effects that have increasingly convinced physicists of a possible link between superconductivity and quantum criticality.

“The quantum critical point is the point at which a material undergoes a transition from one phase to another at absolute zero,” said Si, Rice’s Harry C. and Olga K. Wiess Professor of Physics and Astronomy. “Unlike the classical phase transition of ice melting into water, which occurs when heat is provided to the system, the quantum phase transition results from quantum-mechanical forces. The effects are so powerful that they can be detected throughout the space inside the system and over a long time.”

To observe quantum critical points in the lab, physicists cool their samples — be they heavy fermion metals or high-temperature superconductors — to extremely cold temperatures. Though it is impossible to chill anything to absolute zero, physicists can drive the phase transition temperatures to attainable low temperatures by applying pressure, magnetic fields or by “doping” the samples to slightly alter the spacing between atoms.

Si and colleagues have been at the forefront of studying quantum critical points for more than a decade. In 2003, they developed the first thermodynamic method for systematically measuring and classifying quantum critical points. In 2004 and again in 2007, they used tests on heavy fermion metals to show how the quantum critical phenomena violated the standard theory of metals — Landau’s Fermi-liquid theory.

In 2008, following the groundbreaking discovery of iron-based pnictide superconductors in Japan and China, Si and colleagues advanced the first theory that explained how superconductivity develops out of a bad-metal normal state in terms of magnetic quantum fluctuations. Also that year, Si co-founded the International Collaborative Center on Quantum Matter (ICC-QM), a joint effort by Rice, Zhejiang University, the London Centre for Nanotechnology and the Max Planck Institute for Chemical Physics of Solids in Dresden, Germany.

In 2009, Si and co-authors offered a theoretical framework to predict how the pnictides would behave at or near a quantum critical point. Several of these predictions were borne out in a series of studies the following year.

In the current Nature Materials study, Si and ICC-QM colleagues Zhu’an Xu, an experimentalist at Zhejiang, and Jianhui Dai, a theorist at Hangzhou, worked with Antoine Georges of École Polytechnique, Nai Phuan Ong of Princeton and others to look for evidence of quantum critical points in an iron-based heavy fermion metallic compound made of cerium, nickel, arsenic and oxygen. The material is related to the family of iron-based pnictide superconductors.

“Heavy fermions are the canonical system for the in-depth study of quantum criticality,” Si said. “We have considered heavy fermion physics in the iron pnictides before, but in those compounds the electrons of the iron elements are ordered in such a way that it makes it more difficult to precisely study quantum criticality.

“The compound that we studied here is the first one among the pnictide family that turned out to feature clear-cut heavy fermion physics. That was a pleasant surprise for me,” Si said.

Through measurements of electrical transport properties in the presence of a magnetic field, the study provided evidence that the quantum critical point belongs to an unconventional type proposed in the 2001 work of Si and colleagues.

“Our work in this new heavy fermion pnictide suggests that the type of quantum critical point that has been theoretically advanced is robust,” Si said. “This bodes well with the notion that quantum criticality can eventually be classified.”

He said it is important to note that other homologues — similar iron-based materials — may now be studied to look for quantum critical points.

“Our results imply that the enormous materials basis for the oxypnictides, which has been so crucial to the search for high-temperature superconductivity, will also play a vital role in the effort to establish the universality classes of quantum criticality,” Si said.

Additional co-authors include Yongkang Lou, Yuke Li, Chunmu Feng and Guanghan Cao, all of Zhejiang University; Leonid Pourovskii of both École Polytechnique and Linköping University; and S.E. Rowley of Princeton University.

The research was supported by the National Basic Research Program of China, the National Science Foundation of China, the NSF of Zhejiang Province, the Fundamental Research Funds for the Central Universities of China, the National Science Foundation, the Nano Electronics Research Corporation, the Robert A. Welch Foundation, the China Scholarship Council and the Swedish National Infrastructure for Computing.

Foe more information Nature Materials paper is available at: http://dx.doi.org/10.1038/nmat3991

News Release Source :  Quantum criticality observed in new class of materials

Monday, June 2, 2014

Physicists Take Quantum Leap Toward Ultra-Precise Measurement

University of Toronto Physicists Take Quantum Leap Toward Ultra-Precise Measurement


TORONTO, ON – For the first time, physicists at the University of Toronto (U of T) have overcome a major challenge in the science of measurement using quantum mechanics. Their work paves the way for great advances in using quantum states to enable the next generation of ultra-precise measurement technologies.
Physicists Take Quantum Leap Toward Ultra-Precise Measurement
Physicists Take Quantum Leap Toward Ultra-Precise Measurement

University of Toronto physics students James Bateman (left) and Lee Rozema (right) led a study which successfully measured multiple photons in an entangled NOON state. The work paves the way for great advances in using quantum states to enable the next generation of ultra-precise measurement technologies.

"We've been able to conduct measurements using photons – individual particles of light – at a resolution unattainable according to classical physics," says Lee Rozema, a Ph.D. candidate in Professor Aephraim Steinberg's quantum optics research group in U of T's Department of Physics, and one of the lead authors along with M.Sc. candidate James Bateman of a report on the discovery published online today in Physical Review Letters. "This work opens up a path for using entangled states of light to carry out ultra-precise measurements."

Many of the most sensitive measurement techniques in existence, from ultra-precise atomic clocks to the world's largest telescopes, rely on detecting interference between waves – which occurs, for example, when two or more beams of light collide in the same space. Manipulating interference by producing photons in a special quantum state known as an "entangled" state – the sort of state famously dismissed by a skeptical Albert Einstein as implying "spooky action at a distance" – provided the result Rozema and his colleagues were looking for. The entangled state they used contains N photons which are all guaranteed to take the same path in an interferometer – either all N take the left-hand path or all N take the right-hand path, but no photons leave the pack.

The effects of interference are measured in devices known as "interferometers." It is well known that the resolution of such a device can be improved by sending more photons through it – when classical light beams are used, increasing the number of photons (the intensity of the light) by a factor of 100 can improve the resolution of an interferometer by a factor of 10. However, if the photons are prepared in a quantum-entangled state, an increase by a factor of 100 should improve the resolution by that same full factor of 100.

The scientific community already knew resolution could be improved by using entangled photons. Once scientists figured out how to entangle multiple photons the theory was proved correct but only up to a point. As the number of entangled photons rose, the odds of all photons reaching the same detector and at the same time became astronomically small, rendering the technique useless in practice.

So Rozema and his colleagues developed a way to employ multiple detectors in order to measure photons in entangled states. They designed an experimental apparatus that uses a "fibre ribbon" to collect photons and send them to an array of 11 single-photon detectors.

"This allowed us to capture nearly all of the multi-photons originally sent," says Rozema. "Sending single photons as well as two, three and four entangled photons at a time into our device produced dramatically improved resolution."

The U of T experiment built on a proposal by National University of Singapore physicist Mankei Tsang. In 2009, Tsang posited the idea of placing detectors at every possible position a photon could reach so that every possible event could be recorded, whether or not multiple photons hit the same detector. This would enable the calculation of the average position of all the detected photons, and could be done without having to discard any of them. The theory was quickly tested with two photons and two detectors by University of Ottawa physicist Robert Boyd.

"While two photons are better than one, we've shown that 11 detectors are far better than two," says Steinberg, summarising their advancement on Boyd's results. "As technology progresses, using high-efficiency detector arrays and on-demand entangled-photons sources, our techniques could be used to measure increasingly higher numbers of photons with higher resolution."

The discovery is reported in a study titled "Scalable spatial superresolution using entangled photons" published in the June 6 issue of Physical Review Letters. It is recommended as an Editor's Suggestion, and is accompanied by a commentary in the journal Physics which describes the work as a viable approach to efficiently observing superresolved spatial interference fringes that could improve the precision of imaging and lithography systems.

 News Release Source : University of Toronto Physicists Take Quantum Leap Toward Ultra-Precise Measurement