Monday, December 26, 2011

What is a Quantum Computer?

What is a Quantum Computer?   

by Sandra Prior

While not a new development, quantum computing is the next target for intrepid developers, most of whom are physicists and scientists rather than programmers and inventors. The hardware needed to run a quantum algorithm is finicky at best and is downright terrifying to maintain. But what is quantum computing and what goes into making a quantum computer?
What is a Quantum Computer?
What is a Quantum Computer?


Quantum What?

A conventional modern PC is capable of two binary states, these being a one or a zero. These bits make up all of the information that your stock home computer hold on its hard drive. However, a quantum computer is somewhat different. Rather than being able to operate in states of one or zero, it is able to be in both states at once or in superposition as well as every state in between.

Rather than binary bits, these machines use qubits. A quantum computer makes use of principles of quantum mechanics, notably quantum interference in the most widespread design models, rather than conventional physics in order to operate. Just to illustrate the difference in processing between your home PC and something quantum, a 30-qubit system will run at the real world equivalent of a conventional unit running at 10 teraflops. Since home users have got access to processing in the gigaflop range at the moment, this is a huge jump in power. Maybe Crysis will finally run on Ultra.

Does it Work?

It would appear that quantum computers are in fact a reality. They are very difficult to maintain, construct and understand. Unless you are one of the geniuses involved in the project, that is. Mostly theoretical at this point, there have been major advances in the field of quantum computing. In March 2000, scientists at the Los Alamos National Laboratory announced that they had constructed a 7-qubit quantum computer inside a drop a liquid. The liquid in question was either alanine (used to analyze quantum state decay) or the tongue busting trichloroethylene (used for quantum error correction) and the state of the qubits in question is read by nuclear magnetic resonance or NMR, which is a method of indirectly measuring the state of a qubit.

The problem with measuring a qubit directly is that an accurate reading will drop a qubit out of superposition (should it be in that state) and turn your quantum computer into a conventional one.

The methods used in quantum computing in order to create results are far too lengthy to fully explain almost everywhere outside their scientific literature and the systems being created also vary, making a broad assessment impossible. Very much in their infancy, there are strides forward being made. From 2000 and Los Alamos's 7-qubit machine there have been; another 7-qubit machine demonstrated by IBM and Stanford University; the development of the first qubit using ion traps in 2005 and most recently a 16-qubit quantum computer that was demonstrated by D-Wave Systems in late 2007. Rather than just being a theoretical construct, this computer was able to solve Sudoku puzzles and other problems that were demonstrated.

Sooner than Expected?

D-Wave Systems in particular are raring to get a working model of a quantum computer on the market as soon as possible. Their planned system will not be a fully functional one but will rather be what one quantum algorithm designer calls 'special purpose noisy piece of hardware' that could take up the task of running physical simulations that are impossible on conventional silicon technology.

Looking forward to the future of computing quantum computing has the potential to far outstrip anything that a home user could have imagined. Such processing power could even be put to use creating a true life simulation, a form of virtual reality indistinguishable from reality itself.

Thursday, October 13, 2011

Quantum Computing - Yesterday, Today, and Tomorrow

Quantum Computing - Yesterday, Today, and Tomorrow

Author: Dele Oluwole

Abstract

This paper digs into the fundamental issues of the slow but progressive breakthrough in embracing quantum computing and how its benefit and risk affects humanity. Drawing analysis from its probable practicality, while also exploring today's available technology.

The aim of this idea is to observe the effectiveness of quantum computing and how it could impact on mankind tracing its history and looking into what awaits mankind in the future.

Approaching this ideal from two major perspectives that form the basis for this paper, which are where we are and where we are going consequent upon which this research of impeccable sources were predicated

The result invariably shows realistically the importance of quantum computing to all mankind when eventually fabricated in the future.

1. Introduction

Quantum computing may be coming closer to everyday use because of the discovery of a single electron's spin in an ordinary transistor. The success, by researcher Hong Wen Jiangand colleagues at the University of California, Los Angeles, could lead to major advances in communications, cryptography and supercomputing. Jiang's research reveals that an ordinary transistor, the kind used in a Desktop PC or cell phone can be adapted for practical quantum computing. Quantum computing exploits the properties of subatomic particles and the laws of quantum mechanics. Today's computers have bits in either a 1 or a 0 state. Qubits, however, can be in both states at the same time.

                                                 Quantum Computing Technology Australia :                                                Quantum Computing -  yesterday, today, and tomorrow

CISC is a CPU design that enables the processor to handle more complex instructions from the software at the expense of speed. All Intel processors for PCs are CISC processors. Complex instruction set computing is one of the two main types of processor design in use today. It is slowly losing popularity to RISC designs; currently all the fastest processors in the world are RISC. The most popular current CISC processor is the x86, but there are also still some 68xx, 65xx, and Z80s in use. CISC processor is designed to execute a relatively large number of different instructions, each taking a different amount of time to execute (depending on the complexity of the instruction). Contrast with RISC.

Complex Instruction-Set Computer has CPU designed with a thorough set of assembly calls, systems and smaller binaries but generally slower execution of each individual instruction.

2. CISC/RISC Speed and limitations

One important assumption in circuit design is that all circuit elements are 'lumped'. This means that signal transmission time from one element to the other is insignificant. Meaning that the time it takes for the signal produced at one point on the circuit to transmit to the rest of the circuit is tiny compared to the times involved in circuit operation.

Electrical signals travel at the speed of light, suppose a processor works at 1GHz. that is one billion clock cycles per second, also meaning that one clock cycle goes one billionth of a second, or a nanosecond. Light travels about 30cm in a nanosecond. As a result, the size of circuitry involved at such clock speeds will be much less than 30cm, therefore, the most circuit size is 3cm. bearing in mind that the actual CPU core size is less than 1cm on a side, which is still okay, but this is just for 1 GHz.

Cases where the clock speed is increased to 100GHz, a cycle will be 0.01 nanoseconds, and signals will only transmit 3mm in this time. So, the CPU core will definitely need to be about 0.3mm in size. It will be very difficult to cram a CPU core into such a small space, which is still okay, but somewhere between 1 GHz and 100GHz, there will be a physical barrier. As smaller and smaller transistors are manufactured soon there may be physical limit as the numbers of electrons per transistors will become one and this will bring to a close to the rule of electron.

3. The benefits and capabilities of quantum computing in theory are:


  1. Factor large integers in a time that is exponentially faster than any known classical algorithm.

  2. Run simulations of quantum mechanics.

  3. Break encrypted secret messages in seconds that classical computers cannot crack in a million years.

  4. Create unbreakable encryption systems to shield national security systems, financial transactions, secure Internet transactions and other systems based on present day encryption schemes.

  5. Advance cryptography to where messages can be sent and retrieved without encryption and without eavesdropping.

  6. Explore large and unsorted databases that had previously been virtually impenetrable using classical computers.

  7. Improve pharmaceutical research because a quantum computer can sift through many chemical substances and interactions in seconds.

  8. Create fraud-proof digital signatures.

  9. Predict weather patterns and identify causes of global warming.

  10. Improve the precision of atomic clocks and precisely pinpoint the location of the 7,000-plus satellites floating above Earth each day.

  11. Optimize spacecraft design.

  12. Enhance space network communication scheduling.

  13. Develop highly efficient algorithms for several related application domains such as scheduling, planning, pattern recognition and data compression.

4. Risks

And the risks are

  1. Cripple national security, defences, the Internet, email systems and other systems based on encryption schemes.

  2. Decode secret messages sent out by government employees in seconds versus the millions of years it would take a classical computer.

  3. Break many of the cryptographic systems (e.g., RSA, DSS, LUC, Diffie-Helman) used to protect secure Web pages, encrypted mail and many other types of data.

  4. Access bank accounts, credit card transactions, stock trades and classified information.

  5. Break cryptographic systems such as public key ciphers or other systems used to protect secure Web pages and email on the Internet.

5.  History of Quantum Computing

The idea of quantum computing was first explored in the 1970's and early 1980's by physicists and computer scientists like Charles GH. Bennett of the IBM Thomas J. Watson Research Center,  Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of the University of Oxford, and the late Richard P. Feynman of the California Institute of Technology (Caltech).  This idea emerged as scientists were debating the fundamental limits of computation.  They realized that if technology continued to go by Moore's Law, the continually shrinking size of circuitry packed onto silicon chips will get to a point where individual elements would be no larger than a few atoms. Then there was disagreement over the atomic scale the physical laws that rule the behaviour and properties of the circuit are inherently quantum mechanical in nature, not classical. Then came the question of whether a new type of computer could be invented based on the principles of quantum physics.

Feynman was the first to provide an answer by producing an abstract model in 1982 that demonstrated how a quantum system could be used for computations. Besides he explained how such a machine could act as a simulator for quantum physics. Meaning that, a physicist may have the ability to conduct experiments in quantum physics in a quantum mechanical computer.

In 1985, Deutsch discovered that Feynman's claim could lead to a general purpose quantum computer and published a crucial theoretical paper illustrating that any physical process, in principle, could be moulded perfectly by a quantum computer.  So, a quantum computer would have capabilities far beyond those of any traditional classical computer.  Immediately after Deutsch publication, the search began.

Unfortunately, all that could be found were a few rather contrived mathematical problems, until Shor circulated in 1994 a preprint of a paper in which he set out a method for using quantum computers to crack an important problem in number theory, namely factorization.  He showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to enable a such a machine to factor huge numbers extremely rapidly, much faster than is possible on conventional computers.  With this breakthrough, quantum computing transformed from a mere academic curiosity directly into a national and world interest.

6. Conclusion & Future Outlook

Right now, quantum computers and quantum information technology is still in its pioneering stage, and obstacles are being overcome that will provide the knowledge needed to drive quantum computers up in becoming the fastest computational machines in existence. This has not been without  problems, but it's nearing a stage now where researchers may have been equipped with tools required to assemble a computer robust enough to adequately withstand the effects of de-coherence.  With Quantum hardware, we are still full of hope though, except that progress so far suggest that it  will only be a matter time before the physical and practical breakthrough comes around to test Shor's and other quantum algorithms.  This breakthrough will permanently stamp out today's modern computer. Although Quantum computation has origin is in highly specialized fields of theoretical physics; however its future undoubtedly is in the profound effect it will bring to permanently shape and improve mankind.

References:

1.  D. Deutsch, Proc. Roy. Soc. London, Ser. A 400, 97 (1985).

2.  R. P. Feynman, Int. J. Theor. Phys. 21, 467 (1982).

3.  J. Preskill, 'Battling Decoherence:  The Fault-Tolerant Quantum Computer,' Physics Today, June (1999

4. R. Feynman, Int. J. Theor. Phys. 21, 467 (1982).

5. D. Deutsch, Proc. R. Soc. London A 400, 97 (1985).

6. P.W. Shor, in Proceedings of the 35th Annual Symposium on the Foundations of Computer Science, edited by S. Goldwasser (IEEE Computer Society Press, Los Alamitos, CA), p. 124 (1994).

7. A. Barenco, D. Deutsch, A. Ekert and R. Jozsa, Phys. Rev. Lett. 74, 4083 (1995)

8. Article by Yasar Safkan, Ph.D., Sofware Engineer, Noktalar A.S., Istanbul, Turkey


About the Author

Dele Oluwole graduated in England with a master's degree in computing. He began his IT career as a software test engineer (ISEB certified) with Argos Retail group, UK. Dele has been a consultant in software testing/project management for multi-national organisations, which include LG Electronics, Virgin Mobile, and T-Mobile Telecoms. He has also worked for the NHS, Waitrose, Ardentia, etc. His career in IT has revolved round software integration and project management. Dele is also an avid sports man and a Member of the prestigious British Computer Society (BCS) where he was awarded full membership in 2006.

Monday, August 1, 2011

Quantum Computing Technology Australia : Hello world !!

Welcome to Quantum Computing Technology Australia.


Hello World!! Welcome to Quantum Computing Technology Australia
            Hello World!! Welcome to Quantum Computing Technology Australia


In this website, You can get the information and knowledge about Quantum Computing Technology
!!