Nano computers and quantum computing

Nanocomputer is the logical name for a computer smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called “mini” because it was a lot smaller than the original (mainframe) computers.) More technically, it is a computer whose fundamental parts are no bigger than a few nanometers. For comparison, the smallest part of current state-of-the-art microprocessors measures 28 nm as of March 24, 2012. No commercially available computers that are named nanocomputers exist at this date, but the term is used in science and science fiction.

There are several ways nanocomputers might be built, using mechanical, electronic, biochemical, or quantum technology. It was argued in 2007 that is unlikely that nanocomputers will be made out of semiconductor transistors (Microelectronic components that are at the core of all modern electronic devices), as they seem to perform significantly less well when shrunk to sizes under 100 nanometers; however, as of 2011 it is projected that 22 nm lithography devices will ship before 2012.

This is based on nano technology and, in fact, one of its major application. Nano components of such machine make it efficient to use.

Nano-Computing

The history of computer technology has involved a sequence of changes from gears to relays to valves to transistors to integrated circuits and so on. Today’s techniques can fit logic gates and wires a fraction of a micron wide onto a silicon chip. Soon the parts will become smaller and smaller until they are made up of only a handful of atoms. At this point the laws of classical physics break down and the rules of quantum mechanics take over, so the new quantum technology must replace and/or supplement what we presently have. It will support an entirely new kind of computation with new algorithms based on quantum principles.

Presently our digital computers rely on bits, which, when charged, represent on, true, or 1. When not charged they become off, false, or 0. A register of 3 bits can represent at a given moment in time one of eight numbers (000,001,010,…,111). In the quantum state, an atom (one bit) can be in two places at once according to the laws of quantum physics, so 3 atoms (quantum bits or qubits) can represent all eight numbers at any given time. So for x number of qubits, there can be 2x numbers stored. Parallel processing can take place on the 2x input numbers, performing the same task that a classical computer would have to repeat 2x times or use 2x processors working in parallel. In other words a quantum computer offers an enormous gain in the use of computational resources such as time and memory. This becomes mind boggling when you think of what 32 qubits can accomplish.

This all sounds like another purely technological process. Classical computers can do the same computations as quantum computers, only needing more time and more memory. The catch is that they need exponentially more time and memory to match the power of a quantum computer. An exponential increase is really fast, and available time and memory run out very quickly.

Quantum computers can be programed in a qualitatively new way using new algorithms. For example, we can construct new algorithms for solving problems, some of which can turn difficult mathematical problems, such as factorization, into easy ones. The difficulty of factorization of large numbers is the basis for the security of many common methods of encryption. RSA, the most popular public key cryptosystem used to protect electronic bank accounts gets its security from the difficulty of factoring very large numbers. This was one of the first potential uses for a quantum computer.

“Experimental and theoretical research in quantum computation is accelerating world-wide. New technologies for realising quantum computers are being proposed, and new types of quantum computation with various advantages over classical computation are continually being discovered and analysed and we believe some of them will bear technological fruit. From a fundamental standpoint, however, it does not matter how useful quantum computation turns out to be, nor does it matter whether we build the first quantum computer tomorrow, next year or centuries from now. The quantum theory of computation must in any case be an integral part of the world view of anyone who seeks a fundamental understanding of the quantum theory and the processing of information.” ( Center for Quantum Computation)

In 1995 there was a $100 bet made to create the impossible within 16 years, the world’s first nanometer supercomputer. This resulted in the NanoComputer Dream Team, and utilizes the internet to gather talent from every scientific field and from all over the world, amateur and professional. Their deadline: November 1, 2011.

quantum computer

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer Howard Aiken said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn’t count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

As Moore’s Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create quantum computers, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.

Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you’ll learn what a quantum computer is and just what it’ll be used for in the next era of computing.

You don’t have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one are based on the Turing Theory.

The Turing machine, developed by Alan Turing in the 1930s, is a theoretical device that consists of tape of unlimited length that is divided into little squares. Each square can either hold a symbol (1 or 0) or be left blank. A read-write device reads these symbols and blanks, which gives the machine its instructions to perform a certain program. Does this sound familiar? Well, in a quantum Turing machine, the difference is that the tape exists in a quantum state, as does the read-write head. This means that the symbols on the tape can be either 0 or 1 or a superposition of 0 and 1; in other words the symbols are both 0 and 1 (and all points in between) at the same time. While a normal Turing machine can only perform one calculation at a time, a quantum Turing machine can perform many calculations at once.

Today’s computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today’s most powerful supercomputers.

This superposition of qubits is what gives quantum computers their inherent parallelism. According to physicist David Deutsch, this parallelism allows a quantum computer to work on a million computations at once, while your desktop PC works on one. A 30-qubit quantum computer would equal the processing power of a conventional computer that could run at 10 teraflops (trillions of floating-point operations per second). Today’s typical desktop computers run at speeds measured in gigaflops (billions of floating-point operations per second).

Quantum computers also utilize another aspect of quantum mechanics known as entanglement. One problem with the idea of quantum computers is that if you try to look at the subatomic particles, you could bump them, and thereby change their value. If you look at a qubit in superposition to determine its value, the qubit will assume the value of either 0 or 1, but not both (effectively turning your spiffy quantum computer into a mundane digital computer). To make a practical quantum computer, scientists have to devise ways of making measurements indirectly to preserve the system’s integrity. Entanglement provides a potential answer. In quantum physics, if you apply an outside force to two atoms, it can cause them to become entangled, and the second atom can take on the properties of the first atom. So if left alone, an atom will spin in all directions. The instant it is disturbed it chooses one spin, or one value; and at the same time, the second entangled atom will choose an opposite spin, or value. This allows scientists to know the value of the qubits without actually looking at them.

 Some recent advancements in the field of quantum computing.

So-called quantum computers are designed to quickly crunch numbers that would take a person a lifetime or longer—for instance, mapping trillions of amino acids for futuristic drug cures or making sense of the avalanche of public data we create daily. So what can you get by putting one to use for your company, as Lockheed Martin (LMT) has since it bought the world’s first corporate model from D-Wave Systems in 2011? (A few weeks ago, Google (GOOG) bought the second.) The aerospace and security giant has been operating its device at the University of Southern California’s Quantum Computation Center for the past 18 months.

Quantum computing uses the quantum nature of matter, the atoms themselves, as computing devices. Normal computer architecture is based on the bit—represented either as a one or a zero. The quantum computer is programmed so that the input is initially both zero and one.

Because on the quantum level you’re able to program the atoms to represent all possible input combinations, and to do so simultaneously. That means when you run an algorithm, all possible input combinations are tested at once. With a regular computer you’d have to serially cycle through every possible input combination to arrive at your solution, meaning it would take longer than the age of the universe to complete the most complicated calculations.
To solve hugely enormous, complex problems in a reasonable amount of time. That has been intriguing to scientists at Lockheed Martin since the first inception of quantum computing.
LM started with smaller tasks at first to better understand the capabilities of the machine. But one area of interest is in complex systems such as software verification and validation. The development of any large computer system integration initiative involves a lot of software. Validating the performance of that software is vital, but it’s a very time-consuming and often very expensive undertaking. LM taken the software and cast it as a problem for the quantum computer to address. It scans through the switches and combinations in the software code and makes sure that it’s performing in a way that  expected it to.
The quantum computer as a machine that frees up time and money. If you can validate and verify software in a single series of tests, then the money and time saved can be used elsewhere. It becomes an innovation enabler. LM already in the era of quantum computing. The academic question of how quantum it is, and how entangled the “qubits” (quantum bits) are, really doesn’t concern . What is the concerned with is how it can help  reduce costs, make better systems, and accelerate innovation.. Quantum computing is a practical tool for extremely complex predictive analysis, and machine learning where you need to assess many variables and many patterns and test models against it. This is relevant in the area of drug discovery, cybersecurity, business, finance, investment, health care, logistics, and planning. There are a number of business applications—those that involve solving complex optimization problems—that today would be too difficult to address with silicon computing.
Conventional computing is not going to go away. You wouldn’t want to use a quantum computer to balance your checkbook. Quantum computing best addresses those exceedingly complex computational problems—in drug discovery, for example, when you have trillions of combinations of amino acids to cycle through to find that single protein. That’s a job for quantum computing. That’s the power of it, in a nutshell.

Qubit Control

Computer scientists control the microscopic particles that act as qubits in quantum computers by using control devices.

  • Ion traps use optical or magnetic fields (or a combination of both) to trap ions.
  • Optical traps use light waves to trap and control particles.
  • Quantum dots are made of semiconductor material and are used to contain and manipulate electrons.
  • Semiconductor impurities contain electrons by using “unwanted” atoms found in semiconductor material.
  • Superconducting circuits allow electrons to flow with almost no resistance at very low temperatures.
  • Today’s Quantum Computers

    Quantum computers could one day replace silicon chips, just like the transistor once replaced the vacuum tube. But for now, the technology required to develop such a quantum computer is beyond our reach. Most research in quantum computing is still very theoretical.

    The most advanced quantum computers have not gone beyond manipulating more than 16 qubits, meaning that they are a far cry from practical application. However, the potential remains that quantum computers one day could perform, quickly and easily, calculations that are incredibly time-consuming on conventional computers. Several key advancements have been made in quantum computing in the last few years. Let’s look at a few of the quantum computers that have been developed.

    1998

    Los Alamos and MIT researchers managed to spread a single qubit across three nuclear spins in each molecule of a liquid solution of alanine (an amino acid used to analyze quantum state decay) or trichloroethylene (a chlorinated hydrocarbon used for quantum error correction) molecules. Spreading out the qubit made it harder to corrupt, allowing researchers to use entanglement to study interactions between states as an indirect method for analyzing the quantum information.

    2000

    In March, scientists at Los Alamos National Laboratory announced the development of a 7-qubit quantum computer within a single drop of liquid. The quantum computer uses nuclear magnetic resonance (NMR) to manipulate particles in the atomic nuclei of molecules of trans-crotonic acid, a simple fluid consisting of molecules made up of six hydrogen and four carbon atoms. The NMR is used to apply electromagnetic pulses, which force the particles to line up. These particles in positions parallel or counter to the magnetic field allow the quantum computer to mimic the information-encoding of bits in digital computers.

    Researchers at IBM-Almaden Research Center developed what they claimed was the most advanced quantum computer to date in August. The 5-qubit quantum computer was designed to allow the nuclei of five fluorine atoms to interact with each other as qubits, be programmed by radio frequency pulses and be detected by NMR instruments similar to those used in hospitals (see How Magnetic Resonance Imaging Works for details). Led by Dr. Isaac Chuang, the IBM team was able to solve in one step a mathematical problem that would take conventional computers repeated cycles. The problem, called order-finding, involves finding the period of a particular function, a typical aspect of many mathematical problems involved in cryptography.

    2001

    Scientists from IBM and Stanford University successfully demonstrated Shor’s Algorithm on a quantum computer. Shor’s Algorithm is a method for finding the prime factors of numbers (which plays an intrinsic role in cryptography). They used a 7-qubit computer to find the factors of 15. The computer correctly deduced that the prime factors were 3 and 5.

    2005

    The Institute of Quantum Optics and Quantum Information at the University of Innsbruck announced that scientists had created the first qubyte, or series of 8 qubits, using ion traps.

    2006

    Scientists in Waterloo and Massachusetts devised methods for quantum control on a 12-qubit system. Quantum control becomes more complex as systems employ more qubits.

    2007

    Canadian startup company D-Wave demonstrated a 16-qubit quantum computer. The computer solved a sudoku puzzle and other pattern matching problems. The company claims it will produce practical systems by 2008. Skeptics believe practical quantum computers are still decades away, that the system D-Wave has created isn’t scaleable, and that many of the claims on D-Wave’s Web site are simply impossible (or at least impossible to know for certain given our understanding of quantum mechanics).

    If functional quantum computers can be built, they will be valuable in factoring large numbers, and therefore extremely useful for decoding and encoding secret information. If one were to be built today, no information on the Internet would be safe. Our current methods of encryption are simple compared to the complicated methods possible in quantum computers. Quantum computers could also be used to search large databases in a fraction of the time that it would take a conventional computer. Other applications could include using quantum computers to study quantum mechanics, or even to design other quantum computers.

    But quantum computing is still in its early stages of development, and many computer scientists believe the technology needed to create a practical quantum computer is years away. Quantum computers must have at least several dozen qubits to be able to solve real-world problems, and thus serve as a viable computing method.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s