Anda di halaman 1dari 6

Quantum Computing

Theme - INFORMATION TECHNOLOGY

Sub Theme - Others

Nikhil C. Lagali Vishal R. Kulkarni


M.C.A. Dept. M.C.A. Dept.
(ASM’s IBMR College (ASM’s IBMR College
Chinchwad) Chinchwad)
analyse300@gmail.com
vishal.kulkarni8078@gmail.com

Abstract - The subject of quantum The basic principle behind quantum


computing brings together ideas from computation is that quantum properties
classical information theory, computer can be used to represent data and
science, and quantum physics. perform operations on these data.
Over the past several decades, quantum Experiments have been carried out in
information science has emerged to which quantum computational
seek answers to the question: can we operations were executed on a very
have some advantage by storing, small number of qubits (quantum bits).
transmitting and processing information Both practical and theoretical research
encoded in systems that show unique continues, and many national
quantum properties? government and military funding
Today the answer is yes, and many agencies support quantum computing
research groups around the world are research to develop
working towards the highly ambitious quantum computers for both civilian and
technological goal of building a quantum national security purposes.
computer, which would improve We intend to enlighten certain aspects
computational power for particular of Quantum Computing as this subject is
tasks. still in the experimental stages.
Quantum computers are different from
traditional computers which are based
on transistors.
Intro
Quantum theory applies to the behavior of matter at small -
atomic - scales. Thus, quantum computing deals with utilizing
subatomic particles and electrons instead of silicon chips.
As most of us are aware, nothing is certain at the quantum level,
not wrongly called Alice-in-Wonderland-style science.

Quantum computing is primarily based on a concept known as


quantum interference. This is basically the superposition of two
states; not a 0, not a 1, but a 0 AND a 1.
Consider a regular computer register with three bits. A bit is a
binary digit, the fundamental level of computer information. At
any given time, the register can take one of eight different
states, from 000, 001, 010, 011 and so on up to 111. On the
other hand, a quantum computer with three qubits or quantum
bits, can at a given instant store all eight states due to
superposition.
It's not like regular computers cannot perform tasks that
quantum computers can. All they need is a significant amount of
time and space.
Advantages

The exponential speedup offered by quantum computing, based


on the ability to process quantum information through gate
manipulation, has led to several quantum algorithms with
substantial advantages over known algorithms with traditional
computation.
The most significant is Shor's algorithm for factoring the product
of two large primes in polynomial time.Additional algorithms
include Grover's fast database search; adiabatic solution of
optimization; precise clock synchronization; quantum key
distribution; and recently, Gauss sums and Pell's equation
Reduction in size - Despite significant practical difficulties, small quantum
devices of 5 to 7 bits have been built in the laboratory. 100-bit devices are on the
drawing table and future technologies promise even greater scalability.

Security - The security of many modern cryptosystems relies on the seeming


intractability of factoring the product of two large primes. In particular, it is easy
to show that an efficient algorithm for factoring implies that the widely-used RSA
public-key system becomes efficiently breakable. While the best-known factoring
algorithms for a classical computer run in exponential time, Shor's algorithm can
factor an n-bit integer in O(n3) time (O(n3(log(n))log(5)) with error correction). In
1999, researchers using the Number Field Sieve succeeded in factoring a 512-bit
product of two primes (RSA-155) in 8400 MIPS years. A 1024-bit product (RSA-
309) would take approximately 1.6 billion times more computation!
Disadvantages
1. Reliable and realistic implementation technology: There are multiple approaches from very diverse
fields of science for the realization of a full-scale quantum information processor. Solid state technologies,
trapped ions, and superconducting quantum computation are just a small number of many physical
implementations currently being studied. No matter the choice, any technology used to implement a quantum
information processor must adhere to four main requirements 1) It must allow the initialization of an arbitrary
N-qubit quantum system to a known state. 2) A universal set of quantum operations must be available to
manipulate the initialized system and bring it to a desired correlated state. 3) The technology must have the
ability to reliably measure the quantum system. 4) It must allow much longer qubit lifetimes than the time of a
quantum logic gate. The second requirement encompasses multi-qubit operations; thus, it implies that a
quantum architecture must also allow for sufficient and reliable communication between physical qubits.

2. Robust error correction and fault tolerant structures: Due to the high volatility of quantum data,
actively stabilizing the system's state through error correction will be one of the most vital operations through
the course of a quantum algorithm. Fault tolerance and quantum error correction constitute a significant field
of research that has produced some very powerful quantum error correcting codes analogous, but
fundamentally different from their classical counterparts. The most important result, for our purposes, is the
Threshold Theorem, which says that an arbitrarily reliable quantum gate can be implemented using only
imperfect gates, provided the imperfect gates have failure probability below a certain threshold. This
remarkable result is achieved through four steps: using quantum error-correction codes; performing all
computations on encoded data; using fault tolerant procedures; and recursively encoding until the desired
reliability is obtained. A successful architecture must be carefully designed to minimize the overhead of
recursive error correction and be able to accommodate some of the most efficient error correcting codes.

3. Efficient quantum resource distribution: The quantum no-cloning theorem (i.e. the inability to copy
quantum data) prevents the ability to place quantum information on a wire, duplicate, and transmit it to
another location. Each qubit must be physically transported from the source to the destination. This makes
each qubit a physical transmitter of quantum information, a restriction which places great constraints on
quantum data distribution. Particularly troublesome is moving the qubits over large distances where it must be
constantly ensured the data is safe from corruption. One method is to repeatedly error correct along the
channel at a cost of additional error correction resources. Another solution is to use a purely quantum concept
to implement a long-range wire teleportation, which has been experimentally demonstrated on a very small
scale. Teleportation transmits a quantum state between two points without actually sending any quantum data,
but rather two bits of classical information for each qubit on both ends.
Use and scope

The main area that quantum computers will directly influence is


cryptography or security. Cryptography is based on the fact that
in mathematics, certain computations are easier to perform in
one direction as compared to another. Take factorization. It is
very easy to multiply two numbers. Agreed that as the numbers
get larger, so does the complexity of multiplication, but even
then, it is both possible and computable. There is a definite
algorithm.

On the other hand, if you have a large number, the algorithm to


factorize it gets increasingly complex. In fact, while the largest
number to be factorized as a mathematical challenge was
comprised of 129 digits - which nonetheless, took 8 months and
1600 hackers - it can be shown that more complex factorizations
can take longer than the age of the universe to complete,
primarily because of the various number of iterations the
computer has to run through to obtain the right factors.

Let's consider a simple example. A classical computer is expected


to take about 10 million billion billion years to factorize a 1,000
digit number. In comparison, a quantum-based computer would
take around 20 minutes. While quantum computers can prove to
be racehorses of the computer industry, cryptography as we
know it today will cease to exist, and development of new
algorithms will be required
References
[Deutsch85]. David Deutsch, ``Quantum Computational Networks'', Proc. Soc. R. Lond. A400, pp. 97-117,
1985.

[Shor94]. Peter. W. Shor, ``Polynomial-Time Algorithms For Prime Factorization and Discrete Logarithms on
a Quantum Computer'', 35th Annual Symposium on Foundations of Computer Science, pp. 124-134, 1994.

[Grover96]. L. Grover, ``A Fast Quantum Mechanical Algorithm for Database Search'' Symposium on Theory
of Computing - STOC-96, pp. 212-219, 1996.

[Childs2002]. A. M. Childs, E. Farhi, and J. Preskill, ``Robustness of adiabatic quantum computation'', Phys.
Rev. A65, 2002, quant-ph/0108048.

Anda mungkin juga menyukai