Quantum Computing

AbstractWith more scientific breakthroughs both in algorithmic and physical side, the study of quantum computing which is a new comprehensive and cross science of quantum mechanics and computer science has become a more sought-after and highly charged issue in recent years. This essay aims to give a brief overall introduction of quantum computing that is simple but still conveys the essential ideas and principles to general engineers who have some basic knowledge of linear algebra. In this essay, five primary parts of quantum computing are covered in order. First of all, three reasons for developing quantum computing, including its marvellous performance on some certain computational tasks, high efficiency on simulating other quantum systems and some physical limits to classical computing, will be illustrated in detail. After that, three basic concepts namely quantum bit, quantum gate and quantum computing process will be explained convincingly by comparing their classical counterparts. The different parts between them such as ‘superposition’ and ‘probabilistic’ will be focused on, because they are where the unique properties of quantum computing come from. In the third part, Shor’s factorizing algorithm will be set as an example to demonstrate its prominent computational power. In the final part, its rapid development and tremendous potentials both in physical research and market promise a visible future.

IntroductionCombining quantum mechanics and computer science, quantum computing as an interdisciplinary and cutting-edge science has forged ahead in the past three decades and is forecast to usher in the next scientific and technological revolution in the field of computer industry [1]. The concept of quantum computing came from the work of Richard Feynman [2] in 1982 when he attempted to simulate a physical quantum system in a classical computer and found the physical limits of the classical computing. However, at that point, the lack of idea about how to use quantum effects to speed up computation brings it to a standstill until a quantum algorithm developed by Peter Shor in 1994 proved that the speed of quantum computing is exponentially higher than that of classical computing [3]. It is these facts above that facilitate the development of quantum computing. In this essay, more detailed reasons for why quantum computing will be firstly discussed. In the second section, three basics of quantum computing will be clarified. In line with them, the third section will demonstrate the power of quantum computing by an example algorithm. Following that, that how to put theory into practice, namely the quantum computer, will be considered. Finally, some fascinating future perspectives and potentials of quantum computing will be depicted.

1. Why quantum computing
1.1 Moore’s Law Has Physical LimitsThe physical limit to classical computing seems to be a critical negative incentive for quantum computing. In 1960s, an astounding prediction was made by the Intel co-founder Gordon Moore [4]: in the light of his observation, he anticipated that the density of transistors per chip on an integrated circuit would increase exponentially every year. Though the actual amount of transistors on chips has merely doubled every 18 months, this prediction is somewhat correct that the maturity of classical computer technique keeps soaring. Hence, it is an inescapable fact that such components will eventually be minimized to the atomic scale which is dominated by some bizarre quantum effects [5]. To...

You May Also Find These Documents Helpful

...Ethical Computing
QBA 362 w/ Burke
Ltelatk H. Fritz
FRITZ, LTELATK
Ethical Computing QBA 362-Spring 2010
E T H I C AL C O MP UT I NG
1. Find a code of ethics from a firm of your choosing (other than the CPSR or the ACM). What do you think are the best five guiding principles from all the tips that you found?
http://www.buzzle.com/articles/computer-ethics-code-of-ethics-in-computing.html
Code of Ethics

Information stored on the computer should be treated as seriously as written or spoken word.
 Privacy should not be violated. In case of academic use, it is known plagiarism.
   
Information for public viewing should not be modified or deleted or inaccessible since these are considered as destructive acts. Intrusive software such as "worms" and "viruses" which are destructive to the computer system is also illegal. Congesting somebody’s system with a lot of unwanted information is also unethical. Sending obscene and crude messages through mail or chat is also forbidden.
 Sending sexually explicit content, message or pictures is also forbidden
‡
I felt the top five codes under Buzzle®’s Code of Ethics were the best. Because in general it covers a broad area of things. Treating information stored on computers as if they were written or spoken words, is like asking users to respect the rights of others, as well as their responsibility towards other people’s work. (Individual Responsibility) Leaving other...

...Fathers of Modern Computing
Charles Baggage and George Boole are, without question, central figures in the history of computer science. Charles] Babbage was born in Devonshire on December 26, 1791. The son of a London banker, Babbage took a great liking towards mathematics at an early age. Babbage soon became so proficient in mathematics that he was out performing his tutors at Cambridge. By 1812 Babbage co-founded the Analytical Society with the help of three other Cambridge classmates, Robert Woodhouse, Sir John Herschel, and George Peacock. In 1821 Babbage invented the Difference Engine to compile astronomical tables. While in the process of building it in 1832, he conceived a better machine that could perform not just one mathematical task but any kind of calculation. This machine was the Analytical Engine and it possessed some the characteristics of today's computers. George Boole, born November 2, 1815, was a British mathematician and founder of mathematical logic. Coming from a poor family of limited means, Boole was essentially a self-taught mathematician. In 1847 Boole published "Mathematical Analysis of Logic". In the book, Boole established that logic could be represented by algebraic equations. This conception eventually become known as Boolean algebra and the basis of all modern digital computers. The inventions and achievements of Charles Babbage and George Boole are both directly and indirectly responsible for the conception of modern...

...By the strange laws of quantum mechanics, Folger, a senior editor at Discover, notes, an electron, proton, or other subatomic particle is "in more than one place at a time," because individual particles behave like waves, these different places are different states that an atom can exist in simultaneously. Ten years ago, Folger writes, David Deutsch, a physicist at Oxford University, argued that it may be possible to build an extremely
powerful computer based on this peculiar reality. In 1994, Peter Shor, a mathematician at AT&T Bell Laboratories in New Jersey, proved that, in theory at least, a full-blown quantum computer could factor even the largest numbers in seconds--an accomplishment impossible for even the fastest conventional computer.
An outbreak of theories and discussions of the possibility
of buildig a quantum computer now permeates itself thoughtout the quantum fields of technology and research. It's roots can be traced back to 1981, when Richard Feynman noted that physicists always seem to run into computational problems when they try to simulate a system in which quantum mechanics would take place. The caluclations involving the behavior of atoms, electrons, or photons, require an immense amount of time on today's computers. In 1985 in Oxford England the first description of how a quantum computer might work surfaced with David Deutsch's theories. The new device...

...This term paper will discuss the concept of cloud computing, including its strengths and weaknesses.
Introduction {text:list-item} Body {text:list-item}
{text:list-item}
{text:list-item}
Conclusion
{text:list-item} In a cloud computing system, there's a significant workload shift from the average enterprise network. The term “cloud” refers to the internet, which plays a major part in this architecture. Local computers no longer have to process all the information when it comes to running applications. The network of computers that make up the cloud handles them instead. The only thing the user's computer needs to be able to run is the cloud computing system interface software and the cloud's network takes care of the rest.
Cloud computing systems can be divided into two sections, the front end and the back end, which are connected through a network. The front end is the client side. The back end is the "cloud" section of the system. The front end includes the client's computer and the application required to access the cloud computing system. Services like Web-based e-mail programs use existing applications such as Internet Explorer or Firefox. Other systems have unique or proprietary applications that provide network access to clients. On the back end of the system are the various computers, servers, and data storage systems that run...

...
Something we should know about Cloud Computing
In this paper, I will Introduce the cloud computing. Including the technology of cloud computing; the characteristics that cloud computing have. Also I will mention the service models and deployment models. At the end, the advantage and disadvantage of cloud computing will be discussed.
Commonly, people understand “cloud computing” these two words as in the following sentence. “Continuously evolving, and the terminology and concepts used to define it often need clarifying” (Dialogic Corporation, 2010). As I researched, Cloud computing is also a service. “Cloud computing is the latest effort in delivering computing resources as a service. It represents a shift away from computing as a product that is purchased, to computing as a service that is delivered to consumers over the internet from large-scale data centers – or “clouds”.”(Sriram & Hosseini, 2010) Many people might don’t understand the meaning of term “cloud”. Here I found a sentence which perfectly describes the meaning of term “cloud”. “The term “cloud”, as used in this white paper, appears to have its origins in network diagrams that represented the internet, or various parts of it, as schematic clouds.”(Dialogic Corporation, 2010)
Since we have a little bit sense about cloud computing....

...﻿Part A: Cloud Computing
What is cloud computing?
Cloud computing allows people to log into a web­ based service which hosts applications, programs or document.
Imagine your job at a big corporation is to make sure every single computer in the company has the right software to work. Buying applications and programs for every single computer can be costly. Cloud computing solves that problem. You’d only have to install one application to ensure the employees can access to the “cloud”.
The “cloud” includes hardware, network, storage, service and interface. These combine together and deliver services such as the delivery of software, infrastructure and storage.
There are different types of cloud computing. There are public clouds, private clouds and hybrid clouds. Public clouds are virtual data center. A service provider such as Dropbox makes resources available over the Internet. Private cloud are usually inside an organization’s firewall but it also could a private space dedicated to the organization provided by a service provider. Hybrid cloud combines both aspects of public and private cloud.
What are the advantages of cloud computing?
Cloud computing is very cost efficient as it is very easy to maintain and use. Cloud computing is a lot cheaper than traditional software. As cloud computing is very high productive, it saves a lot of...

...quantum-mechanical phenomena, such as superposition andentanglement, to perform operations on data.[1] Quantum computers are different from digital computers based on transistors. Whereas digital computers require data to be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses qubits (quantum bits), which can be in superpositionsof states. A theoretical model is the quantum Turing machine, also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic computers; one example is the ability to be in more than one state simultaneously. The field of quantumcomputing was first introduced by Yuri Manin in 1980[2] and Richard Feynman in 1982.[3][4] A quantum computer with spins as quantum bits was also formulated for use as a quantum space–time in 1969.[5]
As of 2014 quantumcomputing is still in its infancy but experiments have been carried out in which quantum computational operations were executed on a very small number of qubits.[6]Both practical and theoretical research continues, and many national governments and military funding agencies support quantumcomputing research to develop...

...A quantum leap for lighting
Consumer electronics: Tiny semiconductor crystals, called quantum
dots, enable new forms of energy-efficient lighting
HOW many inventions does it take to change a light bulb? More than
you might think. Around the world, many people are switching from
traditional incandescent bulbs to compact fluorescent (CFL) bulbs,
which require less energy to produce a given amount of light, and
therefore save money and reduce carbon emissions. But CFLs
themselves may soon be overhauled by light emitting diodes (LEDs),
which are even more energy efficient and have the further advantage
that they come on instantly at full brightness, unlike CFLs, which can
take a while to warm up. Advocates of LEDs note that the technology is
versatile enough to work in almost any situation, from stadium lighting
right down to the tiny light on your phone that flashes to indicate a new
message.
But not even LEDs, it seems, are the end of the story. Yet another
lighting technology is on the horizon that offers further advantages:
even greater power efficiency and softer, warmer light, the colour of
which can be precisely controlled. Even though it will be put to rather
mundane uses, the technology in question has an exotic name:
quantum-dot
lighting.
Quantum dots are tiny crystals of semiconducting material just a few
tens of atoms, or a few nanometres (billionths of a metre), across. They
are...