Over the past half
century, computers have significantly revolutionized the world around us. Computers were originally linked to
mathematicians who spent hours, days, months, or even years calculating
information. Since their only job was to
compute numbers for a living, they were subsequently referred to as
"computers." The definition of
a modern computer is simply this: a machine that can take a set of instructions
through a variety of different systems of coding, and process the computations
based upon the user’s instructions.
Computer technology began with huge rooms full of computers found in
places like NASA that had significantly less computational power than your
modern day PlayStation 3. Nowadays, as
most know, we find computers embedded in our daily lives from the smart phones
we all treasure, to devices such as GPS units and ATM machines. Each takes a set of instructions through a
user interface which dictates data entry and display, and through an operating
system that coordinates hardware, which deals with memory and computational
power, and software which are programs that have a set of instructions that
tell the user what to do, spits out a solution to the function you wish to
perform. At an ATM, the function you
wish to perform is capital withdrawal, capital deposit, and even just simply
receiving a statement that indicates your account balance. Smart phones and laptops are a little more
complex in which the user gives the computer a set of instructions, and one by
one the computer spits out solutions that reflect the set of instructions that
the user wishes to receive.

Over the past 50
years, the computational power of computers has since exploded. Moore's law, named after Intel co-founder
Gordon E. Moore, indicates that computational power of computers doubles
roughly every 2 years or so, or as some refer to it as "the 18th months
law." He proposed this idea back in
a 1965 paper on computer technology and computational power. The law refers to the amount of memory, processing
power, and speed at which a computer can operate. It is also an indication of the amount of
pixels a digital picture can have which we can visualize when we look at the
progression of our televisions in terms of the brightness and clear picture we
see through our high definition televisions.
In the end, the race was on to see how many transistors one could fit on
a single circuit to increase the overall power of a system have it be a cell
phone, laptop, or tablet. I read last
year that a silicon transistor was made that was 1 electron across measuring at
about 1.5 nanometers. According to
leading experts in computer technology though, the doubling power of
computational power and memory will soon slowdown in 2013, and computer power
will only double every 3 years or so in classical computing.

Now we move onto
the future of computational technology in terms of quantum processing. Quantum computers are measured in what is
referred to as qubits. Binary code, the
code used to write commands for hardware is a system of 1's and 0's, meaning
that the information in there or it is not there. Computer engineers use complex algorithms to
write code that goes into making commands for our classical computational
devices. The theory of quantum computing
states that bits of information can be found in both x=|1>and x=|0>, and
both x=|1>,|0> at the same time, or it’s in spin up and spin down
simultaneously. X refers to the qubits
of information within binary code.
Physicists use the term bits to describe information on a quantum
scale. Quantum computing uses super
positioned qubits that are in the same quantum state: either the angular
momentum or (p) is in spin up and spin down in a wave function. The goal of a quantum information system is
to entangle qubits in a closed and bounded system, so that information is sent
instantaneously across qubits. It is
quite difficult to maintain such a system due to thermal interference,
therefore many experimental quantum computers are kept underground using super-conductors
at low temperatures near or close to absolute zero or 0 degrees kelvin. Experimenters use super-cooled helium as a
medium to help resolve the thermal interference, or what is referred to as
decoherence. In quantum electrodynamics,
when electrons or any quantum particle for that matter come in close contact
with one another, the two particles or collection of particles exchange a
virtual photon with one another, and the state of each particle goes from a
higher state of energy(excited) to one of a lower state of energy determined by
which particle gives of the photon, and vice-versa. This is referred to as a probabilistic wave
function, which is how a classic computer works by determining the probability
that bits of information will be exchanged from one atom to the next using transistors
within a given circuit. Since one can only look at an object or particle using
light which takes time to view, quantum processing says that the particles in a
close and bounded system share information instantly, but because the observer
must use light to see the exchange take place, the wave function is
probabilistic from an outside observer looking in on the system rather than it
being instantly occurring. As of last
week, I read an article in some scientific journal or magazine(the name is not
as important as the advances the experimenters made in quantum processing) that
a group of scientists were able to make a successful application of quantum
entanglement at room temperature with over a billion particles being in the
same state for 1 second before the states of each particle collapsed into
unlike states, which caused the quantum process to fall apart. This was such a remarkable achievement, and I
truly applaud them for such advancement in the field.

So what are the
implications of such processing power for the future? First and for most, once companies and
governments are able to harness the full potential of such computational power
and memory, a single quantum computer will be able to process the entire amount
of information on the Internet almost instantly. This means that a single computer will be
able to process and spit out every bit of information about everyone and
everything on the planet almost instantly.
The only type of computer that breaks a complex algorithm of a quantum
computer is another quantum computer with equal or great computational
power. This leaves the average person
completely exposed to anyone with this type of computer. All your personal information will be
instantly obtained and embedded within the memory of the quantum processor for
anyone with access to the computer to see and use. At the same time, we will be able to keep our
nation's power grids, Internet capabilities, and satellite operations
essentially safe from external hackers as long as we are one step of their
quantum processor. THE RACE IS ON TO A
COMPUTER DOMINATED WORLD, where 500 qubits of information is equal to 2^500
power of bits of information which absolutely absurd to think about it in terms
of memory and processing power. This
calculation is due to the logarithm 2^n which is 2 the power of the amount of
bits of information in a closed system.

## No comments:

## Post a Comment