**Section 2****Technology**

**2.4.1 Quantum Entanglement and Teleportation.**

**Quantum Entanglement****Recent Developments in Teleportation/Entanglement**

**A Brief Introduction To Quantum Mechanics**

To understand the principles of teleportation, one must have a rudimentary grasp of quantum mechanics and quantum field theory. The theory of Quantum Mechanics began with the work of Kirchoff in 1859 then Stefan later (1879) both whom tried to determine the nature of black body radiation. Such work was modified in a long list by various people most notably Boltzmann, Wien and Rayleigh. Their work led to the development of two laws for radiation but the laws were mutually exclusive. The one predicted quite well the energy at high frequency whereas the other predicted energy at low frequency. They both failed when this was tried the other way around. Plank came forward with the answer saying in principle that the energy of an oscillator at a given frequency was not continuously variable but restricted to discrete multiples of a constant (Plank’s constant).

This is most easily summarised by some of Einstein’s most important work from those months in 1905/1906. Einstein noted that the emissions from metals of electrons was similarly quantised. Basically, he empirically showed that intensity had no effect on electron liberation and only frequency effected this when above a threshold level. This he concluded was precisely the effect one would expect if light were quantised and discrete.

Quantum teleportation is the transmission and reconstruction
over arbitrary distances of the state of a quantum system, an
effect first suggested by Bennett et al in 1993 (*Phys. Rev.
Lett.70:1895*). The achievement of the effect depends on the
phenomenon of entanglement, an essential feature of quantum
mechanics. Individually, an entangled particle has properties
(such as momentum) that are indeterminate and undefined until the
particle is measured or otherwise disturbed. Measuring one
entangled particle, however, defines its properties and seems to
influence the properties of its partner or partners
instantaneously, even if they are light years apart. Due to the
fact that the two particles are entangled interaction on the one
cause instantaneous effects on the other.

Entanglement is not a new concept at all, though Autumn
seminars and the December (1998) publications show a considerable
amount of new work of relevance. Entanglement was first discussed
many years ago, most particularly following the publication in
1935 of the often quoted Einstein-Podolsky-Rosen
paper (*Physical Review 193547:777*). To begin with the
discussions were limited to the form of "gedanken" (*thought*)
experiments involving two quantum-mechanical entangled entities.
More recently, however, there have been laboratory constructions
of actual quantum mechanical systems exhibiting such entanglement
phenomena. Essential here is that any purely verbal account of
quantum mechanical phenomena is severely limited by the
constraint that the properties of quantum mechanical systems can
be precisely described only by the equations relevant for those
systems, and all other descriptions usually introduce serious
ambiguities

Entanglement arises from the wave function equation of quantum mechanics, which has an array of possible function solutions rather than a single function solution, with each possible solution describing a set of possible probabilistic quantum states of the physical system under consideration. Upon fixation of the appropriate boundary conditions, the array of possible solutions collapses into a single solution. For many quantum mechanical physical systems, the fixation of boundary conditions is a theoretical and fundamental consequence of some interaction of the physical system with something outside that system, e.g., an interaction with the measuring device of an observer. In this context, two entities that are described by the same array of possible solutions to the wave function equation are said to be "coherent", and when events decouple these entities, the consequence is said to be "decoherence".

Serge Haroche (Ecole Normale Superieure Paris) reviews quantum mechanical entanglement, decoherence, and the question of the boundary between the physics of quantum phenomena and the physics of classical phenomena. Haroche makes the following points:

1) In quantum mechanics, a particle can be delocalized (simultaneously occupy various probable positions in space), can be simultaneously in several energy states, and can even have several different identities at once. This apparent "weirdness" behavior is encoded in the wave function of the particle.

2) Recent decades have witnessed a rash of experiments designed to test whether nature exhibits implausible non-locality.In such experiments, the wave function of a pair of particles flying apart from each other is entangled into a non-separable superposition of states. The quantum formalism asserts that detecting one of the particles has an immediate effect on the other, even if they are very far apart, even far enough apart to be out of interaction range. The experiments clearly demonstrate that the state of one particle is always correlated to the result of the measurement performed on the other particle, and in just the strange way predicted by quantum mechanics.

3) An important question is: Why and how does quantum weirdness disappear (decoherence) in large systems? In the last 15 years, entirely solvable models of decoherence have been presented by various authors (e.g., Leggett, Joos, Omnes, Zeh, Zurek), these models based on the distinction in large objects between a few relevant macroscopic observables (e.g., position or momentum) and an "environment" described by a huge number of variables, such as positions and velocities of air molecules, number of black-body radiation photons, etc. The idea of these models, essentially, is that the environment is "watching" the path followed by the system (i.e., interacting with the system), and thus effectively suppressing interference effects and quantum weirdness, and the result of this process is that for macroscopic systems only classical physics obtains.

4) In mesoscopic systems, which are systems between macroscopic and microscopic dimensions, decoherence may occur slowly enough to be observed. Until recently, this could only be imagined in a gedanken experiment, but technological advances have now made such experiments real, and these experiments have opened this field to practical investigation.

Entanglement is unique to quantum mechanics in that, and
involves a relationship (a "superposition of states")
between the possible quantum states of two entities such that
when the possible states of one entity collapse to a single state
as a result of suddenly imposed boundary conditions, a similar
and related collapse occurs in the possible states of the
entangled entity no matter where or how far away the entangled
entity is located. The most common form is the polarization of
photons. Polarization is essentially a condition in which the
properties of photons are direction dependent, a condition that
can be achieved by passing light through appropriate media.
Bouwmeester et al (*Univ. of Innsbruck,*) now report an
experimental demonstration of quantum teleportation involving an
initial photon carrying a polarization that is transferred to one
of a pair of entangled photons, with the polarization-acquiring
photon an arbitrary distance from the initial one. The authors
suggest quantum teleportation will be a critical ingredient for
quantum computation networks.

In June 1999 the act of measuring a photon repeatedly without destroying it has was achieved for the first time, enabling researchers to study an individual quantum object with a new level of non-invasiveness. Physicists have long realized that it is possible to perform non-destructive observations of a photon with a difficult-to-execute technique known as a "quantum non-demolition" (QND) measurement. After many years of experimental effort, researchers in France (Serge Haroche, Ecole Normale Superieure) have demonstrated the first QND measurement of a single quantum object, namely a photon bouncing back and forth between a pair of mirrors (a "cavity"). A conventional photodetector measures photons in a destructive manner, by absorbing the photons and converting them into electrical signals. "Eating up" or absorbing photons to study them is not required by fundamental quantum mechanics laws and can be avoided with the QND technique demonstrated by the French researchers. In their technique, a photon in a cavity is probed without absorbing any net energy from it. (Of course, Heisenberg's Indeterminacy Principle ensures that counting a photon still disturbs the "phase" associated with its electric and magnetic fields.) In the experiment, a rubidium atom passes through a cavity. If a photon is present, the atom acquires a phase shift which can easily be detected. Sending additional rubidium atoms through the cavity allowed the researchers to measure the photon repeatedly without destroying it or. This technique can allow physicists to study the behavior of a photon during its natural lifespan; it can potentially allow researchers to entangle an arbitrary number of atoms and build quantum logic gates (Nogues et al., Nature, 15 July)

**The Problems of Indeterminance As Pertaining to the
Scanning of macroscopic Entities.**

The reason photons were so readily used was because there were problems in measuring the spin of subatomic particles. However it must be remembered that for practical macroscopic purposes the scanning of the human body may not entirly be confined to the subatomic. Regarding the inability to track atoms it should be remembered that the task is quite easy from the days when IBM wrote their name in them to the use of monoatmic materials such as those used in the astronautics industry. Also if we want to transport a person we are not interested in protons etc. we essentially want to scan molecules and chains there-of. There are very few sections of the body that are affected by subatomic particles. The main three sub atom constituents that would concern us would be; free radicals, quantum effects in the neurons of the brain, and photons. Taken one at a time, free radicals would not be an important problem and there possible loss may not affect any part of the anatomy. The inability to track photons means we would be unable to resolve where any photons go. Now research has shown that the eye is responsive to single photons ,that of course does not mean a single photon will cause visual stimulation just that there is a probability that a singel photon could cause the brain to percieve light. This might mean that the side effect of transportation is unexpected flashes in the eyes of the person being transported. The firing of certain neurons is a quantum effect but the neuron its self is quite obviously macroscopic, thus we can not damage the brain simply be unaware if certain neurons will fire, as with photons stimulating the eye, the brain might be stimulated and thus our person may or may not receive certain signals, sounds or smells.

If one wants to transport a particle and preserve the exact spin configuration wave form etc. the act of doing so interferes with the original particle. This is the work that won Heisenberg the Nobel Prize in 1932. Though in the construction of matter we are told that the Principle of Indeterminance forbids knowledge of the exact location and or momentum of the particles due to the quantum probability amplitudes for a particle, there several points to consider including whether unmeasured spin exists and also the examination of the work already conducted throughout the world on teleportation. The initial theories of quantum teleportation are based on a process called entanglement developed as a result of the Einstein-Podolsky-Rosen theories. Originally, it was postulated that if two particles had to approach to within a certain distance they could become entangled. If subsequently the particles are separated to any distance, forces between entangled particles remain the same as they were in close proximity. If either is then disturbed, the entanglement stops. When we consider the forces between the entangled particles it is as if space did not exist between them. Furthermore, if the theoretical particles known as singlets are the basic building blocks of matter and their interactions are entangled it would appear that Einstein’s 'spooky action at a distance' maybe not spooky but simply the basic natural method of communication.

**Empirical Data on Teleportation.**

The pinciple of quantum teleportation is not just a theory but
has on many occasions been demonstrated experimentally. The first
method used was the teleportation of photons. Photons possess
spin, but in this case the spin is always in the direction of
propagation and thus is called polarisation. To teleport a
quantum system it is necessary to somehow send all the
information needed to reconstruct the system to the remote
location. But, it might be thought, the Heisenberg uncertainty
principle makes such a measurement impossible. However, the
scheme devised by theorists takes advantage of the previously
mentioned entanglement. If two quantum particles are entangled, a
measurement on one automatically determines the state of the
second - even if the particles are widely separated. Entanglement
describes correlations between quantum systems that are much
stronger than any classical correlation could be. The phenomenon
has been demonstrated for photons more than 10 kilometres apart.
A great deal of work has been done by the Innsbruck team. In
their experiment we can consider that Alice wants to teleport a
photon to Bob. The names are the standard nottation for thought
experiments in Quantum Computation. The technique works by
sending one half of an "entangled" light beam to Alice
and the other to Bob. Alice measures the interaction of this beam
with the beam she wants to teleport. She sends that information
to Bob who uses it make an identical copy of the beam that Alice
wanted to teleport. This original beam is lost in the progress.
It is quite possible to trasmit data as long as we are prepared
to destroy in the process. Bob was then able to use this
information and his half of the entangled beam to create an exact
copy of Alice's original beam. Although teleportation relies on
what Einstein once called “spooky action-at-a-
distance” and appears to occur instantaneously the special
theory of relativity remains intact because neither Alice nor Bob
obtain information about the state being teleported. This was
something that Einstein himself concluded I believe even though
he never fully appreciated QED. If one is extending this
discussion to the transporters of the Star Trek universe, the
discussion obviously has to move beyond photons and singlets to
include atoms and ions. Recent work in Paris where progress
has been made in the macrscopic direction by entangling pairs of
atoms for the first time. Previously, physicists obtained
entangled particles as a by-product of some random or
probabilistic process, such as the production of two correlated
photons a phenomenon that occasionally occurs when a single
photon passes through a special crystal. Though previously only
two-state quantum systems such as the polarisation of a photon
had been teleported this new research should allow all quantum
states to be teleported. In their "deterministic
entanglement" process, the researchers trap a pair of
beryllium ions in a magnetic field. Using a predetermined
sequence of laser pulses, they entangle one ion's internal spin
to its external motion, and then entangle the motion to the spin
of the other atom. The group believes that it will be able to
entangle multiple ions with this process. Now E. Hagley et al,
using rubidium atoms prepared in circular Rydberg states (which
means the outer electrons of the atom have been excited to very
high energy states and are far from the nucleus in circular
orbits), have shown quantum mechanical entanglement at the level
of atoms. .[*Phys.Rev. Lett. 79:1*]. There is talk that
before long quantum mechanical entanglement may be demonstrated
for molecules and perhaps even larger entities. There are
problems with quantum teleportation, though. In the 1960s John
Bell showed that a pair of entangled particles can exhibit
individually random behavior that is too strongly correlated to
be explained by classical statistics. Unfortunately Bell
inequalities and the further modifications by other workers state
that real instruments do not detect by any means every
“particle”.

Assumptions are/were believed to dominate the picture, and data adjustment are sometimes seen as responsible for many claimed results. The original idea, published in 1964 (Bell, 1964), involved pairs of particles produced together, sent in different directions, then either detected or not. Though the assumptions normally state that all particles are detected but during the last few years detection rates of 5% are more common, with no one achieving the often implied 100% detection.The abstract of Freedman and Clauser’s paper, for example, stated that

"Our data, in agreement with quantum mechanics, violate these [Bell] restrictions to high statistical accuracy, thus providing strong evidence against local hidden-variable theories".

Some workers dismiss such optimism given that most often, and certainly in the case of Freedman and Clauser, there is no discuss on the significance of statistical adjustment. What Bell tests are concerned with is the shape of the relationship between coincidence counts and relative detector setting. Now as said the detection rate has always been below 10%, the sometimes assumed 100% were believed to be impossible. However, one of the problems is the ability of detectors to register a single photons. It has been argued that we only have a probabilistic relationship determining detection rates at each intensity. The uncertainty comes into the picture in the form of electromagnetic "noise" that is added to the signal before detection. Morerecently though the ability to interlink two quantum particles with practically 100% certainty, has been achieved by a NIST group (Quentin Turchette, 303- 497-3328). M. Zukowski et al. , Phys. Rev. Lett. (1993). describe the use independent sources to realise an `event-ready' Bell-Einstein-Podolsky-Rosen experiment in which one can measure directly the probabilities of the various outcomes including the nondetection of both particles. The most recent work even in the last few months is swaying even the more rigid and inflexible voices among the scientific community. Jian-wei Pan, et al. published work in Physical Review Letters that they had experimentally entangled freely propagating particles that had never physically interacted with one another or which have never been coupled by any other means. Their work demonstrated that quantum entanglement neither required the entangled particles to come from a common source nor to have interacted in the past. 80, 3891-3894 (1998). The uses of quantum teleportation reach beyond the transporters of the Enterprise, and as Robert Faulkner pointed out in his post there are serious considerations over the use of entanglement in computational-communication and procedure. In 1994 Peter W. Shor of AT&T worked out how to take advantage of entanglement and superposition to find the prime factors of an integer. He found that a quantum computer could, in principle, accomplish this task much faster than the best classical calculator ever could. This was covered in detail in the previous post, and is beyond the scope of what had confused me, that being the disregard for entanglement in the previous posts.

**How Local Effects Can Determine the Plane at a Nonlocal Position.**- In the popular science book "The Emperor's New Mind" Penrose discusses aperiodic tilings of the plane an Escher like tiling of a plane with a few basic common shapescoverings of the plane by numerous common non-repeating shapes. It seems that these tilings are determined by the specification of the space to be tesselated (the plane) the shapes which are allowed, and a few rules constraining the ways in which they may join.The laws of these tilings are "local" in the sense that they are of the form, that they dictate . Yet attempting to satisfy such local rules apparently leads to nonlocal constraints relating distant parts of the tiling: Penrose says (p436, in my copy) that "the assembly [of the tilings] is necessarily non-local", in that one must examine the state of the pattern many tiles away from the point of assembly, in order to figure out what tile to place next.
**Recent Developments in Teleportation.**

**Multiple Particle Entanglement**

- The first entanglement of three photons has been experimentally demonstrated by researchers at the University of Innsbruck. . In the present experiment, sending individual photons through a specified crystal sometimes converted a photon into two pairs of entangled photons. After detecting a "trigger" photon, and interfering two of the three others in a beamsplitter, it became impossible to determine which photon came from which entangled pair. As a result, the respective properties of the three remaining photons were indeterminate.
- The researchers deduced that this entangled state is the long-coveted GHZ state proposed by physicists Daniel Greenberger, Michael Horne, and Anton Zeilinger in the late 1980s. In addition to facilitating more advanced forms of quantum cryptography, the GHZ state will help provide a nonstatistical test of the foundations of quantum mechanics. Albert Einstein, troubled by some implications of quantum science, believed that any rational description of nature is incomplete unless it is both a local and realistic theory: "realism" refers to the idea that a particle has properties that exist even before they are measured, and "locality" means that measuring one particle cannot affect the properties of another, physically separated particle faster than the speed of light. But quantum mechanics states that realism, locality--or both--must be violated. Previous experiments have provided highly convincing evidence against local realism, but these "Bell's inequalities" tests require the measurement of many pairs of entangled photons to build up a body of statistical evidence against the idea. In contrast, studying a single set of properties in the GHZ particles (not yet reported) could verify the predictions of quantum mechanics while contradicting those of local realism. (Bouwmeester et al., Physical Review Letters, 15 February 1999)
**(***Physics News Update by Phillip F. Schewe and Ben Stein*)

**Where Entanglement Is Likely To Lead**

In the past, evidence of quantum mechanical entanglement has been restricted to elementary particles such as protons, electrons, and photons. Now E. Hagley et al, using rubidiumatoms prepared in circular Rydberg states (which means the outer electrons of the atom have been excited to very high energy states and are far from the nucleus in circular orbits), have shown quantum mechanical entanglement at the level of atoms.

What is involved is that the experimental apparatus produces two entangled atoms, one atom in a ground state and the other atom in an excited state, physically separated so that theentanglement is non-local, and when a measurement is made on oneatom, let us say the atom in a ground state, the other atominstantaneously presents itself in the excited state the result of the second atom wave function collapse thus determined by the result of the first atom wave function collapse. There is talk that before long quantum mechanical entanglement may be demonstrated for molecules and perhaps even larger entities

*[Phys. Rev. Lett. 79:1 (1997)]*

**Quantum Entanglement In Computing**

Several research groups believe quantum computers based on the
molecules in a liquid might one day overcome many of the limits
facing conventional computers. There is growing concern that the
transistors used in the elctronics industry are rapidly
approaching an impass. The effort to build a quantum computer was
stimulated by the realisation by Rolf Landauer, Richard Feynman,
Paul Benioff, David Deutsch, Charles Bennett and others that
computers must obey the laws of physics, and that the realm of
microelectronics is fast shrinking to the atomic realm ruled by
quantum mechanics. The downscaling of the components reaches
significant problems when they are built at a size of a few
atoms. (Roger Highfield). Also indeterminancy means there are
quantum effects at small scales; problems also exist in that the
facilities for fabricating still more powerful microchips will
eventually become prohibitively expensive. The advantage of
quantum computers arises from the way they encode a bit, the
fundamental unit of information. The state of a bit in a
classical digital computer is specified by one number, 0 or 1. A
word in classical computing is described by a string of n-bytes
of information where the byte represents the alpha numeric bit,
specifically eight bits of information. A quantum bit, called a
qubit, might be represented by an atom in one of two different
states, which can also be denoted as 0 or 1. Two qubits, like two
classical bits, can attain four different well-defined states (0
and 0, 0 and 1, 1 and 0, or 1 and 1). However, unlike classical
bits, qubits can exist simultaneously as 0 and 1, with the
probability for each state given by a numerical coefficient.
Describing a two-qubit quantum computer thus requires four
coefficients. In general,* n *qubits demand 2^{n}
numbers, which rapidly becomes a sizable set for larger values of
n. For example, if *n *equals 50, about 10^{15}
numbers are required to describe all the probabilities for all
the possible states of the quantum machine--a number that exceeds
the capacity of the largest conventional computer. A quantum
computer promises to be immensely powerful because it can be in
multiple states at once--a phenomenon called superposition--and
because it can act on all its possible states simultaneously.
Thus, a quantum computer could naturally perform myriad
operations in parallel, using only a single processing unit.

*Scientific American June 1998*

Charles Bennett and his colleagues at IBM found a method back in 1993 and it works because you can send the quantum information so long as you do not know the details of what you are sending and that is the idea that has now been demonstrated by Jeff Kimble of the Caltech, along with Samuel Braunstein of the University of Wales at Bangor and others. In 1994 Peter W. Shor of AT&T deduced how to take advantage of entanglement and superposition to find the prime factors of an integer. He found that a quantum computer could, in principle, accomplish this task much faster than the best classical calculator ever could. His discovery had an enormous impact. Suddenly, the security of encryption systems that depend on the difficulty of factoring large numbers became suspect. And because so many financial transactions are currently guarded with such encryption schemes, Shor's result sent tremors through a cornerstone of the world's electronic economy.

*Daily Telegraph Nov 1998*

Certainly no one had imagined that such a breakthrough would come from outside the disciplines of computer science or number theory. So Shor's algorithm prompted computer scientists to begin learning about quantum mechanics, and it sparked physicists to start working in computer science. While at Los Alamos National Laboratory in New Mexico, Isaac Chuang, with Neil Gershenfeld of MIT, took another important step by demonstrating that quantum computing can be carried out with ordinary liquids in a beaker at room temperature. Each molecule contains atoms, and the nuclei of atoms act like tiny bar magnets. These can point in only two directions, "up" and "down", because of a property called "spin". A single nucleus can therefore act as a qubit, its spin pointing perhaps up for "off" and down for "on". A given spin lasts a relatively long time and can be manipulated with nuclear magnetic resonance, a technique used by chemists for years. Thus each molecule can act as a "little computer" and is capable of as many simultaneous calculations as there are ways of arranging its spin, according to Chuang, now with IBM Research, who has tackled some simple problems with chloroform. Does this mean the first quantum computer is about to appear on the market? His colleague, Charles Bennett, has a standard response:

"Definitely in the next millennium."

*Roger Highfield*- The problem facing quantum computing is that almost any interaction a quantum system has with its environment constitutes a measurement. This phenomenon, known as decoherence, makes further quantum calculation impossible. Thus, the inner workings of a quantum computer must somehow be separated from its surroundings to maintain coherence. But they must also be accessible so that calculations can be loaded, executed and read out.