| Alan Shapiro on Tue, 4 Aug 2009 15:19:06 +0200 (CEST) |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
| <nettime> The New Computer Science |
Dear nettime,
Choreograph.net just published the essay "A Proposal for Developing Quantum
Computing in Software", by Alan N. Shapiro and Alexis Clancy (see link
below).
This is an essay on the new computer science, which is based on new
mathematics.
I am sending you one-fourth of the text to publish on nettime.
If you decide to publish it (i hope so), please include the link to the
full text at Choreograph.net.
Best regards,
Alan N. Shapiro
"A Proposal for Developing Quantum Computing in Software" published here:
http://www.choreograph.net/
A Proposal for Developing Quantum Computing in Software
by Alan N. Shapiro and Alexis Clancy
[We] believe that the invention of a new computer science, one more powerful
than that which presently exists, is possible; a more powerful computer
science that often goes by the name of Artificial Intelligence. Shapiro
Technologies will go beyond the digital or binary computing paradigm that
has persisted since the seminal work of the Second World War generation of
information theorists such as Alan Turing, John von Neumann, Norbert Wiener,
and Claude Shannon, so as to achieve quantum computing.
A measurement of superpositions yields only one value, and at the same time
destroys all the others. Computer scientists working on quantum computers
therefore rely heavily on the Fourier transform, a mathematical operation
that transforms one function of a real variable into another, called the
frequency domain representation of the first function, as the hypothesized
way to solve the problem. The quantum Fourier transform is primarily thought
of as being implemented in hardware. A hypothetical quantum computing device
would have so-called 'reversible logic gates' which continuously allow
sequences of reversible decompositions into mathematical unitary matrices.
The goal of quantum computing has been clearly and explicitly defined by
computer scientists, but the mathematics of how to implement qubits and
superposition states does not yet exist. It should be noted right away that
most efforts to realize quantum computing are, in my view, too one-sidedly
hardware-centric.
A crucial characteristic of quantum mechanics known as entanglement occurs
under certain experimental conditions. Subatomic particles become
'inextricably linked' in such a way that a change to one of them is
instantly 'reflected in its counterpart', no matter how physically separated
they are. Quantum theory postulates a superposition of states that
destabilizes the intuitive sensorial notion of spatial separation. Entangled
particles transcend space and remoteness. They belong to a 'shared' system
that acts as a single entity. The distance that divides the particles no
longer plays any influencing role that would lead them to be regarded as
having distinct identities. Once the entanglement state is established, the
subatomic duo stays forever bonded. The two particles will always have
either precisely opposing or 'elegantly complementing' relative values of
key quantum properties such as polarization direction, regardless of how far
apart they travel from one another.
Quantum mechanical phenomena, such as superposition and entanglement, are
made use of to perform operations on what are called quantum bits, or
qubits. Instead of the classical binary or digital bit, which has the
discrete value of 0 or 1, there is a qubit, which may have a third state, an
in-between-state, the momentary value of which is determined by the
superposition of the state of many other bits in the system.
Entanglement and superpositioning enable this third state, which can be
cultivated to correspond with the anticipated choice space of the 'user'.
Third Space mechanics: I consider a model to be a dynamic series of frames.
In modeling a universe, I consider two sets. First, the set F of everything
that I know. Second, the set D of everything that I do not know. Something
can either be known to me or unknown to me. It cannot be both. ["_F_eicte"
is the Irish word for "seen." _D_ofhiecte is the Irish word for "unseen."]
The set F of everything that I know is characterised by collapsed wave form
Kroneker Delta functions which are finite, bounded and measured. [A Kroneker
Delta function is a function whose value is one at a unique instance, zero
everywhere else. It best describes the collapse of a waveform on
measurement, the wave collapsing to an absolute negation of probability at a
certain point on this measurement.]
The set D of everything that I do not know is characterised by Schrödinger
type equations, spacewise infinite and unbounded. However, the perimeter of
the set F of everything that I know presents a problem, as a point on this
perimeter exists in both spaces F and D [Imagine someone standing on the
border of Belgium and The Netherlands - essentially, they are standing in
both countries at the same time]. This contradicts the first Rule. I correct
this model by introducing a small cleft about the perimeter, small yet big
enough to exist. Epsilon small. I call this cleft the Epsilon Cleft. This is
the Third Space.
Locally and superficially, the dimensionality of F strictly does not go
beyond 2D, and it is Euclidean. The dimensionality of D is a function of
time; as time progresses, symmetry breaks [i.e the character of an absolute
law dictating the character of D is no longer a given. See here] and as many
dimensions as are needed to patch the model are used. Ignoring the first
term, the sequence (as stated previously) is 4, 11, 26, 57. The Epsilon
Cleft is the source of these dimensions. My assertion that symmetry will
always break (as long as there is time) dictates that the Epsilon Cleft will
have an inexhaustible supply of dimensions. [This assertion is taken as a
direct inference of Gödel's Incompleteness Theorems.] It is therefore
countably infinite. Adopting this attitude towards a model renders the
'problem' of innumerable infinites not a problem, but rather an actual
contributor to an overall dynamic and evolving model.
I like to view spaces like the Epsilon Cleft as a "novelty" space. I find
them to be analogous to the "No Mind" structure referred to in the Samurai
Creed ("I have no sword. I make No Mind my sword.") and the characteristic
consciousness produced by Samadhi practices of Buddhist and Hindu Yogic
meditation; I place my faith in the Epsilon Cleft to provide a space for
novelty to emerge. In this case, we design the solution space such that the
novelty that emerges is Artificial Life.
In a Riemann type geometry, a conic represents a pinch of some sort. An
unmolested bounded space can be taken to be a sphere but some stress on the
system will render it not so - the most basic morphing will be
hyperbolically conical. I state gravity to be a constraint simply due to its
universality with respect to binding a system. With respect to separating
the time and space factors, I feel that, as we are dealing with a spacetime
metric, the mutation function is a coupled bivariable function. It is almost
a rule of thumb that nature will not use a simple linear function to do
anything - a simple non-linear function is generally the case. The geometry
can be taken to be a quantum geometry, but I believe that most of what we
experience has its origin in these kinds of spaces. I feel that the solution
space metric we will design should embody these qualities and also be
breathable (my term) and elastic - a mathematical weave as opposed to a
mathematical covering (1). I am inspired by Goethe's quote: Search nothing
beyond the phenomena, they themselves are the theory.
Where the challenge lies is in accessing a Schrödinger waveform to "play"
with. It may be of use to draw on a conjecture that I developed regarding
Schrödinger's Equations and Parametric Normal Distributions. The question I
pose is this: Do statistics imitate life, or does life imitate statistics?
The conjecture is based on the meditation that, because Gauss' rigorous
definition of the Normal Distribution [the ubiquitous "Bell Curve" (because
it looks like a bell) seen in most statistical models, particularly in
models whose elements have the possibility to chose their state] predated
the development of Quantum Theory, the results of experimentation and
thought experiments were mathematically retrofitted into Gauss' model and
taken to be a system of "statistical aggregates." However, it is my view
that Gauss' Normal Distribution is a trans-dimensional fractal, mimicking in
form and behaviour its quantum origins on a macro scale.
We want incompleteness. Some methodologies exist in electronic engineering
where a least element is applied to create a mesh for the mathematical space
used for examining given problems. But this has little to do with Gödel's
incompleteness - it is just a method that works. Where the novelty in our
proposed methodology lies is in the assertion that the "gaps" left in a
given frame due to a Möbius inflection are the physical manifestation of
incompleteness. This is a significant breakthrough, and it is the real way
forward for Artificial Intelligence.
What we will practice is the strategy of reversibility - overturn the
negatively connoted perception of limit into a positive opportunity.
Incompleteness will be a positive program for growing embodiment and
vitality. For the first time, computer programming (Java) will be extended
from classical combinatorial logic to the programming of the real conditions
for emergence.
Quantum physics was never philosophically understood by its practitioners,
who opted to just use it, and subsequently developed practical statistical
methods for doing so. No trans-disciplinary knowledge there. So far, all
that the physicists and mathematicians have done are "clever tricks." Even
the quantum teleportation experiment has to use the "clever trick" of the
joint Bell-state analysis or measurement of a third particle that is
independent of the entangled pair.
The way to take measurements on both sides of a created universe, of the
model and its phantom, to access all of the quantum information that is
going on in the system, is to have a safe, protected space in between where
one is allowed to be, prior to 'becoming (measurable).' First, we will have
a portion that conforms to the definition of a universal computing device
made by Turing in "On Computable Numbers," the q-state, the third possible
state of the qubit, as a statistical aggregate of all the other states that
we are interested in (for a particular systems design). That is no problem.
Second, we will have a portion that goes beyond Turing's definition. Along
these lines, we want to perceive quantum states of musical resonance which
are going on in the system in real-time, not just Normal Distribution stuff
that existing computer science and mathematics have been able to handle.
Here is the answer to the riddle of quantum physics: not measure, but
perceive. And an expansion of consciousness supports an expanded perception.
Quantum behaviour is a reality. Physicists thought that they could not
observe or measure this reality without destroying the information therein.
But they conceptualized the methodology of observation conventionally. The
space from which one can observe the reality of quantum behaviour without
destroying the information therein is also a reality, a fact of nature. We
do not have to invent this space, we only have to perceive it. This space of
non-destructive observation really exists, just as quantum behaviour really
exists, and we will get it working in software. To perceive this space, we
have to change our consciousness. That's all that we have to do! We have to
recognize as being scientific some ways of perceiving that belong to other
traditions that Western science has so far small-mindedly regarded as
non-scientific. This expanded perceiving includes creative mathematics, the
deconstruction of classical spacetime mechanics, Buddhist and Hinduist
meditation/ontologies, Aboriginal-sacred-mystical-expanded consciousness
thinking, and Continental semiotics/grammatology.
NOTES
1 - In mathematics, a Metric Space is a set where a specific concept of
distance between elements of the set is defined and implemented.
Three-dimensional Euclidean space - a way of thinking about space that
belongs to the Western metaphysical 'construction of reality' as it was
originated by the Ancient Greek thinkers - corresponds to our 'intuitive
understanding' of space. Another example of Western metaphysics is the
Aristotelian classifiying logic of "A is true or B is true," the limits of
which as an intelligent system of logic are nowadays showing more and more.
The geometric properties of the Metric Space depend on the Metric chosen. By
conceptualizing a different Metric, interesting Non-Euclidean Geometries can
be constructed, for example, those used in the Einsteinian theory of general
relativity. Metric Spaces are Topological Spaces, and there is a continuous
function between Metric Spaces (small changes in input result in small
changes in output).
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: http://mail.kein.org/mailman/listinfo/nettime-l
# archive: http://www.nettime.org contact: nettime@kein.org