nettime's_xor on Wed, 10 Sep 2003 08:11:19 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> DNA (((and) computers) or $VAR) digest [griffis, thacker]


DNA and computers, or how i learned to love the myth of nanotech
     Ryan Griffis <grifray@yahoo.com>
DNA and computers; or "is the genome a computer?"
     "Eugene Thacker" <eugene.thacker@lcc.gatech.edu>

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Date: Tue, 9 Sep 2003 08:54:09 -0700 (PDT)
From: Ryan Griffis <grifray@yahoo.com>
Subject: DNA and computers, or how i learned to love the myth of nanotech

regarding Steven's point about the purity of intention
in acedemic science research, it may well be the case,
but i think history gives us examples as to how
isolated academic research can stay. academic research
may operate under the ethics of a gift economy (ready
for the MP3 verdicts), but that is hardly
representative of the economic system that
universities are certainly a part of. a fact becoming
more significant every year. and the inherent link
between nanotech and biotech that Eugene points out
can be used to assess the possible trajectory of
nanotech (myth or not). Contracts like that made
between UC and Novartis in 1998, and the number of
commercial patents (originally) coming out of
university research institutions - whether the
research is being conducted for the glory of the
researchers or not - can't be disputed. is there any
doubt that all technology and science has as it's
final destination a commodity? maybe it does... but it
seems suspect to me.
and HP seems excited about the myth...
ryan

From: steven schkolne <steven@schkolne.com>

"hi nettimers, some of these comments about researcher's motivation have hit a
little close to home, and fundamentally contradict some of my experience - i am
just wrapping up my PhD on 3d interaction at Caltech and based on my stint here
i'd have to say that this corporate drive is not so strong on campus,
especially in Erik Winfree's dna computation group (http://dna.caltech.edu)...
 <...>

__________________________________
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software
http://sitebuilder.yahoo.com

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Subject: DNA and computers; or "is the genome a computer?"
Date: Tue, 9 Sep 2003 23:00:50 -0400
From: "Eugene Thacker" <eugene.thacker@lcc.gatech.edu>

hi all - I'd be interested in picking up on where this discussion 
started - DNA computing  - maybe this is a splinter-thread

specifically I've been interested in DNA computing on both the 
ontological level (in its practice it re-defines biological materiality 
in terms of computability) and on the political level (why is DNA 
computing generally ignored by biotech and molecular biology? is this a 
new sub-industry for microprocessors?)

I was reminded of Jacob & Monod's papers on gene expression during the 
1960s, in which they use the term "gene informateur" and "cybernetique 
enzymatique", and, to my knowledge, were the first to explicitly 
conceptualize molecular-genetic mechanisms in terms of the computer, a 
la Von Neumann and Wiener (who themselves borrowed from neurobiology)

re: industry and corporatism, I'm not aware of many DNA computing 
start-ups (Steve mentions his work w/ Winfree, maybe you can confirm 
this?) - generally IT companies seems to be very very cautious with 
regard to the whole next-generation DNA microprocessor idea - otherwise 
Bell Labs is using DNA to build tiny tweezers, USC (Adleman's lab) is 
attempting to black-box the DNA computer, and CalTech (Winfree's lab) 
has been working on using enzymes to solve tiling problems

the only actual app (far from being a killer app) I've read about is 
using DNA computing for cryptography - Richard Lipton et al's use of DNA 
computers to crack the DES standard

for those interested, I've pasted below an excerpt from a longer chapter 
on DNA computing to be published this year - I was interested in 
understanding the relation between the computer science of 
Mind/intelligence and DNA computing, as a shift from computer-as-Mind to 
computer-as-biology: a biological functionality that has no biological 
function...

-Eugene
____________________________________

Eugene Thacker, Assistant Professor
School of Literature, Communication & Culture
Georgia Institute of Technology

eugene.thacker@lcc.gatech.edu
http://www.lcc.gatech.edu/~ethacker
____________________________________


*************************************************************************
*
"DNA Memory & DNA Intelligence"

In the context of biocomputing, the Turing test is noteworthy, not 
because it presages anything in biocomputing, but precisely because it 
rules out the possibility of biocomputers. Note that Turing's question 
is not "is biology computational?" but rather "is intelligence 
computational?" In other words, it might be more accurate to say that 
Turing relates organisms and machines in the specific terms of humans 
and computers. This is done on the level of a certain type of cognitive 
performance; humans and computers are compared via the notion of 
"Mind." By contrast, von Neumann's interest in the functional 
correspondences between the computer and the brain express a different 
sort of question, which also forecloses the possibility of biocomputers. 
If Turing's question concerns "intelligence," von Neumann's 
question is "is memory-based cognition in the brain a stored-program 
computer?" To simply greatly, we can suggest that for Turing the issue 
in comparing humans and computers revolves around intelligence, while 
for von Neumann it revolves around memory. However, as we've noted, 
these terms need to be understood in very particular ways. For Turing 
"intelligence" is not an a priori quality exclusive to human beings, 
but rooted in communication, performance, and practical assessment of 
behavior. Likewise, for von Neumann, "memory" is not so much a 
mysterious set of impressions specific to human beings or other 
organisms, but, when seen functionally as a pattern arising from 
switches (flip-flops), it can be considered a mechanical property of 
biological or computational systems. At no point in their respective 
texts do Turing or von Neumann question the existence of intelligence or 
memory in human beings as we commonly use the terms. Though Turing, not 
without some irony, asks "why shouldn't I be considered a 
computer?" it is important to note that both Turing and von Neumann 
begin by analyzing cognition in the organism through the lens of 
computer science =96 for both, the data processing involved in Mind, 
brain, and computers, proceeds through a series of discrete, 
finite-state machines. This is evident in the Turing test (series of 
question and answer) as well as in the von Neumann architecture 
(correlation of control unit and memory unit, of arithmetic unit and 
control unit). In the process, the terms themselves (intelligence, 
memory) become transformed, not only quanitatively, but also 
qualitatively. The Turing test and the von Neumann architecture fold 
back on human-centered notions of Mind and brain, questioning the 
tendency to transcendentalise either.

However, while Turing is explicit in his refutation of common arguments 
against the notion of intelligent machines, von Neumann is more evasive 
in the epistemological implications of his analyses. Von Neumann's 
lectures only briefly touch on the implications of understanding 
computers and brains as stored-program machines, noting that, although 
both can be seen in computational terms, the hybrid analog-digital 
character of the brain and nervous system requires a unique language in 
which to understand its logical structure. Both make use of computer 
science (that is, mathematical foundations to assessing computable 
problems), and in this process, they intentionally or unintentionally 
raise issues that are philosophical. Systems configurations beget 
ontological questions. Both discuss Mind or brain in reference to modern 
digital computers, and while Turing emphasizes the performance factor 
(in both the cultural and technical sense of the term), von Neumann 
emphasizes the functional factor (the computing architecture aspects of 
the brain's functioning).

Thus, in relation to biocomputing, both Turing and von Neumann offer two 
paradigms for thinking about the human-computer relationship as a 
relationship between organism and machine. While Turing emphasizes 
intelligence through the performance of the Turing test, von Neumann 
emphasizes memory as data processing through the architecture of the 
stored-program computer. Both make reference to modern digital 
computers, and, while not reducing the human to the computer, they 
nevertheless raise contentious questions concerning how our 
understanding of cognition, Mind, and the biological may be transformed 
by computer logics. We might say that for Turing, intelligence is a 
"state" of mind, while for von Neumann, the brain is just the 
"memory" of the computer .

What should be apparent in these two paradigms of thinking about humans 
and computers is both the anthropocentrism and the emphasis on defining 
the organism in terms of cognitive processes. Both Turing and von 
Neumann are only concerned about the body to the extent that it provides 
a framework or hardware on top of which higher-level processes can 
occur. Even von Neumann's materialist approach, focusing as it does on 
developments in neuroscience, displays a predilection to consider the 
nervous system solely in terms of brain activity, while Turing's test 
appears to raise the question of embodiment, only to abstract it behind 
the logical operators seen to inhere in language. Thus, in the 
comparisons of humans and computers, we see different paradigms that are 
united in their emphasis on approaching both computers as the human on 
the level of cognition (Turing's interest in Mind, von Neumann's 
interest in the brain). In these examples of computer science, we can 
detect more than a hint of biologism: the Turing test assumes as its 
level of success the intractability of gender from biological sex, while 
the von Neumann architecture employs a constructionist view of cognition 
as proceeding from aggregates of lower-level functions in neurons or 
switches. This biologism provides the foundation for further thinking 
about computers as more than computers; calculation and 
"computability" provide the materialist correlatives for the 
constructionist view of cognition. For both Turing and von Neumann, the 
comparison of computers to humans is to be made in terms of 
"computability," and this computability is set up as the criteria 
for the higher-level processes of "learning" (Turing) and 
"memory" (von Neumann). To compute therefore becomes the functional 
analogue of intelligence (in terms of learning) and memory (in terms of 
data storage), beneath which run the lower-level hardware or biology of 
the system.

The field of biocomputing offers both a corrective to the assumptions in 
the intelligent machine discourse, but it also raises its own set of 
problematics not explored by either Turing or von Neumann. If Turing and 
von Neumann see the human-computer relationship in predominantly 
cognitive terms, gauging "the human" in terms of higher-level 
processes such as learning, memory retrieval, or communication, 
biocomputing sees the human-computer relationship in predominantly 
biomolecular terms, displacing any interest in the human with an 
interest in biomolecular process. Biocomputing inverts the intelligent 
machine discourse's interest in cognition, and places higher priority 
on the seemingly secondary, lower-level processes of the organism at the 
biomolecular level. Biocomputing keeps the relationship of organism and 
machine at the level of organism and machine, and resists the analogous 
comparison of human and computer which both Turing and von Neumann carry 
out. The key to this difference is that for Turing and von Neumann, the 
differentiation between organism and machine takes place at the level of 
human cognition: intelligence/learning and memory/data processing are 
the limits of what computers can do. By contrast, biocomputing suggests 
that the difference between organisms and machines is not anything 
human, but rather a difference between living and non-living systems: 
cell metabolism, gene regulation, and cell membrane signaling are the 
limits of what computers can do. Again, we can detect not only a 
biologism but a anthropocentrism in Turing and von Neumann, in the sense 
that the human is the standard against which computer performance is 
judged. Biocomputing does not necessarily assume this; it rather looks 
to the complex, "parallel" processes in the living cell as the 
threshold of computability. For Turing and von Neumann, what is at stake 
is essentially Mind, with the human as its most sophisticated 
manifestation (one that is nevertheless amenable to computation). For 
biocomputing, what is at stake is "life," by this meaning the 
ability of biomolecular systems to carry out exceedingly complex 
calculations "naturally." In a strange way, neither Turing nor von 
Neumann are really interested in computation, but rather the 
computational explanation of human-centered attributes such as 
intelligence, learning, or memory access. Biocomputing researchers, 
however, are centrally concerned with computation, with the 
understanding that computation in the 1990s comes to take on more than 
it had in the 1950s. For biocomputing, computation becomes, in part, 
synonymous with complexity and parallelism. In this context, "life" 
is both non-human as well as "intelligent."

We should be clear here: the interest of biocomputing, as well as the 
intelligent machine discourse, is in computer science. As we've noted, 
researchers in biocomputing are first and foremost interested in what 
molecular biology can do for computer science. Biocomputing is therefore 
not to be considered a subfield in biotechnology, simply because its 
interest is not in the medical, agricultural, or even economic use of 
biological components and processes. Biocomputing is interested in what 
biology can do for computers, rather than what computers can do for 
biology (the field of computational biology or bioinformatics). In this 
sense, biocomputing has more in common with research in supercomputing, 
parallel processing, and even quantum computing. While biocomputing 
makes use of biological components and processes, and while researchers 
use standard molecular biology techniques, biocomputing is predominantly 
interested in computer science issues. Like the earlier work of Turing 
and von Neumann, biocomputing approaches the organism-machine (or 
human-computer) relation in terms of computer science (computability, 
logical structure, intractability). But it reformulates the central 
concepts in terms of a non-human intelligence defined more in terms of 
complexity than in terms of anthropomorphic cognition.

We can therefore see the changes in the way that the computer is related 
to the human as a shift from an emphasis on "Mind" (or cognition) to 
an emphasis on "life" (or complexity). This is not just a historical 
shift, for much AI research is still very much interested in the 
human-specific properties of cognition, intelligence, and the concept of 
Mind. Likewise, one can already detect in the molecular biology research 
of the 1960s the seeds of a view of the organism as a complex system. 
The key link between this emphasis on Mind vs. life is the changing 
artifact of the computer itself. From a historical perspective, it is 
obvious that the computer shifts from a room-sized, military-funded 
"electronic brain," to a microelectronic, industry-marketed 
"personal" computer. While the computer as an artifact plays many 
roles and takes on many meanings, the point to be made here is that, 
from the perspective of computer science, the modern digital computer of 
Turing and von Neumann conceives of computation as a cognitive function, 
whereas in the PC-era of biocomputing research, computation is seen as 
inherently non-conscious, distributed, and in parallel. The slogan of 
mainframe computing is "never mind that man behind the curtain"; the 
slogan of biocomputing is "even cells do it."

[Note: It might be noted here that biocomputing's emphasis on 
"life" is strikingly similar to the approach of a-life, which 
approaches "life" as a bottom-up phenomenon emerging from simple 
local interactions which take on an aggregate complex effect (thus 
swarms and flocks are favored highly in a-life simulations). While there 
is indeed a comparison to be made here, a-life has not been considered 
because, in its originary formulation, its primary interest was 
biological and not computational. A-life bears more comparison with 
biotechnology, in its interest in the rules and principles of biological 
life, than it does with biocomputing, whose primary interest is in 
computability and computer science. The point of the comparison between 
biocomputing and Turing/von Neumann can in be seen as an unlikely 
pairing between different interests united by a concern over 
computability (in silico or in vivo). Likewise, the pairing of a-life 
and biotechnology can be seen to be united by a common concern over 
elucidating the principles of biological functioning (in silico or in 
vivo). A further distinction between biocomputing and a-life is that for 
the latter, computation serves to largely simulate patterns of living 
pheneomena (and thus serves as an aid to the study of life). By 
contrast, biocomputing sees computation as manifesting itself in a 
unique way in organisms as opposed to silicon computers. The computer 
does not model or simulate the organism; the organism is a computer in 
its own right. One cannot help but to speculate what an a-life 
biocomputing field would be like =96 organisms simulating organisms via 
computation, a kind of universal biological Turing machine.]

[from Eugene Thacker, _Biomedia_ (Univ. Minnesota, forthcoming 2004]
*************************************************************************

#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: nettime@bbs.thing.net