Florian Cramer on Tue, 16 Mar 2004 00:56:08 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> The Limits of Networking


Quoting Alex Galloway and Eugene Thacker:

> Protocol abounds in techno-culture. It is a totalizing control apparatus
> that guides both the technical and political formation of computer
> networks, biological systems and other media. 

[...]

The problem with the word "protocol" seems to me that computer science has
given it a meaning quite different from common English. Other examples are
the words "transparent" (which is used in software design in practically
opposite sense to common understanding, as a mapping of two or more
different symbolic systems into a simulated one, like the "transparent"
access of FTP servers directly in a desktop PC file manager), "code" (used
not in the common sense of "codifying system", but as "codified symbols"),
"interpretation" (understood in the C.S. as the formal
execution/translation of an instruction at runtime, whereas in philosophy,
literary studies and music interpretation it means non-formal translation
of [instructive or non-instructive] signs), and so on.

What computer science and network engineering call "protocol" could just
as well, or better perhaps, be named [a simple, formal] "language" because
they simply serve the purpose that two connected entities can talk to each
other. Yet another word, which you use yourself, is "standard". It is a
virtue of the Internet that its standards are open and designed to be as
agnostic to the information transported as possible; it seems to me that
preserving this design (with DRM schemes, patents etc. on the horizon) is
the issue rather than, as you at the end of the paper, pushing the
protocols.


Of course it is right to say that "protocols", "standards", "languages" or
whatever we call them are systems of control in the sense of what
theoreticians such as Lacan and Foucault have called "symbolic order" or
"discourse"; if this applies to common human language, it no doubt applies
to formal languages as well. But in praxis, it boils down to the question
how the standard is designed, i.e. how much freedom it allows and who
controls it in which way, see Lawrence Lessig's analysis of the Internet
vs. AOL.

But as with any play, consisting of a ruleset and its free execution,
control is never total to the extent that it wouldn't permit freedom, a
paradox best seen in Oulipo writing with its self-imposed formal
restraints (like: writing a novel without a single occurence of the letter
"e", as Perec's "La Disparition"). Freedom and control thus are not
mutually exclusive, but mutually dependent on each other. To envision
communication systems without control - i.e. languages without rules,
networks without protocols - and find them desirable, would be utterly an
infantilist vision of a pre-language paradise. (And to read Freud, Lacan
or Foucault in this way, would be no less naive.)

> Put simply, protocols are all the conventional rules and standards
> that govern relationships within networks.

Yes, but the reality is more complex because network protocols can be
layered onto each other and thus used in quite unpredictable ways.

To stick with the example of the Internet, it would be false to assume
that because http is a "hypertext transportation protocol", it would force
everything under its "totalizing control apparatus" (to quote your paper)
into hypertext format. - The counter-examples are abundant and well-known,
but even topped by the fact that any imaginable network language can, with
the right software tools, be steganographically tunnelled through http,
just as you can subvert the "totalizing control system" English by using
it merely as a cryptographical container for a text written, for example,
in the cosmic Zaum language of futurist poet Velemir Chlebnikov - apart
from the fact that you can still use it to write novels like Joyce's
Ulysses, or in the case of http, web sites like www.jodi.org.

 
> We need only remind ourselves of the military
> backdrop of WWII mainframe computing and the Cold War context of ARPAnet,
> to suggest that networks are not ahistorical entities.

Yet the history is more complex as popular media history reductionism
tells it. The Arpanet/Internet was funded by the military, but designed by
academics - many of them with hippie backgrounds - who used the rhetoric
of the "nuclear-strike resistence" to get the money for it.  Today, you
probably have to write something about "e-commerce opportunities in a
globalized world" or "terrorist-proof network design" if you run a C.S.
lab and want a grant for your work. (Or, if you do humanities research on
the subject, don't miss to write the word "interdisciplinary cultural
research" into your application letter, at least here in Germany.)

> and so forth. What we end up with is a *metaphysics of networks*. The

Agreed, for which to not a small extent Deleuze/Guattari and their popular
perception must be blamed. An aspect of D/G where most clearly their
indebtedness to vitalist philosophy [and hence right-wing philosophy]
shines through. I wonder if that critique could be applied to the
now-fashionable term "multitudes" (which I plainly [mis?]read as a
Deleuze-Guattarian update on the classical Marxist "masses") as well.

 
> Biological or computational, the network is always configured by its
> protocols. We stress this integrative approach because we cannot afford to
> view "information" naively as solely immaterial. Negri notes that "all
> politics is biopolitics," and to this, we would add that all networks are
> not only biopolitical but biotechnical networks. Protocological control in
> networks is as much about networks as *living networks* as it is about the
> materiality of informatics.

I may not quite grasp this argument, but it seems to me that here you fall
into the trap of misreading the map for the territory, or the signifier
for the signified, by reading the sloppy engineering terminology of
"protocol" too seriously.

> Thus we are quite interested in a understanding of political change within
> networks. What follows might be thought of as a series of challenges for
> "counterprotocological practice," designed for anyone wishing progressive
> change inside of biotechnical networks.

While you later disclaim neo-luddite tendencies, "counterprotocological
practice" is a term which almosts screms for being misread as desire for
pre-linguistic status quo.

> but to push technology into a hypertrophic state, further than it is meant
> to go. We must scale up, not unplug. Then, during the passage of
> technology into this injured, engorged, and unguarded condition, it will
> be sculpted anew into something better, something in closer agreement with
> the real wants and desires of its users.

This, in my view, reverberates a "media archeology" you might not have
been aware of, that of language utopias since at least medieval kabbalism.
But, to stay in the previous metaphor, should a French person who read
Lacan and Foucault focus all her/his subversive energy on the Académie
française?

I also note that your own push for a "counterprotocological practice"
solely happens on the level of the signified, not the signifier - or, in
other words: the transported data, not the transport protocols.  Would you
consider the grammar of the English language, the Latin alphabet encoded
into ASCII whose bits then are distributed via the SMTP and POP3/imap
protocols over TCP/IP to Nettime subscribers issues as well?

-F 

-- 
http://userpage.fu-berlin.de/~cantsin/



#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: nettime@bbs.thing.net