Tjebbe van Tijen on Wed, 21 Oct 1998 09:34:36 +0200 (MET DST)

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> ARS OBLIVISCENDI (the nettime zkp5 version)

how things digital will be lost and forgotten
September/Octobre 1998 [1]

Tjebbe van Tijen


The recent phenomena of 'cyberclasm' started with radical student actions
in North America against university and military administration
facilities. One of the earliest examples was in 1969 at Sir George William
University in Montreal where, during a conflict about racism on the
campus, students stormed the computer center of the university, threw out
thousands of punchcards from the windows and smashed the computer
equipment. At that time computers were mostly stand alone machines with
limited storage capacity and data was either stored in punchcards, that
needed to be processed mechanicaly, or on reels of magnetic tape. A year
before a little book with the title "The beast of business: a record of
computer atrocities" was published in London, containing "a guerilla
warfare manual for striking back" at computers that, according to its
author Harvey Matusow, were on their way to "grab power": "from now on it
is them or us". [2] The whole book had a playful Luddite tone and the
proposed guerrilla actions were rather mild, like punching extra holes, or
taping over holes in punchcard payment slips and other administrative
documents which, at the time, started to be send out to the general
public. Using a magnet to de-magnetize computer forms with magnetic strips
was another proposed method to stop the advance of the computer in civil
administration. Matusow mentions the military use of computers, but he did
not seem to understand its function very well, as becomes clear in his
slogan: "It is the computers that want war". It are of course human beings
that want and make war. It is the social network of politicians,
industrialist, the military and scientists, in short the
'military-industrial complex', that started to use computers for war
simulations and war logistics The first IBM super computer build of the
fifties was used to simulate the effectivity of atomic bombs. Matusow
published his book in 1968 and the Vietnam War was already raging for four
years. 1968 is also the year in which the concept for a network of
military and civil computers (ARPAnet) was proposed, a decentralized and
flexible form of communication that would be able to resist a disruptive
nuclear strike by the ennemy. The growing importance of computers in
warfare was not yet recognized by the radical movements of that time. The
different manuals for urban guerrilleros of the late sixties and the
beginning of the seventies do not mention computer facilities as a target
[3], the emphasis is still on radio, television, telephone switches and
electrical power facilities. It is in May 1972 that a first serious attack
on on a military computer center is undertaken. by the 'Kommando 15.
Juli', a group related to the German 'Rote Armee Fraktion'. That target
was situated in the headquarters of the American Forces in Europe in
Heidelberg, reason for the attack was to protest against the stepping up
of aerial bombardments in Vietnam. The guerrillas use two bombs with a
power of 200 kg TNT. Buildings and equipment are damaged and two American
soldiers killed. It never became clear if the army computer was realy hit.
In the years after, it is the metamorphosis of the military ARPAnet into
the civil network of networks called 'Internet', that has created
opportunities for new forms of 'cyber-clasm' and guerrilla, not anymore
direct physical attacks on personel and equipment but indirect attacks,
using the computer system itself as a basis for disruptive and destructive


It is an old tactical adage that each advantage carries a disadvantage in
it, this holds both for assaillant and defendant. Empires: Chinese,
Mongol, Roman, Napoleontic, and their modern heirs, can only grow on the
basis of an efficient transport system of goods, armies and information.
High road systems with facilities for resting, refreshing and changing of
horses and vehicles were created to make such transport movements faster,
but at the same time these high roads, with their valuable traffic,
created new opportunities for robbers, bandits and other highwaymen to
ambush and take what they could not get otherwise. Expanding sea traffic
showed a similar development with pirats laying in wait to catch some of
the rich cargo moving between colony and imperial motherland. The new air
traffic system continues this tradition,of robbery and piracy, the
high-way-man evolved, became a train robber, a high-jacker. All of these
freebooters, over the centuries, have one activity in common: 'stealing
something while in transit'. The modern highway-man or woman roams the
'information high way', lurking, waiting for the right moment to grab,
what is not meant for her or him. Also here it is the intensified movement
of information that creates opportunities., because what is in transit,
between one safety heaven and another, can not be fully protected. The
metaphor of the 'information highway' relates also to the tradition of
drinking, prostitution and gambling at halting places and ports, from
Roman times till our century, and the constant fight of authorities trying
to ban such debauchery. Sex has been closeley related to the Internet from
the very beginning and similary authorities have been trying to eliminate
it from the Net. This can only be partly succesful, one too lusty site
rolled up, a new one pops up a bit further down the road. There is also
illegal gambling on the Internet and selling of pleasure drinks and drugs
that would otherwise not, or not so easy be available. Closing down the
road itself would be the most effective measure, but as modern society
needs information traffic it has to learn to live with the unwanted side
effects. Patrolling the Net, by human and software agents, has made it
possible to ban some of this unwanted information [4], but there is an
inherent danger in the principle that some authority will decide for
individuals what to read, what to see and what not. The 'Index Librorum
Prohibitorum'/Index of Forbidden Books, of the Roman Catholic Church dates
back to the end of the fifth century and was meant to prevent
contamination of faith and corruption of morals. It was regularly
published from 1559 onwards and only ceased publication in 1966. With the
introduction of filtering software that will either stop what is not
approved or, more radical, only let through what is approved, the old
principle of world-wide censorship as practised by the Catholic church,
has been re-introduced by 'modern' governments and affiliated
organisations at the end of the 20th century, on a scale bigger than ever. 


Information that is not travelling is also not safe, even when securely
stored behind 'fire walls'. Like in the fairy tales, whatever strong
fortification is made, in the end someone will be able to enter, often not
by brute force but by deception. It is not surprising that, in the coming
age of digital computers, mythological terms like 'Trojan Horse' are still
used for such cunning tactics whereby unsuspective computer users let
hidden malicious information through the gates of their equipment which at
an unexpected moment starts to raise havoc and destroys valuable
information. One can go back in time two millennia plus three centuries to
find this principle described in the oldest known text on tactics of war,
Sun Tzu's 'Ping Fa' or 'The art of war'. Right in the beginning of this
ancient Chinese text it is: "all warfare is based on deception". Sun Tzu
clearly distinguishes between direct and indirect ways of fighting and he
favours the last form: "indirect methods will be needed' [5] In 1995 the
'National Defence University' at Fort McNair in Washington DC has
instituted a yearly award named after this Chinese war theoretician: 'The
Sun Tzu Art of War in Information Warfare Research Competition'. [6]
Recent prize winners are a group of researchers of a company working for
military organisations. They thought up an imaginary scenario that could
have taken place during the Balkan conflict in September 1998: A group of
Serbian political activists intervene with the radio frequencies of a
temporary airfield at the Bosnian/Croatian border where NATO troops are
flown in during a flaring up of the conflict in Bosnia. The result is two
military airplanes crashing. The Serbian cyber activists, immediately
after, inform the whole world press by email and put up a political
statement on a web site on a server in Amsterdam. CNN, Reuters and others
broadcast and publish the statement including the web-page address. Within
twenty four hours the web page has a million 'hits', many visitors come
from state intelligence organisations. Any computer used to access this
web site is infected by a 'Trojan Horse' program that the activists have
embedded in the web-page, a program that starts to delete all files and
hard disks after twenty four hours. This exercise in military fiction is
used as an explanatory introduction to what 'information warfare' could
be. The authors are warning: "the US. military could find it difficult to
respond against a small and digitally networked enemy". and they propose
the establishment of "Digital Integrated Response Teams (DIRTs)" made up
of "highly trained information warriors" from military and law enforcement
agencies, to counter "information terrorism". [7] These state 'information
warriors' are supposed to work from "remote computers", using "anonymous
response" without open display of force, in order to avoid any public
sympathy for political activists, fighting a possible 'right cause' and
being attacked by the state. Untill now some incidents where strategic
state information have been accessed from the outside have become known,
these cases were played up highly in the press, but none of them seem to
really have posed an enduring security threat to any state in the world.
At many levels of society is has become clear that the criminalization and
persecution of computer hackers often misses the point, that in most cases
the sole aim of a hacker is to master computer and encoding systems, to
prove how far, how deep one can go and even most of the more political
motivated hackers tend to have some basic loyalty to the national state.
There might be infringements of ownership of copyrighted and otherwise
protected digital material, but these incidents are merely based on a
different interpretation of what acceptable forms of ownership are, and
differ from activities of organised crime or terrorist attacks on the
functioning of the state. In several academic and military studies this
more differentiated view on the 'hacker scene' can be found and some
authors even see hackers as a positive force in society that can be tapped
as a resource to improve security systems. [8] Also this is in essence an
ancient tactic as one can read in the last chapter of Sun Tzu's 'Art of
War' that describes the use of spies: "The enemy's spies who have come to
spy on us must be sought out, tempted with bribes, led away and
comfortably housed. Thus they will become converted spies and available
for our service." [9]


As the computerised informatisation of all levels of society progresses, a
feeling of vulnerability is growing. Recently the Clinton administration
issued a 'White Paper on Critical Infrastructure protection' that
describes what to do against "nations, groups or individuals" that "seek
to harm us in non-traditional ways". [10] Others use catch phrases like an
'Electronic Pearl Harbour' and 'Cyberwar, Blitzkrieg of the 21. century'
to fire the imagination of politicians and civil servants who decide about
budgets for new research, new special task forces and new weapons. The
reasoning is constant through human history: what the enemy can do to us,
we should be able to do to the enemy. Apart from the indirect methods by
hackers, computer criminals and their state counterparts the 'information
warriors', a whole new arsenal for more direct forms of 'information war'
is being prepared: guns that can fire 'High Energy Radio Frequencies',
hitting electronic circuits with an overload that will deregulate any
radio and television transmitter, telephone switch, computer network,
aircraft and other transport system dependent on electronics; miniature
robots based on 'nano technology', that can physically alter or destroy
electronic hardware; low energy lasers that can damage optical sensors
used in many modern vehicles and equipment; and the best of it all the use
of an Electric Magnetic Pulse (EMP), originally discovered as a side
effect of nuclear bombs, that will disable all copper wired electronic
circuits, halting all electronic equipment and communication that is not
specially shielded against this form of attack. [11] There are plans for
the usage of the EMP weapon ranging from what in military terms is called
"Shock & Awe", whereby whole urban areas or battlefields will be blasted
with such an energy that all electricity stops functioning, to more
'precise' targeting of single objects. in a range of a few hundred meters.
Modified cruise missiles for such confined operations exist already. There
is some analogy with the Neutron bomb plan that dates from the end of the
Cold War period, a bomb that would not destroy build structures and the
like, but just humans that happened to live there. There was an outcry, at
the time, against such a perfidious plan, including the voice of the
official Soviet Union propaganda machine that attacked the satanic
scientists of the United States, forgetting for a moment their own nuclear
weapon arsenal, with its potential to destroy both build structures and
humans, in one massive blow. Now there is a threat of being bombed out of
our electronic age by a huge electric magnetic pulse. It is difficult to
imagine a world without electricity, but one wonders what it will be like,
to live in a more tangible world. 


The basis of most electronic documents is recoding of human readable text
and graphics and machine readable sound and video. At all stages of
production and reproduction different layers of technology reside between
the human organs of perception and digital documents. Recoding as such is
not a new phenomena. It is recoding of language into written text that
"permits us to create a record that many other people, far distant from us
and from one another in time and space, can read" [12]. The non-electronic
recoding of language, by hand with its directly readable physical marks on
a physical surface left us with only a limited amount of documents from
early ages. A lot of hand-written documents were lost over the centuries,
many did not even survive their own epoch. The shortage of good writing
surfaces like papyrus and parchment made that the reusable wax tablet was
often favoured. Parchment was rare and expensive and for that reason often
'recycled', reused as 'palimpsest' by washing and scraping off the text it
carried. The use of paper and the multiplication of writing by the
printing press fundamentally changed this situation. As noted before, the
dispersal of multiple copies of a (printed) text led to the long term
preservation of that text. Now digital documents are of another order,
they are not any longer tangible objects but "essentially an invisible
string of stored electrical voltages" [13]. First it was scarcity of
carriers for storing these electric currents (floppies, hard discs and the
like) that led to the same practices as the recycling of wax tablet and
parchment in antiquity: erase and use again. Later the price of digital
storage dropped dramatically but by then it was the problem of managing
large quantities of half labelled and messy information that led to the
same decision. As the fixity and multiplicity of the printed is more and
more supplanted by the flexibility of the multiplicated digital document
we come to understand that the new media are posing problems when it comes
to long term preservation of content. Standards for computer hard- and
software are in a constant flux and backward compatibility and long term
support does not generate enough profit for the industry to be taken in
account sufficiently. Bankruptcy of a firm, or defeat of a standard on the
marketing battlefield can mean a sudden loss of massive amounts of
information. Eternal transcoding of digital information from old to new
standards will need to become a routine operation within bigger
institutions, but such facilities will mostly not be available for smaller
institutions and the private sector. This last sector of society was
already under-represented in archives and other deposits for historical
studies and now, in the digital area, even less traces will remain of
personal administration, letters, e-mail, unpublished manuscripts and the
like. Going through the belongings of someone who died one might consider
keeping some letters, notebooks or photographs, things we can read
directly, but what to do with an outdated computer, a shoe box with
unreadable floppies, mysterious looking cartridges and unlabled CD's?
Their fate is to be burnt at the waste disposal or rust away in junkyards
till the moment that they are recycled and become usable materials again.
In this sense we have seen a similar thing happening earlier this century
when old cinematic film was recycled for their silver content. 


Global and direct availability over the Internet of a wide variety of
electronic documents has led to a speed up of information circulation on
the one, and a constant loss of information on the other hand. The life
cycle of content that is made available over the Internet is getting
shorter and shorter. Thousands of web pages are thrown away each day for
various reasons: storage costs, lack of space on computers, hard disc
crashes and other digital disasters, information getting outdated, being
unwanted, censored, neglected. Strangely enough the information is often
not directly lost but fades away slowly, like the light of a star that,
itself, does not exist anymore, but still can be seen in the sky.
Information is duplicated on computers elsewhere in the form of mirror
sites and so called 'proxies', that temporary store often requested
information to lessen the amount of traffic over the Internet. In the end
also these doubles are not needed anymore and will be erased. as well.
Some see this as a positive aspect, why piling up the information debris
of each generation on the already towering heap, others are worried about
the digital void of historical material we will leave for posterity.
Megalomaniac plans, with an imperialistic and totalitarian undertone, to
periodically store 'all information' available on the Internet and
associated networks in gigantic digital warehouses have been proposed.
[14] It seems more logical that the old principle of 'survival through
dispersal' will have a longer lasting effect. on preservation and
availability of digital documents from the past. Even when a very small
percentage of the electronic material on the global network of networks
will be preserved this will be of such a magnitude and diversity that
special techniques of 'digital palaeography', 'data mining' and
'information recovery' will be needed to be able to dig up something that
will make any sense to future generations. One can think about methods
like developed in 'experimental archaeology' whereby theories on extinct
technology are tested in real life situations or the playing of classical
music on historical instruments. Another approach is the simulation of the
functioning of old hardware and software on new machines, be it military
analogue computers of the fifties or one of the popular hobbyist computer
types of the seventies and eighties. The real experience of the
functioning and use of this equipment will be lost in this process, but is
not most of what we think to experience from the past such kind of


The traditional containers of information (books, periodicals, gramophone
records, audio CD's, film and video titles produced for the consumer
market) fix information in such a way (cover design, title, colophon,
credits, numbered series, publisher, place of publication, year, etc.)
that we can easily deduct what they are about and have some understanding
of the context in which they were functioning. It took more than four
centuries for these kind of standards to slowly develop and be commonly
used. From this perspective it is not surprising that the use of new
standards for the description of networked electronic documents (a reality
that exists hardly two decades) is somewhat lagging behind. There are
standards for storing data about data in each electronic documents. Part
of this 'meta-data' is already automatically generated at the moment of
creation of a new document (time, date, hardware used and protocols needed
to display the document again). Without this self-referential information
the documents could not even be distributed and consulted. When it comes
to description of content (author, title, subject, etc.), new standards do
exist, but are little known and hardly used in a proper way. This means
that there is an immense amount of potentially valuable and interesting
information on the Internet that remains unnoticed and will be forgotten
because its content is not properly described. Whatever powerful 'search
engines' are used, machine protocols can not sufficiently distinguish
between meaningful and non meaningful occurrences of search terms used.
Most search results give so many 'links' that one can not possibly follow
all of them. In this way valuable information is "lost in the deafening
babble of global electronic traffic". [15]


There are people who think that such a comparison of new electronic
information and communication systems with traditional media is not
fruitful. They see a loosening of the bonds that bounded text, sound and
image to their respective media and a fusion of these elements in a new
phenomena: multi-media, something of a completely different order where
fixity and linearity have been supplanted by fluidity, a dynamic
changeable recombination of elements, a process that in its ultimate form
will abolish the notion of finite and finished works. This new form of
human communication has one of its theoretical bases in literary and
semiological theories developed three decades ago that pointed to the
relationships within in a given text to a multitude of other texts and the
possibility of a new kind of more personal and active reading. This theory
of the possibility of different 'readings' of text was also extended to
the visual realm and with the new technical opportunities of computers, to
interact with a corpus of many different linked texts fragments, these
theoretical concepts did get a concrete form: hyper text. [16] The first
experiments were with interlinking, some say weaving, of different blocks
of text and images in a virtual library made up of such 'lexias' and
icons, still residing on one computer, or a well controlled internal
network of computers. With the advent of the Internet the concept of
'hyper text' have been widened from linking materials to a 'wide area
network' to linkages made over the 'World Wide Web'. With the growing
enthusiasm for the seemingly unending possibilities, some supporters were
talking of 'the Net' as a global brain of interconnected and linked human
resources. But it are the linkages that form the weak shackles in the
chain. Already on the local level, with high frequency, a followed link
will result in an error message: 'can not be found'. On a global level
this new digitally unified human brain is even more suffering of amnesia.
One can not escape the comparison with printed media here, it is like
reading a book and suddenly missing a few pages or discover that some of
the footnotes have been torn out, or trying to read a newspaper after
someone has cut a series of news clippings from it. The fascination with
the Internet is like the fascination with the beauty of a spider web,
dancing in the wind, it is based on the knowledge of its fragility, one
unlucky instant will destroy all the work. This ephemeral aspect can of
course also be seen in a positive way: enjoy the moment itself, do not
leave too many traces, leave the others, the generations after you, some
space to discover things for themselves. Ideally a combination of the two
elements might develop, whereby some examples of the constantly broken
threads of the Web will be collected and preserved, while the rest will be
washed away by time. 


1 A full version of the original text 'Ars oblivivendi' can be found in
'Memesis, the future of evolution/Ars Electronica 96'; Springer Wien/New
York; 1996; p.254- or at <>

2 'The Beast of Business: a Record of Computer Atrocities' by Harvey
Matusow; Wolfe Press; London; 1968. In the late sixties Harvey Matusow
lived as an American ex-patriate, in London and moved around in the
'cultural underground scene' of that city. Before he had worked in the US
as a FBI agent and played a role as a paid witness during the
anti-communist MacCarthy trials. For his actual activities see

3 For example: Alberto Bayo '150 Questions for a guerrilla', 1959/1965;
Carlos Marighella, 'Minimanual of the urban guerrilla', 1969/1970; Edward
Luttwak 'Coup d'Etat', 1968. 

4 An example of such a patrol facility is Cyber Patrol Corporate that uses
'CyberNOT Block List' a listing of researched Internet sites containing
material which might be found questionable: "Among the categories on the
CyberNOT list are Partial Nudity; Nudity; Sexual Acts/Text; Gross
Depictions; Intolerance; Satanic or Cult; Drugs/Drug Culture;
Militant/Extremist; Violence/Profanity; Questionable/Illegal & Gambling;
Sex Education and Alcohol & Tobacco." For details see

5 A full text copy can be found in the Etext archives of the Gutenberg
project <>

6 <>

7 'Information Terrorism: Can You Trust Your Toaster?' by Matthew G.
Devost, Brian K. Houghton, and Neal A. Pollard of Science Applications
International Corporation; 1996; <>

8 Matthew G. Devost: "The United States should utilize hackers, and give
them recognition in exchange for the service they provide by finding
security holes in computer systems." in "National Security In The
Information Age"; thesis of The University of Vermont, 1995 (electronic
text version). 

9 Sun Tzu ibid. 

10 The full text can be found at <>

11 One of the many overviews from a military point of view can be found in
a paper by the Australian Air Power Studies Center on

12 Paul Delany and Gegorge P. Landow in 'Managing the Digital Word: the
text in an age of electronic reproduction', chapter in 'The Digital Word,
text-based computing in the humanities; MIT Press; Cambridge/London; 1993;

13 Pamela Samuelson in 'Digital media and the changing face of
intellectual property law'; Ruthers Computer and Technology Law Journal;
16 (1990); p.334. 

14 An example is the initiative of Brewster Kahle in 1996 when he founded
the 'Internet Archive'. In an article in Scientific American he estimated
the data volume of that time at: WWW 400,000 1,500GB 600GB/month; Gopher
5,000 100GB declining (from Veronica Index); FTP 10,000 5,000GB; Netnews
20,000 discussions 240GB 16GB/month, more details at

15 Paul Delany and Gegorge P. Landow op.cit. p.15, the full quotes reads:
"The problem on networked communications has become not how to acquire
texts but how to sift out the ones we value from the deafening babble of
global electronic traffic." 

16 For a good description of concept and history of 'hypertext' see George
P. Landow, 'Hyper Text, the convergence of contemporary critical theory
and technology'; John Hopkins University Press; Baltimore/London; 1992. 

#  distributed via nettime-l : no commercial use without permission
#  <nettime> is a closed moderated mailinglist for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: and "info nettime-l" in the msg body
#  URL:  contact: