t byfield on Wed, 7 Nov 2007 12:24:33 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Goodbye Classic ?


fc-nettime@plaintext.cc (Tue 11/06/07 at 05:58 PM +0100):

> You can work around it with emulators like SheepSaver and Basilisk II.

True, but it's also worth noting that this kind of solution is part of 
the broad trend of -- I'll use a phrase that floated by somewhere I've
forgotten -- devolving risk to the consumer. I hope I don't need to add 
that 'risk' and 'consumer' are very loaded terms; it might make more 
sense to speak of devolving additional roles and responsibilities to 
individuals. Either way, the point is that requisite levels of technical 
know-how are becoming an increasingly burdensome aspect of many kinds of
cultural activity and exchange. It doesn't take much technical experience
to check out a book from a library, make a photocopy, keep an eye on the 
movie listings of repertory movie houses, record a cassette tape, or bend 
a hanger into a makeshift TV antenna. And indeed the CD-ROM as a storage 
and transport medium has proven to be astoundingly durable; the first one 
I saw was 1980 or 1981, when the word on the street was expressed in terms
of the faults of vinyl -- so, for example, it was claimed that you could
run over a CD with a car and it'd still play. I never actually saw an 
'industry rep' make that claim, but I assume that the person who owned 
that insanely expensive CD playback device heard it from an interested 
party -- back in that now-inconceivable world when it would take a full-
fledged [and -funded] technical specialist or investigative journalist to 
dredge up more accurate facts. That might seem digressive, but it's not:
the volume and velocity of ambient technical 'knowledge' needed to do 
anything more than merely 'consume' anything other than the 'media' put
in front of us is really breathtaking compared to just a few years ago.
What's even more amazing is the ease with which this knowledge transfer
takes place. My daughter, who's just over two, can: (1) reliably disable 
the keypad lock on my mobile phone using a technqiue I haven't figured
out; (2) plug a USB cable into a USB plug (as opposed to Firewire, say);
(3) unreliably but persistently manipulate a remote control to maneuver
through chapters on a DVD and songs on a Mac; and so on. No one taught 
her any of this; on the contrary, most of these activities are ultimately
little more than dressed-up versions of the kinds of cognitive puzzles 
that are typical fatures of toys made for kids that age. There are lots
of ways to interpret this; one importan one, I think, is to acknowledge
that the flipside of what's often spoken of in terms of cultural loss is
also a testament to human ingenuity. You needn't look much further than,
say, the Dead Media archives to see how *little* has been lost. Or, if
you'd like, you can look back as far as the mythopoetic burning of the 
Library of Alexandria. Would it be nice to have all the books? Sure. But
let's not delude ourselves: if we *did* have them, the result would be 
parochial bickering among specialists in various fields of classical 
studies. The loss en masse has a much broader value, as a symbol in all
its luminous generality. The same applies to more recent references like
_Fahrenheit 451_: this or that camp may invoke it to promote the idea 
that its world is being destroyed, but, really, does anyone propose any
*specific* work will be lost? Maybe, but not nearly as often as people
invoke it as a ~metaphor for a more discurive loss.

But why should it be the responsilibility of the 'consumer' to conjure 
up ways to preserve particular media objects? Why shouldn't it equally
be the responsibility of those who made the works in question to 'update'
their works? Maybe, as you suggest and based on the immanent lessons of
the problem, into what they think will be more neutral, dirable forms?

I've bumped into this in a few forms recently: Hypercard stacks for kids;
William Forsythe's _Improvisation Technologies_ CD-ROM; and an unfinished
project of my own, a 'fictional phone book' that consisted of a mass of 
my own amateur programming in high-level languages to extract data from 
the first consumer-level databases of national addressing info. In the 
case of the Hypercard stacks, there's a micro-fandom scene that's given
pointers about how to port the works (albeit in Japanese, which isn't 
my forte); I wouldn't think of suggesting to the authors that they bore
a responsibility to port the works to a newer (and equally ephemeral)
format. In the case of Forsythe's work, I think there's a strong case 
to be made than ZKM, which played an important role in publishing the
work, to play a similar role in preserving it. And in the case of my 
own, unfinished project, the 'context' has changed so dramatically that
it would make sense to finish it not to complete it but to *reconstruct*
it as a token from the different world of, oh, 1993. Which brings us to
your next point:

> But, at the risk of sounding like a zealot, this is the typical
> example of how proprietary software platforms, and dependency on them
> [which includes dependency on binary compatibility], will always bite
> you in the end. The problem wouldn't exist if those media works had
> been written as Web CGIs in Perl outputting HTML 2.0, or generating
> JPEGs, for example.

You do sound like a zealot, which is fine -- no judgment of any kind is
needed or appropriate. But it's important to recognize that the kinds 
of standards you mention (Perl, JPEG, etc) are, as you suggest, more 
resilient because they're open -- but only 'more' resilient. Perl is 
not Latin, and JPEGs are not the codex; or maybe they are -- the result
is the same, only the time scale changes. Adhering to these standards,
whosen 'openness' is defined and evaluated in large part against the 
backdrop of dominant proprietary standards, would grant these works at
most a few more decades of compatibility. You could point at something
like I/O/D's Webstalker and lament the fact that it was written as a 
Director Projector -- but that was fact was an explicit aspect of its 
creation in its context and, equally important, one reason it was such
a success in that context. If it had been written in an open standard,
it would have been inaccessible to most of the people who appreciated 
it -- who weren't about to invest in some baroque hack like Tenon's 
MachTen (or better, A/UX). 

Back in the day, as they say, some SoHo artist whose name I've forgotten
made what then seemed like a horribly cynical piece, called "Talent," I
think -- a dozen or so headshots of the art stars of that art season. I
have no idea what the artist who made it thought, but it aged quickly 
and, I think, well -- and though I haven't seen it in close to twenty 
years, I'd venture to say that it's probably one of the finest pieces 
of art that came out of that scene. The particulars about who went on 
to become blue-chip or has-been are less important than the -- to use
the phrase of the day -- 'generative' quality of the distinctions as they
unfold over time. We don't know which standards will be accepted or 
applied in the future; and while it's certainly worth considering which
ones will survive, it's also worth considering how paying too much 
attention to the imagined future historicity of what we make can remove
us from making things here and now. 

Cheers,
T





#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org