Philip Galanter on Thu, 16 Jan 2003 11:13:18 +0100 (CET) |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
Re: <nettime> Institutionalization of computer protocols (draftchapter) |
NOTE: I originally wrote this as a direct response to Alexander, but then decided that the issues are general enough that others may be interested...so I've sent it to the list... Hi Alexander. Just a short note to say that while the historical narrative you provide in this chapter is quite nice and accurate so far as I can tell, it is almost entirely orthogonal to the argument you are trying to make. Of course computer protocols are ultimately homogeneous and rigidly standardized. If they were not the internet simply wouldn't work...just as cars wouldn't work unless there were standards as to the sizing of threads on machine screws and the like. But to view this as somehow "anti-diversity" in an implied social sense is a slight of hand hiding a logical fallacy. The network mechanism is rigid but that allows the potential for wildly free expression. The two shouldn't be conflated or confused. This isn't a subtle postmodern contradiction...it is as easily understood as, say, the rigid design of a printing press and moveable type allowing one to publish the thoughts of Marx or Hitler with equal ease. The one area where the technology may start to coerce meaning and expression is in the realm of semantic webs or other metadata systems based on XML or other similar schemes. At that point a line is crossed and one is trying to standardize how ideas are organized rather than how intrinsically meaningless bits are moved about. The realm of metadata has its own history, and *that* I think is a realm ripe for deconstruction. But it is also an old story. For example, in a vaguely similar way how might library cataloging systems, and indeed our modern taxonomy of academic disciplines, skew thinking, discussion, and power relations? Note that the XML standard itself is, so far as I can tell, quite neutral as well. But any particular implementation of a system using XML, any semantic web as it were, is a particular interpretative system of meanings. There are technologies on the horizon that will compete with (against?) the heavy hand of predetermined semantic schemas. These will involve systems based on a complexity based view of information and language, and which allow associations to be formed as a dynamic emergent phenomenon. Google is a simple, albeit naive, example of such an alternative. Anyway, I understand (I think) where you are coming from, but I'd have to predict that your argument will turn out to be the sort that critical studies folks find compelling and others quickly discard. Traditional networking protocols are not a good target for this kind of analysis, but the nascent attempt to standardize semantics in the name of information technology is begging for this kind of critique. cheers, Philip -- =-=-=-=-=-=-=-=-=-=-=- http://philipgalanter.com -=-=-=-=-=-=-=-=-= Philip Galanter New York University phone: 212-998-3041 Associate Director 251 Mercer fax: 212-995-4120 Arts Technology Group New York, NY 10012 internet: galanter@nyu.edu I T S A c a d e m i c C o m p u t i n g S e r v i c e s # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body # archive: http://www.nettime.org contact: nettime@bbs.thing.net