Michael Wojcik on Mon, 11 Aug 2008 20:05:49 +0200 (CEST)

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> [Augmentology] _A Warcry for Birthing Synthetic Worlds_

chad scov1lle wrote:

> One of the more poignant issues which came to mind when reviewing
> yourtext, is that openess and non-regulation is probably the most
> criticalfacet to consider when engaging and making decisions concerning
> thearchitecture of any emerging technology or discourse.

Non-regulation? TCP/IP and its popular applications came to dominate
distributed data processing precisely because they were well-regulated,
through the IETF's RFC process. HTTP and HTML were standardized as they
began to spread beyond CERN.

TCP/IP became common in academia and research organizations because it was
regulated. Prior to that, internetworking was balkanized - first with
proprietary protocols developed by hardware vendors and researchers. Later
UUCP claimed a fair bit of ground for open file transfer, but line
protocols were still often proprietary (Microcom's MNP protocols, Telebit's
protocols, etc), and interactive use was a hodge-podge. ARPAnet and the
other TCP/IP internets were the first really successful open internets -
the IBM internal HONE network was larger than the TCP/IP Internet until the
mid-80s - and they succeeded precisely because they were well-defined and

Just because the protocols weren't government-regulated doesn't mean they
weren't regulated at all. And key portions *were* regulated by the US
government. The authority for numbering and other key functions held by
entities like IANA and Network Solutions derived from MOUs from the
Commerce Department. NSFnet, a major portion of the Internet in the 1980s,
was government-owned, and its ban on commercial use kept commercial traffic
off the Internet. The lifting of that ban eventually enabled web commerce -
hugely important for popular Internet use - but the existence of that ban
in the early years (ie, the first several years after the switch from NCP
to IP) kept the backbone's bandwidth available for research use and
promoted the development of the popular TCP/IP applications, including HTTP
and HTML.

On consumer systems (ie, PCs in the generic sense), TCP/IP didn't catch on
until OS vendors (Microsoft and Apple) began bundling it. The previous
options for "open" and third-party proprietary implementations were never
more than niche players. In other words, the Internet mass market didn't
exist until OS vendors pushed it to customers - which doesn't seem entirely
compatible with the mythology of "openness".

> One of the reasons why the www scaled so considerably well, is that itis
> incredibly trivial at the outset (< 1995 ) for a interested childto
> orient themselves with html and craft a page.

While that may be true, HTML was nothing special in that regard. When it
was invented, it was just another member of the large family of markup
languages that began (more or less) with CTSS RUNOFF in 1964, which led to
Unix roff on one branch and on another, through the work of Goldfarb,
Mosher, and Lorie at IBM, to IBM SCRIPT, GML, SGML, HTML, and XML.

HTML is no easier (indeed, probably harder) to use than SCRIPT or roff. And
while SCRIPT is proprietary, roff was available on most Unix systems - the
same machines that were likely to have TCP/IP access.

For the first couple years of its existence, HTML wasn't particularly more
interesting than competitors like Archie, Gopher, and WAIS. Early text-mode
browsers didn't offer anything that wasn't available through Telnet-based
hypertext systems.

I suspect that HTML's popularity was mostly a matter of timing: the
convergence of the opening of NSFnet to commercial traffic, the push of
graphics hardware to the low end which made NCSA Mosaic's eye candy
available to a larger audience, and the growing PC market which made AOL -
a "mainstream", black-box, walled-garden BBS for non-technical users - a
viable commercial enterprise.

That's not to say that ease of use wasn't important. Certainly a great many
people found that they could create (or at least hack) HTML pages with
little or no training or reference to documentation (thanks in part to
generous observance of Postel's Interoperability Principle in HTML UAs),
and while that led to a great deal of really awful HTML, it also meant that
a great many HTML hobbyists helped popularize the web.

Was that a more important factor than, say, search engines (which largely
enabled online research for non-specialists, and in the pre-Amazon days
largely enabled e-commerce as well, by making it possible for consumers to
find online sellers), or the big e-commerce aggregators like Amazon? Or
walled gardens, like AOL? All of those were also big reasons for the
not-technically-inclined to try the web out.

> Simple scripting languages make feasible power laws when it comes
> tocontent creation. More sophisticated enviornments are allowed todevelop
> along a evolutionary map assuming that the primitive andinitial
> conditions suffice for more complex ideas, tools, andimplementations.

Composing complex applications from simple tools using scripting languages
long preceded HTML. It's the explicit design philosophy of the Unix shell
and utilities, from 1969. By the time HTML came along, it was a commonplace
of software design.

And, of course, HTML offered nothing new there. It was simply an
application of SGML.

 > HTML, with it's intuitive design,

What's intuitive about HTML?

 > gave birth to muchmore engaged ideas about media and how it should
 > operate. Flash, XML,OWL, etc are all the resultant products which
 > further enhance andrichly textualize the experience of web activity.

Flash was an inevitable application of the scripting-language philosophy to
widely-available graphics technology. It was nothing new either. Ed Catmull
demonstrated an animation scripting language at SIGGRAPH 1972.

People have certainly done interesting things with semantic markup.  RDF
and OWL, the Dublin Core and DITA - these are interesting and useful
applications of information technology. But I don't see that any of them
owe any great debt to HTML, except in its role as a popularizer (and thus
the impetus for founding the W3C). SGML is the core technology, and
semantic markup concepts have been around for a couple of decades. TEI was
founded in 1987. Relevant papers were being produced for the AQUILEX
project in the late '80s and early '90s. And so on.

I think what's really new is the proliferation of cheap commodity
processing power, communications, and machine availability. More cycles and
more bits are available to more people. Openness and ease of use are
important, but they were around long before HTML.

People make extensive use of difficult technologies if the rewards are
sufficient. Look at how many people drive cars - an awkward, fragile,
dangerous, often tiresome technology with (in many cases) high-quality
competition. Difficulty isn't the barrier to entry; access to resources is.

> Prosumer applications which allow for both the production andconsumption
> of things have become successful not only becuase theyequate the
> seemingly fundamental narcissitic tendencies of homo sapienswith
> utilization of the application, but also because it is highlytrivial to
> do so.


 > The degregulation of the content conceives the viralbehavior.

That may be true (if I understand your meaning), but I don't think it's
obviously true. Telephone service achieved widespread penetration before it
was deregulated. What proof do we have that a walled garden or some other
single-supplier network couldn't have achieved similar penetration and
wouldn't permit similar "viral" dissemination of user-generated content?

Michael Wojcik
Micro Focus
Rhetoric & Writing, Michigan State University

#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org