Florian Cramer on Wed, 5 Jan 2005 19:55:46 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Bill Thompson: Dump the World Wide Web



Does this guy work for Microsoft? His proposals sound like they come right 
from MS's Research & Development, including all the braindead 
security-flawed designed.

Fortunately the Web is client-server and not like what Thompson proposes. 
Otherwise we would have to shut it down because I would have turned into 
an unmanageable non-open, insecure, giant distributed spyware application. 
- Tim Berners Lee's greatest achievement, in my opinion, is that he took 
fancy, but not very well-engineered concepts from Nelson and proprietary 
applications like Hypercard and re-implemented them with the virtues of 
Unix, free software and open standards, using a client-server model, an 
open protocol (http) and SGML (=the predecessor of XML)-based file format 
(HTML) and allowing operating system-agnostic communication between 
servers and clients/browsers. For the mess that resulted afterwards, only 
software companies like Netscape, Microsoft, Sun, Macromedia with their 
proprietary file formats and format/protocol extensions have to be blamed. 
There would be nothing wrong with the web today if everyone would follow 
the open W3C standard.

> The decision to make HTTP a
> =93stateless=94 protocol has caused immense trouble. It's rather like b=
> eing served by a waiter with short-term memory loss: you can only order
> one course at a time because he will have forgotten your name, never mind
> your dessert order, by the time you've had your first spoonful of
> gazpacho.

Fortunately HTTP is stateless, otherwise privacy/spying problems would
be even worse than they are today.

> Cookies, small data files that are placed on a client computer by
> the server, provide a partial solution, rather like the tattoos sported by
> Guy Pearce in the film Memento, but they are inelegant, complicated and
> far from reliable.

In another words, he wants even something worse and less user-manageable
than cookies.

> It
> isn't as if we need to look far for an alternative -- we've had one =
> since 1990 when the web was just starting to emerge from CERN physics lab.
> It's called =93distributed processing=94 and it enables programs to tal=
> k to each other in a far richer, more complex and more useful way than the
> web's standards could ever support.

In other words: something like Corba or .NET. Richer, more complex, more
error-prone and definitely less secure.

> Instead we have to invent
> technologies which preserve the web approach while making it slightly more
> usable, like the eXtensible Markup Language, or XML.

He doesn't know what he's talking about. XML is a metalanguage and just a 
version 2.0 of the SGML metalanguage from which HTML was created. This 
just ensures that HTML and other SGML/XML-based markup languages follow a 
standard syntax model and can be processed with standard SGML/XML software 
tools (without the need of writing new software tools from scratch for 
every new specific markup language).

He writes about Microsoft:

> At the time their programmers were just beginning to explore the
> possibility of direct programme-to-programme communication and
> network-based collaboration between applications. Without the distraction
> of the web we may well have had widespread distributed online services
> five or even more years ago.

Yes, and such ones that of course only are usable from Windows computers 
because they are the only platform with a complete .NET runtime.

> These services would not rely on the Web browser as the single way of
> getting information from an online service, but would allow a wide range
> of different programs to work together over the network.

Such as the spyware and viruses that already plague computers. When do 
people, especially Microsoft developers, learn that it is not a good idea 
to execute arbitrary code from the Internet on your computer? Well, 
Microsoft has an answer for that, in their typical approach of fixing one 
design mistake with another layer of code bloat and Big Brother technology 
- their answer is called "Trusted Computing".

> A news site could deliver text, images, audio and even video through a
> program designed for the purpose, instead of having to use a
> general-purpose browser,

...thus disallowing the user to set up their own browsers, image viewers, 
audio and video players, but forcing them to use the custom application, 
including all spyware and advertising banners.

> or a shopping site could build its own shopping
> cart and checkout that did nor rely on Web protocols.

So that the user has no control over what this shopping site is doing, and 
every braindead Internet shop programmer in the world could implement his 
own flawed, insecure shopping protocol. - Thanks to sound web standards, 
we have http and https encryption as proven standard solutions.

> And we would have no
> need for Google, because information services would advertise their
> contents instead of having to be searched by inefficient =91spiders'.

So that every porn, spam and spyware site would spam the Internet and 
advertise any content that it wants, with no interception of Google search 
algorithm and page ranking.


Geert, what interests me is why you posted this article. It seems to me 
that you weren't necessarily thrilled by Thompson's technological vision, 
but more by the apocalyptic rhetoric of doing away with the web, right?

-F

-- 
http://userpage.fu-berlin.de/~cantsin/


#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: nettime@bbs.thing.net