www.nettime.org
Nettime mailing list archives

<nettime> Ippolita Collective: The Dark Side of Google Chapter 7 (part 1
Patrice Riemens on Sat, 11 Apr 2009 12:08:14 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Ippolita Collective: The Dark Side of Google Chapter 7 (part 1)



As expected, Chapter 7 is a lift-up after Chapter 6. I am happy that one
of the authors wrote me to agree about the 'difficulties' of the previous
chapter.

Some interruption is to be expected again, due to the usual cause: longish
sardine-can transit in a crack long-distance service (Sampak Kranti
Express) of the Indian Railways. Lalu Jai ki Ho!

Enjoy the Full Moon!
patrizio & Diiiinooos!

..................................................................

NB this book and translation are published under Creative Commons
license 2.0 (Attribution, Non Commercial, Share Alike).
Commercial distribution requires the authorisation of the copyright
holders: Ippolita Collective and Feltrinelli Editore, Milano (.it)


Ippolita Collective

The Dark Side of Google (continued)


Chapter 7 Technocracy (part 1)

Analysis of the Google phenomenon reveals a colorful landscape, in which
the economy of search is but one element within a far larger and more
complex mosaic. Eric Schmidt himself states that Mountain View is setting
up the foundations for a global information technology enterprise, a "One
Hundred Billion Dollars Business", obviously something that is more than a
mere search engine firm.

What it actually is, is an invasive knowledge management system, whose
most significant development and methods we have sketched {in the previous
chapters}: strategies that pair aggressive marketing with smart image
building, propagation of highly configurable {and personalisable}, and yet
always recognizable interfaces, creation of standard contents 'outsourced'
to users and developers, adoption of development methods straight out the
co-operative Free and Open Software handbook, use of state-of-the-art data
capture and archival systems, information retrieval systems associated
with {advanced} profiling techniques, both implicit and explicit, and
{last but not least, sophisticated} personalisation of advertisements.


Technocracy or the experts of science

Experts have found in the control and manipulation of technology the ideal
tool to maintain their power, impose their personal interests {upon
society}, or acquire more privileges. The mechanism is {absurdly} simple:
technology is being (re)presented not only as the guarantor of the
objectivity in scientific research, it is also used to validate the
decisions of politicians in power, or more generally, those of any
'authority' that has access to the technological oracle.

The application of scientific research in its technological form is
excessive, yet reality is constantly being interpreted according to that
paradigm. The curiosity and desire for knowledge that inspire scientific
research are being hampered by ill-informed profitability criterions which
are the hallmark of {contemporary} private and public funding. If research
does not come up with immediate profits generating  technological
artifacts, it is deemed uninteresting. The power's discourse then becomes
technocratic, {completely} at the opposite end of community-oriented
sharing, of self-management, of dialogue and mediation between
individuals. To sell Google's technocracy as if it were a tool for direct
democracy is a charade, meant to make us believe that we participate in
some sort of grand electronic democracy game, when it is completely devoid
of substance. Sure, we may publish what we want on the Internet, and
Google shall index us. But us, who are 'dilettantes' and 'heretics', are
not allowed to mention that Google's accumulation strategy resonates
remarkably well with the market economy /system/, which is based on
endless growth. This makes sense, since we are neither alumni of the
London School of Economics, nor successful entrepreneurs, and us are not
{certified} experts either. Hence we have no 'authority' whatsoever. Yet,
sound common sense and Orwellian memories are more than enough to realise
that such a growth, without end or aim, is the manifestation of the will
to technological power that only consider {human} individuals as potential
consumers {and nothing else}.

That is why PageRank[TM], which, as we have seen, is not merely an
algorithm, becomes the cultural prism through which Google intends us to
analyse everything. In a certain sense, what we witness, is an enforced
extension of the peer review system - which works all right within the
academic system - to the whole gamut of human knowledge.

Traditional authorities, like religious or political institutions, have
hit rock bottom as far as their credibility is concerned. Nonetheless,
their loss of grip on reality, far from having favoured the blossoming up
of autonomous spaces has led to an unreal situation where no assertion can
be hold for true unless validated by some sort of technological authority.
The authority of machines [computers?], in most cases, is only a query
return from a data base being dished out by the high priest of technology
and {assorted} experts to the wealthy class of 'prosumers'. An extreme
form of relativism is the hallmark of methods which pretend to extract
'the truth' out of the available and {allegedly} boundlessly numerous
data, as one can surmise from the number of algorithms and filters that
have been used [?]. The true meaning of 'any search an appropriate answer'
is {actually}: 'a personalised product to every consumer'.

Confronted to this closure of creation, and of the management and
application of knowledge at our expense, there appear to remain only two
options: refusal of scientific culture considered as the root of all evil;
or on the contrary, blind and enthusiastic acceptance of every
'innovation' brought forth by technology.  However, between {these|} two
extremes, techno-hate and techno-craze, it should be possible to advance
the curiosity which is associated with the hacker ethic, viz. the sharing
of knowledge, the critical attitude towards 'truths', the rigorous
verification of sources, all that while going for the way of open
knowledge {and free circulation of information}.

Education is then a fundamental issue in this context, but the ways to
disseminate scientific knowledge on a large scale are {simply} not there.
Educational structures in Europe as well as in North America are only
geared towards the production of specialists. As of today, no pedagogic
model exits that would correspond to a demand for a 'dilettante' kind of
scientific {approach to} knowledge, not even in countries with a
non-western tradition like Brazil or India, which are nonetheless
producing high level scientific research and state-of-the-art technology
at low costs 'thanks' to unremitting {international} competition. A
scientific activity that would be neither academic nor entrepreneurial,
but decentralised and of a DIY kind is nowhere on the agenda, despite the
fact that it is indispensable to foster basic competences and the ability
to evaluate the technological innovation which concern all of us. More
specifically, the {whole} notion of 'scientific culture' would need to be
appraised afresh to cater for the all-round need to have an elementary
command of what is needed to confront the technological tsunami that
engulfs us.

The rise of Information technology to the status of main mover of
technological innovation makes new scenarios possible: IT is not merely a
technique to automatise the management of information, it also has a logic
of its own, meaning that it constantly strives to alter its own
underpinnings. IT is all at once material, theoretical and experimental.
IT applies to the formalisation of language (and hence contributes to the
formalisation of knowledge), and puts that to work with the physical
components of electronics, developing from there languages which in their
turn influence theories of knowledge. IT functions as a loop of sorts,
following a very particular cyclical process.

In classic sciences one observes stable phenomenons: the science of
physics, for instance, constructs natural data and create relevant
theories. But  with IT, {and its derivate, computer science} the phenomena
theory helps identify are {wholly} artificial, they continuously change,
both in nature and conceptually, in the same time and measure as
theoretical and experimental advances make them more refined: the software
that was developed on a computer ten years ago will be structurally
different from one that has been developed the last month. What we held
for true yesterday, we know today that we won't hold it for true tomorrow,
when we will have more powerful machines that will do novel things: this
is a living world as it where, and hence in a constant state of becoming.

(to be continued)


--------------------------
Translated by Patrice Riemens
This translation project is supported and facilitated by:

The Center for Internet and Society, Bangalore
(http://cis-india.org)
The Tactical Technology Collective, Bangalore Office
(http://www.tacticaltech.org)
Visthar, Dodda Gubbi post, Kothanyur-Bangalore (till March 31st, 2009)
(http://www.visthar.org)
The Meyberg-Acosta Household, Pune (from April 2, 2009)






#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime {AT} kein.org