Patrice Riemens on Mon, 26 May 2014 15:53:16 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Ippolita Collective, In the Facebook Aquarium Part Two,


Ippolita Collective, In the Facebook Aquarium, Part Two
The Libertarian World Domination Project: Hacking, Social Network(s),
Activism and Institutional Politics

Social networks as seen through the anarcho-capitalist lens - or the
management of sociality through Big Data. (section 4, continued)


Profiling marks the pledge for automatic, instant freedom: contextualised 
ads, research into the users' sentiments so that everyone gets a
personalised, 'bespoken' ad, so as to promptly realize a click-thru sale,
followed by the disposal of the purchase as soon as possible in order to
... order a new one.  We, the users, are in this all 'suspects' of sorts,
whose most intimate details must be known so as to satisfy our compulsive
craving for ever new, yet instantly obsolete objects. The fake issue of
confidentiality is also regularly being bandied about, but the issue
itself seems only to arise for real once confidentiality has been
breached. This usually goes together with shocked, shocked rants about the
shameful and pushy immorality  of a system that divides people into
categories. And high-octane paranoid (conspiracy) theories are rife in the
era of Big Data.  But the real, much more concrete, and also much more
concerning issue is about the individuals themselves, not the amorphous
mass we are all part of. On the one hand, people want to be profiled, and
on the other, whatever we do in order to avoid profiling, our digital
footprint sticks to us like glue: no way we can opt out once enlisted in
the army of the data-feeding-data-suppliers, aka /prosumers/ (all at once
producers and consumers).

For some time already, a massive debate has been raging about the gross
abuse of the /data mining/ that takes place to make profiling possible
[31]. New lines of digital discrimination are being created, related to
the degree of access: which researchers, which institutions, which groups
have actually both the means and the opportunity to put these data to use?
What are the rules, what are the limits - and: who decides? Here is not
the place to go into all these issues in detail [32]. Let's stick to our
main point. This is not about going against progress (and its promise of a
brilliant future), nor to escape into ludism, or into its exact opposite:
crypto(graphy). To hide serves no purpose, neither does the refusal to
make concessions to the present (order of things). What should be done is
to get a clear understanding of Big Data and profiling, in so far as they
are part and parcel of concrete strategies to arrive at a society shaped
by anarcho-capitalism, the ideology according to which everyone is 'free'
to ransack everybody else.  We do not dig this 'utopia', for sure. We
rather would call it a dystopia of control and auto(self)-control. (Yet)
We are, bit by bit, and very swiftly nonetheless, going from a world
endowed with signification, rich as it is with relationships we are
developing for our own benefit, to one that finds significance only
through relationships (pre-)determined by machines.

It looks like as if we no longer need neither theories nor practices that
are grounded in personal belief and validated by personal experience. The
status of knowledge is thereby transformed, now that data are supposed to
speak for themselves. Knowledge suddenly becomes self-evident and impose
itself as a certainty. Statistical correlations determine existing links
between things and direct relationships between people. We do no longer
shape a discourse: the data have taken over the floor. This is the pipe
dream of a society ruled by data, where the role of the human subject has
been blanked out to all practical purposes. All the human remnant needs to
do is to obediently accept to be 'freed' (of everything), including from
the possibility to choose and to desire. Give us ever more powerful
machines, hand over all your data, be transparent with the machines, and
we will be able to foretell the (radiant) future - the future of the
market, of course.

We fly above the world, we observe it from the outside, we see oceans of
data, expanding at a vertiginous rate, only to be swamped by tsunamis of
social crazes, as sudden as fleeting, occupying all available space before
making room for the next upstart. Mass sentiment can be analyzed, and the
aggregate opinion is easy to distill by way of /sentiment analysis/ and
/opinion mining/ [33]. All this while we, as enthusiast and willing
victims, love to be 'free' consumers: generalised, global recording is the
price to be paid if we want to be truly 'free' to choose. The algorithm
will tell us what we really want: it already advises us on which book to
buy on Amazon.com, it edits our searches on Google, it suggests us which
just-out film we should see, and it tells us which music best suits our
taste. It is an algorithm which points out our potential friends on
Google+ and on LinkedIn, and also those subscribers we might want to
follow on Twitter. Algorithms are paying attention in our stead, and
encourage us to socialise the right way. (Before soon) It will be no
longer necessary to desire whatever what, since the algorithm will see and
foresee for us.

Which will be the equivalent of seeing with 'the eye of God', who is able
to read the future in the crystal ball in which the information deluge
rages. Open your heart, let your body be cut and formatted in usable
chunks, speak out your mind, tell us where you are (now), what you are
doing (there) and who is your current company. Don't think, say it all,
now, and you will obtain all you desire, even without knowing yet what you
actually desire. Inexpressible vertigo (in the literal sense of what
'cannot be expressed'), infantile enthusiasm (in the original sense of
'infans': the one who doesn't talk yet), mystical ecstasy in front of the
Matrix uncoiling under our very eyes. The words and the pictures referring
to Big Data often take on a religious tone, and that a bit too often to be
just fortuitous.

The fetish dangling behind the knowledge society of the Big Data belongs
to a populist, techno-fascist religiousness, since once a sufficient
quantity of data has been gathered, any hypothesis can be proven. Just as
with the Bible, the Q'uran, the Torah, or any sacred book, the scope for
interpretation is endless. And it is precisely because Big data embrace
incredibly more stuff, that each and every assumption can be held up and
proven. Statistics vouch for everything, but prove nothing, they are
allegedly scientific proofs of highly ideologic presupposition.

(to be continued)
Next time: real and digital life explained ...




--------------

[31] Dino Pedreschi et al., "Big data mining, fairness and privacy. A
vision statement towards an interdisciplinary roadmap of research"
?Privacy Observatory/ 2011
http://www.kdnuggets.com/2011/10/big-data-mining-fairness-privacy.html
No wonder mainstream publications like /The Economist/ call for more
transparency for the sake of security, confusing thereby security with
extension of control and surveillance - perhaps deliberately so.
http://www.economist.com/node/15579717
"The data deluge: Businesses, governments and society are only starting to
tap its vast potential"
[32] For a good approach to this issue see Dana Boyd & Kate Crawford: "Six
Provocations for Big Data", /A Decade in Internet Time: Symposium on the
Dynamics of the Internet and Society/, September 2011
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431
[33] The last-born device for the automated analysis of sentiments and
opinions by way of semantic systems   (at time of publication of the
original book in Italian -transl)  was the Sentics Computing:
http://sentic.net ("Helping machines to learn, leverage, love.") WKP entry
on Sentiment Analysis:
http://en.wikipedia.org/wiki/Sentiment_analysis  (for recent developments
and a lot sources)


-----------------------------
Translated by Patrice Riemens
This translation project is supported and facilitated by:
The Institute of Network Cultures, Amsterdam University of Applied Sciences
(http://networkcultures.org/wpmu/portal/)
The Antenna Foundation, Nijmegen
(http://www.antenna.nl - Dutch site)
(http://www.antenna.nl/indexeng.html - english site under construction)
Casa Nostra, Vogogna-Ossola, Italy


#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org