Dmytri Kleiner on Tue, 12 Jun 2012 22:33:01 +0200 (CEST)

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Responsibilization & Collective Social Aspirations

In her informative and entertaing talks Seda GÃrses {1} often refers to the process of "responsibilization." Speaking about privacy, she argues that new, privately operated, online platforms are transforming responsibility of providing safety and privacy to users of these platforms, where users would previously have had the expectations that operators and regulators would bear this responsibility. Responsibilization is the transfer of responsibility form being something that is social and shared, to being held solely by individuals.

Just looking at communications media, privacy on the postal system and the mail system was legislated, and operators and users where held responsible for complying to socially imposed standards.

Yet, on modern online platforms, despite their increasing importance, users are for the most part governed by the site operators' user agreements, and the responsibility of understanding the sites privacy and safety implications, including often complex settings and options, is help by the user alone.

This raises a very important question.

Do we have a collective right to social aspirations? Do we have a collective right to work towards socially determining social outcomes?

Should people simply get the privacy and safety they deserve as determined by their own behaviour? Or do we want a society where people can expect that their privacy and safety is something that can be socially determined?

What's worse, is that there is a significant moral hazard at work, even when legislation, regulation, or simply user outcry, seeks to improve the privacy policies of a platform, the site has significant incentive to resist, foot-drag or out right ignore such expectations. The business models of most operators is based on monetizing user interaction and user data, and therefor whatever the regulatory environment or user desire, so long as it's up to the operators to implement privacy, we're leaving the fox in charge of the hen house, as the saying goes.

Can we realistically expect private platforms to enthusiastically place social concerns above there shareholder's profit interest?

And even if strong regulations, vigilant user advocacy groups or some other incentives can manage to keep the fox from making diner of the hens, is this watchdog state of affairs, with social watchdogs watching profit watchdogs and fighting over every decision really the best way to manage communications platforms?

Ours is often called the communications age, and the development global digital network communications network is often cited as one of the most important development in human history, comparable to electricity, or even agriculture.

Yet, the commercial Internet was born in a neoliberal era, an era when we are no longer allowed to have collective social aspirations, we are no longer allowed to want social outcomes, or even work towards them. We are allowed only to simply accept outcomes as facts, to believe that outcomes are determined by some sort of exogenous logic, be it the market or economy or politics or nature itself, and this often comes hand in hand with blaming the victims for their misfortune. If they where faster, stronger, smarter or even luckier they would have done better with the whole privacy and safety thing, or further the whole wealth and power thing.

Yet, being white, male and rich are often more significant than being fast, strong, smart or talented. By accepting a world where we get what we deserve, where outcomes are the result of individual merit, we embrace a delusion, a make believe land where power, privilege and wealth do not exist.

This delusion is killing the Internet.

Too often, solutions to the social issues of communications are presented as being individual or collective. Too often, privacy concerns about online platforms are responded to with "Well, just don't use Facebook, if you don't like it" or even more unrealistically, "You should make something better then. If their was a market demand for something better, somebody would have made it, so obviously people don't care about your issues."

The fact is that "making something better" requires investment, requires wealth, which means in most cases, capital. Yet, capitalists must invest in ways capture profit, which brings up the moral hazard again. Don't trust this fox with your hens? Find a different fox!

More and more it feels like the biggest challenge of our age is the challenge of making people people that we have a right to collectively work towards our social aspirations, that we can and must work together to achieve social outcomes. Not only for privacy and safety on line, but to create the kind of society we want broadly.

A society where wealth and power, and responsibility is more broadly shared.


Dmytri Kleiner
Venture Communist

#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info:
#  archive: contact: