www.nettime.org
Nettime mailing list archives

Re: <nettime> How computers broke science...
dan on Sat, 14 Nov 2015 05:30:19 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> How computers broke science...


 | TLDR:  point-and-click and closed-source software makes science hard to
 | reproduce.                                                                  

Or consider big data and deep learning.  Even if Moore's Law remains
forever valid, there will never be enough computing hence data
driven algorithms must favor efficiency above all else, yet the
more efficient the algorithm, the less interrogatable it is,[MO]
that is to say that the more optimized the algorithm is, the harder
it is to know what the algorithm is really doing.  That was the
exact theme of a workshop held by Morgan Stanley and the Santa Fe
Institute last fall titled, "Are Optimality and Efficiency the
Enemies of Robustness and Resilience?"[SFI]  (Every speaker's answer
was some form of "yes.")

And there is a feedback loop here: The more desirable some particular
automation is judged to be, the more data it is given.  The more
data it is given, the more its data utilization efficiency matters.
The more its data utilization efficiency matters, the more its
algorithms will evolve to opaque operation.  Above some threshold
of dependence on such an algorithm in practice, there can be no
going back.  As such, preserving algorithm interrogatability despite
efficiency-seeking, self-driven evolution is the research grade
problem that is now on the table.  If science does not pick this
up, then Larry Lessig's characterization of code as law[LL] is
fulfilled.  A couple of law professors have seized on that very
idea and suggested that price-fixing collusion amongst robot traders
will be harder to detect than collusion amongst people[WRC].  An
anti-trust DoJ litigator I know agreed with their suggestion.

--dan


[MO] Michael Osborne, Oxford University, personal communication,
2015

[SFI] "Optimality vs. Fragility: Are Optimality and Efficiency the
Enemies of Robustness and Resilience?", 2014
www.santafe.edu/gevent/detail/business-network/1665

[LL] Lawrence Lessig, _Code: And Other Laws of Cyberspace_, Basic
Books, 2000

[WRC] "When Robots Collude" (algorithms can learn to do so), 2015
uk.businessinsider.com/robots-colluding-to-manipulate-markets-2015-4


#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime {AT} kein.org