brian carroll on Thu, 1 May 2014 05:40:37 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Shoshana Zuboff: Dark Google


Interesting read, writer Shoshana Zuboff has an impressively coherent
view. Prior to delving to the source article, I was puzzled by the
quotes below enigmatically referencing [logic] over and over, which
was the original purpose of my post- to ask a question or critique a
condition of language that may establish a boundary or barrier to
communication of ideas likely required for developing 'shared views'.
Though in reading the article further, the clarity of argument and
perspective provided more depth and context to evaluate and assimilate
the observations- though what became more clear was the condition of
boundedness of thinking, relating, communicating- the author
themselves acknowledging.

For instance, a default situation exists (in a non-empirical framework
for mass communications) that a majority of details and argument
structure is missing from any given viewpoint, whereby making
observations is dependent on hidden assumptions that cannot be
accounted for within a local text itself. Such as the specific
extended meaning of a word, or a combination of words together (onto
sentence, paragraphs) as structures, frameworks, that may be sturdy
and rest on solid foundations or shaky, and prone to collapse given
any tweak or critique this way or that. Communication today in the
sense of 'ideas' exists largely in a realm of the ungrounded, never
fully error-checked or corrected, because it is not even possible
within the language tools being used; though 'logic' references this
capacity to do so, yet functions as a symbolic entity or mysterious
force, seemingly signifying that bedrock truth exists underneath it
all - and everything is rationalized, understood, to greatest depth.

And so it is like a condition, to write a viewpoint, share ideas, that
can exist in a bounded space that allows local observations minus
total accountability of reasoning structures, though it is proposed
these variances and inaccuracies or limited-views or observations are
and must be accounted for or grounded, via other observers who relate
or short-circuit in a context of another viewpoint - finding their
place inside, outside, or betwixt-and-between it; thus the paradox of
interpretation. How does the writer write and the reader read a text;
what is being said and what is communicated. And oftentimes it is
presumed that there is a -gap- that exists between the representation
and the reality, which is a condition of language.

This at the level of language, say written, where a 'word' references
something ('word'=word) and thus to create or knit or code/program
together strings of such words in sequences that establish some
meaning, a given interpretation based on a structural framework,
resting on certain shared or unshared assumptions, to some given
degree.  (And I think the text seems to indicate that what was once
relied upon for this, may no longer be in use and yet a new approach
has not yet been established.)

I think it is evident that [logic] is a particular boundary condition
for interpretation that has functioned perhaps more closely with
~reasoning in the past, though never to the degree needed to surpass
or resolve problems of language - maybe the failures of Enlightenment
thinking and trap of discourse through more and more languaging
without greater clarity or cohesion gives a clue that something is
amiss along these historic, ideological lines. In that thinking and
beliefs become dogmatic, rigid, incapable of addressing errors.
Perhaps most notably the ability of science to edit out unwelcome
dimensions for a purified viewpoint.

And the thing is, language allows this. The existing forms of written
and oral communication as it relates to establishing of rules and laws
and methodologies and practices and pedagogy. The linearization and
serialization of viewpoints, never integrating into a common view that
reaches beyond the limit of existing language constructs, the tools
not allowing the depth that is required by logic to be mediated within
its own framework.

This could be considered in a context of analytic tools of computing,
particularly surveillance, whereby deconstructing of these variances,
nuances, variables, then distills greater meaning by how things
connect at other integrated levels- a hidden empiricism of the state
running in the background while individuals are atomized,
particularized, by unique demographics that only align in minimal not
maximal ways. And this can be a result of ungrounded language, yes,
though moreso, ungrounded thinking, awareness, observation, a loss of
connection with truth that is not already warped, muddied, vague or
itself untrue.

This fallen conceptualization is what is taught, shared, related to
and through, as ideas and beliefs, and can likewise be manipulated to
shape and form behaviors. Thus if a boundary exists between what is
mediated and what is, there can be a separation between the truth of
the condition and how it is understood, how it  is communicated about.
And the point here is that written language in its existing approach
defaults to this disconnect, particularly in regard to the absence of
logic in the frameworks to an extreme degree. Instead it functions as
an assumption, not much more than at the level of opinion, at a most
general distance from the specificities of words or concepts or
arguments. It is vagueness, and in this gap or remove from maximal
empiric accounting for truth of ideas, viewpoints form and are shared
that can move around and make observations that may touch on matters
here and there yet be entirely contradictory at the structural level,
with the same concepts used, if they were grounded in a shared
framework and all errors removed nor allowed beyond hypothesis.

An impossible task in the present, yet absolutely necessary if to
ground communication in a context of reality, and have this common
condition be based on truth, not roving opinions.

What the computer analytics allow is integration via likely various
logical processes, whether data mining of statistics or sensor
networks diagnostics or linguistic parsing of texts to deduce meaning.
And so this typing of text whether emails or books or essays, is not
only an issue of interpretation within a human framework where it can
be evaluated, 'reasoned with' in a fuzzy approximation and accounting
of assumptions and their validity and accuracy as it relates to the
larger empiric model carried within decentralized minds and
consciousness, by accounting for errors, correcting them via
interpretation, hermeneutics, "translation" from variability to local
models of reasoning or others that span beyond the limited or bounded
words and concepts into their connected frameworks, conceptual
scaffolding, the circuitry that then finds grounding, in some truth-
to greater or lesser degree, though always finite. Bounded in its
modeling as language, by what can be written, read, thought,
exchanged.

For computers deconstructing 'language' and interpreting it via
related frameworks, this depth of consideration can be established,
meaning or presumptions can be extracted and related empirically to
all other texts and contexts, as they can feasibly be coherently
related. Tools exist to break-down language and communications into
its constituent parts, to test various hypothesis, to falsify or weave
together views, to weigh probabilities, condense, transform many
different collections into a common model. And "meaning" in language
and communication requires this, to get at its actual meaning. It is
necessary, because words and concepts are so ambiguous. And so delving
layers furher into analytics relies further upon 'words' and concepts
and again, logic. For how this is approached, in what ways, for what
aims. And given the approach, this could tend towards absolute
ideology in an ungrounded context, validating tyranny, and-or towards
a grey-area of interpretation that spans human from inhuman agendas
behind-the-scenes, as this hidden empiric processing takes place.

And yet out front, on the outside of that public-private state
analytic regime, when people are writing or sharing viewpoints, it is
at the most superficial layer of known language dynamics and tools do
not exist to go beyond this bounded condition to communicate ideas.
That is, to break the linearity, to ground with truth, beyond or
outside problems with strings of [word] variables that function as if
prime number truths, by default of simply writing out a sequence of
words then judging validity on an overall impression, in that general
vagueness, allowing for ambiguity and inaccuracies if communicating a
single point or observation while using a vast many words and perhaps
errors, to rely on, to make some minor view known, in this way always
and ever recompiling the inaccurate source to render some new or
unique or unstated viewpoint which then vanishes likewise until
another view is established, etc(etc).

In this way, at least for some, it may be clear that problems with
'the Internet' at large, though also its various details such as
concepts of [openness] are not actually representative of the actual
situation, its reality. And instead are a function of language that is
amiss, that observations are not taking place in a common robust
foundational structure and instead rely on 'theoretic'
conceptualizations. If the Internet actually is 'free' or 'open', in
what sense is this not the case with 'freedom of ideas' or 'freedom of
debate' or 'freedom of discussion'. Are these not separable from a
technical assumption based on rights of consumers to have control over
their place in corporate development schemes- and why is this not
accounted for as a baseline (more in line with democracy,
constitutions, individual rights). This is not an issue of them-us
between people and corporations or governments, it would seem. It also
involves an issue of where people exist, what they observe, how they
think and relate, and what means exist to communicate what they know.
And prior to Google, prior to the Internet, there has been a
boundedness that has likely caused more grief and misery and hardship
than all plagues and wars combined. And that is inaccurate language,
incapable of allowing what exists to be mediated within its
conceptualization. A most basic issue of literacy that is able to
evaluate its own capability and realize: these words do not function
at the level required to represent the reality that exists. And this
is an issue of logic, at its very core, where truth exists and
structures circuitry outward, which then can be tapped into and-or
referenced. Thus consider a condition of language in which truth has
been severed from the very act of communication. Where it has become
irrelevant to account for actual truth, and instead function at the
level of opinion, amongst peoples, while machinery churns away at a
deeper condition and connection, not necessarily aligned with human
values or principles, and makes decisions on the global behalf.

The analytics are able to tear it apart, break it down to analyze it
in a way humans may do via interpretative engagement with ideas, yet
people who are not themselves grounded are not able to relate within a
shared empiric framework necessary to counter this condition, within
language and ideas themselves - beyond local views that are partial
and rely upon errors as a basis for saying, which then functions in a
communicative bubble, suspension of disbelief, allowing access to
whatever truths may exist, yet also not allowing this condition to
ever be transcended within the language used, likewise. Language
itself is the trap, a confinement, the limiting condition, bringing
about or reinforcing confusion and distancing or prevalence of local
detached models of events, relying on warped and skewed observer
frameworks; and in this, 'logic' itself in its historic default is
validation, in that a too-simple binary view based on authority is
taught to people, a form of indoctrination into the misplaced faith
(in quasi-private relativistic methodology that aligns with science
and technology as religion), and thus 'self-belief' can be equated
with universal truth via observational bias, which is in some way
outsourcing accountability of truth to hierarchic structures of
authority for their validation, and as people align with these forces,
they benefit from its shared viewpoint, represented as if truth, and
indicated this way by its most materialized manifestations.

It would seem that this process cannot occur the other way around,
where 'writing' is what starts in logical frameworks, within existing
written and oral language, because it breaks all the rules and
protocols for relations that reinforce the hierarchic 'shared
interpretation' that is ever extended by people referencing valued
concepts as part of discourse, various exchange.

And this gets at that boundedness issue. If logical accounting is not
allowed beyond existing and biased interpretations that even default
to an antihuman agenda, hidden or unhidden. That is, a grounded
connection with truth, within the ideas exchanged, in their accuracy,
and removed of errors or ambiguity that may allow it to persist as
pseudo-truth or even lies that function against stated goals- yet if
never accounted for, allowed to function as virii, worms, propaganda,
brainwashing, etc. "Internet of Things" one such term, given how it is
accounted for. The commercial technical utopianism is of a corporate
technocratic perspective that some may uncritically align with, due to
mediating the truth of the situation in a bounded condition, versus in
other more involved and difficult frameworks, involving behaviorism
and tyrannical control over populations, where buy-in occurs as a
result of calls for ~innovation.

The goal of writing this was to get at the idea that what is occurring
online and offline, in the ways it is occurring, can be mediated and
interpreted in various contexts and constructs. Yet when pulling back
from the immediacy of a given observation, a more general condition
also exists that involves a bounded interaction and engagement or lack
thereof, with this entire situation. And in the observing of it, the
reflective consideration and evaluation, parsing of the data and
connecting various dots in constellations of mind, here and there,
what spans the distance between peoples as shared truth is not what is
being represented in [words] to the degree necessary to deal with the
situation in the terms that exist. Basically, society and civilization
are mute to the events taking place, in that the representations are
misaligned and inaccurate and incapable of modeling events that exist,
in particular using language in its historic heirloom quality that
reinforces and extends what is occurring, instead of resolves it and
allows other options to exist. This is not the case, and it occurs at
the level of logic that is not accounted for within texts, ideas,
viewpoints, relations, exchange. Though ideological machines and
peoples have no need of this errancy in their purified secured hidden
estate.

So this condition of language makes it impossible to accurately
address issues that are voiced and brought up to be 'reasoned with' by
many minds, towards some greater empiric clarity that instead must
remain bounded within and by this very language, the very tools used,
the letters on the keyboards required to type, or sounds used to
convey viewpoints, stringing data together as if fait accompli, truth
has been served, instead of lost in this process. The assumption that
the text is a peak ability instead of a regression, the sinkhole of
bureaucracy and blackhole of relativism, faith in communication
accessing truth beyond the event horizon while annihilated in the very
process, never accessible. A mental, conceptual stasis develops such
that what is assumed true can never be tested- beyond belief. This
enigmatic conflict.

A recent interview of Noam Chomsky outlines this basic situation...

Noam Chomsky | Talks at Google  (~ 5 min)
https://www.youtube.com/watch?v=Y3PwG4UoJ0Y#

2. What is the most interesting insight the science of Linguistics has
revealed but that the public at large seems not to know about or
appreciate? 13:00

If it is acknowledged there is a -perhaps existential- crisis within
language, as language, that establishes this bounded condition and
incapacitates observers by means of communication, (whereby
normalizing a process or procedure makes what is necessary within its
realm an actual impossibility) - it perhaps is at the very crux of the
observation that language is not primarily about communication, and
instead about thinking. Its purpose more tied to logic than presently
acknowledged, and that, secondly, its meaning cannot be assumed,
though at present is believed to be by default of binary analysis,
faith in preconceived viewpoints that reinforce models of 'shared
truth' based on power, though which is only partially grounded or
ungrounded and functions at the level of protected opinion, especially
within a behavioral context. In this way, belief and faith in a binary
worldview validates binary language and the question is resolved which
then serves a machine state as authority over human civilization.

In other words, in its existing fallen condition, language can be
unremedied, and mediated in biased binary frameworks that preference
some views over others by default of what truth is acknowledged and
shared, regardless of whether it is actually true beyond the signage
or the words and concepts and beliefs said to represent events,
signification validated outside the realm of accountable truth beyond
what serves a given perspective and its peculiar interests. Language
allows this by default, where validated shared perspective correlates
with power, and thus determines truth (perhaps in some way related to
use-value, even). In this way lies the power of rationalizations,
reasoning that detaches itself from external, outside, or even
internal truth that contradicts a viewpoint and is ignored, or
defeated, as acts of purification.

This would relate to false absolutism, a false perspective, whereby a
limited and finite view is believed to be universal, though reliant on
errors, warped and skewed in observations that then become normalized,
tested against, standardized, used to indoctrinate, made 'reason' as
if the sheer act of communication is conveying this truth- because it
is believed true, etc.

So binary belief scales very well to shared worldview, and issues of
thinking are as simple as picking a successful interpretation, setting
up a perimeter, allowing no more questions that threaten that view,
purging dissenting observations, and turning it all into a religion of
sorts, where 'the word' maps to deeds, justifies them, laws,
behaviors, morality, punishment, yet can be detached from truth,
requiring a biased, skewed, warped, distorted framework from which to
mediate events. Twisted, madness even as worldview, shared by many.
And in a worst case scenario, also by machines. If the people behave
like animals in their reasoning, and the machines treat people like
animals, and that is all that is happening, then this. And "Reality,
Inc." describes this situation, sorting everything into those
analytical data boxes.

The second observation is more involved, though also tends to
different human from animals and-or machine-based behavioral analysis
as a mode of grounding ambiguous events in what is the more zoological
mindset of science and technology removed of their moral and ethical
foundation and in service to antihuman principles and agendas. The
variability of a [concept] as this may exist in terms of evaluation.
Perhaps it is very simple and there is a simple and immediate
resolution of this quandary, whereby [word] must equate with [word1]
because the other options [word2-wordN] are not even considered or
calculable. Or that once having interpreted the sign, it is forever
answered as a question, and interpretation becomes molar, occurring at
another scale, and the details or intricacies of consideration are
then off-limits, as a right/wrong or true/false evaluation exists as
if by some other process of consideration. In that, if an entity knows
something is already right or wrong, they may not need to parse it
themselves or even be able to, yet can benefit from the belief by
siding with a viewpoint. This would tend towards a non-conceptual or
non-critical thinking or evaluation of language that may be more
aligned with robots or animals, given what cues or models are in use.

And while vague to try to conceive of this as a condition, that some
indeed cannot process at the level of evaluation that others do, and
yet communication is held within a shared domain as if 'truth' can
bridge all viewpoints, even when reduced to vast inaccuracies of
shared belief that either function with a greater truth hidden inside
or lost within the language enigma.

And for the binarist this shallow approach could reinforce the latter
as the basis for deception and exploitation, a winning move and game
over for those of the opposing viewpoint. While for the former, those
who seek to ground their ideas and communications in truth, it may
still exist and persist within and beyond this ruined condition -- yet
remain bounded by thinking that attempts to access and connect with
it, this essential connection with the shared self.

So if language is about thinking, what logic an observer relies upon
for observations effects what can happen, what is thought, how it is
conceived and considered, modeled and shared. The binarists gain
language by losing greater truth, reliant upon errors and relativism,
in that ideas tend toward ideology the further 'reasoning' is extended
in onesided evaluations. The other option is that 2-value (binary)
logic is the problem of language, this limited condition and that the
grey-area of life, via paradoxical logic (3-value: true/neutral or
unknown/false, & N-value: true/N/false) must be mediated within
language, as it exists in observation and thinking, yet is not allowed
or effectively accessible within endless linear strings of data, and
instead requires new conceptualization, of diagrams, modeling,
circuitry, multilinear ecological relations and dynamics.

Existing language has been founded upon a binary mindset that now is
ideologically saturated and dead as language. It is a rancid
disease-ridden corpse, nearly absent its truth.

The problems of the Internet, of this or that, are firstly and
primarily problems of language that involves how thinking exists in
people and is or is not being communicated within given or necessary
frameworks. The problem of the Internet or computers is also a problem
of computing and technology, though moreso, how they are conceived and
programmed, how the code exists is a problem of language - not writing
it differently in a better optimization-- the thinking itself is wrong
at the core, as with written language, so too code, programming.
Thinking is off. Conceptualization. More communication of the same is
bounded, destined to repeat the same patterns over and over, through
extending original assumptions further.

So how to write if it is impossible within the existing means of
communication. How to interpret is related to this -- how do people
interpret texts. And it indicates a condition of variability likely
exists for some yet is not actually accounted for as part of the
challenges of relating via text as medium (or face to face exchanges),
in that interpretation and parsing of other views may require
grounding of observations into internal models and frameworks --
whether or not they are actually true. So the issue of inaccuracy
persists by default, and this relates to communicative inefficiency
whereby massive noise and trace amounts of truth are exchanged and
rebuilt anew for interactions and evaluations, and how robust or
creaky these structures, based on the integrity of the thinking, then
either spans or distances viewpoints. In that 'truth' can be shared
via connective or related-to/through circuitry, or shared lies via
pseudo-truth or deceptions.

This is critically important, because in a high-fidelity approximation
of interactions, those savvy to observation may be carrying multiple
models of evaluation and parsing to different structures of truth,
pseudo-truth, lies, unknowns, simultaneously, and thus [logic] may be
inherent in how data and ideas are processed, compartmentalized, acted
upon or ignored, yet 'truth' may still function in an ambiguous
condition, a realm of dense impenetrable fog at the empirical level,
allowing only local navigation or relations, where this boundedness of
language provides cover for another reality that is unaccounted for by
the machine state. Or functioning in parallel, in different processing
beyond the horizon of biased observers, even as fictive overseers with
godlike powers in the bubble condition, reliant upon constraints.

So moving from a simple binary model of evaluating texts and
communication for meaning, in that a onesided ideological viewpoint
could be established, voiced or spoken about even as a basis for mass
communications, while a more complex and involved model could co-exist
that is mediating truth to its core, ground state, whereby the
enigmatic characteristics give way to deep interconnectedness in other
dimensions and at other levels that likewise are unaccounted for, for
those opposed in principle, passive words and active deeds.

Thus subtexts and various other dynamics and processes may be
evaluated by observers, and may be shared and transmitted via
communication at some layer in the enigmatic confusion, and whether or
not you notice may involve genes, mindset, and initiation or rites of
passage.

An esoteric model of language then could exist, persist, in parallel,
communicating beyond the boundary yet still held back or confined by
it - indicating what needs to happen yet cannot occur within the
existing approach. And thus it is the value or truth communicated, yet
not grounded within the world or governing language in its overt
relation to material development, due to the politics of language and
the need to sustain the ongoing illusion of a false empiricism based
on shared, biased relativistic self-interest of certain demographics.

To cede language then, communication in its simplistic relation, is to
forfeit territory that is then deemed conquered, owned by the
victorious interpretation and its shared [reality]. And this can be as
simple as agreeing with the viewpoint, which validates its truth, or
being wrong and 'unreal' for not accepting these terms in the
animalistic mindset and machine-processing.

Thus to remember that such a concept is _variable, in its truth. It is
a question, or hypothesis, a series of structures and considerations
that must validate the presumption of accuracy and in this way,
achieving universalism of view by its majority representation can be
delusional if ignoring parameters beyond those of belief, opinion. To
the binarist it does not matter, the viewpoint has been secured, it
has 'won' the contest, success and administration is its proof.

So within the string of words, the simplistic model could coexist
within a more advanced analysis and consideration involving more
distributed models that in their localizations have established
grounding, yet not at the larger integrated scale, to compete with the
false POV of ungrounded empiricism, allowed by an relativistic
political worldview functioning against humanity. And yet it is
through this language these distanced connections can be made, and
communication thus retains some vital ability to transmit and relay
signals, to help align the structures in their variety and depth, and
build and develop the larger integrated framework.

If tools were created that helped establish this communication ability
in grounded truth, it would radically reshape the condition of
language, whereby 'thinking' would be the central goal of
communication, including organization of thoughts of the self and with
others, via software, hardware, computer development. In that perhaps
modeling and diagramming is what is needed in these varied instances
(localities) to enable their connection at various scales and from
various perspectives, to attain that global perspective that is often
aimed for within texts, yet only illusory as words strung together due
to the inherent error-rate within existing language itself, which is
establishes a boundary between truth, its representation.

One example of this is speaking of relations between people and Google
in terms of influencing those events, and the implicit yet unstated
connection to establishing corporations as citizens, where a global
business can have more money and workers than a single nation state,
and then thinking an individual can 'reason' with that
corporate-entity as if individuals in a legal dispute, or even an
activist/protest mode, when instead this could be a fictive
relationship and ungrounded, unreal as to what it involves, what the
undercurrents of the state involve in its military dimensions, and so
on. And thus assumptions are forced or not evaluated, yet are also
critical to the validity of perspective. And it is incredibly
difficult to sustain a common perspective that does not start skewing
due to finite language or with word-concepts that have multiple
meanings or in-depth structural relations that may counter or refute
certain views. And it is a condition faced by writers and thinkers,
within language.

Whereas if communication "tools for thinking" existed, these dynamics
could be tamed and and the core hypothesis brought into the common
_known framework for peer evaluation. So it matters who knows what
from whatever perspectives, to N-degrees of truth, for any length of
text to be validated in a given context, in terms of its accuracy, and
this cannot be dealt with using existing tools and technology and
language as conceived for millennia. I do the same, it is impossible
to write ideas because concepts do not work like linear strings, they
are multilinear, go in every direction, outside of the movement of
time in one direction, and yet it can be addressed, truth can be
grounded within language, yet it has to be reconceived beginning at
the level of logic, and patterns, and structures, truth itself, prior
to description.

And here is a case for interpretation as local modeling, and testing
of models against one another, that this is what is occurring in minds
and brains when parsing and translating the data of other viewpoints
locally. That depending on the observer and their relation with truth,
their observations and communications could be reinforced and
strengthened in acknowledgement of truth, or further accumulate or
protect false viewpoints, if not some of both to varying degrees,
unless able to work past this condition in some future scenario.
---

example versions....  Quotes from (Dark Google) article:

version 1:  "During the second half of the twentieth century, more
education and complex social experience produced a new kind of
individual. No longer content to conform to the mass, more people
sought their own unique paths  to self-determination. [...]

The arrival of the Internet provided a new way forward. [...] This was
a new 'networked public sphere,' as legal scholar Yochai Benkler
called it. [...]

The whole topography of cyberspace then began to morph as Google and
Facebook shifted away from the ethos of the public web, while
carefully retaining its rhetoric. They began to develop a new [logic]
of operations in what had until then been a blank area. The new zone
didn't  resemble the bricks and mortar world of commerce, but neither
did it follow the norms of the open web. This confused and distracted
users. In fact, the firms were developing a wholly new business
[logic] that incorporated elements of the conventional [logic] of
corporate capitalism - especially its adversarialism toward end
consumers - along with elements from the new Internet world -
especially its intimacy. The outcome was the elaboration of  a new
commercial [logic] based on hidden surveillance. Most people did not
understand that they and their friends were being tracked, parsed, and
mined without their knowledge or consent."


// in my estimation 'logic' here is used as a symbolic placeholder for
a condition of grounded truth that is believed to exist in the world,
yet is not being mediated within language or signs in this
comprehensive way, such that accounting for truth is detached from the
language used to communicate 'ideas' and this is closely related to
relativism and issues of perspective.

in other words, every instance of truth is not being evaluated within
the language used, such that this excerpt can be considered
'absolutely true' in its conveyance. it is incredibly easy to falsify
such statements, of most any text, by finding examples to the
contrary. It could involve a proposition: 'the whole topography of
cyberspace began to morph', which depending on how these are conceived
and evaluated could range from truth to false. It also can be at the
scale of words, such as 'cyberspace' being more than online
experience, though also 'logic' being more than binary decision-making
and analysis, which is a default assumption and does not necessarily
correlate with grounded truth, just as a signifier.

And yet it is that very referencing of a [word] that seems to validate
its truth, and this is the predicament. It is variable, for one thing,
and for another, based on observational context and interpretation, it
could vary from truth to false to unknown. It could be partly true as
a word, in a particular usage or relation, beyond the page or screen.
It could be mostly false in one reading and mostly true in another,
based on subtext.

What may be readily accessible in this relation is the essence within
the communication, that turns writing into a form of painting, or some
illustrative medium that does not have nor require the deductive
processing of the parts and instead renders the whole instead as this
'truth' that is carried by the words and sentences into a illustrated
scene or perspective. And thus whatever truth is there could be
accessed, though in a general condition, at a very high or topmost
layer, when what is underneath cannot be evaluated to depth or degree
due to the complications of language in this same conveyance and
structural approach. Is it in some sense like a landscape painting to
render observations, beyond words or of empiric meaning.


version 2:  "During the second half of the twentieth century, more
education and complex social experience produced a new kind of
individual. No longer content to conform to the mass, more people
sought their own unique paths  to self-determination. [...]

The arrival of the Internet provided a new way forward. [...] This was
a new 'networked public sphere,' as legal scholar Yochai Benkler
called it. [...]

The whole topography of cyberspace then began to morph as Google and
Facebook shifted away from the ethos of the public web, while
carefully retaining its rhetoric. They began to develop a [new logic]
of operations in what had until then been a blank area. The new zone
didn't  resemble the bricks and mortar world of commerce, but neither
did it follow the norms of the open web. This confused and distracted
users. In fact, the firms were developing a wholly new [business
logic] that incorporated elements of the [conventional logic] of
corporate capitalism - especially its adversarialism toward end
consumers - along with elements from the new Internet world -
especially its intimacy. The outcome was the elaboration of  a new
[commercial logic] based on hidden surveillance. Most people did not
understand that they and their friends were being tracked, parsed, and
mined without their knowledge or consent."


// this is an area that pinged my brain and broke my parsing, because
'logic' is really quite difficult to address, particularly in existing
language (versus modeling or diagramming), and thus my assumption was
that this condition was evaluated in a particular dictionary
definition that corresponds with truth, yet may not be accounted for
as truth, via logic, and so it may not involve any 'new logic' to the
degree stated, though finds expression in these terms because it
allows the essence of the situation to be communicated, perhaps
accurately, relativistically. It is not to nitpick though seek clarity
in language, its truth, that likely 'logic' here is a placeholder for
other ideas and concepts, such as 'strategy' or 'reasoning' which
would more realistically appraise what is actually going on, and how
it is not _that different than what has proceeded, in terms of a
leading-edge of innovation at the edge of civilization that develops
according to new or expanded principles. I think it is a false
perspective that is allowed that then 'logic' seeks to validate as if
an empirical reality shared as a mindset when it does not add up
within the language itself, the words, sentences, in terms of their
truth.

If taking ideas out of context, that is one aspect, yet considered in
terms of 'thinking itself', if all ideas are 'out of context' by
default, and this context is not allowed to develop, then most any
writing that is referencing a standard library (used for
interpretation) could be full of skew, distortion, warped relations by
default.

That said, while interpretation is _variable, and in a certain
instance an observer may not be able to accommodate the assumptions
(say 'logic') though replace or substitute (translate) another set of
concepts in terms of interpretative options, then [business logic]
could be translated into [business strategy] or [business philosophy]
and then retain its meaning in the structures it may be related to
locally, in terms of modeling and grounded truth. In that a given
concept could be a placeholder for another concept or idea or model,
nested within it, and language has this capacity to expand and
contract, this fractal ability within words and ideas, as if
interrelated molecular structures, some aligned some not, and this
quality or characteristic or these parameters seem correlated with
deconstruction and linguistics, as a method of reading, yes, though
potentially also of writing (potential hypertext, wiki, etc).

Then again, with strings and interpretation, the processing of the
data stream into a given or considered framework - this entire
situation is itself variable. And at its core what establishes and
helps determine 'meaning', that is, how truth that exists is accessed
and represented else absent, is [logic]. And thus a binary or 2-value
approach would yield, say, the above evaluation and discard the issue
of logic as logic, to gain a more descriptive aspect of ideas.

Whereas, if this viewpoint where revisited - the endless looping that
language is - in time, it would be possible to have multiple
co-existing interpretations and perhaps in some instance this use of
logic is correct, such that it is variable, and thus multiple logics
could be a subtext that is represented by its use, indicating a
2-value, 3-value, N-value Google may exist as a type of superposition
of activity in this same context, and that a multidimensional,
non-flat earth approach (referencing Flatland and more going on than
perceived in certain conditions) could be a highly accurate modeling
of the situation conceptually, whereby [logic] is variable, does not
default to a particular interpretation except logic as a concept, in
its span of truth.

This references the idea of 'recontextualization' of a text, then, by
the variability of a word or concept that is shifted or changed or
held in [superposition]. The circuitry that is grounded in truth could
change, one interpretation to the next, in differing frameworks and
structures, yet in its interrelated details could emerge some visions
otherwise unseen or beyond view, if only taking one path or another.
And so how many dimensions can be accommodated by a given observer,
and how does this relate to conceptualization and how logic is
identified, established, and parsed, and a basis for 'reasoning',
writing and reading, interpretation.

In some sense perhaps there is an issue of packaging, compression,
unpacking nested data related to concepts and language, and yet while
it may automatically occur in evaluating of texts and communicating
ideas by placing them into form, that tools do not exist to shape
language in the way that thinking actually occurs, especially in
logical terms, of truth.


version 3:  "During the second half of the twentieth century, more
education and complex social experience produced a new kind of
individual. No longer content to conform to the mass, more people
sought their own unique paths  to self-determination. [...]

The arrival of the Internet provided a new way forward. [...] This was
a new 'networked public sphere,' as legal scholar Yochai Benkler
called it. [...]

The whole topography of cyberspace then began to morph as Google and
Facebook shifted away from the ethos of the public web, while
carefully retaining its rhetoric. They began to develop a new [
  ] of operations in what had until then been a blank area. The new
zone didn't  resemble the bricks and mortar world of commerce, but
neither did it follow the norms of the open web. This confused and
distracted users. In fact, the firms were developing a wholly new
business [          ] that incorporated elements of the conventional [
         ] of corporate capitalism - especially its adversarialism
toward end consumers - along with elements from the new Internet world
- especially its intimacy. The outcome was the elaboration of  a new
commercial [          ] based on hidden surveillance. Most people did
not understand that they and their friends were being tracked, parsed,
and mined without their knowledge or consent."



#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org