Profile picture
Abeba Birhane @Abebab
, 136 tweets, 42 min read Read on Twitter
One of my favorite things to do (which I don't do as much as I should, thanks to web-browsing) is go to an interesting section of a library, sit on the floor, take my time and go through what's on the shelf and I can't believe what I just found.
I promise not to over-tweet (I'll try not to) but thought tweeps might be interested in checking if Google categorizes you as "famous"
In the present day of ubiquitous surveillance, who we are is not only what we think we are. Who we are is what our data is made to say about us. #WeAreData
Algorithmically produced categorizes are replacing the politicized language of race, gender and class... whether we know about it, like it, or not. #WeAreData
Who we are in the face of algorithmic interpretation is who we are computationally calculate to be... when our embodied individualities get ignored, we increasingly lose control not just over life but over how life itself is defined. #WeAreData
The different layers of who we are online, and what who we are means, is decided for us by advertisers, marketers, and governments. And all these categorical identities are functionally unconcerned with what, given your own history and sense of self, makes you you. #WeAreData
Classification systems are often sites of political and social struggles, but there sites are difficult to approach. Politically and socially charged agendas are often first presented as purely technical and they difficult even to see. #WeAreData
The process of classification itself is a demarcation of power, an organization of knowledge and life that frames the conditions of possibilities of those who are classified. #WeAreData
When Google analyzes your browsing data and assigns you to one of the two distinct gender categories (only “male” or “female”), your algorithmic gender may well contradict your own identity, needs and values. >

#WeAreData
< Google’s gender is a gender of profitable convenience. It is a category for marketing that cares little whether you really are a certain gender, so long as you surf/purchase/act like that gender. #WeAreData
The use of predictive models based on historical data is inherently conservative. Their use tends to reproduce and reinforce assessments and decisions made in the past. >
< This type of categorization delimits possibility. It constructs programmed vision that extrapolate the future -- or more precisely, a future -- based on the past. #WeAreData
Companies like Google use their algorithms & our data to produce a dynamic world of knowledge that has, & will continue to have, extraordinary power over our present & futures. And as we also continue to be well filled w data, this algo logic produces not just the world but us.
Importantly, the “we” of “we are data” is not a uniform totality but is marked by an array of both privileging and marginalizing of differences. #WeAreData
We need to see digital technology in continuity with previous or existing social, political and economic structures, and not only in terms of change, revolution or novelty. #WeAreData
< And as all data is burdened by this structural baggage, any interpretative classification of datafied life necessarily orders and organizes the world in the shadow of those structures’ effects. #WeAreData
Defining ‘emotion’ via algorithm is an interpretation of the world that ushers us into a distinct suite of knowledge, one that orders and understands the ineffable chaos of our world as operational bits of data. #WeAreData
We are not just represented but also regulated by data. #WeAreData
What is “seen” and “not seen” is more than a technological phenomenon or limitation but an algorithmic consequence shaped by history: who is empowered to look, what is made visible, and what is made invisible? #WeAreData
Facial recognition technologies are burdened with politics. Also, why not drag Descartes with every opportunity we have... :)
"Yes, we are being surveilled, but this grand aggression of data wasn't the surveillance of Orwell's Big Brother or the FBI's COINTELPRO." #WeAreData
“We are data” means that our data, produced in accelerating quantities and with increasing intimacy, is not just data but constitutive material for interpretative, structuring, and ultimately modulatory classifications. #WeAreData
The companies, governments, & researchers that collect, evaluate, & algorithmically interpret our data are agents invested in keeping their interpretations secret. Google’s ‘gender’ & ‘age’ algorithms are proprietary, as are HP’s ‘face’ & Face.com’s ‘emotions’.
In our internetworked world, our datafied selves are tethered together, pattern analyzed, and assigned identities like ‘terrorist’ without our own, historical particularities. #WeAreData
The truism of “one man’s terrorist is another man’s freedom fighter” is reinforced by the fact that this identification is always made on terms favourable to the classifier’s geopolitical needs. #WeAreData
Engineers, mathematicians and scientists who turn social and ethical problems into technical problems are part of the problem
#WeAreData
I couldn’t agree more with the above
Much like the social construction of gender, race, sexuality, and terrorist, the datafied world is not lying in wait to be discovered. Rather, it’s epistemologically fabricated. #WeAreData
And because these constructions - of who counts as a terrorist or what it means to be a man - are legitimated through institutions like the state, media, medicine, and culture at large, they are also politicized and thus corrupt. #WeAreData
< They are inventions born in contemporary relations of power and logics of classification and thus not authentic verification of who we think we might be. #WeAreData
Who determines where data [that make up who are] come from? What data is available and what data isn’t? And, most importantly, how is this data made useful? #WeAreData
< A hammer is useful in breaking things but only presuming those things want and deserve to be broken. The same hammer can hurt people or serve as a prop in a play. #WeAreData
To build a model is to conceive of the world in a certain delimited way. #WeAreData
The indexing of categorical meaning away from the human-centred complexities of narrative, context, & history & toward measurable datafied elements within a closed set casts the measurable type as a discursively contained, and empirically definable, vessel of meaning. #WeAreData
When data defines us, the complexities of our emotional and psychological lives online are flattened out for purposes of mass-scale, approximate data analysis. #WeAreData
For computer scientists, it is the translation of the continuum into the discrete that marks the condition of possibility for computationality. To make something operable for a computer means that something has to be transcoded into discrete, processable elements. #WeAreData
Datafied life undergoes the simplifying moves that are needed to convert the messy realities of people's personal attributes and behaviours into the objective, tractable language of numbers. #WeAreData
What is race, what is gender, and thus who counts as a citizen of the state -- all get rewritten and standardized when histories and contexts are made numerical. #WeAreData
“Raw data is an oxymoron.” The production of data is, as its genesis, encased in a web of preexisting meanings, in which “data are not given; they are made." >

#WeAreData
< As such anyone who employs the term “raw data” forcefully forgets from where that data came from. The data that make up our measurable-type identities must have a history, and that history is anything but untouched by human interference. #WeAreData
More here on "raw data is an oxymoron"
Like the “translation of the continuum into the discrete”, when measurable types are made from our available data, the lived specificities of our own perspective, history, and context are denied by these objectifying claims of identification. #WeAreData
To make any claim about oneself, or to define one’s own subject position, requires a hypothetical halt to the changing nature of the world. #WeAreData

(or in Bakhtin’s words, we can only be finalized when we are dead).
In technology discourse today, there is a devotion to the measurable type, a belief that data both speaks without impediment and with a lifesaving eloquence that describes the human condition better than any doctor or scientist could have. [But] patterns in data aren’t truth. ->
<- patterns in data aren’t truth. They are constructed, algorithmically produced ‘truths’. #WeAreData
When a company is filled with engineers, it turns to engineering to solve problems. Reduce each decision to a simple logic problem. Remove all subjectivity and just look at the data. #WeAreData
Void of subjective assessment, our [algorithmic] ‘gender’, ‘race’, and ‘class’ has little resemblance to how we encounter gender, race and class. #WeAreData
Those who gather and interpret aggregate data understand that there is a certain fictional and arbitrary quality to their categories and that they hide a wealth of problematic variation. Once set, however, these thin categories operate unavoidably as if ->
<- all similarly classified cases were in fact homogeneous and uniform. #WeAreData
Software that has passed beta and is released into the wild, only to find a bug that remained hidden during its testing phase, is not a stain on modernity's notion of truth and perfection. It’s rather the truth of life, a truth without end. #WeAreData
I'll stop there for today. Plenty to digest. #WeAreData
Okay, a series of tweets coming up... #WeAreData
On the disciplinary mode of power (Foucault): if there’s surveillance/interpretation by the other, we do the work of power by disciplining ourselves in accordance with perceived social norms. #WeAreData
As subjects, we tortuously facilitate mobile but constant contact with a regulatory regime that effectively recalibrates and nuances of our own making at the moment of each and every encounter. #WeAreData
Which websites we visit, which products we purchase, which queries we search for, and even which words we type are all vacuumed up via the technological surveillant assemblage for the near entirety of the internet marketing and profiling industries. #WeAreData
Without tracking, these multibillion-dollar analytics systems would fall apart. Without a mechanism to connect our data, we are forgotten. But with tracking, we are remembered, almost forever. This is what makes companies like Quantcast (or Google or Facebook) so profitable.
I thought this was a nice illustration Bayesian probability (wonder what the queen of Bayesian stats, @djnavarro would make of it?) #WeAreData
Our network society facilitates the variable ways that sovereign power interfaces with data-centric tools. >

#WeAreData
<- Seemingly anachronistic modes of power surely remain in the most high-tech of our experiences. People are jailed, injured, killed, or intimidated often because of, not despite, their access to digital technology. #WeAreData
By processing our data but never revealing what that data means to us, the discursive topographies that shape our lives are black boxed into a state of procedural, algorithmically inferential unfolding. #WeAreData
What data materially represents is now tied to biopolitics...if an iPhone application, not a mental health professional, can assess & diagnose your ‘anxiety’ and ‘depression’, what other effects would this shift to the measurable type have for our biopolitical futures? #WeAreData
Those who are empowered to construct the categories that define life have authorship over the future of biopolitical life. And those who have the most adept measurable types are most likely to be the conduits used for biopolitical intervention. #WeAreData
What is useful in a model is what’s empowered. Or what Google or Quantcast decides is useful is what becomes true… use isn’t contingent on anything outside the model’s authors’ intentions or haphazard creations. #WeAreData
[Algorithmic] control is not a control that guides you against some presumed, autonomous will. Instead, it’s a control that frames your world. It conditions the possibilities available for you to live your life as a user -- as well as a member of a category. #WeAreData
and that's it for today :)
The endless production of data about our lives means that there will always be some piece of us to be pooled, stored, sorted, indexed and exploited. #WeAreData
In our interaction with 21st c atmospheric media, we can no longer conceive of ourselves as separate & quasi-autonomous subjects, facing off against distinct media objects; rather, we are ourselves composed as subjects through the operation of a host of multi-scalar processes.
Algorithmic power does not confront subjects as moral agents but attunes their future informational and physical environment according to the predictions contained in the statistical body. #WeAreData
While we can certainly resist relations of power outside data through direct action against Google or legal reform of the NSA, that resistance must exist on data’s terms if it wants to defy our algorithmic identifications. #WeAreData
< Of course, this is not because we literally are made of data but rather b/se data is the only thing legible to these algorithmic systems of power. In this way, these identificatory systems reject the organic. They defy the analogue grey b/n the digital polarities of +1 and 0.
Associations are the lingua franca of our datafied life, the source of its fetters as well as the nourishment for its growth. Because a single piece of data means nothing on its own, the fetishized autonomy of the liberal subject would starve without the other. #WeAreData
Patterns are made from a population, not one parson. How algorithms interpret us necessarily connects us to the lives of others. This sentiment, in all its humanistic beauty, is also regulating. It controls life on the basis of what we do. #WeAreData
Gah! Meant *one person*
The practice of classification itself, a power move of the highest order, frames our worlds in ways absent our choosing. #WeAreData
An important change in privacy dynamics is the fact that the ‘invading entities’ [of surveillance] have become broader & less easily identifiable. The major impetus for the power imbalance b/n the subjects & objects of surveillance in the network is their d/ce in identifiability.
As it stands, privacy as a concepts isn’t about shutting off from the world or hiding our vulnerabilities from some empowered surveillant eye. ->
<- Rather, the normative nature of privacy lies precisely in the protection of ‘the territories of the self’ - a preserve to which an individual can assert ‘entitlement to possess, control, use, [and] dispose of.’ #WeAreData
Privacy’s blunted practical use appears to be an unavoidable casualty of an Internet that has invaded our bodies, our social interactions, and especially our homes - the once-regal throne room of the “private.” #WeAreData
The Internet’s comprehensive capacity to surevill has has led some tech leaders, especially those who stand to make money from cultivating our datafied lives, to join the media chorus and theoretically toss the concept of privacy out the window. #WeAreData
“You have zero privacy anyway. Get over it.” Scott McNealy, Sun Microsystems CEO
“Privacy is no longer a social norm.” Zuckerberg
“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” Eric Schmidt, Google’s former CEO
For these rich white men, privacy is dead because it impinges on their business models. #WeAreData
There has always been a disempowering, asymmetrical relationship between those who have the right to privacy and those who don’t. #WeAreData
For people of colour, “privacy” in the mode of liberal democracy is often a nonstarter. ->
<Excessive state policing, nonstate social vigilance, fueled by racist media representations, & institutionalization of white supremacy all frame nonwhite bodies as less deserving of “right to be let alone”, making it more an empty slogan than a sound, egalitarian legal foundtn.
E Schmidt’s facile rejection of privacy dimly characterizes “breathing space” in a negative light, parroting a trope in contemporary privacy debates that equate the right to privacy w a defense of illegality: “the right to be let alone” has morphed into “I have nothing to hide.”
The procedural conflict between the inclusivity of the right to privacy and the specifics of that right’s action makes a single, universal definition of privacy quite difficult to come by: “privacy, like an elephant, is perhaps more readily recognized that described.” #WeAreData
Just as one person’s happiness might be another’s hell, the breathing space that each of us needs to survive is both contextual and relative. #WeAreData
A privacy that provides for an integrity of self is a privacy that lets us know who we are, how we’re treated, and what that treatment means. Unlike the closed-door privacy of the liberal “private sphere”, a privacy that celebrates integrity assumes that your life is complicated.
Surveillance in the present context applies very little to acts of seeing. Surveillance is a socioalgorithmic process. #WeAreData
Power locates our habits, identities, and characteristics in different categorical boxes. These boxes are often not for us but rather serve as objects for administration about us. #WeAreData
Warning: prepare for a tweetstorm. #WeAreData

Just finished reading the book and this will be the second last tweetstorm (another one will follow sometime in the next couple of days). :)
[In] the Data Wars, in which Google, Microsoft and Yahoo! spend billions $ to buy up several different marketing and web-analytics companies, data has become one of the prized commodities of our contemporary epoch. #WeAreData
When it is a computer, and not an individual, that surveils and evaluates our data, our ability to claim privacy “against” an other is comparably muted. #WeAreData
Privacy is essential to the development of the self as a concept. Without attending to the structures that “make plans” for our selves, we are unable to ask the lingering existential questions: “is this the kind of being I ought to be, or really want to be?” #WeAreData
[To] know how we are ordered is to know the forces at work that disturb the internal consistency, or integrity, of who we are -- and who we want to be. #WeAreData
Without the ability to determine for ourselves “when, how, and to what extent information about [ourselves] is communicated to others," we blindly speak “as” an unknown subject. #WeAreData
We lose subjective integrity when our worlds and selves are algorithmically made, in hidden and inscrutable ways, for us. #WeAreData
Dataism: "widespread belief in the objective quantification and potential tracking of all kinds of human behavour and sociality through online media technologies." Jose van Dijck #WeAreData
Algorithms are devices for allocation, and their allocation will always be empowered, functionalist, and incomplete. Sometimes a decision and its consequences can be unintentional. Other times it can be overly political. #WeAreData
Who we are in one context is different from who we are in another. A robust theory of privacy requires an understanding of the processes by which selfhood comes into being and is negotiated through contexts and over time. #WeAreData
We must learn to think of personal data as an extension of the self and treat it with the same respect we would a living individual. To do otherwise runs the risk of undermining the privacy that makes self-determination possible. #WeAreData
Understanding that our data is part and parcel of who we are and what we are subject to lets us think about privacy in a new way. Privacy has long been heralded as a human right, but what does privacy look like now that we are more than just our bodies but also our information? >
< What is privacy for the posthuman whose ‘citizenship’ and ‘sexuality’ are assigned according to one’s metadata and whose ‘class’ is determined by one’s web browser or operating system? #WeAreData
Some people might fetishize some aseptic perfection of supreme privacy, but its practice is utterly unworkable. No matter how hard we may try to avoid surveillance’s embrace, its ubiquity still manages to bring us back into the fold. #WeAreData
The tussle between the forces of ubiquitous surveillance and the forces of privacy is ultimately a struggle over the organization of digital life itself. #WeAreData
Our lives cannot be datafied in perfect fidelity, as there will always be a material aberration that causes a hiccup -- a lived experience that the ontology of the computer cannot fully reconceptualize. #WeAreData
The tragic story of Mark Hemmings, to whom the book is dedicated.

#WeAreData
If anything merits our accusatory finger-pointing [of Hemmings death], it’s the technocratic desire we have to make data speak, to have data about ourselves be the determining factor in not just our individual actions but the construction of what is ‘deserving’. #WeAreData
The first sentence of Facebook’s terms of service agreement is as hilarious as it’s insightful: “your privacy is important to us.”
Here goes last chapter #WeAreData
Microsoft’s “Potential” so transparently celebrates a world where corporate profit goes hand in hand w digital tech, which goes hand in hand w capitalist globalization, which goes hand in hand w the colonization of our features by some super-intense neoliberal metaphor of code.
We are made of data. But we are really only made when that data is made useful. This configuration is best described by the technologist wet dream in the epigraph. For the cyberutopians who believe this stuff, we are code. #WeAreData
Microsoft’s iconic corporate monopoly works as a convenient stand-in but functionally determining “who we might become.” #WeAreData
Power (be state or capital) classifies us without querying us. And those classifications carry immense weight to determine who possesses the rights of a ‘citizen’, what ‘race’ and ‘gender’ mean, and even who your ‘friends’ are. #WeAreData
On Facebook , our selves are not more free; they are more owned. And they are owned because we are now made of data. #WeAreData

(Zadie Smith's review of The Social Network.)
Imagine a world where a company like Google doesn’t just gatekeep the world’s information but also controls how we describe that world.This is the precise problm we encounter when discourse is privatized,although not the way ppl usually talk about the crisis of privatized knwldg.
Sure, public libraries are losing funding, museums are closing, and even academic journals are pitting to protect copyright for the articles that scholars write for free. That’s one kind of privatization. ->
<- But much like Google has the ability to make a ‘celebrity’ and Facebook to define ‘inappropriate’, Google and Facebook, and others are on their way to privatizing “everything”. #WeAreData
EVERYTHING

#WeAreData
The machinations of Google search will always be hidden, intellectual property that we can never, ever know. Google has no obligation to tell us how they make ‘us’ or the ‘world’ useful. But in order to use such sites, via its terms of service, we must agree not to have privacy.
In fact, via the terms of service for a site like Facebook, not only do we not have privacy, but our data (our photos, videos, even chats) doesn’t even belong to us. Data, more often than not, belongs to whoever holds it in the cloud. #WeAreData
Whether owned by a corporation or ourselves, our data (and by default our privacy) finds comfortable accommodations in the swell of capitalist exchange. #WeAreData
The digital economy, as part of a capital-producing ecosystem, reconfigures how we conceive of work, compensation, and even production. #WeAreData
Our datafied lives are made by our own actions, they order as well as represent a new kind of labour. #WeAreData
You on OkCupid, diligently answering questions in the hope of romantic partnership, produce the knowledge that feed the site’s algorithmic operations. OkCupid’s value and profit margins are dependent on these knowledges. #WeAreData
Microsoft’s code is much more than a metaphor. It’s an empirical, marketable commodity that defines the things we find important now – race, gender, class, sexuality, citizenship – as well as the things that we find important in the future. #WeAreData
When privacy is reversed, everything changes. Our worlds are decided for us, our present and future dictated from behind the computational curtain. #WeAreData
(If to exist as “superintelligent” of the singularity is to exist as abstract code, then isn’t the idea of “superintelligence” another version of Descartes mind body dualism?) #WeAreData
The singularity more likely won’t exist. It’s a techno-futurist aspiration that is more marvel than muster. ->
<- But what does exist is that ideas of our selves (not our souls) have already become datafied and been uploaded onto a server and then, so long as the server is kept plugged in, shape us and the discourses that make us. #WeAreData
Its not our selves that are made into code but the foundry by which our selves that will be made known to the world. #WeAreData
Who has power in an algorithmic world? Who can wield it? And how do the technologies of the world, often unintentionally, inscribe that underlying power in ways that make, protect, and modulate the world in order to suit the status quo or privilege some interests over others?
The end! #WeAreData
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Abeba Birhane
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!