A couple of things that struck me about the surveillant assemblage reading…and perhaps this was down to the publication date, but re:
the surplus value of our data (ahem excuse me for yelling but): IT’S ALL SURPLUS VALUE when it comes to our data, no? I mean, the data that companies actually need to make an app work versus the amount of data they grab is disproportionate. That’s a guess. I got so appalled by what was happening in my UX class that I started to tune out and plot an uprising.
on page 615, it seemed like the authors were making the case that the “soul training” that happened the under the panopticon model is obsolete and that instead we’re subject to consumerist self-monitoring. I’d say it’s not an either or, but both/and – the Sad By Design article would seem like good evidence that our synpotic (?) surveillance society includes soul training intimately bound to our market consumption (e.g. “you are your brand”).
It really is! And since there’s no accountability for these companies, we never get a real sense of how much they actually DO need vs what they take. That is, unless we have a view from the inside. We’ve been talking in class a little about Wendy Liu’s memoir of working in startups, Abolish Silicon Valley, and she gets into some detail about just how much of the startup world is built on the model of “collect all the data you can, if all else fails with your shitty product, monetize that data, rinse and repeat”. Abolish it!!!
Absolutely. They predicted a lot, but not everything, and I think this is one of the things that humdog got right in her 1994 text when she says “i have seen many people spill their guts on-line, and i did so myself until, at last, i began to see that i had commodified myself.” Our personal brands are our commodity which we also get sold back to us by advertising in a reinforcing loop.
When I read “The Surveillant Assemblage”, I highlighted this quote, “This assemblage operates by abstracting human bodies from their territorial settings and separating them into a series of discrete flows.” I thought about how this describes the dehumanization that happens in data collection. People are no longer seen as people, but rather as data points.
I think this helps explains a lot of the unintentional (and probably intentional) harm that has been done, and is being done, by the technology industry. This harm has little significance when you are only seeing data representations of people or groups of people on your screens.
An recent example of this that is not technology-related but does a good job of illustrating the point is the reporting we are seeing regarding people affected by the pandemic. The media throws around numbers and percentages and they seem to forget that they are talking about actual people. Secretary of Education DeVos tries to push the idea that if we open the schools it may mean that only .02% of children might die. That really reduces the fact that thousands of children would die to a very basic data point that we can brush off as a minuscule percentage.
Another quote: “…proles have long been the subject of intense scrutiny”. I think we have definitely seen this in response to the WFH period with differing levels of surveillance from our workplace to ensure productivity. This was the quote that triggered me to post the other topic I did.
As for the Declaration of Independence of Cyberspace, I don’t think the writers really anticipated the onslaught of griefers and trolls that has risen online. It’s stuck in the frankly libertarian view that at heart people really are good and will watch out and protect each other. And while this is true of many people, it also ignores the sheer amount of people that take delight in harming others and any society can only tolerate that harmful element to a point (the paradox of tolerance). It also fails to account for the ways that identity and property have moved online so that they are inextricably linked to our physical selves. So while they may claim that meatspace rules and laws should not apply in cyberspace, when the cyberspace and meatspace are merged laws and rules can be applied.
On a side note: I was killing time this weekend watching some TV, and caught an episode of Mysteries of the Abandoned on The Science Channel. One of the places featured was an abandoned 19th century prison right in the heart of Philadelphia - the Eastern State Penitentiary. The prison was one of the first built with the intent to “rehabilitate” its prisoners rather than just incarcerate. So it was built with lots of single confinement cells where the prisoners were meant to meditate and reflect on their wrongdoings and hopefully come out as better people. I bring it up, because it was designed so that hundreds of prisoners could be supervised and monitored by the fewest possible guards by having wings spread out from a central hub. Viewers in the hub could look down the length of the wings and keep an eye on everyone.
I had the hardest time reading The Surveillant Assemblage - a half formed thought I had was was how data doubles could easily become triplets, quadruplet, quintuplets… One act of surveillance that results in a data double can easily be spread to other surveyors for other uses. And as those doubles are used/analyzed, the distortion gets worse or corrupted, or oversimplified into caricatures of users. Does more surveillance - the matching of data doubles together from different surveyors form a more accurate double with nuance? Or does nuance get stripped away for the convenience of the surveyors to repackage the double for value extraction? I think we know the answer…
I mentioned before that I was reading “The Value of Everything” by Maria Mazzucato. Mazzucato differentiates between value creation (additive) and value extracting (exploitative). Data doubles have been ripe for value extraction - we see that already with Alphabet and FB collecting and selling user data for advertising. Social media users’ data doubles are being commodified a la “Sad by Design” to a perpetually dissatisfying experience. But the social media platforms themselves are not adding value - it’s rent seeking… There is nothing productive about the work of social media and the accompanying surveillance, it’s artificial value extraction. And it gets replicated with more surveillance.
Sam, the idea of multiple data-duplicates makes a lot of sense to me. I have multiple twitter accounts, one for librarianship and another under an alias, where I don’t have to think about if what I say will be seen by professional acquaintances or employers. I thought it was funny how on my librarianship twitter, I got ads for Elizabeth Warren, and on my aliased twitter, I got ads for Bernie Sanders. Perhaps there’s more than one Shelley in dataland, where one is profiled as a middle-aged single female liberal librarian, and another is profiled as a young talking animal with incendiary opinions about landlords.
It also makes me think about how much you can learn about someone from someone else’s social media. If not everyone in your life is privacy conscious, you will likely have friends tagging you in photos from group events, wishing you happy birthday in public (and maybe saying “I can’t believe you’re turning 30!!”), posting about how much they miss spending time with you at such-and-such habitual location. A whole data double could be constructed from the inferred details Facebook and the like collect based on your social network. I remember Facebook a long while back boasted that they could accurately predict who was gay based on who their friends were. I’m sure where a student goes to college, what someone’s religion or ethnic background is, where they live, etc. could be similarly predicted using algorithms. Of course, our data doubles are often inaccurate. On Twitter, people love to go into their ad analytics and make fun of how wrong the algorithm was about their demographics. “Twitter thinks I’m a 65-year-old woman, suppose that’s because I post about knitting. I knew I was old at heart, but do you think I should consider transitioning?”
Value creation and value extraction are certainly too different acts, I agree. Marx of course famously said that value is only created through labor. A tree might have potential value as a chair, but you can throw as much money as you want at the tree, it won’t become a chair until someone does the labor to transform it. The extraction of value comes when the capitalist/business-owner underpays and overworks the worker to obtain surplus value, and then sells the chair at a profit which the worker sees only a small share of. But this framework starts to get a bit trickier to apply when we look at data extraction from data. I dread to call tweeting labor, but if we are the product, then perhaps we are the tree, and the tech workers who build the website are the laborers transforming us into chairs?
@samlee You are so right! And with the commercialization of data that is happening all around us (remember if a service seems to be free, you are the product the company is selling!), this happening and extending the chain even further than you could believe. And now let’s consider feedback loops, where surveillance systems may be acquiring data from other purveyors but were actually the original source of that data. And then we have to deal with competing platforms (because capitalism) that are each collecting redundant data, running it through their algorithms, and then selling it.
@samlee@shelley I see us more like free range milk cows than trees. We get the appearance of having open land to ramble over and grass to chew on. But the content (milk?) we make gets harvested. And then to push this tortured metaphor further, it is sold to ice cream makers, cheese makers, bakeries, etc. who transform it into new products that they can sell (the chain of data duplicates that Sam talked about in her post).
Right it’s proletarianizing a new group of workers, the newly WFH people who are shielded from much of the very worst working conditions (eg slaughterhouses, gig workers, domestic workers) but nevertheless have become a lot more precarious in pandemic times and subject to way more scrutiny, direct surveillance, and insecurity. I think this would be a good thing to ask Jasmine McNealy about when she speaks to us in week 5 (August 20th). I’d also love to hear her discuss the already existing surveillance systems that workers were dealing with prior to covid (look up how bad it is for TRUCK DRIVERS), and how so much of this surveillance was used to prevent unionizing (eg Amazon warehouses and Uber/Lyft drivers).
I definitely understand this thinking, but I think the reason I’m a lot more cynical about this piece is because I’ve been involved in the free software/hacker world for years (via my involvement in The Tor Project) and I have talked to a lot of folks who’ve been around that scene since around the time this came out or at least as it was popular through the years. And they all said that that the “no rules but be excellent to each other” attitude was not enough to handle abusive people in the community. And then of course there were a lot of high profile abusive men that we’ve kicked out in the last few years. What I’m saying is…what you said as “things that can go wrong” with this thinking were already happening!!! The entire time! And maybe not JPB, but definitely a lot of his friends and contemporaries were our biggest barriers to kicking those abusive men out and setting up community accountability systems (stories that I will tell you all in person someday as with my JPB story of when he and I both keynoted a conference in Huntsville, Texas, home of Texas death row!)
Right before the pandemic hit, I had organized Cohort 3’s in-person weekend here in Philadelphia, and we were all gonna go to the Eastern State tour one of the days. . The audio guide is voiced by Steve Buscemi and it’s so good! So, someday we will all do that.
One part of Wendy Liu’s memoir Abolish Silicon Valley that really stuck with me is her description of working in startups, and how everyone just pivots to selling data when their startup (inevitably) fails and before they can figure out another startup scheme. It’s chilling to think about how many hands have touched our data. It’s really such a violation!!!
It’s one of the underlying jokes on the show Silicon Valley, like most of these companies…don’t actually even do anything. The way they describe themselves makes no sense, it’s all buzzwords.
When you mentioned this before I didn’t immediately catch the author’s name…but I just read Maria Mazzucato’s book The Entrepreneurial State, which is about how so much of the ACTUALLY valuable things that have come from SV (eg, the iPhone) were built thanks to enormous state investments in blue-sky technology research and in public research academic institutions (not to mention infrastructure like roads and irrigation in California). The Californian Ideology mentions this as well. And yet the myth persists that all of these guys started something amazing in their garages with no help!!!
shelley can you dm me your twitter handle lol
I literally took a four hour OSINT (open source intelligence) workshop today as part of the HOPE conference, and one thing we talked about at length was how much social engineers will look to your social network, and learn things about you by what your friends and family have said. This week, we will be talking about threat modeling and how our communities and networks are involved in assessing our risks.
This is actually one of Shoshana Zuboff’s points in The Age of Surveillance Capitalism. She says one kind of our data is the raw material, and then another kind is “behavioral surplus” and the way she described it in the book sounded to me a lot like “surplus value”, to put it back in Marxist terms.