I really enjoyed our conversation today everyone! Let’s use this thread for further thoughts on the week 2 readings.
I appreciate everyone’s thoughtful comments in the previous discussion board and their reflections on the readings on Monday’s webinar. It was helpful to hear how people read and interpreted these sources from their own backgrounds and experiences.
How do you see the surveillant assemblage at work in our lives?
I think at the moment, we are seeing the government (state and federal) considering measures that will help track people in the US, particularly those who are infected by COVID19 and how to minimize outbreaks, all for security purposes. It’s a tricky area, and we discussed some of this in the webinar. China and South Korea have deployed similar surveillant assemblage technologies to monitor their populations. I’m very confused, and thinking of the values of our profession, and our privacy rights but at the same time recognizing the vulnerable communities if we don’t take caution of this pandemic. Looking forward to reading your thoughts on this.
What stands out to you about the ideologies that helped shape the internet?
The notions of neoliberalism and capitalism help structure the internet as a force that perpetuates economic and social privilege over others. The markets create an opportunity to “equalize” access but can really take advantage of social control, mass consumption, commercialization to access and the building barriers too. “With the success of the Internet has come a proliferation of stakeholders – stakeholders now with an economic as well as an intellectual investment in the network” (Leiner et al., 1997). The internet has been amazing in bridging connections and connecting us with information but at times we are left out from such information or given mis/disinformation, and they will continue to create all kinds of unimaginable consequences and realities.
I may add more later! Thanks for reading my writing in a stream of consciousness.
Hi All, I really enjoyed our discussion as well. Thanks for getting us started with these thoughtful questions Ray!
How do you see the surveillant assemblage at work in our lives?
This is an interesting question, and now that I am reading Dark Matters by Simone Browne, I am appreciating the theoretical tools she uses in analyzing race and surveillance. At my workplace, I have resisted doing security sweeps or patrolling for people who are eating (hungry students). We have security, and they are slow to respond, and they don’t enforce the food policy. I have said many times that I am a librarian and that my job is not to secure or patrol the library. So unless there are direct threats (where I call 911) I resist acting like security personnel.
I find food and food policy discussions in libraries to be particularly tainted with racial and cultural overtones—especially the notion of stinky food and acceptable snacks. This would be a good research study.
About data and surveillance, there are many systems that produce surveillance data in my library. For example printing data where logs are kept and can be analyzed. Wif-fi login data that records when people with devices come in and out of the library (because the phones automatically login). We have a computer tracking log-in and other authentications that could be used to track users. Finally, we have a remote keyless system (RKS), also called a keyless entry that could also be used to track library staff. Simone Browne in Dark Matters (p. 16) talks about Oscar Gandy’s concept of a “Panoptic sort” process “whereby the collection of data on individuals and in organizations is used to classify, asses, and sort privileging some and disadvantages others.” For example, I wonder what analyzing fine library data by race, gender, age, and other demographic categories would reveal? There are ethical problems with using that data for a study, but I believe you would have interesting results.
How do you see the surveillant assemblage at work in our lives?
I am still catching up on readings but one thing has really stuck with me. How the rise of technology in the form of photography, changed us. A person named with an image. Prior to that though, we have portraits and sculpture of people. Did Da Vinci’s models understand how their image would be used? Isn’t it interesting, that one of the most famous women in the world (Mona Lisa) is so unknown in terms of identity. As a photographer, I can’t help but think about that as I view all of the data uses in this age of COVID-19. It’s everywhere - from thermometers tracking outbreaks in real time to the use of facial recognition in watching over quarantined people.It goes beyond tracking people but turning us into data bits without our permissions. Our bodies, used against us. I see echoes of theses threads in all of the readings and I found @Fransnyc mention of stinky foods in libraries an interesting insight. I worked at a library once that had a sign that said food was acceptable except for “smelly” food.
What stands out to you about the ideologies that helped shape the internet? Considering the birth of the internet, it seems that we are a long way from the beginnings. Also, we needed to consider more how this technology would be commercialized and what that would mean for our data.
Thanks Ray for the questions; I love that you thought of them!
When I read the Surveillant Assemblage the first kind of technology I thought of was Amazon’s Ring. So many people I know have one because they already use Amazon, and it’s cool to have a camera that can see whose at your door on your phone. However, once you start realizing all the other things amazon owns and how they can profit off your data it isn’t worth it it to have the camera. The most threatening way they use your data is by partnering with law enforcement to share it. The police don’t need to create their own all knowing database like in Surveillant Assemblage, because Amazon is doing it for them. I’ve tried having this conversation with the people I know who own Rings, but get pushback like ‘you’re over-reacting’ or ‘we don’t break the law, so we’ll be fine’. Anyone else experience this?
On the ideologies that shaped the internet, what stood out was how well intentioned it was. People wanted to be able to share information and got together to make it happen. It makes you hopeful for fights happening now like net neutrality.
- I definitely think our society has some shedding of greed to do and we can start to see ourselves outside of ‘market’ labels - we’re human beings - this is what technology, I feel, always is trying to separate us from. And as humans we have the capacity to care for and love one another. Look if everything goes haywire everyone can come out to Iowa and we can squat a big corporate farm and start a librarians co-op where we can grow our own food and not worry about car payments-
Anyways my answers to the questions posed:
How do you see surveillant assemblage at work in my life-
Pre-internet I was disillusioned with the world in general - my parents never looked happy and they were teachers. When I expressed disinterest in any assemblage but art it was a constant fight. I was an unpredictable youth in an increasingly brand driven world. I slept in shelters, moved across country, lived in the mountains of Idaho at back country trail station, and lived on an income sharing egalitarian commune, until, finally, I found myself knocked up in Durham, North Carolina - that was my only reason for entering back into market surveilled system. Now twenty years into it I’m desperately holding onto my creativity and sense of natural living. The hardest part has been bringing kids into this system - trying to show them a direction beyond what the assemblage wants them to understand - to really find truth and not be afraid.
What stands out to you about the ideologies that shape the internet?
Yes, the capitalists of our time have shaped it - and we the mass have turned it into what it is - largely a shopping mall with a large porn room in the back. I think there’s a hacker mentality within the internet and the information flowing back and forth, but that hacker is also human - motivated by insecurity and rejection experienced in their ‘real’ lives - we cannot separate our very human experience and think that our ‘digital doubles’ can live in a fantasy world that is completely controlled.
After all we’re creatures of chaos - anyways if anyone needs anything from Iowa holla -
*** How do you see the surveillant assemblage at work in our lives?**
In “The Surveillant Assemblage” article, the editing of the famous phrase “Man was born free, and he is everywhere in chains” to “Humans are born free, and are immediately electronically monitored” really struck me. There is barely a moment in our lives when we are not being monitored, or our data is not being collected. The article mentioned new ankle bracelets for newborn babies, and while I don’t have any children I would be surprised if this kind of technology is not present in some, if not all, hospitals today. Further, you can’t watch a true crime documentary or TV show without realizing the scope of police databases, CCTV cameras, and other traditional surveillance equipment. The amount of data that we leave behind on a daily basis is staggering, and while it is overwhelming and offputting for most, it is even more dangerous for People of Color and immigrants. “Individuals with different financial practices, education and lifestyle will come into contact with different institutions and hence be subject to unique combinations of surveillance,” (pg 618). I can’t remember who so eloquently said this during our lecture/discussion, but one of the participants brought up the fact that while their data was being consumed and applied to the market, advertising things to them that they could buy, this same data could be collected from POC and immigrants going about their day-to-day lives online or in the world, or perhaps in trying to access governmental benefits or be active in asserting their social/civil rights (such as protest), and then be used to target them in police investigations or for deportation. I would love to read the full version of “Dark Matters” (and it sounds like others are already doing so!) to learn more about this systemic targeting against specific ethnicities, races, and socio-economic groups of people. This question rings not only through “The Surveillant Assemblage” and the review of “Dark Matters” but also “You Are Cyborg,” Humdog’s “Pandora’s Vox”, and “Sad by Design” in some way considers the question: “In what relationship do we live with the technologies that watch us?”
*** What stands out to you about the ideologies that helped shape the internet?**
It seems as though the Internet was always shaped by the intentions of those using it. The Declaration’s naivety of the free and open way in which this system should and could operate seems unbelievable now with its assertions that the government “wasn’t invited” (lol tell that to all the countries where the Internet is actively censored by the government), and that “all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth” (as if everyone can afford computers, smart phones, etc or access reliable Internet) and that they were creating “a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity” (LOL). At the same time, while “Another Network is Possible” seems more palatable to us because it discusses how IndyMedia was created to serve as a gathering for communities to “communicate, consolidate, and form a ‘network of resistance’… against government and corporate control,” those who created IndyMedia are very open with that fact that their “tactical media” is inherently biased, because it “is made in support of a political project.” The Internet is not a separate, lovely, equalizing space (though it does allow for more voices to be more amplified than they might be otherwise, i.e. Black Twitter, trans movements, etc, though of course this amplification comes with the cost of doxxing, etc.). It reflects many of the same biases, issues, annoyances, and systemic inequalities that the “real world” suffers from because it was made by and used by the same people in power.
I’m not quite keeping to the questions because I was inspired by a dystopian drama I watched this week to take my mind off of the actual dystopia we’re living in. The other night, I watched the first episode of Westworld Season 3, where they finally go from the park to the outside world (the real world). So much of it made me think of what we read last week.
In this future world, there is an algorithm that controls/determines everything and knows everything that is run by one man. The algorithm decides major things like whether you can get a white collar job, whether you need therapy, etc. People are given a score based on everything they do and that score determines your possibilities. The gig economy is huge for people for whom the algorithm has decided are not very employable and you can find someone to hire to do anything from blowing up an ATM, to babysitting a paranoid and violent rich guy on drugs, to killing a person. As you can imagine, inequality and a lack of social mobility are even more engrained there than in our society.
I was struck by how it’s really not that different from the world we live in today. It doesn’t seem far-fetched that we could be there soon. I think a lot of people imagine that an algorithm could be less biased and better in making decisions than people and could thus be a better way to make decisions about things like hiring, who to rent to, etc. I think those are the assumptions that underlie so many algorithmic-based decision-making tools in the justice system, insurance industry, corporate America, etc. And yet, those tools often reinforce existing racism, sexism, classism, and other structural inequities in our society (the racism of the algorithm that determines the likelihood of whether someone who committed a crime is likely to reoffend is a great example). The more we rely on algorithms to make decisions, the more engrained inequality will be in our society. I found that idea that we are given a number based on all of our actions that determines the course of our lives very believable and that would only keep the rich rich, the poor poor, and white men in power.
The big difference between the surveillant assemblage we live with and this Westworld world is that in Westworld there is one bad guy pulling the strings who I assume the androids will eventually kill, whereas surveillance and algorithmic inequity in our world is a many-headed hydra. As The Surveillant Assemblage article suggested, our surveillance state can’t be killed by banning a specific technology or going after a single entity. It’s like a game of whack-a-mole. It’s convenient to believe that taking out Facebook or Amazon or Clearview AI will make our world so much better, but they are just the most visible of the companies surveilling us, trying to keep us sad, angry, and scared, and profiting off our data doubles. So many things that seem benign and useful are also part of the surveillant assemblage. Many of us have willingly ceded our personal information to for-profit companies for the promise of more information about ourselves and our habits, the convenience of single sign-on, the ability to see the local weather forecast without searching, the promise of discounts or rewards, and the ability to control things in our home by voice. It’s all so very insidious and we all have fallen into the trap to a greater or lesser extent (no I don’t want to put in a rewards card damn it!).
And all of this, whether in Westworld or our world, is based on techno-libertarianism, neoliberalism, the idea that monetarily free is more valuable than freedom and privacy, the commodification of the self (both by individuals who monetize themselves and by corporations that sell bits and pieces of us), the idea that humans can be optimized like machines, and the idea that human behavior can be quantified and predicted. It’s ironic that the promise of a freer online world was what motivated these early Internet pioneers and “thought leaders,” when it has enabled increased surveillance and corporate control of our information and selves in ways they’d never anticipated (other than humdog who saw how we were being commodified online in the 90s!). In the end, the humans of Westworld are as controlled by machines as the androids are controlled by humans. Sadly, not a far-fetched future at all.
I really appreciate that. My first professional position was at a private military university and the Library was the only place on campus where the first-year cadets could not be harassed by their superiors (upper-classmen). We were basically this island of calm, peace, freedom, and privacy at a college in which even the amount of time they spent in the shower was policed (2 minutes, omg!!!). Though I’ve since worked in places that in no way resemble that, I still value the idea of the library as a space where students aren’t going to be hassled or surveilled. And while I know that’s not entirely true because of the actions of our online resource vendors and our college IT policies, I feel like it’s a value worth fighting for. And I especially feel that way when I think about libraries surveilling how students use our resources in order to make a value proposition to administration. It’s just not worth it.
I am trying to think a lot about strategies on how to engage librarians about how much more intense this “data driven approach.” For example, trying to make librarians see the difference between looking at circulation records and then looking at these more complex correlation studies that marry circulation to demographic variables, GPA, student record information, WI-fi logins, fingerprint-scanner time clocks, etc. Corporations and data science, in general, have made this behavioral data modeling completely ethical. We have ve come a long way from denying the FBI library check out records of banned books.
In part, this is because this corporate norm that shifted privacy to what it is today changed so rapidly, and without any consumer education or consumer protections because the technology was evolving too quickly. And the discussion of the ethics of statistical machine learning is too abstract. Statistical machine learning is how these data algorithms are trained to make predictions using behavioral data that companies get from our phones, web browsing, and other sensors. Trying to make the connection between data and intense data collection = surveillance is too abstract. Although it sounds too abstract we have started to give up our student data privacy as many of our current vendors are analyzing these Big Data sets and trying to figure out how to monetize them. There’s nothing to stop Blackboard or other Course Management Systems from selling targeted advertising to third parties. I am surprised it hasn’t happened yet.
On an upbeat note, the good thing about our profession is that it has a history of protecting privacy whereas most technology-oriented professions (Management Information Systems, Data Science, etc.) do not have this ethical training as part of their world view with information. So librarians are probably one of the few professions to really get information ethics as part of a formal and long standing curriculum. However we need more than discussions about ethics. We also need action and create research, legal barriers, and proactive legislation and policies that protect our user data.
As far as ideologies go, I think that the realization of this techno-utopia called the Internet has a scary trajectory. Mostly, because in the history of Utopias we see something like an idealized political philosophy (think Plato’s Republic, Marxism, Heaven, etc., and how these Utopias were ultimately realized to have a streak of authoritarianism like Stalinism, Religious Wars, Fascism, etc. where only one or a few were on top. Right now, Jeff Bezos and friends are on top of this free market techno-utopia, a new AI god is promised. However as the COVID pandemic spreads and Western Economies shrink, I think that China will displace Silicon Valley and copyright will be a memory. Perhaps it’s time for me to go into cryostasis until the Star Trek utopia happens. I’m sorry that’s really bleak.
I think a lot will depend on the leadership we get ahead to restructure labor, economics, healthcare, education, etc… We really need strong Presidential leadership. American Society was radically transformed during the depression under the leadership of FDR.
Haggerty and Erickson’s innovation in “The Surveillant Assemblage” is that they look beyond Foucault’s panopticon (a formidable entity), and rightly show that surveillance in the post-internet world is a lot friendlier, and a lot more insidious–a la Deleuze and Guattari’s formulation of the rhizome in Capitalism and Schizophrenia. This is obviously dense theoretical work, but its so so interesting! I’m a theory nerd, so I’m going to dig into this a bit before I get into more pragmatic ideas. Deleuze and Guattari’s rhizome is an entity that overcomes binarism, or “arborescent” thinking (ironic if you’re thinkin’ 'bout the 'net!) and hierarchies (good!), and heralds the arrival of a postmodern era featuring a “flatter,” more diverse, more organic, more messy social, political, and artistic landscape–like a gorgeous ginger root of open mindedness and free enterprise. I mentioned in last week’s meeting that in ecocritical circles, the rhizome is often thought of as the (still incoming) paradigm shift in our language and behavior needed to properly tackle climate change. To Haggerty and Erickson, the rhizome is not so positive–it’s the very thing that has corroded our access to privacy and freedom. Think of the very structure of the internet from the user’s perspective: multiple companies, with different terms of service, with different functions and assets to the user, all with different log-ins etc., that may be, beneath the soil, connecting or intertwining with various other entities, to varying degrees, for reasons unknown to the user, at times unknown to the user, and the user has no real ability to find out when or how or why. An individual’s data portfolio from Facebook can be accessed by advertising agencies, political campaigns, community groups, or the government, if the price is right. And the price isn’t all that high to begin with! The rhizome, then, as Haggerty and Erickson argue, is a much more fearful thing than the panopticon. At least in the panopticon you know you’re being watched.
This essay, in conjunction with the readings about the idealistic and libertarian philosophies and attitudes that have imbued the internet since its founding has made me think a lot about French critical theory, and its decades-long reign in the academy. French theory (of which Deleuze and Guattari are both a part and critical of) has had an outsize portion of cultural and political influence around the world since the 1970s. In many ways, it has led to a golden era of individualism and liberalism. (There’s a side-bar here about the backlash against French theory on the far left and far right, but I’m gonna stay on topic, lolol) But those attitudes of individualism and liberalism are exactly the same sentiments idealistically touted by the internet’s founders and early adopters in their wide-eyed manifestoes of the late '80s and early '90s. It’s important to remember that the early seeds of Fascist ideology were also wide-eyed, youthful, and looking to the technology of the future as the hope for improving the human experience. The motto “Move fast and break things” is a direct descendant of “The Manifesto of Futurism.”
So Haggerty and Ericson brought up how surveillance is assimilated and turned into entertainment for mass consumption. They brought up the tv show America’s Funniest Home Videos, which still airs. During my immense down time this week I decided to watch a few episodes and was surprised at the amount of people who have cameras filming constantly inside their homes. Through our participation in pop culture, we become complicit in consuming the surveillant assemblage of others, which makes it just that much harder to distance ourselves from it or condemn it.
Thank you Ray for including the weekly questions from the curriculum in here, since I neglected to do it.
Absolutely. I am scrambling to get some of our speakers changed to address these emergent topics as soon as possible. For me, the main questions are:
- Is this surveillance actually necessary to stop the virus from spreading, or are there other measures that can be just as effective (if not more) that don’t create new social problems for us. It seems that widespread testing, social distancing, effective quarantines and so on are actually the best methods of containment. We will definitely be discussing this more.
- If any new surveillance is absolutely necessary in this extraordinary time, how do we prevent it from sticking around after it is no longer necessary? As I said above, I have a hard time with the idea that any of it is necessary, and as someone who was an adult during the post-9/11 years, I know how new surveillance powers can be swept in at a time of emergency and then normalized.
Your points about Simone Browne’s book and enforcement/policing are very relevant to what Ray brought up in his comment. If we approve new surveillance powers to stop the spread of the coronavirus, who will these powers be enforced against? Will it be the celebrities, politicians, and NBA players who get the immediate and best medical care and testing? Or will it be the people who are already marginalized in our society? There is always a racist/racialized dimension to surveillance (I mean this is Browne’s thesis) – post-9/11, it was against Muslims and anyone racialized as Muslim (Sikhs, etc). Coronavirus-related surveillance will be racialized against people of Chinese descent and anyone racialized as Chinese.
I think about this a lot with how the law has not caught up with technology. Our privacy rights in public spaces never accounted for everyone having a camera that can record!
We will have a week later in the course where we just talk about the rise in consumer surveillance devices like Amazon Ring. I am very excited for it, because I think this is a hugely important emergent issue.
The road to hell! It’s a good point, because a lot of the worst things we get from technology are their unintended consequences. People want to be safe during the pandemic, so they’re not thinking about how new surveillance powers might be deployed against them in the future.
Might be taking you up on that in the coming months! Although we might be facing state border shutdowns very soon.
I am very much feeling for parents during this stage of late capitalism.
Imagine being one of the scores of newly unemployed people, and imagine you stop paying your rent. What if you’re also an immigrant? What if you leave your house to join a rent strike protest outside? How do you think the new coronoavirus methods of surveillance might be enforced against you if you belong to one or all of these groups? We are going to talk about this sort of thing this week when we talk about threat modeling.
They also were made up of many more people from marginalized identities, including lots of folks from the Global South (particularly Latin America).
Imagine how an algorithm might determine our likelihood of spreading the virus! We will be talking about algorithmic decision making during the week that we talk about facial recognition (and probably at other times too).
This is why I started Library Freedom.
I wanted to start us off with some theory, so I’m glad you connected with it! The rest of the course will be more pragmatic stuff and news and whatnot.
And this is why our response should also be rhizomatic! For all the reasons that it appeals to the ecocritical circles.
YEPPPPPPP. Sidebar, there’s a response to this that maybe you’re familiar with from Franco “Bifo” Berardi called “After the Future”.