Week 17 discussion

#1

This was such a good talk! I was taking furious notes.

One thing that I just noticed today after lecture was done that I wish I could have asked her about: I just updated my phone to iOS 12 and noticed in the settings that there’s an option now to turn on a VPN. I think that’s new? I was trying to figure out what this means–how it’s hosted, whether it retains logs, etc., but I’m not finding news stories about it.

0 Likes

#2

I don’t have that option listed under my settings…do you have a VPN installed on your iPhone?

Also, I agree. Yesterday’s lecture was fantastic! I hope I can read my notes. =)

1 Like

#3

Two things that stuck out to me in the call:

  • When Harlo was talking about some of the work she’s been involved with, she mentioned helping Grindr understand how the way they expose personal data has an impact on people in ways they may not have foreseen. I’m really trying to remember this as I get excited about different products that get promoted to me to help students (what am I not expecting). For example, the CA community college system has been pilot testing Starfish at a number of campuses (there are 114 to date). Starfish is promoted as a student retention tool. It’s supposed to make it easier for an instructor to see a dashboard of their students in the class and help see who might need extra support (like if they haven’t posted in a few weeks, or they have a number of poor scoring assignments in a row), and then they can flag someone on campus through the system to help them support the student (whether it’s offering financial aid navigation, or food insecurity, etc). In my previous institution, I was in there as a potential referral (such as student X has difficulty citing sources and therefore might be accused of plagiarism down the road, so they can let me know and I can offer some consultation to the student ahead of the next paper), and while that did feel right to me, I spent a lot of time wondering if and how it could be used in the wrong way. There’s another “Early Alert” software from the same company that we don’t have (I think it’s geared to universities), but it seemed pretty worrisome… Rather than predicting who needed help -which is how it was billed-, couldn’t they decide not to have those students at all?

  • At a later point, after Harlo spoke about Whatsapp, she mentioned that while it can be problematic, it has high recognition and might be safer for some folks depending on their threat model. I haven’t used Whatsapp myself but will be looking into it. What bothers me is that even if I do understand a little about someone’s threat model, the tech changes so fast that even if I can recommend something, how safe is it and for how long? What if there are yucky things happening on the back-end that I can’t imagine because of my own ignorance? Definitely not something I try to get too stuck on (gotta keep trying!), but it does give me pause nonetheless.

2 Likes

#4

Without looking into it too much, I have to say that Starfish sounds like something out of Black Mirror. It strikes me as an algorithm doing poorly what a human can do better. I suppose it can enable institutions to churn large quantities of people through online degree mills (or even giant seminar classes) without having to get to know any of them. Then, instead building a relationship, we can just track them like data points and send them an impersonal message to report to X support person on campus! I don’t think any of us survived any educational system without making meaningful relationships with mentors, and none of those were mediated by algorithms.

1 Like

#5

There was so much info in her talk I think I need to re-listen and take better notes! (I’ll admit that I’m always making coffee towards the beginning of these talks as I’m not good at mornings!) I had someone recently recommend I use Telegram instead of Signal. I looked back at my notes, and it just says “Russian company.” Does anyone have any up-to-date info on Telegram?

0 Likes

#6

I think about this a lot, and it makes it really hard for me to feel comfortable recommending anything to anyone. I’ve been thinking lately that maybe the better tactic for me is to help people understand 1) how to create a threat model and 2) what issues/features to look for in any given product or service so that they can evaluate it themselves. But I also recognize that “do your research, and then do it again” is oftentimes not what people want to hear.

1 Like

#7

This is the best of times and worst of times to be interested in this. Last week I was giving a 10min lightening talk about patron data collection and the day before there were Congressional hearings on consumer privacy which the tech press then lambasted. The day of the talk the facebook hack story broke. Lightening talk >> lightening storm & my brain is exploding.

@Sarah_in_Oregon When telegram was first catching on a few years back they took a lot of flack because their encryption protocol was not open source; i.e., no one could see it to test it. How it worked, or if there were bugs, and how fast those bugs got fixed, nobody knows. Telegram is very feature rich though and has none of the usability hiccups Signal has.

1 Like

#8

@mtkinney I had that problem when I was teaching a class on keeping passwords secure. One of the participants just really wanted me to tell him which password manager to use. And we were in the midst of a discussion about risks and benefits, ease of use vs. security, things to look for, etc. and he just got really irritated and impatient.

0 Likes

#9

And there is so much risk for bias! We are also implementing a similar tool in connection with advising. My understanding is that basically, your advisor can see if you are flagged as at risk for dropping out or not completing a particular major. (So, people who could pass Math class X are unlikely to complete an engineering major.) While I can see the use as a point of conversation, I have to wonder how that goes – especially, again, considering what we know about the barriers in high ed to people with various marginalized social identities.

1 Like

#10

Isn’t it always the best and worst of times with this stuff? [headdesk]

0 Likes

#11

“Seemed simpler pre-internet, pre-data flood,” he mumbles, his face bloodied.

2 Likes

#12

so @josh and @clobdell, I’m not actually sure why Claire’s phone has it listed so prominently but she might be getting A/B tested for a new location. the typical location for this setting is under settings -> general -> vpn, and if you go to it it gives you the option to add a VPN configuration (and you could add for example Riseup’s vpn config here). you have to be a little tech savvy to get this working but it’s not outrageously hard, imo. but just to clarify, no it is not a built-in vpn from iphone. it’s a place to add a vpn from scratch. your other option is adding a vpn by using an app, which would set this up for you.

0 Likes

#13

for me, this is why it always comes back to harm reduction. even if something new happens that makes the old recommendations less good, it was STILL BETTER in the meantime than the person using unencrypted comms or something like that. and also it’s very rare that a new exploit for some tool or another is SO SERIOUS that everyone is suddenly screwed. usually it’s something smaller that gets patched quickly, and everything goes back to relative normal.

that said, whatsapp is safer than telegram (to answer @Sarah_in_Oregon’s question). telegram has a shady history of decrypting data on their servers and sharing it with eg the Iranian govt. whatsapp’s content encryption is strong even if they are collecting your metadata. it’s still better than most alternatives.

1 Like

#14

Hey all, one of my student workers came to me and asked if I could help with his new phone, which is a Molorola G6 that had a lot of Amazon bloatware installed on it. He’s been able to remove a lot of it but there’s something called Amazon Widget that he can’t uninstall and it sends a bunch of ads, despite his attempts to lock down the settings. I was looking at his phone and couldn’t figure out how to disable that app. Any ideas, other than rooting the phone?

0 Likes

#15

Usually you have to root to get that stuff off.

I assume this was a coordinated plant post to demonstrate the value of FOSS. :smile:

2 Likes

#16

I found this, which you may have already found, that doesn’t require rooting the device: https://droidgiga.com/disable-amazon-ads-moto-g6-plus-play-remove-bloatware/

But I have never tried this so I don’t know if it’s legit or not.

0 Likes

#17

Thank you! Yeah, he’s leery of downloading an app to try to remove an app, but I think that’s probably the way to go.

Actually, @alison, I was also wondering about something else mobile-related. A few months ago I was talking to a programmer friend who had just released a new app for the company that he works for, and he was happy because he had only gotten two crash reports. When I asked him what he saw in crash reports, he said he could configure them to send all sorts of information, but for the ones he was looking at, he was mostly interested in operating system, CPU usage, available RAM, and what other apps they had open. This is fairly detailed information and it made me wonder 1) whether you’ve ever heard of anyone making deliberately glitchy software in order to harvest crash report data and 2) whether you know of places that reuse crash report data for marketing or other purposes.

0 Likes

#18

hey @clobdell, I haven’t heard anything about either of those uses, but I have seen some stuff about how for example Google is finally starting to consider crash reports as sensitive data collection: https://www.androidpolice.com/2018/03/22/google-warning-developers-include-prominent-crash-reporting-disclosures-apps-face-removal/

1 Like

archived #19
0 Likes