Ugh, I hate when people say this. It’s definitely not true of the students I work with, and I remember reading this piece about the crazy Instagram stenography that teens engage in where they have like, global clusters of people on accounts so they can mess with geotagging, and there’s tons of other examples of similar things out there.
I think this is absolutely true:
Also, what is this??
The personalized learning experience stuff always seems so… idk, uninformed by actual educators to me. Like, what if there was a way for faculty to easily find open source equivalents to some of the super-pricey resources they’re familiar with and default to assigning because it’s what they know? What if we could use some of this machine learning and AI to try to identify similar resources so we can get out from under the oppressive pricing of academic publishing? It’s too bad to hear this about Canvas because it’s the one LMS that seems to be gaining some traction on our campus (we don’t officially use one, which leads to all kinds of fun inconsistencies and difficult knowledge transfer).
There’s some good news about this stuff from our little corner of the world, which is that I’ve spent some time observing a few classes at Olin where a very context-heavy presentation of artificial intelligence and algorithms was being presented. I visited the Artificial Intelligence class last semester, and there was a student who did a presentation on generative adversarial networks (the things used to make deepfakes), and they struggled to explain what good a GAN could actually do when a couple of their classmates asked. The classmates went on to do a final project that was basically like, a booklet of gut-checking your personal ethics as an engineer. When they did their final presentation, instead of printing out a poster with complex terminology and data sets, they had one-on-one conversations with visitors about what AI means to them and why they thought it was important, and they essentially did activism to show people the dangers of facial recognition and using images without consent. It was amazing.
In another class, I let students take my picture to include me in a dataset they were using to try to work on recreating an algorithm, and as part of the process, they asked me how I felt. What was it like to realize that I might not know what would ultimately be done with my photo? I told them that it would have been disturbing if I didn’t know them and feel comfortable and trusting of them. And they noted immediately that this was not a guarantee with people who’s faces have been included in other datasets. So, I’m saying this because these kinds of moments have made me feel good not only about my job choice but also about the future of engineering and people who will have this powerful tech in their hands. At least some of 'em might save us from ourselves.
Last thought here and then I’ll shut up - I think the understanding of AI in libraryland is really lacking, and I’ve seen a lot of presentations about “the future of AI in libraries lololol” that are, well, about the present, and they present a lot of things that I don’t feel particularly moved by, or that I think undermine other issues we confront in the field. For instance, there’s a few AI-driven reference bots out there that keep coming up in these talks. Does anyone really think that libraries of the future will be staffed by expensive reference robots? This seems like tilting at a windmill that directs us away from the real threat, which is austerity cutting positions and closing the library for good.
And for folks working on those technologies, what is the goal? The automation that makes sense to me in libraries is in physical components, like circulation - but that is an argument that needs to be carefully made. In my mind, the idea would be to stop repetitive stress injuries and minimize the panic of having to do five things at once at the circ desk, freeing people up to pay fuller attention and provide better service to patrons. But that might not be how other directors or administrators see it at all - that could be a dollar-signs-in-the-eyes moment for them for position cuts.