
K-7
8-12
Uni/Adult
Aristos Education Services
Episode 4 - Cyber Civil Liberties

Michael:
Welcome to the Cyberethics Podcast. I am joined today by a very special guest. Rory Mehr is the Associate Director of Community Organizing at the Electronic Frontier Foundation. They coordinate EFF's support of local advocacy groups, primarily through the grassroots information sharing network, the Electronic Frontier Alliance. Prior to joining the EFF, Rory studied activist pedagogy and adolescent use of social media as a doctoral student of psychology. As a student, instructor, and researcher, they advocated for student and worker privacy, open science, and open education on campus. They were also active in several New York City community projects like the SciPIR Collective, an EFA member group focused on accessible digital security trainings. Welcome, Rory.
Rory:
Thanks so much for having me.
Michael:
Well, it's our pleasure. So to get started, I'd like you to just give us a brief history of the Electronic Frontier Foundation and some examples of the kinds of projects that they work on.
Rory:
Sure. So yeah, the Electronic Frontier Foundation is a nonprofit civil liberty law firm that has about thirty-three years of history behind it and has had a lot of different iterations in that time. It started off as a project answering new infringements on folks' digital rights. It's most famously kicking off with the Secret Service raiding Steve Jackson games, a small GameMaker in Texas, under suspicion of file sharing and whatnot. And our founders basically identified this as, hey, this is a clear infringement, but it's not getting those same legal protections because it's in the digital space. And basically rallied support for legal representation. EFF's early days were really meant to be a foundation. It's no longer really a foundation for getting legal support to folks that are facing those sort of issues. And then it kind of iterated into, well, we'll have some hired lawyers that can build the expertise in that area and be also a law firm. And then that was the case for a while. Tried to have chapter organizations. So there's a few other electronic frontier locals still around. But then it really came at a point in close to the late nineties of like, we need to actually advocate for better legislation, not just fight it in the courts, even though courts are really important to the lawyers. But, you know, if you have better legislation, you can make better legal arguments. And that's when it really expanded into what it is today. Essentially, a civil liberties defending lobbying group that yeah, wants to make sure that our civil rights apply even when a computer is involved, which is often treated as this, like, magical, like, you can't infringe on people's rights when there's a computer. And so, yeah, and that we've been doing that since and have been growing really rapidly more recently in light of the growing threats. And our mission is to make sure that technology works for everyone in the world. and defend privacy innovation. And yeah, digital technology is now touching every subject. So our scope has definitely expanded quite a bit in that time as well. That's right.
Michael:
Yeah, it makes a lot of sense that the organization has grown in light of the, as you say, increasingly online lives that we live. As you mentioned, digital aspects of existence are creeping into ever new areas as the technology advances. Thank you for that. Tell us a little bit about your own background. How did you come to the EFF?
Rory:
Yeah, so I was always into nerdy stuff like hacking and putting together like my own server and stuff like that growing up. But I was really going on a different path of how digital technology can improve and education equity was. I had kind of a roundabout path, but that's where I landed, going to grad school for that. And in that time, I really dove a lot into community education as like a form of praxis and as a way of taking these often, you know, political things that require action to improve our lives and make technology more equitable through uh, grassroots education. Um, and I started doing that kind of as a side project when procrastinating, uh, during my, um, doctoral studies, which, um, so much great organizing work happens with procrastinating, uh, grad students. Um, and yeah, during that time, um, I, uh, co-founded Cypric Collective, um, a digital security, um, education group that, uh, initially was really focused towards activists and, um, helping, uh, improve the OPSEC of, our operational security of activists we were working with. And then had a good arrangement with the public library to do regular trainings just for the public and found that was actually really what made our work able to accelerate more. We were doing it more regularly. We were kind of seeing more regular cases of regular concerns folks were having. And yeah, in that work, that group joined the Electronic Frontier Alliance right when it started in the year of the Elephant, which you mentioned in the intro is an information sharing grassroots network that the EFF maintains. And through that, we got a lot more promotion. We were able to meet a lot more of other groups and kind of like up our trainings and up our presence. And in doing that work and getting to know EFF more immediately, not just as like a fan from afar, actually meeting folks working here. Yeah, eventually a job opened up on the activism team that worked with the EFA. I joined that team in twenty twenty and I've been with EFF since.
Michael:
Well, that's great. Sounds like a great fit.
Rory:
Yeah, it was really. I wouldn't have left my grad studies for any other job. Right. Right.
Michael:
Yeah. And I mean, the education component is so important, right? Especially in, like we said, with this kind of technology that's ever-changing, ever-developing at a pace that's difficult for the professionals to keep up with, the people who are just trying to live their lives and have these necessary digital components, you know, really value that kind of education. to have someone who can translate what the cutting-edge advances actually mean for us trying to live our everyday lives.
Rory:
For sure. Definitely.
Michael:
Perfect. So towards that point, I understand that one of the projects that you're working on, or one of the topics that you're always interested in, is privacy concerns. So just tell us a little bit about some of your work and how it touches on concerns surrounding privacy.
Rory:
Definitely. And that definitely comes from that background of meeting with everyday folks and the concerns that they're having. Because as much as folks can kind of succumb to privacy nihilism, where they're like, the big companies have everything anyway, who cares what's, you know, but then, you know, you meet people who are actually being impacted by that surveillance, which, you know, anyone can be. when you're being surveilled, that information can always be used against you. But at different points, people encounter those kind of sharp points of, oh, this really abusive person in my life is manipulating my phone in ways I don't understand, or, you know, you know, simply like this information about me is online and I have no real way of controlling it. And even just having those creepy moments where you search for something online and then you get a bunch of advertisements on a bunch of different apps about that thing. You know, different degrees of things that kind of raise that concern. So I definitely see it as a really important thing about. Privacy is really essential for preserving individual autonomy in society, but also collective autonomy. I really think communities, you know, there's a lot of like treating people like islands and digital security, where if you just harden everything, if you install Linux and you only use Tor and all these like really difficult things, then you'll be this perfectly preserved egg. But at the end of the day, privacy is a team sport. You're only as private and secure as the folks in your community and you're working with. So I think, yeah, there's the individual trainings of like risk mitigation, making sure folks understand the technology they're using. There's encouraging folks to work together and form new norms in how they're working together. And then there's the big societal thing of it does require advocacy at the end of the day to kind of ensure that you know, your ring doorbell isn't just giving information to the police if you don't want it to.
Michael:
Right. Yeah, it's a couple of interesting and really important points that you make there. I like your phrase privacy nihilism. I think that that is a problem. A lot of people suffer, right? They sort of throw up their hands and they feel like it's a battle that's already been lost and maybe they don't see the significance of it. But but but it's an important point that you raise. Some people learn too late the significance of not having proper privacy protections. And then there is the knock on effect. Like you said, we are an interconnected web. And so our privacy decisions have an effect outside of our personal lives.
Rory:
Yeah, and for some folks, privacy, frankly, isn't as important because they're in a relatively safe position in society or a privileged position, and they can kind of just ignore it. But even for those folks, I'd really encourage thinking about the other people in your life. Not everyone is, you don't actually know to what extent folks have things that they need to preserve. So just out of compassion for other people, in your community and in your life, doing some kind of basic privacy work, I think, is really essential.
Michael:
Right. Yeah, absolutely. So you might be from a population that doesn't regularly face difficulty from the authorities, but that doesn't mean that other populations aren't facing that leverage.
Rory:
And then there's the not to be too doom and gloom. But, you know, you might be OK with the current powers that be. You might trust current tech executives and current leaders in government. But information lasts longer than that. There can be new powers that arise in your life that you don't trust as much, and that information doesn't expire, typically, and can still be used against you.
Michael:
Yeah, that's an incredibly important point, I think. So not only in terms of corporations, but also governments, right? You might be fine with your government having that information about you now, but regimes change. And do you want it to be the de facto position that whatever regime has power has access to this information about you and the population?
Rory:
Yeah. And frankly, we, that's, we saw a lot of that. I wasn't with the EFF at the time, but in twenty sixteen, there was a big growth support of EFF. And I experienced as a trainer, we went from kind of a fringe kind of nerdy meetup group too. We were doing multiple trainings a month because everyone was really freaked out by Trump winning the election and what that meant for their personal security and safety. So, you know, it's always around the corner, maybe not when you expect it.
Michael:
Context can change pretty quickly.
Rory:
For sure. Yeah.
Michael:
Perfect. Another topic that we mentioned last time when we were chatting was concerns around artificial intelligence, right? So it's the hot topic these days. And so obviously there are as much excitement as there is. There's also concerns that come with any new prevalent technology. And so with artificial intelligence, one of the initial concerns that have been raised by other content creators is intellectual property rights and how those are affected. Could you speak a little bit to that topic?
Rory:
Yeah, definitely. So I think like the really core concerns with artificial intelligence that I, as I think of it, one is the underlying data set, which I think is what this is getting at, where we want to make sure it is transparent and whenever possible consensual by the folks that are in that data set. There's some even privacy concerns that come out of that. If an AI is trained on personal information, let's say your personal emails or something, You know, there's the risk of it kind of regurgitating snippets of that personal information to someone else because it kind of forms this mosaic of the underlying data that it's trained on. So, you know, it also hallucinates. So there's a little bit of plausible deniability if it does do that. But, you know, it's that kind of thinking of like people want to have a say in whether or not they're in a data set. At the same time, kind of counter to that re-expression online includes being able to scrape information and remix it and put it together in a new way. So I think there's a lot of questionable copyright claims being thrown on as a way to combat the negative uses of AI. Not to say that those impacts aren't negative, but copyright likely isn't the best tool for it. And I think a really current example is with workers' rights, basically the fear that AI is going to train on a screenwriter's writing and then just write for the screenwriter, and the studio will just pocket all that money because ChatGBT wrote it instead of a writer. And I think that sort of concern is very present and something that really falls under labor. And I think you know, labor regulations around how AI is used is probably the most appropriate way to address it. Throwing copyright stuff onto it is a little more, it has a knock-on effect that would affect free expression, not from AI. Yeah, we're wary of those precedents.
Michael:
That's very interesting. I hadn't thought about that side of things. Obviously, I thought about, you know, our remix culture and the right to reuse things. But I hadn't thought about the viability of approaching AI or large language models and trying to sue them for copyright infringement or something like that. And the knock-on effect on what we might consider valid or accepted reuse and remixing.
Rory:
Yeah. And one of the main ones that is coming up is basically using personas as a new copyrighted class of things that you can't use someone's face in, you know, just CG or use AI to put it on an actor or something, and then not pay the actor for that, which again, I think is more of a labor concern that having that sort of wide reaching thing and an actor's contract that by the way, we can use your face forever, that kind of right to publicity. thing is concerning. But, you know, being able to use a public figure's face is essential for expression. If you want to do a satire of, say, the president or some other powerful person, being able to use their face in an animation is, you know, protected expression. So generally, copyright is not the right tool for addressing very legitimate labor concerns in a lot of cases.
Michael:
Right. That's very interesting. Um, and then another topic I'd like to touch on with you is I know, um, when you're, you're talking about your background and the way you come up, I know that education is a passion of yours. And so did you want to talk a little bit about open source education?
Rory:
Yeah. So this is something, um, while in my graduate studies, I've learned more about, um, I was always into open source as a concept, this kind of everyone collaborating on this transparent project together. Um, you know, again, doing little nerdy hosting servers and stuff as a teenager. But it wasn't until I was in grad school and learning more about kind of critical pedagogy. And I was part of a program that talked about specifically critical use of technology and pedagogy that I learned more about open education and open access, which I think has a lot of intuitive, especially on the open access side of, you know, publicly funded research, the fruits of that, the findings of that should be shared to the public and science is inherently collaborative. It doesn't really make sense to hide results from other scientists if you're working to the same end. And I think the same really holds to education as well. I don't think teachers are particularly protective of their lesson plans or use, they're not usually. And are often, you know, I experienced I would have had a much harder time teaching if I didn't have other instructors. just give me their lesson plans and say, here's everything I've done, you know, make it your own, and being able to pick and choose from different things I saw. So that collaborative process is already there. What things like open education does is let's it be posted online and be more open to not just rely on those kind of connections that people have to different instructors. And that has the same empowering effect that open access does when we talk about things on a global scale, where someone who doesn't have a friend that works at Columbia or something, you know, can still access these high quality materials based on current research. And then, you know, I mean, for the students, it's free textbooks and not making students pay hundreds of dollars for often kind of outdated textbooks. These are easy to keep current, are more peer-reviewed because it's more collaboratively written. It's just less known about. Generally, when instructors learn about it, they're very quick to adopt it in their own classrooms.
Michael:
Right. That's interesting. Yeah, I know that there's also a sort of push amongst educators or some pressure being put on colleagues to publish with journals that are open access rather than the ones that are behind paywalls.
Rory:
Definitely. Yeah. And there's a lot of. now publishers are kind of finding these sort of mitigations of like, it'll be open access after a few years or these sort of different halfway points because it is a popular idea and they have to kind of adapt to it. Yeah, unfortunately, one of the adaptations and there's a group called spark or scholarly publishing and academic resource coalition that actually recently has been writing about how publishers and adapting to the popularity of open access are going for more, you know, the articles are more readily available. However, they're on the lockdown platforms that actually surveil academics and kind of collect information about you know, who's reading what and who's being productive with what. So like, they recently released this interesting report on Elsevier and its data collection. So we're seeing publishers kind of meeting halfway with the open access movement with some more surveillance heavy offerings and offering more data surveillance work.
Michael:
And is that with a model to try to then monetize the data that they're collecting?
Rory:
That seems to be. Yeah.
Michael:
Subscriptions for the journal.
Rory:
Yeah.
Michael:
It's concerning.
Rory:
Right. And, you know, and it's part of the same, like, kind of fake open access alternatives and fake open education alternatives, where it's like you buy a subscription to a suite of things that gives you free access to these things, but you can't then take it and remix it and it's not free. You know, there's all these kind of, you know, embracing open access because it's popular, but in ways that still let them leverage control.
Michael:
Right. Well, that's interesting. And I'm sure it's going to be an area of growth. As you mentioned, more and more journals are going to try and find alternatives to revenue that they were getting through per journal charges or or subscriptions.
Rory:
Definitely. But I do see it as a mark of success for the open access movement that an open education movements that people are aware of it. They see it as really valuable when they have good research. They'll often advocate like, I'm only going to publish this if it is open access. And yeah, as more academics do that. Yeah, this information will become more readily available. And I think, again, that open education thing to highlight not only does it help universities in different parts of the world that might have fewer resources, it's also nice for people who aren't in university and are just trying to learn about things on their own. Having those educational materials can let people kind of build expertise and introduce themselves to an area on their own.
Michael:
Absolutely. Yeah. It fits much more nicely with the concept of these things being public institutions, right?
Rory:
Exactly. Yeah.
Michael:
Thank you so much for all that Roar. That was very informative. A lot of our listeners will be people who are about to enter the profession, digital professions in various capacities. Given your experience in the sort of wide net that the EFF casts and the experience that you have, is there any advice that you'd give to people who are about to enter the digital professional sphere?
Rory:
Yeah, I mean, I think already thinking about ethics going into these careers is essential. It's something that Yeah, a surprising number of folks don't teach or prioritize when going into it. And remember that you as a worker for these companies have a lot of say, that we've seen a lot of good pushback from developers when a company wants them to implement an unpopular or invasive feature. So thinking about, you know, is my work empowering users? Is it? Yeah. allowing for autonomy of the users, and kind of thinking about those questions at every step in the development process, not just at the very end when there's a PR push, and advocating for the changes in their own workplace. You know, of course, I'll have to say, going to EFF and keeping up to date on what are the current issues and our positions on things can be a helpful guide in that process. But also in your own community, I really encourage folks to, in a way of giving back, tech professionals really have a unique ability to go into, say, an activist space or just a community group that they're sympathetic towards and offering that expertise to regular people and hear how they're experiencing the tech what difficulties they're running into and taking that back in in your own work. You know, I think generally people are happy to complain about tech and the ways that it's annoying to them. And it's often usability. And I think usability and access is really important. But, you know, sometimes people will voice like this thing. Why can't I just do what I want to do? Why can't I read this book I bought in Germany? um, on my American Kindle, right. Um, and those frustrations, I think letting those lead their, uh, work, um, and yeah, speaking up in the workplace when it's not, um, doing right by their user.
Michael:
Right. That's, that's great advice. Yeah. I think sometimes people feel a bit lost, especially if they're part of a larger organization, but I think you're right. Those people who are there as sort of the grassroots or the fundamental level can have an impact.
Rory:
Definitely. Um, Yeah, and I definitely think that inside outside approach is really essential to moving the needle, advocating in the workplace and working with folks outside of the workplace to understand the needs and maybe even do some advocacy with EFF.
Michael:
Right. Well, that leads perfectly onto my next question. I was going to ask if you want to tell people where they can learn more about EFF or some of these projects and any other resources you'd like to promote.
Rory:
Definitely. So you can become a member of EFF at EFF.org slash donate. You can follow our writings at eff.org slash deep links. We have RSS and all that, if you'd like. And if you'd like to kind of get involved in a more hands-on way, I administer that Graphroot network I mentioned earlier, the EFA, eff.org slash EFA. And there you can learn a little bit more about organizing. We have a tool, a few toolkits to get you started. And you can reach out to us at organizing at EFF.org. Unfortunately, we are currently just in the US. So unfortunately, Canadian groups wouldn't be able to join the EFA. That said, always welcome to reach out to us. We're always excited to help people and guide them on their local advocacy work. And then if you ever find yourself in hot water, and maybe need some legal representation, you can always reach out to info at EFF.org. Our legal intake coordinator is really great. You'll get a quick response. Either EFF will take your case, or we make sure that you're referred to a volunteer attorney that will take it on our behalf. So yeah, so those are kind of the best ways to get connected. Yeah, and like I said, reach out to organizing at EFF.org. or Rory at EFF.org. If you have any questions or want to get involved with EFF, I'm happy to chat with people.
Michael:
Excellent. Thanks so much. And we'll put links to all of those things in our show notes. Great. I want to thank you, Roy, for coming out today. We covered a wide gamut of topics from privacy concerns to AI and IP to open source education. I really appreciate your valuable insights on this wide range of topics. So thanks so much for joining us today.
Rory:
Yeah. Thank you for having me and for raising these issues.
