Thousands of Young People Told Us Why the Kids Online Safety Act Will Be Harmful to Minors

1 month 3 weeks ago

With KOSA passed, the information i can access as a minor will be limited and censored, under the guise of "protecting me", which is the responsibility of my parents, NOT the government. I have learned so much about the world and about myself through social media, and without the diverse world i have seen, i would be a completely different, and much worse, person. For a country that prides itself in the free speech and freedom of its peoples, this bill goes against everything we stand for! - Alan, 15  

___________________

If information is put through a filter, that’s bad. Any and all points of view should be accessible, even if harmful so everyone can get an understanding of all situations. Not to mention, as a young neurodivergent and queer person, I’m sure the information I’d be able to acquire and use to help myself would be severely impacted. I want to be free like anyone else. - Sunny, 15 

 ___________________

How young people feel about the Kids Online Safety Act (KOSA) matters. It will primarily affect them, and many, many teenagers oppose the bill. Some have been calling and emailing legislators to tell them how they feel. Others have been posting their concerns about the bill on social media. These teenagers have been baring their souls to explain how important social media access is to them, but lawmakers and civil liberties advocates, including us, have mostly been the ones talking about the bill and about what’s best for kids, and often we’re not hearing from minors in these debates at all. We should be — these young voices should be essential when talking about KOSA.

So, a few weeks ago, we asked some of the young advocates fighting to stop the Kids Online Safety Act a few questions:  

- How has access to social media improved your life? What do you gain from it? 

- What would you lose if KOSA passed? How would your life be different if it was already law? 

Within a week we received over 3,000 responses. As of today, we have received over 5,000.

These answers are critical for legislators to hear. Below, you can read some of these comments, sorted into the following themes (though they often overlap):  

These comments show that thoughtful young people are deeply concerned about the proposed law's fallout, and that many who would be affected think it will harm them, not help them. Over 700 of those who responded reported that they were currently sixteen or under—the age under which KOSA’s liability is applicable. The average age of those who answered the survey was 20 (of those who gave their age—the question was optional, and about 60% of people responded).  In addition to these two questions, we also asked those taking the survey if they were comfortable sharing their email address for any journalist who might want to speak with them; unfortunately much coverage usually only mentions one or two of the young people who would be most affected. So, journalists: We have contact info for over 300 young people who would be happy to speak to you about why social media matters to them, and why they oppose KOSA.

Individually, these answers show that social media, despite its current problems, offer an overall positive experience for many, many young people. It helps people living in remote areas find connection; it helps those in abusive situations find solace and escape; it offers education in history, art, health, and world events for those who wouldn’t otherwise have it; it helps people learn about themselves and the world around them. (Research also suggests that social media is more helpful than harmful for young people.) 

And as a whole, these answers tell a story that is 180° different from that which is regularly told by politicians and the media. In those stories, it is accepted as fact that the majority of young people’s experiences on social media platforms are harmful. But from these responses, it is clear that many, many young people also experience help, education, friendship, and a sense of belonging there—precisely because social media allows them to explore, something KOSA is likely to hinder. These kids are deeply engaged in the world around them through these platforms, and genuinely concerned that a law like KOSA could take that away from them and from other young people.  

Here are just a few of the thousands of reasons they’re worried.  

Note: We are sharing individuals’ opinions, without editing. We do not necessarily endorse them or their interpretation of KOSA.

KOSA Will Harm Rights That Young People Know They Ought to Have 

One of the most important things that would be lost is the freedom of speech - a given right that is crucial to a healthy, functioning environment. Not every speech is morally okay, but regulating what speech is deemed "acceptable" constricts people's rights; a clear violation of the First Amendment. Those who need or want to access certain information are not allowed to - not because the information will harm them or others, but for the reason that a certain portion of people disagree with the information. If the country only ran on what select people believed, we would be a bland, monotonous place. This country thrives on diversity, whether it be race, gender, sex, or any other personal belief. If KOSA was passed, I would lose my safe spaces, places where I can go to for mental health, places that make me feel more like a human than just some girl. No more would I be able to fight for ideas and beliefs I hold, nor enjoy my time on the internet either. - Anonymous, 16 

 ___________________

I, and many of my friends, grew up in an Internet where remaining anonymous was common sense, and where revealing your identity was foolish and dangerous, something only to be done sparingly, with a trusted ally at your side, meeting at a common, crowded public space like a convention or a college cafeteria. This bill spits in the face of these very practical instincts, forces you to dox yourself, and if you don’t want to be outed, you must be forced to withdraw from your communities. From your friends and allies. From the space you have made for yourself, somewhere you can truly be yourself with little judgment, where you can find out who you really are, alongside people who might be wildly different from you in some ways, and exactly like you in others. I am fortunate to have parents who are kind and accepting of who I am. I know many people are nowhere near as lucky as me. - Maeve, 25 

 ___________________ 

I couldn't do activism through social media and I couldn't connect with other queer individuals due to censorship and that would lead to loneliness, depression other mental health issues, and even suicide for some individuals such as myself. For some of us the internet is the only way to the world outside of our hateful environments, our only hope. Representation matters, and by KOSA passing queer children would see less of age appropriate representation and they would feel more alone. Not to mention that KOSA passing would lead to people being uninformed about things and it would start an era of censorship on the internet and by looking at the past censorship is never good, its a gateway to genocide and a way for the government to control. – Sage, 15 

  ___________________

Privacy, censorship, and freedom of speech are not just theoretical concepts to young people. Their rights are often already restricted, and they see the internet as a place where they can begin to learn about, understand, and exercise those freedoms. They know why censorship is dangerous; they understand why forcing people to identify themselves online is dangerous; they know the value of free speech and privacy, and they know what they’ve gained from an internet that doesn’t have guardrails put up by various government censors.  

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

KOSA Could Impact Young People’s Artistic Education and Opportunities 

I found so many friends and new interests from social media. Inspirations for my art I find online, like others who have an art style I admire, or models who do poses I want to draw. I can connect with my friends, send them funny videos and pictures. I use social media to keep up with my favorite YouTubers, content creators, shows, books. When my dad gets drunk and hard to be around or my parents are arguing, I can go on YouTube or Instagram and watch something funny to laugh instead. It gives me a lot of comfort, being able to distract myself from my sometimes upsetting home life. I get to see what life is like for the billions of other people on this planet, in different cities, states, countries. I get to share my life with my friends too, freely speaking my thoughts, sharing pictures, videos, etc.  
I have found my favorite YouTubers from other social media platforms like tiktok, this happened maybe about a year ago, and since then I think this is the happiest I have been in a while. Since joining social media I have become a much more open minded person, it made me interested in what others lives are like. It also brought awareness and educated me about others who are suffering in the world like hunger, poor quality of life, etc. Posting on social media also made me more confident in my art, in the past year my drawing skills have immensely improved and I’m shocked at myself. Because I wanted to make better fan art, inspire others, and make them happy with my art. I have been introduce to many styles of clothing that have helped develop my own fun clothing style. It powers my dreams and makes me want to try hard when I see videos shared by people who have worked hard and made it. - Anonymous, 15 

  ___________________

As a kid I was able to interact in queer and disabled and fandom spaces, so even as a disabled introverted child who wasn’t popular with my peers I still didn’t feel lonely. The internet is arguably a safer way to interact with other fans of media than going to cons with strangers, as long as internet safety is really taught to kids. I also get inspiration for my art and writing from things I’ve only discovered online, and as an artist I can’t make money without the internet and even minors do commissions. The issue isn’t that the internet is unsafe, it’s that internet safety isn’t taught anymore. - Rachel, 19 

  ___________________

i am an artist, and sharing my things online makes me feel happy and good about myself. i love seeing other people online and knowing that they like what i make. when i make art, im always nervous to show other people. but when i post it online i feel like im a part of something, and that im in a community where i feel that i belong. – Anonymous, 15 

 ___________________ 

Social media has saved my life, just like it has for many young people. I have found safe spaces and motivation because of social media, and I have never encountered anything negative or harmful to me. With social media I have been able to share my creativity (writing, art, and music) and thoughts safely without feeling like I'm being held back or oppressed. My creations have been able to inspire and reach so many people, just like how other people's work have reached me. Recently, I have also been able to help the library I volunteer at through the help of social media. 
What I do in life and all my future plans (career, school, volunteer projects, etc.) surrounds social media, and without it I wouldn't be able to share what I do and learn more to improve my works and life. I wouldn't be able to connect with wonderful artists, musicians, and writers like I do now. I would be lost and feel like I don't have a reason to do what I do. If KOSA is passed, I wouldn't be able to get the help I need in order to survive. I've made so many friends who have been saved because of social media, and if this bill gets passed they will also be affected. Guess what? They wouldn't be able to get the help they need either. 
If KOSA was already a law when I was just a bit younger, I wouldn't even be alive. I wouldn't have been able to reach help when I needed it. I wouldn't have been able to share my mind with the world. Social media was the reason I was able to receive help when I was undergoing abuse and almost died. If KOSA was already a law, I would've taken my life, or my abuser would have done it before I could. If KOSA becomes a law now, I'm certain that the likeliness of that happening to kids of any age will increase. – Anonymous, 15 

  ___________________

A huge number of young artists say they use social media to improve their skills, and in many cases, the avenue by which they discovered their interest in a type of art or music. Young people are rightfully worried that the magic moment where you first stumble upon an artist or a style that changes your entire life will be less and less common for future generations if KOSA passes. We agree: KOSA would likely lead platforms to limit that opportunity for young people to experience unexpected things, forcing their online experiences into a much smaller box under the guise of protecting them.  

Also, a lot of young people told us they wanted to, or were developing, an online business—often an art business. Under KOSA, young people could have less opportunities in the online communities where artists share their work and build a customer base, and a harder time navigating the various communities where they can share their art.  

KOSA Will Hurt Young People’s Ability to Find Community Online 

Social media has allowed me to connect with some of my closest friends ever, probably deeper than some people in real life. i get to talk about anything i want unimpeded and people accept me for who i am. in my deepest and darkest moments, knowing that i had somewhere to go was truly more relieving than anything else. i've never had the courage to commit suicide, but still, if it weren't for social media, i probably wouldn't be here, mentally & emotionally at least. 
i'd lose the space that accepts me. i'd lose the only place where i can be me. in life, i put up a mask to appease my parents and in some cases, my friends. with how extreme the u.s. is becoming these days, i could even lose my life. i would live my days in fear. i'm terrified of how fast this country is changing and if this bill passes, saying i would fall into despair would be an understatement. people say to "be yourself", but they don't understand that if i were to be my true self tomorrow, i could be killed. – march, 14 

 ___________________ 

Without the internet, and especially the rhythm gaming community which I found through Discord, I would've most likely killed myself at 13. My time on here has not been perfect, as has anyone's but without the internet I wouldn't have been the person I am today. I wouldn't have gotten help recognizing that what my biological parents were doing to me was abuse, the support I've received for my identity (as queer youth) and the way I view things, with ways to help people all around the world and be a more mindful ally, activist, and thinker, and I wouldn't have met my mom. 
I love my chosen mom. We met at a Dance Dance Revolution tournament in April of last year and have been friends ever since. When I told her that she was the first person I saw as a mother figure in my life back in November, I was bawling my eyes out. I'm her mije, and she's my mom. love her so much that saying that doesn't even begin to express exactly how much I love her.  
I love all my chosen family from the rhythm gaming community, my older sisters and siblings, I love them all. I have a few, some I talk with more regularly than others. Even if they and I may not talk as much as we used to, I still love them. They mean so much to me. – X86, 15 

  ___________________

i spent my time in public school from ages 9-13 getting physically and emotionally abused by special ed aides, i remember a few months after i left public school for good, i saw a post online that made me realize that what i went through wasn’t normal. if it wasn’t for the internet, i wouldn’t have come to terms with my autism, i would have still hated myself due to not knowing that i was genderqueer, my mental health would be significantly worse, and i would probably still be self harming, which is something i stopped doing at 13. besides the trauma and mental health side of things, something important to know is that spaces for teenagers to hang out have been eradicated years ago, minors can’t go to malls unless they’re with their parents, anti loitering laws are everywhere, and schools aren’t exactly the best place for teenagers to hang out, especially considering queer teens who were murdered by bullies (such as brianna ghey or nex benedict), the internet has become the third space that teenagers have flocked to as a result. – Anonymous, 17 

  ___________________

KOSA is anti-community. People online don’t only connect over shared interests in art and music—they also connect over the difficult parts of their lives. Over and over again, young people told us that one of the most valuable parts of social media was learning that they were not alone in their troubles. Finding others in similar circumstances gave them a community, as well as ideas to improve their situations, and even opportunities to escape dangerous situations.  

KOSA will make this harder. As platforms limit the types of recommendations and public content they feel safe sharing with young people, those who would otherwise find communities or potential friends will not be as likely to do so. A number of young people explained that they simply would never have been able to overcome some of the worst parts of their lives alone, and they are concerned that KOSA’s passage would stop others from ever finding the help they did. 

KOSA Could Seriously Hinder People’s Self-Discovery  

I am a transgender person, and when I was a preteen, looking down the barrel of the gun of puberty, I was miserable. I didn't know what was wrong I just knew I'd rather do anything else but go through puberty. The internet taught me what that was. They told me it was okay. There were things like haircuts and binders that I could use now and medical treatment I could use when I grew up to fix things. The internet was there for me too when I was questioning my sexuality and again when my mental health was crashing and even again when I was realizing I'm not neurotypical. The internet is a crucial source of information for preteens and beyond and you cannot take it away. You cannot take away their only realistically reachable source of information for what the close-minded or undereducated adults around them don't know. - Jay, 17 

   ___________________

Social media has improved my life so much and led to how I met my best friend, I’ve known them for 6+ years now and they mean so much to me. Access to social media really helps me connect with people similar to me and that make me feel like less of an outcast among my peers, being able to communicate with other neurodivergent queer kids who like similar interests to me. Social media makes me feel like I’m actually apart of a community that won’t judge me for who I am. I feel like I can actually be myself and find others like me without being harassed or bullied, I can share my art with others and find people like me in a way I can’t in other spaces. The internet & social media raised me when my parents were busy and unavailable and genuinely shaped the way I am today and the person I’ve become. – Anonymous, 14 

   ___________________

The censorship likely to come from this bill would mean I would not see others who have similar struggles to me. The vagueness of KOSA allows for state attorney generals to decide what is and is not appropriate for children to see, a power that should never be placed in the hands of one person. If issues like LGBT rights and mental health were censored by KOSA, I would have never realized that I AM NOT ALONE. There are problems with children and the internet but KOSA is not the solution. I urge the senate to rethink this bill, and come up with solutions that actually protect children, not put them in more danger, and make them feel ever more alone. - Rae, 16 

  ___________________ 

KOSA would effectively censor anything the government deems "harmful," which could be anything from queerness and fandom spaces to anything else that deviates from "the norm." People would lose support systems, education, and in some cases, any way to find out about who they are. I'll stop beating around the bush, if it wasn't for places online, I would never have discovered my own queerness. My parents and the small circle of adults I know would be my only connection to "grown-up" opinions, exposing me to a narrow range of beliefs I would likely be forced to adopt. Any kids in positions like mine would have no place to speak out or ask questions, and anything they bring up would put them at risk. Schools and families can only teach so much, and in this age of information, why can't kids be trusted to learn things on their own? - Anonymous, 15 

   ___________________

Social media helped me escape a very traumatic childhood and helped me connect with others. quite frankly, it saved me from being brainwashed. – Milo, 16 

   ___________________

Social media introduced me to lifelong friends and communities of like-minded people; in an abusive home, online social media in the 2010s provided a haven of privacy, safety, and information. I honed my creativity, nurtured my interests and developed my identity through relating and talking to people to whom I would otherwise have been totally isolated from. Also, unrestricted internet access actually taught me how to spot shady websites and inappropriate content FAR more effectively than if censorship had been at play like it is today. 
A couple of the friends I made online, as young as thirteen, were adults; and being friends with adults who knew I was a child, who practiced safe boundaries with me yet treated me with respect, helped me recognise unhealthy patterns in predatory adults. I have befriended mothers and fathers online through games and forums, and they were instrumental in preventing me being groomed by actual pedophiles. Had it not been for them, I would have wound up terribly abused by an "in real life" adult "friend". Instead, I recognised the differences in how he was treating me (infantilising yet praising) vs how my adult friends had treated me (like a human being), and slowly tapered off the friendship and safely cut contact. 
As I grew older, I found a wealth of resources on safe sex and sexual health education online. Again, if not for these discoveries, I would most certainly have wound up abused and/or pregnant as a teenager. I was never taught about consent, safe sex, menstruation, cervical health, breast health, my own anatomy, puberty, etc. as a child or teenager. What I found online-- typically on Tumblr and written with an alarming degree of normalcy-- helped me understand my body and my boundaries far more effectively than "the talk" or in-school sex ed ever did. I learned that the things that made me panic were actually normal; the ins and outs of puberty and development, and, crucially, that my comfort mattered most. I was comfortable and unashamed of being a virgin my entire teen years because I knew it was okay that I wasn't ready. When I was ready, at twenty-one, I knew how to communicate with my partner and establish safe boundaries, and knew to check in and talk afterwards to make sure we both felt safe and happy. I knew there was no judgement for crying after sex and that it didn't necessarily mean I wasn't okay. I also knew about physical post-sex care; e.g. going to the bathroom and cleaning oneself safely. 
AGAIN, I would NOT have known any of this if not for social media. AT ALL. And seeing these topics did NOT turn me into a dreaded teenage whore; if anything, they prevented it by teaching me safety and self-care. 
I also found help with depression, anxiety, and eating disorders-- learning to define them enabled me to seek help. I would not have had this without online spaces and social media. As aforementioned too, learning, sometimes through trial of fire, to safely navigate the web and differentiate between safe and unsafe sites was far more effective without censored content. Censorship only hurts children; it has never, ever helped them. How else was I to know what I was experiencing at home was wrong? To call it "abuse"? I never would have found that out. I also would never have discovered how to establish safe sexual AND social boundaries, or how to stand up for myself, or how to handle harassment, or how to discover my own interests and identity through media. The list goes on and on and on. – June, 21 

   ___________________

One of the claims that KOSA’s proponents make is that it won’t stop young people from finding the things they already want to search for. But we read dozens and dozens of comments from people who didn’t know something about themselves until they heard others discussing it—a mental health diagnosis, their sexuality, that they were being abused, that they had an eating disorder, and much, much more.  

Censorship that stops you from looking through a library is still dangerous even if it doesn’t stop you from checking out the books you already know. It’s still a problem to stop young people in particular from finding new things that they didn’t know they were looking for.   

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

KOSA Could Stop Young People from Getting Accurate News and Valuable Information 

Social media taught me to be curious. It taught me caution and trust and faith and that simply being me is enough. It brought me up where my parents failed, it allowed me to look into stories that assured me I am not alone where I am now. I would be fucking dead right now if it weren't for the stories of my fellow transgender folk out there, assuring me that it gets better.  
I'm young and I'm not smart but I know without social media, myself and plenty of the people I hold dear in person and online would not be alive. We wouldn't have news of the atrocities happening overseas that the news doesn't report on, we wouldn't have mentors to help teach us where our parents failed. - Anonymous, 16 

  ___________________ 

Through social media, I've learned about news and current events that weren't taught at school or home, things like politics or controversial topics that taught me nuance and solidified my concept of ethics. I learned about my identity and found numerous communities filled with people I could socialize with and relate to. I could talk about my interests with people who loved them just as much as I did. I found out about numerous different perspectives and cultures and experienced art and film like I never had before. My empathy and media literacy greatly improved with experience. I was also able to gain skills in gathering information and proper defences against misinformation. More technically, I learned how to organize my computer and work with files, programs, applications, etc; I could find guides on how to pursue my hobbies and improve my skills (I'm a self-taught artist, and I learned almost everything I know from things like YouTube or Tumblr for free). - Anonymous, 15 

  ___________________ 

A huge portion of my political identity has been shaped by news and information I could only find on social media because the mainstream news outlets wouldn’t cover it. (Climate Change, International Crisis, Corrupt Systems, etc.) KOSA seems to be intentionally working to stunt all of this. It’s horrifying. So much of modern life takes place on the internet, and to strip that away from kids is just another way to prevent them from formulating their own thoughts and ideas that the people in power are afraid of. Deeply sinister. I probably would have never learned about KOSA if it were in place! That’s terrifying! - Sarge, 17 

  ___________________

I’ve met many of my friends from [social media] and it has improved my mental health by giving me resources. I used to have an eating disorder and didn’t even realize it until I saw others on social media talking about it in a nuanced way and from personal experience. - Anonymous, 15 

   ___________________

Many young people told us that they’re worried KOSA will result in more biased news online, and a less diverse information ecosystem. This seems inevitable—we’ve written before that almost any content could fit into the categories that politicians believe will cause minors anxiety or depression, and so carrying that content could be legally dangerous for a platform. That could include truthful news about what’s going on in the world, including wars, gun violence, and climate change. 

“Preventing and mitigating” depression and anxiety isn’t a goal of any other outlet, and it shouldn’t be required for social media platforms. People have a right to access information—both news and opinion— in an open and democratic society, and sometimes that information is depressing or anxiety-inducing. To truly “prevent and mitigate” self-destructive behaviors, we must look beyond the media to systems that allow all humans to have self-respect, a healthy environment, and healthy relationships—not hiding truthful information that is disappointing.  

Young People’s Voices Matter 

While KOSA’s sponsors intend to help these young people, those who responded to the survey don’t see it that way. You may have noticed that it’s impossible to limit these complex and detailed responses into single categories—many childhood abuse victims found help as well as arts education on social media; many children connected to communities that they otherwise couldn’t and learned something essential about themselves in doing so. Many understand that KOSA would endanger their privacy, and also know it could harm marginalized kids the most.  

In reading thousands of these comments, it becomes clear that social media itself was not in itself a solution to the issues they experienced. What helped these young people was other people. Social media was where they were able to find and stay connected with those friends, communities, artists, activists, and educators. When you look at it this way, of course KOSA seems absurd: social media has become an essential element of young peoples’ lives, and they are scared to death that if the law passes, that part of their lives will disappear. Older teens and twenty-somethings, meanwhile, worry that if the law had been passed a decade ago, they never would have become the person that they did. And all of these fears are reasonable.  

There were thousands more comments like those above. We hope this helps balance the conversation, because if young people’s voices are suppressed now—and if KOSA becomes law—it will be much more difficult for them to elevate their voices in the future.  

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Jason Kelley

Analyzing KOSA’s Constitutional Problems In Depth 

1 month 3 weeks ago
Why EFF Does Not Think Recent Changes Ameliorate KOSA’s Censorship 

The latest version of the Kids Online Safety Act (KOSA) did not change our critical view of the legislation. The changes have led some organizations to drop their opposition to the bill, but we still believe it is a dangerous and unconstitutional censorship bill that would empower state officials to target services and online content they do not like. We respect that different groups can come to their own conclusions about how KOSA will affect everyone’s ability to access lawful speech online. EFF, however, remains steadfast in our long-held view that imposing a vague duty of care on a broad swath of online services to mitigate specific harms based on the content of online speech will result in those services imposing age verification and content restrictions. At least one group has characterized EFF’s concerns as spreading “disinformation.” We are not. But to ensure that everyone understands why EFF continues to oppose KOSA, we wanted to break down our interpretation of the bill in more detail and compare our views to those of others—both advocates and critics.  

Below, we walk through some of the most common criticisms we’ve gotten—and those criticisms the bill has received—to help explain our view of its likely impacts.  

KOSA’s Effectiveness  

First, and most importantly: We have serious and important disagreements with KOSA’s advocates on whether it will prevent future harm to children online. We are deeply saddened by the stories so many supporters and parents have shared about how their children were harmed online. And we want to keep talking to those parents, supporters, and lawmakers about ways in which EFF can work with them to prevent harm to children online, just as we will continue to talk with people who advocate for the benefits of social media. We believe, and have advocated for, comprehensive privacy protections as a better way to begin to address harms done to young people (and old) who have been targeted by platforms’ predatory business practices.  

A line of U.S. Supreme Court cases involving efforts to prevent book sellers from disseminating certain speech, which resulted in broad, unconstitutional censorship, shows why KOSA is unconstitutional. 

EFF does not think KOSA is the right approach to protecting children online, however. As we’ve said before, we think that in practice, KOSA is likely to exacerbate the risks of children being harmed online because it will place barriers on their ability to access lawful speech about addiction, eating disorders, bullying, and other important topics. We also think those restrictions will stifle minors who are trying  to find their own communities online.  We do not think that language added to KOSA to address that censorship concern solves the problem. We also don’t think that focusing KOSA’s regulation on design elements of online services addresses the First Amendment problems of the bill, either. 

Our views of KOSA’s harmful consequences are grounded in EFF’s 34-year history of both making policy for the internet and seeing how legislation plays out once it’s passed. This is also not our first time seeing the vast difference between how a piece of legislation is promoted and what it does in practice. Recently we saw this same dynamic with FOSTA/SESTA, which was promoted by politicians and the parents of  child sex trafficking victims as the way to prevent future harms. Sadly, even the politicians who initially championed it now agree that this law was not only ineffective at reducing sex trafficking online, but also created additional dangers for those same victims as well as others.   

KOSA’s Duty of Care  

KOSA’s core component requires an online platform or service that is likely to be accessed by young people to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” various harms to minors. These enumerated harms include: 

  • mental health disorders (anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors) 
  • patterns of use that indicate or encourage addiction-like behaviors  
  • physical violence, online bullying, and harassment 

Based on our understanding of the First Amendment and how all online platforms and services regulated by KOSA will navigate their legal risk, we believe that KOSA will lead to broad online censorship of lawful speech, including content designed to help children navigate and overcome the very same harms KOSA identifies.  

A line of U.S. Supreme Court cases involving efforts to prevent book sellers from disseminating certain speech, which resulted in broad, unconstitutional censorship, shows why KOSA is unconstitutional. 

In Smith v. California, the Supreme Court struck down an ordinance that made it a crime for a book seller to possess obscene material. The court ruled that even though obscene material is not protected by the First Amendment, the ordinance’s imposition of liability based on the mere presence of that material had a broader censorious effect because a book seller “will tend to restrict the books he sells to those he has inspected; and thus the State will have imposed a restriction upon the distribution of constitutionally protected, as well as obscene literature.” The court recognized that the “ordinance tends to impose a severe limitation on the public’s access to constitutionally protected material” because a distributor of others’ speech will react by limiting access to any borderline content that could get it into legal trouble.  

Online services have even less ability to read through the millions (or sometimes billions) of pieces of content on their services than a bookseller or distributor

In Bantam Books, Inc. v. Sullivan, the Supreme Court struck down a government effort to limit the distribution of material that a state commission had deemed objectionable to minors. The commission would send notices to book distributors that identified various books and magazines they believed were objectionable and sent copies of their lists to local and state law enforcement. Book distributors reacted to these notices by stopping the circulation of the materials identified by the commission. The Supreme Court held that the commission’s efforts violated the First Amendment and once more recognized that by targeting a distributor of others’ speech, the commission’s “capacity for suppression of constitutionally protected publications” was vast.  

KOSA’s duty of care creates a more far-reaching censorship threat than those that the Supreme Court struck down in Smith and Bantam Books. KOSA makes online services that host our digital speech liable should they fail to exercise reasonable care in removing or restricting minors’ access to lawful content on the topics KOSA identifies. KOSA is worse than the ordinance in Smith because the First Amendment generally protects speech about addiction, suicide, eating disorders, and the other topics KOSA singles out.  

We think that online services will react to KOSA’s new liability in much the same way as the bookstore in Smith and the book distributer in Bantam Books: They will limit minors’ access to or simply remove any speech that might touch on the topics KOSA identifies, even when much of that speech is protected by the First Amendment. Worse, online services have even less ability to read through the millions (or sometimes billions) of pieces of content on their services than a bookseller or distributor who had to review hundreds or thousands of books.  To comply, we expect that platforms will deploy blunt tools, either by gating off entire portions of their site to prevent minors from accessing them (more on this below) or by deploying automated filters that will over-censor speech, including speech that may be beneficial to minors seeking help with addictions or other problems KOSA identifies. (Regardless of their claims, it is not possible for a service to accurately pinpoint the content KOSA describes with automated tools.) 

But as the Supreme Court ruled in Smith and Bantam Books, the First Amendment prohibits Congress from enacting a law that results in such broad censorship precisely because it limits the distribution of, and access to, lawful speech.  

Moreover, the fact that KOSA singles out certain legal content—for example, speech concerning bullying—means that the bill creates content-based restrictions that are presumptively unconstitutional. The government bears the burden of showing that KOSA’s content restrictions advance a compelling government interest, are narrowly tailored to that interest, and are the least speech-restrictive means of advancing that interest. KOSA cannot satisfy this exacting standard.  

The fact that KOSA singles out certain legal content—for example, speech concerning bullying—means that the bill creates content-based restrictions that are presumptively unconstitutional. 

EFF agrees that the government has a compelling interest in protecting children from being harmed online. But KOSA’s broad requirement that platforms and services face liability for showing speech concerning particular topics to minors is not narrowly tailored to that interest. As said above, the broad censorship that will result will effectively limit access to a wide range of lawful speech on topics such as addiction, bullying, and eating disorders. The fact that KOSA will sweep up so much speech shows that it is far from the least speech-restrictive alternative, too.  

Why the Rule of Construction Doesn’t Solve the Censorship Concern 

In response to censorship concerns about the duty of care, KOSA’s authors added a rule of construction stating that nothing in the duty of care “shall be construed to require a covered platform to prevent or preclude:”  

  • minors from deliberately or independently searching for content, or 
  • the platforms or services from providing resources that prevent or mitigate the harms KOSA identifies, “including evidence-based information and clinical resources." 

We understand that some interpret this language as a safeguard for online services that limits their liability if a minor happens across information on topics that KOSA identifies, and consequently, platforms hosting content aimed at mitigating addiction, bullying, or other identified harms can take comfort that they will not be sued under KOSA. 

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

But EFF does not believe the rule of construction will limit KOSA’s censorship, in either a practical or constitutional sense. As a practical matter, it’s not clear how an online service will be able to rely on the rule of construction’s safeguards given the diverse amount of content it likely hosts.  

Take for example an online forum in which users discuss drug and alcohol abuse. It is likely to contain a range of content and views by users, some of which might describe addiction, drug use, and treatment, including negative and positive views on those points. KOSA’s rule of construction might protect the forum from a minor’s initial search for content that leads them to the forum. But once that minor starts interacting with the forum, they are likely to encounter the types of content KOSA proscribes, and the service may face liability if there is a later claim that the minor was harmed. In short, KOSA does not clarify that the initial search for the forum precludes any liability should the minor interact with the forum and experience harm later. It is also not clear how a service would prove that the minor found the forum via a search. 

The near-impossible standard required to review such a large volume of content, coupled with liability for letting any harmful content through, is precisely the scenario that the Supreme Court feared

Further, the rule of construction’s protections for the forum, should it provide only resources regarding preventing or mitigating drug and alcohol abuse based on evidence-based information and clinical resources, is unlikely to be helpful. That provision assumes that the forum has the resources to review all existing content on the forum and effectively screen all future content to only permit user-generated content concerning mitigation or prevention of substance abuse. The rule of construction also requires the forum to have the subject-matter expertise necessary to judge what content is or isn’t clinically correct and evidence-based. And even that assumes that there is broad scientific consensus about all aspects of substance abuse, including its causes (which there is not). 

Given that practical uncertainty and the potential hazard of getting anything wrong when it comes to minors’ access to that content, we think that the substance abuse forum will react much like the bookseller and distributor in the Supreme Court cases did: It will simply take steps to limit the ability for minors to access the content, a far easier and safer alternative than  making case-by-case expert decisions regarding every piece of content on the forum. 

EFF also does not believe that the Supreme Court’s decisions in Smith and Bantam Books would have been different if there had been similar KOSA-like safeguards incorporated into the regulations at issue. For example, even if the obscenity ordinance at issue in Smith had made an exception letting bookstores  sell scientific books with detailed pictures of human anatomy, the bookstore still would have to exhaustively review every book it sold and separate the obscene books from the scientific. The Supreme Court rejected such burdens as offensive to the First Amendment: “It would be altogether unreasonable to demand so near an approach to omniscience.” 

The near-impossible standard required to review such a large volume of content, coupled with liability for letting any harmful content through, is precisely the scenario that the Supreme Court feared. “The bookseller's self-censorship, compelled by the State, would be a censorship affecting the whole public, hardly less virulent for being privately administered,” the court wrote in Smith. “Through it, the distribution of all books, both obscene and not obscene, would be impeded.” 

Those same First Amendment concerns are exponentially greater for online services hosting everyone’s speech. That is why we do not believe that KOSA’s rule of construction will prevent the broader censorship that results from the bill’s duty of care. 

Finally, we do not believe the rule of construction helps the government overcome its burden on strict scrutiny to show that KOSA is narrowly tailored or restricts less speech than necessary. Instead, the rule of construction actually heightens KOSA’s violation of the First Amendment by preferencing certain viewpoints over others. The rule of construction here creates a legal preference for viewpoints that seek to mitigate the various identified harms, and punishes viewpoints that are neutral or even mildly positive of those harms. While EFF agrees that such speech may be awful, the First Amendment does not permit the government to make these viewpoint-based distinctions without satisfying strict scrutiny. It cannot meet that heavy burden with KOSA.  

KOSA's Focus on Design Features Doesn’t Change Our First Amendment Concerns 

KOSA supporters argue that because the duty of care and other provisions of KOSA concern an online service or platforms’ design features, the bill raises no First Amendment issues. We disagree.  

It’s true enough that KOSA creates liability for services that fail to “exercise reasonable care in the creation and implementation of any design feature” to prevent the bill’s enumerated harms. But the features themselves are not what KOSA's duty of care deems harmful. Rather, the provision specifically links the design features to minors’ access to the enumerated content that KOSA deems harmful. In that way, the design features serve as little more than a distraction. The duty of care provision is not concerned per se with any design choice generally, but only those design choices that fail to mitigate minors’ access to information about depression, eating disorders, and the other identified content. 

Once again, the Supreme Court’s decision in Smith shows why it’s incorrect to argue that KOSA’s regulation of design features avoids the First Amendment concerns. If the ordinance at issue in Smith regulated the way in which bookstores were designed, and imposed liability based on where booksellers placed certain offending books in their stores—for example, in the front window—we  suspect that the Supreme Court would have recognized, rightly, that the design restriction was little more than an indirect effort to unconstitutionally regulate the content. The same holds true for KOSA.  

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

KOSA Doesn’t “Mandate” Age-Gating, But It Heavily Pushes Platforms to Do So and Provides Few Other Avenues to Comply 

KOSA was amended in May 2023 to include language that was meant to ease concerns about age verification; in particular, it included explicit language that age verification is not required under the “Privacy Protections” section of the bill. The bill now states that a covered platform is not required to implement an age gating or age verification functionality to comply with KOSA.  

EFF acknowledges the text of the bill and has been clear in our messaging that nothing in the proposal explicitly requires services to implement age verification. Yet it's hard to see this change as anything other than a technical dodge that will be contradicted in practice.  

KOSA creates liability for any regulated platform or service that presents certain content to minors that the bill deems harmful to them. To comply with that new liability, those platforms and services’ options are limited. As we see them, the options are either to filter content for known minors or to gate content so only adults can access it. In either scenario, the linchpin is the platform knowing every user’s age  so it can identify its minor users and either filter the content they see or  exclude them from any content that could be deemed harmful under the law.  

EFF acknowledges the text of the bill and has been clear in our messaging that nothing in the proposal explicitly requires services to implement age verification.

There’s really no way to do that without implementing age verification. Regardless of what this section of the bill says, there’s no way for platforms to block either categories of content or design features for minors without knowing the minors are minors.  

We also don’t think KOSA lets platforms  claim ignorance if they take steps to never learn the ages of their users. If a 16-year-old user misidentifies herself as an adult and the platform does not use age verification, it could still be held liable because it should have “reasonably known” her age. The platform’s ignorance thus could work against it later, perversely incentivizing the services to implement age verification at the outset. 

EFF Remains Concerned About State Attorneys General Enforcing KOSA 

Another change that KOSA’s sponsors made  this year was to remove the ability of state attorneys general to enforce KOSA’s duty of care standard. We respect that some groups believe this addresses  concerns that some states would misuse KOSA to target minors’ access to any information that state officials dislike, including LGBTQIA+ or sex education information. We disagree that this modest change prevents this harm. KOSA still lets state attorneys general  enforce other provisions, including a section requiring certain “safeguards for minors.” Among the safeguards is a requirement that platforms “limit design features” that lead to minors spending more time on a service, including the ability to scroll through content, be notified of other content or messages, or auto playing content.  

But letting an attorney general  enforce KOSA’s requirement of design safeguards could be used as a proxy for targeting services that host content certain officials dislike.  The attorney general would simply target the same content or service it disfavored, butinstead of claiming that it violated KOSA’s duty to care, the official instead would argue that the service failed to prevent harmful design features that minors in their state used, such as notifications or endless scrolling. We think the outcome will be the same: states are likely to use KOSA to target speech about sexual health, abortion, LBGTQIA+ topics, and a variety of other information. 

KOSA Applies to Broad Swaths of the Internet, Not Just the Big Social Media Platforms 

Many sites, platforms, apps, and games would have to follow KOSA’s requirements. It applies to “an online platform, online video game, messaging application, or video streaming service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.”  

There are some important exceptions—it doesn’t apply to services that only provide direct or group messages only, such as Signal, or to schools, libraries, nonprofits, or to ISP’s like Comcast generally. This is good—some critics of KOSA have been concerned that it would apply to websites like Archive of Our Own (AO3), a fanfiction site that allows users to read and share their work, but AO3 is a nonprofit, so it would not be covered.  

But  a wide variety of niche online services that are for-profit  would still be regulated by KOSA. Ravelry, for example, is an online platform focused on knitters, but it is a business.   

And it is an open question whether the comment and community portions of major mainstream news and sports websites are subject to KOSA. The bill exempts news and sports websites, with the huge caveat that they are exempt only so long as they are “not otherwise an online platform.” KOSA defines “online platform” as “any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user generated content.” It’s easily arguable that the New York Times’ or ESPN’s comment and forum sections are predominantly designed as places for user-generated content. Would KOSA apply only to those interactive spaces or does the exception to the exception mean the entire sites are subject to the law? The language of the bill is unclear. 

Not All of KOSA’s Critics Are Right, Either 

Just as we don’t agree on KOSA’s likely outcomes with many of its supporters, we also don’t agree with every critic regarding KOSA’s consequences. This isn’t surprising—the law is broad, and a major complaint is that it remains unclear how its vague language would be interpreted. So let’s address some of the more common misconceptions about the bill. 

Large Social Media May Not Entirely Block Young People, But Smaller Services Might 

Some people have concerns that KOSA will result in minors not being able to use social media at all. We believe a more likely scenario is that the major platforms would offer different experiences to different age groups.  

They already do this in some ways—Meta currently places teens into the most restrictive content control setting on Instagram and Facebook. The company specifically updated these settings for many of the categories included in KOSA, including suicide, self-harm, and eating disorder content. Their update describes precisely what we worry KOSA would require by law: “While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find.” TikTok also has blocked some videos for users under 18. To be clear, this content filtering as a result of KOSA will be harmful and would violate the First Amendment.  

Though large platforms will likely react this way, many smaller platforms will not be capable of this kind of content filtering. They very well may decide blocking young people entirely is the easiest way to protect themselves from liability. We cannot know how every platform will react if KOSA is enacted, but smaller platforms that do not already use complex automated content moderation tools will likely find it financially burdensome to implement both age verification tools and content moderation tools.  

KOSA Won’t Necessarily Make Your Real Name Public by Default 

One recurring fear that critics of KOSA have shared is that they will no longer to be able to use platforms anonymously. We believe this is true, but there is some nuance to it. No one should have to hand over their driver's license—or, worse, provide biometric information—just to access lawful speech on websites. But there's nothing in KOSA that would require online platforms to publicly tie your real name to your username.  

Still, once someone shares information to verify their age, there’s no way for them to be certain that the data they’re handing over is not going to be retained and used by the website, or further shared or even sold. As we’ve said, KOSA doesn't technically require age verification but we think it’s the most likely outcome. Users still will be forced to trust that the website they visit, or its third-party verification service, won’t misuse their private data, including their name, age, or biometric information. Given the numerous  data privacy blunders we’ve seen from companies like Meta in the past, and the general concern with data privacy that Congress seems to share with the general public (and with EFF), we believe this outcome to be extremely dangerous. Simply put: Sharing your private info with a company doesn’t necessarily make it public, but it makes it far more likely to become public than if you hadn’t shared it in the first place.   

We Agree With Supporters: Government Should Study Social Media’s Effects on Minors 

We know tensions are high; this is an incredibly important topic, and an emotional one. EFF does not have all the right answers regarding how to address the ways in which young people can be harmed online. Which is why we agree with KOSA’s supporters that the government should conduct much greater research on these issues. We believe that comprehensive fact-finding is the first step to both identifying the problems and legislative solutions. A provision of KOSA does require the National Academy of Sciences to research these issues and issue reports to the public. But KOSA gets this process backwards. It creates solutions to general concerns about young people being harmed without first doing the work necessary to show that the bill’s provisions address those problems. As we have said repeatedly, we do not think KOSA will address harms to young people online. We think it will exacerbate them.  

Even if your stance on KOSA is different from ours, we hope we are all working toward the same goal: an internet that supports freedom, justice, and innovation for all people of the world. We don’t believe KOSA will get us there, but neither will ad hominem attacks. To that end,  we look forward to more detailed analyses of the bill from its supporters, and to continuing thoughtful engagement from anyone interested in working on this critical issue. 

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Aaron Mackey

San Diego City Council Breaks TRUST

1 month 3 weeks ago

In a stunning reversal against the popular Transparent & Responsible Use of Surveillance Technology (TRUST) ordinance, the San Diego city council voted earlier this year to cut many of the provisions that sought to ensure public transparency for law enforcement surveillance technologies. 

Similar to other Community Control Of Police Surveillance (CCOPS) ordinances, the TRUST ordinance was intended to ensure that each police surveillance technology would be subject to basic democratic oversight in the form of public disclosures and city council votes. The TRUST ordinance was fought for by a coalition of community organizations– including several members of the Electronic Frontier Alliance – responding to surprise smart streetlight surveillance that was not put under public or city council review.  

The TRUST ordinance was passed one and a half years ago, but law enforcement advocates immediately set up roadblocks to implementation. Police unions, for example, insisted that some of the provisions around accountability for misuse of surveillance needed to be halted after passage to ensure they didn’t run into conflict with union contracts. The city kept the ordinance unapplied and untested, and then in the late summer of 2023, a little over a year after passage, the mayor proposed a package of changes that would gut the ordinance. This included exemption of a long list of technologies, including ARJIS databases and record management system data storage. These changes were later approved this past January.  

But use of these databases should require, for example, auditing to protect data security for city residents. There also should be limits on how police share data with federal agencies and other law enforcement agencies, which might use that data to criminalize San Diego residents for immigration status, gender-affirming health care, or exercise of reproductive rights that are not criminalized in the city or state. The overall TRUST ordinance stands, but partly defanged with many carve-outs for technologies the San Diego police will not need to bring before democratically-elected lawmakers and the public. 

Now, opponents of the TRUST ordinance are emboldened with their recent victory, and are vowing to introduce even more amendments to further erode the gains of this ordinance so that San Diegans won’t have a chance to know how their local law enforcement surveils them, and no democratic body will be required to consent to the technologies, new or old. The members of the TRUST Coalition are not standing down, however, and will continue to fight to defend the standing portions of the TRUST ordinance, and to regain the wins for public oversight that were lost. 

As Lilly Irani, from Electronic Frontier Alliance member and TRUST Coalition member Tech Workers Coalition San Diegohas said

“City Council members and the mayor still have time to make this right. And we, the people, should hold our elected representatives accountable to make sure they maintain the oversight powers we currently enjoy — powers the mayor’s current proposal erodes.” 

If you live or work in San Diego, it’s important to make it clear to city officials that San Diegans don’t want to give police a blank check to harass and surveil them. Such dangerous technology needs basic transparency and democratic oversight to preserve our privacy, our speech, and our personal safety. 

José Martinez

5 Questions to Ask Before Backing the TikTok Ban

1 month 3 weeks ago

With strong bipartisan support, the U.S. House voted 352 to 65 to pass HR 7521 this week, a bill that would ban TikTok nationwide if its Chinese owner doesn’t sell the popular video app. The TikTok bill’s future in the U.S. Senate isn’t yet clear, but President Joe Biden has said he would sign it into law if it reaches his desk. 

The speed at which lawmakers have moved to advance a bill with such a significant impact on speech is alarming. It has given many of us — including, seemingly, lawmakers themselves — little time to consider the actual justifications for such a law. In isolation, parts of the argument might sound somewhat reasonable, but lawmakers still need to clear up their confused case for banning TikTok. Before throwing their support behind the TikTok bill, Americans should be able to understand it fully, something that they can start doing by considering these five questions. 

1. Is the TikTok bill about privacy or content?

Something that has made HR 7521 hard to talk about is the inconsistent way its supporters have described the bill’s goals. Is this bill supposed to address data privacy and security concerns? Or is it about the content TikTok serves to its American users? 

From what lawmakers have said, however, it seems clear that this bill is strongly motivated by content on TikTok that they don’t like. When describing the "clear threat" posed by foreign-owned apps, the House report on the bill  cites the ability of adversary countries to "collect vast amounts of data on Americans, conduct espionage campaigns, and push misinformation, disinformation, and propaganda on the American public."

This week, the bill’s Republican sponsor Rep. Mike Gallagher told PBS Newshour that the “broader” of the two concerns TikTok raises is “the potential for this platform to be used for the propaganda purposes of the Chinese Communist Party." On that same program, Representative Raja Krishnamoorthi, a Democratic co-sponsor of the bill, similarly voiced content concerns, claiming that TikTok promotes “drug paraphernalia, oversexualization of teenagers” and “constant content about suicidal ideation.”

2. If the TikTok bill is about privacy, why aren’t lawmakers passing comprehensive privacy laws? 

It is indeed alarming how much information TikTok and other social media platforms suck up from their users, information that is then collected not just by governments but also by private companies and data brokers. This is why the EFF strongly supports comprehensive data privacy legislation, a solution that directly addresses privacy concerns. This is also why it is hard to take lawmakers at their word about their privacy concerns with TikTok, given that Congress has consistently failed to enact comprehensive data privacy legislation and this bill would do little to stop the many other ways adversaries (foreign and domestic) collect, buy, and sell our data. Indeed, the TikTok bill has no specific privacy provisions in it at all.

It has been suggested that what makes TikTok different from other social media companies is how its data can be accessed by a foreign government. Here, too, TikTok is not special. China is not unique in requiring companies in the country to provide information to them upon request. In the United States, Section 702 of the FISA Amendments Act, which is up for renewal, authorizes the mass collection of communication data. In 2021 alone, the FBI conducted up to 3.4 million warrantless searches through Section 702. The U.S. government can also demand user information from online providers through National Security Letters, which can both require providers to turn over user information and gag them from speaking about it. While the U.S. cannot control what other countries do, if this is a problem lawmakers are sincerely concerned about, they could start by fighting it at home.

3. If the TikTok bill is about content, how will it avoid violating the First Amendment? 

Whether TikTok is banned or sold to new owners, millions of people in the U.S. will no longer be able to get information and communicate with each other as they presently do. Indeed, one of the given reasons to force the sale is so TikTok will serve different content to users, specifically when it comes to Chinese propaganda and misinformation.

The First Amendment to the U.S. Constitution rightly makes it very difficult for the government to force such a change legally. To restrict content, U.S. laws must be the least speech-restrictive way of addressing serious harms. The TikTok bill’s supporters have vaguely suggested that the platform poses national security risks. So far, however, there has been little public justification that the extreme measure of banning TikTok (rather than addressing specific harms) is properly tailored to prevent these risks. And it has been well-established law for almost 60 years that U.S. people have a First Amendment right to receive foreign propaganda. People in the U.S. deserve an explicit explanation of the immediate risks posed by TikTok — something the government will have to do in court if this bill becomes law and is challenged.

4. Is the TikTok bill a ban or something else? 

Some have argued that the TikTok bill is not a ban because it would only ban TikTok if owner ByteDance does not sell the company. However, as we noted in the coalition letter we signed with the American Civil Liberties Union, the government generally cannot “accomplish indirectly what it is barred from doing directly, and a forced sale is the kind of speech punishment that receives exacting scrutiny from the courts.” 

Furthermore, a forced sale based on objections to content acts as a backdoor attempt to control speech. Indeed, one of the very reasons Congress wants a new owner is because it doesn’t like China’s editorial control. And any new ownership will likely bring changes to TikTok. In the case of Twitter, it has been very clear how a change of ownership can affect the editorial policies of a social media company. Private businesses are free to decide what information users see and how they communicate on their platforms, but when the U.S. government wants to do so, it must contend with the First Amendment. 

5. Does the U.S. support the free flow of information as a fundamental democratic principle? 

Until now, the United States has championed the free flow of information around the world as a fundamental democratic principle and called out other nations when they have shut down internet access or banned social media apps and other online communications tools. In doing so, the U.S. has deemed restrictions on the free flow of information to be undemocratic.

In 2021, the U.S. State Department formally condemned a ban on Twitter by the government of Nigeria. “Unduly restricting the ability of Nigerians to report, gather, and disseminate opinions and information has no place in a democracy,” a department spokesperson wrote. “Freedom of expression and access to information both online and offline are foundational to prosperous and secure democratic societies.”

Whether it’s in Nigeria, China, or the United States, we couldn’t agree more. Unfortunately, if the TikTok bill becomes law, the U.S. will lose much of its moral authority on this vital principle.

TAKE ACTION

TELL CONGRESS: DON'T BAN TIKTOK

Hudson Hongo

Location Data Tracks Abortion Clinic Visits. Here’s What to Know

1 month 3 weeks ago

Our concerns about the selling and misuse of location data for those seeking reproductive and gender healthcare are escalating amid a recent wave of cases and incidents demonstrating that the digital trail we leave is being used by anti-abortion activists.

The good news is some states and tech companies are taking steps to better protect location data privacy, including information that endangers people needing or seeking information about reproductive and gender-affirming healthcare. But we know more must be done—by pharmacies, our email providers, and lawmakers—to plug gaping holes in location data protection.

Location data is highly sensitive, as it paints a picture of our daily lives—where we go, who we visit, when we seek medical care, or what clinics we visit. That’s what makes it so attractive to data brokers and law enforcement in states outlawing abortion and gender-affirming healthcare and those seeking to exploit such data for ideological or commercial purposes.

What we’re seeing is deeply troubling. Sen. Ron Wyden recenty disclosed that vendor Near Intelligence allegedly gathered location data of people’s visits to nearly 600 Planned Parenthood locations across 48 states, without consent. It sold that data to an anti-abortion group, which used it in a massive anti-abortion ad campaign.The Wisconsin-based group used the geofenced data to send mobile ads to people who visited the clinics.

It’s hardly a leap to imagine that law enforcement and bounty hunters in anti-abortion states would gladly buy the same data to find out who is visiting Planned Parenthood clinics and try to charge and imprison women, their families, doctors, and caregivers. That’s the real danger of an unregulated data broker industry; anyone can buy what’s gathered from warrantless surveillance, for whatever nefarious purpose they choose.

For example, police in Idaho, where abortion is illegal, used cell phone data in an investigation against an Idaho woman and her son charged with kidnapping. The data showed that they had taken the son’s minor girlfriend to Oregon, where abortion is legal, to obtain an abortion.

The exploitation of location data is not the only problem. Information about prescription medicines we take is not protected against law enforcement requests. The nation’s eight largest pharmacy chains, including CVS, Walgreens, and Rite Aid, have routinely turned over prescription records of thousands of Americans to law enforcement agencies or other government entities secretly without a warrant, according to a congressional inquiry.

Many people may not know that their prescription records can be obtained by law enforcement without too much trouble. There’s not much standing between someone’s self-managed abortion medication and a law enforcement records demand. In April the U.S. Health and Human Services Department proposed a rule that would prevent healthcare providers and insurers from giving information to state officials trying to prosecute some seeking or providing a legal abortion. A final rule has not yet been published.

Exploitation of location and healthcare data to target communities could easily expand to other groups working to protect bodily autonomy, especially those most likely to suffer targeted harassment and bigotry. With states passing and proposing bills restricting gender-affirming care and state law enforcement officials pursuing medical records of transgender youth across state lines, it’s not hard to imagine them buying or using location data to find people to prosecute.

To better protect people against police access to sensitive health information, lawmakers in a few states have taken action. In 2022, California enacted two laws protecting abortion data privacy and preventing California companies from sharing abortion data with out-of-state entities.

Then, last September the state enacted a shield law prohibiting California-based companies, including social media and tech companies, from disclosing patients’ private communications regarding healthcare that is legally protected in the state.

Massachusetts lawmakers have proposed the Location Shield Act, which would prohibit the sale of cellphone location information to data brokers. The act would make it harder to trace the path of those traveling to Massachusetts for abortion services.

Of course, tech companies have a huge role to play in location data privacy. EFF was glad when Google said in 2022 it would delete users’ location history for visits to medical facilities, including abortion clinics and counseling and fertility centers. Google pledged that when the location history setting on a device was turned on, it would delete entries for particularly personal places like reproductive health clinics soon after such a visit.

But a study by AccountableTech testing Google’s pledge said the company wasn’t living up to its promises and continued to collect and retain location data from individuals visiting abortion clinics. Accountable Tech reran the study in late 2023 and the results were again troubling—Google still retained location search query data for some visits to Planned Parenthood clinics. It appears users will have to manually delete location search history to remove information about the routes they take to visiting sensitive locations. It doesn’t happen automatically.

Late last year, Google announced plans to move saved Timeline entries in Google Maps to users’ devices. Users who want to keep the entries could choose to back up the data to the cloud, where it would be automatically encrypted and out of reach even to Google.

These changes would appear to make it much more difficult—if not impossible—for Google to provide mass location data in response to a geofence warrant, a change we’ve been asking Google to implement for years. But when these features are coming is uncertain—though Google said in December they’re “coming soon.”

Google should implement the changes sooner as opposed to later. In the meantime, those seeking reproductive and gender information and healthcare can find tips on how to protect themselves in our Surveillance Self Defense guide. 

Karen Gullo

How to Figure Out What Your Car Knows About You (and Opt Out of Sharing When You Can)

1 month 3 weeks ago

Cars collect a lot of our personal data, and car companies disclose a lot of that data to third parties. It’s often unclear what’s being collected, and what's being shared and with whom. A recent New York Times article highlighted how data is shared by G.M. with insurance companies, sometimes without clear knowledge from the driver. If you're curious about what your car knows about you, you might be able to find out. In some cases, you may even be able to opt out of some of that sharing of data.

Why Your Car Collects and Shares Data

A car (and its app, if you installed one on your phone) can collect all sorts of data in the background with and without you realizing it. This in turn may be shared for a wide variety of purposes, including advertising and risk-assessment for insurance companies. The list of data collected is long and dependent on the car’s make, model, and trim.  But if you look through any car maker’s privacy policy, you'll see some trends:

  • Diagnostics data, sometimes referred to as “vehicle health data,” may be used internally for quality assurance, research, recall tracking, service issues, and similar unsurprising car-related purposes. This type of data may also be shared with dealers or repair companies for service.
  • Location information may be collected for emergency services, mapping, and to catalog other environmental information about where a car is operated. Some cars may give you access to the vehicle’s location in the app.
  • Some usage data may be shared or used internally for advertising. Your daily driving or car maintenance habits, alongside location data, is a valuable asset to the targeted advertising ecosystem. 
  • All of this data could be shared with law enforcement.
  • Information about your driving habits, sometimes referred to as “Driving data” or “Driver behavior information,” may be shared with insurance companies and used to alter your premiums.  This can range from odometer readings to braking and acceleration statistics and even data about what time of day you drive.. 

Surprise insurance sharing is the thrust of The New York Times article, and certainly not the only problem with car data. We've written previously about how insurance companies offer discounts for customers who opt into a usage-based insurance program. Every state except California currently allows the use of telematics data for insurance rating, but privacy protections for this data vary widely across states.

When you sign up directly through an insurer, these opt-in insurance programs have a pretty clear tradeoff and sign up processes, and they'll likely send you a physical device that you plug into your car's OBD port that then collects and transmits data back to the insurer.

But some cars have their own internal systems for sharing information with insurance companies that can piggy back off an app you may have installed, or the car’s own internet connection. Many of these programs operate behind dense legalese. You may have accidentally “agreed” to such sharing without realizing it, while buying a new car—likely in a state of exhaustion and excitement after finally completing a gauntlet of finance and legal forms.

This gets more confusing: car-makers use different terms for their insurance sharing programs. Some, like Toyota's “Insure Connect,” are pretty obviously named. But others, like Honda, tuck information about sharing with a data broker (that then shares with insurance companies) inside a privacy policy after you enable its “Driver Feedback” feature. Others might include the insurance sharing opt-in alongside broader services you might associate more with safety or theft, like G.M.’s OnStar, Subaru’s Starlink, and Volkswagen’s Car-Net.

The amount of data shared differs by company, too. Some car makers might share only small amounts of data, like an odometer reading, while others might share specific details about driving habits.

That's just the insurance data sharing. There's little doubt that many cars sell other data for behavioral advertising, and like the rest of that industry, it's nearly impossible to track exactly where your data goes and how it's used.

See What Data Your Car Has (and Stop the Sharing)

This is a general guide to see what your car collects and who it shares it with. It does not include information about specific scenarios—like intimate partner violence— that may raise distinctive driver privacy issues.

See How Your Car Handles (Data)
Start by seeing what your car is equipped to collect using Privacy4Cars’ Vehicle Privacy Report. Once you enter your car’s VIN, the site provides a rough idea of what sorts of data your car collects. It's also worth reading about your car manufacturer’s more general practices on Mozilla's Privacy Not Included site.

Check the Privacy Options In Your Car’s Apps and Infotainment System
If you use an app for your car, head into the app’s settings, and look for any sort of data sharing options. Look for settings like “Data Privacy” or “Data Usage.” When possible, opt out of sharing any data with third-parties, or for behavioral advertising. As annoying as it may be, it’s important to read carefully here so you don’t accidentally disable something you want, like a car’s SOS feature. Be mindful that, at least according to Mozilla’s report on Tesla, opting out of certain data sharing might someday make the car undriveable. Now’s also a good time to disable ad tracking on your phone.

When it comes to sharing with insurance companies, you’re looking for an option that may be something obvious, like Toyota’s “Insure Connect,” or less obvious, like Kia’s “Driving Score.” If your car’s app has any sort of driver scoring or feedback option—some other names include GM’s ”Smart Driver,” Honda’s “Driver Feedback,” or Mitsubishi’s “Driving Score”—there’s a chance it’s sharing that data with an insurance company. Check for these options in both the app and the car’s infotainment system.

If you did accidentally sign up for sharing data with insurance companies, you may want to call your insurance company to see how doing so may affect your premiums. Depending on your driving habits, your premiums might go up or down, and in either case you don’t want a surprise bill.

File a Privacy Request with the Car Maker
Next, file a privacy request with the car manufacturer so you can see exactly what data the company has collected about you. Some car makers will provide this to anyone who asks. Others might only respond to requests from residents of states with a consumer data privacy law that requires their response. The International Association of Privacy Professionals has published this list of states with such laws.

In these states, you have a “right to know” or “right to access” your data, which requires the company to send you a copy of what personal information it collected about you. Some of these states also guarantee “data portability,” meaning the right to access your data in a machine-readable format. File one of these requests, and you should receive a copy of your data. In some states, you can also file a request for the car maker to not sell or share your information, or to delete it. While the car maker might not be legally required to respond to your request if you're not from a state with these privacy rights, it doesn’t hurt to ask anyway.

Every company tends to word these requests a little differently, but you’re looking for options to get a copy of your data, and ask them to stop sharing it. This typically requires filling out a separate request form for each type of request.

Here are the privacy request pages for the major car brands:

Sometimes, you will need to confirm the request in an email, so be sure to keep an eye on your inbox.

Check for Data On Popular Data Brokers Known to Share with Insurers
Finally, request your data from data brokers known to hand car data to insurers. For example, do so with the two companies mentioned in The New York Times’ article: 

Now, you wait. In most states, within 45 to 90 days you should receive an email from the car maker, and another from the data brokers, which will often include a link to your data. You will typically get a CSV file, though it may also be a PDF, XLS, or even a folder with a whole webpage and an HTML file. If you don't have any sort of spreadsheet software on your computer, you might struggle to open it up, but most of the files you get can be opened in free programs, like Google Sheets or LibreOffice.

Without a national law that puts privacy first, there is little that most people can do to stop this sort of data sharing. Moreover, the steps above clearly require far too much effort for most people to take. That’s why we need much more than these consumer rights to know, to delete, and to opt-out of disclosure: we also need laws that automatically require corporations to minimize the data they process about us, and to get our opt-in consent before processing our data. As to car insurers, we've outlined exactly what sort of guardrails we'd like to see here

As The New York Times' reporting revealed, many people were surprised to learn how their data is collected, disclosed, and used, even if there was an opt-in consent screen. This is a clear indication that car makers need to do better. 

Thorin Klosowski

Making the Law Accessible in Europe and the USA

1 month 3 weeks ago

Special thanks to EFF legal intern Alissa Johnson, who was the lead author of this post.

Earlier this month, the European Union Court of Justice ruled that harmonized standards are a part of EU law, and thus must be accessible to EU citizens and residents free of charge.

While it might seem like common sense that the laws that govern us should be freely accessible, this question has been in dispute in the EU for the past five years, and in the U.S. for over a decade. At the center of this debate are technical standards, developed by private organizations and later incorporated into law. Before they were challenged in court, standards-development organizations were able to limit access to these incorporated standards through assertions of copyright. Regulated parties or concerned citizens checking compliance with technical or safety standards had to do so by purchasing these standards, often at significant expense, from private organizations. While free alternatives, like proprietary online “reading rooms,” were sometimes available, these options had their own significant downsides, including limited functionality and privacy concerns.

In 2018, two nonprofits, Public.Resource.Org and Right to Know, made a request to the European Commission for access to four harmonized standards—that is, standards that apply across the European Union—pertaining to the safety of toys. The Commission refused to grant them access on the grounds that the standards were copyrighted.   

The nonprofits then brought an action before the General Court of the European Union seeking annulment of the Commission’s decision. They made two main arguments. First, that copyright couldn’t be applicable to the harmonized standards, and that open access to the standards would not harm the commercial interests of the European Committee for Standardization or other standard setting bodies. Second, they argued that the public interest in open access to the law should override whatever copyright interests might exist. The General Court rejected both arguments, finding that the threshold for originality that makes a work eligible for copyright protection had been met, the sale of standards was a vital part of standards bodies’ business model, and the public’s interest in ensuring the proper functioning of the European standardization system outweighed their interest in free access to harmonized standards.

Last week, the EU Court of Justice overturned the General Court decision, holding that EU citizens and residents have an overriding interest in free access to the laws that govern them. Article 15(3) of the Treaty on the Functioning of the EU and Article 42 of the Charter of Fundamental Rights of the EU guarantee a right of access to documents of Union institutions, bodies, offices, and agencies. These bodies can refuse access to a document where its disclosure would undermine the protection of commercial interests, including intellectual property, unless there is an overriding public interest in disclosure.

Under the ECJ’s ruling, standards written by private companies, but incorporated into legislation, now form part of EU law. People need access to these standards to determine their own compliance. While compliance with harmonized standards is not generally mandatory, it is in the case of the toy safety standards in question here. Even when compliance is not mandatory, products that meet technical standards benefit from a “presumption of conformity,” and failure to conform can impose significant administrative difficulties and additional costs.

Given that harmonized standards are a part of EU law, citizens and residents of member states have an interest in free access that overrides potential copyright concerns. Free access is necessary for economic actors “to ascertain unequivocally what their rights and obligations are,” and to allow concerned citizens to examine compliance. As the U.S. Supreme Court noted in in 2020, “[e]very citizen is presumed to know the law, and it needs no argument to show that all should have free access” to it.

The Court of Justice’s decision has far-reaching effects beyond the four toy safety standards under dispute. Its reasoning classifying these standards as EU law applies more broadly to standards incorporated into law. We’re pleased that under this precedent, EU standards-development organizations will be required to disclose standards on request without locking these important parts of the law behind a paywall.

Mitch Stoltz

Why U.S. House Members Opposed the TikTok Ban Bill

1 month 3 weeks ago

What do House Democrats like Alexandria Ocasio-Cortez and Barbara Lee have in common with House Republicans like Thomas Massie and Andy Biggs? Not a lot. But they do know an unconstitutional bill when they see one.

These and others on both sides of the aisle were among the 65 House Members who voted "no" yesterday on the “Protecting Americans from Foreign Adversary Controlled Applications Act,” H.R. 7521, which would effectively ban TikTok. The bill now goes to the Senate, where we hope cooler heads will prevail in demanding comprehensive data privacy legislation instead of this attack on Americans' First Amendment rights.

We're saying plenty about this misguided, unfounded bill, and we want you to speak out about it too, but we thought you should see what some of the House Members who opposed it said, in their own words.

 

I am voting NO on the TikTok ban.

Rather than target one company in a rushed and secretive process, Congress should pass comprehensive data privacy protections and do a better job of informing the public of the threats these companies may pose to national security.

— Rep. Barbara Lee (@RepBarbaraLee) March 13, 2024

   ___________________ 

Today, I voted against the so-called “TikTok Bill.”

Here’s why: pic.twitter.com/Kbyh6hEhhj

gilc15axuaahok9.jpg

— Rep Andy Biggs (@RepAndyBiggsAZ) March 13, 2024

   ___________________

Today, I voted against H.R. 7521. My full statement: pic.twitter.com/9QCFQ2yj5Q

nadler.png

— Rep. Nadler (@RepJerryNadler) March 13, 2024

   ___________________ 

Today I claimed 20 minutes in opposition to the TikTok ban bill, and yielded time to several likeminded colleagues.

This bill gives the President far too much authority to determine what Americans can see and do on the internet.

This is my closing statement, before I voted No. pic.twitter.com/xMxp9bU18t

massie.mp4

— Thomas Massie (@RepThomasMassie) March 13, 2024

   ___________________ 

Why I voted no on the bill to potentially ban tik tok: pic.twitter.com/OGkfdxY8CR

himes.jpg

— Jim Himes 🇺🇸🇺🇦 (@jahimes) March 13, 2024

   ___________________ 

I don’t use TikTok. I find it unwise to do so. But after careful review, I’m a no on this legislation.

This bill infringes on the First Amendment and grants undue power to the administrative state. pic.twitter.com/oSpmYhCrV8

bishop.mp4

— Rep. Dan Bishop (@RepDanBishop) March 13, 2024

   ___________________ 

I’m voting NO on the TikTok forced sale bill.

This bill was incredibly rushed, from committee to vote in 4 days, with little explanation.

There are serious antitrust and privacy questions here, and any national security concerns should be laid out to the public prior to a vote.

— Alexandria Ocasio-Cortez (@AOC) March 13, 2024

   ___________________ 

We should defend the free & open debate that our First Amendment protects. We should not take that power AWAY from the people & give it to the government. The answer to authoritarianism is NOT more authoritarianism. The answer to CCP-style propaganda is NOT CCP-style oppression. pic.twitter.com/z9HWgUSMpw
mcclintock.mp4

— Tom McClintock (@RepMcClintock) March 13, 2024

   ___________________ 

I'm voting no on the TikTok bill. Here's why:
1) It was rushed.
2) There's major free speech issues.
3) It would hurt small businesses.
4) America should be doing way more to protect data privacy & combatting misinformation online. Singling out one app isn't the answer.

— Rep. Jim McGovern (@RepMcGovern) March 13, 2024

    ___________________

Solve the correct problem.
Privacy.
Surveillance.
Content moderation.

Who owns #TikTok?
60% investors - including Americans
20% +7,000 employees - including Americans
20% founders
CEO & HQ Singapore
Data in Texas held by Oracle

What changes with ownership? I’ll be voting NO. pic.twitter.com/MrfROe02IS

davidson.mp4

— Warren Davidson 🇺🇸 (@WarrenDavidson) March 13, 2024

   ___________________ 

I voted no on the bill to force the sale of TikTok. Unlike our adversaries, we believe in freedom of speech and don’t ban social media platforms. Instead of this rushed bill, we need comprehensive data security legislation that protects all Americans.

— Val Hoyle (@RepValHoyle) March 13, 2024

    ___________________

Please tell the Senate to reject this bill and instead give Americans the comprehensive data privacy protections we so desperately need.

TAKE ACTION

TELL CONGRESS: DON'T BAN TIKTOK

Josh Richman

SXSW Tried to Silence Critics with Bogus Trademark and Copyright Claims. EFF Fought Back.

1 month 3 weeks ago

Special thanks to EFF legal intern Jack Beck, who was the lead author of this post.

Amid heavy criticism for its ties to weapons manufacturers supplying Israel, South by Southwest—the organizer of an annual conference and music festival in Austin—has been on the defensive. One tool in their arsenal: bogus trademark and copyright claims against local advocacy group Austin for Palestine Coalition.

The Austin for Palestine Coalition has been a major source of momentum behind recent anti-SXSW protests. Their efforts have included organizing rallies outside festival stages and hosting an alternative music festival in solidarity with Palestine. They have also created social media posts explaining the controversy, criticizing SXSW, and calling on readers to email SXSW with demands for action. The group’s posts include graphics that modify SXSW’s arrow logo to add blood-stained fighter jets. Other images incorporate patterns evoking SXSW marketing materials overlaid with imagery like a bomb or a bleeding dove.

One of Austin for Palestine's graphics

Days after the posts went up, SXSW sent a cease-and-desist letter to Austin for Palestine, accusing them of trademark and copyright infringement and demanding they take down the posts. Austin for Palestine later received an email from Instagram indicating that SXSW had reported the post for violating their trademark rights.

We responded to SXSW on Austin for Palestine’s behalf, explaining that their claims are completely unsupported by the law and demanding they retract them.

The law is clear on this point. The First Amendment protects your right to make a political statement using trademark parodies, whether or not the trademark owner likes it. That’s why trademark law applies a different standard (the “Rogers test”) to infringement claims involving expressive works. The Rogers test is a crucial defense against takedowns like these, and it clearly applies here. Even without Rogers’ extra protections, SXSW’s trademark claim would be bogus: Trademark law is about preventing consumer confusion, and no reasonable consumer would see Austin for Palestine’s posts and infer they were created or endorsed by SXSW.

SXSW’s copyright claims are just as groundless. Basic symbols like their arrow logo are not copyrightable. Moreover, even if SXSW meant to challenge Austin for Palestine’s mimicking of their promotional material—and it’s questionable whether that is copyrightable as well—the posts are a clear example of non-infringing fair use. In a fair use analysis, courts conduct a four-part analysis, and each of those four factors here either favors Austin for Palestine or is at worst neutral. Most importantly, it’s clear that the critical message conveyed by Austin for Palestine’s use is entirely different from the original purpose of these marketing materials, and the only injury to SXSW is reputational—which is not a cognizable copyright injury.

SXSW has yet to respond to our letter. EFF has defended against bogus copyright and trademark claims in the past, and SXSW’s attempted takedown feels especially egregious considering the nature of Austin for Palestine’s advocacy. Austin for Palestine used SXSW’s iconography to make a political point about the festival itself, and neither trademark nor copyright is a free pass to shut down criticism. As an organization that “dedicates itself to helping creative people achieve their goals,” SXSW should know better.

Cara Gagliano

Protect Yourself from Election Misinformation

1 month 3 weeks ago

Welcome to your U.S. presidential election year, when all kinds of bad actors will flood the internet with election-related disinformation and misinformation aimed at swaying or suppressing your vote in November. 

So… what’re you going to do about it? 

As EFF’s Corynne McSherry wrote in 2020, online election disinformation is a problem that has had real consequences in the U.S. and all over the world—it has been correlated to ethnic violence in Myanmar and India and to Kenya’s 2017 elections, among other events. Still, election misinformation and disinformation continue to proliferate online and off. 

That being said, regulation is not typically an effective or human rights-respecting way to address election misinformation. Even well-meaning efforts to control election misinformation through regulation inevitably end up silencing a range of dissenting voices and hindering the ability to challenge ingrained systems of oppression. Indeed, any content regulation must be scrutinized to avoid inadvertently affecting meaningful expression: Is the approach narrowly tailored or a categorical ban? Does it empower users? Is it transparent? Is it consistent with human rights principles? 

 While platforms and regulators struggle to get it right, internet users must be vigilant about checking the election information they receive for accuracy. There is help. Nonprofit journalism organization ProPublica published a handy guide about how to tell if what you’re reading is accurate or “fake news.” The International Federation of Library Associations and Institutions infographic on How to Spot Fake News is a quick and easy-to-read reference you can share with friends:

how_to_spot_fake_news.jpg

To make sure you’re getting good information about how your election is being conducted, check in with trusted sources including your state’s Secretary of State, Common Cause, and other nonpartisan voter protection groups, or call or text 866-OUR-VOTE (866-687-8683) to speak with a trained election protection volunteer. 

And if you see something, say something: You can report election disinformation at https://reportdisinfo.org/, a project of the Common Cause Education Fund. 

 EFF also offers some election-year food for thought: 

  • On EFF’s “How to Fix the Internet” podcast, Pamela Smith—president and CEO of Verified Voting—in 2022 talked with EFF’s Cindy Cohn and Jason Kelley about finding reliable information on how your elections are conducted, as part of ensuring ballot accessibility and election transparency.
  • Also on “How to Fix the Internet”, Alice Marwick—cofounder and principal researcher at the University of North Carolina, Chapel Hill’s Center for Information, Technology and Public Life—in 2023 talked about finding ways to identify and leverage people’s commonalities to stem the flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. She discussed why seemingly ludicrous conspiracy theories get so many views and followers; how disinformation is tied to personal identity and feelings of marginalization and disenfranchisement; and when fact-checking does and doesn’t work.
  • EFF’s Cory Doctorow wrote in 2020 about how big tech monopolies distort our public discourse: “By gathering a lot of data about us, and by applying self-modifying machine-learning algorithms to that data, Big Tech can target us with messages that slip past our critical faculties, changing our minds not with reason, but with a kind of technological mesmerism.” 

An effective democracy requires an informed public and participating in a democracy is a responsibility that requires work. Online platforms have a long way to go in providing the tools users need to discern legitimate sources from fake news. In the meantime, it’s on each of us. Don’t let anyone lie, cheat, or scare you away from making the most informed decision for your community at the ballot box. 

Josh Richman

Congress Should Give Up on Unconstitutional TikTok Bans

1 month 3 weeks ago

Congress’ unfounded plan to ban TikTok under the guise of protecting our data is back, this time in the form of a new bill—the “Protecting Americans from Foreign Adversary Controlled Applications Act,” H.R. 7521 — which has gained a dangerous amount of momentum in Congress. This bipartisan legislation was introduced in the House just a week ago and is expected to be sent to the Senate after a vote later this week.

A year ago, supporters of digital rights across the country successfully stopped the federal RESTRICT Act, commonly known as the “TikTok Ban” bill (it was that and a whole lot more). And now we must do the same with this bill. 

TAKE ACTION

TELL CONGRESS: DON'T BAN TIKTOK

As a first step, H.R. 7521 would force TikTok to find a new owner that is not based in a foreign adversarial country within the next 180 days or be banned until it does so. It would also give the President the power to designate other applications under the control of a country considered adversarial to the U.S. to be a national security threat. If deemed a national security threat, the application would be banned from app stores and web hosting services unless it cuts all ties with the foreign adversarial country within 180 days. The bill would criminalize the distribution of the application through app stores or other web services, as well as the maintenance of such an app by the company. Ultimately, the result of the bill would either be a nationwide ban on the TikTok, or a forced sale of the application to a different company.

The only solution to this pervasive ecosystem is prohibiting the collection of our data in the first place.

Make no mistake—though this law starts with TikTok specifically, it could have an impact elsewhere. Tencent’s WeChat app is one of the world’s largest standalone messenger platforms, with over a billion users, and is a key vehicle for the Chinese diaspora generally. It would likely also be a target. 

The bill’s sponsors have argued that the amount of private data available to and collected by the companies behind these applications — and in theory, shared with a foreign government — makes them a national security threat. But like the RESTRICT Act, this bill won’t stop this data sharing, and will instead reduce our rights online. User data will still be collected by numerous platforms—possibly even TikTok after a forced sale—and it will still be sold to data brokers who can then sell it elsewhere, just as they do now. 

The only solution to this pervasive ecosystem is prohibiting the collection of our data in the first place. Ultimately, foreign adversaries will still be able to obtain our data from social media companies unless those companies are forbidden from collecting, retaining, and selling it, full stop. And to be clear, under our current data privacy laws, there are many domestic adversaries engaged in manipulative and invasive data collection as well. That’s why EFF supports such consumer data privacy legislation

Congress has also argued that this bill is necessary to tackle the anti-American propaganda that young people are seeing due to TikTok’s algorithm. Both this justification and the national security justification raise serious First Amendment concerns, and last week EFF, the ACLU, CDT, and Fight for the Future wrote to the House Energy and Commerce Committee urging them to oppose this bill due to its First Amendment violations—specifically for those across the country who rely on TikTok for information, advocacy, entertainment, and communication. The US has rightfully condemned other countries when they have banned, or sought a ban, on specific social media platforms.

Montana’s ban was as unprecedented as it was unconstitutional

And it’s not just civil society saying this. Late last year, the courts blocked Montana’s TikTok ban, SB 419, from going into effect on January 1, 2024, ruling that the law violated users’ First Amendment rights to speak and to access information online, and the company’s First Amendment rights to select and curate users’ content. EFF and the ACLU had filed a friend-of-the-court brief in support of a challenge to the law brought by TikTok and a group of the app’s users who live in Montana. 

Our brief argued that Montana’s ban was as unprecedented as it was unconstitutional, and we are pleased that the district court upheld our free speech rights and blocked the law from going into effect. As with that state ban, the US government cannot show that a federal ban is narrowly tailored, and thus cannot use the threat of unlawful censorship as a cudgel to coerce a business to sell its property. 

TAKE ACTION

TELL CONGRESS: DON'T BAN TIKTOK

Instead of passing this overreaching and misguided bill, Congress should prevent any company—regardless of where it is based—from collecting massive amounts of our detailed personal data, which is then made available to data brokers, U.S. government agencies, and even foreign adversaries, China included. We shouldn’t waste time arguing over a law that will get thrown out for silencing the speech of millions of Americans. Instead, Congress should solve the real problem of out-of-control privacy invasions by enacting comprehensive consumer data privacy legislation.

Jason Kelley

Congress Must Stop Pushing Bills That Will Benefit Patent Trolls

1 month 4 weeks ago

The U.S. Senate is moving forward with two bills that would enrich patent trolls, patent system insiders, and a few large companies that rely on flimsy patents, at the expense of everyone else. 

One bill, the Patent Eligibility Restoration Act (PERA) would bring back some of the worst software patents we’ve seen, and even re-introduce types of patents on human genes that were banned years ago. Meanwhile, a similar group of senators is trying to push forward the PREVAIL Act (S. 2220), which would shut out most of the public from even petitioning the government to reconsider wrongly granted patents. 

Take Action

Tell Congress: No New Bills For Patent Trolls

Patent trolls are companies that don’t focus on making products or selling services. Instead, they collect patents, then use them to threaten or sue other companies and individuals. They’re not a niche problem; patent trolls filed the majority of patent lawsuits last year and for all the years in which we have good data. In the tech sector, they file more than 80% of the lawsuits. These do-nothing companies continue to be vigorous users of the patent system, and they’ll be the big winners under the two bills the U.S. Senate is considering pushing forward. 

Don’t Bring Back “Do It On A Computer” Patents 

The Patent Eligibility Restoration Act, or PERA, would overturn key legal precedents that we all rely on to kick the worst-of-the-worst patents out of the system. PERA would throw out a landmark Supreme Court ruling called the Alice v. CLS Bank case, which made it clear that patents can’t just claim basic business or cultural processes by adding generic computer language. 

The Alice rules are what—finally—allowed courts to throw out the most ridiculous “do it on a computer” software patents at an early stage. Under the Alice test, courts threw out patents on “matchmaking”, online picture menus, scavenger hunts, and online photo contests

The rules under Alice are clear, fair, and they work. It hasn’t stopped patent trolling, because there are so many patent owners willing to ask for nuisance-value settlements that are far below the cost of legal defense. It’s not perfect, and it hasn’t ended patent trolling. But Alice has done a good job of saving everyday internet users from some of the worst patent claims. 

PERA would allow patents like the outrageous one brought forward in the Alice v. CLS Bank case, which claimed the idea of having a third party clear financial transactions—but on a computer. A patent on ordering restaurant food through a mobile phone, which was used to sue more than 100 restaurants, hotels, and fast-food chains before it was finally thrown out under the Alice rules, could survive if PERA becomes law. 

Don’t Bring Back Patents On Human Genes 

PERA goes further than software. It would also overturn a Supreme Court rule that prevents patents from being granted on naturally occurring human genes. For almost 30 years, some biotech and pharmaceutical companies used a cynical argument to patent genes and monopolize diagnostic tests that analyzed them. That let the patent owners run up the costs on tests like the BRCA genes, which are predictive of ovarian and breast cancers. When the Supreme Court disallowed patents on human genes found in nature, the prices of those tests plummeted. 

Patenting naturally occurring human genes is a horrific practice and the Supreme Court was right to ban it. The fact that PERA sponsors want to bring back these patents is unconscionable. 

Allowing extensive patenting of genetic information will also harm future health innovations, by blocking competition from those who may offer more affordable tests and treatments. It could affect our response to future pandemics. Imagine if the first lab to sequence the COVID-19 genome filed for patent protection, and went on to threaten other labs that seek to create tests with patent infringement. As an ACLU attorney who litigated against the BRCA gene patents has pointed out, this scenario is not fantastical if a bill like PERA were to advance. 

Take Action

Tell Congress To Reject PERA and PREVAIL

Don’t Shut Down The Public’s Right To Challenge Patents

The PREVAIL Act would bar most people from petitioning the U.S. Patent and Trademark Office (USPTO) to revoke patents that never should have been granted in the first place. 

The U.S. Patent and Trademark Office (USPTO) issues hundreds of thousands of patents every year, with less than 20 hours, on average, being devoted to examining each patent. Mistakes happen. 

That’s why Congress created a process for the public to ask the USPTO to double-check certain patents, to make sure they were not wrongly granted. This process, called inter partes review or IPR, is still expensive and difficult, but faster and cheaper than federal courts, where litigating a patent through a jury trial can cost millions of dollars. IPR has allowed the cancellation of thousands of patent claims that never should have been issued in the first place. 

The PREVAIL Act will limit access to the IPR process to only people and companies that have been directly threatened or sued over a patent. No one else will have standing to even file a petition. That means that EFF, other non-profits, and membership-based patent defense companies won’t be able to access the IPR process to protect the public. 

EFF used the IPR process back in 2013, when thousands of our supporters chipped in to raise more than $80,000 to fight against a patent that claimed to cover all podcasts. We won’t be able to do that if PREVAIL passes. 

And EFF isn’t the only non-profit to use IPRs to protect users and developers. The Linux Foundation, for instance, funds an “open source zone” that uses IPR to knock out patents that may be used to sue open source projects. Dozens of lawsuits are filed each year against open source projects, the majority of them brought by patent trolls. 

IPR is already too expensive and limited; Congress should be eliminating barriers to challenging bad patents, not raising more.

Congress Should Work For the Public, Not For Patent Trolls

The Senators pushing this agenda have chosen willful ignorance of the patent troll problem. The facts remain clear: the majority of patent lawsuits are brought by patent trolls. In the tech sector, it’s more than 80%. These numbers may be low considering threat letters from patent trolls, which don’t become visible in the public record. 

These patent lawsuits don’t have much to do with what most people think of when they think about “inventors” or inventions. They’re brought by companies that have no business beyond making patent threats. 

The Alice rules and IPR system, along with other important reforms, have weakened the power of these patent trolls. Patent trolls that used to receive regular multi-million dollar paydays have seen their incomes shrink (but not disappear). Some trolls, like Shipping and Transit LLC finally wound up operations after being hit with sanctions (more than 500 lawsuits later). Trolls like IP Edge, now being investigated by a federal judge after claiming its true “owners” included a Texas food truck owner who turned out to be, essentially, a decoy. 

There’s big money behind bringing back the patent troll business, as well as a few huge tech and pharma companies that prefer to use unjustified monopolies rather than competing fairly. Two former Federal Circuit judges, two former Directors of the U.S. Patent and Trademark Office, and many other well-placed patent insiders are all telling Congress that Alice should be overturned and patent trolls should be allowed to run amok. We can’t let that happen. 

Take Action

Tell Congress: Don't Work For Patent Trolls

Joe Mullin

Reject Nevada’s Attack on Encrypted Messaging, EFF Tells Court

1 month 4 weeks ago
Nevada Makes Backward Argument That Insecure Communication Makes Children Safer

LAS VEGAS — The Electronic Frontier Foundation (EFF) and a coalition of partners urged a court to protect default encrypted messaging and children’s privacy and security in a brief filed today.

The brief by the American Civil Liberties Union (ACLU), the ACLU of Nevada, the EFF, Stanford Internet Observatory Research Scholar Riana Pfefferkorn, and six other organizations asks the court to reject a request by Nevada’s attorney general to stop Meta from offering end-to-end encryption by default to Facebook Messenger users under 18 in the state. The brief was also signed by Access Now, Center for Democracy & Technology (CDT), Fight for the Future, Internet Society, Mozilla, and Signal Messenger LLC.

Communications are safer when third parties can’t listen in on them. That’s why the EFF and others who care about privacy pushed Meta for years to make end-to-end encryption the default option in Messenger. Meta finally made the change, but Nevada wants to turn back the clock. As the brief notes, end-to-end encryption “means that even if someone intercepts the messages—whether they are a criminal, a domestic abuser, a foreign despot, or law enforcement—they will not be able to decipher or access the message.” The state of Nevada, however, bizarrely argues that young people would be better off without this protection.

“Encryption is the best tool we have for safeguarding our privacy and security online — and privacy and security are especially important for young people,” said EFF Surveillance Litigation Director Andrew Crocker. “Nevada’s argument that children need to be ‘protected’ from securely communicating isn’t just baffling; it’s dangerous.”

As explained in a friend-of-the-court brief filed by the EFF and others today, encryption is one of the best ways to reclaim our privacy and security in a digital world full of cyberattacks and security breaches. It is increasingly being deployed across the internet as a way to protect users and data. For children and their families especially, encrypted communication is one of the strongest safeguards they have against malicious misuse of their private messages — a safeguard Nevada seeks to deny them.

“The European Court of Human Rights recently rejected a Russian law that would have imposed similar requirements on services that offer end-to-end message encryption – finding that it violated human rights and EU law to deny people the security and privacy that encryption provides,” said EFF’s Executive Director Cindy Cohn. “Nevada’s attempt should be similarly rejected.”

In its motion to the court, Nevada argues that it is necessary to block end-to-end encryption on Facebook Messenger because it can impede some criminal investigations involving children. This ignores that law enforcement can and does conduct investigations involving encrypted messages, which can be reported by users and accessed from either the sender or recipient’s devices. It also ignores law enforcement’s use of the tremendous amount of additional information about users that Meta routinely collects.

The brief notes that co-amicus Pfeffercorn recently authored a study that confirmed that Nevada does not, in fact, need to block encryption to do its investigations. The study found that “content-oblivious” investigation methods are “considered more useful than monitoring the contents of users’ communications when it comes to detecting nearly every kind of online abuse.” 

“The court should reject Nevada’s motion,” said EFF’s Crocker. “Making children more vulnerable in just to make law enforcement investigators’ jobs slightly easier is an uneceesary and dangerous trade off.”

For the brief: https://www.eff.org/document/nevada-v-meta-amicus-brief

Contact:  AndrewCrockerSurveillance Litigation Directorandrew@eff.org
Hudson Hongo

EFF Urges New York Court to Protect Online Speakers’ Anonymity

1 month 4 weeks ago

The First Amendment requires courts to apply a robust balancing test before unmasking anonymous online speakers, EFF explained in an amicus brief it filed recently in a New York State appeal.

In the case on appeal, GSB Gold Standard v. Google, a German company that sells cryptocurrency investments is seeking to unmask an anonymous blogger who criticized the company. Based upon a German court order, the company sought a subpoena that would identify the blogger. The blogger fought back, without success, and they are now appealing.

Like speech itself, the First Amendment right to anonymity fosters and advances public debate and self-realization. Anonymity allows speakers to communicate their ideas without being defined by their identity. Anonymity protects speakers who express critical or unpopular views from harassment, intimidation, or being silenced. And, because powerful individuals or entities’ efforts to punish one speaker through unmasking may well lead others to remain silent, protecting anonymity for one speaker can promote free expression for many others.

Too often, however, corporate or human persons try to abuse the judicial process to unmask anonymous speakers. Thus, courts should apply robust evidentiary and procedural standards before compelling the disclosure of an anonymous speaker’s identity. 

Under these standards, parties seeking to unmask anonymous speakers must first show they have meritorious legal claims, to help ensure that the litigation isn’t a pretext for harassment. Those parties that meet this first step must then also show that their interests in unmasking an anonymous speaker outweigh the speaker’s interests in retaining their anonymity. In this case, the trial court didn’t require the German company to meet this standard, and it could not have in any event.

Courts around the United States have adopted various forms of this test, with EFF often participating as amicus or counsel. We hope that New York follows their lead.

Brendan Gilligan

Access to Internet Infrastructure is Essential, in Wartime and Peacetime

1 month 4 weeks ago

We’ve been saying it for 20 years, and it remains true now more than ever: the internet is an essential service. It enables people to build and create communities, shed light on injustices, and acquire vital knowledge that might not otherwise be available. And access to it becomes even more imperative in circumstances where being able to communicate and share real-time information directly with the people you trust is instrumental to personal safety and survival. More specifically, during wartime and conflict, internet and phone services enable the communication of information between people in challenging situations, as well as the reporting by on-the-ground journalists and ordinary people of the news. 

Unfortunately, governments across the world are very aware of their power to cut off this crucial lifeline, and frequently undertake targeted initiatives to do so. These internet shutdowns have become a blunt instrument that aid state violence and inhibit free speech, and are routinely deployed in direct contravention of human rights and civil liberties.

And this is not a one-dimensional situation. Nearly twenty years after the world’s first total internet shutdowns, this draconian measure is no longer the sole domain of authoritarian states but has become a favorite of a diverse set of governments across three continents. For example:

In Iran, the government has been suppressing internet access for many years. In the past two years in particular, people of Iran have suffered repeated internet and social media blackouts following an activist movement that blossomed after the death of Mahsa Amini, a woman murdered in police custody for refusing to wear a hijab. The movement gained global attention and in response, the Iranian government rushed to control both the public narrative and organizing efforts by banning social media, and sometimes cutting off internet access altogether. 

In Sudan, authorities have enacted a total telecommunications blackout during a massive conflict and displacement crisis. Shutting down the internet is a deliberate strategy blocking the flow of information that brings visibility to the crisis and prevents humanitarian aid from supporting populations endangered by the conflict. The communications blackout has extended for weeks, and in response a global campaign #KeepItOn has formed to put pressure on the Sudanese government to restore its peoples' access to these vital services. More than 300 global humanitarian organizations have signed on to support #KeepItOn.

And in Palestine, where the Israeli government exercises near-total control over both wired internet and mobile phone infrastructure, Palestinians in Gaza have experienced repeated internet blackouts inflicted by the Israeli authorities. The latest blackout in January 2024 occurred amid a widespread crackdown by the Israeli government on digital rights—including censorship, surveillance, and arrests—and amid accusations of bias and unwarranted censorship by social media platforms. On that occasion, the internet was restored after calls from civil society and nations, including the U.S. As we’ve noted, internet shutdowns impede residents' ability to access and share resources and information, as well as the ability of residents and journalists to document and call attention to the situation on the ground—more necessary than ever given that a total of 83 journalists have been killed in the conflict so far. 

Given that all of the internet cables connecting Gaza to the outside world go through Israel, the Israeli Ministry of Communications has the ability to cut off Palestinians’ access with ease. The Ministry also allocates spectrum to cell phone companies; in 2015 we wrote about an agreement that delivered 3G to Palestinians years later than the rest of the world. In 2022, President Biden offered to upgrade the West Bank and Gaza to 4G, but the initiative stalled. While some Palestinians are able to circumvent the blackout by utilizing Israeli SIM cards (which are difficult to obtain) or Egyptian eSIMs, these workarounds are not solutions to the larger problem of blackouts, which the National Security Council has said: “[deprive] people from accessing lifesaving information, while also undermining first responders and other humanitarian actors’ ability to operate and to do so safely.”

Access to internet infrastructure is essential, in wartime as in peacetime. In light of these numerous blackouts, we remain concerned about the control that authorities are able to exercise over the ability of millions of people to communicate. It is imperative that people’s access to the internet remains protected, regardless of how user platforms and internet companies transform over time. We continue to shout this, again and again, because it needs to be restated, and unfortunately today there are ever more examples of it happening before our eyes.




Jillian C. York

Podcast Episode: 'I Squared' Governance

1 month 4 weeks ago

Imagine a world in which the internet is first and foremost about empowering people, not big corporations and government. In that world, government does “after-action” analyses to make sure its tech regulations are working as intended, recruits experienced technologists as advisors, and enforces real accountability for intelligence and law enforcement programs.

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Ff16bc667-91d4-4190-9d9e-8e7cd7a64df3%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

(You can also find this episode on the Internet Archive and on YouTube.)

Ron Wyden has spent decades working toward that world, first as a congressman and now as Oregon’s senior U.S. Senator. Long among Congress’ most tech-savvy lawmakers, he helped write the law that shaped and protects the internet as we know it, and he has fought tirelessly against warrantless surveillance of Americans’ telecommunications data. Wyden speaks with EFF’s Cindy Cohn and Jason Kelley about his “I squared” —individuals and innovation—legislative approach to foster an internet that benefits everyone. 

In this episode you’ll learn about: 

  • How a lot of the worrisome online content that critics blame on Section 230 is actually protected by the First Amendment 
  • Requiring intelligence and law enforcement agencies to get warrants before obtaining Americans’ private telecommunications data 
  • Why “foreign” is the most important word in “Foreign Intelligence Surveillance Act” 
  • Making government officials understand national security isn’t heightened by reducing privacy 
  • Protecting women from having their personal data weaponized against them 

U.S. Sen. Ron Wyden, D-OR, has served in the Senate since 1996; he was elected to his current six-year term in 2022. He chairs the Senate Finance Committee, and serves on the Energy and Natural Resources Committee, the Budget Committee, and the Select Committee on Intelligence; he also is the lead Senate Democrat on the Joint Committee on Taxation. His relentless defiance of the national security community's abuse of secrecy forced the declassification of the CIA Inspector General's 9/11 report, shut down the controversial Total Information Awareness program, and put a spotlight on both the Bush and Obama administrations’ reliance on "secret law." In 2006 he introduced the first Senate bill on net neutrality, and in 2011 he was the lone Senator to stand against the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), ultimately unsuccessful bills that purportedly were aimed at fighting online piracy but that actually would have caused significant harm to the internet. Earlier, he served from 1981 to 1996 in the House of Representatives, where he co-authored Section 230 of the Communications Decency Act of 1996—the law that protects Americans’ freedom of expression online by protecting the intermediaries we all rely on.

Resources: 

 What do you think of “How to Fix the Internet?” Share your feedback here

Transcript

SENATOR RON WYDEN
It's been all about two things, individuals and innovation. I call it “I squared,” so to speak, because those my principles. If you kind of follow what I'm trying to do, it's about individuals, it's about innovation. And you know, government has a role in playing to guardrails and ensuring that there are competitive markets. But what I really want to do is empower individuals.

CINDY COHN
That's U.S. Senator Ron Wyden of Oregon. He is a political internet pioneer. Since he was first elected to the Senate in 1996, he has fought for personal digital rights, and against corporate and company censorship, and for sensible limits on government secrecy.

[THEME MUSIC BEGINS]

CINDY COHN
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley - EFF's Activism Director. This is our podcast series, How to Fix the Internet.

CINDY COHN
The idea behind this show is that we're trying to make our digital lives better. And sometimes when we think about the lawmakers in our country, we often think of the conflict and fighting and people who just don’t get it when it comes to how digital works. But there are also some people in the legislatures who have worked to enact real progress.

JASON KELLEY
Our guest this week is one of the giants in the political fight for internet freedom for several decades now. Senator Wyden played a critical role in the passage of Section 230 — a pillar of online freedom of speech that has recently been coming under attack from many different sides. And he introduced the first Senate net neutrality bill back in 2006. He’s consistently pushed back against mass surveillance and pushed for a strong Fourth Amendment, and over the years, he has consistently fought for many of the things that we are fighting for here at EFF as well.

CINDY COHN
Our conversation takes a look back at some of the major milestones of his career, decisions that have directly impacted all of our online lives. And we talk about the challenges of getting Section 230 passed into law in the first place. But more recently, Senator Wyden also talks about why he was strongly opposed to laws like FOSTA-SESTA, which undermined the space that Section 230 creates for some online speakers, using the cover of trying to stop sex trafficking on the internet.

JASON KELLEY
But like us at EFF, Senator Wyden is focusing on the battles happening right now in Congress that could have a fundamental impact on our online lives. When he was elected in the ‘90s, the focus was on the explosion and rapid expansion of the internet. Now he’s thinking about the rapid expansion of artificial intelligence, and how we can make sure that we put the individual before the profits of corporations when it comes to AI.

CINDY COHN
Our conversation covers a lot of ground but we wanted to start with Senator Wyden’s own view of what a good tech future would look like for all of us.

SENATOR RON WYDEN
Well, it's one that empowers the individual. You know, consistently, the battles around here are between big interest groups. And what I want to do is see the individual have more power and big corporations and big government have less as it relates to communications.

CINDY COHN
Yeah. So what would that look like for an ordinary user? What kinds of things might be different?

SENATOR RON WYDEN
What we'd have, for example, is faster adoption of new products and services for people showing greater trust in emergency technologies. We'd build on the motivations that have been behind my privacy bills, the Fourth Amendment Is Not For Sale, for example, Section 230, the Algorithm Accountability Act. Cindy, in each one of these, it's been all about two things: individuals and innovation.

JASON KELLEY
I'm wondering if you're surprised by the way that things have turned out in any specific instance, you know, you had a lot of responsibility for some really important legislation for CDA 230, scaling back some NSA spying issues, helping to stop SOPA-PIPA, which are all, you know, really important to EFF and to a lot of our listeners and supporters. But I'm wondering if, you know, despite that, you've seen surprises in where we are that you didn't expect.

SENATOR RON WYDEN
I didn't expect to have so many opponents across the political spectrum for Section 230. I knew we would have some, but nothing has been the subject of more misinformation than 230. You had Donald Trump, the President of the United States, lying about Section 230 over and over again. I don't think Donald Trump would know what Section 230 was if it hit him in the head, but he was always lying about vote by mail and all those kinds of things.
And huge corporate interests like Big Cable and legacy media have bankrolled massive lobbying and PR campaigns against 230. Since they saw user-created content and the ability of regular people to be heard as a threat to their top-down model, all those big guys have been trying to invent reasons to oppose 230 that I could not have dreamed of.
So I'm not saying, I don't think Chris Cox would say it either, that the law is perfect. But when I think about it, it's really a tool for individuals, people without power, without clout, without lobbies, without big checkbooks. And, uh, you know, a lot of people come up to me and say, "Oh, if you're not in public life, 230 will finally disappear" and all this kind of thing. And I said, I think you're underestimating the power of people to really see what this was all about, which was something very new, a very great opportunity, but still based on a fundamental principle that the individual would be responsible for what they posted in this whole new medium and in the United States individual responsibility carries a lot of weight.

CINDY COHN
Oh, I so agree, and I think that one of the things that we've seen, um, with 230 but with a lot of other things now, is a kind of a correct identification of the harm and a wrong identification of what's causing it or what will solve it. So, you know, there are plenty of problems online, but, um, I think we feel, and I think it sounds like you do as well, that we're playing this funny little whack-a-mole game where whatever the problem is, somebody's sliding in to say that 230 is the reason they have that problem, when a lot of times it has to do with something, you know, not related. It could even be, in many cases, the U. S. Constitution, but also kind of misindentifying –

SENATOR RON WYDEN
Cindy, there's a great story that I sometimes tell. The New York Times one day had a big picture of Chris Cox and I, it was practically a full-length page. I'm 6'4", went to college on a basketball scholarship dreaming of playing in the NBA, and they said “these two people are responsible for all the hate information online and 230 empowered people to do it.” And we hardly ever do this, but Keith Chu, our wonderful expert on all things technology, finally touched base with him and said, "you know that if there was no 230, over 95 percent of what we see online that we really dislike — you know, misogyny, hate speech, racism — would still be out there because of the First Amendment, not 230."
And the New York Times, to its credit, printed a long, long apology essentially the next day, making the case that that was really all about the First Amendment, not 230. 230 brought added kind of features to this, particularly the capacity to moderate, which was so important in a new opportunity to communicate.

[MUSIC FADES IN]

CINDY COHN
What drives you towards building a better internet? So many people in Congress in your town don't really take the time to figure out what's going on, much less propose real solutions. They kind of, you know, we've been in this swing where they, they treated the technologies like heroes and now we're in a time when they're treating them like villains. But what drives you to, to kind of figure out what's actually going on and propose real solutions?

SENATOR RON WYDEN
I showed up, Cindy, Oregon's first new United States senator in 34 years, in 1996, the winner, and the only person who knew how to use a computer at that point was, uh, Pat Leahy, who was a great advocate of technology and, and innovation. I said, "I'm going to get into new stuff." In other words, Oregon had always been about wood products. We always will be about wood products and I will continue to champion those kinds of practices, particularly now we're working to prevent these huge fires. I also said we're going to get into new things. And my dad was a journalist and he said, "You're not doing your job if you don't ask hard questions every single day."
So what we tried to do, particularly in those first days, is kind of lay the foundation, just do the foundational principles for the internet. I mean, there's a book, Jeff Kossoff wrote “26 Words That Created the Internet,” but we also had internet tax policy to promote non-discrimination, so you wouldn't be treated different online than you would be offline.
Our digital signatures law, I think, has been a fabulous, you know, addition. People used to spend hours and hours in offices, you know, kind of signing these documents that look like five phone books stacked on top of each other, and they'd be getting through it in 15, 20 minutes. So, um, to me, what I think we showed is that you could produce more genuine innovation by thinking through what was to come than just lining the pocketbooks of these big entrenched interests. Now, a big part of what we're going to have to do now with AI is go through some of those same kinds of issues. You know, I think for example, we're all in on beating China. That's important. We're all in on innovation, but we've got to make sure that we cement bedrock, you know, privacy and accountability.
And that's really what's behind the Algorithm Accountability Act because, you know, what we wanted to do when people were getting ripped off in terms of housing and education and the like with AI, we wanted to get them basic protection.

JASON KELLEY
It sounds like you're, you know, you're already thinking about this new thing, AI, and in 20 or more years ago, you were thinking about the new thing, which is posting online. How do we get more of your colleagues to sort of have that same impulse to be interested in tackling those hard questions that you mentioned? I think we always wonder what's missing from their views, and we just don't really know how to make them sort of wake up to the things that you get.

SENATOR RON WYDEN
What we do is particularly focus on getting experienced and knowledgeable and effective staff. I tell people I went to school on a basketball scholarship. I remember recruiting, we kind of recruit our technologists like they were all LeBron James, and kind of talking about, you know, why there were going to be opportunities here. And we have just a terrific staff now, really led by Chris Segoyan and Keith Chu.
And it's paid huge dividends, for example, when we look at some of these shady data broker issues, government surveillance. Now, with the passing of my, my friend Dianne Feinstein,  one of the most senior members in the intelligence field and, uh,  these incredibly good staff allow me to get into these issues right now I'm with Senator Moran, Jerry Moran of Kansas trying to upend the declassification system because it basically doesn't declassify anything and I'm not sure they could catch bad guys, and they certainly are hanging on to stuff that is irresponsible, uh, information collection about innocent people.

[SHORT MUSIC INTERLUDE]

CINDY COHN
These are all problems that, of course, we're very deep in and,  we do appreciate that you, you know, our friend, Chris Segoyan,  who EFF's known for a long time and other people you've brought in really good technologists and people who understand technology to advise you. How do we get more senators to do that too? Are there things that we could help build that would make that easier?

SENATOR RON WYDEN
I think there are, and I think we need to do more, not post-mortems, but sort of more after-action kind of analysis. For example, the vote on SESTA-FOSTA was 98 to 2. And everybody wasn't sure where the other vote was, and Rand Paul came up to me and said, "You're right, so I'm voting with you."
And, uh, the point really was, you know, everybody hated the scourge of sex trafficking and the like. I consider those people monsters. But I pointed out that all you're going to do is drive them from a place where there was transparency to the dark web, where you can't get a search engine. And people go, "Huh? Well, Ron's telling us, you know, that it's going to get worse." And then I offered an amendment to basically do what I think would have really made a difference there, which is get more prosecutors and more investigators going after bad guys. And the ultimate factor that would be good, as I say, to have these sort of after-action, after-legislating kind of things, is everybody said, "Well, you know, you've got to have SESTA-FOSTA, or you're never going to be able to do anything about Backpage. This was this horrible place that, you know, there were real problems with respect to sex trafficking. And what happened was, Backpage was put out of business under existing law, not under SESTA-FOSTA, and when you guys have this discussion with, you know, people who are following the program and ask them, ask them when their senator or congressperson last had a press conference about SESTA-FOSTA.
I know the answer to this. I can't find a single press conference about SESTA-FOSTA, which was ballyhooed at the time as this miraculous cure for dealing with really bad guys, and the technology didn't make sense and the education didn't make sense, and the history with Backpage didn't make any sense and it's because people got all intoxicated with these, you know, ideas that somehow they were going to be doing this wondrous, you know, thing and it really made things worse.

CINDY COHN
So I'm hearing three things in the better world. One, and the one you've just mentioned, is that we actually have real accountability, that when we pass some kind of regulation, we take the time to look back and see whether it worked; that we have informed people who are helping advise or actually are the lawmakers and the regulators who understand how things, uh, really work.
And the third one is that we have a lot more accountability inside government around classification and secrecy, especially around things involving, you know, national security. And, you know, you're in this position, right, where you are read in as a member of the Intelligence Committee. So you kind of see what the rest of us don't. And I'm wondering, obviously I don't want you to reveal anything, but you know, are there, is that gap an important one that we close?

SENATOR RON WYDEN
Yeah, I mean, you know, there have been a lot of 14-to-1 votes in the Intelligence Committee over the, over the years, and, you know, I've been the one, and you know, the reality is people often get swept up in these kinds of arguments, particularly from people in government, like, we're having a big debate about surveillance now, Section 702, and, you know, everybody's saying, "Ron, what are you talking about? You're opposing this, you know, we face all these, all these kinds of, kinds of threats," and, um, you know, what I've always said is, read the title of the bill, Foreign Intelligence Surveillance Act, that means we're worried about foreign intelligence, we're not, under that law supposed to be sweeping up the records of vast numbers of Americans who are interconnected to those foreign individuals by virtue of the fact that communication systems have changed.
And I personally believe that smart policies ensure that you can fight terror ferociously while still protecting civil liberties, and not-so-smart policies give you less of both.

JASON KELLEY
How do we get to that balance that you're talking about, where, you know, I know a lot of people feel like we do have to have some level of surveillance to protect national security, but that balance of protecting the individual rights of people is a complicated one. And I'm wondering how you think about what that looks like for people.

SENATOR RON WYDEN
Well, for example, Zoe Lofgren, you know, Zoe has been a partner of mine on many projects. I know she's been sympathetic with all of you all, well, for many years in her service as a member from California. You know, what we said on our 702 reforms, and by the way, we had a whole bunch of Republicans, there needs to be a warrant requirement. If you're going after the personal data of Americans, there should be a warrant requirement.

Now, we were then asked, "Well, what happens if it's some kind of imminent kind of crisis?" And I said, what I've always said is that all my bills, as it relates to surveillance, have a warrant exception, which is if the government believes that there is an imminent threat to the security of our country and our people, the government can go up immediately and come back and settle the warrant matter afterwards. And at one point I was having a pretty vigorous debate with the President and his people, then-President Obama. And I said, "Mr. President, if the warrant requirement exception isn't written right, you all write it and I'm sure we'll work it out."
But I think that giving the government a wide berth to make an assessment about whether there is a real threat to the country and they're prepared to not only go up immediately to get the information, but to trust the process later on to come back and show that it was warranted. I think it's a fair balance. That's the kind of thing I'm working on right now.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
And now back to our conversation with Senator Ron Wyden and his work on privacy laws.

SENATOR RON WYDEN
Really, the first big law that I got passed involved privacy rights of Americans outside the country. So we had won a bunch of battles before that, you know, defeating John Poindexter, Total Information Awareness, and a variety of other battles.
But when I started this, trying to protect the privacy rights of Americans who are outside the United States, you would have thought that Western civilization was going to end. And this was the Bush administration. And the DNI, the head of national intelligence, talked to me. He said, "Ron, this is just going to be disastrous. It's going to be horrible."
And I walked him through who we were talking about. And I said, the biggest group of people we're talking about are men and women who wear the uniform in the United States because they are outside the United States. You can't possibly be telling me, Director McConnell, it was Director McConnell at that time, that they shouldn't have privacy rights. And then things kind of moved and I kept working with them and they still said that this was going to be a tremendous threat and all the rest. They were going to veto it. They actually put out a statement about there would be a veto message. So I worked with them a little bit more and we worked it out. And when we were done, the Bush administration put out something, and we are proud to say that we are protecting the privacy rights of Americans outside the United States.
So, if you can just take enough time and be persistent enough, you can get things done. And now, we actually have elected officials and presidents of both political parties all taking credit for the privacy rights of people outside the United States.

[MUSIC STING COMES IN TO INTRO CLIP]

SENATOR RON WYDEN ON CSPAN
A yes or no answer to the question, does the NSA collect any type of data at all on millions or hundreds of millions of Americans?

JAMES CLAPPER ON CSPAN
No sir.

SENATOR RON WYDEN ON CSPAN
It does not.

JAMES CLAPPER ON CSPAN
Not wittingly. There are cases where they could inadvertantly, perhaps, collect but not, not wittingly.

CINDY COHN
That's a clip from CSPAN, a pretty famous interaction you had with James Clapper in 2013. But I think the thing that really shines through with you is your ability to walk this fine line — you're very respectful of the system, even in an instance like this where someone is lying under oath right in your face, you know you have to work within the system to make change. How do you navigate that in the face of lies and misdirection?

SENATOR RON WYDEN
Well, you have to take the time to really tee it up, and I really credit John Dickus of Oregon, our staffer at the time, did a phenomenal job. He spent about six months teeing that question up for Mr. Clapper and what happened is his deputy — Mr. Clapper's deputy, Keith Alexander — had been telling what my 11-year-old daughter — my wife and I are older parents — we have this 11-year-old. She said, "Dad, that was a big whopper. That guy told a big whopper." Keith Alexander told a bunch of whoppers. And then Mr. Clapper did. And this had all been done in public. And so we asked for answers. He wouldn't give any answers. Then he came to the one, um, you know, open-threat hearing that we have each year. And we prepare for those open threat hearings like there is no tomorrow, because you don't get very many opportunities to have a chance to ask, you know, the important questions. And so John Dickus sent to Mr. Clapper, he sent him the question a day in advance, so that nobody could say that they hadn't gotten it, and it's an informal rule in the Intelligence Committee that if an official feels that they can't answer, they just say, "I can't answer, I have to do it in private." I wouldn't have liked that answer. But I would have respected it and tried to figure out some other way, but James Clapper got the question, looked at the camera, looked at me, and just lied and persisted in coming up — he had like five or six excuses for how he wasn't lying. And I think as the country found out what was going on, it was a big part of our product to produce the next round of laws that provided some scrutiny over the Patriot Act.

CINDY COHN
I think that's a really important kind of insight, right? Which is the thing that led to people being upset about the kind of massive surveillance and understanding it was kind of the lie, right? Like if there was more transparency on the part of the national security people and they didn't just tell themselves that they have to lie to all the rest of us, you know, in order to keep us safe, which I think is a very, very dangerous story in a democracy, we might end up in a much more reasonable place for everyone about privacy and security. And I actually don't think it's a balance. I think that you only get security if you have privacy, rather than they have to be traded off against them, and –

SENATOR RON WYDEN
You're a Ben Franklin person, Cindy. Anybody who gives up liberty to have security doesn't deserve either.

CINDY COHN
Well, I think that that's kind of right, but I also think that, you know, the history has shown that the intense secrecy, overbroad secrecy actually doesn't make us safer. And I think this goes back to your point about accountability, where we really do need to look back and say these things that have been embraced as allegedly making us safer, are they actually making us safer or are we better off having a different role for secrecy — not that there's no role, but then the one that has been, you know, kind of, it's an all-purpose excuse that no matter what the government does, it just uses the secrecy argument to make sure that the American people can't find out so that we don't, you know, evaluate whether things are working or not.
I just don't think that the, you know, my experience watching these things, and I don't know about yours, is that the overblown secrecy isn't actually making us safer.

[SHORT MUSIC INTERLUDE]

JASON KELLEY
Before we wrap up, we wanted to get a sense from you of what issues you see coming in the next three years or so that we're going to need to be thinking about to be ahead of the game. What's at the top of your mind looking forward?

SENATOR RON WYDEN
The impact of the Dobbs decision repealing Roe v. Wade is going to have huge ripple effects through our society. I believe, you know, women are already having their personal information weaponized. against them. And you're seeing it in states with, you know, MAGA attorneys general, but you're also seeing it – we did a big investigation of pharmacies. And pharmacies are giving out women's personal information hither and, and yon. And, you know, we're very much committed to getting privacy rights here. And I also want to congratulate EFF on your Who's Got Your Back report, because you really are touching on these same kinds of issues, and I think getting a warrant ought to be really important.
And the other one I mentioned is, uh, fighting government censorship. And I would put that both at home and abroad. It's no secret that China, Russia, and India want to control what people can say and read, but you know, if you look at some of what, you know, we're seeing in this country, the U.S. trade representative taking a big step backwards in terms of access to information, we're going to have to deal with that in here in our country too.

CINDY COHN
Oh, those are wonderful and scary, but wonderful and important things. I really appreciate you taking the time to talk to us. It's always such a pleasure and we are huge fans of the work that you've done, and thank you so much for carrying, you know, the “I squared,” individuals and innovation. Those are two values close to our hearts here at EFF and we really appreciate having you in Congress championing that as well

SENATOR RON WYDEN
I don't want to make this a bouquet-tossing contest, but we've had a lot of opportunities to work, work together and, you know, EFF is part of the Steppin' Up Caucus and, uh, really appreciate it and, uh, let's put this in "to be continued," okay?

CINDY COHN
Terrific.

SENATOR RON WYDEN
Thanks, guys.

CINDY COHN
I really could talk with Senator Wyden all day and specifically talk with him about national security all day, but what a great conversation. And it's so refreshing to have somebody who's experienced in Congress who really is focusing on two of the most important things that EFF focuses on as well. I love the framing of I squared, right? Individuals and innovation as the kind of centerpiece of a better world.

JASON KELLEY
Yeah. And you know, he's not just saying it, it's clear from his bills and his work over the years that he really does center those things. Innovation and individuals are really the core of things like Section 230 and many other pieces of legislation that he's worked on, which, it's just really nice and refreshing to hear someone who has a really strong ethos in the Senate and has the background to show that he means it.

CINDY COHN
Yeah, and you know, sometimes we disagree with Senator Wyden, but it's always refreshing to feel like, well, we're all trying to point in the same direction. We sometimes have disagreements about how to get there.

JASON KELLEY
Yeah. And one of the great things about working with him is that, you know, he and his staff are tech-savvy, so our disagreements are often pretty nuanced, at least from what I can remember. You know, we aren't having disagreements about what a technology is or something like that very often. I think we're, we're usually having really good conversations with his folks, because he's one of the most tech-savvy staffers in the Senate, and he's helped really make the Senate more tech-savvy overall.

CINDY COHN
Yeah, I think that this is one of these pieces of a better internet that, that feels kind of indirect, but is actually really important, which is making sure that our lawmakers - you know, they don't all have to be technologists. We have a couple technologists in Congress now, but they really have to be informed by people who understand how technology works.
And I think one of the things that's important when we show up a lot of the times is really, you know, having a clear ability to explain to the people, you know, whether it's the congressional people themselves or their staff, like how things really work and having that kind of expertise in house is, I think, something that's going to be really important if we're going to get to a better internet.

JASON KELLEY
Yeah. And it's clear that we have still work to do. You know, he brought up SESTA-FOSTA and that's an instance where, you know, he understands and his staff understands that that was a bad bill, but it was still, as he said, you know, 98-2, when it came to the vote. And ultimately that was a tech bill. And I think if, if we had more, even more sort of tech-savvy folks, we wouldn't have had such a such a fight with that bill.

CINDY COHN
And I think that he also pointed to something really important, which was this idea of after analysis, after-action thinking and looking back and saying, "Well, we passed this thing, did it do what we had hoped it would do?" as a way to really have a process where we can do error correction. And I noted that, you know, Ro Khanna and Elizabeth Warren have actually, and Senator Wyden, have floated a bill to have an investigation into FOSTA-SESTA, which, you know, for, for those who, who don't know the shorthand, this was a way that Section 230 was cut back, and protection was cut back. And the idea is that it could help stop sex trafficking. Well, all the data that we've seen so far is that it did not do that. And in some ways made sex trafficking,  you know, in the offline environment more dangerous. But having Congress actually step in and do and sponsor the research to figure out whether the bill that Congress passed did the thing that they said is, I think, just a critical piece of how we decide what we're going to do in order to protect individuals and innovation online.

JASON KELLEY
Yeah. For me, you know, it's actually tied to something that I know a lot of tech teams do which is like a sort of post-mortem. You know, after something happens, you really do need to investigate how we got there, what worked and what didn't, but in this case we all know, at least at EFF, that this was a bad bill.

CINDY COHN
Yeah, I mean, sometimes it might be just taking what we know anecdotally and turning it into something that Congress can more easily see and digest. Um, I think the other thing, it's just impossible to talk with or about Senator Wyden without talking about national security because he has just been heroic in his efforts to try to make sure that we don't trade privacy off for security. And that we recognize that these two things are linked and that by lifting up privacy, we're lifting up national security.
And by reducing privacy, we're not actually making ourselves safer. And he really has done more for this. And I think what was heartening about this conversation was that, you know, he talked about how he convinced national security hawks to support something that stood with privacy, this story about kind of really talking about how most of the Americans abroad are affiliated in one way or another with the U.S. military, people who are stationed abroad and their families, and how standing up for their privacy and framing it that way, you know, ultimately led to some success for this. Now, we've got a long ways to go, and I think he'd be the first one to agree. But the kind of doggedness and willingness to be in there for the long haul and talk to the national security folks about how, how these two values support each other is something that he has really proven that he's willing to do and it's so important.

JASON KELLEY
Yeah, that's exactly right, I think, as well. And it's also terrific that he's looking to the future, you know, we do know that he's thinking about these things, you know, 702 has been an issue for a long time and he's still focused on it, but what did you think of his thoughts about what our coming challenges are — things like how to deal with data in in a post-Dobbs world, for example?

CINDY COHN
Oh, I think he's right on, right on it. He's recognizing, I think as a lot of people have, that the Dobbs decision, overturning Roe v. Wade has really made it clear to a lot of people how vulnerable we are, based upon the data that we have to leave behind in what we do every day. Now you can do things to try to protect them, but there's only so much we can do right now without changes in the law and changes in the way things go because you know, your phone needs to know where you are in order to ring when somebody calls you or ping when somebody texts you.
So we need legal answers and he's correct that this is really coming into the fore right now. I think he's also thinking about the challenges that artificial intelligence are bringing. So I really appreciate that he's already thinking about how we fix the internet, you know, in the coming years, not just right now.

JASON KELLEY
I'm really glad we had this bouquet-throwing contest, I think was what he called it. Something like that. But yeah, I think it's great to have an ally and have them be in the Senate and I know he feels the same way about us.

CINDY COHN
Oh, absolutely. I mean, you know, part of the way we get to a better internet is to recognize the people who are doing the right thing. And so, you know, we spend a lot of time at EFF throwing rocks at the people who are doing the wrong thing. And that's really important too. But occasionally, you know, we get to throw some bouquets to the people who are fighting the good fight.

[THEME MUSIC FADES IN]

JASON KELLEY

Thanks for joining us for this episode of How To Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF.org/podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.
We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators.
In this episode you heard Kalte Ohren by Alex and Drops of H10 (The Filtered Water Treatment) by J. Lang
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll talk to you again soon.
I’m Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Josh Richman

EFF to Ninth Circuit: There’s No Software Exception to Traditional Copyright Limits

1 month 4 weeks ago

Copyright’s reach is already far too broad, and courts have no business expanding it any further, particularly where that reframing will undermine adversarial interoperability. Unfortunately, a federal district court did just that in the latest iteration of Oracle v. Rimini, concluding that software Rimini developed was a “derivative work” because it was intended to interoperate with Oracle's software, even though the update didn’t use any of Oracle’s copyrightable code.

That’s a dangerous precedent. If a work is derivative, it may infringe the copyright in the preexisting work from which it, well, derives. For decades, software developers have relied, correctly, on the settled view that a work is not derivative under copyright law unless it is “substantially similar” to a preexisting work in both ideas and expression. Thanks to that rule, software developers can build innovative new tools that interact with preexisting works, including tools that improve privacy and security, without fear that the companies that hold rights in those preexisting works would have an automatic copyright claim to those innovations.

That’s why EFF, along with a diverse group of stakeholders representing consumers, small businesses, software developers, security researchers, and the independent repair community, filed an amicus brief in the Ninth Circuit Court of Appeals explaining that the district court ruling is not just bad policy, it’s also bad law.  Court after court has confronted the challenging problem of applying copyright to functional software, and until now none have found that the copyright monopoly extends to interoperable software absent substantial similarity. In other words, there is no “software exception” to the definition of derivative works, and the Ninth Circuit should reject any effort to create one.

The district court’s holding relied heavily on an erroneous interpretation of a 1998 case, Micro Star v. FormGen. In that case, the plaintiff, FormGen, published a video game following the adventures of action hero Duke Nukem. The game included a software tool that allowed players themselves to build new levels to the game and share them with others. Micro Star downloaded hundreds of those user-created files and sold them as a collection. When FormGen sued for copyright infringement, Micro Star argued that because the user files didn’t contain art or code from the FormGen game, they were not derivative works.

The Ninth Circuit Court of Appeals ruled against Micro Star, explaining that:

[t]he work that Micro Star infringes is the [Duke Nukem] story itself—a beefy commando type named Duke who wanders around post-Apocalypse Los Angeles, shooting Pig Cops with a gun, lobbing hand grenades, searching for medkits and steroids, using a jetpack to leap over obstacles, blowing up gas tanks, avoiding radioactive slime. A copyright owner holds the right to create sequels and the stories told in the [user files] are surely sequels, telling new (though somewhat repetitive) tales of Duke’s fabulous adventures.

Thus, the user files were “substantially similar” because they functioned as sequels to the video game itself—specifically the story and principal character of the game. If the user files had told a different story, with different characters, they would not be derivative works. For example, a company offering a Lord of the Rings game might include tools allowing a user to create their own character from scratch. If the user used the tool to create a hobbit, that character might be considered a derivative work. A unique character that was simply a 21st century human in jeans and a t-shirt, not so much.

Still, even confined to its facts, Micro Star stretched the definition of derivative work. By misapplying Micro Star to purely functional works that do not incorporate any protectable expression, however, the district court rewrote the definition altogether. If the court’s analysis were correct, rightsholders would suddenly have a new default veto right in all kinds of works that are intended to “interact and be useable with” their software. Unfortunately, they are all too likely to use that right to threaten add-on innovation, security, and repair.

Defenders of the district court’s approach might argue that interoperable software will often be protected by fair use. As copyrightable software is found in everything from phones to refrigerators, fair use is an essential safeguard for the development of interoperable tools, where those tools might indeed qualify as derivative works. But many developers cannot afford to litigate the question, and they should not have to just because one federal court misread a decades-old case.

Corynne McSherry

EFF’s Submission to Ofcom’s Consultation on Illegal Harms

1 month 4 weeks ago

More than four years after it was first introduced, the Online Safety Act (OSA) was passed by the U.K. Parliament in September 2023. The Act seeks to make the U.K. “the safest place” in the world to be online and provides Ofcom, the country’s communications regulator, with the power to enforce this.

EFF has opposed the Online Safety Act since it was first introduced. It will lead to a more censored, locked-down internet for British users. The Act empowers the U.K. government to undermine not just the privacy and security of U.K. residents, but internet users worldwide. We joined civil society organizations, security experts, and tech companies to unequivocally ask for the removal of clauses that require online platforms to use government-approved software to scan for illegal content. 

Under the Online Safety Act, websites, and apps that host content deemed “harmful” minors will face heavy penalties; the problem, of course, is views vary on what type of content is “harmful,” in the U.K. as with all other societies. Soon, U.K. government censors will make that decision. 

The Act also requires mandatory age verification, which undermines the free expression of both adults and minors. 

Ofcom recently published the first of four major consultations seeking information on how internet and search services should approach their new duties on illegal content. While we continue to oppose the concept of the Act, we are continuing to engage with Ofcom to limit the damage to our most fundamental rights online. 

EFF recently submitted information to the consultation, reaffirming our call on policymakers in the U.K. to protect speech and privacy online. 

Encryption 

For years, we opposed a clause contained in the then Online Safety Bill allowing Ofcom to serve a notice requiring tech companies to scan their users–all of them–for child abuse content. We are pleased to see that Ofcom’s recent statements note that the Online Safety Act will not apply to end-to-end encrypted messages. Encryption backdoors of any kind are incompatible with privacy and human rights. 

However, there are places in Ofcom’s documentation where this commitment can and should be clearer. In our submission, we affirmed the importance of ensuring that people’s rights to use and benefit from encryption—regardless of the size and type of the online service. The commitment to not scan encrypted data must be firm, regardless of the size of the service, or what encrypted services it provides. For instance, Ofcom has suggested that “file-storage and file-sharing” may be subject to a different risk profile for mandating scanning. But encrypted “communications” are not significantly different from encrypted “file-storage and file-sharing.”

In this context, Ofcom should also take note of new milestone judgment in PODCHASOV v. RUSSIA (Application no. 33696/19) where the European Court of Human Rights (ECtHR) ruled that weakening encryption can lead to general and indiscriminate surveillance of communications for all users, and violates the human right to privacy. 

Content Moderation

An earlier version of the Online Safety Bill enabled the U.K. government to directly silence user speech and imprison those who publish messages that it doesn’t like. It also empowered Ofcom to levy heavy fines or even block access to sites that offend people. We were happy to see this clause removed from the bill in 2022. But a lot of problems with the OSA remain. Our submission on illegal harms affirmed the importance of ensuring that users have: greater control over what content they see and interact with, are equipped with knowledge about how various controls operate and how they can use them to their advantage, and have the right to anonymity and pseudonymity online.

Moderation mechanisms must not interfere with users’ freedom of expression rights, and moderators should receive ample training and materials to ensure cultural and linguistic competence in content moderation. In cases where time-related pressure is placed on moderators to make determinations, companies often remove more than necessary to avoid potential liability, and are incentivized towards using automated technologies for content removal and upload filters. These are notoriously inaccurate and prone to overblocking legitimate material. Moreover, the moderation of terrorism-related content is prone to error and any new mechanism like hash matching or URL detection must be provided with expert oversight. 

Next Steps

Throughout this consultation period, EFF will continue contributing to and monitoring Ofcom’s drafting of the regulation. And we will continue to hold the U.K. government accountable to the international and European human rights protections to which they are signatories.

Read EFF's full submission to Ofcom

Paige Collings

The Foilies 2024

1 month 4 weeks ago
Recognizing the worst in government transparency.

The Foilies are co-written by EFF and MuckRock and published in alternative newspapers around the country through a partnership with the Association of Alternative Newsmedia

We're taught in school about checks and balances between the various branches of government, but those lessons tend to leave out the role that civilians play in holding officials accountable. We're not just talking about the ballot box, but the everyday power we all have to demand government agencies make their records and data available to public scrutiny.

At every level of government in the United States (and often in other countries), there are laws that empower the public to file requests for public records. They go by various names—Freedom of Information, Right-to-Know, Open Records, or even Sunshine laws—but all share the general concept that because the government is of the people, its documents belong to the people. You don't need to be a lawyer or journalist to file these; you just have to care.

It's easy to feel powerless in these times, as local newsrooms close, and elected officials embrace disinformation as a standard political tool. But here's what you can do, and we promise it'll make you feel better: Pick a local agency—it could be a city council, a sheriff's office or state department of natural resources—and send them an email demanding their public record-request log, or any other record showing what requests they receive, how long it took them to respond, whether they turned over records, and how much they charged the requester for copies. Many agencies even have an online portal that makes it easier, or you can use MuckRock’s records request tool. (You can also explore other people's results that have been published on MuckRock's FOIA Log Explorer.) That will send the message to local leaders they're on notice. You may even uncover an egregious pattern of ignoring or willfully violating the law.

The Foilies are our attempt to call out these violations each year during Sunshine Week, an annual event (March 10-16 this year) when advocacy groups, news organizations and citizen watchdogs combine efforts to highlight the importance of government transparency laws. The Electronic Frontier Foundation and MuckRock, in partnership with the Association of Alternative Newsmedia, compile the year's worst and most ridiculous responses to public records requests and other attempts to thwart public access to information, including through increasing attempts to gut the laws guaranteeing this access—and we issue these agencies and officials tongue-in-cheek "awards" for their failures.

Sometimes, these awards actually make a difference. Last year, Mendocino County in California repealed its policy of charging illegal public records fees after local journalists and activists used The Foilies’ "The Transparency Tax Award" in their advocacy against the rule.

This year marks our 10th annual accounting of ridiculous redactions, outrageous copying fees, and retaliatory attacks on requesters—and we have some doozies for the ages.

The "Winners" The Not-So-Magic Word Award: Augusta County Sheriff’s Office, Va.

Public records laws exist in no small part because corruption, inefficiency and other malfeasance happen, regardless of the size of the government. The public’s right to hold these entities accountable through transparency can prevent waste and fraud.

Of course, this kind of oversight can be very inconvenient to those who would like a bit of secrecy. Employees in Virginia’s Augusta County thought they’d found a neat trick for foiling Virginia's Freedom of Information Act.

Consider: “NO FOIA”

In an attempt to withhold a bunch of emails they wanted to hide from the public eye, employees in Augusta County began tagging their messages with “NO FOIA,” as an apparent incantation staff believed could ward off transparency. Of course, there are no magical words that allow officials to evade transparency laws; the laws assume all government records are public, so agencies can’t just say they don’t want records released.

Fortunately, at least one county employee thought that breaking the law must be a little more complicated than that, and this person went to Breaking Through News to blow the whistle.

Breaking Through News sent a FOIA request for those “NO FOIA” emails. The outlet received just 140 emails of the 1,212 that the county indicated were responsive, and those released records highlighted the county’s highly suspect approach to withholding public records. Among the released records were materials like the wages for the Sheriff Office employees (clearly a public record), the overtime rates (clearly a public record) and a letter from the sheriff deriding the competitive wages being offered at other county departments (embarrassing but still clearly a public record). 

Other clearly public records, according to a local court, included recordings of executive sessions that the commissioners had entered illegally, which Breaking Through News learned about through the released records. They teamed up with the Augusta Free Press to sue for access to the recordings, a suit they won last month. They still haven’t received the awarded records, and it’s possible that Augusta County will appeal. Still, it turned out that, thanks to the efforts of local journalists, their misguided attempt to conjure a culture of “No FOIA” in August County actually brought them more scrutiny and accountability.

The Poop and Pasta Award: Richlands, Va.

Government officials retaliated against a public records requester by filling her mailbox with noodles.

In 2020, Laura Mollo of Richlands, Va., discovered that the county 911 center could not dispatch Richlands residents’ emergency calls: While the center dispatched all other county 911 calls, calls from Richlands had to be transferred to the Richlands Police Department to be handled. After the Richlands Town Council dismissed Mollo’s concerns, she began requesting records under the Virginia Freedom of Information Act. The records showed that Richlands residents faced lengthy delays in connecting with local emergency services. On one call, a woman pleaded for help for her husband, only to be told that county dispatch couldn’t do anything—and her husband died during the delay. Other records Mollo obtained showed that Richlands appeared to be misusing its resources.

You would hope that public officials would be grateful that Mollo uncovered the town’s inadequate emergency response system and budget mismanagement. Well, not exactly: Mollo endured a campaign of intimidation and harassment for holding the government accountable. Mollo describes how her mailbox was stuffed with cow manure on one occasion, and spaghetti on another (which Mollo understood to be an insult to her husband’s Italian heritage). A town contractor harassed her at her home; police pulled her over; and Richlands officials even had a special prosecutor investigate her.

But this story has a happy ending: In November 2022, Mollo was elected to the Richlands Town Council. The records she uncovered led Richlands to change over to the county 911 center, which now dispatches Richlands residents’ calls. And in 2023, the Virginia Coalition for Open Government recognized Mollo by awarding her the Laurence E. Richardson Citizen Award for Open Government. Mollo’s recognition is well-deserved. Our communities are indebted to people like her who vindicate our right to public records, especially when they face such inexcusable harassment for their efforts.

The Error 404 Transparency Not Found Award: FOIAonline

In 2012, FOIAonline was launched with much fanfare as a way to bring federal transparency into the late 20th century. No longer would requesters have to mail or fax requests. Instead, FOIAonline was a consolidated starting point, managed by the Environmental Protection Agency (EPA), that let you file Freedom of Information Act requests with numerous federal entities from within a single digital interface.

Even better, the results of requests would be available online, meaning that if someone else asked for interesting information, it would be available to everyone, potentially reducing the number of duplicate requests. It was a good idea—but it was marred from the beginning by uneven uptake, agency infighting, and inscrutable design decisions that created endless headaches. In its latter years, FOIAonline would go down for days or weeks at a time without explanation. The portal saw agency after agency ditch the platform in favor of either homegrown solutions or third-party vendors.

Last year, the EPA announced that the grand experiment was being shuttered, leaving thousands of requesters uncertain about how and where to follow up on their open requests, and unceremoniously deleting millions of documents from public access without any indication of whether they would be made available again.

In a very on-brand twist of the knife, the decision to sunset FOIAonline was actually made two years prior, after an EPA office reported in a presentation that the service was likely to enter a “financial death spiral” of rising costs and reduced agency usage. Meanwhile, civil-society organizations such as MuckRock, the Project on Government Oversight, and the Internet Archive have worked to resuscitate and make available at least some of the documents the site used to host.

The Literary Judicial Thrashing of the Year Award: Pennridge, Penn., School District

Sometimes when you're caught breaking the law, the judge will throw the book at you. In the case of Pennridge School District in Bucks County, Penn. Judge Jordan B. Yeager catapulted an entire shelf of banned books at administrators for violating the state's Right-to-Know Law.

The case begins with Darren Laustsen, a local parent who was alarmed by a new policy to restrict access to books that deal with “sexualized content,” seemingly in lockstep with book-censorship laws happening around the country. Searching the school library's catalog, he came across a strange trend: Certain controversial books that appeared on other challenged-book lists had been checked out for a year or more. Since students are only allowed to check out books for a week, he (correctly) suspected that library staff were checking them out themselves to block access.

So he filed a public records request for all books checked out by non-students. Now, it's generally important for library patrons to have their privacy protected when it comes to the books they read—but it's a different story if public employees are checking out books as part of their official duties and effectively enabling censorship. The district withheld the records, provided incomplete information, and even went so far as to return books and re-check them out under a student's account in order to obscure the truth. And so Laustsen sued.

The judge issued a scathing and literarily robust ruling: “In short, the district altered the records that were the subject of the request, thwarted public access to public information, and effectuated a cover-up of faculty, administrators, and other non-students’ removal of books from Pennridge High School’s library shelves." The opinion was peppered with witty quotes from historically banned books, including Nineteen Eighty-Four, Alice in Wonderland, The Art of Racing in the Rain and To Kill a Mockingbird. After enumerating the district's claims that later proved to be inaccurate, he cited Kurt Vonnegut's infamous catchphrase from Slaughterhouse-Five: "So it goes."

The Photographic Recall Award: Los Angeles Police Department

Police agencies seem to love nothing more than trumpeting an arrest with an accompanying mugshot—but when the tables are turned, and it’s the cops’ headshots being disclosed, they seem to lose their minds and all sense of the First Amendment.

This unconstitutional escapade began (and is still going) after a reporter and police watchdog published headshots of Los Angeles Police Department officers, which they lawfully obtained via a public records lawsuit. LAPD cops and their union were furious. The city then sued the reporter, Ben Camacho, and the Stop LAPD Spying Coalition, demanding that they remove the headshots from the internet and return the records to LAPD.

You read that right: After a settlement in a public records lawsuit required the city to disclose the headshots, officials turned around and sued the requester for, uh, disclosing those same records, because the city claimed it accidentally released pictures of undercover cops.

But it gets worse: Last fall, a trial court denied a motion to throw out the city’s case seeking to claw back the images; Camacho and the coalition have appealed that decision and have not taken the images offline. And in February, the LAPD sought to hold Camacho and the coalition liable for damages it may face in a separate lawsuit brought against it by hundreds of police officers whose headshots were disclosed.

We’re short on space, but we’ll try explain the myriad ways in which all of the above is flagrantly unconstitutional: The First Amendment protects Camacho and the coalition’s ability to publish public records they lawfully obtained, prohibits courts from entering prior restraints that stop protected speech, and limits the LAPD’s ability to make them pay for any mistakes the city made in disclosing the headshots. Los Angeles officials should be ashamed of themselves—but their conduct shows that they apparently have no shame.

The Cops Anonymous Award: Chesterfield County Police Department, Va.

The Chesterfield County Police Department in Virginia refused to disclose the names of hundreds of police officers to a public records requester on this theory: Because the cops might at some point go undercover, the public could never learn their identities. It’s not at all dystopian to claim that a public law enforcement agency needs to have secret police!

Other police agencies throughout the state seem to deploy similar secrecy tactics, too.

The Keep Your Opinions to Yourself Award: Indiana Attorney General Todd Rokita

In March 2023, Indiana Attorney General Todd Rokita sent a letter to medical providers across the state demanding information about the types of gender-affirming care they may provide to young Hoosiers. But this was no unbiased probe: Rokita made his position very clear when he publicly blasted these health services as “the sterilization of vulnerable children” that “could legitimately be considered child abuse.” He made claims to the media that the clinics’ main goals weren’t to support vulnerable youth, but to rake in cash.

Yet as loud as he was about his views in the press, Rokita was suddenly tight-lipped once the nonprofit organization American Oversight filed a public records request asking for all the research, analyses and other documentation that he used to support his claims. Although his agency located 85 documents that were relevant to their request, Rokita refused to release a single page, citing a legal exception that allows him to withhold deliberative documents that are “expressions of opinion or are of a speculative nature.”

Perhaps if Rokita’s opinions on gender-affirming care weren't based on facts, he should've kept those opinions and speculations to himself in the first place.

The Failed Sunshine State Award: Florida Gov. Ron DeSantis

Florida’s Sunshine Law is known as one of the strongest in the nation, but Gov. Ron DeSantis spent much of 2023 working, pretty successfully, to undermine its superlative status with a slew of bills designed to weaken public transparency and journalism.

In March, DeSantis was happy to sign a bill to withhold all records related to travel done by the governor and a whole cast of characters. The law went into effect just more than a week before the governor announced his presidential bid. In addition, DeSantis has asserted his “executive privilege” to block the release of public records in a move that, according to experts like media law professor Catherine Cameron, is unprecedented in Florida’s history of transparency.

DeSantis suspended his presidential campaign in January. That may affect how many trips he’ll be taking out-of-state in the coming months, but it won’t undo the damage of his Sunshine-slashing policies.

Multiple active lawsuits are challenging DeSantis over his handling of Sunshine Law requests. In one, The Washington Post is challenging the constitutionality of withholding the governor’s travel records. In that case, a Florida Department of Law Enforcement official last month claimed the governor had delayed the release of his travel records. Nonprofit watchdog group American Oversight filed a lawsuit in February, challenging “the unjustified and unlawful delay” in responding to requests, citing a dozen records requests to the governor’s office that have been pending for one to three years.

“It’s stunning, the amount of material that has been taken off the table from a state that many have considered to be the most transparent,” Michael Barfield, director of public access for the Florida Center for Government Accountability (FCGA), told NBC News. The FCGA is now suing the governor’s office for records on flights of migrants to Massachusetts. “We’ve quickly become one of the least transparent in the space of four years.”

The Self-Serving Special Session Award: Arkansas Gov. Sarah Huckabee Sanders

By design, FOIA laws exist to help the people who pay taxes hold the people who spend those taxes accountable. In Arkansas, as in many states, taxpayer money funds most government functions: daily office operations, schools, travel, dinners, security, etc. As Arkansas’ governor, Sarah Huckabee Sanders has flown all over the country, accompanied by members of her family and the Arkansas State Police. For the ASP alone, the people of Arkansas paid $1.4 million in the last half of last year.

Last year, Sanders seemed to tire of the scrutiny being paid to her office and her spending. Sanders cited her family’s safety as she tried to shutter any attempts to see her travel records, taking the unusual step of calling a special session of the state Legislature to protect herself from the menace of transparency.

Notably, the governor had also recently been implicated in an Arkansas Freedom of Information Act case for these kinds of records.

The attempt to gut the law included a laundry list of carve-outs unrelated to safety, such as walking back the ability of public-records plaintiffs to recover attorney's fees when they win their case. Other attempts to scale back Arkansas' FOIA earlier in the year had not passed, and the state attorney general’s office was already working to study what improvements could be made to the law.  

Fortunately, the people of Arkansas came out to support the principle of government transparency, even as their governor decided she shouldn’t need to deal with it anymore. Over a tense few days, dozens of Arkansans lined up to testify in defense of the state FOIA and the value of holding elected officials, like Sanders, accountable to the people.

By the time the session wound down, the state Legislature had gone through multiple revisions. The sponsors walked back most of the extreme asks and added a requirement for the Arkansas State Police to provide quarterly reports on some of the governor’s travel costs. However, other details of that travel, like companions and the size of the security team, ultimately became exempt. Sanders managed to twist the whole fiasco into a win, though it would be a great surprise if the Legislature didn’t reconvene this year with some fresh attempts to take a bite out of FOIA.

While such a blatant attempt to bash public transparency is certainly a loser move, it clearly earns Sanders a win in the FOILIES—and the distinction of being one of the least transparent government officials this year.

The Doobie-ous Redaction Award: U.S. Department of Health and Human Services and Drug Enforcement Administration

The feds heavily redacted an email about reclassifying cannabis from a Schedule I to a Schedule III substance.

Bloomberg reporters got a major scoop when they wrote about a Health and Human Services memo detailing how health officials were considering major changes to the federal restrictions on marijuana, recommending reclassifying it from a Schedule I substance to Schedule III.

Currently, the Schedule I classification for marijuana puts it in the same league as heroin and LSD, while Schedule III classification would indicate lower potential for harm and addiction along with valid medical applications.

Since Bloomberg viewed but didn’t publish the memo itself, reporters from the Cannabis Business Times filed a FOIA request to get the document into the public record. Their request was met with limited success: HHS provided a copy of the letter, but redacted virtually the entire document besides the salutation and contact information. When pressed further by CBT reporters, the DEA and HHS would only confirm what the redacted documents had already revealed—virtually nothing.

HHS handed over the full, 250-page review several months later, after a lawsuit was filed by an attorney in Texas. The crucial information the agencies had fought so hard to protect: “Based on my review of the evidence and the FDA’s recommendation, it is my recommendation as the Assistant Secretary for Health that marijuana should be placed in Schedule III of the CSA.”

The “Clearly Releasable,” Clearly Nonsense Award: U.S. Air Force

Increasingly, federal and state government agencies require public records requesters to submit their requests through online portals. It’s not uncommon for these portals to be quite lacking. For example, some portals fail to provide space to include information crucial to requests.

But the Air Force deserves special recognition for the changes it made to its submission portal, which asked requesters if they would  agree to limit their requests to  information that the Air Force deemed "clearly releasable.” You might think, “surely the Air Force defined this vague ‘clearly releasable’ information.” Alas, you’d be wrong: The form stated only that requesters would “agree to accept any information that will be withheld in compliance with the principles of FOIA exemptions as a full release.” In other words, the Air Force asked requesters to give up the fight over information before it even began, and to accept the Air Force's redactions and rejections as non-negotiable.

Following criticism, the Air Force jettisoned the update to its portal to undo these changes. Moving forward, it's "clear" that it should aim higher when it comes to transparency.

The Scrubbed Scrubs Award: Ontario Ministry of Health, Canada

Upon taking office in 2018, Ontario Premier Doug Ford was determined to shake up the Canadian province’s healthcare system. His administration has been a bit more tight-lipped, however, about the results of that invasive procedure. Under Ford, Ontario’s Ministry of Health is fighting the release of information on how understaffed the province’s medical system is, citing “economic and other interests.” The government’s own report, partially released to Global News, details high attrition as well as “chronic shortages” of nurses.

The reporters’ attempts to find out exactly how understaffed the system is, however, were met with black-bar redactions. The government claims that releasing the information would negatively impact “negotiating contracts with health-care workers.” However, the refusal to release the information hasn’t helped solve the problem; instead, it’s left the public in the dark about the extent of the issue and what it would actually cost to address it.

Global News has appealed the withholdings. That process has dragged on for over a year, but a decision is expected soon.

The Judicial Blindfold Award: Mississippi Justice Courts

Courts are usually transparent by default. People can walk in to watch hearings and trials, and can get access to court records online or at the court clerk’s office. And there are often court rules or state laws that ensure courts are public.

Apparently, the majority of Mississippi Justice Courts don’t feel like following those rules. An investigation by ProPublica and the Northeast Mississippi Daily Journal found that nearly two-thirds of these county-level courts obstructed public access to basic information about law enforcement’s execution of search warrants. This blockade not only appeared to violate state rules on court access; it frustrated the public’s ability to scrutinize when police officers raid someone’s home without knocking and announcing themselves.

The good news is that the Daily Journal is pushing back. It filed suit in the justice court in Union County, Miss., and asked for an end to the practice of never making search-warrant materials public.

Mississippi courts are unfortunately not alone in their efforts to keep search warrant records secret. The San Bernardino Superior Court of California sought to keep secret search warrants used to engage in invasive digital surveillance, only disclosing most of them after the EFF sued.

It’s My Party and I Can Hide Records If I Want to Award: Wyoming Department of Education

Does the public really have a right to know if their tax dollars pay for a private political event?

Former Superintendent of Public Instruction Brian Schroeder and Chief Communications Officer Linda Finnerty in the Wyoming Department of Education didn’t seem to think so, according to Laramie County Judge Steven Sharpe.

Sharpe, in his order requiring disclosure of the records, wrote that the two were more concerned with “covering the agency’s tracks” and acted in “bad faith” in complying with Wyoming’s state open records law.

The lawsuit proved that Schroeder originally used public money for a "Stop the Sexualization of Our Children" event and provided misleading statements to the plaintiffs about the source of funding for the private, pro-book-banning event.

The former superintendent had also failed to provide texts and emails sent via personal devices that were related to the planning of the event, ignoring the advice of the state’s attorneys. Instead, Schroeder decided to “shop around” for legal advice and listen to a friend, private attorney Drake Hill, who told him to not provide his cell phone for inspection.

Meanwhile, Finnerty and the Wyoming Department of Education “did not attempt to locate financial documents responsive to plaintiffs’ request, even though Finnerty knew or certainly should have known such records existed.”

Transparency won this round with the disclosure of more than 1,500 text messages and emails—and according to Sharpe, the incident established a legal precedent on Wyoming public records access.

The Fee-l the Burn Award: Baltimore Police Department

In 2020, Open Justice Baltimore sued the Baltimore Police Department over the agency's demand that the nonprofit watchdog group pay more than $1 million to obtain copies of use-of-force investigation files. 

The police department had decreased their assessment to $245,000 by the time of the lawsuit, but it rejected the nonprofit’s fee waiver, questioning the public interest in the records and where they would change the public's understanding of the issue. The agency also claimed that fulfilling the request would be costly and burdensome for its short-staffed police department.

In 2023, Maryland’s Supreme Court issued a sizzling decision criticizing the BPD’s $245,000 fee assessment and its refusal to waive that fee in the name of public interest. The Supreme Court found that the public interest in how the department polices itself was clear and that the department should have considered how a denial of the fee waiver would “exacerbate the public controversy” and further “the perception that BPD has something to hide.”

The Supreme Court called BPD’s fee assessment “arbitrary and capricious” and remanded the case back to the police department, which must now reconsider the fee waiver. The unanimous decision from the state’s highest court did not mince its words on the cost of public records, either: “While an official custodian’s discretion in these matters is broad,” the opinion reads, “it is not boundless.”

The Continuing Failure Award: United States Citizenship and Immigration Services

Alien registration files, also commonly known as “A-Files,” contain crucial information about a non-citizen’s interaction with immigration agencies, and are central to determining eligibility for immigration benefits.

However, U.S. immigration agencies have routinely failed to release alien files within the statutory time limit for responding, according to Nightingale et al v. U.S. Citizenship and Immigration Services et al, a class-action lawsuit by a group of immigration attorneys and individual requesters.

The attorneys filed suit in 2019 against the U.S. Citizenship and Immigration Services, the Department of Homeland Security and U.S. Immigration and Customs Enforcement. In 2020, Judge William H. Orrick ruled that the agencies must respond to FOIA requests within 20 business days, and provide the court and class counsel with quarterly compliance reports. The case remains open.

With U.S. immigration courts containing a backlog of more than 2 million cases as of October of last year, according to the U.S. Government Accountability Office, the path to citizenship is bogged down for many applicants. The failure of immigration agencies to comply with statutory deadlines for requests only makes navigating the immigration system even more challenging. There is reason for hope for applicants, however. In 2022, Attorney General Merrick Garland made it federal policy to not require FOIA requests for copies of immigration proceedings, instead encouraging agencies to make records more readily accessible through other means.

Even the A-File backlog itself is improving. In the last status report, filed by the Department of Justice, they wrote that “of the approximately 119,140 new A-File requests received in the current reporting period, approximately 82,582 were completed, and approximately 81,980 were timely completed.”

The Creative Invoicing Award: Richmond, Va., Police Department

Some agencies claim outrageous fees for redacting documents to deter public access.

OpenOversightVA requested copies of general procedures—the basic outline of how police departments run—from localities across Virginia. While many departments either publicly posted them or provided them at no charge, Richmond Police responded with a $7,873.14 invoice. That’s $52.14 an hour to spend one hour on “review, and, if necessary, redaction” on each of the department’s 151 procedures.

This Foilies “winner” was chosen because of the wide gap between how available the information should be, and the staggering cost to bring it out of the file cabinet.

As MuckRock’s agency tracking shows, this is hardly an aberration for the agency. But this estimated invoice came not long after the department’s tear-gassing of protesters in 2020 cost the city almost $700,000. At a time when other departments are opening their most basic rulebooks (in California, for example, every law enforcement agency is required to post these policy manuals online), Richmond has been caught attempting to use a simple FOIA request as a cash cow.

The Foilies (Creative Commons Attribution License) were compiled by the Electronic Frontier Foundation (Director of Investigations Dave Maass, Senior Staff Attorney Aaron Mackey, Legal Fellow Brendan Gilligan, Investigative Researcher Beryl Lipton) and MuckRock (Co-Founder Michael Morisy, Data Reporter Dillon Bergin, Engagement Journalist Kelly Kauffman, and Contributor Tom Nash), with further review and editing by Shawn Musgrave. Illustrations are by EFF Designer Hannah Diaz. The Foilies are published in partnership with the Association of Alternative Newsmedia. 

Dave Maass

Four Voices You Should Hear this International Women’s Day

2 months ago

Around the globe, freedom of expression varies wildly in definition, scope, and level of access. The impact of the digital age on perceptions and censorship of speech has been felt across the political spectrum on a worldwide scale. In the debate over what counts as free expression and how it should work in practice, we often lose sight of how different forms of censorship can have a negative impact on different communities, and especially marginalized or vulnerable ones. This International Women’s Day, spend some time with four stories of hope and inspiration that teach us how to reflect on the past to build a better future.

1. Podcast Episode: Safer Sex Work Makes a Safer Internet

An internet that is safe for sex workers is an internet that is safer for everyone. Though the effects of stigmatization and criminalization run deep, the sex worker community exemplifies how technology can help people reduce harm, share support, and offer experienced analysis to protect each other. Public interest technology lawyer Kendra Albert and sex worker, activist, and researcher Danielle Blunt have been fighting for sex workers’ online rights for years and say that holding online platforms legally responsible for user speech can lead to censorship that hurts us all. They join EFF’s Cindy Cohn and Jason Kelley in this podcast to talk about protecting all of our free speech rights.

2. Speaking Freely: Sandra Ordoñez

Sandra (Sandy) Ordoñez is dedicated to protecting women being harassed online. Sandra is an experienced community engagement specialist, a proud NYC Latina resident of Sunset Park in Brooklyn, and a recipient of Fundación Carolina’s Hispanic Leadership Award. She is also a long-time diversity and inclusion advocate, with extensive experience incubating and creating FLOSS and Internet Freedom community tools. In this interview with EFF’s Jillian C. York, Sandra discusses free speech and how communities that are often the most directly affected are the last consulted.

3. Story: Coded Resistance, the Comic!

From the days of chattel slavery until the modern Black Lives Matter movement, Black communities have developed innovative ways to fight back against oppression. EFF's Director of Engineering, Alexis Hancock, documented this important history of codes, ciphers, underground telecommunications and dance in a blog post that became one of our favorite articles of 2021. In collaboration with The Nib and illustrator Chelsea Saunders, "Coded Resistance" was adapted into comic form to further explore these stories, from the coded songs of Harriet Tubman to Darnella Frazier recording the murder of George Floyd.

4. Speaking Freely: Evan Greer

Evan Greer is many things: a musician, an activist for LGBTQ issues, the Deputy Director of Fight for the Future, and a true believer in the free and open internet. In this interview, EFF’s Jillian C. York spoke with Evan about the state of free expression, and what we should be doing to protect the internet for future activism. Among the many topics discussed was how policies that promote censorship—no matter how well-intentioned—have historically benefited the powerful and harmed vulnerable or marginalized communities. Evan talks about what we as free expression activists should do to get at that tension and find solutions that work for everyone in society.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Infosec Tools for Resistance this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day
Paige Collings
Checked
5 minutes 23 seconds ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed