[B] 【イベント情報】〈6/14〉講演会「トルコにおけるクルド人の迫害」・〈6/16から〉写真展「日本とトルコ クルド難民の今」
Giving the floor to voices driving real-world change for an internet that serves everyone
35 Years for Your Freedom Online
Once upon a time we were promised flying cars and jetpacks. Yet we've arrived at a more complicated timeline where rights advocates can find themselves defending our hard-earned freedoms more often than shooting for the moon. In tough times, it's important to remember that your vision for the future can be just as valuable as the work you do now.
Thirty-five years ago, a small group of folks saw the coming digital future and banded together to ensure that technology would empower people, not oppress them—and EFF was born. While the dangers of corporate and state forces grew alongside the internet, EFF and supporters like you faithfully rose to the occasion. Will you help celebrate EFF’s 35th anniversary and donate in support of digital freedom?
Protect Online Privacy & Free Expression
Together we’ve won many fights for encryption, free speech, innovation, and privacy online. Yet it’s plain to see that we must keep advocating for technology users whether that’s in the courts, before lawmakers, educating the public, or creating privacy-enhancing tools. EFF members make it possible—you can lend a hand and get some great perks!
Summer Swag Is HereWe love making stuff for EFF’s members each year. It’s our way of saying thanks for supporting the mission for your rights online, and I hope it’s your way of starting a conversation about internet freedom with people in your life.
shirts-both-necklines-wider-square-750px.jpgCelebrate EFF's 35th Anniversary in the digital rights movement with this EFF35 Cityscape member t-shirt by Hugh D’Andrade! EFF has a not-so-secret weapon that keeps us in the fight even when the odds are against us: we never lose sight of our vision for a better future. Choose a roomy Classic Fit Crewneck or a soft Slim Fit V-Neck.
hoodie-front-back-alt-square-750px.jpgAnd enjoy Lovelace-Klimtian vibes on EFF’s new Motherboard Hooded Sweatshirt by Shirin Mori. Gold details and orange poppies pop on lush forest green. Don't lose the forest for the trees—keep fighting for a world where tech supports people irl.
Join the Sustaining Donor Challenge (it’s easy)You'll get a numbered EFF35 Challenge Coin when you become a monthly or annual Sustaining Donor by July 10. It’s that simple.
If you're already a Sustaining Donor—THANKS! You too can get an EFF 35th Anniversary Challenge Coin when you upgrade your donation. Just increase your monthly or annual gift and let us know by emailing upgrade@eff.org. Get started at eff.org/recurring. If you used PayPal, just cancel your current recurring donation and then go to eff.org to start a new upgraded recurring donation.
coin_cat_1200px.jpgSupport internet freedom with a no-fuss automated recurring donation! Over 30% of EFF members have joined as Sustaining Donors to defend digital rights (and get some great swag every year). Challenge coins follow a long tradition of offering a symbol of kinship and respect for great achievements—and EFF owes its strength to technology creators and users like you.
With your help, EFF is here to stay.
Protect Online Privacy & Free Expression
第89回 公共料金等専門調査会【6月9日開催】
栄養成分関連添加物ワーキンググループ(第22回)の開催について【6月18日開催】
6月9日:関生大津2次コンプライアンス事件、控訴棄却の不当判決
関西生コン弾圧事件ニュースNO.122/一審判決を追認しただけの不当判決
NYC Lets AI Gamble with Child Welfare
The Markup revealed in its reporting last month that New York City’s Administration for Children’s Services (ACS) has been quietly deploying an algorithmic tool to categorize families as “high risk". Using a grab-bag of factors like neighborhood and mother’s age, this AI tool can put families under intensified scrutiny without proper justification and oversight.
ACS knocking on your door is a nightmare for any parent, with the risk that any mistakes can break up your family and have your children sent to the foster care system. Putting a family under such scrutiny shouldn’t be taken lightly and shouldn’t be a testing ground for automated decision-making by the government.
This “AI” tool, developed internally by ACS’s Office of Research Analytics, scores families for “risk” using 279 variables and subjects those deemed highest-risk to intensified scrutiny. The lack of transparency, accountability, or due process protections demonstrates that ACS has learned nothing from the failures of similar products in the realm of child services.
The algorithm operates in complete secrecy and the harms from this opaque “AI theater” are not theoretical. The 279 variables are derived only from cases back in 2013 and 2014 where children were seriously harmed. However, it is unclear how many cases were analyzed, what, if any, kind of auditing and testing was conducted, and whether including of data from other years would have altered the scoring.
What we do know is disturbing: Black families in NYC face ACS investigations at seven times the rate of white families and ACS staff has admitted that the agency is more punitive towards Black families, with parents and advocates calling its practices “predatory.” It is likely that the algorithm effectively automates and amplifies this discrimination.
Despite the disturbing lack of transparency and accountability, ACS’s usage of this system has subjected families that this system ranks as “highest risk” to additional scrutiny, including possible home visits, calls to teachers and family, or consultations with outside experts. But those families, their attorneys, and even caseworkers don't know when and why the system flags a case, making it difficult to challenge the circumstances or process that leads to this intensified scrutiny.
This is not the only incidence in which usage of AI tools in the child services system has encountered issues with systemic biases. Back in 2022, the Associated Press reported that Carnegie Mellon researchers found that from August 2016 to May 2018, Allegheny County in Pennsylvania used an algorithmic tool that flagged 32.5% of Black children for “mandatory” investigation compared to just 20.8% of white, all while social workers disagreed with the algorithm's risk scores about one-third of the time.
The Allegheny system operates with the same toxic combination of secrecy and bias now plaguing NYC. Families and their attorneys can never know their algorithmic scores, making it impossible to challenge decisions that could destroy their lives. When a judge asked to see a family’s score in court, the county resisted, claiming it didn't want to influence legal proceedings with algorithmic numbers, which suggests that the scores are too unreliable for judicial scrutiny yet acceptable for targeting families.
Elsewhere these biased systems were successfully challenged. The developers of the Allegheny tool had already had their product rejected in New Zealand, where researchers correctly identified that the tool would likely result in more Māori families being tagged for investigation. Meanwhile, California spent $195,273 developing a similar tool before abandoning it in 2019 due in part to concerns about racial equity.
Governmental deployment of automated and algorithmic decision making not only perpetuates social inequalities, but removes mechanisms for accountability when agencies make mistakes. The state should not be using these tools for rights-determining decisions and any other uses must be subject to vigorous scrutiny and independent auditing to ensure the public’s trust in the government’s actions.
Criminalizing Masks at Protests is Wrong
There has been a crescendo of states attempting to criminalize the wearing of face coverings while attending protests. Now the President has demanded, in the context of ongoing protests in Los Angeles: “ARREST THE PEOPLE IN FACE MASKS, NOW!”
But the truth is: whether you are afraid of catching an airborne illness from your fellow protestors, or you are concerned about reprisals from police or others for expressing your political opinions in public, you should have the right to wear a mask. Attempts to criminalize masks at protests fly in the face of a right to privacy.
Anonymity is a fundamental human right.
In terms of public health, wearing a mask while in a crowd can be a valuable tool to prevent the spread of communicable illnesses. This can be essential for people with compromised immune systems who still want to exercise their First Amendment-protected right to protest.
Moreover, wearing a mask is a perfectly legitimate surveillance self-defense practice during a protest. There has been a massive proliferation of surveillance camera networks, face recognition technology, and databases of personal information. There also is a long law enforcement’s history of harassing and surveilling people for publicly criticizing or opposing law enforcement practices and other government policies. What’s more, non-governmental actors may try to identify protesters in order to retaliate against them, for example, by limiting their employment opportunities.
All of this may chill our willingness to speak publicly or attend a protest in a cause we believe in. Many people would be less willing to attend a rally or march if they know that a drone or helicopter, equipped with a camera, will take repeated passes over the crowd, and police later will use face recognition to scan everyone’s faces and create a list of protest attendees. This would make many people rightfully concerned about surveillance and harassment from law enforcement.
Anonymity is a fundamental human right. EFF has long advocated for anonymity online. We’ve also supported low-tech methods to protect our anonymity from high-tech snooping in public places; for example, we’ve supported legislation to allow car owners to use license plate covers when their cars are parked to reduce their exposure to ALPRs.
A word of caution. No surveillance self-defense technique is perfect. Technology companies are trying to develop ways to use face recognition technology to identify people wearing masks. But if somebody wants to hide their face to try to avoid government scrutiny, the government should not punish them.
While members of the public have a right to wear a mask when they protest, law enforcement officials should not wear a mask when they arrest protesters and others. An elementary principle of police accountability is to require uniformed officers to identify themselves to the public; this discourages officer misconduct, and facilitates accountability if an officer violates the law. This is one reason EFF has long supported the First Amendment right to record on-duty police, including ICE officers.
For these reasons, EFF believes it is wrong for state legislatures, and now federal law enforcement, to try to criminalize or punish mask wearing at protests. It is especially wrong, in moments like the present, where government it taking extreme measures to crack down on the civil liberties of protesters.