【フォトアングル】日中戦争終結80周年記念イベントに370人参加=8月11日、東京・練馬区、伊東良平撮影

1 day 8 hours ago
 8月11日に日本とアジアの平和をつなぐ「日中戦争終結から80周年・記念のつどい」が東京・練馬のココネリホールで開かれた。主催は日中友好協会が母体の実行委員会で、第1部は作家の島田雅彦さんが「米中露の狭間に生きるということ」と題し講演。「ナショナリズムに訴えるのはファシズムに繋がる」と話した。第2部は反戦平和の合唱組曲や沖縄三線、太極拳などが披露され、写真の東京朝鮮中高級学校生による舞踊には、参加者370人の大きな拍手が起きた。       JCJ月刊機関紙「ジャーナリスト」..
JCJ

EFF Awards Spotlight ✨ Erie Meyer

1 day 8 hours ago

In 1992 EFF presented our very first awards recognizing key leaders and organizations advancing innovation and championing civil liberties and human rights online. Now in 2025 we're continuing to celebrate the accomplishments of people working toward a better future for everyone with the EFF Awards!

All are invited to attend the EFF Awards on Wednesday, September 10 at the San Francisco Design Center. Whether you're an activist, an EFF supporter, a student interested in cyberlaw, or someone who wants to munch on a strolling dinner with other likeminded individuals, anyone can enjoy the ceremony!

REGISTER TODAY!

GENERAL ADMISSION: $55 | CURRENT EFF MEMBERS: $45 | STUDENTS: $35

If you're not able to make it, we'll also be hosting a livestream of the event on Friday, September 12 at 12:00 PM PT. The event will also be recorded, and posted to YouTube and the Internet Archive after the livestream.

We are honored to present the three winners of this year's EFF Awards: Just Futures Law, Erie Meyer, and Software Freedom Law Center, India. But, before we kick off the ceremony next week, let's take a closer look at each of the honorees. This time—Erie Meyer, winner of the EFF Award for Protecting Americans' Data:

Erie Meyer is a Senior Fellow at the Vanderbilt Policy Accelerator where she focuses on the intersection of technology, artificial intelligence, and regulation, and a Senior Fellow at the Georgetown Law Institute for Technology Law & Policy. Since January 20, Meyer has helped organize former government technologists to stand up for the privacy and integrity of governmental systems that hold Americans’ data. In addition to organizing others, she filed a declaration in federal court in February warning that 12 years of critical records could be irretrievably lost in the CFPB’s purge by the Trump Administration’s Department of Government Efficiency. In April, she filed a declaration in another case warning about using private-sector AI on government information. That same month, she testified to the House Oversight Subcommittee on Cybersecurity, Information Technology, and Government Innovation that DOGE is centralizing access to some of the most sensitive data the government holds—Social Security records, disability claims, even data tied to national security—without a clear plan or proper oversight, warning that “DOGE is burning the house down and calling it a renovation.” 

We're excited to celebrate Erie Meyer and the other EFF Award winners in person in San Francisco on September 10! We hope that you'll join us there.

Thank you to Fastly, DuckDuckGo, Corellium, and No Starch Press for their year-round support of EFF's mission.

Want to show your team’s support for EFF? Sponsorships ensure we can continue hosting events like this to build community among digital rights supporters. Please visit eff.org/thanks or contact tierney@eff.org for more information on corporate giving and sponsorships.

EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.

Questions? Email us at events@eff.org.

Christian Romero

From Libraries to Schools: Why Organizations Should Install Privacy Badger

1 day 12 hours ago

​​In an era of pervasive online surveillance, organizations have an important role to play in protecting their communities’ privacy. Millions of people browse the web on computers provided by their schools, libraries, and employers. By default, popular browsers on these computers leave people exposed to hidden trackers.

Organizations can enhance privacy and security on their devices by installing Privacy Badger, EFF’s free, open source browser extension that automatically blocks trackers. Privacy Badger is already used by millions to fight online surveillance and take back control of their data.

Why Should Organizations Install Privacy Badger on Managed Devices?

Protect People from Online Surveillance

Most websites contain hidden trackers that let advertisers, data brokers, and Big Tech companies monitor people’s browsing activity. This surveillance has serious consequences: it fuels scams, government spying, predatory advertising, and surveillance pricing

By installing Privacy Badger on managed devices, organizations can protect entire communities from these harms. Most people don’t realize the risks of browsing the web unprotected. Organizations can step in to make online privacy available to everyone, not just the people who know they need it. 

Ad Blocking is a Cybersecurity Best Practice

Privacy Badger helps reduce cybersecurity threats by blocking ads that track you (unfortunately, that’s most ads these days). Targeted ads aren’t just a privacy nightmare. They can also be a vehicle for malware and phishing attacks. Cybercriminals have tricked legitimate ad networks into distributing malware, a tactic known as malvertising.

The risks are serious enough that the U.S. Cybersecurity and Infrastructure Security Agency (CISA) recommends federal agencies deploy ad-blocking software. The NSA, CIA, and other intelligence agencies already follow this guidance. These agencies are using advertising systems to surveil others, but blocking ads for their own employees. 

All organizations, not just spy agencies, should make ad blocking part of their security strategy.

A Tracker Blocker You Can Trust

Four million users already trust Privacy Badger, which has been recommended by The New York Times' Wirecutter, Consumer Reports, and The Washington Post.

Trust is crucial when choosing an ad-blocking or tracker-blocking extension because they require high levels of browser permissions. Unfortunately, not all extensions deserve that trust. Avast’s “privacy” extension was caught collecting and selling users’ browsing data to third parties—the very practice it claimed to prevent.

Privacy Badger is different. EFF released it over a decade ago, and the extension has been open-source—meaning other developers and researchers can inspect its code—that entire time. Built by a nonprofit with a 35-year history fighting for user rights, organizations can trust that Privacy Badger works for its users, not for profit. 

Which organizations should deploy Privacy Badger?

All of them! Installing Privacy Badger on managed devices improves privacy and security across an organization. That said, Privacy Badger is most beneficial for two types of organizations: libraries and schools. Both can better serve their communities by safeguarding the computers they provide.

Libraries

The American Library Association (ALA) already recommends installing Privacy Badger on public computers to block third-party tracking. Librarians have a long history of defending privacy. The ALA’s guidance is a natural extension of that legacy for the digital age. While librarians protect the privacy of books people check out, Privacy Badger protects the privacy of websites they visit on library computers. 

Millions of Americans depend on libraries for internet access. That makes libraries uniquely positioned to promote equitable access to private browsing. With Privacy Badger, libraries can ensure that safe and private browsing is the default for anyone using their computers. 

Libraries also play a key role in promoting safe internet use through their digital literacy trainings. By including Privacy Badger in these trainings, librarians can teach patrons about a simple, free tool that protects their privacy and security online.

Schools

Schools should protect their students’ from online surveillance by installing Privacy Badger on computers they provide. Parents are rightfully worried about their children’s privacy online, with a Pew survey showing 85% worry about advertisers using data about what kids do online to target ads. Deploying Privacy Badger is a concrete step schools can take to address these concerns. 

By blocking online trackers, schools can protect students from manipulative ads and limit the personal data fueling social media algorithms. Privacy Badger can even block tracking in Ed Tech products that schools require students to use. Alarmingly, a Human Rights Watch analysis of Ed Tech products found that 89% shared children’s personal data with advertisers or other companies.

Instead of deploying invasive student monitoring tools, schools should keep students safe by keeping their data safe. Students deserve to learn without being tracked, profiled, and targeted online. Privacy Badger can help make that happen.

How can organizations deploy Privacy Badger on managed devices?

System administrators can deploy and configure Privacy Badger on managed devices by setting up an enterprise policy. Chrome, Firefox, and Edge provide instructions for automatically installing extensions organization-wide. You’ll be able to configure certain Privacy Badger settings for all devices. For example, you can specify websites where Privacy Badger is disabled or prevent Privacy Badger’s welcome page from popping up on computers that get reset after every session. 

We recommend educating users about the addition of Privacy Badger and what it does. Since some websites deeply embed tracking, privacy protections can occasionally break website functionality. For example, a video might not play or a comments section might not appear. If this happens, users should know that they can easily turn off Privacy Badger on any website. Just open the Privacy Badger popup and click “Disable for this site.” 

Don't hesitate to reach out if you're interested in deploying Privacy Badger at scale. Our team is here to help you protect your community's privacy. And if you're already deploying Privacy Badger across your organization, we'd love to hear how it’s going

Make Private Browsing the Default at Your Organization

Schools, libraries, and other organizations can make private browsing the norm by deploying Privacy Badger on devices they manage. If you work at an organization with managed devices, talk to your IT team about Privacy Badger. You can help strengthen the security and privacy of your entire organization while joining the fight against online surveillance.

Lena Cohen

Verifying Trust in Digital ID Is Still Incomplete

1 day 19 hours ago

In the past few years, governments across the world have rolled out different digital identification options, and now there are efforts encouraging online companies to implement identity and age verification requirements with digital ID in mind. This blog is the second in a short series that explains digital ID and the pending use case of age verification. Upcoming posts will evaluate what real protections we can implement with current digital ID frameworks and discuss how better privacy and controls can keep people safer online.

Digital identity encompasses various aspects of an individual's identity that are presented and verified through either the internet or in person. This could mean a digital credential issued by a certification body or a mobile driver’s license provisioned to someone’s mobile wallet. They can be presented in plain text on a device, as a scannable QR code, or through tapping your device to something called a Near Field Communication (NFC) reader. There are other ways to present credential information that is a little more privacy preserving, but in practice those three methods are how we are seeing digital ID being used today.

Advocates of digital ID often use a framework they call the "Triangle of Trust." This is usually presented as a triangle of exchange between the holder of an ID—those who use a phone or wallet application to access a service; the issuer of an ID—this is normally a government entity, like the state Departments of Motor Vehicles in the U.S, or a banking system; and the verifier of an ID—the entity that wants to confirm your identity, such as law enforcement, a university, a government benefits office, a porn site, or an online retailer.

This triangle implies that the issuer and verifier—for example, the government who provides the ID and the website checking your age—never need to talk to one another. This theoretically avoids the tracking and surveillance threats that arise by preventing your ID, by design, from phoning home every time you verify your ID with another party.

But it also makes a lot of questionable assumptions, such as:

1) the verifier will only ever ask for a limited amount of information. 

2) the verifier won’t store information it collects.

3) the verifier is always trustworthy. 

The third assumption is especially problematic. How do you trust that the verifier will protect your most personal information and not use, store, or sell it beyond what you have consented to? Any of the following could be verifiers:

  • Law enforcement when doing a traffic stop and verifying your ID as valid.
  • A government benefits office that requires ID verification to sign up for social security benefits.
  • A porn site in a state or country which requires age verification or identity verification before allowing access.
  • An online retailer selling products like alcohol or tobacco.

Looking at the triangle again, this isn’t quite an equal exchange. Your personal ID like a driver’s license or government ID is both one of the most centralized and sensitive documents you have—you can’t control how it is issued or create your own, having to go through your government to obtain one. This relationship will always be imbalanced. But we have to make sure digital ID does not exacerbate these imbalances.

The effort to answer the questions of how to prevent verifier abuse is ongoing. But instead of working on the harms that these systems cause, the push for this technology is being fast-tracked by governments around the world scrambling to solve what they see as a crisis of online harms by mandating age verification. And current implementations of the Triangle of Trust have already proven disastrous.

One key example of the speed of implementation outpacing proper protections is the Digital Credential API. Initially launched by Google and now supported by Apple, this rollout allows for mass, unfettered verification by apps and websites to use the API to request information from your digital ID. The introduction of this technology to people’s devices came with no limits or checks on what information verifiers can seek—incentivizing verifiers to over-ask for ID information beyond the question of whether a holder is over a certain age, simply because they can. 

Digital Credential API also incentivizes for a variety of websites to ask for ID information that aren’t required and did not commonly do so previously. For example, food delivery services, medical services, and gaming sites, and literally anyone else interested in being a verifier, may become one tomorrow with digital ID and the Digital Credential API. This is both an erosion of personal privacy, as well as a pathway into further surveillance. There must be established limitations and scope, including:

  • verifiers establishing who they are and what they plan to ask from holders. There should also be an established plan for transparency on verifiers and their data retention policies.
  • ways to identify and report abusive verifiers, as well as real consequences, like revoking or blocking a verifier from requesting IDs in the future.
  • unlinkable presentations that do not allow for verifier and issuer collusion. As well as no data shared between verifiers you attest to. Preventing tracking of your movements in person or online every time you attest your age.

A further point of concern arises in cases of abuse or deception. A malicious verifier can send a request with no limiting mechanisms or checks and the user who rejects the request could be  fully blocked from the website or application. There must be provisions that ensure people have access to vital services that will require age verification from visitors.

Government's efforts to tackle verifiers potentially abusing digital ID requests haven’t come to fruition yet. For example, the EU Commission recently launched its age verification “mini app” ahead of the EU ID wallet for 2026. The mini app will not have a registry for verifiers, as EU regulators had promised and then withdrew. Without verifier accountability, the wallet cannot tell if a request is legitimate. As a result, verifiers and issuers will demand verification from the people who want to use online services, but those same people are unable to insist on verification and accountability from the other sides of the triangle. 

While digital ID gets pushed as the solution to the problem of uploading IDs to each site users access, the security and privacy on them varies based on implementation. But when privacy is involved, regulators must make room for negotiation. There should be more thoughtful and protective measures for holders interacting with more and more potential verifiers over time. Otherwise digital ID solutions will just exacerbate existing harms and inequalities, rather than improving internet accessibility and information access for all.

Alexis Hancock

EFF Statement on ICE Use of Paragon Solutions Malware

2 days 6 hours ago

This statement can be attributed to EFF Senior Staff Technologist Cooper Quintin

It was recently reported by Jack Poulson on Substack that ICE has reactivated its 2 million dollar contract with Paragon Solutions, a cyber-mercenary and spyware manufacturer. 

The reactivation of the contract between the Department of Homeland Security and Paragon Solutions, a known spyware vendor, is extremely troubling.

This end run around the executive order both ignores the spirit of the rule and does not actually do anything to prevent misuse of Paragon Malware for human rights abuses

Paragon's “Graphite” malware has been implicated in widespread misuse by the Italian government. Researchers at Citizen Lab at the Munk School of Global Affairs at the University of Toronto and with Meta found that it has been used in Italy to spy on journalists and civil society actors, including humanitarian workers. Without strong legal guardrails, there is a risk that the malware will be misused in a similar manner by the U.S. Government.

These reports undermine Paragon Solutions’s public  marketing of itself as a more ethical provider of surveillance malware. 

Reportedly, the contract is being reactivated because the US arm of Paragon Solutions was acquired by a Miami based private equity firm, AE Industrial Partners, and then merged into a Virginia based cybersecurity company, REDLattice, allowing ICE to circumvent Executive Order 14093 which bans the acquisition of spyware controlled by a foreign government or person. Even though this order was always insufficient in preventing the acquisition of dangerous spyware, it was the best protection we had. This end run around the executive order both ignores the spirit of the rule and does not actually do anything to prevent misuse of Paragon Malware for human rights abuses. Nor will it prevent insider threats at Paragon using their malware to spy on US government officials, or US government officials from misusing it to spy on their personal enemies, rivals, or spouses. 

The contract between Paragon and ICE requires all US users to adjust their threat models and take extra precautions. Paragon’s Graphite isn’t magical, it’s still just malware. It still needs a zero day exploit in order to compromise a phone with the latest security updates and those are expensive. The best thing you can do to protect yourself against Graphite is to keep your phone up to date and enable Lockdown Mode in your operating system if you are using an iPhone or Advanced Protection Mode on Android. Turning on disappearing messages is also helpful that way if someone in your network does get compromised you don’t also reveal your entire message history. For more tips on protecting yourself from malware check out our Surveillance Self Defense guides.

Related Cases: AlHathloul v. DarkMatter Group
Cooper Quintin

EFF Awards Spotlight ✨ Just Futures Law

2 days 8 hours ago

In 1992 EFF presented our very first awards recognizing key leaders and organizations advancing innovation and championing civil liberties and human rights online. Now in 2025 we're continuing to celebrate the accomplishments of people working toward a better future for everyone with the EFF Awards!

All are invited to attend the EFF Awards on Wednesday, September 10 at the San Francisco Design Center. Whether you're an activist, an EFF supporter, a student interested in cyberlaw, or someone who wants to munch on a strolling dinner with other likeminded individuals, anyone can enjoy the ceremony!

REGISTER TODAY!

GENERAL ADMISSION: $55 | CURRENT EFF MEMBERS: $45 | STUDENTS: $35

If you're not able to make it, we'll also be hosting a livestream of the event on Friday, September 12 at 12:00 PM PT. The event will also be recorded, and posted to YouTube and the Internet Archive after the livestream.

We are honored to present the three winners of this year's EFF Awards: Just Futures Law, Erie Meyer, and Software Freedom Law Center, India. But, before we kick off the ceremony next week, let's take a closer look at each of the honorees. First up—Just Futures Law, winner of the EFF Award for Leading Immigration and Surveillance Litigation:

Just Futures Law is a women-of-color-led law project that recognizes how surveillance disproportionately impacts immigrants and people of color in the United States. In the past year, Just Futures sued the Department of Homeland Security and its subagencies seeking a court order to compel the agencies to release records on their use of AI and other algorithms, and sued the Trump Administration for prematurely halting Haiti’s Temporary Protected Status, a humanitarian program that allows hundreds of thousands of Haitians to temporarily remain and work in the United States due to Haiti’s current conditions of extraordinary crises. It has represented activists in their fight against tech giants like Clearview AI, it has worked with Mijente to launch the TakeBackTech fellowship to train new advocates on grassroots-directed research, and it has worked with Grassroots Leadership to fight for the release of detained individuals under Operation Lone Star.

We're excited to celebrate Just Futures Law and the other EFF Award winners in person in San Francisco on September 10! We hope that you'll join us there.

Thank you to Fastly, DuckDuckGo, Corellium, and No Starch Press for their year-round support of EFF's mission.

Want to show your team’s support for EFF? Sponsorships ensure we can continue hosting events like this to build community among digital rights supporters. Please visit eff.org/thanks or contact tierney@eff.org for more information on corporate giving and sponsorships.

EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.

Questions? Email us at events@eff.org.

Christian Romero

[B] 【9/5実施予定】国会前緊急アクション「当事者と共に、『誰もの命・尊厳が守られる社会』へ」

2 days 8 hours ago
日本で暮らす外国人に対するヘイトスピーチが社会問題となる中、仮放免者の支援活動に取り組む団体等は9月5日、入管庁による「不法滞在者ゼロプラン」の廃止などを訴える緊急アクションを国会正門前で実施する予定だ。(藤ヶ谷魁)
日刊ベリタ

[B] 発達障害の最大の原因はお産の現場にある  完全母乳哺育とカンガルーケアの問題点をベテラン医師が告発

2 days 9 hours ago
発達障碍児が増え続けている。文部科学省の2012年の全国調査で発達障害とみられる小学生の割合は7・7%、中学生は4・0%だったが、22年の調査では小学生が10・4%、中学生は5・6%と増加傾向が続いている。 発達障害も「個性の一つ」ではある。発達障碍児の中には特定の分野で稀有な才能を示したり、成人後はうまく社会に適応していく者も少なくない。ただ、発達障害児は生涯にわたって「生きづらさ」を抱えるケースが多いことも事実だ。また、発達障碍児増加の影響は発達障碍児を持つ親、保育士、学校教師らにも及んでいる。 これまで、発達障害の原因としては遺伝説、食物など環境要因説などさまざまな仮説が唱えられてきたが、「最大の原因はお産の現場にある」と断言し、産婦人科医や精神科医の間で波紋を広げている医師がいる。佐賀県で長年にわたり、久保田産科麻酔科医院を運営してきた久保田史郎医師(80)だ。(石山永一郎)
日刊ベリタ