農薬第三専門調査会(第42回)の開催について(非公開)【5月14日開催】
農薬第三専門調査会(第41回)の開催について【5月14日開催】
農薬第五専門調査会(第48回)の開催について(非公開)【5月15日開催】
農薬第一専門調査会(第50回)の開催について(非公開)【5月18日開催】
JVN: CISA ICS Advisory / ICS Medical Advisory(2026年05月05日)
Seeding change: Creating new skills and income opportunities through micro-enterprises and digital training for Nigerian women
Milestone 1.0.0 Release of APK Downloader `apkeep` Powers Research on Android Apps
Last week, we released apkeep version 1.0.0, the latest edition of our command-line Android package downloading software. Rather than indicating major changes for the project, this milestone instead signifies arriving at a relatively stable and mature place after gradual iteration on the project over the course of over four years.
What’s New in 1.0.0
We do have a few fresh features we’ve packed into this latest release, though—all focused on the Google Play Store:
- You can now download a dex metadata file associated with an app containing a Cloud Profile, which provides information on app performance based on real usage.
- You can now provide a token generated by the Aurora Store’s dispenser to log in anonymously for app downloads.
- Users can specify their own device profiles when downloading apps from Google Play, which the store uses to deliver the app variant which works for your particular device specifications.
- We’ve also fixed an authentication bug introduced by the Play Store API.
In addition to the various Linux, Windows, and Android environments we support, we’re also happy to announce that since the last release in October we’ve been included in Homebrew for macOS users!
How Researchers Use apkeep to Understand the Android App Landscape
Researchers and users contributed most of the features of this release, including downloading dex metadata containing Google’s Cloud Profiles. This feature helps them use the tool in their own research of highlighting how these Android compilation profiles can be a vital source of information for evaluating dynamic testing. Numerous other projects have cited apkeep usage in their own workflows. For example, Exodus Privacy uses it to power the εxodus tool’s downloads when they monitor the privacy properties of apps. Various research teams have noted their own use of the tool in whitepapers, including one team who used the tool to download 21,154 apps in a widespread study of Android evasive malware. We are proud to provide a reliable tool in the toolbox they use to power their work.
What’s in Store for apkeep?
Our goals with apkeep have remained constant: provide a reliable, fast, and safe way to download apps from multiple app providers, not just the Google Play Store. While we’ve focused on it as the major Android app provider of choice across much of the world, we’ve expanded support to other stores as well, such as F-Droid for downloading open source apps. We’d like to continue broadening apkeep’s list of supported providers, to make it easy to do comparative analysis of apps provided in different contexts. For this, we’d love your contributions.
How You Can Help
If you’re using apkeep as part of your own toolbox (whether using it to do malware analysis, auditing apps, or simply using it as an app archiving tool), let us know! And if you like what we do, please consider donating to EFF to support our work.
情報通信審議会 情報通信技術分科会(第195回)の開催について
ベトナム科学技術省との情報通信技術及びデジタルトランスフォーメーション分野における協力覚書の署名
日EUデジタルパートナーシップ閣僚級会合の結果
欧州委員会通信ネットワーク・コンテンツ・技術総局との協力取決めの署名及びバイ会談の結果
電波監理審議会 有効利用評価部会(第59回)会議資料
👎 California's Terrible, No Good, Very Bad Social Media Ban | EFFector 38.9
We'd all like the internet to be a better place—for kids and adults alike. But in the name of online safety, governments around the world are racing to impose a dangerous new system of control. Are age gates the silver bullet to the internet's problems they're being promoted as? Or are we being sold a bill of goods? We're answering this question and more in our latest EFFector newsletter.
For over 35 years, EFFector has been your guide to understanding the intersection of technology, civil liberties, and the law. This latest issue covers an attack on VPNs in Utah, a livestream on how to disenshittify the internet, and California's proposed social media ban that could set a dangerous new precedent for online censorship.
Prefer to listen in? EFFector is now available on all major podcast platforms. This time, we're having a conversation with EFF Legislative Analyst Molly Buckley on why social media bans can't sidestep the U.S. constitution. You can find the episode and subscribe on your podcast platform of choice:
%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F07b61711-d8ff-4483-aee3-21daa5a3ea22%3Fdark%3Dfalse%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from simplecast.com
Want to help push back on these misguided regulations? Sign up for EFF's EFFector newsletter for updates, ways to take action, and new merch drops. You can also fuel the fight for privacy and free speech online when you support EFF today!
What have we learned? Revisiting the GISWatch edition on artificial intelligence
【国体擁護法】「国旗等損壊罪」反対集会開く 愛国心刑罰で強要=古川英一
The SECURE Data Act is Not a Serious Piece of Privacy Legislation
The federal SECURE Data Act is not a serious consumer privacy bill, and its provisions—if enacted—would be a retreat from already insufficient state protections.
Republicans on the House Energy and Commerce Committee released a draft of the bill late last month without bipartisan support. The bill is weaker than congressional proposals in prior years, as well as most of the 21 state consumer privacy laws already on the books.
The bill could wipe out hundreds of state privacy protections.
Most troubling for EFF: the bill would preempt dozens, if not hundreds, of state laws that regulate related topics, and it would not allow consumers to sue to protect their own rights (commonly called a private right of action). And it comes nowhere close to banning online behavioral advertising—a practice that fuels technology companies’ always increasing hunt for personal data.
The bill also suffers from many other flaws including weak opt-out defaults, inadequate data minimization requirements, and large definitional loopholes for companies.
Key ProvisionsThe bill would give consumers some rights to take action to control their personal data— like access, correction, deletion, and limited portability. These rights have become standard in all data privacy proposals in recent years.
The bill would also require companies to obtain your consent before processing your sensitive data, or using any of your personal data for a previously undisclosed purpose. Absent your consent, a company couldn’t do these things.
Further, the bill would allow you to opt out of (1) targeted third-party advertising, (2) the sale of your personal data, and (3) profiling of you that has a legal, healthcare, housing, or employment effect. Unfortunately, a company could keep doing these invasive things to you, unless you opted out.
The bill would also require data brokers that make at least 50 percent of their profits from the sale of personal data to register in a public database maintained by the Federal Trade Commission (FTC).
Preemption of Too Many State LawsFederal privacy laws should allow states to build ever stronger rights on top of the federal floor. Many federal privacy laws allow this, including the Health Insurance Portability and Accountability Act, the Video Privacy Protection Act, and the Electronic Communications Privacy Act.
The SECURE Data Act would not do that. Instead, it would wipe out dozens, if not hundreds, of existing state privacy protections. Section 15 of the bill would preempt any “law, rule, regulation, requirement, standard, or other provision [that] relates to the provisions of this Act.” This would kill the 21 state consumer privacy laws passed in the past few years. These state bills aren’t strong enough, but they are still better than this federal proposal. For example, California maintains a data broker deletion tool and requires companies to comply with automatic opt-out signals—including one that is built into EFF’s Privacy Badger.
Because the SECURE Data Act has provisions that relate to data privacy and security, it could preempt all 50 state data breach laws and many others. It could also preempt state laws related to specific pieces of sensitive data, like bans on the sale of biometric or location information. Some states like California have constitutional provisions that protect an individual’s right to privacy, which can be enforced against companies. That constitutional provision, as well as state privacy torts, could also be in danger if this bill passed.
No Private Enforcement, A New Cure Period, and Vague Security PowersStrong consumer privacy laws should allow consumers to take companies to court to defend their own rights. This is essential because regulators do not have the resources to catch every violation, and federal consumer enforcement agencies have been gutted during the current administration.
The SECURE Data Act does not have a private right of action. The FTC, along with state attorneys general, have primary enforcement authority. The law also gives companies 45 days to “cure” any violation with no penalty after they are caught.
Moreover, Section 8 of the bill creates a vaguely defined self-regulatory scheme in which companies can apply to be audited by an “independent organization” that will apply a “code of conduct.” Following this code of conduct would give companies a presumption that they are complying with the law. This provision is an implicit acknowledgement that the bill does not provide regulators with any new resources to enforce new protections.
Section 9 of the bill would give the Secretary of Commerce broad power to “take any action necessary and appropriate to support the international flow of personal data,” including assessing “security interests of the United States.” The scope of this amorphous provision is unclear, but it likely does not belong in a consumer protection bill.
Weak Privacy DefaultsYour online privacy should not depend on whether you have the time, patience, and knowledge to navigate a website and turn off invasive tracking. Good privacy laws build in data minimization requirements—meaning there should be a default standard that prevents companies from processing your data for purposes that are not needed to provide you with the service you asked for.
The SECURE Data Act puts the burden on you to opt out of invasive company practices, like targeted third-party advertising, the sale of your personal data, and profiling. The bill at least requires companies to obtain your consent before processing your sensitive data (like selling your precise location). These consent requirements, however, are often an invitation for companies to trick you into clicking a button to give away your rights in hard-to-read policies. Indeed, few people would knowingly agree to let a company sell their personal data to a broker who turns around and sells it to the government.
Section 3 of the bill uses the term “data minimization,” but it is done in name only. The provision does not limit a company’s processing of data to only what is necessary to provide the customer with the good or service they asked for. Instead, the provision limits processing of data to only what a company “disclosed to the customer”—meaning if it is in the confusing privacy policy that nobody reads, it is okay.
And the bill would not even allow you to restrict certain uses of your data. As companies seek more data for AI systems, many internet users do not want their private personal data to be used to train those models. However, the bill makes clear that “nothing in this Act may be construed to restrict” a company from collecting, using, or retaining your data to “develop” or “improve” a new technology.
Other Flawed Definitions and LoopholesThe bill has numerous loopholes that technology companies would exploit if the bill were to become law. Below is just a sampling:
- Government contractors: Under Section 13(b)(2), government contractors are exempt from the bill, which could be wrongly interpreted to exempt certain data brokers from sale restrictions when those sales are made to the government. This type of exemption could benefit surveillance companies like Clearview AI, which previously argued it was exempt from Illinois’ strict biometric law using a similar contractor exception. This is likely not the authors’ intention, since the definition of sale includes those made “to a government entity.”
Sale definition: The definition in Section 16(28) is defined too narrowly. A sale should mean any exchange for monetary “or other valuable” consideration, as in some other privacy laws. - Biometric information definition: The definition in Section 16(4) excludes data generated from a photo or video, and the definition excludes face scans not meant to “identify a specific individual.” This could be wrongly interpreted to allow biometric identification from security camera footage, or biometric use for sentiment or demographic analysis.
- Personal data definition: The definition in Section 16(21) exempts “de-identified data” from the definition of personal data, which could allow companies to do anything with de-identified data because that data is not protected by the law. The problem with de-identified data is that many times it is not.
- Deletion requests: With regard to data that a company obtained from a third-party, Section 2(d)(5) would treat a consumer’s deletion request merely as an opt-out request. And even if a customer requested deletion, a company might be able to retain the data for research purposes under section 11(a)(9)(A).
- Profiling definition: Under the definition in Section 16(25), companies could profile so long as the profiling is not “solely automated.” The flimsiest human review would exempt highly automated profiling.
Congress is long overdue to enact a strong comprehensive consumer data privacy law, and we have sketched what it should look like. But the SECURE Data Act is woefully inadequate. In fact, it would cause even more corporate surveillance of our personal information, by wiping out state laws that are more protective than this federal bill. Even worse, this bill would block state legislatures from protecting their residents from the privacy threats of tomorrow that are unforeseeable today.
[B] 官製ヘイトと体感治安
【映画の鏡】もう1つのなでしこジャパン『アイ・コンタクト』東京デフリンピックで新作も=鈴木 賀津彦
EFF and 18 Organizations Urge UK Policymakers to Prioritize Addressing the Roots of Online Harm
EFF joins 18 organizations in writing a letter to UK policymakers urging them to address the root causes of online harm—rather than undermining the open web through blunt restrictions.
The coalition, which includes Mozilla, Tor Project, and Open Rights Group, warns that proposed measures following the passage of the Children’s Wellbeing and Schools Bill risk fundamentally reshaping the internet in harmful ways. Chief among these proposals are sweeping age-gating requirements and access restrictions that would apply not only to young people, but effectively to all users.
While framed as efforts to protect children online, these policies rely heavily on age assurance technologies that are either inaccurate, privacy-invasive, or both. As the letter notes, mandating such systems across a wide range of services—from social media and video games to VPNs and even basic websites—would force users to verify their identity simply to access the web. This creates serious risks, including expanded surveillance, data breaches, and the erosion of anonymity.
Beyond privacy concerns, the signatories argue that these measures threaten the core architecture of the open internet. Age-gating at scale could fragment the web into a patchwork of restricted jurisdictions, limit access to information, and entrench the dominance of powerful gatekeepers like app stores and platform ecosystems. In doing so, policymakers risk weakening the very qualities—interoperability, accessibility, and openness—that have made the internet a global public resource.
The letter also emphasizes what’s missing from the current policy approach: meaningful efforts to address the underlying drivers of online harm. Many digital platforms are designed to maximize engagement and profit through pervasive data collection and targeted advertising, often at the expense of user safety and autonomy. Rather than imposing access bans, the coalition calls on UK policymakers to hold companies accountable for these systemic practices and to prioritize user rights by design.
Importantly, the signatories highlight that the internet remains a vital space for young people: offering access to information, support networks, and opportunities for expression that may not exist offline. Policies that restrict access risk cutting off these lifelines without meaningfully reducing harm.
The message is clear: protecting users online requires more than heavy-handed restrictions. It demands thoughtful, rights-respecting policies that tackle the business models and design choices driving harm, while preserving the open, global nature of the web.