総務大臣に対する審査の申立て
オンラインカジノに係るアクセス抑止の在り方に関する検討会(第1回)開催案内
第20回独立行政法人評価制度委員会 会計基準等部会
第31回共同ワーキング・チーム
EFF Urges Court to Avoid Fair Use Shortcuts in Kadrey v. Meta Platforms
EFF has filed an amicus brief in Kadrey v. Meta, one of the many ongoing copyright lawsuits against AI developers. Most of the AI copyright cases raise an important new issue: whether the copying necessary to train a generative AI model is a non-infringing fair use.
Kadrey, however, attempts to side-step fair use. The plaintiffs—including Sarah Silverman and other authors—sued Meta for allegedly using BitTorrent to download “pirated” copies of their books to train Llama, a large language model. In other words, their legal claims challenge how Meta obtained the training materials, not what it did with them.
But some of the plaintiffs’ arguments, if successful, could harm AI developers’ defenses in other cases, where fair use is directly at issue.
How courts decide this issue will profoundly shape the future of this transformative technology, including its capabilities, its costs, and whether its evolution will be shaped by the democratizing forces of the open market or the whims of an oligopoly.
A question this important deserves careful consideration on a full record—not the hyperbolic cries of “piracy” and the legal shortcuts that the plaintiffs in this case are seeking. As EFF explained to the court, the question of whether fair use applies to training generative AI is far too important to decide based on Kadrey’s back-door challenge.
And, as EFF explained, whether a developer can legally train an AI on a wide variety of creative works shouldn’t turn on which technology they used to obtain those materials. As we wrote in our brief, the “Court should not allow the tail of Meta’s alleged BitTorrent use to wag the dog of the important legal questions this case presents. Nor should it accept Plaintiffs’ invitation to let hyperbole about BitTorrent and 'unmitigated piracy' derail the thoughtful and fact-specific fair use analysis the law requires.”
We also urged the court to reject the plaintiffs’ attempt to create a carve out in copyright law for copies obtained using “BitTorrent.”
This dangerous argument seeks to categorically foreclose the possibility that even the most transformative, socially beneficial uses—such as AI training—could be fair use.
As EFF explained in its brief, adopting an exemption from the flexible, fact-specific fair use analysis for “BitTorrent,” “internet piracy,” “P2P downloading,” or something else, would defeat the purpose of the fair use doctrine as a safeguard for the application of copyright law to new technologies.
Privacy on the Map: How States Are Fighting Location Surveillance
Your location data isn't just a pin on a map—it's a powerful tool that reveals far more than most people realize. It can expose where you work, where you pray, who you spend time with, and, sometimes dangerously, where you seek healthcare. In today’s world, your most private movements are harvested, aggregated, and sold to anyone with a credit card. For those seeking reproductive or gender-affirming care, or visiting a protest or a immigration law clinic, this data is a ticking time bomb.
Last year, we sounded the alarm, urging lawmakers to protect individuals from the growing threats of location tracking tools—tools that are increasingly being used to target and criminalize people seeking essential reproductive healthcare.
The good news? Lawmakers in California, Massachusetts, Illinois and elsewhere are stepping up, leading the way to protect privacy and ensure that healthcare access and other exercise of our rights remain safe from invasive surveillance.
The Dangers of Location DataImagine this: you leave your home in Alabama, drop your kids off at daycare, and then drive across state lines to visit an abortion clinic in Florida. You spend two hours there before driving back home. Along the way, you used your phone’s GPS app to navigate or a free radio app to listen to the news. Unbeknownst to you, this “free” app tracked your entire route and sold it to a data broker. That broker then mapped your journey and made it available to anyone who would pay for it. This is exactly what happened when privacy advocates used a tool called Locate X, developed by Babel Street, to track a person’s device as they traveled from Alabama—where abortion is completely banned—to Florida, where abortion access is severely restricted but still available.
Despite this tool being marketed as solely for law enforcement use, private investigators were able to access it by falsely claiming they would work with law enforcement, revealing a major flaw in our data privacy system. In a time when government surveillance of private personal decisions is on the rise, the fact that law enforcement (and adversaries pretending to be law enforcement) can access these tools puts our personal privacy in serious danger.
The unregulated market for location data enables anyone, from law enforcement to anti-abortion groups, to access and misuse this sensitive information. For example, a data broker called Near Intelligence sold location data of people visiting Planned Parenthood clinics to an anti-abortion group. Likewise, law enforcement in Idaho used cell phone location data to charge a mother and her son with “aiding and abetting” abortion, a clear example of how this information can be weaponized to enforce abortion restrictions for patients and anyone else in their orbit.
States Taking ActionAs we’ve seen time and time again, the collection and sale of location data can be weaponized to target many vulnerable groups—immigrants, the LGBTQ+ community, and anyone seeking reproductive healthcare. In response to these growing threats, states like California, Massachusetts, and Illinois are leading the charge by introducing bills aimed at regulating the collection and use of location data.
These bills are a powerful response to the growing threat. The bills are grounded in well-established principles of privacy law, including informed consent and data minimization, and they ensure that only essential data is collected, and that it’s kept secure. Importantly, they give residents—whether they reside in the state or are traveling from other states—the confidence to exercise their rights (such as seeking health care) without fear of surveillance or retaliation.
This post outlines some of the key features of these location data privacy laws, to show authors and advocates of legislative proposals how best to protect their communities. Specifically, we recommend:
- Strong definitions,
- Clear rules,
- Affirmation that all location data is sensitive,
- Empowerment of consumers through a strong private right of action,
- Prohibition of “pay-for-privacy” schemes, and
- Transparency through clear privacy policies.
Effective location privacy legislation starts with clear definitions. Without them, courts may interpret key terms too narrowly—weakening the law's intent. And in the absence of clear judicial guidance, regulated entities may exploit ambiguity to sidestep compliance altogether.
The following are some good definitions from the recent bills:
- In the Massachusetts bill, "consent" must be “freely given, specific, informed, unambiguous, [and] opt-in.” Further, it must be free from dark patterns—ensuring people truly understand what they’re agreeing to.
- In the Illinois bill, a “covered entity” includes all manner of private actors, including individuals, corporations, and associations, exempting only individuals acting in noncommercial contexts.
- "Location information" must clearly refer to data derived from a device that reveals the past or present location of a person or device. The Massachusetts bill sets a common radius in defining protected location data: 1,850 feet (about one-third of a mile). The California bill goes much bigger: five miles. EFF has supported both radiuses.
- A “permissible purpose” (which is key to the minimization rule) should be narrowly defined to include only: (1) delivering a product or service that the data subject asked for, (2) fulfilling an order, (3) complying with federal or state law, or (4) responding to an imminent threat to life.
“Data minimization” is the privacy principle that corporations and other private actors must not process a person’s data except as necessary to give them what they asked for, with narrow exceptions. A virtue of this rule is that a person does not need to do anything in order to enjoy their statutory privacy rights; the burden is on the data processor to process less data. Together, these definitions and rules create a framework that ensures privacy is the default, not the exception.
One key data minimization rule, as in the Massachusetts bill, is: “It shall be unlawful for a covered entity to collect or process an individual’s location data except for a permissible purpose.” Read along with the definition above, this across-the-board rule means a covered entity can only collect or process someone’s location data to fulfil their request (with exceptions for emergencies and compliance with federal and state law).
Additional data minimization rules, as in the Illinois bill, back this up by restraining particular data practices:
- Covered entities can not collect more precise data than strictly necessary, or use location data to make inferences beyond what is needed to provide the service.
- Data must be deleted once it’s no longer necessary for the permissible purpose.
- No selling, renting, trading, or leasing location data – full stop.
- No disclosure of location data to government, except with a warrant, as required by state or federal law, on request of the data subject, or an emergency threat of serious bodily injury or death (defined to not include abortion).
- No other disclosure of location data, except as required for a permissible purpose or when requested by the individual.
The California bill rests largely on data minimization rules like these. The Illinois and Massachestts bills place an additional limit: no collection or processing of location data absent opt-in consent from the data subject. Critically, consent in these two bills is not an exception to the minimization rule, but rather an added requirement. EFF has supported both models of data privacy legislation: just a minimization requirement; and paired minimization and consent requirements.
All Location Data is SensitiveTo best safeguard against invasive location tracking, it’s essential to place legal restrictions on the collection and use of all location data—not just data associated with sensitive places like reproductive health clinics. Narrow protections may offer partial help, but they fall short of full privacy.
Consider the example at the beginning of the blog: if someone travels from Alabama to Florida for abortion care, and the law only shields data at sensitive sites, law enforcement in Alabama could still trace their route from home up to near the clinic. Once the person enters a protected “healthcare” zone, their device would vanish from view temporarily, only to reappear shortly after they leave. This gap in the tracking data could make it relatively easy to deduce where they were during that time, essentially revealing their clinic visit.
To avoid this kind of loophole, the most effective approach is to limit the collection and retention of all location data—no exceptions. This is the approach in all three of the bills highlighted in this post: California, Illinois, and Massachusetts.
Empowering Consumers Through a Strong PRATo truly protect people’s location privacy, legislation must include a strong private right of action (PRA)—giving individuals the power to sue companies that violate their rights. A private right of action ensures companies can’t ignore the law and empowers people to seek justice directly when their sensitive data is misused. This is a top priority for EFF in any data privacy legislation.
The bills in Illinois and Massachusetts offer strong models. They make clear that any violation of the law is an injury and allow individuals to bring civil suits:“A violation of this [law] … regarding an individual’s location information constitutes an injury to that individual. … Any individual alleging a violation of this [law] … may bring a civil action …” Further, these bills provide a baseline amount of damages (sometimes called “liquidated” or “statutory” damages), because an invasion of statutory privacy rights is a real injury, even if it is hard for the injured party to prove out-of-pocket expenses from theft, bodily harm, or the like. Absent this kind of statutory language, some victims of privacy violations will lose their day in court.
These bills also override mandatory arbitration clauses that limit access to court. Corporations should not be able to avoid being sued by forcing their customers to sign lengthy contracts that nobody reads.
Other remedies include actual damages, punitive damages, injunctive relief, and attorney’s fees. These provisions give the law real teeth and ensure accountability can’t be signed away in fine print.
No Pay-for-Privacy SchemesStrong location data privacy laws must protect everyone equally—and that means rejecting “pay-for-privacy” schemes that allow companies to charge users for basic privacy protections. Privacy is a fundamental right, not a luxury add-on or subscription perk. Allowing companies to offer privacy only to those who can afford to pay creates a two-tiered system where low-income individuals are forced to trade away their sensitive location data in exchange for access to essential services. These schemes also incentivize everyone to abandon privacy.
Legislation should make clear that companies cannot condition privacy protections on payment, loyalty programs, or any other exchange of value. This ensures that everyone—regardless of income—has equal protection from surveillance and data exploitation. Privacy rights shouldn’t come with a price tag.
We commend this language from the Illinois and Massachusetts bills:
A covered entity may not take adverse action against an individual because the individual exercised or refused to waive any of such individual’s rights under [this law], unless location data is essential to the provision of the good, service, or service feature that the individual requests, and then only to the extent that this data is essential. This prohibition includes, but is not limited to: (1) refusing to provide a good or service to the individual; (2) charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties; or (3) providing a different level of quality of goods or services to the individual.
Transparency Through Clear Privacy PoliciesIt is helpful for data privacy laws to require covered entities to be transparent about their data practices. All three bills discussed in this post require covered entities to make available a privacy policy to the data subject—a solid baseline. This ensures that people aren’t left in the dark about how their location data is being collected, used, or shared. Clear, accessible policies are a foundational element of informed consent and give individuals the information they need to protect themselves and assert their rights.
It is also helpful for privacy laws like these to require covered entities to prominently publish their privacy policies on their websites. This allows all members of the public – as well as privacy advocates and government enforcement agencies – to track whether data processors are living up to their promises.
Next Steps: More States Must JoinThe bottom line is clear: location data is highly sensitive, and without proper protections, it can be used to harm those who are already vulnerable. The digital trail we leave behind can reveal far more than we think, and without laws in place to protect us, we are all at risk.
While some states are making progress, much more needs to be done. More states need to follow suit by introducing and passing legislation that protects location data privacy. We cannot allow location tracking to be used as a tool for harassment, surveillance, or criminalization.
To help protect your digital privacy while we wait for stronger privacy protection laws, we’ve published a guide specifically for how to minimize intrusion from Locate X, and have additional tips on EFF’s Surveillance Self-Defense site. Many general privacy practices also offer strong protection against location tracking.
If you live in California, Illinois, Massachusetts – or any state that has yet to address location data privacy – now is the time to act. Contact your lawmakers and urge them to introduce or support bills that protect our sensitive data from exploitation. Demand stronger privacy protections for all, and call for more transparency and accountability from companies that collect and sell location data. Together, we can create a future where individuals are free to travel without the threat of surveillance and retaliation.