情報通信審議会 情報通信技術分科会 衛星通信システム委員会700MHz帯衛星ダイレクト通信検討作業班(第1回)
情報通信審議会 電気通信事業政策部会 ユニバーサルサービス政策委員会 ユニバーサルサービス制度における交付金・負担金の算定等に関するワーキンググループ(第1回)
京都府京都市「宿泊税」の変更
「電話番号の犯罪利用対策等に係る電気通信番号制度の在り方」に係る一次答申(案)に対する意見募集
村上総務大臣閣議後記者会見の概要
情報通信審議会 情報通信技術分科会 電波有効利用委員会 電波監視作業班(第5回)開催案内
情報通信審議会 電気通信事業政策部会(第83回)配布資料・議事概要・議事録
情報通信審議会 電気通信事業政策部会 ユニバーサルサービス政策委員会 ユニバーサルサービス制度における交付金・負担金の算定等に関するワーキンググループ(第1回)
大洋州島しょ国・地域向けサイバーセキュリティ能力構築演習 (令和7年度第1回)の実施
情報通信審議会 情報通信技術分科会 電波利用環境委員会 CISPR F作業班(第30回)開催案内
令和7年国勢調査のインターネット回答状況 (10月1日午前0時現在)
衛星放送ワーキンググループ(第17回)
「固定電話サービスの円滑な移行の在り方」の情報通信審議会への諮問
G20デジタル経済大臣会合及びAIタスクフォース大臣会合の開催結果
第328回 官民競争入札等監理委員会(開催案内)
IPネットワーク設備委員会 報告(案)に対する意見募集
労働力調査(基本集計)2025年(令和7年)8月分
デジタル時代における放送制度の在り方に関する検討会(第37回)配付資料
Tips to Protect Your Posts About Reproductive Health From Being Removed
This is the ninth installment in a blog series documenting EFF’s findings from the Stop Censoring Abortion campaign. You can read additional posts here.
Meta has been getting content moderation wrong for years, like most platforms that host user-generated content. Sometimes it’s a result of deliberate design choices—privacy rollbacks, opaque policies, features that prioritize growth over safety—made even when the company knows that those choices could negatively impact users. Other times, it’s simply the inevitable outcome of trying to govern billions of posts with a mix of algorithms and overstretched human reviewers. Importantly, users shouldn’t have to worry about their posts being deleted or their accounts getting banned when they share factual health information that doesn’t violate the platforms' policies. But knowing more about what the algorithmic moderation is likely to flag can help you to avoid its mistakes.
We analyzed the roughly one-hundred survey submissions we received from social media users in response to our Stop Censoring Abortion campaign. Their stories revealed some clear patterns: certain words, images, and phrases seemed to trigger takedowns, even when posts didn’t come close to violating Meta’s rules.
For example, your post linking to information on how people are accessing abortion pills online clearly is not an offer to buy or sell pills, but an algorithm, or a human content reviewer who doesn’t know for sure, might wrongly flag it for violating Meta’s policies on promoting or selling “restricted goods.”
That doesn’t mean you’re powerless. For years, people have used “algospeak”—creative spelling, euphemisms, or indirection—to sidestep platform filters. Abortion rights advocates are now forced into similar strategies, even when their speech is perfectly legal. It’s not fair, but it might help you keep your content online. Here are some things we learned from our survey:
Practical Tips to Reduce the Risk of TakedownsWhile traditional social media platforms can help people reach larger audiences, using them also generally means you have to hand over control of what you and others are able to see to the people who run the company. This is the deal that large platforms offer—and while most of us want platforms to moderate some content (even if that moderation is imperfect), current systems of moderation often reflect existing societal power imbalances and impact marginalized voices the most.
There are ways companies and governments could better balance the power between users and platforms. In the meantime, there are steps you can take right now to break the hold these platforms have:
- Images and keywords matter. Posts with pill images, or accounts with “pill” in their names, were flagged often—even when the posts weren’t offering to sell medication. Before posting, consider whether you need to include an image of, or the word “pill,” or whether there’s another way to communicate your message.
- Clarity beats vagueness. Saying “we can help you find what you need” or “contact me for more info” might sound innocuous, but to an algorithm, it can look like an offer to sell drugs. Spell out what kind of support you do and don’t provide—for example: “We can talk through options and point you toward trusted resources. We don’t provide medical services or medication.”
- Be careful with links. Direct links to organizations or services that provide abortion pills were often flagged, even if the organizations operate legally. Instead of linking, try spelling out the name of the site or account.
- Certain word combos are red flags. Posts that included words like “mifepristone,” “abortion,” and “mail” together were frequently removed. You may still want to use them—they’re accurate and important—but know they make your post more likely to be flagged.
- Ads are even stricter. Meta requires pharmaceutical advertisers to prove they’re licensed in the countries they target. If you boost posts, assume the more stringent advertising standards will be applied.
Big platforms give you reach, but they also set the rules—and those rules usually favor corporate interests over human rights. You don’t have to accept that as the only way forward:
- Keep a backup. Export your data regularly so you’re not left empty-handed if your account disappears overnight.
- Build your own space. Hosting a website isn’t free, but it puts you in control.
- Explore other platforms. Newsletters, Discord, and other community tools offer more control than Facebook or Instagram. Decentralized platforms like Mastodon and Bluesky aren’t perfect, but they show what’s possible when moderation isn’t dictated from the top down. (Learn more about the differences between Mastodon, Bluesky, and Threads, and how these kinds of platforms help us build a better internet.)
- Push for interoperability. Imagine being able to take your audience with you when you leave a platform. That’s the future we should be fighting for. (For more on interoperability and Meta, check out this video where Cory Doctorow explains what an interoperable Facebook would look like.)
If you’re working in abortion access—whether as a provider, activist, or volunteer—your privacy and security matter. The same is true for patients. Check out EFF’s Surveillance Self-Defense for tailored guides. Look at resources from groups like Digital Defense Fund and learn how location tracking tools can endanger abortion access. If you run an organization, consider some of the ways you can minimize what information you collect about patients, clients, or customers, in our guide to Online Privacy for Nonprofits.
Platforms like Meta insist they want to balance free expression and safety, but their blunt systems consistently end up reinforcing existing inequalities—silencing the very people who most need to be heard. Until they do better, it’s on us to protect ourselves, share our stories, and keep building the kind of internet that respects our rights.
This is the ninth post in our blog series documenting the findings from our Stop Censoring Abortion campaign. Read more in the series: https://www.eff.org/pages/stop-censoring-abortion
Affected by unjust censorship? Share your story using the hashtag #StopCensoringAbortion. Amplify censored posts and accounts, share screenshots of removals and platform messages—together, we can demonstrate how these policies harm real people.