JVN: GROWIにおける非効率な正規表現の使用

6 days 22 hours ago
株式会社GROWIが提供するGROWIには非効率な正規表現を使用して入力を処理している箇所があり、細工された入力によってサービス運用妨害(DoS)状態を引き起こされる可能性があります。

[B] チャオ!イタリア通信~小学校の卒業式

6 days 23 hours ago
イタリアの卒業式は、賑やかで自由!!イタリアでは、6月に学校の1年間が終わります。我が家の双子も、この6月に5年間の小学校生活が終わりました。こちらでは、卒業式というものがないのですが、私の子どもたちが通う小学校では、先生手作りのちょっとした式がありました。その様子を報告します。(サトウノリコ=イタリア在住)
日刊ベリタ

Copyright Cases Should Not Threaten Chatbot Users’ Privacy

1 week ago

Like users of all technologies, ChatGPT users deserve the right to delete their personal data. Nineteen U.S. States, the European Union, and a host of other countries already protect users’ right to delete. For years, OpenAI gave users the option to delete their conversations with ChatGPT, rather than let their personal queries linger on corporate servers. Now, they can’t. A badly misguided court order in a copyright lawsuit requires OpenAI to store all consumer ChatGPT conversations indefinitely—even if a user tries to delete them. This sweeping order far outstrips the needs of the case and sets a dangerous precedent by disregarding millions of users’ privacy rights.

The privacy harms here are significant. ChatGPT’s 300+ million users submit over 1 billion messages to its chatbots per day, often for personal purposes. Virtually any personal use of a chatbot—anything from planning family vacations and daily habits to creating social media posts and fantasy worlds for Dungeons and Dragons games—reveal personal details that, in aggregate, create a comprehensive portrait of a person’s entire life. Other uses risk revealing people’s most sensitive information. For example, tens of millions of Americans use ChatGPT to obtain medical and financial information. Notwithstanding other risks of these uses, people still deserve privacy rights like the right to delete their data. Eliminating protections for user-deleted data risks chilling beneficial uses by individuals who want to protect their privacy.

This isn’t a new concept. Putting users in control of their data is a fundamental piece of privacy protection. Nineteen states, the European Union, and numerous other countries already protect the right to delete under their privacy laws. These rules exist for good reasons: retained data can be sold or given away, breached by hackers, disclosed to law enforcement, or even used to manipulate a user’s choices through online behavioral advertising.

While appropriately tailored orders to preserve evidence are common in litigation, that’s not what happened here. The court disregarded the privacy rights of millions of ChatGPT users without any reasonable basis to believe it would yield evidence. The court granted the order based on unsupported assertions that users who delete their data are probably copyright infringers looking to “cover their tracks.” This is simply false, and it sets a dangerous precedent for cases against generative AI developers and other companies that have vast stores of user information. Unless courts limit orders to information that is actually relevant and useful, they will needlessly violate the privacy rights of millions of users.

OpenAI is challenging this order. EFF urges the court to lift the order and correct its mistakes.  

Tori Noble