[B] 情報公開法と公文書管理法の抜本的改正を 市民団体が都内で集会開催

1 day 14 hours ago
臨時国会の開会日となった12月6日、奇しくも8年前の同日に強行採決で成立した特定秘密保護法の廃止を求め、市民団体が都内で集会を開催した。同集会では、特定秘密保護法を廃止に導くべく、情報公開法と公文書管理法の抜本的改正の必要性とそのための今後の取り組み等について語られた。主催は、「『秘密保護法』廃止へ!実行委員会」、「共謀罪NO!実行委員会」。(岩本裕之)
日刊ベリタ

CITAD launches first Nigerian School of Community Networks

1 day 16 hours ago

CITAD has just launched the first Nigerian School of Community Networks, where members of rural communities will learn how to create and manage community networks as a means of bridging the connectivity gaps in the areas where they live.

Language English
deborap

Pay a Hacker, Save a Life

1 day 16 hours ago
Episode 104 of EFF’s How to Fix the Internet

How do we make the Internet more secure? Part of the solution is incentives, according to Tarah Wheeler, this week’s guest on EFF’s How to Fix the Internet. As a security researcher with deep experience in the hacker community, Tarah talks about how many companies are shooting themselves in the foot when it comes to responding to security disclosures. Along with EFF co-hosts Cindy Cohn and Danny O’Brien, Tarah also talks about how existing computer crime law can serve to terrify security researchers, rather than uplift them.

Click below to listen to the show now, or choose your podcast player:

%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F45e44f5b-6f6c-47bd-9e0f-5284bd5b0d69%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

  
  

Computers are in everything we do — and that means computer security matters to every part of our lives. Whether it’s medical devices or car navigation, better security makes us safer. 

Note: We'll be having a special, live event with Tarah Wheeler to continue this conversation on Thursday December 9th. RSVP or learn more.

On this episode, you’ll learn:

  • About the human impact of security vulnerabilities—and how unpatched flaws can change or even end lives;
  • How to reconsider the popular conception of hackers, and understand their role in helping build a more secure digital world;
  • How the Computer Fraud and Abuse Act (CFAA), a law that is supposed to punish computer intrusion, has been written so broadly that it now stifles security researchers;
  • What we can learn from the culture around airplane safety regulation—including transparency and blameless post-mortems;
  • How we can align incentives, including financial incentives, to improve vulnerability reporting and response;
  • How the Supreme Court case Van Buren helped security researchers by ensuring that the CFAA couldn’t be used to prosecute someone for merely violating the terms of service of a website or application;
  • How a better future would involve more collaboration and transparency among both companies and security researchers.

Tarah Wheeler is an information security executive, social scientist in the area of international conflict, author, and poker player. She serves on the EFF advisory board, as a cyber policy fellow at Harvard, and as an International Security Fellow at New America. She was a Fulbright Scholar in Cybersecurity last year. You can find her on Twitter at @Tarah or at her website: https://tarah.org/.  

If you have any feedback on this episode, please email podcast@eff.org.

Below, you’ll find legal resources - including important cases, books, and briefs discussed in the podcast - and a full transcript of the audio.

Resources

Consumer Data Privacy:

 Ransomware:

 Computer Fraud and Abuse Act (CFAA):

 Electoral Security:


Transcript

Tarah: So in 2010, I was getting married for the first time. And as I was walking down the street one night, I see one of the local bridal shops had its front door, just hanging open in the middle of the night.There's no one around, it just looks like someone has maybe thought that the door was closed and left out the back perhaps. So I, I poked my head in, I look around, Hey, is anybody in here? I closed the door, kind of latch it all the way until I can feel it rattle, and lock it from the inside, pull it shut. And I left a little note on the door saying, Hey, folks just want to let you know, your door was open in case there's something wrong with the lock.

And, and I left. Never heard back from them again. Not a single acknowledgement, not a thank you, not anything. And that is really the place that a lot of security researchers find themselves in when they try to make a third-party report of a security vulnerability to a company, they just get ignored. And you know, it's a little annoying. 

Danny: That's Tara Wheeler and she's our guest this week on how to fix the internet. we're going to talk to her about coordinated vulnerability disclosures and what should happen if you find a flaw in software that needs to be fixed and how people like Tarah can keep you safe. 

I'm Danny O'Brien.

Cindy: And I'm Cindy Cohen. Welcome to how to fix the internet a podcast from the Electronic Frontier Foundation helping you understand how we can all make our digital future better.  

Danny: Welcome Tarah. Thank you so much for joining us.

Tarah: Thank you so much for having me. Cindy's an incredible pleasure. Thanks so much, Danny.

Cindy: Tarah, you are a Cyberpolicy fellow at Harvard, an International  cybersecurity fellow at New America, you were a Fulbright scholar in cybersecurity last year and to our great delight you are also a member of the EFF advisory board. Suffice it to say that you know a lot about this stuff, and off the top, you told us a story about walking past a bridal shop, doing a good deed by locking the door and then never hearing back. Can you explain how that story connects to the coordinated vulnerability disclosure world that you live in?

Tarah: Absolutely so coordinated vulnerability disclosure is a process that's engaged in by multiple stakeholders, which translated down into normal human  means there needs to be a way for the company to get that information from somebody who wants to tell them that something's gone wrong.

Well, the problem is that companies are often either unaware that they should have an open door policy for third-party security researchers to let them know something's gone wrong. Security researchers, on the other hand, need to provide that information to companies without having it start off with sounding like a ransom demand, basically.

Danny: Right

Tarah: So let me give you an example. If you find that there's a vulnerability, something like, I don't know, a cross site scripting issue in a company's website, you might try to let that company know that something's gone wrong, that they have a security vulnerability that's easily exploitable and public to the internet.

Well, the question for a lot of people is what do you do if you don't know how to get that information to somebody at a company. The industry standard very first step is make sure that you have the alias security@company.com available as an email address that can take reports from third-party researchers. We in the industry and the community just sort of expect that that email alias will work.

Hopefully you can look on their site and find a way to contact somebody who's in technical support or who's on the security team and let them know something's wrong. However, a lot of companies have an issue with taking those reports in and acknowledging them because honestly, there's two different tracks the world operates on. There's sort of the community of information, security researchers who operate on gratitude and money. And then there's corporations that operate off of liability and public brand management. So when a company gets a third party report of a security vulnerability, there's a triage process that needs to happen at that company.

And I'm here to tell you, as a person who's done this both inside and outside companies, when you are a company that is receiving reports of a vulnerability, unless you can fix that vulnerability pretty quickly, you may not wish to acknowledge it. You sometimes get pressure from inside the company, especially from the lawyers, because that can be seen as an acknowledgement that the company has it seen a triage and we'll repair the vulnerability in a timely manner. Let me assure you a timely manner looks really different to a security researcher than to internal counsel.

Danny:  If somebody finds a vulnerability. And reports it to a company and a company either blows them off or tries to cover it up, what are the consequences for the average user?

Like how does it affect me?

Tarah: How does it affect a normal person who's a user of a product if somebody who's a security researcher has reported a vulnerability to that company and the company never fixes it?

Danny: Hmmm Hmmm

Tarah: Well, I don't know about you, but I'm one of the 143 million people that lost my personal information and credit history when Equifax decided not to patch a single vulnerability in their servers.

Behind every data breach is a story. And that story is either that people didn't know something was wrong, or people knew something was wrong, but they de-prioritized the fix for it, not understanding how severely it could impact them and consumers and the people whose data they're storing.

Danny: You talked about a coordinated, vulnerable disclosure, so who's coordinating and what's being coordinated? 

Tarah: When we talk about multiple stakeholders in a vulnerability, one of the things we're talking about is not just the people who found it and the people who need to fix it, but also the people who are advocating for the consumers who may be affected by it.

That's how you'll get situations like the FTC stepping in to have a conversation or two with companies that have repeatedly failed to fix major vulnerabilities in their systems when they're protecting consumer data. The EFF as a great example, tends to want to protect a larger community of people, not just the researchers, not just the people working at the company, but all the people who are impacted by a vulnerability. So a security researcher finds something that's wrong and reports it to a company, the company's incentives need to be aligned with the idea that they should be fixing the vulnerability, not suing the researcher into silence. 

Cindy: EFF’s had a pretty significant role in this, and I remember the bad old days when a security researcher really pretty much immediately either got a knock on the door from the law enforcement or, you know, service of process for being sued for having the audacity to tell a company that it's got a security problem.

And what I really love about the way the world has evolved is that we do have this conversation now more often than not in the software industry. But you know, computers are in everything now. They're in cars and refrigerators, they're in medical devices that are literally inside our bodies, like insulin pumps and heart monitors.

I'm wondering if you have a sense of how other industries are doing here now that they're in the computer software business too.

Tarah: I was recently on a panel for internet of things security at the Organization for Economic Cooperation and Development. And I was talking to somebody from who was previously from the Australian consumer product safety commission or their equivalent of it. And I am here to tell you that having computers in everything is a fascinating question as a consumer product safety person's entire perspective had very little to do with whether or not there was a software vulnerability in the computers that they were using, but whether or not the product that, that computer had been put in, dealt with temperatures 

So when we start talking about putting a computer in everything, we start talking about things that can kill people when we talk about temperature differentials, altering the temperature inside refrigerators and freezers changing, whether a sous vide machine has a readout that's appropriate or not on it, that's the kind of vulnerability that can kill people.

Danny: Do you think there's something particularly different from software compared to other disciplines where we've sort of sorted out the safety problem? Like bridges don't fall down anymore. Right. Is there something we're doing with bridges that we're not doing with software or is it just that software is hard?

Tarah: Number one, software's hard. Number two, I love an industry I'm going to bring up instead of bridges and that is aviation. And I will tell you what aviation does differently than we do in information security: they have a blameless post-mortem to discuss how and why something occurred when it came to accidents. They have a multi-stakeholder approach that brings in multiple different people to examine the causes of an incident. This is we're talking of course, about the NTSB and the FAA, the National Transportation Safety Board and the Federal Aviation Administration. And in aviation the knowledge exchange between pilots is perpetual, ongoing, expected, regulated, and the expectation for those of us who are in the aviation in the community, is that we will perpetually be telling other people what we did wrong. There's a culture perpetually of discussing, revealing our own mistakes and talking about how other people can avoid them in a way that is, there's no penalty for doing so. Everyone there will tell you what they've done wrong as a pilot to deeply convince you to not make the same mistakes. That's just not true in information security. We hide all of our mistakes as fast as we can bury them under piles of liability and that dreaded email subject line attorney, client confidential. That's that's where that comes from perpetually is this culture of secrecy here, and that's why we have this problem.

: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science, enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. 

Cindy: I think that is really a great place to pivot a little bit to how we fix it, because you know, one of the reasons why the lawyers are so worried is because the liability risk is well sometimes massively overblown, but sometimes real. And, the same thing is true on the other side, which is the law is set up so that the risk to a security researcher, the risk of telling somebody the truth, right. And then on the other side, the liability threat doesn't line up the company's incentives with what's best for society. That to me does not mean that we protect the company against any liability, when a plane falls out of the sky, we still expect the company to be held accountable for the damage that they do. But the liability piece doesn't get in the way of learning from the mistakes. And I think in the rest of software world, we're seeing the liability piece get in the way. And some of that is changes we can make to law and policy, but some of that is I think some changes we need to make to how these problems are lawyered.

Tarah: The lawyering in a recent case in a ransomware attack on a healthcare facility, I think it was the South, just resulted in a lawsuit from a patient whose daughter died after not receiving sufficient care during a hospital, experiencing a cyber attack. And she's filed suit now and the concern by people who are watching this process occur: people suing hospitals for not revealing that they were under cyber attack or not providing them appropriate care during a ransomware attack is not likely to create a situation where hospitals are more open about the fact that they're experiencing a network outage. It's likely to result in hospitals, turning patients away from hospitals during the middle of ransomware attacks.

Now that's the exact wrong lesson to learn. So when I look at the way that we're thinking about liability in critical infrastructure, the incentives are totally wrong to be publicly open and honest about the fact that a hospital is experiencing a cyber attack. The hospital chose to not pay the ransom, which it's important to note that they may never have gotten the records back or the ability to get their network up back anyway, or that it may not have happened in time to save this woman's daughter. But at the same time, we can't teach hospitals that the lesson that they need to learn from experiencing cyber attacks and lawsuits is that they need to shut up more and pay more ransomes. We can't teach institutions that that is the right way to respond to this fear of liability.

Cindy: One of the ways that we've helped fix this in California is the data breach notification law, right in California. If you have a data breach, and this impacts a lot of people around the world, because so many companies are based in California, the company's liability risk goes down not to zero- cause I think it's tremendously important that people can still sue if their kids die because of a data breach. Like you can't remove all accountability, but the accountability shifts, if you tell people in a timely manner about the data breach. So there are things we can do and you know, a national data breach notification law is one of the things that we could consider. We can set these incentives, such that we shift the companies, risk evaluation towards talking about this stuff and a way from not talking about it.

The California law is a good example, but you know, many of the federal cybersecurity laws are about, well, you got to tell the government that, but you won't tell anybody else that. 

Tarah:  There is a dearth of qualified senior cyber security professionals out there. And that is the legacy of the 1986 computer fraud and abuse act. The government did it to themselves on this one because the people who care enough to try to follow the law, but also have the curious minds to start doing work on information security research, to start doing offensive security research and trying to help people understand what their vulnerabilities are, are terrified of the CFAA. The CFAA stops independent security research to a level that I think most people still don't really understand. So as a result, we end up with a culture of fear among people who are trying to be ethical and law abiding, and, a situation where people who aren't just have to evade somebody in federal law enforcement, in the United States long enough to make a quick profit and then get out.

Danny: And this isn't just in the United States, right? I mean, one of the things that's been most disappointing, I think about the CFAA has been that it's been exported around the world and that you have exactly that same challenge for people being turned into criminals instead of good citizens, wherever they live.

Tarah: And we're looking at it at a law that should have had a sunset provision in it to begin with and a law that was created and put into place and supported by a judge who thought that you could whistle into a payphone to launch nuclear weapons. Look, people, computers are not magic sky fairy boxes full of pixie dust that can somehow change the world. I mean, unless you're mining for blockchain, cause we all know that that's the magical stuff. The same situation applies here that computers are not magic. There's some flaws in our legal system, in the United States and the CFA is often used to sprinkle over the top to get indictments. We already have laws that describe what fraud is, what theft is and saying that it's happening over a computer doesn't make it not fraud. Doesn't make it worse than fraud. It's just a different medium of doing it. 

We all know what right and wrong is. Right. And so adding a law that says, if you use a magic pixie box to commit a crime, then it's somehow worse. Doesn't make any sense to those of us who work with computers on an everyday basis. 

Cindy: I think that one of the lessons of the CFAA is maybe we shouldn't write a law based upon a Matthew Broderick movie of the eighties. Right. The CFAA was apparently passed after president Reagan saw war games, which is a very fun movie, but not actually realistic. So please go on.

Tarah: So the nature of the CFAA, going back to the real story here, is that its being used by people who don’t understand it, to prosecute people who never intended to break laws, or who if they did, we already have a law to cover that question. So the CFAA now is being used mostly from what we're able to see in industry to stop exiting employees of large corporations from setting up competing businesses. 

That's the actual use, quietly behind the scenes of the CFAA. The other very public use is to go after people who have no business being prosecuted with that law. They might be bad people. The police officer who collaborated with criminals to harass women and used his departmental computer to look up information in the recent United States vs Van Buren. The problem we have here is that this police officer was charged under the CFAA. Now he wasn't a good guy, but we already have a name for the law that he broke. It's abuse of the public trust, there's fraud, there's theft. 

And the problem we're having here is that the crime he committed is the exact same whether or not he looked it up on his laptop or whether he looked information up on these women, by going back and looking through a file cabinet that was built entirely out of paper and wood back at the department.

So we're inventing a law to prosecute information security researchers, employees who are leaving companies or unfairly to prosecute bad people who committed crimes we already have names for. We don't need the CFAA. We already know it when people have done the right and the wrong thing, and we already have laws for those things, but it's very easy for a judge to be convinced that something scary is going to happen on a computer because they don't understand how they work.

Cindy: Let's switch a little bit into, you know, what does it look like if we get this right. So we've already talked about one, which is the computer fraud and abuse act isn't being used to scare people out of doing good deeds anymore. And that this idea that we have a global cooperative network of people who are all pointed towards making our security network is something that we embrace instead of something that we disincentivize. And we need to embrace that both on the individual level with things like the CFAA and on the corporate level, aligning the corporate incentives and then on the government level, right where we encourage governments, not to stockpile vulnerabilities and not to be big buyers on this private market, but instead to give that information over to the company so the companies can fix it and make things better.

What else do you think it looks like if we get it right, Tarah?

Tarah: If we get it right, security researchers who are reporting vulnerabilities to companies would be appropriately compensated. That doesn't mean that a security researcher who reports a vulnerability that hadn't been found but is a small one should be getting a Porsche every single time.

It does mean that researchers who try to help should at the very least experience some gratitude. When you find a vulnerability that's reported to, you report to a company that is a company killer, fully critical could take the entire system, the entire company down, you should be receiving appropriate compensation.

Cindy: We need to align the incentives for doing the right thing with the incentives for doing the wrong thing is what I'm hearing you say.

Tarah: That is correct. We need to align those incentives.

Cindy: And that's just chilling, right? Because what if those security researchers instead sold that vulnerability to somebody who wants to undermine our elections? I can't imagine something that's more important that it be secure and right, and protected, then our, you know, our basic right to vote and, and live in a democracy that works. When you set up a situation in which you're dealing with something that is that important to a functioning society, we shouldn't have to depend on the Goodwill of the security researchers to tell, you know, the good guys about this and not tell the bad guys we need to set the incentives up so it, it always pushes them in the direction of the good guys and things like, you know, monitors for our health and the protection of our vote, are where these incentives should be the strongest. 

Tarah: Absolutely. That same mindset that lets you find vulnerabilities after the fact lets you see where they're being created. Unfortunately there's first, not enough companies that do appropriate product security reviews early in the development process. Why? Because it's expensive and two, there's not enough people who are good qualified product security, reviewers and developers. They're just, there just aren't enough of them partially because companies don't welcome that a great deal of the time. Right at this moment, the cybersecurity field is exploding with people who want to be in cybersecurity. And yet at the most senior levels, there is simply no doubt that there is a massive lack of diversity in this field. It is very difficult for women, people of color, queer people to see themselves at the top of companies when they don't see themselves at the top of companies, right. They don't see themselves succeeding in this field, even though I am here to tell you, comparatively, the wages are really good in cybersecurity.

So please all of my friends out there, get into a training class, start taking computers apart because this is a great field to be in if you like puzzles and you want to get paid well, but there's a massive lack of diversity and to open those doors fully to the number of people that we need in this field, we have got to, got to, got to start thinking differently about what we think of cybersecurity expert looks like.

Cindy: There seems to be a lack of imagination about what a hacker looks like or should be like.  Tarah, you don’t look like that image….we really need to create better and a wider range of models of what somebody who cares about computer security looks like and sounds like.

Tarah: We do. What you're actually looking for is a sense of safety, a feeling of security in yourself that you hired a smart, educated person to tell you everything's going to be okay. That's this human frailty that gets introduced into cybersecurity again and again. It's hard to reassure somebody that you're an expert if you don't look like what they think an expert looks like, and that is the barrier for women, people of color in, and people of color in cybersecurity right now, they have to be trusted as experts. And it's just a problem to get through and to break through that barrier.

Cindy: This is a community project. to me is a piece of recognizing we are on other people’s computers all day long, and sometimes other people are on our computers. When somebody comes along and says “hey, I think you’ve got  a security problem here, the right thing to do is to thank them. Not to attack them, much less throw the criminal law at them.  

Tarah: If you are a person inside a company, I want you to send an email to security@yourcompany.com, whatever that is. And I want you to find out what happens. Does it bounce back? Does it go to some guy that doesn't work for the company anymore? Does it, as I have previously discovered go to the chief of staff of the CEO. So like, just take a look at where that goes, because that's how people are trying to talk to you and make your world better. And if nobody's checked that mailbox in a while, maybe open it up and, you know, stare at it with one eye closed behind one of those like eclipse viewers. Cause it's going to explode.

Cindy: I could totally do this all day, Tarah. It's so fascinating to talk to you and to see your perspective, because you really have been in multiple different places in this conversation. And I think with people like you we could get to a place where we fixed it if we just had more people listening to Tarah. So thank you so much for coming and talking to us and giving us the kind of straight story about how this is playing out on the ground.

Tarah: It's incredibly kind of you to invite me. Cindy and Danny, I just, I want to hang out with you and just, you know, drink inappropriate morning wine with you and yell about how everything's broken on the internet. I mean, it's a wonderful pastime at the same time it's a wonderful opportunity to, to make the world a little bit better, just recognizing that we are connected to each other, that these, the fixing of one thing in one place doesn't just impact that one thing, it impacts everybody. And it's wonderful to be with you and get a chance to make things a little better. 

Cindy: Well, that was just terrific. You know, Tarah's enthusiasm just spills out and her love for computer security and security research. Just, it's infectious. And it, it really made me think that we can do a lot here to make things better. You know, what, what really struck me is that, you know, I have been an enemy of the computer fraud and abuse act for a very long time, but she really grounded it in how it terrifies and chills security research. And ultimately, you know, hurts our country and the world. But what she said was very specific, which is it's created a culture of fear among people who are trying to be ethical and law abiding. I mean, that really ought to stop us cold. And you know, the good news is that we got a little bit of relief out of the Supreme court case van Buren that we talked about, but there's just so much more to go. 

Danny: I think she really managed to convey the stakes here and the human impact of these sort of vulnerabilities. It's not just about your credit rating going down because personal data was leaked. It's about how a child in a hospital could die if people don't address security vulnerabilities. 

Cindy: The other thing that I really liked was Tarah's really focusing on aligning financial incentives, kind of on both sides, the penalties for the companies who don't fix or talk about security vulnerabilities and compensating the security researchers who are doing us all a favor by finding them. You know, what I like about that is you talk a lot about the four levers of change that Larry Lessig first identified law code norms and markets.And this one is very focused on markets, and how we can align financial incentives to make things better. 

Danny: Yeah. And I think people get very nihilistic about solving the computer security problem. And I think that Tarah’s citing of an actual, real, pragmatic, inspiration for how you might go about improving it was really positive. And that was the airline industries, where you have a community that comes together across businesses, across countries, and works internationally in this very transparent and methodical way to defend against problems that have a very similar model, right? The tiniest error can have huge consequences and people's lives are on the line. So everybody has to work together. I liked the fact that there's something in the real world that we can base our utopian vision on.

Cindy: The other thing I really appreciated is how Tarah makes it so clear that we're just in a networked world now.  We spend a lot of time connected with each other on other people's computers and the way that we fix it is recognizing that and aligning everything towards a networked world. Embracing the fact that we are all connected is the way forward.

Thank you to Tarah Wheeler for joining us and giving us so much insight into her world.

DANNY: If you like what you hear, follow us on your favorite podcast player. We’ve got lots more episodes in store with smart people who will tell you how we can fix the internet. 

Music for the show is by Nat Keefe and Reed Mathis of BeatMower. 

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. I’m Danny O’Brien.

CINDY: And I’m Cindy Cohn. Thank you so much for joining us today. Until next time.



rainey Reitman

虚偽の権利を主張して YouTube から多額の音楽使用料を得ていた米国の 2 名、共謀罪や送金詐欺罪などで起訴される

1 day 22 hours ago
headless 曰く、米アリゾナ地区連邦検事局は 1 日、音楽の著作物に関する虚偽の権利を主張して YouTube から多額の使用料を得ていた男性 2 名に対する起訴を明らかにした (ニュースリリース: PDF、 TorrentFreak の記事、 The Register の記事、 起訴状: PDF)。 2 名は経営する MediaMuv を通じて YouTube のパートナー企業に 5 万トラック以上の権利を主張し、4 年間で 2,000 万ドル以上の使用料を得ていたという。その一方でアーティストに使用料を支払うことはなく、自動車や貴金属類、不動産等の購入に充てていたとのこと。大陪審は 11 月半ばに共謀罪や送金詐欺、資金洗浄など計 30 件で 2 名を起訴している。

すべて読む | ITセクション | 犯罪 | YouTube | アメリカ合衆国 | 音楽 | YRO | 著作権 | お金 |

関連ストーリー:
Google、スペインで 7 年ぶりに Google News 提供再開へ 2021年11月07日
作曲家ジョージ・ガーシュインの曲の一部が再び著作権料徴収の対象に 2021年09月12日
JASRACによる音楽教室からの著作権料徴収を巡る裁判、控訴審では教師と生徒で判断が別れる 2021年03月19日
音楽教室からの著作権料徴収を巡る裁判、教室内での演奏は公衆に聞かせるものとの判断 2020年03月02日
校歌の歌詞をJASRACに委託した場合、歌詞を式次第に印刷する際に著作権使用料が求められる 2019年09月27日
JASRAC、結婚式での音楽利用に向けた著作権・著作隣接権使用料徴収の実証実験を行う 2019年09月05日

nagazou

福島原発事故後の「安全」PRの裏には

1 day 23 hours ago
野池元基さんの情報公開請求で開示された資料。報酬額は黒塗りされていた。(撮影/筆者)  東京電力福島第一原発事故直後、福島県が県放射線健康リスク管理アドバイザーに任命した山下俊一・長崎大学教授(当時)ら計3人に支払った報 […]
admin

LINE Pay、国内外13万人分のキャンペーン参加情報を誤ってGitHubにアップロード

1 day 23 hours ago
LINE Payは6日、同社のキャンペーン参加に関わる情報が閲覧できる状態になっていたと発表した。流出したのはLINE Payユーザー13万3484アカウント分の情報。流出の内訳としては、ユーザーの識別子、加盟店管理番号、キャンペーン情報(キャンペーンコード等)となっている。このうちキャンペーン情報には、キャンペーン名称、決済金額、決済日時が含まれている可能性がある。氏名・住所・電話番号・メールアドレス・クレジットカード番号・銀行口座番号等の情報は含まれていないそうだ。委託先であるグループ会社の従業員が「GitHub」上にアップロードしたものだとしている。すでに情報は削除され、該当ユーザーに関しては個別に連絡済みだとしている(LINE Payのプレスリリース)。

すべて読む | セキュリティセクション | セキュリティ | 情報漏洩 |

関連ストーリー:
EVANGELION STOREで不正アクセス、1万7828件のクレカ情報漏洩の可能性 2021年12月02日
中国電信が米国で免許取り消しへ。安全保障の観点から 2021年11月01日
動画配信のTwitchからサービスの全ソースコードやクリエイターへの支払いリストが流出 2021年10月08日
富士通のProjectWEBへ不正アクセス、129の中央省庁や企業などから情報漏洩が判明 2021年08月12日
読売新聞子会社でクレジットカードの情報漏洩が発生 2021年07月15日
恋活・婚活マッチングアプリ「Omiai」で不正アクセス被害、171万人分の個人情報漏洩の可能性 2021年05月24日
Facebook、過去に流出した世界5億3300万人の個人情報が閲覧可能に。日本からも42万8625人 2021年04月07日

nagazou

Amazon、機械学習向け環境を提供する「SageMaker Studio Lab」を無料提供へ

2 days 1 hour ago
Amazon Web Services(AWS)は1日、機械学習の学習と実験のための開発環境を無料でセットアップ可能となる「Amazon SageMaker Studio Lab」の提供を発表した。SageMaker Studio Labは、オープンソースのJupyterLab IDEをベースにした新サービスで、PythonやR言語などに対応しており、ターミナル機能やGITとの連携機能などを備えているという(Amazonリリース、Publickey)。 利用者はセッションあたり12時間分のCPUもしくは4時間分のGPUにあたる計算リソース、16GBメモリ、プロジェクトあたり15GB分のストレージが利用できるとのこと。クレジットカードなどの登録やAWSアカウントも不要で、メールアドレスを登録すれば利用できるとしている。

すべて読む | ITセクション | ソフトウェア | プログラミング | アナウンス |

関連ストーリー:
機械学習を活用、ネオジム磁石の作成技術を最適化し約1.5倍の磁力を達成 2021年12月01日
LiDARで隠しカメラを発見する技術 2021年11月25日
高解像度化技術「TecoGAN」でAVのモザイクを除去&販売していた人物が逮捕 2021年10月19日
米軍のコグニティブ電子戦システム開発プロジェクト「怪獣」 2021年09月18日
Apple、米国内のデバイスで児童性的虐待画像をスキャンするなど子供を守る取り組みを強化 2021年08月08日

nagazou

NHK、難視聴地域の放送をブロードバンドで代替を検討

2 days 2 hours ago
あるAnonymous Coward 曰く、NHKは12月6日に、放送制度のあり方を議論する総務省の有識者会議で、電波が届きにくい難視聴地域の地上波放送をブロードバンド通信で代替できるように検討することを求めた(朝日新聞)。 フレッツ系事業者各種が提供する光テレビオプションなど競合サービスの廃止が予想され、民放や他コンテンツ提供事業者との競合が不可避となり調整が難航し、NHKの受信料が義務化へと繋がるのではないかと思われる。

すべて読む | ITセクション | テレビ | インターネット |

関連ストーリー:
NHK受信が不能になったテレビでも受信契約義務アリ。最高裁でNHK側の勝訴が確定 2021年12月06日
NHK、新規契約時や住所変更時に電話番号とメアドの収集を検討する素案を発表 2021年10月28日
武田良太総務大臣、記者会見で「スマホからNHK受信料」を否定せず 2021年09月06日
日本郵便、氏名の記載なしでも配達する新サービス。NHK受信料徴収など向け 2021年06月08日
総務省、NHKの受信料下げの新制度案。2月に法令改正へ 2021年01月19日
NHKのネット常時同時配信、インターネット接続端末所有者への受信料負担を必須にするものではないと総務省が回答 2019年12月27日
NHKのBS放送、番組のネット同時配信に向けた「改革」のため4波から3波へ集約へ 2019年12月10日
NHK契約者名簿を使って特殊詐欺のターゲットを選んでいたNHK集金受託会社社長、逮捕される 2019年11月11日

nagazou

練馬区の中学校で生徒からSNSパスワードを収集し問題に。区教委が作成したリーフレットが発端

2 days 3 hours ago
東京都練馬区の中学校で、生徒に対してSNSのパスワードを書かせた書類を提出させようとしていたことが分かった。こうなった原因は複数あるが、最大の理由は東京都練馬区教委会が配布したSNS利用の啓発用のリーフレットに、SNSのパスワードを記入する欄があったこと。もともと区教委はパスワードを収集する意図があったが、パンフレットの内容がTwitterで出回り、人権侵害であるなどの指摘がでて炎上。その結果、区教委はパンフレットはそのままで提出時にパスワードを記載しない状態で提出してもらうと通知していたようだ(練馬区リリース[PDF]、弁護士ドットコム、毎日新聞)。 しかし練馬区内の中学校1校がこの通知を生徒や各家庭に伝達し忘れた。それによりパスワードが記載されたリーフレットが家庭に届き、学校側が生徒のパスワードを知り得る状態になってしまったという。練馬区のリリースによれば12月2日時点で276名からリーフレットの提出を受けていたとしている。このため区では3日、各家庭にあてて謝罪文を送付したとしている。

すべて読む | セキュリティセクション | セキュリティ | 教育 | 情報漏洩 |

関連ストーリー:
EVANGELION STOREで不正アクセス、1万7828件のクレカ情報漏洩の可能性 2021年12月02日
英国、IT機器の推測しやすいデフォルトパスワードを禁止する法案を導入 2021年11月30日
2021 年版流出パスワードトップ 200、 世界では「123456」が 1 位、日本では「password」が 1 位 2021年11月21日
学習向けタブレット等を使用してのいじめ、全国の小中学校で増加 2021年11月11日
イスラエル・テルアビブ、家庭や小規模オフィスの Wi-Fi は 70 % が容易にクラックできるとの調査結果 2021年11月02日
macOSの暗号化zipファイルはパスワード無しで比較的容易に解凍できる 2021年10月12日
動画配信のTwitchからサービスの全ソースコードやクリエイターへの支払いリストが流出 2021年10月08日

nagazou

[B] 12/13 【オンラインセミナー】クーデターから10ヶ月、ミャンマーの今 なぜ止まらない日本からの資金

2 days 10 hours ago
人権や環境問題に取り組む日本のNGO団体などは13日、今年2月1日にミャンマーで軍事クーデターが発生して以降も、日本政府が国軍を利する可能性のある経済援助を継続しているとして、オンラインセミナー〈クーデターから10ヶ月、ミャンマーの今 なぜ止まらない日本からの資金〉を開催予定。(藤ヶ谷魁)
日刊ベリタ