10 Matching Annotations
  1. May 2020
    1. make it as easy to withdraw consent as to give it. The latter gets particularly interesting when considering that in some contexts, consent may be obtained “through only one mouse-click, swipe or keystroke” and therefore “data subjects must, in practice, be able to withdraw that consent equally as easily” per the WP29.

      It seems, then, that one should be careful to not make it too easy to opt in to something unless you are prepared to accept the liability for making it just as easy to opt out (which may be technically challenging).

  2. Apr 2020
    1. The key change here is the removal of an intent to defraud and replacing it with willfully; it will be illegal to share this information as long as you have any reason to know someone else might use it for unauthorized computer access.It is troublesome to consider the unintended consequences resulting from this small change.
    2. Again, this is stupid that I have to do this, but
    1. Without passing any judgement on any third party developers, we have to advise people to never enter their 1Password Master Passwords into anything other than 1Password. I have no reason to doubt the integrity or competence of these third party developers, and RogueLazer’s project is even open-source. But it would be irresponsible for us to do anything other than advise you never to give your 1Password Master Password to anyone or any other application.
  3. Mar 2020
    1. Most companies are throwing cookie alerts at you because they figure it’s better to be safe than sorry When the GDPR came into effect, companies all over the globe — not just in Europe — scrambled to comply and started to enact privacy changes for all of their users everywhere. That included the cookie pop-ups. “Everybody just decided to be better safe than sorry and throw up a banner — with everybody acknowledging it doesn’t accomplish a whole lot,” said Joseph Jerome, former policy counsel for the Privacy & Data Project at the Center for Democracy & Technology, a privacy-focused nonprofit.
  4. Jan 2020
    1. The Twenty-Six Words that Created the Internet is Jeff Kosseff’s definitive history and analysis of the current fight over Section 230, the fight over who will be held responsible to forbid speech. In it, Kosseff explains how debate over intermediary liability, as this issue is called, stretches back to a 1950s court fight, Smith v. California, about whether an L.A. bookseller should have been responsible for knowing the content of every volume on his shelves.

      For me this is the probably the key idea. Facebook doesn't need to be responsible for everything that their users post, but when they cross the line into actively algorithmically promoting and pushing that content into their users' feeds for active consumption, then they do have a responsibility for that content.

      By analogy image the trusted local bookstore mentioned. If there are millions of books there and the user has choice when they walk in to make their selection in some logical manner. But if the bookseller has the secret ability to consistently walk up to children and put porn into their hands or actively herding them into the adult sections to force that exposure on them (and they have the ability to do it without anyone else realizing it), then that is the problem. Society at large would further think that this is even more reprehensible if they realized that local governments or political parties had the ability to pay the bookseller to do this activity.

      In case the reader isn't following the analogy, this is exactly what some social platforms like Facebook are allowing our politicans to do. They're taking payment from politicans to actively lie, tell untruths, and create fear in a highly targeted manner without the rest of society to see or hear those messages. Some of these sorts of messages are of the type that if they were picked up on an open microphone and broadcast outside of the private group they were intended for would have been a career ending event.

      Without this, then we're actively stifling conversation in the public sphere and actively empowering the fringes. This sort of active targeted fringecasting is preventing social cohesion, consensus, and comprimise and instead pulling us apart.

      Perhaps the answer for Facebook is to allow them to take the political ad money for these niche ads and then not just cast to the small niche audience, but to force them to broadcast them to everyone on the platform instead? Then we could all see who our politicians really are?

  5. Jul 2018
    1. So basically in an effort to stop 1,000 pieces of infringing content, you'd end up pulling down 50,000 pieces of legitimate content. And that's with an incredible (and unbelievable) 99.5% accuracy rate. Drop the accuracy rate to a still optimistic 90%, and the results are even more stark:
  6. Apr 2016