21 Matching Annotations
  1. Jul 2021
    1. consumer friendly

      Including the "consumer" here is a red herring. We're meant to identify as the consumer and so take from this statement that our rights and best interests have been written into these BigTech-crafted laws.

      But a "consumer" is different from a "citizen," a "person," we the people.

    2. passage in March of a consumer data privacy law in Virginia, which Protocol reported was originally authored by Amazon

      From the article:

      Marsden and Virginia delegate Cliff Hayes held meetings with other large tech companies, including Microsoft; financial institutions, including Capital One; and non-profit groups and small businesses...

      Which all have something at stake here: the ability to monitor people and mine their data in order to sell it.

      Weak privacy laws give the illusion of privacy while maintaining the corporate panopticon.

    3. consumers would have to opt out of rather than into tracking

      Example of a dark pattern.

  2. Nov 2020
    1. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has. Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal. In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing. Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization. We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

      One reason privacy is important is because society makes moral progress by experimenting with things on the fringe of what is legal.

      This is reminiscent of Signal's founder's argument that we should want law enforcement not to be 100% effective, because how else are we going to find out the gay sex, and marihuana use doesn't devolve and doesn't hurt anybody.

  3. Jun 2020
  4. May 2020
  5. Apr 2020
  6. Mar 2020
    1. If a website/app collects personal data, the Data Owner must inform users of this fact by way of a privacy policy. All that is required to trigger this obligation is the presence of a simple contact form, Google Analytics, a cookie or even a social widget; if you’re processing any kind of personal data, you definitely need one.
    1. When you think about data law and privacy legislations, cookies easily come to mind as they’re directly related to both. This often leads to the common misconception that the Cookie Law (ePrivacy directive) has been repealed by the General Data Protection Regulation (GDPR), which in fact, it has not. Instead, you can instead think of the ePrivacy Directive and GDPR as working together and complementing each other, where, in the case of cookies, the ePrivacy generally takes precedence.
    1. In accordance with the general principles of privacy law, which do not permit the processing of data prior to consent, the cookie law does not allow the installation of cookies before obtaining the user’s consent, except for exempt categories.
    1. A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing
    2. Every data subject should therefore have the right to know and obtain communication in particular with regard to the purposes for which the personal data are processed, where possible the period for which the personal data are processed, the recipients of the personal data, the logic involved in any automatic personal data processing
    3. Where possible, the controller should be able to provide remote access to a secure system which would provide the data subject with direct access to his or her personal data.
    1. Google Analytics created an option to remove the last octet (the last group of 3 numbers) from your visitor’s IP-address. This is called ‘IP Anonymization‘. Although this isn’t complete anonymization, the GDPR demands you use this option if you want to use Analytics without prior consent from your visitors. Some countris (e.g. Germany) demand this setting to be enabled at all times.
    1. The system has been criticised due to its method of scraping the internet to gather images and storing them in a database. Privacy activists say the people in those images never gave consent. “Common law has never recognised a right to privacy for your face,” Clearview AI lawyer Tor Ekeland said in a recent interview with CoinDesk. “It’s kind of a bizarre argument to make because [your face is the] most public thing out there.”
  7. Feb 2017
  8. Dec 2015