124 Matching Annotations
  1. Aug 2023
    1. We lived in a relatively unregulated digital world until now. It was great until the public realized that a few companies wield too much power today in our lives. We will see significant changes in areas like privacy, data protection, algorithm and architecture design guidelines, and platform accountability, etc. which should reduce the pervasiveness of misinformation, hate and visceral content over the internet.
      • for: quote, quote - Prateek Raj, quote - internet regulation, quote - reducing misinformation, fake news, indyweb - support
      • quote
        • We lived in a relatively unregulated digital world until now.
        • It was great until the public realized that a few companies wield too much power today in our lives.
        • We will see significant changes in areas like
          • privacy,
          • data protection,
          • algorithm and
          • architecture design guidelines, and
          • platform accountability, etc.
        • which should reduce the pervasiveness of
          • misinformation,
          • hate and visceral content
        • over the internet.
        • These steps will also reduce the power wielded by digital giants.
        • Beyond these immediate effects, it is difficult to say if these social innovations will create a more participative and healthy society.
        • These broader effects are driven by deeper underlying factors, like
          • history,
          • diversity,
          • cohesiveness and
          • social capital, and also
          • political climate and
          • institutions.
        • In other words,
          • just as digital world is shaping the physical world,
          • physical world shapes our digital world as well.
      • author: Prateek Raj
        • assistant professor in strategy, Indian Institute of Management, Bangalore
  2. Jul 2023
    1. Such efforts to protect data privacy go beyond the abilities of the technology involved to also encompass the design process. Some Indigenous communities have created codes of use that people must follow to get access to community data. And most tech platforms created by or with an Indigenous community follow that group’s specific data principles. Āhau, for example, adheres to the Te Mana Raraunga principles of Māori data sovereignty. These include giving Māori communities authority over their information and acknowledging the relationships they have with it; recognizing the obligations that come with managing data; ensuring information is used for the collective benefit of communities; practicing reciprocity in terms of respect and consent; and exercising guardianship when accessing and using data. Meanwhile Our Data Indigenous is committed to the First Nations principles of ownership, control, access and possession (OCAP). “First Nations communities are setting their own agenda in terms of what kinds of information they want to collect,” especially around health and well-being, economic development, and cultural and language revitalization, among others, Lorenz says. “Even when giving surveys, they’re practicing and honoring local protocols of community interaction.”

      Colonized groups such as these indigenous people have urgency to avoid colonization of their data and are doing something about it

  3. Nov 2022
  4. Oct 2022
    1. En cas de non-respect de la Loi, la Commission d’accès à l’information pourra imposer des sanctionsimportantes, qui pourraient s’élever jusqu’à 25 M$ ou à 4 % du chiffre d’affaires mondial. Cette sanctionsera proportionnelle, notamment, à la gravité du manquement et à la capacité de payer de l’entreprise.ENTREPRISES
  5. Aug 2022
    1. NETGEAR is committed to providing you with a great product and choices regarding our data processing practices. You can opt out of the use of the data described above by contacting us at analyticspolicy@netgear.com

      You may opt out of these data use situations by emailing analyticspolicy@netgear.com.

    2. Marketing. For example, information about your device type and usage data may allow us to understand other products or services that may be of interest to you.

      All of the information above that has been consented to, can be used by NetGear to make money off consenting individuals and their families.

    3. USB device

      This gives Netgear permission to know what you plug into your computer, be it a FitBit, a printer, scanner, microphone, headphones, webcam — anything not attached to your computer.

  6. Jun 2022
    1. Companies need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are. Any ethics board must include a diverse mix of people and experiences. Where possible, companies should look to publish the results of these ethics boards to help encourage public debate and to shape future policy on data use.

    1. The goal is to gain “digital sovereignty.”

      the age of borderless data is ending. What we're seeing is a move to digital sovereignty

  7. May 2022
    1. Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us.

      We need to think more about the externalities of our data decisions.

  8. Apr 2022
  9. Sep 2021
  10. Jul 2021
    1. whereas now, they know that user@domain.com was subscribed to xyz.net at some point and is unsubscribing. Information is gold. Replace user@domain with abcd@senate and xyz.net with warezxxx.net and you've got tabloid gold.
  11. May 2021
    1. Draft notes, E-mail, plans, source code, to-do lists, what have you

      The personal nature of this information means that users need control of their information. Tim Berners-Lee's Solid (Social Linked Data) project) looks like it could do some of this stuff.

  12. Apr 2021
  13. Mar 2021
    1. a data donation platform that allows users of browsers to donate data on their usage of specific services (eg Youtube, or Facebook) to a platform.

      This seems like a really promising pattern for many data-driven problems. Browsers can support opt-in donation to contribute their data to improve Web search, social media, recommendations, lots of services that implicitly require lots of operational data.

  14. Jan 2021
  15. Dec 2020
    1. I haven't met anyone who makes this argument who then says that a one stop convenient, reliable, private and secure online learning environment can’t be achieved using common every day online systems

      Reliable: As a simple example, I'd trust Google to maintain data reliability over my institutional IT support.

      And you'd also need to make the argument for why learning needs to be "private", etc.

    1. And then there was what Lanier calls “data dignity”; he once wrote a book about it, called Who Owns the Future? The idea is simple: What you create, or what you contribute to the digital ether, you own.

      See Tim Berners-Lee's SOLID project.

  16. Nov 2020
  17. Oct 2020
    1. Legislation to stem the tide of Big Tech companies' abuses, and laws—such as a national consumer privacy bill, an interoperability bill, or a bill making firms liable for data-breaches—would go a long way toward improving the lives of the Internet users held hostage inside the companies' walled gardens. But far more important than fixing Big Tech is fixing the Internet: restoring the kind of dynamism that made tech firms responsive to their users for fear of losing them, restoring the dynamic that let tinkerers, co-ops, and nonprofits give every person the power of technological self-determination.
  18. Sep 2020
  19. Jul 2020
  20. Jun 2020
  21. May 2020
    1. Google encouraging site admins to put reCaptcha all over their sites, and then sharing the resulting risk scores with those admins is great for security, Perona thinks, because he says it “gives site owners more control and visibility over what’s going on” with potential scammer and bot attacks, and the system will give admins more accurate scores than if reCaptcha is only using data from a single webpage to analyze user behavior. But there’s the trade-off. “It makes sense and makes it more user-friendly, but it also gives Google more data,”
    2. For instance, Google’s reCaptcha cookie follows the same logic of the Facebook “like” button when it’s embedded in other websites—it gives that site some social media functionality, but it also lets Facebook know that you’re there.
    3. But this new, risk-score based system comes with a serious trade-off: users’ privacy.
    1. they sought to eliminate data controllers and processors acting without appropriate permission, leaving citizens with no control as their personal data was transferred to third parties and beyond
    1. “Until CR 1.0 there was no effective privacy standard or requirement for recording consent in a common format and providing people with a receipt they can reuse for data rights.  Individuals could not track their consents or monitor how their information was processed or know who to hold accountable in the event of a breach of their privacy,” said Colin Wallis, executive director, Kantara Initiative.  “CR 1.0 changes the game.  A consent receipt promises to put the power back into the hands of the individual and, together with its supporting API — the consent receipt generator — is an innovative mechanism for businesses to comply with upcoming GDPR requirements.  For the first time individuals and organizations will be able to maintain and manage permissions for personal data.”
    2. Its purpose is to decrease the reliance on privacy policies and enhance the ability for people to share and control personal information.
  22. Apr 2020
    1. Finally, from a practical point of view, we suggest the adoption of "privacy label," food-like notices, that provide the required information in an easily understandable manner, making the privacy policies easier to read. Through standard symbols, colors and feedbacks — including yes/no statements, where applicable — critical and specific scenarios are identified. For example, whether or not the organization actually shares the information, under what specific circumstances this occurs, and whether individuals can oppose the share of their personal data. This would allow some kind of standardized information. Some of the key points could include the information collected and the purposes of its collection, such as marketing, international transfers or profiling, contact details of the data controller, and distinct differences between organizations’ privacy practices, and to identify privacy-invasive practices.
    1. Before we get to passwords, surely you already have in mind that Google knows everything about you. It knows what websites you’ve visited, it knows where you’ve been in the real world thanks to Android and Google Maps, it knows who your friends are thanks to Google Photos. All of that information is readily available if you log in to your Google account. You already have good reason to treat the password for your Google account as if it’s a state secret.
    1. Alas, you'll have to manually visit each site in turn and figure out how to actually delete your account. For help, turn to JustDelete.me, which provides direct links to the cancellation pages of hundreds of services.
    1. When you visit a website, you are allowing that site to access a lot of information about your computer's configuration. Combined, this information can create a kind of fingerprint — a signature that could be used to identify you and your computer. Some companies use this technology to try to identify individual computers.
    1. Our approach strikes a balance between privacy, computation overhead, and network latency. While single-party private information retrieval (PIR) and 1-out-of-N oblivious transfer solve some of our requirements, the communication overhead involved for a database of over 4 billion records is presently intractable. Alternatively, k-party PIR and hardware enclaves present efficient alternatives, but they require user trust in schemes that are not widely deployed yet in practice. For k-party PIR, there is a risk of collusion; for enclaves, there is a risk of hardware vulnerabilities and side-channels.
    2. Privacy is at the heart of our design: Your usernames and passwords are incredibly sensitive. We designed Password Checkup with privacy-preserving technologies to never reveal this personal information to Google. We also designed Password Checkup to prevent an attacker from abusing Password Checkup to reveal unsafe usernames and passwords. Finally, all statistics reported by the extension are anonymous. These metrics include the number of lookups that surface an unsafe credential, whether an alert leads to a password change, and the web domain involved for improving site compatibility.
    1. Google says this technique, called "private set intersection," means you don't get to see Google's list of bad credentials, and Google doesn't get to learn your credentials, but the two can be compared for matches.
    1. Someone, somewhere has screwed up to the extent that data got hacked and is now in the hands of people it was never intended to be. No way, no how does this give me license to then treat that data with any less respect than if it had remained securely stored and I reject outright any assertion to the contrary. That's a fundamental value I operate under
    1. Covid-19 is an emergency on such a huge scale that, if anonymity is managed appropriately, internet giants and social media platforms could play a responsible part in helping to build collective crowd intelligence for social good, rather than profit
    2. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
  23. Mar 2020
    1. To join the Privacy Shield Framework, a U.S.-based organization is required to self-certify to the Department of Commerce and publicly commit to comply with the Framework’s requirements. While joining the Privacy Shield is voluntary, the GDPR goes far beyond it.
    1. By choosing Matomo, you are joining an ever growing movement. You’re standing up for something that respects user-privacy, you’re fighting for a safer web and you believe your personal data should remain in your own hands, no one else’s.
    1. Data privacy now a global movementWe’re pleased to say we’re not the only ones who share this philosophy, web browsing companies like Brave have made it possible so you can browse the internet more privately; and you can use a search engine like DuckDuckGo to search with the freedom you deserve.
    2. our values remain the same – advocating for 100% data ownership, respecting user-privacy, being reliable and encouraging people to stay secure. Complete analytics, that’s 100% yours.
    3. the privacy of your users is respected
  24. www.graphitedocs.com www.graphitedocs.com
    1. Own Your Encryption KeysYou would never trust a company to keep a record of your password for use anytime they want. Why would you do that with your encryption keys? With Graphite, you don't have to. You own and manage your keys so only YOU can decrypt your content.
    1. When you think about data law and privacy legislations, cookies easily come to mind as they’re directly related to both. This often leads to the common misconception that the Cookie Law (ePrivacy directive) has been repealed by the General Data Protection Regulation (GDPR), which in fact, it has not. Instead, you can instead think of the ePrivacy Directive and GDPR as working together and complementing each other, where, in the case of cookies, the ePrivacy generally takes precedence.
    1. Legitimate Interest may be used for marketing purposes as long as it has a minimal impact on a data subject’s privacy and it is likely the data subject will not object to the processing or be surprised by it.
    1. Earlier this year it began asking Europeans for consent to processing their selfies for facial recognition purposes — a highly controversial technology that regulatory intervention in the region had previously blocked. Yet now, as a consequence of Facebook’s confidence in crafting manipulative consent flows, it’s essentially figured out a way to circumvent EU citizens’ fundamental rights — by socially engineering Europeans to override their own best interests.
    2. The deceitful obfuscation of commercial intention certainly runs all the way through the data brokering and ad tech industries that sit behind much of the ‘free’ consumer Internet. Here consumers have plainly been kept in the dark so they cannot see and object to how their personal information is being handed around, sliced and diced, and used to try to manipulate them.
    3. design choices are being selected to be intentionally deceptive. To nudge the user to give up more than they realize. Or to agree to things they probably wouldn’t if they genuinely understood the decisions they were being pushed to make.
    1. A 1% sample of AddThis Data (“Sample Dataset”) is retained for a maximum of 24 months for business continuity purposes.
    1. Right now, if you want to know what data Facebook has about you, you don’t have the right to ask them to give you all of the data they have on you, and the right to know what they’ve done with it. You should have that right. You should have the right to know and have access to your data.
    1. Good data privacy practices by companies are good for the world. We wake up every day excited to change the world and keep the internet that we all know and love safe and transparent.
  25. Feb 2020
    1. Research ethics concerns issues, such as privacy, anonymity, informed consent and the sensitivity of data. Given that social media is part of society’s tendency to liquefy and blur the boundaries between the private and the public, labour/leisure, production/consumption (Fuchs, 2015a: Chapter 8), research ethics in social media research is par-ticularly complex.
  26. Jan 2020
  27. Nov 2019
    1. Google has confirmed that it partnered with health heavyweight Ascension, a Catholic health care system based in St. Louis that operates across 21 states and the District of Columbia.

      What happened to 'thou shalt not steal'?

    1. Speaking with MIT Technology Review, Rohit Prasad, Alexa’s head scientist, has now revealed further details about where Alexa is headed next. The crux of the plan is for the voice assistant to move from passive to proactive interactions. Rather than wait for and respond to requests, Alexa will anticipate what the user might want. The idea is to turn Alexa into an omnipresent companion that actively shapes and orchestrates your life. This will require Alexa to get to know you better than ever before.

      This is some next-level onslaught.

  28. Sep 2019
  29. Apr 2019
  30. Nov 2018
    1. Does the widespread and routine collection of student data in ever new and potentially more-invasive forms risk normalizing and numbing students to the potential privacy and security risks?

      What happens if we turn this around - given a widespread and routine data collection culture which normalizes and numbs students to risk as early as K-8, what are our responsibilities (and strategies) to educate around this culture? And how do our institutional practices relate to that educational mission?

  31. Aug 2018
    1. A file containing personal information of 14.8 million Texas residents was discovered on an unsecured server. It is not clear who owns the server, but the data was likely compiled by Data Trust, a firm created by the GOP.

    1. Google also says location records stored in My Activity are used to target ads. Ad buyers can target ads to specific locations — say, a mile radius around a particular landmark — and typically have to pay more to reach this narrower audience. While disabling “Web & App Activity” will stop Google from storing location markers, it also prevents Google from storing information generated by searches and other activity. That can limit the effectiveness of the Google Assistant, the company’s digital concierge. Sean O’Brien, a Yale Privacy Lab researcher with whom the AP shared its findings, said it is “disingenuous” for Google to continuously record these locations even when users disable Location History. “To me, it’s something people should know,” he said.
    2. Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.” That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking. (It’s possible, although laborious, to delete it .)
  32. Jul 2018
    1. where applicable, any rating in the form of a data trust score that may be assignedto the data fiduciary under section 35;and

      A Data Trust score. Thankfully, it isn't mandatory to have a data trust score, which mean that apps and services can exist without there being a trust score

    2. the period for which the personal data will beretained in terms of section 10 or where such period is not known, the criteria for determining such period;

      This defines the terms for data retention. From a company perspective, they are likely to keep this as broad as possible.

    3. Upon receipt of notification, the Authority shall determine whether such breach should be reported by the data fiduciaryto the data principal, taking into account the severity of the harm that may be caused to such data principal or whether some action is required on the part of the data principal to mitigate suchharm.

      This means that users aren't always informed about a breach of data. That's the prerogative of the Data Protection Authority, and not mandatory, in the interest of the user.

    4. “Personal data breach”means any unauthorised or accidental disclosure, acquisition, sharing, use, alteration, destruction, loss of access to, of personal data that compromises the confidentiality, integrity or availability of personal data to a data principal;

      Personal data breach here includes "accidental disclosure" as well.

  33. Apr 2018
    1. The alternative, of a regulatory patchwork, would make it harder for the West to amass a shared stock of AI training data to rival China’s.

      Fascinating geopolitical suggestion here: Trans-Atlantic GDPR-like rules as the NATO of data privacy to effectively allow "the West" to compete against the People's Republic of China in the development of artificial intelligence.

    1. Data Re-Use. Contractor agrees that any and all Institutional Data exchanged shall be used expressly and solely for the purposes enumerated in the Agreement. UH Institutional Data shall not be distributed, repurposed or shared across other applications, environments, or business units of the Contractor. The Contractor further agrees that no Institutional Data of any kind shall be revealed, transmitted, exchanged or otherwise passed to other vendors or interested parties except on a case-by-case basis as specifically agreed to in writing by a University officer with designated data, security, or signature authority.

      Like this clause. Wonder if this is the exception or the rule in Uni procurement deals these days?

  34. Dec 2017
    1. Projects by IF is a limited company based in London, England. We run this website (projectsbyif.com) and its subdomains. We also use third party services to publish work, keep in touch with people and understand how we can do those things better. Many of those services collect some data about people who are interested in IF, come to our events or work with us. Here you can find out what those services are, how we use them and how we store the information they collect. If you’ve got any questions, or want to know more about data we might have collected about you, email hello@projectsbyif.com This page was published on 25 August 2017. You can see any revisions by visiting the repository on Github.

      As you'd expect, If's privacy page is fantastic

  35. Oct 2017
    1. The learning analytics and education data mining discussed in this handbook hold great promise. At the same time, they raise important concerns about security, privacy, and the broader consequences of big data-driven education. This chapter describes the regulatory framework governing student data, its neglect of learning analytics and educational data mining, and proactive approaches to privacy. It is less about conveying specific rules and more about relevant concerns and solutions. Traditional student privacy law focuses on ensuring that parents or schools approve disclosure of student information. They are designed, however, to apply to paper “education records,” not “student data.” As a result, they no longer provide meaningful oversight. The primary federal student privacy statute does not even impose direct consequences for noncompliance or cover “learner” data collected directly from students. Newer privacy protections are uncoordinated, often prohibiting specific practices to disastrous effect or trying to limit “commercial” use. These also neglect the nuanced ethical issues that exist even when big data serves educational purposes. I propose a proactive approach that goes beyond mere compliance and includes explicitly considering broader consequences and ethics, putting explicit review protocols in place, providing meaningful transparency, and ensuring algorithmic accountability. Export Citation: Plain Text (APA
  36. Sep 2017
    1. extremely cool, but...

      comparing with tahoe-lafs:

      clearly separates writecap from readcap, but... does it grok readcap as separate from idcap?

      client-side encryption?

      n-of-k erasure encoding?

  37. Aug 2017
    1. Embracing a culture of sharing that breaks down silos while maintaining ethical and privacy standards will be paramount.

      This is gnarly stuff though and deserves its own deep dive/bullet point.

  38. Apr 2017
    1. The Echo Look suffers from two dovetailing issues: the overwhelming potential for invasive data collection, and Amazon’s lack of a clear policy on how it might prevent that.

      Important to remember. Amazon shares very little about what it collects and what it does with what it collects.

    1. En produisant des services gratuits (ou très accessibles), performants et à haute valeur ajoutée pour les données qu’ils produisent, ces entreprises captent une gigantesque part des activités numériques des utilisateurs. Elles deviennent dès lors les principaux fournisseurs de services avec lesquels les gouvernements doivent composer s’ils veulent appliquer le droit, en particulier dans le cadre de la surveillance des populations et des opérations de sécurité.

      Voilà pourquoi les GAFAM sont aussi puissants (voire plus) que des États.

  39. Mar 2017
    1. You can delete the data. You can limit its collection. You can restrict who sees it. You can inform students. You can encourage students to resist. Students have always resisted school surveillance.

      The first three of these can be tough for the individual faculty member to accomplish, but informing students and raising awareness around these issues can be done and is essential.

  40. Feb 2017
    1. All along the way, or perhaps somewhere along the way, we have confused surveillance for care. And that’s my takeaway for folks here today: when you work for a company or an institution that collects or trades data, you’re making it easy to surveil people and the stakes are high. They’re always high for the most vulnerable. By collecting so much data, you’re making it easy to discipline people. You’re making it easy to control people. You’re putting people at risk. You’re putting students at risk.
  41. Oct 2016
    1. Outside of the classroom, universities can use connected devices to monitor their students, staff, and resources and equipment at a reduced operating cost, which saves everyone money.
    1. For G Suite users in primary/secondary (K-12) schools, Google does not use any user personal information (or any information associated with a Google Account) to target ads.

      In other words, Google does use everyone’s information (Data as New Oil) and can use such things to target ads in Higher Education.

  42. Sep 2016
    1.  all  intellectual  property  rights,  shall  remain  the  exclusive  property  of  the  [School/District],

      This is definitely not the case. Even in private groups would it ever make sense to say this?

    2. Access

      This really just extends the issue of "transfer" mentioned in 9.

    3. Data  Transfer  or  Destruction

      This is the first line item I don't feel like we have a proper contingency for or understand exactly how we would handle it.

      It seems important to address not just due to FERPA but to contracts/collaborations like that we have with eLife:

      What if eLife decides to drop h. Would we, could we delete all data/content related to their work with h? Even outside of contract termination, would we/could we transfer all their data back to them?

      The problems for our current relationship with schools is that we don't have institutional accounts whereby we might at least technically be able to collect all related data.

      Students could be signing up for h with personal email addresses.

      They could be using their h account outside of school so that their data isn't fully in the purview of the school.

      Question: if AISD starts using h on a big scale, 1) would we delete all AISD related data if they asked--say everything related to a certain email domain? 2) would we share all that data with them if they asked?

    4. Data  cannot  be  shared  with  any  additional  parties  without  prior  written  consent  of  the  Userexcept  as  required  by  law.”

      Something like this should probably be added to our PP.

    5. Data  Collection

      I'm really pleased with how hypothes.is addresses the issues on this page in our Privacy Policy.

    6. There  is  nothing  wrong  with  a  provider  usingde-­‐identified  data  for  other  purposes;  privacy  statutes,  after  all,  govern  PII,  not  de-­‐identified  data.

      Key point.

    1. Responsible Use

      Again, this is probably a more felicitous wording than “privacy protection”. Sure, it takes as a given that some use of data is desirable. And the preceding section makes it sound like Learning Analytics advocates mostly need ammun… arguments to push their agenda. Still, the notion that we want to advocate for responsible use is more likely to find common ground than this notion that there’s a “data faucet” that should be switched on or off depending on certain stakeholders’ needs. After all, there exists a set of data use practices which are either uncontroversial or, at least, accepted as “par for the course” (no pun intended). For instance, we probably all assume that a registrar should receive the grade data needed to grant degrees and we understand that such data would come from other sources (say, a learning management system or a student information system).

    2. captures values such as transparency and student autonomy

      Indeed. “Privacy” makes it sound like a single factor, hiding the complexity of the matter and the importance of learners’ agency.

    1. “We need much more honesty, about what data is being collected and about the inferences that they’re going to make about people. We need to be able to ask the university ‘What do you think you know about me?’”
  43. Jul 2016
  44. Dec 2015
    1. A personal API builds on the domain concept—students store information on their site, whether it’s class assignments, financial aid information or personal blogs, and then decide how they want to share that data with other applications and services. The idea is to give students autonomy in how they develop and manage their digital identities at the university and well into their professional lives