738 Matching Annotations
  1. Sep 2024
  2. Jul 2024
    1. First, the complexity of modern federal criminal law, codified in several thousand sections of the United States Code and the virtually infinite variety of factual circumstances that might trigger an investigation into a possible violation of the law, make it difficult for anyone to know, in advance, just when a particular set of statements might later appear (to a prosecutor) to be relevant to some such investigation.

      If the federal government had access to every email you’ve ever written and every phone call you’ve ever made, it’s almost certain that they could find something you’ve done which violates a provision in the 27,000 pages of federal statues or 10,000 administrative regulations. You probably do have something to hide, you just don’t know it yet.

  3. Jun 2024
    1. People haven’t really come to grips with the fact that it’s not just personal privacy that matters, it’s also institutional privacy
  4. May 2024
    1. So I'm not surprised, and perhaps the last question about user privacy.

      Question about the privacy of the user interacting with the RAG

      It doesn't seem like JSTOR has thought about this thoroughly. In the beta there is kind of an expectation that the service is being validated so what users are doing is being watched closer.

  5. Apr 2024
    1. Privacy is realized by the group founder creating a specialkeypair for the group and secretly sharing the private group key with every new group member.When a member is removed from a group or leaves it, the founder has to renew the group’s keypair

      Have locks on data is a nice idea.

      But given we have dissemination/disclosure among friends only, do we need to encrypt the blocks? Given communication channels are encrypted.

    1. What are some top UX design principles when designing for kids?Some important UX design principles when designing for kids are as follows. Simplicity and clarity Interactive and engaging elements Age-appropriate content Safety and privacy Consistent feedback and rewards

      There's 5 in this list and there was 4 in the other - I think Safety and Privacy is the one additional but it's also in my proposal because I am concerned about it too.

  6. Mar 2024
    1. Millions of Patient Records at Risk: The Perils of Legacy Protocols

      Sina Yazdanmehr | Senior IT Security Consultant, Aplite GmbH Ibrahim Akkulak | Senior IT Security Consultant, Aplite GmbH Date: Wednesday, December 6, 2023

      Abstract

      Currently, a concerning situation is unfolding online: a large amount of personal information and medical records belonging to patients is scattered across the internet. Our internet-wide research on DICOM, the decade-old standard protocol for medical imaging, has revealed a distressing fact – Many medical institutions have unintentionally made the private data and medical histories of millions of patients accessible to the vast realm of the internet.

      Medical imaging encompasses a range of techniques such as X-Rays, CT scans, and MRIs, used to visualize internal body structures, with DICOM serving as the standard protocol for storing and transmitting these images. The security problems with DICOM are connected to using legacy protocols on the internet as industries strive to align with the transition towards Cloud-based solutions.

      This talk will explain the security shortcomings of DICOM when it is exposed online and provide insights from our internet-wide research. We'll show how hackers can easily find, access, and exploit the exposed DICOM endpoints, extract all patients' data, and even alter medical records. Additionally, we'll explain how we were able to bypass DICOM security controls by gathering information from the statements provided by vendors and service providers regarding their adherence to DICOM standards.

      We'll conclude by providing practical recommendations for medical institutions, healthcare providers, and medical engineers to mitigate these security issues and safeguard patients' data.

    1. Critics say this amounts to users having to pay for their privacy.

      "If you aren't paying for it, you're the product."

      "If you pay for it, you're paying for fundamental rights."

  7. Feb 2024
    1. Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Bugs in our pockets: the risks of client-side scanning, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020

      Abstract

      Our increasing reliance on digital technology for personal, economic, and government affairs has made it essential to secure the communications and devices of private citizens, businesses, and governments. This has led to pervasive use of cryptography across society. Despite its evident advantages, law enforcement and national security agencies have argued that the spread of cryptography has hindered access to evidence and intelligence. Some in industry and government now advocate a new technology to access targeted data: client-side scanning (CSS). Instead of weakening encryption or providing law enforcement with backdoor keys to decrypt communications, CSS would enable on-device analysis of data in the clear. If targeted information were detected, its existence and, potentially, its source would be revealed to the agencies; otherwise, little or no information would leave the client device. Its proponents claim that CSS is a solution to the encryption versus public safety debate: it offers privacy—in the sense of unimpeded end-to-end encryption—and the ability to successfully investigate serious crime. In this paper, we argue that CSS neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite. CSS by its nature creates serious security and privacy risks for all society, while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which CSS can fail, can be evaded, and can be abused.

      Right off the bat, these authors are highly experienced and plugged into what is happening with technology.

  8. Jan 2024
    1. Also just by observing what they’re doing it becomes pretty clear. For example: Facebook recently purchased full-page ads on major newspapers entirely dedicated to “denounce” Apple. Why? Because Apple has built a system-level feature on iPhones that allows users to very easily disable every kind of advertising tracking and profiling. Facebook absolutely relies on being able to track you and profile your interests, so they immediately cooked up some cynical reasons why Apple shouldn’t be allowed to do this.But the truth is: if Facebook is fighting against someone on privacy matters, that someone is probably doing the right thing.
  9. Nov 2023
    1. To improve user privacy, display moment notifications are intentionally delayed a random amount of time when FedCM is enabled.

      How does that improve privacy?

  10. Oct 2023
    1. "Without the right to tinker and explore, we risk becoming enslaved by technology; and the more we exercise the right to hack, the harder it will be to take that right away" - Andre "Bunnie" Huang

      hah, we are already "enslaved by technology". ask Ted Kaczynski

      our enemies already have hardware backdoors, compromising emissions (tempest), closed-source firmware/drivers/hardware, ... but sure, "feel free"

    1. ``` { version: 4,

      info: { reason: <string>, // what triggered this ping: "saved-session", "environment-change", "shutdown", ... revision: <string>, // the Histograms.json revision timezoneOffset: <integer>, // time-zone offset from UTC, in minutes, for the current locale previousBuildId: <string>, // null if this is the first run, or the previous build ID is unknown

      sessionId: <uuid>,  // random session id, shared by subsessions
      subsessionId: <uuid>,  // random subsession id
      previousSessionId: <uuid>, // session id of the previous session, null on first run.
      previousSubsessionId: <uuid>, // subsession id of the previous subsession (even if it was in a different session),
                                    // null on first run.
      
      subsessionCounter: <unsigned integer>, // the running no. of this subsession since the start of the browser session
      profileSubsessionCounter: <unsigned integer>, // the running no. of all subsessions for the whole profile life time
      
      sessionStartDate: <ISO date>, // hourly precision, ISO date in local time
      subsessionStartDate: <ISO date>, // hourly precision, ISO date in local time
      sessionLength: <integer>, // the session length until now in seconds, monotonic
      subsessionLength: <integer>, // the subsession length in seconds, monotonic
      
      addons: <string>, // obsolete, use ``environment.addons``
      

      },

      processes: {...}, simpleMeasurements: {...},

      // The following properties may all be null if we fail to collect them. histograms: {...}, keyedHistograms: {...}, chromeHangs: {...}, // removed in firefox 62 threadHangStats: [...], // obsolete in firefox 57, use the 'bhr' ping log: [...], // obsolete in firefox 61, use Event Telemetry or Scalars gc: {...}, fileIOReports: {...}, lateWrites: {...}, addonDetails: {...}, UIMeasurements: [...], // Android only slowSQL: {...}, slowSQLstartup: {...}, } ```

    1. ``` { type: <string>, // "main", "activation", "optout", "saved-session", ... id: <UUID>, // a UUID that identifies this ping creationDate: <ISO date>, // the date the ping was generated version: <number>, // the version of the ping format, currently 4

      application: { architecture: <string>, // build architecture, e.g. x86 buildId: <string>, // "20141126041045" name: <string>, // "Firefox" version: <string>, // "35.0" displayVersion: <string>, // "35.0b3" vendor: <string>, // "Mozilla" platformVersion: <string>, // "35.0" xpcomAbi: <string>, // e.g. "x86-msvc" channel: <string>, // "beta" },

      clientId: <UUID>, // optional environment: { ... }, // optional, not all pings contain the environment payload: { ... }, // the actual payload data for this ping type } ```

  11. Sep 2023
    1. Google claims this new API addresses FLoC’s serious privacy issues. Unfortunately, it does anything but. The Topics API only touches the smallest, most minor privacy issues in FLoC, while leaving its core intact. At issue is Google’s insistence on sharing information about people’s interests and behaviors with advertisers, trackers, and others on the Web that are hostile to privacy.
    1. If a site you visit queries the Topics API, it may learn of this interest from Chrome and decide to serve you an advert about bonds or retirement funds. It also means websites can fetch your online interests straight from your browser.

      The Topics API is worst than 3rd-parties cookies, anyone can query a user ad profile:

      ```js // document.browsingTopics() returns an array of BrowsingTopic objects. const topics = await document.browsingTopics();

      // Get data for an ad creative. const response = await fetch('https://ads.example/get-creative', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify(topics) });

      // Get the JSON from the response. const creative = await response.json();

      // Display the ad. (or not) ```

  12. Aug 2023
    1. We lived in a relatively unregulated digital world until now. It was great until the public realized that a few companies wield too much power today in our lives. We will see significant changes in areas like privacy, data protection, algorithm and architecture design guidelines, and platform accountability, etc. which should reduce the pervasiveness of misinformation, hate and visceral content over the internet.
      • for: quote, quote - Prateek Raj, quote - internet regulation, quote - reducing misinformation, fake news, indyweb - support
      • quote
        • We lived in a relatively unregulated digital world until now.
        • It was great until the public realized that a few companies wield too much power today in our lives.
        • We will see significant changes in areas like
          • privacy,
          • data protection,
          • algorithm and
          • architecture design guidelines, and
          • platform accountability, etc.
        • which should reduce the pervasiveness of
          • misinformation,
          • hate and visceral content
        • over the internet.
        • These steps will also reduce the power wielded by digital giants.
        • Beyond these immediate effects, it is difficult to say if these social innovations will create a more participative and healthy society.
        • These broader effects are driven by deeper underlying factors, like
          • history,
          • diversity,
          • cohesiveness and
          • social capital, and also
          • political climate and
          • institutions.
        • In other words,
          • just as digital world is shaping the physical world,
          • physical world shapes our digital world as well.
      • author: Prateek Raj
        • assistant professor in strategy, Indian Institute of Management, Bangalore
    1. A spec to optimize ad targeting (respectful of privacy, they say... 😂🤣).

      Fuck you Google with your dystopian API:

      ```js // document.browsingTopics() returns an array of BrowsingTopic objects. const topics = await document.browsingTopics();

      // Get data for an ad creative. const response = await fetch('https://ads.example/get-creative', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify(topics) });

      // Get the JSON from the response. const creative = await response.json();

      // Display the ad. (or not) ```

    1. On-device ad auctions to serve remarketing and custom audiences, without cross-site third-party tracking.

      Naming a thing with a meaning opposite to what the named thing is...

      Google is insatiable when it regards to accessing users private data. Let's block that bullshit.

  13. Jul 2023
    1. Such efforts to protect data privacy go beyond the abilities of the technology involved to also encompass the design process. Some Indigenous communities have created codes of use that people must follow to get access to community data. And most tech platforms created by or with an Indigenous community follow that group’s specific data principles. Āhau, for example, adheres to the Te Mana Raraunga principles of Māori data sovereignty. These include giving Māori communities authority over their information and acknowledging the relationships they have with it; recognizing the obligations that come with managing data; ensuring information is used for the collective benefit of communities; practicing reciprocity in terms of respect and consent; and exercising guardianship when accessing and using data. Meanwhile Our Data Indigenous is committed to the First Nations principles of ownership, control, access and possession (OCAP). “First Nations communities are setting their own agenda in terms of what kinds of information they want to collect,” especially around health and well-being, economic development, and cultural and language revitalization, among others, Lorenz says. “Even when giving surveys, they’re practicing and honoring local protocols of community interaction.”

      Colonized groups such as these indigenous people have urgency to avoid colonization of their data and are doing something about it

  14. Apr 2023
    1. Seeing how powerful AI can be for cracking passwords is a good reminder to not only make sure you‘re using strong passwords but also check:↳ You‘re using 2FA/MFA (non-SMS-based whenever possible) You‘re not re-using passwords across accounts Use auto-generated passwords when possible Update passwords regularly, especially for sensitive accounts Refrain from using public WiFi, especially for banking and similar accounts

      看到人工智能在破解密码方面有多么强大,这很好地提醒了我们,不仅要确保你在使用强密码,还要检查:

      • 你正在使用 2FA/MFA(尽可能不使用基于短信的)。

      • 你没有在不同的账户间重复使用密码

      • 尽可能使用自动生成的密码

      • 定期更新密码,特别是敏感账户的密码

      • 避免使用公共WiFi,尤其是银行和类似账户

    2. Now Home Security Heroes has published a study showing how scary powerful the latest generative AI is at cracking passwords. The company used the new password cracker PassGAN (password generative adversarial network) to process a list of over 15,000,000 credentials from the Rockyou dataset and the results were wild. 51% of all common passwords were cracked in less than one minute, 65% in less than an hour, 71% in less than a day, and 81% in less than a month.
  15. Mar 2023
    1. Time to dive a little deeper to see what information the barcodes actually contain. For this I will break down the previously extracted information into smaller pieces.

      Information contained within boarding pass barcodes

    1. Companies that perform surveillance are attempting the same mental trick. They assert that we freely share our data in return for valuable services. But opting out of surveillance capitalism is like opting out of electricity, or cooked foods—you are free to do it in theory. In practice, it will upend your life.

      Opting-out of surveillance capitalism?

    1. Does the EDL/EID card transmit my personal information? No. The RFID tag embedded in your card doesn't contain any personal identifying information, just a unique reference number.

      Can this unique reference number be used to identify me (assuming they've already identified me another way and associated this number with me)? Yes!!

      So this answer is a bit incomplete/misleading...

  16. Feb 2023
    1. It means that everything AI makes would immediately enter the public domain and be available to every other creator to use, as they wish, in perpetuity and without permission.

      One issue with blanket, automatic entry of AI-generated works to the public domain is privacy: A human using AI could have good reasons not to have the outputs of their use made public.

    1. Exercising Your Rights: California residents can exercise the above privacy rights by emailing us at: support@openai.com.

      Does that mean that any California resident can email to request a record of all the information OpenAI has collected about them?

    2. Affiliates: We may share Personal Information with our affiliates, meaning an entity that controls, is controlled by, or is under common control with OpenAI. Our affiliates may use the Personal Information we share in a manner consistent with this Privacy Policy.

      This would include Microsoft.

    3. improve and/or analyze the Services

      Does that mean that we are agreeing for them to use personal information in any way they choose if they deem it to help them improve their software?

    1. Your access of the website and/or use of our services, after modification, addition or deletion of the privacy policy shall be deemed to constitute acceptance by you of the modification, addition or deletion.

      This sounds bad. Users can't be held to have agreed to arbitrary changes to a privacy policy, if we are not even notified about the changes.

      If you make significant changes to the privacy policy you should give users 30 days' notice, and preferably get their consent again.

      Here's an article about it: https://www.privacypolicies.com/blog/privacy-policy-update-notices/

    1. You may opt-out of the telemetry by setting Storybook's configuration element disableTelemetry to true, using the --disable-telemetry flag, or setting the environment variableSTORYBOOK_DISABLE_TELEMETRY to 1.
  17. Jan 2023
    1. How did it work? GNUAsk (the aspirational, mostly unreleased search engine UI) relied on hundreds of bots, running as daemons, and listening in on conversations within AOL AIM, IRC, Skype, and Yahoo public chat rooms and recording all the textual conversations.
  18. Dec 2022
    1. “Berla devices position CBP and ICE to perform sweeping searches of passengers’ lives, with easy access to cars' location history and most visited places and to passengers’ family and social contacts, their call logs, and even their social media feeds,” she said.
    2. Cybersecurity researcher Curry told Forbes that, after seeing what could be done with just a VIN, it was “terrifying” that those identifying numbers were public.
    1. Economists explain that markets work bestwith “perfect information.” And visibilityfeeds this market by translating and sharingskills. But the price of transparency in themodern age is invaded privacy, as well as biasinherent in automated products and services
    1. The presence of Twitter’s code — known as the Twitter advertising pixel — has grown more troublesome since Elon Musk purchased the platform.AdvertisementThat’s because under the terms of Musk’s purchase, large foreign investors were granted special privileges. Anyone who invested $250 million or more is entitled to receive information beyond what lower-level investors can receive. Among the higher-end investors include a Saudi prince’s holding company and a Qatari fund.

      Twitter investors may get access to user data

      I'm surprised but not surprised that Musk's dealings to get investors in his effort to take Twitter private may include sharing of personal data about users. This article makes it sound almost normal that this kind of information-sharing happens with investors (inclusion of the phrase "information beyond what lower-level investors can receive").

    1. Meta's receipt of tax information via tracking pixels on tax preparer websites is the subject of a federal lawsuit. The tax preparing sites are not participants in the lawsuit (yet?).

  19. Nov 2022
    1. From a technical point of view, the IndieWeb people have worked on a number of simple, easy to implement protocols, which provide the ability for web services to interact openly with each other, but in a way that allows for a website owner to define policy over what content they will accept.

      Thought you might like Web Monetization.

    1. Donations

      To add some other intermediary services:

      To add a service for groups:

      To add a service that enables fans to support the creators directly and anonymously via microdonations or small donations by pre-charging their Coil account to spend on content streaming or tipping the creators' wallets via a layer containing JS script following the Interledger Protocol proposed to W3C:

      If you want to know more, head to Web Monetization or Community or Explainer

      Disclaimer: I am a recipient of a grant from the Interledger Foundation, so there would be a Conflict of Interest if I edited directly. Plus, sharing on Hypothesis allows other users to chime in.

    1. Hidden below all of this is the normalization of surveillance that consistently targets marginalized communities. The difference between a smartwatch and an ankle monitor is, in many ways, a matter of context: Who wears one for purported betterment, and who wears one because they are having state power enacted against them?
    2. The conveniences promised by Amazon’s suite of products may seem divorced from this context; I am here to tell you that they’re not. These “smart” devices all fall under the umbrella of what the digital-studies scholar David Golumbia and I call “luxury surveillance”—that is, surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits.
    1. Some of the sensitive data collection analyzed by The Markup appears linked to default behaviors of the Meta Pixel, while some appears to arise from customizations made by the tax filing services, someone acting on their behalf, or other software installed on the site. Report Deeply and Fix Things Because it turns out moving fast and breaking things broke some super important things. Give Now For example, Meta Pixel collected health savings account and college expense information from H&R Block’s site because the information appeared in webpage titles and the standard configuration of the Meta Pixel automatically collects the title of a page the user is viewing, along with the web address of the page and other data. It was able to collect income information from Ramsey Solutions because the information appeared in a summary that expanded when clicked. The summary was detected by the pixel as a button, and in its default configuration the pixel collects text from inside a clicked button.  The pixels embedded by TaxSlayer and TaxAct used a feature called “automatic advanced matching.” That feature scans forms looking for fields it thinks contain personally identifiable information like a phone number, first name, last name, or email address, then sends detected information to Meta. On TaxSlayer’s site this feature collected phone numbers and the names of filers and their dependents. On TaxAct it collected the names of dependents.

      Meta Pixel default behavior is to parse and send sensitive data

      Wait, wait, wait... the software has a feature that scans for privately identifiable information and sends that detected info to Meta? And in other cases, the users of the Meta Pixel decided to send private information ot Meta?

    1. Although complicated, Gen Z’s relationship with data privacy should be a consideration for brands when strategizing their data privacy policies and messaging for the future. Expectations around data privacy are shifting from something that sets companies apart in consumers’ minds to something that people expect the same way one might expect a service or product to work as advertised. For Gen Zers, this takes the form of skepticism that companies will keep their data safe, and their reluctance to give companies credit for getting it right means that good data privacy practices will increasingly be more about maintaining trust than building it.

      Gen-Z expectations are complicated

      The Gen-Z generation have notably different expectations about data privacy than previous generations. "Libraries" wasn't among the industry that showed up in their survey results. That Gen-Z expects privacy built in makes that factor a less differentiating characteristic as compared to older generations. It might also be harder to get trust back from members of the Gen-Z population if libraries surprise those users with data handling practices that they didn't expect.

    1. “You have to assume that things can go wrong,” shared Waymo’s head of cybersecurity, Stacy Janes. “You can’t just design for this success case – you have to design for the worst case.”

      Future proofing by asking "what if we're wrong?"

  20. Oct 2022
    1. A Midwestern hospital system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained patient information.

      Substitute “library” for “hospital”

      In an alternate universe: “A Midwestern library system is treating its use of Google and Facebook web tracking technologies as a data breach, notifying 3 million individuals that the computing giants may have obtained search and borrowing histories.”

    1. On the other end, there was The Good Phone Foundation, a not-for-profit organization founded with a mission to create an open, transparent, and secure mobile ecosystem outside of Big Tech’s reach, who just released their own Android-based mobile OS and were looking for apps to rely on. They contacted me, and after a couple of calls, we realized that partnering up on the smartphone makes a lot of sense for both of us. So, here we are, introducing you to our brand new Simple Phone. Only having control over both software and hardware ensures the ultimate privacy and security. The target audience consists of more privacy-oriented people that do not want to be tracked or rely on big corporations, Google Play, etc. It focuses on people who just want to get things done in a simple way without having to keep closing ads and wondering what does the system do in the background. Not to mention consistency again as the core apps are developed by us. Hope you will like it just like we do 🙂

      Simple Phone's effort to release its own mobile OS is promising for ordinary users. Because Simple Mobile Tools represents a full suite of basic Android applications, in can, ideally, provide a privacy-friendly and user-friendly alternative to stock Android by providing a unified suite of apps. /e/ OS (aka Murena) is attempting something similar, but its app collection is not quite as unified as the Simple Mobile suite.

    1. in many ways, Law 25 is the most stringent of the three regimes
    2. Impact assessments: Law 25 is broad and requires a PIR to be carried out whenever conditions are met, regardless of the level of risk. The GDPR is less stringent, only requiring assessments in cases where processing is likely to result in a ‘high risk’ to rights and freedoms. Because the CCPA does not specifically focus on accountability-related obligations, it does not mandate impact assessments.
    3. Privacy by default: Bill 64’s “confidentiality by default” clause is far broader in scope and significantly more stringent than the “privacy by design” concept under the GDPR. The CCPA does not provide for this concept at all, instead taking an “after-the-event” remedial approach. 
    1. exige qu'une AIPD soit réalisée à chaque fois que la situation le nécessite et quel que soit le niveau de risque
    2. La confidentialité par défaut : La clause de "confidentialité par défaut" du projet de loi 64 a une portée beaucoup plus vaste et est beaucoup plus stricte que le concept de "confidentialité par conception" prévu par le RGPD. Le CCPA adopte plutôt une approche corrective "après coup".
    1. En cas de non-respect de la Loi, la Commission d’accès à l’information pourra imposer des sanctionsimportantes, qui pourraient s’élever jusqu’à 25 M$ ou à 4 % du chiffre d’affaires mondial. Cette sanctionsera proportionnelle, notamment, à la gravité du manquement et à la capacité de payer de l’entreprise.ENTREPRISES
  21. Sep 2022
    1. Denmark’s data protection regulator found that local schools did not really understand what Google was doing with students’ data and as a result blocked around 8,000 students from using the Chromebooks that had become a central part of their daily education.

      Danish data regulator puts a temporary ban on Google education products

    1. If someone came up to you in the street, said they’re from an online service provider and requested you store all of the above data about you with them, I imagine for many the answer would be a resounding NO!
  22. Aug 2022
    1. University can’t scan students’ rooms during remote tests, judge rules

      Room scan by proctoring tools violates protection against unreasonable searches Ohio court rules (US const 4th amendment) Univ defense was 'it's standard industry practice' and 'others did not complain'. In other words no actual moral consideration was made by univ. This is so bad, even without the fact that a third party keeps the recording of the video scan of this student's bedroom.

      Where there's a need for remote test taking, why try to copy over the cheating controls from in-person test taking? How about adapting the test content on the assumption that students will have material available to them during the test, reducing the proctoring need to only assuring the actual student is taking the test.

    1. Your personal data will be shared both within Beekeeper associated offices globally and with Greenhouse Software, Inc., a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controller’s behalf.

      Personal data will be accessible to different branches (i.e., national affiliates) of Beekeeper.

    1. NETGEAR is committed to providing you with a great product and choices regarding our data processing practices. You can opt out of the use of the data described above by contacting us at analyticspolicy@netgear.com

      You may opt out of these data use situations by emailing analyticspolicy@netgear.com.

    2. Marketing. For example, information about your device type and usage data may allow us to understand other products or services that may be of interest to you.

      All of the information above that has been consented to, can be used by NetGear to make money off consenting individuals and their families.

    3. USB device

      This gives Netgear permission to know what you plug into your computer, be it a FitBit, a printer, scanner, microphone, headphones, webcam — anything not attached to your computer.

  23. Jul 2022
    1. In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

      In the wake of Roe v. Wade being overturned, online privacy is on everyone's minds. But according to privacy experts, the entire way we think about and understand what 'privacy' actually means... is wrong. In this new Think Again, NBC News Correspondent Andrew Stern dives deep into digital privacy — what it really means, how we got to this point, how it impacts every facet of our lives, and how little of it we actually have.

  24. www.mojeek.com www.mojeek.com
    1. Mojeek

      Mojeek is the 4th largest English lang. web search engine after Google, Bing and Yandex which has it's own index, crawler and algo. Index has passed 5.7 billion pages. Growing. Privacy based.

      It uses it's own index with no backfill from others.

    1. she’s being dragged into the public eye nonetheless.

      Wow... I have heard of instances similar to this one. A stranger narrates the life of another stranger online, it goes viral and the identities of everyone are revealed. To me it seemed meaningless I never thought that the people involved could just want their privacy. I find it very scary how this can happen to anyone. Another reason why I limit my social media usage. I found myself to be engaging with very negative commentary and it really affected my mental health. I wonder how this womans mental health is going. Pretty much the whole world knows about her know unfortunately.

    1. Something has shifted online: We’ve arrived at a new era of anonymity, in which it feels natural to be inscrutable and confusing—forget the burden of crafting a coherent, persistent personal brand. There just isn’t any good reason to use your real name anymore. “In the mid 2010s, ambiguity died online—not of natural causes, it was hunted and killed,” the writer and podcast host Biz Sherbert observed recently. Now young people are trying to bring it back. I find this sort of exciting, but also unnerving. What are they going to do with their newfound freedom?
  25. Jun 2022
    1. Companies need to actually have an ethics panel, and discuss what the issues are and what the needs of the public really are. Any ethics board must include a diverse mix of people and experiences. Where possible, companies should look to publish the results of these ethics boards to help encourage public debate and to shape future policy on data use.

    1. The goal is to gain “digital sovereignty.”

      the age of borderless data is ending. What we're seeing is a move to digital sovereignty

    1. Using the network-provided DNS servers is the best way to blend in with other users. Network and web sites can fingerprint and track users based on a non-default DNS configuration.
    1. All wireless devices have small manufacturing imperfections in the hardware that are unique to each device. These fingerprints are an accidental byproduct of the manufacturing process. These imperfections in Bluetooth hardware result in unique distortions, which can be used as a fingerprint to track a specific device. For Bluetooth, this would allow an attacker to circumvent anti-tracking techniques such as constantly changing the address a mobile device uses to connect to Internet networks. 

      Tracking that evades address changes

      An operating system can change the hardware address it broadcasts in avoid tracking. But subtle differences in the signal itself can still be identified and tracked.

    1. Free public projects private projects starting at $9/month per project

      For many tools and apps payment for privacy is becoming the norm.

      Examples: - Kumu.io - Github for private repos - ...

      pros: - helps to encourage putting things into the commons

      cons: - Normalizes the idea of payment for privacy which can be a toxic tool.

      discuss...

    1. the one thing that you have to keep conveying to people about the consequences of surveillance is that it's all very well to say that you have nothing to hide, but when you're spied upon, everybody that's connected to you gets spied upon. And if we don't push back, the most vulnerable people in society, the people that actually keep really massive violations of human rights and illegality in check, they're the people who get most affected.

      "I Have Nothing To Hide" counter-argument

      Even if you have nothing to hide, that doesn't mean that those you are connected with aren't also being surveilled and are part of targeted communities.

  26. May 2022
    1. For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.

      Building in pre-payments or a tax on data leaks to prevent companies neglecting negative externalities could be an important stick in government regulation.

      While it should apply across the board, it should be particularly onerous for for-profit companies.

    2. Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us.

      We need to think more about the externalities of our data decisions.

  27. Apr 2022
    1. This Playbuzz Privacy Policy (“Policy”) outlines what personal information is collected by Playbuzz Ltd. (“Playbuzz”, “we”, “us” or “our”), how we use such personal information, the choices you have with respect to such personal information, and other important information.

      We keep your personal information personal and private. We will not sell, rent, share, or otherwise disclose your personal information to anyone except as necessary to provide our services or as otherwise described in this Policy.

    1. Dorothea Salo (2021) Physical-Equivalent Privacy, The Serials Librarian, DOI: 10.1080/0361526X.2021.1875962

      Permanent Link: http://digital.library.wisc.edu/1793/81297

      Abstract

      This article introduces and applies the concept of “physical-equivalent privacy” to evaluate the appropriateness of data collection about library patrons’ use of library-provided e‑resources. It posits that as a matter of service equity, any data collection practice that causes e‑resource users to enjoy less information privacy than users of an information-equivalent print resource is to be avoided. Analysis is grounded in real-world e‑resource-related phenomena: secure (HTTPS) library websites and catalogs, the Adobe Digital Editions data-leak incident of 2014, and use of web trackers on e‑resource websites. Implications of physical-equivalent privacy for the SeamlessAccess single-sign-on proposal will be discussed.

    1. a child had gone missing in our town and the FBI came to town to investigate immediately and had gone to the library. They had a tip and wanted to seize and search the library’s public computers. And the librarians told the FBI that they needed to get a warrant. The town was grief stricken and was enraged that the library would, at a time like that, demand that the FBI get a warrant. Like everyone in town was like, are you kidding me? A child is missing and you’re– and what? This town meeting afterwards, the library budget, of course, is up for discussion as it is every year, and the people were still really angry with the library, but a patron and I think trustee of the library – again, a volunteer, someone living in town – an elderly woman stood up and gave the most passionate defense of the Fourth Amendment and civil liberties to the people on the floor that I have ever witnessed.

      An example of how a library in Vermont stood up to a warrantless request from the FBI to seize and search public library computers. This could have impacted the library's budget when the issue was brought to a town meeting, but a library patron was a passionate advocate for the 4th amendment.

    1. K-Anonymity, L-Diversity, and T-ClosenessIn this section, I will introduce three techniques that can be used to reduce the probability that certain attacks can be performed. The simplest of these methods is k-anonymity, followed by l-diversity, and then followed by t-closeness. Other methods have been proposed to form a sort of alphabet soup, but these are the three most commonly utilized. With each of these, the analysis that must be performed on the dataset becomes increasingly complex and undeniably has implications on the statistical validity of the dataset.

      privacy metrics

    1. Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know.

      Privacy is the power to decide when and what is secret and who to.

    1. I thought that the point of disappearing messages was to eat your cake and have it too, by allowing you to send a message to your adversary and then somehow deprive them of its contents. This is obviously a stupid idea.But the threat that Snapchat — and its disappearing message successors —was really addressing wasn’t communication between untrusted parties, it was automating data-retention agreements between trusted parties.

      Why use a disappearing message service

      The point of a disappearing message service is to have the parties to the message agree on the data-retention provisions of a message. The service automates that agreement by deleting the message at the specified time. The point isn't to send a message to an adversary and then delete it so they can't prove that it has been sent. There are too many ways of capturing the contents of a message—as simple as taking a picture of the message with another device.

    1. Weinberg’s tweet announcing the change generated thousands of comments, many of them from conservative-leaning users who were furious that the company they turned to in order to get away from perceived Big Tech censorship was now the one doing the censoring. It didn’t help that the content DuckDuckGo was demoting and calling disinformation was Russian state media, whose side some in the right-wing contingent of DuckDuckGo’s users were firmly on.

      There is an odd sort of self-selected information bubble here. DuckDuckGo promoted itself as privacy-aware, not unfiltered. On their Sources page, they talk about where they get content and how they don't sacrifice privacy to gather search results. Demoting disinformation sources in their algorithms would seem to be a good thing. Except if what you expect to see is disinformation, and then suddenly the search results don't match your expectations.

  28. Mar 2022
    1. Thus,information about people and their behaviour is made visible to other people, systems andcompanies.

      "Data trails"—active information and passive telemetry—provide a web of details about a person's daily life, and the analysis of that data is a form of knowledge about a person.

  29. Feb 2022
    1. Others because they want privacy

      AIUI, your account's contribution graph and feed are still public, not private, without a way to opt out—just like on GitHub.

  30. Dec 2021
    1. Efforts to clarify and disseminatethe differences between “privacy as advocacy” (e.g.,privacy is a fundamental right; privacy is an ethicalnorm) and “privacy as compliance” (e.g., ensuringprivacy policies and laws are followed; privacyprograms train, monitor, and measure adherence torules) help frame conversations and set expectations.

      This is an interesting distinction... privacy-because-it-is-the-right-thing-to-do versus privacy-because-you-must. I think the latter is where most institutions are today. It will take a lot more education to get institutions to the former.

    2. As informed and engagedstakeholders, students understand how and why theirinstitutions use academic and personal data.

      Interesting that there is a focus here on advocacy from an active student body. Is it the expectation that change from some of the more stubborn areas of the campus would be driven by informed student push-back? This section on "Students, Faculty, and Staff" doesn't have the same advocacy role from the other portions of the campus community.

    1. Questions, comments and requests, including any complaints, regarding us or this privacy policy are welcomed and should be addressed to privacy@marugroup.net.

      However, if you do this then your email, IP, broswer etc will be collected and shared as per the information above. To be safer, I would write a letter, stick a stamp on the envelope and send it in.

    2. stored at, a destination outside the European Economic Area ("EEA").

      Why? is that allowed? I don't think that I would be happy about that as I am not reassured that 'taking reasonable steps' is actually appropriate considering one of those would be to host within regions specified by GDPR

    3. third party, in which case personal data held by it about its customers will be one of the transferred assets.

      I was going to respond to the survey until I saw this. I am offering to provide feedback for free and yet my personal information is collected and becomes part of the sale of the business in the form of an asset. The question is why is my personal information being held for any length of time after I have completed the survey? Isn't that a violation of GDPR?

    4. In the event that we sell or buy any business or assets, in which case we will disclose your personal data to the prospective seller or buyer of such business or assets.

      Why? I came across this privacy policy as I had been asked to respond to a survey about the website. Not only am I giving my feedback for free but then they want to take my personal information and give it away to unknown buyers and sellers (third parties) all of whom are fictitious and in the future?

    1. About 7 in 10 Americans think their phone or other devices are listening in on them in ways they did not agree to.

      I'm enough of a tinfoil hat wearer to this this might be true. Especially since my google home talks to me entirely too much when I'm not talking to it.

  31. Nov 2021
    1. There Is No Antimimetics Division (qntm): This is the best new sci fi I've read in recent memory, I think because it feels fresh and modern, tackling some of the hardest social questions that the world is facing today. It's about antimemes, defined as "an idea with self-censoring properties...which, by its intrinsic nature, discourages or prevents people from spreading it."

      I like the idea of antimemes. The tougher question is how to actually implement it on the web?

      Is this just the idea of a digital secret?

      "The only way for two computers to keep a secret on the web is if all the computers are dead."—Chris Aldrich

    1. Pretty much anything that can be remembered can be cracked. There’s still one scheme that works. Back in 2008, I described the “Schneier scheme”: So if you want your password to be hard to guess, you should choose something that this process will miss. My advice is to take a sentence and turn it into a password. Something like “This little piggy went to market” might become “tlpWENT2m”. That nine-character password won’t be in anyone’s dictionary. Of course, don’t use this one, because I’ve written about it. Choose your own sentence — something personal.

      Good advice on creating secure passwords.