481 Matching Annotations
  1. Last 7 days
    1. Someone had taken control of my iPad, blasting through Apple’s security restrictions and acquiring the power to rewrite anything that the operating system could touch. I dropped the tablet on the seat next to me as if it were contagious. I had an impulse to toss it out the window. I must have been mumbling exclamations out loud, because the driver asked me what was wrong. I ignored him and mashed the power button. Watching my iPad turn against me was remarkably unsettling. This sleek little slab of glass and aluminum featured a microphone, cameras on the front and back, and a whole array of internal sensors. An exemplary spy device.
  2. May 2020
    1. Companies that show their customers that they take privacy seriously will earn their trust and loyalty.
    1. We present results from technical experiments which reveal that WeChat communications conducted entirely among non-China-registered accounts are subject to pervasive content surveillance that was previously thought to be exclusively reserved for China-registered accounts.

      WeChat not only tracks Chinese accounts

    1. Google encouraging site admins to put reCaptcha all over their sites, and then sharing the resulting risk scores with those admins is great for security, Perona thinks, because he says it “gives site owners more control and visibility over what’s going on” with potential scammer and bot attacks, and the system will give admins more accurate scores than if reCaptcha is only using data from a single webpage to analyze user behavior. But there’s the trade-off. “It makes sense and makes it more user-friendly, but it also gives Google more data,”
    2. This kind of cookie-based data collection happens elsewhere on the internet. Giant companies use it as a way to assess where their users go as they surf the web, which can then be tied into providing better targeted advertising.
    3. For instance, Google’s reCaptcha cookie follows the same logic of the Facebook “like” button when it’s embedded in other websites—it gives that site some social media functionality, but it also lets Facebook know that you’re there.
    4. one of the ways that Google determines whether you’re a malicious user or not is whether you already have a Google cookie installed on your browser.
    5. But this new, risk-score based system comes with a serious trade-off: users’ privacy.
    1. they sought to eliminate data controllers and processors acting without appropriate permission, leaving citizens with no control as their personal data was transferred to third parties and beyond
    1. “Until CR 1.0 there was no effective privacy standard or requirement for recording consent in a common format and providing people with a receipt they can reuse for data rights.  Individuals could not track their consents or monitor how their information was processed or know who to hold accountable in the event of a breach of their privacy,” said Colin Wallis, executive director, Kantara Initiative.  “CR 1.0 changes the game.  A consent receipt promises to put the power back into the hands of the individual and, together with its supporting API — the consent receipt generator — is an innovative mechanism for businesses to comply with upcoming GDPR requirements.  For the first time individuals and organizations will be able to maintain and manage permissions for personal data.”
    2. Its purpose is to decrease the reliance on privacy policies and enhance the ability for people to share and control personal information.
    1. "You wanted open source privacy-preserving Bluetooth contact tracing code? #DP3T software development kits/calibration apps for iOS and Android, and backend server, now on GitHub. iOS/Android apps with nice interface to follow." Michael Veale on Twitter (see context)

    1. I will need to find a workaround for one of my private extensions that controls devices in my home network, and its source code cannot be uploaded to Mozilla because of my and my family's privacy.
    2. I will need to find a workaround for one of my private extensions that controls devices in my home network, and its source code cannot be uploaded to Mozilla because of my and my family's privacy.
  3. Apr 2020
    1. Don’t share any private, identifiable information on social media It may be fun to talk about your pets with your friends on Instagram or Twitter, but if Fluffy is the answer to your security question, then you shouldn’t share that with the world. This may seem quite obvious, but sometimes you get wrapped up in an online conversation, and it is quite easy to let things slip out. You may also want to keep quiet about your past home or current home locations or sharing anything that is very unique and identifiable. It could help someone fake your identity.
    2. Don’t share vacation plans on social media Sharing a status of your big trip to the park on Saturday may be a good idea if you are looking to have a big turnout of friends to join you, but not when it comes to home and personal safety. For starters, you have just broadcasted where you are going to be at a certain time, which can be pretty dangerous if you have a stalker or a crazy ex. Secondly, you are telling the time when you won’t be home, which can make you vulnerable to being robbed. This is also true if you are sharing selfies of yourself on the beach with a caption that states “The next 2 weeks are going to be awesome!” You have just basically told anyone who has the option to view your photo and even their friends that you are far away from home and for how long.
    1. Finally, from a practical point of view, we suggest the adoption of "privacy label," food-like notices, that provide the required information in an easily understandable manner, making the privacy policies easier to read. Through standard symbols, colors and feedbacks — including yes/no statements, where applicable — critical and specific scenarios are identified. For example, whether or not the organization actually shares the information, under what specific circumstances this occurs, and whether individuals can oppose the share of their personal data. This would allow some kind of standardized information. Some of the key points could include the information collected and the purposes of its collection, such as marketing, international transfers or profiling, contact details of the data controller, and distinct differences between organizations’ privacy practices, and to identify privacy-invasive practices.
    2. Finally, from a practical point of view, we suggest the adoption of "privacy label," food-like notices, that provide the required information in an easily understandable manner, making the privacy policies easier to read.
    1. people encountering public Hypothesis annotations anywhere don’t have to worry about their privacy.

      In the Privacy Policy document there is an annotation that says:

      I decided against using hypothes.is as the commenting system for my blog, since I don't want my readers to be traceable by a third party I choose on their behalf

      Alhtough this annotation is a bit old -from 2016- I understand that Hypothes.is server would in fact get information from these readers through HTTP requests, correct? Such as IP address, browser's agent, etc. I wonder whether this is the traceability the annotator was referring to.

      Anyway, I think this wouldn't be much different to how an embedded image hosted elsewhere would be displayed on one such site. And Hypothes.is' Privacy Policy states that

      This information is collected in a log file and retained for a limited time

    1. at any time,

      It would be nice that it said here that Hypothes.is will notify its users if the Privacy Policy is changed.

    1. Before we get to passwords, surely you already have in mind that Google knows everything about you. It knows what websites you’ve visited, it knows where you’ve been in the real world thanks to Android and Google Maps, it knows who your friends are thanks to Google Photos. All of that information is readily available if you log in to your Google account. You already have good reason to treat the password for your Google account as if it’s a state secret.
    1. Alas, you'll have to manually visit each site in turn and figure out how to actually delete your account. For help, turn to JustDelete.me, which provides direct links to the cancellation pages of hundreds of services.
    1. When you visit a website, you are allowing that site to access a lot of information about your computer's configuration. Combined, this information can create a kind of fingerprint — a signature that could be used to identify you and your computer. Some companies use this technology to try to identify individual computers.
    1. Our approach strikes a balance between privacy, computation overhead, and network latency. While single-party private information retrieval (PIR) and 1-out-of-N oblivious transfer solve some of our requirements, the communication overhead involved for a database of over 4 billion records is presently intractable. Alternatively, k-party PIR and hardware enclaves present efficient alternatives, but they require user trust in schemes that are not widely deployed yet in practice. For k-party PIR, there is a risk of collusion; for enclaves, there is a risk of hardware vulnerabilities and side-channels.
    2. At the same time, we need to ensure that no information about other unsafe usernames or passwords leaks in the process, and that brute force guessing is not an option. Password Checkup addresses all of these requirements by using multiple rounds of hashing, k-anonymity, and private set intersection with blinding.
    3. Privacy is at the heart of our design: Your usernames and passwords are incredibly sensitive. We designed Password Checkup with privacy-preserving technologies to never reveal this personal information to Google. We also designed Password Checkup to prevent an attacker from abusing Password Checkup to reveal unsafe usernames and passwords. Finally, all statistics reported by the extension are anonymous. These metrics include the number of lookups that surface an unsafe credential, whether an alert leads to a password change, and the web domain involved for improving site compatibility.
    1. Google says this technique, called "private set intersection," means you don't get to see Google's list of bad credentials, and Google doesn't get to learn your credentials, but the two can be compared for matches.
    1. "If someone knows your old passwords, they can catch onto your system. If you're in the habit of inventing passwords with the name of a place you've lived and the zip code, for example, they could find out where I have lived in the past by mining my Facebook posts or something."Indeed, browsing through third-party password breaches offers glimpses into the things people hold dear — names of spouses and children, prayers, and favorite places or football teams. The passwords may no longer be valid, but that window into people's secret thoughts remains open.
    1. “A phone number is worth more on the dark web than a Social Security number. Your phone is so much more rich with data,” says J.D. Mumford, who runs Anonyome Labs Inc. in Salt Lake City.

      “Facial recognition technology is now cheap enough where you can put it in every Starbucks and have your coffee ready when you’re in the front of the line,” says Lorrie Cranor, a computer science professor at Carnegie Mellon University who runs its CyLab Usable Privacy and Security Laboratory in Pittsburgh. In March, the New York Times put three cameras on a rooftop in Manhattan, spent $60 on Amazon’s Rekognition system, and identified several people. I took an Air France flight that had passengers board using our faceprints, taken from our passports without our permission.

      Private companies such as Vigilant Solutions Inc., headquartered in the Valley, have cameras that have captured billions of geotagged photos of cars on streets and in parking lots that they sell on the open market, mostly to police and debt collectors.

      Project Kovr runs a similar workshop at schools, in which it assigns some kids to stalk another child from a distance so they can create a data profile and tailor an ad campaign for the stalkee. Baauw has also been planning a project in which he chisels a statue of Facebook Chief Executive Officer Mark Zuckerberg as a Roman god. “He’s the Zeus of our time,” he says.

      Until people demand a law that makes privacy the default, I’m going to try to remember, each time I click on something, that free things aren’t free. That when I send an email or a text outside of Signal or MySudo, I should expect those messages to one day be seen.

    1. Someone, somewhere has screwed up to the extent that data got hacked and is now in the hands of people it was never intended to be. No way, no how does this give me license to then treat that data with any less respect than if it had remained securely stored and I reject outright any assertion to the contrary. That's a fundamental value I operate under
    1. Unlike Zoom, Apple’s FaceTime video conference service is truly end-to-end encrypted. Group FaceTime calls offer a privacy-conscious alternative for up to 32 participants. The main caveat is that this option only works if everyone on the call has an Apple device that currently supports this feature.
    1. Covid-19 is an emergency on such a huge scale that, if anonymity is managed appropriately, internet giants and social media platforms could play a responsible part in helping to build collective crowd intelligence for social good, rather than profit
    2. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
    1. Thousands of enterprises around the world have done exhaustive security reviews of our user, network, and data center layers and confidently selected Zoom for complete deployment. 

      This doesn't really account for the fact that Zoom have committed some atrociously heinous acts, such as (and not limited to):

  4. Mar 2020
    1. To join the Privacy Shield Framework, a U.S.-based organization is required to self-certify to the Department of Commerce and publicly commit to comply with the Framework’s requirements. While joining the Privacy Shield is voluntary, the GDPR goes far beyond it.
    1. "users are not able to fully understand the extent of the processing operations carried out by Google and that ‘the information on processing operations for the ads personalization is diluted in several documents and does not enable the user to be aware of their extent."
    2. None Of Your Business
    1. This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won’t stay private from the company.
    2. But despite this misleading marketing, the service actually does not support end-to-end encryption for video and audio content, at least as the term is commonly understood. Instead it offers what is usually called transport encryption, explained further below
    1. The cookie policy is a section of the privacy policy dedicated to cookies
    2. If a website/app collects personal data, the Data Owner must inform users of this fact by way of a privacy policy. All that is required to trigger this obligation is the presence of a simple contact form, Google Analytics, a cookie or even a social widget; if you’re processing any kind of personal data, you definitely need one.
    1. By choosing Matomo, you are joining an ever growing movement. You’re standing up for something that respects user-privacy, you’re fighting for a safer web and you believe your personal data should remain in your own hands, no one else’s.
    1. Data privacy now a global movementWe’re pleased to say we’re not the only ones who share this philosophy, web browsing companies like Brave have made it possible so you can browse the internet more privately; and you can use a search engine like DuckDuckGo to search with the freedom you deserve.
    2. our values remain the same – advocating for 100% data ownership, respecting user-privacy, being reliable and encouraging people to stay secure. Complete analytics, that’s 100% yours.
    3. the privacy of your users is respected
  5. www.graphitedocs.com www.graphitedocs.com
    1. Own Your Encryption KeysYou would never trust a company to keep a record of your password for use anytime they want. Why would you do that with your encryption keys? With Graphite, you don't have to. You own and manage your keys so only YOU can decrypt your content.
    1. When you think about data law and privacy legislations, cookies easily come to mind as they’re directly related to both. This often leads to the common misconception that the Cookie Law (ePrivacy directive) has been repealed by the General Data Protection Regulation (GDPR), which in fact, it has not. Instead, you can instead think of the ePrivacy Directive and GDPR as working together and complementing each other, where, in the case of cookies, the ePrivacy generally takes precedence.
    1. In accordance with the general principles of privacy law, which do not permit the processing of data prior to consent, the cookie law does not allow the installation of cookies before obtaining the user’s consent, except for exempt categories.
    1. the deceptive practices it has been used to shield and enable are on borrowed time. The direction of travel — and the direction of innovation — is pro-privacy, pro-user control and therefore anti-deceptive-design.
    2. Earlier this year it began asking Europeans for consent to processing their selfies for facial recognition purposes — a highly controversial technology that regulatory intervention in the region had previously blocked. Yet now, as a consequence of Facebook’s confidence in crafting manipulative consent flows, it’s essentially figured out a way to circumvent EU citizens’ fundamental rights — by socially engineering Europeans to override their own best interests.
    3. But people clearly do care about privacy. Just look at the lengths to which ad tech entities go to obfuscate and deceive consumers about how their data is being collected and used. If people don’t mind companies spying on them, why not just tell them plainly it’s happening?
    4. The deceitful obfuscation of commercial intention certainly runs all the way through the data brokering and ad tech industries that sit behind much of the ‘free’ consumer Internet. Here consumers have plainly been kept in the dark so they cannot see and object to how their personal information is being handed around, sliced and diced, and used to try to manipulate them.
    5. design choices are being selected to be intentionally deceptive. To nudge the user to give up more than they realize. Or to agree to things they probably wouldn’t if they genuinely understood the decisions they were being pushed to make.
    1. provide users with information regarding how to update their browser settings. Many sites provide detailed information for most browsers. You could either link to one of these sites, or create a similar guide of your own. Your guide can either appear in a pop up after a user declines consent, or it can be part of your Privacy Policy, Cookie Information page, or its own separate page.
    1. What information is being collected? Who is collecting it? How is it collected? Why is it being collected? How will it be used? Who will it be shared with? What will be the effect of this on the individuals concerned? Is the intended use likely to cause individuals to object or complain?
    1. If your agreement with Google incorporates this policy, or you otherwise use a Google product that incorporates this policy, you must ensure that certain disclosures are given to, and consents obtained from, end users in the European Economic Area along with the UK. If you fail to comply with this policy, we may limit or suspend your use of the Google product and/or terminate your agreement.
    1. When joining a Zoom meeting, the "join from your browser" link is intentionally hidden. This browser extension solves this problem by transparently redirecting any meeting links to use Zoom's browser based web client.

      Using this extension means one won't be affected by the tracking that occurs via Zoom's apps for desktop and mobile devices.

    1. The host of a Zoom call has the capacity to monitor the activities of attendees while screen-sharing. This functionality is available in Zoom version 4.0 and higher.

      This is true if one uses the Zoom apps for desktop or mobile devices.

      There is a Chrome extension that redirects Zoom meetings via a web browser.

    1. Legitimate Interest may be used for marketing purposes as long as it has a minimal impact on a data subject’s privacy and it is likely the data subject will not object to the processing or be surprised by it.
    1. Google Analytics created an option to remove the last octet (the last group of 3 numbers) from your visitor’s IP-address. This is called ‘IP Anonymization‘. Although this isn’t complete anonymization, the GDPR demands you use this option if you want to use Analytics without prior consent from your visitors. Some countris (e.g. Germany) demand this setting to be enabled at all times.
    1. A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing
    2. Every data subject should therefore have the right to know and obtain communication in particular with regard to the purposes for which the personal data are processed, where possible the period for which the personal data are processed, the recipients of the personal data, the logic involved in any automatic personal data processing
    3. Where possible, the controller should be able to provide remote access to a secure system which would provide the data subject with direct access to his or her personal data.
    1. As a condition of use of this site, all users must give permission for CIVIC to use its access logs to attempt to track users who are reasonably suspected of gaining, or attempting to gain, unauthorised access. All log file information collected by CIVIC is kept secure and no access to raw log files is given to any third party.
    2. CIVIC will make no attempt to identify individual users. You should be aware, however, that access to web pages will generally create log entries in the systems of your ISP or network service provider. These entities may be in a position to identify the client computer equipment used to access a page.
    1. A 1% sample of AddThis Data (“Sample Dataset”) is retained for a maximum of 24 months for business continuity purposes.
    1. The system has been criticised due to its method of scraping the internet to gather images and storing them in a database. Privacy activists say the people in those images never gave consent. “Common law has never recognised a right to privacy for your face,” Clearview AI lawyer Tor Ekeland said in a recent interview with CoinDesk. “It’s kind of a bizarre argument to make because [your face is the] most public thing out there.”
    1. Enligt Polismyndighetens riktlinjer ska en konsekvensbedömning göras innan nya polisiära verktyg införs, om de innebär en känslig personuppgiftbehandling. Någon sådan har inte gjorts för det aktuella verktyget.

      Swedish police have used Clearview AI without any 'consequence judgement' having been performed.

      In other words, Swedish police have used a facial-recognition system without being allowed to do so.

      This is a clear breach of human rights.

      Swedish police has lied about this, as reported by Dagens Nyheter.

    1. The payment provider told MarketWatch that everyone has a unique walk, and it is investigating innovative behavioral biometrics such as gait, face, heartbeat and veins for cutting edge payment systems of the future.

      This is a true invasion into people's lives.

      Remember: this is a credit-card company. We use them to pay for stuff. They shouldn't know what we look like, how we walk, how our hearts beat, nor how our 'vein technology' works.