613 Matching Annotations
  1. Sep 2021
  2. Aug 2021
    1. You can request that Zoom delete any and all information they hold on you. Information on your data rights and how to get in contact with Zoom to request they erase your data can be found in their privacy policy. Once you have made the request, follow up to ensure you get confirmation that your data has been removed from their servers.
    1. U.S. Senate Subcommittee on Communications, Technology, Innovation, and the Internet, "Optimizing for Engagement: Understanding the Use of Persuasive Technology on Internet Platforms," 25 June 2019, www.commerce.senate.gov/2019/6/optimizing-for-engagement-understanding-the-use-of-persuasive-technology-on-internet-platforms.

      Perhaps we need plurality in the areas for which social data are aggregated?

      What if we didn't optimize for engagement, but optimized for privacy, security, or other axes in the space?

  3. Jul 2021
    1. whereas now, they know that user@domain.com was subscribed to xyz.net at some point and is unsubscribing. Information is gold. Replace user@domain with abcd@senate and xyz.net with warezxxx.net and you've got tabloid gold.
    1. Roberts noted that the risks of physical danger to donors are heightened “with each passing year” as changes in technology enables “anyone with access to a computer” to “compile a wealth of information about” anyone.

      He's going to be shocked at what's in his Facebook (shadow) profile...

    1. consumer friendly

      Including the "consumer" here is a red herring. We're meant to identify as the consumer and so take from this statement that our rights and best interests have been written into these BigTech-crafted laws.

      But a "consumer" is different from a "citizen," a "person," we the people.

    2. passage in March of a consumer data privacy law in Virginia, which Protocol reported was originally authored by Amazon

      From the article:

      Marsden and Virginia delegate Cliff Hayes held meetings with other large tech companies, including Microsoft; financial institutions, including Capital One; and non-profit groups and small businesses...

      Which all have something at stake here: the ability to monitor people and mine their data in order to sell it.

      Weak privacy laws give the illusion of privacy while maintaining the corporate panopticon.

    3. consumers would have to opt out of rather than into tracking

      Example of a dark pattern.

  4. Jun 2021
    1. But after using it for a few days you quickly realize that there is one major privacy issue that has been installed consciously by Amazon and Ring.The ring app allows you to delete videos on the system but it does Not allow you to delete motion sensor and window sensor history.So Amazon/ring knows everything that happens inside your home and there is no way for you to delete that history. They know when you’re inside, they know when you open your door, they know when you closed it. etc. etc. etc. So they essentially know everything about you and your motions within your home.This is a major privacy issue. And it is not some mistake that was overlooked. This was a conscious choice on Amazon/rings part to track the motions of you and your family inside your own home.I spoke with the customer service rep from Ring and she admitted that many many people call up and complain that they can’t delete sensor history. Of course it would’ve been much more ethical to explain to potential customers BEFORE they buy ring products that this breach of privacy has been installed.But Amazon/ring does not warn their customers about this privacy breach. They don’t warn customers because they created the privacy breech and Will continue to have an always have very personal information on the motions of your family inside your own home.If you care about your privacy. Don’t buy Ring products.
    1. Το ότι αποτελούν αντικείμενο ρύθμισης δεν είναι κάποια ριζοσπαστική θέση, είναι η θέση που έχει εκφράσει στο κογκρέσο των ΗΠΑ ο ιδρυτής  και ιδιοκτήτης του Fb Mark Zuckerberg: «Η θέση μου δεν είναι ότι δεν πρέπει να υπάρχει ρύθμιση. Πιστεύω ότι το πραγματικό ερώτημα, καθώς το διαδίκτυο γίνεται ολοένα και πιο σημαντικό για τις ζωές των ανθρώπων, είναι ποιος είναι ο σωστός τρόπος ρύθμισης, και όχι αν είναι απαραίτητο να υπάρχει ρύθμιση»

      Τσακαλώτος στα καλύτερά του, επιχειρηματολογέι εναντια στην ιδεολογία της ιδιώτευσης στο Fb.

    1. Yet books are curious objects: their strength is to be both intensely private and intensely social — and marginalia is a natural bridge between these two states.

      Books represent a dichotomy in being both intensely private and intensely social at the same time.

      Are there other objects that have this property?

      Books also have the quality of providing people with identities.

  5. May 2021
    1. <small><cite class='h-cite via'> <span class='p-author h-card'>jenny (phire) zhang</span> in jenny (phire) zhang on Twitter: "@markpopham the OSS/indieweb world falls into this trap a lot imo!! thank you for reading <3" / Twitter (<time class='dt-published'>05/06/2021 07:20:50</time>)</cite></small>

    2. In 1962, a book called Silent Spring by Rachel Carson documenting the widespread ecological harms caused by synthetic pesticides went off like a metaphorical bomb in the nascent environmental movement.

      Where is the Silent Spring in the data, privacy, and social media space?

    3. For example, we know one of the ways to make people care about negative externalities is to make them pay for it; that’s why carbon pricing is one of the most efficient ways of reducing emissions. There’s no reason why we couldn’t enact a data tax of some kind. We can also take a cautionary tale from pricing externalities, because you have to have the will to enforce it. Western Canada is littered with tens of thousands of orphan wells that oil production companies said they would clean up and haven’t, and now the Canadian government is chipping in billions of dollars to do it for them. This means we must build in enforcement mechanisms at the same time that we’re designing principles for data governance, otherwise it’s little more than ethics-washing.

      Building in pre-payments or a tax on data leaks to prevent companies neglecting negative externalities could be an important stick in government regulation.

      While it should apply across the board, it should be particularly onerous for for-profit companies.

    4. Amidst the global pandemic, this might sound not dissimilar to public health. When I decide whether to wear a mask in public, that’s partially about how much the mask will protect me from airborne droplets. But it’s also—perhaps more significantly—about protecting everyone else from me. People who refuse to wear a mask because they’re willing to risk getting Covid are often only thinking about their bodies as a thing to defend, whose sanctity depends on the strength of their individual immune system. They’re not thinking about their bodies as a thing that can also attack, that can be the conduit that kills someone else. People who are careless about their own data because they think they’ve done nothing wrong are only thinking of the harms that they might experience, not the harms that they can cause.

      What lessons might we draw from public health and epidemiology to improve our privacy lives in an online world? How might we wear social media "masks" to protect our friends and loved ones from our own posts?

    5. In an individual model of privacy, we are only as private as our least private friend.

      So don't have any friends?

      Obviously this isn't a thing, but the implications of this within privacy models can be important.

      Are there ways to create this as a ceiling instead of as a floor? How might we use topology to flip this script?

    6. Even with data that’s less fraught than our genome, our decisions about what we expose to the world have externalities for the people around us.

      We need to think more about the externalities of our data decisions.

    7. 130 years on, privacy is still largely conceived of as an individual thing, wherein we get to make solo decisions about when we want to be left alone and when we’re comfortable being trespassed upon.

      How could one design a mathematical balancing system to help separate individuals embedded within a variety of societies or publics to enforce a balance of levels of privacy.

      • There's the interpersonal level between the individuals
      • There's the person's individual privacy and the public's reaction/response to the thing captured, for which the public may shun or not
      • There's the takers rights (possibly a journalist or news outlet) to inform the broader public which may shame or not
      • There's the publics' potential right to know, the outcome may effect them or dramatically change society as a whole
      • others facets?
      • how many facets?
      • how to balance all these to create an optimum outcome for all parties?
      • How might the right to forget look like and be enforced?
      • How do economic incentives play out (paparazzi, journalism, social media, etc.?)
    1. Draft notes, E-mail, plans, source code, to-do lists, what have you

      The personal nature of this information means that users need control of their information. Tim Berners-Lee's Solid (Social Linked Data) project) looks like it could do some of this stuff.

    1. The seminal 1890 Harvard Law Review article The Right to Privacy—which every essay about data privacy is contractually obligated to cite—argued that the right of an individual to object to the publication of photographs ought to be considered part of a general ‘right to be let alone’.

      <small><cite class='h-cite via'> <span class='p-author h-card'>Jenny</span> in left alone, together | The Roof is on Phire (<time class='dt-published'>05/08/2021 18:32:41</time>)</cite></small>

      See also: https://en.wikipedia.org/wiki/The_Right_to_Privacy_(article)

    1. “For one of the most heavily guarded individuals in the world, a publicly available Venmo account and friend list is a massive security hole. Even a small friend list is still enough to paint a pretty reliable picture of someone's habits, routines, and social circles,” Gebhart said.

      Massive how? He's such a public figure that most of these connections are already widely reported in the media or easily guessable by an private invistigator. The bigger issue is the related data of transactions which might open them up for other abuses or potential leverage as in the other examples.

    1. Although I believe people have a right to secure and private communication, I disagree with those who extrapolate from this that we have a right to anonymous property transfer. It’s totally in the public’s legitimate interest to keep track of who owns what, and to settle which transfers of ownership are legitimate, for instance by disallowing coerced ones.

      I found this thought helpful. I had feelings like this but could not articulate them before.

  6. Apr 2021
    1. People can take the conversations with willing co-workers to Signal, Whatsapp, or even a personal Basecamp account, but it can't happen where the work happens anymore.

      Do note that two of the three systems that Fried use for examples are private. In other words, only people who you explicitly want to see what you're writing will see just that.

      This goes against his previous actions somewhat, e.g. https://twitter.com/jasonfried/status/1168986962704982016

  7. Mar 2021
    1. Not only are these websites breaking my trust—when I visit your website, I entered into contact with you, not 80 other websites—but they are loading content from websites neither know nor trust. Some of which have been know to spread malware.

      The contract of a healthy community: basic respect for one another.

    1. a data donation platform that allows users of browsers to donate data on their usage of specific services (eg Youtube, or Facebook) to a platform.

      This seems like a really promising pattern for many data-driven problems. Browsers can support opt-in donation to contribute their data to improve Web search, social media, recommendations, lots of services that implicitly require lots of operational data.

    2. The idea is that many smaller tech companies would allow for more choice between services. This solution is flawed. For one, services like search or social media benefit from network effects. Having large datasets to train on, means search recommendations get better. Having all your friends in one place, means you don’t need five apps to contact them all. I would argue those are all things we like and might lose when Big Tech is broken up. What we want is to be able to leave Facebook and still talk to our friends, instead of having many Facebooks.

      I'd be interested to better understand this concern or critique. I think the goal of smaller, interoperable services is exactly the idea of being able to communicate with our Facebook friends even if we leave Facebook. Perhaps that is an argument for combining deconsolidation with interoperability.

    1. Our new feature, Total Cookie Protection, works by maintaining a separate “cookie jar” for each website you visit. Any time a website, or third-party content embedded in a website, deposits a cookie in your browser, that cookie is confined to the cookie jar assigned to that website, such that it is not allowed to be shared with any other website.
  8. Feb 2021
  9. www.joinhoney.com www.joinhoney.com
    1. Honey does not track your search engine history, emails, or your browsing on any site that is not a retail website (a site where you can shop and make a purchase). When you are on a pre-approved retail site, to help you save money, Honey will collect information about that site that lets us know which coupons and promos to find for you. We may also collect information about pricing and availability of items, which we can share with the rest of the Honey community.
    1. (F)unctionalSifting:APrivacy-PreservingReputationSystemThroughMulti-InputFunctionalEncryption(extendedversion)
  10. Jan 2021
    1. Despite some implementation challenges, patient portals have allowed millions of patients to access to their medical records, read physicians’ notes, message providers, and contribute valuable information and corrections.

      I wonder if patients have edit - or at least, flag - information in their record?

    1. In our bedrooms, we want to have powerover who has access to us; in our bathrooms, we just want others de-prived of that access.

      Reidman highlights two types of privacy.

      The privacy we want to have in the bathroom, which is the power to deprive others of access to us.

      And the privacy we want to have in the bedroom, which is the power to control who has access to us.

    2. By privacy, I understand the condition in which other people aredeprived of access to either some information about you or some ex-perience of you. For the sake of economy, I will shorten this and saythat privacy is the condition in which others are deprived of access toyou.

      Reiman defines privacy as the condition in which others are deprived of access to you (information (e.g. location) or experience (e.g. watching you shower))

    3. No doubt privacyis valuable to people who have mischief to hide, but that is not enoughto make it generally worth protecting. However, it is enough to re-mind us that whatever value privacy has, it also has costs. The moreprivacy we have, the more difficult it is to get the information that

      Privacy is valuable to people who have mischief to hide. This is not enough to make it worth protecting, but it tells us that there is also a cost.

    1. As you already noticed, the extension does not go in an manipulate the hrefs/urls in the DOM itself. While it may seem scary to you that an extension may manipulate a URL you're navigating to in-flight, I think it's far scarier to imagine an extension reading and manipulating all of the HTML on all of the pages you go to (bank accounts, utilities, crypto, etc) in order to provide a smidgeon of privacy for the small % of times you happen to click a link with some UTM params.
  11. Dec 2020
    1. I haven't met anyone who makes this argument who then says that a one stop convenient, reliable, private and secure online learning environment can’t be achieved using common every day online systems

      Reliable: As a simple example, I'd trust Google to maintain data reliability over my institutional IT support.

      And you'd also need to make the argument for why learning needs to be "private", etc.

    1. And then there was what Lanier calls “data dignity”; he once wrote a book about it, called Who Owns the Future? The idea is simple: What you create, or what you contribute to the digital ether, you own.

      See Tim Berners-Lee's SOLID project.

    1. “Being under constant surveillance in the workplace is psychological abuse,” Heinemeier Hansson added. “Having to worry about looking busy for the stats is the last thing we need to inflict on anyone right now.”

      I really like the Basecamp approach (I forget where I heard this...could have been in one of the Rework podcasts):

      Don't try to get the most out of everyone; try to get the best out of them.

      If you're looking for ways to build trust in a team, I can't recommend the following books published by Basecamp:

      • Rework
      • Remote
      • It doesn't have to be crazy at work
    2. For example, to help maintain privacy and trust, the user data provided in productivity score is aggregated over a 28-day period.

      So that the fact that the metrics are collected over 28 days is meant to maintain privacy and trust. How?

    1. Recent patent filings show that Microsoft has been exploring additional ideas to monitor workers in the interest of organizational productivity. One filing describes a “meeting insight computing system” that would generate a quality score for a meeting using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting.

      So this will require that you have to have video turned on. How will they sell this to employees? "You need to turn your video on so that the algorithm can generate an accurate meeting quality score using your body language and facial expression.

      Sounds perfect. Absolutely no concerns about privacy violations, etc. in this product.

  12. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. anonymous imageboard

      4chan is reasonably unique in the current online landscape, in that it permits conversation by totally anonymous users. This allows its users to post without much thought about their privacy status, which they often take for granted. This unique level of privacy fostered by anonymity, in a way, partially delivers on the Cyberspace rhetoric of the 1990s in that people can't be judged by their physical identities unless they offer identifying information up themselves. That's not to say that 4chan is a welcoming space for all (or even most) users, though, as it has been acknowledged, even later here in Ellis' article, that 4chan houses plenty of white supremacist tendencies, but, strictly speaking, as far as one's ideas go, they are judged purely based on their merit so long as no additional personal identifiers are offered. As Dillon Ludemann notes in his paper, /pol/emics: Ambiguity, scales, and digital discourse on 4chan, white supremacy, as well as other, "practiced and perceived deviancy is due to the default blanket of anonymity, and the general discourse of the website encourages users to remain unnamed. This is further enforced and embodied as named users, colloquially known as 'namefags,' are often vilified for their separation from the anonymous collective community" (Ludemann, 2018).

      Hypothetically, since all users start out as anonymous, one could also present their identity however they so please on the platform, and in theory what this means is that the technology behind the site promotes identity exploration (and thus cyberspace rhetoric), even though in practice, what most users experience is latent racism that depends on users' purposefully offered identifying information or generalized white supremacist posts that are broadcasted for all on the site to see.

      Work Cited:

      Ludemann, D. (2018). /pol/emics: Ambiguity, scales, and digital discourse on 4chan. Discourse, Context & Media, 24, 92-98. doi: 10.1016/j.dcm.2018.01.010

    1. A long term key is as secure as the minimum common denominator of your security practices over its lifetime. It's the weak link.

      Good phrasing of the idea: "you're only as secure as your weakest link".

      You are only as secure as the minimum common denominator of your security practices.

    1. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has. Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal. In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing. Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization. We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

      One reason privacy is important is because society makes moral progress by experimenting with things on the fringe of what is legal.

      This is reminiscent of Signal's founder's argument that we should want law enforcement not to be 100% effective, because how else are we going to find out the gay sex, and marihuana use doesn't devolve and doesn't hurt anybody.

    1. For both the tailor-customer and doctor-patient examples, personal data is an input used to improve an output (dress, suit, medical treatment) such that the improvement directly serves the interests of the person whose information is being used.

      This reminds me of "Products are functions" where your personal data is a variable than enters into the function to determine the output.

    1. Preserving user privacy is difficult when detectingmore nuanced forms of censorshipSome forms of softcensorship might involve intentional performance degrada-tion or content manipulation. Detecting this type of behav-ior would require comparing performance or content acrossgroups of users, but such a comparison also implies thateach user must reveal their browsing history.

      If you want to investigate whether content for a user was manipulated or performance was degraded, there may be no other way but to access detailed logs of their usage. This might raise privacy concerns.

      Not only is personalization difficult to disambiguate from manipulation and censorship, personalization also makes it more costly to compare the personalized experience to some baseline value to determine if manipulation or performance degradation has taken place.

    1. If the EU is set to mandate encryption backdoors to enable law enforcement to pursue bad actors on social media, and at the same time intends to continue to pursue the platforms for alleged bad practices, then entrusting their diplomatic comms to those platforms, while forcing them to have the tools in place to break encryption as needed would seem a bad idea.

      One explanation for the shift away from Whatsapp (based off the Signal protocol) is that the EU themselves are seeking legislation to force a backdoor into consumer tech.

    1. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight.

      Schneier is saying here that as long as the incentives are still pointing in the direction of short-term profit, privacy will be neglected.

      Surveillance, which allows for targeted advertising will win out over user privacy. Outrage, will be prioritized over more wholesome feelings. Corporate secrecy will allow Facebook to evade regulators and its users.

    2. Increased pressure on Facebook to manage propaganda and hate speech could easily lead to more surveillance. But there is pressure in the other direction as well, as users equate privacy with increased control over how they present themselves on the platform.

      Two forces acting on the big tech platforms.

      One, towards more surveillance, to stop hate and propaganda.

      The other, towards less surveillance, stemming from people wanting more privacy and more control.

    1. At the same time, working through these principles is only the first step in building out a privacy-focused social platform. Beyond that, significant thought needs to go into all of the services we build on top of that foundation -- from how people do payments and financial transactions, to the role of businesses and advertising, to how we can offer a platform for other private services.

      This is what Facebook is really after. They want to build the trust to be able to offer payment services on top of Facebook.

    2. People want to be able to choose which service they use to communicate with people. However, today if you want to message people on Facebook you have to use Messenger, on Instagram you have to use Direct, and on WhatsApp you have to use WhatsApp. We want to give people a choice so they can reach their friends across these networks from whichever app they prefer.We plan to start by making it possible for you to send messages to your contacts using any of our services, and then to extend that interoperability to SMS too. Of course, this would be opt-in and you will be able to keep your accounts separate if you'd like.

      Facebook plans to make messaging interoperable across Instagram, Facebook and Whatsapp. It will be opt-in.

    3. An important part of the solution is to collect less personal data in the first place, which is the way WhatsApp was built from the outset.

      Zuckerberg claims Whatsapp was built with the goal of not collecting much data from the outset.

    4. As we build up large collections of messages and photos over time, they can become a liability as well as an asset. For example, many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives.

      Large collections of photos are both a liability and an asset. They might be embarrassing but it might also be fun to look back.

    5. We increasingly believe it's important to keep information around for shorter periods of time. People want to know that what they share won't come back to hurt them later, and reducing the length of time their information is stored and accessible will help.

      In addition to a focus on privacy, Zuckerberg underlines a focus on impermanence — appeasing people's fears that their content will come back to haunt them.

    6. I understand that many people don't think Facebook can or would even want to build this kind of privacy-focused platform -- because frankly we don't currently have a strong reputation for building privacy protective services, and we've historically focused on tools for more open sharing.

      Zuckerberg acknowledges that Facebook is not known for its reputation on privacy and has focused on open sharing in the past.

    7. But now, with all the ways people also want to interact privately, there's also an opportunity to build a simpler platform that's focused on privacy first.

      Zuckerberg says there's an opportunity for a platform that is focused on privacy first.

    8. Today we already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication.

      According to Zuckerberg, in 2019 we're seeing private messaging, ephemeral stories and small groups as the fastest growing areas of online communication.

    9. As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today's open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.

      Mark Zuckerberg claims he believes privacy focused communications will become even more important than today's open platforms (like Facebook).

    1. Now let me get back to your question. The FBI presents its conflict with Apple over locked phones as a case as of privacy versus security. Yes, smartphones carry a lot of personal data—photos, texts, email, and the like. But they also carry business and account information; keeping that secure is really important. The problem is that if you make it easier for law enforcement to access a locked device, you also make it easier for a bad actor—a criminal, a hacker, a determined nation-state—to do so as well. And that's why this is a security vs. security issue.

      The debate should not be framed as privacy-vs-security because when you make it easier for law enforcement to access a locked device, you also make it easier for bad actors to do so as well. Thus it is a security-vs-security issue.

    1. The FBI — along with many other law enforcement and surveillance agents — insists that it is possible to make crypto that will protect our devices, transactions, data, communications and lives, but which will fail catastrophically whenever a cop needs it to. When technologists explain that this isn't a thing, the FBI insists that they just aren't nerding hard enough.

      The infosec community has labelled the argument by the government that there should be a solution to the dilemma of wanting secure consumer tech, but also granting access to government officials as: nerd harder.

    1. Barr makes the point that this is about “consumer cybersecurity” and not “nuclear launch codes.” This is true, but it ignores the huge amount of national security-related communications between those two poles. The same consumer communications and computing devices are used by our lawmakers, CEOs, legislators, law enforcement officers, nuclear power plant operators, election officials and so on. There’s no longer a difference between consumer tech and government tech—it’s all the same tech.

      The US government's defence for wanting to introduce backdoors into consumer encryption is that in doing so they would not be weakening the encryption for, say, nuclear launch codes.

      Schneier holds that this distinction between government and consumer tech no longer exists. Weakening consumer tech amounts to weakening government tech. Therefore it's not worth doing.

    1. The answers for law enforcement, social networks, and medical data won’t be the same. As we move toward greater surveillance, we need to figure out how to get the best of both: how to design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually.

      Each challenge needs to be treated on its own. The trade off to be made will be different for law enforcement vs. social media vs. health officials. We need to figure out "how to design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually."

    1. A future in which privacy would face constant assault was so alien to the framers of the Constitution that it never occurred to them to call out privacy as an explicit right. Privacy was inherent to the nobility of their being and their cause.

      When we wrote our constitutions, it was a time where privacy was a given. It was inconceivable that there would be a threat of being surveilled everywhere we go.

    2. We do nothing wrong when we make love or go to the bathroom. We are not deliberately hiding anything when we seek out private places for reflection or conversation. We keep private journals, sing in the privacy of the shower, and write letters to secret lovers and then burn them. Privacy is a basic human need.

      Privacy is a basic human, psychological need.

    3. Privacy protects us from abuses by those in power, even if we’re doing nothing wrong at the time of surveillance.

      Privacy is what protects us from abuse at the hands of the powerful, even if we're doing nothing wrong at the time we're being surveilled.

    4. My problem with quips like these — as right as they are — is that they accept the premise that privacy is about hiding a wrong. It’s not. Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.

      Common retorts to "If you aren't doing anything wrong, what do you have to hide?" accept the premise that privacy is about having a wrong to hide.

      Bruce Schneier posits that Privacy is an inherent human right, and "a requirement for maintaining the human condition with dignity and respect".

  13. Oct 2020
    1. Similarly, technology can help us control the climate, make AI safe, and improve privacy.

      regulation needs to surround the technology that will help with these things

    1. Legislation to stem the tide of Big Tech companies' abuses, and laws—such as a national consumer privacy bill, an interoperability bill, or a bill making firms liable for data-breaches—would go a long way toward improving the lives of the Internet users held hostage inside the companies' walled gardens. But far more important than fixing Big Tech is fixing the Internet: restoring the kind of dynamism that made tech firms responsive to their users for fear of losing them, restoring the dynamic that let tinkerers, co-ops, and nonprofits give every person the power of technological self-determination.
    1. In fact, these platforms have become inseparable from their data: we use “Facebook” to refer to both the application and the data that drives that application. The result is that nearly every Web app today tries to ask you for more and more data again and again, leading to dangling data on duplicate and inconsistent profiles we can no longer manage. And of course, this comes with significant privacy concerns.
    1. I find it somewhat interesting to note that with 246 public annotations on this page using Hypothes.is, that from what I can tell as of 4/2/2019 only one of them is a simple highlight. All the rest are highlights with an annotation or response of some sort.

      It makes me curious to know what the percentage distribution these two types have on the platform. Is it the case that in classroom settings, which many of these annotations appear to have been made, that much of the use of the platform dictates more annotations (versus simple highlights) due to the performative nature of the process?

      Is it possible that there are a significant number of highlights which are simply hidden because the platform automatically defaults these to private? Is the friction of making highlights so high that people don't bother?

      I know that Amazon will indicate heavily highlighted passages in e-books as a feature to draw attention to the interest relating to those passages. Perhaps it would be useful/nice if Hypothes.is would do something similar, but make the author of the highlights anonymous? (From a privacy perspective, this may not work well on articles with a small number of annotators as the presumption could be that the "private" highlights would most likely be directly attributed to those who also made public annotations.

      Perhaps the better solution is to default highlights to public and provide friction-free UI to make them private?

      A heavily highlighted section by a broad community can be a valuable thing, but surfacing it can be a difficult thing to do.

    1.  recording it all in a Twitter thread that went viral and garnered the hashtag  #PlaneBae.

      I find it interesting that The Atlantic files this story with a URL that includes "/entertainment/" in it's path. Culture, certainly, but how are three seemingly random people's lives meant to be classified by such a journalistic source as "entertainment?"

    1. A friend of mine asked if I’d thought through the contradiction of criticizing Blair publicly like this, when she’s another not-quite public figure too.

      Did this really happen? Or is the author inventing it to diffuse potential criticism as she's writing about the same story herself and only helping to propagate it?

      There's definitely a need to write about this issue, so kudos for that. Ella also deftly leaves out the name of the mystery woman, I'm sure on purpose. But she does include enough breadcrumbs to make the rest of the story discover-able so that one could jump from here to participate in the piling on. I do appreciate that it doesn't appear that she's given Blair any links in the process, which for a story like this is some subtle internet shade.

    2. Even when the attention is positive, it is overwhelming and frightening. Your mind reels at the possibility of what they could find: your address, if your voting records are logged online; your cellphone number, if you accidentally included it on a form somewhere; your unflattering selfies at the beginning of your Facebook photo archive. There are hundreds of Facebook friend requests, press requests from journalists in your Instagram inbox, even people contacting your employer when they can’t reach you directly. This story you didn’t choose becomes the main story of your life. It replaces who you really are as the narrative someone else has written is tattooed onto your skin.
    3. the woman on the plane has deleted her own Instagram account after receiving violent abuse from the army Blair created.

      Feature request: the ability to make one's social media account "disappear" temporarily while a public "attack" like this is happening.

      We need a great name for this. Publicity ghosting? Fame cloaking?

    4. We actively create our public selves, every day, one social media post at a time.
  14. Sep 2020
    1. To defeat facial recognition software, “you would have to wear a mask or disguises,” Tien says. “That doesn’t really scale up for people.”

      Yeah, that sentence was written in 2017 and especially pertinent to Americans. 2020 has changed things a fair bit.

    1. reminding your students that you value and respect their privacy and their culture.

      This constant reminder will make students feel inclusive and reduce the chances of unintended harm.

    1. L’homme asservi n’est pas seulement contraint, il consent à sa contrainte.

      …mais la personne «asservie» le consent-elle vraiment en connaissance de cause?

      Une étude montre que plus les gens sont conscients de ce qui est à l’œuvre, plus ils sont réticents à utiliser les services qui exploitent les données de leur vie privée.

    1. and as a result, the requirement to use this tracking permission will go into effect early next year.

      Looking forward to the feature

    2. There are clever ways around trackers

      I also recommend switching to FIrefox, getting the Facebook container extension and Privacy Badger extension!

    3. These creeping changes help us forget how important our privacy is and miss that it’s being eroded.

      This is important we are normalizing the fact that our privacy is being taken slowly, update after update

    1. Turns out, there’s a dedicated “Individual Account Appeal Form” where they ask you a list of privacy-touching mandatory questions, progressively shifting the Overton window
  15. Aug 2020
  16. Jul 2020
    1. a new kind of power

      This is what Shoshana Zuboff sustains in The Age of Surveillance Capitalism: a new kind of power which can, at first, be apprehended through Marx’s lenses; but as a new form of capitalism, it <mark>“cannot be reduced to known harms—monopoly, privacy—and therefore do not easily yield to known forms of combat.”</mark>

      It is <mark>“a new form of capitalism on its own terms and in its own words”</mark> which therefore requires new conceptual frameworks to be understood, negotiated.