312 Matching Annotations
  1. Last 7 days
    1. Chavarria-Miró, G., Anfruns-Estrada, E., Guix, S., Paraira, M., Galofré, B., Sáanchez, G., Pintó, R., & Bosch, A. (2020). Sentinel surveillance of SARS-CoV-2 in wastewater anticipates the occurrence of COVID-19 cases. MedRxiv, 2020.06.13.20129627. https://doi.org/10.1101/2020.06.13.20129627

    1. To change incentives so that personal data is treated with appropriate care, we need criminal penalties for the Facebook executives who left vulnerable half a billion people’s personal data, unleashing a lifetime of phishing attacks, and who now point to an FTC deal indemnifying them from liability because our phone numbers and unchangeable dates of birth are “old” data.

      We definitely need penalties and regulation to fix our problems.

  2. Apr 2021
    1. The open RSS standard has provided immense value to the growth of the podcasting ecosystem over the past few decades.

      Why do I get the sinking feeling that the remainder of this article will be maniacally saying, "and all of that ends today!"

    2. We also believe that in order to democratize audio and achieve Spotify’s mission of enabling a million creators to live off of their art, we must work to enable greater choice for creators. This choice becomes increasingly important as audio becomes even easier to create and share.

      Dear Anchor/Spotify, please remember that democratize DOES NOT equal surveillance capitalism. In fact, Facebook and others have shown that doing what you're probably currently planning for the podcasting space will most likely work against democracy.

    3. Thus, the creative freedom of creators is limited.

      And thus draconian methods for making the distribution unnecessarily complicated, siloed, surveillance capitalized, and over-monitized beyond all comprehension are beyond the reach of one or two for profit companies who want to own the entire market like monopolistic giants are similarly limited. (But let's just stick with the creators we're pretending to champion, shall we?)

    1. So on a blindingly sunny day in October 2019, I met with Omar Seyal, who runs Pinterest’s core product. I said, in a polite way, that Pinterest had become the bane of my online existence.“We call this the miscarriage problem,” Seyal said, almost as soon as I sat down and cracked open my laptop. I may have flinched. Seyal’s role at Pinterest doesn’t encompass ads, but he attempted to explain why the internet kept showing me wedding content. “I view this as a version of the bias-of-the-majority problem. Most people who start wedding planning are buying expensive things, so there are a lot of expensive ad bids coming in for them. And most people who start wedding planning finish it,” he said. Similarly, most Pinterest users who use the app to search for nursery decor end up using the nursery. When you have a negative experience, you’re part of the minority, Seyal said.

      What a gruesome name for an all-too-frequent internet problem: miscarriage problem

  3. Mar 2021
    1. The scholars Nick Couldry and Ulises Mejias have called it “data colonialism,” a term that reflects our inability to stop our data from being unwittingly extracted.

      I've not run across data colonialism before.

    1. de Oliveira T, Lutucuta S, Nkengasong J, Morais J, Paixao JP, Neto Z, Afonso P, Miranda J, David K, Ingles L, Amilton P A P R R C, Freitas H R, Mufinda F, Tessema K S , Tegally H, San E J, Wilkinson E, Giandhari J, Pillay S, Giovanetti M, Naidoo Y, Katzourakis A, Ghafari M, Singh L, Tshiabuila D, Martin D, Lessells R. (2021) A Novel Variant of Interest of SARS-CoV-2 with Multiple Spike Mutations Detected through Travel Surveillance in Africa. medRxiv. https://www.krisp.org.za/publications.php?pubid=330. Accessed 26 March 2021.

    1. Larremore, D. B., Wilder, B., Lester, E., Shehata, S., Burke, J. M., Hay, J. A., Tambe, M., Mina, M. J., & Parker, R. (2020). Test sensitivity is secondary to frequency and turnaround time for COVID-19 surveillance. MedRxiv, 2020.06.22.20136309. https://doi.org/10.1101/2020.06.22.20136309

  4. Feb 2021
    1. There is only one way to “play” Twitter, and the only real gain is that “No one is learning anything, except to remain connected to the machine.” 

      Ik vraag me af of dat echt zo is. Twitter lijkt meer en meer de plek te worden om je eigen media op te bouwen en het eigen spel te spelen. Er zijn meerdere manieren om het spel te spelen. Toch?

    2. The tech takeover corresponds with shrinking possibilities. This evolution has also seen the rise of a seeming aesthetic paradox. Minimalist design reigns now that the corporations have taken over the net. Long seen as anti-consumerist, Minimalism has now become a coded signal for luxury and control. The less control we have over our virtual spaces, the less time we spend considering our relationships with them. 

      Interessante laatste zin. Hoe minder we eigen controle hebben, zeggenschap, agency, hoe minder we ons bezighouden met de aard van de relatie. Die relatie kan verschillende vormen hebben.

    1. identity theft

      Saw this while scrolling through quickly. Since I can't meta highlight another hypothesis annotation

      identity theft

      I hate this term. Banks use it to blame the victims for their failure to authenticate people properly. I wish we had another term. —via > mcr314 Aug 29, 2020 (Public) on "How to Destroy ‘Surveillance C…" (onezero.medium.com)

      This is a fantastic observation and something that isn't often noticed. Victim blaming while simultaneously passing the buck is particularly harmful. Corporations should be held to a much higher standard of care. If corporations are treated as people in the legal system, then they should be held to the same standards.

    2. <small><cite class='h-cite via'> <span class='p-author h-card'>Cory Doctorow</span> in Pluralistic: 16 Feb 2021 – Pluralistic: Daily links (<time class='dt-published'>02/25/2021 12:20:24</time>)</cite></small>

      It's interesting to note that there are already two other people who have used Hypothes and their page note functionality to tag this article as to read, one with (to read) and another with (TODO-read).

    1. This is just one study, of course, and these are complicated social realities. I think it is fair to say that our pundits and social critics can no longer make the easy assumption that the web and the blogosphere are echo-chamber amplifiers. But whether or not this study proves to be accurate, one thing is certain. The force that enables these unlikely encounters between people of different persuasions, the force that makes the web a space of serendipity and discovery, is precisely the open, combinatorial, connective nature of the medium. So when we choose to take our text out of that medium, when we keep our words from being copied, linked, indexed, that’s a choice with real civic consequences that are not to be taken lightly.

      These words certainly didn't take into account the focusing factor that social media algorithms based on surveillance capitalism and attention seeking clicks and engagement would inflict in the coming decade.

    1. A broad overview of the original web and where we are today. Includes an outline of three business models that don't include advertising including:

      • Passion projects
      • Donation-based sites
      • Subscription-based sites
  5. Jan 2021
    1. The ad lists various data that WhatsApp doesn’t collect or share. Allaying data collection concerns by listing data not collected is misleading. WhatsApp doesn’t collect hair samples or retinal scans either; not collecting that information doesn’t mean it respects privacy because it doesn’t change the information WhatsApp does collect.

      An important logical point. Listing what they don't keep isn't as good as saying what they actually do with one's data.

    2. Recently, WhatsApp updated its privacy policy to allow sharing data with its parent, Facebook. Users who agreed to use WhatsApp under its previous privacy policy had two options: agree to the new policy or be unable to use WhatsApp again. The WhatsApp privacy policy update is a classic bait-and-switch: WhatsApp lured users in with a sleek interface and the impression of privacy, domesticated them to remove their autonomy to migrate, and then backtracked on its previous commitment to privacy with minimal consequence. Each step in this process enabled the next; had user domestication not taken place, it would be easy for most users to switch away with minimal friction.

      Definitely a dark pattern that has been replicated many times.

  6. Dec 2020
    1. The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning. The rise of QAnon, for example, is one of the social web’s logical conclusions. That’s because Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences. Facebook is an agent of government propaganda, targeted harassment, terrorist recruitment, emotional manipulation, and genocide—a world-historic weapon that lives not underground, but in a Disneyland-inspired campus in Menlo Park, California.

      The original goal with a bit of moderation may have worked. Regression to the mean forces it to a bad place, but when you algorithmically accelerate things toward our bases desires, you make it orders of magnitude worse.

      This should be though of as pure social capitalism. We need the moderating force of government regulation to dampen our worst instincts, much the way the United State's mixed economy works (or at least used to work, as it seems that raw capitalism is destroying the United States too).

    1. Therefore, it could be argued that belief regarding the usefulness of technologies could lead to change and ultimately the actual use of digital technologies in teaching and learning.

      This goes both ways. A teacher who believes that their job is to control access to specialised information, and to control assessment may use technology to close down learning opportunities (e.g. by banning the use of Wikipedia, YouTube, etc.) and even insisting on the installation of surveillance (proctoring) software on students' personal computers.

      Again, you can argue that technology in itself doesn't make the difference.

    1. Recent patent filings show that Microsoft has been exploring additional ideas to monitor workers in the interest of organizational productivity. One filing describes a “meeting insight computing system” that would generate a quality score for a meeting using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting.

      So this will require that you have to have video turned on. How will they sell this to employees? "You need to turn your video on so that the algorithm can generate an accurate meeting quality score using your body language and facial expression.

      Sounds perfect. Absolutely no concerns about privacy violations, etc. in this product.

    2. Microsoft says it will make changes in its new Productivity Score feature, including removing the ability for companies to see data about individual users, to address concerns from privacy experts that the tech giant had effectively rolled out a new tool for snooping on workers.

      It's great that MS has reacted so quickly to the outcry around the privacy of workers.

      I thought it would be super-interesting to see how academics might have responded to the idea of institutional administrators keeping tabs on the number of hours that they'd spent in meetings (via Teams), composing and reading emails (via Outlook), writing articles (via Word), and so on.

      And yet these would be the same academics who do this kind of monitoring of student work.

  7. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. surveillance capitalism.

      I recommend to link to the book where its author, Shoshana Zuboff, has coined the term.

      The irony right now is that you're linking to an Amazon version of Zuboff's book; Amazon is currently one of the top-five surveillance-capitalist companies in the tech world.

      I would also consider linking to the Wikipedia page for the term.

    1. unblinking

      I’m struck by the image of the software constantly watching students that’s conjured by “unblinking”. I also can’t help but think of HAL 9000.

    1. In another interview, this time by John Laidler of the Harvard Gazette (March 2019), Zuboff expanded on this: I define surveillance capitalism as the unilateral claiming of private human experience as free raw material for translation into behavioral data. These data are then computed and packaged as prediction products and sold into behavioral futures markets

      Zuboff's definition of Surveillance Capitalism

    1. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight.

      Schneier is saying here that as long as the incentives are still pointing in the direction of short-term profit, privacy will be neglected.

      Surveillance, which allows for targeted advertising will win out over user privacy. Outrage, will be prioritized over more wholesome feelings. Corporate secrecy will allow Facebook to evade regulators and its users.

  8. Oct 2020
    1. It would allow end users to determine their own tolerances for different types of speech but make it much easier for most people to avoid the most problematic speech, without silencing anyone entirely or having the platforms themselves make the decisions about who is allowed to speak.

      But platforms are making huge decisions about who is allowed to speak. While they're generally allowing everyone to have a voice, they're also very subtly privileging many voices over others. While they're providing space for even the least among us to have a voice, they're making far too many of the worst and most powerful among us logarithmic-ally louder.

      It's not broadly obvious, but their algorithms are plainly handing massive megaphones to people who society broadly thinks shouldn't have a voice at all. These megaphones come in the algorithmic amplification of fringe ideas which accelerate them into the broader public discourse toward the aim of these platforms getting more engagement and therefore more eyeballs for their advertising and surveillance capitalism ends.

      The issue we ought to be looking at is the dynamic range between people and the messages they're able to send through social platforms.

      We could also analogize this to the voting situation in the United States. When we disadvantage the poor, disabled, differently abled, or marginalized people from voting while simultaneously giving the uber-rich outsized influence because of what they're able to buy, we're imposing the same sorts of problems. Social media is just able to do this at an even larger scale and magnify the effects to make their harms more obvious.

      If I follow 5,000 people on social media and one of them is a racist-policy-supporting, white nationalist president, those messages will get drowned out because I can only consume so much content. But when the algorithm consistently pushes that content to the top of my feed and attention, it is only going to accelerate it and create more harm. If I get a linear presentation of the content, then I'd have to actively search that content out for it to cause me that sort of harm.

    1. The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.#lazy-img-336042387:before{padding-top:66.68334167083543%;}

      This is a great summation of the issue.

    1. Legislation to stem the tide of Big Tech companies' abuses, and laws—such as a national consumer privacy bill, an interoperability bill, or a bill making firms liable for data-breaches—would go a long way toward improving the lives of the Internet users held hostage inside the companies' walled gardens. But far more important than fixing Big Tech is fixing the Internet: restoring the kind of dynamism that made tech firms responsive to their users for fear of losing them, restoring the dynamic that let tinkerers, co-ops, and nonprofits give every person the power of technological self-determination.
  9. Sep 2020
    1. To defeat facial recognition software, “you would have to wear a mask or disguises,” Tien says. “That doesn’t really scale up for people.”

      Yeah, that sentence was written in 2017 and especially pertinent to Americans. 2020 has changed things a fair bit.

    1. we are all subjects of unknown and unseen processes

      c’est le cas actuellement: il y a une asymmétrie sans précédent (<cite>Les deux textes</cite>) des connaissances et des moyens de manipulation/redirection massive des comportements grâce au big data découlant de la surveillance de masse en ligne.

      • pour une introduction grand public, voir <cite>The Social Dilemma</cite> récemment mis en ligne sur Netflix.
      • pour un ouvrage de fond, voir Shoshana Zuboff, <cite>The Age of Surveillance Capitalism</cite>.
    1. Facebook ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world, according to an explosive memo sent by a recently fired Facebook employee and obtained by BuzzFeed News.The 6,600-word memo, written by former Facebook data scientist Sophie Zhang, is filled with concrete examples of heads of government and political parties in Azerbaijan and Honduras using fake accounts or misrepresenting themselves to sway public opinion. In countries including India, Ukraine, Spain, Brazil, Bolivia, and Ecuador, she found evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes, though she did not always conclude who was behind them.
  10. Aug 2020
    1. The mass surveillance and factory farming of human beings on a global scale is the business model of people farmers like Facebook and Google. It is the primary driver of the socioeconomic system we call surveillance capitalism.
    1. Facebook has apologized to its users and advertisers for being forced to respect people’s privacy in an upcoming update to Apple’s mobile operating system – and promised it will do its best to invade their privacy on other platforms.

      Sometimes I forget how funny The Register can be. This is terrific.

    1. Facebook is warning developers that privacy changes in an upcoming iOS update will severely curtail its ability to track users' activity across the entire Internet and app ecosystem and prevent the social media platform from serving targeted ads to users inside other, non-Facebook apps on iPhones.

      I fail to see anything bad about this.

    1. Vogels, C. B. F., Brackney, D., Wang, J., Kalinich, C. C., Ott, I., Kudo, E., Lu, P., Venkataraman, A., Tokuyama, M., Moore, A. J., Muenker, M. C., Casanovas-Massana, A., Fournier, J., Bermejo, S., Campbell, M., Datta, R., Nelson, A., Team, Y. I. R., Cruz, C. D., … Grubaugh, N. (2020). SalivaDirect: Simple and sensitive molecular diagnostic test for SARS-CoV-2 surveillance. MedRxiv, 2020.08.03.20167791. https://doi.org/10.1101/2020.08.03.20167791

  11. Jul 2020
    1. But the business model that we now call surveillance capitalism put paid to that, which is why you should never post anything on Facebook without being prepared to face the algorithmic consequences.

      I'm reminded a bit of the season 3 episode of Breaking Bad where Jesse Pinkman invites his drug dealing pals to a Narcotics Anonymous-type meeting so that they can target their meth sales. Fortunately the two low lifes had more morality and compassion than Facebook can manage.

      https://www.youtube.com/watch?v=20kpzC3sckQ

    1. a new kind of power

      This is what Shoshana Zuboff sustains in The Age of Surveillance Capitalism: a new kind of power which can, at first, be apprehended through Marx’s lenses; but as a new form of capitalism, it <mark>“cannot be reduced to known harms—monopoly, privacy—and therefore do not easily yield to known forms of combat.”</mark>

      It is <mark>“a new form of capitalism on its own terms and in its own words”</mark> which therefore requires new conceptual frameworks to be understood, negotiated.

    2. One of these semiotizing processes is the extraction, interpretation and reintegration of web data from and into human subjectivities.

      Machine automation becomes another “subjectivity” or “agentivity”—an influential one, because it is the one filtering and pushing content to humans.

      The means of this automated subjectivity is feeding data capitalism: more content, more interaction, more behavioral data produced by the users—data which is then captured (“dispossessed”), extracted, and transformed into prediction services, which render human behavior predictable, and therefore monetizable (Shoshana Zuboff, The Age of Surviellance Capitalism, 2019).

    1. Varatharaj, A., Thomas, N., Ellul, M. A., Davies, N. W. S., Pollak, T. A., Tenorio, E. L., Sultan, M., Easton, A., Breen, G., Zandi, M., Coles, J. P., Manji, H., Al-Shahi Salman, R., Menon, D. K., Nicholson, T. R., Benjamin, L. A., Carson, A., Smith, C., Turner, M. R., … Plant, G. (2020). Neurological and neuropsychiatric complications of COVID-19 in 153 patients: A UK-wide surveillance study. The Lancet Psychiatry. https://doi.org/10.1016/S2215-0366(20)30287-X

  12. Jun 2020
    1. Starr, T. N., Greaney, A. J., Hilton, S. K., Crawford, K. H., Navarro, M. J., Bowen, J. E., Tortorici, M. A., Walls, A. C., Veesler, D., & Bloom, J. D. (2020). Deep mutational scanning of SARS-CoV-2 receptor binding domain reveals constraints on folding and ACE2 binding [Preprint]. Microbiology. https://doi.org/10.1101/2020.06.17.157982

    1. One of the new tools debuted by Facebook allows administrators to remove and block certain trending topics among employees. The presentation discussed the “benefits” of “content control.” And it offered one example of a topic employers might find it useful to blacklist: the word “unionize.”

      Imagine your employer looking over your shoulder constantly.

      Imagine that you're surveilled not only in regard to what you produce, but to what you—if you're an office worker—tap our in chats to colleagues.

      This is what Facebook does and it's not very different to what China has created with their Social Credit System.

      This is Orwellian.

    1. There were also underlying security issues. Most of the messaging apps Tor Messenger supported are based on client-server architectures, and those can leak metadata (such as who's involved in a conversation and when) that might reveal who your friends are. There was no real way for the Tor crew to mitigate these issues.
    2. Tor suggests CoyIM, but it's prone to the same metadata issues as Messenger. You may have to accept that a small amount of chat data could find its way into the wrong hands, even if the actual conversations are locked down tight.
    1. Of course, with Facebook being Facebook, there is another, more commercial outlet for this type of metadata analysis. If the platform knows who you are, and knows what you do based on its multi-faceted internet tracking tools, then knowing who you talk to and when could be a commercial goldmine. Person A just purchased Object 1 and then chatted to Person B. Try to sell Object 1 to Person B. All of which can be done without any messaging content being accessed.
  13. May 2020
    1. What is more frightening than being merely watched, though, is being controlled. When Facebook can know us better than our parents with only 150 likes, and better than our spouses with 300 likes, the world appears quite predictable, both for governments and for businesses. And predictability means control.

      "Predictability means control"

    1. Chu, H. Y., Englund, J. A., Starita, L. M., Famulare, M., Brandstetter, E., Nickerson, D. A., Rieder, M. J., Adler, A., Lacombe, K., Kim, A. E., Graham, C., Logue, J., Wolf, C. R., Heimonen, J., McCulloch, D. J., Han, P. D., Sibley, T. R., Lee, J., Ilcisin, M., … Bedford, T. (2020). Early Detection of Covid-19 through a Citywide Pandemic Surveillance Platform. New England Journal of Medicine, NEJMc2008646. https://doi.org/10.1056/NEJMc2008646

  14. Apr 2020
  15. marlin-prod.literatumonline.com marlin-prod.literatumonline.com <