289 Matching Annotations
  1. Dec 2020
    1. The company’s early mission was to “give people the power to share and make the world more open and connected.” Instead, it took the concept of “community” and sapped it of all moral meaning. The rise of QAnon, for example, is one of the social web’s logical conclusions. That’s because Facebook—along with Google and YouTube—is perfect for amplifying and spreading disinformation at lightning speed to global audiences. Facebook is an agent of government propaganda, targeted harassment, terrorist recruitment, emotional manipulation, and genocide—a world-historic weapon that lives not underground, but in a Disneyland-inspired campus in Menlo Park, California.

      The original goal with a bit of moderation may have worked. Regression to the mean forces it to a bad place, but when you algorithmically accelerate things toward our bases desires, you make it orders of magnitude worse.

      This should be though of as pure social capitalism. We need the moderating force of government regulation to dampen our worst instincts, much the way the United State's mixed economy works (or at least used to work, as it seems that raw capitalism is destroying the United States too).

    1. Therefore, it could be argued that belief regarding the usefulness of technologies could lead to change and ultimately the actual use of digital technologies in teaching and learning.

      This goes both ways. A teacher who believes that their job is to control access to specialised information, and to control assessment may use technology to close down learning opportunities (e.g. by banning the use of Wikipedia, YouTube, etc.) and even insisting on the installation of surveillance (proctoring) software on students' personal computers.

      Again, you can argue that technology in itself doesn't make the difference.

    1. Recent patent filings show that Microsoft has been exploring additional ideas to monitor workers in the interest of organizational productivity. One filing describes a “meeting insight computing system” that would generate a quality score for a meeting using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting.

      So this will require that you have to have video turned on. How will they sell this to employees? "You need to turn your video on so that the algorithm can generate an accurate meeting quality score using your body language and facial expression.

      Sounds perfect. Absolutely no concerns about privacy violations, etc. in this product.

    2. Microsoft says it will make changes in its new Productivity Score feature, including removing the ability for companies to see data about individual users, to address concerns from privacy experts that the tech giant had effectively rolled out a new tool for snooping on workers.

      It's great that MS has reacted so quickly to the outcry around the privacy of workers.

      I thought it would be super-interesting to see how academics might have responded to the idea of institutional administrators keeping tabs on the number of hours that they'd spent in meetings (via Teams), composing and reading emails (via Outlook), writing articles (via Word), and so on.

      And yet these would be the same academics who do this kind of monitoring of student work.

  2. Nov 2020
    1. Online Exams & Proctoring (In Addition to Guidance Listed Above) Requiring students to turn on their camera to be watched or recorded at home during an exam poses significant privacy concerns and should not be undertaken lightly. Several proctoring services use machine learning, AI, eye-tracking, key-logging, and other technologies to detect potential cheating; these should be used only when no feasible alternatives exist. If instructors are using a proctoring service during the COVID-19 measures, they must provide explicit notice to the students before the exam. Instructors are encouraged to work with the Digital Learning Hub in the Commons and the Academic Integrity Office to consider privacy-protective options, including how to use question banks (in Canvas), that will uphold integrity and good assessment design. Proctors and instructors are strongly discouraged from requiring students to show their surroundings on camera. Computers are available in labs for students who do not have a computer to take their final exams. Finals CANNOT be held in a lab, that is, instructors cannot be present nor can students from a specific class be asked to gather there for a final. This is only for those students who need a computer to drop in and complete their exam.
    1. surveillance capitalism.

      I recommend to link to the book where its author, Shoshana Zuboff, has coined the term.

      The irony right now is that you're linking to an Amazon version of Zuboff's book; Amazon is currently one of the top-five surveillance-capitalist companies in the tech world.

      I would also consider linking to the Wikipedia page for the term.

    1. unblinking

      I’m struck by the image of the software constantly watching students that’s conjured by “unblinking”. I also can’t help but think of HAL 9000.

    1. In another interview, this time by John Laidler of the Harvard Gazette (March 2019), Zuboff expanded on this: I define surveillance capitalism as the unilateral claiming of private human experience as free raw material for translation into behavioral data. These data are then computed and packaged as prediction products and sold into behavioral futures markets

      Zuboff's definition of Surveillance Capitalism

    1. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight.

      Schneier is saying here that as long as the incentives are still pointing in the direction of short-term profit, privacy will be neglected.

      Surveillance, which allows for targeted advertising will win out over user privacy. Outrage, will be prioritized over more wholesome feelings. Corporate secrecy will allow Facebook to evade regulators and its users.

  3. Oct 2020
    1. It would allow end users to determine their own tolerances for different types of speech but make it much easier for most people to avoid the most problematic speech, without silencing anyone entirely or having the platforms themselves make the decisions about who is allowed to speak.

      But platforms are making huge decisions about who is allowed to speak. While they're generally allowing everyone to have a voice, they're also very subtly privileging many voices over others. While they're providing space for even the least among us to have a voice, they're making far too many of the worst and most powerful among us logarithmic-ally louder.

      It's not broadly obvious, but their algorithms are plainly handing massive megaphones to people who society broadly thinks shouldn't have a voice at all. These megaphones come in the algorithmic amplification of fringe ideas which accelerate them into the broader public discourse toward the aim of these platforms getting more engagement and therefore more eyeballs for their advertising and surveillance capitalism ends.

      The issue we ought to be looking at is the dynamic range between people and the messages they're able to send through social platforms.

      We could also analogize this to the voting situation in the United States. When we disadvantage the poor, disabled, differently abled, or marginalized people from voting while simultaneously giving the uber-rich outsized influence because of what they're able to buy, we're imposing the same sorts of problems. Social media is just able to do this at an even larger scale and magnify the effects to make their harms more obvious.

      If I follow 5,000 people on social media and one of them is a racist-policy-supporting, white nationalist president, those messages will get drowned out because I can only consume so much content. But when the algorithm consistently pushes that content to the top of my feed and attention, it is only going to accelerate it and create more harm. If I get a linear presentation of the content, then I'd have to actively search that content out for it to cause me that sort of harm.

    1. The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.#lazy-img-336042387:before{padding-top:66.68334167083543%;}

      This is a great summation of the issue.

    1. Legislation to stem the tide of Big Tech companies' abuses, and laws—such as a national consumer privacy bill, an interoperability bill, or a bill making firms liable for data-breaches—would go a long way toward improving the lives of the Internet users held hostage inside the companies' walled gardens. But far more important than fixing Big Tech is fixing the Internet: restoring the kind of dynamism that made tech firms responsive to their users for fear of losing them, restoring the dynamic that let tinkerers, co-ops, and nonprofits give every person the power of technological self-determination.
  4. Sep 2020
    1. To defeat facial recognition software, “you would have to wear a mask or disguises,” Tien says. “That doesn’t really scale up for people.”

      Yeah, that sentence was written in 2017 and especially pertinent to Americans. 2020 has changed things a fair bit.

    1. we are all subjects of unknown and unseen processes

      c’est le cas actuellement: il y a une asymmétrie sans précédent (<cite>Les deux textes</cite>) des connaissances et des moyens de manipulation/redirection massive des comportements grâce au big data découlant de la surveillance de masse en ligne.

      • pour une introduction grand public, voir <cite>The Social Dilemma</cite> récemment mis en ligne sur Netflix.
      • pour un ouvrage de fond, voir Shoshana Zuboff, <cite>The Age of Surveillance Capitalism</cite>.
    1. Facebook ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world, according to an explosive memo sent by a recently fired Facebook employee and obtained by BuzzFeed News.The 6,600-word memo, written by former Facebook data scientist Sophie Zhang, is filled with concrete examples of heads of government and political parties in Azerbaijan and Honduras using fake accounts or misrepresenting themselves to sway public opinion. In countries including India, Ukraine, Spain, Brazil, Bolivia, and Ecuador, she found evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes, though she did not always conclude who was behind them.
    1. Larremore, D. B., Wilder, B., Lester, E., Shehata, S., Burke, J. M., Hay, J. A., Tambe, M., Mina, M. J., & Parker, R. (2020). Test sensitivity is secondary to frequency and turnaround time for COVID-19 surveillance. MedRxiv, 2020.06.22.20136309. https://doi.org/10.1101/2020.06.22.20136309

  5. Aug 2020
    1. The mass surveillance and factory farming of human beings on a global scale is the business model of people farmers like Facebook and Google. It is the primary driver of the socioeconomic system we call surveillance capitalism.
    1. Facebook has apologized to its users and advertisers for being forced to respect people’s privacy in an upcoming update to Apple’s mobile operating system – and promised it will do its best to invade their privacy on other platforms.

      Sometimes I forget how funny The Register can be. This is terrific.

    1. Facebook is warning developers that privacy changes in an upcoming iOS update will severely curtail its ability to track users' activity across the entire Internet and app ecosystem and prevent the social media platform from serving targeted ads to users inside other, non-Facebook apps on iPhones.

      I fail to see anything bad about this.

    1. Vogels, C. B. F., Brackney, D., Wang, J., Kalinich, C. C., Ott, I., Kudo, E., Lu, P., Venkataraman, A., Tokuyama, M., Moore, A. J., Muenker, M. C., Casanovas-Massana, A., Fournier, J., Bermejo, S., Campbell, M., Datta, R., Nelson, A., Team, Y. I. R., Cruz, C. D., … Grubaugh, N. (2020). SalivaDirect: Simple and sensitive molecular diagnostic test for SARS-CoV-2 surveillance. MedRxiv, 2020.08.03.20167791. https://doi.org/10.1101/2020.08.03.20167791

  6. Jul 2020
    1. But the business model that we now call surveillance capitalism put paid to that, which is why you should never post anything on Facebook without being prepared to face the algorithmic consequences.

      I'm reminded a bit of the season 3 episode of Breaking Bad where Jesse Pinkman invites his drug dealing pals to a Narcotics Anonymous-type meeting so that they can target their meth sales. Fortunately the two low lifes had more morality and compassion than Facebook can manage.

      https://www.youtube.com/watch?v=20kpzC3sckQ

    1. a new kind of power

      This is what Shoshana Zuboff sustains in The Age of Surveillance Capitalism: a new kind of power which can, at first, be apprehended through Marx’s lenses; but as a new form of capitalism, it <mark>“cannot be reduced to known harms—monopoly, privacy—and therefore do not easily yield to known forms of combat.”</mark>

      It is <mark>“a new form of capitalism on its own terms and in its own words”</mark> which therefore requires new conceptual frameworks to be understood, negotiated.

    2. One of these semiotizing processes is the extraction, interpretation and reintegration of web data from and into human subjectivities.

      Machine automation becomes another “subjectivity” or “agentivity”—an influential one, because it is the one filtering and pushing content to humans.

      The means of this automated subjectivity is feeding data capitalism: more content, more interaction, more behavioral data produced by the users—data which is then captured (“dispossessed”), extracted, and transformed into prediction services, which render human behavior predictable, and therefore monetizable (Shoshana Zuboff, The Age of Surviellance Capitalism, 2019).

    1. Varatharaj, A., Thomas, N., Ellul, M. A., Davies, N. W. S., Pollak, T. A., Tenorio, E. L., Sultan, M., Easton, A., Breen, G., Zandi, M., Coles, J. P., Manji, H., Al-Shahi Salman, R., Menon, D. K., Nicholson, T. R., Benjamin, L. A., Carson, A., Smith, C., Turner, M. R., … Plant, G. (2020). Neurological and neuropsychiatric complications of COVID-19 in 153 patients: A UK-wide surveillance study. The Lancet Psychiatry. https://doi.org/10.1016/S2215-0366(20)30287-X

  7. Jun 2020
    1. Starr, T. N., Greaney, A. J., Hilton, S. K., Crawford, K. H., Navarro, M. J., Bowen, J. E., Tortorici, M. A., Walls, A. C., Veesler, D., & Bloom, J. D. (2020). Deep mutational scanning of SARS-CoV-2 receptor binding domain reveals constraints on folding and ACE2 binding [Preprint]. Microbiology. https://doi.org/10.1101/2020.06.17.157982

    1. One of the new tools debuted by Facebook allows administrators to remove and block certain trending topics among employees. The presentation discussed the “benefits” of “content control.” And it offered one example of a topic employers might find it useful to blacklist: the word “unionize.”

      Imagine your employer looking over your shoulder constantly.

      Imagine that you're surveilled not only in regard to what you produce, but to what you—if you're an office worker—tap our in chats to colleagues.

      This is what Facebook does and it's not very different to what China has created with their Social Credit System.

      This is Orwellian.

    1. There were also underlying security issues. Most of the messaging apps Tor Messenger supported are based on client-server architectures, and those can leak metadata (such as who's involved in a conversation and when) that might reveal who your friends are. There was no real way for the Tor crew to mitigate these issues.
    2. Tor suggests CoyIM, but it's prone to the same metadata issues as Messenger. You may have to accept that a small amount of chat data could find its way into the wrong hands, even if the actual conversations are locked down tight.
    1. Of course, with Facebook being Facebook, there is another, more commercial outlet for this type of metadata analysis. If the platform knows who you are, and knows what you do based on its multi-faceted internet tracking tools, then knowing who you talk to and when could be a commercial goldmine. Person A just purchased Object 1 and then chatted to Person B. Try to sell Object 1 to Person B. All of which can be done without any messaging content being accessed.
  8. May 2020
    1. What is more frightening than being merely watched, though, is being controlled. When Facebook can know us better than our parents with only 150 likes, and better than our spouses with 300 likes, the world appears quite predictable, both for governments and for businesses. And predictability means control.

      "Predictability means control"

    1. Chu, H. Y., Englund, J. A., Starita, L. M., Famulare, M., Brandstetter, E., Nickerson, D. A., Rieder, M. J., Adler, A., Lacombe, K., Kim, A. E., Graham, C., Logue, J., Wolf, C. R., Heimonen, J., McCulloch, D. J., Han, P. D., Sibley, T. R., Lee, J., Ilcisin, M., … Bedford, T. (2020). Early Detection of Covid-19 through a Citywide Pandemic Surveillance Platform. New England Journal of Medicine, NEJMc2008646. https://doi.org/10.1056/NEJMc2008646

  9. Apr 2020
    1. Edward Snowden disclosed in 2013 that the US government's Upstream program was collecting data people reading Wikipedia articles. This revelation had significant impact the self-censorship of the readers, as shown by the fact that there were substantially fewer views for articles related to terrorism and security.[12] The court case Wikimedia Foundation v. NSA has since followed.
    1. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
    1. Thousands of enterprises around the world have done exhaustive security reviews of our user, network, and data center layers and confidently selected Zoom for complete deployment. 

      This doesn't really account for the fact that Zoom have committed some atrociously heinous acts, such as (and not limited to):

  10. Mar 2020
    1. This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won’t stay private from the company.
    2. But despite this misleading marketing, the service actually does not support end-to-end encryption for video and audio content, at least as the term is commonly understood. Instead it offers what is usually called transport encryption, explained further below
    1. Comment savoir si l’élève fait son travail tout seul?La continuité pédagogique est destinée à s’assurer que les élèves poursuivent des activités scolaires leur permettant de progresser dans leurs apprentissages. Il s’agit d’attirer l’attention des élèves sur l’importance et la régularité du travail personnel quelle que soit l’activité, même si elle est réalisée avec l’aide d’un pair ou d’un tiers. Des travaux réguliers et évalués régulièrement y contribuent. Toutefois, le professeur ne peut contrôler l’assiduité dans ce cadre, ni sanctionner son éventuel défaut.
    1. And if people were really cool about sharing their personal and private information with anyone, and totally fine about being tracked everywhere they go and having a record kept of all the people they know and have relationships with, why would the ad tech industry need to spy on them in the first place? They could just ask up front for all your passwords.
    2. The deception enabled by dark pattern design not only erodes privacy but has the chilling effect of putting web users under pervasive, clandestine surveillance, it also risks enabling damaging discrimination at scale.
    1. Enligt Polismyndighetens riktlinjer ska en konsekvensbedömning göras innan nya polisiära verktyg införs, om de innebär en känslig personuppgiftbehandling. Någon sådan har inte gjorts för det aktuella verktyget.

      Swedish police have used Clearview AI without any 'consequence judgement' having been performed.

      In other words, Swedish police have used a facial-recognition system without being allowed to do so.

      This is a clear breach of human rights.

      Swedish police has lied about this, as reported by Dagens Nyheter.

    1. Mastercard acquired NuData Security in 2017 and it has been making advances in biometric identification.
    2. The payment provider told MarketWatch that everyone has a unique walk, and it is investigating innovative behavioral biometrics such as gait, face, heartbeat and veins for cutting edge payment systems of the future.

      This is a true invasion into people's lives.

      Remember: this is a credit-card company. We use them to pay for stuff. They shouldn't know what we look like, how we walk, how our hearts beat, nor how our 'vein technology' works.

  11. Feb 2020
    1. Last year, Facebook said it would stop listening to voice notes in messenger to improve its speech recognition technology. Now, the company is starting a new program where it will explicitly ask you to submit your recordings, and earn money in return.

      Given Facebook's history with things like breaking laws that end up with them paying billions of USD in damages (even though it's a joke), sold ads to people who explicitly want to target people who hate jews, and have spent millions of USD every year solely on lobbyism, don't sell your personal experiences and behaviours to them.

      Facebook is nefarious and psychopathic.

    1. I suspect that Wacom doesn’t really think that it’s acceptable to record the name of every application I open on my personal laptop. I suspect that this is why their privacy policy doesn’t really admit that this is what that they do.
  12. Jan 2020
    1. A Microsoft programme to transcribe and vet audio from Skype and Cortana, its voice assistant, ran for years with “no security measures”, according to a former contractor who says he reviewed thousands of potentially sensitive recordings on his personal laptop from his home in Beijing over the two years he worked for the company.

      Wonderful. This, combined with the fact that Skype users can—fairly easily—find out which contacts another person has, is horrifying.

      Then again, most people know that Microsoft have colluded with American authorities to divulge chat/phone history for a long time, right?

  13. Dec 2019
    1. We are barrelling toward a country with 350 million serfs serving 3 million lords. We attempt to pacify the serfs with more powerful phones, bigger TVs, great original scripted television, and Mandalorian action figures delivered to your doorstep within the hour. The delivery guy might be forced to relieve himself in your bushes if not for the cameras his boss installed on every porch.