44 Matching Annotations
  1. Jul 2019
    1. What should lawmakers do? First, interrupt and outlaw surveillance capitalism’s data supplies and revenue flows. This means, at the front end, outlawing the secret theft of private experience. At the back end, we can disrupt revenues by outlawing markets that trade in human futures knowing that their imperatives are fundamentally anti-democratic. We already outlaw markets that traffic in slavery or human organs. Second, research over the past decade suggests that when “users” are informed of surveillance capitalism’s backstage operations, they want protection, and they want alternatives. We need laws and regulation designed to advantage companies that want to break with surveillance capitalism. Competitors that align themselves with the actual needs of people and the norms of a market democracy are likely to attract just about every person on Earth as their customer. Third, lawmakers will need to support new forms of collective action, just as nearly a century ago workers won legal protection for their rights to organise, to bargain collectively and to strike. Lawmakers need citizen support, and citizens need the leadership of their elected officials.

      Shoshana Zuboff's answer to surveillance capitalism

  2. Jun 2019
  3. May 2019
    1. They’ve learned, and that’s more dangerous than caring, because that means they’re rationally pricing these harms. The day that 20% of consumers put a price tag on privacy, freemium is over and privacy is back.

      Google want you to say yes, not because they're inviting positivity more than ever, but because they want you to purchase things and make them richer. This is the essence of capitalism.

  4. Apr 2019
    1. The report also noted a 27 percent increase in the number of foreigners whose communications were targeted by the NSA during the year. In total, an estimated 164,770 foreign individuals or groups were targeted with search terms used by the NSA to monitor their communications, up from 129,080 on the year prior.
    2. The data, published Tuesday by the Office of the Director of National Intelligence (ODNI), revealed a 28 percent rise in the number of targeted search terms used to query massive databases of collected Americans’ communications.
    1. we get some of it by collecting data about your interactions, use and experiences with our products. The data we collect depends on the context of your interactions with Microsoft and the choices that you make, including your privacy settings and the products and features that you use. We also obtain data about you from third parties.
    1. drivers delivering Amazon packages have reported feeling so pressured that they speed through neighborhoods, blow by stop signs, and pee in bottles in the trucks or outside
    2. Amazon's system tracks a metric called "time off task," meaning how much time workers pause or take breaks, The Verge reported. It has been previously reported that some workers feel so pressured that they don't take bathroom breaks.
    3. Amazon employs a system that not only tracks warehouse workers' productivity but also can automatically fire them for failing to meet expectations.

      The bots now fire humans. AI 2.0.

    1. Facebook said on Wednesday that it expected to be fined up to $5 billion by the Federal Trade Commission for privacy violations. The penalty would be a record by the agency against a technology company and a sign that the United States was willing to punish big tech companies.

      This is where surveillance capitalism brings you.

      Sure, five billion American Dollars won't make much of a difference to Facebook, but it's notable.

    1. So far, according to the Times and other outlets, this technique is being used by the FBI and police departments in Arizona, North Carolina, California, Florida, Minnesota, Maine, and Washington, although there may be other agencies using it across the country.
    2. In a new article, the New York Times details a little-known technique increasingly used by law enforcement to figure out everyone who might have been within certain geographic areas during specific time periods in the past. The technique relies on detailed location data collected by Google from most Android devices as well as iPhones and iPads that have Google Maps and other apps installed. This data resides in a Google-maintained database called “Sensorvault,” and because Google stores this data indefinitely, Sensorvault “includes detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade.”

      Google is passing on location data to law enforcement without letting users know.

    1. Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.” That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking. (It’s possible, although laborious, to delete it .)
    1. Per a Wednesday report in Business Insider, Facebook has now said that it automatically extracted contact lists from around 1.5 million email accounts it was given access to via this method without ever actually asking for their permission. Again, this is exactly the type of thing one would expect to see in a phishing attack.

      Facebook are worse than Nixon, when he said "I'm not a crook".

    1. “In contrast to Dr. Wood’s claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications,” they wrote.

      This is more important than most people probably realise. Recognition bias will decide if a person dies or not, when implemented at substantial scale, which isn't far away.

    1. The conundrum isn’t just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.#lazy-img-336042387:before{padding-top:66.68334167083543%;}

      This is a great summation of the issue.

    1. The highlight of today’s announcements is the beta launch of the company’s AI Platform. The idea here is to offer developers and data scientists an end-to-end service for building, testing and deploying their own models.
    1. AMP is a set of rules that publishers (typically news and analysis content providers) must abide by in order to appear in the “Top Stories” section of Google’s search results, a lucrative position at the top of the page.

      This is just one of many reasons for not using Google's search engine. Or most of their products.

      Monotheistic and, more importantly, monopolistic thinking like this drags us all down.

    1. Amazon.com Inc. is positioning Alexa, its artificial-intelligence assistant, to track consumers’ prescriptions and relay personal health information, in a bid to insert the technology into everyday health care.

      Surveillance capitalism, anyone?

    1. “They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions. “[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target ‘Jew haters’ and other hateful market segments, and refuse to accept any responsibility for any content or harm. “They #dontgiveazuck” wrote Edwards.

      Well, I don't think he should have deleted his tweets.

    1. Amazon’s technology struggles more than some peers’ to identify the gender of individuals with darker skin, prompting fears of unjust arrests. Amazon has defended its work and said all users must follow the law.

      Draw any parallel to "The Handmaid's Tale" and you're right.

    2. U.S. securities regulators shot down attempts by Amazon.com Inc to stop its investors from considering two shareholder proposals about the company’s controversial sale of a facial recognition service, a sign of growing scrutiny of the technology.

      Surveillance capitalism at its worst; this behemoth tries to have the people who own it not make decisions.

      Capitalism is like Skynet, an organism that's taken flight on its own, bound to make solipsistic and egoistic judgments and choices.

    1. Digital sociology needs more big theory as well as testable theory.

      I can't help but think here about the application of digital technology to large bodies of literature in the creation of the field of corpus linguistics.

      If traditional sociology means anything, then a digital incarnation of it should create physical and trackable means that can potentially be more easily studied as a result. Just the same way that Mark Dredze has been able to look at Twitter data to analyze public health data like influenza, we should be able to more easily quantify sociological phenomenon in aggregate by looking at larger and richer data sets of online interactions.

      There's also likely some value in studying the quantities of digital exhaust that companies like Google, Amazon, Facebook, etc. are using for surveillance capitalism.

    1. “Prison labor” is usually associated with physical work, but inmates at two prisons in Finland are doing a new type of labor: classifying data to train artificial intelligence algorithms for a startup. Though the startup in question, Vainu, sees the partnership as a kind of prison reform that teaches valuable skills, other experts say it plays into the exploitative economics of prisoners being required to work for very low wages.

      Naturally, this is exploitative; the inmates do not learn a skill that they can take out into the real world.

      I'd be surprised if they'd not have to sign a NDA for this.

  5. Mar 2019
    1. If you do not like the price you’re being offered when you shop, do not take it personally: many of the prices we see online are being set by algorithms that respond to demand and may also try to guess your personal willingness to pay. What’s next? A logical next step is that computers will start conspiring against us. That may sound paranoid, but a new study by four economists at the University of Bologna shows how this can happen.
    1. Mention McDonald’s to someone today, and they're more likely to think about Big Mac than Big Data. But that could soon change: The fast-food giant has embraced machine learning, in a fittingly super-sized way.McDonald’s is set to announce that it has reached an agreement to acquire Dynamic Yield, a startup based in Tel Aviv that provides retailers with algorithmically driven "decision logic" technology. When you add an item to an online shopping cart, it’s the tech that nudges you about what other customers bought as well. Dynamic Yield reportedly had been recently valued in the hundreds of millions of dollars; people familiar with the details of the McDonald’s offer put it at over $300 million. That would make it the company's largest purchase since it acquired Boston Market in 1999.

      McDonald's are getting into machine learning. Beware.

    1. As one of 13 million officially designated “discredited individuals,” or laolai in Chinese, 47-year-old Kong is banned from spending on “luxuries,” whose definition includes air travel and fast trains.
    2. Discredited individuals have been barred from taking a total of 17.5 million flights and 5.5 million high-speed train trips as of the end of 2018, according to the latest annual report by the National Public Credit Information Center.The list of “discredited individuals” was introduced in 2013, months before the State Council unveiled a plan in 2014 to build a social credit system by 2020.

      This is what surveillance capitalism brings. This is due to what is called China's "Golden Shield", a credit-statement system that, for example, brings your credit level down if you search for terms such as "Tianmen Square Protest" or post "challenging" pictures on Facebook.

      This is surveillance capitalism at its worst, creating a new lower class for the likes of Google, Facebook, Microsoft, Amazon, and insurance companies. Keep the rabble away, as it were.

    1. Amazon has been beta testing the ads on Apple Inc.’s iOS platform for several months, according to people familiar with the plan. A similar product for Google’s Android platform is planned for later this year, said the people, who asked not to be identified because they’re not authorized to share the information publicly.

      Sounds like one of the best reasons I've ever heard to run Brave Browser both on desktop and mobile. https://brave.com/

    1. Sharing of user data is routine, yet far from transparent. Clinicians should be conscious of privacy risks in their own use of apps and, when recommending apps, explain the potential for loss of privacy as part of informed consent. Privacy regulation should emphasise the accountabilities of those who control and process user data. Developers should disclose all data sharing practices and allow users to choose precisely what data are shared and with whom.

      Horrific conclusion, which clearly states that "sharing of user data is routine" where the medical profession is concerned.

    2. To investigate whether and how user data are shared by top rated medicines related mobile applications (apps) and to characterise privacy risks to app users, both clinicians and consumers.

      "24 of 821 apps identified by an app store crawling program. Included apps pertained to medicines information, dispensing, administration, prescribing, or use, and were interactive."

    1. While employees were up in arms because of Google’s “Dragonfly” censored search engine with China and its Project Maven’s drone surveillance program with DARPA, there exist very few mechanisms to stop these initiatives from taking flight without proper oversight. The tech community argues they are different than Big Pharma or Banking. Regulating them would strangle the internet.

      This is an old maxim with corporations, Google, Facebook, and Microsoft alike; if you don't break laws by simply doing what you want because of, well, greed, then you're hampering "evolution".

      Evolution of their wallets, yes.

    2. Amy Webb, Author of  “The Big Nine: How the Tech Titans and their Thinking Machines could Warp Humanity” refers not only to G-MAFIA but also BAT (the consortium that has led the charge in the highly controversial Social Credit system to create a trust value among its Chinese citizens). She writes: We stop assuming that the G-MAFIA (Google, Apple, Facebook, IBM, and Amazon) can serve its DC and Wall Street masters equally and that the free markets and our entrepreneurial spirit will produce the best possible outcomes for AI and humanity

      This is discussed by Shoshana Zuboff in her masterfully written "The Age of Surveillance Capitalism".

    1. A speech-detecting accelerometer recognizes when you’re speaking and works with a pair of beamforming microphones to filter out external noise and focus on the sound of your voice.

      I'll translate this for you: "This enables Apple to constantly listen to you, record your behaviour, and sell your behaviour data."

    1. we don’t want to fund teachers and manageable class sizes, so we outsource the plagiarism problem to a for-profit company that has a side gig of promoting the importance of the problem it promises to solve.

      Yet another example of a misdirected "solution" to a manufactured problem that ends up being more costly - in terms of monetary expense AND student learning AND faculty engagement - than it would have been to invest in human interaction and learner-centered pedagogies.

  6. Feb 2019
    1. It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”
    2. Larry Page grasped that human experience could be Google’s virgin wood, that it could be extracted at no extra cost online and at very low cost out in the real world. For today’s owners of surveillance capital the experiential realities of bodies, thoughts and feelings are as virgin and blameless as nature’s once-plentiful meadows, rivers, oceans and forests before they fell to the market dynamic. We have no formal control over these processes because we are not essential to the new market action. Instead we are exiles from our own behaviour, denied access to or control over knowledge derived from its dispossession by others for others. Knowledge, authority and power rest with surveillance capital, for which we are merely “human natural resources”. We are the native peoples now whose claims to self-determination have vanished from the maps of our own experience.
    3. The combination of state surveillance and its capitalist counterpart means that digital technology is separating the citizens in all societies into two groups: the watchers (invisible, unknown and unaccountable) and the watched. This has profound consequences for democracy because asymmetry of knowledge translates into asymmetries of power.
    1. No one is forced on Twitter, naturally, but if you aren’t on Twitter, then your audience is (probably) smaller, while if you are on Twitter, they can steal your privacy, which I deeply resent. This is a big dilemma to me. Beyond that, I simply don’t think anybody should have as much power as the social media giants have over us today. I think it’s increasingly politically important to decentralize social media.

      This is an important point! And nothing puts a finer point on it than Shoshona Zuboff's recent book on surveillance capitalism.

  7. Jan 2019
    1. Turnitin’s practices have been ruled as fair use in federal court. But to Morris and Stommel, the ceding of control of students' work -- and their ownership over that work -- to a corporation is a moral issue, even if it's legally sound. Time spent on checking plagiarism reports is time that would be better spent teaching students how to become better writers in the first place, they argue. “This is ethical, activist work. While not exactly the Luddism of the 19th century, we must ask ourselves, when we’re choosing ed-tech tools, who profits and from what?” they wrote in the essay. “The gist: when you upload work to Turnitin, your property is, in no reasonable sense, your property. Every essay students submit -- representing hours, days or even years of work -- becomes part of the Turnitin database, which is then sold to universities.”

      This is key issue for me - and we talked about this last week in GEDI when someone brought up the case of wide-scale cheating on the quizz / test that students took online.

      I'd like teachers to focus on teaching and helping students learn. And I think the question about who profits and who benefits from ed-tech tools like TurnitIn need to be asked.

  8. Aug 2018
    1. But the entire business model — what the philosopher and business theorist Shoshana Zuboff calls “surveillance capitalism” — rests on untrammeled access to your personal data.

      Is Shoshana Zuboff the originator of surveillance capitalism?

      According to Wikipedia--No: Surveillance capitalism is a term first introduced by John Bellamy Foster and Robert W. McChesney in Monthly Review in 2014 and later popularized by academic Shoshana Zuboff that denotes a new genus of capitalism that monetizes data acquired through surveillance.