439 Matching Annotations
  1. Jun 2020
    1. Starr, T. N., Greaney, A. J., Hilton, S. K., Crawford, K. H., Navarro, M. J., Bowen, J. E., Tortorici, M. A., Walls, A. C., Veesler, D., & Bloom, J. D. (2020). Deep mutational scanning of SARS-CoV-2 receptor binding domain reveals constraints on folding and ACE2 binding [Preprint]. Microbiology. https://doi.org/10.1101/2020.06.17.157982

    1. One of the new tools debuted by Facebook allows administrators to remove and block certain trending topics among employees. The presentation discussed the “benefits” of “content control.” And it offered one example of a topic employers might find it useful to blacklist: the word “unionize.”

      Imagine your employer looking over your shoulder constantly.

      Imagine that you're surveilled not only in regard to what you produce, but to what you—if you're an office worker—tap our in chats to colleagues.

      This is what Facebook does and it's not very different to what China has created with their Social Credit System.

      This is Orwellian.

    1. Of course, with Facebook being Facebook, there is another, more commercial outlet for this type of metadata analysis. If the platform knows who you are, and knows what you do based on its multi-faceted internet tracking tools, then knowing who you talk to and when could be a commercial goldmine. Person A just purchased Object 1 and then chatted to Person B. Try to sell Object 1 to Person B. All of which can be done without any messaging content being accessed.
  2. May 2020
    1. What is more frightening than being merely watched, though, is being controlled. When Facebook can know us better than our parents with only 150 likes, and better than our spouses with 300 likes, the world appears quite predictable, both for governments and for businesses. And predictability means control.

      "Predictability means control"

    1. Chu, H. Y., Englund, J. A., Starita, L. M., Famulare, M., Brandstetter, E., Nickerson, D. A., Rieder, M. J., Adler, A., Lacombe, K., Kim, A. E., Graham, C., Logue, J., Wolf, C. R., Heimonen, J., McCulloch, D. J., Han, P. D., Sibley, T. R., Lee, J., Ilcisin, M., … Bedford, T. (2020). Early Detection of Covid-19 through a Citywide Pandemic Surveillance Platform. New England Journal of Medicine, NEJMc2008646. https://doi.org/10.1056/NEJMc2008646

  3. Apr 2020
    1. Edward Snowden disclosed in 2013 that the US government's Upstream program was collecting data people reading Wikipedia articles. This revelation had significant impact the self-censorship of the readers, as shown by the fact that there were substantially fewer views for articles related to terrorism and security.[12] The court case Wikimedia Foundation v. NSA has since followed.
    1. Google's move to release location data highlights concerns around privacy. According to Mark Skilton, director of the Artificial Intelligence Innovation Network at Warwick Business School in the UK, Google's decision to use public data "raises a key conflict between the need for mass surveillance to effectively combat the spread of coronavirus and the issues of confidentiality, privacy, and consent concerning any data obtained."
    1. Thousands of enterprises around the world have done exhaustive security reviews of our user, network, and data center layers and confidently selected Zoom for complete deployment. 

      This doesn't really account for the fact that Zoom have committed some atrociously heinous acts, such as (and not limited to):

  4. Mar 2020
    1. This is known as transport encryption, which is different from end-to-end encryption because the Zoom service itself can access the unencrypted video and audio content of Zoom meetings. So when you have a Zoom meeting, the video and audio content will stay private from anyone spying on your Wi-Fi, but it won’t stay private from the company.
    1. Comment savoir si l’élève fait son travail tout seul?La continuité pédagogique est destinée à s’assurer que les élèves poursuivent des activités scolaires leur permettant de progresser dans leurs apprentissages. Il s’agit d’attirer l’attention des élèves sur l’importance et la régularité du travail personnel quelle que soit l’activité, même si elle est réalisée avec l’aide d’un pair ou d’un tiers. Des travaux réguliers et évalués régulièrement y contribuent. Toutefois, le professeur ne peut contrôler l’assiduité dans ce cadre, ni sanctionner son éventuel défaut.
    1. And if people were really cool about sharing their personal and private information with anyone, and totally fine about being tracked everywhere they go and having a record kept of all the people they know and have relationships with, why would the ad tech industry need to spy on them in the first place? They could just ask up front for all your passwords.
    1. Enligt Polismyndighetens riktlinjer ska en konsekvensbedömning göras innan nya polisiära verktyg införs, om de innebär en känslig personuppgiftbehandling. Någon sådan har inte gjorts för det aktuella verktyget.

      Swedish police have used Clearview AI without any 'consequence judgement' having been performed.

      In other words, Swedish police have used a facial-recognition system without being allowed to do so.

      This is a clear breach of human rights.

      Swedish police has lied about this, as reported by Dagens Nyheter.

    1. The payment provider told MarketWatch that everyone has a unique walk, and it is investigating innovative behavioral biometrics such as gait, face, heartbeat and veins for cutting edge payment systems of the future.

      This is a true invasion into people's lives.

      Remember: this is a credit-card company. We use them to pay for stuff. They shouldn't know what we look like, how we walk, how our hearts beat, nor how our 'vein technology' works.

  5. Feb 2020
    1. Last year, Facebook said it would stop listening to voice notes in messenger to improve its speech recognition technology. Now, the company is starting a new program where it will explicitly ask you to submit your recordings, and earn money in return.

      Given Facebook's history with things like breaking laws that end up with them paying billions of USD in damages (even though it's a joke), sold ads to people who explicitly want to target people who hate jews, and have spent millions of USD every year solely on lobbyism, don't sell your personal experiences and behaviours to them.

      Facebook is nefarious and psychopathic.

  6. Jan 2020
    1. A Microsoft programme to transcribe and vet audio from Skype and Cortana, its voice assistant, ran for years with “no security measures”, according to a former contractor who says he reviewed thousands of potentially sensitive recordings on his personal laptop from his home in Beijing over the two years he worked for the company.

      Wonderful. This, combined with the fact that Skype users can—fairly easily—find out which contacts another person has, is horrifying.

      Then again, most people know that Microsoft have colluded with American authorities to divulge chat/phone history for a long time, right?

  7. Dec 2019
    1. We are barrelling toward a country with 350 million serfs serving 3 million lords. We attempt to pacify the serfs with more powerful phones, bigger TVs, great original scripted television, and Mandalorian action figures delivered to your doorstep within the hour. The delivery guy might be forced to relieve himself in your bushes if not for the cameras his boss installed on every porch.
  8. Nov 2019
    1. Speaking with MIT Technology Review, Rohit Prasad, Alexa’s head scientist, has now revealed further details about where Alexa is headed next. The crux of the plan is for the voice assistant to move from passive to proactive interactions. Rather than wait for and respond to requests, Alexa will anticipate what the user might want. The idea is to turn Alexa into an omnipresent companion that actively shapes and orchestrates your life. This will require Alexa to get to know you better than ever before.

      This is some next-level onslaught.

    1. Somewhere in a cavernous, evaporative cooled datacenter, one of millions of blinking Facebook servers took our credentials, used them to authenticate to our private email account, and tried to pull information about all of our contacts. After clicking Continue, we were dumped into the Facebook home page, email successfully “confirmed,” and our privacy thoroughly violated.
    1. In 2013, Facebook began offering a “secure” VPN app, Onavo Protect, as a way for users to supposedly protect their web activity from prying eyes. But Facebook simultaneously used Onavo to collect data from its users about their usage of competitors like Twitter. Last year, Apple banned Onavo from its App Store for violating its Terms of Service. Facebook then released a very similar program, now dubbed variously “Project Atlas” and “Facebook Research.” It used Apple’s enterprise app system, intended only for distributing internal corporate apps to employees, to continue offering the app to iOS users. When the news broke this week, Apple shut down the app and threw Facebook into some chaos when it (briefly) booted the company from its Enterprise Developer program altogether.
    1. The FBI is currently collecting data about our faces, irises, walking patterns, and voices, permitting the government to pervasively identify, track, and monitor us. The agency can match or request a match of our faces against at least 640 million images of adults living in the U.S. And it is reportedly piloting Amazon’s flawed face recognition surveillance technology.

      FBI and Amazon are being sued because of surveillance of people living in the USA.

  9. Oct 2019
    1. Per Bloomberg, which cited an memo from an anonymous Google staffer, employees discovered that the company was creating the new tool as a Chrome browser extension that would be installed on all employees’ systems and used to monitor their activities.

      From the Bloomberg article:

      Earlier this month, employees said they discovered that a team within the company was creating the new tool for the custom Google Chrome browser installed on all workers’ computers and used to search internal systems. The concerns were outlined in a memo written by a Google employee and reviewed by Bloomberg News and by three Google employees who requested anonymity because they aren’t authorized to talk to the press.

    1. there's still the issue of user IP addresses, which Tencent would see for those using devices with mainland China settings. That's a privacy concern, but its one among many given that other Chinese internet companies – ISPs, app providers, cloud service providers, and the like – can be assumed to collect that information and provide it to the Chinese surveillance state on demand.
    1. This system will apply to foreign owned companies in China on the same basis as to all Chinese persons, entities or individuals. No information contained on any server located within China will be exempted from this full coverage program. No communication from or to China will be exempted. There will be no secrets. No VPNs. No private or encrypted messages. No anonymous online accounts. No trade secrets. No confidential data. Any and all data will be available and open to the Chinese government. Since the Chinese government is the shareholder in all SOEs and is now exercising de facto control over China’s major private companies as well, all of this information will then be available to those SOEs and Chinese companies. See e.g. China to place government officials inside 100 private companies, including Alibaba. All this information will be available to the Chinese military and military research institutes. The Chinese are being very clear that this is their plan.

      At least the current Chinese government are clear about how all-intrusive they will be, so that people can avoid them. IF people can avoid them.

    1. We recently discovered that when you provided an email address or phone number for safety or security purposes (for example, two-factor authentication) this data may have inadvertently been used for advertising purposes, specifically in our Tailored Audiences and Partner Audiences advertising system. 

      Twitter may have sold your e-mail address to people.

      Twitter has only done this with people who have added their e-mail address for security purposes.

      Security purposes for Twitter = sell your e-mail address to a third-party company.

      Spam for you = security purposes for Twitter.

    1. The claim in this ad was ruled false by those Facebook-approved third-party fact-checkers, but it is still up and running. Why? Because Facebook changed its policy on what constitutes misinformation in advertising. Prior to last week, Facebook’s rule against “false and misleading content” didn’t leave room for gray areas: “Ads landing pages, and business practices must not contain deceptive, false, or misleading content, including deceptive claims, offers, or methods.”
  10. Sep 2019
  11. Aug 2019
    1. I think Netflix would’ve avoided this controversy if it had plainly told subscribers what it was doing somewhere in the app or with a notification. Instead, people discovered that Netflix was utilizing Android’s physical activity permission, which is strange behavior from a video streaming app. In some instances, it was doing this without asking users to approve the move first, as was the case for The Next Web’s Ivan Mehta. You’ve got to be transparent if you want to monitor anyone’s movements. Netflix was unable to immediately answer whether it will be removing the physical activity recognition permission from its app now that the test is done.

      It's great that sites like The Verge and The Next Web are calling surveillance capitalists out.

  12. Jul 2019
    1. SZ: We are not users. I say we are bound in new psychological, social, political, as well as, economic interests. That we have not yet invented the words to describe the ways that we are bound. We have not yet invented the forms of collective action to express the interests that bind us.  And that that is a big part of the work that must follow in this year and the next year and the year after that, if we are to ultimately interrupt and outlaw what I view as a pernicious rogue capitalism that has no business dominating our society.
    1. According to Shoshana Zuboff, professor emerita at Harvard Business School, the Cambridge Analytica scandal was a landmark moment, because it revealed a micro version “of the larger phenomenon that is surveillance capitalism”. Zuboff is responsible for formulating the concept of surveillance capitalism, and published a magisterial, indispensible book with that title soon after the scandal broke. In the book, Zuboff creates a framework and a language for understanding this new world. She believes The Great Hack is an important landmark in terms of public understanding, and that Noujaim and Amer capture “what living under the conditions of surveillance capitalism means. That every action is being repurposed as raw material for behavioural data. And that these data are being lifted from our lives in ways that are systematically engineered to be invisible. And therefore we can never resist.”

      Shoshana Zuboff's comments on The Great Hack.

    1. Two years ago, when he moved from Boston to London, he had to register with a general practitioner. The doctor’s office gave him a form to sign saying that his medical data would be shared with other hospitals he might go to, and with a system that might distribute his information to universities, private companies and other government departments.The form added that the although the data are anonymized, “there are those who believe a person can be identified through this information.”“That was really scary,” Dr. de Montjoye said. “We are at a point where we know a risk exists and count on people saying they don’t care about privacy. It’s insane.”
    2. Scientists at Imperial College London and Université Catholique de Louvain, in Belgium, reported in the journal Nature Communications that they had devised a computer algorithm that can identify 99.98 percent of Americans from almost any available data set with as few as 15 attributes, such as gender, ZIP code or marital status.

      This goes to show that one should not trust companies and organisations which claim to "anonymise" your data.

    1. Within this larger context, Facebook, Google (YouTube, Google+, Blogger), and Twitter have grown from small projects mocked up on sketchbooks and developed in college dorms to global networks of billions, garnering attention from venture capitalists who invested in pursuit of growth in revenues and profits and ultimately public offerings of stock. Facebook, Google, and Twitter are thus articulated into a particular political economy of the Internet, one dependent on surveillance of user activities, the construction of user data profiles, and the sale of user attention to an increasingly sophisticated Internet marketing industry (Langlois, McKelvey, Elmer, & Werbin, 2009).
    1. What should lawmakers do? First, interrupt and outlaw surveillance capitalism’s data supplies and revenue flows. This means, at the front end, outlawing the secret theft of private experience. At the back end, we can disrupt revenues by outlawing markets that trade in human futures knowing that their imperatives are fundamentally anti-democratic. We already outlaw markets that traffic in slavery or human organs. Second, research over the past decade suggests that when “users” are informed of surveillance capitalism’s backstage operations, they want protection, and they want alternatives. We need laws and regulation designed to advantage companies that want to break with surveillance capitalism. Competitors that align themselves with the actual needs of people and the norms of a market democracy are likely to attract just about every person on Earth as their customer. Third, lawmakers will need to support new forms of collective action, just as nearly a century ago workers won legal protection for their rights to organise, to bargain collectively and to strike. Lawmakers need citizen support, and citizens need the leadership of their elected officials.

      Shoshana Zuboff's answer to surveillance capitalism

  13. Jun 2019
  14. May 2019
    1. They’ve learned, and that’s more dangerous than caring, because that means they’re rationally pricing these harms. The day that 20% of consumers put a price tag on privacy, freemium is over and privacy is back.

      Google want you to say yes, not because they're inviting positivity more than ever, but because they want you to purchase things and make them richer. This is the essence of capitalism.

  15. Apr 2019
    1. Facebook said on Wednesday that it expected to be fined up to $5 billion by the Federal Trade Commission for privacy violations. The penalty would be a record by the agency against a technology company and a sign that the United States was willing to punish big tech companies.

      This is where surveillance capitalism brings you.

      Sure, five billion American Dollars won't make much of a difference to Facebook, but it's notable.

    1. In a new article, the New York Times details a little-known technique increasingly used by law enforcement to figure out everyone who might have been within certain geographic areas during specific time periods in the past. The technique relies on detailed location data collected by Google from most Android devices as well as iPhones and iPads that have Google Maps and other apps installed. This data resides in a Google-maintained database called “Sensorvault,” and because Google stores this data indefinitely, Sensorvault “includes detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade.”

      Google is passing on location data to law enforcement without letting users know.

    1. Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.” That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking. (It’s possible, although laborious, to delete it .)
    1. (iii) Information we collect from other sources: From time to time, we may obtain information about you or your Contacts from third-party sources, such as public databases, social media platforms, third-party data providers and our joint marketing partners. We take steps to ensure that such third parties are legally or contractually permitted to disclose such information to us.

      So while this is a free site, they can mine your data including your social media account. All of this in the name of providing you better service.

    1. AMP is a set of rules that publishers (typically news and analysis content providers) must abide by in order to appear in the “Top Stories” section of Google’s search results, a lucrative position at the top of the page.

      This is just one of many reasons for not using Google's search engine. Or most of their products.

      Monotheistic and, more importantly, monopolistic thinking like this drags us all down.

    1. “They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions. “[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target ‘Jew haters’ and other hateful market segments, and refuse to accept any responsibility for any content or harm. “They #dontgiveazuck” wrote Edwards.

      Well, I don't think he should have deleted his tweets.

    1. U.S. securities regulators shot down attempts by Amazon.com Inc to stop its investors from considering two shareholder proposals about the company’s controversial sale of a facial recognition service, a sign of growing scrutiny of the technology.

      Surveillance capitalism at its worst; this behemoth tries to have the people who own it not make decisions.

      Capitalism is like Skynet, an organism that's taken flight on its own, bound to make solipsistic and egoistic judgments and choices.

    1. Digital sociology needs more big theory as well as testable theory.

      I can't help but think here about the application of digital technology to large bodies of literature in the creation of the field of corpus linguistics.

      If traditional sociology means anything, then a digital incarnation of it should create physical and trackable means that can potentially be more easily studied as a result. Just the same way that Mark Dredze has been able to look at Twitter data to analyze public health data like influenza, we should be able to more easily quantify sociological phenomenon in aggregate by looking at larger and richer data sets of online interactions.

      There's also likely some value in studying the quantities of digital exhaust that companies like Google, Amazon, Facebook, etc. are using for surveillance capitalism.

    1. “Prison labor” is usually associated with physical work, but inmates at two prisons in Finland are doing a new type of labor: classifying data to train artificial intelligence algorithms for a startup. Though the startup in question, Vainu, sees the partnership as a kind of prison reform that teaches valuable skills, other experts say it plays into the exploitative economics of prisoners being required to work for very low wages.

      Naturally, this is exploitative; the inmates do not learn a skill that they can take out into the real world.

      I'd be surprised if they'd not have to sign a NDA for this.

  16. Mar 2019
    1. If you do not like the price you’re being offered when you shop, do not take it personally: many of the prices we see online are being set by algorithms that respond to demand and may also try to guess your personal willingness to pay. What’s next? A logical next step is that computers will start conspiring against us. That may sound paranoid, but a new study by four economists at the University of Bologna shows how this can happen.
    1. Mention McDonald’s to someone today, and they're more likely to think about Big Mac than Big Data. But that could soon change: The fast-food giant has embraced machine learning, in a fittingly super-sized way.McDonald’s is set to announce that it has reached an agreement to acquire Dynamic Yield, a startup based in Tel Aviv that provides retailers with algorithmically driven "decision logic" technology. When you add an item to an online shopping cart, it’s the tech that nudges you about what other customers bought as well. Dynamic Yield reportedly had been recently valued in the hundreds of millions of dollars; people familiar with the details of the McDonald’s offer put it at over $300 million. That would make it the company's largest purchase since it acquired Boston Market in 1999.

      McDonald's are getting into machine learning. Beware.

    1. Discredited individuals have been barred from taking a total of 17.5 million flights and 5.5 million high-speed train trips as of the end of 2018, according to the latest annual report by the National Public Credit Information Center.The list of “discredited individuals” was introduced in 2013, months before the State Council unveiled a plan in 2014 to build a social credit system by 2020.

      This is what surveillance capitalism brings. This is due to what is called China's "Golden Shield", a credit-statement system that, for example, brings your credit level down if you search for terms such as "Tianmen Square Protest" or post "challenging" pictures on Facebook.

      This is surveillance capitalism at its worst, creating a new lower class for the likes of Google, Facebook, Microsoft, Amazon, and insurance companies. Keep the rabble away, as it were.

    1. Amazon has been beta testing the ads on Apple Inc.’s iOS platform for several months, according to people familiar with the plan. A similar product for Google’s Android platform is planned for later this year, said the people, who asked not to be identified because they’re not authorized to share the information publicly.

      Sounds like one of the best reasons I've ever heard to run Brave Browser both on desktop and mobile. https://brave.com/

    1. Sharing of user data is routine, yet far from transparent. Clinicians should be conscious of privacy risks in their own use of apps and, when recommending apps, explain the potential for loss of privacy as part of informed consent. Privacy regulation should emphasise the accountabilities of those who control and process user data. Developers should disclose all data sharing practices and allow users to choose precisely what data are shared and with whom.

      Horrific conclusion, which clearly states that "sharing of user data is routine" where the medical profession is concerned.

    1. While employees were up in arms because of Google’s “Dragonfly” censored search engine with China and its Project Maven’s drone surveillance program with DARPA, there exist very few mechanisms to stop these initiatives from taking flight without proper oversight. The tech community argues they are different than Big Pharma or Banking. Regulating them would strangle the internet.

      This is an old maxim with corporations, Google, Facebook, and Microsoft alike; if you don't break laws by simply doing what you want because of, well, greed, then you're hampering "evolution".

      Evolution of their wallets, yes.

    2. Amy Webb, Author of  “The Big Nine: How the Tech Titans and their Thinking Machines could Warp Humanity” refers not only to G-MAFIA but also BAT (the consortium that has led the charge in the highly controversial Social Credit system to create a trust value among its Chinese citizens). She writes: We stop assuming that the G-MAFIA (Google, Apple, Facebook, IBM, and Amazon) can serve its DC and Wall Street masters equally and that the free markets and our entrepreneurial spirit will produce the best possible outcomes for AI and humanity

      This is discussed by Shoshana Zuboff in her masterfully written "The Age of Surveillance Capitalism".

    1. A speech-detecting accelerometer recognizes when you’re speaking and works with a pair of beamforming microphones to filter out external noise and focus on the sound of your voice.

      I'll translate this for you: "This enables Apple to constantly listen to you, record your behaviour, and sell your behaviour data."

    1. we don’t want to fund teachers and manageable class sizes, so we outsource the plagiarism problem to a for-profit company that has a side gig of promoting the importance of the problem it promises to solve.

      Yet another example of a misdirected "solution" to a manufactured problem that ends up being more costly - in terms of monetary expense AND student learning AND faculty engagement - than it would have been to invest in human interaction and learner-centered pedagogies.

  17. Feb 2019
    1. It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”
    2. Larry Page grasped that human experience could be Google’s virgin wood, that it could be extracted at no extra cost online and at very low cost out in the real world. For today’s owners of surveillance capital the experiential realities of bodies, thoughts and feelings are as virgin and blameless as nature’s once-plentiful meadows, rivers, oceans and forests before they fell to the market dynamic. We have no formal control over these processes because we are not essential to the new market action. Instead we are exiles from our own behaviour, denied access to or control over knowledge derived from its dispossession by others for others. Knowledge, authority and power rest with surveillance capital, for which we are merely “human natural resources”. We are the native peoples now whose claims to self-determination have vanished from the maps of our own experience.
    1. No one is forced on Twitter, naturally, but if you aren’t on Twitter, then your audience is (probably) smaller, while if you are on Twitter, they can steal your privacy, which I deeply resent. This is a big dilemma to me. Beyond that, I simply don’t think anybody should have as much power as the social media giants have over us today. I think it’s increasingly politically important to decentralize social media.

      This is an important point! And nothing puts a finer point on it than Shoshona Zuboff's recent book on surveillance capitalism.

  18. Jan 2019
    1. Turnitin’s practices have been ruled as fair use in federal court. But to Morris and Stommel, the ceding of control of students' work -- and their ownership over that work -- to a corporation is a moral issue, even if it's legally sound. Time spent on checking plagiarism reports is time that would be better spent teaching students how to become better writers in the first place, they argue. “This is ethical, activist work. While not exactly the Luddism of the 19th century, we must ask ourselves, when we’re choosing ed-tech tools, who profits and from what?” they wrote in the essay. “The gist: when you upload work to Turnitin, your property is, in no reasonable sense, your property. Every essay students submit -- representing hours, days or even years of work -- becomes part of the Turnitin database, which is then sold to universities.”

      This is key issue for me - and we talked about this last week in GEDI when someone brought up the case of wide-scale cheating on the quizz / test that students took online.

      I'd like teachers to focus on teaching and helping students learn. And I think the question about who profits and who benefits from ed-tech tools like TurnitIn need to be asked.

  19. Nov 2018
  20. Oct 2018
    1. The idea that researchers can, and should, quantify something as slippery as “engagement” is a red flag for many of the experts I talked to. As Alper put it, “anyone who has spent time in any kind of classroom will know that attention isn’t something well-measured by the face. The body as a whole provides many more cues.”
    2. The NYCLU found nothing in the documents outlining policies for accessing data collected by the cameras, or what faces would be fed to the system in the first place. And based on emails acquired through the same FOIL request, the NYCLU noted, Lockport administrators appeared to have a poor grasp on how to manage access to internal servers, student files, and passwords for programs and email accounts. “The serious lack of familiarity with cybersecurity displayed in the email correspondence we received and complete absence of common sense redactions of sensitive private information speaks volumes about the district’s lack of preparation to safely store and collect biometric data on the students, parents and teachers who pass through its schools every day,” an editor’s note to the NYCLU’s statement on the Lockport documents reads.
    3. A school using the platform installs a set of high-quality cameras, good enough to detect individual student faces, before determining exactly which biometrics it thinks must set off the system. Crucially, it’s up to each school to input these facial types, which it might source from local police and mug-shot databases, or school images of former students it doesn’t want on its premises. With those faces loaded, the Aegis system goes to work, scanning each face it sees and comparing it with the school’s database. If no match is found, the system throws that face away. If one is, Aegis sends an alert to the control room.
    4. It might sound like dystopian science fiction, but this could be the not-too-distant future for schools across America and beyond. Researchers at the University of California, San Diego, for instance, have already begun publishing models for how to use facial recognition and machine learning to predict student engagement. A Seattle company recently offered up an open-source facial recognition system for use in schools, while startups are already selling “engagement detectors” to online learning courses in France and China. Advocates for these systems believe the technology will make for smarter students, better teachers, and safer schools. But not everyone is convinced this kind of surveillance apparatus belongs in the classroom, that these applications even work, or that they won’t unfairly target minority faces.
    1. The end game of a surveillance society, from the perspective of those being watched, is to be subjected to whims of black-boxed code extended to the navigation of spaces, which are systematically stripped of important social and cultural clues. The personalized surveillance tech, meanwhile, will not make people less racist; it will make them more comfortable and protected in their racism.
    2. What would it look like to be constantly coded as different in a hyper-surveilled society — one where there was large-scale deployment of surveillant technologies with persistent “digital epidermalization” writing identity on to every body within the scope of its gaze?
  21. Aug 2018
    1. But the entire business model — what the philosopher and business theorist Shoshana Zuboff calls “surveillance capitalism” — rests on untrammeled access to your personal data.

      Is Shoshana Zuboff the originator of surveillance capitalism?

      According to Wikipedia--No: Surveillance capitalism is a term first introduced by John Bellamy Foster and Robert W. McChesney in Monthly Review in 2014 and later popularized by academic Shoshana Zuboff that denotes a new genus of capitalism that monetizes data acquired through surveillance.

    1. Since the data is already being collected on a regular basis by ubiquitous private firms, it is thought to contain information that will increase opportunities for intelligence gathering and thereby security. This marks a shift from surveillance to ‘dataveillance’ (van Dijck 2014), where the impetus for data processing is no longer motivated by specific purposes or suspicions, but opportunistic discovery of anomalies that can be investigated. For crisis management this could mean benefits such as richer situation awareness, increased capacity for risk assess-ment, anticipation and prediction, as well as more agile response

      Dataveillance definition.

      The supposed benefits for crisis management don't correspond to the earlier criticisms about data quality, loss of contextualization, and predictive analytics accuracy.

      The following paragraph clears up some of the overly optimistic promises. Perhaps this section is simply overstated for rhetorical purposes.

    2. lthough Snowden’s revelations shocked the world and prompted calls for a public debate on issues of privacy and transparency

      I understand the desire to use a topical hook to explain a complex topic but referring to the highly contentious Snowden scandal as a frame seems risky (alienating) and could potentially undermine an important argument about the surveillance state should new revelations be revealed about his motives/credibility.

    3. While seemingly avoiding the traps of exerting top- down power over people the state does not yet have formal control over, and simultaneously providing support for self- determination and choice to empower individuals for self- sufficiency rather than defining them as vulnerable and passive recipients of top- down protection (Meier 2013), tying individual aid to mobile tracking puts refugees in a situation where their security is dependent upon individual choice and the private sector. Apart from disrupting traditional dynamics of responsibility for aid and protection, public–private sharing of intel-ligence brings new forms of dataveillance

      If the goal is to improve rapid/efficient response to those in need, is it necessarily only a dichotomy of top-down institutional action vs private sector/market-driven reaction? Surely, we can do better than this.

      Data/predictive analytics abuses by the private sector are legion.

      How does social construction vs technological determinism fit here? In what ways are the real traumas suffered by crisis-affected people not being taken into account during the response/relief/resiliency phases?

    4. However, with these big data collections, the focus becomes not the individu-al’s behaviour but social and economic insecurities, vulnerabilities and resilience in relation to the movement of such people. The shift acknowledges that what is surveilled is more complex than an individual person’s movements, communica-tions and actions over time.

      The shift from INGO emergency response/logistics to state-sponsored, individualized resilience via the private sector seems profound here.

      There's also a subtle temporal element here of surveilling need and collecting data over time.

      Again, raises serious questions about the use of predictive analytics, data quality/classification, and PII ethics.

    5. Andrejevic and Gates (2014: 190) suggest that ‘the target becomes the hidden patterns in the data, rather than particular individuals or events’. National and local authorities are not seeking to monitor individuals and discipline their behaviour but to see how many people will reach the country and when, so that they can accommodate them, secure borders, and identify long- term social out-looks such as education, civil services, and impacts upon the host community (Pham et al. 2015).

      This seems like a terribly naive conclusion about mass data collection by the state.

      Also:

      "Yet even if capacities to analyse the haystack for needles more adequately were available, there would be questions about the quality of the haystack, and the meaning of analysis. For ‘Big Data is not self-explanatory’ (Bollier 2010: 13, in boyd and Crawford 2012). Neither is big data necessarily good data in terms of quality or relevance (Lesk 2013: 87) or complete data (boyd and Crawford 2012)."

    6. as boyd and Crawford argue, ‘without taking into account the sample of a data set, the size of the data set is meaningless’ (2012: 669). Furthermore, many tech-niques used by the state and corporations in big data analysis are based on probabilistic prediction which, some experts argue, is alien to, and even incom-prehensible for, human reasoning (Heaven 2013). As Mayer-Schönberger stresses, we should be ‘less worried about privacy and more worried about the abuse of probabilistic prediction’ as these processes confront us with ‘profound ethical dilemmas’ (in Heaven 2013: 35).

      Primary problems to resolve regarding the use of "big data" in humanitarian contexts: dataset size/sample, predictive analytics are contrary to human behavior, and ethical abuses of PII.

    7. Surveillance studies have tracked a shift from discipline to control (Deleuze 1992; Haggerty and Ericson 2000; Lyon 2014) exemplified by the shift from monitoring confined populations (through technologies such as the panopticon) to using new technologies to keep track of mobile populations.

      Design implication for ICT4D and ICT for humanitarian response -- moving beyond controlled environment surveillance to ubiquitous and omnipresent.

  22. Jul 2018
    1. Tega Brain and Sam Lavigne, two Brooklyn-based artists whose work explores the intersections of technology and society, have been hearing a lot of stories like mine. In June, they launched a website called New Organs, which collects first-hand accounts of these seemingly paranoiac moments. The website is comprised of a submission form that asks you to choose from a selection of experiences, like “my phone is eavesdropping on me” to “I see ads for things I dream about.” You’re then invited to write a few sentences outlining your experience and why you think it happened to you.
  23. Mar 2018
  24. Jan 2018
  25. virginia-eubanks.com virginia-eubanks.com
  26. Oct 2017
  27. Jul 2017
    1. For example, an observer or eavesdropper that conducts traffic analysis may be able to determine what type of traffic is present (real-time communications or bulk file transfers, for example) or which protocols are in use, even if the observed communications are encrypted or the communicants are unidentifiable. This kind of surveillance can adversely impact the individuals involved by causing them to become targets for further investigation or enforcement activities

      good example about surveillance

  28. Mar 2017
    1. You can delete the data. You can limit its collection. You can restrict who sees it. You can inform students. You can encourage students to resist. Students have always resisted school surveillance.

      The first three of these can be tough for the individual faculty member to accomplish, but informing students and raising awareness around these issues can be done and is essential.

  29. Dec 2016
    1. Selling user data should be illegal. And the customer data a company is allowed to collect and store should be very limited.

      Under the guidance of Jared Kushner, a senior campaign advisor and son-in-law of President-Elect Trump, Parscale quietly began building his own list of Trump supporters. Trump’s revolutionary database, named Project Alamo, contains the identities of 220 million people in the United States, and approximately 4,000 to 5,000 individual data points about the online and offline life of each person. Funded entirely by the Trump campaign, this database is owned by Trump and continues to exist.

      Trump’s Project Alamo database was also fed vast quantities of external data, including voter registration records, gun ownership records, credit card purchase histories, and internet account identities. The Trump campaign purchased this data from certified Facebook marketing partners Experian PLC, Datalogix, Epsilon, and Acxiom Corporation. (Read here for instructions on how to remove your information from the databases of these consumer data brokers.)

    2. Trump's campaign used carefully targeted negative ads to suppress voter turnout.

      With Project Alamo as ammunition, the Trump digital operations team covertly executed a massive digital last-stand strategy using targeted Facebook ads to ‘discourage’ Hillary Clinton supporters from voting. The Trump campaign poured money and resources into political advertisements on Facebook, Instagram, the Facebook Audience Network, and Facebook data-broker partners.

      “We have three major voter suppression operations under way,” a senior Trump official explained to reporters from BusinessWeek. They’re aimed at three groups Clinton needs to win overwhelmingly: idealistic white liberals, young women, and African Americans.”

  30. Nov 2016
    1. Mike Pompeo is Trump's pick for CIA director. In January 2016, Pompeo advocated "re-establishing collection of all metadata, and combining it with publicly available financial and lifestyle information into a comprehensive, searchable database. Legal and bureaucratic impediments to surveillance should be removed" (At least they acknowledge that backdoors in US hardware and software would do little good.)

      Oh, cute. Pompeo made a name for himself during the Benghazi investigation.<br> http://www.nytimes.com/2016/11/19/us/politics/donald-trump-mike-pompeo-cia.html

  31. Oct 2016
    1. Hemisphere isn’t a “partnership” but rather a product AT&T developed, marketed, and sold at a cost of millions of dollars per year to taxpayers. No warrant is required to make use of the company’s massive trove of data, according to AT&T documents, only a promise from law enforcement to not disclose Hemisphere if an investigation using it becomes public.

      ...

      Once AT&T provides a lead through Hemisphere, then investigators use routine police work, like getting a court order for a wiretap or following a suspect around, to provide the same evidence for the purpose of prosecution. This is known as “parallel construction.”

    1. If you’re white, you don’t usually need to worry about being monitored by the police.

      Interesting. That NYPD surveillance tower on Pitt Street. Will append sanitized photo at some point. Sad because so many children coming from the Masyrk and Brandeis communities cross Pitt on their way to school.

  32. Sep 2016
    1. In theory the editorial writers speak for the publisher. In practice the publisher does not routinely tell them what to say or even see their copy in advance. In this case I have on good authority that neither the publisher, Fred Ryan, nor the owner, Jeff Bezos, had any idea that this editorial was coming. I would be very surprised to learn that either of them agrees with the proposition that our principal stories on the NSA should not have been published. For sure I can tell you that this is not the position of the newsroom’s leadership or any reporter I know. Marty Baron, the executive editor, has said again and again how proud he is of the paper’s coverage of Ed Snowden and the NSA.

      -- Barton Gellman

  33. Jul 2016