6 Matching Annotations
  1. May 2020
    1. If you profile your users, you have to tell them. Therefore, you must pick the relevant clause from the privacy policy generator.
    2. If you’re selling products and keep record of users’ choices for marketing purposes, dividing them into meaningful categories, such as by age, gender, geographical origin etc., you’re profiling them.
  2. Oct 2017
    1. ‘themorepredictableresultwouldbeagradualdesertificationoftheculturallifeofindividualsnolongerabletoencounterwhatisunusual,unexpected,andsurprising.’[61]Ratherthanindividualizedbubbles,sharingsegregatessocialnetworkusersintoculturalbubblesofpreferences,products,andknowledge
    2. platformssuchasGoogleandFacebookthatoperatelike‘predictionengines’by‘constantlycreatingandrefiningatheoryofwhoyouareandwhatyou’lldoandwantnext’basedonwhatyouhavedoneandwantedbefore

    Tags

    Annotators

  3. Sep 2017
    1. In each case data was framed as repressive of notions of civil society or enforcing an impoverished or constrictive notion of citizenship. The perspectives of Tufekci and Cheney-Lippold provide valuable insight into how algorithms and data are powerful shapers of modern life. Yet, they leave little room for a different form of algorithmic citizenship that might emerge where indi-viduals desire to reform technology and data-driven processes. As Couldry and Powell (2014) note, models of algorithmic power (Beer, 2009; Lash, 2007) tend to downplay questions of individual agency. They suggest a need to “highlight not just the risks of creating and sharing data, but the opportunities as well” (p. 5). We should be attentive to moments where meaningful change can occur, even if those changes are fraught with forces of neoliberalism and tinged with technocracy.

    Tags

    Annotators