18 Matching Annotations
  1. Mar 2019
    1. Lastly and more foundationally, why would Facebook or Google owe you anything? It’s not like Zuckerberg the Paparazzo snapped a photo of you and then monetized your image. You willfully used a service and generated data that wouldn’t otherwise exist. What you get in return is Facebook itself, for which you’ve not paid a nickel. Ditto Uber, which uses your data to optimize a tricky two-way market in riders and drivers so you have a car nearby when you open your app. Google likewise uses your searches (and resulting clicks) as a training set for its search algorithm. None of these modern marvels is cheap to maintain. You’re not contributing to some limited pool of data on whose resulting revenue you can stake a claim; you’re an infinitesimally small part of a data cooperative whose benefits accrue to the very users that generated it.

      Data are used by the services - but because you are using those services. Argument to share with public?

  2. Feb 2019
    1. Unified View programme, aimed at giving it a much more complete picture, and one that it can categorise and dissect as it wishes, of registrations and trends in food service operators around the country

      The complete database - the collection and exploitation / use of data

    2. “We recognised the challenge because of the number of local authorities and the wide variation in IT,” Pierce says. “Even though there is a small number of MIS providers there may be different versions of their software, they may be on-premise or in the cloud, and they may be using emerging technology “It’s a little bit lumpy; they come on in groups. Once you get another MIS provider you get all their local authorities coming on, or if they are in a shared service they will come on together. For a local authority to come onboard you need the MIS provider onboard; you have to get both saying yes.”

      The challenge of moving public sector!

    3. “To get a good digital service it should meet everybody’s needs,” she says. “It should be that people want to use it, enter the right data first time without having to think too hard about it, ensure it’s validated, and the goal is always to try to get as complete coverage as possible.”

      Simple data stewardship / custodian / guardian.

    1. He stressed that a “critical part” of the organisation’s capability to do this was to make sure that the whole council was aware of its duties and responsibilities, as well as areas of best practice and of improvement.

      Agree this is critical

    2. Connell likened the principle to the idea that a bad lawyer will tell you simply what you can’t do, while a good one will ask what you are trying to achieve and then help you do that within the bounds of the law. “The reality is, you need someone who can think in the art of the possible – 'What can we do with this data to be more effective, more efficient, to reduce costs and improve services?' – but you also need someone to look at it and say, ‘We can't do that, but we can do that, and we need explicit consent for that’,” Connell said. “I would encourage everyone to do that – to always have their data protection people on exploitation as well as locking the data down.”

      Expertise in data protection as the barrier to making the best use of data. Public sector to become leaders by first understanding it themselves.

    3. one of the greatest successes since he started at the council had been a marked improvement in data protection and information management. Part of this, he said, was down to having the data protection function sitting with the data exploitation area, which brings the two major aspects of data management in local government – using it to greatest effect and yet protecting the people it belongs to – closer together.

      That might do it!

    4. Councils could make more effective use of their data if they ensure the people working on data protection also play a part in exploitation of that information, Geoff Connell, head of information management at Norfolk County Council, has said.

      This is an interesting approach.

    1. These documents are further augmented by a 15,000-word secondary document, called “Known Questions,” which offers additional commentary and guidance on thorny questions of moderation — a kind of Talmud to the community guidelines’ Torah. Known Questions used to occupy a single lengthy document that moderators had to cross-reference daily; last year it was incorporated into the internal community guidelines for easier searching.

      Creating a body of knowledge and trying to spread that is hard.

    2. In some cases, the company has been criticized for not doing enough — as when United Nations investigators found that it had been complicit in spreading hate speech during the genocide of the Rohingya community in Myanmar. In others, it has been criticized for overreach — as when a moderator removed a post that excerpted the Declaration of Independence. (Thomas Jefferson was ultimately granted a posthumous exemption to Facebook’s speech guidelines, which prohibit the use of the phrase “Indian savages.”)

      AI probably here

    1. So the only way things will change is if users get turned off so badly that they tune out.We started to see some hints of this with Facebook after the Cambridge Analytica scandal last year. While Facebook had been caught violating users’ privacy dozens of times, the mere hint that a political consultancy might have used Facebook data to help elect Trump (although this is far from proven) set people off. Congress conducted hearings. Further privacy scandals got attention they never used to. People noisily deleted their accounts. Growth has largely stalled in the U.S., and younger users are abandoning the platform, although this might be more because of changing fashions and faddishness than any reaction to the scandals facing it. (Anyway, kids are still flocking to Facebook-owned Instagram.)

      There has to be large public backlash before things will change.

    2. But tech platforms that rely on user-generated content are protected by the 1996 Communications Decency Act, which says platform providers cannot be held liable for material users post on them. It made sense at the time — the internet was young, and forcing start-ups to monitor their comments sections (remember comments sections?) would have exploded their expenses and stopped growth before it started.Even now, when some of these companies are worth hundreds of billions of dollars, holding them liable for user-generated content would blow up these companies’ business models. They’d disappear, reduce services or have to charge fees for them. Voters might not be happy if Facebook went out of business or they suddenly had to start paying $20 a month to use YouTube.

      Maybe we have to.

    3. But it’s not practical — and may be physically impossible — to hire enough screeners to catch every violation, or to screen every piece of content before it’s posted instead of after. They are investing in computer algorithms and artificial intelligence as well, and these programs do work — there’s almost no porn or nudity on YouTube or Facebook, for instance — but they’re not 100 percent effective, especially for altered videos or political content.

      Will AI help us with this?

    4. friction-free way to upload and share whatever they wanted. As users uploaded masses of words and links and hours of video, these platform companies amassed huge audiences, then sold ads against them. When people can share whatever they want, these platforms turn into a mirror image of the human psyche — including the ugly parts.

      Does it have to be this way? How do other communities moderate one another?

    5. The problem is the entire business model around user-generated content, and the whack-a-mole game of trying to stay one step ahead of people who abuse it.

      User generated content must be moderated - but by whom?