3 Matching Annotations
  1. May 2019
    1. “On the ground in Syria,” he continued, “Assad is doing everything he can to make sure the physical evidence [of potential human-rights violations] is destroyed, and the digital evidence, too. The combination of all this—the filters, the machine-learning algorithms, and new laws—will make it harder for us to document what’s happening in closed societies.” That, he fears, is what dictators want.
    2. Google and Facebook break out the numbers in their quarterly transparency reports. YouTube pulled 33 million videos off its network in 2018—roughly 90,000 a day. Of the videos removed after automated systems flagged them, 73 percent were removed so fast that no community members ever saw them. Meanwhile, Facebook removed 15 million pieces of content it deemed “terrorist propaganda” from October 2017 to September 2018. In the third quarter of 2018, machines performed 99.5 percent of Facebook’s “terrorist content” takedowns. Just 0.5 percent of the purged material was reported by users first.Those statistics are deeply troubling to open-source investigators, who complain that the machine-learning tools are black boxes.
    3. “We were collecting, archiving, and geolocating evidence, doing all sorts of verification for the case,” Khatib recalled. “Then one day we noticed that all the videos that we had been going through, all of a sudden, all of them were gone.”It wasn’t a sophisticated hack attack by pro-Assad forces that wiped out their work. It was the ruthlessly efficient work of machine-learning algorithms deployed by social networks, particularly YouTube and Facebook.