20 Matching Annotations
  1. Nov 2023
    1. If there is any hope for our ability to understand what really happens on social media next year, it may come from the European Union, where the Digital Services Act demands transparency from platforms operating on the continent. But enforcement actions are slow, and wars and elections are fast by comparison. The surge of disinformation around Israel and Gaza may point to a future in which what happens online is literally unknowable.

      Zuckerman mentions the DSA as his single hope, the only surprisal in this piece. Although the DMA is important wrt the silos too, as is the GDPR, it is the DSA that has the transparency reqs, plus actually describes the outside research access Zuckerman sees frustrated as mandatory. Says enforcement is slow however. Yes, at the same time it's not just reactive enforcement. It's about EU market access, pro-active disclosures are mandatory.

  2. May 2023
    1. https://web.archive.org/web/20230507143729/https://ec.europa.eu/commission/presscorner/detail/en/ip_23_2413

      The EC has designated the first batch of VLOP and VLOSE under the DSA

      consultation on data access to researchers is opened until 25 May. t:: need to better read Article 41? wrt this access. Lots of conspiracytalk around it re censorship, what does the law say?

  3. Nov 2022
    1. The last thing Europe wants is its regulation that restricts future innovation, raising barriers to entry for new businesses and users alike. 

      Which is why DSA and DMA target larger entities beyond that start-up scale.

    2. There is no central authority or control that one could point to and hold responsible for content moderation practices; instead, moderation happens in an organic bottom-up manner

      This is I think an incorrect way of picturing it. Moderation isn't bottom-up, that again implies seeing the fediverse as a whole. Moderation is taking place in each 'shop' in a 'city center', every 'shop' has its own house rules. And that is the only level of granularity that counts, the system as a whole isn't a system entity. Like road systems, e-mail, postal systems, internet infra etc. aren't either.

    3. Since moderation in major social media platforms is conducted by a central authority, the DSA can effectively hold a single entity accountable through obligations. This becomes more complex in decentralized networks, where content moderation is predominantly community-driven.

      Does it become more complex in federation? Don't think so as it also means that the reach and impact of each of those small instances is by def limited. Most of the fediverse will never see most of the fediverse. Thus it likely flies under any ceiling that incurs new responsibilities.

    4. what will it mean if an instance ends up generating above EUR 10 million in annual turnover or hires more than 50 staff members? Under the DSA, if these thresholds are met the administrators of that instance would need to proceed to the implementation of additional requirements, including a complaint handling system, cooperation with trusted flaggers and out-of-court dispute bodies, enhanced transparency reporting and the adoption of child protection measures, as well as the banning of dark patterns. Failure to comply with these obligations may result in fines or the geo-blocking of the instance across the EU market. 

      50ppl and >10M turnover for a single instance (mastodon.social runs on 50k in donations or so)? Don't see that happening, and if, how likely is it that will be in the European market? Where would such turnover come from anyways, it isn't adverts so could only be member fees as donations don't count? Currently it's hosters that make money, for keeping the infra humming.

    5. Today– given the non-profit model and limited, volunteer administration of most existing instances– all Mastodon servers would seem to be exempt from obligations for large online platforms

      Almost by definition federated instances don't qualify as large platform.

    6. However, based on the categorizations of the DSA, it is most probable that each instance could be seen as an independent ‘online platform’ on which a user hosts and publishes content that can reach a potentially unlimited number of users. Thus, each of these instances will need to comply with a set of minimum obligations for intermediary and hosting services, including having a single point of contact and legal representative, providing clear terms and conditions, publishing bi-annual transparency reports, having a notice and action mechanism and, communicating information about removals or restrictions to both notice and content providers.

      Mastodon instances, other than personal or closed ones, would fall within the DSA. Each instance is its own platform though. Because of that I don't think this holds up very well, are closed Discord servers platforms under the DSA too then? Most of these instances are small, many don't encourage new users (meaning the potential reach is very limited). For largers ones like mastodon.nl this probably does apply.

  4. Oct 2022
    1. The law in question is the EU’s Digital Services Act (DSA), which was passed by the European Parliament last July 5th amidst almost total indifferenc

      Total indifference? The moment of the law's passing in July was the end of years of discussion and debate. The proposal was published at the end of 2020, and had been years in the making. Also it's a continuation of pre-existing regulations. Things only come suddenly if you haven't been paying attention to the process.

    2. Again, there is no need to enter into the tortuous details of the legislative text to show this

      This is BS. There is every need to base yourself on the legislative text itself. There's nothing tortuous about that text.

  5. Jan 2022
  6. Apr 2021
  7. Apr 2019