The Digital Services Act is a transparency machine. Platforms have to submit every 6 months a report describing their content moderation activities in the EU. The first reports are in and there’s a wealth of information in there. Thread
First up, X/Twitter . In terms of total numbers we see that account suspensions are by far the most used measure (2 mil) followed by restricting reach (90k) and removing content (54k).transparency.twitter.com/dsa-transparen…
Fascinating chart detailing why accounts where suspended. If you take away violations of Twitter's policy on spam and platform manipulation not a great deal seems to be happening.. help.twitter.com/en/rules-and-p…
If we look at why content was removed it's clear that the DSA isn't exactly a 'censorship machine'
France and Germany are responsibly for the bulk of information requests under art. 10 DSA
Linguistic expertise of Twitter's EU content moderation team. Will this change in light of the 2024 elections in Finland, Lithuania, Moldova, Romania and Slovakia?
Montly active users in the EU
Up next: TikTok (definitely a better layout 🤫) TikTok doesnt give the same type of breakdown per Member State unfortunately sf16-va.tiktokcdn.com/obj/eden-va2/f…
It definitely does have more human moderators compared to X (6k). Moderation of content in Irish and Maltese is - just like w X almost non-existent.
The amount of information requests is roughly 25% of the figures X is getting. (TikTok please provide the aggregated figures! (452 compared to Twitters 1728).
TikTok average number of ‘monthly active recipients’ in the EU broken down per Member State during the period 1 April 2023 to 30 September 2023, rounded to the nearest hundred thousand. A total of 135.9 million
Snap gives a breakdown of its content moderators globally. Hm. Clearly not good enough Snap.
Snap also just gives links to its previous global transparency reports. Again, not what's being asked Snap. Snap in general doesnt seem to be very far in terms of DSA-implementation - see also the lack of implementation on article 40.13 for instance.
As of 25 August 2023, LinkedIn had approximately 820 content moderators globally and 180 content moderators located in the EU. Why mention you have 0 moderators in Czech or Danish, but not mention you have 0 in Slovak or Lithuanian?
Interesting data for LinkedIn
Continued
So far Linkedin is the only company providing this stat - v interesting
Linkedin permanently suspended only 2,047 accounts and a grand total of 0 requests from governments to remove content. It also receives significantly receives less information requests from governments.
Time for the first search engine: Bing! 119 million average monthly users in the EU. "This information was compiled pursuant to the Digital Services Act and thus may differ from other user metrics published by Bing". Hm query.prod.cms.rt.microsoft.com/cms/api/am/bin…
This is an otherwise v short report compared to the others. Bing received 0 orders from EU Gvts to remove content. One interesting nugget is that Bing took voluntary actions to detect, block, and report 35,633 items of suspected CSAM content provided by recipients of the service.
Pinterest! Pinterest had 124 million monthly active users (MAU) in Europe. For some reason Pinterest was only able to collect data for one month (as opposed to two for the others)policy.pinterest.com/en/digital-ser…
Very interesting stat on the quality of automated moderation tools.
Linkedin content moderators - same pattern as the others. More when I'm back!
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Today the European Commission proposed how Art. 40 of the Digital Services Act (#DSA) could work in practice. In a worldwide first, this article in the DSA mandated very large platforms to grant researchers access to a wide range of previously undisclosed data. Some key points👇
This Delegated Act (DA) supplements the DSA and lays down - to a certain extent - some of the technical conditions and procedures that need to be followed before access can be granted. This is a draft - comments can be submitted until the 26th of November digital-strategy.ec.europa.eu/en/news/commis…
(1) The DA mandates every platform (“data providers” in the lingo) to establish a “data inventory” (art. 6.4) - a version of a codebook. These will serve as good starting points for developing research questions, yet they don’t capture the full scope of what might be accessible.
The European Parliament just confirmed @vonderleyen as the next president of the European Commission for the next five years. What did she promise to do on tech policy, in particular vis-a-vis large platforms? Seven highlights below 👇🧵
The one-liner: "Tech giants must assume responsibility for their enormous systemic power in our society and economy." Even more, VDL stated that:
On the agenda for this mandate: (1) A "wide-scale inquiry" into the effects of social media on the well-being of young people, as well as an (2) "action plan against cyberbullying". It's unclear what legislative action this would entail, if any.
Some political & legal observations on what I think is the most important European Commission enforcement action to date under the Digital Services Act, against Meta for failing to take effective measures to prevent the spreading of (Russian) disinformation on their platforms 👇
The Commission @DigitalEU is suspecting an infringement of the DSA and hence is launching a procedure under the DSA into Meta for 4 distinct, yet related, reasons. ec.europa.eu/commission/pre…
The EC claims Meta has not done enough to (1) take action against deceptive advertisements and disinformation. This is very clearly linked to the ongoing #doppelganger saga, which has been running for more than two years now politico.eu/article/russia…
The European Commission just released a groundbreaking study that for the first time provides a methodology to assess a systemic risk (in this case: Russian disinformation campaigns) as foreseen by the Digital Services Act🧵(1/8) #dsa #DigitalServicesActop.europa.eu/en/publication…
Its headline findings speaks volumes (2/8)
But its methodological aspects are even more important, as this is the first attempt to operationalize in detail a ‘systemic risk’ as defined by the DSA in article 34 in one very specific context (p9, 3/8).
Four important studies were published in @ScienceMagazine and @nature which are based on unprecedented privileged access to Facebook and Instagram data. This thread collects commentaries about the studies.
1. Asymmetric ideological segregation in exposure to political news on Facebook https://t.co/yvNvHpPt0Xscience.org/doi/full/10.11…
2. How do social media feed algorithms affect attitudes and behavior in an election campaign? https://t.co/dPdzlnFfVgscience.org/doi/full/10.11…
Google Search, Youtube, Facebook, Instagram, Twitter, TikTok, Microsoft Bing and Linkedin make significant new commitments to provide access to data for researchers in EU Code of Practice on Disinformation today ec.europa.eu/commission/pre… 🧵
1. These signatories explicitly commit to provide automated access to non-personal data & anonymised, aggregated or manifestly made public data.This is potentially huge - it could entail the development of a Crowdtangle platform for all these companies @brandonsilverm@persily
2. These companies also commit to fund and cooperate with a future independent third party body that can vet researchers and research proposals - as recommended by the EDMO Working Group on Access to Data @RebekahKTromble