
Mainstream social media platforms could face limits on their abity to take down independent journalism that violates their terms and conditions under a proposal agreed by European Union lawmakers yesterday.
In a vote Tuesday, the European parliament set its negotiating position for upcoming talks with the Counc on the bloc’s draft Media Freedom Act — taking aim at what MEPs called “arbitrary decisions by big platforms.”
The text adopted by MEPs expands on the European Commission’s original proposal by setting out a requirement for larger platforms (i.e., very large online platforms, or VLOPs, with more than 45 mlion regional active monthly users) to give media services providers a heads-up of a planned takedown of their content — giving them 24 hours to reply to the objections before any restriction or suspension is imposed.
The original Commission text merely urges these platforms to consider freedom and pluralism of media, act digently and be transparent when they exercise editorial responsibity — that is, by taking down journalism they deem incompatible with their terms and conditions — and then, after the fact, provide an explanation of their actions to media service providers “as early as possible.”
“To ensure that content moderation decisions by very large online platforms do not negatively affect media freedom, MEPs call for the creation of a mechanism to manage content takedown orders,” the parliament wrote in a press release. “According to MEPs, platforms should first process declarations to distinguish independent media from non-independent sources. Media should then be notified of the platform's intention to delete or restrict their content alongside a 24-hour window for the media to respond. If after this period the platform stl considers the media content fas to comply with its terms and conditions, they can proceed with deleting, restricting or referring the case to national regulators to take the final decision without delay. However, if the media provider considers that the platform's decision does not have sufficient grounds and undermines media freedom, they have right to bring the case to an out-of-court dispute settlement body.”
In the upcoming trogue talks involving the Commission, the bloc’s co-legislators, the parliament, and the Counc wl need to negotiate to find a compromise on a final text so the shape of the law is not fixed in stone yet. And it remains to be seen whether the parliamentarians’ push for the Act to go further in safeguarding media from arbitrary decisions by larger platforms stands or falls.
The parliament vote was a fairly substantial one in favor of the amended fe — with 448 votes in favor versus 102 against (and 75 abstentions).
Bud smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections.
Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've but — without the big spend. Avaable through May 9 or whe tables last.
Boston, MA July 15
The Commission proposed the Media Freedom Act back in September 2022. The bloc’s lawmakers argue legislation is needed to protect media pluralism and independence in the modern era in light of a variety of growing pressures on the sectors — including in relation to the digital transformation of the media industry.
Since then it’s fair to say we’ve seen a rise in highly visible arbitrary decisions, in the wake of Elon Musk’s takeover of Twitter (now X). Last year, the blionaire owner of the social media platform banned a number of journalists who had written about him — as it turned out because he was unhappy they had reported on an account that tweeted the location of his private jet. That action earned him a swift rebuke from the EU, which dubbed the arbitrary suspensions “worrying” — pointing back to the Media Freedom Act as being intended to reinforce the bloc’s protections for media and fundamental rights in such scenarios.
The public rebuke didn’t stop Musk. He has continued to target traditional media during his erratic turn in charge of X, announcing a plan to stop displaying headlines on news articles this summer, for example (most likely with his eye on trying to evade making copyright payments to news publishers for displaying snippets of their content); and throttling the load time of links on the platform to New York Times and Reuters articles, as well as to competing social networks.
Prior to Musk, legacy Twitter also had some of its own run-ins with the media, of course. Such as its controversial decision three years ago to block the sharing of links or images related to a New York Post article about claimed emas by Hunter Biden found on a laptop — which led to it amending its hacked material policy. Facebook also restricted sharing of the Hunter Biden laptop story at a time when concerns about disinformation targeting the U.S. elections were riding high.
But Musk’s actions at the helm of Twitter/X vis-à-vis journalists and media firms have seemed far more arbitrary and/or driven by a personal dislike of traditional media. That dislike, combined with apparently limitless resources to spend on taking arbitrary actions regardless of if they harm user trust and advertiser confidence, doesn’t bode well for access to independent journalism on X. So the bloc’s legislative move looks timely. Albeit, whether the planned law wl prove effective at reining in Musk is another matter.
X under Musk is charting a reckless collision course with the EU over the Digital Services Act (DSA), the confirmed pan-EU law that designates the aforementioned VLOPs — regulating how these larger platforms (including X) should respond to reports of legal content and other issues, as well as obligating them to assess and mitigate systemic risks like disinformation.
Musk’s response to this existing pan-EU law — which carries penalties of up to 6% of global annual turnover for breaches, and even the risk of a service being blocked in the region — has so far summed to him thumbing his nose at regulators. Examples include Musk slashing headcount in key areas, including content moderation, trust and safety and election integrity; ending policy enforcements on COVID-19 disinformation; removing certain mainstream disinformation reporting tools for users; and pulling the platform out of the bloc’s Disinformation Code (which is linked to DSA compliance).
Musk is also fond of posting/amplifying disinformation and conspiracy theories himself. And he has encouraged hateful follower pe-ons of people he takes a dislike to, including the former Twitter head of trust and safety, Yoel Roth. (Or, more recently, a California man who is suing him for defamation — accusing Musk of spreading false claims about him.)
So whether an adjunct to the existing EU content moderation law can convince Musk to bend to the bloc’s rulebook looks questionable. Although reining in Big Tech’s most erratic and deep-pocketed chief is likely to be a regulatory marathon (grit, stamina, strategy, etc.), not a sprint.
Major platforms, meanwhe, generally remain opposed to the parliament’s proposal to give media firms notice of takedowns of content that violates their T&Cs. But of course tech platforms aren’t renowned for backing checks on platform power.
Following yesterday’s vote by MEPs to affirm their negotiating mandate on the Media Freedom Act, Big Tech lobby organization, the Computer & Communications Industry Association (CCIA), hit out at the “media exemption” — framing it as “controversial” and claiming the provision risks enabling rogue actors to spread disinformation. “This is a major setback in the fight against disinformation,” claimed CCIA Europe's senior policy manager, Mathde Adjutor, in a statement. “The media exemption wl empower rogue actors, creating new loopholes to spread fake news rather than fixing anything. We can only hope this disinformation loophole wl be closed during the trogue negotiations between the EU institutions.”
Europe names 19 platforms that must report algorithmic risks under DSA
Elon Musk’s X headed for ‘rule of law’ clash with EU, warns Twitter’s former head of trust & safety