1,088 of 2,000 signatures

To Catherine Martin, Minister for Arts, Culture, Media and Sports

CC: Oireachtas Committee Members for Arts, Culture, Media and Sport

Dear Committee members,

As Uplift members, we are writing to you as you consider amendments to the Online Safety and Media Regulation over the coming weeks.

You have an opportunity in front of you to ensure that the Online Safety and Media Regulation (OSMR) Bill makes our online spaces safer for everyone, no matter who we are. You have a real opportunity to protect our democracy, communities and privacy and we urge you to use your powers to reign in ‘big tech’ corporations whose relentless pursuit of profit has had far-reaching impacts on our lives.

Regrettably the Bill in its current format, though laudable in its ambitions, is not fit for purpose and will not achieve its initial intention. It poses a risk to what everyday people can say online and how we express ourselves. Though maybe not the intention, the vaguely and poorly defined “harmful content” can lead to curtailing freedom of expression and communication.

Uplift members are concerned that the OSMR Bill does not adequately address the root issue, that it is the business model of the huge profit-driven tech companies that pose a serious risk of societal harm from the online environment.

Below we have outlined four issues areas where we think improvements could be made. Fundamentally we and hopefully you, want a Bill that creates safer online spaces for us all, where we can express ourselves freely, spaces free from manipulation, free from surveillance advertising, extremism and hate speech for profit, and one that reigns in big tech platforms - where real people are at the heart of decision-making.

Turn off the manipulation machine:

The OSMR Bill focuses largely on how content is delivered in an online environment, however it does not get to the root of the issue and fails to put serious checks on the systems that drive online harms. The harvesting of personal data used for profiling, targeting and curation of content in the form of recommender systems is a systemic issue. It may drive discrimination, undermine democracy and curtails freedom of expression. [1]

The Bill fails to adequately get to the root of the issue of how Big Tech’s toxic recommender systems and algorithms are amplifying hate speech and disinformation - weaponizing every societal fault line with relentless surveillance to maximise “engagement”. [2]

The Bill makes way for a Media Commission, a state appointed body charged with regulating online spaces, and more crucially, what is said in online spaces. In the scope of ‘online content’, the Commission will be able to issue a notice to remove, disable or limit access to specific content and/or apply to the High Court to block access to a service. This can enable the removal of speech that is not illegal. [3]

The removal of speech, which is not illegal in an offline setting, is deeply problematic and can curtail freedom of speech through self-censorship which could have consequential chilling effects. The Digital Services Act (DSA), the EU’s landmark overhaul of digital regulation has taken a more systemic-approach, one which doesn’t prescribe what type of content needs to be restricted or removed but instead places obligations on very large online platforms with regard to processes, transparency, recommender systems and the detection, flagging and removal of illegal material. [4] We strongly urge that the OSMR Bill also takes this approach.

Stop surveillance for profit:

Protecting people online means putting an end to surveillance advertising that people never asked for. The use of digital services cannot be conditional on acceptance of surveillance and profiling.

There have been amendments to the Bill which have seen greater protections around ‘harmful’ advertising, such infant formula advertisements, but also advertising of junk foods with high fat, salt and sugar content targeting children. But these can go further. A public opinion poll funded by Uplift members found more than 70% of respondents believed the targeting of people based on gender, sexual orientation, race and political/religious views by social media companies should be banned.[5]

The DSA put limits on surveillance advertising, by banning online platforms from using minors’ and sensitive data to serve users with targeted ads. However, it stopped short by watering down rules. The limits on ad targeting do not apply to ad tech giant Google’s hugely profitable system of display advertising. Though hailed as though users will be protected from malicious or deceptive design systems, otherwise known as ‘dark patterns’, last minute amendments drastically altered this. [6].

Put people back in charge

Stronger powers are needed for regulators to hold Big Tech to account, including robust audit powers that cannot be gamed by the companies. The Media Commission will need to have adequate oversight and enforcement mechanisms with teeth to hold Big Tech corporations to account.

In its current form, the OSMR Bill does not adequately address the risk of societal harms from the online environment, as Uplift members are concerned about. Having a mechanism in place by which big tech corporations must perform risk assessments on various aspects of online safety, ranging from democratic integrity to freedom of expression, to child protection is important for accountability. The requirement to publish such risk assessments in a publicly-accessible database should be considered.

Protect freedom of expression and privacy rights

The Online Safety and Media Regulation Bill poses a risk to freedom of expression and privacy rights.[7] As has been highlighted by Irish Council for Civil Liberties and Digital Rights Ireland, the definition of “harmful online content” Chapter 1 of Part 8A includes Section 139A is 'hazardously vague'. [8]

In essence, to ultimately decide what can and cannot be said online by deciding what is and is not harmful is dangerous. We urge committee members to consider how the inclusion of this vague definition and its implications may harm how people express themselves online, or otherwise.

[1] People vs Big Tech: The People’s Declaration
[2] [3] [5] Irish Council for Civil Liberties submission on the Online Safety and Media Regulation Bill, September 2022
[4] Global Witness: Brussels take on Big Tech Lobby: The Good, the Bad and the Ugly & https://corporateeurope.org/en/2022/04/big-techs-last-minute-attempt-tame-eu-tech-rules
[6] Ireland Thinks Poll: On privacy and targeted advertising
[7] Digital Rights Ireland submission on the Online Safety and Media Regulation Bill
[8] Irish Council for Civil Liberties, Bill to regulate online harmful content ‘damages’ constitutional rights

Next step - spread the word

You're signing this as Not you? Click here