精品国产自线午夜福利

We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


EU leading fight against misinformation – Little
Pic: RollingNews.ie

20 May 2022 human rights Print

EU leading fight against misinformation – Little

Former journalist Mark Little (pictured) has said that the need for moderation of digital content to protect human rights has never been greater.

He also warned, however, that the problem of disinformation could not be solved if the systems underlying how social-media platforms operate remained unregulated.

In a lecture on digital misinformation (18 May) organised by the Law Society鈥檚 Human Rights and Equality Committee, Little said that Europe was 鈥減roviding the lead鈥 in such regulation, with measures such as the Digital Services Act (DSA).

Little founded social-media news agency Storyful in 2010, and is co-founder of Kinzen, which provides services aimed at identifying harmful content.

鈥楽afety by design鈥

He said that the European approach was trying to look at the tech giants鈥 underlying systems, and targeting harmful practices built into social-media business models.

The EU was also aiming to ensure that all platforms had a commonly agreed way of approaching illegal content, and trying to make platforms accountable for the content decisions made by their algorithms.

There was a lack of oversight of how algorithms worked, he argued.

Little described the EU approach 鈥 which he called 鈥榮afety by design鈥 鈥 as 鈥減owerful鈥, as it was saying that making underlying systems more transparent and accountable was the most effective form of regulation now.

He added, however, that the DSA was not perfect, expressing concerns about the ability to declare 鈥渆mergency conditions鈥 where limitations could be imposed on platforms.

鈥楲awful but awful鈥

Little argued, however, that the EU measures were better than much of the online-harm regulations emerging in places like Australia, Canada, and the UK.

The former journalist also pointed out that the EU legislation dealt only with illegal content, and not with equally damaging content that was 鈥渓awful but awful鈥.

He said that the main problem with such content was trying to define it, while there were also risks of 鈥渄isproportionate remedies鈥, such as jailing platform executives.

Some online-harm legislation was trying to do away with anonymity, which was essential to some voices of dissent in places such as Russia, he warned.

鈥淪afety by decree has many flaws鈥, said Little, adding that unintended consequences needed to be thought through.

He also told the audience that some countries were using terms such as 鈥榝ake news鈥 or 鈥榙isinformation鈥 as a reason to pass 鈥渧ery draconian鈥 laws.

Weapon

The Storyful founder described his initial excitement, as a 鈥渏ournalist in the old school鈥, at the emergence of new forms of communication, citing their positive impact in events such as the Arab Spring, and in helping protesters in countries such as Iran.

He believed, however, that the internet had then been turned into a weapon by enemies of democracy, with the spreading of false information.

Little said that Storyful had not taken account of the role of the tech giants鈥 underlying systems in helping to drive such misinformation.

鈥淭he underlying business model of social media relies on advertising; it is designed to accelerate the spread of emotion, outrage, and happiness. This is not just a bug in the system, that鈥檚 the way it was originally designed,鈥 he stated.

Codes of practice

Little referred to the current 鈥渟heer over-abundance of information鈥, which was starting to result in the feeling that people did not know what was true and what was not.

鈥淭hey don鈥檛 have to convince us that this is true; they just have to convince us that everyone is lying鈥, he said of attempts by governments or other actors to spread disinformation.

The Kinzen co-founder said that platforms had to think about how to fight back without resorting to heavy-handed censorship.

He praised some new codes of practice on content moderation being developed by platforms working with civil society, such as the , which put human rights at the centre of what he called 鈥渢echnological due process鈥.

Little said that such codes must also ask for transparency on the technology that underlies any moderation, citing fears that platforms would resort to 鈥渂lunt-force instruments鈥, such as automated algorithmic filters, that could end up suppressing free speech.

Public media

Little told the audience that any machine-learning systems involved in content moderation must have a human being involved at every stage of the process.

He concluded by calling for a rethink on how we wanted to design a 鈥榩ublic square鈥 for information and debate.

鈥淧art of the problem is that we don鈥檛 have enough public media,鈥 Little argued, calling for governments to invest not just in national organisations such as RTE or BBC, but also in helping to fund people to report on their communities.

We should put an effort into creating a form pf public information that is regarded as a utility, and to create within that the ability to have counterpoints of view, he urged.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright 漏 2025 Law Society Gazette. The Law Society is not responsible for the content of external sites 鈥 see our Privacy Policy.