Remember, my dear friends, …
… when we could follow content on, say, Facebook and Instagram in the order it was posted, rather than in the way “someone else” decided for us? Well …
Finally, we will be able to do so again! And that’s partly my ‘fault’. As of today, all major online platforms in the European Union must adhere to the rules that have quite rightly been made stricter, this being one of them.
Show me your ‘feed’ and I’ll tell you who you are. The online world is presented to us exactly as the algorithm dictates. Each of us receives a different presentation, based on our personal data that is sold to whoever is willing to pay the highest price. We are the product, and the algorithms know exactly who should be offered what. Unfortunately, they also know who is more susceptible to conspiracy theories, disinformation and deception.
With such a system it is easy to manipulate people and voters, to systematically spread conspiracy theories to sell illegal, harmful products, to shamelessly use hate speech, and even to incite violence, which, unfortunately, often results in violence in real life. This system is abused by those who exploit people’s fear, frustration and anger for their own particular interests; those who wage a culture war on the media and the ghosts of the past. This system (deliberately) undermines trust in experts and people with authority, reduces the quality of public debate and civility, and, worst of all, makes it impossible to tackle the really pressing problems affecting people.
The Digital Services Act was a necessary response to this system. It introduces obligations with respect to algorithmic transparency, indicating the origin of advertising, especially political advertising, and rectifying [AS1] misleading information, and sets clear rules for moderating content online.
I covered the report on this legislative act as rapporteur for my group, Renew Europe, in the Committee on Culture. It was therefore my task to table amendments, set the voting indications for the group and negotiate with the rapporteurs from other groups on the final text in the form of joint, compromise amendments.
This was followed by inter-institutional negotiations which finally led to the act’s adoption, making it binding for the entire European Union. What did I aim (and succeed) to achieve with my amendments to the text on advertising?
As a user seeing a particular advertisement, platforms must inform you why this particular advertisement was chosen for you. Most importantly, you should not be subject to targeted advertising by default, UNLESS you have given your prior consent. So pay attention to the changes that Meta and TikTok (and anyone else) have announced to date, and decline to be shown content based on algorithms if you don’t want that. Now it is possible!
In addition, companies will have to ensure that targeted advertising using sensitive personal data such as sexual orientation, ethnicity or political opinions is not possible. Targeted advertising to minors is prohibited.
I also thought it was important that you have the possibility to make your own decisions about the data you share and to set the algorithm parameters yourself. In other words, that you can actually choose how the ‘world is presented’ to you.
Of course, in this, too, it is important to distinguish between legal, illegal and harmful content. Legal content should not be removed, and platforms should not be held responsible for its removal. Illegal content, however, is currently divided into three types: child sexual abuse material, terrorist content and copyright. All of them are covered by specific legislation, and the Digital Services Act now provides for horizontal rules.
I was strongly opposed to the content being checked exclusively by automated systems, algorithms or artificial intelligence. That is why, throughout the text, I added (and was successful in my endeavour) that content blocking must necessarily be subject to human supervision.
Last but not least, an element that is particularly important in the context of my fight for the Slovenian language: I added that the online giants must respect the language of a Member State and employ moderators who can speak its language, in our case Slovenian. I am sure these multinationals can afford to do that.
We need to know who is behind the content we see on social networks, and we must have the power to control our own identity.
Irena
Leave a Reply
Want to join the discussion?Feel free to contribute!