#IJF18 WebMagazine
INTERNATIONAL JOURNALISM FESTIVAL
Perugia, Italy, 11-15 April 2018

EtHICS 

April 14, 2018

Algorithms: Confronting the ethical dilemmas

Thursday afternoon at the IJF saw an intense discussion about the present and future of algorithms for the media beyond. The use of algorithms is widespread and growing, and so are the ethical dilemmas associated with it.

Two years ago the word ‘algorithm’ remained foreign to many, even within the media industry. Today, concern with the power of algorithms to dictate certain aspects of our lives is growing in the public discourse, although the broader ethical threats remain largely undebated. From biases found in algorithms used by credit-based companies, to the terrifying development of a ‘social credit system’ in China, the potential for this still fairly new technology to affect more than just our online lives is very real.

The IJF team caught up with panellist Christina Elmer, head of data journalism at Der Spiegel, to find out more about the publication’s investigations and what the future of algorithms could hold.

IJF: You discussed how Der Spiegel has been using algorithms in its work, but have you been able to measure the broader implications of the algorithms used by companies such as Facebook and Google?

Elmer: Yes, we did a few projects together with different partners – we teamed up with ProPublica, Süddeutsche Zeitung and NDR to discover Facebook ads and the hidden ads that were shown only to some users in the political campaign. Then we did a project on the Google search algorithm together with Algorithm Watch in Germany, and we are in charge of a project on the SCHUFA scoring system, which is the most important scoring system in Germany for private persons, that gives the probability that they will pay their credit back. And this is really so important for everyone – everyone knows it – and this is a project together with Algorithm Watch again, and with the OKFN – the Open Knowledge Foundation. So these are all projects we took part in, and we joined forces with these partners to reach a broader audience and to get all these experts on-board. Because our data team is not that big, and we don’t have [experts] like ProPublica do in our newsroom, we have to collaborate on every project. But this is best way, I think.

What do you think is the biggest ethical threat to algorithms – do you think Europe could ever follow the lead of China in the creation of a sort of ‘social credit system’?

Elmer: I am really afraid that we won’t get the public to debate this topic intensively enough, and that we will get into a situation like we see in China, where algorithms really shape how society is like for everyone, and then we lose control. As I see it, this influence in China is really something that could not really be drawn back or taken back with any effort, because it shapes how people are living together, how they are communicating, everything.

My biggest fear in this context is that we lose the ability to remain in control. And it’s great to have such examples as the self-driving car, where everyone can follow the questions that are on the table, and everyone has a gut feeling about that. The outcome is really important because it’s about life and death. So that’s the perfect example to develop and a great debate and discourse about what we really want from these algorithms and how we can remain in control.