z-logo
open-access-imgOpen Access
Know your algorithm: what media organizations need to explain to their users about news personalization
Author(s) -
Max van Drunen,
Natali Helberger,
Mariella Bastian
Publication year - 2019
Publication title -
international data privacy law
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.371
H-Index - 20
eISSN - 2044-4001
pISSN - 2044-3994
DOI - 10.1093/idpl/ipz011
Subject(s) - personalization , counterfactual thinking , transparency (behavior) , normative , social media , context (archaeology) , computer science , internet privacy , face (sociological concept) , advertising , business , political science , world wide web , computer security , sociology , psychology , social psychology , law , paleontology , social science , biology
If the right to an explanation is expected to effectively safeguard users’ rights, it must be interpreted in a manner that takes the contextual risks algorithms pose to those rights into account. This article provides a framework of transparency instruments in the context of the news personalization algorithms employed by both traditional media organizations and social media companies. Explaining the impact on a user’s news diet and the role of editorial values in the algorithm is especially important in this context. Conversely, explanations of individual decisions and counterfactual explanations face specific practical and normative barriers that limit their utility.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom