09.03.2020

Dutch DPA warns for algorithms that lead to exclusion and discrimination

On his privacyblog, Aleid Wolfsen, the Ducth Data Protection Authority warns for algorithms that lead to exclusion and discrimination. 

Mr. Wolfsen explains that more and more public and private organizations use algorithmic systems to place people in a profile. To determine which "treatment" they receive. For example, whether or not a mortgage, whether or not to become a customer or employee somewhere, or whether or not to have a discount on an insurance policy. If an organization does not do so according to the principles of the GDPR - lawfully, properly and transparently - there is a risk that they (unknowingly) contribute to the exclusion and discrimination of people. That is why "algorithms" will also be one of the Duch DPA's focus areas in the coming years.

In his blog, Mr. Wolfsen offers 2 recommendations for organizations :   

  • Do you already use algorithms? Then make sure that this technology is not a black box for you. If in doubt, let yourself be thoroughly explained by the clever headlines in your organization which personal data your algorithm uses and how these lead to well-founded conclusions about people. The accountability of the General Data Protection Regulation (AVG) helps you on your way to identify this and to reduce the risks.
  • Also make sure that you, as the controller, keep control over the connections that the algorithm makes. For example, by ensuring that real people also look into the process. Don't settle for "computer says no". Because your customers, citizens and patients can expect you to be transparent about how your algorithm works. So that they too can keep a grip on their personal data and can defend themselves against a decision of your system that is unfavorable to them.