Progress Thoughts with Mirja Telzerow

Progress Thoughts with Mirja Telzerow


About the format:

"Fortschrittsgedanken" appeared weekly in 2020 on the FAIR.nrw blog and presents the opinions of various experts from research and practice on questions that always remain the same. The primary topic is algorithms in personnel selection. FAIR was a project led by CASE and the university of cologne. opinionbarometer


About the author:
Mirja Telzerow is Director HR & Operations at Kearney - a global consulting firm with 4000 employees in 40 countries. Every year, over 150 new colleagues start in Germany, Switzerland and Austria. Kearney works in many ways to integrate tools and apps into the entire HR process chain and thus improve services for internal and external clients. Every year, well over 5,000 applications are received for the German-speaking units, so the topic of recruiting in particular enjoys a high level of importance.

What opportunities do algorithms offer in personnel selection and where are the risks?
Time and again, personnel managers and, of course, business unit managers are accused of a "bias", i.e. a certain prejudice when deciding on new hires. Is that surprising? Of course not: everyone has certain decision-making patterns that have been shaped by upbringing and experience. These influence how we see people, how we judge them in the first moment. First impressions count, it is often said, and they always go through a pre-coloured lens. This also applies to new hires, and it starts with the first reading of the CV. There are mental plus points if the candidate has attended the same university or plays the same sport. Subconsciously, but clearly: the candidate is like me, so she must be good. Of course, HR departments in all companies put a lot of effort into training to remove precisely this bias from the decision and to become aware of it. Does it work? To a certain extent. And here we come to the great opportunities of algorithms. The goal is rarely that all team members have attended the same university, but that everyone performs very well. Algorithms can therefore help to make a neutral preliminary decision as to which candidate(s) this could be. Risk lies in the algorithm itself. How it is designed. Because if an organisation does not pay attention to this, it can produce itself again and again on the basis of the selected algorithms.

What is the impact of discrimination in personnel selection?
Avoiding any discrimination in personnel selection has been one of the main topics in HR for many years. Many ideas have been designed and tested in this regard - if only because of the Anti-Discrimination Act. When discrimination is mentioned, gender or nationality usually come to mind immediately. This can usually be neutralised relatively easily in the personnel selection process by HR blackening the relevant positions. Beyond that, it becomes more difficult, and that is where discrimination can arise: School, hobbies, university. These are all starting points for discrimination.

Can algorithms help to reduce discrimination in personnel selection?
Algorithms can indeed help to reduce discrimination in aspects of unconscious bias. This requires an open discussion of the issues and the clear will of the management level to reduce them. On the other hand, if algorithms are used incorrectly, discrimination can also be exacerbated. It is up to the HR department to design the algorithms with appropriate care.

In concrete terms: What is the ideal process for personnel selection?
Ideally, personnel selection proceeds in a combination. In the first step, algorithms are used to assess the general suitability, but also to reduce possible bias of individual decision-makers, but also of an organisation. The next step is to see if the candidate fits into the team and the organisation. At this stage, I would leave this to the future team members and supervisors. An essential part and also benefit of new hires, especially in teams in high-performance organisations, is always the questioning and recalibration of the team.

Back