Artificial Intelligence: Computers Conduct Job Interviews – Politics

Artificial Intelligence: Computers Conduct Job Interviews – Politics
Written by admin
Artificial Intelligence: Computers Conduct Job Interviews – Politics

The applicant does not conduct today’s job interview with the boss or an employee from the HR department. It runs with the computer. A machine voice asks about your CV, about a professional success, about a nice, personal experience – and then wishes you a good day and hangs up.

What looks like a scene from a sci-fi movie to some is already a reality in many societies, including Germany. IT companies, car companies, insurance companies, sporting goods manufacturers: They all use automated software to filter applications. Sometimes algorithms comb through CVs, sometimes chatbots conduct pre-selection interviews. They scrutinize applicants’ voice and choice of words and make inferences about their suitability—who is resilient, how team-oriented, how curious—and then conduct a pre-selection. The final decision on who gets the job then rests with the boss.

Artificial intelligence has long since arrived in working life, and some think that’s a good thing. They are crazy about the fact that in the future, machines could do the work for people – in production and in the selection of personnel. Some believe that automated application processes offer a great opportunity for fairness. Unlike humans, software has no biases, they only decide on suitability and not which nose suits them.

But doubts about this brave new world of work are mounting, fueled by a new report from the union-friendly Hugo Sinzheimer Institute of the Hans Böckler Foundation. Süddeutsche Zeitung present, present. Frankfurt University professor Bernd Waas examines whether labor law is ready for the advancement of artificial intelligence – and comes to a clear conclusion, especially when it comes to personnel selection. “The danger of discrimination,” he writes, “is almost palpable.”

Disadvantages can be hidden in the algorithm

Waas is concerned that applicants could be disadvantaged by the roundabout, speaking of “indirect discrimination”. A simplified example: The software might learn that a company employs an above-average number of people who say they enjoy watching football in their spare time. She could then assign the – seemingly neutral – criterion that applicants should have a comparable hobby in order to fit in particularly well with the team. In fact, it would be gender discrimination, because there are significantly fewer female football fans than male ones.

The General Act on Equal Treatment prohibits such discrimination. However, Waas fears that they will no longer attract attention in the future because they are hidden somewhere in the middle of complicated algorithms. He writes that it is not enough for employees to sue for discrimination. Instead, there must be better prevention options. One thought: companies should disclose how their selection software works and have their algorithms reviewed by government agencies. It seems that the brave new world of work cannot go unsupervised.

About the author


Leave a Comment