Remote interviews and background bias
Background bias influences hiring managers so it is really important to factor in your surroundings. Background bias does exist, even with AI.
Research suggests that as COVID 19 restrictions ease up 34% of Millennials now the dominant group in the workforce will be looking to change jobs. This probably means that most parts of the process will still be done remotely until everyone has the OK to travel and the risk of contagion is reduced.
The process can be done by a person, usually a recruiter and the hiring manager, or by AI, in what we call an automated or one-way interview. In both cases, it is really important to factor in your surroundings. Background bias does exist, even with AI.
Not everyone can have dedicated space with full leather-bound bookshelves, especially when combining this with home-schooling kids or co-working in cramped conditions with a partner.
Background bias influences hiring managers when they see dirty dishes, unmade beds, and kids running amok, so it’s important to factor that in. You will probably be judged by your surroundings unless you have a highly trained bias-aware recruiter who says “assess the message, not the mess.” This is not guaranteed as research from 3Plus notes that 46% of recruiters don’t have unconscious bias awareness training.
So, it’s important to control the things that you can. Some people have even told me they have rented hotel rooms to ensure they have the most professional-looking background. This is a significant outlay in the middle of a global recession when some hiring processes might involve more than four interviews. Try and find more economic solutions. One client put her laptop on an ironing board in front of a blank wall.
Consider some of the basics:
- Your eyes are at camera level
- Look at the camera not at the image of the person
- Take care with hand movements
Sometimes you might have the possibility to have a virtual background and here choice is also important. It would seem that bookshelf bias is not just a figment of all our imaginations, but something quite real. What’s more, it seems that technology doesn’t help deal with it either.
The increase in the use of artificial intelligence has prompted a discussion around how well software helps reduce bias and improve human decision-making when every human interaction is susceptible to it.
Take a look: Everything Webinar Virtual Office Audit
Research carried out by reporters from Bayerischer Rundfunk (German Public Broadcasting) set out to assess the ability of AI to filter out stereotypes and bias. Using a group of actors to simulate an interview, they examined the software created by a Munich tech start-up Retorio which focuses on video-based behavioral assessment. It seems that not only is AI swayed by the appearance of the candidates themselves, but by their backgrounds.
The software is intended to analyse the tone of voice, language, gestures, and facial expressions to create a behavioral personality profile. Uwe Kanning, Professor of Business Psychology from the University of Osnabrück, suggests that “The software should actually be able to filter out this information in order to be better than the gut feeling of any person who is susceptible to such influences.” But it didn’t.
What they found was the software was indeed influenced by perceptions of candidates as scored by the personality measuring tool Ocean. They also found the results shifted based on the background behind the candidate. This shift was particularly evident in four of the main components: openness, conscientiousness, extraversion, and agreeableness.
It would seem that not only do humans judge candidates by their backgrounds but so does AI.
If you can adjust your background, then choose a bookshelf. We did this with one senior executive, and she got the job. Did her background make a difference? We will never know for sure – but it certainly didn’t do any harm.