The Overlooked Upsides of Algorithms in the Workplace

Technology


You describe the potential of using candidate-screening technology that takes the form of an online game, like Wasabi Waiter from a company called Knack, where a person is a server in a busy sushi restaurant. How can that be effective at assessing job candidates?

Courtesy of Hachette

It’s thinking more creatively about what we’re screening for, using insights from psychology and other research on what makes a good team player. You don’t want only what we call exploitation algorithms, which look at who became successful employees in the past, like somebody who finished an Ivy League college and was captain of a sports team.

There’s a lot of talk about the black box problem, that it’s hard to understand what the algorithm actually is doing. But from my experience as an expert witness in employment discrimination litigation, and research into hiring, it’s also very hard to pierce the black box of our human minds and trace what happened. With digital processes, we actually do have that paper trail, and can check whether a game or some kind of automated emotional screening will outperform the previous way of screening in creating a more diverse pool of people.  

My personal experience of applying for jobs that require aptitude tests and personality screenings is that I find them opaque and frustrating. When you’re speaking to someone face to face, you can get a bit of a sense of how you’re doing. When the whole process is automated, you don’t even really know what you’re being tested on. 

That’s what a lot of people feel. But this is where I get a little more contrarian. It’s not just about how people experience the interview, but what we know about how good people are at making assessments during an interview.

There’s quite a bit of research that shows that interviews are a bad predictor for job performance, and that interviewers consistently overestimate what they can actually glean from an interview. There’s even research that shows how in a matter of seconds, bias creeps in. If we’re serious about expanding the pool of people eligible for a job, the sheer numbers of applicants will be too much for a human to take on, at least in the initial stages.

A lot of these workplace biases are well documented. We’ve known about the gender pay gap for a long time, but it has been very hard to close. Can automation help there?

It has been frustrating to see how stagnant the gender pay gap has been, even though we have equal pay laws on the books. With the vast datasets now available, I think we can do better. Textio’s software helps companies write job ads that are more inclusive and will result in a more diverse applicant pool. Syndio can detect pay inequities across different parts of the labor force in large workplaces, which can be harder to see.

It’s kind of intuitive: If we use software to look across many different modes of pay and a lot of different job ads, we can pierce that veil of formal job descriptions in a large workforce and see what’s happening in terms of gender and race. We used to have this idea of auditing as one-shot—once a year—but here you can have continuous auditing over several months, or when there’s suddenly an increase in pay gaps introduced by things like bonuses.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *