Algorithms are less bias than humans

Algorithms are less bias than humans

There is a lot of controversy around the steady increase in the automation of traditionally human-based decision making. But rather than merely asking whether algorithms are flawed, we should be asking how these flaws compare with those of human beings.

There is a lot of research on algorithmic decision making that dates back several decades. And the studies on this topic all have a remarkably similar conclusion: Algorithms are less biased and more accurate than the humans they are replacing.

E.g., Columbia Business School studied the performance of a job-screening algorithm at a software company. When the company rolled out the algorithm to decide which applicants should get interviews, the algorithm favored “nontraditional” candidates much more than human screeners did.

So the next time you worry about an algorithm making flawed decisions, remember to look in the mirror and recall that human biases are likely even worse.


Inspired by: Want Less-Biased Decisions? Use Algorithms, by Alex P. Miller