This AI Tool Helps Companies Eliminate Hiring Bias in the Name of Productivity Checkr's co-founder and CEO makes the case for why businesses and society are better off when background checks are streamlined with technology.
Opinions expressed by Entrepreneur contributors are their own.
In this series, The Way We Work, Entrepreneur Associate Editor Lydia Belanger examines how people foster productivity, focus, collaboration, creativity and culture in the workplace.
If your criminal record is a clean slate, you probably don't think too much about background checks. But probability suggests it's almost certain that someone you work with has one.
What constitutes a criminal record can range from minor to major crimes, from fishing without a license to speeding to driving under the influence to murder. But historically, some background check processes haven't distinguished among offenses of varying degrees.
The Ban the Box movement has pushed to remove the "Have you ever been convicted of a crime?" checkbox on job applications that can automatically land a candidate's app in the trash, regardless of what that crime might have been. A company might look at its large candidate pool and decide to pass on anyone with a criminal record, unwilling to spend the time and human resources to vet them.
The time-consuming nature of background checks was the initial inspiration for background check software company Checkr. Co-founders Daniel Yanisse and Jonathan Perichon were working as engineers at on-demand delivery startup Deliv back in 2013, when the "gig economy" had first taken shape. Suddenly, consumers were trusting strangers employed for contract work by startups to clean their homes, drive them around, deliver their dinner and more. Deliv needed lots of drivers, but existing background check technology made the screening process take a week or more.
Today, using AI, Checkr automates background checks, and the process only takes a day or two. The San Francisco-based company's 10,000-plus customers include gig-economy giants such as Uber, Lyft, Instacart, Postmates and Grubhub, as well as staffing companies, retailers and more.
Background checks vary in their thoroughness. Some include driving records, employment verifications and drug tests. Checkr gathers data from various sources, including criminal records from courthouses at the county, state and federal levels. Many of these databases are digital, but for those that still keep records on paper, Checkr dispatches a contractor to manually collect the information. Employers typically pay $35 per applicant screening. Checkr pricing varies based on the information requested and the volume of checks a customer runs.
Checkr also adds a layer of quality assurance, confirming that, say, two guys with the same name, birth date and state of residence aren't confused for each other. Finally, it's programmed to account for differing background check regulations at the state level.
"That would take multiple days for humans to review and do the work," Yanisse says, "so we've automated that with algorithms, which is more accurate and faster."
But the Checkr team didn't set out to help the Ubers of the world reject convicts faster. From day one, Yanisse says they've designed the software to reduce bias in hiring and give more qualified candidates a chance at employment. He points out that the U.S. economy loses roughy $87 billion annually because people with criminal records can't get hired, and that nearly 75 percent of formerly incarcerated individuals are still unemployed a year after release, according to ACLU research.
People who have criminal records but don't have quality jobs that pay a living wage are more likely to commit a crime again. That translates to more people on government assistance or in jail -- and more taxpayer money to pay for it. Meanwhile, employment retention rates are higher among people with criminal histories than the general population, which translates to lower long-term recruitment costs for companies who hire them, according to the ACLU.
"When you look at statistics, it's extremely rare to have workplace violence or crime, especially if you have a good interview process and good HR practices in place," Yanisse says, explaining that the background check industry has long relied fear-mongering as a marketing tactic. "So we just want to rebalance."
Checkr also corrects for variation and biases among humans who are trying to hire fairly. It is unrealistic to hold a large company to one hiring standard. If someone has the discretion to reject or advance a candidate, he or she might have trouble standardizing the approach.
Each Checkr client can customize the software to ignore or never surface data about certain criminal offenses that they don't deem relevant to the job at hand. Someone applying to be a customer service representative, for example, could perform the job perfectly well even with traffic violations on their record. A ride-hailing driver? Perhaps not. Companies can also specify time windows: If the crime happened before a certain year, it's no longer a concern.
"In reality, when you would get a background check report in the past, if there was a long list of things, even if it was just a long list of traffic variations or things like that, you probably were going to be biased," Yanisse says. "Like, "Ooh, there are a bunch of flags here. I'm not sure I want to take the risk.'"
When it comes to AI, many people worry about the potential for AI to reflect the biases of the human software engineers who built it. But Yanisse insists that doesn't come into play with Checkr, because the AI is not determining if a candidate is qualified for the job or make hiring decisions for its user companies.
"We are not predicting if the person is going to be a good fit for the job," Yanisse says. "We're using AI for classifying data. Like, is this a driving violation or is this a physical crime? Those things are more fact-based."
Yanisse likens Checkr to the advent of the credit score. Pre-FICO and credit bureaus (the automation of credit score determinations), a loan officer at a bank could pass judgement or discriminate against a credit applicant.
"It removes the ability for employers to use a background check as a proxy to screen people out," says David Patterson, Checkr's head of communications. "Historically, employers would often tell people who were minorities that they didn't get the job and use the background check as an excuse, like, "Oh we couldn't hire them because they had this like minor thing on their record.' But that may have not been the actual reason."
At the moment, Checkr is focused on measuring its impact and calculating how many candidates would have been rejected with or without its software. Last year, the company reports that it helped 8,000 candidates get accepted instead of declined. For 2018, its goal is 80,000. It aims to achieve that not only by signing on more clients, but getting its existing clients to fine-tune their use of the tool.
"Initially, I think it was quite binary how decisions were made. Either you're clean and you haven't done anything and you're a good person, or you have some flags in your background, so we're not going to hire you," Yanisse says. "But when you get into the details, you realize that that's not the case. There's no good or bad people, there's a whole spectrum. People make mistakes, and there are different severities of mistakes."
Related video: How Making Naïve Mistakes Led Me to Ultimate Success