by Barbara Kiviat
Credit reports are on people’s minds these days, thanks to the massive data breach at Equifax that exposed the sensitive personal information of some 145.5 million Americans. With social security numbers and birth dates in hand, hackers can fraudulently open credit cards and take out loans in victims’ names. When these accounts fall into delinquency, it will look like the victims have failed to pay off their debts.
That will be a problem for borrowing money in the future—and also perhaps for landing a new job. About half of U.S. employers at least sometimes check job candidates’ credit history, largely based on the idea that a person who doesn’t pay off their debts is untrustworthy and irresponsible, and therefore dangerous to have in the workplace, especially around money.
In recent years, sociologists have pointed out the risk of cumulative disadvantage that comes from using a person’s experiences in the credit market to determine their chances in the labor market. Consumer advocates have also underscored how the practice can hurt racial minorities, whose credit could be bad through personal fault—or generations of discrimination. Since the Equifax data hack, concerns about credit report inaccuracies have taken center stage, as well. Even before the breach, a full quarter of Americans had a mistake on at least one of their reports.
In September, Representative Steve Cohen and Senator Elizabeth Warren introduced bills in Congress to curtail credit checks in hiring, and in a confirmation hearing, Warren pressed the potential head of the Equal Employment Opportunity Commission to say whether she would bring cases against employers who use credit checks to discriminatory ends. Local legislatures have already made greater strides. Over the past decade, more than a dozen states and cities have passed laws to restrict how companies use credit reports in hiring.
But how, exactly, do companies use credit reports in hiring? Despite the flurry of legislative activity, we actually know precious little about how employers take credit reports—long and complicated financial documents—and translate them into hiring decisions. Unlike lenders, employers don’t see credit scores, so there’s no summary measure to compare across candidates.
In a recent paper, I use interviews with hiring professionals to show that the practice is more complicated than policy debates tend to reflect and deeply prone to bias. Some employers develop sets of rules to guide the reading of credit reports. Reject any candidate with more than three accounts in debt collection, say. Yet most hiring professionals don’t have such rules to guide them, and many of those who do still make exceptions. A full 80% of employers using credit reports sometimes hire people with bad credit, according to the Society for Human Resource Management, a trade group.
What sways hiring professionals? A lot of the time, it’s a good story. While the people I interviewed generally thought credit history sheds light on a person’s moral character, most also thought there were times when people have bad credit through no fault of their own. Most hiring professionals, for example, are open to employing people whose bad credit stems from job loss, divorce, or medical problems. But the job candidate has to know to bring up such personal details (in a hiring setting, mind you), and to offer an account that sounds sufficiently repentant. Hiring professionals talked about times when job candidates didn’t seem genuine enough in their storytelling, which meant they didn’t get the job.
While I didn’t observe job candidates, interviews with hiring professionals suggest that this system benefits candidates with backgrounds most similar to hiring professionals. Hiring professionals often draw from their own experiences to decide whether bad credit indicates personal moral failing—like being a frivolous spender or a deadbeat—or unfortunate surrounding circumstance. The result is that what registers as forgivable to one hiring professional can seem unforgivable to another. The people I interviewed assigned starkly different meanings to delinquent credit card debt, bankruptcy, foreclosure, and other negative items.
Such variability is enabled by the fact that there is no good evidence about how the contents of credit reports relate to workplace behavior. Lenders and the credit scoring outfits that serve them have worked out how to combine bits of context-free credit history into algorithms that mathematically predict risk of loan default. Employers have nothing similar to use, which opens the door to their more interpretive construction of meaning from credit data. Credit reports may seem like neutral carriers of objective financial information, but in employment, they are anything but.
This subjectivity and lack of empirical basis is particularly worrisome given that some employers run credit checks for an additional reason: to prove to regulators, investors, business partners, and other stakeholders that they have conducted due diligence and minimized the danger posed by employees. Hiring professionals use credit reports not only for what they (ostensibly) say about potential workers, but also for the symbolic value they bring as official documents produced by credit bureaus, society’s trusted arbiters of financial credibility.
Given the recent hit to Equifax’s own credibility, that might be one more reason to rethink the use credit reports in hiring.
Barbara Kiviat is a PhD candidate in sociology and social policy at Harvard University. This article summarizes findings from “The Art of Deciding with Data: Evidence from How Employers Translate Credit Reports into Hiring Decisions” in Socio-Economic Review. To download a free version of the article, please click here.
Image: by Nick Youngson via Google Images (CC BY-SA 3.0)