CFPB cracks down on ‘black-box’ algorithms used for credit underwriting

The CFPB Thursday highlighted that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. Along with this announcement, the bureau also published a Consumer Financial Protection Circular to remind the public of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).

In general, the circular communicates a standard of explainability for these models in the context of providing adverse action notices to consumers. The circular states that “creditors must be able to provide applicants against whom adverse action is taken with an accurate statement of reasons.” In addition, the statement of reasons “must be specific and indicate the principal reason(s) for the adverse action.”

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

Of note, in response to a request for information regarding credit unions’ use of artificial intelligence (AI) issued by federal financial regulators – including the NCUA – NAFCU submitted comments detailing how credit unions leverage AI and noted that “credit unions are committed to pursuing responsible innovation, but to meaningfully pursue AI and ML technologies requires a supervisory approach that does not add to already high examination burden.

 

continue reading »