Federal anti-discrimination banking laws are meant to protect consumers of credit companies from denying an application for credit or taking other adverse action without cause or a clear explanation of why the action was taken. was taken.
Businesses have long used advanced calculation methods as part of their credit decision-making process, but as technology evolves, so do these models. Some creditors now make decisions based on a model called “black box” models. This template produces outputs that may be unknown to users of the template, including its own creators.
Thus, adverse action notices that satisfy Equal Credit Opportunity Act (ECOA) of these models may not be possible.
To remind the public of these requirements, the CFPB has issued a consumer financial protection circular, including federal consumer financial protection law enforcement officials, adverse creditor action notification requirements under ECOA.
“Companies are not absolved of their legal responsibilities when letting a black box model make lending decisions,” said Rohit Chopra, director of the CFPB. “The law gives every applicant the right to an accurate explanation if their credit application has been denied, and that right is not diminished simply because a company uses a complex algorithm that it does not understand.”
According to the CFPB, the collection of data about Americans has become large and ubiquitous, giving companies the ability to learn very detailed information about their customers before even interacting with them. Many companies across the economy rely on these detailed datasets to power their algorithmic decision-making, which is sometimes marketed as “artificial intelligence”. The information gathered from data analysis has a wide range of commercial uses by financial companies, including for targeted advertising and credit decision making.
The circular highlights two major points:
- Federal consumer financial protection laws and adverse action requirements must be applied regardless of the technology used by creditors. For example, the ECOA does not allow creditors to use technology that prevents them from providing specific and specific reasons for adverse actions. The use of complex algorithms by creditors should not limit the application of the ECOA or other federal consumer financial protection laws.
- Creditors cannot justify non-compliance with the ECOA on the basis of the simple fact that the technology they use to assess credit applications is too complicated, too opaque in its decision-making, or too new. Creditors who use complex algorithms, including artificial intelligence or machine learning technologies, to make credit decisions must always provide a notice that discloses the specific primary reasons for taking adverse action. There is no exception for breach of law because a creditor uses technology that has not been properly designed, tested or understood.
“Whistleblowers play a pivotal role in uncovering information about companies using technologies, like black box models, in ways that violate ECOA and other federal consumer financial protection laws,” concluded the CFPB. “Having clear and actionable information is essential for the CFPB and other consumer protection officials. The CFPB encourages technicians to provide information to the agency, and they can visit the CFPB Whistleblower program web page to learn more.”
Click here read the circular in its entirety.