Government takes on AI in housing
The government and companies are figuring out how to mitigate AI bias in housing — and use it to promote equity.
Why it matters: There's a lot of talk about the future, existential threats some believe are posed by AI, but the technology already is having real-world impacts on everyday actions like renting.
What we're watching: President Biden's AI executive order directs government agencies to fight discrimination enabled by automated systems and algorithms.
- On Capitol Hill, committees are building on last year's AI insight forums to zoom in on specific issue areas.
- Sen. Tina Smith noted at a hearing this week on AI and housing that she has found "endless applications" of the technology, such as when a renter submits a maintenance request and when a family is trying to qualify for a home.
Details: The executive order gives HUD and CFPB until about the end of April to:
- Study whether tenant screening systems are using criminal and eviction records, credit and other data points in illegal ways that result in discrimination.
- Address how existing laws apply to housing, credit and other real estate ads on digital platforms that use algorithms to push those ads.
Last month, CFPB advised that false, incomplete and old information must not appear in background check reports, including for tenant screening.
The intrigue: How well agencies leverage existing laws (including the Fair Housing Act and the Fair Credit Reporting Act) will help determine which gaps Congress needs to fill.
- Some in the private sector argue that the answer is not new AI-specific laws but better enforcement of what's already on the books.
- Before the EO, CFPB was already taking action on AI, including launching a rulemaking to protect people against algorithmic bias in home valuations.
Meanwhile, HUD's chief AI officer Vinay Singh recently said his agency is just catching up on AI and working on hiring experts on the technology.
In the private sector, companies are harnessing AI in an attempt to counteract inequity in housing.
- Lenders across the country are using technology from company Zest AI to detect biases in underwriting, which has lead to an increase in loan approvals for protected classes.
- That includes a 49% boost for Latinos, 41% for Black applicants, 40% for women, 36% for elderly applicants, and 31% for AAPI applicants, according to the company.
Being able to use data on marginalized groups that historically has been unavailable can help promote equity, experts said.
- For example, consumers in rural areas are less likely to have a traditional credit history and are more likely to rely on higher-cost loan providers — an issue that applies to many Black and Latino households, George Washington University professor Vanessa Perry said in an email.
- "Because AI facilitates the inclusion of other kinds of payments and transactions in credit scoring models, this reduces one barrier to homeownership access," she added.
And company officials said they could use some help from the government in creating a standard definition across the industry for what's considered fair.
- "I could make the fairest model that says yes to everything, but then that's not good for business," said Zest AI CEO Mike de Vere, who has been meeting with CFPB director Rohit Chopra and other government officials.
- "So understanding and having guidance from the government as far as what target are we shooting towards and what progress do we want to make" would be helpful, de Vere said.