Sep 4, 2019

A proposed HUD rule on AI could allow for housing discrimination

Illustration: Rebecca Zisser/Axios

HUD recently proposed a rule that would protect financial institutions from liability for using algorithms to make lending decisions, as long as the technology used was produced or distributed by a recognized company.

Why it matters: AI can inadvertently rely on characteristics that include or are correlated with race, gender and socio-economic class, so under the proposed rule, financial institutions could make illegal determinations and hide behind an AI product.

The big picture: Financial institutions are increasingly using AI to detect suspicious activity, optimize portfolios, recommend strategic investments and assess creditworthiness.

  • The impact: There are factors the financial sector may use that — while not explicitly equivalent to race or gender — correlate with those characteristics and can result in discrimination.

How it works: Institutions may decide, for example, that unbanked individuals are less creditworthy, and using this factor in loan decisions could disadvantage people of color and women.

  • Nearly 17% of African Americans and 14% of Hispanic Americans are unbanked, compared to just 3% of white Americans.
  • 15% percent of unmarried female-headed family households are also unbanked.

What's happening: HUD released a proposed rule that would eliminate the disparate impact standard, which prohibits policies or procedures that result in a disproportionate adverse impact on protected groups. It would also shield financial institutions from liability that arises when they use AI-based tools from third parties, like tech companies for instance, whether or not there was knowledge of the problematic algorithm. 

  • What to watch: If the HUD rule is enacted, algorithms could obscure the reason for a credit denial.
  • But if denial notices are required to be made clear to borrowers, in compliance with the Fair Credit Reporting Act, potential homeowners may have some means of identifying illegal or inappropriate grounds for a determination, even if the financial institution is shielded from liability.

The bottom line: There are already challenges in applying anti-discrimination laws to AI-based determinations. The newly proposed HUD rule would make this considerably more difficult.

Miriam Vogel is the executive director of Equal AI, an adjunct professor at Georgetown Law and a former associate deputy attorney general at the Department of Justice.

Go deeper

Immigration policy could handicap the U.S. in the AI talent race

Illustration: Aïda Amer/Axios

AI experts are pushing the U.S. to ease immigration policies, arguing that the country is hobbling itself in a critical geopolitical race in which American dominance is slipping.

The big picture: Two of the Trump administration's major policy goals seem at cross purposes. Clamping down on immigrants and visitors could hamstring AI development in the U.S., which the White House says is a top priority.

Go deeperArrowSep 14, 2019

Cities aren't ready for the AI revolution

Globally, no city is even close to being prepared for the challenges brought by AI and automation. Of those ranking highest in terms of readiness, nearly 70% are outside the U.S., according to a report by Oliver Wyman.

Why it matters: Cities are ground zero for the 4th industrial revolution. 68% of the world's population will live in cities by 2050, per UN estimates. During the same period, AI is expected to upend most aspects of how those people live and work.

Go deeperArrowSep 30, 2019

The world through AI's eye

How ImageNet sees me. On the left: "beard" / On the right: "Bedouin, Beduin"

Maybe you've seen images like these floating around social media this week: photos of people with lime-green boxes around their heads and funny, odd or in some cases super-offensive labels applied.

What's happening: They're from an interactive art project about AI image recognition that doubles as a commentary about the social and political baggage built into AI systems.

Go deeperArrowSep 22, 2019