New book "The White Wall" details racial bias in the finance industry
Back in 2017 when finance reporters were scrambling to ferret out MeToo stories on Wall Street, New York Times reporter Emily Flitter went another way.
- Instead of sexual discrimination, she dug into the issue of racial bias in the industry for her new book "The White Wall." It's a perennially unaddressed and undercovered topic, she was told.
- "The racial discrimination is so bad," she recounts one prominent lawyer telling her.
Why it matters: Flitter's book, out this week, lays out just how bad. Through a series of devastating anecdotes and solid reporting, she shows how the financial industry works at both the systemic and individual levels to perpetuate the racial wealth gap in the U.S.
Quick take: One thing the book makes clear is how much the finance industry relies on trust and feelings to make assessments about customers: In the U.S., that's a huge problem for Black Americans trying to access capital, get their home appraised, or even access basic banking services.
- There are bank tellers making judgments, based on appearance, over whether or not customers' checks get cashed.
- Flitter tells the stories of Black bank customers who try to cash checks and instead wind up getting security called in.
- She describes how insurance adjusters rely on feelings to decide if homeowners' claims get paid out.
Zoom in: Her examination of the insurance industry, typically undercovered in the business press, is eye-opening.
- We meet Darryl Williams, an apartment building owner in Chicago, who filed suit against State Farm in 2019 after it refused to pay for damages after a frozen pipe burst and flooded his building.
- " 'We have a lot of fraud in your area,' Williams said the adjuster told him," Flitter recounts. "When asked what she meant by that, she said; 'South Side of Chicago and you-all's neighborhoods.' "
- This wasn't a one-off issue. Flitter cites data that found insurers consistently paid more claims in white neighborhoods than in Black neighborhoods in Chicago. But nationwide data on the industry is hard to come by, she explains, the result of concerted efforts to avoid that kind of tracking.
Meanwhile, even when financial institutions try to take human sentiment out of the picture, they run aground. Algorithms, Flitter writes, are built by humans after all. The bias gets baked right in.