Digital Green Book fights misinformation targeting Black communities
Add Axios as your preferred source to
see more of our stories on Google.

Onyx Impact founder and CEO Esosa Osa. Photo: Tyler Boozer
As the erasure of Black history intensified — through book bans, assaults on diversity, and digital disinformation — Atlanta-based technologist and strategist Esosa Osa decided to take action.
Why it matters: Instead of just calling it out, Osa built the Digital Green Book — an AI-powered platform designed to help Black communities spot misinformation, protect their data, and access trusted, culturally informed content.
The big picture: Osa was concerned that Black voices were being suppressed, diversity initiatives were under attack, "woke" was being weaponized, and book bans were raising barriers to historical knowledge.
- The Digital Green Book is a roadmap for digital empowerment.
Flashback: The name is a deliberate nod to the "Negro Motorist Green Book," the Jim Crow-era guide that helped Black Americans navigate safely through a hostile country.
- "We needed a name that instantly communicated to Black folks: This is for you; this has your best interests at heart," Osa, founder and CEO of Onyx Impact, told Axios.
Zoom in: Launched this month, Onyx Impact's platform provides tools to combat misinformation and navigate a challenging digital landscape.
- Osa, a former political strategist and financial analyst, founded the organization after working at Fair Fight Action, BlackRock and Morgan Stanley.
Friction point: Researchers and critics have documented how AI systems often reinforce the very stereotypes Black Americans have spent generations trying to dismantle.
- Last year, the Congressional Black Caucus warned that without intentional design, AI will deepen racial inequities in hiring, housing, education and finance — a phenomenon experts call "algorithmic redlining."
- A 2023 study by linguist Sharese King found that AI consistently assigned Black language speakers to low-prestige jobs and imposed harsher sentences in hypothetical cases — further entrenching bias.
Yes, and: Osa notes that social platforms don't just reflect public opinion — they shape it. Right now, they're:
- Suppressing Black voices.
- Reinforcing harmful stereotypes, biases and discriminatory practices.
- Amplifying fear and confusion through misinformation.
But the problem went deeper.
- "We are not built to see the same headline 20 times, then the responses, then the reactions to those responses," she says.
- She realized the problem wasn't just misinformation — the sheer volume of manipulated content created a false sense of consensus and urgency.
- "We've got to tell people — this isn't real life," Osa said. "Forty-two percent of your online content might be bad bots."
Zoom out: The Digital Green Book is designed to cut through the noise and give Black communities control over their digital lives.
What it offers
- Misinformation detection: Assists users in recognizing digital manipulation and disinformation in real time.
- Data control guidance: Offers clear steps to curate social media feeds and regain ownership of online spaces.
- Child safety tools: Empowers parents with strategies to shield children from the dangers of social media while fostering digital literacy.
- AI-powered knowledge base: A groundbreaking tool trained on Black media and historical sources to provide unbiased answers to questions.
State of play: Osa's team fed the model vetted Black media, historical context, and trusted sources like the NAACP and the Legal Defense Fund.
- "You don't want AI to 'figure out' what it means to be Black from the internet — you'll get a horrible product."
- The goal: lower the barriers to discernment, making it easier to identify misinformation.
Unlike traditional AI tools that pull from the open web, this system prioritizes Black-led sources and historically accurate information to prevent distortion.
What's next: Osa hopes the Digital Green Book and Onyx Impact's initiatives will boost access to Black-led news, enhance digital literacy, and protect users from manipulation.
- "We are fundamentally in an information war — and we're losing. We need to understand how to navigate mass propaganda and misinformation now more than ever."
The bottom line: Even with careful curation and bias testing, AI tools are imperfect. But Osa's mission is clear:
- "We all we got. This is about making sure we've got each other covered."
