Updated Feb 21, 2024 - Technology
Human Intelligence

"Extremely concerned": UN official warns Silicon Valley execs of AI dangers

headshot
Photo illustration of Volker Türk with abstract shapes.

Photo illustration: Axios Visuals. Photo: Picture Alliance via Getty Images

Volker Türk, the UN's high commissioner for human rights, was in Silicon Valley last week to deliver a simple message to tech companies: Your products can do real harm and it's your job to make sure that they don't.

Why it matters: Technologies like artificial intelligence hold enormous potential for addressing a range of societal ills, but without effort and intent, these same technologies can act as powerful weapons of oppression, Türk tells Axios in an interview.

New regulations are often where the tech debate lands, but Türk tells Axios that the firms should already be ensuring their products comply with the existing UN Guiding Principles on Business and Human Rights.

  • "You have already existing obligations and you need to apply them," Türk says, addressing the companies.

Details: That UN document, unanimously approved in 2011, states that "business enterprises have the responsibility to respect human rights wherever they operate and whatever their size or industry."

  • It further notes that responsibility "means companies must know their actual or potential impacts" and "prevent and mitigate abuses."

Yes, but: The guiding principles are non-binding.

  • A years-long process to turn them into enforceable international law has yet to come to fruition.
  • And even enforceable international law has proven difficult to make stick in a world where nation-states retain sovereign freedom of action.

What's happening: Türk met with OpenAI, Meta and Google and also spoke at events, including at Stanford and Berkeley, where representatives from companies including Microsoft, Apple, Cisco, Snap and Anthropic were present.

Between the lines: Türk emphasized that Silicon Valley's lack of diversity and global perspectives hamper the industry. Something that seems positive to a small group of people in San Francisco, he says, may feel far different in other parts of the world.

  • "Very often it's done by people who may not know the consequences of what they're developing," Türk says.
  • "It's done by people with their own biases, with their own prejudices. It's done by people who have no global view of the world," he says.

The big picture: Türk likens the current AI moment to a different sort of artificial intelligence envisioned by Goethe in "Faust."

  • In that classic, Homunculus, an artificial being in a vial intended by its creator to represent the best of Enlightenment-era knowledge, instead ends up encompassing the full range of humanity's traits, including its faults.
  • "This is where we are at the moment," Türk says of today's AI systems. "It's not separate from who we are — our worst fears, our desires, our best aspirations."
  • What's more, AI systems won't necessarily reflect all of humanity, given they will likely be shaped by the values and world views of their creators.

Zoom in: Elections represent a specific threat, with 4 billion people around the world set to go to the polls this year.

  • The power of AI and the presence of so much data about individuals opens the door to politicians or outside influencers targeting different groups with messages playing on their particular fears. "You're almost using it for brainwashing," he says, "and that's where the danger lies."
  • While such concerns predate generative AI, Türk says the new technology allows for misinformation to spread at a "mind-boggling" pace and scale.
  • "I'm extremely concerned about the combination of social media platforms and generative AI, about the way they could whip up emotions," Türk says.

He warns that AI could be used to scapegoat already marginalized groups, including immigrants and members of the LGBTQ+ community.

  • "If you whip up fears...if you create images that actually make people very afraid, and where you essentially manipulate the realities, then yes, we have a toxic mix," he says.
  • As to whether the tech companies are taking things seriously, Türk says, "The issue of elections is on their mind. I don't have the data to know whether it's enough."

The other side: Türk says what gives him hope is what he called the "silent majority" of people, who he says do "deeply care" about human rights, values and dignity.

  • But that group, he says, needs to speak up. "I wish that the silent majority became a bit louder and were not silent, but actually, you know, overcome their fears, overcome the divisions and, and stand up for human rights."
Go deeper