Axios AI+ SF Summit: There's no universal definition of "responsible AI"
Add Axios as your preferred source to
see more of our stories on Google.

Attendees enjoyed lunch during the discussion. Credit: Chris Constantine on behalf of Axios.
SAN FRANCISCO – Tech, investing and education experts spoke about how they are addressing ethical concerns about AI as society considers how to ensure responsible applications of the technology, at an Axios Dec. 17 roundtable conversation.
Why it matters: Leaders are increasingly looking at ways to develop AI while also navigating potential risks — and that looks different depending on what industry you're in.
Axios' Megan Morrone and Sam Sabin moderated conversations with those leading responsible AI development or investment at their respective organizations at the event, which was sponsored by Siegel Family Endowment and Kapor Foundation.
What they're saying: "Really what it means is it's starting out with an AI impact assessment that we have teams fill out before when they start to build their features," Adobe senior director of ethical innovation Grace Yee said.
- From there, Adobe looks at the assessment to evaluate how they're using AI, how it's being incorporated into the desired feature, who the audience is and what are the potential harms.
Sara Deshpande, a general partner at Maven Ventures, called AI the "megatrend of the decade."
- She is looking ahead to see how consumers adopt AI to solve problems in their everyday lives. "I think we've seen a lot of AI that's been adopted for AI technology use within businesses, it makes them more efficient and more productive," Deshpande said.
- "But I think now we're really on the brink of figuring out what happens for consumers, and we've seen historically that they really push the envelope of what's needed," Deshpande added.
Defining "responsibility" in AI adoption can also mean capturing the technology's maximum value, HiddenLayer chief security and trust officer Malcolm Harkins said.
- "A lot of times I think we think of the harm side of it, [but] we also have the responsibility to capture the value," Harkins said.
There are opportunities for AI In the education sector but students should not be viewed the same way as regular consumers, said Camille Crittenden, who is executive director of CITRIS and the Banatao Institute at University of California Berkeley.
- "I think there are definitely great opportunities with respect to artificial intelligence in education," Crittenden said, such as customized tutoring and brainstorming for teachers.
- However, "I think we have to think about what is the actual goal of student learning and is AI going to replace that in ways that ultimately are not going to be to the benefit of the individual or to us as a society," Crittenden added.
The bottom line: There is not one definition of "responsible AI," but the issue is top of mind at most organizations.
