Exclusive: Meta to study race of Instagram users
Facebook parent Meta plans to partner with an outside group to survey the race of Instagram customers in hopes of better understanding who is using its services, officials tell Axios.
Why it matters: The company says it wants to make sure that its products and AI systems operate fairly across racial lines, but feels it can't do that without better knowing its customers. By working with a third party it aims to both protect privacy and ensure customers are more comfortable sharing their information.
How it works: Over the next few months people on Instagram in the United States may see a single-question prompt asking them about their race or ethnicity.
- Meta partnered with a firm called Oasis Labs to ensure Meta will have access to this information only at the aggregate level. The company is working with YouGov for data collection, with the collected information split among several partners.
- Neither Meta nor its partners — Oasis, Northeastern University, University of Central Florida, and Texas Southern University — will have access to individual responses. The company has posted more detailed information on its methodology here.
- Meta said it will continue its policy of not directly collecting race information.
What they're saying: "For a long time, we’ve been working to better understand and improve the experiences that people from marginalized communities are having on our apps," Roy Austin, VP of civil rights at Meta, said in a statement to Axios. "But since it’s always difficult to address something without measuring it first, we’ve partnered with leading researchers, civil rights and academic experts and universities that serve these communities to do exactly that."
- Austin leads a small team focused on civil rights. That unit, in operation for about 18 months, advises the company on everything from voting rights issues to how new products should be designed.
The big picture: Facebook has been criticized for a wide range of issues related to race, including a lack of representation of people of color within its ranks as well as various forms of bias within its products.
- The company committed in 2018 to conduct a civil rights audit. The study, released in 2020, found it repeatedly failed to address hatred, bigotry and manipulation on its platform, saying its efforts in the area were too reactive and piecemeal.
- It also announced a settlement last month with the U.S. Department of Housing and Urban Development that will see Facebook change the way it delivers ads related to housing, as well as credit and employment. Facebook has also eliminated the ability to target those kinds of ads by age or gender and requires that location targeting have a minimum 15-mile radius.
- Understanding who is using the products is key to understanding whether its services — and the systems that serve up content and advertising — are doing so fairly and equitably. That's all the more important as Meta increasingly relies on machine learning and AI algorithms to decide who sees what on its services.
Between the lines: While Facebook could just ask users directly about their race, the company has a problematic track record, including allegations of digital redlining and other racial inequities.
- Starting this past January, Facebook further limited ad targeting based on characteristics that were seen as sensitive, such as religious practices, political beliefs and health concerns.
Editor's note: This story has been corrected to note that the project's research partners will not have access to individual responses, and to include the correct name of Oasis Labs.