Meta wants the metaverse to sound more like the real world
Meta CEO Mark Zuckerberg on Friday detailed advances in audio that he says will allow a concert in the metaverse to really sound like one is at a music venue. The same goes for making sound that resembles a living room or other more intimate space.
Why it matters: While much is made of the need to improve the visual quality of VR headsets, how the metaverse sounds is just as important to how well it is accepted, especially when it comes to mainstream consumers.
Driving the news: Meta said that the company, in conjunction with researchers at the University of Texas at Austin, are releasing as open source software three models that should allow others to better represent real-world sound in virtual spaces.
- One model allows an audio clip recorded in one environment to be made to sound like it came from a different one.
- Another of the models does the opposite, extracting a core audio sample, such as a music recording or dialogue, from its environs.
- A third model uses visual and audio cues to separate speech from background noise.
The big picture: Digital sound is already moving beyond stereo and into so-called spatial audio, which is possible not just on VR headsets but also on devices such as Apple's AirPods Pro.
- That's important because it allows sound to change based on where a person moves and positions their head in virtual space.
- But also key is ensuring the sounds of virtual spaces themselves better match those in the real world, and it's that latest area Meta's announcement addresses.