Mark Zuckerberg wants to build a voice assistant that blows Alexa and Siri away
This key part of his plan for the metaverse could analyze your voice, eye movements, and body language.
Meta, the company formerly known as Facebook, has shifted its long-term strategy away from its social media apps to focus on the metaverse, a virtual world where people wearing augmented/virtual reality headsets can talk to each others’ avatars, play games, hold meetings, and otherwise engage in social activities.
That’s created a lot of questions, such as what this means for a company that has been focused on social media for nearly two decades, whether Meta will be able to achieve its new goal of building a metaverse future, and what that future will look like for the billions of people who use Meta’s products every day. On Wednesday, Meta CEO Mark Zuckerberg revealed some answers during a keynote speech about the company’s latest developments in AI.
One of Meta’s main goals is to develop advanced voice assistant AI technology — think Alexa or Siri, but smarter — that the company plans to use in its AR/VR products, like its Quest headset (formerly Oculus), Portal smart display, and Ray-Ban smart glasses.
“The kinds of experiences you’ll have in the metaverse are beyond what’s possible today,” said Zuckerberg. “That’s going to require advances across a whole range of areas, from new hardware devices to software for building and exploring worlds. And the key to unlocking a lot of these advances is AI.”
The presentation comes during one of the most challenging moments in the company’s history. Meta’s share prices have taken a historic dip, its advertising model has been shaken up by Apple’s mobile privacy changes, and it faces the looming threat of political regulation.
So it makes sense that the company is looking to the future, in which Meta hopes to roll out sophisticated language-processing AI.