Meta’s Connect keynote felt different this year, and not just because it marked the return of an in-person event. It’s been nearly two years since Mark Zuckerberg used Connect to announce that Facebook was changing its name to Meta and reorienting the entire company around the metaverse.
But at this year’s event, it felt almost as if Zuckerberg was trying to avoid saying the word “metaverse.” While he did utter the word a couple of times, he spent much more time talking up Meta’s new AI features, many of which will be available on Instagram and Facebook and other non-metaverse apps. Horizon Worlds, the company’s signature metaverse experience that was highlighted at last year’s Connect, was barely mentioned.
That may not be particularly surprising if you’ve been following the company’s metaverse journey lately. Meta has lost so much money on the metaverse, its own investors have questioned it. And Zuckerberg has been mercilessly mocked for trying to hype seemingly minor metaverse features like low-res graphics or avatars with legs.
AI, on the other hand, is much more exciting. The rise of large language models has fueled a huge amount of interest from investors and consumers alike. Services like OpenAI’s ChatGPT, Snap’s MyAI and Midjourney have made the technology accessible — and understandable— to millions.
Given all that, it’s not surprising that Zuckerberg and Meta used much of Connect — once known solely as a virtual reality conference — to talk about the company’s new generative AI tools. And there was a lot to talk about: the company introduced Meta AI, a generative AI assistant, which can answer questions and take on the personality of dozens of characters; AI-powered image editing for Instagram; and tools that will enable developers, creators and businesses to make their own AI-powered bots. AI will even play a prominent role in the company’s new hardware, the Meta Quest 3 and the Ray-Ban Meta smart glasses, both of which will ship with the Meta AI assistant.
But that doesn’t mean the company is giving up on the metaverse. Zuckerberg has said the two are very much linked, and has previously tried to dispel the notion that Meta’s current focus on AI has somehow supplanted its metaverse investments. “A narrative has developed that we’re moving away from focusing on the metaverse vision,” Zuckerberg said in April. We’ve been focusing on both AI and the metaverse for years now, and we will continue to focus on both.”
But at Connect he offered a somewhat different pitch for the metaverse than he has in the past. Over the last two years, Zuckerberg spent a lot of time emphasizing socializing and working in VR environments, and the importance of avatars. This year, he pitched an AI-centric metaverse.
“Pretty soon, I think we’re going to be at a point where you’re going to be there physically with some of your friends, and others will be there digitally as avatars as holograms and they’ll feel just as present as everyone else. Or you know, you’ll walk into a meeting and you’ll sit down at a table and there will be people who are there physically, and people are there digitally as holograms. But also sitting around the table with you. are gonna be a bunch of AIs who are embodied as holograms, who are helping you get different stuff done too. So I mean, this is just a quick glimpse of the future and how these ideas of the physical and digital come together into this idea that we call the metaverse.”
Notably, the addition of AI assistants could also make “the metaverse” a lot more useful. One of the more intriguing features previewed during Connect were Meta AI-powered search capabilities in the Ray-Ban Meta smart glasses. The Google Lens-like feature would enable wearers to “show” things they are seeing through the glasses and ask the AI questions about it, like asking Meta AI to identify a monument or translate text.
It’s not hard to imagine users coming up with their own use cases for AI assistants in Meta’s virtual worlds, either. Angela Fan, a research scientist with Meta AI, says generative AI will change the type of experiences people have in the metaverse. “It’s almost like a new angle on it,” Fan tells Engadget. “When you’re hanging out with friends, for example, you might also have an AI looped in to help you with tasks. It’s the same kind of foundation, but brought to life with the AIs that will do things in addition to some of the friends that you hang out with in the metaverse.”
For now, it’s not entirely clear just how long it will be before these new AI experiences reach the metaverse. The company said the new “multi-modal” search capabilities would be arriving on its smart glasses sometime next year. And it didn’t give a timeframe for when the new “embodied” AI assistants could be available for metaverse hangouts.
It’s also not yet clear if the new wave of AI assistants will be popular enough to fuel a renewed interest in the metaverse to begin with. Meta previously tried to make (non-AI) chatbots a thing in 2016 and the effort fell flat. And even though generative AI makes the latest generation of bots much more powerful, the company has plenty of competition in the space. But by putting its AI into its other apps now, Meta has a much better chance at reaching its billions of users. And that could lay important groundwork for its vision for an AI-centric metaverse.
Source: www.engadget.com