|
Meta just announced an intriguing tool that uses AI to automatically dub Reels into other languages, complete with lip-sync. This feature was revealed at the annual Meta Connect livestream event and was introduced by CEO Mark Zuckerberg. Zuckerberg showed this off during the keynote, and everything seemed to work flawlessly. The technology not only translates the content, according to Meta, but will also simulate the speakers voice in another language and sync their lips to match. Its worth noting, however, that this didnt appear to be a live demo, but was still pretty impressive. As for a rollout, the company says the feature will arrive first to some creators videos in English and Spanish in the US and Latin America. Meta didnt give a timetable here. It just said the US and Latin America will be getting it first, which indicates that itll be tied to English and Spanish at launch. The company did mention that more languages are coming soon. That wasnt the only AI tool spotlighted during Meta Connect. The companys AI platform will now allow voice chats, with a selection of celebrity voices to choose from. Meta AI is also getting new image capabilities, as it will be able to change and edit photos based on instructions from text chats within Instagram. Messenger and WhatsApp. Catch up on all the news from Meta Connect 2024!This article originally appeared on Engadget at https://www.engadget.com/ai/meta-will-use-ai-to-create-lip-synced-translations-of-creators-reels-175949373.html?src=rss
Category:
Marketing and Advertising
Alongside the Quest 3S and AI updates, we got a glimpse of Meta's future at Meta Connect. After teasing the device several times in recent months, Meta finally gave the world a proper look at its "full holographic" augmented reality glasses, which were codenamed Orion. The company is packing a lot of tech into those chunky frames, which are still in prototype form. This story is developing; please refresh for updates... This article originally appeared on Engadget at https://www.engadget.com/ar-vr/meta-reveals-its-orion-smart-glasses-175353381.html?src=rss
Category:
Marketing and Advertising
Metas AI assistant has always been the most intriguing feature of its second-generation Ray-Ban smart glasses. While the generative AI assistant had fairly limited capabilities when the glasses launched last fall, the addition of real-time information and multimodal capabilities offered a range of new possibilities for the accessory. Now, Meta is significantly upgrading the Ray-Ban Meta smart glasses AI powers. The company showed off a number of new abilities for the year-old frames onstage at its Connect event, including reminders and live translations. With reminders, youll be able to look at items in your surroundings and ask Meta to send a reminder about it. For example, hey Meta, remind me to buy that book next Monday. The glasses will also be able to scan QR codes and call a phone number written in front of you. In addition, Meta is adding video support to Meta AI so that the glasses will be better able to scan your surroundings and respond to queries about whats around you. There are other more subtle improvements. Previously, you had to start a command with Hey Meta, look and tell me in order to get the glasses to respond to a command based on what you were looking at. With the update though, Meta AI will be able to respond to queries about whats in front of you with more natural requests. In a demo with Meta, I was able to ask several questions and follow-ups with questions like hey Meta, what am I looking at or hey Meta, tell me about what Im looking at. When I tried out Meta AIs multimodal capabilities on the glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate longer chunks of text. And later this year the company is adding live translation abilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory. And while I still havent fully tested Meta AIs new capabilities on its smart glasses just yet, it already seems to have a better grasp of real-time information than what I found last year. During a demo with Meta, I asked Meta AI to tell me who is the Speaker of the House of Representatives a question it repeatedly got wrong last year and it answered correctly the first time.This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-ray-ban-branded-smart-glasses-are-getting-ai-powered-reminders-and-translation-features-173921120.html?src=rss
Category:
Marketing and Advertising
All news |
||||||||||||||||||
|