Meta’s AI glasses can now help you hear conversations better

Meta’s AI glasses can now help you hear conversations better

Overview

Meta announced on Tuesday an update to its AI glasses that will allow you to better hear people talking when you’re in a noisy environment The feature will initially become available on Ray-Ban Meta and Oakley Meta HSTN smart glasses in the U In addition, the glasses are getting another update that lets you use Spotify to play a song that matches what’s in your current view

Why This Matters

For instance, if you’re looking at an album cover, the glasses could play a song by that artist Or if you’re looking at your Christmas tree with a pile of gifts, you could play holiday music This addition is more of a gimmick, of course, but it demonstrates how Meta is thinking about connecting what people see with actions they can take in their apps

Key Insights

The conversation-focus feature, meanwhile, seems more practical

First announced at Meta’s Connect conference earlier this year, the feature uses the AI glasses’ open-ear speakers to amplify the voice of the person you’re talking to

Industry Impact

This development is expected to influence the technology industry, highlighting ongoing changes in innovation, competition, and adoption.

Final Thoughts

As the technology landscape continues to evolve, stories like this demonstrate why staying informed is increasingly important.


Read Original Full Article

Post a Comment

다음 이전