Meta's Smart Glasses Miss the Mark on AI, Focusing on Features Instead
Ray-Ban Meta Gen 1 and Gen 2 smart glasses have been touted as a game-changer for users, with features like hands-free calling, navigation, and notifications. However, one thing is missing: effective AI-powered functionality.
While voice assistants are available in many smart devices, their usefulness has remained relatively consistent over the years. Meta's own AI assistant, integrated into its Ray-Ban smart glasses, struggles to improve upon this mediocrity. Google Assistant, Alexa, and Siri have all seen advancements, but not without setbacks β a problem that applies to Meta AI as well.
The primary issue is that the AI-powered camera features, meant to be a highlight of these smart glasses, are underwhelming. The ability to ask Meta AI about surroundings can be useful for translation or navigation assistance, but its execution often falls short. A recent instance saw the assistant incorrectly identify every shell collected at the beach as a shark's tooth.
The emphasis on AI is not unique to Meta; other companies, including Google, have showcased prototypes with similar computer vision-centric designs. This raises concerns about privacy, as the camera would be constantly watching users' actions. Magic Leap's recent presentation further highlights this issue.
Smart glasses can be useful and enjoyable devices, but they require attention to detail and a focus on meaningful features rather than AI-driven gimmicks. While AI may have a place in enhancing these devices, it shouldn't be their defining characteristic β at least not yet.
Ray-Ban Meta Gen 1 and Gen 2 smart glasses have been touted as a game-changer for users, with features like hands-free calling, navigation, and notifications. However, one thing is missing: effective AI-powered functionality.
While voice assistants are available in many smart devices, their usefulness has remained relatively consistent over the years. Meta's own AI assistant, integrated into its Ray-Ban smart glasses, struggles to improve upon this mediocrity. Google Assistant, Alexa, and Siri have all seen advancements, but not without setbacks β a problem that applies to Meta AI as well.
The primary issue is that the AI-powered camera features, meant to be a highlight of these smart glasses, are underwhelming. The ability to ask Meta AI about surroundings can be useful for translation or navigation assistance, but its execution often falls short. A recent instance saw the assistant incorrectly identify every shell collected at the beach as a shark's tooth.
The emphasis on AI is not unique to Meta; other companies, including Google, have showcased prototypes with similar computer vision-centric designs. This raises concerns about privacy, as the camera would be constantly watching users' actions. Magic Leap's recent presentation further highlights this issue.
Smart glasses can be useful and enjoyable devices, but they require attention to detail and a focus on meaningful features rather than AI-driven gimmicks. While AI may have a place in enhancing these devices, it shouldn't be their defining characteristic β at least not yet.