This post is part of a series that highlights equipment from the Northeast Arc’s Assistive Technology Lending Library
Ray-Ban Meta Glasses are a complex wearable technology. These glasses can interpret surroundings, translate languages, and offer real-time information using onboard AI and cloud-based processing. Despite their rich features, concerns around privacy and response accuracy limit the potential of this wearable as an assistive device.
There are several features that can support users with disabilities. Meta’s AI can be asked for “Detailed Responses” that create scene descriptions to help users understand objects, text, and the environment around them. Another feature, Conversation Focus, uses AI to isolate the voice in front of the user in noisy spaces. Lastly, hands-free controls, real-time translation, and voice guidance can support users with various disabilities. These features, when thoughtfully implemented, can empower greater independence in community navigation, communication, and daily living.
On the other hand, implementing these features in real world situations is not always intuitive, and the AI features come with privacy risks. Although the AI is more conversational than older voice assistants, it does not always understand what the user is asking, and it can be difficult know if information is accurate. Similarly, users may want to use the AI features to help them understand complex documents; however, all photos and videos are uploaded to Meta and can be used by them in ways dictated by their privacy policy.
The post Meta Glasses Can Empower Independence, but Raise Privacy Concerns appeared first on Northeast Arc.