LifeloopsStudioAboutMeWorks    Contact                                                                                                                  

                                                                                                                                                  Empowered AR(2024)
AR Glasses, MIT Hackathon, Dissability Design 




EmpoweredAR (2024)  leverages the ability of Xreal eyewear to generate meshes of the physical environment and combined with accurate and reliable distance tracking, can provide wearers with an auditory map.









Concept 
EmpoweredAR was inspired by the challenge of making augmented reality (AR) technology accessible beyond entertainment and education, specifically aimed at aiding visually impaired individuals. The project utilizes Xreal lenses to create a digital mesh of the environment, which, coupled with auditory signals, provides spatial awareness to users. 











How we do it
EmpoweredAR leverages the ability of Xreal eyewear to generate meshes of the physical environment and combined with accurate and reliable distance tracking, can provide wearers with an auditory map. As they approach an object that they may collide with, a tone is created and the intervals between the tones are shortened relative to their distance from the object, giving them very explicit information about the space. Although we focused on collision avoidance, we quickly realized that this helped to illuminate clear spaces; distinguishing the two can help users generate a mental image of their space.

This enables them to navigate around obstacles safely. Developed with Unity for audio playback, TensorFlow and YOLOv3 for object detection, and XREAL glasses for depth data, the project faced challenges such as the absence of an RGB camera in the lenses, complicating the differentiation of obstacles. Despite these challenges, the team successfully implemented dynamic object detection.







Key learnings
 Future plans involve integrating AI to enhance object identification and navigation in more complex and dynamic environments. This initiative represents a significant step in transforming AR into a tool that empowers visually impaired individuals to confidently explore their surroundings, showcasing the project’s blend of technical innovation and social impact. The code and further details are available in our repository: https://codeberg.org/reality-hack-2024/TABLE_72.





                © 2025 LifeloopsStudio