New AI-Powered Glasses Aid the Visually Impaired
Ask Envision, an artificial intelligence (AI) assistant powered by OpenAI’s GPT-4, utilizes advanced technology to assist the blind in perceiving the world around them. Envision Glasses provide visually impaired individuals with detailed descriptions and conversational responses by integrating language models and image-to-text capabilities. The user can interact with the product via its onboard speaker and receiver.
Initially launched as a smartphone app in 2018, Envision focuses on reading text in photos. However, with the introduction of OpenAI’s GPT-4, the capabilities of Ask Envision expanded significantly. The integration of GPT-4 enables Ask Envisions to process images and text simultaneously, delivering a more comprehensive understanding of the visual environment.
When a visually impaired user interacts with the glasses, they can provide an image or ask questions about their surroundings. The AI assistant then analyzes the image and generates a descriptive text, offering detailed information about the scene. This allows users to gain insights into their environment, ranging from simple descriptions like “a cloudy sky” to more complex details such as menus, prices, or dietary restrictions.
OpenAI’s advanced language model enhances the accuracy and efficiency of the responses, empowering those with impaired vision to navigate their surroundings and access information independently. It offers a level of visual detail and independence that was previously unavailable, bridging the gap between visual impairments and understanding of one’s surroundings.