Shopping for groceries is a common task for many people, but identifying grocery items can be challenging for visually impaired individuals. In a groundbreaking development for assistive technology, a team of experts introduced AiSee, an innovative wearable assistive device designed to help people with visual impairments.

Wearable Assistive Device Uses AI To Describe Surroundings, Helps Visually Impaired “See” Objects
(Photo: Pexels/ Eren Li )


What is AiSee?

Visually impaired people face daily hurdles, such as object identification, vital for simple and complex decision-making. Breakthroughs in artificial intelligence have dramatically improved visual recognition capabilities, but the real-world application of these technologies is still challenging and prone to errors.

AiSee aims to overcome these limitations by utilizing state-of-the-art AI technologies. First developed in 2018, this tool has been progressively upgraded over five years. Its main goal is to empower users with more natural interaction.

Following a human-centered design process, the researchers question the typical approach of using glasses augmented with a camera. Visually impaired people may be reluctant to wear glasses to avoid stigmatization. Because of this, the research team led by National University of Singapore's (NUS) School of Computing associate professor Suranga Nanayakkara proposed an alternative hardware that incorporates a discreet bone conduction headphone.

In using AiSee, the user will only need to hold an object and activate the built-in camera to capture an image of the object. The device will identify the object using AI and provide more information when queried by the user.


READ ALSO: New Experimental Wearable Device Could Generate Power From User's Bending Finger; Create, Store Memories


How Does AiSee Work?

AiSee comprises three major components: the eye, brain, and speaker. The eye is a vision engine computer software incorporating a micro-camera to capture the user's field of view. The software can extract features from the surroundings, like text and logos, which will be labeled from the captured image for processing.

The brain serves as an AI-powered image processing unit and interactive Q&A system. After the user takes a photo of the object of interest, the tool utilizes sophisticated cloud-based AI algorithms to process and analyze the images for object identification. The user can also ask questions to find more information about the object.

AiSee uses advanced text-to-speech and speech-to-text recognition and processing technology to identify objects and understand the user's queries. A large language model enables AiSee to excel in question-and-answer interactions. This allows the system to understand and respond to the queries promptly and informally. Unlike other wearable assistive devices that need smartphone pairing, AiSee works as a self-contained system that can function independently without the help of other additional devices.

Meanwhile, the speaker serves as the bone conduction sound system. This technology allows sound transmission through the bones of the skull, ensuring that auditory information is effectively delivered while still having access to external sounds like conversations or traffic noise. This is very important for visually impaired individuals since sounds provide vital information for decision-making, particularly in situations that involve safety considerations.

The researchers believe that AiSee can be a handuseful tool for visually impaired and blind people. Most of the time, assistive devices seem very targeted at totally blind or visually impaired individuals, and AiSee could be a good balance.

RELATED ARTICLE: Calico Robot Assistant: The Tiny, On-Cloth Wearable Device That Can Zip Around Your Clothing

Check out more news and information on Wearable Devices in Science Times.