Our goal is to extend perception capabilities of blind and heavily visually impaired people. A person with healthy vision enjoys using amazingly fast, precise and wide channel of communication with the environment. She judges about size, shape and proximity of the objects without approaching or touching them. The information about the perceived object is transmitted within fractions of seconds. People's eyes are truly masterpiece of nature.
With our product we want to give a simplified version of vision to visually impaired people using cutting edge computational and hardware technologies. We develop a wearable device that transforms video stream from the camera to the tangible tactile images that can be sensed with one's palm or arm.
The device has a shape of a glove or a sleeve. It sits on a wrist and a part of a forearm. Therefor a user can still use her palm and fingers.
The device has a camera on the back of the hand. The data from the camera is processed by built-in microcontroller. A dedicated algorithm detects the most important objects and obstacles and prepares a simplified image for the tactile display. The tactile display sits on user's forearm and generates a sensible tactile shape of the image. The closer the object, the stronger touch is generated at the corresponding location on the forearm.
Left: image from camera, right: what touch display shows
Diagram of the processing pipeline