Detection, analysis and modeling of uman speech ...

Detection, analysis and modeling of uman speech for deaf-blind impaired people

Eliana Frigerio
DEI PhD Student

DEI - Building 33 Lambrate, Seminar Room
November 18th, 2011
11.15 am

Abstract

Deaf-blind impaired people can use only the sense of touch to interact with the world and other people. One common way to “listen” to other people is the Tadoma method in which the deaf-blind person places their thumb on the speaker’s lips and their fingers along the jawline. The middle three fingers often fall along the speaker’s cheeks with the little finger picking up the vibrations of the speaker’s throat. It is sometimes referred to as “tactile lip-reading”, as the deaf-blind person feels the movement of the lips, as well as vibrations of the vocal cords. Unfortunately this method requires the presence of the speaker close to the deaf-blind person and the physical contact between them. Deaf-blind people can also read books by the Braille method where the characters are coded in a digital way. Considering these two techniques already well known by deaf-blind impaired people, aim of the minor has been investigating the opportunity, starting from an audio signal (registered, e.g., by a microphone) to recreate the same sensations perceived by these impaired persons with the Braille system. In particular a dedicated processing of the input signal will generate the proper digital output that will be translated in Braille code and transmitted to the deaf-blind people.

Research area:
Signals