Not long ago Google announced new glasses for augmented reality: you look at the world and information is overlaid on the objects you are seeing.
Indeed, at this point it is just a (small) step to imagine that through the glasses part of this information may be used to clarify somebody’ speech, like it my be useful if that somebody is speaking a foreign language.
This is what Will Powell though and set up to do.
He has assembled a system consisting of a computer, a speech recognition software and a real time translator coupled with a pair of glasses that can display the translated sentences as subtitles…
The idea is that you just wear the glasses as you are talking to someone. A microphone embedded in the glasses captures the voice of the person who is talking to you and a computer translates it and the translation appears as a floating text in front of your eyes, like a prompt.
As you can see in the clip below, the system is quite bulky and it won’t be practical to use. But as it happens with all electronic components it just take a little time (and a good market) to have them shrinking to the point the basically disappear and become embedded.
Hence, I can imagine a real pair of normal looking glasses providing this type of service in just 2 to three years.