This year I had not the opportunity to go and check CHI 2011 conference (Computer Human Interaction), which I consider as one of the most instructive conferences on interfaces, with a strong participation of both academy and industry research centres, like Microsoft, IBM, Xerox etc.
Anyway I had a look to the best awarded papers published online and one came to my attention: Microsoft Research and the formidable Desney Tan, one of the gurus of innovative interfaces in that company, have come out with such an easy to understand idea that I wonder why no one else have had it before (at least to my knowledge).
Our houses are cabled with electrical wiring and full of appliances: those generate some electromagnetic noise out in the surrounding environment (that noise that, at least in the past, was for instance annoying our AM radios…).
Now the MS researchers had the idea of using our own body as an antenna and measuring how we interfere with this noise, using machine learning and artificial intelligence to be able to recognize our position inside the house plan and the gestures we do. In this way we could move one hand on the wall to control an appliance in the same room. Check all details of this concept in the awarded paper.