Just yesterday I posted the news of a fabric screen that can be deformed by your hands and fingers resulting in increased interactions with images. Today goes along the same lines with a device created by Anatomage letting medical students to practice anatomy on what looks like a surgery table with all the bell and whistles of augmented reality and digital images.
The table has the same size of a surgery table and the screen can reproduce a patient body. With the assistance of a computer the students can interact with the image and move inside the body in the same way the would do using scalpel and retractors.
You can watch the video made by BBC.
The image can be part of a data base so that students can see many cases, or it can be generated from exams taken on a patient. In this case it becomes useful for a surgeon in looking inside the patient and studying the approach to the surgery. You can explore (if you are not faint of heart!) the various applications and images that can be visualized.
This kind of interface, here applied to medicine and it makes sense given the current high cost of the system, will slowly become more and more applicable to a variety of application fields, as its cost will go down. My expectation is that by the end of this decade we will start to see desk top screens in offices and in some school. By the end of the next decade most surfaces will double up as screen.
What I think is really interesting is that through these interfaces we move complexity from atoms to bits and we can perform all sort of manipulations at the bit level, thus dramatically reducing the cost and expanding the possibilities.