In the future, you’ll be able to touch things through your computer. MIT Media Lab has created a tangible interface that reproduces a realtime virtual version of whatever you put under its sensors.
Called inFORM, it uses cameras to capture objects in 3-D and then processes the information on a computer, morphing solid rods that rise and fall on a tabled surface that can interact with physical objects.
The MIT team then added a screen that shows a person manipulating the objects remotely, giving the activity a sense of human presence.
The project is a step towards the researchers’ stated ultimate goal of a “material user interface in which all digital information has a physical manifestation that people can interact with directly.”
Check it out: