Imagine being able to interact with a real photo or a video without distorting the image. Actually, being able to see and pull or push the way the person or object moves in real life but on a screen.

Abe and his colleges at MIT developed what they call a motion microscope, which is software that finds these subtle motions in video and amplifies them so that they become large enough for us to see.

This is a new technology created by Abe Davis and his team. Abe said “this type of technology lets us predict how objects will respond to new situations, and you could imagine, for instance, looking at an old bridge and wondering what would happen, how would that bridge hold up if I were to drive my car across it.” Now you can actually interact with the bridge, move the bridge up and down, back and forth to see how well that bridge will hold up!

 

Abe says “ironically, we’re kind of used to having this kind of interactivity when it comes to virtual objects, when it comes to video games and 3D models, but to be able to capture this data from real objects in the real world using just simple, regular video, is something new that has a lot of potential.”