An MIT student working at Google is bringing us a friendlier PC-Smartphone coexistence. Say, you’re rushing to a meeting but need to find where it is located on Google Maps. You bring it up on your PC, find it and rush out. You did not print it and now you’re forced to retype it on your smartphone again.
Not with Deep Shot, the new system designed by Tsung-Hsiang Chang, a graduate student in MIT’s Computer Science and Artificial Intelligence Laboratory, and Google’s Yang Li.
All you now have to do is take a snapshot of your PC’s screen with your smartphone and Deep Shot will recognize the app used on the PC and what exactly you’re looking at. It then uses existing computer vision algorithms to identify the application opened on screen, extracts and transmits the corresponding Uniform Resource Identifier (URI) to your smartphone which then opens up the corresponding app and displays the exact screen. Sweet!
Of course, you need to install some code on both PC and smartphone for this handshaking to work seamlessly. And the process is reversible, too, pushing the info on your smartphone back to your computer, say a paper you took some notes of on your smartphone — and which you now want to work on using your computer large screen and much more comfortable keyboard.