Taking photos through windows almost always yields reflections of the photographer and of persons and objects in front of the glass. Now, MIT researchers have created an algorithm that will remove reflections from most digital photos by exploiting the fact that photos taken through windows often feature two nearly identical reflections, slightly offset from each other. This is true of double-paned windows and thick single-pane windows.
Though there have been other attempts at removing reflections from photos before, results have had partial success, especially in automatically deciding if the recovered scene is the one reflected by the window or the one behind the window. MIT’s new algorithm successfully distinguished reflection from transmission in real-world tests involving 197 images (obtained through searches on Google and Flickr), 96 of which exhibited double reflections that were offset far enough for their algorithm to work.
I imagine the algorithm could eventually be incorporated onto imaging chips, allowing the removal of unwanted glass reflections during shooting or when reviewing the photos in-camera during playback.
Read more at: MIT news.