Last week we shared a sneak peek at some jaw-dropping image deblurring technology currently in development at Adobe. The video wasn’t the best quality and was captured from the audience, so we didn’t get to see the example images very clearly. Adobe has now released an official video of the demo, giving us a better glimpse at what the feature can do.
Here’s the it looked looks like after the Image Deblur feature does its magic (hover your mouse over it for a comparison):
They also did a demo with an image that includes text, demonstrating that useful data can be recovered using the technology:
The feature makes the blurry phone number crystal clear (hover to compare):
Here’s the official video by Adobe (watch it here if it doesn’t load for you):
Update: Adobe has responded to questions regarding the sample photo of Kevin Lynch:
For those who are curious – some additional background on the images used during the recent MAX demo of our “deblur” technology. The first two images we showed – the crowd scene and the image of the poster, were examples of motion blur from camera shake. The image of Kevin Lynch was synthetically blurred from a sharp image taken from the web. What do we mean by synthetic blur? A synthetic blur was created by extracting the camera shake information from another real blurry image and applying it to the Kevin Lynch image to create a realistic simulation. This kind of blur is created with our research tool. Because the camera shake data is real, it is much more complicated than anything we can simulate using Photoshop’s blur capabilities. When this new image was loaded as a JPEG into the deblur plug-in, the software has no idea it was synthetically generated. This is common practice in research and we used the Kevin example because we wanted it to be entertaining and relevant to the audience – Kevin being the star of the Adobe MAX conference!