The world is still experimenting with and in awe of Deep Nostalgia, an AI-powered technique from online family tree service MyHeritage, which adds facial animations to still portrait photographs. Almost daily, we come across small clips on the Internet in which the said technology has been used to animate still pictures, making a character smile, nod, blinks, and tilt their head. The experience, no doubt, is wonderful. While Deep Nostalgia brings life to your photographs, a new AI-powered technique, developed under the supervision of Dr. A.N. Rajagopalan, the Sterlite Technologies Chair Professor in the Electrical Engineering department at IIT Madras, helps restore blurred or degraded images.
His Image Processing and Computer Vision Lab at the institute uses the power of artificial neural networks to restore images affected by rain-streaks, raindrops, haze, or motion blur. The team found that it wasn’t easy for a single neural network to identify the degraded portion of a picture and clean it. Hence, the team decided the split the task into two stages. First, a network localises the degraded or the blurred part. In the following step then, a subsequent network uses the acquired information to restore the image.
“Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that solving this auxiliary task injects crucial localising ability in network layers. We transfer this ability to the main restoration network using attentive knowledge-distillation and focus on the refinement of degraded regions by exploiting this additional knowledge,” he explains.
The method used by Dr. Rajagopalan’s team, detailed in a paper titled ‘Degradation Aware Approach to Image Restoration Using Knowledge Distillation’ on IEEE, appeared to have outperformed other strategies previously attempted to restore old images. The team said it utilised “publicly available datasets” of rain streak, haze, raindrop, and motion blur to test their model.