Image Color Transfer is a free, browser-based application that allows a photo to be color graded to match the color of another photo, similar to the “Match Color” feature found in standalone photo editing applications.
How It Works
The app was developed by two web developers through Github as a collaborative project undertaken due to COVID-19 lockdowns. Terry Johnson, a 71-year-old man from the United Kingdom, specified much of the processing methodology, and Michele Renzullo, a 22-year-old Italian, provided the expertise for the web implementation.
The duo set out to enhance what is often seen as the “Match Color” feature found in web applications and is described in detail in an article on Medium.
“When image color transfer was first developed, it was mostly seen as a technique for matching images prior to photo merging. If two images are captured in sequence under slightly different lighting conditions and then merged to form a continuous scene, then the image join might be visible when color and shading are not exactly matched,” the two explain.
“To overcome this, one photo can be matched to the other using a fairly rudimentary image color transfer method. Because the initial implementation of the method in early software packages was fairly crude, color matching was seen as a technique with fairly limited application. Consequently, little effort was made to improve the processing method. The implementation here offers a more refined and flexible approach to color match processing and so offers the chance to address a wider range of potential applications.”
The process requires a “target image” that an editor wishes to recolor and a “palette image” that contains the color profile that an editor would want to copy. Once both are processed through the app, an output image is generated that takes the color profiles found in the palette image and applies them to the target image.
Image Color Transfer additionally offers a set of sliders that allow a user to fine-tune the output.
“Some slider controls allow the user to select values outside the range 0% to 100% to achieve additional interesting effects,” the two explain. “Other sliders provided by the Web App allow control of output image saturation, limit setting for cross-correlation processing and iteration control for shape matching of output image channel distributions to palette image channel distributions.”
The two have added explanations of additional options for advanced users on Github.
For those curious about how such a platform was created, the two go into extreme detail on how the application works and what methods it uses to determine output and go so far as to show the process behind what they built and the decisions they made in implementing how the application is able to shift color in a way that looks natural.
“The Web App has been designed to primarily operate on a desktop computer, the speed and practicality of processing will be dependent upon image size and the computing power of a particular device,” the two continue. “The output image size is determined by the size of the target image. The palette image need not be the same size as the target image. Indeed, a smaller palette image can facilitate processing.”
The two demonstrate that the app doesn’t even need the target image to be a different photo than the palette image, and the palette image can simply have a rudimentary color scribble drawn over it to produce a photo with a well-implemented color grade.
Image Color Transfer in Practice
PetaPixel tested Image Color Transfer with two photos with extremely different primary colors to see how the application would treat a color grade with no changes to the app’s default settings.
This was chosen as the target image:
This was chosen as the palette image:
And Image Color Transfer produced the following output image:
The term color grading isn’t often associated with photography as it is a video editing concept, but it’s an apt description of what is happening here. While the idea of what the two web developers created is not new, it feels more stylized and less literal than previous attempts at the same concept. For example, two researchers from Adobe and Cornell tried something similar in 2017, but their results are a lot more extreme. NVIDIA has also been experimenting with similar algorithms.