We published this month with Romain Vergne, Thomas Hurtut and Joëlle Thollot a research report on our work on texture transfer based on textural variations. This report is the third chapter of my thesis manuscript and can be found here.
The idea of this work is to transfer a reference texture onto an input texture, reproducing the reference texture patterns, but preserving the input texture global variations such as illuminations changes or perspective deformations.
This is done using the input texture and a single reference texture. In order to do so, we use a texture descriptor to characterize the input texture global variations, and a texture synthesis algorithm to reproduce the reference texture patterns while preserving the input global variations.
Examples of results are given below. From left to right, the images are the input, the reference and the result. Note that for each result we manually chose which global properties of the input we wanted to preserve in the result between luminance, scale and orientation.
Input with mask
Input with mask
Input with mask
Reference with mask
An extension of our Expressive 2016 paper was published in a special edition of Computers & Graphics this month. This extended paper can be found here.
The main contribution of this extension is the ability to use simple strokes to define image regions and apply color transfer or colorization between these regions only. The image regions are automatically computed from the user provided strokes using our edge-aware texture descriptor.
Here is an example of using these strokes to quickly refine the result of the automatic color transfer approach. First, we compute the automatic color transfer result based on textural properties using the house input image, and the sunset image as a reference.
Automatic color transfer
We are happy with the purplish color of the house and hedge, however the sky remained blue because of the blue sky in the reference. To get a better sunset color in the sky, we use another reference image and two strokes to match the skies of the two images.
Automatic result with stroke
New reference with stroke
Locally refined result
As we can see, the masks automatically computed from the stroke (bottom right) accurately separate the sky in the two images. In the final result, only the sky is changed to the purple color of the new reference’s sky, producing a better sunset feeling.
This is an image gallery for my work on color transfer and colorization, described here. Additional results can be found here. This work was done at INRIA Grenoble during my PhD thesis and was published at Expressive 2016 where it received an honorable mention. You can download my presentation slides here.
The algorithm uses a reference or example image (bottom left) to recolorize the input image (top left) and create the result image (right).
Color Transfer Results
My work on hedges detection in collaboration with Mathieu Fauvel was published in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. You can find the article here. This work was done at the University of Iceland in Reykjavik during my master thesis.
Two examples of results are shown below for satellite images of rural areas.
The left column shows the Normalized Difference Vegetation Index (NDVI) used to detect vegetation in satellite images. In those images, vegetation is shown in white, whereas non-vegetation areas (roads, buildings, water, etc…) are black. The right column shows in red the pixels detected as hedges by our method.