Texture Transfer Based on Texture Descriptor Variations

We published this month with Romain Vergne, Thomas Hurtut and Joƫlle Thollot a research report on our work on texture transfer based on textural variations. This report is the third chapter of my thesis manuscript and can be found here.

The idea of this work is to transfer a reference texture onto an input texture, reproducing the reference texture patterns, but preserving the input texture global variations such as illuminations changes or perspective deformations.

This is done using the input texture and a single reference texture. In order to do so, we use a texture descriptor to characterize the input texture global variations, and a texture synthesis algorithm to reproduce the reference texture patterns while preserving the input global variations.

Examples of results are given below. From left to right, the images are the input, the reference and the result. Note that for each result we manually chose which global properties of the input we wanted to preserve in the result between luminance, scale and orientation.

Local texture-based color transfer and colorization

An extension of our Expressive 2016 paper was published in a special edition of Computers & Graphics this month. This extended paper can be found here.

The main contribution of this extension is the ability to use simple strokes to define image regions and apply color transfer or colorization between these regions only. The image regions are automatically computed from the user provided strokes using our edge-aware texture descriptor.

Here is an example of using these strokes to quickly refine the result of the automatic color transfer approach. First, we compute the automatic color transfer result based on textural properties using the house input image, and the sunset image as a reference.

We are happy with the purplish color of the house and hedge, however the sky remained blue because of the blue sky in the reference. To get a better sunset color in the sky, we use another reference image and two strokes to match the skies of the two images.

As we can see, the masks automatically computed from the stroke (bottom right) accurately separate the sky in the two images. In the final result, only the sky is changed to the purple color of the new reference’s sky, producing a better sunset feeling.

 

 

 

Automatic Texture Guided Color Transfer and Colorization

This is an image gallery for my work on color transfer and colorization, described here. Additional results can be found here. This work was done at INRIA Grenoble during my PhD thesis and was published at Expressive 2016 where it received an honorable mention. You can download my presentation slides here.

The algorithm uses a reference or example image (bottom left) to recolorize the input image (top left) and create the result image (right).

Color Transfer Results

 

Colorization Results

Detection of Hedges in a Rural Landscape Using a Local Orientation Feature: From Linear Opening to Path Opening

My work on hedges detection in collaboration with Mathieu Fauvel was published in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. You can find the article here. This work was done at the University of Iceland in Reykjavik during my master thesis.

Two examples of results are shown below for satellite images of rural areas.

The left column shows the Normalized Difference Vegetation Index (NDVI) used to detect vegetation in satellite images. In those images, vegetation is shown in white, whereas non-vegetation areas (roads, buildings, water, etc…) are black. The right column shows in red the pixels detected as hedges by our method.

hedgeDetectionResult