Example-guided image editing - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Thèse Année : 2017

Example-guided image editing

Edition d'images par l'exemple

Résumé

This thesis addresses three main topics from the domain of image processing, i.e. color transfer, high-dynamic-range (HDR) imaging and guidance-based image filtering. The first part of this thesis is dedicated to color transfer between input and target images. Color transfer is often viewed as a distribution transfer problem in which the image color distributions are modelled using the multivariate Gaussian distribution (MGD). Existing color transformations rely on the accuracy of the MGD model and may fail to produce plausible results when the MGD does not fit well enough the distribution of the image colors. To overcome this limitation, in this thesis we adopt cluster-based techniques. We apply Gaussian mixture models to partition the input and target images into Gaussian clusters (each cluster follows an MGD). In addition, we propose four new mapping policies to efficiently map the target clusters to the input clusters. Our results and evaluation show a significant improvement over existing color transfer methods. Color transfer is limited to transferring color between images. To address this limitation, we exploit the properties of the multivariate generalized Gaussian distributions (MGGD). The MGGD can fit a wide class of image features distributions, including the distributions of color, gradient, wavelet coefficients, etc. We propose a novel transformation of the MGGD, which we apply to simultaneously transfer both color and gradient. The proposed MGGD transformation proves to be beneficial to other image processing tasks, such as color correction. Even though the MGGD and the MGD are both continuous distributions, they are often adopted to model the discrete distributions of color and light in images. Our experiments have shown that the bounded Beta distribution provides a much more precise model for the color and light distributions of images. To exploit this property of the Beta distribution, we propose a new color transfer method, where we model the color and light distributions by the Beta distribution. To this end, we introduce a novel transformation of the Beta distribution. The results, obtained by applying our Beta transformation, appear more natural and less saturated than results from recent state-of-the-art methods. Additionally, our results represent accurately the target color palette and truthfully portray the target contrast. Different color transfer methods often result in different output images. The process of determining the most plausible output image may be subjective, as it depends on a person’s preference. To lessen the level of subjectivity in quality assessment for color transfer, in this thesis we propose a model for objective evaluation of the color transfer. Our model explains the relationship between users’ perception and a number of perceptual image features. The second part of this thesis focuses on HDR imaging. First, we present a color transfer method between HDR images. To this end, we propose an extension of existing color transfer methods to the HDR domain. Second, we introduce a method for automatic creation of HDR images from only two images - flash and non-flash images. We mimic the camera response function by a brightness function to obtain a number of differently exposed images using only the non-flash image. Then, we recover details from the flash image using our new chromatic adaptation transform (CAT), called bi-local CAT. That way, we efficiently recover the dynamic range of the real-world scene without compromising the quality of the HDR image (as our method is robust to misalignment). In the context of the HDR image creation, the bi-local CAT recovers details from the flash image, removes flash shadows and reflections. In the last part of this thesis, we exploit the potential of the bi-local CAT for various image editing applications such as image de-noising, image de-blurring, texture transfer, etc. We propose a novel guidancebased filter in which we embed the bi-local CAT. The proposed filter performs as good as (and for certain applications even better than) state-of-the art methods.
Des millions d'images sont téléchargées chaque jour à travers des médias sociaux tels que Instagram, Google et Facebook. Ces plate-formes permettent aux utilisateurs de styliser leurs images en appliquant différents fltres ou en ajustant les niveaux de luminosité, contraste, couleur, etc. La majeure partie des images téléchargées sont prises à l'aide de l'appareil photo d'un téléphone mobile, elles sont donc sujettes au bruit et contiennent des parties floues. Des outils permettant de supprimer le bruit et le flou sont devenus nécessaires pour améliorer l'apparence des photos prises par des utilisateurs. La popularité des médias sociaux a permis de rediriger les recherches vers des algorithmes d'édition d'images innovants dans le but d'améliorer les images et plus particulièrement de styliser ces images. Le travail décrit dans cette thèse apporte des contributions au domaine de l'édition d'images par l'exemple. Il s'agit du transfert de couleur (et de style), l'imagerie HDR (image à grande gamme de luminance) et le filtrage d'image guidé.
Fichier principal
Vignette du fichier
manuscript.pdf (33.14 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

tel-01737486 , version 1 (19-03-2018)

Identifiants

  • HAL Id : tel-01737486 , version 1

Citer

Hristova Hristina. Example-guided image editing. Engineering Sciences [physics]. Université de Rennes 1 [UR1], 2017. English. ⟨NNT : ⟩. ⟨tel-01737486⟩
191 Consultations
202 Téléchargements

Partager

Gmail Facebook X LinkedIn More