We propose ZeST, a method for zero-shot material transfer to an object in the input image given a material exemplar image. ZeST leverages existing diffusion adapters to extract implicit material representation from the exemplar image. This representation is used to transfer the material using pre-trained inpainting diffusion model on the object in the input image using depth estimates as geometry cue and grayscale object shading as illumination cues. The method works on real images without any training resulting a zero-shot approach. Both qualitative and quantitative results on real and synthetic datasets demonstrate that ZeST outputs photorealistic images with transferred materials. % We present qualitative and quantitative evaluation of our method on a collection of real images. We also show the application of ZeST to perform multiple edits and robust material assignment under different illuminations.
Our Method comprises 3 branches. Given a material exemplar M and an input image I, we first encode material exemplar with an image encoder (e.g., IP-Adaptor). Concurrently, we convert the input image into a depth map and a foreground-grayscaled image to feed into the geometry and latent illumination guidance branch, respectively. By combining the two sources of guidance with the latent features from the material encoding, ZeST can transfer the material properties onto the object in input image while preserving all other attributes.
We show 16 examples of material transfers from various exemplars. These material exemplars range from PBR materials to real-world objects.
We also present 2 examples of multiple material transfers within a single image.
@article{cheng2024zest,
title={ZeST: Zero-Shot Material Transfer from a Single Image},
author={Cheng, Ta-Ying and Sharma, Prafull and Markham, Andrew and Trigoni, Niki and Jampani, Varun},
journal={arXiv preprint arXiv:2404.06425},
year={2024}
}