AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (18.9 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Seamless and non-repetitive 4D texture variation synthesis and real-time rendering for measured optical material behavior

Fraunhofer IGD, Darmstadt, 64283, Germany.
TU Darmstadt, Darmstadt, 64289, Germany.
TU Graz, Graz, 8010, Austria.
Show Author Information

Abstract

We show how to overcome the single weakness of an existing fully automatic system for acquisition of spatially varying optical material behavior of real object surfaces. While the expression of spatially varying material behavior with spherical dependence on incoming light as a 4D texture (an ABTF material model) allows flexible mapping onto arbitrary 3D geometry, with photo-realistic rendering and interaction in real time, this very method of texture-like representation exposes it to common problems of texturing, striking in two disadvantages. Firstly, non-seamless textures create visible artifacts at boundaries. Secondly, even a perfectly seamless texture causes repetition artifacts due to their organised placement in large numbers over a 3D surface. We have solved both problems through our novel texture synthesis method that generates a set of seamless texture variations randomly distributed over the surface at shading time. When compared to regular 2D textures, the inter-dimensional coherence of the 4D ABTF material model poses entirely new challenges to texture synthesis, which includes maintaining the consistency of material behavior throughout the 4D space spanned by the spatial image domain and the angular illumination hemisphere. In addition, we tackle the increased memory consumption caused by the numerous variations through a fitting scheme specifically designed to reconstruct the most prominent effects captured in the material model.

References

[1]
Kautz, J.; Sattler, M.; Sarlette, R.; Klein, R.; Seidel, H.-P. Decoupling BRDFs from surface mesostructures. In: Proceedings of Graphics Interface 2004, 177-182, 2004.
[2]
Ritz, M.; Santos, P.; Fellner, D. Automated acquisition and real-time rendering of spatially varying optical material behavior. In: Proceedings of the ACM SIGGRAPH 2018 Posters, Article No. 33, 2018.
[3]
Schwartz, C.; Sarlette, R.; Weinmann, M.; Klein, R. Dome II: A parallelized BTF acquisition system. In: Proceedings of the Workshop on Material Appearance Modeling, 2013.
[4]
Weinmann, M.; Klein, R. Advances in geometry and reflectance acquisition (course notes). In: Proceedings of the SIGGRAPHAsia 2015 Courses, Article No. 1, 2015.
[5]
Cohen, M. F.; Shade, J.; Hiller, S.; Deussen, O. Wang Tiles for image and texture generation. ACM Transactions on Graphics Vol. 22, No. 3, 287-294, 2003.
[6]
Ng, T. Y.; Wen, C.; Tan, T. S.; Zhang, X.; Kim, Y. J. Generating an /spl omega/-tile set for texture synthesis. In: Proceedings of the International 2005 Computer Graphics, 177-184, 2005.
[7]
Efros, A. A.; Freeman, W. T. Image quilting for texture synthesis and transfer. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, 341-346, 2001.
[8]
Liu, X.; Yu, Y.; Shum, H.-Y. Synthesizing bidirectional texture functions for real-world surfaces. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, 97-106, 2001.
[9]
Nozick, V.; Ismael, D.; Saito, H. GPU-based photometric reconstruction from screen light. In: Proceedings of the 8th Annual International Conference on Artificial Reality and Telexistence, 242-245, 2008.
[10]
Lagae, A.; Dutré, P. An alternative for Wang tiles: Colored edges versus colored corners. ACM Transact-ions on Graphics Vol. 25, No. 4, 1442-1459, 2006.
Computational Visual Media
Pages 161-170
Cite this article:
Ritz M, Breitfelder S, Santos P, et al. Seamless and non-repetitive 4D texture variation synthesis and real-time rendering for measured optical material behavior. Computational Visual Media, 2019, 5(2): 161-170. https://doi.org/10.1007/s41095-019-0141-4

546

Views

24

Downloads

1

Crossref

N/A

Web of Science

1

Scopus

0

CSCD

Altmetrics

Revised: 20 March 2019
Accepted: 29 March 2019
Published: 09 May 2019
© The author(s) 2019

This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduc-tion in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.

The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from thecopyright holder.

To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript, please go to https://www. editorialmanager.com/cvmj.

Return