Texture Analysis and Synthesis

Texture basis function plays a very important role in procedural texturing. In this project, a new generalized cellular texture basis function is proposed. This new texture basis function is a generalization of Worley’s cellular texture basis function. Experimental results have shown that the proposed basis function can be used in procedural texturing algorithms to generate a wide range of interesting texture patterns, e.g. crumpled wrinkle, wood, marble, cloud, flame, organic crusty, ice, rock, etc. Using an efficient implementation, the new basis function can be a viable alternative to the popular  Perlin’s noise.

Texture analysis and synthesis has been an active research topic in computer graphics and image processing. In this project, a novel genetic-based approach for texture analysis and synthesis has been developed. Three major techniques are used in the genetic-based approach, namely, the maximum likelihood estimation technique, the globally convergent method for solving non-linear equation system, and the genetic-based search technique. Compared with traditional texture analysis and synthesis approaches, the new genetic-based approach has two major advantages. First, it can handle a wide range of textures including procedural textures as well as real textures. Second, the new approach can easily apply the synthesized textures onto the surfaces of 3D synthetic objects using procedural texturing approach. The experimental results are presented to show these three advantages.

 

Examples of synthesized results

Examples of synthesized results. Top row: input textures and bottom row: output textures.

Examples of procedurally synthesizing input textures onto surfaces of 3D synthetic objects using the genetic-based multi-resolution parameter estimation technique. Images in the first column are input textures, and the corresponding synthesized results are to the right of the input textures. The input textures in the first two rows are procedural textures, and the last two input textures are real images taken by a digital camera.

Aura Matrices in Texture Synthesis

In this project, we present a new mathematical framework for modeling texture images using independent Basic Gray Level Aura Matrices (BGLAMs). We prove that independent BGLAMs are the basis of Gray Level Aura Matrices (GLAMs), and that an image can be uniquely represented by its independent BGLAMs. We propose a new BGLAM distance measure for automatically evaluating synthesis results w.r.t. input textures to determine if the output is a successful synthesis of the input. For the application to texture synthesis, we present a new algorithm to synthesize textures by sampling only the independent BGLAMs of an input texture. With respect to synthesis of textures and evaluation of the results, the performance of our approach is extensively evaluated and compared with symmetric GLAMs that are used in existing techniques and with Gray Level Cooccurrence Matrices (GLCMs). Experimental results have shown that (1) our approach significantly outperforms both symmetric GLAMs and GLCMs; (2) the new BGLAM distance measure has the ability to evaluate synthesis results, which can be used to automate the conventional visual inspection process for determining whether or not the output texture is a successful synthesis of the input; and (3) a broad range of textures can be faithfully synthesized using independent BGLAMs and the synthesis results are comparable to existing techniques.

Examples of synthesized results

Examples of synthesized results

The above Aura matrix approach has been extended to generating solid textures based on input examples. Our method is fully automatic and requires no user interactions in the process. Given an input texture sample, our method first creates its aura matrix representations and then generates a solid texture by sampling the aura matrices of the input sample constrained in multiple view directions. Once the solid texture is generated, any given object can be textured by the solid texture. We evaluate the results of our method based on extensive user studies. Based on the evaluation results using human subjects, we conclude that our algorithm can generate faithful results of both stochastic and structural textures with an average successful rate of 76.4%. Our experimental results also show that the new method outperforms Wei & Levoy’s method and is comparable to that proposed by Jagnow, Dorsey, and Rushmeier.

 

References

X.Qin and Y.H. Yang, "A generalized cellular texture basis function," Proc of the 12th Annual Graduate Symposium on Computer Science, U of Saskatchewan, 2000, pp. 153-162. Click here for the paper.

X. Qin and Y.H. Yang, "Aura 3D Textures," IEEE Trans. on Visualization and Computer Graphics, to appear. Click here to see more results.

X Qin and Y.H. Yang,  “Basic gray level aura matrices: theory and its application to texture synthesis,” Proceedings of International Conference on Computer Vision,  Beijing, China, Oct. 15-21, 2005.

X. Qin and Y.H. Yang, "Estimating parameters for procedural texturing by genetic algorithms," Graphical Models, Vol. 64, 2002, pp. 19-39.

X. Qin and Y.H. Yang, "User Constrained Multiscale MRF Model for Texture Mixture Synthesis and its Application to Texture Replacement," TR05-25.

X. Qin and Y.H. Yang, "Theoretical Analysis of Graphcut Textures," TR05-26.

X. Qin and Y.H. Yang, "Representing Texture Images using Asymmetric Gray Level Aura Matrices," TR05-27.

X. Qin and Y.H. Yang, "Aura Texture," TR05-30.

X. Qin and Y.H. Yang, "Aura 3D Texture," TR06-09.