Material Reconstruction and High Quality, Angle-Dependent Real-Time Visualization on Mobile Devices
Research project in the area of Virtual and Augmented Reality.
About this Project
The quality of synthetic, photorealistic images depends crucially on the quality of the materials that are used and shown. A customer expects visualized materials (e.g. when buying a car or furniture) to coincide with the real material properties of the purchased product.
These material properties have to be measured precisely. Industry giants such as Automobile manufacturers have expensive equipment to measure all their required materials. However, smaller design studios, 3D designers and 3D modelers can not afford such equipment or complex hardware setups and have no way even to measure materials.
When creating realistic 3D models the correct detection and representation of surfaces of real objects is essential. The material properties of a surface are traditionally measured with a gonioreflectometer. A gonioreflectometer measures the reflection behavior of a material, mostly for the creation of the Bidirectional Reflectance Distribution (BRDF). It enables the calculation of the reflection behavior of surfaces under any angle of incidence. For each incident on the material light beam with a given entry angle it provides the ratio of reflected radiance and incident radiance for each outgoing light beam. BRDF is used in the rendering equation to simulate material appearance as realistic as possible with respect to the radiometric properties of photorealistic display.
The simple and rapid detection of the BRDF is a not satisfactorily solved problem. The aim of this project is to develop a simple, rapid and inexpensive method for reconstruction of the BRDF. Our approach uses a single camera of a mobile device (tablet, smartphone). After recording, a scene geometry, as well as all the radiometric properties of the scene and the materials are reconstructed. With the help of the presented method, the user can reconstruct materials, geometry, and lighting of the real scene by using just a mobile device.
To view the reconstructed materials a visualization tool is needed. We aim to develop a mobile visualization platform, that allows the angle-dependent, interactive viewing of the material.
To track the user or the location of the mobile device in the room, our approach will use a combination of an inertial sensor and visual tracking to provide a precise and stable position and orientation (six degrees of freedom). All required sensors (gyroscope , accelerometer, magnetometer and RGB camera are already integrated in mobile device and they will be used to 6DOF determination. This interaction approach is easy to learn and requires no specific prior knowledge in comparison to multi-touch finger gestures. Inputs by means of multi-touch interaction with 3D objects are complex and require manipulation using multiple fingers simultaneously. Such a gesture control has to be learned and requires fine motor skills. Our approach is oriented to interaction metaphors from reality and is therefore intuitively applicable.
For the user, the BRDF acquisition, as well as the interactive visualization tool will be intuitive, quick, and easy to use.