The effect of “Global Illumination” is necessary for realistic images and 3D configurators. In this paper, we present a method for generating photon maps that can later be used to texture surfaces. These texture maps are created by tracking the light emitted from light sources and recording where the light is absorbed. This allows the textures to encode all the lighting properties of shadow and global illumination in them.
Background.
“Global Illumination” is achieved by generating the light path of the light source and the path of the light. When light hits a surface, the energy of the light is absorbed. If the surface is divided into smaller areas, we can see how much light hits each region. These areas correspond to the texts of the photon map.
As a post-process operation, we normalize the amount of light in each texel through the region. We also normalize according to the maximum light value of the scene to give the brightest texel the value 1 or the brightest color representation. The texts are then written to image files that are used as texture maps in a rendering system.
Implementation.
The implementation can be considered in three main parts:
- Tracking the light emitted by the light source.
- Keep an eye on how much light hits each surface.
- Generate texture maps for use in a rendering system.
Tracking the light emitted by the light source.
From the light source, you can generate a beam of light that has both direction and color. You can track the color as an RGB device or track each color individually. You can use the beam of light to calculate the point of intersection in your scene. Once calculated, you add light to the corresponding text on the surface. Since the objects in the scene absorb light, we need to determine when the light should no longer contribute. We calculate the probability to end the beam by using the components of the light color, probability = (red + green + blue)/3. We then take a random number r of [0..1] and if the probability is less than r, we end the beam. You could also calculate the probability with the probability = max (red, green, blue).
If the beam is not terminated, the beam continues and two things have to be calculated, the new direction and color. The direction is calculated using the normals of the surface and a random direction in the hemisphere formed by these normals. The color is calculated from the product of the surface color with the light scaled by the point product of the incident beam with the intersecting polygon (New_Color = RGB(light.r * surface.r, light.g * surface.g, light.b * surface.b) * -light.direction . Normal).
This process is repeated until the luminaire is finished. When the light is stopped, the light source generates another beam.
Keep an eye on how much light hits each surface.
The light is tracked with a two-dimensional arrangement that corresponds to the UV parameterization of the surface. Any surface can be used as long as there is a UV parameterization and the area per texel is known.
If an intersection point is reached, the value of the corresponding texel must be increased by the pro-color product of the light color with that of the surface.
Generation of texture maps for use in a rendering system.
Once a specified number of rays have been tracked from the light sources, the texture creation process can begin. Since the maximum value of a texel can depend on the number of light rays tracked in a scene, the values of the texels must be normalized to the maximum of each color. When overflowing the text set in the scene, the maximum values for each color component are determined. A second pass is performed to normalize each color value with the corresponding maximum. The value of the texel is also divided by its area.
The texture maps can then be saved and used to match the rendering system. We chose the real-time Ray Tracer developed by Steve Parker and Pete Shirley because we can use large amounts of texture memory. The files were stored in polygons and read in when the visualization was put into operation.
Results.
Only rectangular primitives were used to create our scenes. As already mentioned, other primitives with adequate UV parameterizations and known texel ranges such as Triangular Meshes or NURBS could be used.
Since this is a scene that depends on an even distribution of the random rays, the quality of the textures increases with the number of rays per texel. This helps to reduce the speckles below. For good results, 100 to 1,000 rays per texel are recommended.
Although it is an “offline” preprocessing, it can be quite time consuming to generate these texture maps. With a naive implementation on an Origin 3000, it took 240 working hours to track 100 million rays with a scene of 2.2 million textiles.
Conclusions.
This is an uncomplicated implementation for global lighting. In contrast to calculating the form factor, which scales O (n^2) based on the number of discrete surfaces, this approach scales around the cost of calculating crossings with the primitive. By using optimizations of raytracing, such as a limited hierarchy, scaling to O (n log n) can be done.
One of the disadvantages of this implementation is the cost of the texture memory. For relatively complex scenes, the use of OpenGL texture memory becomes impossible because the amount of texture required exceeds the amount in memory. For this reason, we decided to use the Raytracer in real time, as the texture memory was limited only by the main memory.
Thank you very much for your visit.
Leave A Comment