A general rule of thumb for real-time 3D rendering is that drawing multiple overlapping opaque objects is always easier than drawing opaque objects. This is because opaque objects are compatible with the use of a depth buffer, so there is no need for a specific order of representation.
A depth buffer is a surface with the same dimensions as the screen and contains the following information for each pixel: how far from the camera was the last pixel drawn here. With this information we can draw as many objects as we want and always be sure that we will never draw something that should be hidden by another object. BabylonJS provides access to this information with a special DepthRenderer object.
Rendering objects without Depth Buffer would require the use of an old-school technique called Painter’s Algorithm, which is extremely easy to use: Draw more objects first. The sky, then the background etc. up to foreground objects. This is basically the arrangement according to the distance from the camera (alias depth), and in most cases obviously not sufficient.
Testing against a depth buffer during rendering is a very common technique, easy to implement and cost-effective in terms of performance. For non-opaque objects, however, it becomes more complicated because a depth buffer can no longer be used (since these objects do not completely hide what is behind them).
Before BabylonJS draws the meshes on the screen, it ranks them into the following categories, which are displayed in the order of your drawing:
- Depth Pre-Pass Meshes
- Opaque Meshes
- Alpha-tested meshes
- Alpha blended meshes sorted by depth (=distance to camera)
- Sprites (managed by the SpriteManager)
- Particles (processed by the ParticleSystem)
The last two categories are self-explanatory. Note that they are always drawn after all other meshes and do not use a depth buffer that may have been created previously. These cannot simply be obscured by normal meshes.
Renderable objects can be organized into rendering groups that act as layers. Layers are rendered in ascending order by ID, starting with the default layer (which has ID 0). Within each rendering group, the above “General Order” is also used.
To use rendering groups, all you need to do is set the .renderingGroupId property to the objects you want to insert in layers other than the default layer.
This property exists on meshes, particle systems, and sprite managers.
By default, there are a total of 4 rendering groups, which means that the only valid ids are 0, 1, 2, and 3. This can be increased by setting the static property BABYLON.RenderingManager.MAX_RENDERINGGROUPS to the desired maximum ID (e.g. set to 8 to support 7 rendering groups).
Meshes have another property that affects the rendering order: .alphaIndex.
By default, this property is set to Number.MAX_VALUE, which is the highest value a numeric variable can hold (about 1.79E+308).
Unlike opaque and alpha-tested meshes, the BabylonJS rendering engine sorts alpha-mixed meshes by depth before drawing them on the screen. The .alphaIndex property allows you to override this sorting, since a mesh that has a lower alpha index than another is always rendered before it, regardless of its depth.
To put it more simply: alpha-blended are first sorted by alpha index and then by depth / distance to camera).
Note that this feature only works for alpha-blended meshes and has absolutely no effect on opaque or alpha-tested meshes.
Note: This property can be manually set to Meshes in 3ds Max using the BabylonJS Exporter plugin.
Opaque or transparent?
How your meshes are categorized can be very important for the final aspect of your scene. Let’s take a closer look at how the first 4 categories are defined.
Depth Pre-Pass Meshes.
Depth pre-pass meshes define an additional rendering pass for meshes. During this pass, the meshes are rendered only in the depth buffer. The Depth Pre-Pass Meshes for depth are not exclusive. A mesh can have a deep lead and an opaque or alpha mix loop. Activating the Depth Pre-Pass for a mesh initiates the call mesh.material.needDepthPrePass = true. The goal is either to optimize the scene by rendering meshes into the depth buffer to reduce overdrafts, or to help reduce alpha blending sorting problems.
These will be the easiest to render: Your polygons will be fully drawn on the screen with their colors and textures. A depth buffer is used to ensure that nothing is drawn over anything closer to the camera.
As with opaque meshes, only that some parts of these meshes can be defined as fully transparent. Alpha text means that each pixel of the mesh can be either opaque (and then drawn on the screen and in the depth buffer) or transparent, meaning that the pixel is completely discarded. Although very efficient, this type of rendering usually creates aliasing borders and does not allow smooth transparency effects.
A pixel is considered transparent if its alpha value is < 0.4 and is not opaque. This value is currently hard coded.
Alpha blended Meshes.
These meshes have transparent parts that can have an alpha value from 0.0 (completely transparent) to 1.0 (completely opaque). Their color is mixed with what lies behind them to reflect that. These meshes are sorted by depth based on the center of your limited sphere. This does not prevent some problems from occurring when multiple alpha-blended meshes overlap.
Also note that removing the backs for alpha-blended meshes is pretty much mandatory, as otherwise polygons from the front and back of objects will be thrown in a mess (unless you use a depth pre-pass).
- Any mesh where
- the hasVertexAlpha property is set to true (automatically set for exported meshes if nodes have individual alpha (transparency) values)
- or a visibility value < 1 is present
- In the case of a mesh with standard material, if it has the following:
- In case of another material type, if the function .needAlphaBlending() returns true of the material.
Alpha tested meshes:
- In case of a mesh with standard material, if available:
- a diffuse texture with the .hasAlpha property set to true.
- In case of another material type, if the function .needAlphaTesting() returns true for the material.
- Any mesh that does not fit into one of the above categories.
Occasionally, some of your meshes may fall into the wrong category, e.g. an alpha-tested mesh unnecessarily marked alpha-blended, or a mesh that remains opaque when it shouldn’t. This will cause strange disturbances that can sometimes be very annoying.
What should and shouldn’t be done.
- Make sure that your alpha-blended meshes do not intersect, as this will inevitably lead to rendering errors.
- Avoid highly stretched alpha-blended meshes (i.e. large areas), since the center of its bounding sphere is used for depth sorting, this can result in a tissue being so far away from the camera but actually sorted closer to many other meshes.
- Use Alpha Test as often as possible. This can look perfect for a pixel art style or if the transparent part boundaries are straight horizontal or vertical lines.
- To eliminate jagged edges on your alpha-tested meshes, use anti-aliasing for your scenes (FxaaPostProcess). When using anti-aliasing, you can even disable WebGL’s built-in smoothing when creating the Engine object: