To do: Verify all this. Original author says there may be inaccuracies (and I'm just copy-pasting).
The Source 2 engine appears to be designed with multiple different renderers in mind. SteamVR Home appears to use a forward renderer tailored for VR applications, while Dota 2 uses a deferred renderer with real-time lighting.
The renderer in SteamVR Home supports a combination of two kinds of lighting, real-time direct illumination and static baked global illumination. The real-time illumination uses point light sources and cascaded shadow maps. The global illumination uses photon mapping to generate lighting which is baked to light probe volumes and mesh vertices. Image-based lighting is also supported. Baking lightmaps seems unsupported currently, so meshes are limited to vertex lighting.
The lighting can be split up into four different groups, diffuse and specular, with each with respectively direct and ambient lighting. Physically based rendering (PBR) is assumed to be used, which among other things means that surfaces are energy conserving and cannot emit more light that they receive. The light received by the surface is split up into diffuse and specular reflections according to the material properties.
Direct diffuse light can either be rendered by real-time lighting or baked into the static lighting.
Indirect diffuse light is baked into the static lighting.
Direct specular light is emitted by direct light sources, and is rendered on surfaces with specular materials.
Indirect specular light is baked to environment maps using cubemap entities, and the nearest cubemap in range is used to illuminate materials.
This guide uses a Cornell box-like scene with extra subdivisions on the faces to illustrate the effect of vertex lighting. The green and blue materials have specular reflections enabled.
All light sources can be set to provide direct or indirect light, or both. When per-pixel direct lighting is used, the the light entities can be manipulated at runtime to change the lighting.
light_environment entity provides outdoors environmental lighting, as well as options to add ambient occlusion and a base ambient term.
The base Color, Brightness and entity rotation settings control the directional sunlight, providing both direct and indirect lighting. The direct lighting can be switched between per-pixel real-time and baked, while the indirect lighting can only be baked. When real-time lighting is used, the sun light casts shadows using cascaded shadow maps, and the direct light and shadows are excluded from the baked lighting, leaving only the reflected indirect light to be baked.
The ambient sky is controlled with the setting in the Sky group. The Sky Color allows setting an uniform color, and the intensity controls its brightness. The Sky IBL Source option allows using the skybox texture set in the
env_sky entity for image-based lighting, sampling the colors in the skybox for the ambient lighting instead. To use it, give the
env_sky a name, and supply it in the Sky IBL property. It is recommended to use floating point HDR images for this.
The ambient occlusion controls allow extra shadows to be baked to areas with high occlusion, (indents, crevices, etc.), this can have limited use with vertex lighting unless the meshes have high detail.The Ambient Light option allows adding a constant light to the entire level. This is unrealistic, and can produce bad results unless used with caution.
An omnidirectional point light source. Does not provide cascaded shadow maps. Many properties are carried over from the Source
A spot light source. Provides cascaded shadow maps as long as the light cone angle is set under 90 degrees. Many properties are carried over from the Source
The VRAD2 tool used to bake lighting in SteamVR Home uses the photon mapping algorithm to bake lighting. This is a two-step process where the first step generates the photon map itself, a representation of the lighting made by ray tracing light packets from light sources and storing occurrences of packets hitting surfaces. This fist part is cached to file and can be re-used if the geometry or lighting hasn't been changed significantly between runs.
The second step generates the baked lighting onto meshes. Although some of the tools suggest that lightmapping support is in the works, the current version only supports vertex lighting, meaning that lighting is baked only to each vertex of the mesh. This can cause triangular artifacting and odd gradients in areas with high contrast lighting. Due to these effects it is not recommended to bake direct lighting on directional light sources.
The baking only directly affects static meshes. To enable dynamic objects such as physics props to be lit by baked lighting, two techniques are used.
For diffuse lighting, light probe volumes sample the ambient light inside them and generate voxel maps of the different ambient values, and the maps are stored as textures. This is similar to how ambient lighting is handled in the legacy Source engine, except there the volumes are defined by the automatically generated visleaf volumes, whereas in Source 2, the lack of sealed maps and a BSP portal visibility system requires us to map out the volumes ourselves.Light probe volumes are defined with the
env_combined_light_probe_volumeentities, the latter also having an
There are two different ways light probe volumes can sample lighting, the default option samples the light directly from geometry around it by tracing (To do: Verify). The other option is to indirectly sample the lighting using cubemaps (see below), enabled using the Calculate Diffuse Lighting Using Cubemap option in the light probe volume entity.
Ambient specular lighting is handled by environment maps (cubemaps). Cubemaps bake the environment around them into panormamic textures, which are then used for detailed reflections. Cubemaps only provide a fully accurate reflection at its point of origin, so it is recommended to place them at head height.
Direct lighting is handled per light source, and can be either per-pixel real time lit, or baked into the ambient lighting. A mixed option exists to enable real-time lighting on dynamic objects only, while retaining baked direct lighting on static objects.
Baking time increases rapidly with increased map sizes, but there are several ways of combating this.
Photon mapping from environmental lights is performed from the edges of the map, defined by where the map geometry starts. If the map has far away backround objects, photon mapping can take a very long time. To combat this, a
light_importance_volume entity can be placed. This entity defines a volume to perform detailed photon mapping in. Outside it only simplified lighting is performed. Note that the importance volume is a single axis-aligned bounding box, so it can't be rotated, and multiple volumes are merged into a single one, making it less useful for maps that are not rectangularly shaped.
|Photons 1/1|| This is the first step of the photon map generation. This can be optimized by using a |
|LPV-Indirect/Direct||The light probe volume voxel size generation. Can be optimized by decreasing the voxel resolution in light probe volume entities.|
|Tris 1/1||This is the second step of the photon map generation. The time here is dependent on the number of lighting samples, in this case the number of vertices in the static meshes of the map. The only way to optimize this is to simplify the models and meshes used in the map, or decrease the number of complex static objects.|