- 1 Introduction
- 2 Starting the SteamVR Home Workshop Tools
- 3 Addons, and the Content and Game Folders
- 4 Compilation, and why it’s necessary
- 5 Textures, Materials, Models, Maps – oh my!
- 6 Entities, Geometry, Entity Logic and More
- 7 No More Leaks or Brushes (why am I even telling you about these anyway?)
- 8 Lighting
- 9 Scripts
- 10 Console
- 11 Multiplayer
- 12 Performance
- 13 Documentation
Maybe you're a complete beginner who's just started looking into this world of VR development. Or perhaps you're a seasoned Source 1 mod developer, or you’re an expert in another engine, be it Unity, Unreal or one you’ve put together yourself – there might still be something useful here. This article should help all of you – in explaining the basic concepts behind the SteamVR Home engine and development tools.
I expect you’re at least vaguely familiar with game rendering on computers – namely, using textures on 3D models that have simulated lighting and effects applied through some magical rendering process, resulting in final 2D images. SteamVR Home is no different – underneath, it’s a high-powered game engine, and the methods it uses for rendering 3D scenes are pretty much industry standard – it’s all just triangles, textures and shaders in the end. It’s the presentation and performance that are a little different.
Starting the SteamVR Home Workshop Tools
By default, SteamVR Home will be running whenever you start SteamVR itself. However, this will be in a special, streamlined ‘game’ mode – all the development-related sections are part of ‘tools’ mode. To get to this, click on the SteamVR menu to the top left of the small SteamVR status window, then Workshop : Create/Modify an Environment.
This will bring you to the addon selection window. If you want to leap straight into creating something, there’s the Environment tutorial series just for you.
Addons, and the Content and Game Folders
An addon is a collection of files which get placed over the underlying game – similar in concept to Source 1’s mods, but far more suited for switching between at runtime. Your files will be split into two types – source content, which lives somewhere like this:
.. and compiled game content, which lives somewhere remarkably similar, like this:
The content folder is local to your computer, and won’t be redistributed – while things inside the game folder will be packed up into special archives called .VPKs when you upload your addon to the Workshop. But why does content need to be ‘compiled’, anyway?
Compilation, and why it’s necessary
Source content is generally in formats that aren’t all that suited for redistribution or fast loading and rendering. Take, for example, a texture – the source image might be a gigantic Photoshop document with dozens of layers – which then gets compiled into an engine-specific format optimised for rendering as quickly as possible. It throws away all the data it doesn’t need (all those layers, in this case), compresses it into the form requested by the rendering system’s shaders (small programs running on the graphics card, responsible for all the pixels seen on-screen) and generally makes it suitable for loading and displaying as quickly and efficiently as possible. So, all the files in content are the user-friendly formats that you have either placed there or made in the SteamVR Home tools, while files in game are generally produced from those in content, and are all used by the engine directly.
Generally things will be recompiled automatically when the content on disk changes (so you can save a file in Photoshop and have everything update as if by magic) – there’s also the ability to force a recompile, for when things have got a little confused.
Textures, Materials, Models, Maps – oh my!
Here’s a quick breakdown of what all these files are.
- Textures – pixels, basically, and generally from image files. An image of some bricks on a wall, or a normal map (a description of precise surface direction, for imitating extra detail), or one of numerous other possibilities. To be rendered, the engine needs a bit more information about what the pixels actually mean, however. Textures are usually made in software like Adobe Photoshop, The GIMP or similar. Automatically recompiled whenever content updates.
- Materials – a description of how all those textures go together, so the shaders know what everything is for. So for the brick wall, the material might point at the colour map, a normal map, a gloss map (describing surface smoothness / roughness) and so on – and you might have an alternative version of the brick wall which has a different colour map but the same normal and gloss maps. The material untangles all of these, and allows simple configuration of all the shader’s inputs.
- Materials are set up in the aptly named Material Editor.
- Makes a temporary new version whenever you change material settings, and does a proper recompilation when you save.
- Models – generally produced in a 3D modelling program (such as Maya, Modo, Blender etc.), these are 3D shapes made out of triangles. Examples would include furniture, trees, sections of buildings, even animated characters and creatures – a good deal of what you’ll see in VR is based around these 3D models. The surfaces these models are made out of are assigned to particular materials, which in turn point at the appropriate textures to use. While the underlying model files will usually be built using external software, their use in the engine will be set up in the imaginatively titled |Model Editor.
- Automatically recompiled when you change source content – aspects of the resulting compiled model can be defined by the materials it uses. For instance, a model with a simple, unlit material can be automatically compiled to be without vertex normals to save space – but you’ll need to force a model recompile if you subsequently change the material to use proper lighting. Similarly, you change your material to start using secondary UVs for something, you may need to recompile the model for it to render properly.
- Maps – more of a three-dimensional description of the world you are creating and everything in it – from lighting to prop placement via which sounds are placed where, and even the basic scripting linking object behaviours together. Everything you see in VR in SteamVR Home is ultimately hosted in a map – it’s the film set or theatrical stage in which everything happens.
- The SteamVR Home tools include Hammer as the map editor – which, a little confusingly, is also a capable 3D modelling program by itself. Maps can contain shapes created in Hammer – an example would be the default map for SteamVR Home, the Summit Pavilion, the architecture for which was mainly built out of geometry made inside Hammer.
- Maps are not automatically recompiled – you’ll always have to manually compile them before running them. There are various stages (such as building lighting) which you may need to do before compiling the map – these are all done separately since they may take a reasonable amount of time, and the results can be reused in many cases.
Other files include sounds, particle systems and more – these all have source versions in the content folder for your addon, and (automatically or not) get compiled into streamlined, efficient versions in the game folder for your addon.
Exceptions to the Rule
Of course, this perfect separation of content and game does break down in places – mainly with various script files which get consumed by the engine directly. These text files get placed somewhere inside the game folder – these include Lua script files and Soundscape definitions. But, beyond these exceptions, pretty much everything in game can be rebuilt from files in content.
Entities, Geometry, Entity Logic and More
An empty map in Hammer is the blank canvas for VR – this is where just about everything gets placed. And, until you add things to your map, the world will be entirely empty. But what can be put there?
- Entities – these are ‘things’ in the engine. Essentially a particular instance of a computer program, an entity often has a visible (and/or audible) representation in the world, and will react to things in particular ways. An example of an entity could be a physics prop – it tells the engine to render it with a particular model, and knows how it should react to the player throwing it around. Another example would be a pigeon, an information panel, a light source, or an invisible trigger volume which tells another entity to do something when a particular entity enters it. Even the player is an entity.
- Geometry – map geometry built in Hammer can be entirely static and unresponsive – while technically it’s still an entity somewhere behind the scenes, it is more like fixed architecture upon which everything happens. You can also tie particular sections of map geometry to specific entities – this tells the entity (such as a sliding door, or a rotating platform) to use that map geometry as its visible representation. In the case of that invisible trigger volume, a block of map geometry will define the space it encloses.
- Entity Logic – this is the basic glue sticking the behaviours of entities in a map together. A simple map might have little, if any entity logic – a complex game might have all kinds of things connected together. Made from Inputs and Outputs (and pretty much identical to the Source 1 system, modding aficionados), an entity might experience a particular thing happening (such as a player putting a hand into a specified volume) which causes it to fire an OnTrigger output, which is connected to an Input on another entity which tells it to do something else. Examples would include a a door being told to open via its Open input, a light switching on through its TurnOn input, a sound playing via its PlaySound input, or a pigeon flying away being called by its FlyPath input. Each entity can have a list of outputs, and can build up a list of inputs being called from other entities. There's an in-depth article on Inputs and Outputs to read through for more information - but do note that this is a Source 1 tutorial, so links and specifics will often wander off into topics not applicable to SteamVR Home.
No More Leaks or Brushes (why am I even telling you about these anyway?)
If you’re a Source 1 modder, rejoice. Maps are now arbitrary polygon soup, and have no need to be sealed against the void – and brushes are no longer a thing, having been entirely replaced by the much friendlier map geometry constructed in Hammer.
If you have no idea what wonders you are missing out on here, just be thankful and move on.
There are many mentions of this mysterious ‘lighting’ system. Put simply, the engine uses a simulation of how light appears in real life to render objects – allowing surfaces to respond realistically to light sources, ambient lighting and so on. Objects can cast shadows on to themselves and other objects – when set up correctly, lighting can be a huge visual aspect of a map. Properly lighting worlds is a bit of an art in itself. More information on the lighting system to come!
For simple behaviours and map logic, you’ll likely use just basic entity logic – but for more complex things (such as behaviours approaching the creation of new entities in themselves) you’ll want to use the Lua script system, which is a full-fledged programming language which can interact with entities and the world. There's a tutorial on building a player-interactable item if you want to know more, while there's also a wider overview on the script system.
Not a games console, but instead a conceptual descendant of the terminals used to control giant old mainframe computers. The console is a command line interface that relays logs and messages back to the user, and accepts many text-based commands for controlling esoteric features deep inside the engine. It takes the form of a separate program – with the tools running, you can bring it up by pressing the tilde key (~) just below the escape key to the top left of your keyboard.
By default, it may attempt to connect to the wrong port (another conceptual legacy of those mainframes) – so click the ‘+’ button to the left of the ‘Localhost’ tab and, in Port, enter 29009 and press Accept. The window should magically fill with multicoloured text. You can pretty much ignore it all for now.
To launch the console when running in game mode, you can run the .EXE from here:
Occasionally tutorials will ask you to enter console commands. Well, this is how to get to it.
SteamVR Home gives you multiplayer for free – there’s no additional setup needed to let multiple users explore the same environment together. In technical parlance, the engine uses a client-server architecture, with the server running on the computer hosting that particular multiplayer room – other clients communicate with that server to find out what is happening, and to update the server with their own changes. This all happens behind the scenes, and should be mostly seamless in action.
Be aware, however, that real-world communications links are in the way – with latency and bandwidth limitations and all that those entail. Don’t build a hugely complex physically simulated masterpiece and expect it to run efficiently in multiplayer – and understand that lag can be an issue for more distant players...
Gaming computers these days are ridiculously fast. But VR is extremely heavy on system requirements – so, like an irresistible force versus an immovable object, something has to give. One of the largest computational costs is in rendering the scene – so two binocular views in high definition at 90 frames per second. And, much more so than with games on a flat monitor, missed frames in VR are pretty obvious and unpleasant – causing juddering and shuddering at best. The rendering system and shaders in SteamVR Home are specifically engineered to be as efficient as possible, but it’s still possible to overload it.
The adaptive fidelity system used in SteamVR Home will dynamically enable and disable render size and features to get the best possible quality from your hardware, but to prevent excessive blurriness it won’t go below a certain level. When running in Tools mode, it will be disabled – it will be only enabled in game mode. (To best appreciate performance, test things in game mode – tools such as Hammer can expose their own cost on things currently running in VR.)
Here are a few potential rendering costs:
- Too much geometry! The world can be too detailed. For a photogrammetry-based scene, which is pretty much a huge unlit mesh with some giant textures on it, around 2-3 million triangles is a practical limit. Going much beyond that can start causing the framerate to drop on lesser hardware.
- Too much texture memory – again mainly a potential issue with photogrammetry-based scenes, having too many large textures can also cause precipitous performance issues on lesser hardware. Around 15 8k textures can be a sensible maximum – that’s something like 600MB of textures. Remember that things like normal maps, transparency maps and suchlike add up as well – and increase download size of whatever you make.
- Expensive lighting – having too many dynamic light sources and shadows being cast all add to the rendering cost. While the lighting system in SteamVR Home is remarkably efficient (a graphics programmer kept urging me to add more shadowed spotlights to the Summit Pavilion environment), misunderstandings and over-expectations can cause significant issues. Some general rules are: restrict each light source to light only what it is needed for – bring the range in as far as you can – and don’t have too many of them. Dynamic lights are fairly expensive, so bake as many as you can – more information on the lighting system is coming in the near future!
- Expensive shaders – the standard shader in SteamVR Home has many options, bells and whistles. Don’t turn all of them on and expect to render everything in framerate on basic hardware – a particularly expensive thing can be having many layers of transparent geometry (alpha-tested geometry is definitely not cheap, thanks to some super-fancy antialising features).
- If you can, find someone with an entry-level GPU to test things with – an Nvidia GeForce GTX 970 or AMD Radeon RX 480 are a good baseline.
For an up-to-date view of current rendering cost, bring up the frame timing window in SteamVR : Settings : Perfomance : Display Frame Timing. At a very simplistic level, the GPU graph at bottom should stay below 11ms, without any non-zero red line popping up at the bottom – if running in game mode, the adaptive fidelity should try to maximise GPU usage, but it should rarely if ever go above 11ms.
If your CPU graph is busy leaping above 11ms, then you’ve got different problems. Here are some potential CPU costs:
- Too many entities – it’s much more likely to be some aspect of the specific entities you have in the map rather than an overall count of all entities, but if you have many more than usual of a certain thing, it may be worth investigating further.
- Expensive scripts – scripts have to finish running for that frame before rendering can continue, so if you have complex loops and lots of logic, the scripts may be too expensive.
- Complex rendering – counter-intuitively, you can have a high rendering cost even if the GPU isn’t maxed out, thanks to draw calls. If you have many separate models and/or materials, the CPU has to tell the GPU to render each one, which incurs a cost which goes up the more things you have.
- Complex physics – having too many physically simulated objects in the scene at the same time can be a significant computational cost – and in particular having a complex collision hull colliding with another can be very expensive, causing severe frame drops. More information on simplifying physics systems to come!
Finally, welcome to the Valve Developer Community wiki! This documentation is a never-ending work in progress, and is built in part by people like you - each page has a discussion section, and each page can be edited and extended as you see fit. If you have any questions, please do post them - and if you have any answers, post those too!