One of the most significant initiatives I’ve had the pleasure to be involved in during the development of Horizon Forbidden West was automating the rock asset baking pipeline. This process involved setting up a semi-automated (rock) texturing process, writing a Substance Python library (built on the Substance Automation Toolkit), and more.
Why was this approach significant?
- The game would feature a multitude of unique rocks, and all these rocks would need to be treated equally to ensure visual consistency.
- Any texturing (Substance) changes could be easily reapplied by batch reprocessing the Substance files only.
- There’s only one shared setting to bake all the rocks for visual and data consistency.
- My ‘database’ (read: JSON file) successfully tracked all texture resolutions, including those for baking and the in-game target resolution.
Database
The automated process aims to streamline the environment artist’s workflow. They only need to define parameters for naming textures in a small database, JSON file, specifying which component should be packed onto a single sheet, using a custom cage mesh, and setting the target resolution for in-game use. The tool will automatically adjust other parameters during the first run.
Context
In this context, a “set” or “component set” refers to a collection of multiple assets that need to be combined onto a single texture sheet. For instance, during the initial setup of the automated system, an artist would have to bake three rocks as individual assets. They would then have to manually merge all the mesh data to reduce the amount of draw calls and achieve the desired target texel density. However, through the automated pipeline, I was able to merge all their bakes together and apply the Substance graph to the final result. The only requirement from the Environment artist was to perform proper UV mapping beforehand and ensure a well-organized layout.
For each rock asset, we had to bake out various data, including Normals, Ambient Occlusion, Curvature, World Space Normals, UV mask, World Space Direction, and two others. If these were baked at a high resolution of 4k, it would significantly impact the performance of Substance. To address this, I combined all the mesh bakes for each type of rock into a single call. For instance, four different rocks would have their own AO maps, which I merged into a single sheet (automatically). That combined result would be exported as PNGs and integrated into the Substance graph for texturing.
With this automated setup and tracking of assets, we could easily rebake, retexture, and reexport all the rocks if needed. This wasn’t often required, as the environment team did an exceptional job of providing high-quality bakes. However, having the option to verify all in-game content ensures that it’s always up-to-date, and is a great feature.
Texturing
For the rocks, we utilized a shared Substance graph to input all the baked mesh data. This enabled us to generate breakups and variations on a per-asset basis. For generic rock details, we employed tri-planar projections that were appropriately scaled to their real-world scale. For instance, cliffs had a larger repeating rate compared to pebbles. We also incorporated other types of variation and breakups, such as occlusion and curvature, in conjunction with cloud or Perlin noise. However, the process is more intricate and detailed than I can fully explain here.
Most of the assets’ uniqueness stems from the World-Data that is hand-painted by the artists and designers or generated offline. In combination with the addition of shader features like Medium and High Frequency detail maps, even this is defined by parameters such as the asset’s location in the world.
CLI
I had developed a command-line interface (CLI) in Python that allows artists to easily drop and drag their low-poly mesh onto a .batch file. This file offers a couple of options:
- Transferring high-poly to low-poly data for single assets
- Transferring high-poly to low-poly data for all assets within the set
- Only reprocessing the texturing pipeline
- Exporting the assets into the game
- Performing all of the above options
Problems and solutions
- Transferring detailed information from high-poly to low-poly models using the Substance API was a bit of a challenge. Certain assets were too complex and caused memory issues because of the way I had set up the queue. After testing and iterating, I re-examined my threading setup. Upon further exploration of the Substance API, I realized that I could load in the meshes once and bake out all maps simultaneously, rather than doing it individually. This change eliminated the need to load the high and low poly meshes into memory for each thread(!). Looking back, it seems obvious, but it was all part of the learning process, adaptation, and continuous improvement as a technical artist.
- Writing the internal Substance Python functions was a bit of a challenge. Learning a new API while simultaneously writing a tool that is somewhat production-ready is not an easy task, but it’s a valuable experience! You get direct input from your end-users, and being able to troubleshoot with your team at someone’s desk together is a valuable asset. In addition to writing the internal Substance functions, it was also used to batch process and texture other assets, such as the face textures in Horizon Forbidden West.
- None of the pipelines were finalized yet. The Substance graph for the rocks was still in its early stages of development, but that’s part of the iterative process and trusting the workflow we envisioned.
Next Steps
I began developing a graphical user interface (GUI) around the tool to manage rock bakes and provide artists with the flexibility to set custom parameters. However, this endeavor required a solid foundation of code and a well-defined roadmap outlining the necessary features, based on the feedback I had collected from the team.
Scaling up the process to ensure visual consistency across all assets by using the same bake settings would streamline our workflow. Currently, we manually send out bake settings to our partners, which can lead to errors. This tool would eliminate these potential errors, making the process more efficient.
A special thanks to Chris Thompson for proofreading and to Guerrilla for allowing me to publish and share this information.