This postmortem analysis delves into my approach to swiftly converting a substantial amount of assets. Picture this scenario: when transitioning from Horizon Zero Dawn, the initial game, to its sequel, Horizon Forbidden West, we encountered the task of updating the game’s asset-art (props and environment-art models) to reflect the latest technological advancements and ensure compliance with the technical specifications. Moreover, we had to replace the assigned shaders with novel ones specifically designed for Horizon Forbidden West. The sequel would leverage existing assets while introducing new ones. Consequently, some of these scripts and pipelines were reusable. Given the sheer number of assets we wanted to display on screen, we prioritized optimizing their setup to stay within the memory budgets while maintaining the visual fidelity of the first game, supporting an approximately 7-year-old platform (PS4) and the (at the time) newly released PS5.
Step-by-step
- Asset Management: Collected, tagged, and tracked over 5000 assets in a local database, highlighting the importance of testing and validating changes.
- Texture Optimization: Optimized textures by discarding unnecessary maps, converting PSDs to PNGs, and ensuring consistency in materials like rock and stone.
- Conversion Process: Utilized the Substance Automation Toolkit API and a shared Python library for texture conversion, ensuring PBR compliance and compatibility with the engine.
- Maya: Executed Maya batch script, providing information and updating shaders, and relinking textures.
- Export: Exported assets to the engine and linked to a test level for evaluation purposes.
- Evaluating: Evaluated GPU performance, export/conversion issues, and internal tools/settings in-game.
I initiated the process by collecting all assets that required updating through a script. This step allowed me to assess the scope of the undertaking. I tagged and tracked each asset in my local database (a JSON file) with the appropriate process, the Maya file associated with that asset, all the textures linked to the asset, and some other data that I can’t quite remember right now.
Side note:
However, it’s important to mention that this approach wasn’t entirely foolproof and a one-size-fits-all solution. With over 5,000 files touched, all of which were linked to levels, sets, cinematics, or prefabs, the process became even more fragile and complicated, particularly during production when everyone was striving to complete the game. Coordinating the initiative poses a challenge, emphasizing the significance of allocating sufficient time for testing and validating the changes. To address this, it’s crucial to split the changes into smaller check-ins.
Let’s continue!
While updating the files, I took the liberty of optimizing content whenever possible. I discarded any Specular maps that weren’t necessary for dielectrics or any other unnecessary maps. Determining the need for dedicated Specular channels could be challenging, especially since not everything was authored using PBR techniques. Relying solely on Python libraries and image processing wasn’t always sufficient. Additionally, I converted all PSDs to PNGs to significantly enhance Perforce syncing times, image processing, and DDS exporting times.
In addition to texture optimization, we also had to identify assets that required recoloring treatment. For instance, if an asset contained rock or stone materials, we wanted to ensure that it (visually and color-wise) matched the other rocks by applying the same coloring treatment. In most cases, I was able to quickly generate a mask for other assets through a manual process. I could either extract it from the PSD layers, bake out the UV layout and use that, or simply mask in Photoshop and have that linked during the Substance file generation process.
Conversion
To begin the texture conversion process, I utilized the Substance Automation Toolkit API. Certain aspects of this process were defined in a shared Python library that I had written for it. This library is explained in more detail in another blog post titled “Texturing for Rocks.” This shared graph would clamp the Color values to be more PBR compliant, clamp the Roughness ranges due to the engine shading model, and run the AO through a Curve node to compensate for some manually authored AO that was too dark.
Why the Substance Automation Toolkit? Creating a new file using the API is a straightforward process. It’s easy to replicate, and we can open the Substance file in Designer and export it if necessary. Batch exporting is also an option if any changes are made to the shared Substance graph.
Maya
The next step was to update the textures linked in the engine and reexport the DDS’s. This was accomplished using the internally created library, which resulted in a relatively fast process.
Initially, I attempted to use a headless Maya version, but this approach did not work because the viewport renderer did not initialize the shaders and shader-defined information required for evaluation and update. As a slower alternative, I opted to run regular Maya in a batch process. In this process, I provided Maya with a list of information per Maya scene and updated the shaders. Upon loading, I retained the old variables in memory, updated the shader, and then updated the variables accordingly, compensating for any differences, such as variable names or ratios, like tiling for detail maps. Finally, I relinked the textures to the shader once more, ensuring they were correctly associated with the mesh.
From there, the assets were exported to the engine. Unfortunately, I cannot provide further details due to the proprietary nature of the engine.
Once the assets were exported, I was able to link them to my test level, enabling me to evaluate the performance (GPU) in-game and assess any issues that may have arisen during export or the conversion process. Before checking in, I conducted a quick test process to ensure everything was working as expected. I used our internal tools and double-checked if none of the previously set settings or draw distances were broken. The further into development you are, the more vital this becomes.
Conclusion
The way I present the steps in my explanation is also how the process functioned in Python. Evaluating images using PIL was the quickest step, while Maya posed the most significant challenge. Re-exporting assets from Maya was the slowest and most prone to crashes. Multiprocessing consumed so much memory that I experienced several blue screens. It was a valuable learning experience, as I learned how to automate certain steps and refine the Python code to be more compatible in different scenarios rather than a one-time use solution. However, it became apparent that full automation for this process was too risky due to the many odd cases and asset setups that had to be validated by eye.
The ideations and iterations that emerged from these processes, served and continue to serve as the foundation for more automated and/or procedurally driven processes.
A special thanks to Chris Thompson for proofreading and to Guerrilla for allowing me to publish and share this information.