top of page

Outsource Asset Validator for UE

  • Writer: Jesse Olchawa
    Jesse Olchawa
  • Jan 4
  • 13 min read

Updated: Mar 15


Introduction:

In this project I will explore creating a tool to automate the importing of assets, texture creation and material setup pipelines for 3D assets in Unreal Engine. This tool is intended to be used for outsource asset management as creating bespoke textures for repetitive asset surfaces such as wood, metal or plastic can cost time spent away from working on bespoke assets. By cutting down on unique repetitive textures this tool can leverage Unreal engines powerful material shaders to make variations inside the engine ready for level designers to work directly with.

 

Week 1:

Project Outline:

 

This project is heavily inspired by my freelancing work as there was often many assets that simply needed smart masks and a light coat of paint applied after baking to make them match the scene. Modular background assets tend to eat up production time so searching for solutions I stumbled onto Marcus Whites talk from 2019 about how to “Streamline Asset Workflows” in UE using Python (Epic Games, 2019). He outlines using substance baker API to bake out mask maps that are then fed into shaders into engine after successful import and setup. 


I knew this was the next challenge I wanted to work on to finesse my Python skills, so I got to work outlining my goals:

  • Creating a python tool that integrates multiple APIs for baking, GUI and engine in one script.

  • Can take an asset, validate it and import into UE.

  • Can bake and channel pack maps for each mesh.

  • Follows industry naming conventions and fixes issues.

  • Has written, easy to follow documentation.


Stretch goals:

  • Create thumbnails of assets and send them to Trello boards for outsource management.


Planning:


I’m still teaching at the moment so knowing my busy schedule and oncoming marking season I have blocked out this project for a tight 4-week plan. Furthermore, the Trello integration during the final week can be abandoned should the project run overtime. I do anticipate I’ll have some issues during the week working with baking as I have never used Substances baking API before and hope it is still supported as the video is 7 years old. I may need to look for another baking tool to create curvature maps should that occur. Now let’s dig into the assets I will be testing this on!


Assets Chosen:


I have chosen a variety of free to use assets from FAB to experiment with. For full links to them please head to the bottom of the page as they are all cited. I wanted this project to differentiate from Marcus White usage on his stylised game by instead tackling a tool and shaders for full realistic PBR props instead. Unfortunately, some of the assets are made a bit funny such as the barrel with broken scale and overlapping UVS and fantasy weapons being the size of a nail (I got rid of these).



So, I have downscaled it to the wooden crate and fixed barrel for testing and debugging as they provide a good variety of wood and metal ids to test for import. I did some research of how I wanted my shader values to look later as they would need gradients and noises to make them work.



I did encounter some odd issues on the barrel due to the hatch having its own map meaning UVS were overlapped so I deleted that part of the mesh. I needed to conserve the original normal map so that I can bake out curvature and the rest of my thickness maps from it in Painter.


Channel Packing and Noises:


To build the shaders I needed noises and the noise baked texture the script would create. So I used the existing normals to bake out manual exports for me to work with. For my seamless noises for masking and basecolor value variation I used Designer to grab some good textures that would work. As I will be making only a wood and metal shader I chose to pack maps for both into one texture to save on cost. Here’s what I chose:



For the baked mesh maps however I went into painter and added some generators after rebaking all the maps from the normal included from Fab. I created a smart material stack so I could bake the same type of maps out for packing on both the barrel and crate. For this texture the focus is to grab interesting masks to use later to create basecolor and roughness. I used ambient occlusion as standard for the red channel and then curvature for green. Curvature gives great outline and edge damage. For a bigger impact of damage and dirt accumulations I used the blue channel to store a dirt edge generation. I still have the alpha channel left so for this I added a ground dirt-based generator instead. I knew from my previous project I could create masks from the ground using world position but figured a texture would be a way to offload those calculations to a greyscale map. After packing I moved into engine.

 

Creating Smart Mats in UE:

After getting all the maps I needed I imported them into UE and set them up in two materials. I did consider between having a master material that has tons of static mesh toggles to create wood or metals, but I did not like the idea of so many samples and instructions packed into one graph. It would be a nightmare to troubleshoot if something became unconnected, so I instead opted for separate base materials but with similar parameters for consistent user experience.



As assets need to be created with the same noise and bake layout in RGBA there are similarities I can paste across for these variable names as well. For the wood graph I used the normals green channel to create initial grain that is then mixed in with the streak noise channel green. I had to create my own variation of the world align texture function to be able to rotate the UVS of the tileables as not all meshes have normals that are straightened. This noise creates a better gradient that blends with original dents and grain from the normals. Whilst this makes one tone of wood for more interesting basecolor I used the dirt edges to create darker buildup of value and curvature for lighter wooden flecks on chipped edges. I had to flatten the normals to make the basecolor look less noisy. For roughness I desaturated the basecolor which regularly makes a dark grey map so after flipping with one minus the roughness has enough to be light enough for a dry wood appearance. As I had baked a dirt gradient from ground, I toggled this and my other masks to make it possible to create variations such as painted wood, stained or dirtied effects.



For my metal shader my graph began in the roughness area as I used flat scratchy noise to create interesting tiling roughness values. For colour I used a flat value with a toggleable rust mask that uses many of the baked channels and packed noise. Rust is first built up from the dirt edges which appear around the rings and corners of the barrel. For filling in more rust I used the packed noise to tile the effect with adjustable intensity values making it possible to create variations of either scuffed or damaged rust metals. Compared to my wood material I also added edge detection to build up rust on surfaces that make contact with the ground. This is toggleable however as it uses distance fields and I did not want to push the cost if unnecessary onto all metal meshes.

 

Weeks 2-3:

Creating UI with TKinter

I decided to move designing a UI a bit earlier as I wanted working buttons and boxes to interact with as I was creating my functions. When designing my UI I focused on making it easy for the user to visualise what files are at what stage. I iterated this concept in Photoshop to build around a classic file lister looking interface with large expansive columns.



Similar to my previous projects I use the module TKinter to create my UI. However, with the more complicated needs of this tool such as displaying meshes to run through in a list I used a lot of frames. These helped section labels, listboxes and buttons into more orderly sections to match my UI. Another stretch goal I came up with was the option to create light and dark mode with toggles. After some research I uncovered that TKinter supports themes similar to how sites use formatting.


This globally changes the look of buttons, text and icons so I downloaded a dark mode theme called awdark by bll123 (tcl-awthemes). To implement the switch the button triggers this short snippet of code that forces the UI to swap over. As for my icons, I had to toggle a switch to use the dark/light mode versions of Unreal as otherwise the white logo would disappear in light mode.


Storing Data and File paths:


Initially the way this script worked was that you had to select the unreal engine project location and manually route in the location of the .exe of unreal to run portions of the script relating to importing. However, this got very troublesome to do constantly per refresh and I realised a lot of my data I needed to adjust as my tool grow to expose more behaviours and sliders would be impossible to transfer across. Either my script would have to run in different modes and duplicate itself to contain all this information, or it would be lost on a closure/running something else.


So, I decided to create a structure of csvs and config files to help maintain information across my tool. My most used csv is called mesh_data csv and contains the names of the meshes being processed and their original folder location. Now when I go to engine or bake in Blender, I can always call for the correct files. Later on, I expanded to have another csv for materials called mat_data. This one is not generated automatically and requires the user to paste in file locations for their unreal engine references to mesh folder, texture folder, material etc. I did consider making a complicated algorithm to seek out folders based on their names but realised it would cause more issues with projects that have specific naming conventions or folders. This file would ideally be created at the start of a project and setup as noted in my documentation to work for all artists. Ensuring the tool works as intended with any project.


Finding a Good Baker:


Originally, I had planned to use the Substance Automaton toolkit to bake out maps from a normal map setup. I even created a smart material that uses generators and packs them into RGB channels meaning the textures would be ready without extra packing needed. However, I soon uncovered I do not have a license for that package at home and that it’s quite pricey to get so I opted to look for free software that could bake. I then tried XNormals which has a limited python API but could still make all the maps I need provided I was not baking from a normal texture but a high poly. So, I sculpted some high polys went back to testing and the program was insanely slow due to its age and inability to utilise multiple cores to break up tasks. At higher resolutions it took me a minute to get a map each. It also didn’t have generators only the raw curvature, position etc maps so I would need to do more wizardly to overlay them.



So, I ditched it and moved onto Blender instead. Finally, I had a good Python API that supported baking and allowed me to create shader graphs via scripting so it could procedurally load in what I needed to bake out. I did need to tweak them quite a bit to resemble my original Painter graphs however after some overlaying with noises from Designer I had all my textures ready to pack. I used PIL in Python to pack the maps and correctly rename everything as it should be.


Adjusting Baking Settings with Config Files:


I knew I needed to transfer data back and forth between the main master tool script and the baking script, so I created a text file to store a config of all the parameters made from the UI sliders. The core information I waned to save was the file format, the size of texture and the settings for baking from the high poly. The most troublesome settings were the cage extrusion and the position on the dirt gradient as some meshes were quite small so there was little gradient created.


By writing to the config file by tracing when values were changed and calling functions, the file always had the most recent data available. It can also regenerate itself if to default parameters if deleted. I realised I could also use this file to store more paths that could change per system such as where the executable for Unreal Engine or Blender is. This meant that now on loading up the tool would be ready to go without having to manually select file paths.


Importing Assets into Unreal Engine:


For importing the meshes into engine I utilised the same subprocess module as I did to call Blender in a headless state without UI. Unfortunately, a dialog window still spawns but disappears after importing the assets. Using my csv with the names of assets to iterate through the script looks for the mesh file, imports it in and checks what materials are assigned to it. If the material matches one in the material data csv file, it looks for it in the specified folder to create a material instance. This is critical as I cannot manipulate the textures in the shader unless they are a material instance. After the meshes are done importing and assigning mats the textures are brought in and set to disable SRB. By forcing the channel pack map to render as linear colour I can remove the green channel compression that occurs and would create weird effects in the shader. My final bit of the script is to delete those empty mats it creates on import with the default names to clean up before getting the next mesh. This script also checks if assets already exist for example the material instance with the name so it can preserve it and instead of creating a new duplicate, it reuses it and sets the textures on that.


Generating Thumbnails:


One of my greatest challenges by far apart from resolving my baking fiasco was creating thumbnails for my assets in batch. I wanted to generate thumbnails so that I could send them to a Trello board, similar to how outsourcing assets need to be tracked from approved states to completion. I originally wrote at the end of the import script to use a specified render level, grab a camera actor and mesh base that were tagged with “thumb_mesh” and “thumb_camera”. Then I would trigger the high-res screenshot function and take a picture from that camera. This worked well when testing inside Unreal Engine however things began to break when running it without UI. The script would only take a screenshot of the last imported mesh. After some investigation I uncovered it was to do with how high-res screenshot works. It essentially queues up the action of screenshot for the frame and waits to do it. With the script simply continuing on without it, it only has time to screenshot the final asset. I tried to force it to tick and process quicker however has unpredictable results. Despite my initial plans to create a self-contained tool I decided to leave the screenshotting functionality to a widget in the engine.



Fortunately, this did open up more avenues for artists to take better screenshots. My initial math to setup the camera was to get the height of the object and force the camera to move back by 1.3x per each height to capture it fully in frame or if too small, bunch up close in front of it directly. However, some assets this caused issues such as the wrench that was orientated to face upwards was detected as a bit too tall and cropped out of the frame entirely. By having the widget be able to possess a custom camera the artist can frame each asset, adjust anything they need to on the shader and then submit a screenshot in a few easy clicks. By having the engine open these forced ticks to occur that bypassed the weird Python UI screenshot bug. Here’s how my graph looks like for the widget, its quite simple and calls for the script mentioned. It also imports the CSV file to correctly assign the replacer mesh a static mesh from the menu.


Integrating with Trello:

Now that screenshots were being created all I had left was to send them to Trello. Fortunately, there is a good API for communicating with Trello directly however it requires specific keys to work. This forces the script to login as you with your token key and post/manipulate boards with that account. For my debugging I just used my personal account however for security’s sake I recommend making a separate account that could be disabled should these keys/token leak. They are bespoke and tied to one account after all. Back to the script, to enable different users posting their screenshots up with their name to the same board I used a Trello_config.txt file to store these keys. I can’t show you my tokens however here’s how the script looks like and works in action. It checks the folder for all mentioned processed meshes in the csv and tries to find their png thumbnail. If it succeeds then it a new card is created on Trello with the date of uploading, the name of the mesh and the thumbnail image.


You can download all my Python scripts here:

Screenshot Script – https://pastecode.io/s/nz0piduo

Blender Baker Script – https://pastecode.io/s/b6ohc25e

UE Importer Script - https://pastecode.io/s/wfebak03


Read my documentation for artists here:

 

Project Reflection:

To conclude this was my largest and toughest tool to make yet however proved to be extremely rewarding to create. It brought all of my knowledge of the asset creation pipeline into a procedural and seamless way moving from baking to materials, shaders and engine setup all in one swoop. My greatest gripe is that the screenshot functionality could unfortunately not be put into the same program however with engine limitations it is what it is. I also think I could have better organised my CSV files into a larger master file with delegated rows to store all the data. However, by splitting it up, if some files are missing the script can regenerate lost content to try and get itself going again. It’s also very adaptable to any project or file path thanks to its config files which I will definitely be making more of in the future.


Writing and reading from text is very easy to do, small in file size and easy to check to open up. My exploration into Blender’s python API have also opened up new avenues for me to create more complex tools that take advantage of its plethora of tools like modelling or procedural textures in upcoming tools. I definitely want to explore this more in depth and thanks to its free price tag its highly accessible for anyone on the team to get. Writing this blog took forever however I hope you enjoyed and be sure to checkout my scripts above if you want to use any logic in your own work.


Assets Used:

---. “Industrial Pallet.” Fab.com, 2025, www.fab.com/listings/6a8546f0-6e83-4a3a-9035-c9306cd22441 . Accessed 3 Jan. 2026.

---. “METAL BARREL | Post-Apocalyptic Videogame Props.” Fab.com, 2025, www.fab.com/listings/2b6a823a-ffff-4c2b-99b6-8b2bafc7c917 . Accessed 3 Jan. 2026.

---. “Misc Props - UB Free Pack.” Fab.com, 2025, www.fab.com/listings/e5bb46ab-edf1-4530-8797-703f4c98b4a6 . Accessed 3 Jan. 2026.

---. “Simple Sword and Axe.” Fab.com, 2025, www.fab.com/listings/e3a09acc-5704-4cb3-9485-45ad39414fce . Accessed 3 Jan. 2026.

---. “Survival Props.” Fab.com, 2024, www.fab.com/listings/d5d45ca2-d957-447a-8744-ea1f8aabece3. Accessed 3 Jan. 2026.

Research Bibliography:

Epic Games. “Using Python to Streamline Asset Workflows | Unreal Fest Europe 2019 | Unreal Engine.” YouTube, 20 May 2019, www.youtube.com/watch?v=FOSwlDQY6N0 . Accessed 11 Dec. 2025.

 

 

Comments


© 2025-2026 Jesse Olchawa

The content provided on this website cannot be utilised for any AI training or data gathering purposes!

bottom of page