Author Topic: Iray Renders using the Automation Toolkit  (Read 1568 times)

I just watched a video on YouTube entitled "Substance Designer 5.3: Using the Nvidia iRay Renderer" that walks you through how to use Substance Designer to render a textured asset. I've heard it said that the Automation Toolkit can reproduce almost all processes that could be done using Designer- is it possible to automate renders of an object with textures applied using the Substance Automation Toolkit? For instance, say that I had an fbx with a model and camera contained within it, an hdr image to light it, as well as all the file textures needed to apply to the model. Would I be able to combine these elements and generate a render of the model against a white background? Most importantly I'm curious if it can be done, but would also really appreciate it if there are any videos, articles or resources that may be out there that shows anyone doing this. Thanks in advance for any insight you can provide!

Quote
I've heard it said that the Automation Toolkit can reproduce almost all processes that could be done using Designer
It's important to note that Substance Automation Toolkit (SAT) is not an API for Substance Designer.
What it is, is an API for certain processes that are required to automate the material creation process.

SAT encompasses the following:
  • CLI sbsbaker: Bake mesh maps from your input geometry
  • CLI sbscooker: Compile sbs material to sbsar, to be render/Painter compatible
  • CLI sbsrender: Render textures using the mesh maps and sbsar
  • CLI (sbsupdater/sbsmutator): See docs https://docs.substance3d.com/sat/command-line-tools
  • Pysbs: A python API that allows you to create and edit sbs files (essentially xml files). It also includes an API to run the CLIs listed above via python using the batchtools module, which generates subprocesses for you.

Substance Designer also has its own API which aides you in building and editing materials, but it doesn't offer much more beyond that at this time, although it is slowly improving.

As far as I know (2020.1.0), I don't think it's possible to automate rendering via Designer API using IRay.
Even though the Designer API likely won't be of any use to you in this case, I just thought it was worth mentioning that it does exist.

___

It looks like you're looking for a solution for something similar to turntable/thumbnail rendering with your material of choice; to achieve this you would have to combine several different processes.

Start by generating the textures in SAT and then run a python script that applies those textures to the shader in your DCC, which renders the final image.

  • Bake out your required mesh maps by providing the fbx to sbsbaker.
  • Compile the material you need to sbsar using sbscooker.
  • Render your textures by providing both the sbsar and mesh maps to sbsrender.

At this point you can use the DCC of your choice (I'll use Maya in my example) to generate the thumbnail/render.
You could have a python script that does the following:
  • Launch a Maya headless session and provide the appropriate render environment (a template shot file would be ideal: i.e. it already has the appropriate lighting, a shader with texture nodes for each channel applied to the geometry)
  • Apply the textures to the appropriate texture node for each channel.
  • Render the frame/turntable.

    Thanks for your response NevTD! To provide a little more context (in the event that this is seen by Adobe staff), here is why I asked. I work at a company that manages a large amount of assets created in several different DCC applications, supported by several different render engines. The common thread with all of them is that the texturing is done exclusively in Substance Suite. What is needed is ONE environment used for rendering to proof these assets. Several combinations of DCC applications and render engines produce differences in appearance- there is no easy way to calibrate them in such a way that the same asset could be rendered with any of these app/render engine combinations interchangeably, and will come out looking the same. What better place to find that continuity than leveraging the IRay render engine that ships with  Substance Suite? For larger scale operations, it just seemed likely that if the Iray render engine exists in Painter and Designer, some way would be provided within the SAT for generation of these renders, so that you are certain that what is seen by the creator of the asset can be reliably reproduced by those on the receiving end. For now, it appears as though the solution you mentioned is the only option. I appreciate the detailed response- thanks again!