Author Topic: Trouble rendering sbsar files containing 8k texture sizes  (Read 2736 times)

Today I was trying to debug why we were getting some quality issues with 8k textures being processed through substance. One of these textures contains fine print text and while readable as input became a blurry mess on output despite being 8k. I've got the GPU engine set correctly, and our artist found in the settings that Substance Designer was capping inputs at 4k, if I switch that to 8k we can see quality issues resolved in Substance Designer's output previews, though I'm not sure how much this reflects the CLI tools.

Going through the docs I came across sbscooker option `--size-limit`(would be good to know default of SAT when rendering) I tried that with a value of 12 (for 8192px), this baked to an sbsar fine but calling sbsrender gives the following error:
"content of input file "filename,sbsar" does not form a valid package.

If I try to load the sbs in substance designer, it's fine at 4k, but going into settings to support 8k and loading the graph for rendering, memory goes up to about 16GB(out of 128GB available on this system) and SD crashes. Is it likely that sbsrender is having the same problem?

The graph itself is set to absolute 8k for the input bitmaps(they just show as 4k unless the limit is adjusted in settings).

It's possible that it's due to substance setting a memory limit? Docs mention for sbsrender that default is 1000MB, the parameter is listed as `--engine` just like the parameter above it for selecting an engine... That's a bit confusing if the same parameter should be specified twice, maybe `--engine-mem`? I've seen `--memory-budget` elsewhere for this, but it no longer seems to be supported.


I tried `--engine 24000` with no luck, sbscooker used up to 10GB of RAM, the sbsar file produced is about 5GB(`--compresion-mode 2`), the current graph being generated to render contains 20 textures, 5 at 2k, 5 at 8k and 10 at 4k, they're linked externally as far as I know, but I guess the sbsar is packing them?
Last Edit: September 19, 2017, 07:50:33 am

Loaded the sbs into Substance Designer, raised limit in settings from 4k to 8k but didn't load the target graph. Exported the package to sbsar, sbscooker.exe peaked to 122GB of the RAM(99% RAM was in use), it's dipped down various amounts since all the way to 30GB and all the way back up to using as much RAM as possible. It's still running 25 mins later, presently at 94GB of RAM usage. This was with the export option of avoiding compression which was meant to speed up the write to disk.

I'll give it a little while longer, but I guess for some reason sbscooker is really struggling at creating the sbsar file. It just stopped(presumably crashed) while typing this, the sbsar file is still 0kb...sooo not enough RAM?

Just switched settings back to 4k and tried the gui export to sbsar again, it seems to be showing the same symptoms already exceeding 40GB in RAM(I take it the limit setting has no affect on this then? Confirmed by ending the sbscooker.exe task and getting an error with the command, --size-limit is always 12). Tried with 128px in limit settings and best compression, same problem(compression mode was updated to 1 though).

Just to confirm, SAT is given a --size-limit of 12 and exports the sbsar fine, definitely not as long or as much memory usage, at --size-limit 13 it seems to finish fine, sbsrender fails(although the error is from sbsbaker for some reason).