Substance Automation Toolkit: 2020.1.3
Substance Designer
Editor Version: 2020.1.3 (10.1.3) build 3687 commit 0f34f9e9 Release (2020-06-11)
Cooker Format Version: 3.0.0 commit 0x00000000
Engine Version [OpenGL 3.x]: 7.2.9 commit 0x51db286b (2020-06-11)
Bakers Version: 2.4.0 commit 0xb61270a1
Substance Cooker 10.1.0 commit 0xd6d2c783
Given an input bitmap and an output, when an sbs is run through the SAT, any levels or histogram scan adjustments behave violently different than what is observed in the 3d/2d view, and local exports. Diffs only occur through SAT. A finely tuned mask will most usually get blown out to pure white.
I have checked with pipeline that enough memory is allowed, that position maps are baked correctly. The project will not have enough dedicated access to farm GPU machines so a fix for a potentially major CPU diff is in order. Must also note that this occurs when I run SAT on my local machine through automator not just on farm. Some help debugging would be appreciated.
I am unable to provide any files.
Thanks