They are 16bit, but I didn't think CPU did 16bit, so I think I know what the issue is - I'm using gradients to remap the X/Y in a pixel processor, but any gradient greater than 256 pixels will start to double up due to 8bit only being 0-255.
Howcome the GPU engine doesn't work in UE? Is it a limitation in the Substance API? I would have thought it was just a matter of passing Substance a handle to the DirectX driver or something.