Author Topic: Is there a way to bake position maps without normalisation?  (Read 5989 times)

I have multiple objects I want to make maps for. They all match up at the edges, so I need the resulting substance maps to flow from one to the others. For the most part they do. However I do want some large scale variation. The sort of thing that is visible across multiple objects, but doesn't tile.

For this I thought using a position map as an input might work well. Unfortunately the position map needs to be normalised by something. Either Bounding Box for the mesh or bounding sphere. These meshes aren't all going to be the same height, so the range for each position map varies, so they don't match up across boundaries.

I want to be able to set fixed values for the max and min range of the map so I can have the range the same for all meshes, regardless of their dimensions.

Is there any way to do this?

At the moment I'm rendering world space gradients in Max, which is an additional step I'd prefer to do away with.


The position maps need to be normalized as the pixels in the resulting texture have a limited range (at best 0-65535 when saving in a 16 bit textures). The only way to be able to bake non normalized positions would be to be able to save the baked texture to HDR textures (32 or 16 bits floats), but even then you would not be able to use them in Designer as the Substance engine does not handle floating point textures.

I understand the range of values would have to be limited somehow. I was hoping there would be a way to enter minimum and maximum as fixed values so they can be the same for all meshes.

I guess there isn't. I'll look for another way to do this.

Thanks for the input Cyrille.

One thing that would be possible would be to add a third renormalization option where you can specify the extent of the bounding box instead of letting Designer compute it automatically. That wouldn't be terribly convenient, but it's better than nothing.

(*) because you would have to find out the extent of the bounding box of all your objects put together in world space, translate those coordinates in the object space of all your objects, note that down and copy the values in Designer

Thanks Cyrille. Yes that's the sort of option Xnormal gives for heightmaps. I can find out the max range of the models.

I've been looking into this again and I really haven't found a good solution anywhere. It sounds like a really simple thing to get the vertex position in world space and use that to drive a gradient. And in a realtime shader it is. That's how tri-planar shaders work. But rendering that to a map seems more difficult. I guess no-one has wanted to do it before. Xnormal doesn't really offer this type of map. A height map rendered from a plane is near, but our meshes won't all be planar mapped.

I've found scripts to do similar things to the position map baker in Substance, but again, they're normalised by the object so won't match across multiple objects.

Can I put an official request in for manual range options in the position baker? Do I need to put a post in the requests page?


Can I put an official request in for manual range options in the position baker? Do I need to put a post in the requests page?
Yes, please.