Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - ivalalearn

Pages: [1] 2 3
Hi guys,

I am upgrading my system (GTX 1070 laptop), and as I travel a lot I am switching to a ultrabook (Dell XPS13) and looking at eGPUs. Anyone have any experience using eGPUs with Substance Painter for viewport painting etc,  - I am looking at the Aorus 2070 eGPU gaming box to add to my Dell. All the benchmarks I can find are for games rather than 3D work, and I'm unsure how they compare...

Be great to hear!

Yes, it seems this is an ongoing issue.

The problem with assigning different materials prior to importing into Substance Painter (in order to be able to hide different geometries) is that then you end up with a different texture set for each material on export (even when your mesh shares the same UV set, which is wasteful).

It seems another workaround then is to then combine the textures after export (using a plugin or manually) but I have tried two different plugins and neither worked for me and the manual route in photoshop is quite a chore.

I am only very surprised because 3Dcoat has no issue letting you hide sub-meshes whilst painting and maintaining a single texture set, it surely is technically possible? Is it something you all plan to work on? The painting experience is just so good I really want to make the full switch to SP.


I'm thinking for a hack, perhaps you could try to manually gamma correct the textures.


Hi Wes, I'm interested in how you might gamma correct colour textures? I'm still being forced to use gamma colour space in Unity sadly, and this would be a great temporary solution rather than trying to paint them to look good in gamma and modifying later on. Any resources or tips appreciated

Hi guys!

I have a high density .obj with associated .mtl generated via photogrammetric techniques, that look like the attached.

I think it was exported from Zbrush.

- I have the high vertex count mesh
- I have a retopopologised low-poly mesh

I would like to bake the color data (& a normal map) from the high-res mesh to the low res mesh using Substances baking tools.
Are you able to offer a workflow as to how I would go about doing this?

Hope to hear!

Substance DesignerSubstance Designer - Technical Support - Normals baking issue
 on: September 15, 2016, 10:32:03 pm 
Hi guys

I've brought a low & high poly mesh into SD from 3DCoat. In 3DC I get a nice bake:

In SD the overall bake is good but despite playing around with tonnes of settings I get these weird lines all over the mesh too, any ideas?

Here's the retopo:

Be really great to hear. Whenever I try to get into Substance I find these small annoying issues that make me leave it again and just get the project done in 3DC. Would finally like to get SD up & running.


Hi Allegorithmic,

I created & exported some basic maps in Substance Designer. I brought these into Substance Painter and began painting on these.

On trying to export with the Unity 5 Standard (metallic roughness) workflow I'm getting the following errors. Can you please help me understand what's going on?

[MapExporter] The input map 'Opacity' needed by the map 'hoof_wall_lo_Hoof_wall_AlbedoTransparency' of the texture set 'Hoof_wall' isn't available because: 'Opacity' channel is missing in your texture set.
[MapExporter] The input map 'Emissive' needed by the map 'hoof_wall_lo_Hoof_wall_Emission' of the texture set 'Hoof_wall' isn't available because: 'Emissive' channel is missing in your texture set.
[MapExporter] The map 'hoof_wall_lo_Hoof_wall_Emission' can't be generated for the texture set 'Hoof_wall'.
[MapExporter] Export duration: 2s
[MapExporterDialog] Map export ended

Here's a screenshot:

Thanks in advance

Does this look like a Unity bug?

Everything looks good in Substance. Publish to Unity and I get pure color maps.

Hi guys,

Finally getting into using Substance with Unity (Unity 5.1.3p1)

I'm getting some odd behaviour trying to get my Substance file / all the bitmaps from my Substance project to import correctly though:

Here's my tree:

The model in Substance

The substance in Unity

I'm Publishing the substance to Unity as an .sbsar file (via right-click & 'Publish')

I'm not getting my AO map coming in automatically. I've tried a few times & the albdeo color now each time comes in as a weird colour, either green or red.

What am I missing?
Thanks in advance,

Hi Wes,

So for instance in this picture this is an object that has been unwrapped and imported with a 4096 x 4096 normal map from 3Dcoat. 

Here is the relevant UV set (relevant shells marked with yellow)

And this is okay, but it still doesn't look anywhere near as natural as the effect I got on the primitive in my OP. Here is my graph.

To try and get high enough detail noise I set the distance on the perlin noise zoom to 64 (max), and tiled the safe transform grayscale 16 times (max).

Wes, I believe in your sentence
Often times, this is done by using a detail normal map in the shader. As the object is viewed further a way, the mipped texture is blurred and you don't see the high frequency details as noise. The closer you get to the object, the more of the detail normal map you see.
you are referring to this channel in the Unity shader for instance?

I hadn't considered this as a method to increase normal map noise. Unfortunately all my projects are webGL based and mipmapping doesn't work currently  :'(

However this is a method to improve normal map noise. I feel also that the albedo map detail is very 'large' in relation to the size of the object too. I believe you are addressing this with this comment
Object size and UV scale is also an important factor when texturing. If the UV scale (ratio of UV shell size to texture resolution) is small, then the applied noise will look low res. In the second image, I used Painter to demo a noise applied to the mesh. The UVs have a small scale so the texture looks blurry and low res on the mesh. In the third image, I tiled the noise 3 times and it now looks correct to the UV scale of the mesh.

To achieve this effect in SD you suggest I use the safe transform node? I sort of feel I've done this already in the graph above.

I wonder however if, if I am expecting a higher res texture on an object such as the one I am demoing, it needs to occupy more space on a UV map, is this the crux of the issue, that the objects once brought into Substance Designer occupy a relatively smaller UV space than the primitives I'm testing out my initial designs on?

Thanks if you read this far

Thanks guys. And by tiling texture you mean one of the cells textures and the perlin noise zoom texture that allows you to specify a 'distance'. So far I've got these roughly to where I want them to be using a safe transform node but I'm a bit surprised as my objects are of a normal size but it's still difficult to get that high end detail in there at the moment. I am thinking there must be a step that I've missed, I'm going over the getting started tutorial series again.


This seems like an odd workflow Victor. I would have thought the most obvious way to do things is to do all of your base texturing in Designer and then move to Painter to add smaller custom features? Although yes the videos I've seen also describe a SP to SD workflow. Interested to know if there's something I'm missing with my current approach.

Hi Fabian thanks, what does it mean to use proper UV scaling?

I've started getting into Substance designer and I'm loving it. What an awesome piece of kit. An early problem though:
- I seem to be able to create good looking textures in an empty package using a new graph working on a primitive 
- when I reference these graphs into a new package with some actual meshes I want to apply them to, the quality either drops or they just don't seem to work at all. I'm using a multi-material blend node to reference in the individual graphs and add onto the final graph to my final outputs.

2 examples
On primitive, just how I want it

On actual object

On primitive

On actual object

I have set all my graphs to be 4096x4096 in both graphs to try and rectify this.
It would seem to be an issue with the relative scale between the primitive and the actual object perhaps?

Any workflow tips or things I should be looking out for?

Wes, that's terrifically helpful. As always a huge thank you. I am up and running and it's clear Substance is going to become a solid part of my workflow, even after a few hours with it!

Thanks Robin. I dont seem to be able to get the effect I was going for using that. However the safe grayscale transform node seems to get me going in the right direction though!

Pages: [1] 2 3