Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Alex Jenyon

Pages: 1 [2] 3 4 ... 8
As far as I can tell, that's not an error message - it's just information as to what output substance is expecting. 

If you've managed to set an output node (which it looks like you have, since it's orange) everything should be working fine.


The amount of displacement in your shader will have a huge influence on this, too.  Even if you have a perfectly balanced height map, a low displacement setting will make it almost disappear.

In substance itself, iRay calls this the 'height scale' - and a setting of '1' is usually not enough.

There are at least 3 totally different workflows you could use.  All would involve taking photos, and using 'Substance', but the requirements would be quite different.

Approach 1:  bitmap2material
Requirement:  1 high-res photo taken perpendicular to the surface in diffuse light.

Method:  Use 'bitmap2material' (or bitmap2material light node in Designer) to remove lighting information from the photo, and Substance to make it tile.  The 'make it tile' node probably won't work very well (ironically) on actual tiles, unless they are perfectly aligned with the edges of the image - so you may want to use photoshop to crop/straighten the photo first.

Approach 2:  Material scanning using photos
Requirement:  8 high-res photos taken under very specific lighting conditions.

Method:  See THIS TUTORIAL.  Note that this will be tricky to pull off if the tiles are already on the wall!

Approach 3:  Photogrammetry
Requirement:  At least 12, but likely around 36 photos from different angles.  Don't need to be as high-res, though.

Method:  There are lots of photogrammetry tutorials online - and you'll need a photogrammetry tool to solve the photos (substance can't do this).  The output will be a mesh (from which you can extract a height map), as well as a texture (which you can de-light in a similar way to approach 1).

There are lots of 'hybrids' between these approaches, too.  For example, you could use photogrammetry to create a mesh / height map, but then generate the colour information procedurally in 'substance'.  The nice thing about this workflow is that it has a real-world base, but some procedural variation too.

You could also scan several tiles individually (using any approach), and use 'Substance' to, well, tile them.

Hope that helps.  I do a lot of approaches 1 and 3 for my job.  I have not yet found anything I have enough lighting control over to do approach 2, but I'm hoping to try it soon,

Substance DesignerSubstance Designer - Discussions - Re: Critique
 on: May 01, 2018, 04:34:09 pm 
Then I owe you an apology.

Comments still apply, though - I've not seen ceramic get damaged like this, so be awesome to see the ref (for my own education if nothing else).  If the stock ref is online, can you post the page instead of a hotlink? 


Out of the box, no.

This might help, though:

Substance DesignerSubstance Designer - Discussions - Re: Critique
 on: May 01, 2018, 04:33:07 am 
I'm heavily suspecting you have done this from memory - rather than from reference.  I've not seen ceramic tiles (of any type) behave like this when damaged - it looks like the idea of damage, rather than how damaged tiles actually look.

Using reference is not cheating.  The image already in your head came from somewhere, so rather than refer to an unreliable version of it, why not go to the source...

My first suggestion would be to find, or shoot, a piece of reference - and break it down into it's important features:

-The base materials, including grout, dirt, mould, etc.
-The difference between dirt, wear and actual damage
-Where each occurs, and why
-What happens when a piece of tile is missing
...and so on.

Hope that helps


'Non-destructive' is more of a concept - part workflow, part toolset, and part thought process.

That makes it sound really complicated, but it's quite a simple idea at it's heart:

At no point in your workflow do you make a permanent, non-reversible change to any information upstream

Take the example of an image - if you clone a section of it, and save the result back over the original, you WILL NEVER get back the information you originally had under your cloned area.  You have made a 'destructive' edit.

If, however, you applied the clone paint as a layer, or a node in something like 'nuke' (or 'substance'), the original still exists.  If you want to, you can re-do the clone paint at any point, delete it, or apply it to something else.  This is 'non-destructive'.

For a modern texturing workflow, this idea is very useful.  If your entire workflow is non-destructive, you can make edits way upstream in your process, and have the changes automatically propagate downstream. 

Another example:

If you have used a basic granite stone substance as the basis for a complex stone wall texture, and you haven't followed a non-destructive workflow, then a note to make the stone sandstone instead of granite would mean you had to basically start from scratch.

If you've followed a non-destructive workflow, on the other hand, you can simply swap your 'granite' substance for a 'sandstone' substance, and generate a new sandstone wall texture almost immediately.

Layers, nodes, flow graphs and procedural edits are the signs of a non-destructive workflow, but don't necessarily guarantee it.  Baking out the results of one substance, and using the bitmaps in another is a destructive step, for instance.

Hope that helps


The simplest method would be to blur the height field of your pavement before you apply the mask - will smooth out the values, and give something very close to an average.

For a more sophisticated method, you could take a look at the 'maximum masked' node on substance share.  This will give you a constant value for each individual mask - but the maximum value, not the average.  To get a constant, true average you'd have some editing to do, but it should be a good start.

Another thing you could do, since the blobs are not concave shapes, is shrink down each one to a small dot (using find edges, blurs and levels nodes), and use a 'distance' node to expand them out again.  This would give you a constant height value for the centre of each individual piece.

Depending on how efficient you need to be with your substance, a combination of all 3 might give you some natural looking results.


So there is no way to just safe all your hard work as a single 2D image?

What's your intended use for this image?  I can't think of any solid reason for needing to do this in a modern texture workflow.  Screenshots for approval - of course - but then it doesn't need to have perfect resolution / dimensions.

Substance is specifically designed to produce PBR-ready texture packages.  Other things, too - but that's it's core goal.  Outputting the 'combined' texture like you are requesting means baking lighting information into your textures, which is what the CG industry has been trying to move away from for the past 5 years or so...

Sounds like you have a particular niche use-case in mind.  Since the substance viewport doesn't support imported cameras (yet), you'll need to use another DCC to 'render' your textures - almost any modern application can do this.


Be good to confirm with someone on staff, but I'd be willing to bet that's a scan-based material.

The sbs really wouldn't help that much, if that was the case - there wouldn't be a wood pattern to expose. 

You could use the scan editing (patch / clone) tools to try to remove the obviously repeating element.  You could also separate out the planks into individual tiles, and create your own hybrid (scan + procedural) material.

Both are a bit of work - I'd probably do the latter, personally.


Substance DesignerSubstance Designer - Discussions - Re: Wet Fabric
 on: March 25, 2018, 06:04:07 pm 
There are three separate parts to a task like this:

1.  The mesh - wet material is heavier, and tends to cling to the surface underneath it.  Your asset will need to reflect this.

2.  The textures.  The actual fabric pattern itself is relatively straightforward (and something close is likely available on substance source).  Fabric tends to slightly darken when it gets wet, so you'll likely need to lower the luminosity of the albedo (base colour) map to reflect this. 

3.  The shader.  Wet fabric, particularly thin T-shirt fabric, becomes more translucent than when it was dry.  You can fake this in a texture by reducing the opacity, but really you need a translucent shader.  The shader is telling a renderer (realtime or not) how the textures should be used.

Substance can do part 2 in a substance material, and part 3 in a material description (mdl) if you want to go that far.  Part 1 will be down to your modelling / sculpting skills.

Hope that helps


Yes, absolutely.

A lot of the base materials (especially the ones from are just that.

Load the images in, connect them to the correct outputs, and publish!

All of those would seem possible (if tricky). 

Would still require a way of visualizing the result inside of the substance designer viewport, though.  Looks like mdl should be able to do this - might be worth digging into code posted on THIS thread.


Use a 'blend material' node.

Choose the channels you want to blend, plug in your two materials, and feed a greyscale image into the 'mask' pin to control the blend.

The bottom material your be the black wall, the top would be the gold, and the mask would be a black and white version of your logo/lettering.


On the surface, this seems very straightforward - so straightforward it feels like I'm missing something.

This would be my initial answer:

If you take your texture, invert it and convert it to greyscale, you'll have a mask for your cuffs.  You might need to grade it a bit, but it should be possible to get a pretty clean mask.

You can use this mask to blend between the material for your main garment and the material for the cuffs, using a 'blend material' node.

If you want to rotate the material, you could add a 'transform material' node before the material blend.

What did I miss in your question?


Pages: 1 [2] 3 4 ... 8