Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - Alex Jenyon

Pages: [1]
Hello all;

I've created a substance with 3 exposed parameters.

Editing these parameters has a visible effect on the outputs in substance designer, in substance player, and in 'Unreal' if I edit the substance graph instance directly.

However, what I'm trying to do is control the parameters using a blueprint.   

A substance with exactly the same named parameters DOES work in blueprint, and I've copy/pasted the nodes to ensure I didn't make a typo - several times, since I was sure that was going to be the problem.

I still can't get it to work, though.  The sbsar is here:

Any chance someone could try it in blueprint, and see what I did wrong?

Tested in Unreal 4.18.2, Substance Designer 2017.2.3, Latest substance plugin.

Any assistance appreciated!

I thought I remembered the 'ground' category in particular having a lot more substances in it a few months ago - were all the 'Real Displacement Textures' removed from 'Substance Source'?

As an example, 'Forest ground with roots and needles' is still available under my personal 'My assets' tab, since I previously downloaded it - but no longer appears anywhere else on the site.

Be a shame to have lost them, though I assume there were some licencing costs.

I've been working on a substance for Unreal that makes heavy use of the Triplanar projection nodes (both grayscale and color).

Looks great in the default GPU engine, but pixellated garbage in the CPU equivalent.  Resolution, all inputs, etc. are identical.

From some online research, this is apparently because the GPU engine can utilise all of the information stored in a 16-bit position pass, while the CPU can only sample 8 bits.  Most of the threads I could find were from 2015 / early 2016 - but this problem still seems to be seen with the very latest version of SD / SD for Unreal, more than a year later

Are there any plans to fix it?  Or is this an intrinsic limitation of the CPU engine, and something we're stuck with?


Has anyone implemented 3D rotation matrices inside a pixel processor node?

All the pieces are there, and I've attempted it before (in another application's node-based visual flow graph).  I also remember how much of a pain it was to get it working.

(Why would I want to do this?)

I'd like to be able to rotate an existing 'position' bake by a user-defined angle.  This would enable me, for example, to have a substance in Unreal that 'knew' how an object was orientated, and updated the direction of drips / staining interactively as the object was transformed - 'Intelligent texturing'.

I've been working on an implementation of the Unreal Engine 'Height Lerp' node within substance designer.  (Not at all unique to Unreal Engine, but that's where I've used it most)

For people who haven't seen / heard of this technique before - here's the explanation.  For anyone who just wants to try the node - the download is at the bottom.

Lets say I have two ground textures:

Forest floor:

Rocky ground:

And I want to transition from one to the other using a gradient mask:

It looks pretty awful:

What would be better is if we took the height maps of the two textures into account when creating the blend, so the transition was more natural, and we didn't get unrealistic semi-transparent blend areas.

I found the algorithm, and the above image, in a great article here:

The algorithm implemented in a pixel processor node gives a blend map that looks like this:

For a much more natural looking blended texture:

This 'semi-transparent' texture blending has always been something that bothered me about the 'make it tile' node:

So I've altered it to do a height-based blend instead:

You can download a .zip file containing both nodes (a basic 'height blend' node, and my altered 'make it tile' node) here:

Maybe it will be of use to someone - let me know if you have a use for it, or have a suggestion to improve it.  I'd like to get some other hands on it before uploading to substance share.


Hello all;

Substance designer crashes instantly on startup (with no error / crash dump at all) on my brand new Windows build.  I've been reading some threads about this - some specific to Alienware machines, and some connected to the Dec 31 - Jan 1st 2016 date changeover, but nothing that seems to apply to my case:

Windows 10 (clean install)
Nvidia GTX 1080 (with latest drivers)
Substance Designer 5.4.0

Any ideas what to check?

Hello all;

I've found a few posts (here, and on the unreal forums) that suggest it's possible to construct a blueprint that does the following:

-Adds a static mesh to your scene
-Assigns a substance with a random seed to each new mesh (so each mesh has a unique texture)

I referenced THIS thread from Wes, as well as THIS thread on the unreal answers forum.  I also found this useful image from

I've created a construction script that calls the 'DuplicateAndAssign' function from Wes - the whole thing looks like this (VERY large image so you can read the text, best viewed full size):

...but it does't work as intended.  It adds a static mesh to the scene, a substance graph instance and a dynamic material instance for each one as planned - but all the textures are 50% grey

I was trying to replicate the suggested function EXACTLY, but there was some conflict with the input variable types and the variables some of the graph nodes were expecting (static mesh actor vs static mesh component, for example).

Can anyone spot what I've done wrong?

Hello all;

I've set up an animated substance (raindrops on a water surface), based on some FX maps that were posted to this forum (I think by Vincent)

It's working perfectly in Substance player.  Little bit too broken up at the moment, but looking promising, and animates nicely.

I'd like the same behaviour in UE4 - but there is very little information on how to animate the $time variable using blueprints.

I have been referring to the only two tutorials I can find:

Thread on this site

Youtube tutorial

The substance imports just fine into Unreal:

I can change the exposed variables as expected, with the substance updating in the viewport:

I then tried setting up the blueprint, as described in the referenced tutorials:

Attempt 1:

Attempt 2:

Attempt 3:

None of these do ANYTHING when the level is played.

Can anyone spot what I did wrong, or suggest some other things to try?

Many thanks

SubstanceSubstance - Discussions - Fabric Engine + Substance
 on: June 03, 2015, 06:32:13 am 
I've just convinced my work (one of the world's biggest VFX studios) to get a license of substance designer.

Substance already allows for some great workflows, but none of them are within the standard pipeline.

Is there any plan to integrate either substance designer, or the substance engine, with Fabric?  Would allow for some pretty impressive workflows, and be considered much more seriously as a pipelined VFX tool.

I created an 'ocean foam' substance yesterday in SD5.  Used some of the awesome new features - edge bevel, distance, etc.

I tried to load it using the recently released Substance plugin for Modo 801, but got an error:

'This substance contains features that are not supported'

Is the plugin due to be updated soon?  It would be frustrating not to be able to use the substance because the plugin doesn't load it.

I also have access to Maya - is the Maya substance plugin more up to date in terms of feature support?

Many thanks - I'd prefer to use Modo, but I've got a job to finish.

Hello all - wonder if anyone could give me a hand.

I'm trying to construct a pretty simple blended material substance for some shoreline rocks.

I've made 4 base substances - sand, rock, shells and algae, each with the size set as 'relative to parent'. 

I'm then trying to use some material masks to blend the four together.  Nothing advanced / complicated... except the material blend node simply won't work.  It just shows up as a 256x256 blank checkerboard.

Any idea what's going on?

Hello all;

One of the aspects of Modo that I find most powerful is it's ability to quickly and easily create texture variations across replicated objects.

Here's the most basic example - a 'texture variation' node being used to randomly alter the luminosity of a series of replicated spheres:

This isn't really using the full power of a parametric texture, though - what I would like to be able to do is to sample the particle ID of the replicator, and feed this into one of the attributes of the substance:

(This doesn't actually do anything, but is how I imagine the process would work)

Each replica could then have, for instance, a random amount of dirt or random colour variation generated by the substance at render time.  Even a random amount of chipped plaster added on top of a brick wall, with a unique noise seed applied to each one - thus making a population of identical objects look very different to each other.

Only downside I could potentially see is a performance hit caused by generating a whole series of textures at render time - but I assume Modo is already doing something similar with the 'texture variation' layer in any case, and a procedural texture doesn't necessarily have to be generated all at once.

Is this something that could be possible?  It would be INCREDIBLY powerful if it could be done - tiny file sizes with rich texture variation per replica.

Any thoughts?


Pages: [1]