Author Topic: Create a Pixel Processor that leaves a trail behind moving input  (Read 166 times)

I've been trying to build a Pixel Processor that has a flat black input image, and a second input of a Shape node (thorn) scaled down that passes through a Transform 2d node, so that when you adjust the offset the thorn moves and leaves a trail. I tried comparing the black input to the thorn to get the greater value and then pass it through an if/else node and add to the black where the value was greater, but then realized there was nothing to keep the values white after the thorn moved. Is there a way to have a color value only increase and never decrease?

Basically, I'm trying to make it so that I can expose the Transform offset and let a user draw by moving the shape around. I know there's a paint function in the Bitmap and SVG nodes, but they can't be accessed outside of SD. I feel like I'm close to figuring it out but I'm also pretty unfamiliar with Pixel Processors, so I thought I'd check to see if anyone here had done something like that, or felt like taking on the challenge.

The incredible things that could be done with a runtime-adjustable mask in the middle of a Substance graph are exciting enough to make me want to figure this out, so any help is greatly appreciated!

for this to work you will need iterations aka one pixel processor for each step you want to keep. if you want a perfectly smooth trail this means 1 pixel processor for each pixel the shape moved. but since you can't dynamically add nodes (i think?) you have to have a limit to how much distance the "brush" can move by setting up the appropriate amount of pixel processors beforehand.

alternatively if there were a way to dynamically write to a texture and read that texture before writing to it again for every step this would also work, but I don't think that's possible either.

hopefully someone else proves me wrong on this though :)

Hey, just to make sure, can you add a pic to illustrate what you are aiming to do?

Vincent- Sure thing! Here's what I'm trying to recreate:

The paint tool in the Bitmap node is great, but can't be used at runtime in a game engine. I'm looking for a way to make even a simplified version of this that can be used at runtime to draw white or black onto heightmaps, like this:

If you can think of a way to take input in the form of offset information from the Transform node and color all the pixels in the position the offset defines, I'd love to hear it! It feels like it would need to involve movement over time. I know there's the $time float, but would that be possible to work into the logic?
Last Edit: March 28, 2019, 04:34:50 pm

I think it's something to handle in the game engine: if you find a way to paint a texture out there, you should then be able to use it.

You can't set an input image to be dynamic. It has to be the Substance Input Image file type, so there's no way to input, for example, a render target from Unreal engine into the middle of the graph. You can draw on the output textures after the fact, but that wouldn't affect the heightmap at all, which is what I'm trying to achieve. That's why I'm trying to build something inside of Substance Designer.

What would your method be for making even a rudimentary drawing tool? One that can be operated in-engine at runtime, but also can be placed in the middle of the graph and still be processed by other nodes?

You'd have to find some way for substance to record the path you draw. As far as I know it's not currently possible to record something over time. When moving a transform node around, or any other similar input, Designer will know it's final location and rotation, but not the path it took to get there.

The closest thing to drawing something without the bitmap or vector widgets is with splines I think. This is mighty difficult to get working yourself, but there are some nodes floating around already, like this one on share:
Or this paid one:
Esger van der Post.
Game design student and texturing addict.

Ah. Yeah I am familiar with those spline nodes. They're definitely awesome, and I may end up just going in that direction, but they wouldn't be ideal for what I'm going for.

This really feels like something that should be possible, like forcing Offset to update every [amount of time] (not necessarily every frame, but even every half second or so), and then using something like the Sequence node to set a Before value of the image, then move the "brush" object, set an After value, and finally add the before and after together. The final image would then be the starting point for the next round of updates.

It's definitely humbling; before trying to figure this out, I thought I was fairly proficient with Designer. Turns out I have a lot of learning to do!