Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - mrwyatt

Pages: [1] 2
Hi there,

I am trying something with substance designer and I don't even know if this is possible yet.
I am working in film and I need to do one of those Harry Potter like moving paintings. I haven't found anything off the shelf that I liked other than the "matfx_oil_paint" filter in substance painter. It gives me pretty much the look I am after. Problem is I cannot load an image sequence into painter or export an image sequence. I know substance player can do this, So I tried to load the painter filter in substance designer and use it in my own graph and it works so far. So technically I could make a substance and load it into player and export a sequence.
But how do I tell Substance to load a sequence of frames as the input.

I'd also like to request a nuke version of substance if that is a possibility, or just add frame sequences to substance feature list. That would do the trick

according to the documentation her
it says that in 2017.2.2 of the automation toolkit you can bake UDIMs. unfortunately the "How" isn't documented at all.
Any hint about that would be helpful.



just upgraded to SP2 and found out that each time I start iray, the application crashes instantly. I am on a windows 7 PC with an nvidia geforce gtx 960 with 4GB of ram. Also Substance Designer doesn't start at all now.

This is rather annoying.


I noticed that it is impossible to dial in more precise numbers than 0.01 as everything smaller than that gets trunkated. That mmight make sense in a lot of cases, but I found that sometimes a little more fine adjustment is needed. A colleague of mine for instance pointed out that in a substance he made, he could not position a shape precise enough at the position he wanted it as offsetting it 0.17 in U wasn't far enough while 0.18 was too far. he wanted to move it 0.175 but couldn't because Substance didn't give enough precision.

I would also like to see this with color inputs. I get it that Substance is geared towards the gaming community, but it also is getting some traction in the film and vfx field and there, folks love floating point values, with, again, lots of precision. Having basically only integer values from 0 to 255 might be fun and games (pun intended) being able to get a bit more precise when you need it, would be highly appreciated.

just food for thought.


Substance PainterSubstance Painter - Feature Requests - Double Sided Shading
 on: October 17, 2015, 08:05:26 pm 

I'd like to see double sided shading of single sided geometry in SP. Sometimes you have singlesided polys and would like to see them from both sides even when they show the same texture. For instance when our cloth artists tailor clothes in Marvelous Designer the geometry comes in single sided. that is especially annoying with collars as they look like they are not there at all, because the geometry folds over to form the collar and due to single sided shading is invisible.

sounds to me like a trivial feature to implement. please do it.


P.S.: Zbrush has this as a display option to switch double sided shading on or off.

I think I found a bug in the TriPlanar Greyscale node. As I got a look at the new nodes graphs I noticed that the worldspace normal input of the new TriPlanar nodes gets split into 3 greyscale channels using greyscale conversion nodes. The first one splitting out the red channel, the second splitting out the green channel and the third one splitting out the blue channel. At least it's the case for the color version of the triplanar node. The greyscale one splits out the red channel and then twice the green channel. the third greyscale conversion node should set R and G to 0 and B to 1 right? But it doesn't. Both the second and third greyscale conversion nodes have R and B set to 0 and G to 1.

 can anybody confirm this?

Hi there.
I am using the indy pack and love it so far, but one thing that bugs me is that SD, SP and B2M use different naming conventions on export and it is even getting worse. I use the PGB metal/rough shader and when I export maps from Designer it says "basecolor", from Painter it used to be "base_color" now it has changes to "Base_Color" ans B2M calls it "baseColor". As long as you build your shaders by hand that is all fine and dandy, but at work where my boss bought a couple of pro licenses we made a script that builds the shading network automatically, all we need is point to the basecolor map and boom, we're done. but we had to build into it some ugly exceptions to make sure we cover every naming convention, depending on the app used. Now it just got worse as the new version of Painter has introduced upper case naming: _Base_Color, _Normal, _Roughness, _Metallic ect.
It gets worse with each iteration and we don't like it. We love consistency and we hope you see it the same way.

I don't want to dictate what your naming convention should be, I really don't care, but please make it consistent through all apps.


Guillaume Wyatt

it was announced on twitter, but it isn't in my downloads.

just asking...
I mean it's been a while and some bugs really start getting on my nerves.

How are the chances of Substance Products getting colormanagement. OCIO would be really good, that way we could better judge textures against a target LUT. Assuming there are others like me who actually try to use Substance for film work.

Hi there,

I don't know if this expected behaviour, but it looks like a bug to me. I thought the ability on export from SP to specify a resolution bigger than the one I worked on would recalculate the textureset to be true 4k when I chose it. I made a test where I put the frogskin material on Hans body and exported as a 4k map allthough my work resolution was 1k. The image that was exported was indeed 4k in size, but when I compare it to a texture exported in 4k and the project size was 4k, then I get a more detailed result. If the export to bigger size really just interpolates a 1k image to a 4k one, then why even bother. I can do that in photoshop. The way I understood this was that on export it reevaluated the layerstack to truely generate textures at the specified resolution.

Is this a bug, or simply my misunderstanding?

here two images. Both are just 1k details cropped out of 4k images. The first one is the one I generated at a project res of 1k and the second one was exported with a project res of 4k.

Hi guys,
I often find myself wanting to errode/dialate a mask to shrink or grow it. at the moment I have to build a setup using a blur node and a levels to get something similar, but the blur nodes are notoriously slow and it would be nicer to have a built in solution anyway.

Is there an errode node planned for the future? If not consider this a feature request.

This quite honestly sucks. There is no need for 32bit in a normalmap that holds values between 0 and 1. 16bit, OK but 32 bit is overkill. plus you cannot view it in irfanview or similar viewers. Photoshop shows it in wrong colors. It would be cool to be able to set the bitdepth per exported image instead of letting Painter figure it out.

Got a bug for the devs. In the "Old Painted Planks" substance the attribute for "Planks Color" doesn't do anything. Seems to be broken.

Hi guys,
has anybody successfully made a custom MatFX substance effect for painter. I cannot figure out, how to go about making one. When I open one in designer, it doesn't let me view the graph as there isn't a .sbs file, just the compiled version. And some have extra alpha inputs and outputs. Don't know if and when I need those as some of the matfx come without them. A tutorial on how to make a matfx and the ins and outs of when you need extra alpha inputs and outputs and what they do and how you hook them up would help a lot. As it is right now I'm lost.

Pages: [1] 2