Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - hevonen

Pages: [1] 2
To me it seems you don't get the functionality - 'A' only sets next stroke maximum blend to same as the previous stroke.

And 'Flow' only defines how fast 'Opacity' threshold is reached, so pressure sensitivity in 'Flow' is quite useless for hand painting.

The only way to gradually blend is to set 'Flow' to max and 'Opacity' to a small number and do many passes over and over.

There is now a control for brush opacity, but it is not related to stylus pressure. Brush opacity only controls the maximum blend amount of a stroke.

From the feature request:
'Paint "dab" is added to previous "dab" regardless of stylus pressure, making it impossible to easily blend colors with pressure sensitivity.

Solution would be to add a new variable "opacity" which could be linked to stylus pressure and a way to turn off paint accumulation. This brush mode combines all of the paint dabs inside single continuous stroke so that hardest pressure pixels replace the lower pressure pixels instead of adding to them (like in accumulation) while lower pressure pixels have no effect. This makes strokes easily controllable with tablet pressure, allowing for large swathes to be colored and blended quickly.'

This pretty much describes how traditional digital brush works, reference attached. Could you do this simply by changing the brush dab math to minimum instead of subtract?

Masking with polygon fill in UV-space (2d view, triangle fill or quad fill) bleeds to far-away UV parts if Symmetry is activated. I guess this is a bug from Symmetry projection done in 3d-space instead of 2d-space?

In this case faraway UV parts are meshes near each other in 3d space (eg. eyelashes near eyelids). If this won't work in near future it might be useful to disable Symmetry editing with polyfill in 2d view or mention this as a caveat.

There seem to be several kinks in reusing textures.

Smart materials lose references to baked maps that are used in fills. It seems that these maps are linked to by name and not by baker/usage (AO, Thickness). As maps are baked with material set names they won't be found when used on another mesh with different materials (at first I expected that Smart material would automatically bake missing maps and whole material would just work).

Changing a mesh to another mesh with matching topology (for asset variants) will not work if the materials have different names - and material can not be retargeted as Texture sets can not be renamed inside Painter. However for example Unity links materials and textures, so in order to reuse texturing in Painter one has to export meshes twice: once for SP with same material names and once for Unity with differing material names. This creates extra work only because SP doesn't allow Texture Set renaming.

Wouldn't both of these problems be gone if Painter allowed Texture Set renaming? Or is there some way to do that already or to force maps in Smart materials to link to baker, not the image file?

Yes, I can't get it to change the luma of the result (CS6).

To my knowledge the definition of Color blending mode is to change the chroma of the underlaying color, not the luminance or saturation.

This behaviour (and others like it) makes it impossible to:
a) adjust chroma locally for color zoning etc
b) fake color correction adjustment layers with solid fill layers
c) match and use cross-application results (like PS <> SP/SD)

To me this behaviour is just wrong no matter if in Linear or sRGB space, please look into it.

Results from Color blend mode seems to be wrong.

As seen in the image, SP2 doesn't just replace the hue, it also affects the luminance of the bottom layer. Compared to the Photoshop result is at least unexpected. Converting such image to grayscale (via black and white conversion or grayscale conversion) reveals the increase in lightness.

As the point of Color blend mode is to adjust the color, not the luminance or saturation this is pretty problematic.

I think there might be same thing going on with other HSL / LAB blend modes as well:

I can't get opacity to work with stacked objects.

1. First I had an object with both transparent and opaque parts using the same material. Only the uppermost transparent part was visible. Not that big a surprise.

2. Then I split the object to two, separating transparent and opaque parts, but still using the same material. Only the uppermost transparent object was visible, unless disabled in scene setup. Only then I could see the object underneath it.

3. Then I duplicated the material (thinking that SD culls also by material), using different materials for transparent and opaque objects. Still only the uppermost transparent object is visible, unless it is disabled in scene setup. Only then I can see the object underneath it.

IRay seems to render this fine.

Why is this, what format and setup of transparent objects does SD actually support? How does it render?

I have multiple objects unwrapped to same texture. Objects with the same target texture have matching material. How can I easily bake objects with same material so that they would end up in proper textures? Is there variable name for material like there are for $(mesh) and $(bakername)?

It feels like bakers are currently set backwards in SD as one can not setup them as reusable sets - instead for every bake every object to go into that bake has to be activated/deactivated separately. It's bit clunky. It would be an improvement if objects were sorted according to object material (entity > material > object instead of entity > object) - then objects for texture set would be faster to find.

This got fixed. Apparently SD doesn't have problems baking floating meshes if they are separate objects. On the first try I had floating meshes inside the same object and those were not exploded. Either that, or the baking has changed.

Yep, some errors disappear when I set it to 0.0001, but some errors are still there.

Baking vertex colors for masking from low-poly mesh to itself creates errors (probably due to close surfaces with the model) and I didn't see any option for exploding non-connected mesh parts inside object.

It would be preferable not to actually bake vertex colors, but simply copy the colors from vertices to matching uv coordinates. Any workaround for this inside SD or do I need to do it elsewhere?

To put it shortly: "pick gradient" doesn't work on OS X as all picked gradients (even picking from substance UI) come out as black. Naturally this is quite problematic.

Can this be fixed?

Good to hear, this is a pretty big thing for Wacoms and other drawing tablets.

Bumping this, as of SP 1.7.1 there is still no brush opacity or a way to disable accumulation. Current implementation is quite impossible for handpainted textures.

TL;DR: Photoshop-style brush opacity is a must.

Uservoice here:

Ah there is hardness instead of smoothness, so that is not needed again :)

Pages: [1] 2