Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - dgoyette

Pages: 1 ... 3 4 [5] 6
61
In Unity, click on "graph.taslikalan" under your "99" substance. Scroll to the bottom. Under Target Settings, did you increase the Target Height and Target Width? Regardless of what you set the resolution to in SD, the SBSAR always seems to come into Unity at a default 256x256. You probably just need to increase that.

62
I was just informed of the correct way to resolve this. The layer masks behave similarly to paint layer, where the position is stored in 3D space rather than relative to the UV map. If you want to scale a model, and all you've done is to scale the model since the last import, you can simple uncheck "Preserve strokes position on mesh". Upon doing so, the mask will correctly update and retain its relative position on the model.

The caveat is that if you want to scale the model, you should make sure that a scale change is the only change since the last import. That is, since you need to disable "Preserve strokes position on mesh" to get the scale change to work, you wouldn't want any other changes in the model that require "Preserve strokes position on mesh" to import properly.

63
Playing with this a bit more, it seems that the masks are stored in the same way as paint stroke information, somehow storing it as 3D coordinates rather than as being tied to the UV map itself. The effect is the same as if I'd uses a Paint layer to paint the mesh, then scaled the FBX and reimported.

In that case, I don't suppose there's a general solution to mesh scaling? Some way to tell substance "I scaled the mesh by X, please adjust your coorindates" ?

64
I guess I don't really understand. In blender, I can add a UV Grid material, as I've done in the first two screenshots. In this test case, I'm starting with an object in blender, unwrapping it, and assigning the material. The second screenshot shows the same object after doubling its scale. The texture has "grown" to cover the object the same as it originally looked, just larger. This is exactly what I would expect.

The second two screenshots show the before and after for this object in SP. I have two layers: One is a plain Fill layer, with the diamond pattern. The other is a Fill layer with another material, using a mask to "paint" the gear on the object. The "before" picture is what I actually painted. The "after" picture occurs when I double the scale of the object in blender and reimport the FBX. Note that the plain fill has "grown", the same way the UV Grid did in blender. However, the "gear" symbol has not. Rebaking in SP doesn't have any effect on the outcome.

65
The quick version: If I start working on painting an FBX, then later scale the FBX and reimport it into Substance Painter, all Masks will also be scaled, despite UV maps not changing. This seems really surprising to me, as the masks are based on the UV map, which hasn't changed.

Here's a simple way to demonstrate this:

1. In Blender (or whatever program), start a new project. A simple cube is fine. Unwrap the cube, and export the FBX.
2. Bring the FBX into Substance Painter.
3. Add a Fill layer, then add a "black mask" to that layer.
4. Paint somewhere on the mask, allowing some of the fill to show. Save the project.
5. Return to Blender, and scale the object down (say .75 on each axis.) Apply the scale change. Note that we haven't adjusted the UV map. Feel free to unwrap the object again, but uniform scaling shouldn't change the UV map.
6. In Substance Painter, reimport the FBX into that same project.

You may not need to turn on/off the fill channel before you see the result, but you should now observe that the mask contents appear to have been scaled, despite the UV maps not having changed. This bug means that resizing an object has a very destructive effect on the Substance Painter project, even if the UVs are identical before and after.

The two attached screenshots show a Before and After. In the Before, I've put a triangle in the mask. I then scaled the FBX up to 3 on each axis, applied the scaling, and reimported it. The After shows that the triangle is now tiny, even though the UV map hasn't changed.

66
I must be missing something simple. How do I copy/paste a color in the UI? There doesn't seem to be any way to do it, other than to copy/paste the individual RGB values one at a time.

67
SP has a bunch of Hard Surfaces, which are mainly used to imprint height/normal into a surface. But I often find I want the affect the color (or other channels) when using these kinds of features. For example, if adding screws to an object, the color/roughness/etc would be a different color than the surface.

But as far as I can tell, there's no way for the Hard Surface's dimensions to control anything but the Normal's shape. This results in something like the attached image, where the normal looks like a screw, but all the other channels are the shape of the brush. In my example, I want the brush to follow the shape of the Hard Surface. In this case, a hex, but each Hard Surface has a different shape.

So how do you go about doing this? Or how do you all use Hard Surfaces to add details that affect more than just the normal?

68
Hi Wes,

I realize this might be a tough question, but do you have any sense of where you guys expect to be, with respect to supporting Unity releases, in 6 months? In a year?

My main concern with Substance right now, despite the Unity plugin generally working for me in 2018.2, is that Substance will become the reason I can't upgrade Unity. Because the plugin is in Beta, you're still recommending that people stay on Unity 2017. I'm curious if you know when that guidance will change. It's obviously unreasonably in the long run for Substance use in a project to require me to forego many major releases of Unity. To put this into more manageable questions:
  • The beta for the plugin has been going on for 8-9 months now? Do you have a sense of when you'll drop the "beta" tag and consider this production ready? And at that point, do you plan to be production-ready with the then-current release of Unity? Or will there always be some lag where using Substance requires holding off on Unity updates for long periods?
  • Do you eventually plan to be caught-up enough with plugin development that it is possible to track Unity Betas? It's unfortunate that Substance is the only reason I can't try out the Unity 2018.3 beta on my existing project right now.

69
Thanks for clearing that up. Is there any documentation on the site for making my own 3D view shader, so that I can align the 3D view with my workflow?

70
I've always created Substances that have a single output node for each type of output (base color, ao, metallic, roughness, etc). But what if I create a combined output, where metallic goes in the "r" channel, ao goes in the "g" channel, etc. The 3D view doesn't seem to show a correct preview when I do this.

Is there a way to get the 3D View to pull individual channels of a texture? I've already tried adding multiple "usaged" to my combined texture, but 3D view doesn't seem to pick it up. For example, in this attached screenshot, why doesn't the 3D view use the "r" channel of this output for metallic?


71
I understand that the Tab key toggles fullscreen mode. However, there are times when it should not. For example, when editing a color, and trying to enter the individual R, G, and B field values, it's intuitive to press Tab to move to the next field. But doing so here will toggle full screen rather than move to the next field.

72
In the color picker, I don't see a way to simply copy/paste a single hex color, in order to quickly set multiple colors to the same color.

73
Sorry for the confusion, Wes, but this wasn't a Unity integration question. (FWIW, that's been working fine for me, having used the Mask Map approach you lined in your reply.)

The issue I'm having is that within Substance Designer itself, I'm unable to get a good 3D View preview of my substance, because the Mask Map output doesn't map into Metallic, AO, Detail and Smoothness in the 3D View.

I've attached two screenshots to hopefully explain this better. The "Current" screenshot shows what I'm currently doing, where I've made a custom node that generates the Mask Map, but also passes through the Metallic, Smoothness, and AO channels. Right now I'm connecting those Metallic, Smoothness, and AO channels to outputs, and the 3D view looks correct. (Proper glossiness on the right parts.)

But it seems wasteful and unnecessarily complex to manually wire up the Metallic, Smoothness, and AO outputs when I won't actually be using them in Unity. So the "Ideal" screenshot shows me having only three outputs: Base Color, Normal, and MaskMap. And while this is fine for Unity, you can see that the 3D view in this screenshot looks bad. None of the metallic or smoothness of the MaskMap texture is applied in the 3D View.

So, the question is how I can get the 3D view in SD to read the R channel of my maskmap for metallic, and the A channel of my maskmap for smoothness? I tried adding additional usages to my maskmap output (as shown in the "Usages" screenshot), but that doesn't case the 3D view to use it.

Thanks.

74
I tried the following: For my MaskMap output node, I set its usage to an RGBA type called MaskMap, but I also added three more Usage entries, one for metallic, ambientOcclusion, and roughness. However, despite those usages existing on the output node, the 3D View doesn't use them. See attached for example. The only way I seem to be able to get metallic and roughness to display in the 3D view is to have distinct output nodes for those, which isn't ideal since I don't plan on using those outputs for any other purpose.

What is then purpose of multiple usages on an output node if they don't get picked up by the 3D view? Is what I'm doing in the screenshot reasonable?

75
I've started switching over to using HDRP in Unity. (However, note that this is not a question about integrating with Unity.)

The big difference with HDRP, in terms of Substance Designer, is that the HDRP workflow expects a new Mask Map texture, which combines Metallic, Smoothness, AO, and Detail into a single texture. This page explains how to set that up:

https://support.allegorithmic.com/documentation/integrations/working-with-hdrp-lwrp-172818842.html

However, upon doing this, I no longer get an accurate 3D View preview of my material, as the Metallic and Smoothness outputs no longer exist. How can I adjust the 3D View settings to have it pull the R channel from my MaskMap, and treat it as the Metallic channel, etc?

I tried clicking the "Add" button in the Material dropdown of the 3D View. However, it simply asks me to name the new material, but then nothing happens after. Is there a way to map the MaskMap output into the typical single-output channels for the purposes of previewing?

Thanks.

Pages: 1 ... 3 4 [5] 6