Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - nicvcer

Pages: 1 [2] 3
Try scaling your UVS down so they fit inside the box better. It looks like based on your errors that your uvs are poking outside of the uvs limits. You will always want to do this as when texturing padding is necessary between your UV islands for proper texturing.

Not really an answer, but I personally have not had much success with material layering in Unity so I just cut that out of my pipeline. If you can afford the extra expense, try adding a material to the model and having an emissive material on top with an alpha channel.

I think the problem might be in your first image, though this is a guess. Give this a try...
Before exporting your substance, make sure your output size is relative to parent without any multipliers. Excluding uniform color nodes (Absolute 16x16) and loaded bitmaps (Absolute 2048x2048), all of your grid items and your top-level should be at 256x256 (Relative to Parent x1) when exporting any substance correctly. Then in Unity/Player change the size to 2048x2048.

Based on the Coconut graph

Coconut Husk, Unity

Baseball footprints in dirt

I've decided to share my default Unity 5 Metallic Roughness substance template that I use for every new Unity Substance I create. Its not perfect but its getting better! Should be useful to anyone looking to organize their workflow into Unity. It is designed as a clean output system that also allows access to a packed Metal/Smooth map output.

Something to note is the Ambient Occlusion from the normal map is by default set to 1px x 1px absolute and is not connected to the output. Once the height map is fully set up these nodes should be changed to Relative to Parent and connected to the output. I made it small and disconnected it because it can add a lot of processing time to any substance graph and should be saved as a final touch in this particular pipeline.

I've uploaded the template to Substance Share and will post a link to it here once it is approved.

I've solved this issue. It was not a problem with Substance Player, rather an additional output needed via Substance Designer.

A while ago I made a feature request for an option to export the proper metallic/smoothness file from Substance Player, so my clients could modify the material in player and export the correctly packed maps they needed to go straight into Unity. This allows them to choose whether they want smaller file sizes and longer loads (substance) or larger file sizes and faster loads (baked maps).

Now that I'm a bit wiser I see that a change to Substance Player itself isn't necessary. Only another output is needed for this operation. This output will be included with all future Unity substances I produce. I specifically used the "Any" output and got a proper result.

In the image below the roughness is inverted and then merged with the metallic map. I used the metal output for all three RBG slots despite only the RED channel is used for the metallic map. The Blue and Green channels are wasted, they are not used by Unity. I passed the metallic values to Green and Blue anyways just to visualize a black and white result.

This "Any" output is not visible in Unity. It is visible in Substance Player, where it is needed. The "Any" output I labeled as "MetalSmoothPack" can be exported from Player and the textures can then be placed directly into Unity.

Just posting this solution here in case anyone was wondering how to do this.

Hi devs! Thanks for everything you do! Question...

Will there be support for texture packing with Player in the future? Will there ever be Export Presets?

The option to export the textures generated by the substance, after having tweaked exposed parameters in player, is important to what I'm trying to achieve. This is not at the request of a client, but a concern on my end as to the future sale-ability of my created models and substances to future clients.

When trying to warm any client up to the idea of using substances in their workflow, it would be helpful to allow the client the breathing room of being able to bake substance archives down into static bitmaps and move in a backwards compatible direction that they are used to.

If I'm going to sell the pliability of a "substance creature" versus a "static baked textures creature", it would help if clients could fall back on static baked textures that they can generate themselves if desired in Player. If I had to generate the static baked textures, then every time there is a change, I would have to run it through my system and send it back. That creates a dependency I'm not really all about. I also can't expect any clients to have image processing software capable of channeling the images correctly, not to mention they'd have to learn how to do that. Substance Player would be an excellent software for the client, as it would only perform what the client needs and is directly related to the substance file (not just an image editing software).

An example client uses Unity, but may not want the fluff that comes with using a substance on a 3D mesh, such as computation of maps on startup or the generation of several outputs textures in the background. The client is excited about the idea of being able to alter textures, essentially, but maybe they prefer their simplified 3map-in format. Having a way for the client to generate Unity compatible maps, based on the sbsar file I've given them, would be super useful.

Howdy Wes! Thanks for responding!

When editing the subgraph's exposed parameters they do indeed get updated, however the effects of the subgraph being updated would not reach up to the combined graph in Unity - there is no driver to push the changes made in the subgraph upward, as on its own the subgraph functions independently and does not communicate with other substances. The method I came up with allows the main graph to give instruction to the subgraphs and receive feedback - working from the top level downwards then back upwards to force out a result.
Ah I see, so the super-graph with everything thrown on it is potentially the most optimized way to do this, despite being difficult to look at. I bet I could shave half the computation time off doing it that way... Its far too destructive, though, so maybe I'll consider that approach in the future if it is dire to optimize.

Final Update : Input Works!

So I've converted the substance from being output based to being input based and there is a dramatic improvement to the readability of the graph. Also the Unity warning is now gone! The input method does work in Unity, just as well as the output method.

The blending of the skin colors is done internally within substances instead of externally in this upper layer. The Dino-Skin is much more compact and simpler to configure.

As for the interiors of the underlying substances, they have been restored via switches. That way if I'd like access to the %100 original colored material, all I have to do is flip the switches.

A bit of warning, though. If these switches are set to show the original color and not the custom input, it will disable the ability to customize the color in Unity. So the switches are a tad dangerous but well worth it to keep the substance completely non-destructive.

Update :

One feature I liked about the brute force setup I had in the last post is that the end-user only has one substance that they see. This can reduce confusion when applying the substance to a model. But the overall complexity of the graph negates this positive gain imo.

This time I removed the color information from the underlying substances, and brought the colors up to the top-level material where all the substance graphs are being blended together.

Substances that used masks for color blends got custom outputs of those masks. I was then able to access those outputs during the material blend, and the alphas do carry over to Unity despite the warning saying they failed. I also went through and optimized things a bit. I set Uniform Color nodes to 16px and turned all the other nodes "Relative to Parent" for the ability to scale the material in Unity. Pictured below is where I blend the Rough Skin and Smooth Skin into the material. In order to simplify things in Unity I use the same colors for both the Rough Skin and Smooth Skin.

Organizationally this is an ideal setup as it brings all my custom controls to an area where I can see them all at once. The workflow is odd in that it goes against the non-destructive nature of Substance designer in that I had to visually modify underlying substances to get the approach to work.

One last approach I would like to try is setting Inputs in the Substances and just pumping the color directly in - instead of having outputs and in-between blends. That could perhaps simplify this process farther by having the blends happen behind the scenes. It would also be less destructive to the underlying substance graphs as they could retain their blend information. If this works as could be predicted, it would also get rid of the Unity warning as I'd be using inputs rather than outputs.

A work-around for this would be centering the oddly-sized image into a 4096x4096 image, run it through B2M, then crop the image back to the original size after it has been processed.

Also this question would work better in the BitmapToMaterial section of the forum rather than the Unity 3D section.

I wanted to give an update and share the result of me brute-forcing and sticking every material onto a single graph.

As expected, once this substance archive is brought into Unity it works as I'd like, minus some metalness/roughness issues that I'm not going to fix for now. The graph may be a headache to look at and set up, but its a headache that works!

I'm able to tweak color values and generate new textures on-the-fly in Unity, exactly what I was aiming for. There must be a better way, though! Thanks in advance for any help you can give!

Pages: 1 [2] 3