Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - t.leszczynski

Pages: 1 2 [3]
One note for the horrid jpg output - you can use png, bmp, tga, tiff etc. I usually use png or tga.

Hard to say what's the problem without seeing some pics - normal map, UVs, model with wireframe etc. like Raouf said.

Only crashes I experienced are when I try to:
- render texture 8k and higher in Substance Player - it alone eats 1.5GB of RAM during generation
- render texture in Substance Player when Designer is open

My tips:
- make sure you use the GSLS renderer (Mac) or DirectX (Windows) in the player
- try not to use Designer and Player at the same time - conserve your resources. Also check the amount of memory being eaten by generating specific texture sizes.
- simplify your blend nodes - do not use alpha blending if your textures doesn't use it
- use operations on greyscale images - works faster, eats less resources
- if all else fails, use Substance Batch tools - with them I can cook 8k textures with no problem (didn't try bigger), just write yourself some custom script in ie. python

Hope you will solve your problem.

I use i5, 8GB RAM, Win 7 x64.

You have to calculate the "weight" of possible workflows. So, if you:
A. generate the bitmaps, and apply them to mesh in Unity:
  - you will be giving up the possibility of dynamically changing the texture in game
  - your final game size will be bigger
  - there will be no time taken during runtime to generate the textures (more on that further down)
B. use b2m in Unity and bake in-game:
  - you will be having an option of changing the texture in editor or in-game
  - your game size will be smaller (although the amount of VRAM it eats is the same)
  - you will have to take some time overhead into account for the textures generation
C. use b2m in Unity and keep in editor:
  - you will be having an option of changing the texture only in editor
  - your game size will be the same as in option A
  - no overhead time taken for texture generation at runtime

The only inconvenience is that you can't load presets to your substances in Unity, that you tweaked in the Substance Player. You can either way write your own editor plugin to parse the presets and apply them in Unity or you can bugger the Unity devs to add that option to Unity. I think I saw in UDK there is an option to read the presets already integrated, I don't know why there isn't one in Unity.

 :P Don't know either. But I wasn't really using it anyway.

shift + LMB - move the light
shift + RMB + move mouse up/down - scale the light
LMB - rotate the model
MMB - move the model
RMB - scale the model

In 3D view go to Channels ->Reset to default colors.

Substance DesignerSubstance Designer - Technical Support - Re: baking failed
 on: February 28, 2013, 07:02:07 pm 
The model provided in the tutorial has overlapped UVs for arms, gauntlets and legs. Try unchecking those additional names (they have "1" appended to name) during scene information baking.
Maybe error comes from that.

I upgraded whole 3.2.x branch over previous 3.1.0 version and the samples were not touched. When I open them, designer asks me for specific subgraphs/generators from the library - all I have to do is just point it where each one of them is on the disk. So just the automatic library update doesn't work with them, everything else looks ok.

SubstanceSubstance - Discussions - Re: Scratching my head
 on: February 18, 2013, 03:08:06 pm 

I perfectly understand you man :)
My experience with substances began with playtesting the v2.5 ca. a year ago.  At first look substances were looking like magician hats, from which you can pull out not only a rabbit but an elephant.
But, there wasn't much practical info on this except VTC Substance Tutorials and Substance POD tutorial. I tried to make few substances myself but finally gave up in the end, well only a magician can do the tricks with magician hats I thought.

I'm a hobbyist which occasionally has a small job from time to time texturing some assets. I went back to SD when v3 came up. Watched the Cymourai tutorial and this time I went into SD trying to follow it not just watch. That changed my perspective on SD (yeah I was lazy before).
Substance designer is improving my workflow in making textures - imagine your client or you want to change a specific part of the texture just before your deadline.
If I work in Photoshop for example, I have to do more steps to achieve desired result. I have to juggle between AO, diffuse, specular maps and their layers, remake the normal map, redo the levels correction, etc.
In substance designer I can use baked SVG info, paint a mask on it, change only that specific isolated element and then just "plug it in" to previously made and working perfectly chain of nodes containing level corrections, gradient maps, AO bakes etc. All my final outputs are automatically regenerated with new information. Not only that, but I can expose some of the nodes parameters for the client to fine tune it to his/hers needs (ie. if he/she will want to change a color of that element in the future, again :) ).

Follow the first steps tutorials, Cymourai using the SD and I think after a while you will understand what SD can be used for and what substances are.
I only scratched the surface of this - there are still a lot of things I didn't put my hands on yet (FX Maps, advanced function graphs), but I don't give up.

Be careful though, after some time spent in SD you will be seeing patterns everywhere  ;D When I look at my wooden desk I think about streched noise pattern, mixed with transformed b/w gradient warped by noise, gradient colored...

Substance DesignerSubstance Designer - Discussions - Performance questions
 on: February 18, 2013, 02:11:07 pm 
Hello wonderful people at Allegorithmic - I have a couple node computation questions regarding performance of the final sbsar substance.

Example problem:
I have a door (metal) to texture, the client wants to have (optimistic version) 2 different metal types in the substance to switch. I can separate the default scratches and other weathering and put it in the graph as an instance. But what about the metal node chains ? Can they be instanced as well ? Or should I make different sbsar for each metal type - which will increase the total substances size as each sbsar needs to have normal map baked from high poly mesh.

1. How are the switch blend nodes evaluated ? The question is are the different node chains existing prior to the blend switch computed or not ? It looks like Substance Designer computes it all (eager evaluation). I wonder if the same substance published and used in Unity/UDK use different method (lazy evaluation) to check first for the default switches, and compute first only the needed node chains. When I'm creating simple node chains that give me only diffuse with mixed AO in them it matters little but what if I wanted to instance more complicated, lengthy graphs, each with it's own diffuse, normal, AO, etc. output.

2. Suppose I didn't put the metal node chains as instances put just pasted them into the graph. First noise generator used in the upper chain is the same as the noise generator used in the lower chain. Does it matter for the published substance if they are separate instances ? Or should they be collapsed to just one ? They both use the same exposed value/function.

3. If the sbsar in Unity/UDK is computing all nodes, I was thinking about putting a blend switch nodes (with UFC as 2nd input) just after each noise generator. They would be driven by the same exposed value as the main switch (most right). I don't know if the noises are computed or not in this case (ref question 1), but at least changing them to uniform color should speed up the computation of the rest of the chain. Or am I wrong ?

Thank you.
I've been using Substance Designer for ca. 4 months and didn't know about that - it really is a time saver.
I must have missed that in the documentation. :o

I would like to see (1) graph navigation option based on exposed parameters and (2) reverse exposed parameter information. What I mean:

1. After making a bunch of substances and exposing parameters of their nodes, I save them for later use. Everything is tip-top short time after or when I work on them - I can easily remember which exposed parameter correspond to which node, however after some time I'm often lost. Only way for now to figure that out (that I know of) is to jiggle with exposed parameter default value looking at the graph which node is changing - after opening that graph and changing the input to some static vector / diffuse / color map.
Proposal - getting from exposed parameter to it's node / nodes (or highlighting it / them).

2. Selecting exposed parameter in node should give information about it's exposed name and/or label. Now after clicking the exposed parameter in node I get the list of all the labels.
Proposal - exposed parameter label/name on top of the list ? different color ?  label / name after hovering over ?  etc.

In my opinion the second option is more important - as there can be multiple nodes sharing one exposed param.
Thanks :)

Pages: 1 2 [3]