Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - NevTD

Pages: [1]
Looking to test out the MaterialX integration.
Are you able to provide an ETA?

I'm running into multiple renders hanging when attempting to launch them in parallel threads with 4k exports (d3d10pc).
If I try to export 2k textures, it works just fine.

SAT: 2018.3.0
CPU: Xeon E5-2650 v3 x2
GPU: Quadro m4000
Memory: 64gb
OS: Windows 7 64

It begins normally and starts exporting out the maps to disk, eventually it just stops (about 20/189 maps in).
I cannot capture the stdout as the sbsrender process hangs and won't write out to file unless it finishes.

- Tried setting memory-budget and cpu-count limits, to no avail.
- Running the sbsrender processes one by one works but takes a long time.
- Once these processes hang, I can restart them using Windows Process Explorer and they continue rendering just fine, until something causes it to hang again. At which point if I restart them again, they continue working.
- Ive locked each task_pool to make sure that the next pool cannot start until the first one is completely finished; this allows me to get a few more exports, but it still ends up hanging.

Any idea what could be causing the sbsrender processes to hang indefinitely?
Could it be a GPU memory issue?

I've included an attachment with a simplified diagram of how the threading is set up and also of Process Explorer when the sbsrender jobs hang.


I can't seem to log in to substance share using my account. I keep getting the same "Sign-in failed! Please make sure your email and password are correct." error many users have posted about back in January.

Resetting the password didn't help and I can log in just fine everywhere else.

Calling batchtools processes from within Maya returns a handler error because of the way Maya deals with handlers.
Code: [Select]
WindowsError: [Error 6] The handle is invalid
The normal fix for this is to define the handlers for the subprocess:
Code: [Select]
                [tool_exe, sub_command, '--opt-descr'],
                stdin=open(os.devnull, 'wb'),

I know its possible to provide args to the subprocess.Popen in __sbscall; however, those same arguments never get used in the subprocess.check_output called in __load_command_info. This results in the handle error posted above and prevents __sbscaller from continuing.

Can this be patched to allow the subprocess kwargs from __sbscaller to also pass into the subprocess for __load_command_info?

I'm setting up a thread pool per UDIM which processes tasks in the following order:
 [cook thread]
[bake threads]
           | <-----wait for bake completion
[render thread]

In order to process the correct baked maps in sbsrender(via setentry) I have to manually resolve the bake paths and return the value via Queue.queue.put(resolved_filepath), which then gets added to a paths dictionary (in my queue manager) for the associated baker.
The same happens to the sbsar once it's cooked.

The paths dictionary is then grabbed from the queue manager, into the sbsrender task, in order to associate the baked maps with the correct input and the cooked sbsar.

Although this works for the baker and cooker, it's far more difficult for the render, as I have no way of predicting which outputs will be available and therefore can't resolve the output usage/identifier easily.
The process of resolving the filepath manually from a bunch of function arguments, and any other production related "default" arguments stored in configs, and now possibly also via pysbs, makes this process tedious and bloats up the task function unnecessarily.

Is there a way to retrieve the resolved filepath from the batchtools processes or sbsbaker, sbscooker, sbsrender tools?
I can't seem to find a way.
If not, it would be extremely useful if both batchtools and SAT tools returned the resolved filepath at the end.

I'm running into an issue while creating complex materials.

I have a node which can output diffuse or basecolor based on a switch. (not relevant to the issue)

If I make an explicity connection from an output to an input, it ignores the values and connects to the first acceptable match on the left node.

If I run the code below, it will connect the 'diffuse' output on the source_node, to the 'input' on the target_node.
Code: [Select]

This is the output order on the source_node:

Now if I reverse the order so that basecolor is first, it connects properly:

The temporary fix is to retain proper order via Designer, but it's extremely unreliable when managing a complex library of materials or generating them via pysbs.

This has become a huge problem for us, so any suggestions or confirmation would be appreciated.

Would it be possible to set up a new sub-form dedicated to Substance Designer API related posts?

Painter has it's "Scripting" sub-forum which makes it easy to reference or search through posts; whereas Designer's Python API related posts are currently scattered all over.

I'm not currently part of the beta but was hoping to do some preemptive testing with Alchemist exported materials.

Can anyone please provide the following?
1) A simple material with filters, exported from Alchemist as an SBS
    (i.e. a base material with a dirt filter and dust splatter filter)
2) A screenshot of the materials layer stack in Alchemist, for reference.

Much appreciated.

The node graph I'm creating has other graph instances which represent custom built materials.
I need to expose the parameters from each of these instances, to the parent graph.

To do this, I'm iterating through all the parameters of the instanced graph and creating dynamic parameters:
Code: [Select]
dyn_val = aNode.setDynamicParameters
Now on the dynamic value, I need to create a function:
Code: [Select]
So far so good, except aFunction requires a FunctionEnum value.
To build this value, I'm attempting to extract the existing type name on the parameter, which will then be concatenated for use with FunctionEnum to retrieve the value:
Code: [Select]
param_type = aParam.getType()
# 16

The problem is that it returns the ParamTypeEnum value rather than the name (INTEGER1, in this case), and there doesn't appear to be a way to get the name.

This process would work fine if you're creating parameters while creating graph instances and explicitly define the types, but it falls apart when attempting to expose existing parameters.

I feel like I'm missing something here.
Is this the correct way of exposing parameters from instances?
Is there reverse mapping for enums?

I'm developing a material via Pysbs with multiple sets of channel outputs (grouped) which would allow us to have two groups of texture sets as output:
  • The first set is created from the outputs of each graph instance.
  • The second set is the final "comped" material outputs.

Is it possible to do the following via Pysbs...

A: Enable/disable an output node?
I understand that since instanced materials are graph instance representations, there's no direct way to disable them, since there's currently no bypass for data to flow through.

I assume, however, that's not a limitation for output nodes and that there should be a way to disable them.
I've tried SBSCompNode.mDisabled, but I can't figure out what that does outside of adding the disabled XML attribute to the sbs file. It seems to make no difference in Designer or Painter.

B: Reordering an output node?
Much like re-ordering the widgets in the Output Images tab of the graph in Designer, is it possible to re-order your CompOutputs via pysbs?
I'd like painter to pick up my material outputs group by default, rather than the order in which the outputs were created (since the "comped" material outputs will be created last).

Substance PainterSubstance Painter - Scripting - Editing layer stack
 on: December 12, 2018, 04:03:14 pm 
Does the API currently let you edit the layer stack of a texture set?
I'd like to be able to add a material(layer) to the layer stack of a specific texture set.

I've looked through the docs but wasn't able to find anything related to this.

I've finished setting up a custom network in Maya with the aiStandardSurface, which renders correctly and all the parameters appear to work, with the exception of all mesh map driven parameters.

I can't seem to find any way to connect the baked mesh maps into the inputs I've created in the substance.
This significantly reduces the functionality of the substance.

Are the inputs not available in Maya?

Does the current indie promo with promo code (BF18-SUB) contain the Substance Automation Toolkit?
I'm interested in buying the license for personal use, but I absolutely need access to SAT.


Pages: [1]