Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - NevTD

Pages: [1] 2 3 ... 9
1
Does your version of SAT, match the version of the docs?
Can you provide the full command you're executing?

The correct use --gpu should come after the render subcommand...
Code: [Select]
sbsrender render --gpunot...
Code: [Select]
sbsrender --gpuas the gpu flag is an option of the render subcommand.

2
All 3d noise nodes require a position map plugged in for the mesh in order to work correctly.

In order to make this work with one of the default meshes (like the plane in the video), you'll need to convert it to a resource first, then right click on the resource and Bake model information > position.

3
Quote
I've heard it said that the Automation Toolkit can reproduce almost all processes that could be done using Designer
It's important to note that Substance Automation Toolkit (SAT) is not an API for Substance Designer.
What it is, is an API for certain processes that are required to automate the material creation process.

SAT encompasses the following:
  • CLI sbsbaker: Bake mesh maps from your input geometry
  • CLI sbscooker: Compile sbs material to sbsar, to be render/Painter compatible
  • CLI sbsrender: Render textures using the mesh maps and sbsar
  • CLI (sbsupdater/sbsmutator): See docs https://docs.substance3d.com/sat/command-line-tools
  • Pysbs: A python API that allows you to create and edit sbs files (essentially xml files). It also includes an API to run the CLIs listed above via python using the batchtools module, which generates subprocesses for you.

Substance Designer also has its own API which aides you in building and editing materials, but it doesn't offer much more beyond that at this time, although it is slowly improving.

As far as I know (2020.1.0), I don't think it's possible to automate rendering via Designer API using IRay.
Even though the Designer API likely won't be of any use to you in this case, I just thought it was worth mentioning that it does exist.

___

It looks like you're looking for a solution for something similar to turntable/thumbnail rendering with your material of choice; to achieve this you would have to combine several different processes.

Start by generating the textures in SAT and then run a python script that applies those textures to the shader in your DCC, which renders the final image.

  • Bake out your required mesh maps by providing the fbx to sbsbaker.
  • Compile the material you need to sbsar using sbscooker.
  • Render your textures by providing both the sbsar and mesh maps to sbsrender.

At this point you can use the DCC of your choice (I'll use Maya in my example) to generate the thumbnail/render.
You could have a python script that does the following:
  • Launch a Maya headless session and provide the appropriate render environment (a template shot file would be ideal: i.e. it already has the appropriate lighting, a shader with texture nodes for each channel applied to the geometry)
  • Apply the textures to the appropriate texture node for each channel.
  • Render the frame/turntable.

    4
    Need a bit of clarification on what you're trying to achieve here.

    From reading your post, it seems like you want to add a bitmap, connect it to the transform node and then render the output. Is that a fair assumption?

    If that's the case, then there's a couple of things that need to be accounted for:
    • The transform node needs to be connected to one of the output channels (nodes), most likely base_color is what you want.
    • Pysbs is intended to create or modify sbs files so a different process is required to render/export the channels from your graph.

    SAT is a combination of command line scripts (sbsbaker, sbsrender, sbscooker, others.) and pysbs.
    However, pysbs includes a wrapper for the command line scripts called batchtools, which can be used to run the CLI's as subprocesses via python.

    You already have batchtools imported in your example, you can utilize it by calling batchtools.sbsrender_render:
    https://docs.substance3d.com/sat/pysbs-python-api/api-content/helpers/batchtools
    The supported arguments can be found under the sbsrender CLI docs:
    https://docs.substance3d.com/sat/command-line-tools/sbsrender


    Edit: To expand a little bit more, your python script should eventually be compartmentalized like this:
    • Create the sbs package (pysbs; most of this is done in the script you posted)
    • Save the sbs package (pysbs)
    • Run the sbs package through sbscooker (batchtools/CLI); this will generate a flattened sbsar
    • Run the sbsar through sbsrender (batchtools/CLI); this will generate your textures/renders

    5
    You might have better luck contacting them directly rather than posting on the forums for time sensitive issues.

    Also, see if this applies to you:
    https://forum.substance3d.com/index.php?topic=21392.0

    6
    Have you checked C:/Program Files/Allegorithmic/Substance Automation Toolkit/Python API/Pysbs-2019.3.0/pysbs_demos/sample/resultHelloWorld.sbs to see if the modified timestamp has been updated after running the script?

    demohelloworld.py
    Code: [Select]
    if __name__ == "__main__":
        destFileAbsPath = python_helpers.getAbsPathFromModule(sys.modules[__name__], 'sample/resultDemoHelloWorld.sbs')
        demoHelloWorld(aDestFileAbsPath = destFileAbsPath)

    7
    It depends on where exactly you're trying to drag/drop.

    I provided this slightly hackey solution a while back for the node graph, see if it points you in the right direction.
    https://forum.substance3d.com/index.php/topic,30012.msg115891.html#msg115891

    I didn't bother looking deep enough into capturing the QMimeData, but it should be possible for you to install an eventFilter on the Designer widget* containing the dropEvent and capturing the mimeData details there...in theory.

    Alternatively, you could even install an eventFilter to the widget that accepts and processes your custom mimeData from your application's dropEvent, before/after processing the built in mouseDropEvent.

    I no longer have access to the QT version of Designer so there's no way for me to test it.

    * See the "Testing..." section of the code on how to capture the widget and their class names (objectName isn't being used).

    8
    Hi Markus,

    Although I don't have the answers to all of your questions, I'll try to help out with my experiences since it's better than nothing.

    Quote
    I have noticed "Cooking optimization options" in sbscooker.
    My goal is to make sbscooker as quick as possible (there will be a dedicated computer). As far as I understand, I have to set "crc" to "1" and "full" to "0". But this actually increased cooking time for me. After that, I've applied "merge-data 1" and it made the process faster for me. What settings should I apply for the fastest performance?
    I haven't strayed from the default settings so I can't provide any input here.


    Quote
    In the future we would like to use sbsmutator to edit existing graphs...
    Edit what, exactly?
    Although sbsmutator has editing capabilities, they can be limited to graph level I/O optimizations.
    If you need more advanced capabilities like adding to the graph or building one, then pysbs would be needed.

    Quote
    In the future we would like to use sbsmutator to edit existing graphs, sbscooker to make these "ready-for-render" and sbsrender to render them. Is this the "official" way to this?
    Yup, sounds about right.

    Quote
    And if it is, what hardware would you recommend? What is more important for this use-case, CPU or GPU?
    Tough to answer as it's highly relative to your production needs and how you optimize your subprocesses.
    Bakes (from-mesh): Heavily GPU reliant.
    Renders: GPU/CPU reliant. Although I think the bias is more towards CPU performance. (I don't know this for sure and I can no longer find their blog entry that discussed it.)

    An RTX capable card (GTX 1000/2000 series) would give you the best performance when baking.
    A mid/high end multithreaded processor will give you great performance when rendering.

    I'm not sure if RTX has been optimized or is even supported by Linux, so you may need to look into that if you're using Linux.

    Quote
    How much RAM would you recommend for a medium complex graph?
    Again, very relative. How would you define a medium complex graph?
    Baking:
    The resolution of your asset will likely have the most significant impact on the memory as it needs to be cached during baking.
    I've seen a single subprocess take anywhere from 2gb to 15gb, depending on a combination of factors.
    Ideally aim for a minimum of 64GB.

    Rendering:
    Highly dependent on the optimization of your graph.
    If you're allowing artists to work heavily with procedural scaling (not tiling), expect to use a lot more memory.
    In fact monitoring memory and performance during rendering is how we've caught multiple graphs that weren't optimized properly.

    9
    (Correction: There is an installer download for Substance Automation Toolkit but no license key file. I assume that it does not need one.)

    That's assumption is correct.
    At the moment there's no license authentication procedure for SAT.

    10
    ...since there is only a $5000/yr option, and my inquiry to ask about other options yielded no response.

    Did you ever find an option that was less than $5000/yr?

    If you qualify for the indie license then SAT should be included with that. (~$250/year I think).
    Although I didn't renew my subscription in 2019, I did purchase an indie subscription back in Nov 2018 specifically so I could have access to SAT.

    Hope that helps.

    11
    Use tile_sampler_color instead.

    12
    Substance DesignerSubstance Designer - Discussions - Re: Serious help needed
     on: January 13, 2020, 11:13:44 pm 
    I think your post might be too ambiguous for people to answer, it might need to be fleshed out more to explain what it is you're trying to achieve.

    Is your end goal to texture it quickly or to texture it in a specific way?
    Why exactly are you fixated on Designer if you're new to it and not familiar with its workflow or limitations?

    13
    Try contact@substance3d.com.

    Quote
    When I hit submit it says the captcha wasn't entered correctly, but I don't see one on the page.
    Are you using an ad-blocker? It might be disabling some desired elements on the page that may need to be whitelisted.

    14
    Figured as much; it's something in the QT5 library probably triggering it on initialization via SAT.

    I would say it's safe to ignore it unless Adobe/Allegorithmic says otherwise.

    15
    I only have access to Designer at work so the code snippets I'm providing are from referencing the docs and completely untested, however it should be enough to point you in the right direction.

    Quote
    1. get the name of the current graph in the package ?
    Code: [Select]
    import sd.api.qtforpythonuimgrwr ui_mgr
    graph = ui_mgr.getCurrentGraph()
    graph_name = graph.getIdentifier()

    Quote
    2. get the function/info in the 3D View ? (I'm trying to reset the position then save the image into specified location)
    Unfortunately there's no way to interact with the 3d view via SD API.

    If you're trying to generate a swatch to go with your publish, you can try the following:
    A:
    • Create a template scene with a sphere (or other obj) and shader in the dcc/renderer of your choice.
    • During publish, export graph outputs via sd.tools.export.exportSDGraphOutputs
    • Trigger a subprocess to plug the textures into the shader of your template scene, via headless/python session and render the outputs.

    B:
    • If you're familiar with QT, you can try to capture the 3d viewport object and force the camera changes.
    • Try to export the current view via public methods.

    The latter option would require a decent amount of QT knowledge, not to mention being a very hackey solution and not at all future proof. Not recommended but listing it anyways in case you know what you're doing.

    Pages: [1] 2 3 ... 9