Author Topic: Designer maxing out memory and CPU with simple nodes  (Read 117 times)

Seemingly at random I will create or attach any sort of node (noise, levels, swirtl etc) in designer and my memory usage will jump up to 100%. CPU will jump up to 100% and then fluctuate between 50-100. Computer slows down to an almost unresponsive level.

Before I over-explain I'll post all the info that might be relevant:

Computer spec:
Windows 10 fully updated
Processor   AMD Ryzen 7 3700X 8-Core Processor 3.60 GHz
Installed RAM   32.0 GB
Graphics Card NVIDIA GeForce GTX 1070 (drivers up to date)

Designer graph size is relative to parent and set to 1k.
The same issue occurs whether I have one graph/package open or a dozen.
I have "automatically compute all nodes" off.

More explanation:

I noticed it first when creating noise nodes. I used clouds 3 as a test several times and froze my computer up more often than not. However, the slow downs are not 100% consistent. Designer might work fine for 10 minutes and then suddenly this behavior will begin again. This can also happen when creating a levels node, or transformation, or most recently a swirl.
I can usually use task manager to force close Designer and after 30 seconds or so my memory usage will return to normal. During the slow-downs, if I sort task manager by memory usage I can see several "sbsrender" and a "crashpad_handler" task stacking up.

Similar issue that may or may not be related: if I click on a wire between two nodes and tab in a new node, I can cause this same slow down behavior 100% of the time.

I'm wondering if there is some Designer settings I'm using that are incompatible with my computer setup? Would appreciate any help I can get!

I believe the majority of the random memory spikes are a problem unique to the package I was working in. I'm working in tutorial packages today and not having any problems creating/editing nodes.

However, the memory spike when adding a node on an existing wire still happens 100% of the time, no matter the file I'm in.

Hello @NAyres9,

Thank you for reaching out to us!

Based on the behaviour you describe, the sustained activity may be caused by the computing of the Library thumbnails.

The content of the Library is displayed in the Library panel and the Node menu when searching for a node in the graph.
Thumbnails are computed when content is displayed on screen for the first time. This process involves computing and rendering the graph for this content if a custom icon was not set, as the graph's first output is used as an icon in this case. If the content is a bitmap, the entire bitmap needs to be processed.

Rendering these graphs can be an intensive process, yet this is generally not really noticeable in the Library panel since only a few items are displayed at any given time.
However, when searching for a node using the Node menu, the list of matching results is refreshed each time the search text changes, and usually beings with a quite long list, which is then narrowed down to what the user was searching for. A long list means a lot of Library content being displayed at once, often for the first time, and that means an extensive series of thumbnails renders start, putting a strain on the system.

All that being said, please note the computation of thumbnails is a one-time process, and the thumbnails are only recomputed if the source content was modified, or the user triggers a recomputation manually.

You may mitigate this impact by doing the following:
  • If you have added custom content to the Library, make sure this content is curated to not include undesired items and bloat
  • Before loading any graph, browse the various categories of the Library panel and scroll down the content progressively, waiting for all thumbnails to be computed
  • Combining the two first points above, make sure custom filters are set up for custom content so this content can be displayed in the Library panel – and its thumbnails computed – without the need to use the Search field or the Node menu
I hope this is helpful and informative, feel free to let me know the results!

Best regards,
Luca
QA Analyst
Substance Designer Team