Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Sergey Danchenko

Pages: 1 2 [3] 4 5 ... 10
31
I'm pretty sure that filling the tiles with a gradient based on their luminance values is possible on a conceptual level. It's just that the node may become a pretty expensive one and some crafty algorithm would be necessary to make it fast.

I'm thinking about isolating and processing of each individual luminance value from 1 to 255 (thus tiles sharing this value) and applying a gradient to them in separate steps. This way it should be pretty controllable and reasonably fast, as making 255 relatively simple processings with a pixel processor wouldn't be too costly. Need to test this one out, though.  :D

By the way, @Vincent Gault, any tips on the future of my randomize mask node? Would it make its way into the SD library? I'm asking because I've been withholding it from the Share for some time now, and if it will not, I would go ahead and upload it there.

32
It seems that the node can't fill your tiles entirely with parameters values you're using. Try to bump a bit the Fill Steps parameter. Something like 16 should be enough for this (or, if it isn't, as much as needed to fill entire cells). Also, you can try to increase a bit the Inner Resolution Divider to help the node fill your cells faster. For 2k graph, a value of 3 or 4 would be good. Generally, for higher Inner Resolution Divider values a lower value of Fill Steps would be necessary.

33
Uploaded the node to the Share, it's pending review now.  :)

34
You have to flush Engine\Plugins\Runtime\Substance\Binaries folder before the build. Otherwise, the plugin will crash the Editor.

35
It's available  :) Controls were simplified a bit and it is .SBS instead of .SBSAR. https://forum.allegorithmic.com/index.php/topic,15281.0.html

36
Hello everyone,

As Substance Designer 6 introduced new bit depths for nodes (L16F and L32F), some new opportunities and new possible workflows come to life. To facilitate them, I've accidentally created a new node that I thought would be great to share.  ::)

The Auto Levels Plus node allows automatic remapping of an input image to a custom range specified by input parameters. Basically, it works just like you would expect from an Auto Levels node, but with two added benefits:

1) You can remap to a custom range, like to 0.5 - 1 or any other.

2) The node supports HDR data, so an input image can be of any available bit depth (L16F or L32F too).

I have tested the node for some time, but there could be bugs or corner cases when it may not work as expected. Please report such occasions in this thread so I could take a look at them and possibly come up with a fix.

Additional considerations before using the Auto Levels plus node:

The node uses this formula to remap values: NewValue = (OldValue - OldMin) * (NewMax - NewMin) / (OldMax - OldMin) + NewMin. So, at the core, it should be mathematically accurate.However, finding the OldMin and OldMax is quite an expensive operation, so at higher resolutions (2k+) performance would start to drop quite a bit.

To alleviate it, the node makes sampling for OldMin and OldMax values in lower resolution. Though there are some countermeasures, in some cases when resolution optimization is too extreme for a given image, due to downsampling OldMin and OldMax values can be sampled inaccurately (a small bit higher or lower than they are in fact). Most of the times, however, countermeasures mentioned above tend to work great, so it will produce a mathematically accurate result at a good speed for most cases.

One more note: in preliminary tests I've spotted some strange occurrences when the image is being remapped to a range other than 0...1. It was possible that some pixels would get a value a little bit off this new range. For example, if you remap it to a range of 0.41 - 1, some pixel сan get a value of 0.409998 or like that. I believe this is a precision issue due to operating with Float values, but I'm not sure about it. As a workaround, I've decided to clamp such "stray" values — in practice, it shouldn't cause any problems, as the margin of error there is minuscule.

Some images to illustrate the node are below and here is a download link:
https://forum.allegorithmic.com/index.php?action=dlattach;topic=15281.0;attach=23839

Cheers!

37
At the core, it should be mathematically accurate. This formula is used to remap values: NewValue = (OldValue - OldMin) * (NewMax - NewMin) / (OldMax - OldMin) + NewMin.

However, finding the OldMin and OldMax is quite an expensive operation in this implementation, so at higher resolutions (2k+) performance would start to drop quite a bit. To alleviate it, the node makes sampling for OldMin and OldMax values in lower resolution. Though there are some countermeasures, in some cases when resolution optimization is too extreme for a given image, due to downsampling OldMin and OldMax values can be sampled inaccurately (a small bit higher or lower than they are in fact). Most of the times, however, countermeasures mentioned above tend to work great, so it will produce a mathematically accurate result at a good speed for most cases.

One more note: I've spotted some strange occurrences when the image is being remapped to a range other than 0...1. It is possible that some pixels would get a value a little bit off this new range. For example, if you remap it to a range of 0.41 - 1, some pixel сan get a value of 0.409998 or like that. I believe this is a precision issue due to operating with Float values, but I'm not sure about it. As a workaround, I've decided to clamp such "stray" values — in practice, it shouldn't cause any problems, as the margin of error there is minuscule.

In the end, I've decided to release this node to the public as Auto Levels Plus. I think that auto levels functionality with an ability to specify a custom remap range is pretty cool in itself, with an HDR data support coming as an extra boon :) Will post here when it's live.

38
Hey,

I've tried to put together a node like that.  :D Basically it's working exactly like you've described it — it finds min/max values in an input image and remaps it to a new range. By default, it remaps to 0...1, but I've exposed the variables so you can choose another range if you would like to do so.

It is a preliminary work result, some bugs are to be expected. Tested it on HDR panoramas, seems to be working. Check it out to see if it's working for you — the node is attached below (for now, it will work best for resolutions of 1k or 2k).

39
I understood. Thank you for your hard work!

40
Splendid! Reimport seems to be fully functional now ;D Unfortunately, I'm still getting a crash every time I'm trying to undo a parameter change. I wonder if this is just me or it is a known bug?  :P

41
Got another issue - Unreal Editor crashes every time on Undo for substance instance parameter change. Posting it here to keep account of it. :P

42
Huh, indeed, it's a good timing  ;D Thank you, Dan, I'll be looking forward for the update.

Regards,
Sergey

43
Hi,

I'm looking for tips on how substance reimport works in UE4. I would like to update the SBSAR already imported and used in UE4 project by making some changes to the published substance (SBSAR) with Substance Designer. Ideally I would like to see that the substance in UE4 gets updated after Reimport menu option was used and the substance material remains assigned to objects. Also I would like to keep tweaked substance parameters intact. Unfortunately, for now it's not like this for me.

So, my questions:

1) If I'm not mistaken, to reimport a substance in UE4 I should use Reimport menu option on Substance Factory asset. Is this correct?

2) When I reimport substance factory asset, a dialogue window pops that asks for files overwrite. I suppose I should answer YES. Is this correct?

3) For what purpose should I reimport the substance instance asset and why it crashes the Editor if I try to do so (UE 4.14.3 and new Substance plugin)? A bug, perhaps?

4) When I reimport the substance with substance factory asset, it creates a new material instance, so I have to reconstruct the material network and reapply the material to the meshes where it was used before. Is it not possible to keep the original material on reimport process and just update the substance instance and textures generated from it?

5) Is it possible to keep tweaked parameters as they are in substance instance on reimport? As of now, they reset to defaults.

Thanks in advance!

44
Hello everyone,

I've made a small update to the node and reworked controls for the random opacity mask output (bottom one). Now it is possible to control the luminance range of the cells that will be included in the opacity mask, offset it by specified value and control binarity of this mask. Basically it's like taking the "slice" of the luminance mask in a given range, so now it should be much easier to produce random opacity mask for various parts of the input mask.

Download Randomize_Mask_v2.zip from here: https://forum.allegorithmic.com/index.php?action=dlattach;topic=5158.0;attach=23677


45
Nice find, Pawel! With some added controls for gradients and with a way to prevent tiling issues it could be almost complete solution.

Pages: 1 2 [3] 4 5 ... 10