Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Tok Tok

Pages: [1]
Nobody knows a solution to this?


Before I build my material scanner I used Pixplant create my textures and, moreover, get textures to tile well. This worked pretty well but it does not work with my scanned textures as you cannot generate the diffuse, normal and height at the same time. So I'm wondering if there is a similar feature in Substance Designer?

The problem I'm having now, is that when I scan a piece of fabric it tiles very well but when I zoom out you still see a pattern. The obvious solution is of course that I scan bigger pieces of fabric but the problem with that is that I lose detail and I only have samples that are 10x20cm big. Another solution is that I scan/process different parts of that piece, say the front and backside and use SD to stitch different parts together, so creating different seeds that can generate a big texture randomly. Is this possible in SD??



Thanks, that's what I wanted to know. Using a 16bit format results in better quality images.

Also good tip about not using compression on the TIFF files.  8)

No actually in one of the last steps, before going into SD, they save all the images to TIFFs.

I'm currently scanning material with my new material scanner and process the photo's I make through Lightroom for white balancing and color correction. After that, I save the images as TIFFs in 16bit format te preserve a lot of light information. But these files are big and also take a lot of time to open and process in Substance Designer, so my question is, does Substance Designer need 16bit images to process the multi-angle images correctly? Or can I also save PNG's or even JPEG's from Lightroom and have the same quality? In the end, I just save jpegs from SD.

I finished the material scanner and posted the first results in this topic:

Substance DesignerSubstance Designer - Showcase - Material scanner finished
 on: March 02, 2019, 01:53:33 pm 
So I just finished my material scanner I build to scan in fabrics (and other stuff, but mostly fabrics), and the results are amazing. I'm very happy with the detail that can be achieved with this method! This is rendered in a very simple scene with only an HDRI to quickly see the results. Take a look:

The next step would be to add hairs to make it even more realistic.

At the moment I'm still building (very busy with viz work) but when it's finished I'll share the results.

What is Substance Alchemist, do you have more information on that? The beta page looks really interesting!

Yes I guess that's true but at the same time Substance Designer should be able to get the color information that is in the shadows in one picture back from the other.

Anyway I'm still building and went with the soft light approach. We'll see what the results will be. :)

Thanks, that's what I wanted to know. I have to change my design a little bit, soft light is harder to make than just a spot. :)

In a youtube video you can see someone putting some leather material in a boxed 3d scanner with soft lights. Are there any specs/tips on how you did the gradient soft light in there?

I don't have a license for substance designer so I don't know much about it (best algorithm etc.), yet, but I will buy it for this project. I'm very eager to test out the scanner once it's done!


I'm not sure if I put this in the right section but there wasn't a section for scanning materials.

I'm currently building my own material scanner and to do this I'm following the blog of Dave Riganelli and Allegorithmic, but I see two big differences between the two. Dave Riganelli is using hard lights to light the material and on the Allegorithmic blog they say you need a big soft light for the best result. So before I start building my scanner I would like to know the difference between the two options and which one is best.

Does anybody have experience with this?

 8) J

Pages: [1]