Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - dessert_monkeyjk

Pages: [1] 2
1
Well this is not what I expected. I upgraded to a new PC and was reinstalling software, Substance Designer 2017.1.4 included, and opened a package I was working on recently on my older PC and I see this error (see attached)

Now I normally see this when opening a file edited with a newer version but this seems to be the opposite scenario which is rather bizarre. I know for a fact that I used SD 2017 for this file so I'm rather confused how this is occurring.

Anyone know why this could happen?

Edit: So I decided to attempt changing the version number, both the formatVersion and updaterVersion values, to "1.1.20170" using Notepad++ to and see how Designer reacted to it. It picked up on it and offered to upgrade it to "1.1.20174" but upon doing so I get this error. (see attached)

I have a few 2017 files and they all have roughly the same issue, one complaining about a type variable (output size in one instance) and others for comptype (value is either an integer set to 1 or 2, nothing unusual). These values also exist for 5.6 files so I'm puzzled what's preventing it from reading the file past these variables in 2017.1.4.


2
Since the model you're texturing is already a high poly mesh, did you check the option under Common called "Use Low Poly mesh as High Poly mesh" when baking the AO map? This will make Painter just use the model you're painting on as the model to bake textures from. The only reason you would leave this unchecked is when baking the ID map using the FBX model with the ID colors setup.

Another thing to check is your high poly and low poly suffix. They are case sensitive and rely on the object naming in the FBX file. See if that is mentioned anywhere in the tutorial you're following.

Hopefully that helps a little.

3
I just found something cool on the internet today that might interest a few people. While I am not sharing this on their behalf, I figure this was too cool to not share here.



An artist by the name of Phillip on ArtStation is working on a plugin called Stack N Pack. What it essentially does is acts as an alternative to the built-in exporter and can take multiple texture sets and pack them into a single set. For those of you who just use multiple texture sets just to avoid painting on certian parts and want to combine them after, this might be of use to you.

There are some additional features as well which are listed in Philip's blog post here. As of writing, it's not available yet but they're looking for beta testers so if you want to give it a try, feel free to contact them.

Personally, I wished I had this before since I had to do round trips of combining texture sets from Painter to Designer in the past any time I changed something. Maybe that could be improved in the future but this plugin does the trick for now.

4
I'm starting to notice this but with more and more files being uploaded to Substance Share and newer versions of Substance Designer and Substance Painter coming out, compatibility of these files are becoming a concern for me.

I know when you upload a file to the site, you can set what software and version it was created with (Substance Designer 5.6, Substance Painter 2017.1, etc.) However, this information isn't displayed on the file page where others can download and comment on it. Would start to become very annoying to download something awesome only to have it not open because you didn't know it wasn't compatible with the software version you have. Upgrading isn't always an option for everyone either, especially when working in a studio where files need to be compatible between machines.

Simply put, can this information actually be displayed so others can see it? An additional idea is to have a filter option so certian versions and lower are displayed. For example, anything made with Substance Designer 2017 or lower will only be visible as a filter option.

Just thought I would bring it up for consideration.

5
I hate to bump the topic but just wanted to let you know that I've been having this happen frequently recently with Substance Painter 2017.4. I had about 4 files so far that have been meet with this fate and it's starting to become a growing concern for me. Oddly enough, I can still see the texturesets when opening the files, just no model.

Now luckily the files from the autosave plugin work without issue every single time (though they're strangely larger then the main file) so not all is lost. Still, I had to crank the autosave interval to as low (which is 10 minutes) as it can go to deal with this issue. Annoyingly though, the autosave time resets any time I manually save myself which I do frequently.

Anyway, just wanted to give a recent update on the issue. Hopefully others aren't as unlucky as me.

6
I've been curious about making use of the new widgets that are selectable in the 2d View that was introduced since Substance Designer 6 for making custom graphs of my own but I'm struggling on how to get them to work. Nodes like Cube 3D and Multi Clone Patch have such widgets with the latter having more than one.

There's no mention of this anywhere in the docs or change log on how to use this and I've tried exposing parameters for both the transform matrix and offset in hopes that a transform widget will kick in but no such luck. Do I need to name the parameters in a certain way, tag them, etc in order for this feature to work or are widgets restricted to just the nodes that ship with Substance Designer as is?

7
Yeah, the original image size remains intact when the Substance package is published and the Bitmap nodes that are generated are set to absolute by default from what I'm seeing. I just wish I can get the image size more easily as a variable and use it to set the output size dynamically per graph.

Right now I'm setting the size manually by checking what the texture size is per texture set and setting the output size that way. While that works, imagine doing that manually for a lot of these and you can see why I want to automate that step as well.  :-\

Unless there's a way to do that using the batch tools, I'll just have to use ImageMagick to get the size and use an if the statement of some variety to set the power of 2 size based on the image size and work with that. Won't be as portable of a batch script with an additional dependency but it'll do for now. Thanks for the assistance and feel free to post a solution if you have one in case others are having the same problem.

8
I think why this happens is because the Transform2D node resolution is set relative to the output size of your graph, and the output size of the graph is set to 256x256 by default. When you use sbsrender to render the outputs of the graph, the graph size is set to the resolution you use in the sbsrender commandline, and then the Transform2D nodes inherit that. So try just ignoring this issue, and use sbsrender.exe with the resolution you want and I think it should do what you want.

Thanks for the reply. While that should work, the only issue is rendering an image larger or smaller than it actually is. I'm curious if there's a way to get the size of an image and then used that info to set the render size for sbsrender. There's probably a way to get metadata about an image file, either directly or from the image resource in the .sbs package but I'm not sure how to get that.

Edit: I have found that a command-line tool from ImageMagick can fetch this info which might be a solution but it requires installing it just to get something that the Substance package already knows about. Seems silly but that's may be an alternative if the Substance batch tools can't fetch it easily.

9
Hello, I've been working on making use of the Substance Automation Toolkit (AKA Substance Batch Tools ???) using batch script (because I have barely used Python) and aside from figuring out what some of the tools such as sbsmutator and sbscooker want in order to work, I managed to generate .sbs and .sbsar files with little issue.

I'm having a small issue though with sbsmutator when using the specialization and --connect-image options and I'm not sure how to handle this since there's nothing pointing this out in the documentation. What's currently happening is that for every image added using the --connect-image command, a Transform 2d node is also added and is downsizing the image to 256x256 which is not desirable.  :-\ The resulting bitmap nodes are using the correct size by default but everything else in the graph does not.

Is there any way to correct this such as removing said nodes, getting the size of the input images and setting the size of the graph, or something like that? Ideally, I want it where input size == output size and not have all of them downsize to 256x256. Once I can do that, we're in business.

10
...I don't mean to be rude but that makes no sense. If I erased the unwanted part of the normal map after applying a stamp like this, it blends just fine, rough but fine, whether the normal channel on the layer is set to combine the normal detail (Normal map Detail, Normal map Combine) or copy over it (Normal, Replace). When the blend mode for the Normal channel is set anything other than for combining normal maps together, this is where it becomes problematic.

I'm guessing this is just a quirk specifically when painting in the normal map channel at the moment with the lack of the texture alpha being utilized? From the looks of things, the texture alpha works when painting in other channels, just not in the normals channel... which makes me even more confused.

Sorry if I'm being persistent, just wanted to be thorough is all.

11
Hey fellas, I got a question about painting with textures that come with an alpha channel. Is it possible to use this channel in some way as a brush alpha in Painter or do I have to have a second texture just for the brush alpha?

For example, let's say I'm using one of the normal map textures that come with Substance Painter 2 that includes an alpha channel. In the attached images you see what it looks like before and after the texture is applied via the brush tool and the fact that I wind up painting the entire texture instead of just the part within the texture alpha itself.

If there's a way to make use of the texture alpha as the brush alpha when painting, that would be very useful. Otherwise I would have to make a seperate image for every texture I want to paint with that has an alpha channel... and that would suck.

Edit: Just noticed I posted this in the wrong board (Substance Designer)... whoops.

12
Ah okay, good to hear.
Also, you're on 5.6 already? I guess I updated too soon ;D Eager to see what's new and all that.

13
I recently updated Substance Designer from 5.4 to 5.5.3 and I have some custom filters setup which came over just fine. However, when checking things in the Library I noticed some odd behavior when I selected a Folder in the Library. You can see for yourself in the attached image.

As you can see, when selecting a folder it filters through ALL the content in the Library instead of just the ones determined by the Filters in the selected Folder. Not only that but every time I select a folder, all the Filters in the folder have a condition assigned to them set to All > Base Name > Contains > "". In other words, it's set to filter anything with a blank name.

This bug doesn't actually break anything, it's more of an inconvenience then anything. Still, it's unwanted behavior so I thought I would let you know.

14
Hello there, I'm taking a stab on attempting to create a vector map of sorts using the Pixel Processor node and I'm wondering if anyone can shed some light on this.

What I'm essentially trying to do is create a vector map that defines the "up" direction of the UVs after a UV image has been passed through a Tri-Planar node. The idea is to use said map to solve the issue that arises when passing a Normal Map though said Tri-Planar node where the normals are no longer facing the correct way. I attached a screenshot of where I'm at in terms of the UV map projection.

I fiddled around with trying to get this vector from the resulting projection, screenshot also attached for that, and so far I'm scratching my head on how to solve a vector out of it... perhaps I just need to get a unit vector for the pixels themselves or something? I'm not sure.

Any assistance on this would be greatly appreciated and hopefully we can use this to resolve this Normal Map Tri-Planar shenanigans for the most part. Would make it more ideal to project with the normal map as opposed to using a heightmap instead which doesn't look as good.

15
From the looks of things, it appears your roughness map is exported as its own map. I know that Unity 5 expects this map to be in the Metallic map's alpha channel when exported and not seperate which, from the looks of things, is why everything is uniformly smooth looking in Unity. In other words, the material isn't really metallic, just super smooth looking.

There should be an option in the Export dialog in the Config drop-down that has one called "Unity 5 (Standard Metallic)". You can see more on how to use it in the documentation here: https://support.allegorithmic.com/documentation/display/SPDOC/Exporting+textures

Hopefully this helps solve your issue :)

Pages: [1] 2