Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - paradox00

Pages: [1]
1
Hello!

I have been learning a lot about Pixel Processor lately, specially thanks to this guy https://www.artstation.com/andreizelenco and the Facebook group he created to share knowledge.

Creating Pixel Processor functions is great, but there is what I think is a major drawback right now: there is no way to make loops. You end up doing what Andrei shows in his header image, repeating the nodes that are supposed to be in the loop as much as you need, which as you might imagine, is cumbersome and far from convenient.

Having a loop node on the Pixel Processor seems possible as we have already something like it in the FX-Map node in case.

The idea could be having something like in Houdini, where you have a node that defines and contains a loop, or having a 'solver' node, that calls for a function.

Whatever is best from your perspective.

In any case, having this would be awesome and I would like to know how to make a request (and hope it is created hahahah).

Thanks.

2
Substance DesignerSubstance Designer - Discussions - VR considerations
 on: August 21, 2017, 07:03:51 pm 
Hi! hope everything is doing great and creating awesome art.

I was wondering if anyone has some knowledge to drop regarding substances in VR. Not textures from Designer, of course, but actual substances imported into Unreal to use in VR. Maybe someone has something to say about performance, good practices etc?

I would like to know any info before running my own tests

Thanks!

3
Substance DesignerSubstance Designer - Discussions - Designer for VR
 on: November 08, 2016, 11:51:24 pm 
Hi Folks! looking for some guidance.

Im about to start a VR project for PC using UE4, i would like to know if you hand some reference about the optimization side of things using Designer. I do know the page where the general optimization guidelines for Designer are, but i would like to know any comment or help regarding Designer for VR, specially comparing using the Substance instead of the actual texture and stuff like that.

Of course, i already found a some information on google, but you maybe comment and point to more specific references, or maybe talk about your experience with the media.

Thanks!

4
Copying this from the thread on optimization.

On the topic of optimization, i read the new published guidelines for optimization and I still have some doubts.

I started a substance our team is aiming to use ingame. We need to texture a hard surface type of props, which are the main objects in the game and have different materials on them. The idea is to nail down or texture/material pipeline to be optimized for VR and mobile for the very scratch, even though for now the game is aiming to PC's. The art direction allows us to aim for a general optimization from the very start.

I tried two a approches, one which to me seems minimal (even though it uses more nodes) and one more streamlined:

The minimal approach is: Create masks from the ID map using the Color to Mask node, then create a bunch of blend nodes to combine each mask accordingly. That means, imagine there are 3 materials on the prop, then i use 3 Color to Mask Nodes, then there is a section with 3 blend nodes for color, another section with 3 blend nodes for roughness and similarly for metallic (there is no height and the normal and AO are baked). The blend nodes combine the color required with the mask and then with the next color and so on.

The streamlined approach is to use the Multimaterial Blend. In this case i have a custom node with exposed parameters to define the materials i need, then combine them using the Multimaterial Blend.

Judgingd by the time of each node on the graph it seems that the minimal approach is better, but i want to be sure on how to judge this kind of constructions.

On the other hand, i have a similar question regarding the Mask Builder (and nodes derived from it). I create a simple 2 blend node setup using the curvature and AO (plugged in to the mask of each blend) to create wear in our props which mostly works. Time wise it is faster than the Mask Builder, but i need a little bit more complex mask creator, probably adding two more blend nodes. Is it worth to create a custom one or am i wasting time and should use the Mask Builder?

In general

How much the time displayed on the nodes matter to consider it in optimization?

Is there any way of prioritizing which node i should primarily use and which are meant to be used just if needed (or not at all)? something on the lines of: "Blend, thisNoise, thatNoise, thatNode... are usually ok to use but forget about this other ones". For example, the Grunge Maps are a bad idea if you want to use the substance in an engine (more over on mobile), right? which nodes are as bad or in the next step of "badness" and so on.

I hope I explained myself.

Thanks.

5
Hi! haven't found a proper conversation about this subject. If you can redirect me to the proper thread if there is one it would also be great.

Here is the thing:

- Is there any sort of guidelines to decide when is more optimal, in terms of computation time (vs file size or customizability, maybe?), to export the texture rather than publishing the substance directly to the engine?

- Which are the proper things to take into account to make the call of instead of simply exporting the texture is better to publish the substance?

For example, this graph is rather complex https://twitter.com/lobachevscki/status/746784097066954752 and i sincerely don't know if it is optimal (in whatever terms should be accounted) to publish it to Unreal rather than just simply export the textures the graph produces and tweak them in Designer for further iterations https://twitter.com/lobachevscki/status/746780042638790656

I'm in some kind of connundrum because of the game i'm working made me ask this question, due the kind of assets we are delivering.

I saw that people at Naughty Dog rather chose to export textures from designer (to my knowledge), but there is the possibility of publishing it so i wonder what consdierations should be accounted to make the call.

Hope my point can get accross.

Thanks in advnaced.

6
Hi, trying to recreate a pattern that looks like the one on the left in the image below:



I pretty much nailed the pattern shape, but now i need to crop it up because the reference tile is actually pretty tight and packed together.

When i input the pattern above into the Tile Generator it (obviously) creates huge spaces between patterns, it would be awesome if there is a way or workflow to reduce the amount of space indicated in red in the left image (it actually applies to the borders on the sides as well).

Cropping the blends node i used to create the shape doesn't seem to work at all. Trying to tile it by hand using a  Transformation 2D after i created the pattern doesnt' work either (at least i didnt make it work) because the shape is not a square and it has no offset in the reference tile. Any changes (size, interstices etc) on the Tile Generator destroys the pattern, it turns it into a squarish shape and that's not the idea,

Do i have to export the pattern, crop it in Photoshop and import it back? seems like the only solution for me.

Any suggestions?

Thanks.

7
Hi,

A few days back i was wondering how to export masks from Substance Painter. I found a plugin that does exactly that but needed to upgrade to Substance Painter 2.1. I did so but now it is not recognizing the plugin. I copied it into the plugins folder as the instructions say but it doesn't work, I'm not seein the button that is supposed to appear once it recognizes the plugin.

What am i doing wrong? it's just copy the plugin into the folder, right?

Thanks

8
Hi everybody.

I made a mask on substance painter, it looks like this



But when I export the mask by clicking on it and exporting the result is this one:



I was expecting the exact same mask I highlighted in red in the first image, with the black background and the two shades of grey on the uv shells. That is the actual mask, not the result i'm having. The result isn't useful at all.

What am i doing wrong? or that is the way it works?

Thanks!

9
Hi, i know there is a thread for this here https://forum.allegorithmic.com/index.php?topic=79.0 but when I was about to reply the page warned me about a 120 days timeframe and stuff, so i got its word and started a new thread.

I got the GPU based plugin mentioned in the old thread and followed the instructions, I think i followed them correctly, but i still can't get to bake 4K from Maya.

Any ideas of what I should check, i know i must be doing something wrong but i don't know which step i'm missing.

Thanks

10
Hi!

My imported substance in Maya is working just fine but i do have a concern: the output diffuse that is the result of my graph looks a lot more desaturated while on viewport 2.0 than compared with the original image I'm feeding it and the output.

Some image to explain the issue:

Viewport 2.0 look. The left one is the original map into a lambert, the center one is the substance attached to a lambert and feeded with the same diffuse as the first one, the right one is the baked substance.



The networks in Maya



The graph in Substance. The original one is more complex but for the sake of the example i just plugged one of the inputs into the only output, it can't be simpler than that.


What's happening and how can i fix it? for this particular graph which is part of a tool it is particularly important to get the color rights in the viewport.

Thanks

11
I'm puzzled by this that I think it should be obvious but it isn't for me.

I have two graphs, let say graph Main and graph Utility. Main uses Utility 3 times and it is published to be used en Maya.

Utility does something and the only parameter that needs to be exposed is Color in a Color to Mask Node. I will call that paramater Utility_color

The idea is to use Utility more than once inside of Main because they are at least three masks to be edited in Maya. The goal is to give the user the ability to change the color in each mask, that is to say that if i have 3 Utilities inside Main, you could inside of Maya assign 3 different colors to 3 different parameters.

The problem is: I created three variables (let say, Color_1, Color_2, Color_3) inside of Main and passed one to each Utility, but the only parameter being exposed inside of Maya is Utility_color, i was expecting Maya to show me Color_1, Color_2 and Color_3 as parameters but I haven't yet found the way to do so.

Could you help me with this problem. I hope I've explained myself clearly.

Thanks.

12
Maya + Substance Painter + UNity 5

Pages: [1]