Using Blender to texture G8.
So one of the big things I was looking forward to when moving all my figures into blender was the idea that I could make all of their skins procedurally from scratch using blender nodes, aswell as making their use very modular by using node groups. Essentially I can just drop a custom made "Skin BSDF" into the shader editor and pick the base skin tone of the figure. And so far that has been successful! But I have run into a bit of a roadblock in this process.
When you import a daz figure into Blender, the UVs are already unwrapped decently enough, but the issue is the placement of the UV islands.
They are all placed in such a way that only the face UVs are actually where they need to be. The rest are just scattered off to the side. Now, of course the simple solution is to select all the verts and hit "Pack Islands", but then you create a new problem where now everything is mooshed into 1 island and nothing lines up anymore.
The way the UVs are originally placed seems to suggest that I should be able to have more than one island square visible at a time. That is, the face verts all perfectly line up, so if I were to be able to bring the torso image to the right of the face image, and the leg image to the right of that, etc etc, everything would be perfectly aligned. But in all my googling I have not been able to figure out how to do that, or if such a thing is even possible. If not that, then it seems the solution is to pack the UVs and bake the entire thing into 1 single texture. But then that would mean I would have to redo that for every individual figure I imported because as far as I can tell, you can't import/export the UV layout. You can import/export an image file of the UV layout, but not the actual vert placement data.
So essentially the problem is that it seems like within blender there is no way to accurately texture the figure without just packing 1 huge UV island and manually painting it in an image editor, and then importing that image back in. Which breaks the existing format of the textures as they come in. I feel like I am missing something extremely obvious but it's a really difficult thing to explain, let alone do google searches for.
Comments
You can use the paint options to fit the paint to the uv map, as tiling and repeat image. Or you can paint directly to the mesh and ignore the uv map. Or you can convert to udim with diffeomorphic.
Personally I prefer to paint the textures in gimp anyway. But the blender tools work fine.
Can Blender not handle UDIMs natively? That would be a pretty major shortcoming these days.
UDIM's my friend. Look up some info on UDIM's. The textures are separated to provide more detail. If they were all packed onto one uv island, you would only have one texture map. If that map was a 4k map, the texture quality would not be great. To increase quality, you would have to make a giant 16k or 32k map that would not load well, if at all. That is why they are separated into multiple maps, so we can load multiple 4k maps to maintian detail in our renders.
To copy materials from one character to another in Blender, you can use the 'material utilities' plug in. Once you make one set of materials, you can just import that character and copy those materials to a newly imported character.
https://docs.blender.org/manual/en/latest/addons/materials/material_utils.html
Using node groups to make global adjustments is also a good idea. You can place your color correction, spec/roughness adjustment, etc. all in a node group and make global changes to your characters face, head, torso, arms, and legs all at once.
That was the ticket. I didn't quite understand what UDIMs are. So now that I know that, maybe you can give me some tips on how to efficiently do this.
I have this BSDF I made which just plugs right into the final result and you get perfect skin of any color. The only issue is since the Daz figure is packed into many different materials, I don't know the most efficient way to synch them up. One way would be to just put one on the face, adjust it how I want, then copy-paste that node group into all the other appropriate materials. But then any time I might want to change something, I'd have to go through and change it 5 times. The other option would be to take all of the relevant skin material groups (Face, torso, arms, etc) and link them all into 1 material. Then every change made would happen to the whole body at once. However I don't know if this will cause any "damage" to the functionality of the figure. More specifically would it effect the UVs at all.
My ultimage goal is to have this skin bsdf control the skin color, and then have a modular set of alphas on hand to make makeup, tattoos, wounds, etc. And if it's possible maybe even make that into a node group of it's own. "Makeup BSDF" essentially, so that I can quickly apply it to a new character without manually setting it up bit by bit every time.
Is your shader based on UVs or is it based on 3D coordinates? A UV-based shader is going to need different scaling for different surface groups, and I don't know how you will avoid seams. If you want to do the job procedurally I would suspect a remap might be called for.
The shader doesn't really need UVs to work by itself since it's not a texture. It's just a series of settings for the Principled BSDF. So whatever verts are included in this material group are effected by it. And so far in the testing it doesn't seem like there are any seams.
The issue of makeup and such creates the need for actual texture mapping since those details will be specifically located. So in the majority of cases, seams shouldn't be an issue because the details in question will usually obey the boundaries of the existing UV maps. The part I don't yet understand is how I could utilize alpha maps to create layers on top of this shader. So for example if I shaded all of the skin including the lips with this shader, but I wanted to give the character lipstick. So I would then have to set up an alpha map that fits just over the lips UVs and then either apply a different shader onto that, or texture it traditionally. The idea being that none of the things I add would be destructive at all to eachother. So let's say in one scene the character has no makeup, then in the next scene she has pink lips, then in the next she has blue. Instead of having 3 different sets of 4K textures for the face, it could just be a slider, or a node.
@AliPop The uv maps are vertex groups so you don't loose them linking materials.
I have a new problem that I have no idea how to correct. So I am following this tutorial to make procedural eyes. Infact I've followed a few of them now but this is the one I am on currently.
And what I've done is I have taken the G8F model, imported it into blender hidden everything so that I am only working on the eyes, with the goal of trying to match it up since the Daz eye model is very different from a simple UV sphere.
Now you might notice I have this node plugged into the base color instead of the transmission. Ignore that, I've only done it that way to try and see the color ramp accurately. If you have a deep understanding of the Daz models, you might also think "Hey, this tutorial won't work because the structure of the eyes is fundamentally different from the example used", but ignore that too because I am basically reverse engineering this to work with Daz imports. So while the entire result may not be possible with this order of operations, there are pieces I can use to figure out an alternative. The issue I am having specifically, if you go to the part of the video at about 5:06. They use the mapping node to set the white part up where it needs to be for the next step. The issue is no matter what I do, I can't actually find the white part on the daz eye. I've taken all of the options available in the mapping node and moved them in every direction, I've tried moving the verts around, but I don't even know how to diagnose the problem.
The key to this is that I am using the UV mapping that comes with the Daz export. Which means the UVs for the eyes are actually on UDIM 1006. I don't know if this effects how shaders operate, or if there's something I need to do to point to the correct coordinates. However it's very important that I cannot modify the UVs at all. I can't rewrap or repack the islands, because that will break the uniformity that I am aiming for. Essentially if I can make it work with the existing UV situation, then I can make it plug-n-play for all Daz imports. If I can just make that little white sphere appear on the front of the eye, it will give me the knowledge I need to translate other UV related things from Daz.
@AliPop In the video they use a spherical gradient with object coordinates. This has nothing to do with the daz uv maps or uv maps in general, I mean you could have no uv maps at all and the example in the video will work the same.
Well that's what I thought, but for whatever reason it's not working with my setup. You can see my nodes there aswell, so I don't really know how to find out what is going wrong.
There's nothing wrong in your setup apart you may need to change the mapping location to fit the eye size. The eye example in the video is two meters.
Would you have any idea how I can figure out the adjustment needed? Just moving the sliders around never let me see it so I think I would need to know something a more specific idea to work from. The current dimensions are roughly X:0.1m, Y:0.03m, Z:0.035
Attached the setup for a human sized eye.
I'm very close now for the eyes. It's just so difficult because just by having the eyeball model be the way it is, everything is just so dramatically different in every way in terms of how shaders behave.
This is where I am at now. Instead of spending another weak playing with the mapping node, I simply selected the center vert of the eye, put an empty on it, and assigned the shader to the empty. So now everything is perfectly centered. In the future I just have to parent the empty to the correct eye bone to keep things going. However this has seemingly created a new brand of problem.
I cannot make my noise/voronoi texture setup stretch like everyone does. And this is an issue that is nearly impossible to google on it's own.
It can.
I did a lot of searching for handling UDIM uv in blender, when trying to bake them into AO maps.
Following is what I find out:
* Blender can not handle UDIM tasks very well, it only offers the very basic abilitiy: loading them.
* Any other tasks, you need an addon or write your own python script for your UDIM related tasks. That's what I did, write a python script to read UDIM island and bake them one by one. It's a pain.
* The only two 3D tools can handle all UDIM tasks very well without writing a script, are 3ds max and maya. If you are always dealing with UDIM UV and do not want to write your own script, you need one of those.
@butaixianran Blender 3.2 added support for baking textures to udims.
https://wiki.blender.org/wiki/Reference/Release_Notes/3.2/Cycles
https://developer.blender.org/rB6787cc13d4ef
Good to know blender team still cares about UDIM uv.
Also, I think what AliPop need, is Character Creator 4, which can adjust skin color with one slider. Also comes with masks for all the makeup you want.