Subd is completely up to you. I don't think I've ever gone above 2.
As for hair, yeah, a few of us are struggling with getting a handle on it. I've seen what competent people can do with it, and very quickly, though. If you're using 2.83, though, you can put the character in a collection and then set that collection as the collision collection. That should vastly improve penetrations and just look more natural.
As for the irises, you could fix it in Blender, there's probably an edge loop you could slide, but I think you should really find a morph in Daz for that sort of thing... stand on the shoulders of giants, ya know?
Thank you again! Is it possible to move a model from Blender back into Daz to edit or do I have to re import a new model and restart the shading process all over again?
Also should I make sure to import her as high res?
I actually wouldn't know... there might be a way, but I'm sure it is not easy.
But if I were you, I'd copy the blend file before the export so you can just File/Append all your materials that you worked so hard to perfect.
And there are two ways to go about choosing what resolution to export at. If you want it EXACTLY how it is in Daz, export at High Resolution, and the file will be large. If you can deal with very slight differences, convert to base resolution and apply a subdivision surface modifier once in Blender. The file will be lighter. But for one character, and one frame, I'd just export at High.
Again, thank you so so much for your mega helpful comments! I redid everything again with my first model using the second model's skin shader and spent the whole day working on hair. I'm glad to say... I'm really proud of her hair! Hah!
Woah. I like the hair, too! Congrats!
Hehe thank you! Now moving onto the next stage, do you recommend I use rigify on her for posing?
Should I save this file as base such that if I ever want to use this character, I'll duplicate from this file since it already has her shaders and hair in it and then change the scene around her accordingly?
For Marvelous Designer, between it and Daz3d, I would first get the A pose of my model, bring her to Marvelous Designer, get the outfit on her, go back to Daz and change the pose to what I want and then bring that model into Marvelous Designer as morph target. Once the outfit is fitted into her pose, I then save the outfit as obj and bring that obj back into Daz.
Are the steps basically the same between Blender and Marvelous Designer?
I import both posed and unposed. I tend to import some poses for use, which if you rigify, you can't do.
Rigify really is brilliant, expecially for animation, but great for just for single poses too.
Take advantage of the pose library feature in blender.
Subd is completely up to you. I don't think I've ever gone above 2.
As for hair, yeah, a few of us are struggling with getting a handle on it. I've seen what competent people can do with it, and very quickly, though. If you're using 2.83, though, you can put the character in a collection and then set that collection as the collision collection. That should vastly improve penetrations and just look more natural.
As for the irises, you could fix it in Blender, there's probably an edge loop you could slide, but I think you should really find a morph in Daz for that sort of thing... stand on the shoulders of giants, ya know?
Thank you again! Is it possible to move a model from Blender back into Daz to edit or do I have to re import a new model and restart the shading process all over again?
Also should I make sure to import her as high res?
I actually wouldn't know... there might be a way, but I'm sure it is not easy.
But if I were you, I'd copy the blend file before the export so you can just File/Append all your materials that you worked so hard to perfect.
And there are two ways to go about choosing what resolution to export at. If you want it EXACTLY how it is in Daz, export at High Resolution, and the file will be large. If you can deal with very slight differences, convert to base resolution and apply a subdivision surface modifier once in Blender. The file will be lighter. But for one character, and one frame, I'd just export at High.
Again, thank you so so much for your mega helpful comments! I redid everything again with my first model using the second model's skin shader and spent the whole day working on hair. I'm glad to say... I'm really proud of her hair! Hah!
Woah. I like the hair, too! Congrats!
Hehe thank you! Now moving onto the next stage, do you recommend I use rigify on her for posing?
Should I save this file as base such that if I ever want to use this character, I'll duplicate from this file since it already has her shaders and hair in it and then change the scene around her accordingly?
For Marvelous Designer, between it and Daz3d, I would first get the A pose of my model, bring her to Marvelous Designer, get the outfit on her, go back to Daz and change the pose to what I want and then bring that model into Marvelous Designer as morph target. Once the outfit is fitted into her pose, I then save the outfit as obj and bring that obj back into Daz.
Are the steps basically the same between Blender and Marvelous Designer?
I import both posed and unposed. I tend to import some poses for use, which if you rigify, you can't do.
Rigify really is brilliant, expecially for animation, but great for just for single poses too.
Take advantage of the pose library feature in blender.
I was trying to follow @benniewoodell 's youtube video but it seems like for him, once he clicks rigify, everything moves along with the model including her clothes. However when I click rigify, her clothes and eyebrows, eyelashes/peach fuzz from daz all gets left behind. I've attached how my scene collection looks.
EDIT: Okay nvm, I just didn't merge rig for all of them. Turns out all it takes is for me to post a question on a public forum first before I can suddenly figure it out by myself lol.
That's called talking to the rubber duck. When there isn't someone handy, you talk to an item on your desk, bath, or wherever; posting on forums works too.
Ohh damn I did her hair and shaders to her in her A pose. I now know that I can move shaders between files but can I do that with her hair too? I assume its possible since redoing her hair for every pose sounds horrifyingly tedious lol.
As for going back to Daz, well before Blender it was the only choice I had for rendering haha. Now if possible I have no interest of going back to Daz3d after exporting out my initial model.
I've never used Alembic before though and I just did a quick google.. and have no idea what they're saying hahaha. Why would an alembic file be better than an obj?
Btw just to make sure I understood you clearly, your workflow is basically..
- Get model adjusted to your liking in Daz
- Pose her in Daz but with A pose somewhere in the animation (?)
- Bring her to Blender and add shader and hair (?)
- Change the timeline till where model is in A pose
- Import A pose model from Blender to MD
- Fit the clothes on her in MD
- Move the timeline till she is in her desired pose and clothes move along
- Export clothes to Blender as an alembic where model is also in desired pose (?)
Haha did I understand the above correctly or am I way off base here?
Her hair is a particle system attached to the body. It will go wherever her body object goes.
And complexity is actually Alembic's flaw; it is designed to do much more than other interchange formats. Alembic won't afford you any advantage that obj won't, if you're not animating. I don't know if there are other ways, but it does work very well to simulate things in MD. I have actually simulated entire shots with it by importing the unsubdivided model's animation via Alembic. I have no knowledge about the morph technique you mentioned.
Unfortunately, and fortunately, I work with mocap data and so my workflow is not really relevant for you. But I would do:
Create the character, and pose it in Daz
Don't sim anything from a memorized position, but start with the A=pose at the first frame, and you desired pose at 30. Export that, without SubD, to MD.
Fit the clothing on the A-Pose in MD,and simulate. Export the garment at frame 30 to Blender via obj.
Go back to Daz and export the character, with SubD, to Blender via obj. It should fit mperfectly on the character.
Groom the hair in Blender.
Texture and light everything, and render.
I don't *think* there's anything ridiculous in there, that was just the first workflow I tried that worked; I stopped thinking about it after that :)
It also just dawned on mr what you were asking. I guess that's an advantage of the Diffeomorphic Daz Imorter over Alembic: With Alembic, you can't pose without re-importing,which means re-doing the hair if you groom it. With Daz Importer, you can just pose the character, or import a saved pose from Daz. Daz Importer supports the JCMs that make Daz characters so beautiful, but I don't know if it supports the JCJs that add subtle realism that you'd otherwise have to add yourself. JCJs are what people are calling the fact that certain joints affect other joints; raising the eyebrow widens the eyelids, and such.
Alembic(Ogawa)
has major advantage over any unified .obj format
for animation rendering in Blender or even Lightwave3D .
This is because every surface comes over as separate mesh
giving you the option to hide/unhide things Like hats sunglasses
weapons etc. WITHOUT changing the overall vertex count
For example I can send this character
(in the Video) from iclone wearing his hat for this shot
for the next shot I can animate him sitting at his computer back in Iclone
and export a new ABC file overwriting the previous one, with same name.
back in Blender he would now be sitting at his desk driven by the new ABC Data.
Now I could just hide
his hat/glasses /sidearm etc. in a collection with visibilty turned off and render another shot without having to export the untextured mesh
from Iclone (sans the hat etc ), and redo the textures again
If only blender had away to completely over write/replace ALL of
the motion Data in an animated FBX character... Life would be perfect.
Well as you'd expect to convert iray materials to eevee is not always good since eevee is a real time engine so of course it has limitations over cycles. Plus daz characters use multiple layers of refraction for the eyes that doesn't help with eevee.
That said, if you have to use eevee then I'd avoid the default import options and I'd go with principled and sss that's more eevee friendly and easier to eventually fix by hand. Below there's how the G8F eyes turn out with those settings. Plus some basic settings for eevee refraction. Also a link if you want to learn more.
edit.@shavonnew to fix the rounded teeth please download the development version.
@Padone Hmm, I wonder then what makes the eyes work without any problem in Eevee in the stable version versus the development? All the shots in at the end of the video I posted are all done in Eevee with just the default settings from the diffeomorphic tool, I didn't touch the materials on anything. But I'll look into using the video you posted here because I would like to start using the development version for my next project. I did see the brighten eyes thing on the diffeomorphic page but couldn't for the life of me find it in Blender when I did some research after finding it out, and I found the operator presets and unit scale, but I didn't see any of the options listed below it in your photo.
@nicstt I've been able to make hair move with a cloth sim, but it takes FOREVER to bake. Like my computer was baking overnight for like 125 frames, which I laughed at that as I would set shots to render in Daz overnight for animation which is what I was trying to avoid with Eevee. But I just found a tutorial from a guy I follow for Blender tips that shows how to rig hair to move naturally in under 2 minutes, I don't need to rig anymore hair for the few shots I have left in my film, but I am going to give this a try afterwards. If it works, I'll let you know!
Thanks for this, I'll check it out. I've been experimenting and am now setting up a scene and will be appending various experiments in to the one character.
For you to (re)write an alembic exporter, what did you learn about the formal DAZ product that makes it not useful for your interests?
I own it, and I'm guessing you can adjust my expectations as I consider using it, although it sounds like your little beast will be a godsend.
tnx,
--ms
What @wolf359 said, and just its general flakiness. It uses the node names, which users don't even see, to identify nodes instead of the label which users do see and Daz ensures are unique, and fails when they are not unique, with a completely unhelpful error message. It messes up the vertex order and screws up normals. It mangles UV maps. it has an arbitrary frame limit. It is not a viable product.
And even Blender's Alembic support itself is kind of a hack, and so has severe limitations once it's imported. I also wanted to add some functionality to overcome those. For example, nothing you do in edit mode will stick once you go back to object mode.
One of the most important things I worked on was converting certain hair assets to Blender's particle system. Daz doesn't think we should be able to use dForce strand based hair, but that's standard for Blender, and as @wolf359 also pointed out, Blender is even getting a new hair system soon. When I converted dForce Classic Long Curly Hair, the model was 3.5 gigs lighter in Blender.
I'm going to try to finish the conversion code that just spits out a Python script that does everything, a one-click solution. There are some other small annoyances that I can fix over time, but I think I'll just make it available sooner rather than later because I certainly feel people's pain. I would literally be dead in the water without this tool; the Daz Importer is very, very good and I still use it for materials, but the slight differences in SubD drive me crazy. They shouldn't, but they do :)
@shavonnew I'm glad you got rigify to work with the clothes and such! Lol I too have had the same situation where you post and then a minute or two later you figure it out. Just know that once in awhile, if you just shift click the whole group, the hair might not merge. If that happens, just undo it and then open up the hair hierarchy and then shift click everything to make sure anything in there is for sure highlighted and it'll work. I don't understand why sometimes it does that, but it can be picky I guess. I also discovered a blender plugin called autorig pro. I haven't picked it up yet, it's 40 bucks or so, but it looks like it does a similar thing like rigify, only it looks like there's more control options. I might pick it up and try to see how it works bringing in a character with the diffeomorphic tool and just adding that rig on. It's just 40 bucks right now is a lot, if this was three months ago, I wouldn't have even thought twice. I'll see what I do and post here if I happen to try it!
@nicstt let me know how it works! I still haven't been able to give this tutorial a try but I am excited for the possibilities with it.
For you to (re)write an alembic exporter, what did you learn about the formal DAZ product that makes it not useful for your interests?
I own it, and I'm guessing you can adjust my expectations as I consider using it, although it sounds like your little beast will be a godsend.
tnx,
--ms
What @wolf359 said, and just its general flakiness. It uses the node names, which users don't even see, to identify nodes instead of the label which users do see and Daz ensures are unique, and fails when they are not unique, with a completely unhelpful error message. It messes up the vertex order and screws up normals. It mangles UV maps. it has an arbitrary frame limit. It is not a viable product.
And even Blender's Alembic support itself is kind of a hack, and so has severe limitations once it's imported. I also wanted to add some functionality to overcome those. For example, nothing you do in edit mode will stick once you go back to object mode.
One of the most important things I worked on was converting certain hair assets to Blender's particle system. Daz doesn't think we should be able to use dForce strand based hair, but that's standard for Blender, and as @wolf359 also pointed out, Blender is even getting a new hair system soon. When I converted dForce Classic Long Curly Hair, the model was 3.5 gigs lighter in Blender.
I'm going to try to finish the conversion code that just spits out a Python script that does everything, a one-click solution. There are some other small annoyances that I can fix over time, but I think I'll just make it available sooner rather than later because I certainly feel people's pain. I would literally be dead in the water without this tool; the Daz Importer is very, very good and I still use it for materials, but the slight differences in SubD drive me crazy. They shouldn't, but they do :)
tnx,
that you have to write this is a bummer.
thanks in advance for this info and for taking on that project.
I'm now certain I'll be using these kinds of export tools in the very-near-future as I migrate my core workflow and purchasing sources to alternative environments.
Alembic(Ogawa) has major advantage over any unified .obj format for animation rendering in Blender or even Lightwave3D .
This is because every surface comes over as separate mesh giving you the option to hide/unhide things Like hats sunglasses weapons etc. WITHOUT changing the overall vertex count
For example I can send this character (in the Video) from iclone wearing his hat for this shot
for the next shot I can animate him sitting at his computer back in Iclone and export a new ABC file overwriting the previous one, with same name.
back in Blender he would now be sitting at his desk driven by the new ABC Data.
Now I could just hide his hat/glasses /sidearm etc. in a collection with visibilty turned off and render another shot without having to export the untextured mesh from Iclone (sans the hat etc ), and redo the textures again
If only blender had away to completely over write/replace ALL of the motion Data in an animated FBX character... Life would be perfect.
tnx wolf359,
this is interesting and I've saved it, as I think the value and logic of it will make more sense once I start exporting my assets.
Good Morning everyone, I am new to the daz to blender workflow and I am happy to find a small community over here. I have a problem that I cant solve at this time maybe one of you can help.
I am trying to import an aniblock to blender from daz via the diffeomorphic plugin via the import action botton, the baking part to keyframes done in daz. I get the message some morph were not imported,so I guess it didnt work. Is an aniblock to give some nice jiggles to the breast of a female character but it doesnt import in blender. I tried both blender plugins for the bones in blender , spring bones and wiggle bones but it doesnt work at all, maybe I am stupid and I am doing something wrong, I would love if someone can give me a trick how to do that.
another thing I want to get that face mocaped with facecap plugin however I am not able to find any tutorial how to step by step to blender, how to transfer the data to a mesh, in this case a daz genesis 8 imported to blender by difffeomorphic.
if someone can help and knows the solution please help , i will really appreciate it.
Hi all! After using the diffeomorphic import and adding the face units, I realised there isn't any control for the eyeballs. How do I control where her eyeballs point towards now?
Alembic(Ogawa) has major advantage over any unified .obj format for animation rendering in Blender or even Lightwave3D .
This is because every surface comes over as separate mesh giving you the option to hide/unhide things Like hats sunglasses weapons etc. WITHOUT changing the overall vertex count
This is actually not a property of the Alembic format itself, but rather of the exporting application. It could just as easily export the entire scene as a single object just as easily as it could each material zone as a separate object.
I also make sure to load the face realted controls
If you dont merge the face and pply an animation to the rest of the armature rigged I guess when the character start to move the face rig will stay a part somewhere esle in the scene, no ? Can you guys please share your workflow on how to animate a daz genesis 8 face in Blender ? how to use facecap or any better solution ? I am not interested by realussion software as it doesnt accept some type of characters woth geoshells as cyborgs or even some HD characters plus you need iclone and other plugins.
Hi I am not aware of any easy way to create facial animation for the Genesis 8 After import to Blender.
Assuming the Diffeo plugin imports all of your visemes as Blend shapes ,You can try manually animating them for your lipsynch/Facial performance.
Blender has an excellent graph editor& Dope sheet
animating manually will take forever and will never be organic , I saw some examples of facecap, is a very cheap plugin for facial mocap for iphones and very powerful, however I cant find any tutorials how to do, not even in their website https://www.bannaflak.com/face-cap/ check it out, once I import daz in blender with diffeomorphic the only thing I know is to apply a BVH file to the armature but I cant find any tutorials or manual for facial animation, the facecap app record all the data to an fbx file I guess, but then how to apply that data to the character ?
this is the character I am trying to face mocap, I am getting inspired by the movie Alita , if I can facemocap this ill be very happy to track it inside a real footage and see the effects as I have blackmagic camera that records 12 bits raw video.
here a frame of the character applied a BVH animation plus a 3 studio light setup, as you can see if I can get this babe to talk it will be something else and ill be happy to get a rtx 2080ti for render. I am using cycles to render as I get better realistic results.
Hi all! After using the diffeomorphic import and adding the face units, I realised there isn't any control for the eyeballs. How do I control where her eyeballs point towards now?
Hi @shavonnew! I hope I covered that in my video. You have to merge all the rigs, and then the face bones, and then make all bones posable, just go right down the line in the corrections tab. Then go to the morphs section and update morphs, and facial controls. Then scroll down and you'll find the facial stuff (as well as expressions if you convert those too in the morphs section, and visimes if you want to use papagayo for the stable version or apparently anilip works with the new version which I still have yet to really play around with). In the facial tab, there is a bar that you can use to move the eyes sideways, up and down, squint, close them, etcetera, as well as brows, cheeks, you can do everything and keyframe it!
But please make sure to do all the updating of the morphs first, if you go back after you decide to use rigify, your character's face is going to get all wonky.
You will have alter your G8 to conform to the facecap export scripts.
Yes, but I dont want to import the avatar, I want to apply the data from the fbx imported to blender to the daz character, I dont know how to apply the shape keys from the data to my daz character, maybe @benniewoodell knows how to do that, and he can maybe explain it, please .
You will have alter your G8 to conform to the facecap export scripts.
Yes, but I dont want to import the avatar, I want to apply the data from the fbx imported to blender to the daz character, I dont know how to apply the shape keys from the data to my daz character, maybe @benniewoodell knows how to do that, and he can maybe explain it, please .
AFAIK Blender does not have the ability to retarget Blend shape
animation Data between FBX rigs
They say Diffeo can apply .Duf files with facial viseme animation but that wont help you with the face cap Data unless you can get that Data into Daz studio before exporting to blender via Diffeo
.............. I truly wish blender could retarget new Data to FBX as it would save me much time in my Iclone FBX to Blender Pipeline.
You will have alter your G8 to conform to the facecap export scripts.
Yes, but I dont want to import the avatar, I want to apply the data from the fbx imported to blender to the daz character, I dont know how to apply the shape keys from the data to my daz character, maybe @benniewoodell knows how to do that, and he can maybe explain it, please .
AFAIK Blender does not have the ability to retarget Blend shape animation Data between FBX rigs
They say Diffeo can apply .Duf files with facial viseme animation but that wont help you with the face cap Data unless you can get that Data into Daz studio before exporting to blender via Diffeo
.............. I truly wish blender could retarget new Data to FBX as it would save me much time in my Iclone FBX to Blender Pipeline.
how to get the data then into daz ? is an fbx file. also my character doesnt have bones that close the eyes and open the eyes when is imported to blender, i can play with all bones of the face except for the eyes are nor imported, any solution ?
suddenly I baked 3 anyblocks and the breast jiggly start to work now, it didt work before weird. but still trying to figure out for the facial mocap, lol.
I also make sure to load the face realted controls
If you dont merge the face and pply an animation to the rest of the armature rigged I guess when the character start to move the face rig will stay a part somewhere esle in the scene, no ? Can you guys please share your workflow on how to animate a daz genesis 8 face in Blender ? how to use facecap or any better solution ? I am not interested by realussion software as it doesnt accept some type of characters woth geoshells as cyborgs or even some HD characters plus you need iclone and other plugins.
I don't tend to animate.
I will at times, but it isn't part of my normal workflow. I use it get related scenes done, which I rarely do.
Comments
I import both posed and unposed. I tend to import some poses for use, which if you rigify, you can't do.
Rigify really is brilliant, expecially for animation, but great for just for single poses too.
Take advantage of the pose library feature in blender.
I was trying to follow @benniewoodell 's youtube video but it seems like for him, once he clicks rigify, everything moves along with the model including her clothes. However when I click rigify, her clothes and eyebrows, eyelashes/peach fuzz from daz all gets left behind. I've attached how my scene collection looks.
EDIT: Okay nvm, I just didn't merge rig for all of them. Turns out all it takes is for me to post a question on a public forum first before I can suddenly figure it out by myself lol.
That's called talking to the rubber duck. When there isn't someone handy, you talk to an item on your desk, bath, or wherever; posting on forums works too.
Glad you solved it.
Her hair is a particle system attached to the body. It will go wherever her body object goes.
And complexity is actually Alembic's flaw; it is designed to do much more than other interchange formats. Alembic won't afford you any advantage that obj won't, if you're not animating. I don't know if there are other ways, but it does work very well to simulate things in MD. I have actually simulated entire shots with it by importing the unsubdivided model's animation via Alembic. I have no knowledge about the morph technique you mentioned.
Unfortunately, and fortunately, I work with mocap data and so my workflow is not really relevant for you. But I would do:
Create the character, and pose it in Daz
Don't sim anything from a memorized position, but start with the A=pose at the first frame, and you desired pose at 30. Export that, without SubD, to MD.
Fit the clothing on the A-Pose in MD,and simulate. Export the garment at frame 30 to Blender via obj.
Go back to Daz and export the character, with SubD, to Blender via obj. It should fit mperfectly on the character.
Groom the hair in Blender.
Texture and light everything, and render.
I don't *think* there's anything ridiculous in there, that was just the first workflow I tried that worked; I stopped thinking about it after that :)
It also just dawned on mr what you were asking. I guess that's an advantage of the Diffeomorphic Daz Imorter over Alembic: With Alembic, you can't pose without re-importing,which means re-doing the hair if you groom it. With Daz Importer, you can just pose the character, or import a saved pose from Daz. Daz Importer supports the JCMs that make Daz characters so beautiful, but I don't know if it supports the JCJs that add subtle realism that you'd otherwise have to add yourself. JCJs are what people are calling the fact that certain joints affect other joints; raising the eyebrow widens the eyelids, and such.
This is because every surface comes over as separate mesh giving you the option to hide/unhide things Like hats sunglasses weapons etc. WITHOUT changing the overall vertex count
For example I can send this character (in the Video) from iclone wearing his hat for this shot
for the next shot I can animate him sitting at his computer back in Iclone and export a new ABC file overwriting the previous one, with same name.
back in Blender he would now be sitting at his desk driven by the new ABC Data.
Now I could just hide his hat/glasses /sidearm etc. in a collection with visibilty turned off and render another shot without having to export the untextured mesh from Iclone (sans the hat etc ), and redo the textures again
If only blender had away to completely over write/replace ALL of the motion Data in an animated FBX character... Life would be perfect.
Hey TheMysteryIsThePoint,
For you to (re)write an alembic exporter, what did you learn about the formal DAZ product that makes it not useful for your interests?
I own it, and I'm guessing you can adjust my expectations as I consider using it, although it sounds like your little beast will be a godsend.
tnx,
--ms
Blender will politely remind you to get with the times and refuse your import.
Thanks for this, I'll check it out. I've been experimenting and am now setting up a scene and will be appending various experiments in to the one character.
What @wolf359 said, and just its general flakiness. It uses the node names, which users don't even see, to identify nodes instead of the label which users do see and Daz ensures are unique, and fails when they are not unique, with a completely unhelpful error message. It messes up the vertex order and screws up normals. It mangles UV maps. it has an arbitrary frame limit. It is not a viable product.
And even Blender's Alembic support itself is kind of a hack, and so has severe limitations once it's imported. I also wanted to add some functionality to overcome those. For example, nothing you do in edit mode will stick once you go back to object mode.
One of the most important things I worked on was converting certain hair assets to Blender's particle system. Daz doesn't think we should be able to use dForce strand based hair, but that's standard for Blender, and as @wolf359 also pointed out, Blender is even getting a new hair system soon. When I converted dForce Classic Long Curly Hair, the model was 3.5 gigs lighter in Blender.
I'm going to try to finish the conversion code that just spits out a Python script that does everything, a one-click solution. There are some other small annoyances that I can fix over time, but I think I'll just make it available sooner rather than later because I certainly feel people's pain. I would literally be dead in the water without this tool; the Daz Importer is very, very good and I still use it for materials, but the slight differences in SubD drive me crazy. They shouldn't, but they do :)
@shavonnew I'm glad you got rigify to work with the clothes and such! Lol I too have had the same situation where you post and then a minute or two later you figure it out. Just know that once in awhile, if you just shift click the whole group, the hair might not merge. If that happens, just undo it and then open up the hair hierarchy and then shift click everything to make sure anything in there is for sure highlighted and it'll work. I don't understand why sometimes it does that, but it can be picky I guess. I also discovered a blender plugin called autorig pro. I haven't picked it up yet, it's 40 bucks or so, but it looks like it does a similar thing like rigify, only it looks like there's more control options. I might pick it up and try to see how it works bringing in a character with the diffeomorphic tool and just adding that rig on. It's just 40 bucks right now is a lot, if this was three months ago, I wouldn't have even thought twice. I'll see what I do and post here if I happen to try it!
@nicstt let me know how it works! I still haven't been able to give this tutorial a try but I am excited for the possibilities with it.
tnx,
that you have to write this is a bummer.
thanks in advance for this info and for taking on that project.
I'm now certain I'll be using these kinds of export tools in the very-near-future as I migrate my core workflow and purchasing sources to alternative environments.
cheers,
--ms
tnx wolf359,
this is interesting and I've saved it, as I think the value and logic of it will make more sense once I start exporting my assets.
--ms
Good Morning everyone, I am new to the daz to blender workflow and I am happy to find a small community over here. I have a problem that I cant solve at this time maybe one of you can help.
I am trying to import an aniblock to blender from daz via the diffeomorphic plugin via the import action botton, the baking part to keyframes done in daz. I get the message some morph were not imported,so I guess it didnt work. Is an aniblock to give some nice jiggles to the breast of a female character but it doesnt import in blender. I tried both blender plugins for the bones in blender , spring bones and wiggle bones but it doesnt work at all, maybe I am stupid and I am doing something wrong, I would love if someone can give me a trick how to do that.
another thing I want to get that face mocaped with facecap plugin however I am not able to find any tutorial how to step by step to blender, how to transfer the data to a mesh, in this case a daz genesis 8 imported to blender by difffeomorphic.
if someone can help and knows the solution please help , i will really appreciate it.
Hi all! After using the diffeomorphic import and adding the face units, I realised there isn't any control for the eyeballs. How do I control where her eyeballs point towards now?
I don't merge in the face rig.
I also make sure to load the face realted controls
This is actually not a property of the Alembic format itself, but rather of the exporting application. It could just as easily export the entire scene as a single object just as easily as it could each material zone as a separate object.
If you dont merge the face and pply an animation to the rest of the armature rigged I guess when the character start to move the face rig will stay a part somewhere esle in the scene, no ? Can you guys please share your workflow on how to animate a daz genesis 8 face in Blender ? how to use facecap or any better solution ? I am not interested by realussion software as it doesnt accept some type of characters woth geoshells as cyborgs or even some HD characters plus you need iclone and other plugins.
Assuming the Diffeo plugin imports all of your visemes as Blend shapes ,You can try manually animating them for your lipsynch/Facial performance.
Blender has an excellent graph editor& Dope sheet
animating manually will take forever and will never be organic , I saw some examples of facecap, is a very cheap plugin for facial mocap for iphones and very powerful, however I cant find any tutorials how to do, not even in their website https://www.bannaflak.com/face-cap/ check it out, once I import daz in blender with diffeomorphic the only thing I know is to apply a BVH file to the armature but I cant find any tutorials or manual for facial animation, the facecap app record all the data to an fbx file I guess, but then how to apply that data to the character ?
this is the character I am trying to face mocap, I am getting inspired by the movie Alita , if I can facemocap this ill be very happy to track it inside a real footage and see the effects as I have blackmagic camera that records 12 bits raw video.
here a frame of the character applied a BVH animation plus a 3 studio light setup, as you can see if I can get this babe to talk it will be something else and ill be happy to get a rtx 2080ti for render. I am using cycles to render as I get better realistic results.
https://www.bannaflak.com/face-cap/importavatar.html
You will have to alter your G8 to conform to the facecap export scripts.
Hi @shavonnew! I hope I covered that in my video. You have to merge all the rigs, and then the face bones, and then make all bones posable, just go right down the line in the corrections tab. Then go to the morphs section and update morphs, and facial controls. Then scroll down and you'll find the facial stuff (as well as expressions if you convert those too in the morphs section, and visimes if you want to use papagayo for the stable version or apparently anilip works with the new version which I still have yet to really play around with). In the facial tab, there is a bar that you can use to move the eyes sideways, up and down, squint, close them, etcetera, as well as brows, cheeks, you can do everything and keyframe it!
But please make sure to do all the updating of the morphs first, if you go back after you decide to use rigify, your character's face is going to get all wonky.
Yes, but I dont want to import the avatar, I want to apply the data from the fbx imported to blender to the daz character, I dont know how to apply the shape keys from the data to my daz character, maybe @benniewoodell knows how to do that, and he can maybe explain it, please .
They say Diffeo can apply .Duf files with facial viseme animation but that wont help you with the face cap Data unless you can get that Data into Daz studio before exporting to blender via Diffeo
.............. I truly wish blender could retarget new Data to FBX as it would save me much time in my Iclone FBX to Blender Pipeline.
how to get the data then into daz ? is an fbx file. also my character doesnt have bones that close the eyes and open the eyes when is imported to blender, i can play with all bones of the face except for the eyes are nor imported, any solution ?
suddenly I baked 3 anyblocks and the breast jiggly start to work now, it didt work before weird. but still trying to figure out for the facial mocap, lol.
I don't tend to animate.
I will at times, but it isn't part of my normal workflow. I use it get related scenes done, which I rarely do.