Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
I import both posed and unposed. I tend to import some poses for use, which if you rigify, you can't do.
Rigify really is brilliant, expecially for animation, but great for just for single poses too.
Take advantage of the pose library feature in blender.
I was trying to follow @benniewoodell 's youtube video but it seems like for him, once he clicks rigify, everything moves along with the model including her clothes. However when I click rigify, her clothes and eyebrows, eyelashes/peach fuzz from daz all gets left behind. I've attached how my scene collection looks.
EDIT: Okay nvm, I just didn't merge rig for all of them. Turns out all it takes is for me to post a question on a public forum first before I can suddenly figure it out by myself lol.
That's called talking to the rubber duck. When there isn't someone handy, you talk to an item on your desk, bath, or wherever; posting on forums works too.
Glad you solved it.
Her hair is a particle system attached to the body. It will go wherever her body object goes.
And complexity is actually Alembic's flaw; it is designed to do much more than other interchange formats. Alembic won't afford you any advantage that obj won't, if you're not animating. I don't know if there are other ways, but it does work very well to simulate things in MD. I have actually simulated entire shots with it by importing the unsubdivided model's animation via Alembic. I have no knowledge about the morph technique you mentioned.
Unfortunately, and fortunately, I work with mocap data and so my workflow is not really relevant for you. But I would do:
Create the character, and pose it in Daz
Don't sim anything from a memorized position, but start with the A=pose at the first frame, and you desired pose at 30. Export that, without SubD, to MD.
Fit the clothing on the A-Pose in MD,and simulate. Export the garment at frame 30 to Blender via obj.
Go back to Daz and export the character, with SubD, to Blender via obj. It should fit mperfectly on the character.
Groom the hair in Blender.
Texture and light everything, and render.
I don't *think* there's anything ridiculous in there, that was just the first workflow I tried that worked; I stopped thinking about it after that :)
It also just dawned on mr what you were asking. I guess that's an advantage of the Diffeomorphic Daz Imorter over Alembic: With Alembic, you can't pose without re-importing,which means re-doing the hair if you groom it. With Daz Importer, you can just pose the character, or import a saved pose from Daz. Daz Importer supports the JCMs that make Daz characters so beautiful, but I don't know if it supports the JCJs that add subtle realism that you'd otherwise have to add yourself. JCJs are what people are calling the fact that certain joints affect other joints; raising the eyebrow widens the eyelids, and such.
This is because every surface comes over as separate mesh giving you the option to hide/unhide things Like hats sunglasses weapons etc. WITHOUT changing the overall vertex count
For example I can send this character (in the Video) from iclone wearing his hat for this shot
for the next shot I can animate him sitting at his computer back in Iclone and export a new ABC file overwriting the previous one, with same name.
back in Blender he would now be sitting at his desk driven by the new ABC Data.
Now I could just hide his hat/glasses /sidearm etc. in a collection with visibilty turned off and render another shot without having to export the untextured mesh from Iclone (sans the hat etc ), and redo the textures again
If only blender had away to completely over write/replace ALL of the motion Data in an animated FBX character... Life would be perfect.
Hey TheMysteryIsThePoint,
For you to (re)write an alembic exporter, what did you learn about the formal DAZ product that makes it not useful for your interests?
I own it, and I'm guessing you can adjust my expectations as I consider using it, although it sounds like your little beast will be a godsend.
tnx,
--ms
Blender will politely remind you to get with the times and refuse your import.
Thanks for this, I'll check it out. I've been experimenting and am now setting up a scene and will be appending various experiments in to the one character.
What @wolf359 said, and just its general flakiness. It uses the node names, which users don't even see, to identify nodes instead of the label which users do see and Daz ensures are unique, and fails when they are not unique, with a completely unhelpful error message. It messes up the vertex order and screws up normals. It mangles UV maps. it has an arbitrary frame limit. It is not a viable product.
And even Blender's Alembic support itself is kind of a hack, and so has severe limitations once it's imported. I also wanted to add some functionality to overcome those. For example, nothing you do in edit mode will stick once you go back to object mode.
One of the most important things I worked on was converting certain hair assets to Blender's particle system. Daz doesn't think we should be able to use dForce strand based hair, but that's standard for Blender, and as @wolf359 also pointed out, Blender is even getting a new hair system soon. When I converted dForce Classic Long Curly Hair, the model was 3.5 gigs lighter in Blender.
I'm going to try to finish the conversion code that just spits out a Python script that does everything, a one-click solution. There are some other small annoyances that I can fix over time, but I think I'll just make it available sooner rather than later because I certainly feel people's pain. I would literally be dead in the water without this tool; the Daz Importer is very, very good and I still use it for materials, but the slight differences in SubD drive me crazy. They shouldn't, but they do :)
@shavonnew I'm glad you got rigify to work with the clothes and such! Lol I too have had the same situation where you post and then a minute or two later you figure it out. Just know that once in awhile, if you just shift click the whole group, the hair might not merge. If that happens, just undo it and then open up the hair hierarchy and then shift click everything to make sure anything in there is for sure highlighted and it'll work. I don't understand why sometimes it does that, but it can be picky I guess. I also discovered a blender plugin called autorig pro. I haven't picked it up yet, it's 40 bucks or so, but it looks like it does a similar thing like rigify, only it looks like there's more control options. I might pick it up and try to see how it works bringing in a character with the diffeomorphic tool and just adding that rig on. It's just 40 bucks right now is a lot, if this was three months ago, I wouldn't have even thought twice. I'll see what I do and post here if I happen to try it!
@nicstt let me know how it works! I still haven't been able to give this tutorial a try but I am excited for the possibilities with it.
tnx,
that you have to write this is a bummer.
thanks in advance for this info and for taking on that project.
I'm now certain I'll be using these kinds of export tools in the very-near-future as I migrate my core workflow and purchasing sources to alternative environments.
cheers,
--ms
tnx wolf359,
this is interesting and I've saved it, as I think the value and logic of it will make more sense once I start exporting my assets.
--ms
Good Morning everyone, I am new to the daz to blender workflow and I am happy to find a small community over here. I have a problem that I cant solve at this time maybe one of you can help.
I am trying to import an aniblock to blender from daz via the diffeomorphic plugin via the import action botton, the baking part to keyframes done in daz. I get the message some morph were not imported,so I guess it didnt work. Is an aniblock to give some nice jiggles to the breast of a female character but it doesnt import in blender. I tried both blender plugins for the bones in blender , spring bones and wiggle bones but it doesnt work at all, maybe I am stupid and I am doing something wrong, I would love if someone can give me a trick how to do that.
another thing I want to get that face mocaped with facecap plugin however I am not able to find any tutorial how to step by step to blender, how to transfer the data to a mesh, in this case a daz genesis 8 imported to blender by difffeomorphic.
if someone can help and knows the solution please help , i will really appreciate it.
Hi all! After using the diffeomorphic import and adding the face units, I realised there isn't any control for the eyeballs. How do I control where her eyeballs point towards now?
I don't merge in the face rig.
I also make sure to load the face realted controls
This is actually not a property of the Alembic format itself, but rather of the exporting application. It could just as easily export the entire scene as a single object just as easily as it could each material zone as a separate object.
If you dont merge the face and pply an animation to the rest of the armature rigged I guess when the character start to move the face rig will stay a part somewhere esle in the scene, no ? Can you guys please share your workflow on how to animate a daz genesis 8 face in Blender ? how to use facecap or any better solution ? I am not interested by realussion software as it doesnt accept some type of characters woth geoshells as cyborgs or even some HD characters plus you need iclone and other plugins.
Assuming the Diffeo plugin imports all of your visemes as Blend shapes ,You can try manually animating them for your lipsynch/Facial performance.
Blender has an excellent graph editor& Dope sheet
animating manually will take forever and will never be organic , I saw some examples of facecap, is a very cheap plugin for facial mocap for iphones and very powerful, however I cant find any tutorials how to do, not even in their website https://www.bannaflak.com/face-cap/ check it out, once I import daz in blender with diffeomorphic the only thing I know is to apply a BVH file to the armature but I cant find any tutorials or manual for facial animation, the facecap app record all the data to an fbx file I guess, but then how to apply that data to the character ?
this is the character I am trying to face mocap, I am getting inspired by the movie Alita , if I can facemocap this ill be very happy to track it inside a real footage and see the effects as I have blackmagic camera that records 12 bits raw video.
here a frame of the character applied a BVH animation plus a 3 studio light setup, as you can see if I can get this babe to talk it will be something else and ill be happy to get a rtx 2080ti for render. I am using cycles to render as I get better realistic results.
https://www.bannaflak.com/face-cap/importavatar.html
You will have to alter your G8 to conform to the facecap export scripts.
Hi @shavonnew! I hope I covered that in my video. You have to merge all the rigs, and then the face bones, and then make all bones posable, just go right down the line in the corrections tab. Then go to the morphs section and update morphs, and facial controls. Then scroll down and you'll find the facial stuff (as well as expressions if you convert those too in the morphs section, and visimes if you want to use papagayo for the stable version or apparently anilip works with the new version which I still have yet to really play around with). In the facial tab, there is a bar that you can use to move the eyes sideways, up and down, squint, close them, etcetera, as well as brows, cheeks, you can do everything and keyframe it!
But please make sure to do all the updating of the morphs first, if you go back after you decide to use rigify, your character's face is going to get all wonky.
Yes, but I dont want to import the avatar, I want to apply the data from the fbx imported to blender to the daz character, I dont know how to apply the shape keys from the data to my daz character, maybe @benniewoodell knows how to do that, and he can maybe explain it, please .
They say Diffeo can apply .Duf files with facial viseme animation but that wont help you with the face cap Data unless you can get that Data into Daz studio before exporting to blender via Diffeo
.............. I truly wish blender could retarget new Data to FBX as it would save me much time in my Iclone FBX to Blender Pipeline.
how to get the data then into daz ? is an fbx file. also my character doesnt have bones that close the eyes and open the eyes when is imported to blender, i can play with all bones of the face except for the eyes are nor imported, any solution ?
suddenly I baked 3 anyblocks and the breast jiggly start to work now, it didt work before weird. but still trying to figure out for the facial mocap, lol.
I don't tend to animate.
I will at times, but it isn't part of my normal workflow. I use it get related scenes done, which I rarely do.