Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
@aelghazi The diffeomorphic plugin doesn't import aniblocks you have to bake them to the timeline first. As for facecap I have no experience with it, but from what I see as @wolf359 said it needs the 51 FACS shapekeys that are not available for the genesis figures unless you do them yourself. Then you can simply copy and paste the animation over the FACS.
A ready to go solution exists for the Maya plugin.
https://www.laylo3d.com/facecap-x-now-available-affordable-facial-motion-capture-for-genesis-3-8-characters/
yes I even dont know how to male shapekeys and I am sure is not going to be an easy task. the pluggin you are talking by the same creator, if you check his website, there a new plugin called face mojo, that works with facecap but inside daz, so maybe once the animation inside dais done, it can be baked in timeline and then import as an action inside blender, im not sure if its gonna work or no, some aniblocks work goos some no once are baked , i tried few of them, the genesis 8 walk in heels where the is a nice breasts jiggle work perfecty once is baked inside blender, some facial animations work so bad once are baked, maybe because eevee doest look good for genesis compared to cycles. this new plugin called facemojo it looks good, but its 200 $, and nobody has tested yet if its gonna work in blender the way I explianed above.
I hope someone will make a something that does shape keys in blender like the plugin you mentioned for maya. there is a version of blender with e-cycles render engine, is faster that cycles but still slower that eevee I find myself limited sometimes with my gtx 1060 as 6gb sometimes is out of memory with large scenes and hace to use instead the cpu that takes forever.
For clarification I do not use the Diffeo Plugin for for sending animations ,from Daz studio to Blender ,because I dont animate in Daz studio as I use Iclone for both Face and body and send the FBX to Blender .
Reallusion Provides us with a special .duf file that contains all of the face morphs for lipsynch & expressions that we apply to our Genesis figures before exporting to Iclone .
Perhaps a Diffeo user ( who does animations) could explain how to send a figure ,with face animation, to Blender.
$200 for the upcoming facemojo plugin sounds like alot but it looks to be a really good solution (for Iphone users ) and a Daz studio plugin that requires an Iphone &only works with the G8 figures, has a limited target demogrpahic and has to be priced accordingly.
I will tell you how I do my facial animations using the diffeomorphic.
step 1 in daz, I make the character I want sabe the .duf and .json to import it with diffeo in blender, A pose is fine dont need the T pose.
step 2, I will apply to the same character in daz any time of aniblocks, I have few with morphs that give a nice boobs jiggles and another for walking, check out the effects, I hace other aniblock for facial animations called https://www.daz3d.com/real-facial-animation-for-genesis-8-maless-and-females it does the job but some facial animations doesnt export properly to blender, maybe because doesnt look good in eevee.
very important, in daz make sure to create many layers in the time line and put every aniblock you wanna use in a different layer and then you bake to keyframes.
step 3, you export all as a pose in daz, and you will get a window showing the number of frames before hitting the button.
step 4, once you imported the character in blender, you merge all armatures and the you import an action pose to the character, et voila !!
try and tell me it works most of the time not as I want bur it does the job done, I didnt try to apply an external BHV file to the facial aniblocks baked only to see if its gonna work but I will.
Please try and tell me how you find this tecnique, or if you discover other way, I dont use iclone because apparently my sci fi charcters doesnt import, because of geoshells or I dont know, the shell is imported badly not attached to the body, even if I apply T pose before importing to cc3 it doesnt work, so I had to find another workflow.
that facemojo plugin apparantly works fine, if everythink can be baked as keyframes and imported to blender the same way, it will be top notch. hope the price will go down or a demo version to try it out, I dont mind to buy it if it works, but need to try, plus facecap that you have to buy as well but not that expensive.
@aelghazi For the face animation to work in the diffeomorphic plugin you have to first import the face morphs. For easy of use they are divided into face units + visemes + expressions. If you are unsure what you need just import all of them. Then you can bake the aniblocks and import the animation in blender.
If the aniblocks use custom morphs then you have to import them with the import morphs button.
http://diffeomorphic.blogspot.com/p/morphs.html
I am baking all the animations to keyframes in daz and import as a single action pose in blender, it works, however I get the message some morphs are missing, but when I play it looks ok, since there is no shapekeys to play with in blender nothing can be tweeked in blender for the face animations. I really wanna try facemojo but at 200$ is a bit expensive, plus 50$ for facecap,if ican make the same animation and import them same way in blender, then the workflow will be very good .
a small test I did with eevee, let me know what do you think.
Like it ... is all the animation done in DAZ Studio or have you tried animating in Blender?
all in blender, daz just used to create the character and for baking the facial and animations, all the correction part and lighting and composition in general done in Blender thanks to diffeomorphic plugin. I am sure with the facecap and facemojo is gonna be a game changer for mocap, imagqine the possibilities with 3D tracking into a real scene, you really can make something very nice, I played a lot with the nodes to achieve this result with eevee.
Yeah, impressive. You are talking about things way over my head at the moment but I have acknowledged many times that if I want to use Blender, I will need to learn those things. Animation, IK, rigging, the node system ... my capacity for learning complex new techniques is not what it once was so I keep my head in the DAZ Studio sand and occasionally peek above the surface to see what others have done in Blender, etc.
But a thread on sharing the DAZ Studio to Blender workflow is a great idea and the more who post their techniques, the better for the 'fraidy-cats like me.
beleive me I started learning blender and daz just when the quarantine started, I did basically all in daz, just used blender to make the character better looking and faster render and to be able to use it in a composition ,a more complex scene, the animations facial and body are anyblocks that are baked into keyframes in daz and then exported all to blender using different button from diffeomorphic, of course to make the character look a bit better you need to have a basic knowkedge od lighting tecniques and compositions and play around the shading nodes to correct what is wrong like skin or eyes, basically i use daz just to apply at the moment a facial animation as I want to try the facial mocap for my needs, I have a 5gb of body mocap BVH files that I can apply in blender as well to the character in A pose. I will happy to share my workflow and tecnique used and I hope we can make small daz to blender community, we are still waiting for bennie to share with us the cloth simulation tecnique he uses.
Indeed, I learned from Bennie's first video and eagerly await the next.
What you say about baking aniblocks to the DAZ Studio timeline is why I asked about whether you animate in Blender. Same goes for BVH and Mixamo ... these are imported animations rather than keyframed in the Blender timeline (I hope I'm using the correct terminology). So my question was really trying to get at how difficult (or not) it may be to use the Diffeomorphic plugin (or FBX, etc.) to import a fully clothed DAZ Figure and animate (with cloth sim) in Blender.
Blender has an oddity when it comes to clothes simulation. If the clothes is too dense (i.e. high vertex count) it starts to behave weird when being simulated. My solution if that occour is to scale the clothing and the character by 10, and then the simulation behaves nicely. I have tried to see if I could tweak the parameters to obtain similar, but without real luck.
I hope I can develope my skills so I have an excuse to upgrade my laptop to a nice RTX 2080ti tower or next gen RTX maybe. At the moment still want to see the results of the facecap and face mojo plugins and see if a small short film can be achieve with a nice quality and details. More techniques I have to learn such as simulations ( fluid and cloth) and general composition in blender, realism is important for me , i am sad I have to use eevee instead of cycles but no possible with my laptop. yes I chosed the easy way to animate my character and take it back to blender, but was not easy to find that workflow and at least it worked thanks to diffeomorphic.
I recomend you guys to see what the facecap is capable of its impresive and this character is so impressive the details, I am sure some daz characters are so realistic if they are tweeked in blender, but that facecap thing is impressive.
To be honest it just looks like any stereotypical Daz/Poser, breast bouncing, sexy strut, that we have seen youtube for years.
No hair or even Blenders excellent realtime SSS available in EEVEE.
Not sure I see the point of going through the whole Diffeo export/ import process when one can get a similar animation( with lipsynch) with a straight FBX export from Daz Studio 4.12 without third party plugins.
I believe the novelty/value of this is this clip is that we're seeing the same result from a completely different workflow - active bones and blends from DS successfully exported and rendered in a completely non-DS environment. If I understand that correctly, it is "a good thing" that we're seeing, assuming the workflow isn't any more complex than described.
appreciated,
--ms
It's getting better each day. Brekel and many others have had facecap for a while, but it always seemed to be 'not quite there' to the viewer - kind of an uncanny-valley with respect to the motion and emotion of the faces. I think most of these early version tools miss the cheeks and under-eye information, which is where a huge amount of the emotional bandwidth in humans occurs. This is probably a mechanical issue, because the point-clouds and edge-detectors don't have a lot of information to grab from those flat and texture-less areas.
The video in that post is inspiring.
For a good while longer I see myself using the facecap baseline in concert with aniblock-like or puppeteered 'ennhancement' layers, and that being the most common path in the hobbiest (read affordable) domain - DS, Iclone, Carrara, and Poser all have their variants of these controls. I'd like to be be proven wrong with better and affordable (app-native) technology, but I also think the baseline + enhancement model will *always* be needed - for control - especially with toon-oriented animations where exaggeration is a key component of that medium.
cheers,
--ms
maybe similar animation but not similar texture/shaders imported, I think diffeo has a better shader import than a normal FBX file, thats the reason we are using diffeo and not other type of imports with daz to blender, at least a good base to worl with with blender shading nodes system, at least for people like mne looking for photorealism.
exactly, diffferent workflow is always good as it opens other possibilities that previous or other types of workflows cant do or very limited, the only bad thing is that diffeo doesnt import blendshapes, and the only way to match the facecap data is to have those blendshapes, and I have no idea how can I create them and it doesnt lookj easy at all. good thing the facemojo with facecap internally in daz is a good thing as we can facemocap in daz and then bake to keyframes that and import it to blender same way I used in my video for aniblocks, and once the result achieved, infinite possibilities in blender to start with. I hope a new version of diffeo will be more complete and and can import those damn blendshapes.
this a cycles one frame render thaat took me 8 minutes to render, you can see the results are amazing, way better than an fbx import, of course I played a bit with nodes and lighting but I just see it very difficult to get same result from an fbx file.
the mapping of controls to sensors - as natively as possible - is kind of the big goal here, isn't it.
Then, being able to edit (correct, refine, exaggerate, etc.) after-the-fact is the other requirement for the control we seem to want.
I see it coming - with almost exponential velocity, but it may require that I get out of my rut to take advantage of it.
@aelghazi - your work proves how close we are, and where the additional work still needs to be done.
Good thread and insights going on here.
--ms
JCMs and proper materials, I would guess (not to mention the countless other things the Diffeomorphic addon does for you automatically if you plan to do anything more complex down the road).
you can check the cycles frame and just tell me if FBX is doing the same, maybe you have to delete all the shading nodes and start doing it from zero, if you enjoy the process maybe is good but I dont think is a good idea, as diffeo give you a good base to start tweeking nodes.
@aelghazi that image looks fantastic! The video also looks amazing, I'm so glad you got it all to work!
@Padone, or anyone that might have come across this, I hope someone can help me out here. This week has been insanely busy and I'm sitting down to relax a bit and wanted to see what some of my shots for my animated short would look like in Cycles now that the animation is done and I'm just waiting for the actors to send me their audio, and if I see that it can help my film, I'm going to re-render it all in Cycles while I'm working on the audio and use what I have as the temp track. I went ahead and downloaded the newest version of the Diffeomorphic tool as well, so it's the development version 1.4.2. Now, the girl looks stunning in cycles for the reverse angle, but the guy I'm having some difficulty with and I don't know what the problem is. As you'll see, in the shot that's got the background and her in it, that was in Eevee and the hair is exactly how I exported it in Daz. But when I kick it over to Cycles, now the hair isn't shaved on the side anymore and his eyelashes look to be squares over and under his eyes. I've sat here for the past three hours or so trying to putz around in the shading and UV Editing tab trying to see if I could find the culprit, but nothing came of it, I can't get the shaved sides. And there's even a material set up in the material tab called SamuraiBun_Shaved-1. I checked every node and nothing worked. So I'm just wondering if this is something you've run across or not. It's not the end of the world, it's not going to break the film if I can't have the sides shaved, but aesthetically I like it better.
Thanks so much for any help in this matter!
Ive got across the same issue while I was trying to reduce cycles render time: under light paths max bounces, make sure you dont have any value under 8 specially transmission and transparency, try the total value as well, hope it will work for you.
ive got some improvement for sure for my anymations and shading node setups ,better but still not as I wated to be.
Ah ha, max bounces! I will give this a try after this shot I'm doing renders and see if the max bounce is the culprit. I do know I have it set to 4 as I had read earlier today from Blender Guru that he just sets it to four and generally it's fine to speed up Cycles renders. But if it's going to affect hair like that, I shall boost it back up! Fingers crossed it works.
And keep at it, you'll get it to where you want it to be for sure. That's the joy of all this, it's a constant learning experience. Every day, every project brings about a new challenge to figure out, keeps the brain on it's toes :)
let me know if it works .
It worked! I had to jack the max, transmission, and transperency bounces up to 200, but it finally got rid of the hair enough! Great call, thank you so much!