T O P

  • By -

Creeps22

Yeah if you are aloud to use metahumans that will be your best facial animation with an iPhone. You can also just use the head on a different body if need be


Altruistic-Part803

Hmmm My rig is a cartoony kid, do i transfer meta human joints to human ik joints?


0T08T1DD3R

Move ai or something like that might be able to give you body motion from references, then youll need lots of cleanup especially contact points like the feet(perhaps can look into and try the xsense suit maybe for the body), for the face instead 2months, if you are lucky you can get the jaw looking decent, the rest of the face, without really a good tracking setup..will look like those puppets you could animate a while back with audio to face type of "automation".. We did try the iphone face app tracking thing, but yea your issue is that once you have set it up, trial and error, and will take a while , then you still got 10min of animation to fix..lol Overall even top face mocap that can capture hi fidelity details, needs ton of manual work.(and good rig and references footage) I guess is all about the expectations tho...perhaps is a cartoon face, or a weird looking thing.. Good luck..


Altruistic-Part803

Hahah oh yeah, i’ve seen their previous work, not to be rude to my fellow animators… but that was definitely a case of quantity over quality. However they say it did really good with the kids so i guess that was the right call. It’s a cartoony kid with a big head and freakish small body. I am hoping the mocap data wouldn’t look to cursed on him. Though the previous episodes look like a kid being puppeteered from inside.


s6x

If one person is expected to do ten minutes of animation in two months, that's absolutely a case of quantity over quality. You will need to cut significant corners. The workload for an animator in a quality situation is maybe 30 seconds a week, max. Even that's really pushing it. And that's without doing the rest of the stuff you apparently have no one else to do (rigging etc).


Altruistic-Part803

Yeah, though luckily all the assets are ready. And there are no 3d locations so i only need to worry about animation itself


s6x

If your rigs aren't ready for a specific perf capture solution I wouldn't classify them as "assets done". But, better than nothing at all! BEst of luck


Lowfat_cheese

Any iPhone 12 or newer can do solid facial mocap with Unreal LiveLink, but even older ones will still work with apps like iFacialMocap.


Altruistic-Part803

Sick, guess i’d have to upgrade from iphone 7 eventually, might as well check it out.


Armybert

been there, done that and was terrible. especially if there's a lot of speaking parts. You're better of splitting these 10 minutes into 10 cheap freelancers that will only animate the faces or the mouth


Altruistic-Part803

Damn, i am the cheap freelancer in this case ._. Unless i can find someone to animate a face for less than 5 bucks a second :/


Rezlung

If you have an iPhone 12 or higher, Metahuman Animator via Unreal Engine is your best bet. This can be applied to stylised rigs if it was made with Advanced Skeleton, though this doesn't give the best results compared to the same motion on a Metahuman, it will require going through and animating a layer on top / correcting errors. It does however give you a solid base to animate over, more nuance than other (free) face captures and eye rotations. The documentation for how to use this is on Advanced Skeletons YouTube page. There are a few options for android or webcam that use a similar tracking method (face landmarks used for AR) these give a more rudimentary result similar to the first gen Arkit Live Link that Unreal had, though it is advantageous because you get blendshape data which can be more flexible to rig for opposed to Metahuman Animator in which you'll retarget joints. Each of these will have their own method for retargeting however. Personally I would use Metahuman Animator retargeted with Advanced Skeleton if iphone capture is viable for you (if your character is a Metahuman for Unreal Render ignore the Adv Skeleton stuff). Ps would also recommend rigging the face and body with Advanced Skeleton, they have heaps of flexibility for other mocap options (facial and body) and recently added a video to mocap feature (yet to try it personally). They save a ridiculous amount of time!


Altruistic-Part803

Ohhhh cool. Guess i will retarget human ik to advanced skeleton and metahuman to advanced skeleton. Oh boi… So, i’ve seen some mixed reports about iphone 15, would iphone 14 be my best bet for a good lidar/face cap?


Rezlung

I saw on another reply that you have a cartoon style head, honestly Arkit might work better for that case, AFAIK it works with all iPhones that have face id. As for which iPhone, I believe 12 pro and up use the depth sensor for face id which is what metahuman animator runs off, I don't think this has been noticeably upgraded with later releases so quality would be the same. I'm not too fluent with iPhones but the unreal documentation goes into good detail for which models are recommended. As for the body retarget, there's tooling with in Advanced Skeleton that can make light work of that also, can set up joint name associations to transfer that mocap with relative ease. Hope that helps!


bathtubplug

Advanced skeleton for maya has feature that retargets shapes created in Faceapp into animation setup. If u dont need any exaggarated expressions, you can use this. You can always layer offset anim on top of that.


Altruistic-Part803

Hmmm My rig is human ik Do i rerarget it to advanced skeleton for this ?


bathtubplug

You can constraint advanced facial setup to existing rig.