Body and Facial MOCAP | Rokoko + iClone

published on July 17, 2020

Hello and welcome to this virtual production vlog today we're going to be focusing on facial motion capture animation using an iPhone 10 in this case an iPhone 11 actually and we're going to be using a program called iclone for the face animation and we're

Going to be mixing that with our rococo mocap at the same time so let's jump right into this video yeah this is how I ended it it's awkward okay hello and welcome to this virtual production vlog today we're going to be focusing on

Facial motion capture animation hello and welcome to this virtual production vlog today we're going to be capturing facial animation using an iPhone can you use an Android or something else the answer is no anyway

We're going to be taking a facial mocap data from this and we're gonna combine it with our body mocap from our rococo suit boom zone ahem in the rococo smart suit Pro and let's go over like high-level what we're gonna get done

Today and how this works first we're going to be streaming body mocap data from the smart suit to rococo studio that's what we did last time but instead of recording the mocap data in rococo studio we're gonna pass the data to a

Program called iclone and it's going to take the body mocap data in iclone we're gonna then use the religion face app I forget what it's called and this now my iPhone Plus that app is going to record my face data and also send that to

Iclone so I clones kind of the hub it's gonna take face data and it's gonna take body data and put them together and then in icon I'm gonna record the performance right I'm gonna record my face and my body synced together so step one is to

Start rococo studio turn on your smart suit and hit the calibrate button and now you're gonna go to settings up here and StudioLive and you can see that from here you can send the data to all sorts of programs like unity and real

Engine MotionBuilder and we're gonna make sure that we're sending it to iclone so this is character creator made by or illusion and it's the program I use to make digital humans for Sinha tracer you can get clothes and hair and

All different sorts of customizations to the humans and they have a really good work flow back to Unreal Engine it's basically one click and it's really great so what we're doing is we're actually going to send this to iclone

Where is that send character to iclone so now we're in iclone this is actually a different character but I did the same thing I just sent it right here and what you can do with this human now is live puppet them if you have the motion live

Plug-in which you need so I'm gonna load motion live here and you see that I have two profiles that you have to go get separately I have live face for the face and rococo for you guessed it the body so I'm going to click on this okay so

You can see now that I have Rococo studio going alright and it's passing the data into iclone it's really easy there's not much to do just turn on the live link and then take it into motion live and you've connected these two

Together array okay so now we're going to add the facial mocap animation into iclone so I'm running an app called a live face here it's free it's from your illusion and it's going to handle all the face iPhone stuff and all you need

To do is connect it into your motion live setup here so I'm going to click on connect and there it goes client app connection successful they just have to be on the same Wi-Fi and then you it tells you the IP you type it in pretty

Easy so and there it is okay so now you can see that I am moving my body here and I'm also animating the characters face as well so what's happened now is that Rococo studio is kind of lost where I am

In the world and the way you fix this is basically by doing a recalibration so I am going to recalibrate for standing right here okay cool so now I'm back on my origin and I'm not like turned around or anything so that's good I'm gonna put

Rococo studio back over there and let's go for another recording here and see how it goes so we're doing the face and the body and action hello and welcome to this virtual production vlog today we're going to be focusing on facial motion

Capture animation using an iPhone 10 in this case an iPhone 11 actually and we're going to be using a program called iclone for the face animation and we're going to be mixing that with our rococo mocap at the same time so that's

Basically what the motion capture process looks like ideally you have this phone mounted to like your body so that it follows you or your head people use helmets but just so that you can basically walk around and it captures

Your face at the same time I don't have that rigged yet maybe I'll get it in the future so I'm just kind of standing in front of my desk but normally you would mount the phone to your body but long story short we have body animation we

Have face animation and I record an audio into this as well I hope I did I think I did so now we can take this data and we can export it to Unreal Engine so to get this into Unreal Engine all we have to do is hit file export and FBX

I'm going to go to 30 frames per second you see that we have a preset for unreal awesome I'm gonna do the range here that's fine and it is gonna send this mesh but we only need the animation that's it so I'm gonna hit export okay

So we are in Unreal Engine and I've exported this character creator character into Unreal Engine and got her all set up and now these are my other tests down here I'm going to go import that animation data so let's get the I

Cloned for vlog intro that's the one I just recorded we're gonna import it and these settings are important so just give it a second great so we do not want the mesh right we just want the animation and in my case I've

Retargeted everything to the ue4 mannequin skeleton all these default ones will work pretty well we're gonna hit import and this takes a hot second all the animation curves for every single bone every single morph

Target it takes a while so we'll be back in a little bit so while that's importing into Unreal Engine we actually have to export the audio recorded as well you could of course record the audio separately and sync it later but

Iclone will actually record it for you which i think is nice a WAV file all this stuff is fine and we're gonna just call this one eye clone oh for audio right okay so that animation finally got imported it does take a while and it is

A dense thing look at all these keys and curves these has every bone in the body and every morph target on the face it's a whole lot it's a dense thing but you don't really have to deal with that unless you feel like editing it

Otherwise it's pretty easy and all we do in this case is I imported that kind of like default character creator person and I put that animation onto her that's it really really straightforward really cool and let's uh look at it okay so

There's like some weird bouncing happening I think that's my fault there's definitely a lot of nuance and practice into like actually performing with the iPhone and performing on the rococo soo but technically we have this

All working right we see the face moving okay it froze there and we see the body going at the same time so with this in play in this FBX kind of proofed I'm going to bring this now into Sena tracer up she started again I'm gonna bring

This into Sena tracer and film it with my new virtual camera should be fun and then we'll sync up the audio in post I'm not gonna sink it in Unreal Engine you can but I'm just not gonna do that hello and welcome to this virtual

Production vlog today we're going to be focusing on facial motion capture animation using an iPhone 10 in this case an iPhone 11 actually and we're going to be using a program called iclone for the face animation and we're

Going to be mixing that with our KOMO cap at the same time so let's jump right into this video yeah this is how I ended it it's awkward okay so I just brought the mocap date of the body and face into Senate racer and I have my V

Cam hooked up and I'm filming some takes with it it's pretty cool hello and welcome to this virtual production vlog today we're going to be focusing on facial motion capture animation using an iPhone 10 in this

Case an iPhone 11 actually and we're gonna be using a program called iclone for the face animation and we're going to be mixing that with our rococo mocap at the same time so let's jump right into this video yeah this is how I ended

It it's awkward okay so the crazy part is that there's the light is there like I'm filming in the real world like that's where my key light is and this is ready it's freaky this is super freaky the phone like this and I really love

Using this V cam this is like my it's like my thing now umm so that pretty much wraps it up for the vlog we have combined our rococo mocap with our face mocap and we join it all together in iclone because it makes it really easy

To get all that data and a human character into Unreal Engine like it's really really good really simple speaking of which if you're going to GDC and you're interested in meeting up on Tuesday I'll leave a link below I have a

Facebook event we're doing an unofficial Unreal Engine meetup group though I think all a lot of the Unreal Engine people are coming too so look out for that if you're going to GDC and you want to hang out talk on real engine retro

Production whatever I'll be there I'm also speaking for side-effects for my Houdini workflow so I'll link to that talk that I'm giving and I'm gonna be doing this virtual production stuff and vlogging and making content and like you

Know new stuff for Senate race or until GDC so I'll keep reminding people and if there's any new talks or events that I'm going to at GDC then I'll let you know right after that is NAB which I'm going to as well but nothing like concrete on

The dates for speaking and meetups and whatnot there but there's going to be a whole lot of retro production on religion stuff any be this year so those are gonna both be really cool events so that wraps it

Up for this vlog I'll see you guys on the next one

Related Videos

You you hey guys welcome back to the first livestream of 2019 hopefully we're live I think the twenty second delay is just it's weird to get used to but...
In today’s tutorial we’re going to be looking at the 360 workflow and a couple of features HitFilm has to offer. The link in the description will provide a 50% ...
In this video we're going to take a look at the new features that were added in version 2 of Imerge Pro. You can learn more from the blog post on our websit...
Today we're taking a look at the upcoming features in the version 12 update for HitFilm Pro and Express the update will be coming out later this month but w...
Green screen is used in all types of movies, to put actors in places that would otherwise be expensive or impossible. Maybe you’ve filmed your subject against a...
So you’ve filmed your subject against a green screen, and you’ve successfully brought it into HitFilm and keyed out the green. But you’re not done yet- the comp...
Hi, I’m Axel from FXhome, and this video series is going to answer the question, “How do I use masks in Imerge Pro?” This video will examine how we can use Grad...
Hi, I’m Axel from FXhome, and this is the second part of our series on Masking in Imerge Pro. So if you haven’t watched the first video yet, I encourage you to ...
Hey guys, welcome back to another HitFilm tutorial. Alita Battle Angel recently launched and in this film they took CG and mo-cap to a completely new level to r...
Lighting your shots correctly is one of the most basic, but often overlooked, things you can do to make your videos look better. It doesn’t matter what type of ...
That clip comes from our new short classified mission renegade over the next few weeks we'll have tutorials for the effects you see in renegade and today we...
In this tutorial we’ll be covering how to create the bullet hits effect from our latest short, Renegade. This scene used stock footage from ActionVFX, who have ...
A good title sequence can help set the mood for your film. Today we'll be taking a look at how we creatd the title for our short film, Renegade, using stock...
When we made our latest short, Renegade, we knew we wanted to include a Mission: Impossible-style timer. In this tutorial we’ll take a look at how to create one...
Hey guys, welcome back to another video. Today we're going to be covering the behind the scenes stuff of Renegade; how we did it, what happened and what we ...
Hey guys, today we're going to be looking at the VFX Supervisor and how you can prepare for visual effects. When producing visual effects it’s never as simp...
Hey, this is Axel from FXhome, here with a tutorial on using Blend Modes for grading. We will be looking primarily at Imerge Pro in this tutorial, but the techn...
In today’s tutorial, we’re going to be looking at satellite surveillance and image stitching. In the example sequence, I use footage from a DJI Spark for my end...
A shaky camera can be a good thing or a bad thing, depending on the type of film you’re trying to make. It can increase the feeling of danger and put you right ...
Today we're going to be talking about camera psychology. They say a picture is worth a thousand words, and in the realm of film production that couldn’t be ...