![faceshift with kinect faceshift with kinect](https://i1.rgstatic.net/publication/278524063_Neural_Correlates_of_Facial_Motion_Perception/links/5581475508ae607ddc324444/largepreview.png)
It seems that there is no reason to move from Unity to UE4 just only to get best graphics. I recommend you to try microsoft kinect sdk. Direct input looks very nice in case of plugins for Unit圓d.
![faceshift with kinect faceshift with kinect](https://new.cgvisual.com/wp-content/uploads/2020/05/JUL12_KINPRO.jpg)
I just want to transmitt tracking data from my kinect to UE and apply it to a 3d model in real-time. I said that I don’t want to record animation. If you want achieve a good quality use depth cameras. Simply put you might want to record an animation and then import into UE4, because direct input ( without any proper tweaking/smoothing and so on ) will look like ****.Īlso, facial movement are way too subtle and a 2d image wont capture all the expressions…when 4D scan will be processed in realtime then we can talk about something incredible Realtime facial tracking currently lacks proper cameras ( no 60fps to be seen around so goodbye proper lipsync, unless you use a GoPro with a proper usb/hdmi converter ) and technology, even if Faceware is about to release the tech for everyone to use. We’re at least 2 years for that to be a reality.