A team from EPFL (École Polytechnique Fédérale de Lausanne, School of Computer and Communication Sciences Computer Graphics and Geometry Laboratory) have released Faceshift, new facial animation software for Kinect that you can see in action in the video above. (This is primarily professional animators’/developers’ software, not a Kinect app for the Xbox 360.) At least 1 thing that separates Faceshift from similar Kinect software is that it can actually track your eyes!
Here’s more info from the Faceshift website:
MARKERLESS MOTION CAPTURE AT EVERY DESK
We believe that technology should enable creativity, that it should be at the same time simple and powerful, and that it should be accessible to everyone.
Now for the first time you can capture facial animations that are realistic and emotional, with a system that is affordable and easy to use.
- Gaze Tracking
- Head Pose Tracking
- Realtime Tracking
- Offline Refinement
Excellent tracking is the core of faceshift. We can track with high accuracy, because we learn your personalized avatar. The avatar is created from a few training expressions, and the resulting tracking is very stable, accurate and expressive. We are tracking 48 blendshape parameters, which allows us to capture even small emotions.
We also track the eye gaze and head pose, brining the characters alive. And we do all of this in realtime, giving you the possibility to adapt your acting on the fly. For the final result, we offer an offline refinement and editing stage which further increases the tracking accuracy.
- File Based
- Animation Export (fbx)
- Blendshape Curves Export
- Virtual Marker Export
- Plugin Based
- Motion Builder
- TCP/IP streaming via open format
All excellence at tracking is useless, if it is not easy to get the results onto your own rigs, in your own workflow. There are four ways of using faceshift:
– export the animation on one of our predefined rigs or personalized avatars as a fbx file, which can be imported in every major animation software.
– work directly within your animation package with our maya or motion builder plugins.
– integrate with existing marker based pipelines. No more need to place the markers accurately on the actor, you define them once and export as bvh or c3d for every take.
– and for the real geeks we offer a TCP/IP streaming protocol which gives you access to all the tracking data in real time, allowing you to come up with your very own workflows and applications.
- Custom Avatar Creation
- Fully Rigged
- Gaze Tracking
We did not set out to make an avatar creation software, but the result is pretty cool. As a byproduct of using faceshift you get a fully rigged, expressive avatar, which you can export to use in your own applications, and which you can drive with faceshift.
And people are coming up with very creative uses for this, from sliced wooden head sculptures to surreal fashion photography.
- Runs with
- Microsoft Kinect
- Asus Xtion live pro
- Runs on
- Linux [not yet released]
We support all major operating systems and the affordable RGBD cameras. And we made sure that our software is efficient such that you can run faceshift and your animation software simultaneously even on current laptops.
- Video Tutorials
- Responsive support
- Affordable support agreements
We know that not everything goes smooth all the time. This is why we are proud to offer a responsive support and extensive self help material. If something is unclear, drop us a mail or visit our support page
Source: EPFL News' YouTube Channel, Faceshift.com, ryandmonk's YouTube Channel, Tibi Ti.'s YouTube Channel Via: http://gizmodo.com/5960875/how-your-video-game-character-could-soon-share-your-rage-face