Today I joined the @ClipeCurie training workshop to present all the technical work of the libraries we have produced on Avatars and avatar animations while at @MSFTResearch. Some notes. 🧵
1/ I❤️ the @HorizonEU projects that have a special focus on training students on the @MSCActions. CLIPE is creating very consequential tools and future leaders in the space. With folks truly collaborating across the continent @Inria, @MPICybernetics, @ucl@UCYOfficial, @la_UPC ++
2/My work has been growing & growing, from avatars to locomotion to Haptics to cognition to interaction to real-time systems. It feels great to give a very focused talk on just 1 TOPIC. For the longest time I wanted to focus on our open sourcing effort on Avatars @MSFTResearch
I always felt: “but we are in the middle of building more tools! I should wait.” A big shout out to my partner in crime: @Eyal_Ofek for this one. So today that was it and at CLIPE I presented: “Using Existing libraries to animate avatars” And now a summary of the talk:
This release was accompanied with a “largish” + seminal paper on avatars on: why are they useful & how can they be done. Lined up key players across industry & academia (@melslater@StanfordVR). W so many authors the review happened from the inside 😅 frontiersin.org/articles/10.33…
5/ There was still a gap for easy Mocap & real-time animations: so I led the next release of the Movebox. 3 projects in 1. (1) A capture studio w real-time remapping from kinect v2 and azure kinect. Where you can record and replay movements in @unitygames.
(2) a first person project where you can use an HMD and embody an avatar with basic IK for the arms and hand/finger tracking compatible w @oculus Quest. But also @htcvive and more.
(3) A #SMPL model parser. This parser was a contribution of @Cornell_VR & @ericwang0701 that can be use to import body tracking information for N avatars recovered from videos using CV tools like the ones developed the team from @Michael_J_Black from @MPICybernetics
6/ We then focused on facial animation. @volontematias helped create a tool that allowed for anyone to do facial expressions inside Maya & easily export fbx w blendshapes of all the avatars in the library. We release v2 of the #MicrosoftRocketbox now with over a 100 blendshapes.
Compatible with FACS (for openface), Visemes (for salsa and oculus lipsync), and Vibe facial expressions. Plus 17 additional emotions for easy code driven triggering of happy/angry etc faces. Demos of these setups get released w the new Headbox: github.com/openVRlab/Head…
Rapidly the team led by @panxueni takes on this new tool and builds compatibility with ARKit. @fangma82 creates 52 new blendshapes. And contribute back! (We are now in the process of validating the new expressions so stay tuned)
6/ There have been so many cool contributor to these libraries that they don’t fit on a tweet. And more is coming soon: @zyrcant and @RyanMcMahan.
Some cool things that are still to be done: 1) real-time morphing of faces and body vertexes. To change body shapes or adapt the faces to be like lookalikes and add more diversity among the avatars by swapping heads around. 2) add more mocap remappings. 📢 People: just go do it!
I will probably not be able to continue contributing as much going fwd. But I feel like I am leaving a sailing boat ⛵️on a good route. @Eyal_Ofek will be your Point of Contact at @MSFTResearch. It has been a pleasure.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Join our #VR Locomotion workshop at @IEEEVR. @hasti_seifi@maxdiluca & I spent almost a year collecting all sorts of locomotion techniques (over 100) into an interactive database locomotionvault.github.io/workshopieeevr…
We want your contributions and explorations. Abstracts Jan 29, Papers Feb 15
Let me start a fun thread here to inspire some of you who think all has been invented already in the field of Locomotion... My top 5 crazy techniques: 1/ Head Bowling: literally throw your head (and camera) around the scene.
2/ Hand walking. It is like your hands are your legs, and you should totally try it. Especially everyone on the science of body schema. Find it on Lucid Trips lucidtrips.com