Dr. Mar Gonzalez-Franco Profile picture
Oct 1, 2022 16 tweets 13 min read Read on X
Today I joined the @ClipeCurie training workshop to present all the technical work of the libraries we have produced on Avatars and avatar animations while at @MSFTResearch. Some notes. 🧵
1/ I❤️ the @HorizonEU projects that have a special focus on training students on the @MSCActions. CLIPE is creating very consequential tools and future leaders in the space. With folks truly collaborating across the continent @Inria, @MPICybernetics, @ucl @UCYOfficial, @la_UPC ++
2/My work has been growing & growing, from avatars to locomotion to Haptics to cognition to interaction to real-time systems. It feels great to give a very focused talk on just 1 TOPIC. For the longest time I wanted to focus on our open sourcing effort on Avatars @MSFTResearch
I always felt: “but we are in the middle of building more tools! I should wait.” A big shout out to my partner in crime: @Eyal_Ofek for this one. So today that was it and at CLIPE I presented: “Using Existing libraries to animate avatars” And now a summary of the talk:
3/ In 2019 I became a CS archaeologist 🕵️ and found my way to recover, update & revive the old #Rocketbox avatars (thanks to @HavokDave @MarkusWojcik). That ended as the release of v1 #MicrosoftRocketbox avatar library for research & academic use microsoft.com/en-us/research…
This release was accompanied with a “largish” + seminal paper on avatars on: why are they useful & how can they be done. Lined up key players across industry & academia (@melslater @StanfordVR). W so many authors the review happened from the inside 😅 frontiersin.org/articles/10.33…
4/ We released 470 animations and added compatibility to @Mixamo animations and the @CarnegieMellon mocap database. We also released a set of animals. Re-released w open #MITLicense. External contributions started coming. Batch importer to @UnrealEnginegithub.com/microsoft/Micr… Image
5/ There was still a gap for easy Mocap & real-time animations: so I led the next release of the Movebox. 3 projects in 1. (1) A capture studio w real-time remapping from kinect v2 and azure kinect. Where you can record and replay movements in @unitygames. Image
(2) a first person project where you can use an HMD and embody an avatar with basic IK for the arms and hand/finger tracking compatible w @oculus Quest. But also @htcvive and more. Image
(3) A #SMPL model parser. This parser was a contribution of @Cornell_VR & @ericwang0701 that can be use to import body tracking information for N avatars recovered from videos using CV tools like the ones developed the team from @Michael_J_Black from @MPICybernetics Image
6/ We then focused on facial animation. @volontematias helped create a tool that allowed for anyone to do facial expressions inside Maya & easily export fbx w blendshapes of all the avatars in the library. We release v2 of the #MicrosoftRocketbox now with over a 100 blendshapes. Image
Compatible with FACS (for openface), Visemes (for salsa and oculus lipsync), and Vibe facial expressions. Plus 17 additional emotions for easy code driven triggering of happy/angry etc faces. Demos of these setups get released w the new Headbox: github.com/openVRlab/Head…
Rapidly the team led by @panxueni takes on this new tool and builds compatibility with ARKit. @fangma82 creates 52 new blendshapes. And contribute back! (We are now in the process of validating the new expressions so stay tuned) Image
6/ There have been so many cool contributor to these libraries that they don’t fit on a tweet. And more is coming soon: @zyrcant and @RyanMcMahan. Image
Some cool things that are still to be done: 1) real-time morphing of faces and body vertexes. To change body shapes or adapt the faces to be like lookalikes and add more diversity among the avatars by swapping heads around. 2) add more mocap remappings. 📢 People: just go do it!
I will probably not be able to continue contributing as much going fwd. But I feel like I am leaving a sailing boat ⛵️on a good route. @Eyal_Ofek will be your Point of Contact at @MSFTResearch. It has been a pleasure. Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Dr. Mar Gonzalez-Franco

Dr. Mar Gonzalez-Franco Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @twi_mar

Jan 3, 2021
Join our #VR Locomotion workshop at @IEEEVR.
@hasti_seifi @maxdiluca & I spent almost a year collecting all sorts of locomotion techniques (over 100) into an interactive database locomotionvault.github.io/workshopieeevr…
We want your contributions and explorations. Abstracts Jan 29, Papers Feb 15
Let me start a fun thread here to inspire some of you who think all has been invented already in the field of Locomotion... My top 5 crazy techniques:
1/ Head Bowling: literally throw your head (and camera) around the scene.
2/ Hand walking. It is like your hands are your legs, and you should totally try it. Especially everyone on the science of body schema. Find it on Lucid Trips lucidtrips.com
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(