The perfect virtual YouTuber set-up
Take a bow, MVN Animate. This technology is unmatched for ease of use, has robust and reliable hardware, and produces production-quality data. MVN is the ideal tool for professional VTubers.
Learn more about MVN Animate
First, you need a VTuber avatar. Simple, right? Not exactly. A full-body avatar needs to act and move naturally, and creating a unique avatar is not easy. It needs to be fully “rigged” before it can move in a natural way. The easiest way to start is to download a model from pages like TurboSquid, Sketchfab or CGTrader. Unreal Engine also released Meta Human Creator to create high-fidelity digital humans.
To pull it all together you need 3D animation software such as Unreal, Unity 3D or iClone. It’s really easy to stream Xsens’ motion capture data into all major 3D animation software packages. You can do it either natively or via a plugin. An overview of the Xsens integrations can be found here.
In this set-up, you see the full-body motion capture system from Xsens, including the “MVN Link” hardware (suit). The live motion capture data can be streamed into Unity using MVN Animate software to give you the best live quality data available.
Many VTubers use the MVN Awinda Starter in combination with the MVN Animate Plus software for live streaming.
Xsens motion capture technology has been a proven technology for many years and has a long track record in live and streaming animations.
Examples of VTuber-style videos
Cory Strassburger with an iPhone 12, Unreal Engine, and
Cory’s new character Blu uses a set-up with Xsens mocap in tandem with an iPhone 12, streamed via Unreal Engine. You can see his set-up at the end in the video.
One Piece VTuber
One Piece voice actors Mayumi Tanaka and Kappei Yamaguchi are in the MVN motion capture system doing a live VTuber show as their characters Luffy and Usopp.