split

The virtual YouTuber is everywhere and here to stay

VTubers are among the most successful online content producers in the world. The ability to reach millions of subscribers through videos or streams is a potent force. One that relies on motion capture technology to bring a VTubers personality to lifeand create the perfect platform to share views and showcase products.

 

Kizuna_AI_artworkX2

But what is the technology driving all this?

The set-up of a VTuber mostly involves facial recognition, gesture recognition, and animation software, and combining these technologies can be tricky. Well-known companies in the virtual YouTuber space, like Cygames and CAPTUREROID, use this typical VTuber set-up.

Read on to discover the ideal set-up for VTubing.

Vtuber

The perfect virtual YouTuber set-up

 

Take a bow, MVN Animate. This technology is unmatched for ease of use, has robust and reliable hardware, and produces production-quality data. MVN is the ideal tool for professional VTubers.

 

Learn more about MVN Animate
384x184 _144ppi master_0004_3d_avatar
3D avatar

First, you need a VTuber avatar. Simple, right? Not exactly. A full-body avatar needs to act and move naturally, and creating a unique avatar is not easy. It needs to be fully “rigged” before it can move in a natural way. The easiest way to start is to download a model from pages like TurboSquid, Sketchfab or CGTrader. Unreal Engine also released Meta Human Creator to create high-fidelity digital humans.

 

Additional options include apps like Ready Player Me or Wolf3D.

384x184 _144ppi master_0003_3d_animation
3D animation software

To pull it all together you need 3D animation software such as Unreal, Unity 3D or iClone. It’s really easy to stream Xsens’ motion capture data into all major 3D animation software packages. You can do it either natively or via a plugin. An overview of the Xsens integrations can be found here.

full-body-xsens-system
The full-body Xsens motion capture system

In this set-up, you see the full-body motion capture system from Xsens, including the “MVN Link” hardware (suit). The live motion capture data can be streamed into Unity using MVN Animate software to give you the best live quality data available.

 

Many VTubers use the MVN Awinda Starter in combination with the MVN Animate Plus software for live streaming.

384x184 _144ppi master_0002_face_track
An iPhone with face recognition software attached to a head mount

There are several resources online for how to get facial data into Unity, iClone or Maya. We have used the Live Link Face app ourselves and had great results.

 

There are also high-end solutions such as Faceware.

Close up of woman wearing Xsens gloves
Gloves

Xsens Metagloves by Manus finger tracking data is integrated in MVN Animate and can be streamed into Unity or Unreal. The same counts for the StretchSense gloves.

 

Xsens motion capture technology has been a proven technology for many years and has a long track record in live and streaming animations.

split
CodeMiko set-up with Live Link in Unreal Engine

CodeMiko is a VTuber mostly active on Twitch; go check her out here. She has shared many videos showing her full tech set-up.

 

Also, check out the “How to become a VTuber” event we did with her, at the bottom of this page.

VTuber

Examples of VTuber-style videos

 

Cory Strassburger with an iPhone 12, Unreal Engine, and
Xsens set-up

Cory’s new character Blu uses a set-up with Xsens mocap in tandem with an iPhone 12, streamed via Unreal Engine. You can see his set-up at the end in the video.

MVN Animate →

 

One Piece VTuber

One Piece voice actors Mayumi Tanaka and Kappei Yamaguchi are in the MVN motion capture system doing a live VTuber show as their characters Luffy and Usopp.

MVN Animate →
Creator's corner

Mention @MovellaEnt and #Xsens for a chance to be featured!

 

Live stream: How to become a VTuber

We have a perfect webinar to get you started! During our Mocap Content Creator Conference (MCCC) we talked to CodeMiko about how she became a VTuber. This webinar will give you insights into her journey to becoming a VTuber, including how to utilize Xsens motion capture.

 

CodeMiko is a VTuber mostly active on Twitch; go check her out here. She has shared many videos showing her full tech set-up.

cinematic-video-prod-greenscreen

Easy integration

Whether it’s live connections or motion capture data files, integration has never been this easy. Xsens full-body motion capture systems integrate directly into your pipeline. And we have all the integrations you need for your applications.

Product bundles

Best bundles for VTubing & Streaming

MVN hardware
MVN software
Additional hardware
Bundle features

MVN Link + MVN Animate Pro

Request pricing →
Bundle features
Ideal for high-dynamic movements
High update rate for extra accuracy
On body recording and sturdy sensor placement for ultimate freedom
Clean data and full magnetic immunity
Up to 4 actors supported (unlimited with on-body recording)
Exporting available: FBX, BVH, C3D, MP4 (movie)
Live streaming into 3rd party software
3rd party plugins available (Unreal Engine, Unity3D, Autodesk + various others)
High fidelity finger tracking
Low latency (20ms)
Local HD reprocessing
Time code support
3D positional aiding with HTC Vive
Request pricing →

MVN Awinda Starter + MVN Animate Plus

Buy now →
Bundle features
Suitable for all types of motions
More affordable - with Xsens quality
Quick setup with easy and fast sensor placement
Clean data and full magnetic immunity
Up to 2 actors supported
Exporting available: MVNX (position/orientation), FBX, BVH
Live streaming into 3rd party software
3rd party plugins available (Unreal Engine, Unity3D, Autodesk + various others)
High fidelity finger tracking
Buy now →