Blog
Blog

16th September 2022


Voltaku began life in 2020, and was founded by actor and entrepreneur Charles Borland and visual effects technical director Sally Slade. Their company exists at the confluence of movies, games, anime and the metaverse, and is aimed at hard-core anime fans known as ‘otaku’.

This transmedia company is busy with a series of projects that include a real time puppeteering iPhone app called Vodcasto, a universal descriptor for avatar creation and sharing, and a TV show based on the Killtopia graphic novel by Dave Cook and Craig Paton. Voltaku has already shown off some proof-of-concept footage, which was captured using an Xsens MVN Link suit and fed into Unreal Engine allied to Chaos Vantage for a real-time raytracing pipeline.

What was the inspiration behind the formation of Voltaku?

Charles: When I was an actor in a video game called Grand Theft Auto IV, I would spend a lot of time in a mocap suit, and I'd been on a lot of TV and film shoots and saw just how inefficient the Hollywood production process is. I remember thinking, holy cow, when this technology and the economics get to a certain point, all of this gaming technology and real-time technology is going to revolutionize filmmaking and how you make content.

I really wanted to push the envelope in terms of Hollywood, and I thought there’s a way to create a new type of studio. If you build one from the ground up around these real-time technologies and virtual production, you can open up an audience that has traditionally been underserved in Hollywood, which is the otaku market – there are no otaku content creation companies in Hollywood.

And you don’t have to stop at just creating a movie or TV series: we’re merging virtual production and interactive experience pipelines into one. So you can create these digital 3D assets once and then deploy them across platforms, across ecosystems.

Sally: I really got into real-time around 2019, 2020 when I realized all the things I'd been doing for Magnopus, for The Lion King, for real-time virtual production could be applied to anime. So I went down this rabbit hole of writing my own animation software to help with anime style-stuff. And then Charles scooped me up, and now I'm making tools for Voltaku as we try to get this pipeline going that will help us with the creation of this otaku-centric TV serie.

voltaku  Stilleto_Raytraced

Tell us about your system for creating and sharing avatars across platforms?

Sally: LayerCake is our avatar creation software. It was written in PyQt (Python) so that its UI can easily plug into existing DCC (Digital Content Creation) pipelines. Currently, it supports the creation of static 2D avatars through layering of PNGs. It has tools for composition such as forbidding or enforcing arbitrary combinations of assets, as well as tools for collection review. I can take these assets and tint them and shuffle them, and this kind of self assembly can be done with metadata files so that you can have attributes that are shared and mapped across projects. These metadata files can make the avatar builder auto-populate its fields and then spit out an approximation, and we're calling that metadata a universal avatar descriptor (UAD) file. It’s meant to bridge the gap between avatar systems across arbitrary platforms and projects, and is built to scale from the littlest indie avatar system to something AAA. So rather than having thousands of mappings [between platforms], we’re going to define one central UAD master file, and then you can map between that and your project. We think this idea of style-transfer, high level interoperability is more fun, with users seeing themselves through the lens of each different project.

voltaku avatars

Why is it important to you that metaverse platforms are decentralized?

Sally: There’s so many angles, but from a technical perspective, I feel that in order to invest in interoperability, stakeholders and project owners need to know that the assets for whom they're building aren't going anywhere. Of course, that’s if by decentralized you mean you’re applying blockchain. What's great about that is that it’s immutable. And it's public. So I know that if I build around a project, even if it tanks, my pipeline will stay. Because the things that I've been referencing and looking at are going to stay online in this decentralized file hosting system, which is great.

What prompted you to start using motion capture?

Charles: One of the first tasks that we set for ourselves as a studio was, if we're going to build this in a game engine, like Unreal Engine, then we obviously have to do things like set up a camera inside of Unreal. We knew that we were going to have an actress, and we were going to try and do this in real-time. But one of the things that we were looking at was real-time ray tracing, and to push the envelope on that. And so, okay, if we're building this new type of studio from the ground up, let's actually do it from the ground up: how far can you get if all you have is a backyard, a Lenovo ThinkPad, an Xsens suit and an iPhone? Literally, that's what we did.

voltaku vlcsnap-2021-09-19-22h20m09s901

We couldn’t go into the studio and do full camera tracking, so we thought ‘let's find something that’s inertia-based, and obviously Xsens is a leader in that… Let’s do it in the backyard and let’s just see how far we can push the envelope and what problems we run up against’. So that’s where we started from, and we were able to identify a bunch of problems, issues with ray-tracing and some other things. But using the Xsens suit, capturing the raw mocap data, was great.

 

 

Do you need an Xsens motion capture setup for your next project?

Not every mocap project has the same requirements, so not every mocap solution is the same. To make sure you’re choosing the best Xsens mocap setup for you, head over to our blog post to ensure you make the right decision.

Choosing the best Xsens mocap setup

Related articles