Web3 technology is the path to huge creative potential in music production. A fact not lost on the enigmatically anonymous duo, ROHKI. The idea of a decentralized internet interested the group, providing the possibility to create and distribute music on its own terms, connect with new audiences, and engage with fans on a deeper level.
ROHKI was formed in 2021 to explore the exciting capabilities of this new digital frontier. The independent duo uses Unreal Engine to create animated music and lore videos — content explaining the background, history, and details of their fictional musical worlds. Their aim is to take viewers on unique narrative journeys that explore long-form stories that go beyond traditional music videos.
Starting out in Houdini, ROHKI soon found that creating engaging, realistic characters was complex, and the rendering time was limiting. This is where Unreal Engine came in, allowing for real-time rendering, powerful graphics capabilities, and ease of use.
The duo learnt their craft along the way, improving with every new video. But during the development of its first two projects, there was a clear challenge: it’s difficult to learn animation as you work.
Behind the scenes: Watch ROHKI transform musical storytelling with Xsens motion capture
Elevating Character Creation
The technicality of creating realistic 3D characters proved to be the hardest aspect. “We were forced to develop shots that would obscure our lack of experience,” says the enigmatic figure on the other end of Zoom. “So we started downloading pre-made animations and hand-fixing the finer details. It was very limiting.”
Acting as fundamental drivers of their musical narratives, their central characters are key contributors to fan engagement. Finding new ways to animate them would be an important factor in enhancing their stories and engrossing new audiences.
To create truly immersive experiences, ROHKI needed a tool that would help them to worry less about the creative process and focus on the music. For their latest project, Stars Won’t Align, the pair used Movella’s Xsens motion capture technology to enhance their workflow and do away with the difficulties of character customization. ROHKI hoped to spend less time editing premade assets, and instead create more enriched environments.
Xsens is a motion capture technology that records movement data, used to create more lifelike character models. By using their phones in conjunction with the Xsens Livelink, the corresponding facial movements could be tracked in real time. This technology allowed them to fully embody their characters, leading to a more authentic and immersive performance. Rather than relying on pre-made assets, they were able to start performing as their characters right away. "We could totally become our characters, which felt really amazing," ROHKI explains.
“Thanks to the intuitive nature of the Xsens suits, we quickly grasped how we could create more ambitious videos”, they continue. Having previously relied heavily on friends with more knowledge of animation, ROHKI could create videos with a newfound confidence. “We simply put the suit on, open the software, calibrate, record, and transfer the data into Unreal Engine from the get go.”
Motion Capture Meets AI
With this self assurance came a deeper level of autonomy over every aspect of their work. Instead of relying on the knowledge of others or obscuring shots that were difficult to animate, ROHKI can now be musicians, art directors, mocap actors, videographers, VFX artists, and more — all with complete creative freedom.
A major part of their exploration into musical storytelling is through the use of AI art programmes such as Warp Fusion, a tool that helps explore different art styles. “Initially we used Midjourney to embellish our content with fresh flair,” comments ROHKI’s figurehead. “But we knew there was a way of incorporating AI on a deeper level.”
Watch ROHKI’s Xsens powered music video – Stars Won’t Align
ROHKI moved to Warp Fusion to stylize their videos however they wanted. Having a motion capture tool that could support this transition proved transformational to their content. Once the Movella Xsens motion capture data was transferred into Unreal Engine, it could be run through Warp Fusion.
Passionate about using the most cutting-edge technology, ROHKI plans to use Movella’s Xsens technology to start creating live shows that allow viewers to experience their music and animations in real time. The combination of Xsens and Unreal Engine has the potential to create performances that go beyond traditional music videos, and ROHKI will continue to push those boundaries.