
Group Documentation
Arcane Arena is a real-time, motion-capture-driven combat experience developed in Unreal Engine 5.3.2. The project’s primary objective was to synchronize high-fidelity physical performances with deterministic gameplay mechanics. This document focuses on the group’s technical contributions to the systems architecture, the digital content creation (DCC) pipeline, and the integration of virtual production hardware.
To facilitate the complex requirements of real-time motion capture and modular gameplay, we utilized a professional suite of industry-standard tools:
Synopsis: The idea of the game was to set an arena where a Mage and a Mech would battle each other in a magic-infused, real-time wand fight. The characters would be motion-tracked by the actors/players in the LIM Lab, and each player would be provided with a wand prop rigged with reflective tracking points. The game's functionality fluctuated throughout the semester as we tried to determine the best approach for fluidity and functionality while simultaneously reducing stress on the motion-tracking system to optimize performance. We utilized a bounding box surrounding each player character and attached a projectile (fireball) system to the hand socket of the character model. The idea was that the motion-tracked wand would be “cast” outside of the bounding box, triggering the projectile to fire towards the direction of the wand and hand socket.
<aside> 💡
Team members & Roles:
Terrence O’Leary - Lead Programmer
Alexander Le - Lead Technician
Anoushka Chigullapalli - Damage & Health Systems
Jing ‘Mark’ Lin - Rigging & Animations
</aside>
Terrence
Alexander
I worked on a specialized pipeline to bridge the gap between Blender’s Rigify system and the Unreal Engine 5 bone hierarchy. By utilizing the Expy Kit plugin, I translated custom control rigs into a format compatible with the engine's native animation library. This ensured that our character models, the Mage and the Mech, could utilize standardized UE5 animations without requiring manual re-weighting for every unique asset.
To achieve high-fidelity motion, I utilized Unreal Engine’s native IK Rig Retargeting system to bridge incoming motion capture data with our custom skeletal meshes. I configured custom IK Rigs to map non-standard bone chains—specifically those utilizing "DEF-" prefixes from external rigging tools—to the standard engine bone chains. This configuration enabled the seamless retargeting of animations and motion-matching logic, ensuring that real-time performances were accurately reflected on our custom skeletal meshes.
During the import phase, I diagnosed and resolved critical vertex deformation and "mesh explosion" issues. These were identified as artifacts of inconsistent coordinate systems and transform data. I went back to Blender and normalized the root bone positioning to the world origin (0,0,0) and standardized transform scales across all armatures to ensure 1:1 parity upon engine import.
I integrated gameplay triggers directly into the animation pipeline using frame-accurate notifies (AnimNotifies). This allowed me to synchronize the physical spawning of projectiles and Niagara visual effects with specific hand gestures. By anchoring these events to the animation data, I ensured that the mechanical execution of a "spell" always matched the visual action of the character.
I migrated the damage and health system prototype initially created by Anoushka. The migration required the restructuring of variables and components from the health system that connected to the damage components (i.e., a collision hit will deduct x amount of points).
To prevent rigid class dependencies and minimize technical debt, I expanded on a blueprint interface from Anoushka that served as a communication bridge between gameplay actors (the characters) and the User Interface (the HUD). By using interfaces, the HUD remained actor-agnostic, allowing it to poll data from any actor marked as "damageable" without the need for expensive or brittle casting operations.
I developed a projectile casting system designed to resolve the discrepancy between a character’s physical hand position and the player’s intended target. The system utilizes skeletal mesh sockets as physical muzzles while calculating trajectories via camera-space vectors. This ensures that projectiles travel accurately toward the player's world-space reticle, regardless of the character’s current animation pose.
I conducted a project-wide collision audit to resolve issues where projectiles would immediately collide with the player who fired them. By reconfiguring collision channels and instigator-ignore logic, I enabled projectiles to spawn within an actor's bounds without triggering false impacts, while still maintaining high-fidelity physical responses when hitting environmental obstacles or opponents.
Working alongside Terrence, I assisted with the technical troubleshooting of the Captury and QTM Live Link tracking systems. We encountered significant spatial drift where the digital characters would desynchronize from the physical actor's position. I resolved this by implementing dynamic transform offsets within the character logic, effectively "re-zeroing" the actors within the digital environment to match the physical capture volume.
I developed the "Versus" HUD system, expanding on Anoushka’s HUD, which utilizes dynamic variable references to monitor the health status of the Mage and Mech simultaneously. To ensure technical resilience, I engineered robust error-handling protocols within the UI bindings. This included "Is Valid" checks and safe-math logic to prevent engine crashes in the event of rapid actor initialization or destruction.
I leveraged the Level Blueprint to serve as a centralized session manager. This system identifies specific character instances upon level start and "hands off" those references to the HUD. I diagnosed and fixed critical race conditions where UI elements would attempt to access character data before it had been initialized. By structuring a sequenced initialization flow with managed delays, I ensured total data integrity across the session.
Anoushka
I built the damage system from scratch using various blueprints available in Unreal Engine 5.
Extensible Damage Blueprint: I used Enumerations-DamageResponse and DamageType-to store different kinds of damage types and responses. The intent behind this was to make it easy to extend further into more damage types (e.g., spells like “Fireball” would be of type “Fire”) and appropriate responses to different kinds of damage (e.g., big hits would cause a greater knockback and smaller hits would have a lesser effect).
Standardized Damage Data Structure: I created a Structure-S_DamageInfo-to store the damage type, response, and health reduction for each instance of damage.
Standardized Damage API: An Interface-Damageable-to be applied to each actor/player that can take damage was made with simple functions such as GetCurrentHealth, GetMaxHealth, and TakeDamage.
Modular Damage Actor Component: I used a Blueprint Actor Component Class-BPC_DamageSystem-to hold the health values and invincibility and death status flags as variables. It also contained the TakeDamage function which held the logic for the owning actor to take damage (if not invincible) and die (if health falls below $0)$. The component blueprint created event dispatchers for damage and death events for the owning player to catch.

Base Character Template and Inheritance: I used a Blueprint Character Class-BP_Player-to create the base template for all characters in the game. This is intended to make adding new characters to the game as easy as possible with simple overrides and variables (more on this later). This class implements the Damageable interface as well as builds the logic for what to do when the player takes damage or dies.

The class is made to be used as a parent. To add a new character to the game, it is intended to create them as Child Blueprint Classes of BP_Player. Then, the mesh and animation class assets can be easily swapped out to incorporate new models. I also included a variable-CastAnimMontage-so the animation montage to be played upon casting a spell can be swapped out just as easily.
I implemented the cooldown system as a part of the blueprint character class, BP_Player.
Accessible Variables: The character class holds variables that pertain to the current cooldown timer (the actual timer), the maximum cooldown (the number of seconds between casts), and a flag to track whether the character can cast at a given time.

Engine-Time Based Cooldown Increment: The blueprint uses the event, Event Tick, built into Unreal Engine, to increment the current cooldown variable if the player has exhausted their cast, and allow them to cast again once the current cooldown reaches the set maximum cooldown.

I designed the title screen and UI elements such as the health and cooldown bars in Procreate and applied them to our game as Widget Blueprints in Unreal Engine.
This title screen was then put into a Widget Blueprint and made to spawn immediately using the Level Blueprint’s event graph before attaching the camera to the static camera actor and initializing sound and lighting.

I added these as progress bar components in separate Widget Blueprints. I created the design in white so the color could be easily altered in the engine, making design changes and updates easy in the future if required. The widgets were spawned by overridable functions within the character blueprints to allow for quick changes in screen location per-player, as well as to avoid race conditions (e.g., if a common blueprint was used, it would be decoupled from player spawning events, causing potential errors if the widget was spawned before the players).
Mark
<aside> 💡
</aside>
Mivry53test - Unreal Editor 2026-04-30 00-38-22 (1).mp4
