titleScreen.PNG

Group Documentation

Arcane Arena is a real-time, motion-capture-driven combat experience developed in Unreal Engine 5.3.2. The project’s primary objective was to synchronize high-fidelity physical performances with deterministic gameplay mechanics. This document focuses on the group’s technical contributions to the systems architecture, the digital content creation (DCC) pipeline, and the integration of virtual production hardware.

To facilitate the complex requirements of real-time motion capture and modular gameplay, we utilized a professional suite of industry-standard tools:

Synopsis: The idea of the game was to set an arena where a Mage and a Mech would battle each other in a magic-infused, real-time wand fight. The characters would be motion-tracked by the actors/players in the LIM Lab, and each player would be provided with a wand prop rigged with reflective tracking points. The game's functionality fluctuated throughout the semester as we tried to determine the best approach for fluidity and functionality while simultaneously reducing stress on the motion-tracking system to optimize performance. We utilized a bounding box surrounding each player character and attached a projectile (fireball) system to the hand socket of the character model. The idea was that the motion-tracked wand would be “cast” outside of the bounding box, triggering the projectile to fire towards the direction of the wand and hand socket.

<aside> 💡

Team members & Roles:

Terrence O’Leary - Lead Programmer

Alexander Le - Lead Technician

Anoushka Chigullapalli - Damage & Health Systems

Jing ‘Mark’ Lin - Rigging & Animations

</aside>

Terrence

Terrence O’Leary

Motion Capture Integration

Gesture Recognition & Custom Input Systems

C++ Systems Architecture

Character Engineering & Animation Pipeline

Level Design, Blueprints & Systems Integration

Concepting, Ideation & Project Leadership

Alexander

Alexander Le

Engine-Standardized Rigging Workflow

I worked on a specialized pipeline to bridge the gap between Blender’s Rigify system and the Unreal Engine 5 bone hierarchy. By utilizing the Expy Kit plugin, I translated custom control rigs into a format compatible with the engine's native animation library. This ensured that our character models, the Mage and the Mech, could utilize standardized UE5 animations without requiring manual re-weighting for every unique asset.

IK Rig Retargeting Configuration

To achieve high-fidelity motion, I utilized Unreal Engine’s native IK Rig Retargeting system to bridge incoming motion capture data with our custom skeletal meshes. I configured custom IK Rigs to map non-standard bone chains—specifically those utilizing "DEF-" prefixes from external rigging tools—to the standard engine bone chains. This configuration enabled the seamless retargeting of animations and motion-matching logic, ensuring that real-time performances were accurately reflected on our custom skeletal meshes.

Skeletal Mesh Technical Optimization

During the import phase, I diagnosed and resolved critical vertex deformation and "mesh explosion" issues. These were identified as artifacts of inconsistent coordinate systems and transform data. I went back to Blender and normalized the root bone positioning to the world origin (0,0,0) and standardized transform scales across all armatures to ensure 1:1 parity upon engine import.

Animation-Driven Gameplay Logic

I integrated gameplay triggers directly into the animation pipeline using frame-accurate notifies (AnimNotifies). This allowed me to synchronize the physical spawning of projectiles and Niagara visual effects with specific hand gestures. By anchoring these events to the animation data, I ensured that the mechanical execution of a "spell" always matched the visual action of the character.

System Migration & Logical Refactoring

I migrated the damage and health system prototype initially created by Anoushka. The migration required the restructuring of variables and components from the health system that connected to the damage components (i.e., a collision hit will deduct x amount of points).

Decoupled Communication via Interfaces

To prevent rigid class dependencies and minimize technical debt, I expanded on a blueprint interface from Anoushka that served as a communication bridge between gameplay actors (the characters) and the User Interface (the HUD). By using interfaces, the HUD remained actor-agnostic, allowing it to poll data from any actor marked as "damageable" without the need for expensive or brittle casting operations.

Precision Projectile Firing Pipeline

I developed a projectile casting system designed to resolve the discrepancy between a character’s physical hand position and the player’s intended target. The system utilizes skeletal mesh sockets as physical muzzles while calculating trajectories via camera-space vectors. This ensures that projectiles travel accurately toward the player's world-space reticle, regardless of the character’s current animation pose.

Advanced Physics & Collision Filtering

I conducted a project-wide collision audit to resolve issues where projectiles would immediately collide with the player who fired them. By reconfiguring collision channels and instigator-ignore logic, I enabled projectiles to spawn within an actor's bounds without triggering false impacts, while still maintaining high-fidelity physical responses when hitting environmental obstacles or opponents.

Real-Time Data Stream Calibration

Working alongside Terrence, I assisted with the technical troubleshooting of the Captury and QTM Live Link tracking systems. We encountered significant spatial drift where the digital characters would desynchronize from the physical actor's position. I resolved this by implementing dynamic transform offsets within the character logic, effectively "re-zeroing" the actors within the digital environment to match the physical capture volume.

Dynamic HUD Data Binding & Technical Resilience

I developed the "Versus" HUD system, expanding on Anoushka’s HUD, which utilizes dynamic variable references to monitor the health status of the Mage and Mech simultaneously. To ensure technical resilience, I engineered robust error-handling protocols within the UI bindings. This included "Is Valid" checks and safe-math logic to prevent engine crashes in the event of rapid actor initialization or destruction.

Session Management & Race Condition Resolution

I leveraged the Level Blueprint to serve as a centralized session manager. This system identifies specific character instances upon level start and "hands off" those references to the HUD. I diagnosed and fixed critical race conditions where UI elements would attempt to access character data before it had been initialized. By structuring a sequenced initialization flow with managed delays, I ensured total data integrity across the session.

Anoushka

Anoushka Chigullapalli

Health and Damage System

I built the damage system from scratch using various blueprints available in Unreal Engine 5.

The class is made to be used as a parent. To add a new character to the game, it is intended to create them as Child Blueprint Classes of BP_Player. Then, the mesh and animation class assets can be easily swapped out to incorporate new models. I also included a variable-CastAnimMontage-so the animation montage to be played upon casting a spell can be swapped out just as easily.

Cooldown System

I implemented the cooldown system as a part of the blueprint character class, BP_Player.

Title Screen & UI Elements

I designed the title screen and UI elements such as the health and cooldown bars in Procreate and applied them to our game as Widget Blueprints in Unreal Engine.

This title screen was then put into a Widget Blueprint and made to spawn immediately using the Level Blueprint’s event graph before attaching the camera to the static camera actor and initializing sound and lighting.

I added these as progress bar components in separate Widget Blueprints. I created the design in white so the color could be easily altered in the engine, making design changes and updates easy in the future if required. The widgets were spawned by overridable functions within the character blueprints to allow for quick changes in screen location per-player, as well as to avoid race conditions (e.g., if a common blueprint was used, it would be decoupled from player spawning events, causing potential errors if the widget was spawned before the players).

Mark

<aside> 💡

</aside>

Gameplay

Vid1_1.mp4

Vid2_1.mp4

Vid3_1.mp4

Mivry53test - Unreal Editor 2026-04-30 00-38-22 (1).mp4

Copy of PXL_20260428_162824894.jpg

Gesture Recognition Demo(3_32-3_40) - Trim_1.mp4