A camera-based motion tracking game built in Unity using C# and AR Foundation. Players control the paddle through real-time body movement, combining physical motion with on-screen gameplay for an immersive AR experience.
Built using Unity, AR Foundation, and C# scripting, the game leverages real-time computer vision and human-motion tracking to detect player movement through the device camera. The paddle responds to the user’s body position. This experiment explored how camera-based motion can create engaging, accessible gameplay for all ages.
Used Unity’s AR Foundation to access the camera feed and real-time frame data for basic motion analysis without external libraries.
Implemented frame differencing in C# to detect movement intensity on left/right zones and translate it into paddle movement.
Spawned cubes at random positions; collisions increase score and particle effects trigger feedback. Includes difficulty progression and score tracking logic.
This project expanded my knowledge of integrating real-world camera input into game mechanics. It strengthened my skills in Unity scripting, motion analysis, and AR interaction design. I learned to balance performance with real-time tracking accuracy, and to design gameplay that rewards physical engagement. The combination of AR and computer vision gave me a deeper sense of how interactive technologies are shaping the foundation of the Metaverse — blending the physical and digital worlds into a single experience.
Demonstration of camera-tracked paddle control and real-time gameplay interaction.