Meteo vis
Meteo Vis is an interactive 3D globe visualisation showcasing NASA's meteorite landing data, allowing users to explore over 15,000 meteorite impacts across Earth's surface.
Built with Three.js and vite, the application creates a dynamic 3D Earth globe with meteorite points positioned at their exact geographic coordinates. The visualisation efficiently renders thousands of meteorites with size representing mass and colour indicating landing year.
The interface features an interactive timeline slider that filters meteorites by year (1490-2013), as well as search functionality to find specific meteorites by name. The data comes from NASA's Data Portal via The Meteoritical Society's Meteoritical Bulletin Database.
Head runner
HeadRunner is an innovative take on the infinite runner genre, inspired by subway surfers and powered by an AI-driven, hands-free control system.
Built with C++ and OpenGL, the game uses a Python and MediaPipe backend to perform real-time facial landmark detection, tracking the player's head movements. This data is sent via a socket to the game, where my custom logic translates it into smooth lane changes.
The game features a procedurally generated track, 3D models, lighting, sound effects via OpenAL, and a HUD overlay built with ImGui.
See below a demo showcasing how the game can be controlled by Head movement, hand movement as well as using the keyboard
A fairly hungry caterpillar
A fairly hungry caterpillar is my take on the retro snake game - powered by ai.
Built with p5.js, The game uses ml5.js's AI facial landmark detection to track head movement in real time. I programmed custom logic to interpret these movements and steer the caterpillar - no keyboard necessary.
As the caterpillar eats apples, it grows longer and the player’s score increases. The game was tested at two speeds, though fair warning: the faster mode may result in mild neck strain.
IOT Twin House
A smart home system built in Unity, designed to mirror and control a real IoT-enabled house using MQTT communication. I was responsible for integrating the Unity <----> MQTT connection, enabling real-time two-way communication between the digital and physical environments.
Users can walk through the virtual house and control elements like lights and the bathroom fan. Sound design was spatially mapped - for example, the fan fades out as you leave the room, and audio doesn’t bleed through walls. The front porch light uses simulated motion detection with a 5-second timeout, just like the real-world setup.
The project was intended for VR and tested using the XR Toolkit. However, when deploying to the MetaQuest headset, the application consistently crashed at launch. Due to time constraints, we weren’t able to fully debug and resolve these issues, so the final build remained desktop-based.