Self Help AI - iOS Mobile App

React NativeLLMsiOSAPI DesignUI/UXMobile DevTypeScriptJavaScript

I developed the full software stack for Self Help AI, an iOS mobile app that integrates user data, function calling & personalization settings to create a ChatBot experience for the purpose of self-help. The end product uses ~30k lines of code across 24 files, and has amassed over 200 App Store downloads since launch.

Integrating LLMs into an Original 3D Unity Game

UnityC#LLMsAPI Design3D GraphicsGame DevPythonRAG

I led a group project in creating an original Unity-based video game which incorprates Large Language Models to create dynamic NPC interactions and combat narratives. We managed the game state & character behaviors through C# scripts and Unity's component system, while a custom API helped us handle LLM prompting, context management, and response processing.

Technical Features:
  • Unity game engine with C# scripting for core mechanics
  • Custom API middleware for LLM integration and prompt management
  • 7 NPCs with context-aware dialogue systems
  • 21 unique combat abilities with particle system integration
  • 37 total animations, 3 combat encounters, and 3 original world maps

Multimodal Voice-Controlled Robot

PythonRaspberry PiComputer VisionOpenCVRoboticsGPIOPID ControlLinux

I developed a multimodal robot with Rasberry Pi that can interpret voice commands, control movements, analyze its surroundings, and respond with emotionally expressive speech. This involved integrating three seperate AI models (Speech-to-Text, a Multimodal LLM, and Text-to-Speech) with a Rasberry Pi, controller board, motors, camera, microphone, and speaker.