👋 I'm Victor, an engineer focused on XR, AI, and front-end systems. I build full-stack, real-time computer vision applications for VR (Meta Quest), wearable AR (Snap Spectacles, Mentra Live), and interactive web apps. My work spans deploying ML models on-device, designing spatial interfaces and responsive UIs, and integrating end-to-end pipelines.
- 🏆 Wins: OpenAI Winner @ TreeHacks 2025 (2/500) · Roboflow + ElevenLabs Winner @ Mentra Live Hackathon (1/100)
- 🌱 Currently exploring: PyTorch, TensorFlow, and modern React (Next.js, Tailwind)
- 🤝 What I care about: Bridging AI/ML and interaction design to make emerging technologies usable and intuitive.
- 💼 Experience:
- Co-developed Virtual Becomes Reality — a Stanford narrative VR experience exploring perception and presence.
- Led front-end at rézme — built fast, interactive web apps to automate Fair Chance Hiring across jurisdictions.
- Integrated real-time systems across APIs and hardware — built multimodal prototypes using OpenAI, Roboflow, ElevenLabs, and Perplexity, connected to devices like Mentra Live (via MentraOS SDK), Adafruit Feather, and more, using Flask, Bun, and LoRa.
- Developed a CH₄ emissions visualizer at Lawrence Berkeley National Lab — visualized LSTM-powered global methane forecasts on an interactive, rotatable 3D globe with custom web dashboards for NetCDF datasets.
- Engineered on-device CV apps with Stony Brook University — built and deployed YOLOv5-based Android apps for real-time waste classification on the Google Play Store.
Exploring human-centered approaches to AI, spatial computing, and web applications. Always learning from others building in these spaces. 📬 [email protected] · LinkedIn · Website



