I'm a big fan of robotics, and to that end I've worked on lots of personal projects both big and small. Most of my recent work has been in microrobots and swarm robotics, but I've also worked on different projects across the field, from robotic grasping to autonomous optical testing stations. I also enjoy branching out in different sides of the tech stack, and I've done projects in areas like computer vision, machine learning, and server management.

I've listed some of my favorite personal projects below:

The Creator

A group of four friends and I programmed a robot to inspect a structure built out of MEGABLOKS and autonomously replicate the structure. We accomplished this task by using computer vision to generate a computerized model of the structure, which we then sent to a Baxter or Sawyer robotic arm to pick and place blocks in their respective positions.

I designed this system as my final project for EECS C106A (Introduction to Robotics) at UC Berkeley, which generously supplied the Baxter and Sawyer robotic arms for the project.

See the full project page here!

The Creator setup

Demonstration of the Creator building a pyramid

A collective graphic of the computer vision setup and processing


Interested in applications for bio-mimetic swarm robotics, I worked in a group of three to simulate predator avoidance strategies using TurtleBots. We devised a realistic model for the prey's line of sight and roaming strategies and extended the ROS Rapidly Exploring Random Tree (RRT) library to incorporate success metrics for both the predator and the prey. We then demonstrated the success of our algorithms in both simulation and controlled real-world trials.

I designed this system as my final project for EECS C106B (Introduction to Robotics) at UC Berkeley, which generously supplied the TurtleBots for the project.

See the full project page here, or read the research paper on the project here!

This is the 2D costmap generated during RRT exploration that's used for path planning. Predators and the areas within their line of sight are considered obstacles to avoid no matter what.
This is a simulation of a prey's roaming strategies given a fixed predator location. As shown in the prey's mental "map" of its environment, it avoided the predator's region as soon as the prey noticed the predator through the open door.

Demo using a real robot, where AR tags are used to simulate predators. The AR tag's orientation indicates the predator's line of sight.

Demo in simulation. As shown, the prey avoids the area with the predator (indicated by the red dot as shown above) and just roams around the lower half of the map.

Adding a Dandelion-Inspired Airfoil to a MEMS Ionocraft

Continuing my aforementioned passion for bio-mimetic robotic structure, for my first MEMS project I designed a dandelion-inspired airfoil as an add-on to the ionocraft designed by Drew, et al. (featured here). We modified the airfoil design by Cummins, et al. (featured here) to match the ionocraft's dimensions and demonstrated the viability for such an airfoil to improve flight time and stability through COMSOL Multiphysics simulations.

I designed this system as my final project for EE 147 (Introduction to MEMS) at UC Berkeley.

See the research paper on the project here!

This is the mask layout we designed for fabricating the airfoil, together with the ionocraft. This layout was designed for a standard two-mask SOI process, with the red representing the SOI layer and the blue representing the TRENCH layer.

The above figure shows simulations for the airflow pattern for various Reynold's values using COMSOL Multiphysics. The images are inverted from their normal orientation, with air flow from top to bottom. The structure consists of a porous disk representing the airfoil and a generic box-like structure representing the thruster. As seen from left to right, the air flows represent Re = 100, 200, 300, and 400, respectively. As shown, an air pocket can be observed for lower Re values but not for higher ones, indicating that we lose the SVR’s stability above Re = 300. Above Re = 500, the airflow becomes unstable, giving convergence errors in COMSOL.


To promote responsible disposal and improve the quality of dorm life, I built an interactive trashcan organizer in line with UC Berkeley’s goal of Zero Waste by 2020. It featured multiple bins for trash, compost, and recycling that could be chosen from via voice activation. Its enclosed design also works to keep away flies and minimize odors from old trash and compost, enhancing the hygiene of college life.

Front view, featuring our laser-cut logo

Top view, demonstrating our microphone and trash segmentation

Learning Machines

This is an optical character recognition platform built specifically for medical equipment. Given an input image of an equipment package, it can parse out the lot number and expiration date, two of the key identification components necessary to log and use medical equipment in standard practice. For use in industrial settings, it's designed with high noise tolerance and built to recognize a wide variety of notations used to mark the lot number and expiration date on medical labels.

I designed this system alongside the team at Medinas Health at the MASH Startup Hackathon hosted by UC Launch in March 2018.

You can find the source code for this project on Github. It is licensed under an Apache License, Version 2.0, with copyrights owned by Medinas Health.

A demonstration of the computer vision process and output