Used OpenCV to train a neural network modeled RC car to drive along a track in autopilot and automatically disengage when a truck is confidently labeled from continuous image input processing.
NVIDIA Jetson Nano was used for ML processing, OAK-D stereo camera & LiDAR sensors for image capturing.
Used ROS2 algorithms (publisher & subscriber nodes); utilized Donkey Simulator to train reinforcement models.
Built OpenCV from source; utilized YOLO algorithm on COCO dataset to train our Deep Neural Network model.
GPU supercluster trained model on thousands of images taken during track runs; autopilot mode mapped real-time environment data to steering angle and velocity, combined with real-time object detection using DNN models.
Prototyped a travel-logging app that recommends safe travel destinations based on COVID-19 data; completed during a 36-hour hackathon in a team of four, utilized Figma and data queries to demonstrate proof-of-concept.
Developed back-end using Spring Boot web framework with PostgreSQL database for storing/processing GPS data.
Implemented JSON parser for more efficient data collecting; initiated a GraphQL prototype for UI map coloring.
Microcontroller car based on reinforcement learning, built to follow a black line on a white surface; 3D-printed chassis using Solidworks CAD, integrated circuit design for the 4-potentiometer speed + PID control board.
Utilized C++ and various Arduino libraries to input photoresistor sensor-value data into a proportion-integral-derivative (PID) error system that modified dyadic rear-wheel velocities depending on input data (brightness values).
Implemented software testing, debugging, and test-driven development around the Arduino Mega 2560 microcontroller.
Developed a recycling image capture system by prototyping a circuit board and Arduino logic set that programmed ultrasonic, photoresistor, and passive infrared sensors; captured images of targets and assigned recyclability values.
Utilized a static JSON buffer to store sensor values and an HTTP client object to send motion data to a target IP.
Developed a Node.js server to display updated sensor values in real-time using an ESP8266 Wi-Fi transceiver.
Built during SD Hacks 2019, a 48-hour hackathon, by a two-person team. Our project won the Northrop Grumman category award, "Best IoT Device to Incorporate Multiple Nodes."
Developed a navigation system that included pathfinding, virtual arrows, a user interface, and a 3D mini-map in C# and Unity, aiming to assist astronauts aboard the International Space Station in completing technical objectives.
Programmed using Microsoft HoloLens framework; utilized Leap Motion controller for optical hand-tracking.
Our ten-person team represented the only community college in the nation to have their proposal accepted by NASA and selected to participate at the Johnson Space Center in Houston, TX to demo a functioning prototype.
Built for the design challenge program, NASA Spacesuit User Interface Technologies for Students (SUITS).
An interactive, character-based game that can be played entirely in the console. Based on the 1934 Parker Brothers board game.
Developed a bot algorithm for single-player mode; the algorithm is based off a series of if-else conditions that determines optimal moves for winning.
Implements user-input validation and error-checking to prevent illegal player moves; updates and prints visual state of the board after each player move.
Used as a showcase project for intro to logic design--includes thorough report with flowcharts, pseudocode, and tables outlining core C++ functions/concepts used.
Blog
My online journal. I want to share some interesting stories and tech tips about things
that interest me. Planning on writing an entry once a month. Stay tuned!
Blog