I work on the Infrastructure and Integration team to help build an end to end integration ecosystem, allowing manual and automated data ingestion and export at scale. Worked in a small team on the infrastructure of greenfield project to enable customers to stream events to 3rd party destinations, contributing to important Amplitude CDP product launch.
Hatchways is an early stage YC startup that qualifies and matches software engineers with jobs. I do work on the frontend, backend, and automation. I also review candidates' submitted projects and perform mock interviews during their interview preparation.
Worked in a team of 6 on cross-platform mobile app for the Smart Activity Sensor. Developed in C# on Xamarin. Personally worked on custom UI, BLE and Wi-Fi connectivity, and a caching module.
Worked on two projects. The AI-education kit in Python and Keras. In a team of 2, designed and implemented deep learning projects. For the other project, individually designed and built an AI car that chases subjects. See it in action below.
As a personal project, I developed this web-app that allows a user to select courses from their school that they need help with, and that they can tutor others in. The user is then matched with others who have corresponding needs. Features include authentication, email verification, password reset, user courses, user matching, and result filtering.
In a team of 4 and in 24 hours, we created Vibecheck. It improves Spotify's recommendations by curating it to a user's current mood. I trained an ML model to classify the mood of a song based on its musical characteristics. I also designed an algorithm to determine a semantic and syntactic similarity score between a user's journal and song lyrics.
I was tasked individually with creating a hands-on, AI based project. I designed and implemented a multi-threaded algorithm that drives a robot around until it detects a person, and then chases them and celebrates when it catches up to them. It will also perform obstacle avoidance with an ultrasonic sensor, and is written in modular code to be customizable.
In a team of 4 and in 24 hours, we created a polished and fully functioning mood tracker powered by Microsoft Azure. In fact, we were finalists for the Best Use of Azure category. I personally wrote flask endpoints, HTTP requests to use Azure for Sentiment Analysis and Entity Extraction, and Python scripts for performing ETL and delivering meaningful results.
Background Filters detects humans in a webcam photo, then applies a filter using Fast Style Transfer to the person or their background (or both). I use Tensorflow.JS models to segment a person, resulting in a mask. I then perform style transfer on the webcam photo. Then, using a custom function, I iterate through pixels, painting either the original image or the styled image.