Reports and Blogs

Report on Tweet Analyzer

Tweet analyzer was my first foray into deep learning. In this project I wanted to test what was the hype about with all these neural netowrk models. So I created a classifier that would do sentiment analysis on Tweets posted by people online. I used traditional machine learning methods and compared its best result with state of the art deep learning algorithms. Amongst the traditional deep learning methods, logistic regression performed the best with 92% accuracy and amongs the deep learning methods LSTM performed the best with 95% acccuracy. When I checked, where LSTM did better than logistic regression to my surprise LSTM was able to correctly classify sarcasm!

First Datathon at Caltech

The Data Open, was my first data analysis hackathon. The participants are given a dataset and they have 7 hours to explore, analyze, and draft their findings in a report. In 2017, when the Data Open was taking place at Caltech I was paired up with 3 post graduate students from Caltech coming from different fields, one was from computer science, and the other two were from electrical engineering. It was really daunting for me, initially, because I was the only undergraduate. But coming, together as a team and once time started ticking nothing mattered all we cared about was startegizing, exploring the data, sharing our approach and going after the analysis. For our particular challenge, we were given NYC's taxi data. The beauty about Data Open is, there are no rules you can explore and analyze what you want and take whatever approach you desire and communicate it in the report. Your approach and creativity is all that counts.

Review of L1 and L2 Regularization (includes ElasticNet)

A lot of the time I forget the difference between L1 and L2 Regularization and then I have to go back and search what each one means, what it looks like in an example, when one is used over the other, and all of it is such a hassle. So here is a resource which includes examples from Aurelien Geron's Hands-on-ML book for a quick reference which connects to a Colab notebook.

MNIST GAN

MNIST GAN is an introductory tutorial for architecting GANs. It includes an explanation of how to create and train a discriminator, how to create a generator, and finally how to combine the two together to architect and train a GAN. Here, the GAN creates hand written digits of its own! I took a lot of help from How to Develop a GAN for Generating MNIST Handwritten Digits and Generative Adversarial Networks for beginners

My Portfolio

Space Race

This is a recreation of a game I played in childhood called SkyRoads. I built this game on the Unity Engine using C#.

Super-Sixagon

With Super-Sixagon I tried to create a basic version of SuperHexagon and also because I wanted to learn how to create 2D games on Unity.

Meme me

Meme me is an iOS Application that allows the users to use a photo either from their photo album or capture it in real time, create memes with it, send it to their friends or post it on their social media platforms.

Link-ibrary

This is a link shortener website. I implemented it as a full stack login/out link saving application. It allows the user to sign up and maintain a personal list of web links in an orderly fashion. I developed it using React.JS, ES6, and JSX.

Image Caption Generator

I trained my model on dataset from Flickr. To analyze the photos, I used Oxford's Visual Geometry Group model that won ImageNet competition in 2014, and to tokenize the captions I used Keras' tokenizer class. Finally, I used Inject-Merge architecture to train on the photos and generate captions.

Tweet Analyzer

Tweet analyzer that used traditional machine learning algorithms to perform sentiment analysis on tweets and compared its results with deep learning algorithms.