This is an interactive website using Three.js. Click to interact with the website.


Course: CSE 598 – Mobile Computing
Institution: Arizona State University
Date: February 2024
Project Overview:
This graduate-level project involved developing a mobile and backend application to control SmartHome devices using hand gestures. Built using Android Studio and Python, the app allowed users to select from 17 predefined gestures, watch expert demonstration videos, and record their own gestures via the device camera. The video data was uploaded to a local or fog/cloud server and used for gesture classification.
In the second part of the project, I built a Python-based classification pipeline using a Convolutional Neural Network (CNN) to recognize gestures from the uploaded videos. The system extracted middle frames, computed feature vectors, and performed cosine similarity to identify gestures, storing the final predictions in a CSV file.
Technologies Used:
Android Studio (Java/Kotlin)
Python (TensorFlow, Keras, OpenCV, NumPy, Pandas)
Flask (Local Server)
CNN Gesture Recognition Model
Gradescope Submission
Skills Gained:
Mobile UI design and multi-screen navigation
REST API integration
Cloud/fog video upload logic
CNN-based gesture recognition
End-to-end data processing and classification
CSE 470: Cloud Development: AWS Presentation (April 2023)
Completed this for my undergrad in computer science at SNHU 
 
UC Davis: Spring 2021
This is a project I did at University of California Davis in my Web Development class. 
Website from FreeCodeCamp 
Back to Top