Siyuan (Shawn) Chen

CS Undergraduate Student at Cornell University.

Majoring in Computer Science at Cornell has widened my understanding in Artificial Intelligence and machine learning. I am interested in delving deep into various practical applications of AI and ML. My other interests include making side projects (such as this website), fitness and reading.

More About Me

I am part of Combat Robotics @ Cornell (CRC) , a project team that builds RC and autonomous combat robots and compete in nation-wide battlebot competitions such as National Havoc Robot League. I am also a undergrad research assistant at Cornell SciFi Lab , supervised by Prof. Cheng Zhang. Throughout the academic year, I work as teaching assistants at the College of Computing and Information Science for CS 3110: Functional Programming and Data Structures, and CS 3410: Computer System Organization and Programming.

Projects

Below is a list of projects I have worked on, applying both imperative and functional programming techniques learned from school, on fields such as games, mobile applications, machine learning and computer vision.

Lance: Autonomous robot

A combat robot capable of detecting road and corners, follow direction signs, and recognize text and pictures on the target with Machine Learning algorithm.

Tools and Techniques: Machine Learning, Embedded System

Developer Team: Combat Robotics @ Cornell Firmware Subteam

My Role: Assemble embedded system using raspberry pi, implement Pytorch convolutional neural network on road tracking and facial recognition.

Farmfield Monitor

A device that monitors farm situation and enhances crop protection: traps farm insects, identifies their types and quantity, kills if recognized as pests, and update data on server.

Tools and Techniques: Computer Vision, Embedded System, AWS

Developer Team: Barn Owl Technologies

My Role: Create API data update framework, train insect recognition model, assemble prototypes, sensors and controllers.

EchoBlink Research Project

A wearable eyeglass system with micro speakers and microphones capable of detecting eye blinks.

Tools and Techniques: Embedded system, Deep Learning, Acoustic Signal Processing

Research Team: Cornell SciFi Lab

My Role: Lead the project,conduct pilot and user study, assemble and experiment with form factor, collect and process blink and non-blink acoustic signals from 15 participants to train deep learning algorithms.

Battleship Game

A two-player battleship game on command-line, where each player take turns deploying various bombs on a generated grid to sink enemy ships.

Tools and Techniques: OCaml, Functional Programming

Developer Team: Shawn Chen, Clare Daggett, Richard Jin, Alexandra Kushnirsky

My Role: Implemented user command parser engine and interactive command-line user interface with generated grid.

ShowTracker iOS App

An iOS App developed in Swift that tracks and identifies TV show preferences of different user, and keeps track of watchlists distinct to each user ID.

Tools and Techniques: Swift, Flask, Heroku

Developer Team: Prithwish Dan, Joyce Wu, Gabe Pasternack, Shawn Chen, Richard Jin

My Role: Architected front-end multi-layer GUI to display interactive show watchlist, with networking with backend API.

Froggit Game

A classic GUI game to move the frog across the road and the river to reach all five destinations without losing all three lives

Tools and Techniques: Python, Object-oriented Programming

Developer Team: Solo

My Role: Construct a fully playable GUI game with a movable frog controlled by keyboard, and automatic state transition when frog loses a life, reaches destination, or the game ends.