Multimodal Machine Learning & Hunan-Computer Interaction
At GLAMOR Lab and Interaction Lab, I work on Multimodal Machine Learning to detect early signs of Alzheimer's Disease in collaboration with USC School of Gerontology;
I use PyTorch to analyze and find connections between audio, visual, and behavioral data collected from elderly participants.
I also develop iOS and Windows UWP applications integrated with Tobii Pro Lab Eye-Tracking SDK for visual, audio, and ink stroke data collection on human subjects.