In the SaPHaRI Lab, we have several projects underway—and many opportunities to start new projects as well.
AstroPsych is a multimodal framework designed to provide dynamic mental health support in challenging environments where traditional support is limited or impossible. Leveraging asynchronous therapist guidance and real-time physiological signals to administer on-demand support sessions, AstroPsych adapts continuously to the evolving needs of individuals in crisis. AstroPsych offers a range of interaction methods, including text, voice, avatar, and robotic interfaces, to effectively meet diverse user preferences. As NASA prepares for human missions to Mars and beyond, astronauts will encounter psychological challenges beyond those experienced during previous missions, such as Apollo. The time delay in communication - ranging from 3 to 22 minutes one way - presents a significant barrier to real-time mental health support. While near Mars, this delay could be as long as 15 to 20 minutes, with complete communication blackouts due to planetary alignment. AstroPsych addresses these unique challenges, providing immediate, empathetic support when needed, thereby helping astronauts maintain their mental well-being despite temporal isolation. Beyond deep-space exploration, AstroPsych offers substantial benefits for mental health care on Earth. With a growing shortage of mental health professionals, this framework can supplement existing services by providing on-demand adaptive support during times of acute need. Additionally, AstroPsych helps to facilitate better patient-therapist matching, ensuring that individuals are connected with professionals best suited to support their needs.
This project focuses on developing affective computing technologies that address challenges in emotional regulation and promote social interaction for individuals with Autism Spectrum Disorder (ASD) and Post-Traumatic Stress Disorder (PTSD). People with these conditions often experience difficulty interpreting and expressing emotions (alexithymia), limiting their ability to engage in social interactions. The system aims to continuously monitor and quantify the user’s emotional state using a wearable device with non-invasive physiological sensors. Through a multi-modal actuation framework, the wearable will intuitively signal the wearer’s emotional state and/or social interaction desires, encouraging nearby humans to engage the wearer. This project explores real-time emotion sensing, modular actuation methods, and the broader impact of technology-mediated social touch in improving emotional well-being.
This project explores the interaction between humans and a group of mobile robots in a guided navigation context. Using TurtleBots upgraded with Raspberry Pis and running on ROS Noetic, we investigate how various aspects of multi-robot systems influence human reception and behavior. Specifically, we consider factors such as the number of robots involved, their spacing, proxemic distance, and the influence of auditory cues on the human experience. Our goal is to understand better how different configurations of robot behavior and environmental factors can optimize group interactions, enhance human comfort, and improve the effectiveness of robot-guided navigation in shared spaces.