
Hi there!
Hi there! I am Nhan Tran (sounds like “Nyun”). I’m currently a research engineer working on robotics perception and human-robot interaction at Robust AI. Previously, I interned, learned, and collaborated with the amazing teams at Robust.AI, Facebook, Google Nest, and NASA/Caltech Jet Propulsion Laboratory.
I recently graduated with a masters thesis in Computer Science + Robotics and Intelligent Systems from Colorado School of Mines, advised by Dr. Tom Williams. Last year, I was selected as one of the HRI Pioneers 2020 cohort, a highly competitive doctoral workshop at the top venue for Human-Robot Interaction research. While currently working in industry, I still try to maintain ties with academia and the robotics research community by serving as Web Chair for both 2021 HRI Pioneers and the 2021 ACM/IEEE International Conference on Human-Robot Interaction.
My research interests are Extended Reality (XR) and Human-Robot Interaction/Teaming (including ubiquitous intelligent systems such as decision support system, autonomous vehicles, UAVs, or multimodal sensor technologies). For my masters’ thesis, I researched computational models for adaptive augmentation, where the way in which information is communicated to human teammates is adapted on the fly based on their cognitive load and mental states. My lab mates and I conducted experiments analyzing the effectiveness of robot-generated mixed reality gestures using real robotic and mixed reality hardware (two papers are now in submission). More broadly, I aspire to design naturalistic human-agent teaming experiences and adaptive technologies that augment human capabilities.
Academically, I am honored and grateful to have received several scholarships (Daniels Scholar, Greenhouse Scholar) and graduate fellowships (research and teaching fellow). These not only provided me with full financial support to now complete both of my bachelors and masters debt-free, but had also instilled in me strong leadership skills and a commitment to give back to the community.
I network
I share ideas
I research and write
I build projects
I make
videos
Robots built
Note: Some of these GIF images can take up to 20-30 seconds to load.
I have written in more detail about these and other side projects here. https://www.trannhan.com/projects/
Wall-Z 1.0
The Wall-Z robot (inspired by Disney’s Wall-E) features built-in neural-network-based ASL letter recognition, VR-based remote visualization of the target environment, and controllable motor and servo movements for precise interaction with the target environment.
Video: https://trannhan.com/walle
Mixed-reality assistant for finding medications
An embodied mixed reality assistant for older adults, which, compared to its voice-only relatives (e.g., Alexa), can use non-verbal communication to help users keep track of what medication to take and pave the best possible route in augmented reality to where they last put their medications.
Project: https://github.com/megatran/HoloLens_Pill_Tracker
Equipping robots with mixed-reality communication modality
A Microsoft HoloLens application which allows robots to communicate their intents to the human teammate using the Augmented Reality modality. Paper submitted to HRI 2021 (pending)
Blasterbotica
Built with the 2016 Colorado School of Mines’ Blasterbotica senior design team to compete in the NASA Robotic Mining Competition. This robot could traverse the arena, avoid obstacles, excavate regolith, and dump collected regolith into the final collection bin.
Video: http://youtu.be/hPARYsAKzIY
Biped robot v1.5
This one-foot-high biped robot is designed to imitate human walking, detect obstacles and be operated using hand gestures. Inspired by the recently developed Atlas robot at Boston Dynamics.
Project: https://www.trannhan.com/project/hand-gesture-control-biped-robot/
3D-printed Mars Rover
I worked on this tiny Mars rover with the 2017 Mines Robotics club to compete in the Colorado Space Grant Robotics Challenge. The robot used several proximity sensors to avoid obstacles, drive toward a beacon, and withstand the Mars-like environment of the Great Sand Dunes National Park.
Video: http://youtu.be/4_nzMrpQ53k
Sir Mixer
An IoT drink mixer that is able to interpret the facial expressions of the human users, infer their emotions, and then mix drinks accordingly.
Project: https://www.trannhan.com/project/thoughtful-drink-mixer-v2-0/
PubNub Robot
This IoT project uses the PubNub API to synchronize and display the data from the distance sensor of the robot on a web interface, while allowing human operators to control the robot from anywhere in the world.
Project: https://github.com/megatran/pubnub_iot_tinkering
Hailfire, a hand gesture-controlled robot
A prototype showcasing how a robot can be operated using JavaScript via Cylon.js. I did a lightning talk at the 2016 O’Reilly Fluent Conference about this project.
Publications
For a full list of publications and citations, please visit my Google Scholars page.
- Adapting Mixed Reality Robot Communication to Mental Workload (Accepted, camera-ready version in progress)
Nhan Tran
Proceedings of the HRI Pioneers Workshop* at the 15th ACM/IEEE International Conference on Human-Robot Interaction | HRI 2020
*HRI Pioneers is a premiere forum for graduate students in HRI. Each year, the workshop brings together a cohort of the world’s top student researchers and provides the opportunity for students to present and discuss their work with distinguished student peers and senior scholars in the field.
- Mixed Reality Deictic Gesture for Multi-Modal Robot Communication
Tom Williams, Matthew Bussing, Sebastian Cabroll, Elizabeth Boyle, Nhan Tran
ACM/IEEE International Conference on Human-Robot Interaction | HRI, 2019 - A Hands-Free Virtual-Reality Teleoperation Interface for Wizard-of-Oz Control
Nhan Tran, Josh Rands, and Tom Williams
International Workshop on Virtual, Augmented, and Mixed Reality for HRI | VAM-HRI, 2018 - Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis
Tom Williams, Nhan Tran, Josh Rands and Neil T. Dantam
VAMR/HCI International Conference on Virtual, Augmented and Mixed Reality | VAMR, 2018
- When not coding, I enjoy making videos with friends and playing around with movie special effects. As former captain of Mines Robotics Club, I also captured various moments of my team building robots and competing in collegiate robotics tournaments.














