Hi there!
Hi there! I am Nhan Tran (sounds like “Nyun”). I am currently a Ph.D. student in Computer Science at Cornell University, co-advised by Professors Abe Davis and Angelique Taylor.
Before graduate school, I had two wonderful years in the industry working on robotics perception and human-robot interaction at Robust AI (check out our robot here). Prior to that, I interned, learned, and collaborated with the amazing teams at Robust.AI, Facebook, Google Nest, and NASA/Caltech Jet Propulsion Laboratory.
My current research interests include robotics and extended reality (XR, which includes AR, VR, and MR) in healthcare.
Research and INdustry Experience
Robust AI
Robotics Software Engineer II
September 2020 – August 2022
Research Intern
Summer 2020
Mines Interactive Robotics Research Lab
Graduate Research Assistant / Masters Thesis
Summer 2020
Undergraduate Research Assistant
2017-2018
Production Engineer Intern
Summer 2019
NASA/Caltech Jet Propulsion Lab
Software Engineer Intern
Summer 2018
Google Nest Labs
Software Engineer Intern
Summer 2017
Selected Publications
For a full list of publications, please visit my Google Scholars page.
What’s The Point? Tradeoffs Between Effectiveness and Social Perception When Using Mixed Reality to Enhance Gesturally Limited Robots
Jared Hamilton* and Thao Phung* and Nhan Tran and Tom Williams
ACM/IEEE International Conference on Human-Robot Interaction | HRI 2021

Get This! ⇓ Mixed Reality Improves Robot Communication Regardless of Mental Workload
Nhan Tran and Trevor Grant and Thao Phung and Leanne Hirshfield and Christopher Wickens and Tom Williams
ACM/IEEE International Conference on Human-Robot Interaction | Late-Breaking Reports | HRI LBRs 2021

Adapting Mixed Reality Robot Communication to Mental Workload
Nhan Tran
Proceedings of the HRI Pioneers Workshop* at the 15th ACM/IEEE International Conference on Human-Robot Interaction | HRI 2020
*HRI Pioneers is a premiere forum for graduate students in HRI. Each year, the workshop brings together a cohort of the world’s top student researchers and provides the opportunity for students to present and discuss their work with distinguished student peers and senior scholars in the field.

Mixed Reality Deictic Gesture for Multi-Modal Robot Communication
Tom Williams, Matthew Bussing, Sebastian Cabroll, Elizabeth Boyle, Nhan Tran
ACM/IEEE International Conference on Human-Robot Interaction | HRI 2019

Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis
Tom Williams, Nhan Tran, Josh Rands and Neil T. Dantam
VAMR/HCI International Conference on Virtual, Augmented and Mixed Reality | VAMR, 2018

A Hands-Free Virtual-Reality Teleoperation Interface for Wizard-of-Oz Control
Nhan Tran and Josh Rands and Tom Williams
1st International Workshop on Virtual, Augmented, and Mixed Reality for HRI, VAM-HRI 2018

Robots built
Note: Some of these GIF images can take up to 20-30 seconds to load.
I have written in more detail about these and other side projects here. https://www.trannhan.com/projects/
Wall-Z 1.0
The Wall-Z robot (inspired by Disney’s Wall-E) features built-in neural-network-based ASL letter recognition, VR-based remote visualization of the target environment, and controllable motor and servo movements for precise interaction with the target environment.
Video: https://trannhan.com/walle
Mixed-reality assistant for finding medications
An embodied mixed reality assistant for older adults, which, compared to its voice-only relatives (e.g., Alexa), can use non-verbal communication to help users keep track of what medication to take and pave the best possible route in augmented reality to where they last put their medications.
Project: https://github.com/megatran/HoloLens_Pill_Tracker
Equipping robots with mixed-reality communication modality
A Microsoft HoloLens application which allows robots to communicate their intents to the human teammate using the Augmented Reality modality. Paper submitted to HRI 2021 (pending)
Blasterbotica
Built with the 2016 Colorado School of Mines’ Blasterbotica senior design team to compete in the NASA Robotic Mining Competition. This robot could traverse the arena, avoid obstacles, excavate regolith, and dump collected regolith into the final collection bin.
Video: http://youtu.be/hPARYsAKzIY
Biped robot v1.5
This one-foot-high biped robot is designed to imitate human walking, detect obstacles and be operated using hand gestures. Inspired by the recently developed Atlas robot at Boston Dynamics.
Project: https://www.trannhan.com/project/hand-gesture-control-biped-robot/
3D-printed Mars Rover
I worked on this tiny Mars rover with the 2017 Mines Robotics club to compete in the Colorado Space Grant Robotics Challenge. The robot used several proximity sensors to avoid obstacles, drive toward a beacon, and withstand the Mars-like environment of the Great Sand Dunes National Park.
Video: http://youtu.be/4_nzMrpQ53k
Sir Mixer
An IoT drink mixer that is able to interpret the facial expressions of the human users, infer their emotions, and then mix drinks accordingly.
Project: https://www.trannhan.com/project/thoughtful-drink-mixer-v2-0/
PubNub Robot
This IoT project uses the PubNub API to synchronize and display the data from the distance sensor of the robot on a web interface, while allowing human operators to control the robot from anywhere in the world.
Project: https://github.com/megatran/pubnub_iot_tinkering
Hailfire, a hand gesture-controlled robot
A prototype showcasing how a robot can be operated using JavaScript via Cylon.js. I did a lightning talk at the 2016 O’Reilly Fluent Conference about this project.
















