Nhan Tran

Hi there! I am Nhan Tran (sounds like “Nyun”).

I'm currently a Ph.D. student in Computer Science at Cornell University, advised by Professor Abe Davis.

I'm also pursuing a minor in Media Studies (focusing on Cinematography and Visual Storytelling) at Cornell's Performing and Media Arts.

Before graduate school, I had two wonderful years in the industry working on robotics perception and human-robot interaction at Robust AI (check out our robot here). Prior to that, I interned, learned, and collaborated with the amazing teams at Robust.AI, Facebook, Google Nest, and NASA/Caltech Jet Propulsion Laboratory.

My current research interests include extended reality (XR, which includes AR, VR, and MR).

CV/Resume     Github     G. Scholar     LinkedIn     Twitter/X     YouTube    

nhan at cs dot cornell dot edu

Robots and AR/VR are two of my favorite things. I have been fortunate to work on a variety of projects over the years, from research to industry to personal hobbies, that either utilize or combine these two areas. For example, I have integrated AR capabilities into robot systems to enable more intuitive human-robot interaction. I am continually excited to tinker and see how innovations in robotics and immersive technologies can be applied to improve people's lives in areas like healthcare, education, filmmaking, and more.


Now Look Here! ⇓ Mixed Reality Improves Robot Communication Without Cognitive Overload

Nhan Tran, Trevor Grant, Thao Phung, Leanne Hirshfield, Christopher Wickens, Tom Williams
HCI International Conference on Virtual, Augmented, and Mixed Reality (HCII 2023)

We explored whether the success of Mixed Reality Deictic Gestures for human-robot communication depends on a user's cognitive load, through an experiment grounded in theories of cognitive resources. We found these gestures provide benefits regardless of cognitive load, but only when paired with complex language. Our results suggest designers can use rich referring expressions with these gestures without overloading users.

What's The Point? Tradeoffs Between Effectiveness and Social Perception When Using Mixed Reality to Enhance Gesturally Limited Robots

Jared Hamilton, Thao Phung, Nhan Tran, Tom Williams
ACM/IEEE International Conference on Human-Robot Interaction (HRI 2021)

We present the first experiment analyzing the effectiveness of robot-generated mixed reality gestures using real robotic and mixed reality hardware. Our findings demonstrate how these gestures increase user effectiveness by decreasing user response time during visual search tasks, and show that robots can safely pair longer, more natural referring expressions with mixed reality gestures without worrying about cognitively overloading their interlocutors.

Adapting Mixed Reality Robot Communication to Mental Workload

Nhan Tran
HRI Pioneers Workshop at the International Conference on Human-Robot Interaction (HRI 2020)
★ HRI Pioneers ★ PDF    

Mixed reality deictic gesture for multi-modal robot communication

Tom Williams and Matthew Bussing and Sebastian Cabrol and Elizabeth Boyle and Nhan Tran
ACM/IEEE International Conference on Human-Robot Interaction (HRI 2019)

We investigate human perception of videos simulating the display of allocentric gestures, in which robots circle their targets in users' fields of view. Our results suggest that this is an effective communication strategy, both in terms of objective accuracy and subjective perception, especially when paired with complex natural language references.

Augmented, mixed, and virtual reality enabling of robot deixis

Tom Williams, Nhan Tran, Josh Rands, Neil T Dantam
HCI International Conference on Virtual, Augmented, and Mixed Reality (2018)
PDF  •  

Humans use deictic gestures like pointing when interacting to help identify targets of interest. Research shows similar robot gestures enable effective human-robot interaction. We present a conceptual framework for mixed-reality deictic gestures and summarize our work using these techniques to advance robot-generated deixis state-of-the-art

Films & Videos

I have created some videos over the years as a personal hobby and creative outlet. Now that I am pursuing a Ph.D. in Computer Science with a minor in Media Studies (focusing on Cinematography), I am looking forward to having more opportunities to tell visual stories during breaks from research and teaching responsibilities.

Stay tuned for more or subscribe to my YouTube channel!

Solar Eclipse | Chimney Bluffs State Park
The Phantom of Gates Hall | An Otamatone Performance
The Tiny Explorer | A short film by Waki Kamino, Peter Wu, Nhan Tran
Short Films Teaser 2023 | "The Tiny Explorer", "My Robot", "Facade"
MY ROBOT | A 2-minute short film
16mm Film Experiment
Spaced Out | A Short Movie
Inclusive User Testing in VR | MIT Reality Hack 2022
XR-Controlled Hospital Robot Prototype
Inclusive User Testing in VR | MIT Reality Hack 2022
MusicBlox: Tangible Programming in Mixed Reality | AR/VR Grand Prize @ Stanford TreeHacks 2020
Melody Mesh | 3D Audio Visualizer
Pandemic Simulator (Cornell CS5620 Creative Project 1)
Physics Things - Short Horror Movie
Blasterbotica 2016 NASA Robotic Mining Competition
Mines Robotics Recognized for Best Robot at 2017 CO Space Grant Robotics Challenge
Mines Robotics WON FIRST PLACE at 2017 ASME Robot Pentathlon - Student Design Competition

Misc Projects

Over the years, I have enjoyed tinkering with electronics and AR/VR projects as a personal hobby during my free time. I have built several small projects over weekends, at hackathons with friends, or at various regional and national outreach events (e.g., AAAI 2018 and 2019) to teach kids robotics. These hands-on experiences have not only been fun, but also great learning opportunities.

Robotic Medical Crash Cart

Video 1 (Hardware) Video 2 (Pilot Study)

I led this project with a team of undergraduates to transform a medical crash cart used in hospitals into a smart robotic system as part of the Mobile Human-Robot Interaction class taught by Prof. Wendy Ju at Cornell Tech. The base is built on a modified hoverboard. On the perception side, we use the RealSense depth sensor to prototype the "follow me" interaction robot that carries medical supplies and follows designated user.

Wall Z 1.0

My friend Ryan and I built the Wall-Z robot, inspired by Disney's Wall-E, which uses on-edge processing with an Nvidia Jetson for ASL recognition, VR for remote environment visualization, and synchronizes its head movement with a VR headset.

Mixed-Reality Assistant for Medication Navigation and Tracking


I built an embodied mixed reality assistant on the Microsoft HoloLens 1 that uses virtual interfaces to allow users to anchor where they placed their pill bottles, saves the locations in a map, and then when requested, projects an overlay of the shortest path from the user's current position to the saved anchor points.

3D-printed Mars Rover


Team project with the Mines Robotics Club. We built a tiny Mars rover to compete in the Colorado Space Grant Robotics Challenge. The robot used several proximity sensors to avoid obstacles, drive toward a beacon, and withstand the Mars-like environment of the Great Sand Dunes National Park.

Blasterbotica: The Mining Bot at the NASA Robotic Mining Competition


Built with the 2016 Colorado School of Mines’ Blasterbotica senior design team to compete in the NASA Robotic Mining Competition, this robot could traverse the arena, avoid obstacles, excavate regolith, and dump the collected regolith into the final collection bin. I was a member in the perception team learning to use ROS and OpenCV to detect obstacles and the collection bin.

Biped Robot v1.5 - A DIY Humanoid Walking Robot


My friend Arthur and I built this biped robot over a weekend. It was designed to imitate human walking, detect obstacles, and be operated using hand gestures. This was after watching the debut of the Atlas robot at Boston Dynamics. Through DIY, we learned that bipedal locomotion is hard!

Hailfire, a hand gesture-controlled robot

I was learning how to interface from the web to an Arduino using Cylon.js. This prototype showcases how a robot can be operated using JavaScript and an accelerometer. I gave a lightning talk at the 2016 O'Reilly Fluent Conference about this project.

Carpal tunnel arm coach

A robotic hand that coaches users through exercises to help prevent carpal tunnel syndrome

Sir Mixer: An emotionally aware bartender robot


My roommate Patrick and I built an IoT drink mixer that is able to interpret the facial expressions of human users, infer their emotions, and then mix drinks accordingly.

Web-Based Teleoperation and Monitoring of a Mobile Robot over PubNub

This IoT project uses the PubNub API to synchronize and display the data from the distance sensor of the robot on a web interface, while also allowing human operators to control the robot from anywhere in the world.