When I was a college sophomore, I joined the Blasterbotica senior design team which competed in NASA Robotic Mining Competition. I wanted to get hands-on robotic experience and learn from the upperclassmen engineers. My main contribution to the project was with the software and vision sub-team. Within this sub-team, I worked with the seniors Ross Bunker (CS), Jonathan Melton (EE), and Nathan Young (CS+EE) on the rover’s autonomous localization and obstacle avoidance. Our goal was to guide the robot to traverse the arena, avoid obstacles, excavate regolith, and dump collected regolith into the final collection bin. In the first half part of the project timeline, we were determined to approximate the location of the robot in the field and its proximity to obstacles. The rules of the contest stated that the walls cannot be used for tracking, which narrowed down the possible choices. After reading various publications, previous works, and OpenCV tutorials, we decided to develop a computer vision system that used a fiducial target attached to the bin to determine the position of the robot.
As an underclassman, I implemented and tested the computer vision code with a USB webcam on my Linux computer to help the robot find its target where to go. I used OpenCV 3.0 to capture the image frame and utilized the cross-correlation formula to find the circle patterns. The senior developers then refactored my work and integrated it with the autonomy system. Later on, the team added the ArUco square pattern to improve close range localization within 2m away from the bin and partial target detection.
We had a safety check so that when the rover starts in an autonomous mode, it would switch to manual control if it encounters unexpected issues during competition. After completing the localization part, the team moved on to obstacle avoidance. At the time, I coded the ROS publisher in Python to obtain image data from the Microsoft Kinect and the senior members worked on the ROS subscribers involving computer vision and machine learning methods to process the Kinect data from my publisher node. Working with these seniors on this task, I learned more about point cloud processing as well as obstacles isolation using segmentation and cluster extraction algorithms.
In addition to assisting the Autonomy sub-team, I helped with documenting testing data and capturing footages of the whole senior design team. When the team prepared for the presentation to the NASA engineers, I created a short video that demo several functionalities of our robot.
Since this was a big project with a multidisciplinary team of 15 engineering students coming from backgrounds in Computer Science, Electrical Engineering, and Mechanical Engineering, I’d like to give more details about the project, the team, and other subsystems of the robot built by other senior members.
The Senior Team Blasterbotica:
Team Blasterbotica was established at the Colorado School of Mines (CSM) in 2008 to participate in the NASA Robotic Mining Competition (NASA RMC), formerly the NASA Lunabotics Competition. Team Blasterbotica has worked diligently over the years to design, test and construct rovers capable of excavating and delivering of a minimum of 10 kg of simulated regolith within a 10-minute time period. This year, we developed a new rover with state-of-the-art components and improvements on previous designs. In addition, our team developed a completely autonomous control system – a first for a CSM team in this competition. Work was distributed amongst team members by dividing the project into three main subsystems: drivetrain, excavation, and autonomy/controls. These interfaced subsystems combine to form CSM’s 2016 rover.
The 2016 Blasterbotica team.
From left to right: Dr. Angel Abbud-Madrid (our faculty advisor) Matthew Bailey Camden Nohorniak Jonathan Melton Holden Steppan Alexis Humann Ross Bunker Nhan Tran (myself :D ) Cory Varney Nathan Young Christine Pumford Jeffery Nichols David Schack Katy Schneider Joshua Nelson
The NASA RMC set forth the requirements for competing rovers. Rovers must be capable of traversing an arena filled with a sandy Martian regolith simulant, called Black Point-1 (BP-1), and an icy Martian regolith simulant while avoiding obstacles and craters. The rover must be able to excavate the regolith in a designated mining area and then transport and deliver the regolith into a collection bin. The rover must follow additional requirements. Points are awarded for successful delivery of regolith and ice simulant into the collection bin, low rover mass, high dust tolerance, low bandwidth and energy usage, and autonomous operation.
System Design Requirements and Constraints
The rover must adhere to several requirements and constraints as detailed by the NASA RMC.
These are summarized as follows:
- A minimum of 10 kg of BP-1 and/or icy regolith must be excavated and delivered to a collection bin to qualify for competition awards.
- The rover mass may not exceed 80 kg.
- The rover’s dimensions may not exceed 1.5 m by 0.75 m and 0.75 m. Once the competition round has started, the rover may expand in size.
- BP-1 and icy regolith simulant may only be excavated in the designated mining area.
- The rover’s average bandwidth may not exceed 5,000 kb/s.
- The rover may not utilize substances that would not function in a Martian environment.
- The rover must only utilize onboard power.
- The rover must be equipped with a red emergency stop button.
Team Blasterbotica’s goals included creating a rover that:
- Meets all requirements and constraints set forth by the NASA RMC.
- Optimizes collection of BP-1.
- Minimizes mass and size.
- Operates completely autonomously.
- Functions reliably in a variety of soil environments.
- Drive Train
The drive train subsystem featured the “Salad-bowl wheels” made of two stainless steel bowls and polycarbonate fins that were lightweight, inexpensive and provided great traction. The drivetrain design had straight wheels, wide orientation, capable of zero-point turns and reduced power draw.
The excavation subsystem featured the storage bin to store sand, conveyor belt for effective regolith delivery, and bucket ladder system 0.75 m above the ground that could move up/down to excavate. It took ~21 seconds to raise the bucket.
The Control/Autonomy subsystem allowed for manual and partial to fully autonomous operation. The onboard laptop accepted ROS commands and processed Kinect + camera data. There were also teensy and Arduinos under the hierarchy to control separate serial lines, motors, and other hardware subsystems. The Autonomy sub-team also did simulation with the Turtlebot in ROS virtual environment. My contribution to the Localization and Object Detection components were mentioned in the beginning of the post.
At the competition, the team placed 14th out of 45 participating teams.