Posts by Collection

portfolio

USU Mars Rover – Aug 2020 - May 2021

Published:

I worked on the USU Mars Rover team to develop a robot that can navigate autonomously in a simulated Mars environment. I worked on the computer vision system to detect and avoid obstacles. I also worked on the communication system to allow the robot to communicate with the base station.

LLAMA – Jun 2021 - Aug 2021

Published:

As part of my work with the Army Research Laboratory (ARL) I worked on the Legged Locomotion and Movement Adaptation (LLAMA) robot. I used joint data to predict terrain types that the robot was walking on.

Spot – Jun 2021 - Present

Published:

I am working with the Boston Dynamics Spot robot to train it to detect complex terrain and estimate the cost for the robot to walk on different surfaces.

DOFbot – Jan 2022 - Mar 2022

Published:

We created a teleoperated control system in Unity in just 24 hours. The system utilized an Oculus headset to control the robot remotely.

Go1 – Jun 2022 - Present

Published:

I am working on developing new RL methods for legged robots. The Go1 is the main platform I use and I am researching improvements in jumping and the robot's ability to walk in deep snow.

publications

Improving Methods for Multi-Terrain Classification Beyond Visual Perception

Published in 2021 Fifth IEEE International Conference on Robotic Computing (IRC), 2022

Terrain classification in mixed-surface unstructured environments is key for safe navigation, energy efficiency, and anticipating motion volatility. This is particularly true for dynamically moving legged platforms which are highly impacted by foot ground interactions. This research demonstrates terrain classification using a long short-term memory (LSTM) model trained on actuator time series data, particularly the difference in center-of-pressure (COP) and leg forces. The LSTM COPForce model showed a 97.5% accuracy in classification on three outdoor surfaces with small amounts of data and no additional sensors.

Recommended citation: C. Allred, M. Russell, M. Harper and J. Pusey, "Improving Methods for Multi-Terrain Classification Beyond Visual Perception," 2021 Fifth IEEE International Conference on Robotic Computing (IRC), Taichung, Taiwan, 2021, pp. 96-99, doi: 10.1109/IRC52146.2021.00022. https://ieeexplore.ieee.org/abstract/document/9699886/

Divide and Survey: Observability Through Multi-Drone City Roadway Coverage

Published in 2022 IEEE International Smart Cities Conference (ISC2), 2022

Deploying autonomous drone systems in smart cities to identify unexpected events and adapt rapidly to crises has a great potential for optimizing city operations and increasing city-wide situational awareness. This work presents an algorithmic technique, Postman Moving Voronoi Coverage (PMVC), which effectively distributes and plans coverage routes for each drone agent. PMVC divides city roadways into similarly sized subregions based on system limitations for many types of unmanned aerial vehicle (UAV). The findings describe trade-offs a city must make between drone types, number of systems, and the desired speed of city-wide road network traversal. Often, employing more low capacity drones are more cost and time effective for city coverage.

Recommended citation: H. Kocabas, C. Allred and M. Harper, "Divide and Survey: Observability Through Multi-Drone City Roadway Coverage," 2022 IEEE International Smart Cities Conference (ISC2), Pafos, Cyprus, 2022, pp. 1-7, doi: 10.1109/ISC255366.2022.9922207. https://ieeexplore.ieee.org/abstract/document/9922207

Terrain Dependent Power Estimation for Legged Robots in Unstructured Environments

Published in 2022 Sixth IEEE International Conference on Robotic Computing (IRC), 2023

Gait-based legged robots offer substantial advantages for traversing complicated, unstructured, or discontinuous terrain. Thus increasing their use in many real-world applications. However, they are also challenging to deploy due to limitations in operation time, range, and payload capabilities due to their complex locomotion and power needs. Anticipating the impact of terrain transitions on the range and average power consumption is crucial for understanding operational limits in autonomous and teleoperated missions. This study examines strategies for forecasting terrain-dependent energy costs on five unique surfaces (asphalt, concrete, grass, brush, and snow). The field experiments demonstrate the effectiveness of our combined proprioception and vision approach called MEP-VP. This hybrid framework only requires two seconds of motion data before returning actionable power estimates. Validation is conducted on physical hardware in field demonstration.

Recommended citation: C. Allred, H. Kocabas, M. Harper and J. Pusey, "Terrain Dependent Power Estimation for Legged Robots in Unstructured Environments," 2022 Sixth IEEE International Conference on Robotic Computing (IRC), Italy, 2022, pp. 329-333, doi: 10.1109/IRC55401.2022.00064. https://ieeexplore.ieee.org/document/10023912

Unknown building exploration simulator (UBES)

Published in Elsevier Journal Software Impacts, 2023

The Unknown Building Exploration Simulator (UBES) software is built to investigate effective strategies for multi-robot exploration of unknown, indoor environments. The software allows the assessment of different exploration techniques, offering the flexibility to modify initial and target conditions, and account for potential agent loss. A total of 12 distinct algorithms commonly used in modern multi-robot exploration are implemented for baseline analysis. The simulation software includes complex building environments that can be user-defined or randomly generated according to several adjustable parameters. Agents also include user-modifiable features such as agent count, sensor range, LiDar characteristics, and inter-agent communication capability.

Recommended citation: Allred, Christopher, Huzeyfe Kocabas, and Mario Harper. "Unknown building exploration simulator (ubes)." Software Impacts 18 (2023): 100576. https://www.sciencedirect.com/science/article/pii/S2665963823001136

Detecting Ballistic Motions in Quadruped Robots: A Boosted Tree Motif Classifier for Understanding Reinforcement Learning

Published in 2023 Seventh IEEE International Conference on Robotic Computing (IRC), 2023

Quadrupedal robots require sophisticated algorithms to learn dynamic and ballistic motions, such as jumping. Traditional methods, which often employ Reinforcement Learning (RL), face inherent limitations in identifying the formation of desired actions during training. This study introduces the Boosted Tree Motif Classifier (BTMC), a novel approach designed to accurately detect complex motion patterns, thus facilitating learning dynamic actions in quadrupedal robots. Unlike classical motif search techniques and simple neural networks, which achieved a precision rate of merely 2% and 12%, respectively, in identifying a ”jump” motif, BTMC demonstrated remarkable efficiency with a 96% precision rate. The overall accuracy of BTMC was comparable to other learning approaches developed but superior in capturing essential motifs crucial for dynamic motions. The results highlight BTMC's potential as an innovative solution in RL-based robotic systems, offering an advancement in robotic locomotion. Our findings open up new avenues for RL's theoretical and practical applications in robotics.

Recommended citation: C. Allred, J. Pusey and M. Harper, "Detecting Ballistic Motions in Quadruped Robots: A Boosted Tree Motif Classifier for Understanding Reinforcement Learning," 2023 Seventh IEEE International Conference on Robotic Computing (IRC), Laguna Hills, CA, USA, 2023, pp. 143-151, doi: 10.1109/IRC59093.2023.00032. https://ieeexplore.ieee.org/abstract/document/10473572

talks

teaching