close
Technology

Roboticists Travel Off-road to gather information for Self-driving ATV Training

The ATV was driven at high speeds of up to 30 miles per hour. It slid through turns, up and down hills, and got stuck in the mud while collecting vital data such as video, wheel speed, and suspension shock travel from seven different types of sensors.

Researchers took an all-terrain vehicle on wild rides through tall grass, loose gravel, and mud to learn how the ATV interacted with a difficult off-road environment. They drove the heavily instrumented ATV at speeds of up to 30 miles per hour. They slid through turns, took it up and down hills, and even got it stuck in the mud — all while gathering data such as video, the speed of each wheel and the amount of suspension shock travel from seven types of sensors.

Researchers from Carnegie Mellon University took an all-terrain vehicle on wild rides through tall grass, loose gravel, and mud to collect data on how the ATV interacted with a challenging off-road environment. They drove the heavily instrumented ATV aggressively at speeds of up to 30 miles per hour. They slid it through turns, up and down hills, and even got it stuck in the mud, all while collecting data from seven different sensors, including video, wheel speed, and suspension shock travel.

Unlike autonomous street driving, off-road driving is more difficult because you have to understand the dynamics of the terrain in order to drive safely and faster. We were forcing the human to go through the same control interface as the robot would. In that way, the actions the human takes can be used directly as input for how the robot should act.

Wenshan Wang

TartanDrive, the resulting dataset, contains about 200,000 of these real-world interactions. The data, according to the researchers, is the largest real-world, multimodal, off-road driving dataset in terms of the number of interactions and sensor types. The five hours of data collected could be used to train a self-driving vehicle to navigate off-road.

“Unlike autonomous street driving, off-road driving is more difficult because you have to understand the dynamics of the terrain in order to drive safely and faster,” Wenshan Wang, a project scientist at the Robotics Institute, explained (RI).

Previous work on off-road driving has frequently used annotated maps, which provide labels such as mud, grass, vegetation, or water to assist the robot in understanding the terrain. However, such information is not always available and, even when it is, it may not be useful. A map area labeled “mud,” for example, may or may not be drivable. Robots that understand dynamics can reason about the physical world.

Roboticists go off road to compile data that could train self-driving ATVs

The research team discovered that the multimodal sensor data collected for TartanDrive enabled them to develop prediction models that outperformed those developed with simpler, nondynamic data. Driving aggressively also pushed the ATV into a performance realm where an understanding of dynamics became critical, according to Samuel Triest, a second-year master’s student in robotics.

“The dynamics of these systems tend to get more difficult as you add more speed,” said Triest, who was the team’s lead author on the resulting paper. “You drive faster, and you hit more things. We were particularly interested in gathering data on more aggressive driving, more difficult slopes, and thicker vegetation because this is where some of the simpler rules begin to fail.”

Though most research on self-driving vehicles has focused on street driving, the first applications are likely to be off-road in controlled access areas where the risk of collisions with people or other vehicles is low. The tests were carried out at a location near Pittsburgh where CMU’s National Robotics Engineering Center tests autonomous off-road vehicles. Humans drove the ATV, though they used a drive-by-wire system to control steering and speed.

“We were forcing the human to go through the same control interface as the robot would,” Wang said. “In that way, the actions the human takes can be used directly as input for how the robot should act.”

The data gathered by the research team assisted them in developing prediction models that performed better than models developed with simpler, non-dynamic data. By driving the ATV aggressively during tests, the team pushed the vehicle into a performance realm where dynamics were critical. Robots that understand dynamics have a better chance of reasoning about the physical world.

Sebastian Scherer, an associate research professor at the RI, Aaron Johnson, an assistant professor of mechanical engineering, Wang, and Triest were part of the research team that worked on the paper.

Topic : News