ARCHES project
Autonomous robots rehearse for Mars and the deep sea
Autonomous research robots play a crucial role in deep-sea research as well as in the exploration of distant planets. Helmholtz researchers from both fields have joined forces and together are developing new techniques.
Etna in Sicily: With its bizarre landscapes and unique nature, Europe's highest active volcano captivates travelers from all over the world. But what its visitors saw there in June 2022 was a spectacle of a very different kind. On a rugged plain a good 2,500 meters above sea level, robots pulled their furrows through the volcanic rock as if controlled by a ghostly hand, collected samples, erected antennas and rappelled each other down into one of the smaller craters. Robots exploring cratered landscapes? Aren't these more like scenes from a lunar mission? Had someone got the celestial body wrong here?
"No," says Armin Wedler with a grin. "This was a demo mission for the space part of the Helmholtz ARCHES future project." The mechanical engineer works at the Institute of Robotics and Mechatronics at the German Aerospace Center (DLR) and leads the project, which also involves the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI), GEOMAR Helmholtz Centre for Ocean Research Kiel and the Karlsruhe Institute of Technology (KIT). "ARCHES stands for Autonomous Robotic Networks to Help Modern Societies. So it's about networked robotic systems that operate widely and continuously in extreme environments," he explains. "DLR represents one of these extreme environments with space. GEOMAR and AWI represent another one with the deep sea."
Freezing cold or hellish heat; extreme pressure or vacuum; toxic atmosphere; deadly radiation; eternal darkness: The deep sea and outer space are not exactly inviting. And since humans aren't necessarily made for such extremes, the idea of the project is that robot teams will go instead. The teams consist of individual specialists that are networked together and support each other. The demo mission was intended to show what this might look like on a space mission, so 55 scientists and engineers–among them 12 from the European Space Agency ESA–met in Sicily on Mount Etna..
Explore, control, build a base
"Etna is an active volcano. That means the rock in this environment has not yet been heavily altered by environmental influences. It resembles the rock on the moon and Mars," says Armin Wedler, explaining the choice of mission location. "Technologically, this creates some challenges, for example, for the locomotion of the robots. And scientifically, it offers the opportunity to practice here on Earth with analysis procedures for later missions to other celestial bodies.
The demo mission consisted of three parts. The first was a precursor mission. This is the term used by astronauts when a celestial body is to be explored, and in this instance Etna was used to simulate an automated exploration of the moon. "The scenario did not include a lunar orbit station, radio relays or operation from a spacecraft in lunar orbit," says the project manager. "So the two wheel-driven robots, LRU 1 and LRU 2, and the Ardea drone were on their own, exploring the area completely autonomously." Next, the researchers wanted to take a closer look at an area just as they would actually do on a space mission, with a remote control mission. "To do this, we converted one of the conference rooms in our hotel on Mount Etna into a control center," Armin Wedler says. The data from the volcano converged there, and from there the robot "Interact-Rover" was controlled remotely. A team from ESA supervised this step in a control room at the European Space Operations Center ESOC in Darmstadt. One of those who sat at the control desks during this part of the mission, and who also took the controls himself, was astronaut Thomas Reiter.
"In the third part, we simulated a permanent base on the moon," says Armin Wedler. "The robots were tasked with setting up a radio telescope on the dark side of the moon." It's a task that makes radio astronomers in particular sit up and take note. After all, on Earth, a perpetual tangle of radiation from a wide variety of modern civilization's devices makes life difficult for them. The far side of the moon, on the other hand, offers an undisturbed "view" of the universe, so the exercise could well turn into a real mission at some point in the future.
Even robot teams need a plan
But how do you get a wide variety of robots to work together and, if necessary, not be thrown off course by the interjections of a human operator? Sören Hohmann at the KIT Institute for Control Systems has been researching this successfully for many years. Esther Bischoff from his team explains, "If you have different robots that are to perform different tasks within a mission, that has to be planned." That sounds simple at first. But in most cases, certain boundary conditions have to be met for this to happen. For example, there are tasks that may only be started if another one has been completed first. "And if the preferences of the human project participants are also to be taken into account, this planning is better handled by a computer algorithm." The young scientist has designed one such algorithm for the ARCHES project and demonstrated its capabilities on Etna.
"We used a huge touch display that showed a terrain map along with the current position of the different robots," she says. "For this, we designed a planning and coordination algorithm and developed the interactive user interface used to interact with humans." In the simulation, it was then possible to define all the tasks that the robots would later perform. The operator could also specify whether a very specific robot was intended for a task or whether the system should decide freely. And during the mission, the operator was free to intervene at any time. For example, new tasks could be added or certain robots could be assigned a different task. "We brought everything with us to Etna, set up the simulation there and presented it to colleagues from the other centers," explains the electrical and information engineer. "The different tasks could be defined in the user interface and were then processed in the simulation."
Extreme worlds challenge science
Among the colleagues listening intently to the presentation was Sascha Flögel. The geologist and paleontologist conducts research at GEOMAR. "I was there as a spectator this time," he recounts. "Together with AWI, we from GEOMAR already had our baptism of fire two years ago." referring to the deep-sea demo mission that took place in October 2020 in the Eckernförde Bay. Due to the pandemic, only a small team of marine researchers was involved in setting up an autonomous underwater measuring network and successfully demonstrating the cooperation of seven mobile and stationary measuring platforms.
"It's quite different when the mission control center is a fixed building with a connection to the supply and not a swaying ship all alone in the ocean," says Sascha Flögel, looking at the control room in the hotel on Mount Etna. "And then also to see the systems and their interaction up close was a very exciting experience."
After all, even though both are extreme environments, space and deep sea sometimes present researchers with very different challenges. For example, in communication. "Even though there are runtime delays in transmission, in spaceflight the mission control is usually in constant contact with the robotic system," explains Sascha Flögel. "In ocean research, it's different. Once the systems are set down in the ocean and not connected to the ship via a cable or acoustically, then you don't have access to them for the duration of the mission." That means researchers usually don't find out whether everything went well until 12 or 14 months later. Whether all the measurement systems have worked and whether the data are all there.
"Underwater, communication works only acoustically," the Flögel continues. "That limits the data rate to a few kilobytes per second." Sending images or videos to the surface takes a long time; live images are not even a thought. Even the scientific data, which itself occupies very little storage space, is very often stored in the system and only recovered with it from the bottom of the sea after the mission is over. "The kind of cooperation between robotic systems that we have developed at ARCHES has never been seen before in marine research," he explains. "It opens up a great many new and, above all, exciting possibilities for us." And not just on Earth. Because oceans also exist on other celestial bodies, for example, the icy moons in the outer solar system. In recent years, these have increasingly become the focus of scientific attention. "With a robotic mission to the icy moons, space and deep-sea research would then really grow together," Sascha Flögel is pleased to say.
Developed, networked and practically deployed
Like the "Deep Sea" demo mission in the Bay of Kiel, the "Space" demo mission on Etna has demonstrated the function of the robotic systems and their ability to work cooperatively. Wouldn't that be the best starting position for a follow-up project? "With ARCHES and Co, we don't talk about follow-up projects," says Armin Wedler. "Rather, they are impulses or beacons that deal intensively with a topic and then provide impetus for other research teams to pick up where we leave off." But that certainly doesn’t mean the end of successful collaboration. "In the new iFOODis project, we are researching the Baltic Sea and the Schlei region," he reveals. "This time, the aim is to find out what effects intensive land use has on nature and the environment." In addition to DLR, AWI and GEOMAR are also involved. "In the Eckernförde Bay, various robotic systems will again be used on land and water. The set-up is complemented by satellite data and measurement systems on land."
The applications have now been approved and the project will start in January 2023. "For me, it is both a continuation and a logical consequence," says Sascha Flögel. "In ROBEX, we developed the systems [ed. see box]. In ARCHES, we connected the systems. And in iFOODis, we'll put the systems to practical use together, and there will also be a joint demo mission for the very first time for this."
The evolution of autonomous robots
Back in 2012, researchers from DLR, AWI and GEOMAR joined forces to form the Helmholtz Alliance ROBEX. The task of "Robotic Exploration under Extreme Conditions" was thatautonomous robots with different capabilities were to be developed to work together under the extreme conditions of space and deep sea, but also in rescue missions in disaster areas, or in the dismantling of nuclear reactors. At that time, robotic systems such as the seafloor crawler VIATOR including docking station MANSIO or DLR's Krabbler were developed. To explore how such robotic systems can be networked, how they work autonomously as a team, and how collaboration with a human operator works, a new alliance was forged in 2018: ARCHES. In addition to DLR, AWI, and GEOMAR, KIT also joined this time as an expert in networked systems. After successful demonstrations in Eckernförde Bay and on Etna, it is now time to take the next step. With iFOODis, the project partners are now focusing on a joint application which will see space and deep-sea research in the Helmholtz Association growing closer together again.
Readers comments