METERON – human-robotic planetary exploration

METERON: SUPVIS-M and SUPVIS-E

Controlling rovers on planetary surfaces from an orbiting spacecraft

Future Mars missions are likely to involve astronauts orbiting the planet and controlling or supervising rovers on its surface. This will cut out the time delay experienced when controlling rovers from Earth, and allow more direct intervention from humans when needed, for example to navigate around hazards or identify targets.

To simulate this, and test how such missions might be done in future, Tim Peake will take part in an experiment where he has supervisory control, from the International Space Station, of a rover on the ground, in Airbus Defence and Space’s Mars Yard in Stevenage.

METERON

METERON is a European project to prepare for human-robotic missions to the Moon, Mars and other celestial bodies. The project is organised around a series of experiments, testing technologies and adapting traditional ways of working.

METERON will implement an infrastructure to test communications, operations and robotic control strategies. Operational considerations such as which tasks are robotic and which human, and what data is needed to support the monitoring and control of assets, will feed directly into plans for how to explore, and the design of communication systems.

SUPVIS-M

SUPVIS-M is one in a series of METERON experiments. Building on previous tests, the European Space Agency, UK Space Agency and Airbus Defence and Space UK are working together on a to investigate distributed Supervisory Control of robotic assets in a simulated planetary environment. Airbus’ Mars Yard, in Stevenage, Hertfordshire, was built to develop the locomotion and navigation systems for ESA’s ExoMars rover vehicle and provides a realistic Mars-like environment to test systems. The experiment will provide valuable data to assess the benefits of human involvement in a rover’s path planning.

A representative mission scenario will be set up where a rover is commanded to go from a lit environment into challenging dark location (to emulate a cave or a shaded crater) and identify a number of science targets. The Mars Yard will be split into two areas, one lit and one in the dark.

From one end of the yard, Airbus’s rover, named ‘Bridget’, will be commanded from the ESA Operation Centre (ESOC) in Darmstadt, Germany, to the edge of the shaded area. Then at the edge of the ‘cave’, control of the platform will be passed on to Tim, on board the ISS, who will control Bridget to drive across the yard, avoiding obstacles and identifying potential science targets, which will be marked with a distinctive UV fluorescent marker. Once the targets have been identified and mapped, Tim will drive the rover out of the shaded area and hand control back to ESOC to drive the rover back to its starting point.

To make the scenario and operation more realistic, the time in the shaded area will be limited to emulate the rover energy depleting. In addition Tim will not be provided with an explicit path or routing instructions: he will decide and execute live operation of the rover based on the visual cues fed back to him and his own perception of the terrain and the environment.

This project builds upon previous tests related to the setup of the teleoperation architecture and will demonstrate the control of the same rover from three separate and distant locations (Stevenage, Darmstadt and the ISS). It will also provide another level of complexity by demonstrating how a rover can traverse difficult terrain thanks to human control with limited visual feedback and provide valuable data to shape a wide range of future exploration missions.

SUPVIS-E

This experiment will operate on similar principles to SUPVIS-M, but will see Tim operating ESA’s Eurobot rover in ESA’s technology development centre (ESTEC, in the Netherlands) while colleagues in ESOC, Darmstadt, control a smaller ‘scouting’ Surveyor Rover. The scouting rover moves in close proximity to the Eurobot rover, in order to give another perspective – meaning that Tim can see the landscape from two different angles, and therefore plan the rover’s path more effectively.

The additional rover adds complexity to the set-up – controlling two rovers, from two different locations, is quite a challenge to do in a coordinated way. The system was first successfully tested by Danish astronaut Andreas Morgensen in September 2015, and Tim will be building on his excellent work:

https://www.youtube.com/watch?v=sFnhZj9Wlxs&feature=youtu.be

Background: exploring other planets with robotic rovers

Robotic rovers provide a great way to explore the surface of other planets. But they have to be able to traverse vast and varied expanses of terrain and often venture into unknown and challenging regions, the accessibility of which can only be estimated based on sparse images from orbiting satellites. There is no GPS on Mars!

Sometimes, the estimates are wrong and a rover gets stuck – for example, in a stretch of soft sand or by another obstacle which cannot be foreseen. Soft sand saw the otherwise very successful NASA Mars rover, Spirit, well and truly stuck. To avoid such accidents, planetary exploration rovers today move in a very carefully calculated way, covering not more than a few meters per day, relying on thorough path planning, performed by rover controllers back on Earth.

Good path planning relies primarily on the quality of the local terrain mapping. Rovers map out their surrounding area using stereo cameras, but this can be problematic in harsh lighting conditions:

SUPVIS E ESTEC

Example of harsh lighting (left) vs. good lighting (right)

SUPVIS E MAP

What the rover sees, in good and bad lighting conditions (white = perceived obstacle)

For Mars and Moon exploration, such harsh lighting occurs naturally – especially at sites of scientific interest (i.e. where you really want to take the rover) such as craters or cave-like features. With limited energy supplies, efficient use of time and speed is vital when exploring areas with strong shadows.

One way to minimise these difficulties is with human intervention: a human can make sense of the environment with very little data to unambiguously distinguish between shadows and obstacles, whereas a rover cannot.

It is thought that for future missions to Mars and other planets, human crew will be in close proximity, for example orbiting the planet in a spacecraft while controlling a rover on the surface. This will cut out the communications delay when sending signals to and from Earth (it takes seven minutes for a communication to reach Mars from Earth – and seven minutes back again to know if it has been properly received), enabling much more efficient exploration.