EELS robot may help JPL look for life on watery worlds

Engineers at NASA’s Jet Propulsion Laboratory are taking artificial intelligence to the next level – by sending it into space disguised as a robotic snake.

As the sun beats down on JPL’s Mars Yard, the robot lifts its “head” from a gleaming surface of man-made ice to scan the world around it. It maps its surroundings, analyzes potential obstacles, and chooses the safest route through a valley of fake boulders to its goal.

A long, black object lifts its front part between two rocks.

EELS raises its head unit to scan its surroundings.

(Brian van der Brug / Los Angeles Times)

Once he has a plan, the 14-foot robot lowers its head, activates its 48 motors, and slowly glides forward. Its cautious movements are powered by the clockwise or counterclockwise rotations of the spiral links that connect its 10 body segments and send the cyborg in a specific direction. All the while, sensors all over the body are reassessing the environment, allowing the robot to make adjustments if needed.

JPL engineers have designed spacecraft that orbit distant planets and built rovers that rumble around Mars like commuting to the office. But EELS – short for Exobiology Extant Life Surveyor – was designed to reach places that were previously inaccessible to both humans and robots.

The lava tubes On the moon? EELS could explore the underground tunnels that could provide shelter for future astronauts.

The polar ice caps on Mars? EELS would be able to study them and deploy instruments to gather chemical and structural information about them frozen carbon dioxide.

The liquid ocean beneath the frozen surface of Enceladus? EELS could be making its way there, looking for evidence that Saturn’s moon might be habitable.

“They talk about a snake robot that can cross the surface of ice, go through holes and swim under water – a robot that can conquer all three worlds.” Rohan Thakker, a robotics technologist at JPL. “No one has done that before.”

And if all goes according to plan, the gliding space explorer, developed with grants from Caltech, will do all of these things autonomously, without having to wait for detailed orders from staff at NASA’s La Cañada Flintridge lab. Although EELS is still years away from its first official deployment, it is already learning to hone its decision-making skills so that it can navigate independently even in dangerous terrain.

Hiro OnoLeader of JPL’s Robotic Surface Mobility Group, started seven years ago another vision named to study Enceladus and another water moon orbiting Jupiter Europe. He envisioned a three-part system consisting of a surface module that generated electricity and communicated with the earth; a descent module making its way through a moon’s icy crust; and an autonomous underwater vehicle that explored the subterranean ocean.

EELS replaces all of that.

Black tubular units are connected to each other.

EELS features spiral treads for traction and multiple body segments for flexibility. Its design allows it to wriggle out of any difficult terrain.

(Brian van der Brug / Los Angeles Times)

Thanks to its snake-like anatomy, this new space explorer can move forwards and backwards in a straight line, slide like a snake, move its entire body like a windshield wiper, curl up in a circle, and raise its head and tail. The result is a robot that won’t get in the way of deep craters, icy terrain, or small spaces.

“Sometimes the most interesting science is in hard-to-reach places” said Matt Robinson, the project manager for EELS. Rovers struggle with steep slopes and irregular surfaces. But a snake-like robot would be able to reach places like an underground lunar cavity or the near-vertical wall of a crater, he said.

The farther away a spaceship is, the longer it takes for human commands to reach it. The rovers on Mars are remotely controlled by humans at JPL and dependent on the relative positions of Earth and Mars it can take five to 20 minutes so that messages can be transmitted between them.

Enceladus, on the other hand, can be anywhere from 746 million to more than 1 billion miles from Earth. A radio transmission from out there would take at least an hour, maybe even an hour and a half. If EELS was in danger and needed human help to get out of it, its fate could be sealed if the SOS got a response.

“Most people get frustrated when their video game has a few seconds of lag,” said Robinson. “Imagine piloting a spacecraft that is in a dangerous area and has a 50 minute delay.”

Because of this, EELS learns to make its own decisions about the path from point A to point B.

One person points to a spot in the distance while other people look on from behind laptops and a monitor on a table.

EELS Autonomy Leader Rohan Thakker, second from left, consults with engineers as they put the robot through its paces.

(Brian van der Brug / Los Angeles Times)

A computer screen with colorful markers on the left and two images from two cameras on the right.

A computer screen shows the actual position of EELS compared to the programmed position.

(Brian van der Brug / Los Angeles Times)

Teaching the robot to assess its surroundings and make decisions quickly is a multi-step process.

First, EELS is taught to be safe. With the help of software that calculates the probability of failures – such as hitting something or getting stuck – EELS learns to recognize potentially dangerous situations. For example, it’s about figuring out that when something like fog interferes with its ability to map the world around it, it should respond with a more cautious approach, said Thakker, the project’s autonomy leader.

It also relies on its multitude of built-in sensors. Some can see a change in its orientation with respect to gravity – the robotic equivalent of feeling like you’re about to fall. Others measure the stability of the ground and can see when hard ice is suddenly turning into loose snow, allowing EELS to maneuver onto a more navigable surface, Thakker said.

In addition, EELS is able to incorporate past experiences into its decision-making process – in other words: it learns. However, it does so in a slightly different way than a typical artificial intelligence robot.

For example, if an AI robot spots a puddle of water, it can examine it a bit before jumping in. The next time he encounters a puddle, he recognizes it, remembers it’s safe, and jumps in.

But that could be deadly in a dynamic environment. Thanks to the additional programming of EELS, it would be able to rate the puddle every time – just because it was safe once doesn’t mean it’s safe again.

In the Mars Yard, a half-acre of rocky sandbox used for testing rovers, Thakker and Team EELS assign a specific target. It’s then the robot’s job to use its sensors to scan the world around it and find the best path forward, whether it’s right in the dirt or on white mats that mimic ice.

A long, black, tubular object lies on the ground, with two people standing in the background.

JPL engineers tested EELS on glossy mats that served as a substitute for ice.

(Brian van der Brug / Los Angeles Times)

It’s similar to navigating a self-driving car, except there are no stop signs or speed limits, which could help EELS develop its strategy, Thakker said.

EELS has also been tested on an ice rink, on a glacier and in snow. With its spiraling treads for traction and multiple body segments for flexibility, it can wriggle itself through any difficult terrain.

A tubular object appears between two stones on the ground. In the foreground a hand holds a plastic box and a button.

Mechanical engineer Sarah Yaericks holds the emergency stop as EELS bypasses obstacles in the Mars Yard at JPL.

(Brian van der Brug / Los Angeles Times)

The robot is not the only one learning. As his human caregivers monitor EELS’s progress, they adjust his software to help him better assess his surroundings, Robinson said.

“It’s not an equation that you can easily solve,” Ono said. “It can often be more of an art than a science. … A lot comes from experience.”

The goal is for EELS to gain enough experience to be used independently in any type of environment.

“We’re not there yet,” Ono said. But EELS’ recent advances mean “one small step for the robot and one giant leap for mankind.”

Alley Einstein

Alley Einstein is a USTimesPost U.S. News Reporter based in London. His focus is on U.S. politics and the environment. He has covered climate change extensively, as well as healthcare and crime. Alley Einstein joined USTimesPost in 2023 from the Daily Express and previously worked for Chemist and Druggist and the Jewish Chronicle. He is a graduate of Cambridge University. Languages: English. You can get in touch with me by emailing Alley@ustimespost.com.

Related Articles

Back to top button