NASA’s EELS robot is revolutionizing alien exploration

Illustration of the concept of the Extraterrestrial Life Survey (EELS). This versatile snake-like robot is designed to autonomously explore uncharted terrains in space and on Earth without real-time human intervention. Credit NASA/JPL-CalTech
A versatile robot that can map, traverse and explore previously inaccessible destinations is being tested in the test. NASAJet Propulsion Laboratory.
How do you make a robot that can go places no one has ever seen before – on its own, without human intervention in real time? A team at NASA’s Jet Propulsion Laboratory that builds a snake-like robot to traverse extreme terrain takes on the challenge with the mindset of a startup: build fast, test often, learn, tune, and iterate.
Called EELS (an acronym for Exobiology Extant Life Surveyor), the self-propelled, autonomous robot was inspired by a desire to search for signs of life in the ocean hiding beneath the ice crust. SaturnEnceladus, through the descent of narrow vents in the surface, spews geysers into space. Although testing and development continues, designing for such a challenging destination has resulted in a highly adaptable robot. EELS can choose a safe path through a variety of terrain on Earth, the Moon, and beyond, including undulating sand and ice, cliff walls, craters too steep for the rover, subterranean lava tubes, and labyrinthine spaces within glaciers.

Team members from the Jet Propulsion Laboratory test a snake robot called EELS at a ski resort in the mountains of Southern California in February. Designed to sense its environment, calculate risks and travel, and collect data without human intervention in real time, EELS can eventually explore destinations throughout the solar system. Credit: NASA/JPL-Caltech
“It has the ability to go to locations other robots can’t go. Although some robots are better at a certain type of terrain or another, the idea of EELS is the ability to do everything,” she says. Jet Propulsion LaboratoryMatthew Robinson, EELS Project Manager. “When you go places where you don’t know what you’ll find, you want to send a versatile, risk-aware bot that’s prepared for uncertainty — and can make decisions on its own.”
The project team started building the first prototype in 2019 and has been making ongoing revisions. Since last year, they’ve conducted monthly field tests and refined both the hardware and software that allow EELS to operate independently. In its current form, called EELS 1.0, the robot weighs about 220 pounds (100 kilograms) and is 13 feet (4 meters) long. It consists of 10 identical parts that rotate, using screw threads for propulsion, traction, and grip. The team has been experimenting with a variety of screws: 8-inch (20 cm) 3D-printed plastic screws for testing on looser ground, and narrower, sharper metal screws for ice.

An eel was tested in the sandy terrain of JPL’s Mars Yard in April. Engineers frequently test the Snake robot across a variety of terrains, including sand, snow, and ice. Credit: NASA/JPL-Caltech
The robot has been tested in sand, snow, and ice environments, from a Mars yard at JPL to a “robot playground” set up at a ski resort in the Snowy Mountains of Southern California, even at a local indoor ice rink.
“We have a different philosophy of developing robots than traditional spacecraft, with many rapid cycles of testing and debugging,” said Hiroo Ono, EELS principal investigator at JPL. “There are dozens of textbooks on how to design a four-wheeled vehicle, but no textbook on how to design an autonomous snake robot to boldly go where no robot has gone before. We have to write our own. That’s what we’re doing now.”
EELS (Exobiology Extant Life Surveyor) at the Jet Propulsion Laboratory (JPL) is designed as an autonomous snake robot that descends from narrow openings in the icy crust of icy moon Enceladus to explore the ocean hiding below. But prototypes of the robot setup have been tested for a variety of environments. Credit: NASA/JPL-Caltech
How EELS think and move
Because of the time lag in communications between Earth and deep space, EELS is designed to autonomously sense its environment, calculate risks, travel, and collect data using yet-to-be-identified scientific tools. When something goes wrong, the goal is for the robot to recover on its own, without human help.
“Imagine a car driving autonomously, but there are no stop signs, no traffic lights, not even any roads. The robot has to figure out the road and try to follow it,” said Rohan Thacker, autonomy lead on the project. “Then it has to go down 100 feet and not fall off.” .”

Members of the JPL’s EELS team lowered the robot’s sensor head — which uses lidar and stereo cameras to map its environment — into a vertical shaft called Moline on the Athabasca glacier in British Columbia in September 2022. The team will return to the site in 2023 and 2024 for additional tests with the robot versions. full python. Credit: NASA/JPL-Caltech
EELS creates a 3D map of its surroundings using four pairs of stereo cameras and lidar, which is similar to radar but uses short laser pulses instead of radio waves. Using the data from those sensors, navigation algorithms determine the safest path forward. The goal was to create a library of “gaits,” or the ways a robot can move in response to terrain challenges, from deflecting to turning on itself, a movement the team calls “bananas.”
In its final form, the robot will have 48 actuators—essentially small motors—that give it the flexibility to take on multiple configurations but add complexity to both the hardware and software teams. Thakker compares the engines to “48 driving wheels”. Many of them have a torque sensor built in, and it acts like a kind of skin so the EELS can get a feel for how much force they’re applying to the terrain. This helps it move vertically in narrow chutes with uneven surfaces, positioning itself to press against the opposite walls at the same time as a rock climber.

The screws that drive the EELS while providing traction and grip are lined up in a lab at JPL. On the left is the black aluminum screw for snowboard testing. The remaining 3D-printed plastic screws—of varying lengths, lead angles, thread heights, and edge sharpness—were tested on more pliable snow and sand. Credit: NASA/JPL-Caltech
Last year, the EELS team got to experience those kinds of tricky spaces when they lowered a robot’s visualization head — the part that contains the cameras and lidar — into a vertical shaft called Moline in the Athabasca Glacier in the Canadian Rockies. In September, they will return to the site, which is in many ways an analogue of the icy moons in our solar system, with a version of the robot designed to test navigation below the surface. The team will drop a small array of sensors — to monitor the chemical and physical properties of the glaciers — that EELS will eventually be able to deploy to remote sites.
“Our focus so far has been on autonomous ability and mobility, but eventually we will look at scientific tools that we can combine with EELS,” said Robinson. Scientists tell us where they want to go, what they’re most excited about, and we’ll provide a robot that takes them there. how? Like a startup, we just have to build it.”
The EELS system is a mobile instrument platform designed to explore interior terrain structures, assess habitability, and ultimately search for evidence of life. It is designed to be adaptable to traverse ocean-inspired terrain, fluidized media, closed labyrinth environments, and fluids. Credit: NASA/JPL-CalTech
More about the project
EELS is funded by the Technology and Strategy Infusion Office at NASA’s Jet Propulsion Laboratory in Southern California through a technology acceleration program called JPL Next. NASA’s Jet Propulsion Laboratory is managed by the California Institute of Technology in Pasadena, California. The EELS team worked with a number of university partners on the project, including Arizona State University, Carnegie Mellon University, and the University of California, San Diego. The robot is not currently part of any NASA mission.