The recent eruption of Erta Ale volcano in northeastern Ethiopia highlights the speed and impact of space artificial intelligence (A.I.).

By Andrew Good (Phys.org) |

One of our planet’s few exposed lava lakes is changing, and artificial intelligence is helping NASA understand how.

On January 21, a fissure opened at the top of Ethiopia’s Erta Ale volcano —one of the few in the world with an active lava lake in its caldera. Volcanologists sent out requests for NASA’s Earth Observing 1 (EO-1) spacecraft to image the eruption, which was large enough to begin reshaping the volcano’s summit.

As it turned out, that spacecraft was already busy collecting data of the . Alerted by a detection from another satellite, an (A.I.) system had ordered it to look at the volcano. By the time scientists needed these images, they were already processed and on the ground.

It’s a fitting capstone to the A.I.’s mission. That software, called the Autonomous Sciencecraft Experiment (ASE), has guided the actions of EO-1 for more than 12 years, helping researchers study natural disasters around the globe. ASE will conclude its operations this month, when EO-1’s mission comes to an end. ASE leaves behind a legacy that suggests great potential for A.I. in future space exploration.

Besides the recent eruption, ASE helped scientists study an Icelandic volcano as ash plumes grounded flights across Europe in 2010. It also tracked catastrophic flooding in Thailand. The software cut the turnaround time for data from weeks to just days, as users could put in requests in real-time.

ASE was developed by NASA’s Jet Propulsion Laboratory in Pasadena, California, and uploaded in 2003 to EO-1, an earth science satellite managed by Goddard Space Flight Center in Greenbelt, Maryland. The software directed EO-1 to alert researchers whenever it detected events of scientific interest, and autonomously tasked the spacecraft to take photos during subsequent orbital passes.

Additionally, it manages a “sensor web,” a network of other satellites and ground sensors that all “talk” to one other, helping to prioritize which events to focus on.

“It’s a milestone in A.I. application,” said Steve Chien, principal investigator of ASE and head of the Artificial Intelligence Group at JPL. “We were supposed to do this for six months, and we were so successful that we did it for more than 12 years.”

Continue reading this story on Phys.org
——
See also:


Watch new Ethiopian movies. Download app.  iPhone   Android     iPad

One Response

Leave a Reply

Your email address will not be published.