This article was originally published in The Conversation. The publication contributed the article to Space.com’s Expert Voices: Op-Ed & Insights.
The famous first image of a black hole just got two times sharper. A research team used artificial intelligence to dramatically improve upon its first image from 2019, which now shows the black hole at the center of the M87 galaxy as darker and bigger than the first image depicted.
I’m an astronomer who studies and has written about cosmology, black holes and exoplanets. Astronomers have been using AI for decades. In fact, in 1990, astronomers from the University of Arizona, where I am a professor, were among the first to use a type of AI called a neural network to study the shapes of galaxies.
Better telescopes, more data
As long as astronomy has been a science, it has involved trying to make sense of the multitude of objects in the night sky. That was relatively simple when the only tools were the naked eye or a simple telescope, and all that could be seen were a few thousand stars and a handful of planets.
A hundred years ago, Edwin Hubble used newly built telescopes to show that the universe is filled with not just stars and clouds of gas, but countless galaxies. As telescopes have continued to improve, the sheer number of celestial objects humans can see and the amount of data astronomers need to sort through have both grown exponentially, too.
For example, the soon-to-be-completed Vera Rubin Observatory in Chile will make images so large that it would take 1,500 high-definition TV screens to view each one in its entirety. Over 10 years it is expected to generate 0.5 exabytes of data — about 50,000 times the amount of information held in all of the books contained within the Library of Congress.
There are 20 telescopes with mirrors larger than 20 feet (6 meters) in diameter. AI algorithms are the only way astronomers could ever hope to work through all of the data available to them today. There are a number of ways AI is proving useful in processing this data.