At our observatory we have two small radio telescopes that are completely automated.
Each day at sunrise they point at the Sun and then track it all day until sunset, recording the strength of the solar radio emission. In addition, three times a day the telescopes make precise measurements, which involves a complex choreography of pointing the radio telescopes at and away from the Sun, and switching on calibration devices. An automatic data distribution system then emails those measurements to users all over the world. The only things that require staff action are clearing the snow off the antennas and fixing things when they break. Today, most telescopes are automated to some degree, but it was not always so.
H.G. Wellsí book War of the Worlds starts with an astronomer observing Mars. He is looking at the Red Planet through the eyepiece of the telescope, and the only automation is a small motor that is moving the telescope at a constant rate to compensate for the Earth’s rotation, keeping it pointed at Mars. These days things are different.
Over the last few decades, astronomical telescopes have undergone what can be described as creeping automation, gradually reducing the need for human intervention. Taking into account the long hours it takes to make observations, often at inconvenient times, it is easy to see why automation would be a priority. In addition to the convenience, it minimizes the chance of expensive telescope time being lost due to mistakes by half-asleep astronomers. We can also make mistakes in analyzing the data. If we record the data from the telescope in a basic, untouched form, we have an opportunity to repeat our work, and to find and correct our mistakes. This usually means we have to record a lot of information, leading to the problem of data overload.
Back in the H.G. Wells days, a nightís observing would produce a few numbers written in a notebook and maybe a hand-drawn sketch. A couple of decades later, the data was often in the form of photographs. These required making long exposures, with the astronomer carefully making sure the telescope remained precisely pointed.
Then we put computers on telescopes, making a nightís work more productive, and yielding maybe a few thousand numbers to deal with later. As computers became more powerful, we could log more data, making it easier to detect and correct problems. Then the emphasis in astronomy moved from making detailed observations of single objects to making surveys – observations of thousands or millions of objects. This turned the flow of data to be dealt with into a tsunami.
Surveys are useful because they give us data from many objects, collected in a short time and all processed exactly the same way. We can categorize them and identify any objects of special interest and compare them with one another. However sifting through the mountains of data that surveys produce is an enormous task. Enter the robot data analyst. Today neural network software, improvements in computer intelligence and machine learning mean we can often let computers take the first look at the data, classifying it and sorting out the good bits.
We can tell computers what we are looking for, such as the slight dimmings of one or more of the thousands of stars as planets move in front of them, or we can simply tell the computer to look at the data and report anything that stands out in some way.
A few years ago I got a demonstration of this when visiting a colleague in Ottawa. He gave a catalogue of observations of more than 20,000 galaxies to a computer and simply told it to sort them out and categorize them, which it did, saving countless hours of work. Next, robot astronomers?
Mars is low in the southwest, above the red star Antares. Saturn is fainter and close above Mars. The Moon will reach Last Quarter on Aug. 24.
Ken Tapping is an astronomer with the National Research Council’s Dominion Radio Astrophysical Observatory, Penticton.