Now you really are ready to run the reduction program. You have all the necessary information in the right file formats. Usually, it's convenient to keep each night's observations in a separate file.
The command REDUCE/PHOT will start up the reduction program. It starts out very much like the planning program, as it also needs the telescope, standard-star, and instrument information. But instead of asking about output formats and the date of the observing run, it requests the names of the data files. Normally, each data file is one night; you can reduce up to 30 nights together.
To save typing, it is convenient to make a catalog that refers to all the data files; use the MIDAS command CREATE/TCAT to do this. For example, if your data files have names like night1.tbl, night2.tbl, and so on, the MIDAS command line
CREATE/TCAT data night*.tbl
will make a catalog named data.cat that refers to the whole group.
As with star files, if you say by mistake that there is another data file, you can enter ``NONE'' when it asks for another file name, and it will end the list and go on.
If the data are pulse counts, they are corrected for the nominal dead time (using a formula appropriate to the type of counter used) as they are read in. If they are raw magnitudes, they are converted to intensities at this stage, so the reduction can proceed in the same way for all types of data. At this point, observations are in intensity units on some arbitrary instrumental scale. The intensities are then corrected for the nominal values of any neutral attenuators that may have been used.
Once the data are read, the reduction proceeds in two main stages: first, filling in missing meteorological data, subtracting dark readings, and subtracting sky; and second, fitting the remaining stellar data to appropriate models.
If they are present, the temperatures and relative humidities for each night are displayed graphically. After looking at each graph, you can choose one of three treatments: polygon interpolation (i.e., just a straight line segment between adjacent data); linear smoothing (a straight line fitted to the whole set); or adopting a constant value for the whole night. Usually, polygon interpolation is adequate. However, it is sensitive to aberrant or bad points. If you believe there are outliers in the data, and the rest are reasonably linear, use the linear fit, which resists the effects of bad points. If you think a bad point can be fixed, or should be removed before proceeding, just enter ``q'' to quit, and deal with the bad datum.