Information about changes in the pipeline: here
This page provides information about pipeline processing and data types.
Raw data are selected, associated and inserted into a reduction mechanism which produces calibration products, science products and quality control information. This mechanism is the data processing pipeline. There is one such pipeline for each VLT and VLTI instrument.
Find general information about ESO reduction pipelines here.
The main functionalities of the pipelines are:
QC Garching creates master calibration data from all raw calibration data. The raw data are stored in the ESO Archive and are public. They are quality-checked and used for data reduction and for trending.
Before October 2011 QC Garching processed science data, using the best available, quality-checked master calibration data. As of October 2011 this service is not offered anymore.
There are two instances of the data reduction pipelines:
The automatic mode is used for quick look purposes and for on-site quality control. It processes all raw data sequentially, as they arrive from the instrument. If calibration products ("master calibrations") are required for processing science data, these are taken from a database with standard, pre-manufactured calibration products. The automatic mode is not tuned to obtain the best possible results.
The optimized mode is the mode, which uses all data of a night, including the daytime calibrations. The calibration data are sorted and grouped according to their dependencies. Master calibration data are created. Their quality is checked.
The HAWK-I data processing pipeline
is operational since April 2008. All operational HAWK-I
observing modes are supported although with some caveats:
The HAWK-I pipeline is publicly available (check out here). Under this link you also find the pipeline Users Manual.
The following is s summary of some changes in the
pipeline. Please note that the list may not be complete:
Find the description of HAWK-I data processing and pipeline recipes here:
Raw data. HAWK-I has a 2x2 mosaic of 4 2048x2048 photon-sensitive pixels. There are NO pre- or post- overscan columns or rows.
Further details regarding the chips can be found in Appendix B of the User Manual.
In total, HAWK-I raw frames have 4096x4096 pixels. There is only one read-mode available, which is unbinned, un-correlated. Raw frames have a size of 65MB each. Find example (reference) frames here.
Extensions. Raw data come as FITS files with FIVE HDUs (header units), so-called Multi-Extension FITS files, or MEFs. The primary HDU has the header with all primary keywords and telescope, positioner, detector, observation, instrument etc. keywords. The pixel data, plus chip-specific header info for each of the FOUR chips are then stored in four further HDUs.
Products. Pipeline products are either FITS images or FITS tables, and either have a single HDU (i.e. traditional FITS files) or 5 HDUs as for the RAW data, i.e. a primary HDU containing the header with all primary keywords and telescope, positioner, detector, observation, instrument etc. keywords and then pixel or table data, plus chip-specific header info for each of the FOUR chips in four further HDUs. Further info can be found here.
Since The beginning of HAWK-I operations (April 2008), runs performed in Service Mode receive a set of DVDs containing:
32bit Linux systems, as used by the ESO Quality Control
Group, have a fundamental limit to the amount of memory that
any single process can address at one time. This has the
impact on HAWK-I data procecssing that for data processed
with pipeline version 1.3.4 and earlier stacks of more than
about 70 frames can not be handled. Therefore, for the
purpose of QC, only the first 70 frames of stacks with more
than 70 frames are used to create pipeline products.
Imperfect telescope offsets:
For the purposesof QC in P81, SCIENCE frames reduced by the pipeline
without using the --refine command line option of the
Without the --refine command line option, the pipeline uses
the offsets recorded in the headers of each RAW frame in the stack for
shifting each frame onto a common reference. With the --refine
command line option, the pipeline uses the offsets recorded in the
headers of each RAW frame in the stack as a first guess, but refines
these guesses by making a fit to a single object in the central region
of the chip for shifting each frame onto
a common reference.
Bugs in the ZPOINT recipe prior to version 1.4.2:
HAWK-I ZPOINTs are measured by observing single standard stars. The
star is observed four times in each filter, placing the star at the
center of each chip in turn. The OB should be created as follows: