[ ESO ]  

Data Pipelines and Quality Control

HOME INDEX SEARCH HELP NEWS

The Data Flow System Operations Model for VLT/VLTI Instrumentation details the process of production and operation of the pipelines, quality control, and instrument models, in order to guarantee the quantitative predictability and performance control of the instrumentation during the operations of the VLT Observatory. Pipeline processing will be available for a subset of the VLT instrument modes.

Pipeline Systems

The Pipeline Systems performs the standard reduction tasks needed to produce suitable calibration data, to calibrate science frames, to support a data quality control as well as to do trend analysis. It is intended to be executed as much as possible in an automatic mode and to deliver instrument signature corrected data.

Pipeline processing will be available for a subset of the VLT instrument modes.

Quality Control and Simulation

The Quality Control system includes the tools used inside and outside the pipeline environment in order to control the conditions in which the data have been acquired and processed, in particular the instrumental and observational conditions. The Quality Control Library contains a number of main classes to build these tools.

A number of simulators and calibration database management tools have been developed to support proposal observing run preparations. Documentation is available.


 [Projects and Developments]  [VLT Data Flow System]  [ESO]  [Index]  [Search]  [Help]  [News]