Common DFOS tools:

dfos = Data Flow Operations System, the common tool set for DFO
*make printable new: See also :

version 1.0


- the tool is distributed through the utilities package (utilPack)
- hideFrame: the tool to request header updates

[ used databases ] databases SAFIQ
[ used dfos tools ] dfos tools  
[ output used by ] output used by ngasClient
[ output used by ] upload/download download: headers from SAFIQ



hotfly = headers on the fly: this is a tool for

The source of information is the keyword repository (also called "SAFIQ"* or the "warehouse"). This repository stores all header keywords, including the ones from extensions. Technically, it is an "inverted" database table where all keywords are essentially stored as "name - value" pairs (very much like in the original headers), in contrast to a traditional databasetable where a complete header would be stored as one record (one line) with as many columns as keywords. This different architecture makes it possible to store, address, and update each single name-value pair. This is impossible in a traditional table like dp_products (where only standard keys like arcfile and dp_catg are stored) or dp_headers (where only the header name and its complete, "frozen" content are stored and cannot be updated).
*(where nobody at ESO can tell what IQ stands for ...) More about Sybase IQ e.g. here ...

Any change of a header keyword which has been done in SAFIQ is reflected in a header downloaded with hotfly, or in a fits file header updated with hotfly. In particular, any change of a raw file header key which you have requested from dbcm using hideFrame will be reflected.

Currently the insertion of new headers in SAFIQ takes place roughly every 24 hours. This means that currently no header younger than about a day can be downloaded or updated.

Extensions. By default the tool downloads and updates the complete header, including all extensions if existing. You may want to focus on the primary header only, using the option -p. After all, all information required for OCA should be in the primary header.

Comparison of all current header download mechanisms

  dataclient -t h hotfly ngasClient -W dataclient -t f, then extract header combinations
  1 2 3 4 1+ 2
delay* almost none (replication from Paranal) about one day several days several days almost none
hdr update no yes partial partial yes
broken tpl support yes no no yes yes
extensions yes yes no yes yes
full b/w compatibility no yes yes yes yes
performance** 0.13 sec 0.3 sec 2.8 sec very slow in comparison fast

* difference between 'now' and last header available
**download per hdr on old system
1+2: first method 1 (for preview, corrections etc.), then method 2 (upon processing): combines all advantages

Combining methods 1 and 2 currently offers all advantages. This means that for all dfos applications requiring up-to-date headers (qc1Parser and the future calChecker) we continue to use dataclient -t h. For processing with autoDaily (creating ABs and downloading on demand) the preferred method is ngasClient with hotfly (ngasClient -f).


In the .dbrc file you should have access to the SAFIQ server. Add this line to the .dbrc file:


where <passwd> is the password for the QC user, and HOTFLY_OP is an alias for the operational server.

Make sure that your .qcrc also includes the line


and that you source the modified .qcrc before using hotfly or the updated ngasClient.

How to use

Usually you will not run the tool on the command line, but within ngasClient. Command-line usage:

Type hotfly -h for on-line help, hotfly -v for the version number.

To download a header, use

hotfly -H AMBER.2007-11-21T20:34:21.626.hdr ;

to update the header of an existing fits file (you need to be in the directory where the fits file is located):

hotfly -f AMBER.2007-11-21T20:34:21.626.fits ;

to have the primary header updated only, type

hotfly -f AMBER.2007-11-21T20:34:21.626.fits -p .


The tool comes as part of the utilPack delivery.

Configuration file


Operational aspects