EUROPEAN SOUTHERN OBSERVATORY

Organisation Européenne pour des Recherches Astronomiques dans l'Hémisphère Austral

Europäische Organisation für astronomische Forschung in der südlichen Hemisphäre

 

 

 

 

VLT PROGRAMME

 

 

VERY LARGE TELESCOPE

 

 

 

 

 

VLT Instrumentation Software

---

Acceptance Test Plan

Template Document

 

 

 

 

 

 

Doc. No.: VLT-PLA-ESO-17240-2266

 

Issue: 6

 

Date: 16/02/2007

 

 

 

 

 

 

Name                                                       Date                         Signature

                Prepared:  A.Longinotti                                      16/02/2007

 

 

Name                                                       Date                         Signature

          Approved:  K.Wirenstrand                                                                  

 

Name                                                       Date                         Signature

                Released:  M.Peron                                                

 

 

VLT PROGRAMME * TELEPHONE: (089) 3 20 06-0 * FAX: (089) 3 20 06 514


 

CHANGE RECORD

 

 

ISSUE

DATE

SECTION/PAGE

AFFECTED

REASON/INITIATION

DOCUMENTS/REMARKS

 

1

17/08/2000

All

First issue

2

28/03/2002

All

MAR2002

3

31/03/2003

All

APR2003

4

31/03/2004

1.2 3.9 4.2

Added Control Model tests. APR2004

5

13/01/2005

1.2 3.9 3.10.6

3.4.1 3.4.3 3.4.4 3.6.2 3.6.3 3.7.1

3.10.4 3.10.5

Chapter 4

 

Chapter 5

Added tat tests

Examples changed

 

Added reference to document on DFS in the VCM

Added TAT001 and VCM006

Updated according to new test scheme (VLTSW20040158)

Removed unnecessary manual pages

6

16/02/2007

3.4.4

3.10.4 3.10.5

Chapter 6

ic0SelfTest replaced by inscSelfTestICS

updated links to TWiki pages 

new (VLTSW20060060)

 


 

TABLE OF CONTENTS

 

 

 

1 INTRODUCTION                                                                                                                                                                                             5

1.1     PURPOSE                                                                                                                                                                                                5

1.2     SCOPE                                                                                                                                                                                                     5

1.3     APPLICABLE DOCUMENTS                                                                                                                                                              5

1.4     REFERENCE DOCUMENTS                                                                                                                                                                6

1.5     ABBREVIATIONS AND ACRONYMS                                                                                                                                              6

1.6     GLOSSARY                                                                                                                                                                                             7

1.7     STYLISTIC CONVENTIONS                                                                                                                                                                7

1.7.1      Data Flow and Processor Model Diagrams                                                                                                                             7

1.8     NAMING CONVENTIONS                                                                                                                                                                   7

1.9     PROBLEM REPORTING/CHANGE REQUEST                                                                                                                                 7

2 OVERVIEW                                                                                                                                                                                                       8

2.1     HARDWARE REQUIREMENTS                                                                                                                                                         8

2.2     SOFTWARE REQUIREMENTS                                                                                                                                                          8

3 TEST DESCRIPTION                                                                                                                                                                                      9

3.1     DOCUMENTATION                                                                                                                                                                             9

3.1.1      Instrument Software Acceptance Test Plan                                                                                                                           9

3.1.2      Instrument Software User and Maintenance Manual (DOC001)                                                                                         9

3.1.3      Instrument Software Acceptance Test Report                                                                                                                       9

3.2     STANDARDS                                                                                                                                                                                        9

3.2.1      Programming Standards (STD001)                                                                                                                                           9

3.2.2      Standard Architecture (STD002)                                                                                                                                              9

3.2.3      DCS packages (STD003)                                                                                                                                                            9

3.2.4      ICS package (STD004)                                                                                                                                                                9

3.2.5      OS package (STD005)                                                                                                                                                                 9

3.2.6      Startup procedures (STD006)                                                                                                                                                    9

3.2.7      Rules and package for templates (STD007)                                                                                                                            9

3.2.8      Instrument Configuration files (STD008)                                                                                                                                9

3.2.9      Users name (STD009)                                                                                                                                                               10

3.3     INSTALLATION                                                                                                                                                                                 10

3.3.1      Make sure that the Instrument Software is built from scratch (INS001)                                                                           10

3.3.2      Usage of pkgin to build the Instrument Software (INS002)                                                                                                10

3.3.3      Access to cmm Archive (INS003)                                                                                                                                           10

3.3.4      Installation failures check (INS004)                                                                                                                                        10

3.3.5      Instrument package for P2PP (INS005)                                                                                                                                  10

3.4     SUB-SYSTEMS TEST                                                                                                                                                                         10

3.4.1      DCS test (DCS001)                                                                                                                                                                    10

3.4.2      ICS special device LCU test  (ICS001)                                                                                                                                   10

3.4.3      ICS special device test  (ICS002)                                                                                                                                            11

3.4.4      ICS test  (ICS003)                                                                                                                                                                      11

3.5     GRAPHICAL USER INTERFACE                                                                                                                                                     11

3.5.1      DCS stand-alone GUI (GUI001)                                                                                                                                               11

3.5.2      ICS stand-alone GUI (GUI002)                                                                                                                                                11

3.5.3      OS Control GUI (GUI003)                                                                                                                                                         11

3.5.4      OS Status GUI (GUI004)                                                                                                                                                           11

3.5.5      GUIs layout (GUI005)                                                                                                                                                               11

3.6     OS                                                                                                                                                                                                           12

3.6.1      Startup/Shutdown (OS001)                                                                                                                                                      12

3.6.2      Single exposure (OS002)                                                                                                                                                          12

3.6.3      Templates (OS003)                                                                                                                                                                    12

3.6.4      Interface P2PP-BOB (OS004)                                                                                                                                                   12

3.7     MS                                                                                                                                                                                                          12

3.7.1      Technical templates (MS001)                                                                                                                                                  12

3.7.2      Results format (MS002)                                                                                                                                                            12

3.8     ALARMS                                                                                                                                                                                              12

3.8.1      Emergency cases (ALM001)                                                                                                                                                   12

3.8.2      Simulate alarms (ALM002)                                                                                                                                                       12

3.8.3      Configure alarm conditions (ALM003)                                                                                                                                  12

3.9     AUTOMATIC REGRESSION TESTS                                                                                                                                               13

3.9.1      Full cycle (TAT001)                                                                                                                                                                  13

3.10    VLT CONTROL MODEL                                                                                                                                                                    13

3.10.1         Make sure that the Instrument Software is built from scratch (VCM001)                                                                   13

3.10.2         Build the Instrument Software for the VCM (VCM002)                                                                                                 13

3.10.3         Templates (VCM003)                                                                                                                                                           13

3.10.4         Interface P2PP-BOB (VCM004)                                                                                                                                          13

3.10.5         Interface OS-Archive (VCM005)                                                                                                                                       13

3.10.6         Automatic Regression Tests (VCM006)                                                                                                                          13

4 TEST EXECUTION                                                                                                                                                                                        14

4.1     AT THE AIV PREMISES                                                                                                                                                                    14

4.2     IN THE VLT CONTROL MODEL                                                                                                                                                      18

5 REFERENCE                                                                                                                                                                                                    19

6 VERIFICATION MATRIX                                                                                                                                                                            20

6.1     Instrument specific requirements                                                                                                                                                      20

6.2     General requirements for Instrumentation Software                                                                                                                       21

 

1          INTRODUCTION

1.1              PURPOSE

Purpose of Preliminary Acceptance Europe (PAE) is to verify the readiness of an instrument, in terms of fulfilling requirements, before being shipped to Chile for commissioning.

 

According to the VLT Software Management Plan [AD 10], an Acceptance Test Plan (ATP) document has to be issued by the consortium in charge of the instrument and reviewed by ESO well before the foreseen PAE. Such a document must contain a list of tests, which have to successfully pass in order to certify that the instrument has completed the implementation phase and is ready for commissioning. As a result of PAE, an Acceptance Test Report (ATR) document has to be produced.

 

The ATR document normally consists of the ATP added with the results of the PAE, including any relevant comment/remark. It has to be prepared by the consortium and agreed with ESO, before being issued.

 

The present document provides structure and contents of an ATP document and indicates which characteristics the software for an instrument, to be operated and maintained at Paranal, is expected to have, in terms of packages and standards used. In particular it aims to emphasize the importance of using common software to implement common functionality: it increases the maintainability of the final product.

 

This document is intended to be applicable to all contracts with consortia for. It should therefore be added to the list of applicable documents in the related Statement of Work.

1.2              SCOPE

The present document describes all tests foreseen for PAE, to verify the completeness of the instrument software before shipment to Chile. It covers the whole set of functionality as described in the User Requirements document.

The Software PAE normally takes place at the location where the instrument has been assembled and integrated.

The execution of a sub-set of the tests also in the VLT Control Model in Garching, e.g. to verify the interface with TCS or the Data Flow Software (Archive, Observation Handling Tool), is considered integral part of the PAE and is mandatory for all new instruments.

The availability of automatic regression test procedures is also considered mandatory for all new instruments and their successful execution is also part of the Software PAE run.

 

This document aims to provide instrumentation software responsible, from ESO and from consortia, with a template of Acceptance Test Plan (ATP) document. Instrument specific ATP documents should be based on this template. They must contain at least the tests described herein (whenever applicable), and possibly add instrument specific tests.

Paragraphs in italics should be removed.

1.3              APPLICABLE DOCUMENTS

The following documents, of the exact issue shown, form a part of this document to the extent specified herein. In the event of conflict between the documents referenced herein and the contents of this document, the contents of this document shall be considered as a superseding requirement.

 

Reference

Document Number

Issue

Date

Title

[AD 01]

GEN-SPE-ESO-19400-0794

3.0

In preparation

DICB – Data Interface Control Document

[AD 02]

VLT-SPE-ESO-10000-0011

3

In preparation

VLT Software Requirements Specification

[AD 03]

VLT-PRO-ESO-10000-0228

2

In preparation

VLT Software Programming Standards

[AD 04]

VLT-PLA-ESO-10000-0441

1.0

01/05/1995

VLT Science Operation Plan

[AD 05]

VLT-MAN-ESO-17210-0667

1.2

08/10/2001

Guidelines for VLT applications.

[AD 06]

VLT-SPE-ESO-17212-0001

4

13/01/2005

INS Software Specification

[AD 07]

VLT-SPE-ESO-17240-0385

4

13/01/2005

INS Common Software Specification

[AD 08]

VLT-ICD-ESO-17240-19400

2.6

17/11/1997

ICD between VCS and Archive

[AD 09]

VLT-ICD-ESO-17240-19200

1.3

07/06/2000

ICD between VCS and OH

[AD 10]

VLT-PLA-ESO-00000-0006

3

In preparation

VLT Software Management Plan

[AD11]

VLT-SPE-ESO-xxxx-xxxx

1

xx/xx/xxxx

XXXX Control Software User Requirements

 

1.4              REFERENCE DOCUMENTS

The following documents are referenced in this document.

 

Reference

Document Number

Issue

Date

Title

[RD 01]

VLT-MAN-ESO-17200-0888

1.0

17/08/1995

VLT Common Software Overview

[RD 02]

VLT-MAN-ESO-17200-0642

4

29/04/2004

VLT Common Software Installation Manual

[RD 03]

VLT-SPE-ESO-17100-3439

1

In preparation

Paranal Network/Computers Design Description

[RD 04]

VLT-MAN-SBI-17210-0001

3.7

05/10/2001

LCU Common Software User Manual

[RD 05]

VLT-MAN-ESO-17210-0600

1.7

02/10/1998

Motor Control sw User Manual API/ACI

[RD 06]

VLT-MAN-ESO-17210-0669

1.6

02/10/1998

Motor Engineering Interface User Manual

[RD 07]

VLT-MAN-ESO-17210-0619

2.4

31/03/2004

Central Control Software User Manual

[RD 08]

VLT-MAN-ESO-17210-0707

1.6

30/09/1999

On Line Database Loader User Manual

[RD 09]

VLT-MAN-ESO-17210-0771

1.8

06/10/2001

EVH User Manual

[RD 10]

VLT-MAN-ESO-17210-0770

1.8

30/09/2001

Extended CCS User Manual

[RD 11]

VLT-MAN-ESO-17210-0690

5

31/03/2002

Panel Editor User Manual

[RD 12]

VLT-MAN-ESO-17240-0853

3

26/03/2004

INS Common sw – oslx User Manual

[RD 13]

VLT-MAN-ESO-17240-0672

1.6

25/09/1998

CCD Detectors Control Software User Manual

[RD 14]

VLT-MAN-ESO-14100-1878

1.4

01/12/2003

IRACE-DCS User Manual

[RD 15]

VLT-MAN-ESO-17240-0934

5

31/03/2004

Base ICS User Manual

[RD 16]

VLT-MAN-ESO-17240-2265

4

05/04/2004

Base OS Stub User Manual

[RD 17]

VLT-MAN-ESO-17240-1913

4

31/03/2004

Installation Tool for VLT Sw packages

[RD 18]

VLT-MAN-ESO-17240-2153

4

31/03/2004

Startup Tool Stub User Manual

[RD 19]

VLT-MAN-ESO-17220-0737

3

28/03/2002

HOS – Sequencer User Manual

[RD 20]

VLT-MAN-ESO-17220-1999

4

19/04/2004

Broker for Observation Blocks User Manual

[RD 21]

VLT-MAN-ESO-13640-1388

3

31/03/2004

FIERA CCD Controller Software User Manual

[RD 22]

VLT-MAN-ESO-17240-2240

4

31/03/2004

Common Software for Templates User Manual

[RD 23]

VLT-MAN-ESO-17240-1973

5

13/01/2005

Template Instrument User Manual

[RD 24]

VLT-MAN-ESO-17240-2606

3

31/03/2004

Base ICS GUI User Manual

[RD 25]

VLT-MAN-ESO_17200-0908

1.4

15/02/2001

Tool for Automated Testing User Manual

1.5              ABBREVIATIONS AND ACRONYMS

This document employs several abbreviations and acronyms to refer concisely to an item, after it has been introduced. The following list is aimed to help the reader in recalling the extended meaning of each short expression:

AIV

Assembly Integration and Verification

ATP

Acceptance Test Plan

ATR

Acceptance Test Report

CCS

Central Control Software

CPU

Central Processing Unit

DCS

Detector Control Software

DFS

Data Flow System

ESO

European Southern Observatory

FITS

Flexible Image Transport Format

GUI

Graphical User Interface

HW

Hardware

ICS

Instrument Control Software

INS

Instrumentation Software Package

I/O

input/output

ISF

Instrument Summary File

IWS

Instrument Workstation

LAN

Local Area Network

LCC

LCU Common Software

LCU

Local Control Unit

MS

Maintenance Software

N/A       

Not Applicable

OMT

Object Modeling Technique

OO

Object Oriented

OOD

Object Oriented Design

OS

Observation Software

PAE

Preliminary Acceptance Europe

P2PP

Phase 2 Proposal Preparation

RAM

Random Access Memory

SW

Software

TAT

Tool for Automated Testing

TBC

To Be Clarified

TBD

To Be Defined

TCS

Telescope Control Software

TIM

Time Interface Module

TRS

Time Reference System

TSF

Template Signature File

UIF

(Portable) User Interface (Toolkit)

VCM

VLT Control Model

VLT

Very Large Telescope

VLTI

VLT Interferometer

VME

Versa Module Eurocard

WS

Workstation

<!-- Standard footer -->

1.6              GLOSSARY

No special definition is introduced in this manual

1.7              STYLISTIC CONVENTIONS

The following styles are used:

bold

in the text, for commands, filenames, pre/suffixes as they have to be typed.

italic

in the text, for parts that have to be substituted with the real content before typing.

teletype

for examples.

<name>

in the examples, for parts that have to be substituted with the real content before typing.

 

bold and italic are also used to highlight words.

1.7.1                       Data Flow and Processor Model Diagrams

Data Flow and processor Model Diagrams are based on De Marco/Yourdon notation for real-time systems [RD 20].

1.8              NAMING CONVENTIONS

This implementation follows the naming conventions as outlined in [AD 03].

1.9              PROBLEM REPORTING/CHANGE REQUEST

The form described in [RD 02] shall be used.

2         OVERVIEW

The present document is structured as follows:

·         Chapter 3 gives a detailed description of the tests to be performed.

·         Chapter 4 describes the exact sequence of actions to be executed during PAE.

·         Chapter 5 contains the manual pages of the test scripts used to run the tests.

2.1              HARDWARE REQUIREMENTS

The list below refers to the Template Instrument XXXX. It must be modified to reflect the actual requirements of each specific instrument.

 

In order to perform the whole set of tests described in this document, the following computers and hardware components must be available:

·         One Instrument Workstation

·         Two LCUs for ICS

·         One LCU for the TCCD

·         One Sparc LCU for IRACE

·         One Sparc LCU for FIERA

 

2.2              SOFTWARE REQUIREMENTS

In order to perform the whole set of tests described in this document, the following software components must be available:

·         UNIX Operating System (see [RD 02] for the types and versions supported).

·         VLT Common Software – MAR2001or higher, installed according to [RD 02].

·         Access to the cmm Archive.

3         TEST DESCRIPTION

3.1              DOCUMENTATION

This section describes the documents produced for PAE.

3.1.1                       Instrument Software Acceptance Test Plan

It is prepared and reviewed before PAE.

It consists of the present document.

3.1.2                       Instrument Software User and Maintenance Manual (DOC001)

It is based on [RD 23] and includes:

1.        One chapter dedicated to an overview of the architecture of the whole Instrumentation sw (LAN, computers, processes, environments, and database).

2.        One chapter dedicated to the installation of the whole Instrumentation Software.

3.        One chapter dedicated to observation scenarios, including a layout of the GUIs.

4.      One chapter dedicated to Templates.

3.1.3                       Instrument Software Acceptance Test Report

It is produced after PAE.

It is derived from the present document, in particular chapter 4, by adding the results and comments from PAE.

3.2              STANDARDS

The following aspects of the Instrumentation Software will be verified through code inspection.

3.2.1                       Programming Standards (STD001)

Compliance with Software Programming Standards ([AD 03]) is verified through code inspection on files (randomly around 10% of the total source code) of all main categories (C++, C, tcl).

Since this verification takes time, it is recommended to do it separately before the actual PAE takes place.

3.2.2                       Standard Architecture (STD002)

The LAN and hardware platforms (WS, LCUs), including names, are conform to what specified in [RD 03].

For VLTI, VST, La Silla instruments an equivalent reference document should exist.

3.2.3                       DCS packages (STD003)

DCS uses the standard DCS package FIERA ([RD 13]) or CCD ([RD 14]) or IRACE ([RD 21]).

Exceptions must be justified and agreed upon at FDR latest.

3.2.4                       ICS package (STD004)

ICS uses the base ICS package icb [RD 15] and icbpan [RD 24].

The specific code developed for the instrument ICS must be justified and documented.

3.2.5                       OS package (STD005)

OS uses the common OS package BOSS, [RD 16].

The specific code developed for the instrument OS must be justified and documented.

3.2.6                       Startup procedures (STD006)

Startup/Shutdown procedures are based on the common tool stoo, [RD 18].

If not based on stoo, at least a short description of the startup procedure (processes started, initialized attributes, commands sent) must be included in the documentation (see 3.1.2).

3.2.7                       Rules and package for templates (STD007)

Templates use the common library tpl and follow the rules defined in [RD 22].

3.2.8                       Instrument Configuration files (STD008)

All files dealing with the instrument configuration for Paranal belong to one single dedicated module (xxmcfg).

The User Manual describes the procedures to be followed to keep under sw configuration control any change to the Instrument configuration parameters.

3.2.9                       Users name (STD009)

The target Instrument WS defines two users:

1.        xxxxmgr, responsible for the installation

2.        xxxx, who runs the instrument sw.

For both users, INTROOT and INS_ROOT must be defined according to the standard adopted at Paranal:

·         INTROOT  set to /vlt/XXXX/INTROOT

·         INS_ROOT set to /data/XXXX/INS_ROOT

3.3              INSTALLATION

All tests described in this section must be executed at the AIV premises as user xxxxmgr

3.3.1                       Make sure that the Instrument Software is built from scratch (INS001)

It is possible to rebuild from scratch the complete instrument software and related environments.

Before running the installation procedure, the old contents of

$INTROOT, $INS_ROOT, $VLTDATA/ENVIRONMENTS, $VLTDATA/config

are (re)moved, to verify that installation can be done from scratch.

3.3.2                       Usage of pkgin to build the Instrument Software (INS002)

The Instrument Software installation is based on pkgin ([RD 17]).

 

In any case, there must be an automatic installation procedure. To minimize the downtime of the target host during software upgrades at Paranal, verify that the installation procedure is or can be split into two main phases (as pkgin does):

1.       Creation of the INTROOT, placing there all files needed by the instrument software, creation of CCS and LCU environments.  It should be possible to execute this phase off-line, not necessarily on the target WS.

        It should be possible to copy the result (INTROOT) to the target host.

2.       The rest of the installation (environment initialization and startup, scan links creation and scan system startup) is always executed at the target host. If possible, this phase should not need access to the sources, only to the INTROOT produced by the first phase.

It must be possible to execute each of these steps with one single UNIX shell command.

3.3.3                       Access to cmm Archive (INS003)

The complete code is accessible and can be retrieved from the cmm Archive. This can be verified by checking the contents of the file xxins/config/xxinsINSTALL.cfg.

In order to be able to repeat the tests at any time with exactly the same configuration, all module versions are explicitly registered in this file.

3.3.4                       Installation failures check (INS004)

The installation procedure, being based on pkgin, allows easy tracing of failures and possible reasons.

3.3.5                       Instrument package for P2PP (INS005)

As result of the build and installation procedure, the Instrument Packages XXXX.zip (observations) and XXXX_tec.zip (maintenance), as defined by P2PP, are produced and placed in $INTROOT/config.

3.4              SUB-SYSTEMS TEST

All tests described in this section must be executed at the AIV premises as user xxxxmgr

3.4.1                       DCS test (DCS001)

Run dedicated test procedure(s), which exercises for every individual detector system (DCS):

·         the proper startup/shutdown

·         state change

·         execution of the main operations when online:

q       one single exposure, for all implemented read-out modes, or a selection of them, if too many.

q       verify if FITS files are properly saved in $INS_ROOT/SYSTEM/DETDATA.

An example is provided in xxmmpe/test/xxmmpeTestDCS.

It must be possible to run the same test under tat (see [RD 25]).

3.4.2                       ICS special device LCU test  (ICS001)

Run for each ICS special device from the vxWorks shell a low-level test, which exercises the device functionality by accessing directly the associated driver.

Examples are available in ic0sen/test.

3.4.3                       ICS special device test  (ICS002)

Run for each ICS special device a self-test procedure, which exercises:

·         state change

·         SETUP all functions in all possible named positions (or samples over a continuous range),

·         STATUS –header

An example is available in xxmmpe/test/xxmmpeTestICS.

It must be possible to run the same test under tat (see [RD 25]).

3.4.4                       ICS test  (ICS003)

Run the ICS self test procedure, based on inscSelfTestICS. It exercises:

·         the proper startup/shutdown

·         state change

·         SETUP all functions in all possible named positions (or samples over a continuous range),

·         STATUS -header -dumpFits.

An example is available in xxmmpe/test/xxmmpeTestICS.

It must be possible to run the same test under tat (see [RD 25]).

3.5              GRAPHICAL USER INTERFACE

All tests described in this section must be executed at the AIV premises as user xxxx

3.5.1                       DCS stand-alone GUI (GUI001)

The DCS stand-alone GUI allows performing all main operations foreseen:

·         startup/shutdown

·         go online

·         set simulation level

·         define a setup

·         execute an exposure.

3.5.2                       ICS stand-alone GUI (GUI002)

The ICS stand-alone GUI is based on icbpan and allows performing all main operations foreseen:

·         startup/shutdown

·         go online

·         set global simulation level

·         set single device simulation level

·         define a setup

·         execute a setup

3.5.3                       OS Control GUI (GUI003)

The OS Control GUI has the following characteristics:

·         It is complementary (not alternative) to BOB, in particular

·         there is no START button

·         there are PAUSE, CONTINUE, CHANGE exp. time, ABORT one single exposure, whenever applicable.

·         It shows a summary of the current instrument status

·         It shows the current instrument mode

·         It shows the main ongoing activities (e.g. status of running exposures).

3.5.4                       OS Status GUI (GUI004)

The OS Status GUI shows the detailed status of the whole instrument and its devices.

3.5.5                       GUIs layout (GUI005)

GUIs used during observations fit into the scheme and space adopted by Paranal.

In particular, they fit into two screens:

1.        Main screen for BOB (left) and OS control (right).

2.        Second screen for image display with RTD.

3.6              OS

All tests described in this section must be executed at the AIV premises as user xxxxmgr

3.6.1                       Startup/Shutdown (OS001)

Run the startup/shutdown procedure, based on the stoo package, for the whole instrument.

Exercise also the state change commands (STANDBY, ONLINE, OFF).

3.6.2                       Single exposure (OS002)

Execute, through a dedicated test script, one single exposure for each observing mode, involving all sub-systems (DCSs, ICS), and verify the result (FITS file) and its contents. Verify also that the generated FITS file is placed by volac in the right directory for archiving: $INS_ROOT/SYSTEM/ARCDATA.

An example is available in xxmmpe/test/xxmmpeTestOS.

It must be possible to run the same test under tat (see [RD 25]).

3.6.3                       Templates (OS003)

Execute through a dedicated test OB (file .obd), in sequence the complete set of templates implemented.

An example is available in xxmmpe/test/xxmmpeTestTPL.

It must be possible to run the same test under tat (see [RD 25]).

 

Purpose is not to verify the scientific result, but just the technical result.

In particular, the run time of such an OB should not be more than one hour, possibly < 15 minutes.

Templates, which require the availability of sub-systems (typically acquisition templates, which require the telescope) should preferably implement a simulation of the missing sub-systems. Alternatively, they should not be part of the complete test OB and be included instead in a separate dedicated test OB, to be run only when the sub-systems are available.

3.6.4                       Interface P2PP-BOB (OS004)

Verify that the P2PP and the Instrument Package are properly installed on the Observation Handling Workstation.

Define an OB with the P2PP tool and fetch it from BOB. Execute it from BOB.

For test purposes P2PP can be installed and started on the Instrument Workstation (see manual page of inscP2PPInstall).

3.7              MS

All tests described in this section must be executed at the AIV premises as user xxxxmgr

3.7.1                       Technical templates (MS001)

All MS procedures are implemented in form of technical templates.

Exceptions should be justified and agreed upon.

An example is available in xxmmpe/test/xxmmpeTestMS.

It must be possible to run the same test under tat (see [RD 25]).

3.7.2                       Results format (MS002)

The results produced by MS procedures are archived either in form of an ASCII file, with the same format supported by the CCS sampling tool (for those results obtained through this tool or equivalent), or as part of the operational logs file (short-FITS format).

3.8              ALARMS

All tests described in this section must be executed at the AIV premises as user xxxx

3.8.1                       Emergency cases (ALM001)

The main emergency conditions that may affect the instrument are identified and documented.

3.8.2                       Simulate alarms (ALM002)

Alarms corresponding to emergency conditions are implemented in the software.

 

If possible, check that these alarms work. If it is impossible to test the real cases, HW shall implement simulation conditions. The SW simulation shall be done if there is really no other alternative. Special care will be taken for Emergency Stops, if any.

3.8.3                       Configure alarm conditions (ALM003)

Alarm thresholds (if applicable, e.g. LN2 tank level, temperature threshold) can be set through a GUI.

3.9              AUTOMATIC REGRESSION TESTS

All tests described in this section must be executed at the AIV premises as user xxxxmgr

3.9.1                       Full cycle (TAT001)

It must be possible to verify with an automatic procedure, i.e. with no user interactions, that the complete Instrument Software can be rebuilt from scratch, the environments can be created and started and all sub-systems tests are performed successfully. This procedure must be based on the VLT standard Tool for Automatic Tests (tat, see [RD 25]).

An example is available in xxmmpe/test/TestList.lite

3.10          VLT CONTROL MODEL

All tests described in this section must be executed on the VLT Control Model (VCM) in Garching as user xxxxmgr.

3.10.1                   Make sure that the Instrument Software is built from scratch (VCM001)

See INS001.

3.10.2                   Build the Instrument Software for the VCM (VCM002)

Because of the different hardware available in the VCM, the installation module to be used in xxmgar. Files in this module contains all the definitions characterizing the Garching configuration.

3.10.3                   Templates (VCM003)

Execute through a dedicated test OB (file .obd), in sequence the complete set of templates implemented.

3.10.4                   Interface P2PP-BOB (VCM004)

Verify that P2PP is running on the VCM OH Workstation and OBs can be transferred to BOB (see instructions under http://websqa.hq.eso.org/sdd/bin/view/VLTSW/IWSDfsSetup).

3.10.5                   Interface OS-Archive (VCM005)

Verify that all FITS files generated when running an OB are transferred to the online Archive WS (see instructions under http://websqa.hq.eso.org/sdd/bin/view/VLTSW/IWSDfsSetup).

 

3.10.6                   Automatic Regression Tests (VCM006)

Execute the automatic regression test procedure for the VCM configuration.

An example is available in xxmgar/test/TestList.lite

4         TEST EXECUTION

This chapter describes, in tabular form, the sequence of actions/commands performed during the PAE to run the complete set of tests/verifications.

 

The last column in the table is reserved for notes and remarks to be added during PAE and included in the ATR document.

 

The names of commands and scripts refer to the Template Instrument XXXX. They have to be adapted to each specific instrument.

It is assumed that the installation module for the location where AIV takes place is named xxmmpe. It must be changed according to the actual AIV location.

4.1              AT THE AIV PREMISES

 

Test ID

Action/Command

Expected results

Notes/comments

DOC001

Check contents of Software User and Maintenance Manual

Document structure and contents similar to [RD 23]

 

 

 

 

STD001

Inspect around 10% of the code

Compliance with [AD 03]

 

 

 

 

 

STD002

Check contents of Software User and Maintenance Manual

Compliance with [RD 03] or equivalent

 

 

 

 

STD003

Code and documentation inspection

Standard DCS packages are used. Exceptions are explained, justified and agreed by ESO.

 

 

STD004

Code and documentation inspection

Standard ICS package is used. Exceptions are explained, justified and agreed by ESO.

 

 

STD005

Code and documentation inspection

Standard OS package is used. Exceptions are explained, justified and agreed by ESO.

 

 

 

STD006

Code and documentation inspection

Standard startup package is used. Exceptions are explained, justified and agreed by ESO.

 

 

STD007

Code and documentation inspection

Standard templates package is used. Exceptions are explained, justified and agreed by ESO.

Compliant with rules described in [RD 22]

 

STD008

Code and documentation inspection

All configuration files are in module xxmcfg. Manual describes clearly procedures to update the instrument configuration.

 

STD009

Login on the Instrument WS as user xxxxmgr and xxxx

It is possible to login as xxxxmgr and xxxx.

INTROOT and INS_ROOT set as in section 3.2.9

 

 

INS001

Run as user xxxxmg:

mv $HOME/XXXXSource    $HOME/XXXXSource.old

mkdir $HOME/XXXXSource

cd $HOME/XXXXSource

cmmCopy xxmmpe

cd xxmmpe/test; make

export TARGET=INTEGRATION

../bin/xxmmpeTestClean

$INTROOT, $INS_ROOT $VLTDATA/ENVIRONMENTS are empty.

$VLTDATA/config/lxx* files do not exist.

Same check on DCS SLCUs, if any.

 

INS002

Run as user xxxxmgr:

cd $HOME/XXXXSource

export TARGET=INTEGRATION

pkginBuild xxmmpe

No errors from pkginBuild.

INTROOT and INS_ROOT contain all files needed to run the instrument software.

 

 

INS003

Check contents of xxmmpe/config/xxmmpeINSTALL.cfg

Only cmm modules are used to build the software from scratch.

For each module, the version is specified.

 

INS004

Check contents of INSTALL/pkginBuild.err

File does not contain errors.

 

 

 

 

 

INS005

Check contents of $INTROOT/config

 

The following files exists:

XXXX.zip

XXXX_tec.zip

 

 

 

DCS001

Run as user xxxxmg:

cd $HOME/XXXXSource/xxmmpe/test

../bin/xxmmpeTestDCS

The script terminates without errors.

 

 

 

 

ICS001

Login on the LCU lxxics2:

rlogin lxxics2

From the vxWorks shell run:

-> lcubootAutoLoadNoAbort 1,”xxidev”,0

-> xxidevTestVx “/iser0”

The program executes without errors all what specified in 0

 

 

 

ICS002

ICS003

Run as user xxxxmg:

cd $HOME/XXXXSource/xxmmpe/test

../bin/xxmmpeTestICS

 

The program executes without errors all what specified in 3.4.3 and 3.4.4

 

 

 

GUI001

Run as user xxxx:

xxinsStart –panel TCCD

xxinsStart –panel FIERA

xxinsStart –panel IRACE

xxinsStart –panel TCCD_RTD

xxinsStart –panel FIERA_RTD

xxinsStart –panel IRACE_RTD

It is possible to execute all operations described in 0 on each of the DCS panels

 

 

 

GUI002

Run as user xxxx:

xxinsStart –panel  ICS

It is possible to execute all operations described in 3.5.2

 

 

 

 

GUI003

Run as user xxxx:

xxinsStart –panel OS_CONTROL

It is possible to execute all operations described in 3.5.3

 

 

 

 

GUI004

Run as user xxxx:

xxinsStart –panel  OS_STATUS

It is possible to execute all operations described in 3.5.4

 

 

 

 

GUI005

Run as user xxxx:

xxinsStartup

Wait that the  startup configuration panel pops-up

Push the button START

The default panels fits into two screen and the layout is the same as described in 3.5.5

 

 

 

OS001

OS002

Run as user xxxxmgr:

cd $HOME/XXXXSource/xxmmpe/test

../bin/xxmmpeTestOS

 

The script executes without errors all what specified in 3.6.2

Verify that the results are stored in FITS file(s) and check contents.

 

OS003

Load from BOB panel the file

XXXX_gen_tec_SelfTest.obd

Run from BOB panel that OB.

Verify that this OB contains all observation templates.

The OB terminates successfully

 

 

 

 

OS004

Start p2pp as user xxxxmgr (see 3.6.4)

Build an OB, which produces at least one FITS file

Fetch the OB from the BOB panel and start it.

The OB can be defined with the P2PP GUI and executed without errors from BOB.

 

MS001

Load from BOB panel the file

XXXX_gen_tec_MSTest.obd

Run from BOB panel that OB.

Verify that this OB contains all maintenance templates.

The OB terminates successfully

 

 

 

 

MS002

Check results of the execution of XXXX_gen_tec_MSTest.obd

The format is the same as specified in 3.7.2

 

 

 

 

ALM001

Check contents of Software User and Maintenance Manual

Emergency cases are identified and documented

 

 

 

 

ALM002

Run as user xxxxmgr:

cd $HOME/XXXXSource/xxmmpe/test

../bin/xxmmpeTestAlarms

 

All foreseen software alarms are one by one triggered.

Verify that alarms simulated by HW trigger software alarms

 

ALM003

Run as user xxxx:

xxinsStart –panel ALARM

It is possible to configure through a GUI alarm conditions

 

 

 

TAT001

Run as user xxxxmgr:

cd $HOME/XXXXSource/xxmmpe/test

export TARGET=INTEGRATION

tat

PASSED

 

 

 

 

 

 

4.2              IN THE VLT CONTROL MODEL

VCM001

Run as user xxxxmgr:

mkdir –p $HOME/XXXXSource

cd $HOME/XXXXSource

rm –rf xxmgar

cmmCopy xxmgar

cd xxins/test; make

export TARGET=CM_FULL

../bin/xxmgarTestClean

$INTROOT, $INS_ROOT $VLTDATA/ENVIRONMENTS are empty.

$VLTDATA/config/lxx* files do not exist.

Same check on DCS SLCUs, if any.

 

VCM002

Run as user xxxxmgr:

cd $HOME/XXXXSource

export TARGET=CM_FULL

pkginBuild xxmgar

 

No errors from pkginBuild.

INTROOT and INS_ROOT contain all files needed to run the instrument software.

 

 

VCM003

Make sure that TCS is online

Start the Instrument Software. Run:

xxinsStart

Load from BOB panel the file

XXXX_gen_tec_SelfTest.obd

Run from BOB panel that OB.

Verify that this OB contains all observation templates.

The OB terminates successfully

 

 

 

 

 

VCM004

Start p2pp on the OH WS (see 3.10.4).

Build an OB, which produces at least one FITS file

Fetch the OB from the BOB panel and start it.

The OB can be defined with the P2PP GUI and executed without errors from BOB.

 

VCM005

Make sure that the on-line archive is active (see 3.10.5)

On the on-line archive WS verify that the FITS files produced with the last OB executed have been transferred.

The FITS files are on the on-line archive WS disk.

 

 

 

 

VCM006

Run:

cd $HOME/XXXXSource/xxmgar/test

export TARGET=CM_FULL; tat

export TARGET=CM_WS; tat

 

 

PASSED

PASSED

 

 

 

5         REFERENCE

This section contains the manual pages of the test scripts/procedures implemented.

Only manual pages providing additional information needed to properly execute the tests have to be presented here.

 

 

6         VERIFICATION MATRIX

6.1              Instrument specific requirements

The following table contains the links between the instrument specific requirements, defined in [AD11], and the corresponding test.

Req.

TEST

DESCRIPTION

REQ01

ICS003

List of devices and assemblies

REQ02

ICS003

Lamps in stand-by state

REQ03

ICS003

Derotator modes

REQ04

ICS003

Measures to overcome mechanical backlash

REQ05

ICS003

Gratings setup parameters

REQ06

DOC001

Sensors sampling period

REQ07

DCS001

UV detector size

REQ08

DCS001

IR detector size

REQ09

DOC001

Cryogenic devices kept to the necessary minimum

REQ10

DOC001

List of observing modes

REQ11

OS002

Automatic settings in UV spectroscopy

REQ12

OS002

Automatic settings in IR spectroscopy

REQ13

OS002

Automatic settings in dichroic spectroscopy

REQ14

OS002

Automatic settings in IR imaging

REQ15

OS001

Description of state OFF

REQ16

OS001

Description of state LOADED

REQ17

OS001

Description of state STANDBY

REQ18

OS001

Description of state ONLINE

REQ19

STD008

Save and retrieve Instrument Configuration

REQ20

STD008

User acknowledgement before changing Instrument Configuration

REQ21

STD008

Protection of Instrument Configuration files

REQ22

VCM006

Device hardware simulation

REQ23

VCM006

Support full hardware simulation

REQ24

DCS001

Data acquisition maximum speed

REQ25

DCS001

Maximum Software overhead for data acquisition

REQ26

GUI001

Display all images

REQ27

OS002

Maximum delay between acquisition and display

REQ28

GUI001

Mouse driven operations on image display

REQ29

OS002

Image files in FITS format

REQ30

OS002

FITS header conform to ESO standards

REQ31

OS002

Sensors information in the FITS header

REQ32

DOC001

Typical disk storage requirement for one night

REQ33

DOC001

Maximum disk storage requirement for one night

REQ34

OS002 OS003 MS001

Archive all image FITS files

REQ35

OS003

Archive in background

REQ36

OS003 MS001

On-line data processing on the IWS

REQ37

ICS003 DCS001OS002

Information to be logged

REQ38

GUI003

Information displayed in the OS control GUI

REQ39

GUI004

Information displayed in the OS status GUI

REQ40

GUI005

User Station screen 1 contents

REQ41

GUI005

User Station screen 2 contents

REQ42

OS004

P2PP on dedicated screen

REQ43

N/A

Off-line data reduction on dedicated WS and screen

REQ44

OS003

Functionality required from TCS

REQ45

ALM001 ALM002

Hardware interlocks

REQ46

OS003

Science operations according to the Science Operations Plan

REQ47

OS003

Parameters during science operations in high level units

REQ48

OS002

Check for parameters value validity

REQ49

OS003

Parallel setup of devices

REQ50

OS003

Lamps with warm-up time switched on at the first setup

REQ51

OS003

Continuous derotator motion during integrations

REQ52

MS001

Parameters during maintenance operations in high level or engineering units

REQ53

MS001

Maintenance operations supported by Templates

REQ54

OS003 MS001

List of Templates

REQ55

OS002

Maximum time for bias exposure

REQ56

all

List of scripts/procedures for the test Software

REQ57

ALM002 ALM003

Software alarms warn for approaching hardware interlock conditions

REQ58

ALM002

Warnings shall be logged

REQ59

ALM002

Warnings treated as low priority alarms

REQ60

ALM002

Alarms displayed with standard tool

REQ61

GUI005

Alarms GUI permanently displayed in the User Station

REQ62

ALM001

List of Alarms

REQ63

ALM002

Alarms shall be logged

REQ64

ALM002 ALM003

Sounds associated to alarms

REQ65

ALM002

Alarms monitoring also in STANDBY

REQ66

ICS003

Initialization maximum time

REQ67

ICS003

Setup maximum time

6.2              General requirements for Instrumentation Software

The following table section contains the links between the general requirements for instrumentation Sw, defined in [AD 06], and the corresponding test.

 

Req.

TEST

DESCRIPTION

INS01

DOC001

Define Instrument ID and prefix in agreement with ESO

INS02

DOC001 ICS001

Time critical synchronization via Time Reference System

INS03

STD002

Naming conventions for Instrument LAN nodes

INS04

INS003

Instrument Software divided into the standard INS Modules

INS05

INS002 STD006

Facilities to build, install, startup and shutdown must be available

INS06

OS003

On-line data processing done within templates, if no real-time requirements

INS07

OS003

ESO approval required for on-line data processing

INS08

OS003

ESO approval required for the choice of on-line data processing tool

INS09

GUI001 GUI002GUI003 GUI004 GUI005

All GUIs based on the VLT panel editor

INS10

all

Test Software part of the mandatory deliverables. Standard minimum set applicable

INS11

INS001 INS002

Use Template Instrument to build a new instrument from scratch

INS12

INS003

Use cmm for Software configuration control management (Archive)

INS13

INS003

Follow cmm modules naming conventions

INS14

STD001

VLT programming standards applicable to Instrumentation Software

INS15

STD006

Instrument configuration under Software configuration control

INS16

STD006

Instrument configuration files in one single cmm module belonging to MS

INS17

STD002

One CCS environment for each LAN node

INS18

STD002

Use CCS-lite

INS19

STD002

CCS environment name same as LAN node name

INS20

STD009

Two users for each instrument

INS21

STD003

Use CCD Software for Technical CCDs

INS22

STD003

Use IRACE Software for Infra-red scientific cameras

INS23

STD003

Use FIERA Software for optical scientific cameras

INS24

STD003

Use dxf for data transfer between nodes

INS25

STD003

Use rtd for Real-Time display

INS26

STD004

Use icb for ICS processes and icbpan for ICS GUIs

INS27

STD005

Use boss for OS processes

INS28

STD007

Use tpl for templates

INS29

STD003 STD004 STD005

Use oslx for FITS keywords handling

INS30

INS002 INS004

Use pkgin for build and installation

INS31

STD004 STD005 STD008

Use ctoo for Instrument configuration files handling

INS32

STD006 OS001

Use stoo for startup and shutdown

INS33

ICS003

ICS controls all devices, except detectors

INS34

STD003 STD004 STD005

ICS, DCS and OS implement standard states

INS35

STD003 STD004 STD005

ICS, DCS and OS implement standard commands

INS36

STD008

ICS, DCS and OS configuration parameters values shall not be hard-coded

INS37

STD003 STD004 ICS002

ICS and DCS LCU status stored in the database

INS38

DCS001 ICS003 OS002

ICS, DCS and OS parameters values shall not be changed until a new command requests for it

INS39

DCS001 ICS003 OS002

ICS, DCS and OS set and actual values stored in separate database attributes

INS40

ICS003

Status of ICS on-going and completed actions shall be accessible

INS41

DCS001 ICS003 OS002

ICS, DCS and OS Set values shall be checked for validity

INS42

DCS001 ICS003 OS002

ICS, DCS and OS keywords shall be syntactically checked against dictionary

INS43

STD003 STD004

Use CCS scan system to transfer ICS and DCS parameters values from LCU to IWS database

INS44

DCS001 ICS003

ICS and DCS part of FITS header shall contain full status information and some statistics

INS45

VCM003

ICS and DCS part of FITS header shall be produced also in simulation

INS46

OS003

ICS, DCS and OS keywords in the FITS header should be syntactically checked against dictionary and comply with the rules defined in the Data Interface Control Document.

INS47

GUI001 GUI002

ICS and DCS stand-alone GUI must be available

INS48

DCS001 ICS003

ICS and DCS complete logging: commands, errors, LCU boot, sensors values, movements

INS49

VCM006

ICS and DCS simulation at WS level

INS50

VCM003

ICS devices simulation at LCU level

INS51

GUI001 GUI002

ICS and DCS simulation shall not be hidden to the user

INS52

VCM003

ICS and DCS simulation shall be indicated in the FITS header

INS53

DOC001

Implementation of ICS special devices must be approved by ESO

INS54

INS003

ICS cmm modules follow the naming conventions

INS55

DCS001

One DCS responsible for each camera (one camera may control a mosaic)

INS56

OS002

Handling of FITS header size between DCS and OS

INS57

VCM003

DCS DFE simulation at LCU level

INS58

N/A

DCS hw simulation at DFE level

INS59

VCM006

DCS readout frames simulation supported

INS60

STD003

DCS must support highest possible duty cycle

INS61

STD003

DCS DUMP command for image re-transmission

INS62

STD003

Save readout data also in case of failure

INS63

DCS001

DCS data saved in FITS format uncompressed

INS64

DCS001

DCS data saved in binary format

INS65

OS002

DCS data saved on dedicated disk not concurrently accessed by other applications

INS66

STD003

DCS must check for disk space availability before starting an exposure

INS67

DCS001

Windowed and binned readout supported

INS68

GUI001

DCS data optionally displayed with different orientation

INS69

STD003

DCS responsible for shutter time. If shutter controlled by ICS, use TRS for synchronization

INS70

STD003

Actual exposure time should take into account shutter opening and closing time

INS71

STD003

DCS cmm modules follow the naming conventions

INS72

OS002

OS Server responsible for coordination of single exposures

INS73

OS003

OS Server shall handle overlapping exposures

INS74

OS003

OS Server shall handle parallel exposures

INS75

OS002

Results of exposures shall always be archived (FITS format)

INS76

OS002 OS003

OS Archiver shall not affect the observing cycle. Archiving errors shall be reported to BOB

INS77

OS003

FITS files containing results of exposures shall follow naming conventions

INS78

OS003

OS includes templates

INS79

N/A

SOS responsible for coordination of exposures involving more than one instrument

INS80

GUI003

Mandatory OS parameters are available

INS81

OS002

Use standard exposure types

INS82

OS003

Follow rules for FITS files and keywords contained in the Data Interface Control Document

INS83

OS003

Implement complex operations in Templates

INS84

DOC001 OS003

Implement special functionality (e.g. auto-guiding, active optics) in separate OS process

INS85

MS001

All AIV and Commissioning activities supported by technical templates

INS86

GUI003

Implement OS Control panel

INS87

GUI004

Implement OS Status panel

INS88

OS004 VCM004

Follow ICD between OS and OH

INS89

VCM005

Follow ICS between OS and Archive

INS90

INS003

OS cmm modules follow the naming conventions

INS91

STD008

All Instrument configuration files in one cmm module belonging to MS

INS92

INS003

All dictionary files in one cmm module

INS93

INS003

Instrument configuration parameters protected from not authorized users

INS94

STD008

Use standard mechanism to control Instrument configuration changes

INS95

STD008

Instrument configuration changes shall be logged in FITS format

INS96

MS001 INS005

MS procedures implemented as technical templates. A Technical Instrument Package must exist

INS97

MS002

Results of technical templates logged in FITS format or in CCS sampling tool format

INS98

INS003

MS cmm modules follow the naming conventions

INS99

N/A

ESO authorization needed if p2pp complemented by a dedicated OSS tool for OB preparation tool

INS100

N/A

Special tool for target selection, if needed, part of OSS

INS101

N/A

OSS cmm modules follow the naming conventions

INS102

DOC001

Alarms must be listed in ISFS document and detailed in ISDD document

INS103

ALM002

Alarms implementation compatible with the CCS Alarm System

INS104

ALM002

Alarms triggered only if the value of the related database attribute is up-to-date

INS105

ALM002

Alarm database attributes associated to sensors must follow a standard naming scheme

INS106

GUI004

Alarm conditions displayed in the OS status panel

INS107

GUI005

Panels shall not pop-up and disappear automatically

INS108

GUI005

Static placement of panels

INS109

GUI005

A GUI shall not automatically close another panel

INS110

GUI005

User Station must follow standard configuration (2 screens). Extensions must be agreed with ESO

INS111

OS003

Follow standard interface to TCS/VLTI

INS112

INS002

Installation module shall follow the standard naming convention

INS113

OS001

Instrument specific adds-on to stoo functionality must be in the installation module

INS114

DCS001 ICS003

Restart one INS module without restarting the whole INS Software

INS115

DCS001 ICS003

ICS and DCS must provide own startup/shutdown scripts for the stand-alone mode

INS116

DOC001

Documentation in same electronic format used at ESO

INS117

DOC001

Instrument Software architecture must follow the scheme described in the INS Software Specs

INS118

STD001

Use VLT common software wherever possible

INS119

DOC001

Software activities included in the Instrument Software Management Plan

INS120

N/A

Instrument Software User Requirements document reviewed before PDR

INS121

N/A

Freeze Software User Requirements at PDR

INS122

N/A

Review Software Functional Specification at PDR. Recommended a few iterations before

INS123

N/A

Before PDR run Template Instrument, build Instrument Software skeleton, check performances

INS124

N/A

Review Software Design document(s) at FDR. Recommended a few iterations before

INS125

N/A

Review Acceptance Test Plan document at FDR.

INS126

N/A

Before FDR Instrument skeleton according to actual configuration, no code except for prototypes

INS127

TAT001 VCM006

Software test procedures automatic and reproducible, based on tat

INS128

DOC001

Accept. Test Plan, User and Maintenance manual ready for PAE. Recommended a few iterations before

INS129

TBD

Acceptance Test Report produced as result of PAE

INS130

DOC001

Agree with ESO intermediate check points between FDR and PAE

INS131

all

PAE at integration premises and in the VLT Control Model

INS132

DOC001 INS003

Software and documentation under cmm

INS133

OS003

OS shall be able to handle secondary guiding TCCDs in parallel to science exposures.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

___oOo___