Single Field Interferometry
The monthly integration for SFI has been performed based on ACS-6.0.2
The software has been tagged with TRUNK-SFI-2006-12-AfterMerge-ITER-3
These are the patches we received and integrated subsystem per subsystem:
|
SUBSYSTEM NAME |
TAG (corresponding to the last bug fixing) |
|
/ |
/ |
To validate the software delivered for R4, the tests from the validation tests suite at the ITS twiki page have been run.
A) Short summary:
|
Number |
Test |
Type |
Result |
|
1 |
MDB 0th order test |
simple test on sending monitor points to the archive |
PASS |
|
2 |
APDM test |
check on the consistency of the APDM |
FAIL |
|
3 |
Control-DataCapturer test |
mini-integration test |
FAIL |
|
4 |
Optical Pointing e2e test |
e2e test processing an OP SB |
PASS |
|
5 |
Shared Simulator e2e test |
e2e test processing an SS SB |
PASS |
|
6 |
Holography e2e test |
e2e test processing an HG SB |
PASS |
|
7 |
ACACORR integration test |
tests suite on the ACACORR software |
FAIL |
Percentage of success on the total number of integration tests: 57,14%
The validation (made by Lindsey) of the ASDM produced for SFI showed that a couple of problems still exist:
1. There is a problem with the ScanTable. The scanIntent column is not being correctly filled out for the calibrator sources. All these are assigned intent values Common instead of Phase, Phase,PhaseCurve,Seeing, etc as appropriate.
2. Only one of the scans contains 2 subscans. This is scan 12 which is a target scan. Although scan / subscan divisions are under the control of the script it is not clear to me why this happens only once in this place ? The scan subscan structure does seem to be correctly recorded in the main / scan / subscan etc tables.
There has been no time/commitment to solve the two residual problems.
SLOC detailed figures:
Note: Starting from Release R1.1 ITS and SE have used a common approach in calculating SLOC.
Note2: ACS slocs in R4.0 are circa 200’000 lines more since by mistake tools have been counted.
Assessment on the sufficiency of Doxygen-like in-line documentation: Graph
|
Modules inline |
Global inline
|
See the log at this link.
You will have a page with the results per each subsystem (see list below). In this page of results, there is a table with 3 columns:
- the first one lists the modules belonging to the subsystem,
- the second gives the results of the tests (PASSED, FAILED, UNDETERMINED, when it is neither PASSED nor FAILED); sometimes the test directory can be missing, so you will just see the message: "No test directory, nothing to do here".
- the third one gives a resume produced by Purify/Coverage reporting the analysis results on:
Functions
Functions and exits
Statement blocks
Implicit blocks
Decisions
Loops
The values reported for every item in the list above give the number of hits for that item.
In the same cell with the resume there is a link to the "Complete Report" produced by Purify. In the Complete Report one has information about the lines where the hit happened. For a loop, one has also the values: 0 (the loop is executed without entering into it), 1 (the loop is entered once), 2+ (the loop is entered and repeated once or more times).
Sometimes, instead of the resume, you will see a message like:
ERROR: No coverage info available
No atlout.spt
This happens when the modular test of that module is a fake one (for example the test is actually just an "echo something"), so there is no code (Java, Python or Cpp) that can be instrumented.
ARCHIVE (MISSING)
Note: the Test Coverage Analysis is not yet stable. Work is in progress to improve the results. You will see errors like:
This means that the test exists and is executed but Purify is not able to dump the information collected in the result file. This error is under investigation. We had 5 cases in ACS only
still under investigation, we don't know yet the reasons
We hope to clarify the problems under investigation as soon as possible!
See the list at this link.
See details at this link (MISSING)