October 2021 Paranal Service Mode User Satisfaction Survey
Once per year, now in the northern autumn, the User Support Department of ESO launches a Paranal Service Mode User Satisfaction Survey campaign. While this survey is by default anonymous, respondents are afforded the opportunity to identify themselves and to provide their e-mail address (see below).
This report details the findings of the September/October 2021 survey campaign (hereafter referred to simply by October 2021), while previous such reports are found here.
We view these reports as an important way to
- close the loop with the ESO Community,
- gather information on issues that need to be addressed or reinforced,
- thank all respondents, and
- demonstrate clearly that such feedback is important to us!
To this end, here we provide a summary of the responses received and trends in these responses over the last years, predominantly in the form of graphs. It should also be stressed that for those few cases where respondents did identify themselves and did make specific free-text comments we have contacted them by e-mail to address their particular comments.
Methodology and General Results
The ESO Service Mode Questionnaire is always available on-line for users to fill in but the usual rate of return is less than 2 per month. However, experience shows that a targeted campaign focused on a single (in this case Phase 2 related) aspect results in many more survey completions.
In September 2021, we asked Principal Investigators (PIs) of Service Mode runs scheduled for Paranal in Periods 107 and/or 108 (plus their then-active Phase 2 delegates) to complete the survey by a fixed deadline. We thus solicited a response from 492 PIs and their then-active Phase 2 delegates (370 individuals). Because of overlap this amounts to a total of 762 individuals, who were contacted via e-mail. A deadline was set for two weeks from the date of contact, and two reminder e-mails were sent before the deadline.
A total of 248 responses were received by the deadline (some 76 of which were not fully complete), representing a 32.7% return rate. This is 50% higher than in 2020 and is the third highest value since September 2016 (see the figure below).
Interactive Figure Features
The figures below are all interactive. By this we mean:
- Puttling the cursor over the plot will display the data values on the screen.
- Clicking on the menu icon in the upper right (the three short parallel horizontal lines) will open a menu of print/download options.
- For those figures with legends to the right of the plot clicking on any entry in the legend will toggle display of the corresponding data within the graph.
As a start in detailing the results from the survey, in the figure below we show the number of responses we received per instrument. Despite the fact that this year's survey covered 16 instruments (as opposed to 15 last year) the overall higher response rate resulted in an increase in the average number of responses per instrument (19.8 versus last year's figure of 12.3).
In the following two stacked histogram plots we present a general overview of user satisfaction (in percentage of responses) with two general items:
- the Phase 2 web documentation and
- the overall support provided by the User Support Department.
The plots, designed to show the trend in user satisfaction expressed in survey results since March 2015, clearly show the consistently high satisfaction with these services offered by the User Support Department.
p2 and Other Observation Preparation Tools
As in the past, we also asked about both the p2 tool and other, instrument-specific, observation preparation tools. For comparative purposes we include still the last P2PP results (September 2018) for a direct side-by-side comparison of P2PP and p2. In general the satisfaction levels with these tools are somewhat poorer than with the above mentioned services provided by the User Support Department. p2 is still overall well accepted, though we note that both ease of use and documentation satisfaction are the lowest values shown in these figures, even lower than for its predecessor, P2PP. Thus there remains room for improvement on both items. The user satisfaction with functions provised by the p2 application remains consistent with the previous year's value.
ESO has released a Phase 2 Application Programming Interface (API) that can be used to create, modify, or delete observation blocks (OBs), containers and an accompanying ReadMe file that define an observing run. We asked respondents whether or not they had made use of this powerful facility. The way they replied is shown below. In that plot we see that fractional use of the p2API is unchanged from last year.
Since, with the exception of ObsPrep, the number of responses per observing preparation tool other than p2 is rather limited (see the table below), any presentation of individual-tool responses on documentation, ease of use, or functionality would suffer from small number statistics. Thus, in the three figures below answers for all tools, except ObsPrep, are combined. The ObsPrep figures can be found further down the page. As with p2, when one considers the other tools as an emsemble we continue to see room for improvement.
|Observing Preparation Tool||Number of Responses|
|ObsPrep (p2 built-in, instrument-specific plug-in)||58|
As the usage of the ObsPrep tool outweighs all the other tools combined (see above) we have separated out responses to our questions about it for display below. Starting with usage statistics in the upper left plot we see a suggestion that usage has plateaued. This is surprising, as since the previous survey and the current survey support with ObsPrep has seen the addition of SPHERE (all modes), CRIRES (all but polarimetry), and FORS (all but FIMS modes). We interpret this as a need for more advertisement of the benefits of ObsPrep.
The ObsPrep documentation satisfaction level (top right plot below) shows a continuance of the decline reported on last year. We will continue to monitor this (through future surveys and incoming tickets), and, naturally, to work to improve the documentation.
ObsPrep ease of use and functionality provided (bottom two plots) both show very strong user satisfaction levels. Of the three cases where respondents expressed dissatisfaction with the functionality of ObsPrep one was related to the preparation of numerous identical OBs (for which p2 offers the solution of a copy-paste capability), one made useful suggestions for better handling of blind offsetting usage, and one, unfortunately, provided no feedback of any sort. Unfortunately, one respondent who expressed dissatisfaction with the ease of use of ObsPrep did not provide further details that would help us understand how to improve the tool.
Related to the above tools is, of course, the suite of Exposure Time Calculators. Thus, we asked survey participants the question, “How satisfied are you with the ETCs you have used?” The responses are shown below. There are consistently few respondents that express dissatisfaction with the ETC (and we do try to follow up comments when given). The over-the-years-steady ~9% (average) of "No opinion" answers could be interpreted as that fraction of the respondents that did not use any ETC.
Again we asked users to tell us how satisfied they are with the p2fc (finding chart generator) app within p2. We asked how they liked the tool in terms of documentation and its usefulness (as compared to any other alternatives for producing finding charts), with the plots below displaying the results. The user satisfaction with the documentation appears to have slightly decreased again this year though whether this is significant or not remains to be seen.
The few anonymous respondents that expressed dissatisfaction with aspects of p2fc provided no further comments. We can only stress that improvements in the software can only benefit from user's feedback if it is provided!
And lastly we asked survey participants, "Which operating system(s) do you use for ESO tools (e.g. for proposal/observation preparation, data reduction), excluding any browser-based tools?" This information is obviously not used as an indicator of satisfaction with the tools or support, but rather is gathered to ensure that we have an understanding about the technical requirements that users may have for the tools. We strive to develop tools that are OS independent, but a few of the legacy observation preparation tools still require local installation. Furthermore, data reduction pipelines also need to be packaged and tested on specific OSs.
The breakdown of responses is shown in the figures below. Here we see a continued dominance of Mac OS X over Linux, with a roughly steady small percentage of Windows usage.
Within the Linux usage it appears that Ubuntu is staging a come-back while both SuSE and Scientific Linux usages have both faded away.
1The total time allocated in Service Mode for Periods 107 and 108 was 11585.3 hours, while the corresponding number for Visitor Mode was 2079.8 hours. Thus, the October 2021 survey targetted PIs (and their then-active delegates) representing 84.8% of the total Paranal time allocation, including all public surveys.