Step by step guidelines for the review

News in P110

In P110 the Distributed Peer Review has been deployed for proposals requesting less than 16 hours (and not including Target of Opportunity runs). About 50% of the proposals will be reviewed via this new channel, hence decreasing the load on the panels by the same amount. In a few words: the panel members will be reviewing half of the normal programmes which they were reviewing in the past.

Given the drastic reduction in the number of proposals, as of this period all proposals assigned to a given panel will be reviewed and graded by all panel members during the pre-meeting phase. Each proposal will be assigned to one primary reviewer. All the remaning, non-conflicted panel members will be assigned as secondary reviewers. As a consequence, the role of grader has been removed.


General information

The review process is supervised by the Observing Programmes Committee (OPC) Chair and it is organized in panels, grouped per scientific category (A,B,C,D). In the current review scheme, there are 2 panels in A, 3 panels in B, and 4 panels each in C and D categories. Each Panel consists of 6 members, including 1 Panel Chair and 5 Panelists. One of the 5 Panelists is usually asked to take on the extra role of vice-Chair, to replace the Panel Chair in case of conflicts.

The OPC is composed by all 13 Panel Chairs, plus 3 extra members (panel co-Chairs) to compensate for the smaller number of Expert Panels in science categories A (2) and B (3). This is because each science category must be equally represented in the OPC by the same number of reviewers.  

Each proposal is assigned one Primary and up to five Secondary reviewers. They will pre-grade these proposals and, if not triaged after the pre-grading, the Primary reviewer will present their scientific case at the panel meetings. All panel members will participate to the scientific discussion and grading the proposals during the panel meetings.

All reviewers have to report conflicts on all proposals in their Panel, irrespective of their Primary or Secondary reviewers role.


Access to OPC and Panel documents

Documents are made available to the OPC and Panel members at various stages of the process. These can be accessed via the following steps:

  1.  log in to your User Portal account at
  2. select "Access the OPC and panel documents" under the “Actions” button in the left frame menu
  3. select "Documents for panel X0" (where X0 is your panel ID) at the  bottom of this page. 


Practical aspects

The new web tool called PEI (Proposal Evaluation Interface) will be used during both the pre-grading/pre-meeting phases and at the panel meetings to grade the proposal. PEI is straightforward to use; nevertheless, for the users convenience, the tool includes a detailed help guide, featuring several video tutorials.

Important note: since the development of the new PEI is still on-going, one initial step of the review process, the conflicts declaration, must still be attended via the “old” ELECTRA/WOT system. The ELECTRA package must be manually installed by the reviewers (Appendix B).  

Also, the proposal ID format has recently changed, but since the review and scheduling processes still relies on both the old and the new review tools, both proposal IDs are provided as often as possible. For convenience, the PEI will display both versions.

Large Programmes are now submitted on a yearly basis, only in even periods. On-going Large Programmes are also reviewed only once per year, in odd Periods.

As of P108 applicants must formulate the scientific rationale of their proposals following the anonymisation guidelines. Examples and a detailed description of the dual anonymous peer review paradigm are described here (a video tutorial and a PDF version are also available). Specific instructions for the reviewers are provided here.

The following non-anonymized fields of the proposals are not included in the PDFs distributed to the panels:

  • Investigators
  • Background and Expertise
  • Report on Previous Usage of ESO Facilities
  • Publications
  • Data Product Delivery Plan (for Large Programmes only)

These will only be accessible to the panels after the ranking phase is completed and only upon request. All the sections of the proposals distributed to the panels before that must comply with the anonymization guidelines. Failure to abide by the anonyisation rules may lead to the disqualification of the proposal. Panel members must log possible violations via the Proposal Evaluation Interface (for this there is a dedicated field in the interface).

In case of questions, please contact the Observing Programmes Office (  


Conflicts declaration and category changes

A few days after the submission deadline, proposals will be distributed based on their scientific category to the Expert Panels, automatically taking into account possible institutional conflicts. At this stage, all proposals within a given panel are assigned to all members of that panel. This is to allow the declariation of conflicts before the assignements are finalised. This takes place once this phase is completed and before the pre-grading starts.

All reviewers will need to perform the initial checks on their panel assignments via the ELECTRA package, available on the panel specific OPC Web Page.

ELECTRA is an interface to access/print the proposals, to identify those to be reviewed and to perform a number of operations like sorting, searching, etc.. Its deployment is straightforward on most operating systems, but it requires a suitable version of Python (only 2.X) to be present on the host computer. Alternatively, the pdf files of the proposals can be found in a subdirectory of the ELECTRA package and can be accessed independently. The ELECTRA Manual contains further details and a video tutorial shows how to use this interface.

Here follows a summary of the main actions to be performed within ELECTRA:


1. Identify the proposals assigned to a Panel: This can be done in several ways using ELECTRA, e.g. by following the link labelled “Referees” in the navigation frame on the left of the page. If ELECTRA has not been fully installed one can still extract the list of the assigned proposals from the package (see the ELECTRA Manual).

Each proposal has one Primary reviewer. Once ELECTRA has been configured (by specifying the reviewer acronym), the proposal assignment is explicitly stated at the bottom of the main page (e.g. “you are NOT a referee of this program”). For reviewers not using ELECTRA, the updated assignment lists are available in the ELECTRA zip file (R4S reports under the reports directory).


2. Check the OPC category: All proposals are submitted within one of the four OPC categories:  

A -  Cosmology and the intergalactic medium

B -  Galaxies

C -  Interstellar medium, star formation and planetary systems

D -  Stellar evolution

In general, category changes are strongly discouraged.  The choice of the scientific category for which a given proposal was submitted is under the full responsibility of the PIs. Therefore, as a rule,  the proposals should be reviewed within the specified  category.  Deviations from this rule will be considered only under very  exceptional circumstances.  If you identify such an exceptional  case, first  discuss it with (and have it approved by) your Panel Chair.   Only then you should issue the category change request. The Web OPC Tool (WOT) must be used for this (Appendix B). This video tutorial explains how to do that. Please note that any category changes  must  be reported by the given deadline. OPO will assess a transfer to another category and the reviewers will be notified about any re-assignment.vThe identification code of a proposal that has been reassigned to another OPC category remains unchanged and so do the scientific keywords which characterise it.

Important note: ELECTRA is meant for proposal navigation only. It does not have any reporting capability for recording conflicts, category changes and/or proposal evaluation and grading. For this, other tools have to be used (see below).


3. Check for conflicts of interest: ESO's policies concerning fonflicts are described in the document Rules for Dealing with Conflicts of Interest. Although with the anonymization rules the number of detectable conflicts has significantly reduced, reviewers still need to report conflicts to OPO, whenever they feel unable to make an impartial assessment of a proposal.

The WOT interface must be used by all reviewers to report conflicts of interest (and only for this!). Once a conflict of interest is reported, OPO will assess its severity and its impact on the evaluation of the proposal and notify the reviewer about the next steps.

The corrections of the category and reviewer assignments must be completed before the next phase (pre-grading) can start. Thus, it is essential that all category mismatches and conflicts be reported within a few days after the release of the proposals. The OPC time-line (Appendix A) lists the deadline for reporting conflicts.

Primary and Secondary Reviewers may still report any conflicts via an e-mail to OPO, also during the pre-grading phase.


For even semesters only: Reviewers cannot be PIs of Large Programmes proposed in the Period in which they serve. In cases where they are a Co-I on a Large Programme, they will be conflicted in the discussion of that specific Large Programme, but they are expected to evaluate all other Large Programmes assigned to them.


NOTE for P110: ELECTRA and WOT will most likely be decommissioned at the end of P110. Their functionalities will be fully transferred to the PEI.



As soon as the conflict declaration phase is completed, the final proposal distribution is performed by OPO, taking into account the accepted category changes and conflict reports.  For the pre-grading phase, all reviewers will use the Proposal Evaluation Interface (PEI), which is accessible via their User Portal account. From this point on, all remaining steps of the review process will be performed via the PEI. Instructions for the usage of PEI in the pre-grading phase can be found here.

All proposals for Normal Programmes, Monitoring, GTO Programmes and ToO Programmes that have been assigned to the reviewers must be read and given a grade and comment before the panel meetings. Each proposal has one Primary reviewer and up to five Secondary reviewers.

In even Periods, all assigned Large Programme proposals need to be evaluated as well by all non-conflicted reviewers in the category of the proposal.

OPC members have specific duties in addition to the regular proposal assignments given to other reviewers. These include evaluating all Large Programme proposals across all categories (only applicable in even periods).

The criteria to be applied in the evaluation of the proposals are (in order of importance):

  • scientific merit and the contribution to the advancement of scientific knowledge;
  • evidence of sufficient resources and an adequate strategy for complete and timely data analysis;
  • the scientific outcome of previous observations with ESO telescopes.

Proposals requesting time for the continuation or completion of programmes already accepted during previous periods should be given special consideration.

Guaranteed Time Observations (GTO) are protected against duplication. Proposals simply duplicating Guaranteed Time Observations should be rejected. Potential conflicts should have been signalled in the specific field of the proposals. The GTO Protected Observations Web Page  for the given period is also useful.

A similar protection applies to Public Survey observations, which should not be duplicated without specific justification in open time. These protections are listed on the Public Survey Protected Observations Web Page.

Reviewers must assign grades and comments to each run of the assigned proposals. Before starting their work, reviewers should consult the guidelines for grading and the grade scale to be used (1.0=best … 5.0=worst), which can be found in Appendix C.

The full grade scale should be used so as to ensure that the resulting ranking of the proposals is as meaningful as possible. Grades assigned by individual reviewers must be specified with one decimal digit (e.g. 2.7).

Proposals should be evaluated based solely on their contents. The Observatory will evaluate the technical feasibility of programmes. Should clarification of some technical aspects of a proposal be necessary, please contact OPO.

To enter the proposal grades and comments the web-based PEI interface must be used. Instructions for the usage of the PEI can be found here.

Once the reported conflicts have been resolved by OPO, the PEI will allow the reviewers to enter grades and comments for all proposals assigned to them. Note the following:

  • Each run of every proposal must receive a grade.
  • The Primary and Secondary reviewers must provide comments for each proposal, emphasizing strengths and weaknesses.

While different grades may be assigned to different runs, the comments apply to the proposal as a whole. The pre-meeting comments entered in PEI serve different purposes. They are the basis for the panel discussion and, in the case of triaged proposals, they are used for the feedback to the applicants. Grades and comments for Normal, GTO, Monitoring and ToO Programmes will be accessible only to the members of the applicable panel (and to the OPC Chair). Members of other panels will not be able to access them. Grades and comments for Large Programmes will be readable by all panel members of the category in which the Large Programme was submitted, as well as by all OPC members (only applicable in even periods).

It is absolutely necessary that reviewers respect the deadline for the submission of the pre-grades (Appendix A). Triage can be carried out only once all members of all panels have completed the pre-grading.


Large Programmes (even semesters only)

During this phase, all non-conficted panel members review and grade all Large Programmes beloging to their scientific category. The grades will be used to prepare a pre-selection of the Large Programmes to be discussed at the OPC meeting.



Triage and pre-meeting activities  

Once all reviewers have submitted their pre-grades, the outcome of the triage (see below) and other relevant reports will be made available via PEI or via the OPC Web Page.

The lowest-ranked runs will be triaged. The reviewers will get the chance to rescue some proposals if considered necessary. Triaged runs will be listed in the PEI and it will be possible to rescue proposals directly from there during the meetings, preferably at the start of the discussion. As a rule, triaged runs are not scheduled. A description of the triage process is given in Appendix D.

For OPC members only: review OPC-specific documents.

The following documents are made available to OPC members:

  • Pre-selection list of Large Programmes (only in even Periods)
  • Reports on ongoing Large Programmes (only in odd periods).

For Panel Chairs only: prepare for the meeting of your panel

It is good practice to have a clear idea of the panel’s tasks. Deciding beforehand which proposals will be discussed on a given day is very useful. The list of selected proposals (e.g., for Day#1) should be shared with the rest of the panel before the start of the meeting.


The Panel Meetings

The individual online panel meetings are spread over several days. The Proposal Evaluation Interface (PEI) will be used during the panel meetings to associate grades and comments to each proposal discussed.


Instructions for the usage of PEI during the Panel meetings are availabe here for the Panel members and here for the Panel chairs.


Individual panel meetings – General procedure

The corresponding Chair leads the panel meeting. The mission of each panel is to grade all Normal, GTO, Monitoring and ToO programme runs. Any proposal that had been triaged can be rescued by the Panel Chair via the PEI, upon the request of one or more reviewers (including the Chair). If rescued, they are then automatically added to the list of the proposals to be reviewed by the panel during the meeting. None of the remaining triaged proposals is discussed further.

Typically, the Primary Reviewer presents the proposal and his/her assessment. Then the Secondary Reviewers give their comments. This is followed by a general discussion and concluded by assigning a grade (1.0=best...5.0=worst) in the PEI by all non-conflicted reviewers.

The panel may decide to grade a proposal as a whole (i.e., to assign the same grade to all its runs) or assign different grades to individual runs. This decision has to be taken before opening the grading phase of a given proposal by the non-conflicted reviewers and must be specified in the PEI by the Panel Chair. While it is legitimate to assign different grades to different runs of a given proposal, reviewers should be aware that this implies that some of them may end up being scheduled and others not.

The mean and the standard deviation of the grades of the individual reviewers are computed and stored in the database. The mean is adopted as the panel “final grade” (as reported in PEI) and used as reference to establish the rankings.

Conflicted reviewers must leave the [virtual] room during the discussion. The conflicts of interest identified prior to the OPC meeting are reported by PEI. Should an additional conflict be identified during the meeting, the conflicted person should signal it to the Panel Chair and leave the [virtual] room for the discussion. The Panel Chair will then record the new conflict in the PEI.

Proposals must be evaluated as submitted. In particular, the panel should refrain from recommending a different time allocation as compared to what was requested in the proposals. However, the panel may exceptionally request time allocation changes if there is a compelling scientific justification. In such a case, a request should be sent to with the justification. Such a change should also be described in the feedback to the PI.

Requests for clarification of technical aspects of a proposal should be sent to Answers to technical feasibility questions are provided within 24 hours. Only questions submitted before the end of the panel sessions can be processed.

The runs are ranked according to their grade on a telescope-by-telescope basis. Different runs of a given proposal with the same grade are assigned the same rank. A merged ranking of all proposals is established across all panels for each telescope. The assigned grades are re-normalised so as to ensure consistency across panels.

A scientific assistant supports the panel in its work. All questions addressed to (e.g. on procedures, policies or the technical feasibility) should be communicated through the scientific assistant.

Target of Opportunity (TOO) runs

ToO runs are defined for targets that cannot be known more than one week before the observation needs to be executed. (See Call for Proposals for more details). Programmes may include a mixture of ToO runs and non-ToO runs. The regular runs are evaluated and scheduled according to the general procedure. ToO runs should be graded consistently with all other runs with which they compete for allocation of time. 

After the end of the panel meeting, OPO prepares a merged list of ToO runs to be discussed during the OPC meeting and finally recommended for implementation. This merged ToO list includes a tentative cut-off line to be discussed and approved by the OPC. Runs below the cut-off line will not be considered for scheduling. Only A-ranked ToO runs will be considered for scheduling.

Large Programmes (even periods only)

  • Approved Large Programmes (LPs) are given top priority in the Service Mode queues. 
  • LPs of a given scientific category (ABCD) are evaluated and pre-graded by all non conflicted panel members of the given category and by all OPC members (of all categories).
  • Based on the pregrades, OPO compiles a ranked list and propose a cutoff for the discussion during the panels and at the OPC meeting.
  • Each LP is assigned to a Primary reviewer (choosen amongst the OPC members of the given scientific category).
  • The pre-selection list is distributed to the OPC members in due course before the start of the panel meetings. 
  • The OPC members are entitled to request that one or more LPs below the pre-selection line are promoted for the discussion. 
  • The requests must be sent to the OPC Chair. For the programme to be considered, at least two OPC members must request the promotion.
  • The final list of pre-selected LPs is approved before the start of the panel meeting and circulated by OPO.
  • During the panel meetings, the LPs of a given category are discussed by all panels separately. During this phase no grading or voting takes place. The purpose of the discussion in the panels is to provide the corresponding Chairs (and co-Chairs where applicable) with evaluations and opinions to be discussed at the OPC meeting.


The OPC Meeting

The OPC meeting is scheduled after the Panel meetings are over (see Appendix A). Conflicted OPC members must refrain from participating in the discussion of the relevant proposal/s.

Final review of LPs and GTO LPs (even periods only)

A significant fraction of the OPC meeting is dedicated to the final evaluation of the new Large Programme proposals. A more detailed description of the voting procedure is given in Appendix E.

  • All LPs pre-selected for discussion are presented by their Primary reviewers and the discussions which took place in the panels of the given category are summarised by the corresponding Chairs. 
  • All non-conflicted OPC members participate to the general discussion, which is led by the OPC Chair. 
  • A vote (Yes/No/Abstention) follows.
  • Only LPs which receive the qualified majority are recommended for approval to the Director General.
  • GTO LPs (LPs with GTO contract keywords) are treated in the same way as open-time LPs.


Final review of ToO runs

Decision about the final recommendation of ToO runs is made by the whole OPC based on the merged ranking provided by OPO.

Review of merged rankings of panels

ESO provides a merged ranked list of all runs per telescope based on re-normalised grades. The Panel Chairs should cross-check the merged ranking and make sure it reflects the recommendations of their panel and no undue conflicts or duplications are present. This is also the time when critical cases related to the proposal anonymisation should be flagged and discussed, if required.

Review of ongoing LP reports (odd periods only)

PIs of LPs approved in previous semesters are required to submit periodic reports on the progress of their projects. The reports are collected by OPO and distributed to the OPC members in due course. Each report is assigned to a Primary reviewer, who will present the case at the OPC meeting. The OPC is supposed to examine the reports, approve/reject possible time compensation requests, require actions on the PI side and approve/reject/pause the continuation of the LPs. Normally, only critical cases are discussed at the OPC meeting.


Post-meeting activities

The primary reviewer provides a text with OPC or panel comments for each proposal via the PEI, highlighting strengths and weaknesses. The feedback should help the proposers to understand the outcome and possibly to improve the proposal for re-submission. Reviewers can take notes and/or write comments in the PEI during the whole review process. Draft feedback notes or comments from other reviewers will be accessible to the Primary reviewer for compiling the final feedback to the applicants.

Feedback comments must be submitted using the PEI by the deadline specified in the OPC time-line (Appendix A). The OPC Chair is responsible for the final version of the comments to be sent to the PIs.



Appendix A - Timeline

ESO Period 110 - Proposal submission: deadline March 25th, 2022.

  1. Distribution of the observing proposals to the reviewers: March 30 (Wednesday).
  2. Reviewers report conflicts of interest and possible category changes: April 5 (Tuesday).
  3. Opening of PEI for pre-grading: April 8 (Friday).
  4. Submission of pre-grades by reviewers: May 6 (Friday).
  5. Release of the triaged proposals in PEI: May 11 (Wednesday).
  6. Panel and OPC online meetings:
    • Panels: week of May 16-20.
    • OPC: May 24-25 (Tuesday-Wednesday).
  7. Submission of final feedback by reviewers for Normal/Monitoring/GTO programmes: May 31 (Tuesday).
  8. Submission of final feedback by OPC members for Large Programmes: June 15 (Wednesday).


Appendix B - ELECTRA and the Web OPC Tool (WOT)


In P110 ELECTRA and WOT are only used during the initial conflict declaration phase. They will be decommissioned as soon as this phase is integrated in the Proposal Evaluation Interface.



The proposal distribution for each OPC category (A, B, C and D) is packaged into a Python application called the ELECTRA, available from the panel specific OPC Web Page. The installation of ELECTRA is not automatic, and must be done manually by the reviewers. The ELECTRA package is released only once, for the conflict declaration phase. In this phase all proposals of a given Panel will be assigned to all non-institutionally conflicted members. A video tutorial for the usage of ELECTRA can be accessed here. The ELECTRA manual can be downloaded here.

Reviewers not using ELECTRA

In case you do not wish to use Electra, the list of proposals assigned  to your  Panel  can also be found in  the  file:

  • mydir/electraP110/reports/R4S_110A-XYZ.pdf

where "mydir"  stands for the directory in which you have untarred the Electra package and "XYZ"  is your 3-letter acronym (for this see the OPC  composition). The PDFs of the proposals are located in the directory: mydir/electraP110/proposals/pdf/. You may access the proposals directly from there.



The Web OPC Tool (WOT) is accessible via the ESO User Portal following the link labelled “Report/Comment Cards” under the “Actions” button in the left frame menu. First select the proper period from the pull-down menu, then press the “Change cycle” button.


B.1 Reporting errors in the OPC category 

Follow the link to the “Report Cards” to open a page with the list of proposals of the selected cycle. Proposals marked with a red cross indicate that the report cards have not yet been completed.

If you find that one of these proposals belongs to another OPC category, please first discuss the case with the Chair of your panel, who may have received assessments from other reviewers as well. If you jointly decide that indeed the category is wrong, using the “Report Cards” please click on its proposal number to open the wrong category and conflict page. To report an error in the category assignment, check the “Wrong Category” button. The text box should be used to indicate to which category the proposal should be transferred. Submit the report by clicking on “save conflict/wrong category”. 

The explanatory text box must be filled; providing a description of the recommended action is mandatory.

A yellow disk replaces the red cross once the report has been submitted. 

The notification is reviewed by OPO within the next working day. If it is accepted, the proposal will be transferred to a panel of the correct category. If the change is rejected by OPO, the proposal will remain in the original category.

B.2 Reporting conflicts of interest 

B.2.1 Before proposal assignment to reviewers

Under “Report Cards”, click on the proposal number for which you want to report a conflict.  Check the “Conflict of interest” radio button. Use the text box to explain the nature of the conflict. Under “Report Cards” click on “Submit the report” by clicking on “Save Conflict/Wrong Category”. 

The justification text box must be filled; it is mandatory to provide an explanation of the conflict.

A yellow disk replaces the red cross. 

The notification is reviewed by OPO within the next working day. If it is accepted the proposal is transferred to the other reviewers.

OPO informs the reviewers whether the conflict is considered substantial enough to warrant re-assignment. Once all the conflicts reported by a reviewer have been resolved by OPO the reviewer must confirm that they have reported all conflicts associated with proposals assigned to them by clicking the “No (more) conflicts to report” button at the bottom of the list of proposals (only after all conflicts reported have been assessed and resolved by OPO). Please note that this button should be pressed even if no conflicts are identified by the reviewer.  

B.2.2 After proposal assignment to reviewers

Primary and Secondary reviewers can still report any conflicts in their report cards (see above). 

B.3 Grades and comments

Grades and comments are entered by the reviewers using the PEI, both during the pre-meeting phase and at the panel meetings. Instructions for the usage of PEI will be provided in due course. 

B.4 Final comments

The final comments to be passed to the PIs will be compiled by the Primary reviewer via the PEI and overseen by the corresponding Panel Chair. Instructions for the usage of PEI will be provided in due course. 


Appendix C - Evaluation and grading guidelines

For each proposal you will be providing a grade (between 1=outstanding and 5=unsuitable). For proposals containing multiple runs (e.g. for different instruments), different grades can be attributed to different runs.

The grade scale to be used is defined as follows:

1.0 – outstanding: breakthrough science

1.5 – excellent: definitely above average

2.0 – very good: no significant weaknesses

2.5 – good: minor deficiences do not detract from strong scientific case

3.0 – fair: good scientific case, but with definite weaknesses

3.5 – rather weak: limited science return prospects

4.0 – weak: little scientific value and/or questionable scientific strategy

4.5 – very weak: deficiences outweight strengths

5.0 – unsuitable

  • Proposals with grades larger than 3.0 will not be considered for scheduling. 
  • The full grade scale (1 to 5) should be used so as to ensure that the resulting ranking of the proposals is as meaningful as possible. 
  • Grades can and should be specified with one decimal digit (e.g. 2.7). 
  • While evaluating a proposal, do not try to double guess what grade you should give it in order to have it scheduled. Keep in mind that the final grade will be computed as the aggregated value resulting from the evaluations of other reviewers. Also, the same grade has very different implications at different telescopes, depending on their demand. Finally, other constraints (like RA distribution, moon restrictions and atmospheric conditions) have a critical role in the final outcome of the scheduling process. 
  • While reviewing the proposals you should keep in mind these aspects:
    • Does the proposal clearly indicate which important, outstanding question/s will be addressed?
    • Is there sufficient background/context for the non-expert (i.e., someone not specialized in this particular sub-field)?
    • Are previous results (either by proposers themselves or in the published literature) clearly presented?
    • Are the proposed observations and the Immediate Objectives pertinent to the background description?
    • Is the sample selection clearly described, or, if a single target, is its choice justified?
    • Are the instrument modes, and target location(s) specified clearly?
    • Is the signal-to-noise ratio specified in the proposal sufficient to reach the scientific goals?
    • Will the proposed observations add significantly to the knowledge of this particular field?


In general, the scientific merit should be assessed solely on the content of the proposal, according to the above criteria. Proposals may contain references to published papers. Consultation of those references should not, however, be required for a general understanding of the proposal.

Please note that ESO encourages reviewers to give full consideration to well-designed, high-risk/high-impact proposals even if there is no guarantee of a positive outcome or definite detection.

Appendix D - The Triage process

Triage is applied to Normal and ToO programmes before the panel meetings. Runs ranked in the lowest third, according to the pre-grades, should in general not be discussed at the panel meeting. This is to maximize the amount of time for the evaluation of the most interesting proposals. The triaged runs will not be considered for allocation of observing time.

This appendix gives a brief description of the triage procedure as applied at ESO.

Pre-grades are renormalised to provide equal averages and the standard deviations of all the grades of each reviewer. For each run, a single grade and standard deviation is computed by averaging the renormalised grades of all the reviewers. 

For each telescope, the runs of Normal and ToO Programmes are ranked according to their grade. Normally the lowest 30% (in time and per telescope) are triaged. To ensure significant oversubscription per telescope, in some cases, triaged runs – including runs for non-optimal conditions – are brought back from triage until at least 1.5 times the available time at the corresponding telescope is covered. These resurrected runs should be discussed by the panel as usual non-triaged runs. Proposals with triaged and non-triaged runs are identified.

A ranked list of all the runs is provided for each panel. The ranked list of runs is sorted by the re-normalised grade. The triage line appears in pink in the document. The runs above the final triage line must be discussed at the panel meetings. The runs below this triage line should not be discussed unless any panel member requests the resurrection of a particular proposal.

Note: depending on the specific demand in the given cycle, different triage fractions may be applied to different telescopes.

Appendix E - Large Programmes pre-selection (promotion)

In the past the pre-selection of Large Programmes (LP) to be discussed at the OPC meeting was carried out in Joint Panel meetings. In these meetings, all non-conflicted panel members of a given scientific category discussed and voted the promotion of LPs in their category to the next step, that is the OPC meeting. The purpose of the pre-selection is to reduce the number of proposal to be discussed and voted at the OPC meeting, hence allowing the OPC members to focus on the most promising cases.

With the advent of the pandemic and the difficulty in organising online joint Panel Meetings it was decided to follow a different approach, which is described in this Appendix.

During the pre-meeting phase, all panel members are asked to review and grade the LPs in their scientific category. To provide them with a more comprehensive view for the final discussion, OPC members are asked to review and grade all LPs submitted in the given semester.

In the following we will call "in-category" the grades given by reviewers belonging to the category in which the LP was submitted. The process is as follows:

  1. The grades of each referee were calibrated using the grade distribution of normal programmes (and only in the range 1.0-3.0), to reduce the systematic differences between their grading scales. The procedure is identical to that adopted for all other programmes in the review.
  2. For each LP, the calibrated in-category grades are combined to yield the in-category calibrated average.
  3. A further category calibrationis applied to reduce residual systematic differences between categories. After this calibration, the distribution of average grades in each category all have the same average and standard deviation.
  4. Finally, calibrated in-category averages (CatNormAvg) and standard deviation (CatNormStd) are computed for each LP.
  5. The programmes are sorted by ascending CatNormAvg.
  6. A nominal cutoff line is presented to the OPC members ahead of the start of the Panel meetings.
  7. The OPC memebrs can propose to promote one or more LPs initially placed below the cut-off line.
  8. The requests for promotion are sent to and approved by the OPC Chair.
  9. The list of promoted LPs is finalised by the end of the Panel meetings and circulated to the OPC memebers before the start of the OPC meeting.

Appendix F - Large Programmes voting procedure

This appendix summarizes the content of the Internal Memorandum of November 14, 2018 (Voting procedure for Large Programmes at the OPC Meeting).

  • The list of LPs to be discussed at the OPC meeting is consolidated before the OPC meeting starts. The list is based on the grades provided during the pre-meeting phase. Each OPC member has the right of proposing the promotion of proposals excluded from the pre-selection. 
  • The promotions have to be submitted to and approved by the OPC Chair. See Appendix E for more details.
  • Proposals not included in the pre-selection are formally rejected and cannot be reconsidered.
  • After the presentation of the given LP by the Primary referee and the discussion open to all non-conflicted OPC members, the OPC Chair calls for the voting.
  • The voting ballots are secret.
  • All non-conflicted OPC members have to cast their vote using one of the three options: Y[es]/N[o]/A[bstain].
  • The votes are digitally collected and recorded by OPO.
  • The results of the voting are presented to the OPC only once all the cases have been discussed and voted, and the OPC Chair declares the discussion completed.
  • A proposal is considered as recommended for approval to the Director General if the number of Y exceeds the number of N.
  • All proposals receiving a number of N equal or larger than the number of Y are rejected.
  • The vote is considered valid only if the number of Y+N votes is equal or larger than 2/3 of the number of non-conflicted OPC members.
  • If the 2/3 fraction of the number of non-conflicted OPC members is not round, the limit is rounded by defect to the closest integer.
  • If the 2/3 threshold is not reached, the vote is not valid. In this case the OPC chair calls for a further discussion and a new voting session takes place.
  • In case the 2/3 limit is not reached after the second voting round, the OPC members who abstained in the previous two sessions are forced to express their vote as Y or N and a third (and final) voting session takes place.

Appendix G - Confidentiality agreement

This appendix contains the Confidentiality Agrement which is signed by all OPC and Panel member when they accept to serve.


Confidentiality Agreement

for the Members of the ESO Observing Programmes Committee (OPC) and its Experts Panels


In accordance with the Terms of Reference and Rules of Procedure of the ESO Observing Programmes Committee (approved by ESO Council on 04.12.20131), I acknowledge that during the term of my membership I may be given or have access to confidential information which may be supplied in tangible form or verbally or by observing proposals and the respective discussions at OPC are considered Confidential Information, as they may contain unpublished research and/or proprietary information. I further acknowledge that the Confidential Information will be made available to me only for the purpose to

  • fulfil my duties as a member of the ESO OPC and its Experts Panels in accordance with the Terms of Reference and Rules of Procedure of the ESO Observing Programmes Committee,

('the Permitted Purpose')


  1. to use the Confidential Information only for the Permitted Purpose;
  2. to keep the Confidential Information secret and confidential and not to disclose it in any form to any third party (including to any persons or party at my employer, including research colleagues, graduate students, post-doctoral or research associates) without the prior written consent of ESO;
  3. on the written request of ESO to deliver to ESO any tangible items of Confidential Information in my possession;

I understand that this Confidentiality Agreement does not apply to any item of Confidential Information, which is disclosed to me by any third party without the breach of the present Confidentiality Agreement, or which comes into the public domain through no fault of mine, but that otherwise this Agreement will continue in force without limit of time. ESO may release me in writing from the Agreement.

I hereby confirm that I understood and will comply with the above confidentiality requirements.



Back to the top

Last update: Tue Mar 22 18:05:45 CET 2022