Science Operations 2015 - Programme
|TUESDAY, 24 November 2015|
|Chair: Bruno Leibundgut, Venue: ESO Auditorium (Eridanus)|
|The Context 14:00 - 17:25|
|14:00 - 14:15||Andreas Kaufer|
|14:15 - 14:30||Martin Kessler|
|14:30 - 15:05||Christophe Arviset||
ESAC, Spain - Track: Context
The ESAC Science Data Centre (ESDC) provides services and tools to access and retrieve science data from all ESA space science missions (astronomy, planetary and solar heliospheric). The ESDC consists of a team of scientists and engineers working together and in very close collaboration with Science Ground Segments teams. The large set of science archives located at ESAC represent a major research asset for the community, as well as a unique opportunity to provide multi missions and multi wavelength science exploitation services. ESAC Science Archives long term strategy is set along the main three axes:
(1) enable maximum scientific exploitation of data sets;
(2) enable efficient long-term preservation of data, software and knowledge, using modern technology and,
(3) enable cost-effective archive production by integration in, and across, projects
The author wants to thanks all the people from the ESAC Science Data Centre and the mission archive scientists who have participated to the development of the archives and services presented in this paper.
|15:05 - 15:40||Martino Romaniello||
ESO, Garching - Track: Context
Providing the best science data is at the core of ESO’s mission to enable major science discoveries from our science community. I will briefly describe the steps that ESO undertakes to fulfill this, namely ensuring that instruments are working properly, that the science content can be extracted from the data and, finally, delivering the science data to our users, PIs and archive researchers alike.
Metrics and statistics that gauge the results and impact of these efforts will be discussed.
|16:10 - 16:45||Alberto Accomazzi||
CfA, Harvard - Track: Datalinks
The NASA Astrophysics Data System (ADS) has long been used as a discovery platform for the scientific literature in Astronomy and Physics. With the addition of records describing datasets linked to publications, observing proposals and software used in refereed astronomy papers, the ADS is now increasingly used to find, access and cite an wider number of scientific resources. In this talk, I will discuss the recent efforts involving the indexing of software metadata, and our ongoing discussions with publishers in support of software and data citation. I will demonstrate the use of ADS's new services in support of discovery and evaluation of individual researchers as well as archival data products.
|16:45 - 17:05||Pascal Ballester||
ESO, Garching - Track: Strategy
Scientific software development at ESO involves defined processes for the main phases of project inception, monitoring of development performed by instrument consortia, application maintenance, and application support. We discuss the lessons learnt and evolution of the process for the next generation of tools and observing facilities.
|17:05 - 17:25||Bruno Merin||
ESAC, Spain - Track: Context
The ESAC Science Data Centre, ESDC, is working on a science-driven discovery portal for all its astronomy missions with the provisional name Multi-Mission Interface. The first public release of this service will be demonstrated, featuring an interface for sky exploration and for single and multiple target searches. It requires no prior knowledge of any of the missions involved. From a technical point of view, the system offers all-sky projections of full mission datasets using a new-generation HEALPix projection called HiPS; detailed geometrical footprints to access individual observations at the mission archives using VO-TAP queries; and direct access to the underlying mission-specific science archives.
A first public release is scheduled before the end of 2015 and will give users worldwide simplified access to high-level science-ready data products from all ESA Astronomy missions plus a number of ESA-produced source catalogues. A demo will accompany the presentation.
|WEDNESDAY, 25 November 2015|
|Chairs: Danny Lennon, Martino Romaniello, Venue: ESO Auditorium (Eridanus)|
|THURSDAY, 26 November 2015|
|Chairs: Christophe Arviset, Magda Arnaboldi, Venue: ESO Auditorium (Eridanus)|
|Archives 9:00 - 12:50|
|9:00 - 9:20||Deborah Baines||
ESAC, Spain - Track: Archive
ESA's European Space Astronomy Centre (ESAC) has recently launched a new version of the European Hubble Space Telescope science archive. The new and enhanced archive offers several new features, some of which are not available anywhere else.
The new web-based archive has been completely re-engineered and is now faster, more accurate and more robust than ever. Several of its unique features will be presented: the possibility of seeing the exact footprint of each observations on top of an optical all-sky image, the online visualization and inspection of FITS headers, imaging and spectral observation previews without downloading files or the possibility to search for data that has not yet been published in refereed journals.
This state-of-the-art science data archive will be the new main access point to HST data for the European astronomical community and will be enhanced in the near-future to include the Hubble Source Catalogue or other high-level data products as required.
|9:20 - 9:40||Jose Manuel Alacid||
Centro de Astrobiologia, Spain - Track: Archive
The Gran Telescopio Canarias (GTC) archive is operational since November 2011. The archive, maintained by the Data Archive Unit at CAB in the framework of the Spanish Virtual Observatory project, provides access to both raw and science ready data and has been designed in compliance with the standards defined by the International Virtual Observatory Alliance (IVOA) to guarantee a high level of data accessibility and handling.
In this presentation I will describe the main capabilities the GTC archive offers to the community, in terms of functionalities and data collections, to carry out an efficient scientific exploitation of GTC data.
|9:40 - 10:00||Eva Verdugo||
ESAC, Spain - Track: Archive
The Herschel mission required a Science Archive able to serve data to very different users: The own Data Analysis Software (both Pipeline and Interactive Analysis), the consortia of the different instruments and the scientific community. At the same time, the KP consortia were committed to deliver to the Herschel Science Centre, the processed products corresponding to the data obtained as part of their Science Demonstration Phase and the Herschel Archive should include the capability to store and deliver them. I will explain how the current Herschel Science Archive is designed to cover all these requirements.
|10:00 - 10:20||Ivan Zolotukhin||
IRAP, Toulouse - Track: Archive
Like it is the case for many large projects, XMM-Newton data have been used by the community to produce many valuable higher level data products. However, even after 15 years of the successful mission operation, the potential of these data is not yet fully uncovered, mostly due to the logistical and data management issues. We present a web application, http://xmm-catalog.irap.omp.eu, to highlight an idea that existing public high level data collections generate significant added research value when organized and exposed properly. Several application features such as access to the all-time XMM-Newton photon database and online fitting of extracted sources spectra were never available before. In this talk we share best practices we worked out during the development of this website and discuss their potential use for other large projects generating astrophysical data.
|10:20 - 10:40||Xavier Dupac||
ESAC, Spain - Track: Archive/Project
The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA).
It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe.
A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.
|11:15 - 11:50||Marc Sauvage||
CEA, Saclay - Track: Mission/Project
|11:50 - 12:10||Gijs Verdoes Kleijn||
University of Groningen - Track: Mission/Project
The E-ELT First-light instrument MICADO will explore new parameter space in terms of precision astrometry, photometry and spectroscopy. This provides challenges for the data handling and reduction to ensure MICADO takes the observational capabilities of the AO-assisted E-ELT towards its limits. Our plan is to achieve this via iterative improvement of data quality. We present implications for the science data management system and pipelines. The iterative feedback loop between improving science data quality and improving the instrument, telescope and atmospheric calibration leads to an ever better observational model. For this reason the MICADO instrument data simulator plays a vital role in the design of pipelines and science data management system. It should provide realistic data for all science cases to guide the trade-off between calibration via hardware and software. It can then also guide how to embed the observational model in the data management system. We discuss our current models of instrument, E-ELT and sky background and the type of science data management system that supports their continuous monitoring and iterative improvement.
|12:10 - 12:30||Tanya Lim||
ESAC, Spain - Track: Archive
ExoMars 2016 will be the first operational ESA mission to use PDS4 the new version of the NASAâs Planetary Data System (PDS) standards. The data produced will be housed in the new Planetary Science Archive (PSA) which is currently under development at ESAC. This talk will introduce the ExoMars 2016 mission and its payload. The adaptation of the PDS4 standard for ExoMars 2016 and other future missions in the PSA will be discussed along with a progress report on the new PSA development.
|12:30 - 12:50||Juan Gonzalez-Nunez||
ESAC, Spain - Track: Archive/VO
The ESDC (ESAC Science Data Center) is one of the active members of the IVOA (International Virtual Observatory Alliance) that have defined a set of standards, libraries and concepts that allows to create flexible,scalable and interoperable architectures on the data archives development.
In the case of astronomy science that involves the use of big catalogues, as in Gaia or Euclid, TAP, UWS and VOSpace standards can be used to create an architecture that allows the explotation of this valuable data from the community. Also, new challenges arise like the implementation of the new paradigm "move code close to the data", what can be partially obtained by the extension of the protocols (TAP+, UWS+, etc) or the languages (ADQL).
We explain how we have used VO standards and libraries for the Gaia Archive that, not only have producing an open and interoperable archive but, also, minimizing the developement on certain areas. Also we will explain how we have extended these protocols and the future plans.
|Data Centres 14:00 - 17:55|
|14:00 - 14:35||Mike Irwin||
University of Cambridge - Track: Datacenter
In this talk I will review the data management facilities at CASU for handling large scale ground-based imaging and spectroscopic surveys. The overarching principle for all science data processing at CASU is to provide an end-to-end system that attempts to deliver fully calibrated optimally extracted data products ready for science use. The talk will outline our progress in achieving this and how end users visualize the state-of-play of the data processing and interact with the final products via our internal data repository.
|14:35 - 14:55||Enrique Solano||
CAB/INTA, Villanueva, Spain - Track: Datacenter
The Centro de Astrobiología (CAB) Data Centre is the most important astronomical data centre managed by a Spanish institution. Among others, it contains the Gran Telescopio Canarias (GTC) and the Calar Alto (CAHA) scientific archives. Nevertheless, our activities go well beyond data curation. Generation of high level data products (reduced datasets, catalogues,...), knowledge transfer to other Spanish data centres, development of tools to publish astronomical data in VO-compliant archives and services, development of data mining and analysis tools for an optimum scientific exploitation of our data collections and collaboration with scientific groups with research lines using CAB archive data are some of the topics that will be described in this presentation.
|14:55 - 15:15||Wolfram Freudling||
ESO, Garching - Track: Processing
Producing science data products that can be used to extract science is the ultimate objective of astronomical observation. The complexity of modern instruments require highly specialized algorithms for data organization and data reduction. Data visualization and user interaction, both to fine tune individual algorithms and to modify the data flow itself, are essential for the production of science grade products that fully exploits the potential of the raw data.
ESO has a long history of providing specialized algorithms called recipes for each of its instruments. ESOREFLEX is an environment to deliver complete data reduction workflows that include these recipes to the users. These workflows encapsulate the best practise data reduction for the data from a particular instrument, and at the same can easily be modified by the user. ESOREFLEX includes systems for automatic data organization and visualization, interaction with recipes, and the exploration of the provenance tree of intermediate and final data products. ESOREFLEX allows ESO to deliver recipes that are used in its unsupervised operational pipelines to [...]
|15:15 - 15:50||Giovanni Lamanna||
CNRS/LAPP, Anecy - Track: Mission/Project
Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.
|16:20 - 16:55||Mark Allen||
CDS, Strasbourg - Track: Datacenter
The Centre de Donnees de Strasbourg (CDS) is a reference data centre for Astronomy. The CDS services; SIMBAD, Vizier, Aladin and X-Match, provide added value to scientific content in order to support the astronomy research community. Data and information are curated from refereed journals, major surveys, observatories and missions with a strong emphasis on maintaining a high level of quality. The current status and plans of the CDS will be presented, highlighting how the recent innovations of the HiPS (Hierarchical Progressive surveys) and MOC (Multi-Order Coverage map) systems enable the visualisation of hundreds of surveys and data sets, and brings new levels of interoperability between catalogues, surveys images and data cubes.
|16:55 - 17:15||Nicholas Cross||
Institute of Astronomy, Edinburgh - Track: Datacenter
The Wide-Field Astronomy Unit in Edinburgh (WFAU) specializes in building and operating survey science archives. Imaging surveys carried out on UKIRT/WFCAM, VISTA/VIRCAM and the VST/OmegaCAM account for most of our current work, although we also operate the archive for the Gaia-ESO Spectroscopic survey.
|17:15 - 17:35||Joerg Retzlaff||
ESO, Garching - Track: Archive
Phase 3 denotes the process of preparation, submission, validation and ingestion of science data products for storage in the ESO Science Archive Facility and subsequent publication to the scientific community. We will review more than four years of Phase 3 operations at ESO and we will discuss the future evolution of the Phase 3 system.
|17:35 - 17:55||Nausicaa Delmotte||
ESO, Garching - Track: Archive
Data validation is an essential step of the Phase 3 process at ESO. It ensures a homogeneous and consistent archive with well traceable data products, to the benefits of archive users. The many aspects of the Phase 3 validation will be described in the presentation.
|FRIDAY, 27 November 2015|
|Chairs: Michael Sterzik, Venue: ESO Auditorium (Eridanus)|
|Linking Data 9:00 - 12:15|
|9:00 - 9:35||David Schade||
CADC, Canada - Track: Datalinks/center
|9:35 - 10:10||Edwin Valentijn||
University of Groningen - Track: Datacenter
The AstroWise information system is operational for the production of the results of a number astronomical survey programmes with OmegaCAM@VST and MUSE@VLT. In different forms it has also been applied to the Lofar radiotelescope, life science projects and business applications. I will discuss the common "data federation"aspects of these projects, and the data federation aspects of the Euclid Archive System.
|10:10 - 10:30||Severin Gaudet||
CADC, Canada - Track: Datalinks?/VO
Over the past six years, the CADC has moved beyond the astronomy archive data centre to a multi-service system for the community. This evolution is based on two major initiatives. The first is the adoption of International Virtual Observatory Alliance (IVOA) standards in both the system and data architecture of the CADC, including a common characterization data model. The second is the Canadian Advanced Network for Astronomical Research (CANFAR), a digital infrastructure combining the Canadian national research network (CANARIE), cloud processing and storage resources (Compute Canada) and a data centre (Canadian Astronomy Data Centre) into a unified ecosystem for storage and processing for the astronomy community. This talk will describe the architecture and integration of IVOA and CANFAR services into CADC operations, the operational experiences, the lessons learned and future directions.
|11:00 - 11:35||Francoise Genova||
CDS, Strasbourg - Track: Datalinks
European Virtual Observatory (VO) activities have been coordinated by a series of projects funded by the European Commission. Three pillar were identified: support to the data providers for implementation of their data in the VO framework; support to the astronomical community for their usage of VO-enabled data and tools; technological work for updating the VO framework of interoperability standards and tools. A new phase is beginning with the ASTERICS cluster project. ASTERICS Work Package "Data Access, Discovery and Interoperability" aims at making the data from the ESFRI projects and their pathfinders available for discovery and usage, interoperable in the VO framework and accessible with VO-enabled common tools. VO teams and representatives of ESFRI and pathfinder projects and of EGO/VIRGO are engaged together in the Work Package. ESO is associated to the project which is also working closely with ESA. The three pillars identified for coordinating Europaen VO activities are tackled.
|11:35 - 11:55||Johannes Reetz|
|11:55 - 12:15||Uta Grothkopf||
ESO, Garching - Track: Datalinks
The ESO Telescope Bibliography (telbib) is a database of refereed papers published by the ESO users community. It links data in the ESO Science Archive with the published literature, and vice versa. Developed and maintained by the ESO library, telbib also provides insights into the organization's research output and impact as measured through bibliometric studies.
Curating telbib is a multi-step process that involves extensive tagging of the database records. Based on selected use cases, this talk will explain how the rich metadata provide parameters for reports and statistics in order to investigate the performance of ESOâs facilities and to understand trends and developments in the publishing behaviour of the user community.
|12:15||Final Discussion and Concluding Remarks|
Do not drink and drive