Mainstreaming High-Energy Solar Data

Andre Csillaghy (University of Applied Sciences Northwestern Switzerland), Laszlo I. Etesi (University of Applied Sciences Northwestern Switzerland), Nicky Hochmuth (University of Applied Sciences Northwestern Switzerland)


Abstract

Analyzing and interpreting high-energy data is fundamental to understanding solar dynamics. The current solar high-energy mission RHESSI (Reuven Ramathy High Energy Solar Spectroscopic Imager, in orbit since 2002) and the proposed instrument STIX (Spectrometer/Telescope for Imaging X-rays, selected to fly on Solar Orbiter) rely both on indirect imaging to reach the necessary arc seconds spatial resolution needed to resolve solar X-ray sources.
High-energy solar images and spectra are reconstructed from calibrated photon counts in a way that is mathematically equivalent to the reconstruction of interferometry (radio) data. The generation of science-ready data products is currently a manual process that requires understanding of the instrument and its imaging technique, as well as mastering a complex data analysis software suite. As a result, the exploitation of such data is limited to a group of experts, who work on a limited set of events while a fair amount of the data remains unexploited. The combination of these data with other instruments is also complicated and further limits the science output of the instruments.
HESPE (High-Energy Solar Physics in Europe) is a small Space Collaboration Project, funded by the European FP7 program, aiming to provide scientists with easy to understand high-energy data, with an emphasis on new data processing and analysis methodologies and algorithms. It will automatically produce a comprehensive database of computed products that enables statistical data analysis over many events and ensure integration with Virtual Observatories. This will enable the exploration and mining of the 20TB RHESSI data archive and will prepare the grounds for the STIX instrument.
The automated generation of the RHESSI data products is controlled by a workflow engine in Java. The process is based on the extraction of instrument-independent data components called visibilities that are equivalent to data elements from radio interferometers. An interval selection module within the processing pipeline analyzes time-energy spectrograms for each flare to find optimal time-energy interval settings, which are used to reconstruct meaningful science products from the created visibility database. A default set of reconstructed images and spectra will be provided. Other research-dependent science products can be specified and generated from the visibility database on demand.

Slides in PDF format

Paper ID: O12



Latest News

Quick links

ADASS XXI Conference Poster

Download the Official Conference Flyer:

JPG:   A4  A3

PDF (with printer marks):

8.5in x 11in  11in x 17in  A4  A3  A2