Y3 Advanced Gamma-Ray Science Methods and Tools
The goal of this project is to study modern techniques of signal processing from the wider scientific and mathematical community and apply them to the analysis of very-high-energy gamma ray data.
This applies both to the reconstruction of raw data from Cherenkov telescopes, where we strive to lower the energy threshold and improve angular resolution and cosmic-ray rejection, to the realtime detection of transient sources in high-level data.
This study benefits current and future gamma ray instruments, in particular the Cherenkov Telescope Array (CTA) observatory while simultaneously giving visibility to local groups within several large projects.
POSITION NAME SURNAME LABORATORY NAME GRADE, EMPLOYER WP leader Karl KOSACK AIM CDI, CEA WP member Jérémie Decock AIM Postdoc, Univ. Paris VII WP member Sandrine Pires CosmoStat CDI, CEA WP member Bruno Khelifi APC CDI, CNRS WP member Fabio Acéro AIM CDI, CNRS WP member Tino Michael AIM Postdoc, ASTERICS WP member Thierry Stolarczyk AIM CDI, CEA
Overview of Current Results:
- A wavelet-based denoising library based on techniques from the CosmoStat lab was developed, tested, and applied to realistic IACT data to attempt to improve performance
- A full end-to-end data processing chain was produced to analyze IACT data from the raw shower image to the high-level science results (reconstructed event lists and instrumental response functions). This chain will be incoporated into the official CTA data processing pipeline, and may soon also be used to optinally process HESS data. The development of this chain made it possible to make high-level comparisons of the scientific performance of the image processing techniques used in this project to standard techniques.
- The wavelet-based denoising achieves a better point-spread function than the standard analysis at all energies. This can be attributed to better signal extraction and to the improved number of images per shower that are usable for event reco nstruction (faint images that still retain enough geometrical information to be useful that may be thrown out by standard cleaning techniques).
- Initial sensitivity curves show that the wavelet out-performs a standard analysis by factors of 20% – 200%, particularly well at the low energy range close to the detection threshold, but improvements are also seen at high-energies due to better PSF. These preliminary results come from a recently-completed analysis of the full CTA–South telescope array, and are in the process of being cross-checked.
- All software produced as open-source libraries, mostly in the Python language with some C++. These libraries may be released publically, with appropriate acknoledgement of UnivEarths support. They will likely be used in the CTA data processing pipeline, as well as for deeper studies of data from the HESS telescopes.
- The improved sensitivity achieved using the wavelet denoising technique is currently demonstrated only with Monte-Carlo data. A significant extension to this project would be to apply it to real data from the HESS telescope array, particulary to science cases where a lower threshold or better PSF would provide improvement. For example, the Galactic Center region, a supernova remnant, or a weak variable point-source.
- A preliminary study showed extreme promise in applying the same wavelet denoising techniques to the full time-dependent shower datacubes rather than the single time-integrated images used now. This adds a 3rd dimension in the wavelet transform, which is easily realized (in fact already done), but needs more study to determine the optimal cleaning thresholds. If successful, it would mean the application of denoising could start at an earlier level in the data processing chain, and using more information could further improve sensitivity. This may particularly be interesting for realtime analysis, where the steps of calibration, time-integration (which includes time-series peak detection and other filtering techniques) could be combined into a single wavelet-transform.
- It should be explored whether or not the wavelet parameters can be used in the gamma-hadron discrimination step of the pipeline in lieu (or in addition to) the moment-based “Hillas-parameters” that are currently used, which are computed after denoising. This additional information may also improve sensitivity.
- The code as currently developed still has some speed bottlenecks, in particular the need to transfer data to the wavelet algorithms in FITS format. This could be significantly improved by re-writing and simplifying some of the CosmoStat code.
- The final products should be fully documented, verified, and packaged in a nice format so that it can be maintained and included in for example the CTA or HESS standard analysis. This requires some cleanup and maintenance work.
The work has so far only been presented at a CTA consortium meeting, and a CTA pipelines workshop. This was mainly due to the lack of concrete results until recently. Hopefully the final results can be presented somewhere.