Tag Archives: Data Analysis, Statistics and Probability

A simple ‘knee-like’ approximation of the Lateral Distribution Function (LDF) of Cherenkov light emitted by EAS (extensive air showers) in the atmosphere is proposed for solving various tasks of data analysis in HiSCORE and other wide angle ground-based experiments designed to detect gamma rays and cosmic rays with the energy above tens of TeV. Simulation-based parametric analysis of individual LDF curves revealed that on the radial distance 20-500 m the 5-parameter ‘knee-like’ approximation fits individual LDFs as well as the mean LDF with a very good accuracy. In this paper we demonstrate the efficiency and flexibility of the ‘knee-like’ LDF approximation for various primary particles and shower parameters and the advantages of its application to suppressing proton background and selecting primary gamma rays.

In this paper we propose a ‘knee-like’ approximation of the lateral distribution of the Cherenkov light from extensive air showers in the energy range 30-3000 TeV and study a possibility of its practical application in high energy ground-based gamma-ray astronomy experiments (in particular, in TAIGA-HiSCORE). The approximation has a very good accuracy for individual showers and can be easily simplified for practical application in the HiSCORE wide angle timing array in the condition of a limited number of triggered stations.

A ‘knee-like’ approximation of Cherenkov light Lateral Distribution Functions, which we developed earlier, now is used for the actual tasks of background rejection methods for high energy (tens and hundreds of TeV) gamma-ray astronomy. In this work we implement this technique to the HiSCORE wide angle timing array consisting of Cherenkov light detectors with spacing of 100 m covering 0.2 km$^2$ presently and up to 5 km$^2$ in future. However, it can be applied to other similar arrays. We also show that the application of a multivariable approach (where 3 parameters of the knee-like approximation are used) allows us to reach a high level of background rejection, but it strongly depends on the number of hit detectors.

This work is a methodical study on hybrid reconstruction techniques for hybrid imaging/timing Cherenkov observations. This type of hybrid array is to be realized at the gamma-observatory TAIGA intended for very high energy gamma-ray astronomy (>30 TeV). It aims at combining the cost-effective timing-array technique with imaging telescopes. Hybrid operation of both of these techniques can lead to a relatively cheap way of development of a large area array. The joint approach of gamma event selection was investigated on both types of simulated data: the image parameters from the telescopes, and the shower parameters reconstructed from the timing array. The optimal set of imaging parameters and shower parameters to be combined is revealed. The cosmic ray background suppression factor depending on distance and energy is calculated. The optimal selection technique leads to cosmic ray background suppression of about 2 orders of magnitude on distances up to 450 m for energies greater than 50 TeV.

This work is a methodical study of another option of the hybrid method originally aimed at gamma/hadron separation in the TAIGA experiment. In the present paper this technique was performed to distinguish between different mass groups of cosmic rays in the energy range 200 TeV – 500 TeV. The study was based on simulation data of TAIGA prototype and included analysis of geometrical form of images produced by different nuclei in the IACT simulation as well as shower core parameters reconstructed using timing array simulation. We show that the hybrid method can be sufficiently effective to precisely distinguish between mass groups of cosmic rays.

Many observational records critically rely on our ability to merge different (and not necessarily overlapping) observations into a single composite. We provide a novel and fully-traceable approach for doing so, which relies on a multi-scale maximum likelihood estimator. This approach overcomes the problem of data gaps in a natural way and uses data-driven estimates of the uncertainties. We apply it to the total solar irradiance (TSI) composite, which is currently being revised and is critical to our understanding of solar radiative forcing. While the final composite is pending decisions on what corrections to apply to the original observations, we find that the new composite is in closest agreement with the PMOD composite and the NRLTSI2 model. In addition, we evaluate long-term uncertainties in the TSI, which reveal a 1/f scaling

Data processing pipelines are one of most common astronomical software. This kind of programs are chains of processes that transform raw data into valuable information. In this work a Python framework for astronomical pipeline generation is presented. It features a design pattern (Model-View-Controller) on top of a SQL Relational Database capable of handling custom data models, processing stages, and result communication alerts, as well as producing automatic quality and structural measurements. This pat- tern provides separation of concerns between the user logic and data models and the processing flow inside the pipeline, delivering for free multi processing and distributed computing capabilities. For the astronomical community this means an improvement on previous data processing pipelines, by avoiding the programmer deal with the processing flow, and parallelization issues, and by making him focusing just in the algorithms involved in the successive data transformations. This software as well as working examples of pipelines are available to the community at https://github.com/toros-astro.