Abstract | Consider a scenario where one aims to learn models from dynamic and evolving data being characterized by very large fluctuations that are neither attributable to noise nor outliers. This may be the case, for instance, when predicting the potential future damages of earthquakes or oil spills, or when conducting financial data analysis. If follows that, in such a situation, the standard central limit theorem does not apply, since the associated Gaussian distribution exponentially suppresses large fluctuations. In this paper, we present an analysis of data aggregation and correlation in such scenarios. To this end, we introduce the Lévy, or stable, distribution which is a generalization of the Gaussian distribution. Our theoretical conclusions are illustrated with various simulations, as well as against a benchmarking financial database. We show which specific strategies should be adopted for aggregation, depending on the stability exponent of the Lévy distribution. Our results firstly show scenarios where it may be impossible to determine the mean and the standard deviation of an aggregate. Secondly, we discuss the case where an aggregate may have to be characterized with its largest fluctuations. Thirdly, we illustrate that the correlation in between two attributes may be underestimated if a Gaussian distribution is erroneously assumed. |
---|