1. Introduction
Noise is an inherent property of any physical system. The perception of noise, whether constructive or destructive, differs from one scientific field to another. In information theory, noise stands for whatever masks the information content of a signal; noise is a nuisance that should be eliminated or minimised as much as possible in order to detect a signal. On the contrary, in biology, physics, chemistry, and neuroscience, noise refers to background fluctuations that can interfere with the system itself. The constructive role of noise in the last few decades has been pointed out, for example, in stochastic resonance or Brownian ratchets, where additive or multiplicative noises drive the system behaviour, even increasing its efficiency [
1].
White noise is probably the most commonly used descriptor for fluctuations in physical, chemical, and biological systems. It typically represents thermal effects at equilibrium. A random process is “white noise” when the distribution of its power spectrum density (PSD) in the frequency domain is flat across all available frequencies. In addition to that, the auto-correlation function (ACF) of white noise is delta-correlated. In discrete space, a white noise sequence has the form of time-ordered and hierarchically distributed, uncorrelated random variables. A sequence formed by independent and identically distributed (i.i.d.) random variables whose values are drawn from a probability distribution function (PDF) is a white noise. The opposite statement, that white noise is a collection of i.i.d. random variables, is not necessarily true. Numerically, white noise is created as a sequence of uncorrelated pseudorandom numbers repeated only after a significantly large number of steps. The i.i.d. values of white noise can be uniformly or normally distributed around a zero mean value. They can also satisfy various PDFs.
As opposed to systems at equilibrium, where white noise can be used to encode all fast-decaying interactions, systems close to or far from equilibrium are better described by coloured noises. The colour of a noise is defined by the value of the slope of PSD of the linear regression on the log-log scale. If the PSD of a noise scales as a power law of the frequency,
f, that is,
, then the value of
classifies the colour. It is purple for
, blue for
, white for
, pink for
, red for
, and black for
. Coloured noises can be associated with the presence of a drift term or a gradient, with the presence of a restoring mechanism, or with a synergic action or even brain functioning. For example, red noise has long been linked to the motion of molecules and particles (Brownian motion) [
2,
3,
4,
5]. A pink or flickering noise can drive animal populations and result in synchronization and cooperativity [
6]. Black noises are associated with the frequency of natural disasters [
7]. In recent years, coloured noises have therefore acquired increasing importance. They can either deliver information related to the environment where a random process takes place or they can drive a system. The latter requires the formation of a memory, that is, the process should follow up to a point or trend, which can either be persistent or anti-persistent. Persistence is accompanied by a slowly varying ACF, thus reflecting the trend of a process that will likely follow its latest values. On the other hand, anti-persistence points to a process that reverses itself more often than white noise, and the ACF takes negative values.
Coloured noises can be numerically produced starting from a white noise sequence through the action of a proper operator (filter). There are various algorithms available to create coloured noises, and it is important to distinguish between those working in the time domain and those in the frequency domain. Auto-regressive approaches [
8,
9], physics-inspired algorithms based on Langevin-type equations [
10,
11], and Fractional Calculus [
12] work in time domain. In frequency-domain, fast-Fourier transform (FFT)-based algorithms are commonly used [
13,
14]. All algorithms convert the white noise input into coloured noise output. This means that under the same transformation, white noises (mother processes) drawn from various probability distributions “give birth” to coloured noises (daughter processes), whose colour is the same regardless of the initial distribution of white noise. The question is whether and to what extent the daughter process retains some of the properties of the mother process.
The colour (that is the spectral exponent
) as an index for describing a random process is second order statistics [
15,
16], and accordingly, it roughly describes a random process around its mean and variance, while the mean necessarily does not exist for a plethora of random processes. A classification based only on the colour code cannot distinguish daughter processes coming from different mother processes, and thus a study limited to the PSD is not able to shed light on differences in coloured noises produced by different white noise distributions. The auto-correlation function (ACF) is a widely used metric for analyzing stochastic process characteristics. ACF is second-order statistics, just like PSD; it tells us whether a noise is persistent or anti-persistent (i.e., the kind of memory it retains), Apart from ACF and PSD, fractal dimension (FD) [
17], Hurst exponent, (H), and generalised moments method (GMM) have been used to characterise noise properties [
18]. In general, FD and H values describe a random process around its mean and are precise for monofractal processes. On the other hand, GMM can accurately describe the properties of non-stationary processes, while for stationary processes, GMM returns a zero Hurst exponent, and the evaluation of the latter goes through Rescale range analysis (R/S), see details in [
19].
In order to detect mother process imprints on the daughter ones, we estimate the kurtosis of each produced coloured noise and compare it to the value of the kurtosis of the corresponding white noise. The value of the kurtosis of data series formed by the time average mean square displacement of single trajectories has been used to classify the input trajectories either as ergodic or not [
20,
21]. This parameter is called the ergodicity breaking parameter (EB), and it has become a widely used measure to quantify fluctuations from trajectory to trajectory [
22]. Kurtosis, however, does not always exist since there are distributions with non-existent moments; for example, the first moment is not defined for the Cauchy distribution, and the higher moments diverge. To overcome this obstacle, we use the codifference function (CD) [
23]. CD makes use of the characteristic function and thus it always exists.
In this work, we create symmetric alpha stable (SaS) white noises, which draw values from either a Gaussian (G), or a Laplace (L), or a Cauchy (C), or an Uniform (U) distribution. For each one of them, we create coloured noises with a spectral exponent . Two techniques are used, namely, fractional calculus (FC) operating in the time domain and spectral processing (SP) operating in the frequency domain. Each produced noise is characterized in terms of ACF, PSD, CD, and kurtosis. We show that ACF and PSD cannot discriminate among coloured noises produced by different white noise distributions. On the other hand, kurtosis and CD can be used as indicators of the mother process traits that have persisted in the daughter process.
2. White Noises and Probability Distributions
Let be an i.i.d. random variable drawing values from a probability distribution . A sequence of time-ordered events with , and defines a random process. Such a process is called white noise and it is strictly stationary as all i.i.d. random processes are. Strictly stationary means that the statistical properties of a process do not change over time, or in other words, the distribution of a number of random variables is the same as we shift them along the time indexed axis. Strictly stationary does not imply the existence of finite moments. Random processes with a constant mean, a finite second moment, and cross-correlation that depends only on the time lag are called weak stationary. Gaussian distributed i.i.d. random variables fulfill the criteria of both strict and weak stationary. It is important to notice that neither strict nor weak stationary imply one another.
Let two i.i.d. random variables have a common distribution, then if any linear combination of the two has the same distribution up to location and scale parameters then the distribution is called stable (unchanged shape). Stable distributions is an important class of probability distributions with interesting properties [
24]. Stable distributions are also known as
-stable Lévy distributions. An
-stable distribution,
requires four parameters for its complete description and is defined through its characteristic function,
.
is the Fourier transform of
, and it always exists for any real-valued random variable as opposed to the moments generating function. The
of an
-stable distribution reads [
25]
where
is the index of stability or characteristic exponent,
is a skewness parameter,
is a scale parameter, and
is a location parameter or mean. For
, Equation (
1) returns the characteristic function of the stretched exponential centered around its mean,
. For
the distribution has undefined variance, while for
has undefined mean. For
and for
, Equation (
1) returns the Gaussian and the Cauchy distributions, respectively, which read
and
Mean, variance, skewness and kurtosis are expressed by the moments of the probability distribution up to 4th order. The
qth order integer moment with
is derived directly from Equation (
1) as (Alternatively, the moments can be obtained directly from the integral
, when the pdf
is known.)
Equation (
4) can be generalized to include also fractional order moments and reads [
26]
where
,
, and
stands for the real part of the complex number. Having as starting point Equation (
5) one can obtain absolute moments centered around the mean, see for example [
27].
Geometric stable laws constitute a class of limiting distributions of appropriately normalized sums of i.i.d. random variables [
28].
is the characteristic function of a geometric stable distribution
if and only if it is connected to the characteristic function of an
-stable distribution,
, through the relation
with stability index,
, mean or location parameter,
, scale parameter,
, and asymmetry parameter,
b, are used to describe a geometric stable distribution. For
,
,
, and scale parameter
, Equation (
6) gives the characteristic function of the standard Laplace distribution, whose analytic form reads
In addition, Equation (
7) for
returns the classical Laplace distribution whose PDF can be expressed as addition or multiplication of i.i.d. random processes, see Table 2.3 in [
29].
Finally, we also use the uniform distribution (U) with pdf
with
a and
b being the values range of the distribution. For
, and
Equation (
8) returns the uniform (U) distribution with zero mean and variance 1. The U-distribution is used as an auxiliary one for constructing L and C distributions in discrete space, see below.
The discrete white noise time series shown in
Figure 1 were generated using Matlab [
30]. Technically speaking, the function
of [
30] has been used to create uniformly distributed random numbers,
. The function
of [
30], which is based on the Box-Muller algorithm [
31], as
, where
and
are two uniformly distributed random numbers in the range (0,1) [
32], has been used to produce Gaussian distributed random numbers. A Laplace (L) distributed white noise sequence is generated as the ratio of two uniformly distributed random numbers
and
taken values in the range (0,1),
[
33]. A Cauchy (C) white noise distribution is generated as
, where
is uniformly distributed random number in (0,1) [
34]. Notice also that, a Cauchy distribution can be constructed as the ratio of two i.i.d. discrete random variables with values drawn from the Gaussian distribution. All white noise sequences used in this work were shifted properly to have a zero mean and unitary variance. The length of each of them was set to
steps, and for sake of simplicity, we convert steps/realizations to time steps (arbitrary units). Codifference is estimated as a function of the time lag, whose values fall within the range
, and its value is estimated from Equations (
A6) and (
A7), see
Appendix B.
Typical statistical measures of classifying time series are the autocorrelation function (ACF), power spectral density (PSD), mean square displacement (MSD), and its generalization (generalized moments method, GMM), which includes moments smaller and higher than two, including also fractional moments [
18]. For all white noises used in this study, ACF is by definition delta-correlated, and thus no distinction can be made based on it. Similar, PSD is flat for all noises, and GMM depends on
,
q being the order of the moment, and only the prefactor changes from one white noise to another; see the analytical forms of the absolute central moments in
Table 1. In addition, by construction, the produced white noises are of symmetric
-stable sequences (SaS), and accordingly, skewness is zero. Kurtosis provides the first indication regarding the differences between the white noise sequences [
35]. The values of the kurtosis for the different white noises shown in
Figure 1 are exactly in line with theory, see
Table 1. A standard Gaussian white noise is characterized by kurtosis of 3 and has a mesokurtic distribution. Distributions with kurtosis values greater than 3 are named leptokurtic while PDFs with kurtosis values smaller than 3 are said platykurtic. In this frame, Laplace and Cauchy white noise are leptokurtic while Uniform white noise is platykurtic. The codifference function for each white noise is also shown in
Figure 1. The definitions for the codifference function are given in
Appendix B, see Equations (
A5)–(
A7). For Gaussian white noise the codifference function is equal to
for
and zero otherwise, Equation (A8), and the result depicted in
Figure 1 is in perfect agreement with theory. Equation (B.8) provides the codifference function for Cauchy white noise, and is zero, in agreement with our findings presented in
Figure 1. The codifference function takes the values
for Gaussian,
for Laplace,
for Uniform, see also
Table 2 and discussion in
Appendix B. Given that the codifference provides the similarity of the bulk of the distribution and has a value of
for Gaussian noise, we expect that this value will be smaller in absolute value for Laplace white noise (less similarity) and higher in absolute value for Uniform white noise (higher similarity).
Table 1 shows the absolute moments, the mean, the variance, the skeweness, and the kurtosis for Gaussian, Laplace, Cauchy, and Uniform white noise distributions. Notice that the Cauchy distribution, Equation (
3), has undefined first moment and non-converging second moment. However, we can form white noise discrete distributions that have a constant mean and a time-dependent variance, which draw values from a Cauchy distribution by imposing a kind of truncation.
4. Results and Discussion
Each white noise sequence depicted in
Figure 1 is subject to the action of both fractional calculus (FC) and spectral processing (SP), and seven coloured sequences for each operator are produced. Therefore, each white noise (mother process) yields 14 daughter processes (7 with FC and 7 with SP), each of length
. In total we generate
= 56 new sequences, which we analyze.
ACFs for daughter processes derived from the various white noises (G, L, C, and U) are essentially identical when a color is used for comparative analysis (see
Figure 2). And yet, the operator involved in the creation of each daughter process has no significant impact on the sequence that is produced in terms of this property. In addition, ACFs display the fundamental characteristics of the generated noises, i.e., correlated (smoother trajectories) and anti-correlated (rough trajectories). Contrary to non-vanishing AFCs for
, which indicate correlation, the small negative part of AFCs for
highlights processes that are anti-correlated. Notice also that the effects of correlation, long tails, and anti-correlation, deeper minima, are stronger the higher the absolute value of
.
The PSDs do not distinguish between noise sequences of the same colour generated by various white noise distributions (G, L, C, and U). Actually, as it should be, the slope of the PSDs in log-log linear regression is maintained constant for at least 4 orders of magnitude, and its value relates to a particular colour. PSDs produced by the SP present some differences with respect to those produced by the FC. The SP method exerts a “brutal force” on the flat spectrum of white noise and reverts it by a given slope (colour), and thus all data points fit well into a linear regression on a log-log scale (see
Figure 2). Instead, FC creates a memory that does not last for ever, and thus, in the long-time limit, this memory is washed out, and the spectrum in this region looks flat, retaining properties of the mother process from which it came. In addition it has been noted that spectral methods accurately predict the scaling of synthetic time series produced in the frequency domain, while for those produced in the time domain, half of the spectra estimates deviate significantly for the nominal value of the scaling exponent [
44].
The first proof of discrimination between noises of the same color produced white noise sampled from different PDFs is provided by kurtosis. The value of kurtosis for mother processes is equal to 3 for Gaussian, 6 for Laplace, 1.8 for Uniform, and it is very large for Cauchy white noise (see
Figure 1). Remarkably, daughter processes of same colour but generated from different mother processes (i.e., different PDFs) differ (see
Figure 3). Gaussian coloured noises (whatever the colour is) retain the mesokurtic behaviour, as expected for a linear transformation of a Gaussian distribution, while all the other daughter processes diverge from the value of 3. Laplace and Cauchy coloured noises are leptokurtic (kurtosis > 3) and uniform ones are platykurtic (kurtosis < 3), but the value of kurtosis depends on the colour of the noise. Interestingly, in non-mesokurtic daughter processes, kurtosis tends to 3 as the absolute value of
increases.
Codifference is the second index that distinguishes coloured noises with the same spectral exponent but produced by different mother processes. The codifference function,
, see
Appendix B, where
, takes the value
for
for all coloured noises coming from Gaussian white noise. On the contrary, all coloured noises derived from a uniform mother process are characterized by CD
and those derived from Laplace white noise have CD
(see
Figure 3 and
Table 2). A completely different behaviour is displayed for Cauchy coloured noises whose CD function is characterized by constant value around zero.
Table 2 reports the values of CD at lag-time
obtained from numerical simulations. CD takes the value of
for Laplace white noise at
, and this value is in agreement with theoretical expectations; see
Appendix B. The corresponding value of CD (
) for coloured daughter processes of a Laplace white noise present small changes with respect to the value of the mother process. The value of CD (
obtained for uniform white noise is in perfect agreement with theory, see
Appendix B. All the daughter processes derived from uniform white noise are characterized by a CD value close to
and confirms that all retain imprints of the mother process. This is the value of the codifference for lag zero. It takes the value of
for white noise and all produced noises from it, and independent of the used operator (FC or SP), have a value of equal or very similar to
; see
Table 2. The value of
is in perfect agreement with theory, see
Appendix B. For Cauchy white noise, the codifference function is zero for lag zero, see Equation (B.8) in
Appendix B, and it is zero independent of the time lag,
Figure 3. The same is true for all daughter processes that originated from Cauchy white noise; see
Figure 3, where the line in yellow stands for Cauchy noises. The zero value of codifference, for
-stable distributions, states that the processes involved in its definition are independent of each other if
or
. For
, Cauchy distribution, it is not clear if the involved processes are independent of each other or not; see also discussion in
Appendix B.
In order to validate the results obtained from the analysis of the codifference function, we compared the analytical form of
given by Equation (
A10) for Gaussian coloured noises with the results of the numerical simulations for the same coloured noises and for
; in our simulations, we produced coloured noises by both fractional calculus and spectral processing (see
Figure 4). The values obtained from numerical simulations match nicely with theoretical predictions. The characterization of the daughter processes demonstrates that second-order statistics, such as PSD and ACF, can detect the colour of a noise but cannot discriminate the PDF of the original mother noise. On the contrary, kurtosis and CD are effective tools to separate coloured noises generated from various white noise distributions. For all coloured noises produced from the original mother process, kurtosis can provide information regarding the PDF of the mother process: it will be greater than 3 for leptokurtic mother white noise while smaller than 3 for a platykurtic one. Furthermore, CD for lag-time zero effectively maintains the value, within a small fluctuating range, of the original mother process for all coloured noises that are derived from it, and consequently it can be used as a fingerprint for the detection of PDF family (lepto-, platy- or meso-kurtic) it belongs to. The lag-time dependence of the co-difference function is interesting. No matter from which PDF of white noise a daughter process was derived, CD exhibits nearly the same time dependence when a colour value is within the expected range [0.25, 0.75] (see
Figure 3). On the other hand, there is a definite distinction between the same colour derived from different white noise pdf’s when the colour value is within the range
. According to
Figure 3, the uniform distribution-generated coloured noises have the absolute maximum in comparison to the other noises, followed by the coloured noises generated by the Gaussian distribution. Conversely, noise generated based on the Laplace distribution always has the lowest maxima. The properties under consideration don’t appear to be impacted by whether coloured noise is produced using FC or SP. One can note the best placements of the power spectrum but the latter has to do with how to produce a coloured noise as already discussed in the text.
Four distinct white noise distributions were chosen, and the coloured noises produced by them have been analysed in the present work. These distributions find application in many different branches of the sciences, just to name a few: (i) any aspect of science for Gaussian white noise; (ii) resonance, spectroscopy, prices of speculative assets in economy, distribution of hypocenters on focal spheres of earthquakes in geophysics [
25,
45] for Cauchy; (iii) formation of images by the eyes [
46] in decision-making for systems by maximising the information for Uniform; and (iv) communications, economics, engineering, and finance for Laplace [
47]. The present work can be extended in many different ways. First, the same initial white noise distributions can be chosen without the condition of SaS, and thus, apart from kurtosis and CD, the skewness can also be examined in order to see if this property for coloured noises also retains characteristics of the initial white noises. In addition, different white noises than those used in this work can be selected as seeds, and of interest will be if kurtosis and CD maintain some of the properties of the initial white noises in the way we found in this work. A detailed investigation of various white noise distributions can lead to a general conclusion about the behaviour of kurtosis and CD. A positive answer—common behaviour—will support studies where a coloured noise finding (non-equilibrium process) can directly indicate the form of equilibrium from which it came, thus indicating possible mechanisms acting on the equilibrium state.
Figure 1.
White Noises: Trajectories (first column), probability distribution (second column), kurtosis (third column) and codifference function (fourth column) are displayed for Gaussian (blue), uniform (green), Laplace (orange), and Cauchy (yellow) PDFs.
Figure 1.
White Noises: Trajectories (first column), probability distribution (second column), kurtosis (third column) and codifference function (fourth column) are displayed for Gaussian (blue), uniform (green), Laplace (orange), and Cauchy (yellow) PDFs.
Figure 2.
Autocorrelation (ACF) and Power Spectral Density (PSD) for all produced coloured noises. The inset defines the operator used for the creation of noise sequence and the value of the fractional noise. Blue for daughter noises coming from Gaussian white noise, green for daughter noises coming from Uniform white noise, orange for those coming from Laplace white noise, and yellow for those coming from Cauchy white noise.
Figure 2.
Autocorrelation (ACF) and Power Spectral Density (PSD) for all produced coloured noises. The inset defines the operator used for the creation of noise sequence and the value of the fractional noise. Blue for daughter noises coming from Gaussian white noise, green for daughter noises coming from Uniform white noise, orange for those coming from Laplace white noise, and yellow for those coming from Cauchy white noise.
Figure 3.
Kurtosis and codifference function for all produced coloured noises. The inset defines the PDF of the mother process and the operator used for the creation of a noise sequence.
Figure 3.
Kurtosis and codifference function for all produced coloured noises. The inset defines the PDF of the mother process and the operator used for the creation of a noise sequence.
Figure 4.
Codifference function as function of the lag time as it is predicted by simulations and by Equation (A9).
Figure 4.
Codifference function as function of the lag time as it is predicted by simulations and by Equation (A9).
Table 1.
Characteristic Function (CF), absolute central moments (), mean, variance (var), skewness (Sk), and Kurtosis (Ku) for Uniform (U), Gaussian (G), Laplacian (L), and Cauchy (C) i.i.d. random variables. Variance, skweness, and kurtosis can be obtained from the absolute central moments. In addition, for all moments of the Cauchy distribution are either undefined or go to infinity, however, after a suitable renormalization, for example, dividing each component of the probability distribution by its highest value, all the moments of a Cauchy distribution may exist.
Table 1.
Characteristic Function (CF), absolute central moments (), mean, variance (var), skewness (Sk), and Kurtosis (Ku) for Uniform (U), Gaussian (G), Laplacian (L), and Cauchy (C) i.i.d. random variables. Variance, skweness, and kurtosis can be obtained from the absolute central moments. In addition, for all moments of the Cauchy distribution are either undefined or go to infinity, however, after a suitable renormalization, for example, dividing each component of the probability distribution by its highest value, all the moments of a Cauchy distribution may exist.
|
|
|
mean |
var |
Sk |
Ku |
U |
|
|
|
|
0 |
1.8 |
G |
|
|
|
|
0 |
3 |
L |
|
|
|
|
0 |
6 |
C |
|
|
undefined |
∞ |
∞ |
∞ |
Table 2.
Codifference values for lag-time , obtained from numerical simulations of the mother white noises, namely, Gaussian, Laplace, and Uniform, as well as for all their daughter coloured noises () produced either with FC or with SP. For Cauchy white and coloured noises CD .
Table 2.
Codifference values for lag-time , obtained from numerical simulations of the mother white noises, namely, Gaussian, Laplace, and Uniform, as well as for all their daughter coloured noises () produced either with FC or with SP. For Cauchy white and coloured noises CD .
|
|
|
|
|
0 |
0.25 |
0.50 |
0.75 |
Gaussian |
|
|
|
|
|
|
|
|
FC |
|
|
|
|
|
|
|
|
SP |
|
|
|
|
|
|
|
|
Laplace |
|
|
|
|
|
|
|
|
FC |
|
|
|
|
|
|
|
|
SP |
|
|
|
|
|
|
|
|
Uniform |
|
|
|
|
|
|
|
|
FC |
|
|
|
|
|
|
|
|
SP |
|
|
|
|
|
|
|
|