1. Introduction
The generalized inverse Gaussian (hereafter GIG) distribution with parameters has density
where is the modified Bessel function of the third kind.
In [1], the authors have established the rate of convergence of the GIG distribution to the gamma distribution by Stein’s method. In order to compare the rate of convergence obtained via Stein’s method with the rate obtained by using another distance,
the authors have established an explicit upper bound of the total variation distance between the GIG random variable and the gamma random variable, which is of order for the case . We generalize this result by providing the order of the rate of convergence in total variation of the GIG distribution to the gamma distribution for all , . In particular, we obtain a rate of convergence of order for , which is
better than the one in [1].
For , , , the Kummer distribution has density function
where is the confluent hypergeometric function of the second kind and is the gamma function.
Details on the GIG and the Kummer distributions can be found in [
1,
2,
3,
4,
5] and references therein.
For , , the gamma distribution has density function
For , , the inverse gamma distribution has density function
The beta distributions of type 2 has density
We have the following definition and a Property of the total variation distance.
Definition 1.
Let and be two continuous real random variables, with density and respectively. Then, the total variation distance between and is given by
Property 1.
Consider and be two continuous random variables. Let (resp. ) the density of (resp. ) on . Assume that the function has a unique zero on .
- If is positive for , then
- If is negative for , then
Proof.
Let (resp. ) be the distribution function of (resp. ). If is positive for , then
which proves the item 1. For item 2, using similar arguments as in the previous case leads to the result.
Remark 1.
The support of the densities may be any interval, but
here we take this support to be in the purpose of the application to the GIG and Kummer’s distributions.
The aim of this paper is to provide a bound for the distance between a GIG (resp. a Kummer’s) random variable and its limiting inverse gamma or gamma variables (resp. gamma or beta variables), and therefore to give a contribution to the study of the rate of convergence in the limit theorems involved. Section 2 presents the main results and their proofs in Section 3.
2. Main results
2.1. On the rate of convergence of the generalized inverse Gaussian distribution to the inverse gamma distribution
The first main result is presented in Theorem 1 below. We recall the convergence of the GIG distribution to the inverse gamma distribution as Proposition 1.
Proposition 1.
For , , let be a sequence of random variables such that for each . Then, as , the sequence converges in law to a random variable following the distribution.
Theorem 1.
Under the assumptions and notations of Proposition 1, we have:
Remark 2.
The upper bound provided by Theorem 1 is of order .
Table 1 and Table 2 are some numerical results for . This case is particularly interesting since it corresponds to the inverse Gaussian distribution used in data analysis when the observations are highly right-skewed [6,7]. The inverse Gaussian law is the distribution of the first hitting time for a Brownian motion [8].
Table 1. Numerical values for and .
|
|
|
|
0.008963786 |
0.01 |
|
0.002983103 |
0.003162278 |
|
0.0004934534 |
0.001 |
|
0.0001549545 |
0.0003162278 |
|
4.948836 10 |
0.0001 |
|
1.570466 10 |
3.162278 10 |
Table 2. Numerical values for and .
|
|
|
|
0.02614564 |
0.03162278 |
|
0.008963782 |
0.01 |
|
0.002971153 |
0.003162278 |
|
0.0004843202 |
0.001 |
|
0.0001553049 |
0.0003162278 |
|
4.927859 10 |
0.0001 |
2.2. On the rate of convergence of the generalized inverse Gaussian distribution to the gamma distribution
Theorem 2.
For , , let be a sequence of random variables such that for each . As , the sequence converges in distribution to a random variable following the distribution.
where
and
Corollary 1.
The upper bound provided by Theorem 2 is of order for and of order for all of the form , , integer.
Remark 3.
In [1], by Stein method, the authors have established an explicit upper bound of given a regular function in , the class of bounded functions for which , , exist
and are bounded. For , , integer, the upper bound provided in [1] by Stein method is of order (Proposition 3.3). This is the same in our result. In addition, our upper bound is quite simple when compared to the one in [1] obtained by Stein’s method (Theorem 3.1), and sharper than the one obtained in Proposition 3.4 [1].
2.3. On the rate of convergence of the Kummer distribution to the gamma distribution
As in the previous subsection, the following theorem contains the rate of convergence in total variation of the Kummer distribution to the gamma distribution.
Theorem 3.
Let be a sequence of random variables such that with , .
Then,
- 1. As , the sequence converges in distribution to a random variable following the distribution.
- 2.
where and
Tables 3 and 4 present the numerical results for fixed values , and . The Upper bound is .
Table 3. Numerical results for .
|
|
Upper bound |
|
0.0001721703 |
0.001817133 |
|
1.721839 |
1.815646 |
|
1.721869 |
1.815546 |
|
1.722037 |
1.816018 |
|
1.723704 |
1.820897 |
|
1.740368 10 |
1.870423 |
Table 4. Numerical results for and .
|
|
Upper bound |
|
0.0001045401 |
0.005830092 |
|
1.045445 |
5.828016 |
|
1.045512 |
5.82978 |
|
1.046143 |
5.849711 |
|
1.052453 |
6.053044 |
|
1.360213 10 |
8.518632 |
2.4. On the rate of convergence of the Kummer distribution to the beta distribution
We have the following result.
Theorem 4.
Let be a sequence of random variables such that with , .
Then,
- 1. As , converges in law to a random variable following the distribution.
- 2.
where and
Remark 4.
As , . Therefore, the upper bound provided in (7) is of order .
3. Proofs of main results
Proof of Proposition 1.
For all ,
We now use the well-known fact that (see for instance [9,10]), as ,
to see that
For all integer , . The function is integrable on . By the Lebesgue’s Dominated Convergence Theorem:
Hence
Proof of Theorem 1.
Let and the densities of and distributions respectively. Let
and
We have
and Which gives
Now, let , then is decreasing on with and
Also,
Then have a unique zero
on . Hence if and if . Using Property 1,
we have:
Then integrating by part, we get:
Since is decreasing and positive on , for all and such that , , we have:
So
and
For , since
and
so, we have
Therefore, for , we have
Proof of Theorem 2.
Let
and
Denote by (rep. ) the density of (resp. ).
We have
and Which gives
is negative if
Hence
Integration by part of leads to
where
and
Proof of Corollary 1.
By equivalence (8), as , we have
Since ,
we have
For , where as . We have
Hence
For all , , integer, we have
Let and
For , we have By induction on ,
can be written in the form
Since as , we have
and, by doing the Euclidean division as in the case (), there exist constants such that,
Hence
Proof of Theorem 3.
Let , with and As in the GIG case, we have
Proof of Theorem 4.
Let with Then
where
and
Acknowledgments
The author is really grateful to the editor and the anonymous reviewers for their constructive comments.
He would also like to thank Kokou Essiomle, Tchilabalo E. Patchali and Essodina Takouda for their help during the preparation of the manuscript.
Conflicts of Interest
The author declares no conflict of interest.