Nonparametric density estimation, based on kernel-type estimators, is a very popular method in statistical research, especially when we want to model the probabilistic or stochastic structure of a data set. In this paper, we investigate the asymptotic confidence bands for the distribution with kernel-estimators for some types of divergence measures (Rényi-α and Tsallis-α divergence). Our aim is to use the method based on empirical process techniques, in order to derive some asymptotic results. Under different assumptions, we establish a variety of fundamental and theoretical properties, such as the strong consistency of an uniform-in-bandwidth of the divergence estimators. We further apply the previous results in simulated examples, including the kernel-type estimator for Hellinger, Bhattacharyya and Kullback-Leibler divergence, to illustrate this approach, and we show that that the method performs competitively.
Published in | American Journal of Theoretical and Applied Statistics (Volume 5, Issue 1) |
DOI | 10.11648/j.ajtas.20160501.13 |
Page(s) | 13-22 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2016. Published by Science Publishing Group |
Divergence Measures, Kernel Estimation, Strong Uniform, Consistency
[1] | Bosq, D. and Lecoutre, J. P. (1987). Théorie de l’estimation fonctionnelle. Économie et Statistiques Avancées. Economica, Paris. |
[2] | Bouzebda, S. and Elhattab, I. (2011) Uniform-in-bandwidth consistency for kernel-type estimators of Shannon’s entropy. Electronic Journal of Statistics. 5, 440-459. |
[3] | Csiszár, I. (1967). Information-type measures of differences of probability distributions and indirect observations. Studia Sci. Math. Hungarica, 2: 299-318. |
[4] | Deheuvels, P. (2000). Uniform limit laws for kernel density estimators on possibly unbounded intervals. In Recent advances in reliability theory (Bordeaux, 2000), Stat. Ind. Technol., pages 477-492. BirkhaBoston. |
[5] | Deheuvels, P. and Mason, D. M. (2004). General asymptotic confidence bands based on kernel-type function estimators. Stat. Inference Stoch. Process., 7(3), 225-277. |
[6] | Deroye, L. and Gyorfi, L. (1985). Nonparametric density estimation. Wiley Series in Probability and Mathematical Statistics: Tracts on Probability and Statistics. John Wiley & Sons Inc., New York. The L1 view. |
[7] | Devroye, L. and Lugosi, G. (2001). Combinatorial methods in density estimation. Springer Series in Statistics. Springer-Verlag, New York. |
[8] | Devroye, L. and Wise, G. L. (1980). Detection of abnormal behavior via nonparametric estimation of the support. SIAM J. Appl. Math., 38(3), 480-488. |
[9] | Dmitriev, J. G. and Tarasenko, F. P. (1973). The estimation of functionals of a probability density and its derivatives. Teor. Verojatnost. i Primenen., 18, 662-668. |
[10] | Einmahl, U. and Mason, D. M. (2000). An empirical process approach to the uniform consistency of kernel-type function estimators. J. Theoret. Probab., 13 (1), 1-37. |
[11] | Einmahl, U. and Mason, D. M. (2005). Uniform in bandwidth consistency of kernel-type function estimators. Ann. Statist., 33(3), 1380-1403. |
[12] | Giné, E. and Guillou, A. (2002). Rates of strong uniform consistency for multivariate kernel density estimators. Ann. Inst. H. Poincaré Probab. Statist. 38 907-921. |
[13] | Giné, E. and Zinn, J. (1984). Some limit theorems for empirical processes (with discussion). Ann. Probab. 12 929-998. |
[14] | Johnson, D. H., Gruner, B., C. M. K., and Seshagiri.(2001) Information-theoretic analysis of neural coding. Journal of Computational Neuroscience. |
[15] | Krishnamurthy A., Kandasamy K., Póczos B., and Wasserman L., (2014). Nonparametric Estimation of Rényi Divergence and Friends. http://www.arxiv.org/1402.2966v2. |
[16] | Ngom, P., Dhaker, H., Mendy, P., Deme,. E. Generalized divergence criteria for model selection between random walk and AR(1) model.https://hal.archives-ouvertes.fr/hal-01207476v1 |
[17] | Nolan, D. and Pollard, D. (1987): U-processes: rates of convergence. Ann. Statist., 15(2): 780–799. |
[18] | Pakes, A. and Pollard, D.(1989): Simulation and the asymptotics of optimization estimators. Econometrica, 57(5): 1027–1057, 1989. |
[19] | Parzen, E. (1962). On estimation of a probability density function and mode. Ann. Math. Statist., 33, 1065-1076. |
[20] | Pardo, L.(2005) Statistical inference based on divergence measures. CRC Press. |
[21] | Pluim B M, Safran M. From breakpoint to advantage. description, treatment, and prevention of all tennis injuries. Vista: USRSA, 2004. |
[22] | Prakasa Rao, B. L. S. (1983). Nonparametric functional estimation. Probability and Mathematical Statistics. Academic Press Inc. [Harcourt Brace Jovanovich Publishers], New York. |
[23] | Póczos, B. and Schneider, J. On the estimation of alpha-divergences. CMU, Auton Lab Technical Report, |
[24] | http://www.cs.cmu.edu/bapoczos/articles/poczos11alphaTR.pdf. |
[25] | Póczos, B. Xiong L., Sutherland D, J., and Schneider J. (2012). Nonparametric kernel estimators for image classification. In IEEE Conference on Computer Vision and Pattern Recognition. |
[26] | Rényi, A. (1961). On measures of entropy and information. In Fourth Berkeley Symposium on Mathematical Statistics and Probability. |
[27] | Rényi, A. (1970). Probability Theory. Publishing Company, Amsterdam. |
[28] | Rosenblatt, M. (1956). Remarks on some nonparametric estimates of a density function. Ann. Math. Statist., 27, 832-837. |
[29] | Van der Vaart, A. W. and Wellner, J. A. (1996). Weak Convergence and Empirical Processes: With Applications to Statistics. Springer, New York. |
[30] | Viallon, V. (2006). Processus empiriques, estimation non param´etrique et données censurées. Ph. D. thesis, Université Paris 6. |
[31] | Villmann, T. and Haase, S. (2010). Mathematical aspects of divergence based vector quantization using Frechet-derivatives. University of Applied SciencesMittweida. |
APA Style
Hamza Dhaker, Papa Ngom, El Hadji Deme, Pierre Mendy. (2016). Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency. American Journal of Theoretical and Applied Statistics, 5(1), 13-22. https://doi.org/10.11648/j.ajtas.20160501.13
ACS Style
Hamza Dhaker; Papa Ngom; El Hadji Deme; Pierre Mendy. Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency. Am. J. Theor. Appl. Stat. 2016, 5(1), 13-22. doi: 10.11648/j.ajtas.20160501.13
AMA Style
Hamza Dhaker, Papa Ngom, El Hadji Deme, Pierre Mendy. Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency. Am J Theor Appl Stat. 2016;5(1):13-22. doi: 10.11648/j.ajtas.20160501.13
@article{10.11648/j.ajtas.20160501.13, author = {Hamza Dhaker and Papa Ngom and El Hadji Deme and Pierre Mendy}, title = {Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency}, journal = {American Journal of Theoretical and Applied Statistics}, volume = {5}, number = {1}, pages = {13-22}, doi = {10.11648/j.ajtas.20160501.13}, url = {https://doi.org/10.11648/j.ajtas.20160501.13}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20160501.13}, abstract = {Nonparametric density estimation, based on kernel-type estimators, is a very popular method in statistical research, especially when we want to model the probabilistic or stochastic structure of a data set. In this paper, we investigate the asymptotic confidence bands for the distribution with kernel-estimators for some types of divergence measures (Rényi-α and Tsallis-α divergence). Our aim is to use the method based on empirical process techniques, in order to derive some asymptotic results. Under different assumptions, we establish a variety of fundamental and theoretical properties, such as the strong consistency of an uniform-in-bandwidth of the divergence estimators. We further apply the previous results in simulated examples, including the kernel-type estimator for Hellinger, Bhattacharyya and Kullback-Leibler divergence, to illustrate this approach, and we show that that the method performs competitively.}, year = {2016} }
TY - JOUR T1 - Kernel-Type Estimators of Divergence Measures and Its Strong Uniform Consistency AU - Hamza Dhaker AU - Papa Ngom AU - El Hadji Deme AU - Pierre Mendy Y1 - 2016/02/16 PY - 2016 N1 - https://doi.org/10.11648/j.ajtas.20160501.13 DO - 10.11648/j.ajtas.20160501.13 T2 - American Journal of Theoretical and Applied Statistics JF - American Journal of Theoretical and Applied Statistics JO - American Journal of Theoretical and Applied Statistics SP - 13 EP - 22 PB - Science Publishing Group SN - 2326-9006 UR - https://doi.org/10.11648/j.ajtas.20160501.13 AB - Nonparametric density estimation, based on kernel-type estimators, is a very popular method in statistical research, especially when we want to model the probabilistic or stochastic structure of a data set. In this paper, we investigate the asymptotic confidence bands for the distribution with kernel-estimators for some types of divergence measures (Rényi-α and Tsallis-α divergence). Our aim is to use the method based on empirical process techniques, in order to derive some asymptotic results. Under different assumptions, we establish a variety of fundamental and theoretical properties, such as the strong consistency of an uniform-in-bandwidth of the divergence estimators. We further apply the previous results in simulated examples, including the kernel-type estimator for Hellinger, Bhattacharyya and Kullback-Leibler divergence, to illustrate this approach, and we show that that the method performs competitively. VL - 5 IS - 1 ER -