Full metadata
Title
Improved Finite Sample Estimate of A Nonparametric Divergence Measure
Description
This work details the bootstrap estimation of a nonparametric information divergence measure, the Dp divergence measure, using a power law model. To address the challenge posed by computing accurate divergence estimates given finite size data, the bootstrap approach is used in conjunction with a power law curve to calculate an asymptotic value of the divergence estimator. Monte Carlo estimates of Dp are found for increasing values of sample size, and a power law fit is used to relate the divergence estimates as a function of sample size. The fit is also used to generate a confidence interval for the estimate to characterize the quality of the estimate. We compare the performance of this method with the other estimation methods. The calculated divergence is applied to the binary classification problem. Using the inherent relation between divergence measures and classification error rate, an analysis of the Bayes error rate of several data sets is conducted using the asymptotic divergence estimate.
Date Created
2016-05
Contributors
- Kadambi, Pradyumna Sanjay (Author)
- Berisha, Visar (Thesis director)
- Bliss, Daniel (Committee member)
- Electrical Engineering Program (Contributor)
- Barrett, The Honors College (Contributor)
Topical Subject
Resource Type
Extent
30 pages
Language
eng
Copyright Statement
In Copyright
Primary Member of
Series
Academic Year 2015-2016
Handle
https://hdl.handle.net/2286/R.I.37612
Level of coding
minimal
Cataloging Standards
System Created
- 2017-10-30 02:50:58
System Modified
- 2021-08-11 04:09:57
- 3 years 3 months ago
Additional Formats