220 0

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

Title
Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
Author
노영균
Keywords
FEATURE-SELECTION; GENE-EXPRESSION; INFORMATION; RELEVANCE
Issue Date
2018-01
Publisher
MIT PRESS
Citation
NEURAL COMPUTATION, v. 30, no. 7, page. 1930-1960
Abstract
Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.
URI
https://www.mitpressjournals.org/doi/abs/10.1162/neco_a_01092https://repository.hanyang.ac.kr/handle/20.500.11754/117286
ISSN
0899-7667; 1530-888X
DOI
10.1162/neco_a_01092
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > COMPUTER SCIENCE(컴퓨터소프트웨어학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE