Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 노영균 | - |
dc.date.accessioned | 2019-12-04T06:24:57Z | - |
dc.date.available | 2019-12-04T06:24:57Z | - |
dc.date.issued | 2018-01 | - |
dc.identifier.citation | NEURAL COMPUTATION, v. 30, no. 7, page. 1930-1960 | en_US |
dc.identifier.issn | 0899-7667 | - |
dc.identifier.issn | 1530-888X | - |
dc.identifier.uri | https://www.mitpressjournals.org/doi/abs/10.1162/neco_a_01092 | - |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/117286 | - |
dc.description.abstract | Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence. | en_US |
dc.description.sponsorship | Y.K.N. is supported by grants from NRF/MSIT-2017R1E1A1A03070945, M.S. and M.C.dP. from the JST CREST JPMJCR1403, S.L. from KAKENHI grant-in-Aid (RAS 15H06823), and Y.K.N. and F.C.P. from BK21Plus and MITIP-10048320. D.D.L. acknowledges support from the U.S. NSF, NIH, ONR, ARL, AFOSR, DOT, and DARPA. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | MIT PRESS | en_US |
dc.subject | FEATURE-SELECTION | en_US |
dc.subject | GENE-EXPRESSION | en_US |
dc.subject | INFORMATION | en_US |
dc.subject | RELEVANCE | en_US |
dc.title | Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence | en_US |
dc.type | Article | en_US |
dc.relation.no | 7 | - |
dc.relation.volume | 30 | - |
dc.identifier.doi | 10.1162/neco_a_01092 | - |
dc.relation.page | 1930-1960 | - |
dc.relation.journal | NEURAL COMPUTATION | - |
dc.contributor.googleauthor | Noh, Yung-Kyun | - |
dc.contributor.googleauthor | Sugiyama, Masashi | - |
dc.contributor.googleauthor | Liu, Song | - |
dc.contributor.googleauthor | du Plessis, Marthinus C. | - |
dc.contributor.googleauthor | Park, Frank Chongwoo | - |
dc.contributor.googleauthor | Lee, Daniel D. | - |
dc.relation.code | 2018000127 | - |
dc.sector.campus | S | - |
dc.sector.daehak | COLLEGE OF ENGINEERING[S] | - |
dc.sector.department | DEPARTMENT OF COMPUTER SCIENCE | - |
dc.identifier.pid | nohyung | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.