The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Em um problema prático de classificação, há casos em que rótulos incorretos são incluídos nos dados de treinamento devido ao ruído do rótulo. Introduzimos um método de classificação na presença de ruído de rótulo que idealiza um método de classificação baseado no algoritmo de maximização de expectativa (EM) e avaliamos seu desempenho teoricamente. Seu desempenho é avaliado assintoticamente através da avaliação da função de risco definida como a divergência de Kullback-Leibler entre distribuição preditiva e distribuição verdadeira. O resultado desta avaliação de desempenho permite uma avaliação teórica do desempenho mais bem-sucedido que o método de classificação baseado em EM pode alcançar.
Goki YASUDA
Waseda University
Tota SUKO
Waseda University
Manabu KOBAYASHI
Waseda University
Toshiyasu MATSUSHIMA
Waseda University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Goki YASUDA, Tota SUKO, Manabu KOBAYASHI, Toshiyasu MATSUSHIMA, "Asymptotic Evaluation of Classification in the Presence of Label Noise" in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 3, pp. 422-430, March 2023, doi: 10.1587/transfun.2022TAP0013.
Abstract: In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2022TAP0013/_p
Copiar
@ARTICLE{e106-a_3_422,
author={Goki YASUDA, Tota SUKO, Manabu KOBAYASHI, Toshiyasu MATSUSHIMA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Asymptotic Evaluation of Classification in the Presence of Label Noise},
year={2023},
volume={E106-A},
number={3},
pages={422-430},
abstract={In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.},
keywords={},
doi={10.1587/transfun.2022TAP0013},
ISSN={1745-1337},
month={March},}
Copiar
TY - JOUR
TI - Asymptotic Evaluation of Classification in the Presence of Label Noise
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 422
EP - 430
AU - Goki YASUDA
AU - Tota SUKO
AU - Manabu KOBAYASHI
AU - Toshiyasu MATSUSHIMA
PY - 2023
DO - 10.1587/transfun.2022TAP0013
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2023
AB - In a practical classification problem, there are cases where incorrect labels are included in training data due to label noise. We introduce a classification method in the presence of label noise that idealizes a classification method based on the expectation-maximization (EM) algorithm, and evaluate its performance theoretically. Its performance is asymptotically evaluated by assessing the risk function defined as the Kullback-Leibler divergence between predictive distribution and true distribution. The result of this performance evaluation enables a theoretical evaluation of the most successful performance that the EM-based classification method may achieve.
ER -