The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Foi demonstrado que conjuntos de dados rotulados em grande escala facilitam o sucesso do aprendizado de máquina. No entanto, a recolha de dados rotulados é muitas vezes muito dispendiosa e propensa a erros na prática. Para lidar com este problema, estudos anteriores consideraram o uso de um rótulo complementar, que especifica uma classe à qual uma instância não pertence e pode ser coletada mais facilmente do que rótulos comuns. No entanto, os rótulos complementares também podem ser propensos a erros e, assim, mitigar a influência do ruído dos rótulos é um desafio importante para tornar a aprendizagem dos rótulos complementares mais útil na prática. Neste artigo, derivamos condições para a função de perda tais que o algoritmo de aprendizagem seja não afetados por ruído em rótulos complementares. Experimentos em conjuntos de dados de benchmark com rótulos complementares ruidosos demonstram que as funções de perda que satisfazem nossas condições melhoram significativamente o desempenho da classificação.
Hiroki ISHIGURO
University of Tokyo
Takashi ISHIDA
University of Tokyo,RIKEN
Masashi SUGIYAMA
University of Tokyo,RIKEN
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Hiroki ISHIGURO, Takashi ISHIDA, Masashi SUGIYAMA, "Learning from Noisy Complementary Labels with Robust Loss Functions" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 2, pp. 364-376, February 2022, doi: 10.1587/transinf.2021EDP7035.
Abstract: It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDP7035/_p
Copiar
@ARTICLE{e105-d_2_364,
author={Hiroki ISHIGURO, Takashi ISHIDA, Masashi SUGIYAMA, },
journal={IEICE TRANSACTIONS on Information},
title={Learning from Noisy Complementary Labels with Robust Loss Functions},
year={2022},
volume={E105-D},
number={2},
pages={364-376},
abstract={It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.},
keywords={},
doi={10.1587/transinf.2021EDP7035},
ISSN={1745-1361},
month={February},}
Copiar
TY - JOUR
TI - Learning from Noisy Complementary Labels with Robust Loss Functions
T2 - IEICE TRANSACTIONS on Information
SP - 364
EP - 376
AU - Hiroki ISHIGURO
AU - Takashi ISHIDA
AU - Masashi SUGIYAMA
PY - 2022
DO - 10.1587/transinf.2021EDP7035
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 2
JA - IEICE TRANSACTIONS on Information
Y1 - February 2022
AB - It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.
ER -