The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Muitas máquinas de aprendizagem que possuem estrutura hierárquica ou variáveis ocultas estão agora sendo usadas em ciência da informação, inteligência artificial e bioinformática. No entanto, várias máquinas de aprendizagem utilizadas em tais campos não são modelos estatísticos regulares, mas sim singulares, pelo que o seu desempenho de generalização ainda é desconhecido. Para superar esses problemas, nos artigos anteriores, provamos novas equações em aprendizagem estatística, pelas quais podemos estimar a perda de generalização de Bayes a partir da perda de treinamento de Bayes e da variância funcional, desde que a distribuição verdadeira seja uma singularidade contida em um máquina de aprender. Neste artigo, provamos que as mesmas equações são válidas mesmo que uma distribuição verdadeira não esteja contida em um modelo paramétrico. Também provamos que as equações propostas num caso regular são assintoticamente equivalentes ao critério de informação de Takeuchi. Portanto, as equações propostas são sempre aplicáveis sem qualquer condição de distribuição verdadeira desconhecida.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Sumio WATANABE, "Equations of States in Statistical Learning for an Unrealizable and Regular Case" in IEICE TRANSACTIONS on Fundamentals,
vol. E93-A, no. 3, pp. 617-626, March 2010, doi: 10.1587/transfun.E93.A.617.
Abstract: Many learning machines that have hierarchical structure or hidden variables are now being used in information science, artificial intelligence, and bioinformatics. However, several learning machines used in such fields are not regular but singular statistical models, hence their generalization performance is still left unknown. To overcome these problems, in the previous papers, we proved new equations in statistical learning, by which we can estimate the Bayes generalization loss from the Bayes training loss and the functional variance, on the condition that the true distribution is a singularity contained in a learning machine. In this paper, we prove that the same equations hold even if a true distribution is not contained in a parametric model. Also we prove that, the proposed equations in a regular case are asymptotically equivalent to the Takeuchi information criterion. Therefore, the proposed equations are always applicable without any condition on the unknown true distribution.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E93.A.617/_p
Copiar
@ARTICLE{e93-a_3_617,
author={Sumio WATANABE, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Equations of States in Statistical Learning for an Unrealizable and Regular Case},
year={2010},
volume={E93-A},
number={3},
pages={617-626},
abstract={Many learning machines that have hierarchical structure or hidden variables are now being used in information science, artificial intelligence, and bioinformatics. However, several learning machines used in such fields are not regular but singular statistical models, hence their generalization performance is still left unknown. To overcome these problems, in the previous papers, we proved new equations in statistical learning, by which we can estimate the Bayes generalization loss from the Bayes training loss and the functional variance, on the condition that the true distribution is a singularity contained in a learning machine. In this paper, we prove that the same equations hold even if a true distribution is not contained in a parametric model. Also we prove that, the proposed equations in a regular case are asymptotically equivalent to the Takeuchi information criterion. Therefore, the proposed equations are always applicable without any condition on the unknown true distribution.},
keywords={},
doi={10.1587/transfun.E93.A.617},
ISSN={1745-1337},
month={March},}
Copiar
TY - JOUR
TI - Equations of States in Statistical Learning for an Unrealizable and Regular Case
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 617
EP - 626
AU - Sumio WATANABE
PY - 2010
DO - 10.1587/transfun.E93.A.617
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E93-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2010
AB - Many learning machines that have hierarchical structure or hidden variables are now being used in information science, artificial intelligence, and bioinformatics. However, several learning machines used in such fields are not regular but singular statistical models, hence their generalization performance is still left unknown. To overcome these problems, in the previous papers, we proved new equations in statistical learning, by which we can estimate the Bayes generalization loss from the Bayes training loss and the functional variance, on the condition that the true distribution is a singularity contained in a learning machine. In this paper, we prove that the same equations hold even if a true distribution is not contained in a parametric model. Also we prove that, the proposed equations in a regular case are asymptotically equivalent to the Takeuchi information criterion. Therefore, the proposed equations are always applicable without any condition on the unknown true distribution.
ER -