The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
A geometria da informação é aplicada à variedade de redes neurais chamadas perceptrons multicamadas. É importante estudar uma família total de redes como uma variedade geométrica, porque a aprendizagem é representada por uma trajetória nesse espaço. A variedade de perceptrons possui uma rica estrutura geométrica diferencial representada por uma métrica Riemanniana e singularidades. Um método de aprendizagem eficiente é proposto ao usá-lo. O espaço de parâmetros dos perceptrons inclui muitas singularidades algébricas, que afetam as trajetórias de aprendizagem. Tais singularidades são estudadas por meio de modelos simples. Isto representa um problema interessante de inferência estatística e aprendizagem em modelos hierárquicos incluindo singularidades.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Shun-ichi AMARI, Tomoko OZEKI, "Differential and Algebraic Geometry of Multilayer Perceptrons" in IEICE TRANSACTIONS on Fundamentals,
vol. E84-A, no. 1, pp. 31-38, January 2001, doi: .
Abstract: Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e84-a_1_31/_p
Copiar
@ARTICLE{e84-a_1_31,
author={Shun-ichi AMARI, Tomoko OZEKI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Differential and Algebraic Geometry of Multilayer Perceptrons},
year={2001},
volume={E84-A},
number={1},
pages={31-38},
abstract={Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.},
keywords={},
doi={},
ISSN={},
month={January},}
Copiar
TY - JOUR
TI - Differential and Algebraic Geometry of Multilayer Perceptrons
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 31
EP - 38
AU - Shun-ichi AMARI
AU - Tomoko OZEKI
PY - 2001
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E84-A
IS - 1
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - January 2001
AB - Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.
ER -