The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
O conhecimento do domínio é útil para melhorar o desempenho de generalização de máquinas de aprendizagem. As restrições de sinal são uma representação útil para combinar o conhecimento do domínio com a aprendizagem da máquina. Neste artigo, consideramos a restrição dos sinais dos coeficientes de peso no aprendizado da máquina de vetores de suporte linear e desenvolvemos um algoritmo de otimização para minimizar o risco empírico sob as restrições de sinal. O algoritmo é baseado no método Frank-Wolfe que também converge sublinearmente e possui um critério de terminação claro. Mostramos que cada iteração do Frank-Wolfe também requer O(nd+d2) custo computacional. Além disso, derivamos a expressão explícita para o número mínimo de iteração para garantir uma solução ε-precisa analisando a curvatura da função objetivo. Finalmente, demonstramos empiricamente que as restrições de sinal são uma técnica promissora quando semelhanças com os exemplos de treinamento compõem o vetor de características.
Kenya TAJIMA
Gunma University
Takahiko HENMI
Gunma University
Tsuyoshi KATO
Gunma University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Kenya TAJIMA, Takahiko HENMI, Tsuyoshi KATO, "Frank-Wolfe for Sign-Constrained Support Vector Machines" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 10, pp. 1734-1742, October 2022, doi: 10.1587/transinf.2022EDP7069.
Abstract: Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2022EDP7069/_p
Copiar
@ARTICLE{e105-d_10_1734,
author={Kenya TAJIMA, Takahiko HENMI, Tsuyoshi KATO, },
journal={IEICE TRANSACTIONS on Information},
title={Frank-Wolfe for Sign-Constrained Support Vector Machines},
year={2022},
volume={E105-D},
number={10},
pages={1734-1742},
abstract={Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.},
keywords={},
doi={10.1587/transinf.2022EDP7069},
ISSN={1745-1361},
month={October},}
Copiar
TY - JOUR
TI - Frank-Wolfe for Sign-Constrained Support Vector Machines
T2 - IEICE TRANSACTIONS on Information
SP - 1734
EP - 1742
AU - Kenya TAJIMA
AU - Takahiko HENMI
AU - Tsuyoshi KATO
PY - 2022
DO - 10.1587/transinf.2022EDP7069
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 10
JA - IEICE TRANSACTIONS on Information
Y1 - October 2022
AB - Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.
ER -