The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
A dor é um fenômeno fisiológico essencial ao ser humano. A avaliação precisa da dor é importante para desenvolver o tratamento adequado. Embora o método de autorrelato seja o padrão-ouro na avaliação da dor, ele não é aplicável a indivíduos com comprometimento comunicativo. Indicadores não-verbais de dor, como expressões faciais relacionadas à dor e alterações nos parâmetros fisiológicos, podem fornecer informações valiosas para a avaliação da dor. Neste artigo, propomos uma rede neural integrada de fluxo baseada em multimodal com diferentes taxas de quadros (SINN) que combina expressão facial e sinais biomédicos para avaliação automática da dor. As principais contribuições desta pesquisa são três. (1) Existem quatro entradas de fluxo do SINN para extração de características de expressão facial. As características faciais variantes são integradas às características biomédicas e as características articulares são utilizadas para avaliação da dor. (2) As características faciais dinâmicas são aprendidas de maneira implícita e explícita para melhor representar as mudanças faciais que ocorrem durante a experiência da dor. (3) Múltiplas modalidades são utilizadas para identificar vários estados de dor, incluindo expressão facial e sinais biomédicos. Os experimentos são conduzidos em conjuntos de dados de dor disponíveis publicamente e o desempenho é comparado com vários modelos de aprendizagem profunda. Os resultados experimentais ilustram a superioridade do modelo proposto, que atinge a maior precisão de 68.2%, o que é até 5% superior aos modelos básicos de aprendizagem profunda na avaliação da dor com classificação binária.
Ruicong ZHI
University of Science and Technology Beijing,Beijing Key Laboratory of Knowledge Engineering for Materials Science
Caixia ZHOU
University of Science and Technology Beijing,Beijing Key Laboratory of Knowledge Engineering for Materials Science
Junwei YU
University of Science and Technology Beijing,Beijing Key Laboratory of Knowledge Engineering for Materials Science
Tingting LI
University of Science and Technology Beijing,Beijing Key Laboratory of Knowledge Engineering for Materials Science
Ghada ZAMZMI
University of South Florida
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Ruicong ZHI, Caixia ZHOU, Junwei YU, Tingting LI, Ghada ZAMZMI, "Multimodal-Based Stream Integrated Neural Networks for Pain Assessment" in IEICE TRANSACTIONS on Information,
vol. E104-D, no. 12, pp. 2184-2194, December 2021, doi: 10.1587/transinf.2021EDP7065.
Abstract: Pain is an essential physiological phenomenon of human beings. Accurate assessment of pain is important to develop proper treatment. Although self-report method is the gold standard in pain assessment, it is not applicable to individuals with communicative impairment. Non-verbal pain indicators such as pain related facial expressions and changes in physiological parameters could provide valuable insights for pain assessment. In this paper, we propose a multimodal-based Stream Integrated Neural Network with Different Frame Rates (SINN) that combines facial expression and biomedical signals for automatic pain assessment. The main contributions of this research are threefold. (1) There are four-stream inputs of the SINN for facial expression feature extraction. The variant facial features are integrated with biomedical features, and the joint features are utilized for pain assessment. (2) The dynamic facial features are learned in both implicit and explicit manners to better represent the facial changes that occur during pain experience. (3) Multiple modalities are utilized to identify various pain states, including facial expression and biomedical signals. The experiments are conducted on publicly available pain datasets, and the performance is compared with several deep learning models. The experimental results illustrate the superiority of the proposed model, and it achieves the highest accuracy of 68.2%, which is up to 5% higher than the basic deep learning models on pain assessment with binary classification.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDP7065/_p
Copiar
@ARTICLE{e104-d_12_2184,
author={Ruicong ZHI, Caixia ZHOU, Junwei YU, Tingting LI, Ghada ZAMZMI, },
journal={IEICE TRANSACTIONS on Information},
title={Multimodal-Based Stream Integrated Neural Networks for Pain Assessment},
year={2021},
volume={E104-D},
number={12},
pages={2184-2194},
abstract={Pain is an essential physiological phenomenon of human beings. Accurate assessment of pain is important to develop proper treatment. Although self-report method is the gold standard in pain assessment, it is not applicable to individuals with communicative impairment. Non-verbal pain indicators such as pain related facial expressions and changes in physiological parameters could provide valuable insights for pain assessment. In this paper, we propose a multimodal-based Stream Integrated Neural Network with Different Frame Rates (SINN) that combines facial expression and biomedical signals for automatic pain assessment. The main contributions of this research are threefold. (1) There are four-stream inputs of the SINN for facial expression feature extraction. The variant facial features are integrated with biomedical features, and the joint features are utilized for pain assessment. (2) The dynamic facial features are learned in both implicit and explicit manners to better represent the facial changes that occur during pain experience. (3) Multiple modalities are utilized to identify various pain states, including facial expression and biomedical signals. The experiments are conducted on publicly available pain datasets, and the performance is compared with several deep learning models. The experimental results illustrate the superiority of the proposed model, and it achieves the highest accuracy of 68.2%, which is up to 5% higher than the basic deep learning models on pain assessment with binary classification.},
keywords={},
doi={10.1587/transinf.2021EDP7065},
ISSN={1745-1361},
month={December},}
Copiar
TY - JOUR
TI - Multimodal-Based Stream Integrated Neural Networks for Pain Assessment
T2 - IEICE TRANSACTIONS on Information
SP - 2184
EP - 2194
AU - Ruicong ZHI
AU - Caixia ZHOU
AU - Junwei YU
AU - Tingting LI
AU - Ghada ZAMZMI
PY - 2021
DO - 10.1587/transinf.2021EDP7065
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E104-D
IS - 12
JA - IEICE TRANSACTIONS on Information
Y1 - December 2021
AB - Pain is an essential physiological phenomenon of human beings. Accurate assessment of pain is important to develop proper treatment. Although self-report method is the gold standard in pain assessment, it is not applicable to individuals with communicative impairment. Non-verbal pain indicators such as pain related facial expressions and changes in physiological parameters could provide valuable insights for pain assessment. In this paper, we propose a multimodal-based Stream Integrated Neural Network with Different Frame Rates (SINN) that combines facial expression and biomedical signals for automatic pain assessment. The main contributions of this research are threefold. (1) There are four-stream inputs of the SINN for facial expression feature extraction. The variant facial features are integrated with biomedical features, and the joint features are utilized for pain assessment. (2) The dynamic facial features are learned in both implicit and explicit manners to better represent the facial changes that occur during pain experience. (3) Multiple modalities are utilized to identify various pain states, including facial expression and biomedical signals. The experiments are conducted on publicly available pain datasets, and the performance is compared with several deep learning models. The experimental results illustrate the superiority of the proposed model, and it achieves the highest accuracy of 68.2%, which is up to 5% higher than the basic deep learning models on pain assessment with binary classification.
ER -