The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
A reidentificação de pessoas por infravermelho visível (VI-ReID) é uma tarefa desafiadora de recuperação de pedestres devido à enorme discrepância de modalidade e discrepância de aparência. Para resolver esta difícil tarefa, esta carta propõe um novo método de exploração de aumento de cinza (GAE) para aumentar a diversidade dos dados de treinamento e buscar a melhor proporção de aumento de cinza para aprender um modelo mais focado. Além disso, também propomos uma forte perda de trio central (AMCT) para todas as modalidades para tornar as características extraídas do mesmo pedestre mais compactas, mas aquelas de pessoas diferentes mais separadas. Experimentos realizados no conjunto de dados público SYSU-MM01 demonstram a superioridade do método proposto na tarefa VI-ReID.
Xiaozhou CHENG
China University of Mining and Technology,Sinostell Maanshan General Institute of Mining Research Co., Ltd.
Rui LI
China University of Mining and Technology
Yanjing SUN
China University of Mining and Technology
Yu ZHOU
China University of Mining and Technology
Kaiwen DONG
China University of Mining and Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Xiaozhou CHENG, Rui LI, Yanjing SUN, Yu ZHOU, Kaiwen DONG, "Gray Augmentation Exploration with All-Modality Center-Triplet Loss for Visible-Infrared Person Re-Identification" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 7, pp. 1356-1360, July 2022, doi: 10.1587/transinf.2021EDL8101.
Abstract: Visible-Infrared Person Re-identification (VI-ReID) is a challenging pedestrian retrieval task due to the huge modality discrepancy and appearance discrepancy. To address this tough task, this letter proposes a novel gray augmentation exploration (GAE) method to increase the diversity of training data and seek the best ratio of gray augmentation for learning a more focused model. Additionally, we also propose a strong all-modality center-triplet (AMCT) loss to push the features extracted from the same pedestrian more compact but those from different persons more separate. Experiments conducted on the public dataset SYSU-MM01 demonstrate the superiority of the proposed method in the VI-ReID task.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDL8101/_p
Copiar
@ARTICLE{e105-d_7_1356,
author={Xiaozhou CHENG, Rui LI, Yanjing SUN, Yu ZHOU, Kaiwen DONG, },
journal={IEICE TRANSACTIONS on Information},
title={Gray Augmentation Exploration with All-Modality Center-Triplet Loss for Visible-Infrared Person Re-Identification},
year={2022},
volume={E105-D},
number={7},
pages={1356-1360},
abstract={Visible-Infrared Person Re-identification (VI-ReID) is a challenging pedestrian retrieval task due to the huge modality discrepancy and appearance discrepancy. To address this tough task, this letter proposes a novel gray augmentation exploration (GAE) method to increase the diversity of training data and seek the best ratio of gray augmentation for learning a more focused model. Additionally, we also propose a strong all-modality center-triplet (AMCT) loss to push the features extracted from the same pedestrian more compact but those from different persons more separate. Experiments conducted on the public dataset SYSU-MM01 demonstrate the superiority of the proposed method in the VI-ReID task.},
keywords={},
doi={10.1587/transinf.2021EDL8101},
ISSN={1745-1361},
month={July},}
Copiar
TY - JOUR
TI - Gray Augmentation Exploration with All-Modality Center-Triplet Loss for Visible-Infrared Person Re-Identification
T2 - IEICE TRANSACTIONS on Information
SP - 1356
EP - 1360
AU - Xiaozhou CHENG
AU - Rui LI
AU - Yanjing SUN
AU - Yu ZHOU
AU - Kaiwen DONG
PY - 2022
DO - 10.1587/transinf.2021EDL8101
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2022
AB - Visible-Infrared Person Re-identification (VI-ReID) is a challenging pedestrian retrieval task due to the huge modality discrepancy and appearance discrepancy. To address this tough task, this letter proposes a novel gray augmentation exploration (GAE) method to increase the diversity of training data and seek the best ratio of gray augmentation for learning a more focused model. Additionally, we also propose a strong all-modality center-triplet (AMCT) loss to push the features extracted from the same pedestrian more compact but those from different persons more separate. Experiments conducted on the public dataset SYSU-MM01 demonstrate the superiority of the proposed method in the VI-ReID task.
ER -