The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
Este artigo apresenta um algoritmo de recuperação baseado em conteúdo para dados de captura de movimento, necessário para reutilizar um banco de dados de grande escala que possui muitas variações na mesma categoria de movimentos. O problema mais desafiador é que movimentos logicamente semelhantes podem não ser numericamente semelhantes devido às variações de movimento em uma categoria. Nosso algoritmo pode efetivamente recuperar movimentos logicamente semelhantes a uma consulta, onde uma métrica de distância entre nossos novos recursos de curto prazo é definida adequadamente como um componente fundamental em nosso sistema. Extraímos os recursos com base na análise de curto prazo das velocidades articulares após dividir uma sequência inteira de captura de movimento em muitos pequenos clipes sobrepostos. Em cada clipe, selecionamos não apenas a magnitude, mas também o padrão dinâmico das velocidades articulares como nossos recursos, que podem descartar as variações de movimento enquanto mantêm as informações de movimento significativas em uma categoria. Simultaneamente, a quantidade de dados é reduzida, aliviando o custo computacional. Usando os recursos extraídos, definimos uma nova métrica de distância entre dois clipes de movimento. Através da distorção dinâmica do tempo, uma medida de dissimilaridade de movimento é calculada entre duas sequências de captura de movimento. Então, dada uma consulta, classificamos todos os movimentos em nosso conjunto de dados de acordo com suas medidas de dissimilaridade de movimento. Nossos experimentos, realizados em um conjunto de dados de teste que consiste em mais de 190 movimentos, demonstram que nosso algoritmo melhora muito o desempenho em comparação com dois métodos convencionais, de acordo com uma medida de avaliação popular. P(NR).
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copiar
Jianfeng XU, Haruhisa KATO, Akio YONEYAMA, "Content-Based Retrieval of Motion Capture Data Using Short-Term Feature Extraction" in IEICE TRANSACTIONS on Information,
vol. E92-D, no. 9, pp. 1657-1667, September 2009, doi: 10.1587/transinf.E92.D.1657.
Abstract: This paper presents a content-based retrieval algorithm for motion capture data, which is required to re-use a large-scale database that has many variations in the same category of motions. The most challenging problem is that logically similar motions may not be numerically similar due to the motion variations in a category. Our algorithm can effectively retrieve logically similar motions to a query, where a distance metric between our novel short-term features is defined properly as a fundamental component in our system. We extract the features based on short-term analysis of joint velocities after dividing an entire motion capture sequence into many small overlapped clips. In each clip, we select not only the magnitude but also the dynamic pattern of the joint velocities as our features, which can discard the motion variations while keeping the significant motion information in a category. Simultaneously, the amount of data is reduced, alleviating the computational cost. Using the extracted features, we define a novel distance metric between two motion clips. By dynamic time warping, a motion dissimilarity measure is calculated between two motion capture sequences. Then, given a query, we rank all the motions in our dataset according to their motion dissimilarity measures. Our experiments, which are performed on a test dataset consisting of more than 190 motions, demonstrate that our algorithm greatly improves the performance compared to two conventional methods according to a popular evaluation measure P(NR).
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E92.D.1657/_p
Copiar
@ARTICLE{e92-d_9_1657,
author={Jianfeng XU, Haruhisa KATO, Akio YONEYAMA, },
journal={IEICE TRANSACTIONS on Information},
title={Content-Based Retrieval of Motion Capture Data Using Short-Term Feature Extraction},
year={2009},
volume={E92-D},
number={9},
pages={1657-1667},
abstract={This paper presents a content-based retrieval algorithm for motion capture data, which is required to re-use a large-scale database that has many variations in the same category of motions. The most challenging problem is that logically similar motions may not be numerically similar due to the motion variations in a category. Our algorithm can effectively retrieve logically similar motions to a query, where a distance metric between our novel short-term features is defined properly as a fundamental component in our system. We extract the features based on short-term analysis of joint velocities after dividing an entire motion capture sequence into many small overlapped clips. In each clip, we select not only the magnitude but also the dynamic pattern of the joint velocities as our features, which can discard the motion variations while keeping the significant motion information in a category. Simultaneously, the amount of data is reduced, alleviating the computational cost. Using the extracted features, we define a novel distance metric between two motion clips. By dynamic time warping, a motion dissimilarity measure is calculated between two motion capture sequences. Then, given a query, we rank all the motions in our dataset according to their motion dissimilarity measures. Our experiments, which are performed on a test dataset consisting of more than 190 motions, demonstrate that our algorithm greatly improves the performance compared to two conventional methods according to a popular evaluation measure P(NR).},
keywords={},
doi={10.1587/transinf.E92.D.1657},
ISSN={1745-1361},
month={September},}
Copiar
TY - JOUR
TI - Content-Based Retrieval of Motion Capture Data Using Short-Term Feature Extraction
T2 - IEICE TRANSACTIONS on Information
SP - 1657
EP - 1667
AU - Jianfeng XU
AU - Haruhisa KATO
AU - Akio YONEYAMA
PY - 2009
DO - 10.1587/transinf.E92.D.1657
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E92-D
IS - 9
JA - IEICE TRANSACTIONS on Information
Y1 - September 2009
AB - This paper presents a content-based retrieval algorithm for motion capture data, which is required to re-use a large-scale database that has many variations in the same category of motions. The most challenging problem is that logically similar motions may not be numerically similar due to the motion variations in a category. Our algorithm can effectively retrieve logically similar motions to a query, where a distance metric between our novel short-term features is defined properly as a fundamental component in our system. We extract the features based on short-term analysis of joint velocities after dividing an entire motion capture sequence into many small overlapped clips. In each clip, we select not only the magnitude but also the dynamic pattern of the joint velocities as our features, which can discard the motion variations while keeping the significant motion information in a category. Simultaneously, the amount of data is reduced, alleviating the computational cost. Using the extracted features, we define a novel distance metric between two motion clips. By dynamic time warping, a motion dissimilarity measure is calculated between two motion capture sequences. Then, given a query, we rank all the motions in our dataset according to their motion dissimilarity measures. Our experiments, which are performed on a test dataset consisting of more than 190 motions, demonstrate that our algorithm greatly improves the performance compared to two conventional methods according to a popular evaluation measure P(NR).
ER -