• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

A Framework for Human Motion Segmentation Based on Multiple Information of Motion Data

Vol. 13, No. 9, September 29, 2019
10.3837/tiis.2019.09.017, Download Paper (Free):

Abstract

With the development of films, games and animation industry, analysis and reuse of human motion capture data become more and more important. Human motion segmentation, which divides a long motion sequence into different types of fragments, is a key part of mocap-based techniques. However, most of the segmentation methods only take into account low-level physical information (motion characteristics) or high-level data information (statistical characteristics) of motion data. They cannot use the data information fully. In this paper, we propose an unsupervised framework using both low-level physical information and high-level data information of human motion data to solve the human segmentation problem. First, we introduce the algorithm of CFSFDP and optimize it to carry out initial segmentation and obtain a good result quickly. Second, we use the ACA method to perform optimized segmentation for improving the result of segmentation. The experiments demonstrate that our framework has an excellent performance.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
X. Zan, W. Liu and W. Xing, "A Framework for Human Motion Segmentation Based on Multiple Information of Motion Data," KSII Transactions on Internet and Information Systems, vol. 13, no. 9, pp. 4624-4644, 2019. DOI: 10.3837/tiis.2019.09.017.

[ACM Style]
Xiaofei Zan, Weibin Liu, and Weiwei Xing. 2019. A Framework for Human Motion Segmentation Based on Multiple Information of Motion Data. KSII Transactions on Internet and Information Systems, 13, 9, (2019), 4624-4644. DOI: 10.3837/tiis.2019.09.017.