• KSII Transactions on Internet and Information Systems
    Monthly Online Journal (eISSN: 1976-7277)

Creating Deep Learning-based Acrobatic Videos Using Imitation Videos

Vol. 15, No. 2, February 28, 2021
10.3837/tiis.2021.02.018, Download Paper (Free):

Abstract

This paper proposes an augmented reality technique to generate acrobatic scenes from hitting motion videos. After a user shoots a motion that mimics hitting an object with hands or feet, their pose is analyzed using motion tracking with deep learning to track hand or foot movement while hitting the object. Hitting position and time are then extracted to generate the object’s moving trajectory using physics optimization and synchronized with the video. The proposed method can create videos for hitting objects with feet, e.g. soccer ball lifting; fists, e.g. tap ball, etc. and is suitable for augmented reality applications to include virtual objects.


Statistics

Show / Hide Statistics

Statistics (Cumulative Counts from December 1st, 2015)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article

[IEEE Style]
J. I. Choi and S. H. Nam, "Creating Deep Learning-based Acrobatic Videos Using Imitation Videos," KSII Transactions on Internet and Information Systems, vol. 15, no. 2, pp. 713-728, 2021. DOI: 10.3837/tiis.2021.02.018.

[ACM Style]
Jong In Choi and Sang Hun Nam. 2021. Creating Deep Learning-based Acrobatic Videos Using Imitation Videos. KSII Transactions on Internet and Information Systems, 15, 2, (2021), 713-728. DOI: 10.3837/tiis.2021.02.018.

[BibTeX Style]
@article{tiis:24284, title="Creating Deep Learning-based Acrobatic Videos Using Imitation Videos", author="Jong In Choi and Sang Hun Nam and ", journal="KSII Transactions on Internet and Information Systems", DOI={10.3837/tiis.2021.02.018}, volume={15}, number={2}, year="2021", month={February}, pages={713-728}}