ログイン
言語:

WEKO3

  • トップ
  • ランキング
To
lat lon distance
To

Field does not validate



インデックスリンク

インデックスツリー

メールアドレスを入力してください。

WEKO

One fine body…

WEKO

One fine body…

アイテム

  1. 30 理工学研究科・理工学部(含:旧鉱山・工学資源学部)
  2. 30B 本学紀要
  3. 30B1b 秋田大学工学資源学部研究報告
  4. 第23号 (30B1b)

<論文>動作プリミティブ抽出と舞踊符割当ての自動化

http://hdl.handle.net/10295/171
http://hdl.handle.net/10295/171
7c6102d7-d1d6-4f28-b5d0-e40bd0d3ce89
名前 / ファイル ライセンス アクション
KJ00000048348.pdf KJ00000048348.pdf (748.0 kB)
Item type 紀要論文 / Departmental Bulletin Paper(1)
公開日 2007-12-14
タイトル
タイトル <論文>動作プリミティブ抽出と舞踊符割当ての自動化
その他のタイトル
その他のタイトル <Original Papers>Extraction of Motion Primitives and Automated Buyo-fu Assignment
言語
言語 jpn
資源タイプ
資源タイプ識別子 http://purl.org/coar/resource_type/c_6501
資源タイプ departmental bulletin paper
作成者 湯川, 崇

× 湯川, 崇

湯川, 崇

Search repository
小原, 直子

× 小原, 直子

小原, 直子

Search repository
玉本, 英夫

× 玉本, 英夫

玉本, 英夫

Search repository
YUKAWA, Takashi

× YUKAWA, Takashi

en YUKAWA, Takashi

Search repository
OBARA, Naoko

× OBARA, Naoko

en OBARA, Naoko

Search repository
TAMAMOTO, Hideo

× TAMAMOTO, Hideo

en TAMAMOTO, Hideo

Search repository
内容記述
内容記述タイプ Other
内容記述 We have proposed a human motion description method using Buyo-fu, which aims at establishing a new recording method of human motions and re-using a time series of three-dimensional human motion data(mocap data) obtained by using a motion capturing system. The mocap data can be partitioned into some basic motions. We assign a code to each basic motion, and call it Buyo-fu. We can efficiently describe the human motions by using Buyo-fu. At present, we have a problem that it takes much time and effort to make Buyo-fu, because only a person who has professional knowledge about the concerned motions can extract Buyo-fu, In this paper, we propose a method which can automatically extract Buyo-fu from the mocap data, In our proposed method, 1) a series of the human motions is partitioned into the basic motions based on the speed of the motions, 2) since we consider that similar basic motions appear several times in a series of motions, we do clustering the obtained basic motions, 3) we assign a label to each clustered basic motion, and call it a primitive motion, 4) we can express a series of motions by means of the concatenation of the primitive motions, 5) when we can find out the same pattern of the primitive motions in the human motions, we define this pattern to be a new primitive. Each primitive motion which can be obtained using our proposed method can become Buyo-fu. Performing the experiment with the typical form of "tensho" and "sanchin" of Karate-do, we show the proposed method is a good candidate for automatically extracting Buyo-fu from the mocap data.
出版タイプ
出版タイプ VoR
出版タイプResource http://purl.org/coar/version/c_970fb48d4fbd8a85
書誌情報 秋田大学工学資源学部 研究報告

巻 23, p. 33-40, 発行日 2002-10-31
ISSN
収録物識別子タイプ ISSN
収録物識別子 13457241
NCID
収録物識別子タイプ NCID
収録物識別子 AA11410906
出版者
出版者 秋田大学工学資源学部
戻る
0
views
See details
Views

Versions

Ver.1 2023-07-25 12:08:12.539200
Show All versions

Share

Mendeley Twitter Facebook Print Addthis

Cite as

エクスポート

OAI-PMH
  • OAI-PMH JPCOAR 2.0
  • OAI-PMH JPCOAR 1.0
  • OAI-PMH DublinCore
  • OAI-PMH DDI
Other Formats
  • JSON
  • BIBTEX

Confirm


Powered by WEKO3


Powered by WEKO3