Responsive image
博碩士論文 etd-0912112-163448 詳細資訊
Title page for etd-0912112-163448
論文名稱
Title
人形機器人行為模仿與重現
Humanoid Robot Behavior Emulation and Representation
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
83
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2012-08-28
繳交日期
Date of Submission
2012-09-12
關鍵字
Keywords
Kinect、行為模仿、關鍵姿態、人形機器人
Humanoid robot, Key-posture, Behavior emulation, Kinect
統計
Statistics
本論文已被瀏覽 5755 次,被下載 1505
The thesis/dissertation has been browsed 5755 times, has been downloaded 1505 times.
中文摘要
本論文的目的是利用體感控制機器人的概念,開發一套便於操控人型機器人的系統,讓使用者可以透過一種很簡便的方式來操控機器人。因此建立一個以體感Kinect為操控器的系統架構。以Kinect為操控方式可以將使用者的範圍擴大到各個年齡層,有別於以往透過使用者介面的方式,使用者只需要透過肢體語言表達就可以達到目的而不需要的繁複步驟。本系統可以準確的計算使用者的在動作中姿態變化的角度並且辨認出可以組成相似於目前模仿行為的關鍵姿態。換句話說,透過這些關鍵姿態可以重組原本使用者的行為。因此系統主要由運動學計算模組以及重現演算法所組成,不僅提供行為模仿的功能,還包含了行為重現。透過重現演算法系統可以擷取組合行為的特徵。為了要讓已產生的行為能夠被善加利用,系統也加入了行為結合的功能。透過模組化的概念可以將不同的由關鍵姿態所構成的行為結合,產生新的行為。本系統實作Kinect的應用程式是在OpenNI以及NITE的環境下;OpenNI用來讀取Kinect所捕捉到的資料,而NITE則是建立骨架追蹤的中介軟體。本論文將以影片“Tai-Ji-Advancer”呈獻並將影片置於http://www.youtube.com/watch?v=cSYS49JKVAA。
Abstract
The objective of the thesis is utilizing body sensing technology to develop a more intuitive and convenient way to control robots. The idea is to build a body sensing control system based on Kinect framework. Through Kinect, users from different age groups can achieve the desired purposes through motion demonstration without complicated programming. The system can accurately calculate angle change from users’ gestures in a motion and identify key-postures which can compose an emulation motion similar to the presenting one. In other words, from analyzing these key postures, the demonstrated behaviors are able to be represented internally. Therefore, the system, consisting of a kinematics computational module and a representation algorithm, not only provides the function of behavior emulation but also behavior representation. By representation algorithm, the system extracts the features of combined behaviors. Besides, with the modular programming methodology, different behaviors can be reorganized to generate new behaviors based on the set of key poses represented by the extracted features. The application implemented in this system is within the OpenNI and NITE environment. OpenNI is used to retrieve information that the Kinect captured. NITE is used to track the user skeleton. The system is demonstrated by a play of “Tai-Ji-Advancer” and at http://www.youtube.com/watch?v=cSYS49JKVAA.
目次 Table of Contents
摘要 ii
Abstract iv
Table of Contents v
List of Figures vii
List of Tables ix
I. Introduction 1
1.1 Motivation 1
1.2 Objective 2
1.3 Organization of thesis 3
II. Related Researches 4
2.1. Motion Capture 4
2.2. Humanoid 8
2.3. Communication 9
III. System Analysis and Design 10
3.1. System Architecture 10
3.2. System Analysis 11
3.2.1. 5W1H 11
3.2.2. Functional and Non-functional Requirements 13
3.2.3. Use Cases 15
3.2.4. UML 19
3.3. Skeleton Algorithm 21
3.4. Data Retrieve and Compute Module 22
3.4.1. Inverse-Kinematics' 22
3.4.2. Representation Algorithm 30
3.4.3. Communication 35
3.5. Behavior Functional Module 36
3.5.1. Behavior Emulation 37
3.5.2. Behavior Representation 39
3.5.3. Behavior Combination 42
3.6. System Integration 43
IV. Implementation 46
4.1. Development Environment 46
4.2. Implementation of Skeleton Algorithm 49
4.3. Implementation of DRCM 52
4.3.1. Inverse-Kinematic Computation 52
4.3.2. Experiment of algorithm 54
4.4. Implementation of BFM 61
4.4.1. Behavior Emulation 61
4.4.2. Behavior Representation 63
4.4.3. Behavior Combination 65
V. Conclusion and Future Work 67
5.1. Conclusion 67
5.2. Future Work 67
Reference 69
參考文獻 References
[1].Y. Sakagami, R. Watanabe, C. Aoyama, S. Matsunaga, N. Higaki and K. Fujimura, “The intelligent asimo: system overview and integration,” IEEE/RSJ International Conference on Intelligent Robots and Systems, vol.3, pp2478-2483, 2002.
[2].林永祥,“SOWT分析光學式動作擷取系統發展趨勢,”第三屆運動科學暨休閒遊憩管理學術研討會論文集, pp467-473, 2010.
[3].K.-D. Nguyen, I.-M. Chen, S.-H. Yeo and B.-L. Duh, “Motion control of a robotic puppet through a hybrid motion capture device,” IEEE International Conference on Automation Science and Engineerin(CASE), pp753-758, Sept. 22-25, 2007.
[4].H.-I. Lin, Y.-C. Liu and C.-L. Chen, “Imitation evaluation of human-robot arm movement,” 8th Asian Control Conference (ASCC), pp287-292, May. 15-18, 2011.
[5].P. Bednavidez and M. Jamshidi, “Mobile robot navigation and target tracking system,” 6th International Conference on System of Systems Engineering(SoSE), pp. 299-304, June. 27-30, 2011.
[6].K. Lai, J. Konrad and P. Ishwar, “A gesture-driven computer interface using kinect,” IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI), pp.185-188, April. 22-24, 2012.
[7].S.-F. Huang and W.-J. Wang, “Human’s postures recognition by using kinect,” National Symposium on System Science and Engineering, pp.187-192, June. 17-18, 2011.
[8].M.-V.-D. Bergh, D. Carton, R.-D. Nijs, N. Mitsou, C. Landsiedel, K. Kuehnlenz, D. Wollherr, L.-V. Gool and M. Buss, “Real-time 3D hand gesture interaction with a robot for understanding directions from humans,” 20th IEEE International Symposium on Robot and Human Interactive Communication, pp.357-362, July. 31, 2011-Aug. 3, 2011.
[9].F. Hoshino and K. Morioka, “Human following robot based on control of particle distribution,” 2011 IEEE/SICE International Symposium on System Integration (SII), pp.212-217, Dec. 20-22, 2011.
[10].R.-R. Igorevich, E.-P. Ismoilovich and D. Min, “Behavioral synchronization of human and humanoid robot,” 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), pp655-660, Nov. 23-26, 2011.
[11].I.-I. Itauma, H. Kivrak, and H. Kose, “Gesture imitation using machine learning techniques,” 20th Signal Processing and Communications Applications Conference (SIU), pp1-4, April. 18-20, 2012
[12].L. Cheng, Q. Sun, H. Su, Y. Cong and S. Zhao, “Design and implementation of human-robot interactive demonstration system based on kinect,” 24th Chinese Control and Decision Conference (CCDC), pp971-975, May. 23-25, 2012.
[13].K. Pan, “Implementation of robot arm controller on embedded system,” National Chung Cheng University, 2010.
[14].J.-T. Zou and D.-H. Tu, “The development of six D.O.F. robot arm for intelligent robot,” 8th Asian Control Conference (ASCC), pp976-981. May. 15-18, 2011.
[15].G.-S. Huang, C.-K. Tung, H.-C. Lin, and S.-H. Hsiao, “Inverse kinematics analysis trajectory planning for a robot arm,” 8th Asian Control Conference (ASCC), pp965-970. May. 15-18, 2011
[16].S. Li, J. Liang, G. Liu, and Y. Zhang, “Generate virtual human whole body reaching postures combining analytical inverse kinematics with example postures,” 7th International Conference on System Simulation and Scientific Computing, 2008. ICSC 2008. Asia Simulation Conference, pp.45-52, Oct. 10-12, 2008.
[17].D.-Y. Chen, H.-Y.-M. Liao, H.-R. Tyan and C.-W. Lin, “Automatic key posture selection for human behavior analysis,” IEEE 7th Workshop on Multimedia Signal Processing, pp1-4, Oct. 30, 2005-Nov. 5, 2005.
[18].Y. Chen, Q. Wu, X. He, C. Du and J. Yang, “Extracting key postures in a human action video,” IEEE 10th Workshop on Multimedia Signal Processing, pp569-573, Oct. 8-10, 2008.
[19].J. Zhou, C. Wen and Y. Zhang, “Adaptive output control of nonlinear systems with uncertain dead-zone nonlinearity,” IEEE Transactions on Automatic Control, vo1.51, Issue 3, pp504-511, March. 2006.
[20].X.-S. Wang, C.-Y. Su and H. Hong, “Robust adaptive control of a class of nonlinear systems with unknown dead-zone,” 40th IEEE Conference on Decision and Control, vol.2, pp1627-1632, 2001.
[21].C.-A. Chen, “The authoring tool and performance platform for robotic puppet show,” National Chung Cheng University, 2011.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內校外完全公開 unrestricted
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code