Responsive image
博碩士論文 etd-0829111-231710 詳細資訊
Title page for etd-0829111-231710
論文名稱
Title
應用全方位攝影機於空間移動物體偵測
Application of an Omnidirectional Camera to Detection of Moving Objects in 3D Space
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
151
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2011-07-21
繳交日期
Date of Submission
2011-08-29
關鍵字
Keywords
全方位影像、物體追蹤、CAMShift、光流
Omnidirectional image, Object tracking, CAMShift, Optical flow
統計
Statistics
本論文已被瀏覽 5667 次,被下載 3806
The thesis/dissertation has been browsed 5667 times, has been downloaded 3806 times.
中文摘要
傳統攝影機的視野狹小,在影像追蹤上僅能觀察局部的區域,其追蹤範圍有限。若將此視覺系統應用在機器人上,其平台的運動將會受到限制而降低了追蹤能力。使用全方位攝影機(omnidirectional camera)可增加視野範圍、減少視覺死角並改善追蹤能力。 本研究假設將全方位攝影機架設在移動平台上,該平台僅進行平面運動的情況下,使用光流及CAMShift追蹤一空間中僅受重力作用且無自我推力的飛行物。藉由拋物線擬合、最小平方法與Levenberg-Marquardt方法預估物體目前以及下一刻的空間位置。最後預估其落點以利移動平台能移動到落點位置與目標物會合。即使攝影機在平移及旋轉的情況下,仍可以對目標物進行追蹤及落點預測。
Abstract
Conventional cameras are usually small in their field of view (FOV) and make the observable region limited. Applications by such a vision system may also limit motion capabilities for robots when it comes to object tracking. Omnidirectional camera has a wide FOV which can obtain environmental data from all directions. In comparison with conventional cameras, the wide FOV of omnidirectional cameras reduces blind regions and improves tracking ability. In this thesis, we assume an omnidirectional camera is mounted on a moving platform, which travels with planar motion. By applying optical flow and CAMShift algorithm to track an object which is non-propelled and only subjected to gravity. Then, by parabolic fitting, least-square method and Levenberg-Marquardt method to predict the 3D coordinate of the object at the current instant and the next instant, we can finally predict the position of the drop point and drive the moving platform to meet the object at the drop point. The tracking operation and drop point prediction can be successfully achieved even if the camera is under planar motion and rotation.
目次 Table of Contents
目錄 i
圖目錄 iv
表目錄 ix
摘要 x
Abstract xi
第一章 緒論 1
1.1 研究動機與目的 1
1.2 文獻回顧 3
1.3 論文架構 5
第二章 全方位攝影機 7
2.1 全方位影像之特性 7
2.2 全方位攝影機系統模型 9
2.3 幾何投影關係式 11
2.4 拋物面鏡型全方位攝影機 15
2.5 其他全方位攝影機 16
第三章 基本影像處理技巧 18
3.1 色彩模型與轉換 18
3.1.1 RGB色彩空間 18
3.1.2 灰階色彩模型 19
3.1.3 HSV色彩模型 21
3.2 傳統影像內插法簡介 23
3.3 影像積分 28
3.4 核心函數(Kernel Function) 30
3.5 機率密度函數(Probability Density Function/PDF) 31
3.6 影像矩(Image Moment) 33
3.6.1 原始矩(Raw Moment) 34
3.6.2 中心矩(Central Moment) 35
3.7 影像方位 35
第四章 攝影機校正與影像轉換 38
4.1 全方位攝影機校正 38
4.2 全方位影像轉換至環景影像 41
第五章 金字塔Lucas-Kanade光流法 47
5.1 光流法(Optical Flow) 47
5.1.1 光流的概念 47
5.1.2 標準迭代型Lucas-Kanade光流法 48
5.3 影像金字塔搭配Lucas-Kanade光流法 54
5.3.1 影像金字塔(Image Pyramid) 54
5.3.2 粗略到精細(Coarse to Fine)之光流計算 57
第六章 CAMShift目標追蹤演算法 60
6.1 Mean Shift 60
6.2 自適應視窗大小 65
6.3 以色彩為基礎之CAMShift 65
6.3.1 樣板輸入 65
6.3.2 使用Mean Shift進行目標追蹤 66
6.4 以光流為基礎之CAMShift 67
第七章 目標物落點預估 72
7.1 落點預估之策略 72
7.2 初期預估 72
7.2.1 拋物線擬合 72
7.2.2 預估目標在環景影像的下一刻投影點 73
7.2 後期落點之預估 78
7.2.1 使用最小平方法推算目標物之初始狀態 78
7.2.2 使用Levenberg-Marquardt最佳化初始狀態 88
第八章 模擬與實驗 94
8.1 全方位攝影機之硬體架構 94
8.2 使用全方位攝影機校正工具箱進行校正 96
8.3 全方位影像至環景影像轉換 101
8.4 模擬測試 103
8.4.1 環境建構 103
8.4.2 位置預估測試 104
8.5 實際圖片測試 119
8.5.1 實際測試之環境與操作 119
8.5.2 軌跡估測之實際圖片測試 122
第九章 結論與未來展望 128
參考文獻 131
附錄A:最小平方法估測物體位置之推導 135
附錄B:球面式全方位影像至環景影像轉換 136
參考文獻 References
[1] J. Hill and W. T. Park, "Real Time Control of a Robot with a Mobile Camera," in Proc. 9th ISIR, Washington, D.C., pp. 233-246, Mar. 1979.
[2] T. Collett, "Insect Vision: Controlling Actions through Optical Flow," Current Biology, Vol. 12, No. 18, pp. R615-R617, 2002.
[3] G. Bradski, "Computer Vision Face Tracking For Use in a Perceptual User Interface," Intel Technology Journal, 1998.
[4] C. Harris and M. Stephens, "A Combined Corner and Edge Detection," Proceedings of The Fourth Alvery Vision Conference, pp. 147-151, 1988.
[5] D. H. Ballard, "Generalizing the Hough Transform to Detect Arbitrary Shapes," pp. 714–725, 1987.
[6] D.A. Martins, A.J.R. Neves, A.J. Pinho, "Real-time generic ball recognition in RoboCup domain," Proceedings 3rd International Workshop on Intelligent Robotics, IROBOT 2008, Lisbon, Portugal, pp. 37-48, 2008.
[7] J. Canny, "A computational approach to edge detection. Pattern Analysis and Machine Intelligence," IEEE Transactions on, PAMI-8(6), pp. 679–698, 1986.
[8] S. J. K. Pedersen, "Circular Hough Transform," Aalborg University, Vision, Graphics, and Interactive Systems, 2007.
[9] H. Lu, H. Zhang, J. Xiao, F. Liu, and Z. Zheng, "Arbitrary Ball Recognition Based on Omni-Directional Vision for Soccer Robots," RoboCup 2008, LNAI 5399, pp. 133–144, 2009.
[10] N. Oshima, T. Saitoh and R.Konishi, "Real Time Mean Shift Tracking using Optical Flow Distribution," SICE-ICASE, 2006. International Joint Conference, pp. 4316-4320, 2006.
[11] 周家至,應用全方位影像與光流技術於運動估測,國立中山大學機械與機電工程學系碩士論文,民國九十六年七月。
[12] S.-W. Jeng and W.-H. Tsai, "Precise Image Unwarping of Omnidirectional Cameras with Hyperbolic-Shaped Mirrors," 16th IPPR Conference on Computer Vision, Graphics and Image Processing, pp. 414-422, 2003.
[13] "Point Grey - 360 Spherical - Ladybug2 - 360 Video Camera - CCD FireWire Camera," Point Grey Research, Inc., http://www.ptgrey.com/products/ladybug2/ladybug2_360_video_camera.asp.
[14] R. Jain, R. Kasturi, B. G. Schunck, "Machine Vision, International Edition 1995", McGraw-Hill series in computer science, pp. 33-35, 1995.
[15] D. Scaramuzza, A. Martinelli and R. Siegwart, "A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion," Proceedings of IEEE International Conference on Computer Vision Systems, pp. 45, 2006.
[16] D. Scaramuzza, A. Martinelli and R. Siegwart, "A Toolbox for Easily Calibrating Omnidirectional Cameras," Proceedings of IEEE International Conference on Intelligent Robots and Systems, pp. 5695-5701, 2006.
[17] B.K.P. Horn and B.G. Schunck, "Determining optical flow," Artificial Intelligence, vol 17, pp. 185-203, 1981.
[18] B. D. Lucas and T. Kanade, "An iterative image registration technique with an application to stereo vision," Proceedings of Imaging Understanding Workshop, pp. 121-130, 1981.
[19] P. Anandan, "A computational framework and an algorithm for the measurement of visual motion," International Journal of Computer Vision 2(3), pp. 283-310, 1989.
[20] J. L. Barron, D. J. Fleet and S. Beauchemin, "Performance of Optical Flow Techniques," International Journal of Computer Vision, Vol. 12, No. 1, pp. 43-77, 1994.
[21] J.-Y. Bouguet, "Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm," Intel Corporation Microprocessor Research Labs, 2000.
[22] D. Comaniciu and P. Meer, "Mean Shift: A Robust Approach Toward Feature Space Analysis". IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 603-619, 2002.
[23] 林家瑋,應用平均位移法於變形物體之即時視覺追循,國立中山大學機械與機電工程學系碩士論文,民國九十八年七月。
[24] K. Fukunaga and L.D. Hostetler, "The Estimation of the Gradient of a Density function, with Applications in Pattern Recognition," IEEE Trans. Info. Theory, vol. IT-21, pp. 32-40, 1975.
[25] Y. Cheng, "Mean Shift, Mode Seeking, and Clustering," IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 790–799, 1995.
[26] T. Cacoullos, "Estimation of A Multivariate Density," Annals of the Institute of Statistical Mathematics, vol 18, pp 179-189, 1996
[27] D. Comaniciu, V. Ramesh, and P. Meter, "Real-time Tracking of Non-rigid Objects Using Mean Shift," 2000 IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head, SC, vol. II, pp. 142-149, 2000.
[28] Y. Ukrainitz and B. Sarel, "Mean Shift Theory and Applications", Weizmann Institute of Science, http://www.wisdom.weizmann.ac.il/~vision/courses/2004_2/ files/mean_shift/mean_shift.ppt
[29] R.B. Fernando, "Catching balls: What visual cues are used to judge the 3-D trajectory of a moving object?", Neuroscience Honors Thesis, Wellesley College, 2008.
[30] R. L. Burden and J. D. Faires, "Numerical Analysis, 8th Edition," Thomson Brooks/Cole, pp. 482-494, 2005.
[31] R. Herrejon, S. Kagami, and K. Hashimoto, "Online 3-D trajectory estimation of a flying object from a monocular image sequence," Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on, pp. 2496-2501, 2009.
[32] C. Viriyasuthee, "A real-time 3D parabolic trajectory reconstruction from projected 2D data using regression model," School of Computer Science McGill University, Montreal, Quebec, Canada, pp. 7-9, 2009.
[33] R. E. Kalman, "A New Approach to Linear Filtering and Prediction Problems," Transaction of the ASME—Journal of Basic Engineering, pp. 35-45, 1960.
[34] G. Welch and G. Bishop, "An introduction to the Kalman filter", SIGGRAPH 2001 course 8 in computer graphics, Annual Conf. on Computer Graphics and Interactive Techniques, ACM Press, Addison-Wesley, Los Angeles, CA (August 12–17, 2001), SIGGRAPH 2001 course pack edition, 2001.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code