Responsive image
博碩士論文 etd-0809113-173558 詳細資訊
Title page for etd-0809113-173558
論文名稱
Title
多感測器融合應用於智慧型路口安全監控系統
Multi-Sensor Fusion for Safety Monitoring Systems in Intelligent Intersections
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
79
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2013-07-22
繳交日期
Date of Submission
2013-09-09
關鍵字
Keywords
安全監控系統、智慧型路口、雷射測距儀、人工神經網路、影像處理
laser range finder, artificial neural network, image processing, intelligent intersections, safety monitoring systems
統計
Statistics
本論文已被瀏覽 5696 次,被下載 191
The thesis/dissertation has been browsed 5696 times, has been downloaded 191 times.
中文摘要
在現今都市化的趨勢下,都市交通變得更加壅塞與危險,因此如果有一套智慧型路口安全監控系統(Safety Monitoring Systems in Intelligent Intersections),可以將路口的即時資訊傳送至遠端讓駕駛者參考,或是透過智慧型監控系統即時計算後來加以控制交通號誌和發出危險資訊,都能使路口更智慧化,也有效解決交通壅塞與安全的問題。
因此,本研究希望以雷射測距儀(Laser Range Finder)及攝影機(Camera)來建構這套智慧型監控系統。雷射測距儀可以即時且準確地量測出路口之即時訊息,來加以計算出物體種類、速度和碰撞時間等資訊,並可以透過無線網路與3G行動網路來傳送至駕駛者之智慧型手機(Smartphone)來接收,以達到遠端監控之功能。
而因為雷射測距儀有遮蔽和資料遺失之缺點,因此也希望以攝影機來輔助雷射測距儀之物體種類判別,並標示遮蔽區域。也可以將攝影機影像以向量量化編碼法(Vector Quantization)壓縮後,透過無線網路與3G行動網路傳送至遠端監控系統,讓駕駛者與行控中心可以即時監控路口狀況。
在雷射測距儀的部分則是以人工神經網路(Artificial Neural Network,ANN)來分類出不同物體之種類,例如汽車、機車與背景物件,並計算出其速度、碰撞時間和路口車流量,再透過無線網路傳送至智慧型手機和遠端來提供駕駛人即時資訊與危險警訊之警示。
而為了解決雷射測距儀遮蔽和資料遺失之缺點,希望加上路口攝影機來輔助辨識效果和被遮蔽物體之標示。因此透過背景相減(Background Subtraction)前處理及樣板比對(Template Matching)之方法來辨識物體種類。且透過逆透視轉換(Inverse Perspective Mapping)與曲面擬合(Surface Fitting)修正後估算出物體之位置來輔助雷射測距儀之辨識。
因此以貝氏定理(Bayesian theorem)整合雷射測距儀和攝影機,且結合智慧型手機來建構出一套智慧型路口監控與安全警示系統,使未來之路口可以減少壅塞及交通事故,讓駕駛者有更安全的行車環境。
Abstract
In the tendency of urbanization nowadays, urban traffic has become more crowded and dangerous. In order to cope with the above situation, the main purpose of this research is to develop a safety monitoring system in intelligent intersections that will send the real-time traffic information to drivers, control the traffic signals and provide the warning message to drivers. That is, to build a safety monitoring system in intelligent intersections for solving the heavy traffic congestion and safety problems.
By the inspiration of traffic congestion problems, this research develops a safety monitoring systems in intelligent intersections based on a laser range finder and a camera. The laser range finder is able to collect the real-time intersection data accurately .After that, the proposed system can classify different classes of objects (e.g., vehicle and bike), acquiring the physical information (e.g. velocity of vehicles, time-to-collision, and traffic flow) by the calculation of the data. Also, the related information and data from the sensors will be transmitted to smartphones and remote control sites by wireless network, for the purpose of updating real-time traffic status for drivers.
Due to the disadvantages of data loss and obstruction of view on the laser range finder, the camera is set to assist the laser range finder to classify the objects. Also, the system will transmit the image from a camera compressed by vector quantization and the real-time status to remote control sites through wireless network and 3G mobile network for drivers and control center.
On the methods of algorithms, artificial neural network (ANN) is implemented in the classification of the data from the laser range finder. In order to compensate the disadvantages of data loss and the obstruction of view on the laser range finder, background subtraction preprocessor and template matching are applied to classify the objects by the images of the camera. Through inverse perspective mapping and surface fitting calibration, the position can be predicted to assist the laser range finder.
Integrate the laser range finder, the camera and smartphones by Bayesian theorem, the safety monitoring system in intelligent intersections can ensure the reduction of traffic congestion and accidents and provide drivers with a more safety driving environment.
目次 Table of Contents
論文審定書 i
致 謝 ii
中文摘要 iii
Abstract v
目 錄 vii
圖 次 ix
表 次 xii
第一章 緒 論 1
1-1 研究動機 1
1-2 文獻回顧 3
1-3 主要貢獻 8
1-4 章節介紹 8
第二章 系統概述 9
2-1 智慧型路口安全監控系統 9
2-2 三大系統功能 10
2-3 系統架構流程 11
第三章 系統實現 13
3-1 雷射測距儀系統 13
3-1-1 雷射測距儀 13
3-1-2 前處理 14
3-1-3 人工神經網路分類 19
3-2 影像辨識系統 23
3-2-1 攝影機 23
3-2-2 前處理 23
3-2-3 樣板比對分類 25
3-3 雷射測距儀與影像之整合 30
3-3-1 誤差曲面擬合修正 30
3-3-2 雷射測距儀與影像聚類 30
3-3-3 貝氏定理分類 31
3-4 碰撞警示與遠端監控系統 33
3-4-1 路口碰撞警示系統 33
3-4-2 向量量化編碼法 34
3-4-3 遠端監控系統 36
第四章 實驗結果 40
4-1 實驗場景 40
4-2 雷射測距儀系統 41
4-3 影像辨識系統 45
4-4 雷射測距儀與影像之整合 49
4-5 碰撞警示與遠端監控系統 55
第五章 結論與未來展望 59
5-1 結論 59
5-2 未來展望 59
參考文獻 61
參考文獻 References
[1] 行政院主計總處: http://www.dgbas.gov.tw/mp.asp?mp=1
[2] H. J. Zhao, J. Sha, Y. P. Zhao, J. Q. Xi, J. S. Cui, H. B. Zha, R. Shibasaki, “Detection and tracking of moving objects at intersections using a network of laser scanners,” IEEE Transactions on Intelligent Transportation Systems, June 2012, pp.655-670.
[3] ARTC:http://www.artc.org.tw/chinese/02_research/02_01detail.aspx?pdid=20
[4] W. Li, R. Zhang, Z. G. Liu, H. J. Zhao, R. Shibasaki, “An approach of laser-based vehicle monitor,” Applied Mathematics and Computation, 2007, pp.953-962.
[5] F. Nashashibi, A. Bargeton, “Laser-based vehicles tracking and classification using occlusion reasoning and confidence estimation,” IEEE Intelligent Vehicles Symposium, June 2008, pp.847-852.
[6] 經濟部投資業務處:http://www.dois.moea.gov.tw/
[7] C. Goerick, D. Noll, M. Werner, “Artificial neural networks in real-time car detection and tracking applications,” Pattern Recognition Letters, 1996, pp.335-343.
[8] N. Buch, J. Orwell, S. A. Velastin, “Urban road user detection and classification using 3D wire frame models,” IET Computer Vision, June 2010, pp.105-116.
[9] B. Zhang, “Reliable classification of vehicle types based on cascade classifier ensembles,” IEEE Transactions on Intelligent Transportation Systems, March 2013, pp.322-332.
[10] M. Kafai, B. Bhanu, “Dynamic bayesian networks for vehicle classification in video,” IEEE Transactions on Industrial Informatics, Feb. 2012, pp.100-109.

[11] S. Gupte, O. Masoud, R. F. K. Martin, N. P. Papanikolopoulos, “Detection and classification of vehicles,” IEEE Transactions on Intelligent Transportation Systems, Mar 2002, pp.37-47.
[12] L. Ngako Pangop, F. Chausse, S. Cornou, R. Chapuis, “Feature-based multisensor fusion using bayes formula for pedestrian classification in outdoor environments,” IEEE Intelligent Vehicles Symposium, 2007 , pp. 62-67.
[13] Q. Baig, O. Aycard, T. D. Vu, T. Fraichard, “Fusion between laser and stereo vision data for moving objects tracking in intersection like scenario,” IEEE Intelligent Vehicles Symposium, June 2011, pp.362-367.
[14] L. Lu, C. Ordonez, E. G. Collins, E. M. DuPont, “Terrain surface classification for autonomous ground vehicles using a 2D laser stripe-based structured light sensor,” International Conference on Intelligent Robots and Systems, 2009, pp.2174-2181.
[15] B. Musleh, F. García, J. Otamendi, J. M. Armingol and A. D. L. Escalera, “Identifying and tracking pedestrians based on sensor fusion and motion stability predictions,” Journal of Sensors, August 2010, pp.8028-8053.
[16] T. Wang, N. Zheng, J. M. Xin, Z. Ma, “Integrating millimeter wave radar with a monocular vision sensor for on-road obstacle detection applications,” Journal of Sensors, September 2011, pp.8028-8053.
[17] S. Wender, K. Dietmayer, “3D vehicle detection using a laser scanner and a video camera,” IET Intelligent Transport Systems, April 2008, pp.105-112.
[18] G. Alessandretti, A. Broggi, P. Cerri, “Vehicle and guard rail detection using radar and vision data fusion,” IEEE Transactions on Intelligent Transportation Systems, March 2007, pp.95-105.


[19] C. Y. Chan, “Characterization of driving behaviors based on field observation of intersection left-turn across-path scenarios,” IEEE Transactions on Intelligent Transportation Systems, Sept. 2006, pp.322-331.
[20] M. Sepulcre, J. Gozalvez, J. Hernandez, “Cooperative vehicle-to-vehicle active safety testing under challenging conditions,” Transportation Research Part C: Emerging Technologies, 2013, pp.233-255.
[21] R. Miller, Q. F. Huang, “An adaptive peer-to-peer collision warning system,” Vehicular Technology Conference, vol. 1, pp. 317-321, 2002.
[22] Y. Wang, W. E, D. Tian, G. Lu, G. Yu, Y. Wang, “Vehicle collision warning system and collision detection algorithm based on vehicle infrastructure integration,” Advanced Forum on Transportation of China, Oct. 2011, pp.216-220.
[23] J. A. Misener, R. Sengupta, H. Krishnan, “Cooperative collision warning: enabling crash avoidance with wireless technology,” World Congress on Intelligent Transport Systems, November 2005, pp.1-11.
[24] L. Duan, D. Xiong, J. Lee and F. Guo, “A local density based spatial clustering algorithm with noise,” IEEE International Conference on Systems, October 2006, pp.4061-4066.
[25] SICK, Laser Measurement Sensors, LMS291: http://mysick.com/partnerPortal/eCat.aspx?c=1&go=FinderSearch&Cat=Row&At=Fa&Cult=English&Category=Produktfinder&FamilyID=267&Selections=8644,0,0,8775,0.。
[26] 極座標系:http://zh.wikipedia.org/wiki/
[27] S. Wender, T. Weiss, K. C. J. Dietmayer, “Object classification exploiting high level maps of intersections,” University of Ulm K. Ch. Furstenberg, 2006, pp.187-203.
[28] Artificial Neural Network:http://en.wikipedia.org/wiki/Artificial_neural_network
[29] 陳俊偉, 吳炳飛 教授, “混合行人與汽機車於路口之影像式偵測系統研究,” 碩士論文, 國立交通大學 電機與控制工程研究所, 2009.
[30] 賴俊良, 鄭銘揚 教授, “移動目標物視覺偵測與追蹤研究,”碩士論文, 國立成功大學 電機工程學系, 2006.
[31] 攝影機:http://www.logitech.com/zh-tw/product/6816?crid=34
[32] M. Bertozzi, A. Broggi, A. Fascioli, “Stereo inverse perspective mapping: theory and applications,” Image and Vision Computing, 1998, pp.585-590.
[33] MATLAB SFTOOL:http://www.mathworks.com/help/curvefit/surface-fitting.html
[34] N. Friedman, D. Geiger, and M. Goldszmidt “Bayesian network classifiers,” Machine Learning, 1997, pp.131–163.
[35] H. Zhao, X. W. Shao, K. Katabira, R. Shibasaki, “Joint tracking and classification of moving objects at intersection using a single-row laser range scanner,” IEEE Intelligent Transportation Systems Conference, 2006, pp.287-294.
[36] L. N. Pangop, F. Chausse, R. Chapuis, S. Cornou, “Asynchronous bayesian algorithm for object classification: application to pedestrian detection in urban areas,” International Conference on Information Fusion, 2008, pp.1-7.
[37] A. Papoulis, S. U. Pillai, “Probability, random variables and stochastic processes,” 4th edition, Chap2., pp32-35.
[38] J. L. Castro, M. Delgado, J. Medina, M. D. Ruiz-Lozano, “An expert fuzzy system for predicting object collisions. Its application for avoiding pedestrian accidents,” Expert Systems with Applications, 2011, 486–494.
[39] Y. Linde, A. Buzo, R. M. Gray, “An algorithm for vector quantizer design,” IEEE Transactions on Communications, 1980, pp.84-95.
[40] Acer Veriton M460:http://www.acer.com.tw/ac/zh/TW/content/home
[41] HTC One S:http://www.htc.com/tw/smartphones/htc-one-s/
[42] WIKIPEDIA - Eclipse: http://zh.wikipedia.org/wiki/Eclipse.
[43] TCP協定:http://dns2.asia.edu.tw/~wzyang/slides/info_net/info_B/CH10TCP.pdf
[44] PSNR:http://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio
[45] S. J. Krotosky, M. M. Trivedi, “On color-, infrared-, and multimodal-stereo approaches to pedestrian detection,” IEEE Transactions on Intelligent Transportation Systems, Dec. 2007, pp.619-629.
[46] K. Jisu, H. Sungjun, B. Jeonghyun, K. Euntai, L. HeeJin, “Autonomous vehicle detection system using visible and infrared camera,” Conference on Automation and Systems, Oct. 2012, pp.630-634.
[47] T. Williams, P. Alves, G. Lachapelle, C. Basnayake, “Evaluation of GPS-based methods of relative positioning for automotive safety applications,” Transportation Research Part C: Emerging Technologies, 2012, pp.98-108.
[48] T. L. Willke, P. Tientrakool, N. F. Maxemchuk, “A survey of inter-vehicle communication protocols and their applications,” Communications Surveys & Tutorials, 2009, pp.3-20.
[49] Google Driverless Car: https://zh.wikipedia.org/wiki/Google_Driverless_Car
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code