Responsive image
博碩士論文 etd-0727101-134305 詳細資訊
Title page for etd-0727101-134305
論文名稱
Title
以線性迴歸的技巧加強RBF類神經網路的引申能力
Improving the Generalization Capability of the RBF Neural Networks via the Use of Linear Regression Techniques
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
63
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2001-06-29
繳交日期
Date of Submission
2001-07-27
關鍵字
Keywords
引申能力、函數模擬、輻射基底函數、類神經網路
Function Approximation, Generalization Capability, RBF
統計
Statistics
本論文已被瀏覽 5700 次,被下載 4399
The thesis/dissertation has been browsed 5700 times, has been downloaded 4399 times.
中文摘要
類神經網路可視為一種會學習的工具,本論文在以線性迴歸的技巧加強RBF類神經網路的引申能力,讓類神經網路學習後的成果具有實際應用的價值,用以避免學習效果佳卻應用效果不彰。
本論文研究RBF類神經網路的訓練方法,保留Chen and Billings(1992)提出的正交最小平方(orthogonal least square,OLS)學習法則的架構,針對RBF的特性分別對第一階段及第二階段提出改良的學習法則,並以early stop為訓練停止條件。
再者,本論文應用統計學中數個linear regression的技巧加強RBF的引申能力,分別對各方法在不同雜訊情形下進行電腦模擬。

Abstract
Neural networks can be looked as a kind of intruments which is

able to learn. For making the fruitful results of neural networks'

learning possess parctical applied value, the thesis makes use of

linear regression technics to strengthen the extended capability of

RBF neural networks.

The thesis researches the training methods of RBF neural networks,

and retains the frame of OLS(orthogonal least square) learning rules

which is published by Chen and Billings in 1992. Besides, aiming at

the RBF's characteristics, the thesis brings up improved learning rules

in first and second phases, and uses " early stop" to be the condition

of training ceasing.

To sum up, chiefly the thesis applies some technics of statistic

linear regression to strenthen the extended capability of RBF, and

using different methods to do computer simulation in different noise

situations.
目次 Table of Contents

第一章 緒論 1
1.1 前言 1
1.2 研究動機 2
1.3 論文架構 3
第二章 RBF網路架構與正交最小平方法則 4
2.1網路架構 4
2.2正交最小平方法則 5
第三章 改良RBF類神經網路學習法則 11
3.1 QR Algorithm 12
3.2 隱藏層神經元的參數 18
3.3 Early Stop 21
第四章 加強RBF類神經網路的引申能力 25
4.1神經元高斯函數寬度值的調整 25
4.2統計學的應用 26
4.3Linear Regression 27
4.3.1 Backward Elimination 28
4.3.2 Influential Observation 30
4.3.3 Robust Regression 31
4.4 加強RBF引申能力的方法 32
4.4.1Gaussian noise 38
4.4.2Uniform noise 43
4.4.3 Impulse noise 47
第五章 結論 54
參考文獻 56
附錄A QR Algorithm 58
附錄B實驗數據 59

參考文獻 References
[1] Chen, S., and Billings, S.A., 1992, “Neural Networks for Nonlinear Dynamic System Modeling and Identification,” International Journal of Control, Vol. 56, No.2, pp. 319-346

[2] Broomhead, DS., and Lowe, D., 1988, “Multivariable Functional Interpolation and Adaptive Networks,”Complex System, Vol. 2, pp.321-355

[3] Moody, J. and Darken, C., 1989, “Fast-learning in Networks of Locallyturned Processing Units,” Neual Computation, Vol 1, pp. 281-294

[4] Parlett, B.N.,2000,“The QR algorithm,”Computing in Science & Engineering , Vol 2, pp. 38-42

[5] William W. Hager ,1988, “Applied Numerical Linear Algebra,” Prentice Hall Internation, pp. 192-225

[6] Hansen, L.K.; and Larsen, J.; Fog, T.,1997, “Early stop criterion from the bootstrap ensemble,” Acoustics, Speech, and Signal Processing, 1997 IEEE International Conference on , Vol 4,pp. 3205 -3208

[7] Simon Haykin ,1999, “Neural Networks,A comprehensive Foundation,” Prentice Hall Internation, pp. 215-216

[8] Dennis G. Zill , and Michael R. Cullen, 1992, “Advanced Engineering Mathematics,” PWS Publishing Company, pp.415-417

[9] Easwaran S., and Gowdy,J.N., 1992, “An Improved Initialization Algorithm for Use with The K-means Algorithm for Code Book Generation,” Proceeding of IEEE Southeastcon, Vol. 2, pp. 471-474

[10] T. M. Cover and P. E. Hart 1967, “Nearest neighbor pattern classification,” IEEE Trans. Inform. Theory, vol. IT-13, pp. 21-27.


[11] Simon Haykin ,1999, “Neural Networks,A comprehensive Foundation,” Prentice Hall Internation, pp. 213-214

[12] Jose C. Principe ,and Neil R. Euliano,and W.Curt Lefebvre,2000, “Neural and Adaptive System ,” John Wiley & Sons ,inc., pp.8-9

[13] Mark L. Berenson ,and David M. Levine, 1999, “Basic Business Statistics Concepts and Applications ,” 7nd Edition, Prentice Hall, pp. 811-843

[14] Douglas C. Montgomery ,George C.Runger,1994,“Applied Statistics And Probaility For Engineers”,John Wiley & Sons,Inc pp.192-225

[15] Vladimir Cherkassky,1996, “Comparison of Adaptive Methods for Function Estimation form Samples,” IEEE Transactions on Neural Networks, Vol. 7, pp. 969-984

[16] J. Barry Gomm, and Ding Li Yu, 2000, “Selecting Radial Basis Function Network Centers with Recursive Orthogonal Least Squares Training,” IEEE Transactions on Neural Networks , Vol. 11, pp. 306-314

[17] Peter J.Rousseeuw, and Annick M. Leroy, 1987, “Robust Regression and Outlier Detection,” John Wiley & Sons,pp.75-84

[18] Bradley Efron, and RoberT J. Tibshirani, 1993, “An Introduction to the Bootstrap,” Chapman & Hall,Inc, pp.105-123

[19] 劉聰仁, 1996, “遞迴及分類類神經網路的改良及其在非線性模式控制問題之應用”, 博士論文, 國立中山大學機械工程研究所

[20] Hagan, M.T, and Demuth, H.B., American Control Confereence,1999, “Neural Network for Control ”, Proceedings of the 1999 , Volume: 3 , 1999
Page(s): 1642 -1656 vol.3

電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內校外完全公開 unrestricted
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code