Responsive image
博碩士論文 etd-0623108-160018 詳細資訊
Title page for etd-0623108-160018
論文名稱
Title
最小截尾平方類神經網路之研究
Study on Least Trimmed Squares Artificial Neural Networks
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
54
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2008-06-13
繳交日期
Date of Submission
2008-06-23
關鍵字
Keywords
類神經網路、最小截尾平方
Artificial Neural Networks, Least Trimmed Squares
統計
Statistics
本論文已被瀏覽 5747 次,被下載 644
The thesis/dissertation has been browsed 5747 times, has been downloaded 644 times.
中文摘要
在本論文裡,我們研究最小截尾平方類神經網路,它是將經常使用在強韌線性參數回歸問題中的最小截尾平方估測器,推廣至非線性回歸問題之非參數化類神經網路。
在本論文中我們提出兩種訓練的演算法。第一種演算法是漸進式梯度下降演算法。為了加快收歛速度,我們提出的第二種訓練演算法是根據遞迴最小平方誤差。
我們將提出三個範例來測試傳統類神經網路和最小截尾平方類神經網路對於抑制離群值之強韌性。模擬結果顯示,根據適當選擇學習機中的截尾常數,最小截尾平方類神經網路和傳統類神經網路比較起來具有較佳抑制離群值之強韌性。
Abstract
In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems.
Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS).
Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs.
目次 Table of Contents
誌謝 i
摘要 ii
Abstract iii
List of Figures and Tables iv
Glossary of Symbols vi
Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Brief Sketch of the Contents 5
Chapter 2 Artificial Neural Networks 6
2.1 Artificial Neural Network Modeling 6
2.2 Training by Incremental Gradient Descent Algorithm 10
2.3 Training by Recursive Least Squares Algorithm 12
Chapter 3 Least Trimmed Squares Artificial Neural Networks 15
3.1 Least Trimmed Squares 15
3.2 Training by Incremental Gradient Descent Algorithm 19
3.3 Training by Recursive Least Squares Algorithm 22
Chapter 4 Illustrative Examples 24
Chapter 5 Conclusion 38
Appendix Recursive Least Squares Algorithm for Least Squares Problems 39
References 43
參考文獻 References
[Bos.1] B. E. Boser, I. M. Guyon, and V. N. Vapnik, “A training algorithm for optimal margin classifiers,” in Proc. The Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, Pennsylvania, pp. 144-152, 1992.
[Chu.1] C. C. Chuang, J. T. Jeng, and P. T. Lin, “Annealing robust radial basis function networks for function approximation with outliers,” Neurocomputing, vol. 56, pp. 123-139, 2004.
[Cor.1] C. Cortes and V. N. Vapnik, “Support vector networks,” Machine Learning, vol. 20, pp. 273-297, 1995.
[Cyb.1] G. Cybenko, “Approximation by superposition of a sigmoidal functions,” Mathematics of Control, Signals, and Systems, vol. 2, pp. 304-314, 1989.
[Far.1] J. J. Faraway, Linear Models with R. CRC Press. Boca Raton, Florida, 2005.
[Fun.1] K. Funahashi, “On the approximate realization of continuous mappings by neural networks,” Neural Networks, vol. 2, no. 3, pp. 183-192, 1989.
[Har.1] E. J. Hartman, J. D. Keeler, and J. M. Kowalski, “Layered neural networks with Gaussian hidden units as universal approximations,” Neural Computation, vol. 2, no. 2, pp. 210-215, 1990.
[Här.2] W. Härdle, Z. Hlavka, and S. Klinke, XploRe - Application Guide. Springer, Berlin, Germany, 2000.
[Hor.1] K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, no. 5, pp. 359-366, 1989.
[Hsi.1] J. G. Hsieh, Y. L. Lin, and J. H. Jeng, “Preliminary study on Wilcoxon learning machines,” IEEE Transactions on Neural Networks , vol. 19, no. 2, pp.201-211, 2008.
[Kec.1] V. Kecman, Learning and Soft Computing. MIT Press, Cambridge, Massachusetts, 2001.
[Kos.1] B. Kosko, “Fuzzy systems as universal approximators,” IEEE Transactions on Computers, vol. 43, no. 11, pp. 1329-1333, 1994.
[Lin.1] Y. L. Lin, SVM-based Robust Template Design of Cellular Neural Networks and Primary Study of Wilcoxon Learning Machines. PhD Thesis, Department of Electrical Engineering, National Sun Yat-Sen University, Kaohsiung, Taiwan, 2006.
[Mcc.1] W. S. McCulloch and W. H. Pitts, A Logical Calculus of the Ideas Immanent in Nervous Activity. MIT Press, Cambridge, Massachusetts, 1943.
[Min.1] M. Minsky and S. Papert, Perceptrons: An Introduction to Computational Geometry. MIT Press, Cambridge, Massachusetts, 1969.
[Mon.1] D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to Linear Regression Analysis. 4th ed., Wiley, Hoboken, New Jersey, 2006.
[Nov.1] A. Novikoff, On Convergence Proofs for Perceptrons. Polytechnic Press, New York, 1963.
[Park.1] J. Park and I. W. Sandberg, “Universal approximation using radial basis function networks,” Neural Computation, vol. 3, no. 2, pp. 246-257, 1991.
[Ros.1] F. Rosenblatt, “The perceptron, a probabilistic model for information storage and organization in the brain,” Psychological, vol. 65, pp. 386-408, 1958.
[Rou.1] P. J. Rousseeuw, Multivariate Estimation with High Breakdown Point. B. Residel, Netherlands, 1985.
[Rum.1] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart, J. L. McClelland, and the PDP Research Group Eds. MIT Press, Cambridge, vol. 1, Foundations, pp. 318-362, 1986.
[Rus.1] A. Rusiecki, “Robust LTS backpropagation learning algorithm,” Lecture Notes in Computer Science, vol. 4507, pp. 102-109, 2007.
[Vem.1] V. R. Vemuri, Artificial Neural Networks. IEEE Computer Society Press, Los Alamitos, California, 1992.
[Wang.1] L. X. Wang and J. M. Mendel, “Fuzzy basis functions, universal approximation, and orthogonal least squares learning,” IEEE Transactions on Neural Networks, vol. 3, no. 5, pp. 807-814, 1992.
[Wid.1] B. Widrow and M. E. Hoff, “Adaptive switching circuits,” WESCON Convention Record, pp. 96-104, 1960.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus:永不公開 not available
校外 Off-campus:永不公開 not available

您的 IP(校外) 位址是 3.12.162.179
論文開放下載的時間是 校外不公開

Your IP address is 3.12.162.179
This thesis will be available to you on Indicate off-campus access is not available.

紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code