Responsive image
博碩士論文 etd-1119110-215738 詳細資訊
Title page for etd-1119110-215738
論文名稱
Title
韌性模糊神經網路之研究
Research on Robust Fuzzy Neural Networks
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
104
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2010-11-12
繳交日期
Date of Submission
2010-11-19
關鍵字
Keywords
迴溯配適程序.、半參數化Wilcoxon模糊神經網路、半參數化模糊神經網路、重新加權最小平方法、最陡梯度法、最小截尾平方模糊神經網路、最大概度模糊神經網路
backfitting procedure, semiparametric Wilcoxon fuzzy neural networks, semiparametric fuzzy neural networks, least trimmed squares fuzzy neural networks, iteratively reweighted least squares, Maximum Likelihood Fuzzy Neural Networks, gradient descent
統計
Statistics
本論文已被瀏覽 5738 次,被下載 1406
The thesis/dissertation has been browsed 5738 times, has been downloaded 1406 times.
中文摘要
眾所周知,在許多的實際應用中所搜集到的資料,無可避免地包含一個或多個異常的離群值;也就是說,離群值是一些遠離大多數資料的一些觀測資料。離群值的產生可能是由於小數點的位置錯誤、紀錄錯誤、傳輸錯誤,或者是設備故障等。這些異常值可能導致錯誤的參數估測,進而影響到模型推論的正確性和準確性。為了解決這些問題,本論文將提出三種韌性模糊神經網路。在一般的非線性學習問題中,它們提供了另外一些學習機器的選擇。我們所要強調的重點是這些學習機器對抑制離群值之強韌性。儘管在本論文目前的研究中,我們只探討了模糊神經網路,但我們可以很容易地將這樣的方法延伸到其它的神經網路,如類神經網路或徑向基底函數網路等。
在論文的第一個部分中,經常使用於線性參數化強韌回歸問題中之最大概度估測器將被推廣至可用於非線性回歸問題之非參數化最大概度模糊神經網路。根據最陡梯度法與重新加權最小平方法,我們提出簡單的權重更新規則。
在論文的第二個部分中,經常使用於線性參數化強韌(或強抗)回歸問題中之最小截尾平方估測器將被推廣至可用於非線性回歸問題之非參數化最小截尾平方模糊神經網路。同樣地,我們也根據最陡梯度法與重新加權最小平方法提出簡單的權重更新規則。
在論文的最後部分中,我們將結合參數化模型的易解釋性以及非參數化模型的靈活性之優點提出半參數化模糊神經網路以及半參數化Wilcoxon模糊神經網路。其相對應的學習規則是基於經常被使用於半參數化回歸中之迴溯配適程序。
Abstract
In many practical applications, it is well known that data collected inevitably contain one or more anomalous outliers; that is, observations that are well separated from the majority or bulk of the data, or in some fashion deviate from the general pattern of the data. The occurrence of outliers may be due to misplaced decimal points, recording errors, transmission errors, or equipment failure. These outliers can lead to erroneous parameter estimation and consequently affect the correctness and accuracy of the model inference. In order to solve these problems, three robust fuzzy neural networks (FNNs) will be proposed in this dissertation. This provides alternative learning machines when faced with general nonlinear learning problems. Our emphasis will be put particularly on the robustness of these learning machines against outliers. Though we consider only FNNs in this study, the extension of our approach to other neural networks, such as artificial neural networks and radial basis function networks, is straightforward.
In the first part of the dissertation, M-estimators, where M stands for maximum likelihood, frequently used in robust regression for linear parametric regression problems will be generalized to nonparametric Maximum Likelihood Fuzzy Neural Networks (MFNNs) for nonlinear regression problems. Simple weight updating rules based on gradient descent and iteratively reweighted least squares (IRLS) will be derived.
In the second part of the dissertation, least trimmed squares estimators, abbreviated as LTS-estimators, frequently used in robust (or resistant) regression for linear parametric regression problems will be generalized to nonparametric least trimmed squares fuzzy neural networks, abbreviated as LTS-FNNs, for nonlinear regression problems. Again, simple weight updating rules based on gradient descent and iteratively reweighted least squares (IRLS) algorithms will be provided.
In the last part of the dissertation, by combining the easy interpretability of the parametric models and the flexibility of the nonparametric models, semiparametric fuzzy neural networks (semiparametric FNNs) and semiparametric Wilcoxon fuzzy neural networks (semiparametric WFNNs) will be proposed. The corresponding learning rules are based on the backfitting procedure which is frequently used in semiparametric regression.
目次 Table of Contents
誌謝 i
摘要 ii
ABSTRACT iii
LIST OF FIGURES v
LIST OF TABLES vii
GLOSSARY OF SYMBOLS ix
CHAPTER 1 Background and Research Motive 1
1.1 Motivation 1
1.2 Fuzzy Systems 3
1.3 Fuzzy Neural Networks 15
CHAPTER 2 Maximum Likelihood Fuzzy Neural Networks 19
2.1 Introduction 19
2.2 Linear M-Estimators 21
2.3 M-Fuzzy Neural Networks 27
2.4 Illustrative Examples 31
CHAPTER 3 Least Trimmed Squares Fuzzy Neural Networks 41
3.1 Introduction 41
3.2 Linear Least Trimmed Squares Estimators 42
3.3 LTS-Fuzzy Neural Networks 45
3.4 Illustrative Examples 48
CHAPTER 4 Semiparametric Wilcoxon Fuzzy Neural Networks 54
4.1 Introduction 54
4.2 Linear Wilcoxon Regressors 54
4.3 Wilcoxon Fuzzy Neural Networks 57
4.4 Semiparametric Wilcoxon Fuzzy Neural Networks 61
4.5 Illustrative Examples 66
CHAPTER 5 Conclusion and Discussion 79
APPENDIX Linear Weighted Least Squares Regression 83
REFERENCES 89
參考文獻 References
[Bos.1] B. E. Boser, I. M. Guyon, and V. N. Vapnik, “A training algorithm for optimal margin classifiers,” Proceeding of the Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, Pennsylvania, pp. 144-152, 1992.
[Bre.1] L. Breiman, J. H. Friedman, “Estimating optimal transformations for multiple regression and correlations,” Journal of the American Statistical Association, vol. 80, no. 391, pp. 580-598, 1985.
[Buja.1] A. Buja, T. J. Hastie, R. J. Tibshirani, “Linear smoothers and additive models,” Annals of Statistics, vol. 17, no. 2, pp. 453-510, 1989.
[Chen.1] D. S. Chen and R. C. Jain, “A robust backpropagation learning algorithm for function approximation,” IEEE Transactions on Neural Networks, vol. 5, no. 3, pp. 467-479, 1994.
[Chu.1] C. C. Chuang, S. F. Su, and C. C. Hsiao, “The annealing robust backpropagation (ARBP) learning algorithm,” IEEE Transactions on Neural Networks, vol. 11, no. 5, pp. 1067-1078, 2000.
[Chu.2] C. C. Chuang, S. F. Su, and S. S. Chen, “Robust TSK fuzzy modeling for function approximation with outliers,” IEEE Transactions on Fuzzy Systems, vol. 9, no. 6, pp. 810-821, 2001.
[Chu.3] C. C. Chuang, J. T. Jeng, and P. T. Lin, “Annealing robust radial basis function networks for function approximation with outliers,” Neurocomputing, vol. 56, pp. 123-139, 2004.
[Cor.1] C. Cortes and V. N. Vapnik, “Support vector networks,” Machine Learning, vol. 20, pp. 273-297, 1995.
[Far.1] J. J. Faraway, Linear Models with R, Chapman & Hall/CRC, Florida, 2005.
[Ham.1] F. Hampel, Contributions to the Theory of Robust Estimation, Ph.D. Thesis, University of California, Berkeley, 1968.
[Ham.2] F. R. Hampel, E. M. Ronchetti, P. J. Rousseeuw, and W. A. Stahel, Robust Statistics: The Approach Based on Influence Functions, Wiley, New York, 2005.
[Har.1] E. J. Hartman, J. D. Keeler, and J. M. Kowalski, “Layered neural networks with Gaussian hidden units as universal approximations,” Neural Computation, vol. 2, no. 2, pp. 210-215, 1990.
[[Har.2] W. Hardle, M. Muller, S. Sperlich, and A. Werwatz, Nonparametric and Semiparametric Models, Springer, Berlin, Germany, 2004.
[Hogg.1] R. V. Hogg, J. W. McKean, and A. T. Craig, Introduction to Mathematical Statistics, 6th ed., Prentice-Hall, Englewood Cliffs, New Jersey, 2005.
[Hor.1] K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, no. 5, pp. 359-366, 1989.
[Hsi.1] J. G. Hsieh, Y. L. Lin, and J. H. Jeng, “Preliminary study on Wilcoxon learning machines,” IEEE Transactions on Neural Networks, vol. 19, no. 2, pp. 201-211, 2008.
[Hua.1] L. Huang, B. L. Zhang, and Q. Huang, “Robust interval regression analysis using neural networks,” Fuzzy Sets and Systems, vol. 97, no. 3, pp. 337-347, 1998.
[Hub.1] P. J. Huber, “Robust estimation of a location parameter,” Annals of Mathematical Statistics, vol. 35, no. 1, pp. 73-101, 1964.
[Hub.2] P. J. Huber and E. M. Ronchetti, Robust Statistics, 2nd ed., Wiley, Hoboken, New Jersey, 2009.
[Kec.1] V. Kecman, Learning and Soft Computing, Cambridge, MIT Press, Massachusetts, 2001.
[Kos.1] B. Kosko, “Fuzzy systems as universal approximators,” IEEE Transactions on Computers, vol. 43, no. 11, pp. 1329-1333, 1994.
[Kut.1] M. H. Kutner, C. J. Nachtsheim, J. Neter, and W. Li, Applied Linear Statistical Models, 5th ed., McGraw-Hill, New York, 2005.
[Lee.1] C. C. Lee, P. C. Chung, J. R. Tsai, and C. I Chang, “Robust radial basis function neural networks,” IEEE Transactions on Systems, Man, and Cybernetics-Part B, vol. 29, no. 6, pp. 674-685, 1999.
[Les.1] J. M. Leski, “TSK-fuzzy modeling based on -insensitive learning,” IEEE Transactions on Fuzzy Systems, vol. 13, no. 2, pp. 181-193, 2005.
[Mai.1] J. Maindonald and J. Braun, Data Analysis and Graphics Using R, 2nd ed., Cambridge University Press, New York, 2007.
[Mar.1] R. A. Maronna, R. D. Martin, and V. J. Yohai, Robust Statistics: Theory and Methods, Wiley, Chichester, England, 2006.
[Mon.1] D. C. Montgomery, E. A. Peck, and G. G. Vining, Introduction to Linear Regression Analysis, 4th ed., Wiley, Hoboken, New Jersey, 2006.
[Park.1] J. Park and I. W. Sandberg, “Universal approximation using radial basis function networks,” Neural Computation, vol. 3, no. 2, pp. 246-257, 1991.
[Rou.1] P. J. Rousseeuw, “Multivariate estimation with high breakdown point,” Mathematical Statistics and Applications, W. Grossmann, G. Pflug, I. Vincze, and W. Wertz, Eds., Dordrecht, Reidel, vol. B, pp. 283-297, 1985.
[Rou.2] P. J. Rousseeuw and A. M. Leroy, Robust Regression and Outlier Detection, Wiley, New York, 2003.
[Rum.1] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal representations by error propagation,” Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D. E. Rumelhart, J. L. McClelland, and the PDP Research Group, Eds., Cambridge, MIT Press, Massachusetts, vol. 1, Foundations, pp. 318-362, 1986.
[Rus.1] A. Rusiecki, “Robust LTS backpropagation learning algorithm,” Lecture Notes in Computer Science, vol. 4507, pp. 102-109, 2007.
[Rup.1] D. Ruppert, M. P. Wand, and R. J. Carroll, Semiparametric Regression, Cambridge University Press, New York, 2003.
[Ryan.1] T. P. Ryan, Modern Regression Methods, 2nd ed., Wiley, New York, 2009.
[Shi.1] H. L. Shieh, Y. K. Yang, P. L. Chang, and J. T. Jeng, “Robust neural-fuzzy method for function approximation,” Expert Systems with Applications, vol. 36, pp. 6903-6913, 2009.
[Tab.1] M. Tableman, “The influence functions for the least trimmed squares and the least trimmed absolute deviations estimators,” Statistics & Probability Letters, vol. 19, pp. 329-337, 1994.
[Tsai.1] H. H. Tsai and P. T. Yu, “On the optimal design of fuzzy neural networks with robust learning for function approximation,” IEEE Transactions on Systems, Man, and Cybernetics-Part B, vol. 30, no. 1, pp. 217-223, 2000.
[Tuk.1] J. W. Tukey, “A survey of sampling from contaminated distributions,” Contributions to Probability and Statistics, I. Olkin, S. G. Ghurye, W. Hoeffding, W. G. Madow, and H. B. Mann, Eds., Stanford University Press, Stanford, California, pp. 448-485, 1960.
[Wang.1] L. X. Wang and J. M. Mendel, “Fuzzy basis functions, universal
approximation, and orthogonal least squares learning,” IEEE Transactions on Neural Networks, vol. 3, no. 5, pp. 807-814, 1992.
[Wang.2] L. X. Wang, A Course in Fuzzy Systems and Control, Prentice-Hall, Englewood Cliffs, New Jersey, 1997.
[Wang.3] W. Y. Wang, T. T. Lee, C. L. Liu, and C. H. Wang, “Function approximation using fuzzy neural networks with robust learning algorithm,” IEEE Transactions on Systems, Man, and Cybernetics-Part B, vol. 27, no. 4, pp. 740-747, 1997.
[Wu.1] H. K. Wu, J. G. Hsieh, Y. L. Lin, and J. H. Jeng, “On maximum likelihood fuzzy neural networks,” Fuzzy Sets and Systems, vol. 161, no. 21, pp. 2795-2807, 2010.
[Wu.2] H. K. Wu, J. G. Hsieh, and K. W. Yu, “Study on least trimmed squares fuzzy neural networks,” International Conference on Intelligent Systems and Knowledge Engineering, 2010. (to appear)
[Wu.3] H. K. Wu, J. G. Hsieh, Y. L. Lin, and J. H. Jeng, “Study on semiparametric Wilcoxon fuzzy neural networks,” Soft Computing. (submitted)
[Yu.1] W. Yu and X. Li, “Fuzzy identification using fuzzy neural networks with stable learning algorithms,” IEEE Transactions on Fuzzy Systems, vol. 12, no. 3, pp. 411-420, 2004.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內校外完全公開 unrestricted
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code