Responsive image
博碩士論文 etd-0721106-213119 詳細資訊
Title page for etd-0721106-213119
論文名稱
Title
支援向量迴歸方法的參數預測
Estimation of Parameters in Support Vector Regression
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
70
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2006-07-04
繳交日期
Date of Submission
2006-07-21
關鍵字
Keywords
模糊c均值法、支援向量機、序列最小優化、資源分配網路、輻射半徑基底函數
Sequential Minimal Optimization, Fuzzy C-means, RAN, Radial Basis Function, Support Vector Machine
統計
Statistics
本論文已被瀏覽 5838 次,被下載 0
The thesis/dissertation has been browsed 5838 times, has been downloaded 0 times.
中文摘要
在支援向量機(support vector machine)的領域裡,核函數(kernel function)的選擇和修改是非常重要的研究,核函數對支援向量機的效能有著顯著的影響。核函數能將資料集從原本的資料空間投射到特徵空間,讓原本無法在低維度中處理的問題透過核函數的轉換,在高維度中處理,而最常用的核函數便是Radial Basis Function(RBF)。在這篇論文中,我們使用Fuzzy C-means這個群聚(clustering)的演算法將資料樣本分成數個群聚(cluster),然後使用統計的方法針對每一個資料樣本算出標準差(standard deviation),所以,我們可以得到資料的分佈並給資料樣本適當的標準差,也就是Radial Basis Function裡頭的變異數(variance),最後再將資料樣本和相對應的變異數交由支援向量機訓練。由實驗結果得知,我們的方法可以得到更好的核函數,而且也有更好的學習(learning)能力和泛化(generalization)能力。
Abstract
The selection and modification of kernel functions is a very important problem in the field of support vector learning. However, the kernel function of a support vector machine has great influence on its performance. The kernel function projects the dataset from the original data space into the feature space, and therefore the problems which couldn’t be done in low dimensions could be done in a higher dimension through the transform of the kernel function. In this thesis, we adopt the FCM clustering algorithm to group data patterns into clusters, and then use a statistical approach to calculate the standard deviation of each pattern with respect to the other patterns in the same cluster. Therefore we can make a proper estimation on the distribution of data patterns and assign a proper standard deviation for each pattern. The standard deviation is the same as the variance of a radial basis function. Then we have the origin data patterns and the variance of each data pattern for support vector learning. Experimental results have shown that our approach can derive better kernel functions than other methods, and also can have better learning and generalization abilities.
目次 Table of Contents
摘要 i
Abstract ii
第一章 簡介 - 1 -
第二章 支援向量機 - 5 -
2.1 線性可分割的分類問題 - 5 -
2.2 線性分類支援向量機 - 9 -
2.3 軟性邊界 - 12 -
2.4 核函數 - 15 -
2.5 支援向量迴歸 - 20 -
第三章 研究方法 - 27 -
3.1 研究動機 - 27 -
3.2 研究方法 - 30 -
3.3 序列最小優化 - 35 -
3.4 工作集的選擇 - 38 -
3.5 最大違反對 - 39 -
3.6 使用二階資訊的工作集選擇方法 - 40 -
3.7 考慮變異數的工作集選擇方法 - 43 -
第四章 實驗結果與分析 - 46 -
第五章 結論 - 58 -
參考文獻 - 59 -
參考文獻 References
[1].V.Vapnik, “The Nature of Statistical Learning Theory,” New York: Springer Verlag, 1995.
[2].C.J.C.Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Data Mining and Knowledge Discovery, vol. 2, no.2, pp. 1-43, 1998.
[3].V.Vapnik, “Estimation of Dependences Based on Empirical Data,” New York: Springer Verlag, 1982.
[4].J.C.Bezdek, R.Ehrlich and W.Full, “FCM: The Fuzzy C-Means Clustering Algorithm,” Computers and Geosciences, vol. 10, no. 2-3, pp. 191-203, 1984.
[5].R.O.Duda and P.E.Hart, “Pattern Classification and Scene Analysis,” New York: John Wiley and Sons, 1996.
[6].N.Cristianini and J.Shawe-Taylor, “An Introduction to Support Vector Machines: and Other Kernel-Based Learning Methods,” Cambridge University Press, 2002
[7].J.Shawe-Taylor and N.Cristianini, “Kernel Methods for Pattern Analysis,” Cambridge University Press, 2004.
[8].A.J.Smola and B.Scholkopf, “A Tutorial on Support Vector Regression,” Statistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.
[9].B.Scholkopf, K.K.Sung, C.J.C.Burges, F.Girosi, P.Niyogi, T.Poggio and V.Vapnik, “Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers,” Signal Processing, vol. 45, no. 11, pp. 2758-2765, November 1997.
[10].R.E.Fan, P.H.Chen, and C.J.Lin, “Working set selection using the second order information for training SVM,” Journal of Machine Learning Research 6, 1889-1918, 2005.
[11].C.C.Chang and C.J.Lin. LIBSVM: a Library for Support Vector Machines. April 17, 2005.
[12].P.H.Chen, R.E.Fan and C.J.Lin, “A Study on SMO-Type Decomposition Method for Support Vector Machines,” Technical Report, Department of Computer Science, National Taiwan University, 2005.
[13].J.C.Platt, “Fast Training of Support Vector Machines using Sequential Minimal Optimization,” Advances in Kernel Methods – Support Vector Learning, pp. 185-208, 1999
[14].J.C.Platt, “Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines.” Technical Report MSR-TR-98-14, Microsoft Research, 1998.
[15].S.S.Keerthi, S.K.Shevade, C.Bhattacharyya and K.R.K.Murthy, “Improvements to Platt’s SMO Algorithm for SVM Classifier Design.” Neural Computation, vol. 13, no. 3, pp. 637-649, March 2001.
[16].V.Kadirkamanathan and M.Niranjan, “A Function Estimation Approach to Sequential Learning with Neural Networks,” Neural Computation, vol. 5, no. 6, pp. 954-975, November 1993.
[17].J.Platt, “A Resource-Allocating Network for Function Interpolation,” Neural Computation, vol. 3, no. 2, pp. 213-225, 1991.
[18].L.Yingwei, N.Sundararajan and P.Saratchandran, ”Performance Evaluation of a Sequential Minimal Radial Basis Function(RBF) Neural Network Learning Algorithm,” Neural Networks, vol. 9, no. 2, pp. 308-318, March 1998.
[19].G.B.Huang, P.Saratchandran and N.Sundararajan, “A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation,” Neural Networks, vol. 16, no. 1, pp. 57-67, January 2005.
[20].M.R.Azimi-Sadjadi and S.Sheedvash, “Recursive Node Creation in Back-Propagation Neural Networks using Orthogonal Projection Method,” Neural Networks, vol. 4, pp. 242-256, March 1993.
[21].V.Kadirkamanathan and M.Niranjan, “Nonlinear Adaptive Filtering in Nonstationary Environments,” Proceedings of the International Conference on Acoustics, Speech and Signal Processing, pp. 2177-2180, Toronto, CA, 1991.
[22].M.S.Bazaraa, H.D.Sherali and C.M.Shetty, “Nonlinear Programming: Theory and Algorithms,” Wiley, 1992.
[23].M.J.L.Orr, “Introduction to Radial Basis Function Networks,” Technical Report, Institute for Adaptive and Neural Computation of the Division of Informatics at Edinburgh University, Scotland, UK, 1996.
[24].H.Frigui and R.Krishnapuram, “Clustering by Competitive Agglomeration,” Pattern Recognition, vol. 30, no. 7, pp. 1109-1119, 1997.
[25].J.T.Jeng and C.C.Chung, ”A Novel Approach for The Hyper-Parameters of Support Vector Regression,” Proceedings of the International Joint Conference on Neural Networks, pp. 642-647, Hawaii, USA, 2002.
[26].Y.Tan and J.Wang, “A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension,” Knowledge and Data Engineering, vol.16, no.4, pp.385-395, April 2004.
[27].K.Ito and R.Nakano, “Optimizing Support Vector Regression Hyperparameters Based on Cross-Validation,” Proceedings of the International Joint Conference on Neural Network, pp. 2077-2082, Oregon, USA, 2003.
[28].D.J.newman, S.Hettich, C.L.Blake and C.J.Merz, UCI Repository of Machine Learning Databases, Irvine, CA: University of California, Department of Information and Computer Science, 1998.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內校外均不公開 not available
開放時間 Available:
校內 Campus:永不公開 not available
校外 Off-campus:永不公開 not available

您的 IP(校外) 位址是 18.116.36.192
論文開放下載的時間是 校外不公開

Your IP address is 18.116.36.192
This thesis will be available to you on Indicate off-campus access is not available.

紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code