博碩士論文 etd-0721106-215207 詳細資訊


[回到前頁查詢結果 | 重新搜尋]

姓名 楊智程(Chih-cheng Yang) 電子郵件信箱 E-mail 資料不公開
畢業系所 電機工程學系研究所(Electrical Engineering)
畢業學位 碩士(Master) 畢業時期 94學年第2學期
論文名稱(中) 支援向量迴歸方法中的參數學習與支援向量點的縮減
論文名稱(英) Parameter learning and support vector reduction in support vector regression
檔案
  • etd-0721106-215207.pdf
  • 本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
    請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
    論文使用權限

    電子論文:校內校外均不公開

    論文語文/頁數 中文/72
    統計 本論文已被瀏覽 5064 次,被下載 0 次
    摘要(中) 支援向量學習(support vector learning)方法上核心函式(kernel functions)的選擇與學習,是一個非常重要但是卻較少被研究到的問題。然而,支援向量迴歸(support vector regression)中核心函式是影響其效能很大的關鍵。核心函式主要是將原始輸入空間投射到高維度的特徵空間(feature space),藉由核心函式的轉換,使得資料在低維度空間無法解決的問題至高維度空間得以解決。
    在本論文中主要內容分為兩個部份。第一個部份,我們使用最陡坡降演算法(gradient descent method)來學習核心函式。利用最陡坡降演算法在風險最小化原理(risk minimization principle)下去訓練參數,我們可以建構出核心函式參數的學習規則,藉由此參數我們能指定核心函式形狀與分佈。因此,我們將可以獲得最佳的核心函式。第二個部份,為縮減支援向量點(support vectors)的數目我們採用垂直最小平方法(orthogonal least squares)。以垂直最小平方法去評估較具代表性的支援向量點,我們可以去除對於支援向量迴歸模型中,較不重要的支援向量點。
    由實驗結果得知,我們的方法可以得到最佳的核心函式,比起其他方法也展現出較佳的一般性能力(generalization ability),也能有效地降低支援向量點個數。
    摘要(英) The selection and learning of kernel functions is a very important but rarely studied problem in the field of support vector learning. However, the kernel function of a support vector regression has great influence on its performance. The kernel function projects the dataset from the original data space into the feature space, and therefore the problems which can not be done in low dimensions could be done in a higher dimension through the transform of the kernel function.
    In this paper, there are two main contributions. Firstly, we introduce the gradient descent method to the learning of kernel functions. Using the gradient descent method, we can conduct learning rules of the parameters which indicate the shape and distribution of the kernel functions. Therefore, we can obtain better kernel functions by training their parameters with respect to the risk minimization principle. Secondly, In order to reduce the number of support vectors, we use the orthogonal least squares method. By choosing the representative support vectors, we may remove the less important support vectors in the support vector regression model.
    The experimental results have shown that our approach can derive better kernel functions than others and has better generalization ability. Also, the number of support vectors can be effectively reduced.
    關鍵字(中)
  • 垂直最小平方法
  • 最陡坡降法
  • 支援向量點
  • 核心函式
  • 支援向量迴歸
  • 關鍵字(英)
  • gradient descent method
  • support vector regression
  • support vectors
  • orthogonal least squares
  • kernel function
  • 論文目次 摘要             i
    Abstract           ii
    圖目錄            v
    表目錄            vii
    第一章 簡介        1
    第二章 相關文獻探討    5
    2.1 支援向量機問題 5
    2.1.1 線性支援向量機     5
    2.1.2 最佳分割超平面                6
    2.1.3 線性不可分問題                10
    2.1.3.1 軟性邊界                  10
    2.1.3.2 非線性核心函式               13
     2.2 支援向量迴歸問題               18
    第三章 我們的方法                 23
    3.1參數學習(Parameter learning)          23
    3.2支援向量縮減(Support vector reduction)     32
    第四章 實驗結果與分析               40
      4.1參數學習                   40
        實驗4.1.1                  40
        實驗4.1.2  42                    
        實驗4.1.3                  44
      4.2支援向量縮減                 49
        實驗4.2.1                  49
        實驗4.2.2                  52
        實驗4.2.3                  54
    第五章 結論                    60
    參考文獻                      61
    參考文獻 [1] V. Vapnik, The Nature of Statistical Learning Theory, New York:Springer Verlag, 1995.
    [2] C.J.C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Knowledge Discovery and Data Mining, vol. 2, no. 2, pp. 1-43, 1998.
    [3] V. Vapnik, Estimation of Dependences Based on Empirical Data, New York: Springer Verlag, 1982.
    [4] K. Kobayashi, D. Kitakoshi and R. Nakano, “Yet faster method to optimize SVR hyperparameters based on minimizing cross-validation error,” Proc. International Joint Conference on Neural Networks, pp. 871-876, Montréal, Québec, Canada, 2005.
    [5] N. Cristianini and J. Shawe-Taylor, An introduction to support vector machines: and other kernel-based learning methods, London: Cambridge University Press, 2000.
    [6] A.J. Smola and B. Scholkopf, “A tutorial on support vector regression,” Statistics and Computing , vol. 14, no. 3, pp. 199-222, 2004.
    [7] C.C. Chang and C.J. Lin, “Training ν-Support Vector Regression: Theory and Algorithms,” Neural Computation, vol. 14, pp. 1959-1977, 2002.
    [8] K. Ito and R. Nakano, “Optimizing Support Vector regression hyperparameters based on cross-validation,” Proc. International Joint Conference on Neural Networks, pp. 2077-2082, Portland, OR, USA, July 2003.
    [9] J.T. Jeng and C.C. Chuang, “A novel approach for the hyperparameters of support vector regression,” Proc. International Joint Conference on Neural Networks, pp. 642-647, Honolulu, Hawaii, USA, 2002.
    [10] Y. Tan and J. Wang, “A support vector machine with a hybrid kernel and minimal Vapnik-Chervonenkis dimension,” IEEE Trans. Knowledge and Data Engineering, vol. 16, no. 4, pp. 385-395, 2004.
    [11] D.J. Newman, S. Hettich, C.L. Blake and C.J. Merz, UCI Repository of machine learning databases [http://www.ics.uci.edu/ 
      mlearn/MLRepository.html], Irvine, CA: University of California, Department of Information and Computer Science, 1998.
    [12] J.H. Chiang and P.Y. Hao, ”Support vector learning mechanism for fuzzy rule-based modeling: a new approach,” IEEE Trans. Fuzzy Systems, vol. 12, no. 1, pp. 1-12, Feb. 2004.
    [13] C.W. Hsu, C.C Cheng and C.C Lin, ”A Practical Guide to Support Vector Classification,” Available at [http://www.csie.ntu.edu.tw/~cjlin/paper/guide/guide.pdf].
    [14] J. Shawe-Tayloe, P.L. Barlett, R.C. Williamason and M. Anthony, “Structural risk minimization over data-dependent hierarchies,” IEEE Trans. Information Theory, vol. 44, Issue 5, pp. 1926-1940, Sept. 1998.
    [15] B. Scholkopf, C.J.C. Burges, A.J. Smola, Introduction to Support Vector Learning, Adveng in Kernel Method-Support Vector Learning, pp. 1-15, Cambridge, MA, MIT Press, 1999.
    [16] A.J. Smola, Learning with Kernels, PhD thesis, Technische Universität Berlin, 1998.
    [17] M. Setnes and R. Babuska, “Rule base reduction: some comments on the use of orthogonal transforms,” Systems, Man and Cybernetics, Part C, IEEE Transactions, vol. 31, Issue 2, pp. 199-206, May 2001.
    [18] J. Yen and L. Wang, “Simplifying fuzzy rule-based models using orthogonal transformation methods,” IEEE Trans. Syst., Man, Cybern, Part B, vol. 29, Issue 1, pp. 13-24, Feb. 1999.
    [19] T. Takagi and M. Sugeno, “Fuzzy identification of systems and its applications to modeling and control,” IEEE Trans. Syst., Man, Cybern, vol. 15, pp. 116–132, Jan./Feb. 1985.
    [20] L.X. Wang and J.M. Mendel, “Fuzzy basis functions, universal 
      approximation,and orthogonal least-squares learning,” IEEE Trans. Neural Network, vol. 3, Issue 5, pp. 807-813, Oct. 1992.
    [21] S. Cowan, C.F.N. Chen, P.M. Grant, ”Orthogonal least squares learning algorithm for radial basis function networks,” IEEE Trans. Neural Networks, vol. 2, Issue 2, pp. 302-309, March 1991.
    [22] C.H. Lee, H.C. Yang, F.C Hsu, T.C. Chen and C.C. Hung, “A Multiple Classifier Approach for Measuring Text Relatedness Based on Support Vector Machines Techniques,” 9th World Multiconference on Systemics, Cybernetics and Informatics (WMSCI 2005), Orlando, USA, July 10-13 2005.
    [23] J. Friedman, "Multivariate Adaptive Regression Splines," Technical Report no. 102, Laboratory for Computational Statistics, Department of Statistics, Stanford University, November 1988.
    [24] G.E.P. Box and G.M. Jenkins, Time Series Analysis, Forecasting and Control San Francisco, Holden Day, pp. 532-533, 1970.
    [25] S.R. Gunn, “Support Vector Machines for Classification and Regression,” Technical Report, Image Speech and Intelligent Systems Research Group, University of Southampton, 1997.
    口試委員
  • 郭忠民 - 召集委員
  • 吳志宏 - 委員
  • 歐陽振森 - 委員
  • 黃宗傳 - 委員
  • 李錫智 - 指導教授
  • 口試日期 2006-07-04 繳交日期 2006-07-21

    [回到前頁查詢結果 | 重新搜尋]


    如有任何問題請與論文審查小組聯繫