Responsive image
博碩士論文 etd-0727106-123654 詳細資訊
Title page for etd-0727106-123654
論文名稱
Title
改良式模糊決策樹及其於向量編碼之應用
An Improved C-Fuzzy Decision Tree and its Application to Vector Quantization
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
81
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2006-07-04
繳交日期
Date of Submission
2006-07-27
關鍵字
Keywords
影像壓縮、向量量化法、模糊決策樹、決策樹、遞迴式奇異值分解的最小平方估算法、模糊分群
vector quantization method, recursive SVD-based least squares estimator, fuzzy decision tree, Decision trees, fuzzy clustering, image compression
統計
Statistics
本論文已被瀏覽 5841 次,被下載 2636
The thesis/dissertation has been browsed 5841 times, has been downloaded 2636 times.
中文摘要
在近百年裡,人類發明了許許多多便利的工具,以追求更美好、舒適的生活環境,其中電腦為最重要的發明之一,其運算能力是人類無法比擬的;由於電腦能快速且準確地處理大量的資料,人們進而希望利用這項優勢來模擬人類的思維,而人工智慧就此堀起,舉凡類神經網路、資料探勘、模糊邏輯…等方法源源不絕的提出,應用到指紋辨識、影像壓縮、天線設計…等各方領域,在此我們將根據決策樹及模糊分群法對於資料的預測技術方面進行探討。模糊決策樹使用模糊分群的方法達到將資料分類的效果,進而建構決策樹以對資料做預測的動作;然而,在距離函數方面,由於設計輸出值的影響力將隨著輸入向量維度大小成反比,將導致在某些資料集的分類上產生問題,除此之外,每個葉節點的輸出模式僅以一個常數的代表輸出值,忽略了以此節點中資料的分佈狀況來描繪輸出結果的方法;我們將提出對距離函數更合理的定義來同時考慮具有權重因子的輸入及輸出空間的測量,且利用局部線性函數更廣泛的定義每個葉節點的輸出模式,並根據遞迴式奇異值分解的最小平方估算法求得此函數之各項係數;實驗結果顯示我們改良的方法在分類及迴歸問題上,都能具有較高的辨視效果及較小的均方差值。
Abstract
In the last one hundred years, the mankind has invented a lot of convenient tools for pursuing beautiful and comfortable living environment. Computer is one of the most important inventions, and its operation ability is incomparable with the mankind. Because computer can deal with a large amount of data fast and accurately, people use this advantage to imitate human thinking. Artificial intelligence is developed extensively. Methods, such as all kinds of neural networks, data mining, fuzzy logic, etc., apply to each side fields (ex: fingerprint distinguishing, image compressing, antennal designing, etc.). We will probe into to prediction technology according to the decision tree and fuzzy clustering. The fuzzy decision tree proposed the classification method by using fuzzy clustering method, and then construct out the decision tree to predict for data. However, in the distance function, the impact of the target space was proportional inversely. This situation could make problems in some dataset. Besides, the output model of each leaf node represented by a constant restricts the representation capability about the data distribution in the node. We propose a more reasonable definition of the distance function by considering both input and target differences with weighting factor. We also extend the output model of each leaf node to a local linear model and estimate the model parameters with a recursive SVD-based least squares estimator. Experimental results have shown that our improved version produces higher recognition rates and smaller mean square errors for classification and regression problems, respectively.
目次 Table of Contents
第一章 簡介 1
1.1 模糊C-均值演算法 2
1.2 決策樹 6
1.3 向量量化法 9
第二章 C-模糊決策樹的方法介紹 13
2.1 使用模糊分群法達到分類的效果 14
2.2 C-模糊決策樹分裂停止條件 17
2.3 C-模糊決策樹之建立 20
第三章 我們的方法 23
3.1 改良式模糊C-均值法 24
3.2 局部線性輸出函數 28
3.3 改良式模糊決策樹之建立 34
第四章 應用於向量量化法 39
4.1 樹狀編碼簿法 39
4.2 我們的方法 45
第五章 實驗結果 54
第六章 總結 69
參考文獻 71
參考文獻 References
[1] J. Han and M. Kamber, Data Mining: Concepts and Techniques, San
Francisco, CA: Morgan Kaufmann, 2001.

[2] S.-J. Lee and C.-S. Ouyang, “A neuro-fuzzy system modeling with self-constructing rule generation and hybrid SVD-based learning,” IEEE Transactions on Fuzzy Systems, vol. 11, no. 3, pp. 341–353, 2003.

[3] Y. Lin, G. A. Cunningham III, and S. V. Coggeshall, “Using fuzzy partitions to create fuzzy systems from input-output data and set the initial weights in a fuzzy neural network,” IEEE Transactions on Fuzzy Systems, vol. 5, pp. 614–621, 1997.

[4] C. J. Merz and P. M. Murphy, UCI Repository for Machine Learning Data-bases. Department of Information and Computer Science, University of California, Irvine, CA.
(http://www.ics.uci.edu/ mlearn/MLRepository.html)

[5] W. Pedrycz and Z. A. Sosnowski, “C-fuzzy decision trees,” IEEE Transactions on Systems, Man, cybernetics–Part C: Applications and Reviews, vol. 35, no. 4, pp. 498–511, 2005.

[6] J. R. Quinlann, “Induction of decision trees,” Machine Learning, pp. 81–106, 1986.

[7] J. R. Quinlann, C4.5: Programs for Machine Learning, San Francisco, CA: Morgan Kaufmann, 1993.

[8] Jim C. Bezdek. Fuzzy Mathematics in Pattern Classification. PhD thesis, Applied Math. Center, Cornell University, Ithaca, 1973.

[9] Y. Linde, A. Buzo, and R. M. Gray, “An algorithm for vector quantizer design,” IEEE Trans. Commu., COM-28(1), pp. 84-95, 1980.

[10] J.-S. Chung, C.-S. Ouyang, W.-J. Lee, and S.-J. Lee, “Distortion-Based Hierarchical Clustering for Vector Quantization,” Proceedings of International Conference on Informatics, Cybernetics, and Systems, pp. 1006-1011, Dec. 2003.

[11] Vlajic, N. and Card, C., “Vector quantization of images using modified adaptive resonance algorithm for hierarchical clustering,” in IEEE Transactions on Neural Networks, Vol. 12, pp. 1147 - 1162 , Sept. 2001.

[12] Chang, C. and Shiue, C., “Tree structured vector quantization with dynamic path search,” in Proceedings of International Workshops on Parallel Processing, pp. 536 – 541, Sept. 1999.

[13] Pedrycz, W.; Sosnowski, Z.A.; “Designing decision trees with the use of fuzzy granulation,” IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans, Vol. 30, NO. 2, pp. 151 – 159, Mar. 2000.

[14] Karayiannis, N.B.; Randolph-Gips, M.M.; “Soft learning vector quantization and clustering algorithms based on non-Euclidean norms: single-norm algorithms,” IEEE Transactions on Neural Networks, Vol. 16, NO. 2, pp. 423 – 435, Mar. 2005.

[15] N. N. karnik, J. M. Mendel, and Q. Liang; “Type-2 fuzzy logic systems,” IEEE Transactions on Fuzzy Systems, Vol. 7, pp. 643-658, Dec. 1999.

[16] Chi-Hsu Wang; Chun-Sheng Cheng; Tsu-Tian Lee; “Dynamical optimal training for interval type-2 fuzzy neural network (T2FNN),” IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics, Vol. 34, No. 3, pp. 1462 - 1477, Jun. 2004.

[17] Karayiannis, N.B.; “An axiomatic approach to soft learning vector quantization and clustering, ” IEEE Transactions on Neural Networks, Vol. 10, NO. 5, pp. 1153 – 1165, Sep. 1999.

[18] Hoppner, F.; Klawonn, F.; “A contribution to convergence theory of fuzzy c-means and derivatives, ” IEEE Transactions on Fuzzy Systems, Vol. 11, NO. 5, pp. 682 - 694, OCT. 2003.
[19] Kolen, J.F.; Hutcheson, T.; “Reducing the time complexity of the fuzzy c-means algorithm,” IEEE Transactions on Fuzzy Systems, Vol. 10, NO. 2, pp. 263 - 267, APR. 2002.

[20] Cheng-Ru Lin; Ming-Syan Chen; “Combining partitional and hierarchical algorithms for robust and efficient data clustering with cohesion self-merging,” IEEE Transaction on Knowledge and Data Engineering, VOL. 17, NO. 2, pp. 145 - 159 , FEB. 2005

[21] Basak, J.; Krishnapuram, R.; “Interpretable hierarchical clustering by constructing an unsupervised decision tree,” IEEE Transaction on Knowledge and Data Engineering, VOL. 17, NO. 1, pp. 121 - 132, JAN. 2005

[22] M.-T. Jone and S.-J. Lee; “Construction of Multi-Class Neural Networks with Entropy Measure Approach,” Proceedings of International Symposium on Artificial Neural Networks, Tainan, Taiwan, ROC, pp. 296-301, Dec. 1994.

[23] Groll, L.; Jakel, J.; “A New Convergence Proof of Fuzzy c-Means,” IEEE Transactions on Fuzzy Systems, Vol. 13, NO. 5, pp. 717 - 720, OCT. 2005.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:校內外都一年後公開 withheld
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code