§ 瀏覽學位論文書目資料
  
系統識別號 U0002-0607200911040000
DOI 10.6846/TKU.2009.00149
論文名稱(中文) 使用混合最佳化技術化簡支撑向量機之解
論文名稱(英文) A Hybrid Optimization Strategy for Simplifying the Solutions of Support Vector Machines
第三語言論文名稱
校院名稱 淡江大學
系所名稱(中文) 資訊工程學系博士班
系所名稱(英文) Department of Computer Science and Information Engineering
外國學位學校名稱
外國學位學院名稱
外國學位研究所名稱
學年度 97
學期 2
出版年 98
研究生(中文) 葉日斌
研究生(英文) Jih-Pin Yeh
學號 891190125
學位類別 博士
語言別 英文
第二語言別
口試日期 2009-06-23
論文頁數 38頁
口試委員 指導教授 - 林慧珍(hjlin@cs.tku.edu.tw)
委員 - 徐道義(taoi@gc.shu.edu.tw)
委員 - 陳朝欽(cchen@cs.nthu.edu.tw)
委員 - 林慧珍(hjlin@cs.tku.edu.tw)
委員 - 洪文斌(horng@cs.tku.edu.tw)
委員 - 蔡憶佳(tsai@cs.tku.edu.tw)
關鍵字(中) 支撐向量機
粒子尋優演算法
遺傳演算法
最佳化
判斷函數
關鍵字(英) support vector machine
particle swarm optimization
genetic algorithm
optimization
discriminant function.
第三語言關鍵字
學科別分類
中文摘要
本論文研究使用最佳化技術(粒子尋優演算法及遺傳演算法) 簡化支撐向量機(SVM)之解。計畫的主要之議題為“找出SVM的解集合的最佳部分解”,並使得此SVM的解集合的最佳部分解形成之判斷函數(discriminant function)能最逼近原來未簡化解時的判斷函數。而SVM的解集合的最佳部分解是選自原來之SVM的解集合,並以一適應函數(fitness)為指標來選出,而且此一適應函數能評量所形成之判斷函數的好壞。而使用之最佳化技術(粒子尋優法及遺傳演算法)也是利用所定的適應函數(fitness)來搜尋找出SVM的解集合的最佳部分解。結果顯示所定出之適應函數的好壤及使用那一種搜尋技術會影響所得的SVM的近似判斷函數的性能。本論文所提之方法可應用於任一種SVM的核函數所形成之判斷函數。另外識別率可依工作需要做適應性的調整。而所提之方法也會在標準的資料庫上實驗。而實驗結果指出混合最佳化技術的策略的確能有效地找出SVM的解集合的最佳部分解。並得到搜尋演算法在找此SVM的解集合的最佳部分解的性能比較好壞依次為PSO-GA,GA-PSO,PSO,及GA。
英文摘要
This thesis investigates and compares the performance of reduction of solutions for SVMs using two optimization techniques, namely particle swarm optimization (PSO) and genetic algorithm (GA). The main issue is to search for a subset of the support vector solutions produced by an SVM that forms a discriminant function best approximating the original one. The work is accomplished by giving a fitness that fairly indicates how well the discriminant function formed by a set of selected vectors approximates the original one, and searching for the set of vectors having the best fitness using PSO, GA, or a hybrid approach combining PSO and GA. Both the defined fitness function and the adopted search technique affect the performance. Our method can be applied to SVMs associated with any general kernel. The reduction rate can be adaptively adjusted based on the requirement of the task. The proposed approach is tested on some benchmark datasets. From the test results, it can be observed that the combination of the particle swarm optimization algorithm and genetic algorithm can improve search results; that is, both PSO-GA and GA-PSO outperform both PSO and GA.
第三語言摘要
論文目次
Table of Contents
Chapter 1.	Introduction-------------------------1
Chapter 2. 	Support Vector Machines (SVM) -------3
Chapter 3.	Reduction of the Solutions for SVMs -7
3.1. Approximation Error of Reduced Set----------------10
3.2. Evaluation of Approximation Error-----------------11
3.3. Analysis of Approximation Error-------------------12
3.4. Mathematical Model for Support Vector Selection Problem -----------------------------------------------14
Chapter 4.	Searching the Optimal Reduced Set of 
Support Vectors----------------------------------------15
4.1  Particle Swarm Optimization ----------------------16
4.1.1  Our Discrete PSO--------------------------------18
4.1.2 Particle Representation and Fitness Function-----20
4.1.3 Optimization Using PSO---------------------------21
4.2 Genetic Algorithm----------------------------------22
Chapter 5.	Experimental Results and Comparisons-25
Chapter 6.	Conclusions and Future Works---------33
BIBLIOGRAPHY-------------------------------------------34
 
List of Figures
Figure 2.1. The feature map ψ from X to H---------------4
Figure 2.2. A sketch for SVM process--------6
Figure 3.1. The span of ψ(XF) the span of ψ(S) ---------8
 
List of Tables
Table 1. Recognition rates for two spirals ------------26
Table 2. Recognition rates for two waveform graphs ----28
Table 3. Recognition rates for two concentric ellipses-30
Table 4. Recognition rates for two cosine -------------32
參考文獻
[1]  P. Angeline, “Evolutionary Optimization Versus Particle Swarm Optimization: Philosophy and Performance Difference,” in Proceedings of the Seventh Annual Conference on Evolutionary Programming, pp. 601-610, 1998.
[2]  P. Angeline, “Using Selection to Improve Particle Swarm Optimization,” in Proceedings of the International Conference on Evolutionary Computation,  Piscataway, New Jersey, USA, pp. 84-89, 1998.
[3]  B. E. Boser, I. M. Guyon, and V. N. Vapnik, “A Training Algorithm for Optimal Margin Classifiers,” in Proceedings of 5th Ann. Workshop on Computat. Learn. Theory, pp. 144-152, 1992.
[4]  C. J. C. Burges, “Simplified Support Vector Decision Rules,” in Proceedings of 13th International Conference on Machine Learning, Bari, Italy, pp. 71-77, 1996.
[5]  C. J. C. Burges, “Geometry and Invariance in Kernel Based Method,” Advance in Kernel Method-Support Vector Learning, MIT Press, Cambridge, MA, pp. 86-116, 1999.
[6]  C. Cortes and V. Vapnik, “Support Vector Networks,” Machine Learning, Vol. 20, pp. 273-297, 1995.
[7]  B. Chakraborty and P. Chaudhuri, “ On the Use of Genetic Algorithm with Elitism in Robust and Nonparametric Multivariate Analysis,” Austrian  Journal of Statistics, Vol. 32, pp. 13-27, 2003.
[8]  M. Clerc, “The Swarm and the Queen: Towards a Deterministic and Adaptive Particle Swarm Optimization,” in Proceedings of the Congress on Evolutionary Computation, Washington DC, USA, 3, pp. 1951-1957, 1999.
[9]  R. Courant, and D. Hilbert, Methods of Mathematical Physics, Interscience, New York, 1953.
[10]  T. Downs, K. Gates, and A. Masters, “Exact Simplification of Support Vector Solutions,” J. Mach. Learning Res. Vol. 2, pp. 293-297, 2001.
[11]  S. Esquivel and C. Coello, “On the Use of Particle Swarm Optimization with Multimodal Functions,” in Proceedings of IEEE Congress on Evolutionary Computation, pp. 1130-1136, 2003.
[12]  R. Eberhart and Y. Shi, “Comparison between Genetic Algorithms and Particle Swarm Optimization,” in Proceedings of the Seventh Annual Conference on Evolutionary Programming, pp. 611-619, 1998.
[13]  J. Eiker and M. Kupferschmid, Introduction to Operations Research, John Wiley & Sons, New York, 1988.
[14]  J. Guo, N. Takahashi, and T. Nishi, “A Learning Algorithm for Improving the Classification Speed of Support Vector Machines,” in Proceedings of 2005 European Conference on Circuit Theory and Design, 2005.
[15]  T. Graepel, R., Herbrich, and J. Shawe-Taylor, “Generalisation Error Bounds for Sparse Linear Classifiers,” in Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, pp. 298-303, 2000.
[16]  D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison Wesley, New York, 1989.
[17]  N. Higashi and H. Iba, “Particle Swarm Optimization with Gaussian Mutation,” in Proceedings of the IEEE Swarm Intelligence Symposium, Indianapolis, Indiana, USA, pp. 72-79, 2003.
[18]  T. Joachims, “Estimating the Generalization Performance of An SVM Efficiently,” in Proceedings of the 17th International Conference on Machine Learning (ICML-00), pp. 431-438, 2000.
[19]  T. Krink, J. Vesterstrøm, and J. Riget, “Particle Swarm Optimization with Partial Particle Extension,” in Proceedings of the Fourth Congress on Evolutionary Computation, 2002.
[20]  J. Kennedy and R. Eberhart, “Particle Swarm Optimization,” in Proceedings of IEEE International Conference on Neural Networks, Perth, Australia, Vol. 4, pp. 1942-1948, 1995.
[21]  J. Kennedy and R. Eberhart, Swarm Intelligence, Morgan Kaufmann, 2001.
[22]  J. Kennedy and R. Eberhart, “A Discrete Binary Version of The Particle Swarm Algorithm,” in Proceedings of the 1997 Conference on Systems, Man, and Cybernetics, pp. 4104-4109, 1997.
[23]  Y. LeCun, L. Botou, L. Jackel, H. Drucker, C. Cortes, J. Denker, I. Guyon, U. Muller, E. Sackinger, P. Simard, and V. Vapnik, “Learning Algorithms for Classification: A Comparison on Handwritten Digit Recognition,” Neural Network, pp. 261-276, 1995.
[24]  C. Liu, K. Nakashima, H. Sako, and H. Fujisawa, “Handwritten Digit Recognition: Bench-Marking of State-of-The-Art Techniques,” Pattern Recognition, Vol. 36, pp. 2271-2285, 2003.
[25]  Q. Li, L. Jiao, and Y. Hao,  “Adaptive Simplification of Solution for Support Vector Machine,” Pattern Recognition, Vol. 40, No. 3, pp. 972-980, 2007.
[26]  H. J. Lin and J. P. Yeh, “Optimal Reduction of Solutions for Support Vector Machines,” Applied Mathematics and Computation, Vol. 214, Issue 2, pp. 329-335,  2009.
[27] M. Løvberg and T. Krink, “Extending Particle Swarm Optimizers with Self-Organized Criticality,” in Proceedings of the Fourth Congress on Evolutionary Computation, Vol. 2, pp. 1588-1593, 2002.
[28]  M. Løvberg, “Improving Particle Swarm Optimization by Hybridization of Stochastic Search Heuristics and Self Organized Critically,” Master's Thesis. Department of Computer Science, University of Aarhus, Denmark, 2002.
[29]  K. J. Lang and M. J. Witbrock, “Learning to tell two spirals apart,” in Proceedings of 1989 Connectionist Models Summer School, pp. 52-61, 1989.
[30]  G. Mak, G. Ratzer, and H. Vangheluwe, “The Implementation of Support Vector Machines Using the Sequential Minimal Optimization Algorithm,” Master's Thesis. School of Computer Science McGill University, Montreal, Canada, 2000.
[31]  E. Osuna, R. Freund, and F. Girosi, “Support Vector Machines: Training and Applications,” Technical Report: AIM-1602, MIT A. I. Lab., 1996.
[32]  J. Platt, “Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines,” Technical report, Microsoft Research, Redmond, 1998.
[33]  M. Pontil and A. Verri, “Properties of Support Vector Machines,” Neural Computation, Vol. 10, pp. 955-974, 1998.
[34]  Andrzej P. Ruszczynski, Nonlinear Optimization, Princeton University Press, pp. 160, 2006.
[35]  R. Reynolds, B. Peng, and J. Brewster, “Cultural Swarms II: Virtual Algorithm Emergence,” in Proceedings of IEEE Congress on Evolutionary Computation, Canbella, Australia, pp. 1972-1979, 2003.
[36]  Y. Shi and R. Eberhart, “Fuzzy Adaptive Particle Swarm Optimization,” in Proceedings of Congress on Evolutionary Computation, Seoul, S. Korea, 2001.
[37]  T. Thies and F. Weber, “Optimal Reduced-Set Vectors for Support Vector Machines With a Quadratic Kernel,” Neural Computation, Vol. 16, pp. 1769-1777, 2004.
[38]  V. N. Vapnik, The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995.
[39]  V. N. Vapnik, Statistical Learning Theory, Wiley, New York, 1998.
[40]  F. Van den Bergh, “An Analysis of Particle Swarm  Optimizers,” PhD Thesis, Department of Computer Science, University of Pretoria, South Africa, 2002.
[41]  K. Veeramachaneni, T. Peram, C. Mohan, and L. Osadciw, “Optimization Using Particle Swarm with Near Neighbor Interactions,” Lecture Notes Computer Science, Vol. 2723, pp. 110-121, 2003.
論文全文使用權限
校內
紙本論文於授權書繳交後4年公開
同意電子論文全文授權校園內公開
校內電子論文於授權書繳交後4年公開
校外
同意授權
校外電子論文於授權書繳交後4年公開

如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信