淡江大學覺生紀念圖書館 (TKU Library)
進階搜尋


下載電子全文限經由淡江IP使用) 
系統識別號 U0002-2107200909183400
中文論文名稱 應用於支向機分類器與彈性多工排程問題之最佳化演算法設計
英文論文名稱 Optimization Algorithms Design for Support Vector Machine Classifier and Flexible Job-shop Scheduling Problem
校院名稱 淡江大學
系所名稱(中) 電機工程學系博士班
系所名稱(英) Department of Electrical Engineering
學年度 97
學期 2
出版年 98
研究生中文姓名 呂俊良
研究生英文姓名 Chun-Liang Leu
學號 889350020
學位類別 博士
語文別 英文
口試日期 2009-07-16
論文頁數 150頁
口試委員 指導教授-翁慶昌
委員-蘇順豐
委員-周永山
委員-余繁
委員-陶金旺
委員-許陳鑑
委員-龔宗鈞
委員-翁慶昌
中文關鍵字 支向機分類器  彈性多工排程問題  動態濃縮鄰居  多目標粒子群最佳化  蒙地卡羅樹狀搜尋 
英文關鍵字 Support Vector Machine (SVM) Classifier  Flexible Job-shop Scheduling Problems (FJSP)  Dynamic Condensed Nearest Neighbor (DCNN)  Multi-Objective Particle Swarm Optimization (MOPSO)  Mote-Carlo Tree Search (MCTS) 
學科別分類
中文摘要 本論文提出兩種單目標混合型最佳化演算法來改善支向機分類器的分類正確率以及兩種多目標最佳化演算法來解決彈性多工排程問題。在支向機分類器的設計上,本論文提出一種可與輸入訓練資料的順序無關的動態濃縮最近鄰居點的演算法,其可以減少因冗餘或雜訊之訓練樣本在分類過程中所造成的影響。然後我們提出分別使用基因演算法與粒子群演算法來將重要代表點的建立、特徵的選取、以及支向機的核心函數之參數設定等三項同時做最佳化處理。本論文提出GA-DCNN-SVM與PSO-DCNN-SVM兩種混合型最佳化演算法,並且和其他先前已公開發表的演算法做比較,經由多種的UCI標準測試資料集的實驗顯示,本論文所提出的混合型演算法可改善支向機的分類正確率,並且表現優於現存的方法。此外,在彈性多工排程問題上,本論文首先提出一個多目標粒子群演算法以尋找問題的多目標最佳解。在此方法中,本論文提出一個有效的編碼法來將粒子的實數值拆成整數部份與小數部份,並且分別表示機器選擇與任務順序,再提出一個循序指派的解碼機制來得到合理可行的排程候選解。因此標準實數型的粒子群演算法整合突變運算的演算法架構即可用來求解彈性多工排程問題。最後,我們提出一個新的多目標蒙地卡羅樹狀搜尋演算法,本論文提出一個循序的機器與任務的指派混合編碼方式來將原來演繹式架構的問題轉換成樹狀搜尋架構的問題,並且採用NSGA-II非支配解排序的計算方式來修改原始的UCT公式,使其適用於求解多目標的問題,然後就可以應用所提出的多目標蒙地卡羅樹狀搜尋演算法來求解彈性多工排程問題。經由與其他演算法的比較,可驗證所提出兩種方法確實可以有效地解決彈性多工排程問題。
英文摘要 In this dissertation, two types of single-objective hybrid model are proposed to improve the classification rate for Support Vector Machine (SVM) classifier and two effective multi-objective optimization algorithms are developed to solve Flexible Job-shop Scheduling Problems (FJSP). In the optimization design for SVM classifier, an order-independent algorithm for the data reduction, called the Dynamic Condensed Nearest Neighbor (DCNN) rule, is proposed to adaptively construct prototypes in training dataset and to reduce the redundant or noisy instances in a classification process for the SVM classifier. Furthermore, two hybrid models based on the Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) algorithm are adopted to simultaneously optimize the prototype construction, feature selection, and the SVM kernel parameters setting. Several UCI benchmark datasets are considered to compare the proposed GA-DCNN-SVM and PSO-DCNN-SVM approaches with the other previously published methods. The experimental results show that the developed hybrid models outperform the existing method and improve the classification accuracy for the support vector machine. In the optimization design for FJSP, an evolutionary Multi-Objective Particle Swarm Optimization (MOPSO) approach is adopted and successfully applied to find the optimal Pareto solutions for FJSP. An effective encoding method is proposed to divide the continuous real value into both integer and decimal numbers, which indicates the machine selection and operation order, respectively. By using the sequential assignment decoding mechanism, a feasible schedule is always obtained and the standard continuous PSO algorithm incorporated with the mutation operator can be adopted. Finally, a Multi-Objective Mote-Carlo Tree Search (MOMCTS) algorithm is proposed to solve the FJSP. The Sequential Operation-Machine Assignment (SOMA) scheme for encoding representation is first introduced to always produce feasible solutions, and then the evolution-based FJSP mapping to a general tree search structure via SOMA is successively completed. The modification of formal UCT (Upper Confidence Bound Apply to Tree) algorithm by using the non-dominated sort strategy makes it suitable for the characters of multi-objective problems, and the Pareto-optimal solutions for FJSP can be obtained. Compared to some developed algorithms, the experimental results illustrate the effectiveness of the both proposed methods for FJSP.
論文目次 Abstract (in Chinese) i
Abstract (in English) ii
Contents iii
List of Figures vi
List of Tables xii

Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Review of Research Works 2
1.2.1 Support Vector Machine (SVM) Classifier 2
1.2.2 Flexible Job-shop Scheduling Problem (FJSP) 4
1.3 Summary and Outline of This Dissertation 5
1.3.1 GA-based Single-Objective Hybrid Model for SVM 6
1.3.2 PSO-based Single-Objective Hybrid Model for SVM 6
1.3.3 Multi-Objective PSO Algorithm for FJSP 7
1.3.4 Multi-Objective MCTS Algorithm for FJSP 7

Chapter 2 GA-based Single-Objective Hybrid Model for SVM 8
2.1 Basic SVM Classifier 9
2.2 Condensed Nearest Neighbor (CNN) Algorithm 11
2.3 Genetic Algorithm 12
2.4 Methods 14
2.4.1 Prototype Voting Scheme (PVS) 14
2.4.2 Dynamic CNN (DCNN) Algorithm 16
2.4.3 The Proposed GA-DCNN-SVM Hybrid Model 20
2.5 Experimental Results and Concluding Remarks 26

Chapter 3 PSO-based Single-Objective Hybrid Model for SVM 30
3.1 Basic Single-Objective Particle Swarm Optimization (PSO) 31
3.2 Binary Particle Swarm Optimization (BPSO) 32
3.3 The Proposed Hybrid Model for SVM 33
3.3.1 Particle Representation 34
3.3.2 Fitness Definition 34
3.3.3 Disturbance Strategy for PSO Operation 35
3.3.4 The Proposed PSO-DCNN-SVM Hybrid Model 36
3.4 Experimental Results and Concluding Remarks 39

Chapter 4 Multi-Objective PSO Algorithm for FJSP 44
4.1 Formulation of the FJSP 44
4.2 Multi-Objective Particle Swarm Optimization (MOPSO) 46
4.3 Particle Representation and SDAI Decoding Method 50
4.3.1 Particle Representation 50
4.3.2 Sorting Decimal Assignment Integer (SDAI) Decoding Method 52
4.4 The Proposed MOPSO Algorithm for FJSP 57
4.5 Experimental Results and Concluding Remarks 63
4.5.1 Problem 8x8 63
4.5.2 Problem 10x10 64
4.5.3 Problem 15x10 66
4.5.4 Problem 4x5 with Release Date 66
4.5.5 Problem 10x7 with Release Date 67
4.5.6 Problem 15x10 with Release Date 69
4.5.7 Concluding Remarks 70

Chapter 5 Multi-Objective MCTS Algorithm for FJSP 71
5.1 MCTS and UCT 72
5.1.1 Structure of Monte-Carlo Tree Search (MCTS) 72
5.1.2 The UCT Algorithm 73
5.2 Representation and SOMA Method 75
5.3 Problem Transformation 80
5.4 Modification of UCT with Non-dominated Sorting Strategy 82
5.5 The Path Random Search (PRS) Algorithm 85
5.6 The Proposed MOMCTS Algorithm for FJSP 88
5.7 Experimental Results and Concluding Remarks 91
5.7.1 Problem 8x8 91
5.7.2 Problem 10x10 93
5.7.3 Problem 15x10 94
5.7.4 Problem 4x5 with Release Date 95
5.7.5 Problem 10x7 with Release Date 96
5.7.6 Problem 15x10 with Release Date 97
5.7.7 Concluding Remarks 98

Chapter 6 Conclusions and Future Work 99

Appendix A Some Results of Gantt Charts for Solving FJSP 101
A.1 Problem 8×8 and Some Results of Gantt Charts 103
A.2 Problem 10×10 and Some Results of Gantt Charts 110
A.3 Problem 15×10 and Some Results of Gantt Charts 118
A.4 Problem 4×5 with Release Date and Some Results of Gantt Charts 123
A.5 Problem 10×7 with Release Date and Some Results of Gantt Charts 127
A.6 Problem 15×10 with Release Date and Some Results of Gantt Charts 134

Appendix B Construction of Tree Search Topology 139

References 144

List of Figures

Figure 2.1 The flowchart of genetic algorithm. 13
Figure 2.2 Pseudo code of the PVS algorithm. 16
Figure 2.3(a) The result of original dataset. 19
Figure 2.3(b) The result of .
19
Figure 2.3(c) The result of .
19
Figure 2.3(d) The result of .
19
Figure 2.3(e) The result of .
19
Figure 2.3(f) The result of .
19
Figure 2.4 The flowchart of our proposed hybrid GA-DCNN-SVM model. 23
Figure 3.1 The flowchart of our proposed hybrid PSO-DCNN-SVM model. 38
Figure 4.1 Case A in the external repository. 47
Figure 4.2 Case B in the external repository. 47
Figure 4.3 Case C in the external repository. 47
Figure 4.4 Case D in the external repository. 47
Figure 4.5 Structure of proposed particle representation of each dimension. 51
Figure 4.6 One possible encoding representation in Table 4.1. 52
Figure 4.7(a) The SDAI scheme in Step 1. 53
Figure 4.7(b) The Gantt chart in Step 1. 53
Figure 4.8(a) The SDAI scheme in Step 2. 53
Figure 4.8(b) The Gantt chart in Step 2. 53
Figure 4.9(a) The SDAI scheme in Step 3. 54
Figure 4.9(b) The Gantt chart in Step 3. 54
Figure 4.10(a) The SDAI scheme in Step 4. 54
Figure 4.10(b) The Gantt chart in Step 4. 55
Figure 4.11(a) The SDAI scheme in Step 5. 55
Figure 4.11(b) The Gantt chart in Step 5. 55
Figure 4.12(a) The SDAI scheme in Step 6. 56
Figure 4.12(b) The Gantt chart in Step 6. 56
Figure 4.13(a) The SDAI scheme in Step 7. 56
Figure 4.13(b) The Gantt chart in Step 7. 57
Figure 4.14 The system flowchart of the proposed MOPSO algorithm 62
Figure 5.1 The scheme of Monte-Carlo tree search. 73
Figure 5.2 The Gantt chart in Step 7. 79
Figure 5.3 The system flowchart of the proposed PRS algorithm. 87
Figure 5.4 The system architecture of the proposed MOMCTS algorithm. 90

Figure A1 Optimization solution 1*(a) of problem 8×8 (F1=75, F2=12,F3=15).
█ Obtained by MOPSO □ Obtained by MOMCTS 104
Figure A2 Optimization solution 1*(b) of problem 8×8 (F1=75, F2=12,F3=15).
█ Obtained by MOPSO □ Obtained by MOMCTS 104
Figure A3 Optimization solution 1*(c) of problem 8×8 (F1=75, F2=12,F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 105
Figure A4 Optimization solution 1*(d) of problem 8×8 (F1=75, F2=12,F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 105
Figure A5 Optimization solution 2*(a) of problem 8×8 (F1=73, F2=13,F3=16).
█ Obtained by MOPSO □ Obtained by MOMCTS 106
Figure A6 Optimization solution 2*(b) of problem 8×8 (F1=73, F2=13,F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 106
Figure A7 Optimization solution 2*(c) of problem 8×8 (F1=73, F2=13,F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 107
Figure A8 Optimization solution 2*(d) of problem 8×8 (F1=73, F2=13,F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 107
Figure A9 Optimization solution 3*(a) of problem 8×8 (F1=77, F2=12,F3=14).
█ Obtained by MOPSO █ Obtained by MOMCTS 108
Figure A10 Optimization solution 3*(b) of problem 8×8 (F1=77, F2=12,F3=14).
□ Obtained by MOPSO █ Obtained by MOMCTS 108
Figure A11 Optimization solution 4*(a) of problem 8×8 (F1=77, F2=11,F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 109
Figure A12 Optimization solution 4*(b) of problem 8×8 (F1=77, F2=11,F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 109
Figure A13 Optimization solution 1*(a) of problem 10×10 (F1=41, F2=7,F3=8).
█ Obtained by MOPSO □ Obtained by MOMCTS 111
Figure A14 Optimization solution 1*(b) of problem 10×10 (F1=41, F2=7,F3=8).
█ Obtained by MOPSO □ Obtained by MOMCTS 111
Figure A15 Optimization solution 1*(c) of problem 10×10 (F1=41, F2=7,F3=8).
□ Obtained by MOPSO █ Obtained by MOMCTS 112
Figure A16 Optimization solution 1*(d) of problem 10×10 (F1=41, F2=7,F3=8).
█ Obtained by MOPSO █ Obtained by MOMCTS 112
Figure A17 Optimization solution 2*(a) of problem 10×10 (F1=42, F2=5,F3=8).
█ Obtained by MOPSO □ Obtained by MOMCTS 113
Figure A18 Optimization solution 2*(b) of problem 10×10 (F1=42, F2=5,F3=8).
█ Obtained by MOPSO █ Obtained by MOMCTS 113
Figure A19 Optimization solution 2*(c) of problem 10×10 (F1=42, F2=5,F3=8).
□ Obtained by MOPSO █ Obtained by MOMCTS 114
Figure A20 Optimization solution 3*(a) of problem 10×10 (F1=43, F2=5,F3=7).
█ Obtained by MOPSO □ Obtained by MOMCTS 114
Figure A21 Optimization solution 3*(b) of problem 10×10 (F1=43, F2=5,F3=7).
□ Obtained by MOPSO █ Obtained by MOMCTS 115
Figure A22 Optimization solution 3*(c) of problem 10×10 (F1=43, F2=5,F3=7).
□ Obtained by MOPSO █ Obtained by MOMCTS 115
Figure A23 Optimization solution 4*(a) of problem 10×10 (F1=42, F2=6,F3=7).
█ Obtained by MOPSO □ Obtained by MOMCTS 116
Figure A24 Optimization solution 4*(b) of problem 10×10 (F1=42, F2=6,F3=7).
█ Obtained by MOPSO □ Obtained by MOMCTS 116
Figure A25 Optimization solution 4*(c) of problem 10×10 (F1=42, F2=6,F3=7).
□ Obtained by MOPSO █ Obtained by MOMCTS 117
Figure A26 Optimization solution 4*(d) of problem 10×10 (F1=42, F2=6,F3=7).
█ Obtained by MOPSO █ Obtained by MOMCTS 117
Figure A27 Optimization solution 1*(a) of problem 15×10 (F1=91, F2=11,F3=11).
█ Obtained by MOPSO □ Obtained by MOMCTS 119
Figure A28 Optimization solution 1*(b) of problem 15×10 (F1=91, F2=11,F3=11).
█ Obtained by MOPSO █ Obtained by MOMCTS 119
Figure A29 Optimization solution 1*(c) of problem 15×10 (F1=91, F2=11,F3=11).
□ Obtained by MOPSO █ Obtained by MOMCTS 120
Figure A30 Optimization solution 1*(d) of problem 15×10(F1=91, F2=11,F3=11).
□ Obtained by MOPSO █ Obtained by MOMCTS 120
Figure A31 Optimization solution 2*(a) of problem 15×10(F1=93, F2=10,F3=11).
█ Obtained by MOPSO □ Obtained by MOMCTS 121
Figure A32 Optimization solution 2*(b) of problem 15×10(F1=93, F2=10,F3=11).
█ Obtained by MOPSO □ Obtained by MOMCTS 121
Figure A33 Optimization solution 2*(c) of problem 15×10(F1=93, F2=10,F3=11).
□ Obtained by MOPSO █ Obtained by MOMCTS 122
Figure A34 Optimization solution 2*(d) of problem 15×10(F1=93, F2=10,F3=11).
█ Obtained by MOPSO █ Obtained by MOMCTS 122
Figure A35 Optimization solution 1*(a) of problem 4×5(F1=31, F2=9,F3=16).
█ Obtained by MOPSO □ Obtained by MOMCTS 124
Figure A36 Optimization solution 1*(b) of problem 4×5(F1=31, F2=9,F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 124
Figure A37 Optimization solution 1*(c) of problem 4×5(F1=31, F2=9,F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 124
Figure A38 Optimization solution 2*(a) of problem 4×5(F1=32, F2=8,F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 125
Figure A39 Optimization solution 2*(b) of problem 4×5(F1=32, F2=8,F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 125
Figure A40 Optimization solution 2*(c) of problem 4×5(F1=32, F2=8,F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 125
Figure A41 Optimization solution 2*(d) of problem 4×5(F1=32, F2=8, F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 126
Figure A42 Optimization solution 3*(a) of problem 4×5(F1=33, F2=7, F3=16).
█ Obtained by MOPSO █ Obtained by MOMCTS 126
Figure A43 Optimization solution 3*(b) of problem 4×5(F1=33, F2=7, F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 126
Figure A44 Optimization solution 1*(a) of problem 10×7(F1=60, F2=12, F3=16).
█ Obtained by MOPSO □ Obtained by MOMCTS 128
Figure A45 Optimization solution 1*(b) of problem 10×7(F1=60, F2=12, F3=16).
█ Obtained by MOPSO □ Obtained by MOMCTS 128
Figure A46 Optimization solution 1*(c) of problem 10×7(F1=60, F2=12, F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 129
Figure A47 Optimization solution 1*(d) of problem 10×7(F1=60, F2=12, F3=16).
□ Obtained by MOPSO █ Obtained by MOMCTS 129
Figure A48 Optimization solution 2*(a) of problem 10×7(F1=61, F2=11, F3=15).
█ Obtained by MOPSO █ Obtained by MOMCTS 130
Figure A49 Optimization solution 2*(b) of problem 10×7(F1=61, F2=11, F3=15).
█ Obtained by MOPSO █ Obtained by MOMCTS 130
Figure A50 Optimization solution 2*(c)of problem 10×7(F1=61, F2=11, F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 131
Figure A51 Optimization solution 2*(d) of problem 10×7(F1=61, F2=11, F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 131
Figure A52 Optimization solution 3*(a)of problem 10×7(F1=62, F2=10, F3=15).
█ Obtained by MOPSO □ Obtained by MOMCTS 132
Figure A53 Optimization solution 3*(b) of problem 10×7(F1=62, F2=10, F3=15).
█ Obtained by MOPSO □ Obtained by MOMCTS 132
Figure A54 Optimization solution 3*(c)of problem 10×7(F1=62, F2=10, F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 133
Figure A55 Optimization solution 3*(d) of problem 10×7(F1=62, F2=10, F3=15).
□ Obtained by MOPSO █ Obtained by MOMCTS 133
Figure A56 Optimization solution 1*(a) of problem 15×10(F1=91, F2=11, F3=23).
█ Obtained by MOPSO □ Obtained by MOMCTS 135
Figure A57 Optimization solution 1*(b) of problem 15×10(F1=91, F2=11, F3=23).
█ Obtained by MOPSO □ Obtained by MOMCTS 135
Figure A58 Optimization solution 1*(c) of problem 15×10(F1=91, F2=11, F3=23).
□ Obtained by MOPSO █ Obtained by MOMCTS 136
Figure A59 Optimization solution 1*(d) of problem 15×10(F1=91, F2=11, F3=23).
□ Obtained by MOPSO █ Obtained by MOMCTS 136
Figure A60 Optimization solution 2*(a) of problem 15×10(F1=93, F2=10, F3=23).
█ Obtained by MOPSO □ Obtained by MOMCTS 137
Figure A61 Optimization solution 2*(b) of problem 15×10(F1=93, F2=10, F3=23).
█ Obtained by MOPSO □ Obtained by MOMCTS 137
Figure A62 Optimization solution 2*(c) of problem 15×10(F1=93, F2=10, F3=23).
□ Obtained by MOPSO █ Obtained by MOMCTS 138
Figure A63 Optimization solution 2*(d) of problem 15×10(F1=93, F2=10, F3=23).
□ Obtained by MOPSO █ Obtained by MOMCTS 138
Figure B1 Transformation to tree search structure via SOMA in Step 1. 139
Figure B2 Transformation to tree search structure via SOMA in Step 2. 139
Figure B3 Transformation to tree search structure via SOMA in Step 3. 140
Figure B4 Transformation to tree search structure via SOMA in Step 4. 140
Figure B5 Transformation to tree search structure via SOMA in Step 5. 141
Figure B6 Transformation to tree search structure via SOMA in Step 6. 141
Figure B7 Transformation to tree search structure via SOMA in Step 7. 142
Figure B8 Transformation to tree search structure via SOMA in Step 8. 143

List of Tables

Table 2.1 The numerical values of the prototype data points with five values of merged rate ( =0.8, 0.85, 0.9, 0.95, and 1.0).

20
Table 2.2 Datasets from the UCI Machine Learning Repository. 26
Table 2.3 Experimental results for Heart disease dataset using GA-DCNN-SVM algorithm and GA-SVM method. 28
Table 2.4 Experimental results summary of GA-DCNN-SVM algorithm and GA-SVM method on six UCI datasets. 28
Table 3.1 The particle i is comprised of four parts: , , and feature mask
34
Table 3.2 Datasets from the UCI Machine Learning Repository. 39
Table 3.3 Parameters used for particle swarm optimization and genetic algorithm. 41
Table 3.4 Experimental results for Heart disease dataset using PSO – DCNN - SVM approach and GA-SVM method proposed by Huang et al. 42
Table 3.5 Experimental results summary of PSO-DCNN-SVM model and PSO-SVM, GA-SVM without DCNN method on six UCI datasets 42
Table 3.6 Experimental results summary of comparison the PSO with GA algorithm under DCNN framework on six UCI datasets. 43
Table 4.1 An example of the FJSP with 3 jobs, 2 machines, and 7 operations 52
Table 4.2 Symbols used in the MOPSO algorithm procedure statement. 58
Table 4.3 Comparison of results on problem 8×8 with 27 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 64
Table 4.4 Comparison of results on problem 10×10 with 30 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 65
Table 4.5 Comparison of results on problem 15×10 with 56 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 66
Table 4.6 Comparison of results on problem 4×5 with 12 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 67
Table 4.7 Comparison of results on problem 10×7 with 29 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 68
Table 4.8 Comparison of results on problem 15×10 with 56 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 69
Table 5.1 Initial candidate set for SOMA. Candidate: "○", Selection: "△".
75
Table 5.2 The selection set and candidate set .
76
Table 5.3 The selection set and candidate set .
76
Table 5.4 The selection set and candidate set .
77
Table 5.5 The selection set and candidate set .
77
Table 5.6 The selection set and candidate set .
78
Table 5.7 The selection set and candidate set .
78
Table 5.8 The selection set .
79
Table 5.9 An example to compute value for the initialization state.
83
Table 5.10 An example to compute value for the simulation state.
84
Table 5.11 Symbols used in the algorithm procedure statement 85
Table 5.12 Comparison of results on problem 8×8 with 27 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 92
Table 5.13 Comparison of results on problem 10×10 with 30 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 93
Table 5.14 Comparison of results on problem 15×10 with 56 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 94
Table 5.15 Comparison of results on problem 4×5 with 12 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 95
Table 5.16 Comparison of results on problem 10×7 with 29 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 96
Table 5.17 Comparison of results on problem 15×10 with 56 operations. The symbol ‘x’ indicates that the authors did not provide the Gantt chart in this article. 97
Table A1 Summary of Gantt chart diversity by MOPSO and MOMCTS. 102
Table A2 Problem 8 jobs × 8 machines. 103
Table A3 Problem 10 jobs × 10 machines. 110
Table A4 Problem 15 jobs × 10 machines. 118
Table A5 Problem 4 jobs × 5 machines with release date. 123
Table A6 Problem 10 jobs × 7 machines with release date. 127
Table A7 Problem 15 jobs × 10 machines with release date. 134
參考文獻 [1] K. Deb, Optimization for Engineering Design: Algorithm and Examples, New Delhi: Prentice-Hall, 1995.
[2] K. Deb, Multi-Objective Optimization Using Evolutionary Algorithms, John Wiley, 2001.
[3] V.N. Vapnik, Statistical Learning Theory, Wiley, New York, 1998.
[4] D. L. Luo, S. X. Wu, M.Q. Li, and Z. Yang, “Ant colony optimization with local search applied to the flexible job shop scheduling problems,” ICCCAS conference in Communications, Circuits and Systems, 2008, pp. 1015-1020.
[5] W. Xia and Z. Wu, “An effective hybrid optimization approach for multi-objective flexible job-shop scheduling problems,” Computers and Industrial Engineering, vol. 48, 2005, pp. 409-425.
[6] M. Bressan and J. Vitri, “Nonparametric discriminant analysis and nearest neighbor classification,” Pattern Recogn. Lett., vol. 24, 2003, pp. 2743–2749.
[7] F. Pernkopf, “Bayesian network classifiers versus selective k-NN classifier,” Pattern Recogn. Lett., vol. 38, 2005, pp. 1–10.
[8] J. K. Sing, D. K. Basu, M. Nasipuri, and M. Kundu, “Improved k-means algorithm in the design of RBF neural networks,” TENCON 2003. Conference on Convergent Technologies for Asia-Pacific Region, Vol. 2, Oct. 2003, pp. 15-17.
[9] V.N. Vapnik, The Nature of Statistical Learning Theory, New York: Springer-Verlag, 1995.
[10] J. Kamruzzaman and R.K. Begg, “Support vector machines and other pattern recognition approaches to the diagnosis of cerebral palsy gait,” IEEE Trans. Biomedical Engineering, vol. 53, no. 12, 2006, pp. 2479-2490.
[11] M. Jianmin, M.N. Nguyen, and J.C. Rajapakse, “Gene classification using codon usage and support vector machines,” IEEE Trans. Computational Biology and Bioinformatics, vol. 6, no. 1, 2009, pp. 134-143.
[12] W.M. Campbell, D.E. Sturim, and D.A. Reynolds, “Support vector machines using GMM supervectors for speaker verification,” IEEE Signal Processing Letter, vol. 13, no. 5, 2006, pp. 308-311.
[13] V. Wan and S. Renals, “Speaker verification using sequence discriminant support vector machines,” IEEE Trans. Speech and Audio Processing, vol. 13, no. 2, 2005, pp. 203-210.
[14] I. Dino, L.H. Lee, V.P. Kallimani and R. RajKumar, “Text document preprocessing with the bayes formula for classification using the support vector machine,” IEEE Trans. Knowledge and Data Engineering, vol. 20, no. 8, 2008, pp. 1264-1272.
[15] Y. Dong, M. Tu, Z. Xia, and G. Xing, “An optimization method for selecting parameters in support vector machines,” ICMLA 2007 Sixth International Conference on Machine Learning and Applications, Dec. 2007, pp. 1-6.
[16] V. Solo, “Selection of tuning parameters for support vector machines,” IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '05), vol. 5, March 2005, pp. 237-240.
[17] K.Z. Mao, “Feature subset selection for support vector machines through discriminative function pruning analysis,” IEEE Trans. on Syst., Man, Cybern., Vol. 34, no. 1, 2004, pp. 60-67.
[18] Z. Zhu, Y.S. Ong and M. Dash, “Wrapper-Filter feature selection algorithm using a memetic framework,” IEEE Trans. on Syst., Man, Cybern., Vol. 37, no. 1, 2007, pp. 70-75.
[19] A.V. Anghelescu and I.B. Muchnik, “Combinatorial PCA and SVM methods for feature selection in learning classifications,” IEEE International Conference on Integration of Knowledge Intensive Multi-Agent Systems, Sept. 2003, pp.491-496.
[20] H. Frohlich and A. Zell, “Feature subset selection for support vector machines by incremental regularized risk minimization,” IEEE International Joint Conference on Neural Networks, vol. 3, July 2004, pp.2041-2045.
[21] N. Zhou and L. Wang, “A novel support vector machine with class-dependent features for biomedical data,” IEEE International Conference on Systems, Man and Cybernetics, vol. 2, Oct. 2006, pp.1666-1670.
[22] C.L. Huang and C.J. Wang, “A GA-based feature selection and parameters optimization for support vector machines,” Expert Systems with Applications, vol. 31, Aug. 2006, pp. 231-240.
[23] C.L. Huang and J.F. Dun, “A distributed PSO-SVM hybrid system with feature selection and parameter optimization,” Applied Soft Computing, vol. 8, Sep. 2008, pp. 1381-1391.
[24] J.R. Cano, F. Herrera and M. Lozano, “Using evolutionary algorithms as instance selection for data reduction in KDD: An experimental study,” IEEE Trans. on Evolutionary Computation, Vol. 7, Dec. 2003, pp.561-575.
[25] X. Zhang, “Using class-center vectors to build support vector machines,” IEEE Signal Processing Society Workshop Neural Networks for Signal Processing, Aug. 1999, pp.3-11.
[26] H. Ke and X. Zhang, “Editing support vector machines,” In Proc. Neural Networks (IJCNN’01), vol. 2, July 2001, pp. 1464–1467.
[27] M. Li, F. Chen and J. Kou, “Candidate vectors selection for training support vector machines,” Third International Conference on Natural Computation, Vol. 1, Aug. 2007, pp.538-542.
[28] M.R. Garey, D.S. Johnson, and R. Sethil, “The complexity of flow shop and job-shop scheduling,” Mathematics of Operations Research 1, June 1996, pp. 117-129.
[29] H. Tamaki, T. Ono, H. Murao, and S. Kitamura, “Modeling and genetic solution of a class of flexible job shop scheduling problems,” IEEE International Conference on Emerging Technologies and Factory Automation, Vol. 2, Oct. 2001, pp.343-350.
[30] I. Kacem, “Genetic algorithm for the flexible job-shop scheduling problem,” IEEE International Conference on Systems, Man and Cybernetics, Vol. 4, Oct. 2003, pp.3464-3469.
[31] C. Zhang, Q. Liu, F. He, D. Chao, and X. Shao, “Applying genetic local search to solve the flexible job-shop scheduling problem,” WCICA 7th World Congress on Intelligent Control and Automation, June 2008, pp.3929-3935.
[32] C. Low, J. Y. Yeh, and K. I. Huang, “A robust simulated annealing heuristic for flow shop scheduling problems,” Int. J. Adv. Manuf. Technol., Vol. 23, 2004, pp. 762-767.
[33] R. K. Suresh and K. M. Mohanasundaram, “Pareto archived simulated annealing for job shop scheduling with multiple objectives,” Int. J. Adv. Manuf. Technol., Vol. 29, 2006, pp. 184-196.
[34] J. Zhang, X. M. Hu, X. Tan, J. H. Zhong, and Q. Huang, “Implementation of an ant colony optimization technique for job shop scheduling problem,” Trans. Inst. Meas. Control, Vol. 28, pp. 93-108, 2006.
[35] D.L. Luo, S.X. Wu, M.Q. Li, and Z. Yang, “Ant colony optimization with local search applied to the flexible job shop scheduling problems,” International Conference on Communications, Circuits and Systems, May 2008, pp.1015-1020.
[36] M. Mastrolilli and L.M. Gambardella, “Effective neighborhood functions for the flexible job shop problem,” J. Sched., Vol. 3, 2000, pp. 3-20.
[37] Z. Jia, H. Chen, and J. Tang, “An improved particle swarm optimization for multi-objective flexible job-shop scheduling problem,” IEEE International Conference on Grey Systems and Intelligent Services, Nov. 2007, pp.1587-1592.
[38] H. Liu, A. Abraham, and C. Grosan, “A novel variable neighborhood particle swarm optimization for multi-objective flexible job-shop scheduling problems,” International Conference on Digital Information Management, Vol. 1, Oct. 2007, pp.138-145.
[39] H. B. Yu and W. Liang, “Neural network and genetic algorithm-based hybrid approach to expanded job-shop scheduling,” Comput. Ind. Eng., Vol. 39, 2001, pp. 337-356.
[40] S. Yang and D. Wang, “A new adaptive neural network and heuristics hybrid approach for job-shop scheduling,” Comput. Oper. Res., Vol. 28, 2001, pp. 955-971.
[41] N. B. Ho and J. C. Tay, “Solving multiple-objective flexible job shop problems by evolution and local search,” IEEE Trans. on Systems, Man, and Cybernetics, Part C, Vol. 38, No. 5, 2008, pp. 674-685.
[42] S.W. Lin, K.C. Ying, S.C. Chen, and Z.J. Lee, “Particle swarm optimization for parameter determination and feature selection of support vector machines,” Expert Systems with Applications, vol. 35, 2008, pp. 1817-1824.
[43] C.J.C.Burges, “A tutorial on support vector machines for pattern recognition,” Data mining Knowledge Discovery, Vol. 2, no. 2, 1998, pp.121-167.
[44] T.M. Cover and P.E. Hart, “Nearest neighbor pattern classification,” IEEE Trans. Information Theory, vol. 13, 1967, pp. 21-27.
[45] T.M. Cover, “Estimation by the nearest neighbor rule,” IEEE Trans. Information Theory, vol. 14, no. 3, 1968, pp. 50-55.
[46] P.E. Hart, An Asymptotic Analysis of the Nearest-Neighbor Decision Rule, Stanford Electronics Labs., Stanford, Calif., Tech. Rept., 1966, 1828-2 (SEL-66-016).
[47] P.E. Hart, “The condensed nearest neighbor rule,” IEEE Trans. Information Theory, vol. 14, no. 3, 1968, pp. 515-516.
[48] F. Angiulli, “Fast nearest neighbor condensation for large data sets classification,” IEEE Trans. Knowledge and Data Engineering, vol. 19, no. 11, 2007, pp. 1450-1464.
[49] J.H. Holland, Adaption in Natural and Artificial Systems, Cambridge, MA: MIT Press, 1975.
[50] H.T. Lin and C.J. Lin, A study on sigmoid kernels for SVM and the training of non-PSD kernels by SMO-type methods. Technical report, Department of Computer Science and Information Engineering, National Taiwan University.
Available: http://www.csie.ntu.edu.tw/~cjlin/papers/tanh.pdf.
[51] UCI Machine Learning Repository [Online], Available: http://www.ics.uci.edu/mlearn/MLRepository.html.
[52] C.C. Chang, and C.J. Lin, LIBSVM: a library for support vector machines, Software available at: http://www.csie.ntu.edu.tw/~cjlin/libsvm, 2001.
[53] R. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proc. 6th Int. Symp. Micro Machine and Human Science (MHS), Oct. 1995, pp. 39–43.
[54] B. Liu, L. Wang, and Y. H. Jin, “An effective PSO-based memetic algorithm for flow shop scheduling,” IEEE Trans. Syst., Man, Cybern. B, Cybern., vol. 37, no. 1, Feb. 2007, pp. 18–27.
[55] N. Franken and A. P. Engelbrecht, “Particle swarm optimization approaches to coevolve strategies for the iterated prisoner’s dilemma,” IEEE Trans. Evol. Comput., vol. 9, no. 6, Dec. 2005, pp. 562–579.
[56] R. A. Krohling and L. S. Coelho, “Coevolutionary particle swarm optimization using Gaussian distribution for solving constrained optimization problems,” IEEE Trans. Syst., Man, Cybern. B, Cybern., vol. 36, no. 6, Dec. 2006, pp. 1407–1416.
[57] J. Kennedy and R. C. Eberhart, “A discrete binary version of the particle swarm algorithm,” In Proceedings of the World Multiconference on Systemics, Cybernetics and Informatics, Piscataway, NJ, 1997, pp. 4104-4109.
[58] W. Gates, “The reduced nearest neighbor rule,” IEEE Trans. Information Theory, vol. 18, no. 3, 1972, pp. 431-433.
[59] F. S. Devi and M. N. Murty, “An incremental prototype set building technique,” Pattern Recognition, vol. 35, no. 2, 2002, pp. 505-513.
[60] M. Pinedo and X. Chao, Operations Scheduling with Applications in Manufacturing and Services, McGraw-Hill, New York, 1999.
[61] C. A. Coello, G. T. Pulido, and M. S. Lechuga, “Handling multiple objectives with particle swarm optimization,” IEEE Trans. on Evolutionary Computation, vol. 8, no. 4, June 2004, pp. 256-279.
[62] C. A. Coello and M. S. Lechuga, “MOPSO: A proposal for multiple objective particle swarm optimization,” In Proc. Congress Evolutionary Computation (CEC’2002), vol. 1, Honolulu, HI, May 2002, pp. 1051–1056.
[63] D. L. Luo, S. X. Wu, M.Q. Li, and Z. Yang, “Ant colony optimization with local search applied to the flexible job shop scheduling problems,” ICCCAS conference in Communications, Circuits and Systems, 2008, pp. 1015-1020.
[64] S. Mostaghim and J. Teich, “Covering Pareto-optimal fronts by subswarms in multi-objective particle swarm optimization,” In Proc. Congress Evolutionary Computation (CEC’2004), vol. 2, June 2004, pp. 1404–1411.
[65] S. Gelly, Y. Wang, R. Munos, and O. Teytaud, Modification of UCT with Patterns in Monte-Carlo Go, University of South Paris and Paris institute of technology, 2006.
[66] Y. Wang and S. Gelly, “Modifications of UCT and sequence-like simulations for Monte-Carlo Go,” IEEE Symposium on Computational Intelligence and Games, 2007, pp. 175-182.
[67] P. Auer, N. Cesa-Bianchi, and P. Fischer, “Finite-time analysis of the multi-armed bandit problem,” Machine Learning, Vol. 47, 2002, pp. 235–256.
[68] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multi-objective genetic algorithm: NSGA-II,” IEEE Trans. on Evolutionary Computation, vol. 6, no. 2, 2002, pp. 182-197.
[69] G. Tesauro and G.R. Galperin, “On-line policy improvement using Monte-Carlo search,” In M.C. Mozer, M.I. Jordan, and T. Petsche, Editors, NIPS 9, 1997, pp.1068-1074.
[70] D. Billings, A. Davidson, J. Schaeffer, and D. Szafron, “The challenge of poker,” Artificial Intelligence, Vol. 134, 2002, pp. 201-240.
[71] B. Sheppard, “World-championship-caliber scrabble,” Artificial Intelligence, Vol. 134 (1-2), 2002, pp. 241-271.
[72] G.M.J.B. Chaslot, M.H.M. Winands, J.W.H.M. Uiterwijk, and H.J. Van Den Herik, “Progressive strategies for Monte-Carlo tree search,” In Information Sciences 2007 Proceedings of the 10th Joint Conference, 2007, pp.655-661.
[73] L. Kocsis and C. Szepesvari, “Bandit based Monte-Carlo planning,” In 15th European Conference on Machine Learning, 2006, pp.282-293.
[74] R. Agrawal, “Sample mean based index policies with regret for the multi-armed bandit problem,” Advances in Applied Probability, 1995, pp. 1045-1078.
[75] H. Chen, J. Ihlow, and C. Lehmann, “A genetic algorithm for flexible job-shop scheduling,” Proceedings of the IEEE International Conference on Robotics and Automation, vol.2, 1999, pp. 1120–1125.
[76] K. Shibahara and Y. Kotani, “Combining final score with winning percentage by sigmoid function in Monte-Carlo simulations,” IEEE Symposium on Computational Intelligence and Games, 2008, pp. 183-190.
論文使用權限
  • 同意紙本無償授權給館內讀者為學術之目的重製使用,於2011-07-23公開。
  • 同意授權瀏覽/列印電子全文服務,於2011-07-23起公開。


  • 若您有任何疑問,請與我們聯絡!
    圖書館: 請來電 (02)2621-5656 轉 2281 或 來信