摘要 | 第5-8页 |
ABSTRACT | 第8-11页 |
Symbols | 第23-24页 |
Abbreviations | 第24-29页 |
Chapter 1 Introduction | 第29-47页 |
1.1 Background of related topics | 第29-42页 |
1.1.1 Constrained optimization problems (COPs) | 第29-31页 |
1.1.2 Large scale optimization | 第31-37页 |
1.1.3 Sparse reconstruction | 第37-41页 |
1.1.4 Classifier ensemble | 第41-42页 |
1.2 Motivations | 第42-43页 |
1.3 Contributions | 第43-45页 |
1.4 Layout of this thesis | 第45-47页 |
Chapter 2 A novel selection evolutionary strategy for constrained optimization | 第47-79页 |
2.1 Related work | 第48-53页 |
2.2 Proposed approach | 第53-66页 |
2.2.1 The distribution characteristics of individuals in a population | 第54-57页 |
2.2.2 Design of the proposed algorithm | 第57-65页 |
2.2.3 Analysis of time complexity | 第65-66页 |
2.3 Experimental results | 第66-77页 |
2.4 Concluding remarks | 第77-79页 |
Chapter 3 Decomposition method for LSO based on mixed second order par-tial derivatives | 第79-95页 |
3.1 Mixed second order partial derivatives decomposition method | 第79-87页 |
3.1.1 Problem definitions | 第79-82页 |
3.1.2 Theoretical foundation for interaction and independence of vari-ables | 第82-84页 |
3.1.3 Derived interaction criterion | 第84-85页 |
3.1.4 Relationship between the proposed method, DG and CCVIL | 第85-86页 |
3.1.5 Proposed algorithms based on IS criterion | 第86-87页 |
3.2 Experimental results and discuss | 第87-90页 |
3.3 Concluding remarks | 第90-95页 |
Chapter 4 Evolutionary multi-objective optimization for compressed sensingproblems | 第95-123页 |
4.1 Multi-objective approach to sparse reconstruction | 第95-106页 |
4.1.1 Soft-thresholding local search | 第97-103页 |
4.1.2 Selection operator | 第103页 |
4.1.3 Finding knee areas | 第103-106页 |
4.2 Experiments and discussions | 第106-121页 |
4.2.1 Existence of knee areas and best compromise between two con-flicting objectives | 第107-117页 |
4.2.2 Comparison of StEMO against other methods | 第117-121页 |
4.3 Conclusions | 第121-123页 |
Chapter 5 A compressed sensing approach for efficient ensemble learning | 第123-155页 |
5.1 Compressed sensing ensemble | 第124-127页 |
5.1.1 Problem formulation and solution technique | 第125-127页 |
5.2 Roulette-wheel kappa-error diagram | 第127-132页 |
5.3 Experimental results and discussion | 第132-138页 |
5.3.1 Results with base learner CART and random subspace | 第133-135页 |
5.3.2 Results with base learner C45 and Bagging | 第135页 |
5.3.3 Roulette-wheel Kappa-error diagrams for the four different CS methods | 第135-137页 |
5.3.4 Comparison of proposed method against five other pruning algo-rithms | 第137-138页 |
5.4 Conclusion and future work | 第138-155页 |
Chapter 6 Joint sparse representation for ensemble learning | 第155-177页 |
6.1 Joint sparse reconstruction for ensemble learning | 第156-163页 |
6.1.1 Posing ensemble problems as joint sparse representation problems | 第159-161页 |
6.1.2 Two alternative methods for obtaining the sub-underdetermined systems for a joint sparse ensemble | 第161-163页 |
6.2 Experimental results and discussion | 第163-168页 |
6.2.1 Performance of joint sparse ensemble | 第164-167页 |
6.2.2 Comparison of joint sparse ensemble against other algorithms | 第167-168页 |
6.3 Concluding remarks | 第168-177页 |
Chapter 7 Concluding remarks and future Work | 第177-181页 |
References | 第181-195页 |
Appendix A Comparison of two cases and figure results in CHAPTER 4 | 第195-203页 |
A.1 Comparison of no-better-choice with random-choice | 第195-199页 |
A.1.1 Effect of no-better-choice and random-choice on the Same Popu-lation | 第195-199页 |
A.1.2 Incorporate no-better-choice and random-choice into the Local Search Algorithm | 第199页 |
A.2 Figures produced by each algorithm for the benchmark problems | 第199-203页 |
Acknowledgements | 第203-205页 |
致谢 | 第205-207页 |
About the Author | 第207-211页 |
作者简介 | 第211-213页 |