Learn to bridge the gap between machine learning and metaheuristic methods to solve problems in optimization approaches
Few areas of technology have greater potential to revolutionize the globe than artificial intelligence. Two key areas of artificial intelligence, machine learning and metaheuristic computation, have an enormous range of individual and combined applications in computer science and technology. To date, these two complementary paradigms have not always been treated together, despite the potential of a combined approach which maximizes the utility and minimizes the drawbacks of both.
Machine Learning and Metaheuristic Computation offers an introduction to both of these approaches and their joint applications. Both a reference text and a course, it is built around the popular Python programming language to maximize utility. It guides the reader gradually from an initial understanding of these crucial methods to an advanced understanding of cutting-edge artificial intelligence tools.
The text also provides:
- Treatment suitable for readers with only basic mathematical training
- Detailed discussion of topics including dimensionality reduction, clustering methods, differential evolution, and more
- A rigorous but accessible vision of machine learning algorithms and the most popular approaches of metaheuristic optimization
Machine Learning and Metaheuristic Computation is ideal for students, researchers, and professionals looking to combine these vital methods to solve problems in optimization approaches.
Spis treści
About the Authors xi
Preface xiii
Acknowledgments xvii
Introduction xix
1 Fundamentals of Machine Learning 1
1.1 Introduction 1
1.2 Different Types of Machine Learning Approaches 4
1.3 Supervised Learning 6
1.4 Unsupervised Learning 8
1.5 Reinforcement Learning 10
1.6 Which Algorithm to Apply? 13
1.7 Recommendation to Build a Machine Learning Model 15
References 19
2 Introduction to Metaheuristics Methods 21
2.1 Introduction 21
2.2 Classic Optimization Methods 23
2.3 Descending Gradient Method 24
2.4 Metaheuristic Methods 29
2.5 Exploitation and Exploration 35
2.6 Acceptance and Probabilistic Selection 37
2.7 Random Search 41
2.8 Simulated Annealing 47
References 57
3 Fundamental Machine Learning Methods 59
3.1 Introduction 59
3.2 Regression 60
3.2.1 Explanatory Purpose 62
3.2.2 Predictive Purpose 62
3.3 Classification 71
3.3.1 Relationship Between Regression and Classification 72
3.3.2 Differences Between Regression and Classification 72
3.4 Decision Trees 73
3.4.1 Procedure of Classification 74
3.4.2 Determination of the Splitting Point 77
3.4.2.1 Gini Index 77
3.4.2.2 Entropy 78
3.4.3 Example of Classification 79
3.5 Bayesian Classification 86
3.5.1 Conditional Probability 87
3.5.2 Classification of Fraudulent Financial Reports 87
3.5.3 Practical Constraints by Using the Exact Bayes Method 90
3.5.4 Naive Bayes Method 90
3.5.5 Computational Experiment 92
3.6 k-Nearest Neighbors (k-NN) 99
3.6.1 k-NN for Classification 99
3.6.2 k-NN for Regression 101
3.7 Clustering 105
3.7.1 Similarity Indexes 107
3.7.2 Methods for Clustering 108
3.8 Hierarchical Clustering 112
3.8.1 Implementation in MATLAB 114
3.9 K-Means Algorithm 122
3.9.1 Implementation of K-Means Method in MATLAB 127
3.10 Expectation-Maximization Method 130
3.10.1 Gaussian Mixture Models 131
3.10.2 Maximum Likelihood Estimation 131
3.10.3 EM in One Dimension 132
3.10.3.1 Initialization 132
3.10.3.2 Expectation 132
3.10.3.3 Maximization 133
3.10.4 Numerical Example 133
3.10.5 EM in Several Dimensions 135
References 141
4 Main Metaheuristic Techniques 145
4.1 Introduction 145
4.1.1 Use of Metaphors 145
4.1.2 Problems of the Use of Metaphors 146
4.1.3 Metaheuristic Algorithms 147
4.2 Genetic Algorithms 148
4.2.1 Canonical Genetic Algorithm 149
4.2.2 Selection Process 152
4.2.3 Binary Crossover Process 155
4.2.4 Binary Mutation Process 156
4.2.5 Implementation of the Binary GA 157
4.2.6 Genetic Algorithm Utilizing Real-Valued Parameters 164
4.2.7 Crossover Operator for Real-Valued Parameters 165
4.2.8 Mutation Operator for Real-Valued Parameters 176
4.2.9 Computational Implementation of the GA with Real Parameters 181
4.3 Particle Swarm Optimization (PSO) 189
4.3.1 Strategy for Searching in Particle Swarm Optimization 189
4.3.2 Analysis of the PSO Algorithm 192
4.3.3 Inertia Weighting 192
4.3.4 Particle Swarm Optimization Algorithm Using MATLAB 193
4.4 Differential Evolution (DE) Algorithm 196
4.4.1 The Search Strategy of DE 197
4.4.2 The Mutation Operation in DE 200
4.4.2.1 Mutation Rand/ 1 201
4.4.2.2 Mutación Best/ 1 201
4.4.2.3 Mutation Rand/ 2 202
4.4.2.4 Mutation Best/ 2 202
4.4.2.5 Mutation Current-to-Best/ 1 203
4.4.3 The Crossover Operation in DE 203
4.4.4 The Selection Operation in DE 205
4.4.5 Implementation of DE in MATLAB 205
References 209
5 Metaheuristic Techniques for Fine-Tuning Parameter of Complex Systems 211
5.1 Introduction 211
5.2 Differential Evolution (DE) 211
5.2.1 Mutation 212
5.2.1.1 Mutation Best/ 1 213
5.2.1.2 Mutation Rand/ 2 213
5.2.1.3 Mutation Best/ 2 213
5.2.1.4 Mutation Current-to-Best/ 1 213
5.2.2 Crossover 213
5.2.3 Selection 214
5.3 Adaptive Network-Based Fuzzy Inference System (ANFIS) 219
5.4 Differential Evolution for Fine-Tuning ANFIS Parameters Setting 220
References 236
6 Techniques of Machine Learning for Producing Metaheuristic Operators 237
6.1 Introduction 237
6.2 Hierarchical Clustering 238
6.2.1 Agglomerative Hierarchical Clustering Algorithm 239
6.3 Chaotic Sequences 243
6.4 Cluster-Chaotic-Optimization (CCO) 245
6.4.1 Initialization 246
6.4.2 Clustering 246
6.4.3 Intra-Cluster Procedure 247
6.4.3.1 Local Attraction Movement 247
6.4.3.2 Local Perturbation Strategy 247
6.4.3.3 Extra-Cluster Procedure 248
6.4.3.4 Global Attraction Movement 249
6.4.3.5 Global Perturbation Strategy 249
6.5 Computational Procedure 250
6.6 Implementation of the CCO Algorithm in MATLAB 250
6.7 Spring Design Optimization Problem Using the CCO Algorithm in MATLAB 258
References 267
7 Techniques of Machine Learning for Modifying the Search Strategy 269
7.1 Introduction 269
7.2 Self-Organization Map (SOM) 270
7.2.1 Network Architecture 272
7.2.2 Competitive Learning Model 273
7.2.2.1 Competition Procedure 273
7.2.2.2 Cooperation Procedure 274
7.2.2.3 Synaptic Adaptation Procedure 275
7.2.3 Self-Organization Map (SOM) Algorithm 275
7.2.4 Application of Self-Organization Map (SOM) 276
7.3 Evolutionary-SOM (EA-SOM) 277
7.3.1 Initialization 280
7.3.2 Training 281
7.3.3 Knowledge Extraction 281
7.3.4 Solution Production 282
7.3.5 New Training Set Construction 283
7.4 Computational Procedure 283
7.5 Implementation of the EA-SOM Algorithm in MATLAB 284
7.6 Gear Design Optimization Problem Using the EA-SOM Algorithm in MATLAB 289
References 294
8 Techniques of Machine Learning Mixed with Metaheuristic Methods 297
8.1 Introduction 297
8.2 Flower Pollination Algorithm (FPA) 298
8.2.1 Global Rule and Lévy Flight 298
8.2.2 Local Rule 299
8.2.3 Elitist Selection Procedure 299
8.3 Feedforward Neural Networks (FNNs) 303
8.3.1 Perceptron 305
8.3.2 Feedforward Neural Networks (FNNs) 305
8.4 Training an FNN Using FPA 306
References 308
9 Metaheuristic Methods for Classification 311
9.1 Introduction 311
9.2 Crow Search Algorithm (CSA) 311
9.3 CSA for Nearest-Neighbor Method (k-NN) 315
9.4 CSA for Logistic Regression 319
9.5 CSA for Fisher Linear Discriminant 323
9.6 CSA for Naïve Bayes Classification 326
9.7 CSA for Support Vector Machine 330
References 336
10 Metaheuristic Methods for Clustering 339
10.1 Introduction 339
10.2 Cuckoo Search Method (CSM) 340
10.3 Search Strategy for CSM 340
10.3.1 Initialization 342
10.3.2 Lévy Flight 342
10.3.3 Solution Replacement 344
10.3.4 Elitist Selection 344
10.4 Computational Procedure 345
10.4.1 Metaheuristic Operators for CSM 345
10.5 Implementation of the CSM in MATLAB 347
10.6 Cuckoo Search Method for K-Means 352
10.6.1 Implementation of KM algorithm in MATLAB 354
10.6.2 Cuckoo Search Method for K-Means 356
10.6.2.1 Implementation of CSM to KM Clustering in MATLAB 358
References 363
11 Metaheuristic Methods for Dimensional Reduction 365
11.1 Introduction 365
11.2 Ant Colony Optimization (ACO) 365
11.2.1 Pheromone Representation 366
11.2.2 Ant-Based Solution Construction 367
11.2.3 Pheromone Update 367
11.3 Dimensionality Reduction 372
11.4 ACO for Feature Selection 373
References 375
12 Metaheuristic Methods for Regression 377
12.1 Introduction 377
12.2 Genetic Algorithm (GA) 377
12.2.1 Computational Structure 378
12.2.2 Initialization 378
12.2.3 Selection Method 378
12.2.3.1 Roulette Wheel Selection 378
12.2.3.2 Stochastic Reminder Selection 379
12.2.3.3 Rank-Based Selection 379
12.2.3.4 Tournament Selection 380
12.2.4 Crossover 380
12.2.5 Mutation 381
12.3 Neural Network Regression with Artificial Genetic 386
12.4 Linear Regression Employing an Artificial Genetic 391
References 396
Index 397
O autorze
Erik Cuevas, Ph D, is a Full Professor in the Department of Electronics at the University of Guadalajara. He is a Member of the Mexican Academy of Sciences and the National System of Researchers. He has provided editorial services on several specialized journals.
Jorge Galvez, Ph D, is a Full Professor in the Department of Innovation Based on Information and Knowledge at the University of Guadalajara. He is a Member of the Mexican Academy of Sciences and the National System of Researchers.
Omar Avalos, Ph D, is a Professor in the Electronics and Computing Division of the University Center for Exact Sciences and Engineering at the University of Guadalajara. He is a Member of the Mexican Academy of Sciences and the National System of Researchers.
Fernando Wario, Ph D, is a Professor at the University of Guadalajara and an Associate Researcher at the Institute of Cognitive Sciences and Technologies (ISTC) in Rome, Italy. He is a Member of the Mexican Academy of Sciences and the National System of Researchers.