Hands-on ensemble learning with Python : build highly optimized ensemble machine learning models using scikit-learn and Keras /

Ensemble learning can provide the necessary methods to improve the accuracy and performance of existing models. In this book, you'll understand how to combine different machine learning algorithms to produce more accurate results from your models.

Saved in:
Bibliographic Details
Main Authors: Kyriakides, George (Author), Margaritis, Konstantinos G. (Author)
Format: eBook
Language:English
Published: Birmingham, UK : Packt Publishing, [2019]
Subjects:
Online Access:CONNECT
CONNECT
LEADER 05719cam a2200601 i 4500
001 in00005989214
006 m o d
007 cr |||||||||||
008 190810t20192019enk o 000 0 eng d
005 20211007165902.5
035 |a 1WRLDSHRon1110483236 
040 |a EBLCP  |b eng  |e rda  |e pn  |c EBLCP  |d OCLCQ  |d UKMGB  |d OCLCO  |d EBLCP  |d OCLCF  |d TEFOD  |d YDX  |d TEFOD  |d UKAHL  |d OCLCQ  |d N$T  |d OCLCQ 
015 |a GBB9D1835  |2 bnb 
016 7 |a 019485029  |2 Uk 
019 |a 1110483044 
020 |a 9781789617887  |q (electronic bk.) 
020 |a 178961788X  |q (electronic bk.) 
020 |z 9781789612851  |q (pbk.) 
035 |a (OCoLC)1110483236  |z (OCoLC)1110483044 
037 |a 9781789617887  |b Packt Publishing 
037 |a 85C90D7D-571A-41E9-9A36-69181EC00298  |b OverDrive, Inc.  |n http://www.overdrive.com 
050 4 |a Q325.5 
082 0 4 |a 006.3/1  |2 23 
049 |a TXMM 
100 1 |a Kyriakides, George,  |e author. 
245 1 0 |a Hands-on ensemble learning with Python :  |b build highly optimized ensemble machine learning models using scikit-learn and Keras /  |c George Kyriakides. 
264 1 |a Birmingham, UK :  |b Packt Publishing,  |c [2019] 
264 4 |c ©2019 
300 |a 1 online resource (284 pages) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
588 0 |a Print version record. 
505 0 |a Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Section 1: Introduction and Required Software Tools; Chapter 1: A Machine Learning Refresher; Technical requirements; Learning from data; Popular machine learning datasets; Diabetes; Breast cancer; Handwritten digits; Supervised and unsupervised learning; Supervised learning; Unsupervised learning; Dimensionality reduction; Performance measures; Cost functions; Mean absolute error; Mean squared error; Cross entropy loss; Metrics; Classification accuracy; Confusion matrix 
505 8 |a Sensitivity, specificity, and area under the curvePrecision, recall, and the F1 score; Evaluating models; Machine learning algorithms; Python packages; Supervised learning algorithms; Regression; Support vector machines; Neural networks; Decision trees; K-Nearest Neighbors; K-means; Summary; Chapter 2: Getting Started with Ensemble Learning; Technical requirements; Bias, variance, and the trade-off; What is bias?; What is variance?; Trade-off; Ensemble learning; Motivation; Identifying bias and variance; Validation curves; Learning curves; Ensemble methods; Difficulties in ensemble learning 
505 8 |a Weak or noisy dataUnderstanding interpretability; Computational cost; Choosing the right models; Summary; Section 2: Non-Generative Methods; Chapter 3: Voting; Technical requirements; Hard and soft voting; Hard voting; Soft voting; ​Python implementation; Custom hard voting implementation; Analyzing our results using Python; Using scikit-learn; Hard voting implementation; Soft voting implementation; Analyzing our results; Summary; Chapter 4: Stacking; Technical requirements; Meta-learning; Stacking; Creating metadata; Deciding on an ensemble's composition; Selecting base learners 
505 8 |a Selecting the meta-learnerPython implementation; Stacking for regression; Stacking for classification; Creating a stacking regressor class for scikit-learn; Summary; Section 3: Generative Methods; Chapter 5: Bagging; Technical requirements; Bootstrapping; Creating bootstrap samples; Bagging; Creating base learners; Strengths and weaknesses; Python implementation; Implementation; Parallelizing the implementation; Using scikit-learn; Bagging for classification; Bagging for regression; Summary; Chapter 6: Boosting; Technical requirements; AdaBoost; Weighted sampling; Creating the ensemble 
505 8 |a Implementing AdaBoost in PythonStrengths and weaknesses; Gradient boosting; Creating the ensemble; Further reading; Implementing gradient boosting in Python; Using scikit-learn; Using AdaBoost; Using gradient boosting; XGBoost; Using XGBoost for regression; Using XGBoost for classification; Other boosting libraries; Summary; Chapter 7: Random Forests; Technical requirements; Understanding random forest trees; Building trees; Illustrative example; Extra trees; Creating forests; Analyzing forests; Strengths and weaknesses; Using scikit-learn; Random forests for classification 
500 |a Random forests for regression 
520 |a Ensemble learning can provide the necessary methods to improve the accuracy and performance of existing models. In this book, you'll understand how to combine different machine learning algorithms to produce more accurate results from your models. 
590 |a EBSCO eBook Academic Comprehensive Collection North America 
650 0 |a Machine learning. 
650 0 |a Python (Computer program language) 
700 1 |a Margaritis, Konstantinos G.,  |e author. 
730 0 |a WORLDSHARE SUB RECORDS 
776 0 8 |i Print version:  |a Kyriakides, George.  |t Hands-On Ensemble Learning with Python : Build Highly Optimized Ensemble Machine Learning Models Using Scikit-Learn and Keras.  |d Birmingham : Packt Publishing, Limited, ©2019  |z 9781789612851 
856 4 0 |u https://ezproxy.mtsu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2204655  |z CONNECT  |3 eBooks on EBSCOhost  |t 0 
949 |a ho0 
994 |a 92  |b TXM 
998 |a wi  |d z 
999 f f |s 5906426f-7b7f-4968-831f-160b45e32329  |i 5906426f-7b7f-4968-831f-160b45e32329  |t 0 
952 f f |t 1  |e Q325.5   |h Library of Congress classification 
856 4 0 |3 eBooks on EBSCOhost  |t 0  |u https://ezproxy.mtsu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2204655  |z CONNECT