Writing an all-encompassing book on Python machine learning is difficult, given how expansive the field is. But reviewing one is not an easy feat either, especially when it’s a highly acclaimed title such as Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, 2nd Edition.
The book is a best-seller on Amazon, and the author, Aurélien Géron, is arguably one of the most talented writers on Python machine learning.
And after reading Hands-on Machine Learning, I must say that Geron does not disappoint, and the second edition is an excellent resource for Python machine learning. Geron has managed to cover more topics than you’ll find in most other general books on Python machine learning, including a comprehensive section on deep learning.
But there are some caveats, and unless you come prepared, you won’t be able to appreciate everything Hands-on Machine Learning has to offer.
[Read: Everything you need to know about transfer learning in AI]
A top-to-bottom approach to machine learning
Hands-on Machine Learning has a unique approach. It usually starts with a high-level description of different machine learning concepts to give you the general idea; then you go through hands-on coding with Python libraries without going into the details; finally, when you get comfortable with the coding and concepts, you lift the hood and get into the nitty-gritty of how the math and code work.
To understand the more advanced topics discussed in the book, you’ll need to have a firm grasp of Python coding and some of the useful tricks such as list comprehensions and lambda functions, as well as basic knowledge of key data science libraries such as Numpy, Pandas, and Matplotlib.
You also need to have a solid command of algebra, calculus, and the basics of data science. Hands-on Machine Learning assumes you know your math pretty well and won’t hold your hand on partial derivatives and gradients when you reach the deep learning section.
The book is split into two sections, the first one covering general machine learning and the second focused on deep learning.
The first chapter of the book is one of the most intuitive, example-oriented introductions to machine learning I’ve seen in any book. Even experienced Python machine learning developers will find it very useful, solidifying what they already know and refreshing subtle concepts they might have forgotten.
The book also goes through an end-to-end machine learning project with Python, taking you through data collection, preparation, and visualization; followed by model creation, training, and fine-tuning. You do all the steps without going too much into the details, which provides an overall idea of the machine learning pipeline, readying your mind for what is to come.
The rest of the first part goes into some key supervised and unsupervised learning algorithms. You’ll find the general roster of algorithms and libraries that most Python machine learning books cover (regression algorithms, decision trees, support vector machines, clustering algorithms, etc.). There are, however, some unique touches that set this book apart from others, such as the discussion of multi-class output and multi-output classification, which is absent from most other books.
One of the things I really like about Hands-on Machine Learning is the step-by-step explanation and coding of gradient descent and stochastic gradient descent. Geron has managed to make two of the most fundamental (and complicated) optimization algorithms accessible to readers who don’t have a technical background. The groundwork helps you navigate through the much more complicated topics you’ll go through later in the book, especially when you reach the section on artificial neural networks and deep learning.
As you go through your exploration of machine learning algorithms, Geron throws in other goodies that are less discussed elsewhere, including in-depth discussions of different SVM kernels (with a lot of complicated math), a variety of ensemble methods (other books usually discuss random forests only), and a technical overview of boosting methods.
Hands-on Machine Learning also introduces you semi-supervised learning, a machine learning technique used when you want to perform supervised learning but your training/testing data is unlabeled. Again, this is something other introductory books on Python machine learning don’t mention.
But the machine learning section isn’t without fault. The classification chapter gets a bit frustrating because there are subsequent sections where you must run cross-validation on the entire MNIST dataset, which is pretty slow, even on an eight-core CPU server. The dimensionality reduction chapter has some good visualizations but reads like a reference manual with short code snippets, and misses end-to-end examples that can better explain the concept and the problem it solves (full examples are included in the code sample files). Toward the end of the first section, the book becomes very complicated, and you’ll struggle if you don’t have prior background on machine learning and a solid foundation in calculus.
Comprehensive coverage of deep learning
I don’t expect a book on machine learning to extensively cover deep learning, but in Hands-on Machine Learning, Geron has managed to pack a lot in 400 pages. You start with a great history of artificial neural networks, which I think is important for anyone studying deep learning (many people jump into coding without taking note of the decades of research behind neural networks). Again, as with the machine learning section, you get an overview of key deep learning concepts such as multi-layer perceptrons (MLP), backpropagation, hyperparameter tuning, etc.
There is also a great overview of activation functions and some good warnings about the pitfalls of deep learning (don’t torture the data!)
Geron gives you a structural overview of the TensorFlow, Google’s popular deep learning framework, along with key classes and customization capabilities for classes, activation functions, models, and more. This is important because you’ll be doing a lot of custom component creation in the book. There’s also a lot of coding in Keras, the higher-level library that makes it easier to work with Tensorflow components.
The rest of the book is specialized chapters on different disciplines, including computer vision, sequence processing, and natural language processing. There are also introductions to advanced concepts such as generative models and reinforcement learning.
You will get to look under the hood of key deep learning constructs, including convolutional neural networks (CNN), recurrent neural networks (RNN), long short-term memory networks (LSTM), and gated recurrent units (GRU).
The computer vision chapter is especially interesting and contains plenty of material not found in other books, including useful examples of image classification and object detection with popular deep learning algorithms such as ResNet and YOLO. You also get an overview of the structure of other acclaimed deep learning models such as AlexNet, GoogLeNet, and VGGNet. There’s also a hands-on example of transfer learning with the Xception neural network model.
The NLP section is also very example-oriented with in-depth coverage of sequence prediction, sentiment analysis, and neural machine translation. You also get introduced to advanced topics that deserve a book of their own, including bi-directional RNNs, beam search, and attention mechanisms.
The generative models section starts out with a smooth and intuitive introduction to autoencoders but gets complicated when you reach generative adversarial networks (GAN), which again, merits a book of its own.
One thing you’ll need as you go through this section is a lot of computing power (GPUs preferably). The examples are very compute-intensive. Hands-on Machine Learning’s accompanying Jupyter Notebooks also contain plenty of valuable code and functions that the book does not discover in-depth, so make sure to check them out as well.
The deep learning section finishes off with a preview of professional deep learning production environments.
One of the best things I like about Hands-on Machine Learning is how Geron ends it: “Going forward, my best advice to you is to practice and practice: try going through all the exercises (if you have not done so already), play with the Jupyter notebooks, join Kaggle.com or some other ML community, watch ML courses, read papers, attend conferences, and meet experts. It also helps tremendously to have a concrete project to work on, whether it is for work or for fun (ideally for both), so if there’s anything you have always dreamt of building, give it a shot! Work incrementally; don’t shoot for the moon right away, but stay focused on your project and build it piece by piece. It will require patience and perseverance, but when you have a walking robot, or a working chatbot, or whatever else you fancy to build, it will be immensely rewarding.”
And it’s true. If there’s one thing Hands-on Machine Learning teaches you, it’s that learning artificial intelligence never ends. The more you dig into it, the more you have to learn.
Final verdict
Hands-on Machine Learning is a must-read for anyone embarking on the Python machine learning and deep learning journey. However, I do not recommend it as a first step, and it’s certainly not the last book standing between you and a career in machine learning.
In case you’re new to the field, I would suggest reading an introductory book on Python data science before picking up Hands-on Machine Learning. If you already have some Python and data science skills, these two specialized books are quick reads that will help you better prepare for the depth of the material provided in Hands-on Machine Learning. They provide you with a solid background in Python math and data manipulation libraries.
Also, I would recommend reading another book on machine learning such as Python Machine Learning or taking an online course like Udemy’s Machine Learning A-Z before or after Hands-on Machine Learning. You’ll find a lot of overlap, but each offers new perspectives and topics that the others don’t cover.
This article was originally published by Ben Dickson on TechTalks, a publication that examines trends in technology, how they affect the way we live and do business, and the problems they solve. But we also discuss the evil side of technology, the darker implications of new tech and what we need to look out for. You can read the original article here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.