May 31st, 2020 · 6 min read
5th issue! If you missed them, you can read the previous issues of the Machine Learning Monthly newsletter here.
Hey everyone! Daniel here, I'm 50% of the instructors behind the Complete Machine Learning and Data Science: Zero to Mastery course. I also write regularly about machine learning on my own blog as well as videos on the topic on YouTube.
Welcome to the 5th edition of Machine Learning Monthly. A 500ish word post detailing some of the most interesting things on machine learning I've found in the last month. If there is enough interest, I will keep doing these every month so please share it with your friends!
Since there's a lot going on, the utmost care has been taken to keep things to the point.
Through an eight-part series of articles, David Page from myrtle.ai takes you through how to train the popular ResNet (residual neural network) deep learning architecture on the CIFAR10 dataset (60000 images, 10 classes).
Well, consider David shows you how to take the out-of-the-box training time from 341s to 26s on a single GPU — an over 10x reduction!
Ever wanted all of the most helpful machine learning libraries curated in a nice place? Amit has you covered.
From the most popular datasets to different ways to do data augmentation to tools for optimising your models, all collected in a simple README.
As bonus, Amit also collects his favourite learning resources for Data Science, business and life in a README called learning.
I've been coding for three years now. And there a few places in my skillset, I keep hitting walls, things which I'd like to improve, namely: full-stack development, databases and the command line.
While not strictly machine learning related, Teach Yourself Computer Science got an overhaul in May 2020 and is a great place for the 2-3 year self-taught coder to check out in order to cement their skills. Just read their 'Why learn computer science?' section.
I'm looking at this seriously and wondering if it fits into my next 2-years of learning.
Side note: Andrei also has a great course that teaches computer science fundamentals to those that don't have a CS degree.
I'm on the waiting list for a new book by O'Reily called Building ML Pipelines: Automating Model Life Cycles with TensorFlow.
It's cool for a model to be trained in a Jupyter Notebook but it's a potential life-changer when a machine learning model is deployed to the world.
As much as I'm interested in doing the former, I'm far more interested in doing the latter.
This upcoming book goes through all the steps required to automate a machine learning pipeline (a software-based system which takes in data, preprocesses it, models it, lets users access the model and tracks performance) with TensorFlow.
Check the website to get on the update list or read the early release notes.
Google released BiT this month, short for 'Big Transfer'. In short, they asked, "What would happen if we trained some of the biggest models we possibly can on as many images as we can?".
Turns out, a few computer vision records got broken.
The best thing is though, you can now access the model weights (the patterns the models learned on 14M-300M different images) and use them in a few lines of TensorFlow code. Check the blog post to see how.
Ever wanted a single great resource to learn the ins and outs of PyTorch?
freeCodeCamp has you covered.
In this video, you'll get primers on PyTorch basics, image classification, image classification improvement using data augmentation and regularization and generative adversarial networks.
I started learning machine learning in April 2017. This article by Jason Been does a great job of putting together a list of things I would've liked to have known when I started. From becoming a programmer first to perfecting your work environment for studying.
Write about what you learn.
In 2013, DeepMind published a paper which showed a deep reinforcement learning agent which could play 7 out of 57 Atari 2,600 games, surpassing human levels in 3 out of 7.
Well, it's been a few years since then and their agent has grown up. Agent57 (what they're calling it) is not only able to play all 57 Atari 2,600 games but can surpass human performance on all of them.
The most interesting part will be how much of Agent57's ability transfers outside of the game environment.
Ever wanted to get a world-class introduction to one of the centuries most influential technologies?
Now you can.
MIT have released their full Introduction to Deep Learning (6.S191) curriculum, from sequence modelling to using machine learning for scent (yes, digitising the sense of smell) in the form of YouTube videos and slides.
There's nothing more I love than good explanations. After going through the first few videos, I can say this series is one of the best out there, for both theory and code.
And that's it for May.
In the meantime, keep learning, keep creating.
See you next month,
How did you like this post? Please share the post on Twitter if you enjoyed it and want me to keep writing them! Also, if you haven't already, subscribe below to receive Machine Learning Monthly next month and other exclusive ZTM posts.
By the way, I'm a full time instructor with Zero To Mastery Academy teaching people Machine Learning in the most efficient way possible. You can see a couple of our courses below or see all Zero To Mastery courses by visiting the courses page.