Neural Networks have a lot of interconnected layers, each of them with an Activation Function. But why is this necessary? What are Activation Functions for?
LSTM Neural Networks have seen a lot of use in the recent years, both for text and music generation, and for Time Series Forecasting.
Today, I’ll teach you how to train a LSTM Neural Network for text generation, so that it can write with H. P. Lovecraft’s style.
Convolutional Neural Networks are a part of what made Deep Learning reach the headlines so often in the last decade. Today we’ll train a CNN to tell us whether an image contains a dog or a cat, using TensorFlow’s eager API.
Applying filters to images is not a new concept to anyone. We take a picture, make a few changes to it, and now it looks cooler. But where does Artificial Intelligence come in? Let’s try out a fun use for Unsupervised Machine Learning with K Means Clustering.
Deep Learning has revolutionized the Machine Learning scene in the last years. Can we apply it to image compression? How well can a Deep Learning algorithm reconstruct pictures of kittens? What’s an autoencoder?
Today we’ll use XGBoost Boosted Trees for regression over an official HDI dataset. Who said Supervised Learning was all about classification?
This project combines two of my passions: Magic: The Gathering and Machine Learning. Let’s see how they mix with this recommender system!