Markov chains have been around for a while now, and they are here to stay. From predictive keyboards to applications in trading and biology, they’ve proven to be versatile tools.
Neural Networks have a lot of interconnected layers, each of them with an Activation Function. But why is this necessary? What are Activation Functions for?
LSTM Neural Networks have seen a lot of use in the recent years, both for text and music generation, and for Time Series Forecasting.
Today, I’ll teach you how to train a LSTM Neural Network for text generation, so that it can write with H. P. Lovecraft’s style.
Convolutional Neural Networks are a part of what made Deep Learning reach the headlines so often in the last decade. Today we’ll train a CNN to tell us whether an image contains a dog or a cat, using TensorFlow’s eager API.
Applying filters to images is not a new concept to anyone. We take a picture, make a few changes to it, and now it looks cooler. But where does Artificial Intelligence come in? Let’s try out a fun use for Unsupervised Machine Learning with K Means Clustering.