Skip to main content

Posts

Showing posts from January, 2019

The Only Neural Network Layer You Will EVER Need

Neural networks can seem daunting, complicated, and impossible to explain. But in reality they are remarkably simple. In fact, they only ever require a single layer of neurons. In my previous post about the basics of neural networks , I talked about how neurons compute values. They take a set of inputs, multiply each input value by a weight, and sum the terms. An activation function is then applied to the sum of products, to yield the output value. That output value could be zero (i.e., did not activate), negative, or positive. In this post, we will go deeper down the rabbit hole. We will look at neuron layers, which layers are actually necessary for a network to function, and come to the stunning realization that all neural networks have only a single output. Organizing Neurons into Layers In most neural networks, we tend to organize neurons into layers. The reason for this comes from graph theory (as neural networks are little more than computational graphs). Each layer con