Skip to main content

Posts

The Only Neural Network Layer You Will EVER Need

Neural networks can seem daunting, complicated, and impossible to explain. But in reality they are remarkably simple. In fact, they only ever require a single layer of neurons. In my previous post about the basics of neural networks , I talked about how neurons compute values. They take a set of inputs, multiply each input value by a weight, and sum the terms. An activation function is then applied to the sum of products, to yield the output value. That output value could be zero (i.e., did not activate), negative, or positive. In this post, we will go deeper down the rabbit hole. We will look at neuron layers, which layers are actually necessary for a network to function, and come to the stunning realization that all neural networks have only a single output. Organizing Neurons into Layers In most neural networks, we tend to organize neurons into layers. The reason for this comes from graph theory (as neural networks are little more than computational graphs). Each layer con...

Neural Network Pooling Layers

Neural networks need to map inputs to outputs. It seems simple enough, but in most useful cases this means building a network with millions of parameters, which look at millions or billions of relationships hidden in the input data. Do we need all of these relationships? Is all of this information necessary? Probably not. That's where neural network pooling layers can help. In this post, we're going to dive into the deep end and learn how pooling layers can reduce the size of your network while producing highly accurate models.

Neural Network Dense Layers

Neural network dense layers (or fully connected layers) are the foundation of nearly all neural networks. If you look closely at almost any topology, somewhere there is a dense layer lurking. This post will cover the history behind dense layers, what they are used for, and how to use them by walking through the "Hello, World!" of neural networks: digit classification.

What are Neural Networks?

I used to work in HPC (as evidenced in my posts on AoS vs SoA and AoSoAs ), but my interest has always been nature-inspired algorithms, like genetic algorithms . But there's a nature-inspired algorithm that is getting lots of attention these days: neural networks. People think of them as black boxes, but in this post I'll try to peel back the top and explain what's going on inside.

Taking Care of Yourself

We've all done it before. As a recovering academic, it was practically part of my job description: work until complete burnout. But if you want to have a long, productive career then you need to be sure you take care of yourself. Here are a few things I make sure to do now that I didn't necessarily do in the past.