Fully-connected layers (also known as Dense layers) are a fundamental building-block layer for artificial neural networks, where every neuron in the layer is connected to every neuron in the previous layer and each connection has an associated weight. It ensures that the layer has access to all features from the previous layer which enables it to learn complex representations.
Itβs mostly used to combine features extracted by earlier layers so the model can make decisions or classifications. In many models, the output layer is a fully-connected layer to produce predictions with an Activation Function.
For a neuron in a dense layer:
where:
- is the input from the -th neuron in the previous layer
- is the weight connecting neuron to neuron
- is the bias for neuron
- is the activation function
- is the output of neuron
Ultimately, these dense layers are typically combined with other layers (like convolutional or recurrent layers) to optimize the modelβs performace.