The dropout layer helps to reduce overfitting in a neural network by randomly disabling some neurones by setting their outputs to 0 during the forward pass. These "disabled" neurones are chosen at random based on a probability function during every forward pass. A dropout neurone is only connected to one neurone from the previous layer.
Dropout layers are only included in training and during testing / application the dropout layers always just pass on their inputs unchanged
A single dropout layer can be defined as
This could be re-written using a Hadamard product of two matrices to describe a full layer of dropout neurones as follows