Notation of Fully Connected Neural Network

Chinmayi Sahu
2 min readDec 29, 2020

--

This example has one Input Layer, three Hidden layers, and an Output Layer.

Elements of Deep Neural Network

Xi is a four-dimensional feature of a data point.

Xij represents each feature of the data point.

Y_Pred is a real number. This means here we are solving a regression problem in this example.

F is the Activation Function.

O is Output.

W is the weight.

We have weights associated with each of the edges. These weights get multiply with the Inputs of the next layer and summed. This Result passes through a function F. If the input is the same as the output we call the function an Identity function. Ex. Linear Regression.

Layer 2 output passed to Layer 3

The number of weight in each layer can be represented as the “number of Input” x “the number of the output” matrix

Let's consider weights between hidden layer 1 and hidden layer 2. Layer 1 has 4 units and Layer 3 has 3 units. So, the total number of weights will be 4 * 3 = 12. Similarly, between Layer 2 and Layer 3, the number of weights will be 3 * 2 = 6.

For simplicity I excluded Bias but we can have Bias term as well.

--

--

Chinmayi Sahu
Chinmayi Sahu

No responses yet