Concepts

When we count layers without taking into account the input payer.

Logistic Regression is a one layer neural network.

Notation

L = Number of layers.

x[l] = Input features.

n[l] = Number of units in the lager l.

a[l] = Activation in layer l.

a[l] = g[l] (z[l]).

w[l] = Weight for z[l].

b[l] = Bia for z[l].

$ y_hat $ = Predicted values.

In this case n[0] = 2,

Forward Propagation

x: z[l] = w[l] + b[l]

a[l] = g