4

As mentioned in the question, I have some issues understanding what are the differences between those terms.

From what I have understood:

  1. Forward pass: compute the output of the network given the input data

  2. Backward pass: compute the output error with respect to the expected output and then go backward into the network and update the weights using gradient descent ecc...

What is backpropagation then? Is it the combination of the previous 2 steps? Or is it the particular method we use to compute dE/dw? (chain rule ecc...)

Mattia Surricchio
  • 421
  • 3
  • 5
  • 15

3 Answers3

1

In brief, backpropagation references the idea of using the difference between prediction and actual values to fit the hyperparameters of the method used. But, for applying it, previous forward proagation is always required. So, we could say that backpropagation method applies forward and backward passes, sequentially and repeteadly.

Your machine learning model starts with random hyperparameter values and makes a prediction with them (forward propagation). Then it compares with real values while adjusting those random initial values (backpropagation), trying to minimize the error (depending of your objective function and optimization method applied). And then, it starts over again, until you reach the stopping criteria.

You may find a better explanation in this question.

Furthermore, you will find more topic explanation here.

Zephyr
  • 997
  • 4
  • 11
  • 20
Dave
  • 180
  • 1
  • 7
1

In a narrow sense backpropagation only refers to the calculation of the gradients. So it does, for example, not include the update of any weights. But usually it is used refering to the whole backward pass.

Also see Wikipedia.

Zephyr
  • 997
  • 4
  • 11
  • 20
Jonathan
  • 5,605
  • 1
  • 11
  • 23
0

Forward Pass is same as Forward Propagation and Backward Pass is same as Backpropagation.

Forward Pass(Forward Propagation) is to calculate a model's predictions with true values(train data), working from input layer to output layer.

Backward Pass(Backpropagation) is to calculate a gradient using the mean(average) of the sum of the losses(differences) between the model's predictions and true values(train data), working from output layer to input layer.