4

Is there a way to train a neural network as $f(x) = {1 \over x}$ precisely?

Ethan
  • 1,657
  • 9
  • 25
  • 39

2 Answers2

2

Since the activation function itself can be an inverse function, the answer is yes.

Stephen Rauch
  • 1,831
  • 11
  • 23
  • 34
1

Division by zero is not defined, so at 1/0 its not possible to find the gradient and hence the function cannot be exactly approximated by NN

user80942
  • 11
  • 1