0

Let's say that I have a standard feedforward neural network which has $M$ inputs, some number of hidden layers $N$, and a single neuron in the output.

Is it possible to construct a network such that the output neuron's output value is the maximum weighted input value?

I know that I could easily hard code something to make this happen, but I'd like to limit the construction to "standard" operations where the activation of a neuron is the dot product of the input vector and the weight vector with a bias added, then putting that value through a differentiable activation function.

Alan Wolfe
  • 1,358
  • 11
  • 22

1 Answers1

0

No, not with standard activation functions, but you might be able to approximate it. For instance, if the activation function is differentiable, then the output of such a network will always be differentiable, but the "max" function is not differentiable.

You're more likely to get a useful answer if you provided some context about what you're trying to accomplish or how you'll use the answer. I suspect an XY problem.

D.W.
  • 167,959
  • 22
  • 232
  • 500