2

Given a matrix $\mathbf{W}\in\mathbb{R}^{n\times n}$ and a vector $\mathbf{b}\in\mathbb{R}^{n}$, find the equilibirum for the following single-layer neural network: $$ f(\mathbf{x})=\text{ReLU}(\mathbf{Wx+b}), $$ where $\text{ReLU}(\cdot)=\text{max}(\cdot,\mathbf{0})$ is an element-wise activation function, and the goal is actually to find a set of $\mathbf{x}\in\mathbb{R}^n$ such that: $$ f(\mathbf{x})-\mathbf{x}=\mathbf{0}. $$

How to determine the existence of equilibrium points and calculate them efficiently?


My Efforts:

I construct a new cost function: $$ J(\mathbf{x})=\lVert f(\mathbf{x})-\mathbf{x}\rVert_2^2 =\lVert \text{max}(\mathbf{Wx+b},\mathbf{0})-\mathbf{x}\rVert_2^2, $$

and try to find its minimum points, but it's still hard for me to solve this.

This problem may be turned into a limitation:

$$ \mathbf{x}^*=\underset{k\rightarrow \infty}{\lim}\mathbf{x}_k, $$

where $\mathbf{x}_k=f\left( \mathbf{x}_{k-1} \right)$ with giving some starting point $\mathbf{x}_{0}$, but I have no idea about how to solve this completely or inductively.

BinChen
  • 648

0 Answers0