In this case, $f(\mathbf x)$ is a function defined as $$f(\mathbf x):\Bbb R^n \rightarrow \Bbb R, f(\mathbf x) = \sum_{i=1}^{n}x_i$$
Where $\mathbf x = (x_1,x_2,....,x_{n-1},x_n)$
Now we would have to consider a notion of gradient ,which is the generalization of our classic differential for single variable calculus. We first define partial derivatives of $f$ in each $x_i$ direction, as
$$\frac{\partial f(\mathbf x)}{\partial x_i} = \lim_{h \to 0}\frac{f(\mathbf x + h\mathbf u_i) - f(\mathbf x)}{h}$$
Here, $\mathbf u_i$ represents the unit vector of the $x_i$ direction, i.e. $\mathbf u_i = (0,0,...,1_i,0,..,0)$
Now, we define the gradient as a scalar to vector function, as
$$\nabla f(\mathbf x) = \sum_i\frac{\partial f(\mathbf x)}{\partial x_i}\mathbf u_i$$
One can easily check this for single variate case, it matches our notion of differential