I’ll summarize the comments and add some additional details here:
As is, the argument makes no sense. Whatever space we’re working with, whatever $\int$ means precisely (recall that antiderivative is not uniquely defined, so the operator is not truly specified), as long as $\int$ has an eigenvector (say, $e^x$) with eigenvalue $1$, then $1 - \int$ is not invertible, so $(1 - \int)^{-1}$ does not make sense. Even if it does, any $ce^x$ for a constant $c$ would be an eigenvector with eigenvalue $1$, so it is not possible to obtain the power series for $e^x$ instead of, say $-e^x$, this way.
But as the question linked in Gonçalo’s comment illustrates, this can be fixed by specifying $\int$ as $\int_0^x$, i.e., $[\int f](x) = \int_0^x f$. Then $e^x$ is the unique function $f$ satisfying $(1 - \int)f = f - \int f = 1$. We can obtain the power series expansion around $0$ of $e^x$ (which, as we know, has radius of convergence $\infty$, though the method only guarantees a radius of convergence of at least $1$) using this. Indeed, for any $0 < \epsilon < 1$. Consider the space $C[-\epsilon, \epsilon]$. Then $\int$ may be understood as an operator from $C[-\epsilon, \epsilon]$ to itself, and one may easily check the norm of operator is $\epsilon < 1$. Whence, $1 - \int$ is invertible with inverse $(1 - \int)^{-1} = \sum_{n = 0}^\infty \int^n$. Thus,
$$e^x = (1 - \int)^{-1} 1 = \sum_{n = 0}^\infty \int^n 1 = \sum_{n = 0}^\infty \frac{1}{n!} x^n$$
Where the convergence is uniform on $[-\epsilon, \epsilon]$ for any $0 < \epsilon < 1$. (Again, in actuality we know that the convergence is uniform on any bounded interval, but this method only proves this for $\epsilon < 1$ instead of an arbitrarily large $\epsilon$.)