To show the convergence to $0$ is not almost sure, note $$\{X_n = n+1\} \subset \{|S_n| \geq n/2\} \cup \{|S_{n-1}| \ \geq n/2\}.$$ The probabilities on the left have infinite sum. By Borel-Cantelli, it therefore occurs with probability $1$ that $|S_n|/n \geq 1/2$ infinitely often.
For the convergence in probability I quote the following result from Chung's "A course in probability theory":
Suppose $(X_n)$ is a sequence of independent random variables with distribution functions $F_n$. Let $(b_n)$ be a sequence of real numbers increasing to $\infty$. Suppose that
$$\sum_{j=1}^n \int_{|x|>b_n} dF_j(x) = o(1) $$
and that
$$\frac{1}{b_n^2} \sum_{j=1}^n \int_{|x| \leq b_n} x^2 dF_j(x) = o(1) .$$
Define some new constants
$$a_n := \sum_{j=1}^n \int_{|x|\leq b_n} x dF_j(x) .$$
Then the following convergence holds in probability:
$$\frac{1}{b_n} (S_n - a_n) \rightarrow 0. $$
(This is easy to show using some truncated random variables). In your case, take $b_n = n+1$. Then the first sum above vanishes; so it's definitely $o(1)$. The quantity on the LHS of the second condition is of order
$$ \frac{li(n)}{n+1} $$
where $li$ is the logarithmic integral function (see wikipedia). But according to wikipedia
$$li(n) = O(n/\log(n)) .$$
So
$$\frac{li(n)}{n+1} = O(1/log(n)) = o(1). $$
By the symmetry of your variables, the $a_n$ defined above equal zero. This gives
$$\frac{S_n}{n+1} \rightarrow 0 $$
in probability, and since $n/(n+1) \rightarrow 0$, certainly $S_n/n \rightarrow 0$ in probability.
Note: you could also just use $b_n=n$ to avoid this last step in the argument, but then there will be a non-zero term in the first sum.