A function defined on an interval I is said to be uniformly continuous on I if to each $\epsilon$ there exists a $\delta$ such that
$|f(x_1)-f(x_2)| < \epsilon$, for arbitrary points $x_1, x_2$ of I for which $|x_1-x_2|<\delta$.
For $f(x)=x^2$ on [-1,1], f is uniformly continuous but when f is defined on $R$ then f is not uniformly continuous. The proof for Interval [-1,1] goes like this :
$|f(x_1)-f(x_2)| < |x_1^{2}-x_2^{2}| = |x_1-x_2||x_1+x_2| < \epsilon$ when $|x_1-x_2|<1/2\epsilon=\delta$.
This step I don't understand. how $|x_1-x_2|<\epsilon/2$ gives that $|f(x_1)-f(x_2)|<\epsilon$. Also, how does the proof will look like if the same function is defined on $R$ and hence is not uniformly continuous.