There are a few things worth talking about here, mainly concerning the use of norms and inner products in general, as your biggest problem seems to be that you don't really see why we care about these things. For that reason, I will not answer your specific problem (just plug it into the formula), but rather discuss these things in general.
So firstly, the formula
$$\langle u,v\rangle=\frac{1}{4}\left(\lVert u+v\rVert^2-\lVert u-v\rVert^2\right)$$
is called the polarization identity, and and is simply a useful identity for extracting the inner product $\langle \cdot,\cdot\rangle$ which induces your norm $\lVert\cdot\rVert$ when you know the norm but not necessarily the inner product. In fact, an important fact from linear algebra is that not all norms arise this way. See, you've probably seen a few examples where you have some inner product $\langle\cdot,\cdot\rangle$, and from that you define a norm by setting
$$\lVert u\rVert=\sqrt{\langle u,u\rangle}.$$
This is all fine and good, and a very natural consideration (I'll give you a few examples further down), however there are norms which cannot be written this way. That's why things such as the polarization identity are important, because you can always apply that formula, however if your norm is not induced by an inner product, then the left-hand side will not be an inner product. Another way of checking this is the so called parallellogram law, which can be written
$$2\lVert x\rVert^2+2\lVert y\rVert^2=\lVert x+y\rVert^2+\lVert x-y\rVert^2,$$
which holds true if and only if the norm is associated with an inner product.
Before we look at some examples, let's make it clear what the motivation for introducing norms to begin with even is. So notice how the symbol $\lVert\cdot\rVert$ for a norm bares a very striking resemblance to the absolute value $\lvert\cdot\rvert$. This is not by accident, as norms are a way to generalize the notions which the absolute value capture. In particular, the absolute value of a number $a$ on the real line, is just a measurement of how "long" this number is, if we think of it as a line going from $0$ to $a$, and so by extension, calculating something like $\lvert b-a\rvert$ gives us the length of the line starting at $a$ and ending at $b$. In this sense, a norm is just an attempt to capture the notion of length and apply it to more general structures, i.e. vector spaces.
So let's look at some examples of norms and which arise naturally in mathematics. The first one we'll consider is the standard Euclidean norm on, say $\mathbb{R}^2$. This norm can be written
$$\lVert(x,y)\rVert=\sqrt{x^2+y^2}.$$
Why is this natural? Well first of all, if we think of the numbers on the form $(x,0)$ as numbers on the real line, then we have that
$$\lVert (x,0)\rVert=\sqrt{x^2}=\lvert x\rvert,$$
and so it seems to extend the absolute value. But how can we be sure that it actually gives us some meaningful notion of length? After all, we could've just as well introduced, say, a factor of $2$ in front of the $y^2$-term and still gotten the same result. Well the reason this makes sense is precisely because it encodes the Pythagorean theorem. See, when you have a vector $(x,y)$ in $\mathbb{R}^2$, this can be thought of as a line going from the origin to the point $(x,y)$ in the plane. In particular, notice how this line will be the hypotenuse of a triangle with sides $x$ and $y$. This means that to get this line, we need to calculate precisely $\sqrt{x^2+y^2}$ by the Pythagorean theorem, which is what makes this a useful notion of length. Now associated with this norm is an inner product
$$\langle (x_1,y_1),(x_2,y_2)\rangle=x_1x_2+y_1y_2,$$
which you might recognize as the dot product on $\mathbb{R}^2$. Also this has a geometric interpretation, however I will not go into that right now, but there are plenty of good resources online for that.
Now these might be among the most obvious considerations for norms and dot products, but what we need to realize is that there are a lot of structures we work with which are vector spaces, and so we might ask ourselves whether there is a natural notion of length on these spaces. So let's look at a lot less trivial example. Let's take the set $\mathcal{C}([a,b])$, which consists of all continuous functions $f:[a,b]\to\mathbb{R}$. This is a vector space under the operations of addition and scalar multiplication of functions (check this for yourself). Now fix a number $p\in[1,\infty)$, and define
$$\lVert f\rVert_p=\left(\int_a^b \lvert f(x)\rvert^p~\mathrm{d}x\right)^{\frac{1}{p}},$$
where $f$ is a function in the set $\mathcal{C}([a,b])$. This might seem like a very arbitrary construction, but this arises very naturally in a lot of mathematical analysis, such as for example Fourier analysis, and a lot of research is done relating to these operations of taking $\lVert f\rVert_p$. In particular, it is a way to try to assign a notion of length to a function (note: this is not the same as the length of the graph of the function). One can show that $\lVert\cdot\rVert_p$ is a norms for any $p\in[0,\infty)$ we choose, however the interesting thing comes when we ask which ones are given by an inner product. In our first example, this was pretty easy to check, however now we have a much more complicated norm, and it is not necessarily an easy task. I will not go into the details, but one can prove that the only $p$ for which there actually is an associated inner product is when $p=2$, i.e.
$$\lVert f\rVert_2=\sqrt{\int_a^b f(x)^2~\mathrm{d}x},$$
which is associated with the inner product
$$\langle f,g\rangle =\int_a^b f(x)g(x)~\mathrm{d}x.$$
But now let's tie this example back to our first example. In particular, the norm we considered on $\mathbb{R}^2$ can in a very simple way be extended to any $\mathbb{R}^n$ as follows: let $v=(v_1,\dots,v_n)\in\mathbb{R}^n$, and define
$$\lVert v\rVert=\sqrt{v_1^2+\dots+v_n^2}=\sqrt{\sum_{j=1}^n v_j^2}.$$
This is associated with the inner product
$$\langle u,v\rangle=u_1v_1+\dots+u_nv_n=\sum_{j=1}^nu_jv_j.$$
But notice the striking resemblance between
$$\lVert f\rVert_2=\sqrt{\int_a^b f(x)^2~\mathrm{d}x}\qquad\text{and}\qquad\lVert v\rVert=\sqrt{\sum_{j=1}^n v_j^2},$$
as well as between
$$\langle f,g\rangle=\int_a^b f(x)g(x)~\mathrm{d}x\qquad\text{and}\qquad\langle u,v\rangle=\sum_{j=1}^nu_jv_j.$$
This is not by accident, and there is a much deeper connection between them (however it takes developing measure theory to understand that they are just "two sides of the same coin", which is definitely too advanced for this answer). The point is that norms and inner products can come from seemingly unrelated places, but still have a lot of use, and for that reason it is important to develop a unified theory of them, as that allows us to deal with all of them using the same box of tools.
Finally, the reason why we ideally want our norm to come with an inner product (which we have seem is not always the case) is because inner products often play very well with bases, and allow us to "project". So say we are given two ordered bases $(e_1,\dots,e_n)$ and $(e_1',\dots,e_n')$ for some $n$-dimensional vector space $V$ (you can think of this as $\mathbb{R}^n$ if that's easier). Now suppose we have some vector $v\in V$, and we know the coordinates $(v_1,\dots,v_n)$ of $v$ with respect to the first basis $(e_1,\dots,e_n)$. In a lot of situations, this might not be ideal, and the basis $(e_1',\dots,e_n')$ might be a lot more convenient to work with and have nicer properties, such as for example consisting of eigenvectors of some linear operator (don't worry about what this means if you don't know, the key here is that this other basis is more useful for some reason). So what we have is that
$$v=v_1e_1+\dots+v_ne_n,$$
and we wish to find some coordinates $(v_1',\dots,v_n')$ such that
$$v=v_1'e_1'+\dots+v_n'e_n'.$$
Now say we are also given an inner product $\langle\cdot,\cdot\rangle$ on this space, and let's say, for simplicity that $(e_1',\dots,e_n')$ is an orthonormal basis. What this means is that the basis vectors are orthogonal to each other, i.e.
$$\langle e_i',e_j'\rangle=0$$
if $i\neq j$, and that they are normal, i.e. $\lVert e_j'\rVert =1$ for all $j$. In particular then
$$\langle e_j',e_j'\rangle=\lVert e_j'\rVert^2=1$$
for all $j$. This is a property which is often desirable, and there is a process (in finite-dimensional spaces) to turn any basis into an orthonormal basis, called the Gram-Schmidt process, which you will encounter some time during your linear algebra studies. So now that we have this orthonormal basis, we can make the computation
$$\langle v,e_j'\rangle=\langle v_1'e_1'+\dots+v_n'e_n',e_j'\rangle=v_1'\langle e_1',e_j'\rangle+\dots+v_n'\langle e_n',e_j'\rangle=v_j'$$
(notice that orthogonality got rid of all the terms except the $v_j'\langle e_j,e_j\rangle$ term, and this equals $v_j'$ by normality). But this means that we can compute the coordinates $(v_1',\dots,v_n')$ by computing the inner products $\langle v,e_1'\rangle,\dots,\langle v,e_n'\rangle$, and so we have a method we can use to change basis (at least when our new basis is orthonormal)!
Let's end this answer by using this process on a concrete example. So suppose we are given $v\in\mathbb{R}^3$ with coordinates $(1,2,3)$ with respect to the ordered basis $((1,1,0),(1,0,1),(0,1,1))$, and we wish to find the coordinates with respect to the new basis $(e_1',e_2',e_3')$ given by
$$e_1'=\frac{1}{\sqrt{3}}(1,1,1), \quad e_2'=\frac{1}{\sqrt{6}}(-1,2,-1), \quad e_3'=\frac{1}{2\sqrt{3}}(-3,0,-3).$$
We give $\mathbb{R}^3$ the dot product as its inner product. An exercise for you is to check that this is an orthonormal basis. This means that we can apply the method above to find its coordinates. So first, notice that
$$v=1\cdot(1,1,0)+2\cdot(1,0,1)+3\cdot(0,1,1)=(3,4,5),$$
where we used the coordinated we were given with respect to the first basis. We then have that
$$\langle v,e_1'\rangle=\frac{1}{\sqrt{3}}\rangle(3,4,5),(1,1,1)\rangle=\frac{1}{\sqrt{3}}(3+4+5)=4\sqrt{3},$$
$$\langle v,e_2'\rangle=\frac{1}{\sqrt{6}}\rangle(3,4,5),(-1,2,-1)\rangle=\frac{1}{\sqrt{6}}(-3+8-5)=0,$$
$$\langle v,e_3'\rangle=\frac{1}{3\sqrt{2}}\rangle(3,4,5),(-3,0,3)\rangle=\frac{1}{3\sqrt{2}}(-9+15)=\sqrt{2}.$$
Using what we showed above, this means that $v$ has coordinates $(4\sqrt{3},0,\sqrt{2})$ with respect to the ordered basis $(e_1',e_2',e_3')$.
Hopefully this sheds some light on why norms and inner products are useful. If you want to, I could also add a bit on subspaces, and how inner products and subspaces make for a really useful combination (as you can perform projection), however this answer has already gotten quite long, so I'll leave it as is for now.
\langleand\rangle, as opposed to the less-than and greater-than signs<and>(which give very ugly spacing if used as brackets, since they are interpreted as relational symbols rather than delimiters). – Hans Lundmark Oct 20 '22 at 04:14