I would say the main recurrent goals in analysis is to prove that two things are either sufficiently close together or arbitrarily close together (and thus equal).
These rather vague goals, in my opinion, feed directly into the appeal and infamy of analysis. For there are certainly uncountably many ways to show that two values of interest are close together and the common recurring tactic here is to do so by proxy: this is close to that, that is close to yonder, yonder is close to over there.... therefore this is close to over there. Knowing which intermediary points to compare things to is partly craft but also partly art.
Any concrete recommendations (including my past ones in my other answer) I would call part of the "craft" of analysis. You learn tricks of astonishing mastery by other mathematicians and you mimic their approach when you think you're in a similar situation. However the "art" of analysis is puttying your own spin on these tricks first by applying some sort of method to the madness---find some way of grouping these tricks that makes sense to you---and then wait for the method to inspire new madness.
Some of the methods that I have collected under analysis include:
- Collect inequalities.
- Collect algebraic identities.
- Collect limits (even slowly converging ones) in their various forms.
- Exploit density, compactness, and connectedness.
- Attempt to seesaw the quantities you're working with.
- Give yourself some buffer room.
- Introduce a variable, even if it makes your problem seemingly more difficult.
- Take the Herglotz point-of-view.
- Be optimistic.
each of which I should probably describe by some example. Of course, the underlying form of just about all analytical arguments include inequalities. Thus, it is helpful to have several different ones at your fingertips. Particularly useful ones are inequalities that connect algebraic functions with transcendental functions such as
$$1+x\leq e^x\,,\quad \frac{2}{\pi}|x|\leq |\sin x|\leq |x|$$
or inequalities that attach several different "levels of arithmetic" with each other in different orders such as Cauchy's Inequality
which trades sums of products for products of powers of sums of powers. I would say that almost all of these inequalities can be derived from convexity and Jensen's inequality.
However, it is also frequently useful to know certain algebraic identities. Many interesting assertions have clever proofs that involve rewriting the quantities in question by some algebraic manipulation where suddenly everything is trivial with hindsight. These proofs were concocted with algebraic insight. Knowing identities with sums of squares, Vieta's Formulae, the Binomial Theorem, special factorizations, factoring polynomials of low degree, exploiting telescoping, and trigonometric identities (as a couple examples) can all feed into making a very sleek and cheeky proof. If you happen upon a novel identity. Keep it somewhere.
Sometimes, you only care about what happens in the limit than at a particular point in time. In this case, having several limits on hand are extremely useful. The more routine limits arise from differentiation; however, you can also find nifty limits by realizing things as Riemann sums. More arcane limits, especially slow ones for some reason, are useful too. You can demonstrate that $\Gamma(1/2)=\sqrt{\pi}$ (and thus calculate the Gaussian integral) with the Wallis product and $\Gamma$'s Euler product form. The various sequences that converge to $e$ also notoriously spring up like weeds.
The next point addresses topological arguments. There are several nice theorems that only occur on compact spaces or connected ones. Keep them. Density however creeps up quite a bit as well. If you want to show two functions or functionals are equal, you need only do so on a dense subset, which can dramatically simplify the argument you have to do.
Although a lot of analytic approximations are through a chain of proxies, some arguments are done by "seesawing". That is, you reallocate some weight from some term over to another term that wouldn't mind having the extra weight. The "adding-and-subtracting" trick can be seen as falling in here. But I would also toss the Peter-Paul inequality in my last answer and Young's Inequality here.
"Giving yourself some buffer room" roughly translates to attempting to prove a stronger result which will give you your original result upon taking a limit. For example, if you want to show
$$\sum_{k=1}^\infty\frac{1}{k^2}\leq 2$$
you should instead show
$$\sum_{k=1}^n\frac{1}{k^2}\leq 2-\frac{C}{n}$$
for some constant $C$. Rephrasing the question like this suddenly opens up the door for induction. And the argument will take you there.
Introducing a variable, in my opinion, is perhaps the dirtiest trick I can think of in elementary analysis. For example, to calculate the improper integral
$$\int_0^\infty \frac{\sin x}{x}\,dx$$
you should instead attempt to calculate
$$\int_0^\infty\frac{\sin x}{x}e^{-tx}\,dx$$
for arbitrary $t$ where $t>0$. Or to calculate the sum below
$$\sum_{n=1}^\infty\frac{(-1)^{n+1}}{n}$$
you should instead calculate the sum
$$\sum_{n=1}^\infty\frac{(-1)^{n+1}x^n}{n}$$
for arbitrary $x$ with $|x|<1$. The user Jack D'Aurizio on this site is a master of this technique and also a master of rewriting things as integrals. View his answers to get a feel for this.
Herglotz is mildly famous for giving a fairly elementary and easy-to-follow proof of the following identity:
$$\pi\cot\pi x=\lim_{N\rightarrow\infty}\sum_{k=-N}^N\frac{1}{x+k}\,.$$
Everyone needs to read that proof and digest it for themselves. But the essential idea behind the proof is to demonstrate that the two functions on the left and right have enough properties in common to prove that their difference is zero. This idea also swallows up the idea of proving that two differentiable functions are equal if their derivatives are equal and they agree at a point. A very similar proof to Herglotz's also proves the $\Gamma$ reflection formula with sine.
"Be optimistic" is meant to encapsulate the idea that in order to construct something, it is sometimes helpful to just assume that it exists and deduce necessary properties that hint at its construction. Bohr and Mollerup gave a uniqueness proof of the $\Gamma$ function that follow more along this motif rather than Herglotz's motif. A similar idea can be used to give a rather strange-looking definition of sine and cosine if one can only talk in the language of elementary analysis.
Of course, there are several other ideas that I could write. But this answer is already more like a blog post than a StackExchange answer.