0

For example, determine $\int \left(\frac{1}{2x+1}\right)dx$.

Given that $f(x)$ = $\ln(2x+1)$ and $f'(x)$ = $\ln\left(\frac{2}{2x+1}\right)$.

Would this be $\frac{1}{2} \int\left (\frac{2}{2x+1}\right)dx = \frac{1}{2} (\ln(2x+1) + C)$ or $\frac{1}{2} \int\left (\frac{2}{2x+1}\right)dx = \frac{1}{2} (\ln(2x+1)) + C$?

Math
  • 864
CountDOOKU
  • 1,071
  • 2
    they are the same – Henry Lee Jan 03 '20 at 11:53
  • For what it's worth, I think it's most common to see the constant $C$ added all the way on the outside (the second way). The first answer is perfectly fine, but distributing the $\frac{1}{2}$ gives you $+\frac{C}{2}$ which is unnecessarily complicated. – kccu Jan 03 '20 at 12:27
  • 2
    @kccu Although I'd say it's important not to internalise this too hard, since sometimes the constant of integration ends up in a form that isn't simply added, and you can get the wrong answer if you simply mechanically shove a $+C$ on the end. For example, when solving $5y'-3y = 0$, you end up with $y = A e^{3x/5}$. At the point where you actually do an integral, it's added, but I'm wary of instilling an unhelpful habit of thought that can lead you to just sprinkle additive constants everywhere even when they're wrong. – Patrick Stevens Jan 03 '20 at 12:31
  • @PatrickStevens Fair point! – kccu Jan 03 '20 at 14:30

3 Answers3

2

It doesn't matter. Since $c$ is arbitrary, these are the same answers.

0

A bit more waffle:

Integration is not enough to tell you the $y$-intercept of the resulting function, since differentiation threw that information away. The constant of integration is effectively simply telling you where the $y$-intercept of the antiderivative is.

In your case, you have got the choice of $\frac{1}{2}C$ or simply $C$ as your constants of integration; that is to say, you have the choice of $y$-value when $x=0$ of $\frac{1}{2}C$ in the first answer, or $C$ in the second answer.

But if you wanted to choose constants of integration so that the $y$-intercept is $N$, say, you can do it exactly the same in either case: just pick $C = 2N$ in the first case, or $C = N$ in the second case.

Since you can get exactly the same set of functions from varying $C$ in the first case as you can in the second, the two expressions are effectively the same; both answers yield the same set of antiderivatives.

0

I don't like the other answers, because the root of the problem is that you did not understand the meaning of the so-called 'integration constant'. $ \def\lfrac#1#2{{\large\frac{#1}{#2}}} $


$\int f(x)\ dx$ denotes an anti-derivative of $f(x)$ with respect to $x$, namely an expression $g(x)$ that when differentiated with respect to $x$ gives $f(x)$. Symbolically you want to find $g(x)$ given that $\lfrac{d(g(x))}{dx} = f(x)$.

If you are given $f(x)$ and an anti-derivative $F(x)$ of $f(x)$, then you do have $\lfrac{d(F(x))}{dx} = f(x)$, but for any constant $c$ you also have $\lfrac{d(F(x)+c)}{dx} = \lfrac{d(F(x))}{dx} + \lfrac{dc}{dx}$ $= f(x) + 0 = f(x)$ by the basic properties of differentiation, and so you cannot determine $\int f(x)\ dx$ without having more information.

However, if $\lfrac{d(F(x))}{dx} = f(x)$ at every $x∈I$ where $I$ is some interval of the real line, then we can in fact conclude that $\int f(x)\ dx = F(x) + C$ for some constant $C$ (by something called Rolle's theorem). Take note, you do not know what constant $C$ is! Also, for any constant $k$, we have $k · \int f(x)\ dx = k · ( F(x) + C )$ $= k·F(x)+k·C$ for some constant $C$. Since we are always working with constants from a field $F$, and $\{ k·C : C∈F \} = F$, we also can deduce $k · \int f(x)\ dx = k·F(x)+D$ for some constant $D$.

As mentioned in a comment, you must actually know how the unknown constant 'propagates' to the final expression. For example, $\exp(\int f(x)\ dx) = \exp(F(x)+C)$ $= \exp(F(x))·\exp(C)$ for some constant $C$. If you are working in the reals, then $\exp(C) > 0$, so you can conclude that $\exp(\int f(x)\ dx) = \exp(F(x))·D$ for some constant $D > 0$. But if you are working in the complex numbers, then $\exp(C)$ can be any complex number except $0$, so you would instead conclude $\exp(\int f(x)\ dx) = \exp(F(x))·D$ for some constant $D ≠ 0$.

user21820
  • 60,745
  • I'm also not quite comfortable with the answer in the linked thread, because I do not agree with using $∫$ to denote the class/set of anti-derivatives. It brings... quite a lot of trouble... when you want to have completely rigorous symbolic manipulation of anti-derivatives. It's okay if you just want to do a quick computation, but it's not going to be a good approach if you actually want to rigorously justify everything. – user21820 Jan 05 '20 at 14:55