3

I've just been reading the Wikipedia entry regarding the Jacobian conjecture, and it said that either the conjecture is true for all fields of characteristic zero, or it is false for all such fields.

Hence, I wonder, shouldn't this be an easy problem that yields to methods from real or complex analysis? After all, it involves only simple terms like determinant, inverse, constant, polynomial etc.

Specifically, the determinant condition gives a relation between the derivatives, which one may then be able to integrate in order to possibly obtain polynomials.

To make this more specific, say that we have a polynomial function $f: \mathbb K^n \to \mathbb K^n$, where $\mathbb K = \mathbb R$ or $\mathbb C$. Then $\det J_f$ is a polynomial in the derivatives of the components and hence itself a polynomial. By the inverse rule and Cramer's rule, the derivative of the (local) inverse has the form $$ \frac{1}{\det(J_f)} \operatorname{Cof}(J_f), $$ where by assumption $\det(J_f)$ is constant. Also, the cofactor matrix is a polynomial matrix. Thus, we integrate any of its entries for each component to obtain a local polynomial inverse, which is also global due to the identity theorem (at least in the complex case).

What makes this approach fail?

(This main part of my question makes it unique among other questions regarding the Jacobian conjecture, which have been completely falsely suggested to be a duplicate of this one.)

Cloudscape
  • 5,326
  • 2
  • As far as I can tell, the given question does not suggest a specific approach, and neither does its answer address one (except for attempting to find a counter-example). So could you please un-duplicate this question? – Cloudscape Mar 10 '19 at 09:40
  • 1
    Run through the hypothetical argument outlined in one of those answers. While not a perfect duplicate, the fact you propose solving this problem as if it were a simple bit of ODE/Linear-Algebra implies to me you don't actually understand why the problem is nuanced and complicated. I agree with you it's not a perfect duplicate, but it's better than my immediate answer to your question, which is that the Jacobian Conjecture involves multiple variables. You can't just integrate to cancel out the differentiation immediately and get polynomials. – Brevan Ellefsen Mar 10 '19 at 09:43
  • "you don't actually understand why the problem is nuanced and complicated" - Yes, that's why I'm asking. Since I've only just met the conjecture, this doesn't imply that I'm stupid though. – Cloudscape Mar 10 '19 at 09:51
  • 2
    I never stated anything about your intelligence, so please don't accuse me of doing so. It's perfectly understandable that you wouldn't get the nuance looking at the conjecture for the first time - I surely didn't when I first saw it. That is why I: 1) connected a question I think answers most aspects of this question decently well enough to be considered a duplicate, and 2) answered the one part of your question that isn't a duplicate via my above comment. – Brevan Ellefsen Mar 10 '19 at 09:54
  • Well, you said that it wasn't possible to pursue the approach, but you didn't say why. (Eg. the parity obstruction is an explanation for why twim primes wouldn't yield to classical sieve thy. methods, but you didn't give me anything like that.) – Cloudscape Mar 10 '19 at 09:56
  • to quote myself "the Jacobian Conjecture involves multiple variables. You can't just integrate to cancel out the differentiation immediately and get polynomials." If you have a more elaborate idea in mind involving integration please edit your question to include additional details. – Brevan Ellefsen Mar 10 '19 at 10:03
  • @BrevanEllefsen Done. Respond. – Cloudscape Mar 10 '19 at 10:16
  • Note also that the local polynomial may be uniquely extended, so that it's in fact a global inverse by the multi-dim. identity theorem. – Cloudscape Mar 10 '19 at 10:21

1 Answers1

3

I now see what my mistake was. Instead of being a polynomial in $y$, the variable of the target space, the inverse is a polynomial in $f^{-1}(y)$. Specifically:

$$ J_{f^{-1}}(y) = J_f^{-1}(f^{-1}(y)). $$

Thus, we only obtain a polynomial in the components of $f^{-1}(y)$, which is probably worthless.

The only property of the inverse we have thus shown is this: If we differentiate the function in any direction, we obtain a polynomial in the components of that function. Yet this is even true for $\exp$.

Cloudscape
  • 5,326