4

Asked on maths overflow here.


What's the bijection between (equivalence classes of) scalar products (I guess 'scalar product' is the same as 'inner product') and a.c.s. (almost complex structure/s) on $\mathbb R^2$?

From Example 1.2.12 of Daniel Huybrechts - Complex Geometry An Introduction.

enter image description here


Assumptions and notation:

  1. I just pretend $V = \mathbb R^2$ literally instead of just an isomorphism.

  2. Let $\Phi(V)$ be the set of real symmetric positive definite $2 \times 2$ matrices. This set is in bijection with inner products on $V$, I believe. We have according to this,

$$\Phi(V) = \{\begin{bmatrix} h & f\\ f & g \end{bmatrix} \ | \ h+g, hg-f^2 > 0 \}_{h,f,g \in \mathbb R}$$

  1. Let $\Gamma(V)$ be the (matrix representations of) a.c.s. on $V$. We have, according to this,

$$\{\begin{bmatrix} a & b\\ \frac{-1-a^2}{b} & -a \end{bmatrix}\}_{a,b \in \mathbb R, b \ne 0}=: \Gamma(V) \subseteq Auto_{\mathbb R}(V) \subseteq End_{\mathbb R}(V)$$

  1. I understand that the 'rotation' matrices in $V$ are $SO(2) := \{R(\theta) := \begin{bmatrix} \cos(\theta) & -\sin(\theta)\\ \sin(\theta) & \cos(\theta) \end{bmatrix}\}_{\theta \in \mathbb R}$, though I'm not sure that Huybrechts has the same usage of the term 'rotation'. (I ask about this later.)

Questions:


A. For injectivity (except for the equivalence class):

Given (equivalence class of) scalar product ($[M]$ of) $M$, choose unique $I$ that assigns $v$ to the one described. I'll call this map $\gamma: \Phi(V) \to \Gamma(V)$, $\gamma(M)=I$. (Later, $\tilde \gamma: \frac{\Phi(V)}{\tilde{}} \to \Gamma(V)$, $\tilde \gamma([M])=I$.)

  1. It's 'rotation by $\pi/2$' or something. In what way? For $M=I_2$ (2x2 identity), then $I$ is indeed 'rotation by $\pi/2$', in the sense that it's $\begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix} \in SO(2) \cap \gamma(V)$, which is the '$R(\theta)$', for $\theta = \pi/2$.

  2. What exactly is the formula for $I=\begin{bmatrix} a & b\\ \frac{-1-a^2}{b} & -a \end{bmatrix} \in \Gamma(V)$ given $M = \begin{bmatrix} h & f\\ f & g \end{bmatrix} \in \Phi(V)$?

I'm asking because

  • 2a - I would exceed wolfram computation time

  • 2b - I notice for a different $M$ I tried, $I$ isn't a 'rotation matrix' in the sense of $SO(2)$. In fact, I believe the only 'rotation' matrices that are also a.c.s. are $\pm \begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix}$, i.e. $SO(2) \cap \gamma(V) = \{\pm \begin{bmatrix} 0 & 1\\ -1 & 0 \end{bmatrix}\}$. However, I think $I$ kind of 'rotates by $\pi/2$' in some other sense.

  • 2c - I think $SO(2) \cap \gamma(V)$ isn't meant to be the image of $\gamma$


B. For surjectivity:

I'll call whatever map we would have as $\phi: \Gamma(V) \to \Phi(V)$, $\phi(I)=M$

  1. Given an a.c.s. $I$, what are some possible scalar products $M$?

  2. There's a comment that goes choosing the unique $M_v$ such that for some $v \in V \setminus 0$, we have $\{v,I(v)\}$ as an orthonormal basis. I tried this out (long to type!), and the only thing missing was the positively oriented. I guess either $\{v,I(v)\}$ or $\{v,-I(v)\}$ is positively oriented though. So I'll let $M_v$/$N_v \in \Phi(V)$ correspond to $\{v,I(v)\}$/$\{v,-I(v)\}$. Then by fixing $v$ (I ask about non-fixing of $v$ later), we have $\phi(I)=M_v$ or $N_v$, whichever corresponds to positively oriented basis. I'll just call this $\phi(I)=L_v$ Is this right?

  3. Is $\phi$ supposedly an inverse (or right inverse or left inverse or whatever) to $\gamma$ (or $\tilde \gamma$ or whatever), in the sense that $\gamma(\phi(I)) = I$ for all (a.c.s.) $I \in \Gamma(V)$?

  4. This whole thing about the $v$ makes me think there's another equivalence relation going on here. Is there?

This seems like we can have maps parametrised by the nonzero $v$, namely $\phi_v: \Gamma(V) \to \Phi(V)$. In this case, we might investigate if $\phi_v(I)=L_v=L_w=\phi_w(I)$ or at least if $[L_v]=[L_w]$ under the old equivalence relation of positive scalar $\lambda$, i.e. $L_v = \lambda L_w$. If this investigation turns out negative, then I think there's some problem like if 2 inner products are equivalent if they are from the same a.c.s. $I$ under $\phi_{\cdot}$, but for possibly different $v$ and $w$, then I think the equivalence class of $L_v$ under this new relation, which is $\{L_w\}_{w \ne 0}$, might not be the same as the equivalence class of $L_v$ under the old relation, which is $\{\lambda L_v\}_{\lambda > 0}$.


Ideas:

  1. Perhaps there's some matrix thing here about how scalar products are in bijection with positive definite symmetric matrices and then almost complex structures are rotation matrices or something that are square roots of $-I_2$. Like given pos def symmetric $B$, there exists unique a.c.s. $J$ such that (something something).

  2. Perhaps this is related, but I'd rather not further analyse the question or read through the answer given that I've spent over a month on almost complex structures BEFORE we even put inner products on vector spaces. Please consider spoon-feeding me here.

BCLC
  • 14,197
  • 3
    Given $I$, pick any non-zero vector $v$ and declare ${v, I(v)}$ to be an orthonormal basis. This defines an inner product on $V$. – Jason DeVito - on hiatus Nov 11 '20 at 18:14
  • @JasonDeVito Ok that gives an inner product, I'll just say it's represented by matrix $M_v$ (with subscript $v$, since well $M$ depends on $v$), from an $I$, but does $M_v$ give back the $I$? The book says the $I$ is unique so if $M_v$ gives out $J$, then all I have to do is show $J$ has the same properties as $I$. After rewriting bases and stuff, I was able to show everything except positively oriented. – BCLC Nov 13 '20 at 14:25
  • @JasonDeVito Is $\det[v | I(v)]$ somehow positive (I just pretend WOLOG $V = \mathbb R^2$ and that the orientation is given by standard basis) or something because ${v,I(v)}$ is an orthonormal basis? If positive but for a different reason, then what? If not necessarily positive, then could we perhaps declare from the start that ${v, I(v)}$ is positively oriented orthonormal? – BCLC Nov 13 '20 at 14:27
  • @JasonDeVito Thanks for commenting! – BCLC Nov 13 '20 at 14:28
  • 1
    Hm, I didn't think about that issue. But it can definitely be the case that ${v, I(v)}$ is negatively oriented: if $I$ is an almost complex structure, then so is $-I$, and ${v,I(v)}$ and ${v,-I(v)}$ have opposite orientations. In the text you copied, the author mentions "almost complex structures that induce the given orientation", but I'm not sure I agree it's the same as the set ${I\in GL_+(V): I^2 = -Id}$, unless $GL_+$ has a non-standard meaning. – Jason DeVito - on hiatus Nov 13 '20 at 14:36
  • @JasonDeVito Re the orientation: Ah ok so given $I$, then pick either the $M_v$ for ${v,I(v)}$ or the $N_v$ for ${v,-I(v)}$? I mean, there are only 2 orientations anyway soooo.... – BCLC Nov 13 '20 at 14:38
  • @JasonDeVito Re the $GL_{+}$, thanks but I kind of have other questions about this 1st. I'm actually about to edit post for more info. – BCLC Nov 13 '20 at 14:39
  • 1
    Well, I've actually never seen the notation $GL_+$, only $GL^+$. Perhaps the former notation means exactly what the author wrote: The $I$ in $GL(V)$ for which ${v,I(v)}$ is positively oriented. (The latter, to me, means positive determinant/ orientation preserving). – Jason DeVito - on hiatus Nov 13 '20 at 14:39
  • @JasonDeVito Edited question. THanks – BCLC Nov 13 '20 at 15:23
  • 1
    Let $V$ be a two-dimensional real vector space equipped with an inner product $\langle -, - \rangle$. If $v \in V$ is nonzero, then there are exactly two vectors which are at right angles to $v$ and of the same length as $v$. These are the two candidates for what $Iv$ could be. There is a priori no way to choose one or the other, but the fact that $V$ has the extra data of an orientation means we can select one so that $(v, Iv)$ is a positively oriented basis of $V$. Hence we get a unique operator $I \colon V \to V$ from the data of $(V, \langle -, - \rangle, \text{orientation})$. – Joppy Nov 14 '20 at 10:25
  • 1
    Conversely, suppose that $V$ is a two-dimensional real vector space equipped with a linear operator $I \colon V \to V$ satisfying $I^2 = - \operatorname{id}_V$, and an orientation, and suppose furthermore that $(v, Iv)$ is a positively oriented basis for any nonzero $v \in V$. Then there is a unique inner product defined by picking any $v \in V$ and declaring $(v, Iv)$ to be orthonormal. (Any $v \neq 0$ can be used: the scalar product will be the same up to a positive real scalar multiple). So we get a unique inner product from the data of $(V, I, \text{orientation})$. – Joppy Nov 14 '20 at 10:29
  • 1
    This creates two maps, and you should check that they compose to give the identity (you need to mod out the scalar products by a positive real scalar) in both directions. Is there more that is less clear? – Joppy Nov 14 '20 at 10:30
  • @Joppy Thanks. You've kinda given the same answer as Jason DeVito except that you address that any $v$ can be used (you didn't prove, but I guess I have to figure it out. well at least i know now it's true. i mean i didn't even know if it were true, so it could've been a waste of time if i attempted to prove this) but.... – BCLC Nov 14 '20 at 10:38
  • @Joppy oh wait wait i was about to say you don't address positively oriented but you do...hmmm.... – BCLC Nov 14 '20 at 10:39
  • @Joppy how can we just assume ${v,I(v)}$ is positively oriented? Do you mean that given an $I$, does there exist a $v$ such that ${v,I(v)}$ is positively oriented? I think this is the only missing thing for the surjectivity part. In answering this, you can post as answer for the surjectivity part. – BCLC Nov 14 '20 at 10:40
  • @JasonDeVito Joppy seems to have an answer for the surjectivity part. What do you think? – BCLC Nov 14 '20 at 10:41
  • @Joppy Just to double check, you're sure that for any $v,w \in V \setminus 0$, there exists $\lambda$ such that $L_v = \lambda L_w$? I mean I'm happy to prove it, but I wanna be sure that I'm right. (I think there's no need to assume positively oriented or whatever at this point, but based on my earlier comment perhaps we'll just later further restrict $v,w$ to the ones that give positive orientations or whatever – BCLC Nov 14 '20 at 10:43
  • @Joppy Wait is my $L_v := {M_v, N_v}$ thing right? Like any $v$ can be used and we're not sure ${v,I(v)}$ is positively oriented, but we're sure either ${v,-I(v)}$ or ${v,I(v)}$ is positively oriented – BCLC Nov 14 '20 at 10:48
  • 1
    The operator $I$ defines an orientation: the one where $(v, Iv)$ is positively oriented for any nonzero $v$. (By the way, you need to use ordered bases to define an orientation: a basis as a set will not do). – Joppy Nov 14 '20 at 11:16
  • @JasonDeVito are (A1),(A2),(B) below correct? – BCLC Nov 22 '20 at 08:51
  • (A1) - acs intersection $GL^{+}$ refers to acs that necessarily give that $(v,I(v))$ as positively oriented ordered basis (not necessarily orthonormal for any particular inner product $M$) soooo the bijection is between not all acs and equivalence classes of $M$ of those particular acs (the '$C$' in @Joppy 's answer) and equivalence classes of $M$. – BCLC Nov 22 '20 at 08:51
  • (A2) Therefore, in declaring that given $I$ we have ${v,I(v)}$ to be $M_I$-orthonormal for some $M_I$ inner product, we're assured that because of $I$, we have for the ordered (orthonormal) basis $(v,I(v))$ that $(v,I(v))$. (B) To say that '$(v,I(v))$ is positively oriented ordered basis' for any $v$ (even zero $v$ in which case the statement is incorrect or vacuously correct) doesn't require any kind of inner product $M$ on $V \cong \mathbb R^2$? – BCLC Nov 22 '20 at 08:51
  • @Joppy are (A1),(A2),(B) above correct? – BCLC Nov 22 '20 at 08:52
  • 1
    @JohnSmith: I believe so. – Jason DeVito - on hiatus Nov 22 '20 at 12:32
  • @JasonDeVito Thanks! – BCLC Nov 23 '20 at 12:49
  • @JasonDeVito Ok now may you please say whether or not the $$\frac{MI}{\sqrt{\det(M)}} = J$$ formula is correct? – BCLC Nov 23 '20 at 12:50

3 Answers3

2

Fix a two-dimensional real vector space $V$. There are three kinds of extra data we can impose upon $V$:

  1. An orientation, a function $\omega$ which measures a basis $(v_1, v_2)$ and outputs $\omega(v_1, v_2) \in \{\pm 1\}$.
  2. A complex structure, an $\mathbb{R}$-linear operator $I \colon V \to V$ satisfying $I^2 = -\operatorname{id}_V$.
  3. A scalar product $B \colon V \times V \to \mathbb{R}$, which is bilinear, symmetric, and positive-definite.

For example, when $V = \mathbb{R}^2$ and $(e_1, e_2)$ is the standard basis, then we have the standard structures:

  1. The orientation of a basis $(v_1, v_2)$ is the sign of the determinant of the change-of-basis matrix from $(e_1, e_2)$ to $(v_1, v_2)$.
  2. The complex structure is a rotation counter-clockwise by $\pi/2$, the linear operator defined by $I e_1 = e_2$ and $I e_2 = -e_1$.
  3. The dot product $B(e_1, e_1) = B(e_2, e_2) = 1$ and $B(e_1, e_2) = 0$.

When I say "the" rotation by $\pi/2$, I am really using both the orientation and the scalar product implicitly. An algebraic rotation by $\pi/2$ is simply an operator $I$ squaring to $I^2 = - \operatorname{id}_V$, and there are many operators of this form. For example, I could define $J e_1 = 2 e_1 + 3e_2$ and $J(2e_1 + 3e_2) = -e_1$ and $J$ would be an algebraic rotation by $\pi/2$.

Keep in mind that if $V$ is just a two-dimensional real vector space with no more data, we cannot possibly say if something preserves lengths or angles, think for example of the two-dimensional vector space of functions $f \colon \mathbb{R} \to \mathbb{R}$ spanned by $e^x$ and $\sin x$: is the operator $I(e^x) = \sin x$ and $I(\sin x) = -e^x$ a true "rotation"? We cannot possibly say before we define an inner product on the space, but it is certainly an algebraic rotation since it squares to minus one.

Things brings us to standard notions for "compatibility" of a complex structure with the above:

  • A complex structure $I$ is compatible with the scalar product $B$ if it is an isometry: $B(Iv_1, Iv_2) = B(v_1, v_2)$ for all $v_1, v_2 \in V$.
  • A complex structure $I$ is compatible with the orientation if $(v, Iv)$ is positively oriented for any $v \in V$.

Lemma: If $(V, \omega, B)$ is a two-dimensional real vector space equipped with an orientation $\omega$ and scalar product $B$, then there is a unique compatible complex structure $I \colon V \to V$.

Proof: Since $I$ is an isometry it preserves lengths: $B(v, v) = B(Iv, Iv)$ for all $v \in V$. Furthermore, we have $B(v, Iv) = B(Iv, I^2 v) = -B(v, Iv)$ and hence $v$ and $Iv$ are perpendicular for all $v \in V$. Therefore $Iv$ lies in the one-dimensional subspace perpendicular to $v$, and must be one of the two vectors on this line which have the same length as $v$. Out of these two possibilities for $Iv$ we take the one where $\omega(v, Iv) = 1$.


Now, fix an oriented two-dimensional vector space $(V, \omega)$. Define $$S = \{B \colon V \times V \to \mathbb{R} \mid B \text{ a scalar product}\},$$ $$C = \{I \colon V \to V \mid I^2 = -\operatorname{id}_V \text{ and } \omega(v, Iv) = 1 \text{ for all } v \in V \}$$ $$ \Phi \colon S \to C, \quad \Phi(B) = I_B $$ where $I_B$ is the unique complex structure compatible with the data $(V, \omega, B)$. We want to show that $\Phi$ is surjective, and that whenever $\Phi(B) = \Phi(D)$ then $B = \lambda D$ for some $\lambda \in \mathbb{R}_{>0}$.

Surjectivity: Let $I$ be a complex structure on $V$ compatible with $\omega$. Pick any vector $v \in V$, then $(v, Iv)$ is a positively oriented basis. Define a scalar product $B$ by setting $B(v, v) = B(Iv, Iv) = 1$ and $B(v, Iv) = 0$, in other words $B$ is defined so that $(v, Iv)$ is an orthonormal basis. Since $I$ is compatible with both $\omega$ and $B$, we have that $I = I_B = \Phi(B)$.

"Injectivity": Suppose that $I_B = I_D$ for two scalar products $B, D$. Then $(v, I_B v)$ is a positively oriented orthogonal basis for both $B$ and $D$. Hence there are positive scalars $\lambda, \mu$ such that $(\lambda v, \lambda I_B v)$ and $(\mu v, \mu I_B v)$ are positively oriented orthonormal bases for $B$ and $D$ respectively, and therefore $\frac{1}{\lambda} B = \frac{1}{\mu} D$. (If this does not convince you, do the simple exercise: a scalar product is entirely determined by an orthonormal basis).


Hopefull that is enough: it is a very drawn out explanation. Intuitively, complex structures are algebraic rotations. A scalar product defines a circle in the space (vectors of unit length) and angles in the space, and an orientation tells you which way around the circle is the "positive" way, hence you get a unique compatible complex structure in the presence of an scalar product and orientation. Scaling up/down the scalar product (making the "unit" circle larger or smaller) doesn't change angles or rotations.

Joppy
  • 13,983
  • I've analysed some of this. Will type up follow up questions later. awarding bounty for now. thanks joppy! – BCLC Nov 20 '20 at 16:25
  • $$\frac{MI}{\sqrt{\det(M)}} = J$$ Joppy and Jason DeVito, I've done some analysis of your answers. Instead of making some comments, I have an answer of my own, summed up by above formula with with $M$ as inner product, $I$ as almost complex structure and $J$ as rotation matrix by $\frac{\pi}{2}$. Please let me know what you think. – BCLC Nov 22 '20 at 08:42
  • Joppy do your $\lambda$ or $\mu$ relate to determinants? – BCLC Nov 23 '20 at 12:52
  • btw, so what is $SO(V)$ please? i guess related https://encyclopediaofmath.org/wiki/Orthogonal_transformation ... my initial thought is that $SO(V)$ means $SO((V,\langle \rangle))$ and like any endomorphism $H$ is s.t. $H \in SO(V)$ iff $H$ is compatible with $\langle \rangle$, but here it seems to be a consequence rather than a definition of $\in SO(V)$. – BCLC Dec 03 '20 at 23:54
  • $SO$ stands for the "special orthogonal" group. When $(V, \langle -, - \rangle)$ is an inner product space, the orthogonal group $O(V, \langle -, - \rangle)$ are those linear transformations $f$ such that $\langle f(u), f(v) \rangle = \langle u, v \rangle$ for all vectors $u, v \in V$ (in other words, linear isometries). $V$ is over the real numbers, then every element of the orthogonal group has determinant $\pm 1$. The special orthogonal group $SO(V, \langle -, - \rangle) \subseteq O(V, \langle -, - \rangle)$ is the subgroup of determinant 1 transformations. – Joppy Dec 04 '20 at 03:53
1

This answer addresses the "injectivity" questions you ask.

  1. I interpret "rotation" as "orientation preserving, and also preserving of the inner proudct". If your inner product is the standard one on $\mathbb{R}^2$, then these corresponds to rotation matrices as you've defined them. However, in a different inner product, the rotation matrices look different. In this iterpretation, saying $I$ is rotation by $\pi/2$ just means that $I$ preserves lengths (as computed in the weird inner product), and the angle between $v$ and $I(v)$ (as computed in the weird inner product) is $\pi/2$.

  2. The matrix of $I$ is $\begin{bmatrix} -\frac{f}{\sqrt{gh-f^2}} & -\frac{g}{\sqrt{gh-f^2}} \\ \frac{h}{\sqrt{gh-f^2}} & \frac{f}{\sqrt{gh-f^2}}\end{bmatrix}.$ I found this by setting $I\begin{bmatrix} 1\\0\end{bmatrix}= \alpha \begin{bmatrix} 1\\0\end{bmatrix} + \beta \begin{bmatrix} 0\\1\end{bmatrix}$ and then using the two equations $\left\| \begin{bmatrix} 1\\0\end{bmatrix}\right\| = \left\| I\begin{bmatrix} 1\\0\end{bmatrix}\right\|$ and $\left\langle \begin{bmatrix} 1\\0\end{bmatrix}, I\begin{bmatrix} 1\\0\end{bmatrix} \right\rangle = 0$ to solve for $\alpha$ and $\beta$. It turns out there is a sign ambiguity which is resolved using the orientation. I believe this answers 2a; I think 1. answers 2b and 2c.

  • Holy crap. Thanks a lot Jason DeVito. God bless you. Upvoted. Will read more later. – BCLC Nov 13 '20 at 16:10
  • $$\frac{MI}{\sqrt{\det(M)}} = J$$ Joppy and Jason DeVito, I've done some analysis of your answers. Instead of making some comments, I have an answer of my own, summed up by above formula with with $M$ as inner product, $I$ as almost complex structure and $J$ as rotation matrix by $\frac{\pi}{2}$. Please let me know what you think. – BCLC Nov 22 '20 at 08:42
  • btw, so what is $SO(V)$ please? i guess related https://encyclopediaofmath.org/wiki/Orthogonal_transformation ... my initial thought is that $SO(V)$ means $SO((V,\langle \rangle))$ and like any endomorphism $H$ is s.t. $H \in SO(V)$ iff $H$ is compatible with $\langle \rangle$, but here it seems to be a consequence rather than a definition of $\in SO(V)$. – BCLC Dec 03 '20 at 23:56
  • in re your other comment about basis and such, is it not the case that my answer is the basis-free one while the ones that aren't basis-free are your answer, Joppy's answer and Huybrechts' proof? I mean, you all rely on some nonzero $v$ while I do not. I think mine is a lot simpler and therefore better if correct. No offense. And of course I got the idea for my answer based on your computation of $I$. Again, thanks for that computation – BCLC Dec 04 '20 at 00:09
  • @JohnSmithKyon: My answer and your own answer depend on the basis. As Joppy says, any time you write down a matrix, there is an implied basis. Joppy's answer after his/her "Lemma" is basis independent. Choosing a non-zero $v$ doesn't imply you've picked a basis - choosing such a $v$ is something you can do in any non-zero vector space, even in one that doesn't have a basis (this is getting far afield, but see https://math.stackexchange.com/questions/207990/vector-spaces-and-ac) – Jason DeVito - on hiatus Dec 04 '20 at 02:14
0

DISCLAIMER: posting as answer instead of comment since too long for comment. You could think of this as a Cunningham's law thing or whatever, but I'm really doing this just because it's too long for a comment. I'm also going to do community wiki if this makes any difference.

TL;DR I think the bijection can be summed up in this formula (based on Jason DeVito's explicit computation of $I$)

$$\frac{MI}{\sqrt{\det(M)}} = J$$


Edit: Adding my intuition:

  1. Given $M$, we want unique $I$ such that $I$ is something like '(anti-clockwise) rotation by $\frac{\pi}{2}$' but 'with respect to $M$'. In precise terms, this is the unique $I$ s.t. $\frac{MI}{\sqrt{\det(M)}} = J$, where $J$ is literally (anti-clockwise) rotation by $\frac{\pi}{2}$ ('with respect to $I_2$' or something)

  2. Similarly, given $I$, we want an $M$ such that $I$ is '(anti-clockwise) rotation by $\frac{\pi}{2}$' but 'with respect to $M$'. Turns out there are several $M$'s that satisfy this condition, where this condition is stated in precise terms as $\frac{MI}{\sqrt{\det(M)}} = J$.


The bijection is:

  • From $M$ to $I_M$: Given $M$, choose unique $I_M = M^{-1}J\sqrt{\det(M)}$

  • From $I$ to $[M_I]$: Given $I$, choose unique equivalence class $[M_I]$ given by all $M_I$ such that $\frac{M_I}{\sqrt{\det(M_I)}} = JI^{-1} = -JI$

  • Injectivity (of map $M$ to $I_M$): Given $M$ and $I_M=I_N$, I believe Joppy's $\lambda$ and $\mu$ divide to give ratio or square root of ratio of determinants

  • Surjectivity (of map $M$ to $I_M$): Given $I$, there exists a lot of $M$, namely $[M_I]$.

Joppy and Jason DeVito, I've done some analysis of your answers. Follow-up questions:

FUP1 - What do you think of the above formula, with $M$ as inner product, $I$ as almost complex structure and $J$ as rotation matrix by $\frac{\pi}{2}$? I think this encapsulates all 3 properties of I. In particular...

FUP2 - I think the positively oriented is given in choosing $J$ instead of $-J$ (rotation matrix by $\frac{-\pi}{2}$)?

  • FUP2a - In this case, I think the 1st 2 properties of unique $I$ given $M$ (respectively, $v^TMIv=0$ and $v^T(I^TMI-M)v=0$, for all nonzero $v$) is equivalent to $\frac{MI}{\sqrt{\det(M)}} = \pm J$?

FUP3 - We can see the that the $\lambda$ between $M$'s is given by square root of ratio of determinants:

  • FUP3a - For $\frac{M}{\sqrt{\det(M)}} = \frac{N}{\sqrt{\det(N)}} \implies M=\lambda N, \lambda=\frac{\sqrt{\det(M)}}{\sqrt{\det(N)}}$?

    • FUP3ai - I think in this part we do not assume that each matrix $\frac{M}{\sqrt{\det(M)}}$, $\frac{N}{\sqrt{\det(N)}}$ is equal to $JI^{-1} = -JI$?
  • FUP3b - Conversely for $M=\lambda N$ and $\lambda > 0$, we can take $\det$ of both sides to get $\frac{M}{\sqrt{\det(M)}} = \frac{N}{\sqrt{\det(N)}}$?

    • FUP3bi - In particular, we can the relevance of $2$ dimensions here: taking determinant of both sides of $M=\lambda N$ gives us a $\lambda^2$?
BCLC
  • 14,197
  • 2
    I usually prefer writing things down in the abstract first, then choosing a basis and writing down equations of matrices and so on. This is because it is relatively easy to go that way, and very difficult to go the other. I think at this point you will have to answer questions yourself, unless you want to ask more, like "what is the rotation matrix in an arbitrary basis" and so on. – Joppy Nov 23 '20 at 01:34
  • Thanks @Joppy 1 - Re your last sentence: $I$ or $M$ depends on a choice of basis? 2 - but wait does anything seem wrong with the $$\frac{MI}{\sqrt{\det(M)}} = J$$ formula? – BCLC Nov 23 '20 at 12:51
  • 2
    I don't know if the formula is true, I haven't checked - it looks believable though. Yes, every matrix depends on a choice of basis in a vector space. The matrix of a bilinear form especially so - for the matrix of a linear map, $\det(A)$ is the same no matter what basis $A$ is relative to. For the matrix of a bilinear form, even $\det(M)$ depends on the choice of basis. It is good to be able to choose bases and write things down in coordinates, but it is also important to be able to use an abstract bilinear form without needing to turn it into a matrix first. – Joppy Nov 23 '20 at 13:10
  • thanks @Joppy re 2 (the formula) - well it's like this: let's say given $M$ the $I$ really is an ACS that satisfies the properties including the positive oriented. then let's say given $I$ any such $M$ is positive definite. then this formula is the indeed the bijection? – BCLC Nov 23 '20 at 13:20
  • @Joppy re 1 (the matrix depends on basis thing) - 2 things - 1a (part 1 of 2) not sure if this makes any sense but even though $\det M$ somehow depends on choice of basis (let's just pretend I understand how $\det M$ can depend on choice of basis. hopefully you will not need to elaborate), does $\frac{M}{\sqrt{\det M}}$ depend on choice of basis? I was thinking this ratio will somehow cancel all the basis stuff. – BCLC Nov 23 '20 at 13:22
  • @Joppy 1a (part 2 of 2) Kinda like how Gauss-Bonnet cancels out all the geometry stuff in the integral to get the topological item $\chi(M)$. or simply like how $\frac{2x}{3x}$ doesn't depend on $x$ even though $2x$ and $3x$ depend on $x$ – BCLC Nov 23 '20 at 13:25
  • @Joppy 1b (part 1 of 2) - i think i kinda understand what you mean like rotation matrix in arbitrary basis like I think somehow $J$ is relative to standard basis and then relative to some other basis it becomes $I_M$ or something. I was indeed thinking (well just for a bit. it's kind of a headache to think of it. i guess this is why we have all these 'coordinate-free' descriptions) that somehow $I_M$ is like $J$ under a change of basis or something. but in the sense that like...i mean....idk i don't really see where exactly bases are here. Given $I$/$M$, I get $[M_I]$/$I_M$ with the formula. – BCLC Nov 23 '20 at 13:26
  • @Joppy 1b (part 2 of 2) - where exactly are the bases here? I mean the way I understand a given $I$ or $[M]$ is that it's (an equivalence class of) 4 numbers arranged rectangularly. I don't believe these numbers depend on a basis or anything. Perhaps you're thinking of something like this (from Loring W. Tu's Differential Geometry: Connections, Curvature, and Characteristic Classes), but I believe I've steered clear of bases here, eg when I get $M_I$, I don't even use (I THINK) the extra info of another nonzero vector $v$ like what you & Jason DeVito did – BCLC Nov 23 '20 at 13:34
  • @JohnSmithKyon: Curiously, $\det M$ does not depend on the choice of basis (so long as you use the same basis on the domain and codomain), so in your expression $\frac{M}{\sqrt{\det M}}$, the basis dependence doesn't "cancel out". I agree with Joppy - generally I introduce a basis and do computations as a last resort, because I generally don't feel the computations impart understanding. To be clear, I think these kinds of computations count as a "proof", but I also think the end goal of a mathematician is not "proof" but "understanding". – Jason DeVito - on hiatus Nov 23 '20 at 15:08
  • @JasonDeVito when did i ever introduce basis though? you and joppy were the ones who started it actually i believe. haha. actually even huybrechts. like when you say given a $v$, then ${v,I(v)}$ is orthonormal or whatever. imho, i'm the basis-free one among the 4 of us because I just say simply $$\frac{MI}{\sqrt{\det(M)}} = J$$ – BCLC Dec 03 '20 at 23:42
  • @Joppy Edited answer to include intuition. – BCLC Dec 03 '20 at 23:51
  • @JasonDeVito Edited answer to include intuition. – BCLC Dec 03 '20 at 23:51