0

I've been following Gil Strang's lectures and he shows that $Null(A)$ is orthogonal to $Row(A)$ of an $m\times n$ matrix $A$ from the fact that matrix multiplication $Av$ is like taking the dot product of $v$ with the rows of $A$, or $Av=\begin{bmatrix}r_1v\\\vdots\\r_mv\end{bmatrix}$. If $v$ is in the null space, every $r_iv=0$, so $v$ must be orthogonal to the span of the rows of $A$. He uses a similar logic to show orthogonality of $Col(A)$ and $Null(A^T)$.

To me this only makes sense if $v$ and $r_i$ are seen as vectors in the standard basis. If $v$ has coordinates based on some arbitrary basis of $R^n$ (I don't even know what coordinates $r_i$ would be in terms of in this case), the dot product of the coordinates of two orthogonal vectors may not equal 0. However, $Av = 0$ ($Null(A)$) is still $v$ such that every $r_iv=0$. Is there a more general way to see orthogonality of the fundamental subspaces that is basis independent?

Yandle
  • 903
  • The definition of orthogonal is dot product zero. Basis doesn't have anything to do with it. – Gerry Myerson Aug 14 '19 at 07:37
  • @GerryMyerson From this question, I got the impression that two orthogonal vectors have an inner product of zero regardless of basis and computed by the dot product formula in the standard basis. If the basis is changed, the inner product is computed some other way (and usually not the same formula as the dot product) to get the same value (0 for two orthogonal vectors). Is this incorrect? – Yandle Aug 15 '19 at 03:25
  • First, you define an inner product on the vector space; then, you define orthogonality to mean, inner product is zero. The definition of the inner product may or may not involve any particular basis. If it does involve some particular basis (for example, if it is given by the dot product on ${\bf R}^n$), and you choose to represent elements of the vector space with respect to some other basis, then, yes, you have to change the way of computing the inner product. – Gerry Myerson Aug 15 '19 at 04:53
  • @GerryMyerson My question is linked to when the way the inner product is computed is basis dependent. The $v$ such that $Av=0$ is when the dot product of the $v$ with each row of $A$ is zero regardless what basis is used to represent the coordinates of $v$, but $v$ may be represented with respect to some basis where the dot product of two orthogonal vectors is not zero (but inner product is). This is why I am confused about how null and row spaces are orthogonal if $v$ is represented wrt some arbitrary basis. – Yandle Aug 17 '19 at 02:49
  • The null space and row space of a matrix are properties of the matrix and are completely independent of any choice of basis or of inner product. The dot product of a row in the matrix and an element of the null space is zero. Beyond that, it's all up for grabs. – Gerry Myerson Aug 17 '19 at 22:19
  • @GerryMyerson If so, then for $T:R^n\to R^m,[T(x)]\Omega =A[x]\beta$ where $\Omega, \beta$ are arbitrary bases, is it appropriate to view the tuples $[T(x)]\Omega \in Col(A)$ and $[x]\beta \in Null(A)$ as actual vectors with respect to the standard basis as opposed to coordinate vectors relative to $\Omega$ and $\beta$? Then, if $Row(A)$ is also seen wrt the standard basis, $Ax=0$ implies dot product is zero between every vector in $Null(A)$ and $Row(A)$, which under the standard basis means the two spaces are orthogonal? And similarly to show $Null(A^T)^\perp = Col(A)$. – Yandle Aug 19 '19 at 00:38
  • I think you ought to make up some small example and work it out in full detail instead of trying to understand it at the most general level with nothing to ground it in. Once you've worked through an example or two I'm sure you'll be able to write up and post a convincing answer to your questions. – Gerry Myerson Aug 19 '19 at 12:40
  • So, how is that small example coming along? – Gerry Myerson Aug 20 '19 at 13:03
  • If $A = \begin{bmatrix}2&-1\0&0\end{bmatrix}$ describe some $T:R^2\to R^2$, then the coordinate vectors $[x]=c\begin{bmatrix}1\2\end{bmatrix}$ solves $A[x]=0$ and is the same regardless of the basis. If the domain basis of $T$ is the standard basis (such that $[x]=x$) and same goes for $Row(A)$, then the dot product of $x\in Null(A)$ and $Row(A)$ give $0$ so the two spaces are orthogonal. – Yandle Aug 21 '19 at 05:03
  • If the domain basis is $\beta = {\begin{bmatrix}1\0\end{bmatrix}, \begin{bmatrix}1\1\end{bmatrix}}$, $Null(A)$ consists of all $x$ where $x=1c\begin{bmatrix}1\0\end{bmatrix}+2c\begin{bmatrix}1\1\end{bmatrix}=c\begin{bmatrix}3\2\end{bmatrix}$, then its dot product with vectors in $Row(A)$ is not $0$. However, the product of $A$ with a change of basis matrix ($\beta \to$ std) produces $\begin{bmatrix}2&-3\0&0\end{bmatrix}$ and the dot product of its rows with $x$ is $0$. It seems like orthogonality is only obvious when everything is viewed under the standard basis. – Yandle Aug 21 '19 at 05:03

0 Answers0