39

Suppose I have a $n\times n$ matrix $A$. Can I, by using only pre- and post-multiplication by permutation matrices, permute all the elements of $A$? That is, there should be no binding conditions, like $a_{11}$ will always be to the left of $a_{n1}$, etc.

This seems to be intuitively obvious. What I think is that I can write the matrix as an $n^2$-dimensional vector, then I can permute all entries by multiplying by a suitable permutation matrix, and then re-form a matrix with the permuted vector.

Landon Carter
  • 13,462
  • 4
  • 37
  • 90
  • You can pre and post multiply by matrices formed by permitting the rows of the identity matrix. Post multiplying permutes the columns of the original matrix and pre multiplying permutes the rows – Jack May 10 '18 at 14:28
  • I know, those are the permutation matrices. I am just saying that given $A$ and $B$, where $B$ is just a permutation of $A$, can I reach from $A$ to $B$ in finitely many steps? I believe the answer is yes, which is what I wrote. – Landon Carter May 10 '18 at 14:30
  • 6
    Can you get from $\pmatrix{1&2\3&4}$ to$\pmatrix{2&1\3&4}$? – Angina Seng May 10 '18 at 14:32
  • Yes, by the method I mentioned. – Landon Carter May 10 '18 at 14:34
  • 1
    @LandonCarter So what are the permutation matrices $P$ and $Q$ with $P\pmatrix{1&2\3&4}Q=\pmatrix{2&1\3&4}$? – Angina Seng May 10 '18 at 14:37
  • In the example of @LordSharktheUnknown, permuting rows and/or colums always leaves $1$ and $3$ in the same column. How do you manage to achieve the second matrix via this process? – qualcuno May 10 '18 at 14:38
  • 1
    @LordSharktheUnknown Only I don't know if the method I mentioned is equivalent to the row and column interchanges. – Landon Carter May 10 '18 at 14:38
  • For one thing, you are seeking to achieve $(n^2)!$ permutations using a group with only $n!^2$ elements (direct product of $S_n$ with itself, or more precisely with its opposite group). This is clearly asking too much. (Oh, I now see that there is an answer saying basically the same.) – Marc van Leeuwen May 11 '18 at 13:00
  • I am amazed by the popularity this question received! – Landon Carter May 12 '18 at 13:48
  • Closely related, focusing on transpositions of matrix entries: Permute any two entries in a n×n matrix. – hardmath May 30 '18 at 21:40

4 Answers4

74

It is not generally possible to do so.

For a concrete example, we know that there can exist no permutation matrices $P,Q$ such that $$ P\pmatrix{1&2\\2&1}Q = \pmatrix{2&1\\2&1} $$ If such a $P$ and $Q$ existed, then both matrices would necessarily have the same rank.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
44

Let me add one more argument:

For $n \ge 2$:

Suppose the entries in the $n \times n$ matrix $A$ are all distinct. Then there are $(n^2)!$ distinct permutations of $A$.

There are $n!$ row-permutations of $A$ (generated by premultiplication by various permutation matrices), and $n!$ col-permutations of $A$ (generated by post-multiplication by permutation matrices). If we consider all expressions of the form $$ RAC $$ where $R$ and $C$ each range independently over all $n!$ permutation matrices, we get at most $(n!)^2$ possible results. But for $n > 1$, we have \begin{align} (n!)^2 &= [ n \cdot (n-1) \cdots 2 \cdot 1 ] [ n \cdot (n-1) \cdots 2 \cdot 1 ] \\ &< [ 2n \cdot (2n-1) \cdots (n+2) \cdot (n+1) ] [ n \cdot (n-1) \cdots 2 \cdot 1 ] \\ &= (2n)! \\ &\le (n^2)! \end{align} because $2n \le n^2$ for $n \ge 2$, and factorial is an increasing function on the positive integers. So the number of possible results of applying row- and col-permutations to $A$ is smaller than the number of possible permutations of the elements of $A$. Hence there's some permutation of $A$ that does not appear in our list of all $RAC$ matrices.

BTW, just to close this out: for $1 \times 1$ matrices, the answer is "yes, all permutations can in fact be realized by row and column permutations." I suspect you knew that. :)

PS: Following the comment by @Jack M, I want to make clear why it's OK to consider only things of the form $RAC$. Why do we do the column permutations first, and then the rows? (Or vice-versa, if you read things the other way). What if interleaving row and column permutations does something funky? The answer is that if you do a bunch of row ops interleaved with a bunch of column ops, you get the same thing as if you do all the row-ops first, and then all the column ops afterwards (although the row ops have to come in the same order in this rearrangement, and similarly for the column ops). That requires a little proof, but nothing hairy.

What if we do more than one row-permutation, say, two of them? Don't we have to look at $R_1 R_2 A C$ instead? Answer: $R_1 R_2$ will again be a permutation matrix, so we can really consider this as being $(R_1 R_2) A C$, i.e. the matrix I've called $R$ is the product of any number of permutation matrices. And it will always be a matrix with one $1$ in each row and each column. So my counting of possible row-permutations is still valid.

John Hughes
  • 100,827
  • 4
  • 86
  • 159
  • I like this argument better than the counter example, because it feels like a better general explanation to me. Talking about rank of a matrix isn't too weird, but this sort of reasoning is much more direct. Also, I like how this argument clearly fails for scalars, although I suppose that's not so remarkable. – theREALyumdub May 10 '18 at 23:19
  • 2
    I like it too, but I'm glad the rank argument is there as well, because it's a good skill to learn: use rank arguments when possible. :) – John Hughes May 11 '18 at 01:21
  • Thank you for this wonderful answer. I was so blown away by the rank argument that I accepted it, should have waited before other answers appeared. My sincere apologies. – Landon Carter May 11 '18 at 11:05
  • 1
    A pleasure. As I noted, I actually prefer the rank argument, because it's more generally useful than this one. I just wrote mine down because it occurred to me out of nowhere, and I'm generally so unlikely to think of combinatorial arguments that I found it kind of fun. – John Hughes May 11 '18 at 11:09
  • 3
    From a group theory perspective, this works because the subgroup of row permutations and the subgroup of column permutations, seen as subgroups of $S_{n^2}$, are in direct product. – Jack M May 11 '18 at 17:55
31

Given two elements $a_1$ and $a_2$, the properties "$a_1$ and $a_2$ are on different rows" and "$a_1$ and $a_2$ are on different columns" are preserved by any permutation. Proof:

A column permutation won't affect what row anything is on. A row permutation has to send an entire row to the same row, so if they start on the same row, they end on the same row. Permutations are invertible, so if they can't take two elements on the same row to different rows, they can't take elements on different rows to the same row.

An analogous argument holds for being on the same or different columns.

Thus, a row and column permutation is completely characterized by what it does to a diagonal; to find out where it sends an arbitrary element, just take the row that its row was sent to, and the column its column was sent to.

Acccumulation
  • 12,864
  • 1
    +1 this is by far the most intuitive response to me, I suppose it only has the least votes because the OP holds this to be an equivalent problem. But this could be intuitive for others. – theREALyumdub May 10 '18 at 23:18
  • @theREALyumdub: It had the least votes probably for the simple fact that it was the last answer posted, with enough time in between for the other answers to gather votes. Note that at the time I'm writing this comment, it already has almost twice as many votes as the currently least-voted answer (which was posted almost three hours earlier). – celtschk May 12 '18 at 08:46
  • @celtschk An astute observation about how internet "traffic" works, but I was very much less concerned about the votes as I was the intuition here. The pigeon hold principle and rank arguments are great and much more natural to linear algebra, but my education of determinants in early undergraduate years forced me to consider the hypothesis of this proof: that multiplying matrices by permutation matrices keeps their rows and columns distinct. I don't know how to get the rank theorem without that first (although there are probably plenty of ways). – theREALyumdub May 14 '18 at 00:41
  • 1
    @theREALyumdub: Had I had issues with the first part of the first sentence, or with the last sentence of your original comment, I would have expressed it. Actually I agree, and I upvoted just this answer (already before I read your comment). I just took issue of you apparently reading too much into the number of votes at that point in time. – celtschk May 14 '18 at 04:07
15

Some users of MSE are very sensitive to the word "obvious", but I believe it is blatantly obvious that the answer to your question is "no" in general. The reason is simple: by left (right) multiplying $A$ by a permutation matrix, you are permuting each row (column) of $A$ as a whole. Therefore, entries on the same row (column) of $A$ will still align in a row (column). You cannot break row or column alignment by applying left and/or right permutations to $A$.

From another perspective, if you vectorise $PAQ$, it becomes $\operatorname{vec}(PAQ)=(Q^T\otimes P)\operatorname{vec}(A)$. While the Kronecker product $Q^T\otimes P$ is an $n^2\times n^2$ permutation matrix, it is clear that not all $n^2\times n^2$ permutation matrices are decomposable tensors.

user1551
  • 149,263
  • 7
    I veyr much hope that the second paragraph was not meant to also fall under the category of "obvious". :) – John Hughes May 10 '18 at 17:20
  • @JohnHughes It isn't as obvious as the first paragraph, but it's still quite straightforward: if you partition $Q^T\otimes P$ into sub-blocks, each of size $n\times n$, then each sub-block is either a permutation matrix or zero. It shouldn't be hard to exhibit a permutation matrix that is not of this form. – user1551 May 10 '18 at 17:55
  • 13
    It is conceivable that someone just learning about linear algebra and matrices might ask this question without ever having seen tensor products, and without knowing about decomposable tensors. And in that case, you're saying "It's obvious if you know something that you don't know," which is often the case, but seldom helpful. – John Hughes May 10 '18 at 19:04
  • 5
    Here's what I think is an improved version of your first paragraph: "No. By left multiplying $A$ by a permutation matrix, you are permuting the rows: any two entries that share a row before the permutation will share a (possibly different) row after. The analogous statement applies to right multiplication and columns. Thus you cannot effect the permutation $\pmatrix{1 & 2 \ 3 & 4} \to \pmatrix{3 & 2 \ 1 & 4}$ with just row and column permutations. " I believe the lack of adjectives/adverbs describing anything other than mathematical entities represents an improvement. Others may disagree. – John Hughes May 10 '18 at 19:12
  • @JohnHughes To your point, I have a BS in pure mathematics including a semester of basic linear algebra and another in advanced linear algebra, and we were not taught tensors at all. So it is possible to have questions like this one without having seen tensor products. – Todd Wilcox May 13 '18 at 04:38
  • @JohnHughes, You are putting words into my mouth. I gave an elementary argument first, which I called "obvious", followed by an alternative perspective, which might be useful when one gets enough mathematical maturity. But you framed my words as saying "It's obvious if you know something that you don't know". I like your answer, by the way, and I upvoted it before you even commented to my answer. I just don't understand why you wanted to distort my words to such extent. – user1551 May 13 '18 at 08:40
  • I was unclear: I found your second part not obvious; it used ideas almost certainly unfamiliar to OP. The scope of "obvious" in your answer wasn't clear. I (separately) found the first part unpleasant because of its emphasis on "obvious" and "simple" rather than mathematical facts, which I felt reduced the chances that OP would get value from it, and suggested a rewrite. I apologize for not separating my comments more clearly. I found the ideas of both your answers valuable, and appreciated that the second attempted to address the second part of OP's question, which others did not do. – John Hughes May 13 '18 at 09:56