8

Question

Let $X$ be a finite set, let $\mathbf{F}$ be a field. Let $\mathcal{A}$ be a subset of the power set of $X$, i.e. $\mathcal{A}\subseteq\mathcal{P}(X)$. Let $1_A:X\to F$ denote the indicator function of $A\subseteq X$. When is $\{1_A:A\in\mathcal{A}\}$ linearly independent in the vector space $\mathbf{F}^X$?

Potential difficulty

Unfortunately, this seems to depend on the characteristic of $\mathbf{F}$. For example, if $X=\{a,b,c\}$ and $\mathcal{A}$ consists of all two-element subsets of $X$, then $\{1_A:A\in\mathcal{A}\}$ is dependent when $\mathrm{char}(\mathbf{F})=2$, but independent otherwise.

Partial answer

All I can say so far is that $\mathcal{A}$ cannot contain an inclusion-exclusion family by which I mean a collection of the form $$\{A,B,A\cap B, A\cup B\}\text{ or}$$ $$\{A,B,C,A\cap B, A\cap C, B\cap C, A\cap B\cap C, A\cup B \cup C\} \text{ or so on}.$$

Eran
  • 2,681

2 Answers2

1

First, notice that the indicator function of singleton sets always forms a basis for the space, (e.g. $1_{\{a\}},1_{\{b\}},1_{\{c\}}$ is a basis for $F^X$ (regardless of $F$)). Now saw that $X = \{x_1,\dots,x_n\}$.

For a function $f \in F^X$ then we can think of it as a column vector $\begin{bmatrix}\ f(x_1)\\ \vdots \\ f(x_n) \end{bmatrix}$. In this case, if $e_i$ is the $\text{i}^{\text{th}}$ standard basis vector for $F^n$, then each $e_i$ will correspond to $1_{x_i}$.

Using this basis, a set of indicator functions will be linearly independent whenever the column vectors are linearly independent.

So, we can put these into a matrix and row reduce to test for independence. Taking your example, the corresponding column vectors will be $ \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$ and $ \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}$. We can put these into a matrix to get and reduce only working with integers (Don't multiply row by non-unit constant however). In this case, we get $\begin{bmatrix} 1 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{bmatrix}$ which we can reduce to $\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 2 \end{bmatrix}$. We then know that in characteristic $2$, the third entry actually became $0$ and therefore the vectors are not linearly independent but in other characteristics, they are linearly independent.

  • 1
    I realize my vector space is isomorphic to $\mathbf{F}^n$ where $\vert X\vert =n$. I'm more looking for conditions on $\mathcal{A}$ that ensure independence. – Eran Apr 21 '18 at 21:33
  • I see. I the idea is that you can get a matrix for $\mathcal{A}$ which will tell you everything you are looking for. But based on what you said, this is also something you would have come up with. I don't know of any clever conditions purely on the relations of the sets in $\mathcal{A}$ except that they correspond to the relations given by working with $F^n$ and with matricies with all $0$s and $1$s. – Jonathan Dunay Apr 21 '18 at 23:13
0

The answer to this question is surprisingly interesting, and does in fact depend delicately on the characteristic! As in Jonathan's answer, consider the matrix whose columns are the values of the indicator functions $1_A$, so we have a $0,1$-matrix $M$, and we want to know when this matrix has linearly independent columns, or equivalently full rank; for an $m \times n$ matrix this means rank $\text{min}(m, n)$. Note that any $0, 1$-matrix can be interpreted as a matrix whose columns (or rows) are indicator functions of some subsets of some subset, so the question is just equivalent to when a $0, 1$-matrix has full rank.

Say that a maximal minor of an $n \times m$ matrix is a $\text{min}(n, m) \times \text{min}(n, m)$ minor.

Proposition: A matrix $M$ has full rank iff at least one of its maximal minors does not vanish.

For a proof see here. Because $M$ is a $0, 1$-matrix, its maximal minors are all integers, so we conclude:

Corollary: A $0, 1$-matrix $M$ (or more generally an integer matrix) has full rank over a field $F$ iff at least one of its maximal minors is not divisible by the characteristic $p$ of $F$. In particular, if it has full rank over some field of positive characteristic, it has full rank over every field of characteristic zero.

Since we're restricting to a square submatrix anyway, we might as well assume WLOG that $M$ is square. Then the criterion simplifies to: $\det M \neq 0$ in $F$, meaning $\det M \neq 0$ in characteristic $0$ and $p \nmid \det M$ in characteristic $p$.

It's a fun exercise to check that the determinant of a $(0, 1)$-matrix can be any integer; this means we can construct examples of subsets whose indicator functions are linearly independent except in characteristic $p$ for all but an arbitrary finite set $p \in S$ of primes. So the answer can be complicated!

And even if we fix the field $F$ the answer can still be complicated, e.g. when $p = 2$ we are just asking when a matrix in $M_n(\mathbb{F}_2)$ is invertible, and there isn't an answer much easier than saying its determinant is equal to $1$; there can't be a really easy criterion since the order of $GL_n(\mathbb{F}_2)$ is known and somewhat complicated.

Qiaochu Yuan
  • 468,795