I have some matrix $\mathbf{X}$ $=$ $\mathbf{A}$ $\odot$ $\mathbf{B}$, where $\mathbf{X}$, $\mathbf{A}$, and $\mathbf{B}$ $\in \mathbb{R}^{M \times N}$, and $\odot$ represents the elementwise multiplication operator -- the Hadamard product.
I know that $\mathbf{A}$ and $\mathbf{B}$ can individually be written in terms of functions of some other matrix $\mathbf{Y}$ $\in \mathbb{R}^{I \times J}$, and I am interested in obtaining the Jacobian of $\mathbf{X}$ w.r.t. $\mathbf{Y}$, i.e., I want to solve for $\frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{Y})}$ $\in \mathbb{R}^{MN \times IJ}$, using the Jacobians w.r.t. $\mathbf{A}$ and $\mathbf{B}$.
According to this stackexchange post discussing the Hadamard product,, user91684's answer implies that: The Hadamard product $\mathbf{A} \odot \mathbf{B}$ is bilinear and the derivative satisfies
\begin{align*} \mathrm{d}(\mathbf{A} \odot \mathbf{B})= \mathrm{d}\mathbf{A} \odot \mathbf{B} + \mathbf{A} \odot \mathrm{d}\mathbf{B} \end{align*}
Using this, I assume that the Jacobian looks something like this:
\begin{align*} \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{Y})} = \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{A})} \frac{\partial \text{vec}(\mathbf{A})}{\partial \text{vec}(\mathbf{Y})} + \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{B})} \frac{\partial \text{vec}(\mathbf{B})}{\partial \text{vec}(\mathbf{Y})} \end{align*}
and I believe that $\frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{A})}$ $\in \mathbb{R}^{MN \times MN}$ and $\frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{B})}$ $\in \mathbb{R}^{MN \times MN}$ are given as:
\begin{align*} \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{A})} = \text{diag} \bigg( \text{vec} \big( \mathbf{B}^\top \big) \bigg) \hspace{2.0cm} \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{B})} = \text{diag} \bigg( \text{vec} \big( \mathbf{A} \big) \bigg) \end{align*}
where $\text{diag} \big( \text{vec} ( \mathbf{A} ) \big)$ $\in \mathbb{R}^{MN \times MN}$ represents taking the vectorization of $\mathbf{A}$ $\in \mathbb{R}^{M \times N}$ and forming a diagonal matrix from its elements.
I don't know how to verify this. I am basing this off of similar derivations of Jacobians where we have normal matrix multiplication instead of elementwise matrix multiplication, but I don't know if similar rules apply for the Jacobian over elementwise multiplication.
I feel like this could be wrong. Noting that the Hadamard product is commutative, such that $\mathbf{A} \odot \mathbf{B}$ $=$ $\mathbf{B} \odot \mathbf{A}$, I feel like we can equivalently have
\begin{align*} \mathrm{d}(\mathbf{A} \odot \mathbf{B}) = \mathbf{B} \odot \mathrm{d}\mathbf{A} + \mathbf{A} \odot \mathrm{d}\mathbf{B} \end{align*}
where $\frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{A})}$ $\in \mathbb{R}^{MN \times MN}$ and $\frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{B})}$ $\in \mathbb{R}^{MN \times MN}$ could instead be given as:
\begin{align*} \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{A})} = \text{diag} \bigg( \text{vec} \big( \mathbf{B} \big) \bigg) \hspace{2.0cm} \frac{\partial \text{vec}(\mathbf{X})}{\partial \text{vec}(\mathbf{B})} = \text{diag} \bigg( \text{vec} \big( \mathbf{A} \big) \bigg) \end{align*}
I don't know if things inside of here need to be transposed or not. This feels very wrong to me, and I was unable to find any resources about taking the Jacobian w.r.t. elementwise multiplication.