n$. Clearly l-rank is less than or equal to d-rank, which implies by theorem 1 that d-rank is larger than or equal to $r=\mathbf{rank}(A_\bullet)$. In order to proceed we need another definition. **Defintion 3: [pextension]** Any $p\times p$ symmetric matrix of order $p$ which has the $r\times r$ matrix $A$ as its leading principal submatrix is a *p-extension* of $A$. **Theorem 2: [drank]** The d-rank of $(A_1,\cdots.A_m)$ is less than or equal to $p$ if and only if there exist for each $j=1,\cdots,m$ pairwise commuting positive semi-definite p-extensions $C_j$ of $B_j=\Lambda^{-1}K'A_jK\Lambda^{-1}$ that add up to one. **Proof:** Equation $\eqref{E:wform}$ in the proof of theorem 1 shows there must exist $U_j$ and $V_j$ such that $$ \begin{bmatrix}\Lambda^{-1}K'A_jK\Lambda^{-1}&U_j\\U_j'&V_j\end{bmatrix}=\begin{bmatrix}L'\\L_\perp'\end{bmatrix}W_j\begin{bmatrix}L&L_\perp\end{bmatrix} $$ But this is the same as saying there must be p-extensions of the $B_j$ that commute. **QED** It seems that in general for $p>n$ and $m>2$ commuting p-extensions are difficult to work with. But in some cases theorem 2 simplifies. **Corollary 1: [small_d_rank]** The d-rank of $(A_1,\cdots,A_m)$ is $r=\mathbf{rank}(A_\bullet)$ if and only if $A_jA_\bullet^+A_\ell=A_\ell A_\bullet^+A_j$ for all $j,\ell$. **Proof:** The $\Lambda^{-1}K'A_jK\Lambda^{-1}$ must commute without any p-extension. This translates to the condition in the theorem. **QED** Finally, we also have as a corollary a version of the basic result of @deleeuw_A_82b. **Corollary 2: [m=2]** The d-rank of $(A_1,A_2)$ is $\mathbf{rank}(A_1+A_2)$. **Proof:** Because $\Lambda^{-1}K'A_1K\Lambda^{-1}+\Lambda^{-1}K'A_2K\Lambda^{-1}=I$ we see from lemma 4 that $\Lambda^{-1}K'A_1K\Lambda^{-1}$ and $\Lambda^{-1}K'A_2K\Lambda^{-1}$ commute. **QED** #Appendix: Some Lemmas **Lemma 1: [diagonal]** If $A$ is positive semi-definite and $a_{ii}=0$ then $a_{ij}=a_{ji}=0$ for all $j$. **Proof:** Suppose $$ A=\begin{bmatrix}0&r'\\r&S\end{bmatrix} $$ is positive semi-definite. Define $z=\begin{bmatrix}1&-\epsilon r\end{bmatrix}$ with $\epsilon>0$. Then $z'Az=-2\epsilon r'r+\epsilon^2r'Sr$. If $r'Sr=0$ and $r'r>0$ then $z'Az<0$ for all $\epsilon>0$, which contradicts that $A$ is positive semi-definite. If $r'Sr>0$ and $r'r>0$ then $$ \min_{\epsilon>0}z'Az=-\frac{(r'r)^2}{r'Sr}<0, $$ which again contradicts that $A$ is positive semi-definite. Thus $r'r=0$, i.e. $r=0$. **QED** **Lemma 2: [crossprod]** Suppose the positive semi-definite matrix $A$ of order $n$ has eigenvalue decomposition $$ A=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}\Lambda^2&0\\0&0\end{bmatrix}\begin{bmatrix}K'\\K_\perp'\end{bmatrix}, $$ with $\Lambda^2$ a positive definite diagonal matrix of order $r=\mathbf{rank}(A)$. The equation $A=XX'$, with $X$ an $n\times p$ matrix has a solution if and only if $p\geq r$. All solutions are of the form $X=K\Lambda L'$, with $L$ is a $p\times r$ orthonormal matrix. **Proof:** Write $X$ as $$ X=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}U\\V\end{bmatrix}, $$ which gives $$ XX'=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}UU'&UV'\\VU'&VV'\end{bmatrix}\begin{bmatrix}K'\\K_\perp'\end{bmatrix}. $$ Thus $XX'=A_\bullet$ if and only if $V=0$ and $UU'=\Lambda$. It follows that $X=K\Lambda L'$, with a $p\times r$ orthonormal $L$. Also $\mathbf{rank}(X)=\mathbf{rank}(A_\bullet)=r$ and $p\geq r$. **QED** **Lemma 3: [simultaneous]** Suppose $(A_1,\cdots,A_m)$ is a sequence of real symmetric matrices of order $n$. Then there exist a square orthonormal $X$ and diagonal $W_j$ such that $A_j=XW_jX'$ if and only if the $A_j$ commute in pairs, i.e. if and only if $A_jA_\ell=A_\ell A_j$ for all $j\not=\ell$. **Proof:** It is obvious that simultaneously diagonalizability implies that the $A_j$ commute in pairs. The interesting part of the proof is to show that pairwise commuting implies simultaneous diagonalizability. The standard proof, repeated most recently in @jiang_li_16, uses induction, starting from the fact that the lemma is trivially true for $m=1$. We give the proof in our notation and make it perhaps a bit more explicit and computational. So let us suppose the real symmetric matrices $(A_1,\cdots,A_m)$ commute in pairs. And suppose $A_m$ has eigenvalue decomposition $$ A_m=\begin{bmatrix}K_1&K_2&\cdots&K_r\end{bmatrix}\begin{bmatrix}\lambda_1I&0&\cdots&0\\0&\lambda_2I&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\lambda_rI\end{bmatrix} \begin{bmatrix}K_1'\\K_2'\\\vdots\\K_r'\end{bmatrix}, $$ with all $\lambda_s$ different. Set $K:=\begin{bmatrix}K_1&K_2&\cdots&K_r\end{bmatrix}$. Then for all $j=1,\cdots,m-1$ $$ A_mA_jK_s=A_jA_mK_s=\lambda_1A_jK_s $$ and thus $A_jK_s$ are eigenvectors of $A_m$ with eigenvalue $\lambda_s$, i.e. $A_jK_s=K_s(K_s'A_jK_s)$. Write this as $$ K'A_jK=\begin{bmatrix}K_1'A_jK_1&0&\cdots&0\\0&K_2'A_jK_2&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&K_r'A_jK_r\end{bmatrix}. $$ Now obviously the matrices $K'A_jK$ commute in pairs, which implies that that the $m-1$ matrices $(K_s'A_1K_s,\cdots,K_s'A_{m-1}K_s)$ commute in pairs for each $s$. By the induction hypothesis there are square orthonormal $L_s$ and diagonal $\Phi_{js}$ such that $K_s'A_jK_s=L_s\Phi_{js} L_s'$. Define $L:=\begin{bmatrix}L_1&L_2&\cdots&L_r\end{bmatrix}$. Then $$ L'K'A_jKL=\begin{bmatrix}\Phi_{j1}&0&\cdots&0\\0&\Phi_{j2}&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\Phi_{jr}\end{bmatrix}, $$ for $j=1,\cdots,m-1$, while of course $$ L'K'A_jKL=\begin{bmatrix}\lambda_1I&0&\cdots&0\\0&\lambda_2I&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\lambda_rI\end{bmatrix}. $$ **QED** **Lemma 4: [commute]** If $A$ and $B$ are symmetric matrices with $A+B=I$ then $A$ and $B$ commute. **Proof:** $AB=A(I-A)=A-A^2$, which is symmetric. Another way to see this is that $A$ and $B=I-A$ have the same eigenvectors $L$, and $L$ diagonalizes both matrices. **QED** #References