n\). Clearly l-rank is less than or equal to d-rank, which implies by theorem 1 that d-rank is larger than or equal to \(r=\mathbf{rank}(A_\bullet)\). In order to proceed we need another definition. \textbf{Defintion 3: {[}pextension{]}} Any \(p\times p\) symmetric matrix of order \(p\) which has the \(r\times r\) matrix \(A\) as its leading principal submatrix is a \emph{p-extension} of \(A\). \textbf{Theorem 2: {[}drank{]}} The d-rank of \((A_1,\cdots.A_m)\) is less than or equal to \(p\) if and only if there exist for each \(j=1,\cdots,m\) pairwise commuting positive semi-definite p-extensions \(C_j\) of \(B_j=\Lambda^{-1}K'A_jK\Lambda^{-1}\) that add up to one. \textbf{Proof:} Equation \(\eqref{E:wform}\) in the proof of theorem 1 shows there must exist \(U_j\) and \(V_j\) such that \[ \begin{bmatrix}\Lambda^{-1}K'A_jK\Lambda^{-1}&U_j\\U_j'&V_j\end{bmatrix}=\begin{bmatrix}L'\\L_\perp'\end{bmatrix}W_j\begin{bmatrix}L&L_\perp\end{bmatrix} \] But this is the same as saying there must be p-extensions of the \(B_j\) that commute. \textbf{QED} It seems that in general for \(p>n\) and \(m>2\) commuting p-extensions are difficult to work with. But in some cases theorem 2 simplifies. \textbf{Corollary 1: {[}small\_d\_rank{]}} The d-rank of \((A_1,\cdots,A_m)\) is \(r=\mathbf{rank}(A_\bullet)\) if and only if \(A_jA_\bullet^+A_\ell=A_\ell A_\bullet^+A_j\) for all \(j,\ell\). \textbf{Proof:} The \(\Lambda^{-1}K'A_jK\Lambda^{-1}\) must commute without any p-extension. This translates to the condition in the theorem. \textbf{QED} Finally, we also have as a corollary a version of the basic result of De Leeuw (1982). \textbf{Corollary 2: {[}m=2{]}} The d-rank of \((A_1,A_2)\) is \(\mathbf{rank}(A_1+A_2)\). \textbf{Proof:} Because \(\Lambda^{-1}K'A_1K\Lambda^{-1}+\Lambda^{-1}K'A_2K\Lambda^{-1}=I\) we see from lemma 4 that \(\Lambda^{-1}K'A_1K\Lambda^{-1}\) and \(\Lambda^{-1}K'A_2K\Lambda^{-1}\) commute. \textbf{QED} \section{Appendix: Some Lemmas}\label{appendix-some-lemmas} \textbf{Lemma 1: {[}diagonal{]}} If \(A\) is positive semi-definite and \(a_{ii}=0\) then \(a_{ij}=a_{ji}=0\) for all \(j\). \textbf{Proof:} Suppose \[ A=\begin{bmatrix}0&r'\\r&S\end{bmatrix} \] is positive semi-definite. Define \(z=\begin{bmatrix}1&-\epsilon r\end{bmatrix}\) with \(\epsilon>0\). Then \(z'Az=-2\epsilon r'r+\epsilon^2r'Sr\). If \(r'Sr=0\) and \(r'r>0\) then \(z'Az<0\) for all \(\epsilon>0\), which contradicts that \(A\) is positive semi-definite. If \(r'Sr>0\) and \(r'r>0\) then \[ \min_{\epsilon>0}z'Az=-\frac{(r'r)^2}{r'Sr}<0, \] which again contradicts that \(A\) is positive semi-definite. Thus \(r'r=0\), i.e. \(r=0\). \textbf{QED} \textbf{Lemma 2: {[}crossprod{]}} Suppose the positive semi-definite matrix \(A\) of order \(n\) has eigenvalue decomposition \[ A=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}\Lambda^2&0\\0&0\end{bmatrix}\begin{bmatrix}K'\\K_\perp'\end{bmatrix}, \] with \(\Lambda^2\) a positive definite diagonal matrix of order \(r=\mathbf{rank}(A)\). The equation \(A=XX'\), with \(X\) an \(n\times p\) matrix has a solution if and only if \(p\geq r\). All solutions are of the form \(X=K\Lambda L'\), with \(L\) is a \(p\times r\) orthonormal matrix. \textbf{Proof:} Write \(X\) as \[ X=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}U\\V\end{bmatrix}, \] which gives \[ XX'=\begin{bmatrix}K&K_\perp\end{bmatrix}\begin{bmatrix}UU'&UV'\\VU'&VV'\end{bmatrix}\begin{bmatrix}K'\\K_\perp'\end{bmatrix}. \] Thus \(XX'=A_\bullet\) if and only if \(V=0\) and \(UU'=\Lambda\). It follows that \(X=K\Lambda L'\), with a \(p\times r\) orthonormal \(L\). Also \(\mathbf{rank}(X)=\mathbf{rank}(A_\bullet)=r\) and \(p\geq r\). \textbf{QED} \textbf{Lemma 3: {[}simultaneous{]}} Suppose \((A_1,\cdots,A_m)\) is a sequence of real symmetric matrices of order \(n\). Then there exist a square orthonormal \(X\) and diagonal \(W_j\) such that \(A_j=XW_jX'\) if and only if the \(A_j\) commute in pairs, i.e.~if and only if \(A_jA_\ell=A_\ell A_j\) for all \(j\not=\ell\). \textbf{Proof:} It is obvious that simultaneously diagonalizability implies that the \(A_j\) commute in pairs. The interesting part of the proof is to show that pairwise commuting implies simultaneous diagonalizability. The standard proof, repeated most recently in Jiang and Li (2016), uses induction, starting from the fact that the lemma is trivially true for \(m=1\). We give the proof in our notation and make it perhaps a bit more explicit and computational. So let us suppose the real symmetric matrices \((A_1,\cdots,A_m)\) commute in pairs. And suppose \(A_m\) has eigenvalue decomposition \[ A_m=\begin{bmatrix}K_1&K_2&\cdots&K_r\end{bmatrix}\begin{bmatrix}\lambda_1I&0&\cdots&0\\0&\lambda_2I&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\lambda_rI\end{bmatrix} \begin{bmatrix}K_1'\\K_2'\\\vdots\\K_r'\end{bmatrix}, \] with all \(\lambda_s\) different. Set \(K:=\begin{bmatrix}K_1&K_2&\cdots&K_r\end{bmatrix}\). Then for all \(j=1,\cdots,m-1\) \[ A_mA_jK_s=A_jA_mK_s=\lambda_1A_jK_s \] and thus \(A_jK_s\) are eigenvectors of \(A_m\) with eigenvalue \(\lambda_s\), i.e. \(A_jK_s=K_s(K_s'A_jK_s)\). Write this as \[ K'A_jK=\begin{bmatrix}K_1'A_jK_1&0&\cdots&0\\0&K_2'A_jK_2&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&K_r'A_jK_r\end{bmatrix}. \] Now obviously the matrices \(K'A_jK\) commute in pairs, which implies that that the \(m-1\) matrices \((K_s'A_1K_s,\cdots,K_s'A_{m-1}K_s)\) commute in pairs for each \(s\). By the induction hypothesis there are square orthonormal \(L_s\) and diagonal \(\Phi_{js}\) such that \(K_s'A_jK_s=L_s\Phi_{js} L_s'\). Define \(L:=\begin{bmatrix}L_1&L_2&\cdots&L_r\end{bmatrix}\). Then \[ L'K'A_jKL=\begin{bmatrix}\Phi_{j1}&0&\cdots&0\\0&\Phi_{j2}&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\Phi_{jr}\end{bmatrix}, \] for \(j=1,\cdots,m-1\), while of course \[ L'K'A_jKL=\begin{bmatrix}\lambda_1I&0&\cdots&0\\0&\lambda_2I&\cdots&\cdots\\\vdots&\vdots&\ddots&\vdots\\0&0&\cdots&\lambda_rI\end{bmatrix}. \] \textbf{QED} \textbf{Lemma 4: {[}commute{]}} If \(A\) and \(B\) are symmetric matrices with \(A+B=I\) then \(A\) and \(B\) commute. \textbf{Proof:} \(AB=A(I-A)=A-A^2\), which is symmetric. Another way to see this is that \(A\) and \(B=I-A\) have the same eigenvectors \(L\), and \(L\) diagonalizes both matrices. \textbf{QED} \section*{References}\label{references} \addcontentsline{toc}{section}{References} \hypertarget{refs}{} \hypertarget{ref-deleeuw_A_82b}{} De Leeuw, J. 1982. ``Generalized Eigenvalue Problems with Positive Semidefinite Matrices.'' \emph{Psychometrika} 47: 87--94. \url{http://www.stat.ucla.edu/~deleeuw/janspubs/1982/articles/deleeuw_A_82b.pdf}. \hypertarget{ref-deleeuw_pruzansky_78}{} De Leeuw, J., and S. Pruzansky. 1978. ``A New Computational Method to fit the Weighted Euclidean Distance Model.'' \emph{Psychometrika} 43: 479--90. \hypertarget{ref-jiang_li_16}{} Jiang, R., and D. Li. 2016. ``Simultaneous Diagonalization of Matrices and its Applications in Quadratically Constrained Quadratic Programming.'' \emph{SIAM Journal of Optimization} 26 (3): 1649--68. \hypertarget{ref-schoenemann_72}{} Schönemann, P.H. 1972. ``An Algebraic Solution for a Class of Subjective Metrics Models.'' \emph{Psychometrika} 37: 441--51. \end{document}