Processing math: 0%
\def\<#1>{\left<#1\right>} \newcommand{\OR}[1]{\mathrm{OR}(#1)} \newcommand{\XOR}[1]{\mathrm{XOR}(#1)} \newcommand{\SUM}[1]{\mathrm{SUM}(#1)} \newcommand{\GEN}{\mathrm{GEN}} \newcommand{\Cov}{c} \newcommand{\Dec}{d} \newcommand{\tr}{\mathrm{tr}} \newcommand{\pr}{\mathrm{rk}_+} \newcommand{\br}{\mathrm{rk}_{\lor}} \newcommand{\Geq}{\succcurlyeq} \newcommand{\Leq}{\preccurlyeq}

On XOR complexity of direct sum of matrices

Recall that the direct sum of matrices A and B is the matrix A\boxplus B=\left[\begin{smallmatrix} A & 0 \\ 0 & B\end{smallmatrix}\right]. For SUM and OR complexities, we have that (cf., e.g. Thm. 3.18): \SUM{A\boxplus B}=\SUM{A}+\SUM{B}\ \ \mbox{ and }\ \ \OR{A\boxplus B}=\OR{A}+\OR{B}\,. But what about the XOR-complexity? Does then \XOR{A\boxplus B}\geq \XOR{A}+\XOR{B}\qquad (*) also hold?

Igor recently observed that (*) does not hold: the complexity of direct sum of matrices may be smaller than the sum of their complexities.

Take a boolean n\times n matrix A such that \XOR{A}\sim n^2/2\log n; by Thm. 2.2, we know that such matrices exist.

Let M_n be the Kronecker product of the identity matrix I_n with A, that is, M_1=A\ \ \mbox{ and } \ \ M_{n+1}=\left[\begin{matrix} A & 0 \\ 0 & M_n\end{matrix}\right] Held (*), then we would have that \XOR{M_n}\geq n\cdot \XOR{A}=\Omega(n^3/\log n).

But the outputs of an XOR circuit for y=M_n\cdot x are the entries of the product matrix Y=A\cdot X, where X is the vector x written as an n\times n matrix. By using the Strassen algorithm, all these entries can be computed over GF(2) using only O(n^{\omega}) operations, where \omega < 2.81 (actually, \omega < 2.4), which is much smaller than \Omega(n^3/\log n), a contradiction.


May 22, 2015