For a 2D matrix X (shape (m,n)), I'm trying to calculate X.T * X where * is matrix multiplication. Following the explanation on this post I expected to be able to do this using, np.einsum('ji,ik->jk', X, X) where on the LHS writing ji first takes the transpose of the first X argument, then multiplies it by the second X argument.
This doesn't work with the error (for (m,n) = (3,4)):
ValueError: operands could not be broadcast together with remapped shapes [original->remapped]: (4,3)->(4,newaxis,3) (4,3)->(3,4)
This works however: np.einsum('ij,jk->ik', X.T, X). What am I missing here? Why is it even adding an axis in the middle anyway?