Topic Description
Compute the multiplication of two matrices. A matrix $A$ of order $n \times m$ is multiplied by a matrix $B$ of order $m \times k$ to obtain a matrix $C$ of order $n \times k$ with $C[i][j]=A[i][0] \times B[0][j]+A[i][1] \times B[1][j]+$ ... ... $+A[i][m-1] \times B[m-1][j] (C[i][j]$ denotes the $i$th row and $j$th column element of the $C$ matrix).
Input format
The first row is $n,m,k$, which means that the $A$ matrix has $n$ rows and $m$ columns, the $B$ matrix has $m$ rows and $k$ columns, and $n,m,k$ is less than $100$.
Then input $A$ and $B$ matrices successively, $A$ matrix $n$ rows $m$ columns, $B$ matrix $m$ rows $k$ columns, and the absolute value of each element in the matrix will not be greater than $1000$.
Output format
Output matrix $C$ with $n$ rows, $k$ integers per row, separated by a space.
Sample input
3 2 3 1 1 1 1 1 1 1 1 1 1 1 1
Sample output
2 2 2 2 2 2 2 2 2