[SOLVED] Multiply a 3D array with a 2D array

Issue

I have a 3D numpy array and I want to multiply it with a 2D array, The 3D looks like follows:

``````C= np.zeros((3, 2, 2))
C[0][0] = [0,0]
C[0][1] = [0,1]
C[1][0] = [1,0]
C[1][1] = [1,1]
C[2][0] = [1,2]
C[2][1] = [2,1]

``````

The 2D array looks like:

``````V = np.zeros((3,2))
V[0][0] = 1
V[0][1] = 2
V[1][0] = 1
V[1][1] = 3
V[2][0] = 4
V[2][1] = 5
``````

The result `R` is to be a 2X2 2D array(4 elements in total) `R=[[5,8],[13,10]]` where:

``````R[0] = V[0][0]*C[0][0]+V[1][0]*C[1][0]+V[2][0]*C[2][0] = [5,8] (first row of R)

R[1] = V[0][1]*C[0][1]+V[1][1]*C[1][1]+V[2][1]*C[2][1] = [13,10] (second row of R)
``````

This is just an example, How Can I get `R` using numpy matrix multiplication operation with `V` and `C` (with no for loop!). Please help!

Sorry I made some edit later, the comment showed an old example, it should be good now

Solution

Your example is confusing. Why do you say your expected result is `[[1, 0], [5, 10]]` but in your example you also say `R` should be `[[5, 8], [13, 10]]`?

I hope this was just a typo on your part because it’s not clear from your example how you’d get from one to the other.

In any case:

``````(V.T * C.T).sum(axis=2).T
``````

Output:

``````array([[ 5.,  8.],
[13., 10.]])
``````