I am going to make the following matrix:
s= [[s11 s12 s13]
[s21 s22 s23]
[s31 s32 s33]]
where I can obtain each array of the matrix s by:
sii = a(i) ; for s11, s22, and s33
sij = a(i)**2 + 10 ; for s12=s21, s23=s32, and s13=s31
here, ai is a list of data:
a = [0.1, 0.25, 0.12]
So when I use the following:
import numpy as np
s = np.ones([3,3])
def matrix(s):
a = [0.1, 0.25, 0.12]
s[np.diag_indices_from(s)] = ai
s[~np.eye(s.shape[0],dtype=bool)] = ai**2 + 10
It gives me an error. How can I solve this problem? Thanks.