Problem:
Calculate the mean and standard deviation of a tightly clustered set of 1000 initial conditions as a function of iteration number. The bunch of initial conditions should be Gaussian distributed about x = 0.3 with a standard deviation of 10-3
The code I wrote:
from numpy import *
def IterateMap(x,r,n):
for i in xrange(n):
x = r * x * (1.0 - x)
return x
output = "data"
nIterations = 1000
r = 4.0
x0 = 0.3
delta = 0.00005
L = []
for i in xrange(nIterations):
x = x0
x = IterateMap(x,r,1)
L[i] = x
x0 = x0 + delta
A = array(L)
print 'mean: ', mean(A)
So what my code is supposed to do is to take an initial value for x (x0) and call the IterateMap function and return a new value of x and place it in a list(L) then x0 changes to a new value, and this process continues for 1000 times. I get the error "list assignment index out of range". Also, do you think I'm following the problem correctly?
L = []and then you try to doL[i] = x, there are no elements in the list. ;)