I am generating a numpy array with 1000000 random numbers between 1 and 6 and i would like to calculate the mean of the first 10, 100, 1000, ... I also want to plot the means on logarithmic scale. I mustn't use anything but Python with numpy and matplotlib. Why do i get this error? What have i done wrong?
This is my code:
throws=numpy.random.randint(1,7,(1000000))
print(throws[1:10])
x=np.logspace(1,6,6)
plt.plot(x, int(mean(throws[1:x])))
plt.semilogx()
Sorry for my bad English and the german variable names...