1

I woul like to solve an n-dimensional optimisation problem using iminuit.

So my approach is the following. I am trying to figure out how to extend this:

def f(x,y,z):
    return (x-1.)**2 + (y-2*x)**2 + (z-3.*x)**2 -1.

to a variable "x" that is a numpy.array.

I would like to do something like this:

x = [1,2,3,4,5]
y = [2,4,6,8,10]# y=2x
class StraightLineChi2:
    def __init__(self,x,y):
        self.x = x
        self.y = y
    def __call__(self,m,c): #lets try to find slope and intercept
        chi2 = sum((y - m*x+c)**2 for x,y in zip(self.x,self.y))
        return chi2

but in my case x is my unknown, and it is an array. Like in many optimization/minimization problems, the function is a f=f(x1,...,xn) where n can be big. x1,...,xn are the unknowns of the problem.

(These examples are taken from here)

Something similar is achieved "hacking" pyminuit2, like described here

Cœur
  • 32,421
  • 21
  • 173
  • 232
davrandom
  • 21
  • 8
  • Am I right that you want to return function of one variable (parameterized with `m` and `c`)? – twil Jul 19 '13 at 14:47
  • No, maybe the example is itself not very explanatory. f=f(x1,...,xn) where x1,...,xn is a numpy array. x1,...,xn are my unknowns, and I have some initial values. So i would like to use minuit as you can use scipy.optimize or ipopt or nlopt. – davrandom Jul 19 '13 at 15:53
  • What does it mean to say "array minus 1."? What does it mean to say to square that? – 2rs2ts Jul 19 '13 at 16:06
  • @2rs2ts From the numpy tag, those are element-wise operations. – jorgeca Jul 21 '13 at 14:06
  • @davrandom I couldn't understand your question, but after having a look at the tutorial you link, I think you may want to know how to solve an n-dimensional optimisation problem using iminuit, right? If that's the case please improve your question. – jorgeca Jul 21 '13 at 14:24
  • @jorgeca Yes, exactly. Thanks! – davrandom Jul 22 '13 at 08:00
  • 1
    I'm iminuit author. The question is is n fixed? If not then, it's not really a parametric problem and iminuit can't help you with that. But if it's fixed. You can look at hardcore tutorial. Especially at generic reusuable cost function. You can flatten put all flatten argument in an array and call your function again in __call__(self, *arg). http://nbviewer.ipython.org/urls/raw.github.com/iminuit/iminuit/master/tutorial/hard-core-tutorial.ipynb – Piti Ongmongkolkul Jul 31 '13 at 18:33
  • 1
    I could also write a quick example if you need (assuming that n is fixed). – Piti Ongmongkolkul Jul 31 '13 at 18:40
  • @PitiOngmongkolkul yes n is fixed and if you have some time I would love to have a quick example! (obviously I will first read your linked tutorial) – davrandom Aug 28 '13 at 07:49
  • This question is about iminuit and by now there's a stackoverflow tag for this. Could someone with the required karma please add that tag? – Christoph Aug 02 '15 at 09:35

1 Answers1

4

For your example I recommend you using iminuit and probfit. Having an argument as a list of parameter is not exactly what you want to do since you will get confused which parameter is what very soon.

Here is an example taken straight from probfit tutorial. Also see the documentation


import iminuit
import probfit
x = np.linspace(0, 10, 20) 
y = 3 * x + 15 + np.random.randn(len(x))
err = np.ones(len(x))
def line(x, m, c): # define it to be parabolic or whatever you like
    return m * x + c
chi2 = probfit.Chi2Regression(line, x, y, err)
minuit = iminuit.Minuit(chi2)
minuit.migrad();
print(minuit.values) #{'c': 16.137947520534624, 'm': 2.8862774144823855}

Piti Ongmongkolkul
  • 2,047
  • 21
  • 19