0

The model that I tried to specify was:

M4 <- glmer(CORT_pgmm~AO_Ause+(1|St),data=belcher,family=Gamma(link="log"), control=glmerControl(optimizer="bobyqa"))

While doing this I got the following warning message using glmer with "bobyqa" and "Nelder_Mead" as optimizers.

Warning message:
In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv,  :
  Model failed to converge with max|grad| = 0.0146734 (tol = 0.001, component 1)

I followed the instructions given by Mr. Ben Bolker in a previous answer

Once I change the optimizers to the ones contained in the optimx package the warning message stop appearing.

 M4 <- glmer(CORT_pgmm~AO_Ause+(1|St),data=belcher,family=Gamma(link="log"), control=glmerControl(optimizer="optimx",optCtrl=list(method="nlminb")))

Still, I am not sure how to know if this is the correct way to proceed in my case. How can I be sure about it?

My apologies if I did not write this correctly as it is my first post. Thank you in advance for your answers.

Community
  • 1
  • 1
S. Llanos
  • 3
  • 3
  • I'll let BenBolker respond here, but it is generally helpful to provide a [reproducible example](http://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example) – Alex W Nov 18 '15 at 21:51

1 Answers1

0

You can never be sure (this is numerical optimization of a case about which we can't prove a whole lot in the general case), but as a general matter I would say that if you have succeeded in reaching approximately the same putatively "optimal" parameter estimates with more than one different optimizer you can stop worrying about convergence failure. How close "approximately" is depends on personal taste and scientific goals. For my typical use cases, estimating parameters within (say) 1%, or conservatively 0.1%, of each other with different optimizers would count as "approximately equal".

Ben Bolker
  • 173,430
  • 21
  • 312
  • 389