Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

53 questions
0
votes
0 answers

rounding a trial float param

One of the parameters I am optimizing over is the learning rate for an optimizer in a DNN. I would like to limit the number of digits which can be set for that. Currently I am using lr = trial.suggest_loguniform("lr", 1E-6, 1E-3) which can set…
ironv
  • 817
  • 6
  • 19
0
votes
1 answer

How can I choose the right number of epochs per trial in Optuna?

Is there a rule of thumb for how to choose the number of epochs per trial in Optuna?
mhdadk
  • 145
  • 1
  • 5
0
votes
1 answer

As far as I can tell, there is no way to parameterize character strings in an AllenNLP config file --- only ints or floats

So the issue is that, for using autotuning (like optuna) with AllenNLP, the suggested practice is to use, in jsonnet scripts, references to environment variables, and then to set up a study to modify those parameters. That works fine, when the…
Jim Cox
  • 13
  • 3
0
votes
1 answer

Best parameters of an Optuna multi-objective optimization

When performing a single-objective optimization with Optuna, the best parameters of the study are accessible using: import optuna def objective(trial): x = trial.suggest_uniform('x', -10, 10) return (x - 2) ** 2 study =…
0
votes
0 answers

How to make optuna suggest (different) values few times in a row and use only the last suggested value?

Here is the basic code example: import optuna def objective(trial): i=0 while i < 5: x = trial.suggest_uniform('x', -10, 10) c = trial.suggest_categorical('c',['dave','masha']) print('x =',x,'; c =',c) i +=1 …
0
votes
1 answer

Is there any equivalent of hyperopts lognormal in Optuna?

I am trying to use Optuna for hyperparameter tuning of my model. I am stuck in a place where I want to define a search space having lognormal/normal distribution. It is possible in hyperopt using hp.lognormal. Is it possible to define such a space…
MSS
  • 1,448
  • 14
  • 30
0
votes
1 answer

Optuna hyper-parameter optimization: define hyper-parameter space outside the objective function

Does anybody know how to define the hyper-parameter space outside the objective function using Optuna API? class Objective (object): def __init__(self, metric, #training_param_search_space, …
0
votes
1 answer

CNN forward function , AutoTuning the number of layers

class ConvolutionalNetwork(nn.Module): def __init__(self, in_features, trial): # we optimize the number of layers, hidden units and dropout ratio in each layer. n_layers = self.trial.suggest_int("n_layers", 1, 5) p =…
Tonz
  • 95
  • 8
0
votes
0 answers

Averaging an optuna objective function over the nearby parameter volume

In the optuna framework a Trial returns the objective function for a particular parameter choice. There are cases where the global minimum is not a stable set of parameters and the user may want to average / apply a function (e.g. penalize for high…
Alexander McFarlane
  • 8,800
  • 7
  • 47
  • 91
0
votes
1 answer

Optuna on Pytorch CNN

class ConvolutionalNetwork(nn.Module): def __init__(self, in_features, trial): super().__init__() self.in_features = in_features self.trial = trial # this computes no of features outputted by 2 conv layers …
Tonz
  • 95
  • 8
0
votes
0 answers

Optuna Autotuning

anyone tried Optuna autotuning before? #Setup optuna platform for autotuning, with TPESampler to minimize loss study = optuna.create_study(sampler=optuna.samplers.TPESampler(), direction="minimize") study.optimize(autotune, n_trials=300) …
Tonz
  • 95
  • 8
0
votes
0 answers

Optimizing filter sizes of CNN with Optuna

I have created a CNN for classification of three classes based on input images of size 39 x 39. I'm optimizing the parameters of the network using Optuna. For Optuna I'm defining the following parameters to optimize: num_blocks =…
machinery
  • 4,802
  • 8
  • 49
  • 88
0
votes
1 answer

Additional parameters in def__init__ for Optuna Trial

class ConvolutionalNetwork(nn.Module): def __init__(self, in_features): super().__init__() self.in_features = in_features # this computes num features outputted from the two conv layers c1 = int(((self.in_features…
Tonz
  • 95
  • 8
0
votes
1 answer

Specify fixed parameters and parameters to be search in optuna (lightgbm)

I just found Optuna and it seems they are integrated with lightGBM, but I struggle to see where I can fix parameters, e.g scoring="auc" and where I can define a gridspace to search, e.g num_leaves=[1,2,5,10]. Using…
CutePoison
  • 2,118
  • 1
  • 13
  • 22
0
votes
1 answer

Optuna - Memory Issues

I am trying to free memory in between Optuna optimization runs. I am using python 3.8 and the latest version of Optuna. What happens is I run the commands: optuna.create_study(), then I call optuna.optimize(...) in a loop, with a new objective…