Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

53 questions
1
vote
1 answer

Is it a flaw that Optuna examples return the evaluation metric of the test set?

I am using Optuna for parameter optimization for some models. In almost all the examples the objective function returns a evaluation metric on the TEST set, and tries to minimize/maximize this. I feel like this is a flaw in the examples since…
brian
  • 11
  • 1
1
vote
1 answer

How to tune conditional objective function using optuna or hyperopt

I tried to use optuna to tune hyperparameters. But my objective function is conditional which creates issues in getting optimal parameters. i want to get cwc only if the condtion is met otherwise continue trial for next hyperparameters. But i guess…
0
votes
0 answers

optuna.integration.lightGBM custom optimization metric

I am trying to optimize a lightGBM model using optuna. Reading the docs I noticed that there are two approaches that can be used, as mentioned here: LightGBM Tuner: New Optuna Integration for Hyperparameter Optimization. The first approach uses the…
0
votes
1 answer

Optuna ConvergenceWarning on Lasso hyperparameter tuning study

When fine tuning my Lasso model using Optuna, I get the following ConvergenceWarning.. Is it possible to increase nr. of iterations? I increased n_trials but it didn't help. My code: def objective(trial): _alpha = trial.suggest_float("alpha",…
TkrA
  • 21
  • 3
0
votes
0 answers

Optuna pass dictionary of parameters from "outside"

I am using Optuna to optimize some objective functions. I would like to create my custom class that "wraps" the standard Optuna code. As an example, this is my class(it is still a work in progress!): class Optimizer(object): def…
Mattia Surricchio
  • 717
  • 2
  • 9
  • 29
0
votes
0 answers

optuna cmaes with initital value x0?

my code: import optuna def objective(trial): x = trial.suggest_float('x', -100.0, 100.0) y = trial.suggest_float('y', -100.0, 100.0) return x**2 + y**2 study =…
0
votes
1 answer

suggest_int() missing 1 required positional argument: 'high' error on Optuna

I have the following code of Optuna to do the hyperparameter tunning for a Xgboost classifier. import optuna from optuna import Trial, visualization from optuna.samplers import TPESampler from xgboost import XGBClassifier def objective(trial:…
Yifan Lyu
  • 23
  • 4
0
votes
1 answer

Improving the performance of an autoencoder network

For a couple of days, I am working to improve the performance of my autoencoder network, from changing the network architecture to manually tuning some parameters and lately using optuna to optimize hpyerparameters. All to no significant improvement…
arilwan
  • 1,952
  • 2
  • 16
  • 29
0
votes
1 answer

Create a list from dictionary items of optimal hyperparameters

I'm using the optuna framework to select the best parameters for my intended CNN network, including number of layers, filters in a layer, optimizer etc. I can confirm my best parameters is a dictionary containing: study =…
super_ask
  • 782
  • 1
  • 14
0
votes
1 answer

Limit max number of parallel processes in Optuna

How to limit the max number of parallel processes when running hyper-parameter search in Optuna?
Gal Hyams
  • 167
  • 1
  • 3
  • 12
0
votes
1 answer

Optuna catboost pruning

is there a way to have pruning with CatBoost and Optuna (in LightGBM it's easy but in Catboost I can't find any hint). My code is like this def objective(trial): param = { 'iterations':trial.suggest_int('iterations', 100,1500,…
0
votes
0 answers

Turning off Warning in Optuna Training

I fully realized that I will likely be embarassed for missing something obvious, but this has me stumped. I am tuning a LGBM model using Optuna, and my notebook gets flooded with warning messages, how can I suppress them leaving errors (and ideally…
Ncosgove
  • 13
  • 4
0
votes
1 answer

Optuna lightgbm integration giving categorical features error

Im creating a model using optuna lightgbm integration, My training set has some categorical features and i pass those features to the model using the lgb.Dataset class, here is the code im using ( NOTE: X_train, X_val, y_train, y_val are all pandas…
Occhima
  • 125
  • 1
  • 8
0
votes
1 answer

How to set a minimum number of epoch in Optuna SuccessiveHalvingPruner()?

I'm using Optuna 2.5 to optimize a couple of hyperparameters on a tf.keras CNN model. I want to use pruning so that the optimization skips the less promising corners of the hyperparameters space. I'm using something like this: study0 =…
0
votes
0 answers

Why aren't search variables and domains defined before?

This is more a conceptual question: Consider the minimal example below that optimizes for the minimal value of an absolute-value function. Here I used trial.suggest_float within the objective function just as it is taught in the tutorials. But here…
flawr
  • 6,011
  • 2
  • 27
  • 46