Questions tagged [optuna]

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values. Site: https://optuna.org

Optuna is Hyper-parameter Optimization Framework for Python (versions 2.7 and 3.*) that helps to find the best parameters for Machine Learning models via checking various combinations of parameters' values.

Resources:

53 questions
3
votes
1 answer

Optuna Pytorch: returned value from the objective function cannot be cast to float

def autotune(trial): cfg= { 'device' : "cuda" if torch.cuda.is_available() else "cpu", # 'train_batch_size' : 64, # 'test_batch_size' : 1000, # 'n_epochs' : 1, # 'seed' : 0, # …
Tonz
  • 95
  • 8
3
votes
1 answer

How can I cross-validate by Pytorch and Optuna

I want to use cross-validation against the official Optuna and pytorch-based sample code (https://github.com/optuna/optuna/blob/master/examples/pytorch_simple.py). I thought about splitting the data for cross-validation and trying parameter tuning…
sta
  • 31
  • 1
3
votes
1 answer

OptKeras (Keras Optuna Wrapper) - use optkeras inside my own class, AttributeError: type object 'FrozenTrial' has no attribute '_field_types'

I wrote a simple Keras code, in which I use CNN for fashion mnist dataset. Everything works great. I implemented my own class and classification is OK. However, I wanted to use Optuna, as OptKeras (Optuna wrapper for Keras), you can see an example…
yak
  • 3,180
  • 14
  • 53
  • 98
3
votes
3 answers

Python: How to retrive the best model from Optuna LightGBM study?

I would like to get the best model to use later in the notebook to predict using a different test batch. reproducible example (taken from Optuna Github) : import lightgbm as lgb import numpy as np import sklearn.datasets import sklearn.metrics from…
HarriS
  • 359
  • 2
  • 12
3
votes
2 answers

How to sample parameters without duplicates in optuna?

I am using optuna for parameter optimisation of my custom models. Is there any way to sample parameters until current params set was not tested before? I mean, do try sample another params if there were some trial in the past with the same set of…
roseaysina
  • 55
  • 7
3
votes
1 answer

how to fit learning rate with pruning?

The background for the question is optimizing hyper params of neural network training by running study.optimize() with default pruning enabled and learning rate as parameter to optimize (this question can be generalized to other hyperparams). high…
2
votes
1 answer

How to suggest multivariate of ratio (with bound) in Optuna?

I want suggest ratio in Optuna. The ratio is X_1, X_2, ..., X_k bounded to ∑X_i = 1 and 0 <= X_i <= 1 for all i. Optuna doesn't offer Dirichlet distribution. I tried this but it doesn't work. def objective(trial): k = 10 ratios =…
2
votes
2 answers

Optuna Suggests the Same Parameter Values in a lot of Trials (Duplicate Trials that Waste Time and Budget)

Optuna TPESampler and RandomSampler try the same suggested integer values (possible floats and loguniforms as well) for any parameter more than once for some reason. I couldn't find a way to stop it from suggesting same values over over again. Out…
2
votes
0 answers

Using optuna LightGBMTunerCV as starting point for further search with optuna

I'm trying to use LightGBM for a regression problem (mean absolute error/L1 - or similar like Huber or pseud-Huber - loss) and I primarily want to tune my hyperparameters. LightGBMTunerCV in optuna offers a nice starting point, but after that I'd…
Björn
  • 392
  • 4
  • 19
2
votes
1 answer

How to fix error "'_BaseUniformDistribution' object has no attribute 'to_internal_repr'" - strange behaviour in optuna

I am trying to use optuna lib in Python to optimise parameters for recommender systems' models. Those models are custom and look like standard fit-predict sklearn models (with methods get/set params). What I do: simple objective function that…
roseaysina
  • 55
  • 7
1
vote
1 answer

What happens when I add/remove parameters dynamically during an Optuna study?

Optuna's FAQ has a clear answer when it comes to dynamically adjusting the range of parameter during a study: it poses no problem since each sampler is defined individually. But what about adding and/or removing parameters? Is Optuna able to handle…
jorijnsmit
  • 3,893
  • 4
  • 24
  • 48
1
vote
0 answers

Jointly optimizing autoencoder and fully connected network for classification

I have a large set of unlabeled data and a smaller set of labeled data. Thus, I would like to first train a variational autoencoder on the unlabeled data and then use the encoder for classification of three classes (with a fully connected layer…
machinery
  • 4,802
  • 8
  • 49
  • 88
1
vote
0 answers

CoNLL files in hyperparameter tuning using Optuna

I've been trying to work out how to optimize the hyperparameters in a Bi-LSTM model for PoS and dependency parsing (https://github.com/datquocnguyen/jPTDP). The model takes CoNLL-U files as input and I am clueless as to how I can go about using them…
1
vote
1 answer

Does TPE (from Optuna) takes account of the number of trials?

I am using TPE sampler from optuna to optimize hyperparameters for Deep Learning vision models. I was wondering if optuna adapt search depending of the number of trials. If I train for 1000 trials and stop at 500, I can see that many parameters were…
Phi
  • 21
  • 3
1
vote
2 answers

Is there a way to pass arguments to multiple jobs in optuna?

I am trying to use optuna for searching hyper parameter spaces. In one particular scenario I train a model on a machine with a few GPUs. The model and batch size allows me to run 1 training per 1 GPU. So, ideally I would like to let optuna spread…
mRcSchwering
  • 646
  • 4
  • 18
1
2 3 4