Questions tagged [adaboost]

AdaBoost is a meta machine learning algorithm. It performs several rounds of training in which the best weak classifiers are selected. At the end of each round, the still misclassified training samples are given a higher weight, resulting in more focus on these samples during the next round of selecting a weak classifier.

238 questions
29
votes
2 answers

Using GridSearchCV with AdaBoost and DecisionTreeClassifier

I am attempting to tune an AdaBoost Classifier ("ABT") using a DecisionTreeClassifier ("DTC") as the base_estimator. I would like to tune both ABT and DTC parameters simultaneously, but am not sure how to accomplish this - pipeline shouldn't work,…
GPB
  • 1,985
  • 7
  • 21
  • 34
18
votes
3 answers

AdaBoostClassifier with different base learners

I am trying to use AdaBoostClassifier with a base learner other than DecisionTree. I have tried SVM and KNeighborsClassifier but I get errors. Can some one point out the classifiers that can be used with AdaBoostClassifier?
vdesai
  • 743
  • 1
  • 5
  • 13
15
votes
2 answers

Weak Classifier

I am trying to implement an application that uses AdaBoost algorithm. I know that AdaBoost uses set of weak classifiers, but I don't know what these weak classifiers are. Can you explain it to me with an example and tell me if I have to create my…
14
votes
3 answers

How to boost a Keras based neural network using AdaBoost?

Assuming I fit the following neural network for a binary classification problem: model = Sequential() model.add(Dense(21, input_dim=19, init='uniform', activation='relu')) model.add(Dense(80, init='uniform', activation='relu')) model.add(Dense(80,…
ishido
  • 3,165
  • 7
  • 26
  • 40
13
votes
2 answers

how to use weight when training a weak learner for adaboost

The following is adaboost algorithm: It mentions "using weights wi on the training data" at part 3.1. I am not very clear about how to use the weights. Should I resample the training data?
tidy
  • 3,887
  • 5
  • 36
  • 74
12
votes
3 answers

Explaining the AdaBoost Algorithms to non-technical people

I've been trying to understand the AdaBoost algorithm without much success. I'm struggling with understanding the Viola Jones paper on Face Detection as an example. Can you explain AdaBoost in laymen's terms and present good examples of when it's…
dole doug
  • 28,350
  • 20
  • 62
  • 84
11
votes
4 answers

Using adaboost within R's caret package

I've been using the ada R package for a while, and more recently, caret. According to the documentation, caret's train() function should have an option that uses ada. But, caret is puking at me when I use the same syntax that sits within my ada()…
Bryan
  • 5,039
  • 9
  • 27
  • 48
10
votes
1 answer

Custom learner function for Adaboost

I am using Adaboost to fit a classification problem. We can do the following: ens = fitensemble(X, Y, 'AdaBoostM1', 100, 'Tree') Now 'Tree' is the learner and we can change this to 'Discriminant' or 'KNN'. Each learner uses a certain Template…
JohnAndrews
  • 5,088
  • 9
  • 57
  • 121
9
votes
1 answer

Parameter selection in Adaboost

After using OpenCV for boosting I'm trying to implement my own version of the Adaboost algorithm (check here, here and the original paper for some references). By reading all the material I've came up with some questions regarding the implementation…
Matteo
  • 6,694
  • 21
  • 75
  • 123
9
votes
1 answer

Advantages of SVM over decion trees and AdaBoost algorithm

I am working on binary classification of data and I want to know the advantages and disadvantages of using Support vector machine over decision trees and Adaptive Boosting algorithms.
8
votes
3 answers

Multilabel AdaBoost for MATLAB

I am currently looking for a multilabel AdaBoost implementation for MATLAB or a technique for efficiently using a two-label implementation for the multilabel case. Any help in that matter would be appreciated.
smichak
  • 4,430
  • 3
  • 29
  • 46
8
votes
4 answers

How to calculate alpha if error rate is zero (Adaboost)

I have been wondering what the value of alpha (weight of a weak classifier) should be when it has an error rate(perfect classification) since the algorithm for alpha is (0.5) * Math.log(((1 - errorRate) / errorRate)) Thank you.
8
votes
1 answer

How to normalize an image color?

In their paper describing Viola-Jones object detection framework (Robust Real-Time Face Detection by Viola and Jones), it is said: All example sub-windows used for training were variance normalized to minimize the effect of different lighting…
6
votes
5 answers

state-of-the-art of classification algorithms

We know there are like a thousand of classifiers, recently I was told that, some people say adaboost is like the out of the shell one. Are There better algorithms (with that voting idea) What is the state of the art in the classifiers.Do you…
edgarmtze
  • 23,307
  • 73
  • 221
  • 364
6
votes
0 answers

Ratio of positive to negative data to use when training a cascade classifier (opencv)

So I'm using OpenCV's LBP detector. The shapes I'm detecting are all roughly circular (differing mostly in aspect ratio), with some wide changes in brightness/contrast, and a little bit of occlusion. OpenCV's guide on how to train the detector is…
user3765410
  • 101
  • 6
1
2 3
15 16