Questions tagged [perceptron]

Perceptron is a basic linear classifier that outputs binary labels.

Perceptron is a basic linear classifier that outputs binary labels. If the training data set is not linear separable, the learning algorithm cannot converge.

A classical problem of XOR is a dataset that is not linear separable. A perceptron does not work in this case. By adding nonlinear layers between the input and output, one can separate all data. With enough training data, the resulting network is able to model any well-defined function to arbitrary precision. This model is a generalization known as a multilayer perceptron.

For more details about , see wiki.

440 questions
106
votes
4 answers

multi-layer perceptron (MLP) architecture: criteria for choosing number of hidden layers and size of the hidden layer?

If we have 10 eigenvectors then we can have 10 neural nodes in input layer.If we have 5 output classes then we can have 5 nodes in output layer.But what is the criteria for choosing number of hidden layer in a MLP and how many neural nodes in 1…
66
votes
4 answers

Perceptron learning algorithm not converging to 0

Here is my perceptron implementation in ANSI C: #include #include #include float randomFloat() { srand(time(NULL)); float r = (float)rand() / (float)RAND_MAX; return r; } int calculateOutput(float…
Richard Knop
  • 73,317
  • 142
  • 374
  • 539
38
votes
5 answers

Why is weight vector orthogonal to decision plane in neural networks

I am beginner in neural networks. I am learning about perceptrons. My question is Why is weight vector perpendicular to decision boundary(Hyperplane)? I referred many books but all are mentioning that weight vector is perpendicular to decision…
29
votes
4 answers

Training on imbalanced data using TensorFlow

The Situation: I am wondering how to use TensorFlow optimally when my training data is imbalanced in label distribution between 2 labels. For instance, suppose the MNIST tutorial is simplified to only distinguish between 1's and 0's, where all…
13
votes
2 answers

Intuition for perceptron weight update rule

I am having trouble understanding the weight update rule for perceptrons: w(t + 1) = w(t) + y(t)x(t). Assume we have a linearly separable data set. w is a set of weights [w0, w1, w2, ...] where w0 is a bias. x is a set of input parameters [x0, x1,…
joshreesjones
  • 1,824
  • 5
  • 22
  • 41
12
votes
3 answers

How do you draw a line using the weight vector in a Linear Perceptron?

I understand the following: In 2D space, each data point has 2 features: x and y. The weight vector in 2D space contains 3 values [bias, w0, w1] which can be rewritten as [w0,w1,w2]. Each datapoint needs an artificial coordinate [1, x, y] for the…
user1337603
  • 235
  • 2
  • 4
  • 8
11
votes
3 answers

Parameter Tuning for Perceptron Learning Algorithm

I'm having sort of an issue trying to figure out how to tune the parameters for my perceptron algorithm so that it performs relatively well on unseen data. I've implemented a verified working perceptron algorithm and I'd like to figure out a method…
11
votes
7 answers

Geometric representation of Perceptrons (Artificial neural networks)

I am taking this course on Neural networks in Coursera by Geoffrey Hinton (not current). I have a very basic doubt on weight spaces. https://d396qusza40orc.cloudfront.net/neuralnets/lecture_slides%2Flec2.pdf Page 18. If I have a weight vector (bias…
kosmos
  • 307
  • 4
  • 13
9
votes
3 answers

Can a perceptron be used to detect hand-written digits?

Let's say I have a small bitmap which contains a single digit (0..9) in hand writing. Is it possible to detect the digit using a (two-layered) perceptron? Are there other possibilities to detect single digits from bitmaps besides using neural nets?
9
votes
5 answers

implementing a perceptron classifier

Hi I'm pretty new to Python and to NLP. I need to implement a perceptron classifier. I searched through some websites but didn't find enough information. For now I have a number of documents which I grouped according to category(sports,…
9
votes
2 answers

Why won't Perceptron Learning Algorithm converge?

I have implemented the Perceptron Learning Algorithm in Python as below. Even with 500,000 iterations, it still won't converge. I have a training data matrix X with target vector Y, and a weight vector w to be optimized. My update rule is:…
manbearpig
  • 143
  • 2
  • 6
8
votes
3 answers

What's the point of the threshold in a perceptron?

I'm having trouble seeing what the threshold actually does in a single-layer perceptron. The data is usually separated no matter what the value of the threshold is. It seems a lower threshold divides the data more equally; is this what it is used…
Hypercube
  • 1,143
  • 2
  • 10
  • 16
7
votes
1 answer

Correct backpropagation in simple perceptron

Given the simple OR gate problem: or_input = np.array([[0,0], [0,1], [1,0], [1,1]]) or_output = np.array([[0,1,1,1]]).T If we train a simple single-layered perceptron (without backpropagation), we could do something like this: import numpy as…
alvas
  • 94,813
  • 90
  • 365
  • 641
7
votes
3 answers

Neural Network: Solving XOR

Could someone please give me a mathematical correct explanation why a Multilayer Perceptron can solve the XOR problem? My interpretation of the perceptron is as follows: A perceptron with two inputs and has following linear function and is hence…
6
votes
1 answer

OpenCV::ML - is it possible to tell openCV which parts of data we want to send to which neuron?

So here is shown a simple example - 2 floats as data + 1 float as output: Layer 1: 2 neurons (2 inputs) Layer 2: 3 neurons (hidden layer) Layer 3: 3 neurons (hidden layer) Layer 4: 1 neurons (1…
Rella
  • 59,216
  • 102
  • 341
  • 614
1
2 3
29 30