Questions tagged [perceptron]

Perceptron is a basic linear classifier that outputs binary labels.

Perceptron is a basic linear classifier that outputs binary labels. If the training data set is not linear separable, the learning algorithm cannot converge.

A classical problem of XOR is a dataset that is not linear separable. A perceptron does not work in this case. By adding nonlinear layers between the input and output, one can separate all data. With enough training data, the resulting network is able to model any well-defined function to arbitrary precision. This model is a generalization known as a multilayer perceptron.

For more details about , see wiki.

440 questions
6
votes
1 answer

Keras correct input shape for multilayer perceptron

I'm trying to make a basic MLP example in keras. My input data has the shape train_data.shape = (2000,75,75) and my testing data has the shape test_data.shape = (500,75,75). 2000 and 500 are the numbers of samples of training and test data (in other…
20XX
  • 147
  • 5
6
votes
2 answers

Setting the number of output nodes in scikit-learn's MLPClassifier

I'm currently experimenting with scikit-learn's neural net capabilites. Is there are way to set the number of output nodes in its MLPClassifier? I know you can set the number of hidden layers by passing it as parameters like: clf =…
Oliver Atienza
  • 518
  • 3
  • 8
6
votes
3 answers

Advice for algorithm choice

I have to do a project that tries to scan the shape of the vehicles and detect what type of vehicle it is , the scanning will performed with a sensors called “vehicle scanner” they are just 50 beams of lights, each beam with receptor and emission as…
5
votes
1 answer

Optimal Feature-to-Instance Ratio in Back Propagation Neural Network

I'm trying to perform leave-one-out cross validation for modelling a particular problem using Back Propagation Neural Network. I have 8 features in my training data and 20 instances. I'm trying to make the NN learn a function in building a…
psteelk
  • 1,105
  • 2
  • 13
  • 23
5
votes
2 answers

Why does single-layer perceptron converge so slow without normalization, even when the margin is large?

This question is totally re-written after I confirmed my results (the Python Notebook can be found here) with a piece of code written by someone else (can be found here). Here is that code instrumented by me to work with my data and to count epochs…
AlwaysLearning
  • 5,895
  • 2
  • 16
  • 50
5
votes
2 answers

How are neurons in deeper layers capable of making more complex decisions than neurons in shallower/earlier layers?

I'm brand new to ML and am reading the online book at http://neuralnetworksanddeeplearning.com. In the first chapter the author describes a single perceptron using a Cheese Festival example. Basically he illustrates an example of a perceptron trying…
smeeb
  • 22,487
  • 41
  • 197
  • 389
5
votes
1 answer

Multilayer-perceptron, visualizing decision boundaries (2D) in Python

I have programmed a multilayer perception for binary classification. As I understand it, one hidden layer can be represented using just lines as decision boundaries (one line per hidden neuron). This works well and can easily be plotted just using…
johnblund
  • 332
  • 3
  • 19
5
votes
1 answer

plot decision boundary matplotlib

I am very new to matplotlib and am working on simple projects to get acquainted with it. I was wondering how I might plot the decision boundary which is the weight vector of the form [w1,w2], which basically separates the two classes lets say C1 and…
anonuser0428
  • 8,987
  • 18
  • 55
  • 81
5
votes
1 answer

Multilayer perceptron implementation: weights go crazy

I am writing a simple implementation of the MLP with a single output unit (binary classification). I need it for teaching purposes, so I can't use existing implementation :( I managed to create a working dummy model and implemented training…
Dmytro Prylipko
  • 3,979
  • 1
  • 18
  • 39
5
votes
1 answer

multi layer perceptron - finding the "separating" curve

with single-layer perceptron it's easy to find the equation of the "separating line" (I don't know the professional term), the line that separate between 2 types of points, based on the perceptron's weights, after it was trained. How can I find in a…
user1767774
  • 1,585
  • 1
  • 21
  • 29
4
votes
1 answer

Can I use hadoop to train a neutral network?

I want to train a neural network with the help of Hadoop. We know when training a neural network, weights to each neuron are altered every iteration, and each iteration depends on the previous. I'm new to Hadoop and don't quite familiar with…
4
votes
1 answer

Passing an array to numpy.dot() in Python implementation of Perceptron Learning Model

I'm trying to put together a Python implementation of a single-layer Perceptron classifier. I've found the example in Sebastian Raschka's book 'Python Machine Learning' very useful, but I have a question about one small part of his implementation.…
jonrossi
  • 71
  • 7
4
votes
2 answers

Replacing a network of perceptrons by sigmoid neurons

This site gives a bit of mathematical elaboration before we introduce sigmoid neurons (neurons with sigmoid activation function), namely about perceptrons. http://neuralnetworksanddeeplearning.com/chap1.html It starts off by perceptrons and goes on…
4
votes
1 answer

Perceptron Learning Algorithm taking a lot of iterations to converge?

I am solving the homework-1 of Caltech Machine Learning Course (http://work.caltech.edu/homework/hw1.pdf) . To solve ques 7-10 we need to implement a PLA. This is my implementation in python: import sys,math,random w=[] # stores the weights data=[]…
user2179293
  • 417
  • 8
  • 19
4
votes
1 answer

Classify data with a perceptron in MATLAB

I am generating random data that can be separated linearly. I want to write my own version of a perceptron to separate them. I know there are some post, that have similar problems - but I can't find my mistake. I am really stuck. The algorithm…
Chris
  • 43
  • 1
  • 3
1
2
3
29 30