Skip to content
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.

Back propagation and Structure of a Neural Network in scikit-neuralnetwork #226

Open
albayraktaroglu opened this issue Feb 7, 2017 · 0 comments

Comments

@albayraktaroglu
Copy link

I am trying to learn Neural Networks using scikit-neuralnetwork framework and I know basics about Neural Networks and now trying to implement it with scikit-learn. but I am confused on 2 points.

1- what is the structure of this NN given below? Somehow, in some examples felt to me, some people don't put input layer as a layer. Otherwise, I am thinking this as a 2 layer NN has input layer with 100 nodes and 1 node at the ouput layer.

2- Does scikit-neuralnetwork do back propagation within the code that I put below or how it trains the system and changes the weights in the neural network?

Thank you!

from sknn.mlp import Classifier, Layer

nn = Classifier(
   layers=[
    Layer("Maxout", units=100, pieces=2),
    Layer("Softmax")],
learning_rate=0.001,
n_iter=25)

nn.fit(X_train, y_train)


Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant