PyBrain:How can I put specific weights in a neural network?
Asked Answered
B

1

8

I am trying to recreate a neural network based on given facts.It has 3 inputs,a hidden layer and an output.My problem is that the weights are also given,so I don't need to train.

I was thinking maybe I could save the trainning of a similar in structure neural network and change the values accordingly.Do you think that will work?Any other ideas.Thanks.

Neural Network Code:

    net = FeedForwardNetwork()
    inp = LinearLayer(3)
    h1 = SigmoidLayer(1)
    outp = LinearLayer(1)

    # add modules
    net.addOutputModule(outp)
    net.addInputModule(inp)
    net.addModule(h1)

    # create connections
    net.addConnection(FullConnection(inp, h1))
    net.addConnection(FullConnection(h1, outp))

    # finish up
    net.sortModules()


    trainer = BackpropTrainer(net, ds)
    trainer.trainUntilConvergence()

Save training and load code from How to save and recover PyBrain training?

# Using NetworkWriter

from pybrain.tools.shortcuts import buildNetwork
from pybrain.tools.xml.networkwriter import NetworkWriter
from pybrain.tools.xml.networkreader import NetworkReader

net = buildNetwork(2,4,1)

NetworkWriter.writeToFile(net, 'filename.xml')
net = NetworkReader.readFrom('filename.xml') 
Bullbat answered 3/3, 2012 at 21:6 Comment(1)
If you find the answer helpful, please mark it as accepted ;)Incinerator
I
7

I was curious how reading already trained network (with xml tool) is done. Because, that means network weights can be somehow set. So in NetworkReader documentation I found, that you can set parameters with _setParameters().

However that underscore means private method which could have potentially some side effects. Also keep in mind, that vector with weights must be same length as originally constructed network.

Example

>>> import numpy
>>> from pybrain.tools.shortcuts import buildNetwork
>>> net = buildNetwork(2,3,1)
>>> net.params

array([...some random values...])

>>> len(net.params)

13

>>> new_params = numpy.array([1.0]*13)
>>> net._setParameters(new_params)
>>> net.params

array([1.0, ..., 1.0])

Other important thing is to put values in right order. For example above it's like this:

[  1., 1., 1., 1., 1., 1.,      1., 1., 1.,        1.,       1., 1., 1.    ] 
     input->hidden0            hidden0->out     bias->out   bias->hidden0   

To determine which weights belongs to which connections between layers, try this

# net is our neural network from previous example
for c in [connection for connections in net.connections.values() for connection in connections]:
    print("{} -> {} => {}".format(c.inmod.name, c.outmod.name, c.params))

Anyway, I still don't know exact order of weights between layers...

Ihs answered 8/1, 2013 at 0:1 Comment(1)
The ordering of weights in the parameters vector is uniquely defined, see here: https://mcmap.net/q/959291/-pybrain-how-to-print-a-network-nodes-and-weightsMedial

© 2022 - 2024 — McMap. All rights reserved.