This class implements a generic (acylic) recurrent neural network, whose input is a labeled graph. It works much like a regular neural network, but uses BPTS instead of plain BP to calculate gradients. For this to work, the weights must be instances of RWeight
. Also, (these conditions are not yet checked), the size of VI_R
should be (at least, but there's no point in making it larger) equal to the size of VO_R
times the maximum indegree of the input graphs.
std::vector<InputTerminal*> VI_L
std::vector<Neuron*> VI_R
Neuron(id,one)
.
std::vector<Neuron*> VO_L
std::vector<Neuron*> VO_R
void setInput(const LabelGraph &str) throw (length_error)
void setInput(const LabelNode &node) throw (length_error)
void setInput(const LabelNode &node, const std::vector<double> &inV) throw (length_error)
setInput()
method. Allows to change the perceived label on the given node to be the given vector. No check is performedvoid setDelta(const LabelNode &node, const std::vector<double> &deltaV) throw (length_error)
void setRDelta(const LabelNode &node, const std::vector<double> &deltaV) throw (length_error)
void value(const LabelNode &node, std::vector<double> *outV) throw (length_error)
void valueR(const LabelNode &node, std::vector<double> *outV) throw (length_error)
void clean()
setInput(const LabelGraph&)
.
ostream& operator<<(ostream &o,const RecNeuralNet &n)
writeNodes<>()
and writeLinks<>()
. The network is written in its folded state.
istream& operator>>(istream &i,RecNeuralNet &n)
readNodes<>()
and readLinks<>()
, with NeuronInserter
, ITInserter
and LinkInserter
.