Mathias Franzius
2009-02-04 15:03:07 UTC
Dear group,
does FANN support sigma-pi style activation functions, i.e.,
the activation of a neuron is determined by a weighted sum of products of its inputs?
If this is not the case, could I extend FANN myself to do so? Is there some documentation how to start?
My use case: I currently have a hierarchical feed-forward network that is learned unsupervised
layer-by-layer without labels and thus without backpropagation.
The nonlinearity in this net is a quadratic expansion of the inputs in each layer.
No I plan to export this pretrained structure to FANN and add some supervised backprop-based learning.
Thanks!
Mathias
__________________________________________________________________
Deutschlands größte Online-Videothek schenkt Ihnen 12.000 Videos!*
http://entertainment.web.de/de/entertainment/maxdome/index.html
does FANN support sigma-pi style activation functions, i.e.,
the activation of a neuron is determined by a weighted sum of products of its inputs?
If this is not the case, could I extend FANN myself to do so? Is there some documentation how to start?
My use case: I currently have a hierarchical feed-forward network that is learned unsupervised
layer-by-layer without labels and thus without backpropagation.
The nonlinearity in this net is a quadratic expansion of the inputs in each layer.
No I plan to export this pretrained structure to FANN and add some supervised backprop-based learning.
Thanks!
Mathias
__________________________________________________________________
Deutschlands größte Online-Videothek schenkt Ihnen 12.000 Videos!*
http://entertainment.web.de/de/entertainment/maxdome/index.html