I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.
Post by Vincenzo Di MassaHi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have on
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so many
ways. It is also limited: it assumes 1 output per neuron, which is not true
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the constraint
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the activation
function provided by theyr containing MIMO neuron. The same is true for the
algorithm used (MIMO neurons inside a layer can use different aglorithms or
even different implementations - SSE, scalar, theraded, ... - ) and many
other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO neurons of
the layer)
The layer owns the output array: layers assigns pieces of it to the MIMO
neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons as a
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull to
make things faster and to make writing new algorithms easier (once the docs
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus OliveiraHi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send you.
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4 alloc'd"
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons times
on outputs, but outputs was allocated with ann->num_output (line 201).
How the fann_recurrent.c line 195, num_neurons must to be greater than
num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen NissenTry running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus OliveiraHi,
I was trying to use the recurrent networks of gsoc branch and I got some
errors.
Post by Victor Mateus OliveiraThe program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general