Discussion:
RNN problem in GSoC branch
Victor Mateus Oliveira
2007-11-09 01:13:49 UTC
Permalink
Hi,

I was trying to use the recurrent networks of gsoc branch and I got some errors.
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".

Maybe I'm doing something wrong.. :-)

I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.

I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.


Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
Steffen Nissen
2007-11-09 05:16:20 UTC
Permalink
Try running the program in the debugger and get a stack trace, it definately
looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I got some errors.
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
Victor Mateus Oliveira
2007-11-09 15:12:24 UTC
Permalink
Hi,

I ran it with valgrind. I think its more complete than gdb.

Did you receive the attachments? If the server blocked it, a can send you.

Analysing the attached files you will see somethings like these:

valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
bellow of:
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4 alloc'd"

valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons times
on outputs, but outputs was allocated with ann->num_output (line 201).
How the fann_recurrent.c line 195, num_neurons must to be greater than
num_outputs.

I hope that it help.


Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it definately
looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I got some
errors.
Post by Victor Mateus Oliveira
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Vincenzo Di Massa
2007-11-10 09:43:10 UTC
Permalink
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have on
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so many
ways. It is also limited: it assumes 1 output per neuron, which is not true
for MIMO nets.

I think rewriting the whole think will be easier than fix it.

Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.

Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **

In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the constraint
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the activation
function provided by theyr containing MIMO neuron. The same is true for the
algorithm used (MIMO neurons inside a layer can use different aglorithms or
even different implementations - SSE, scalar, theraded, ... - ) and many
other parameters.

Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO neurons of
the layer)

The layer owns the output array: layers assigns pieces of it to the MIMO
neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons as a
**fann_type to read inputs from.

That might seem confusing but it's just another neuron container usefull to
make things faster and to make writing new algorithms easier (once the docs
get well written :-) ).

Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send you.
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4 alloc'd"
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons times
on outputs, but outputs was allocated with ann->num_output (line 201).
How the fann_recurrent.c line 195, num_neurons must to be greater than
num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I got some
errors.
Post by Victor Mateus Oliveira
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Victor Mateus Oliveira
2007-11-11 23:40:09 UTC
Permalink
Hi,

I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.

att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have on
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so many
ways. It is also limited: it assumes 1 output per neuron, which is not true
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the constraint
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the activation
function provided by theyr containing MIMO neuron. The same is true for the
algorithm used (MIMO neurons inside a layer can use different aglorithms or
even different implementations - SSE, scalar, theraded, ... - ) and many
other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO neurons of
the layer)
The layer owns the output array: layers assigns pieces of it to the MIMO
neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons as a
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull to
make things faster and to make writing new algorithms easier (once the docs
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send you.
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4 alloc'd"
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons times
on outputs, but outputs was allocated with ann->num_output (line 201).
How the fann_recurrent.c line 195, num_neurons must to be greater than
num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I got some
errors.
Post by Victor Mateus Oliveira
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The valgrind
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Steffen Nissen
2007-11-12 06:35:47 UTC
Permalink
Hi Victor,

It is really good that you want to give it a go.

All changes should be made on the GSoC branch until this branch is merged
onto the main branch. To start out with you will not be needing a CVS user,
you can just develop up against the branch and send cvs patches to me or
Vincenzo and we will make sure that they are integrated into the CVS branch.
If this goes well I will give your sf.net user write rights to the CVS
repository.

Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - ) and many
other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to the MIMO
neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier (once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons times
on outputs, but outputs was allocated with ann->num_output (line 201).
How the fann_recurrent.c line 195, num_neurons must to be greater than
num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
The program crash with "*** glibc detected *** ./recurrent: double
free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library (FANN)
http://leenissen.dk/fann/
Vincenzo Di Massa
2007-11-12 11:51:59 UTC
Permalink
Welcome to the new (want to be) fann developer :-)

Fann is so fun :-)

Tha sad thing is that we seldom have time to have fun as we would like ::-(

Welcome again.

Vincenzo
Post by Steffen Nissen
Hi Victor,
It is really good that you want to give it a go.
All changes should be made on the GSoC branch until this branch is merged
onto the main branch. To start out with you will not be needing a CVS user,
you can just develop up against the branch and send cvs patches to me or
Vincenzo and we will make sure that they are integrated into the CVS
branch. If this goes well I will give your sf.net user write rights to the
CVS repository.
Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - ) and
many other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to the
MIMO neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier (once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons
times on outputs, but outputs was allocated with ann->num_output
(line 201). How the fann_recurrent.c line 195, num_neurons must to be
greater than num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
double free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Victor Mateus Oliveira
2007-11-13 00:10:51 UTC
Permalink
Hi!

I'm studying the fann base code to start out my work. I'll use git as
local repository to control my changes.
As soon as possible I send to you a patch with, at least, the base code for rnn.

Any doubt I had, I ask here.

Thanks for welcome. :-)

[]s
Victor
Post by Vincenzo Di Massa
Welcome to the new (want to be) fann developer :-)
Fann is so fun :-)
Tha sad thing is that we seldom have time to have fun as we would like ::-(
Welcome again.
Vincenzo
Post by Steffen Nissen
Hi Victor,
It is really good that you want to give it a go.
All changes should be made on the GSoC branch until this branch is merged
onto the main branch. To start out with you will not be needing a CVS user,
you can just develop up against the branch and send cvs patches to me or
Vincenzo and we will make sure that they are integrated into the CVS
branch. If this goes well I will give your sf.net user write rights to the
CVS repository.
Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - ) and
many other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to the
MIMO neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier (once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons
times on outputs, but outputs was allocated with ann->num_output
(line 201). How the fann_recurrent.c line 195, num_neurons must to be
greater than num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
double free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Victor Mateus Oliveira
2007-11-15 02:45:40 UTC
Permalink
Hi,

The first few basic questions... :-)
I have found fann_generic and the files of the optimized folder using
a notation of to define the algorithm, implementation,
activation_function and use the macro MAKE_NAME. But the fann_som,
fann_gng and fann_recurrent (actual) declare the functions manually. I
noted too that some base fann functions use the names formed by
MAKE_NAME.
There is a standard to use? Anyone? Or I'm making some confusion?

And about MIMO. To make my code right can I take fann_base.c like a
example? It's seems like you explained.


[]s
Victor
Post by Victor Mateus Oliveira
Hi!
I'm studying the fann base code to start out my work. I'll use git as
local repository to control my changes.
As soon as possible I send to you a patch with, at least, the base code for rnn.
Any doubt I had, I ask here.
Thanks for welcome. :-)
[]s
Victor
Post by Vincenzo Di Massa
Welcome to the new (want to be) fann developer :-)
Fann is so fun :-)
Tha sad thing is that we seldom have time to have fun as we would like ::-(
Welcome again.
Vincenzo
Post by Steffen Nissen
Hi Victor,
It is really good that you want to give it a go.
All changes should be made on the GSoC branch until this branch is merged
onto the main branch. To start out with you will not be needing a CVS user,
you can just develop up against the branch and send cvs patches to me or
Vincenzo and we will make sure that they are integrated into the CVS
branch. If this goes well I will give your sf.net user write rights to the
CVS repository.
Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love to
try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think its
give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not yet
documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains "standard
neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation function is
defined for each MIMO neuron; all the standard neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - ) and
many other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to the
MIMO neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier (once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in valgrind log
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write num_neurons
times on outputs, but outputs was allocated with ann->num_output
(line 201). How the fann_recurrent.c line 195, num_neurons must to be
greater than num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack trace, it
definately looks like there is a problem with the FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
double free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to use
recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-----------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
--
GNU/Linux user #446397 - http://counter.li.org

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Vincenzo Di Massa
2007-11-15 10:33:31 UTC
Permalink
Post by Victor Mateus Oliveira
Hi,
The first few basic questions... :-)
I have found fann_generic and the files of the optimized folder using
a notation of to define the algorithm, implementation,
activation_function and use the macro MAKE_NAME. But the fann_som,
fann_gng and actual) declare the functions manually. I
noted too that some base fann functions use the names formed by
MAKE_NAME.
There is a standard to use? Anyone? Or I'm making some confusion?
You are right...
The fann_som, fann_gng and fann_recurrent code has been written before the
fann MIMO implementation was finished... the idea was there but the code was
to come.
That is why the code is not "uniform". It need a clean up. Feel free to fix
and move things around in your patchset if you like.
Using the MAKE_NAME stuff allows us to use many activation functions whithout
the need to have "switch" or long "if else" sequences in the code, so it is
desiderable to use it.
Post by Victor Mateus Oliveira
And about MIMO. To make my code right can I take fann_base.c like a
example? It's seems like you explained.
Yes fann_base.c has most of the infrastructure you need. I'm not sure you
really need to change that code...
Take a look at src/optimized/scalar/fann_scalar_mimo_batch.c and
src/optimized/scalar/fann_scalar_mimo_rprop.c to understand how
I "subclassed" the fann_base.c code. I think you can write a
src/optimized/scalar/fann_scalar_mimo_XXXXX.c file where XXXXX is recurrent.
Then import it in fann_scalar_mimo.c

Note that the code src/optimized/scalar/fann_scalar_mimo_batch.c does
not "sublcass" the layers (it uses the "connected_any_any" type), but the
code to do do is commented at the beginning of the file (I left it as a
tutorial).
If you change the layer type you have to appropriately create a network of
that type in src/optimized/scalar/fann.c constructors.

The steps to do are quite a lot, but the code to actually write should be
short. Writing the code this way enables it to use whatever activation
function gats written automatically :-)

Ciao
Vincenzo
Post by Victor Mateus Oliveira
[]s
Victor
Post by Victor Mateus Oliveira
Hi!
I'm studying the fann base code to start out my work. I'll use git as
local repository to control my changes.
As soon as possible I send to you a patch with, at least, the base code for rnn.
Any doubt I had, I ask here.
Thanks for welcome. :-)
[]s
Victor
Post by Vincenzo Di Massa
Welcome to the new (want to be) fann developer :-)
Fann is so fun :-)
Tha sad thing is that we seldom have time to have fun as we would like ::-(
Welcome again.
Vincenzo
Post by Steffen Nissen
Hi Victor,
It is really good that you want to give it a go.
All changes should be made on the GSoC branch until this branch is
merged onto the main branch. To start out with you will not be
needing a CVS user, you can just develop up against the branch and
send cvs patches to me or Vincenzo and we will make sure that they
are integrated into the CVS branch. If this goes well I will give
your sf.net user write rights to the CVS repository.
Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love
to try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think
its give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not
yet documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains
"standard neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation
function is defined for each MIMO neuron; all the standard
neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different
aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - )
and many other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to
the MIMO neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO
neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container
usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier (once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a
can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in
"==20829== Address 0x41C43C4 is 0 bytes after a block of size 4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write
num_neurons times on outputs, but outputs was allocated with
ann->num_output (line 201). How the fann_recurrent.c line 195,
num_neurons must to be greater than num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack
trace, it definately looks like there is a problem with the
FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch
and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
double free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to
use recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------
----
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX
and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------
------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------
------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser. Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------
------ This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser. Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-----------------------------------------------------------------------
--
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
Victor Mateus Oliveira
2007-11-17 23:25:30 UTC
Permalink
Hi,

I'm writing my code with MAKE_NAME. We could update som and gng codes
to use this standard.
Another thing that I saw is the particular nets parameters on fann
structure. There's a member for each net type. It could be a c union
or a void pointer. Or we can adopt a inheritance way to declare the
new nets. So fann is the father and the others net strucutures have
the fann like a fisrt member to be possible cast it. And so on...
Well, it's a project decision. I'm just telling some ideias. :-)
Soon I'll send my patch.

Another day I write a more detailed email. Now I'm running. :-)

best regards,
Victor
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Hi,
The first few basic questions... :-)
I have found fann_generic and the files of the optimized folder using
a notation of to define the algorithm, implementation,
activation_function and use the macro MAKE_NAME. But the fann_som,
fann_gng and actual) declare the functions manually. I
noted too that some base fann functions use the names formed by
MAKE_NAME.
There is a standard to use? Anyone? Or I'm making some confusion?
You are right...
The fann_som, fann_gng and fann_recurrent code has been written before the
fann MIMO implementation was finished... the idea was there but the code was
to come.
That is why the code is not "uniform". It need a clean up. Feel free to fix
and move things around in your patchset if you like.
Using the MAKE_NAME stuff allows us to use many activation functions whithout
the need to have "switch" or long "if else" sequences in the code, so it is
desiderable to use it.
Post by Victor Mateus Oliveira
And about MIMO. To make my code right can I take fann_base.c like a
example? It's seems like you explained.
Yes fann_base.c has most of the infrastructure you need. I'm not sure you
really need to change that code...
Take a look at src/optimized/scalar/fann_scalar_mimo_batch.c and
src/optimized/scalar/fann_scalar_mimo_rprop.c to understand how
I "subclassed" the fann_base.c code. I think you can write a
src/optimized/scalar/fann_scalar_mimo_XXXXX.c file where XXXXX is recurrent.
Then import it in fann_scalar_mimo.c
Note that the code src/optimized/scalar/fann_scalar_mimo_batch.c does
not "sublcass" the layers (it uses the "connected_any_any" type), but the
code to do do is commented at the beginning of the file (I left it as a
tutorial).
If you change the layer type you have to appropriately create a network of
that type in src/optimized/scalar/fann.c constructors.
The steps to do are quite a lot, but the code to actually write should be
short. Writing the code this way enables it to use whatever activation
function gats written automatically :-)
Ciao
Vincenzo
Post by Victor Mateus Oliveira
[]s
Victor
Post by Victor Mateus Oliveira
Hi!
I'm studying the fann base code to start out my work. I'll use git as
local repository to control my changes.
As soon as possible I send to you a patch with, at least, the base code for rnn.
Any doubt I had, I ask here.
Thanks for welcome. :-)
[]s
Victor
Post by Vincenzo Di Massa
Welcome to the new (want to be) fann developer :-)
Fann is so fun :-)
Tha sad thing is that we seldom have time to have fun as we would like ::-(
Welcome again.
Vincenzo
Post by Steffen Nissen
Hi Victor,
It is really good that you want to give it a go.
All changes should be made on the GSoC branch until this branch is
merged onto the main branch. To start out with you will not be
needing a CVS user, you can just develop up against the branch and
send cvs patches to me or Vincenzo and we will make sure that they
are integrated into the CVS branch. If this goes well I will give
your sf.net user write rights to the CVS repository.
Best Regards,
Steffen
Post by Victor Mateus Oliveira
Hi,
I'm not a especialist in ann (yet.. :-) ), but I really would love
to try write this code.
I'll study the fann internal architecture and begin to work.
What base fann code might I use? Head? GSoc branch?
I need a cvs user, if it's possible.
By what you wrote, MIMO nets is not so hard to implement. I think
its give a easier way to write more complex nets.
att,
Victor
Post by Vincenzo Di Massa
Hi,
I gave a look at the recurrent code for the first time...
You are right the sizes of the arrays is wrong! neuron->sums just
have
on
Post by Vincenzo Di Massa
element... blahhh!!!
It needs a lot of love more. The current recurrent code is broken
in so
many
Post by Vincenzo Di Massa
ways. It is also limited: it assumes 1 output per neuron, which
is not
true
Post by Vincenzo Di Massa
for MIMO nets.
I think rewriting the whole think will be easier than fix it.
Anyone wants to give it a try? I can assist and explain the (not
yet documented) MIMO implementation.
Basically with MIMO we now have
** net->layers->mimo_neuron->outputs **
a non mimo implementation would just have
** net->layers->outputs **
In each layer of the MIMO implemntation we have a new layer like
neuron-container-structure (the MIMO neuron) that contains
"standard neurons".
All the "standard neurons" inside the same MIMO nerons have the
constraint
Post by Vincenzo Di Massa
that they must have similar parameters: e.g. the activation
function is defined for each MIMO neuron; all the standard
neurons use the
activation
Post by Vincenzo Di Massa
function provided by theyr containing MIMO neuron. The same is
true for
the
Post by Vincenzo Di Massa
algorithm used (MIMO neurons inside a layer can use different
aglorithms
or
Post by Vincenzo Di Massa
even different implementations - SSE, scalar, theraded, ... - )
and many other parameters.
Note: layers (in the MIMO fann) have
- num neurons (number of MIMO neurons)
- num outputs (number of standard neurons contained in all the MIMO
neurons of
Post by Vincenzo Di Massa
the layer)
The layer owns the output array: layers assigns pieces of it to
the MIMO neurons for them to leave the computed results in.
Also the inputs belong to the layer: they are passed to the MIMO
neurons
as a
Post by Vincenzo Di Massa
**fann_type to read inputs from.
That might seem confusing but it's just another neuron container
usefull
to
Post by Vincenzo Di Massa
make things faster and to make writing new algorithms easier
(once the
docs
Post by Vincenzo Di Massa
get well written :-) ).
Ciao
Vincenzo
Post by Victor Mateus Oliveira
Hi,
I ran it with valgrind. I think its more complete than gdb.
Did you receive the attachments? If the server blocked it, a
can send
you.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Analysing the attached files you will see somethings like
valgrind.log line 17 "==20829== Invalid write of size 4" -
fann_base_fully_recurrent.c:217
The loop of this line try write neuron_it->num_weights times on
neuron_it->sums, but neuron_it->sums was allocated with
neuron->num_outputs (line 78). The num_output is diferent of
num_weights. It's in the function
fann_neuron_constructor_fully_recurrent. It's showed in
"==20829== Address 0x41C43C4 is 0 bytes after a block of size
4
alloc'd"
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
valgrind.log line 66 "==20829== Invalid write of size 4"
Now the problem is on the outer loop. It's try write
num_neurons times on outputs, but outputs was allocated with
ann->num_output (line 201). How the fann_recurrent.c line 195,
num_neurons must to be greater than num_outputs.
I hope that it help.
Att,
Victor
Post by Steffen Nissen
Try running the program in the debugger and get a stack
trace, it definately looks like there is a problem with the
FANN code.
Post by Victor Mateus Oliveira
Hi,
I was trying to use the recurrent networks of gsoc branch
and I
got
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
some
errors.
Post by Victor Mateus Oliveira
double free or corruption (out): 0x0804bf90 ***".
Maybe I'm doing something wrong.. :-)
I attached the program, a backtrace and a valgrind log. The
valgrind
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
shows some errors on the rnn code.
I know that this branch isn't stable, but I was trying to
use recurrent networks and I hope I can help.
Att,
Victor
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------
----
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
-- This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX
and a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Post by Victor Mateus Oliveira
Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
Steffen Nissen - http://facebook.com/profile.php?id=595485027
Project Administrator - Fast Artificial Neural Network
Library
(FANN)
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
http://leenissen.dk/fann/
-------------------------------------------------------------------
------
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and
a
browser.
Post by Vincenzo Di Massa
Post by Victor Mateus Oliveira
Post by Steffen Nissen
Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-------------------------------------------------------------------
------
Post by Vincenzo Di Massa
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser. Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------
------ This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a
browser. Download your FREE copy of Splunk now >>
http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
-----------------------------------------------------------------------
--
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Fann-general mailing list
https://lists.sourceforge.net/lists/listinfo/fann-general
--
GNU/Linux user #446397 - http://counter.li.org

-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
Vincenzo Di Massa
2007-11-22 11:01:11 UTC
Permalink
Hi,
any news?
If you get stuck somewhere, just ask more questions :-)
I'm all for having a new developer and a new "owner of the MIMO knowledge" :-)

Ciao
Vincenzo





-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/

Loading...