[ODE] Virtual Creature Sample Code here

Rob Leclerc leclerc at cpsc.ucalgary.ca
Tue Apr 15 18:58:01 2003


That is what you should be getting. This is the evolution stage and
early in the evolution stage the neural network wont be very evolved and
thus there should not be much excitement. If you click on the drawstuff
window you will see fitness values being generated for a population over
a generation. This goes faster if you close the graphics window. This
will save individuals to the ./data directory and you can then load the
best evolved individuals. It will probably take some number of
generations before you get good walking behavior.

RDL


-----Original Message-----
From: ode-admin@q12.org [mailto:ode-admin@q12.org] On Behalf Of Ted
Milker
Sent: Tuesday, April 15, 2003 7:52 PM
To: ode@q12.org
Subject: Re: [ODE] Virtual Creature Sample Code here

Could you describe what the proper output of this program is?  All I get
is a stick figure that hops and hops :)  I'm not real familiar with GA
and evolution of virtual creatures.

On Tue, Apr 15, 2003 at 04:59:15PM -0600, Rob Leclerc wrote:
> I think I fixed all the bugs in this one. (Memory leak, and problems
> reloading saved individuals). It has some decent documentation and

I don't think the mem leak is fixed:

97  of  0    fitness:1.1
98  of  0    fitness:1.1
99  of  0    fitness:1.1
Best Found: Gen[0] Fit[-0.6]


Program received signal SIGSEGV, Segmentation fault.
0x0804a9cd in crossOver(NeuralNetwork*, NeuralNetwork*, NeuralNetwork*)
(p1=0x8096178, p2=0x80964d8,
    child=0x80967b8) at main.cpp:191
191          {    child->weights[i] = p1->weights[i];
(gdb) print p1->weightSize
$1 = -2061584302

I didn't have to change anything around the NeuralNetwork classes to get
this to compile under Linux, so I don't think it's anything I
introduced.

> *This was programmed in VC6 and has not been tested on other
platforms.

I had to fix a bunch of out of scope warnings with for() loops using
variables that haven't been declared(or were declared in a previous
for() loop).

I also had to fix the first sort() call in getNextGeneration to compile
under g++ 3.2:

     sort(annVector.begin()+(annVector.size() / 2), annVector.end(),
NNCompare<NeuralNetwork>());

g++ 3.2 doesn't have an itoa() call, so I used:

     string generationNum;
     ostringstream out(generationNum);

     out << currentGeneration;
     filename += out.str();
     filename += ".dat";

I think that was all the major issues I had with getting it to compile,
of course now, I'd have to locate the mem leak to get past one
generation(I think that's what it is).  I'll post if I find anything.

Ted
_______________________________________________
ODE mailing list
ODE@q12.org
http://q12.org/mailman/listinfo/ode