[ODE] Creature Evolution input
whitt013
whitt013 at bama.ua.edu
Wed Mar 12 23:03:01 2003
I think I misinterpretted the original question, so I'll add a few thoughts to
this discussion. First, check out the Framsticks genotype (google for it) for
some inspiration as far as input values. A few notables that I don't think
have been mentioned yet for sensory input:
Touch: boolean value (0.0 or 1.0...) that states whether this body is touching
anything. Pretty easy to implement with ODE's collision detection.
Equilibrium: This value starts out at 1.0 when the body is in it's initial
state (i.e. it's rotation matrix = identity), and goes to -1.0 when it's
upside-down. Another easy one to implement (= R[5]).
Smell: for target following or energy acquisition simulations and the like,
1.0 when the target is right on top of the creature, down to 0.0 when it is at
or beyond the creature's maximum smell range.
Just a couple ideas...
David
>===== Original Message From <skjold@cistron.nl> =====
>Hi Ian,
>
>> Actually, I've be interested to find out what other people have done re:
>> evolved morphologies/controllers with ODE...?
>
>This is my primary use for ODE at the moment, although I've only just started
getting familiar with ODE. That means I don't have anything that works yet (is
time a scarce resource or what). But it sounds like you're doing stuff I
intend to do as well. In fact, aside from the fun of playing ith ODE, I am
hoping that such experiments produces some insight for me into what I could
develop for an actual robot (the real thing, i.e. sensors, servo's and IC's).
So, what you said about simulating camera's and other sensors is something
that I find interesting as well. I would certainly welcome any discussions
related to this, although this mailing list might not be the right place for
it.
>
>It's always nice to swap ideas though :)
>
>Greets,
>Mark
>
>
>> Hi Henri,
>>
>> Like (I think) quite a few people on this list, I evolve agent physical
>> and controller morphologies using ODE for movement and beacon following.
>> I've managed to evolve agents using a wide variety of movements made
>> from blocks connected by powered hinge joints, and the only inputs to
>> the network I use is the angle of the joints at a each timestep, which
>> adds a helpful negative feedback for the pattern generation - that's if
>> I'm evolving for movement only.
>>
>> If you're just trying to get movement, you can get away with no real
>> inputs at all, and just use a sine wave input and allow the neural
>> network to modulate that!
>>
>> For beacon following, I use two types of sensor input. I use directional
>> beacons. For these, the input to the neuron is the angle between a
>> vector normal to some genetically determined position on a block, and a
>> vector pointing to the the beacon, this works fine.
>>
>> The other type is a more realistic camera sensor, for which I use OpenGL
>> and off-screen rendering, but this is very slow. I've evolved an agent
>> that can visually locate and approach a block of a certain colour. This
>> is a sensor with a directional field of view, and the neuron input is
>> the percentage that a certain colour fills it. I'm trying to work out a
>> faster way of doing this though!
>>
>> So you can get interesting behaviour with very simple sensors!
>>
>> Actually, I've be interested to find out what other people have done re:
>> evolved morphologies/controllers with ODE...?
>>
>>
>> Ian
>>
>
>_______________________________________________
>ODE mailing list
>ODE@q12.org
>http://q12.org/mailman/listinfo/ode