[ODE] Creature Evolution input
skjold@cistron.nl
skjold at cistron.nl
Wed Mar 12 18:33:01 2003
Hi Ian,
> Actually, I've be interested to find out what other people have done re:
> evolved morphologies/controllers with ODE...?
This is my primary use for ODE at the moment, although I've only just started getting familiar with ODE. That means I don't have anything that works yet (is time a scarce resource or what). But it sounds like you're doing stuff I intend to do as well. In fact, aside from the fun of playing ith ODE, I am hoping that such experiments produces some insight for me into what I could develop for an actual robot (the real thing, i.e. sensors, servo's and IC's). So, what you said about simulating camera's and other sensors is something that I find interesting as well. I would certainly welcome any discussions related to this, although this mailing list might not be the right place for it.
It's always nice to swap ideas though :)
Greets,
Mark
> Hi Henri,
>
> Like (I think) quite a few people on this list, I evolve agent physical
> and controller morphologies using ODE for movement and beacon following.
> I've managed to evolve agents using a wide variety of movements made
> from blocks connected by powered hinge joints, and the only inputs to
> the network I use is the angle of the joints at a each timestep, which
> adds a helpful negative feedback for the pattern generation - that's if
> I'm evolving for movement only.
>
> If you're just trying to get movement, you can get away with no real
> inputs at all, and just use a sine wave input and allow the neural
> network to modulate that!
>
> For beacon following, I use two types of sensor input. I use directional
> beacons. For these, the input to the neuron is the angle between a
> vector normal to some genetically determined position on a block, and a
> vector pointing to the the beacon, this works fine.
>
> The other type is a more realistic camera sensor, for which I use OpenGL
> and off-screen rendering, but this is very slow. I've evolved an agent
> that can visually locate and approach a block of a certain colour. This
> is a sensor with a directional field of view, and the neuron input is
> the percentage that a certain colour fills it. I'm trying to work out a
> faster way of doing this though!
>
> So you can get interesting behaviour with very simple sensors!
>
> Actually, I've be interested to find out what other people have done re:
> evolved morphologies/controllers with ODE...?
>
>
> Ian
>