View Single Post
  #19 (permalink)  
Old 08-14-2005, 06:33 AM
Rob Ryland
Guest
 
Posts: n/a
Default Re: Where can i find GeneticFPGA toolkit

"Eric" <[email protected]> wrote in
news:[email protected] oups.com:

> Genetic Algorithm pseudo code:
>
> Problem: Solve 2x+1 = 7 (we know the answer is 3, but work with me)
>
> Step #1 Generate a Random population of numbers:
>
> Population = 1, 6, 8, 4
>
> Step #2 Chose a fitness function. Ours will be Error = ABS(1-(2x +
> 1)/7)
>
> Error(1) = ABS(1-(2x1+1/7)) = 0.57
> Error(6) = ABS(1-(2x6+1/7)) = 0.86
> Error(8) = ABS(1-(2x8+1/7)) = 1.43
> Error(4) = ABS(1-(2x4+1/7)) = 0.29
>
> The number with the smallest Error is closest to the answer.
> (Still
> with me?)
>
> Step #3 Repopulate your population based on your fitness function
> results. This is the tough part to grasp. We need to normalize all of
> the errors so we can get a repopulation percentage. This will help us
> get the new population of numbers.
>
> Take the total of the error 0.57 + 0.86 + 1.43 + 0.29 = 3.15
>
> 3.15/0.57 = 5.53
> 3.15/0.86 = 3.66
> 3.15/1.43 = 2.20
> 3.15/0.29 = 10.86

************** Bunch of quoted text trimmed *********************
>
>
> Kind of make sense???
>
>
> Eric
>


Good lord, that will just confuse them!...
Of course you realize that your example algorithm converges a
little slower than just guessing until you've guessed the right
answer. I say a little slower because what you've described
essentially IS just guessing until you happen upon the right answer,
with a few extra operations to slow everything down! The magic in a
genetic algorithm comes from the ***, which happens in the crossover
that you omitted in step 4 above. You HAVE to come up with a way to
combine two 'pretty good' solutions to get another 'pretty good,
maybe a bit better' solution. Without that combination step
(analogous to *** in the world of biology), you don't have a genetic
algorithm at all! Now finding a way to combine two 'pretty good'
solutions to come up with a new 'pretty good' solution is sometimes
difficult. In your example it would be easy; simply average (or
perhaps a randomly weighted mean) the parents to get the offspring.
Now that would actually converge to a solution in your example.
The mutation part is less important than the *** part, but still
needed. But i don't think i've EVER seen mutation implemented as
simply making up a wholy new, randomly generated member of the
population. That would be no better than (and roughly equivilent to)
having a larger initial population. The mutation must be a
relatively small change to an existing member of the population. The
idea is that your population will, after a few generations, be much
better adapted than a randomly generated individual and a small
change will have a decent chance of being 'pretty good' too.

I don't mean to be insulting or belittle the contribution to the
newsgroup, but genetic algorithms are important and your example
could easily confuse people.

-Rob Ryland
Reply With Quote