How to find the best parameters for a Genetic Algorithm?
Asked Answered
P

6

9

Some Genetic Algorithm frameworks, such as http://www.aforgenet.com/ requires many parameters, such as mutation rate, population size, etc

There is universal best numbers for such parameters? I believe that it depends on the problem (fitness function delay, mutation delay, recombination delay, evolution rate, etc). My first thought was to use a GA to configure another GA.

Any better ideas?

Patterson answered 2/7, 2009 at 17:18 Comment(0)
O
6

The one time I programmed a genetic algorithm I included those values in the values to mutate, basically like you said using a GA to configure itself. It worked surprisingly well, especially since I've found it to be beneficial for those values to change over the course of it's computation.

Octant answered 2/7, 2009 at 17:22 Comment(2)
To be more elaborate, what you have discovered is a Parameter Control method, for the people interested in different ways of finding 'best' GA parameters have a look at my answer.Kiss
I'm dumbfounded I couldn't come up with this solution myself. It's so elegantly simple and obvious in hindsight :DDodwell
D
18

I find it helps to think of these problems as a landscape, where you're trying to find the lowest point.

Methods like genetic algorithms are used when the landscape is too large to just test all the points, and the "shape" of the landscape is such that methods like gradient-descent will get you stuck in local minima.

One good example is Rastrigin's function (image ref): alt text
(source: scientific-computing.com)
:

Your choices are:

Generation size:

  • Too large: you're going to have a long epoch time, restricting how many chances each individual has to explore it's neighbourhood.
  • Too small: you don't get good coverage of the search space.

Mutation rate:

  • Too high: you risk the individuals "jumping" over a solution they were close to.
  • Too low: they're all going to get stuck in local minima.

So it really does depend on your own particular search space. Experiment with the parameters and try to find the optimal combination. I would agree that using another GA to optimise the parameters is not going to solve the problem.

Disable answered 2/7, 2009 at 21:50 Comment(1)
I do a lot of work with genetic algorithms. This is an excellent description and visualization. Good job.Borborygmus
K
15

I find it rather disappointing there are so many answers that assume you can't find the best parameters for the Genetic Algorithm automatically. I agree that the parameters does depend on the problem however there are methods where you can find them.

Additionally the No Free Lunch Theorem by no means stops you from finding the best parameters, as there have already been discussion that disputes the fact:

There are two types of parameter setting:

  • Parameter Tuning (offline parameter search - before the GA is run)
  • Parameter Control (online parameter tweaking - during the GA run)
    • Adaptive
    • Self Adaptive
    • Deterministic

There are a lot of literature out there that describe how one can find these optimal parameters, it depends whether you are wanting to do the parameter search offline or online. Popular belief is that offline is better suited for most cases because the online parameter control methods would add too much complexity over offline.

Here are some examples to find the "best" parameters:

Parameter Tuning:

Parameter Control:

And many more, just search for the literature using keywords used above. There are scientific methods for finding suitable parameters for any given problem!

Kiss answered 14/1, 2014 at 17:6 Comment(1)
Thanks for the answer and for the great references!Adit
L
8

It ain't easy.

Why? Because of the No Free Lunch theorem. This basically states that there is no general search algorithm that works well for all problems.

The best you can do is tailor the search for a specific problem space. You'll have to manually tweak your parameters to fit your solution. Sorry.

Using a GA to find GA parameters gets complicated. How do you find the optimal parameters for your GAGA search? Another GA...?

Laforge answered 2/7, 2009 at 22:39 Comment(7)
Not true in regards to manually tuning a GA, there is something called an Adaptive GAs? (See hereKiss
@Kiss Yes there are many adaptive GAs. I use them a lot. But they only work for specific (usually smooth) problem spaces. They are not general.Laforge
May I kindly request a source? It would be useful for me thanks :)Kiss
You have one: the "No Free Lunch" Theorem.Laforge
Actually having had the time to review the No Free lunch theorem there are cases where it can be mitigated. Namely the use of Coevolution Meta-GA for example. Additionally there have been papers that dispute that realistically the No Free Lunch theorem does not hold (see here)! Therefore it is incorrect to prematurely assume using a simple heuristic such as the one I gave is inadequate for parameter searching.Kiss
@Kiss nice references! But they don't really help your case. Give the Coevolution Meta-GA a try. I'd like to know how well it works!Laforge
Actually the original authors of No Free Lunch Theorem (Wolpert and Macready) has already proven that Co-evolution GA's that produce a undisputed "champion" goes against the No Free Lunch Theorem: ieeexplore.ieee.org/xpl/…Kiss
O
6

The one time I programmed a genetic algorithm I included those values in the values to mutate, basically like you said using a GA to configure itself. It worked surprisingly well, especially since I've found it to be beneficial for those values to change over the course of it's computation.

Octant answered 2/7, 2009 at 17:22 Comment(2)
To be more elaborate, what you have discovered is a Parameter Control method, for the people interested in different ways of finding 'best' GA parameters have a look at my answer.Kiss
I'm dumbfounded I couldn't come up with this solution myself. It's so elegantly simple and obvious in hindsight :DDodwell
C
4

There really isn't an automatic way to do it for a given dataset. If there were, they wouldn't expose those parameters. Using a second GA to tune the parameters of the first GA is perilous -- do you use a third GA to tune the parameters of the second? Even if you did that, it's a recipe for overfitting anyway.

My advice would be to play with the parameters, and see how they affect your population distrubution at each generation, how many generations it takes to get to an acceptable answer, etc. If you have too much mutation your population will never stabilize. Too little and you'll end up with homogeneity.

It's a dirty secret of GAs that tuning them is an art, not a science.

Carbonaceous answered 2/7, 2009 at 17:32 Comment(1)
I don't agree. Tuning GA's is not an art, it can be done scientifically, there is something called Adaptive GAs (see this scientific paper here) which is not problem specific, and is general enough to be used in any problems you can put into a GA.Kiss
T
2

As everyone else said, there is no one answer. Although there is some tendency to use crossover rate on level 0.7-0.9 and mutation on 0.1-0.3 it really depends. Depends on problem, may depend on fitness function, and definitely depends on Genetic Algorithm itself. There are many GA variations, optimal parameters for the same problem may vary.

As for using GA to tune parameters of target GA there are approaches like that, but, as it was pointed out, how to tune parameters of first GA? Keep in mind, that maybe mutation rate should be higher at the beginning, and than it should decreasing while cross over rate should be increasing. It is issue of exploration versus exploitation. There are approaches to let GA be more adaptive and let it change its parameters as it looks for solution. Fuzzy controllers are sometimes used to manipulate parameters of GA. There are also other approaches.

If you want to know about it more, buy some books, or look through academic research papers.
If you need to setup your own GA without extensive research, try some values from others work, and experiment with them.

Tebet answered 4/11, 2009 at 19:55 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.