# [ACCEPTED]-How to find the best parameters for a Genetic Algorithm?-aforge

I find it helps to think of these problems 14 as a landscape, where you're trying to find 13 the lowest point.

Methods like genetic algorithms 12 are used when the landscape is too large 11 to just test all the points, and the "shape" of 10 the landscape is such that methods like 9 gradient-descent will get you stuck in local 8 minima.

One good example is Rastrigin's 7 function (image ref):

_{(source: scientific-computing.com)}

:

Your choices are:

Generation 6 size:

- Too large: you're going to have a long epoch time, restricting how many chances each individual has to explore it's neighbourhood.
- Too small: you don't get good coverage of the search space.

Mutation rate:

- Too high: you risk the individuals "jumping" over a solution they were close to.
- Too low: they're all going to get stuck in local minima.

So it really does depend 5 on your own particular search space. Experiment 4 with the parameters and try to find the 3 optimal combination. I would agree that 2 using another GA to optimise the parameters 1 is not going to solve the problem.

I find it rather disappointing there are 19 so many answers that assume you can't find 18 the best parameters for the Genetic Algorithm 17 automatically. **I agree that the parameters does depend on the problem however there are methods where you can find them.**

Additionally the No Free Lunch Theorem by no means 16 stops you from finding the best parameters, as 15 there have already been discussion that 14 disputes the fact:

- Practically it does not hold
- Description length may dispute the theorem
- Coevolution can make the theorem irrelevant

There are two types of 13 parameter setting:

- Parameter Tuning (offline parameter search - before the GA is run)
- Parameter Control (online parameter tweaking - during the GA run)
- Adaptive
- Self Adaptive
- Deterministic

There are a lot of literature 12 out there that describe how one can find 11 these optimal parameters, it depends whether 10 you are wanting to do the parameter search 9 offline or online. Popular belief is that 8 offline is better suited for most cases 7 because the online parameter control methods 6 would add too much complexity over offline.

Here 5 are some examples to find the "best" parameters:

**Parameter Tuning**:

- Simple parameter sweep (try everything)
- Meta-GA ontop of a GA
- Racing strategy to find best parameters
- Meta-GA and Racing together

**Parameter Control**:

- Adaptive probabilities for mutation and crossover
- Self adaptive popuation size
- Deterministic Genetic Operators

And 4 many more, just search for the literature 3 using keywords used above. There are scientific 2 methods for finding suitable parameters 1 for any given problem!

It ain't easy.

Why? Because of the No Free Lunch theorem. This 8 basically states that there is no **general** search 7 algorithm that works well for **all** problems.

The 6 best you can do is tailor the search for 5 a **specific** problem space. You'll have to manually 4 tweak your parameters to fit your solution. Sorry.

Using 3 a GA to find GA parameters gets complicated. How 2 do you find the optimal parameters for your 1 GAGA search? Another GA...?

The one time I programmed a genetic algorithm 6 I included those values in the values to 5 mutate, basically like you said using a 4 GA to configure itself. It worked surprisingly 3 well, especially since I've found it to 2 be beneficial for those values to change 1 over the course of it's computation.

There really isn't an automatic way to do 15 it for a given dataset. If there were, they 14 wouldn't expose those parameters. Using 13 a second GA to tune the parameters of the 12 first GA is perilous -- do you use a third 11 GA to tune the parameters of the second? Even 10 if you did that, it's a recipe for overfitting 9 anyway.

My advice would be to play with the 8 parameters, and see how they affect your 7 population distrubution at each generation, how 6 many generations it takes to get to an acceptable 5 answer, etc. If you have too much mutation 4 your population will never stabilize. Too 3 little and you'll end up with homogeneity.

It's 2 a dirty secret of GAs that tuning them is 1 an art, not a science.

As everyone else said, there is no one answer. Although 24 there is some tendency to use crossover 23 rate on level 0.7-0.9 and mutation on 0.1-0.3 22 it really depends. Depends on problem, may 21 depend on fitness function, and definitely 20 depends on Genetic Algorithm itself. There 19 are many GA variations, optimal parameters 18 for the same problem may vary.

As for using 17 GA to tune parameters of target GA there 16 are approaches like that, but, as it was 15 pointed out, how to tune parameters of first 14 GA? Keep in mind, that maybe mutation rate 13 should be higher at the beginning, and than 12 it should decreasing while cross over rate 11 should be increasing. It is issue of exploration 10 versus exploitation. There are approaches 9 to let GA be more adaptive and let it change 8 its parameters as it looks for solution. Fuzzy 7 controllers are sometimes used to manipulate 6 parameters of GA. There are also other approaches.

If 5 you want to know about it more, buy some 4 books, or look through academic research 3 papers.

If you need to setup your own GA 2 without extensive research, try some values 1 from others work, and experiment with them.

More Related questions

We use cookies to improve the performance of the site. By staying on our site, you agree to the terms of use of cookies.