Compare and Contrast Monte-Carlo Method and Evolutionary Algorithms
Asked Answered
R

2

16

What's the relationship between the Monte-Carlo Method and Evolutionary Algorithms? On the face of it they seem to be unrelated simulation methods used to solve complex problems. Which kinds of problems is each best suited for? Can they solve the same set of problems? What is the relationship between the two (if there is one)?

Rush answered 19/9, 2011 at 2:26 Comment(0)
G
19

"Monte Carlo" is, in my experience, a heavily overloaded term. People seem to use it for any technique that uses a random number generator (global optimization, scenario analysis (Google "Excel Monte Carlo simulation"), stochastic integration (the Pi calculation that everybody uses to demonstrate MC). I believe, because you mentioned evolutionary algorithms in your question, that you are talking about Monte Carlo techniques for mathematical optimization: You have a some sort of fitness function with several input parameters and you want to minimize (or maximize) that function.

If your function is well behaved (there is a single, global minimum that you will arrive at no matter which inputs you start with) then you are best off using a determinate minimization technique such as the conjugate gradient method. Many machine learning classification techniques involve finding parameters that minimize the least squares error for a hyperplane with respect to a training set. The function that is being minimized in this case is a smooth, well behaved, parabaloid in n-dimensional space. Calculate the gradient and roll downhill. Easy peasy.

If, however, your input parameters are discrete (or if your fitness function has discontinuties) then it is no longer possible to calculate gradients accurately. This can happen if your fitness function is calculated using tabular data for one or more variables (if variable X is less than 0.5 use this table else use that table). Alternatively, you may have a program that you got from NASA that is made up of 20 modules written by different teams that you run as a batch job. You supply it with input and it spits out a number (think black box). Depending on the input parameters that you start with you may end up in a false minimum. Global optimization techniques attempt to address these types of problems.

Evolutionary Algorithms form one class of global optimization techniques. Global optimization techniques typically involve some sort of "hill climbing" (accepting a configuration with a higher (worse) fitness function). This hill climbing typically involves some randomness/stochastic-ness/monte-carlo-ness. In general, these techniques are more likely to accept less optimal configurations early on and, as the optimization progresses, they are less likely to accept inferior configurations.

Evolutionary algorithms are loosely based on evolutionary analogies. Simulated annealing is based upon analogies to annealing in metals. Particle swarm techniques are also inspired by biological systems. In all cases you should compare results to a simple random (a.k.a. "monte carlo") sampling of configurations...this will often yield equivalent results.

My advice is to start off using a deterministic gradient-based technique since they generally require far fewer function evaluations than stochastic/monte-carlo techniques. When you hear hoof steps think horses not zebras. Run the optimization from several different starting points and, unless you are dealing with a particularly nasty problem, you should end up with roughly the same minimum. If not, then you might have zebras and should consider using a global optimization method.

Graiae answered 19/9, 2011 at 2:57 Comment(3)
I like your answer but it seems incomplete. You touched upon how Evolutionary Algorithms work, but didn't explicitly discuss what kinds of problems they are best suited for. Please also discuss the Monte-Carlo Method in greater detail.Rush
"People seem to use it (Monte-Carlo) for any technique that uses a random number generator". Is this a valid definition? Or are you implying Monte-Carlo means something else?Rush
@Rush To quote from the Wikipedia article that you linked, "Monte Carlo methods (or Monte Carlo experiments) are a class of computational algorithms that rely on repeated random sampling to compute their results." My point is simply that MC describes a CLASS of algorithms. In the context of global optimization, Evolutionary Algorithms are one of many Monte Carlo (a.k.a. stochastic) optimization approaches.Graiae
F
4

well I think Monte Carlo methods is the general name for these methods which use random numbers in order to solve optimization problems. In this ways, even the evolutionary algorithms are a type of Monte Carlo methods if they use random numbers (and in fact they do).

Other Monte Carlo methods are: metropolis, wang-landau, parallel tempering,etc

OTOH, Evolutionary methods use 'techniques' borrowed from nature such as mutation, cross-over, etc.

Foot answered 19/9, 2011 at 3:23 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.