ade.de.DifferentialEvolution(object)
class documentation
Part of ade.de
(View In Hierarchy)
Asynchronous Differential Evolution: The foundational object.
I perform differential evolution asynchronously, employing your multiple CPU cores or CPUs in a very cool efficient way that does not change anything about the actual operations done in the DE algorithm. The very same target selection, mutation, scaling, and recombination (crossover) will be done for a sequence of each target in the population, just as it is with a DE algorithm that blocks during fitness evaluation.
Lots of Lock
My magic lies in the use of Twisted's DeferredLock
.
There's one for each index of the population list. Because the number of
population members is usually far greater than the number of CPU cores
available, almost all of the time the asynchronous processing will find a
target it can work on without disturbing the operation sequence.
Population & Individuals
Construct me with a Population
instance and any keywords that set my runtime configuration different
than my default attributes. Before running my instance with __call__
,
you must call the Population.setup
method. That initializes it with a population of Individual
objects
that can be evaluated according to the population object's evaluation
function.
The evaluation function must return a fitness metric where lower
values indicate better fitness, None
or inf
represents an invalid or failing individual and thus worst-possible
fitness, and a negative number represents a fatal error that will
terminate operations. It must except either a 1-D Numpy array of
parameter values or, if I am shutting down, a None
object.
Darwin, Interrupted
When running in a console Python application, my shutdown
method gets called when the Enter key is pressed. It took quite a bit of
work to implement that user-abort capability in a clean, Twisted-friendly
manner, but it was worth it. Serious evolution of stuff with DE involves
a lot of observing the distributions of parameter values vs SSE, stopping
evolution, tweaking the parameter ranges, and resuming evolution
again.
Crossover Rate
My CR attribute determines the crossover rate, the probability of the mutant's value being used for a given parameter instead of just copying the parent's value. A low CR means less mutation and thus innovation. But the less drastic, lower-dimensional movement in the search space that a low CR results in ultimately may be more productive.
Montgomery & Chen, "An Analysis of the Operation of Differential Evolution at High and Low Crossover Rates" (2010): "Small values of CR result in exploratory moves parallel to a small number of axes of the search space, while large values of CR produce moves at angles to the search space’s axes. Consequently, the general consensus, supported by some empirical studies, is that small CR is useful when solving separable problems while large CR is more useful when solving non-separable problems."
Despite the simplicity of CR being proportional to exploration dimensionality, selecting a value for CR is not terribly intuitive. Montgomery & Chen show that, for some well-known competition problems, performance is best when CR is near but not at either extreme of 0.0 or 1.0. The ranges 0.1-0.2 and 0.8-0.9 look promising. They note that "characteristics and convergence rates are all highly different" at each end of the overall CR range: "While DE with CR = 0.9 relies on the population converging so that its moves may be scaled for finer-grained search, DE with CR ≤ 0.1 maintains a highly diverse population throughout its course, especially in complex landscapes, as individual solutions conduct largely independent searches of the solution space."
Parameters | logHandle | An open handle for a log file, or True to log output to STDOUT
or None to suppress logging. Default is STDOUT. |
CR | The crossover rate between parent (i.e., basis) and mutant (i.e., candidate, offspring). CR is the probability of the mutant's value being used for a given parameter. Only if a U[0,1] random variate is bigger than CR (as it seldom will be with typical CR around 0.8), and only if the parameter is not a reserved random one that must be mutated, is the mutant's value discarded and the parent's used. (type: float) | |
F | A scalar or two-element sequence defining the differential weight, or a range of possible weights from which one is obtained as a uniform random variate. | |
maxiter | The maximum number of iterations (i.e., generations) to run. It can be useful to set this to something realistic (e.g., 500 for big populations and lengthy evaluations) so that you have a nice summary of the best candidates when you come back and check results in an hour or so. | |
randomBase | Set True to use DE/rand/1/bin where a random individual his
chosen from the Population instead
of the current best individual as the basis for mutants. Or set it to a
float between 0.0 and 1.0 to use ADE's modified version DE/prob/1/bin where
the probability of an individual being chosen increases with how close it
is to being the best one in the population; the higher the number, the
closer to uniformly random that probability will be. | |
uniform | Set True to initialize the population with uniform random
variate's as the parameters instead of a Latin hypercube. Not usually
recommended because you don't want to start off with clustered parameters. | |
adaptive | Set True to adapt the value (or values) of F in a way
that tries to maintain the number of successful challenges at a reasonable
level. The adaptive algorithm accounts not just for whether a challenge
succeeded but also how much better the challenger is than the individual it
replaced. | |
bitterEnd | Normally, ade quits trying once there are too few successful
challenges and it appears that further iterations won't accomplish much.
Set this True if you have all the time in the world and wanted
to keep going until maxiter. | |
dwellByGrave | The number of iterations that ade hangs around after it's decided that nothing more is being accomplished. if you think there really is some progress being with occasional marginally better replacements but don't want to go until the bitterEnd, feel free to increase this from the default. Be aware that increasing it too much, e.g., to 40, can effectively force the iterations to continue until the bitterEnd, because a single adaptive increase in F will reset the count and you'll need all those continuous no-progress iterations to happen all over again for it to quit. | |
goalSSE | You can set a goal for SSE to indicate that any further iterations are pointless if that goal is reached. If defined and the best individual has a better SSE than it at the end of an iteration, there will be no further iterations. | |
xSSE | Set True if your evaluation function can accept and make use
of an xSSE keyword defining an SSE value above which continuing the
evaluation is pointless. If this is set, each call to the eval function for
a challenger will include the xSSE keyword set to its target's SSE.
If the challenger's SSE exceeds xSSE, the evaluation can terminate
early because the challenge will fail no matter what. |
Class Variable | attributes | Default values for attributes CR, F, maxiter, randomBase, uniform, adaptive, bitterEnd, and dwellByGrave that define how a Differential Evolution run should be conducted. The attributes are set by my constructor, and the defaults can be overridden with constructor keywords. (That's why they're listed here as keywords.) (type: dict) |
Instance Variable | p | An instance of Population supplied
as my first constructor argument. |
Instance Variable | fm | An instance of FManager . |
Instance Variable | running | True unless my shutdown
method has been called. |
Method | __init__ | DifferentialEvolution(population, **kw) |
Method | shutdown | Call this to shut me down gracefully. |
Method | crossover | Crossover of parent and mutant individuals. |
Method | challenge | Challenges the target ("parent") individual at index kt with a challenger at index kb. |
Method | __call__ | Here is what you call to run differential evolution on my Population p
of individuals. |
Method | _run | Called by __call__
to do most of the work. func is a callback function that gets called
after each generation with the generation number as the sole argument. |
Call this to shut me down gracefully.
Repeated calls are ignored. Gets called when the Enter key is pressed.
Sets my running flag False
, which lets all my
various loops know that it's time to quit early. Calls Population.abort
on my Population
object p to shut it down ASAP.
Crossover of parent and mutant individuals.
The probability of the mutant keeping any given value is CR, except for a randomly chosen one that it always gets to keep.
def challenge(self, kt, kb):
Challenges the target ("parent") individual at index kt with a challenger at index kb.
The challenger, often referred to as a "trial" or
"child," is an Individual
that was
produced from DE mutation and crossover
.
The trial individual is formed from crossover between the target and a donor individual, which is formed from the vector sum of a base individual at index kb and a scaled vector difference between two randomly chosen other individuals that are distinct from each other and both the target and base individuals:
id = ib + F*(i0 - i1) [1] ic = crossover(it, id) [2]
First, I calculate the vector difference between the two other
individuals. Then I scale that difference by F, the current,
possibly random, possibly population-dependent value of which is obtained
with a call to the get
method of my FManager
. Then I add the
scaled difference to the donor base individual at kb and perform
crossover to obtain the donor.
The crossover of [2], the "bin" in
DE/[rand|best]/1/bin
, consists of giving each parameter of the
donor individual a chance (usually a very good chance) to appear in the
challenger, as opposed to using the target's parameter. For each parameter,
if a uniform random number in the range of 0-1 is less than my attribute
CR, I use the donor's version of that parameter and thus preserve
the mutation performed in [1]. Otherwise, I use the target's version and
the discard the mutation for that parameter.
Finally, I conduct a tournament between the target and the challenger.
Whoever has the lowest result of a call to Individual.evaluate
is the winner and is assigned to index kt.
The "returned" Deferred
fires with
None
.
def _run(self, func):
Called by __call__
to do most of the work. func is a callback function that gets called
after each generation with the generation number as the sole argument.
Returns a Deferred
that fires with my Population
object
p when the DE run is completed.
Here is what you call to run differential evolution on my Population
p
of individuals.
You have to construct me with the population object, and you have to run
Population.setup
on it yourself. Make sure that's been done and the resulting
Deferred
has fired before trying to call this.
At the conclusion of each generation's evaluations, I consider the amount of overall improvement if I am running in adaptive mode. If the overall improvement (sum of rounded ratios between SSE of replaced individuals and their replacements) exceeded that required to maintain the status quo, I bump up F a bit. If the overall improvement did not meet that threshold, I reduce F a bit, but only if there was no new best individual in the population.
So long as the best individual in the population keeps getting better with each generation, I will continue to run, even with tiny overall improvements.
Returns a Deferred
that fires with my Population
object
p when the DE run is completed.
Parameters | func | Supply a callback function to have it called after each generation, with the generation number as the sole argument. |