Asynchronous Differential Evolution: The foundational object.
I perform differential evolution asynchronously, employing your multiple CPU cores or CPUs in a very cool efficient way that does not change anything about the actual operations done in the DE algorithm. The very same target selection, mutation, scaling, and recombination (crossover) will be done for a sequence of each target in the population, just as it is with a DE algorithm that blocks during fitness evaluation.
Lots of Lock
My magic lies in the use of Twisted's
There's one for each index of the population list. Because the number of
population members is usually far greater than the number of CPU cores
available, almost all of the time the asynchronous processing will find a
target it can work on without disturbing the operation sequence.
Population & Individuals
Construct me with a
instance and any keywords that set my runtime configuration different
than my default attributes. Before running my instance with
you must call the
method. That initializes it with a population of
that can be evaluated according to the population object's evaluation
The evaluation function must return a fitness metric where lower
values indicate better fitness,
represents an invalid or failing individual and thus worst-possible
fitness, and a negative number represents a fatal error that will
terminate operations. It must except either a 1-D Numpy array of
parameter values or, if I am shutting down, a
When running in a console Python application, my
method gets called when the Enter key is pressed. It took quite a bit of
work to implement that user-abort capability in a clean, Twisted-friendly
manner, but it was worth it. Serious evolution of stuff with DE involves
a lot of observing the distributions of parameter values vs SSE, stopping
evolution, tweaking the parameter ranges, and resuming evolution
My CR attribute determines the crossover rate, the probability of the mutant's value being used for a given parameter instead of just copying the parent's value. A low CR means less mutation and thus innovation. But the less drastic, lower-dimensional movement in the search space that a low CR results in ultimately may be more productive.
Montgomery & Chen, "An Analysis of the Operation of Differential Evolution at High and Low Crossover Rates" (2010): "Small values of CR result in exploratory moves parallel to a small number of axes of the search space, while large values of CR produce moves at angles to the search space’s axes. Consequently, the general consensus, supported by some empirical studies, is that small CR is useful when solving separable problems while large CR is more useful when solving non-separable problems."
Despite the simplicity of CR being proportional to exploration dimensionality, selecting a value for CR is not terribly intuitive. Montgomery & Chen show that, for some well-known competition problems, performance is best when CR is near but not at either extreme of 0.0 or 1.0. The ranges 0.1-0.2 and 0.8-0.9 look promising. They note that "characteristics and convergence rates are all highly different" at each end of the overall CR range: "While DE with CR = 0.9 relies on the population converging so that its moves may be scaled for finer-grained search, DE with CR ≤ 0.1 maintains a highly diverse population throughout its course, especially in complex landscapes, as individual solutions conduct largely independent searches of the solution space."
|Parameters||logHandle||An open handle for a log file, or |
|CR||The crossover rate between parent (i.e., basis) and mutant (i.e., candidate, offspring). CR is the probability of the mutant's value being used for a given parameter. Only if a U[0,1] random variate is bigger than CR (as it seldom will be with typical CR around 0.8), and only if the parameter is not a reserved random one that must be mutated, is the mutant's value discarded and the parent's used. (type: float)|
|F||A scalar or two-element sequence defining the differential weight, or a range of possible weights from which one is obtained as a uniform random variate.|
|maxiter||The maximum number of iterations (i.e., generations) to run. It can be useful to set this to something realistic (e.g., 500 for big populations and lengthy evaluations) so that you have a nice summary of the best candidates when you come back and check results in an hour or so.|
|bitterEnd||Normally, ade quits trying once there are too few successful
challenges and it appears that further iterations won't accomplish much.
Set this |
|dwellByGrave||The number of iterations that ade hangs around after it's decided that nothing more is being accomplished. if you think there really is some progress being with occasional marginally better replacements but don't want to go until the bitterEnd, feel free to increase this from the default. Be aware that increasing it too much, e.g., to 40, can effectively force the iterations to continue until the bitterEnd, because a single adaptive increase in F will reset the count and you'll need all those continuous no-progress iterations to happen all over again for it to quit.|
|goalSSE||You can set a goal for SSE to indicate that any further iterations are pointless if that goal is reached. If defined and the best individual has a better SSE than it at the end of an iteration, there will be no further iterations.|
|Class Variable||attributes||Default values for attributes CR, F, maxiter, randomBase, uniform, adaptive, bitterEnd, and dwellByGrave that define how a Differential Evolution run should be conducted. The attributes are set by my constructor, and the defaults can be overridden with constructor keywords. (That's why they're listed here as keywords.) (type: dict)|
|Instance Variable||p||An instance of
|Instance Variable||fm||An instance of
|Method||shutdown||Call this to shut me down gracefully.|
|Method||crossover||Crossover of parent and mutant individuals.|
|Method||challenge||Challenges the target ("parent") individual at index kt with a challenger at index kb.|
|Method||__call__||Here is what you call to run differential evolution on my
Populationsupplied as my first constructor argument.
shutdownmethod has been called.
Crossover of parent and mutant individuals.
The probability of the mutant keeping any given value is CR, except for a randomly chosen one that it always gets to keep.
def challenge(self, kt, kb):
Challenges the target ("parent") individual at index kt with a challenger at index kb.
The trial individual is formed from crossover between the target and a donor individual, which is formed from the vector sum of a base individual at index kb and a scaled vector difference between two randomly chosen other individuals that are distinct from each other and both the target and base individuals:
id = ib + F*(i0 - i1)  ic = crossover(it, id) 
First, I calculate the vector difference between the two other
individuals. Then I scale that difference by F, the current,
possibly random, possibly population-dependent value of which is obtained
with a call to the
get method of my
FManager. Then I add the
scaled difference to the donor base individual at kb and perform
crossover to obtain the donor.
The crossover of , the "bin" in
DE/[rand|best]/1/bin, consists of giving each parameter of the
donor individual a chance (usually a very good chance) to appear in the
challenger, as opposed to using the target's parameter. For each parameter,
if a uniform random number in the range of 0-1 is less than my attribute
CR, I use the donor's version of that parameter and thus preserve
the mutation performed in . Otherwise, I use the target's version and
the discard the mutation for that parameter.
Finally, I conduct a tournament between the target and the challenger.
Whoever has the lowest result of a call to
is the winner and is assigned to index kt.
Deferred fires with
def _run(self, func):
Here is what you call to run differential evolution on my
You have to construct me with the population object, and you have to run
on it yourself. Make sure that's been done and the resulting
Deferred has fired before trying to call this.
At the conclusion of each generation's evaluations, I consider the amount of overall improvement if I am running in adaptive mode. If the overall improvement (sum of rounded ratios between SSE of replaced individuals and their replacements) exceeded that required to maintain the status quo, I bump up F a bit. If the overall improvement did not meet that threshold, I reduce F a bit, but only if there was no new best individual in the population.
So long as the best individual in the population keeps getting better with each generation, I will continue to run, even with tiny overall improvements.
Deferred that fires with my
p when the DE run is completed.
|Parameters||func||Supply a callback function to have it called after each generation, with the generation number as the sole argument.|