Asynchronous Differential Evolution: The foundational object.
I perform differential evolution asynchronously, employing your multiple CPU cores or CPUs in a very cool efficient way that does not change anything about the actual operations done in the DE algorithm. The very same target selection, mutation, scaling, and recombination (crossover) will be done for a sequence of each target in the population, just as it is with a DE algorithm that blocks during fitness evaluation.
My magic lies in the use of Twisted's
one for each index of the population list. Because the number of population
members is usually far greater than the number of CPU cores available,
almost all of the time the asynchronous processing will find a target it
can work on without disturbing the operation sequence.
Construct me with a
and any keywords that set my runtime configuration different than my
default attributes. Before running my instance with
you must call the
method. That initializes it with a population of
that can be evaluated according to the population object's evaluation
The evaluation function must return a fitness metric where lower values
indicate better fitness,
an invalid or failing individual and thus worst-possible fitness, and a
negative number represents a fatal error that will terminate operations. It
must except either a 1-D Numpy array of parameter values or, if I am
shutting down, a
When running in a console Python application, my
method gets called when the Enter key is pressed.
|Parameters||logHandle||An open handle for a log file, or |
|CR||The crossover probability between parent (i.e., basis) and mutant (i.e., candidate, offspring). (type: float)|
|F||A scaler or two-element sequence defining the differential weight, or a range of possible weights from which one is obtained as a uniform random variate.|
|maxiter||The maximum number of iterations (i.e., generations) to run. It can be useful to set this to something realistic (e.g., 500 for big populations and lengthy evaluations) so that you have a nice summary of the best candidates when you come back and check results in an hour or so.|
|bitterEnd||Normally, ade quits trying once there are too few successful
challenges and it appears that further iterations won't accomplish much.
Set this |
|dwellByGrave||The number of iterations that ade hangs around after it's decided that nothing more is being accomplished. if you think there really is some progress being with occasional marginally better replacements but don't want to go until the bitterEnd, feel free to increase this from the default.|
|goalSSE||You can set a goal for SSE to indicate that any further iterations are pointless if that goal is reached. If defined and the best individual has a better SSE than it at the end of an iteration, there will be no further iterations.|
|Class Variable||attributes||Default values for attributes CR, F, maxiter, randomBase, uniform, adaptive, bitterEnd, and dwellByGrave that define how a Differential Evolution run should be conducted. The attributes are set by my constructor, and the defaults can be overridden with constructor keywords. (That's why they're listed here as keywords.) (type: dict)|
|Instance Variable||p||An instance of
|Instance Variable||fm||An instance of
|Method||shutdown||Call this to shut me down gracefully.|
|Method||crossover||Crossover of parent and mutant individuals.|
|Method||challenge||Challenges the target ("parent") individual at index kt with a challenger at index kb.|
|Method||__call__||Here is what you call to run differential evolution on my
Populationsupplied as my first constructor argument.
shutdownmethod has been called.
Crossover of parent and mutant individuals.
The probability of the mutant keeping any given value is CR, except for a randomly chosen one that it always gets to keep.
def challenge(self, kt, kb):
Challenges the target ("parent") individual at index kt with a challenger at index kb.
The trial individual is formed from crossover between the target and a donor individual, which is formed from the vector sum of a base individual at index kb and a scaled vector difference between two randomly chosen other individuals that are distinct from each other and both the target and base individuals:
id = ib + F*(i0 - i1)  ic = crossover(it, id) 
First, I calculate the vector difference between the two other
individuals. Then I scale that difference by F, the current,
possibly random, possibly population-dependent value of which is obtained
with a call to the
get method of my
FManager. Then I add the
scaled difference to the donor base individual at kb and perform
crossover to obtain the donor.
The crossover of , the "bin" in
DE/[rand|best]/1/bin, consists of giving each parameter of the
donor individual a chance (usually a very good chance) to appear in the
challenger, as opposed to using the target's parameter. For each parameter,
if a uniform random number in the range of 0-1 is less than my attribute
CR, I use the donor's version of that parameter and thus preserve
the mutation performed in . Otherwise, I use the target's version and
the discard the mutation for that parameter.
Finally, I conduct a tournament between the target and the challenger.
Whoever has the lowest result of a call to
is the winner and is assigned to index kt.
Deferred fires with
to do most of the work.
Here is what you call to run differential evolution on my
You have to construct me with the population object, and you have to run
on it yourself. Make sure that's been done and the resulting
Deferred has fired before trying to call this.
At the conclusion of each generation's evaluations, I consider the amount of overall improvement if I am running in adaptive mode. If the overall improvement (sum of rounded ratios between SSE of replaced individuals and their replacements) exceeded that required to maintain the status quo, I bump up F a bit. If the overall improvement did not meet that threshold, I reduce F a bit, but only if there was no new best individual in the population.
So long as the best individual in the population keeps getting better with each generation, I will continue to run, even with tiny overall improvements.