a.d.DifferentialEvolution(object) : class documentation

Part of ade.de View In Hierarchy Source for de.py Project Page for ade

I perform asynchronous differential evolution, employing your multiple CPU cores in a very cool efficient way that does not change anything about the actual operations done in the DE algorithm. The very same target selection, mutation, scaling, and recombination (crossover) will be done for a sequence of each target in the population, just as it is with a DE algorithm that blocks during fitness evaluation. The magic lies in the use of DeferredLock instances for each index of the population list. Because the number of population members is usually far greater than the number of CPU cores available, almost all of the time the asynchronous processing will find a target it can work on without disturbing the operation sequence.

Construct me with a population.Population instance and any keywords that set my runtime configuration different than my default attributes. The Population object will need to be initialized with a population of individual.Individual objects that can be evaluated according to the population object's evaluation function, which must return a fitness metric where lower values indicate better fitness.

Method __init__ Undocumented
Method shutdown Undocumented
Method crossover Undocumented
Method challenge No summary
Method __call__ Call this to run differential evolution on a population of individuals.
def __init__(self, population, **kw):
Undocumented
def shutdown(self):
Undocumented
def crossover(self, parent, mutant):
Undocumented
@defer.inlineCallbacks
def challenge(self, kt, kb):

Challenges the target ("parent") individual at index kt with a challenger (often referred to as a "trial" or "child") individual produced from DE mutation and crossover. The trial individual is formed from crossover between the target and a donor individual, which is formed from the vector sum of a base individual at index kb and a scaled vector difference between two randomly chosen other individuals that are distinct from each other and both the target and base individuals:


 id = ib + F*(i0 - i1)         [1]



 ic = crossover(it, id)        [2]

First, I calculate the vector difference between the two other individuals. Then I scale that difference by F, the current, possibly random, possibly population-dependent value of which is obtained with a call to the get method of my FManager. Then I add the scaled difference to the donor base individual at kb and perform crossover to obtain the donor.

The crossover of [2], the "bin" in DE/[rand|best]/1/bin, consists of giving each parameter of the donor individual a chance (usually a very good chance) to appear in the challenger, as opposed to using the target's parameter. For each parameter, if a uniform random number in the range of 0-1 is less than my attribute CR, I use the donor's version of that parameter and thus preserve the mutation performed in [1]. Otherwise, I use the target's version and the discard the mutation for that parameter.

Finally, I conduct a tournament between the target and the challenger. Whoever has the lowest result of a call to Individual.evaluate is the winner and is assigned to index kt.

@defer.inlineCallbacks
def __call__(self):

Call this to run differential evolution on a population of individuals.

At the conclusion of each generation's evaluations, I consider the amount of overall improvement if I am running in adaptive mode. If the overall improvement (sum of rounded ratios between SSE of replaced individuals and their replacements) exceeded that required to maintain the status quo, I bump up F a bit. If the overall improvement did not meet that threshold, I reduce F a bit, but only if there was no new best individual in the population.

So long as the best individual in the population keeps getting better with each generation, I will continue to run, even with tiny overall improvements.

API Documentation for ade, generated by pydoctor at 2018-08-29 10:51:10.