Asynchronous Differential Evolution: The foundational object.

I perform differential evolution asynchronously, employing your multiple CPU cores or CPUs in a very cool efficient way that does not change anything about the actual operations done in the DE algorithm. The very same target selection, mutation, scaling, and recombination (crossover) will be done for a sequence of each target in the population, just as it is with a DE algorithm that blocks during fitness evaluation.

My magic lies in the use of Twisted's DeferredLock. There's one for each index of the population list. Because the number of population members is usually far greater than the number of CPU cores available, almost all of the time the asynchronous processing will find a target it can work on without disturbing the operation sequence.

Construct me with a Population instance and any keywords that set my runtime configuration different than my default attributes. Before running my instance with __call__, you must call the Population.setup method. That initializes it with a population of Individual objects that can be evaluated according to the population object's evaluation function.

The evaluation function must return a fitness metric where lower values indicate better fitness, None or inf represents an invalid or failing individual and thus worst-possible fitness, and a negative number represents a fatal error that will terminate operations. It must except either a 1-D Numpy array of parameter values or, if I am shutting down, a None object.

When running in a console Python application, my shutdown method gets called when the Enter key is pressed.

ParameterslogHandleAn open handle for a log file, or True to log output to STDOUT or None to suppress logging. Default is STDOUT.
CRThe crossover probability between parent (i.e., basis) and mutant (i.e., candidate, offspring). (type: float)
FA scaler or two-element sequence defining the differential weight, or a range of possible weights from which one is obtained as a uniform random variate.
maxiterThe maximum number of iterations (i.e., generations) to run. It can be useful to set this to something realistic (e.g., 500 for big populations and lengthy evaluations) so that you have a nice summary of the best candidates when you come back and check results in an hour or so.
randomBaseSet True to use DE/rand/1/bin where a random individual his chosen from the Population instead of the current best individual as the basis for mutants. Or set it to a float between 0.0 and 1.0 to use ADE's modified version DE/prob/1/bin where the probability of an individual being chosen increases with how close it is to being the best one in the population; the higher the number, the closer to uniformly random that probability will be.
uniformSet True to initialize the population with uniform random variate's as the parameters instead of a Latin hypercube. Not usually recommended because you don't want to start off with clustered parameters.
adaptiveSet True to adapt the value (or values) of F in a way that tries to maintain the number of successful challenges at a reasonable level. The adaptive algorithm accounts not just for whether a challenge succeeded but also how much better the challenger is than the individual it replaced.
bitterEndNormally, ade quits trying once there are too few successful challenges and it appears that further iterations won't accomplish much. Set this True if you have all the time in the world and wanted to keep going until maxiter.
dwellByGraveThe number of iterations that ade hangs around after it's decided that nothing more is being accomplished. if you think there really is some progress being with occasional marginally better replacements but don't want to go until the bitterEnd, feel free to increase this from the default.
goalSSEYou can set a goal for SSE to indicate that any further iterations are pointless if that goal is reached. If defined and the best individual has a better SSE than it at the end of an iteration, there will be no further iterations.
xSSESet True if your evaluation function can accept and make use of an xSSE keyword defining an SSE value above which continuing the evaluation is pointless. If this is set, each call to the eval function for a challenger will include the xSSE keyword set to its target's SSE. If the challenger's SSE exceeds xSSE, the evaluation can terminate early because the challenge will fail no matter what.
Class Variable attributes Default values for attributes CR, F, maxiter, randomBase, uniform, adaptive, bitterEnd, and dwellByGrave that define how a Differential Evolution run should be conducted. The attributes are set by my constructor, and the defaults can be overridden with constructor keywords. (That's why they're listed here as keywords.) (type: dict)
Instance Variable p An instance of Population supplied as my first constructor argument.
Instance Variable fm An instance of FManager.
Instance Variable running True unless my shutdown method has been called.
Method __init__ DifferentialEvolution(population, **kw)
Method shutdown Call this to shut me down gracefully.
Method crossover Crossover of parent and mutant individuals.
Method challenge Challenges the target ("parent") individual at index kt with a challenger at index kb.
Method __call__ Here is what you call to run differential evolution on my Population p of individuals.
Method _run Called by __call__ to do most of the work.
attributes =
Default values for attributes CR, F, maxiter, randomBase, uniform, adaptive, bitterEnd, and dwellByGrave that define how a Differential Evolution run should be conducted. The attributes are set by my constructor, and the defaults can be overridden with constructor keywords. (That's why they're listed here as keywords.) (type: dict)
p =
An instance of Population supplied as my first constructor argument.
fm =
An instance of FManager.
running =
True unless my shutdown method has been called.
def __init__(self, population, **kw):

DifferentialEvolution(population, **kw)

def shutdown(self):

Call this to shut me down gracefully.

Repeated calls are ignored. Gets called when the Enter key is pressed.

Sets my running flag False, which lets all my various loops know that it's time to quit early. Calls Population.abort on my Population object p to shut it down ASAP.

def crossover(self, parent, mutant):

Crossover of parent and mutant individuals.

The probability of the mutant keeping any given value is CR, except for a randomly chosen one that it always gets to keep.

@defer.inlineCallbacks
def challenge(self, kt, kb):

Challenges the target ("parent") individual at index kt with a challenger at index kb.

The challenger, often referred to as a "trial" or "child," is an Individual that was produced from DE mutation and crossover.

The trial individual is formed from crossover between the target and a donor individual, which is formed from the vector sum of a base individual at index kb and a scaled vector difference between two randomly chosen other individuals that are distinct from each other and both the target and base individuals:

 id = ib + F*(i0 - i1)         [1]
 ic = crossover(it, id)        [2]

First, I calculate the vector difference between the two other individuals. Then I scale that difference by F, the current, possibly random, possibly population-dependent value of which is obtained with a call to the get method of my FManager. Then I add the scaled difference to the donor base individual at kb and perform crossover to obtain the donor.

The crossover of [2], the "bin" in DE/[rand|best]/1/bin, consists of giving each parameter of the donor individual a chance (usually a very good chance) to appear in the challenger, as opposed to using the target's parameter. For each parameter, if a uniform random number in the range of 0-1 is less than my attribute CR, I use the donor's version of that parameter and thus preserve the mutation performed in [1]. Otherwise, I use the target's version and the discard the mutation for that parameter.

Finally, I conduct a tournament between the target and the challenger. Whoever has the lowest result of a call to Individual.evaluate is the winner and is assigned to index kt.

The "returned" Deferred fires with None.

@defer.inlineCallbacks
def _run(self):

Called by __call__ to do most of the work.

def __call__(self):

Here is what you call to run differential evolution on my Population p of individuals.

You have to construct me with the population object, and you have to run Population.setup on it yourself. Make sure that's been done and the resulting Deferred has fired before trying to call this.

At the conclusion of each generation's evaluations, I consider the amount of overall improvement if I am running in adaptive mode. If the overall improvement (sum of rounded ratios between SSE of replaced individuals and their replacements) exceeded that required to maintain the status quo, I bump up F a bit. If the overall improvement did not meet that threshold, I reduce F a bit, but only if there was no new best individual in the population.

So long as the best individual in the population keeps getting better with each generation, I will continue to run, even with tiny overall improvements.

API Documentation for ade, generated by pydoctor at 2019-05-15 11:01:59.