Example demonstrating the use of the caching decorator with functional equivalance checkingΒΆ

Caches the results of fitness evaluations in a pickle file (example_fec_caching_cache.pkl). To illustrate its practical use, compare the runtime of this script when you first call it vs. the second time and when you comment out the decorator on inner_objective.

import functools
import multiprocessing as mp
import time

import numpy as np

import cgp

We define the target function for this example.

def f_target(x):
    return x ** 2 + x + 1.0

We then define the objective function for the evolutionary algorithm: It consists of an inner objective which we wrap with the caching decorator. This decorator specifies a pickle file that will be used for caching results of fitness evaluations. In addition the decorator accepts keyword arguments that specifiy the statistic of the samples used for evaluation. The inner objective is then used by the objective function to compute (or retrieve from cache) the fitness of the individual.

@cgp.utils.disk_cache(
    "example_fec_caching_cache.pkl",
    compute_key=functools.partial(
        cgp.utils.compute_key_from_numpy_evaluation_and_args,
        _seed=12345,
        _min_value=-10.0,
        _max_value=10.0,
        _batch_size=5,
    ),
    file_lock=mp.Lock(),
)
def inner_objective(ind):
    """The caching decorator uses the return values generated from
    providing random inputs to ind.to_numpy() to identify functionally
    indentical individuals and avoid reevaluating them. Note that
    caching only makes sense for deterministic objective functions, as
    it assumes that identical phenotypes will always return the same
    fitness values.

    """
    f = ind.to_func()

    loss = []
    for x_0 in np.linspace(-2.0, 2.0, 100):
        y = f(x_0)
        loss.append((f_target(x_0) - y) ** 2)

    time.sleep(0.25)  # emulate long fitness evaluation

    return np.mean(loss)


def objective(individual):
    if not individual.fitness_is_None():
        return individual

    individual.fitness = -inner_objective(individual)

    return individual

Next, we define the parameters for the population, the genome of individuals, and the evolutionary algorithm.

params = {
    "population_params": {"n_parents": 10, "seed": 8188211},
    "ea_params": {
        "n_offsprings": 10,
        "tournament_size": 1,
        "mutation_rate": 0.05,
        "n_processes": 1,
    },
    "genome_params": {
        "n_inputs": 1,
        "n_outputs": 1,
        "n_columns": 10,
        "n_rows": 2,
        "levels_back": 2,
        "primitives": (cgp.Add, cgp.Sub, cgp.Mul, cgp.ConstantFloat),
    },
    "evolve_params": {"max_generations": 200, "termination_fitness": -1e-12},
}

We then create a Population instance and instantiate the evolutionary algorithm.

pop = cgp.Population(**params["population_params"], genome_params=params["genome_params"])
ea = cgp.ea.MuPlusLambda(**params["ea_params"])

Finally, we call the evolve method to perform the evolutionary search.

cgp.evolve(pop, objective, ea, **params["evolve_params"], print_progress=True)


print(f"evolved function: {pop.champion.to_sympy()}")

Out:

[2/200] max fitness: -2.360269360269361
[3/200] max fitness: -2.360269360269361
[4/200] max fitness: -2.360269360269361
[5/200] max fitness: -1.0
[6/200] max fitness: -1.0
[7/200] max fitness: -1.0
[8/200] max fitness: -1.0
[9/200] max fitness: -1.0
[10/200] max fitness: -1.0
[11/200] max fitness: -1.0
[12/200] max fitness: -1.0
[13/200] max fitness: -1.0
[14/200] max fitness: -1.0
[15/200] max fitness: -1.0
[16/200] max fitness: -1.0
[17/200] max fitness: -1.0
[18/200] max fitness: -1.0
[19/200] max fitness: -1.0
[20/200] max fitness: -1.0
[21/200] max fitness: -1.0
[22/200] max fitness: -1.0
[23/200] max fitness: -1.0
[24/200] max fitness: -1.0
[25/200] max fitness: -1.0
[26/200] max fitness: -1.0
[27/200] max fitness: -1.0
[28/200] max fitness: -1.0
[29/200] max fitness: -1.0
[30/200] max fitness: -1.0
[31/200] max fitness: -1.0
[32/200] max fitness: -1.0
[33/200] max fitness: -1.0
[34/200] max fitness: -1.0
[35/200] max fitness: -1.0
[36/200] max fitness: -1.0
[37/200] max fitness: -1.0
[38/200] max fitness: -1.0
[39/200] max fitness: -1.0
[40/200] max fitness: -1.0
[41/200] max fitness: -1.0
[42/200] max fitness: -1.0
[43/200] max fitness: -1.0
[44/200] max fitness: -1.0
[45/200] max fitness: -1.0
[46/200] max fitness: -1.0
[47/200] max fitness: -1.0
[48/200] max fitness: -1.0
[49/200] max fitness: -1.0
[50/200] max fitness: -1.0
[51/200] max fitness: -1.0
[52/200] max fitness: -1.0
[53/200] max fitness: -1.0
[54/200] max fitness: -1.0
[55/200] max fitness: -1.0
[56/200] max fitness: -1.0
[57/200] max fitness: -1.0
[58/200] max fitness: -1.0
[59/200] max fitness: -1.0
[60/200] max fitness: -1.0
[61/200] max fitness: -1.0
[62/200] max fitness: -1.0
[63/200] max fitness: -1.0
[64/200] max fitness: -1.0
[65/200] max fitness: -1.0
[66/200] max fitness: -1.0
[67/200] max fitness: -1.0
[68/200] max fitness: -1.0
[69/200] max fitness: -1.0
[70/200] max fitness: -1.0
[71/200] max fitness: -1.0
[72/200] max fitness: -1.0
[73/200] max fitness: -1.0
[74/200] max fitness: -1.0
[75/200] max fitness: -1.0
[76/200] max fitness: -1.0
[77/200] max fitness: -1.0
[78/200] max fitness: -1.0
[79/200] max fitness: -1.0
[80/200] max fitness: -1.0
[81/200] max fitness: -1.0
[82/200] max fitness: -1.0
[83/200] max fitness: -1.0
[84/200] max fitness: -1.0
[85/200] max fitness: -1.0
[86/200] max fitness: -1.0
[87/200] max fitness: -1.0
[88/200] max fitness: -1.0
[89/200] max fitness: -1.0
[90/200] max fitness: -1.0
[91/200] max fitness: -1.0
[92/200] max fitness: -1.0
[93/200] max fitness: -1.0
[94/200] max fitness: -1.0
[95/200] max fitness: -1.0
[96/200] max fitness: -1.0
[97/200] max fitness: -1.0
[98/200] max fitness: -1.0
[99/200] max fitness: -1.0
[100/200] max fitness: -1.0
[101/200] max fitness: -1.0
[102/200] max fitness: -1.0
[103/200] max fitness: -1.0
[104/200] max fitness: -1.0
[105/200] max fitness: -1.0
[106/200] max fitness: -1.0
[107/200] max fitness: -1.0
[108/200] max fitness: -1.0
[109/200] max fitness: -1.0
[110/200] max fitness: -1.0
[111/200] max fitness: -1.0
[112/200] max fitness: -1.0
[113/200] max fitness: -1.0
[114/200] max fitness: -1.0
[115/200] max fitness: -1.0
[116/200] max fitness: -1.0
[117/200] max fitness: -1.0
[118/200] max fitness: -1.0
[119/200] max fitness: -1.0
[120/200] max fitness: -1.0
[121/200] max fitness: -1.0
[122/200] max fitness: -4.930380657631324e-34
evolved function: x_0**2 + x_0 + 1.0

Total running time of the script: ( 0 minutes 13.252 seconds)

Gallery generated by Sphinx-Gallery