Posted on ::

🚧 Page under construction

TL;DR: Differential Evolution is like a genetic algorithm, but it drinks more coffee and doesn’t care about gradients.

Into to Differential Evolution Algorithm

The key idea of DE is evolution by difference vectors: new candidate solutions are generated by adding the weighted difference between individuals to another individual. DE maintains a population of candidate solutions and refines them over generations.

πŸ—οΈ Algorithm Structure

At its core, DE consists of 4 steps:

  1. Initialization
  2. Mutation
  3. Crossover
  4. Selection

1. Initialization

We start by initializing a population of $ N_p $ individuals:

where $\mathbf{l}$ and $\mathbf{u}$ are the lower and upper bounds of the search space.

2. Mutation

For each individual $\mathbf{x}_{i,G}$ in generation $ G $, we generate a donor vector using the DE/rand/1 strategy:

Where:

  • $ r_1, r_2, r_3 \in {1, \dots, N_p} $ are distinct and different from $ i $,
  • $ F \in [0, 2] $ is the differential weight (commonly $ F = 0.5 $).

Other strategies include DE/best/1, DE/current-to-best/1, etc.

3. Crossover

To increase diversity, DE uses binomial crossover to create a trial vector $ \mathbf{u}_{i,G+1} $:

Where:

  • $ C_r \in [0, 1] $ is the crossover rate,
  • $ j_{\text{rand}} $ ensures at least one component from $ \mathbf{v}_{i,G+1} $ is selected.

4. Selection

Finally, apply a greedy selection:

Where $ f(\cdot) $ is the objective function being minimized.

πŸ” Repeat

Repeat mutation, crossover, and selection for each generation until a stopping criterion is met (e.g., max iterations or convergence threshold).

Rust Implementation (will be completed soon)

use crate::{
    algorithms::{Algorithm, differential_evolution::solver::Solver},
    objective_functions::{
        bukin::Bukin, griewank::Griewank, holder_table::HolderTable, michalewicz::Michalewicz,
        rosenbrock::Rosenbrock, schwefel::Schwefel,
    },
    utils::file_io,
};
use std::{f64::consts::PI, io};

pub mod algorithms;
pub mod experiment_models;
pub mod objective_functions;
pub mod utils;

fn main() -> io::Result<()> {
    let costfn = Bukin;
    let boundaries = vec![(-15.0, -5.0), (-3.0, 3.0)];

    let mut dea = Solver::builder()
        .iterate(200)
        .with_pop_size(50)
        .with_mut_scale_factor(0.2, 0.8)
        .with_crs_probability(0.9)
        .with_boundaries(boundaries)
        .with_obj_fn(costfn)
        .with_mut_strategy(
            algorithms::differential_evolution::mutation_strategies::MutationType::Rand1,
        )
        // .with_rng_seed(250) // custom deterministic seed
        .display_process(true)
        .build();

    dea.optimize();

    Ok(())
}

Objective Function

Michalewicz

use std::f64::consts::PI;

use crate::objective_functions::ObjectiveFunction;

#[derive(Default)]
pub struct Michalewicz {
    pub m: i32,
}

impl ObjectiveFunction for Michalewicz {
    /// Michalewicz function
    /// Global minimum depends on parameter `m` (default m = 10).
    /// Common domain: xi ∈ [0, Ο€]
    ///
    /// # Arguments
    /// - `xx`: slice of input values (e.g., `[x1, x2, ..., xn]`)
    /// - `m`: optional steepness parameter (higher m = more difficult)
    ///
    /// # Returns
    /// - `f64`: the negative-valued function output
    fn cost(&self, vars: &[f64]) -> f64 {
        let mut sum = 0.0;
        for (i, &xi) in vars.iter().enumerate() {
            let ii = i + 1;
            let term = (xi.sin()) * ((ii as f64 * xi * xi / PI).sin()).powi(2 * self.m);
            sum += term;
        }
        -sum
    }
}

Griewank

use crate::objective_functions::ObjectiveFunction;

#[derive(Default)]
pub struct Griewank;

impl ObjectiveFunction for Griewank {
    /// Griewank function
    /// Global minimum: f(x*) = 0 at x* = [0, ..., 0]
    /// Domain: xi ∈ [-600, 600]
    ///
    /// # Arguments
    /// - `vars`: input vector `[x1, x2, ..., xd]`
    ///
    /// # Returns
    /// - `f64`: function output
    fn cost(&self, vars: &[f64]) -> f64 {
        let sum: f64 = vars.iter().map(|&xi| xi.powi(2) / 4000.0).sum();
        let prod: f64 = vars
            .iter()
            .enumerate()
            .map(|(i, &xi)| (xi / ((i + 1) as f64).sqrt()).cos())
            .product();

        sum - prod + 1.0
    }
}

Sphere

Reference

  • Price, K.V., Storn, R.M. and Lampinen, J.A. (2005) Differential evolution, Natural computing series. https://doi.org/10.1007/3-540-31306-0.
Table of Contents