NEAT was invented to solve the competing conventions problem.

… by figuring out how to take two different parental neural networks, and identify parts that are likely to be performing similar functions. It does so by tracking the ancestry of edges (nature does things akin to this too).

Terminology

Genotype = “blueprint”
Phenotype = actual structure
NeuroEvolution of Augmenting Topologies = evolving the structure (topology) of the network

Intro example

center
Here, the genotype is encoded as two vectors, one for the nodes and the connectivity.
The network currently has 5 neurons, each with a flag indicating neuron type.
There are six synapses, however one of them is disabled, so the phenotype only has 5 synapses.
The disabled flag mimicks dominant / recessive genes. Intuitively, the connection might become useful in a different context at a later point in time. In nature, this is sometimes called “junk DNA”, genes that are there but not active. It is like a scratchpad / like uncommenting out code.
We also have an innovation number, the main innovation of NEAT.

Innovation number

The innovation number is a globally unique counter “historical marking”, assigned to each new node / synapse separately. They identify the genes.
Nodes with the same innovation number likely perform similar functions, kind of like alleles (different version of the same gene).

Mutation

One parent → One child.
Copy parent weights (retain innovation number).
Add / remove nodes / synapses.

Crossover

Randomly pick genes from two parents, hope that the child will be good.
→ Algorithmically simple & produces children that are similar to their parents.

Procedure:
Align genes by innovaiton number.
If for each shared gene, flip a coin to decide which parent to inherit from.
For disjoint genes and excess genes, flip a coin to decide whether to inherit it.
A disjoint gene is one that is present in one parent but not the other. Excess genes are disjoint genes… just where we ran out of innovation numbers from the other parent.
center

References

(ES-(Hyper-))NEAT resources

Pytoch (Hyper)NEAT code
HyperNEAT paper (high quality)
Good BP series:

Could not find ES-HyperNEAT paper… There is “Enhancing ES-HyperNEAT” (same authors).
→ nvm, found it. See ES-HyperNEAT.
Bad ES-HyperNEAT code
Evolutionary Robotics, Lecture 16: NEAT & HyperNEAT (Josh Bongard)

ES-HyperNEAT
evolutionary optimization
neuroevolution