Edit on GitHub

Backend

Neural network topology

The Leela Chess Zero’s neural network is largely based on the DeepMind’s AlphaGo Zero1 and AlphaZero2 architecture. There are however some changes.

Network topology

The core of the network is a residual tower with Squeeze and Excitation3 (SE) layers.
The number of the residual BLOCKS and FILTERS (channels) per block differs between networks. Typical values for BLOCKS×FILTERS are 10×128, 20×256, 24×320.
SE layers have SE_CHANNELS channels (typically 32 or so).

C++ interface

Implement your own Lc0 backend in four simple steps:

  1. Implement Network and NetworkComputation interface.
  2. Write a factory function to create your backend.
  3. Register your factory function using REGISTER_NETWORK macro.
  4. Link your implementation with Lc0.

Some details:

Weights file format

The NN weights file is in Google Protocol Buffers1 format.

The schema definition is located in the lczero-common repository.

NN format description

The network format description is contained in weights.format().network_format() submessage.