The Leela Chess Zero’s neural network is largely based on the DeepMind’s AlphaGo Zero1 and AlphaZero2 architecture. There are however some changes.
The core of the network is a residual tower with Squeeze and Excitation3 (SE) layers.
The number of the residual BLOCKS
and FILTERS
(channels) per block differs between networks.
Typical values for BLOCKS
×FILTERS
are 10×128, 20×256, 24×320.
SE layers have SE_CHANNELS
channels (typically 32 or so).
Implement your own Lc0 backend in four simple steps:
Network
and NetworkComputation
interface.REGISTER_NETWORK
macro.Some details:
The NN weights file is in Google Protocol Buffers1 format.
The schema definition is located in the lczero-common repository.
The network format description is contained in weights.format().network_format()
submessage.