Skip to content

Conversation

borg323
Copy link
Member

@borg323 borg323 commented Feb 21, 2023

The encoding is moved to the Weights where it makes sense and can be used where the weights are passed around without needing to keep track of the encoding separately.
Turns out it is more convenient to have the encoding in each Layer.
Also adds a dims field per Layer to store the tensor dimensions.

@@ -362,7 +370,7 @@ message Format {
UNKNOWN = 0;
LINEAR16 = 1;
}

// Any encoding specified in a Layer overides this.
optional Encoding weights_encoding = 1;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can actually remove this field (replace with reserved 1) — it will be just ignored parsed by code that doesn't know it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure this won't break older builds. But we can certainly do that after we move it to lc0.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It likely would be fine, but let's keep for now.

@@ -362,7 +370,7 @@ message Format {
UNKNOWN = 0;
LINEAR16 = 1;
}

// Any encoding specified in a Layer overides this.
optional Encoding weights_encoding = 1;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It likely would be fine, but let's keep for now.

@borg323 borg323 merged commit c47d368 into LeelaChessZero:master Sep 11, 2025
@borg323 borg323 deleted the fp16 branch September 11, 2025 18:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants