Skip to content

Conversation

xingmingyyj
Copy link
Contributor

Before submitting

  • Lint code. If there are lint issues, please format the code first.
# Install and register `pre-commit` in the project folder
pip install pre-commit && pre-commit install

# Process previous code files separately
pre-commit run --file XXXX.py
  • Add test cases into tests folder. If there are codecov issues, please add tests cases first.

PR types

New features

PR changes

Others

Description

adapter flex_checkpoint

Copy link

paddle-bot bot commented Aug 20, 2025

Thanks for your contribution!

@xingmingyyj xingmingyyj force-pushed the add_sharded_state_dict branch from 92bb80e to 9805b27 Compare August 20, 2025 13:24
@@ -1427,6 +1427,7 @@ def get_tensor_parallel_split_mappings(num_layers):

@classmethod
def _get_fuse_or_split_param_mappings(cls, config: LlamaConfig, is_fuse=False):
raise NotImplementedError
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里为啥需要拦截

@xingmingyyj xingmingyyj force-pushed the add_sharded_state_dict branch from 13eb7f7 to 419b349 Compare August 22, 2025 06:08
@xingmingyyj
Copy link
Contributor Author

/re-run all-failed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants