You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This error seems to appear with certain models when training with gradient accumulation. Full error:
<model name> does not accept `num_items_in_batch
Using gradient accumulation will be very slightly less accurate.
Read more on gradient accumulation issues here: https://unsloth.ai/blog/gradient
There was a (now) well-known bug with gradient accumulation whereby losses were incorrectly aggregated.
At the time, it was necessary to wrap the trainer for the fix, but transformers then fixed their trainer. If that's the case, what does this error here mean?
I have version:
Name: unsloth
Version: 2025.8.10
I see this for Llama 3.2 1b, and qwen3-4b. Probably other models too.