Skip to content

Commit 91d6fb8

Browse files
authored
Fix miscalculated token count (#9776)
### What problem does this PR solve? The total token was incorrectly accumulated when using the OpenAI-API-Compatible api. ### Type of change - [x] Bug Fix (non-breaking change which fixes an issue)
1 parent 45f52e8 commit 91d6fb8

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

rag/llm/chat_model.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@ def chat_streamly_with_tools(self, system: str, history: list, gen_conf: dict =
374374
if not tol:
375375
total_tokens += num_tokens_from_string(resp.choices[0].delta.content)
376376
else:
377-
total_tokens += tol
377+
total_tokens = tol
378378

379379
finish_reason = resp.choices[0].finish_reason if hasattr(resp.choices[0], "finish_reason") else ""
380380
if finish_reason == "length":
@@ -410,7 +410,7 @@ def chat_streamly_with_tools(self, system: str, history: list, gen_conf: dict =
410410
if not tol:
411411
total_tokens += num_tokens_from_string(resp.choices[0].delta.content)
412412
else:
413-
total_tokens += tol
413+
total_tokens = tol
414414
answer += resp.choices[0].delta.content
415415
yield resp.choices[0].delta.content
416416

0 commit comments

Comments
 (0)