Replies: 1 comment
-
You can get the log probabilities by adding in include_raw=True flag when calling llm.with_structured_output https://python.langchain.com/docs/how_to/structured_output/#advanced-raw-outputs |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Currently, LangChain does not provide a way to retrieve log probabilities (logprobs) when using OpenAI models with structured outputs. However, this feature is valuable for applications that require confidence estimates, uncertainty quantification, or probabilistic reasoning over generated tokens.
Expected Behavior:
• When requesting structured outputs, allow the user to enable logprobs in OpenAI calls.
• Expose token-wise log probabilities in the response for further analysis.
Proposed Solution:
• Modify the OpenAI wrapper to allow logprobs when using structured outputs.
• Ensure compatibility with existing response parsing mechanisms.
Motivation
Use Case:
• Evaluating model confidence in structured generations.
• Debugging and improving prompt engineering with probabilistic insights.
Proposal (If applicable)
Beta Was this translation helpful? Give feedback.
All reactions