-
Notifications
You must be signed in to change notification settings - Fork 31
Fix: Correctly handle chat history in StreamContentAsync for function… #58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix: Correctly handle chat history in StreamContentAsync for function… #58
Conversation
|
Caution Review failedThe pull request is closed. WalkthroughThe Changes
Estimated code review effort2 (10–30 minutes) Possibly related PRs
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/GenerativeAI/AiModels/ChatSession.cs (1)
185-185: Consider potential performance optimization for response text building.The
sb.ToString()call creates the final response text, but theRequestExtensions.FormatGenerateContentInputmight be doing additional processing that could be optimized.Consider caching the result if
FormatGenerateContentInputis expensive, or verify if the StringBuilder content can be used more directly:-var finalModelResponseContent = RequestExtensions.FormatGenerateContentInput(sb.ToString(), Roles.Model); +var responseText = sb.ToString(); +var finalModelResponseContent = RequestExtensions.FormatGenerateContentInput(responseText, Roles.Model);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/GenerativeAI/AiModels/ChatSession.cs(1 hunks)
🔇 Additional comments (3)
src/GenerativeAI/AiModels/ChatSession.cs (3)
176-176: Good approach to track history state before processing.Capturing the history count before processing the request is a clean way to identify which content items are new in this request. This helps avoid confusion between existing history and new content.
190-197: Conditional history update logic correctly addresses the core issue.The conditional logic properly handles both scenarios:
- When a valid user prompt is found: Updates history with the user prompt and model response
- When no valid user prompt is found: Only updates
LastResponseContentwithout modifying historyThis approach ensures that intermediate function call steps don't corrupt the chat history while still maintaining the final response for reference.
186-189: Validate user prompt isolation in both streaming and non-streaming flows
Verify that history is prepended to
request.Contentsbefore identifying the user prompt:
– Non-stream path: you’re already callingrequest.Contents.InsertRange(0, History); // line 132which confirms the ordering assumption.
– Streaming path: ensure thatbase.StreamContentAsync(request)also receives the same history-first ordering so thatvar historyCountBeforeRequest = this.History.Count; var userPromptContent = request.Contents .Skip(historyCountBeforeRequest) .FirstOrDefault(…);always skips the history messages, not just the new ones.
Consider how to handle multiple user prompts after skipping history:
– WithFirstOrDefault, only the first user message is used. If your scenario allows more than one new prompt in a single request, decide whether to switch toSingleOrDefault,LastOrDefault, or explicitly guard against multiple user entries.Please confirm both the history insertion in the streaming override and your intended behavior when multiple prompts are present.
Resolves #57
This PR resolves the critical history corruption issue within
StreamContentAsyncthat occurred during streaming conversations with function calls.The root cause was a logic that used
request.Contents.Last()to determine the user's input.This fix implements a more detailed logic that isolates the turn's real initial prompt and saves it paired with the model's final text answer, while ignoring all intermediate tool call steps, thus ensuring the chat history remains clean.
Summary by CodeRabbit