-
Notifications
You must be signed in to change notification settings - Fork 46
[AIT-206] Add message per token guide for LangGraph #3157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThis PR adds documentation and navigation for a new guide on token-by-token streaming with LangGraph. It includes navigation entries, an index tile linking to the guide, and a comprehensive tutorial document with publisher and subscriber code examples using Ably. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
Comment |
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@src/data/nav/aitransport.ts`:
- Around line 107-114: The nav entry named 'Vercel AI SDK token streaming -
message per response' points to a non-existent page at
'/docs/guides/ai-transport/vercel-message-per-response'; either create the
missing guide file or update the nav link to the existing Vercel guide path
'/docs/guides/ai-transport/vercel-message-per-token' (or rename the entry to
match the existing file) by editing the object in src/data/nav/aitransport.ts so
the name and link correspond to the actual MDX file.
In `@src/pages/docs/guides/ai-transport/lang-graph-message-per-token.mdx`:
- Around line 268-296: The responses Map is never cleaned up, causing unbounded
growth; in the stop handler (the channel.subscribe('stop', ...) callback where
responseId and finalText are obtained) delete the entry from responses after you
finish using finalText (e.g., call responses.delete(responseId)) so per-response
state is released when the response completes.
- Around line 180-226: The module-scoped responseId causes cross-request reuse;
make responseId a local variable inside streamLangGraphResponse (declare let
responseId = null at the top of that function) so each call gets its own ID,
update where you check and set it (inside the for-await loop) and include it in
the start and token publishes as before, and guard the final stop publish so you
only call channel.publish({ name: 'stop', ... }) when responseId was captured
(i.e., if (responseId) publish stop) to avoid emitting stop for unrelated
streams.
86bc5dc to
181c7d5
Compare
Description
Follows the existing structure of the message-per-token guides for LangGraph in JS.
Review App
Checklist
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.