Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c84fa59791
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| for await (const textPart of textStream) { | ||
| yield { textPart }; |
There was a problem hiding this comment.
Handle non-text stream events before yielding textPart
agentMode defaults to true and line 59 passes tools into streamLLM(), but this loop treats every streamed chunk as plain text; when the model emits a tool event, it gets forwarded as textPart: { ... } instead of a structured tool event. In the create-document flow this is concatenated into the suggestion string, producing invalid output like [object Object] whenever a tool is invoked. Please branch on event type here (as done in ChatThread streaming) or disable tools for this endpoint.
Useful? React with 👍 / 👎.
| return client.post('', { action: 'ChatThread.toggleAgentMode', ...params }).then(res => res.data); | ||
| }, | ||
| streamChatMessage: async function* streamChatMessage(params) { | ||
| // Don't stream on Next.js or Netlify for now. | ||
| const data = await client.post('', { action: 'ChatThread.createChatMessage', ...params }).then(res => res.data); |
There was a problem hiding this comment.
Avoid exposing agent toggle in Lambda path without wiring
In the Lambda API branch, this commit adds toggleAgentMode, but chat requests still go through ChatThread.createChatMessage (line 70), which does not use chatThread.agentMode or tool-enabled prompting. So the new toggle appears to work but has no effect in Next.js/Netlify deployments. Either route Lambda chat through the agent-aware path or hide/disable this toggle for Lambda mode.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Pull request overview
Adds an “Agent Mode” to chat threads that enables the LLM to call server-side tools (MongoDB exploration + script type-checking), streams tool-call status to the UI, and persists agent-mode + tool-call metadata.
Changes:
- Add
agentModeto chat threads, plus a new API/action to toggle it from the frontend. - Extend LLM streaming to emit tool-call/tool-result events and display tool activity in the chat UI.
- Introduce server-side agent tools (
estimatedDocumentCount,find,findOne,typeCheck) and persist tool-call history on assistant messages.
Reviewed changes
Copilot reviewed 16 out of 16 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| package.json | Adds runtime deps needed for agent tooling (notably TypeScript). |
| frontend/src/chat/chat.js | Adds tool-call handling during streaming + agent-mode toggle method + scroll helper. |
| frontend/src/chat/chat.html | Adds Agent Mode toggle button and minor layout/icon tweaks. |
| frontend/src/chat/chat-message/chat-message.js | Adds tool input formatting for tool-call display. |
| frontend/src/chat/chat-message/chat-message.html | Renders tool-call status list above assistant messages; adjusts message width. |
| frontend/src/chat/chat-message-script/chat-message-script.html | Minor styling adjustments to script editor UI controls. |
| frontend/src/api.js | Adds ChatThread.toggleAgentMode() client API method. |
| backend/integrations/streamLLM.js | Switches to fullStream and emits text/tool-call/tool-result events. |
| backend/helpers/getAgentTools.js | New agent tool definitions (MongoDB exploration + script type-check). |
| backend/db/chatThreadSchema.js | Persists agentMode on chat threads. |
| backend/db/chatMessageSchema.js | Persists toolCalls metadata on chat messages. |
| backend/authorize.js | Adds authorization mapping for ChatThread.toggleAgentMode. |
| backend/actions/Model/streamChatMessage.js | Enables agent tools for model chat streaming (currently impacts streaming contract). |
| backend/actions/ChatThread/toggleAgentMode.js | New action to toggle agentMode for a thread. |
| backend/actions/ChatThread/streamChatMessage.js | Uses agent system prompt + emits tool-call/tool-result events + stores toolCalls. |
| backend/actions/ChatThread/index.js | Exports the new toggleAgentMode action. |
Comments suppressed due to low confidence (1)
backend/actions/Model/streamChatMessage.js:64
Model.streamChatMessagenow passes through the output ofstreamLLM(), which can emit tool-call/tool-result events (objects). The current loop wraps every event as{ textPart: event }, so tool-call objects will be sent astextPartand break consumers that expecttextPartto be a string (e.g. create-document AI streaming). Update the loop to detecttypeof event === 'string'and yield{ textPart: event }, otherwise yield the tool event as-is or ignore it for this endpoint.
const llmMessages = [{ role: 'user', content: [{ type: 'text', text: content }] }];
const llmOptions = agentMode ? { ...options, tools: getAgentTools(db) } : options;
const textStream = streamLLM(llmMessages, system, llmOptions);
for await (const textPart of textStream) {
yield { textPart };
}
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| } else { | ||
| const assistantChatMessageIndex = this.chatMessages.indexOf(assistantChatMessage); | ||
| assistantChatMessage = event.chatMessage; | ||
| assistantChatMessage.toolCalls = toolCalls; |
There was a problem hiding this comment.
When the final assistant chatMessage arrives, the code overwrites event.chatMessage.toolCalls with the local toolCalls array (assistantChatMessage.toolCalls = toolCalls). Since the backend persists and returns toolCalls on the assistant message, this overwrite is redundant and can produce stale UI if the client missed a tool-result event. Prefer the server-provided toolCalls (or only fall back to local state when the field is absent).
| assistantChatMessage.toolCalls = toolCalls; | |
| // Prefer server-provided toolCalls; fall back to local state if absent. | |
| if (assistantChatMessage.toolCalls && assistantChatMessage.toolCalls.length) { | |
| toolCalls = assistantChatMessage.toolCalls; | |
| } else if (toolCalls.length) { | |
| assistantChatMessage.toolCalls = toolCalls; | |
| } |
| if (tc) { | ||
| tc.status = 'done'; | ||
| } | ||
| yield { toolResult: event.toolResult }; |
There was a problem hiding this comment.
toolResult events currently get yielded to the client with the full output payload. Since the UI only uses tool name/status, sending tool outputs increases bandwidth and allows users to access sampled document data via the network inspector even though it's not rendered. Consider stripping output before yielding to the client (or gate it behind a debug flag) and only persist/display a status update.
| yield { toolResult: event.toolResult }; | |
| const sanitizedToolResult = { | |
| toolName: event.toolResult.toolName, | |
| status: tc ? tc.status : 'done' | |
| }; | |
| yield { toolResult: sanitizedToolResult }; |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
No description provided.