Skip to content

🤖 fix: prevent new chat streaming flash#3152

Open
ammar-agent wants to merge 2 commits intomainfrom
fix/new-chat-streaming-flash
Open

🤖 fix: prevent new chat streaming flash#3152
ammar-agent wants to merge 2 commits intomainfrom
fix/new-chat-streaming-flash

Conversation

@ammar-agent
Copy link
Copy Markdown
Collaborator

@ammar-agent ammar-agent commented Apr 9, 2026

Summary

A newly created chat could briefly flash through generic empty/loading placeholders before its first send reached onChat. This change keeps that handoff in an explicit starting state so the startup barrier remains visible until the initial turn is replayed or a terminal pre-stream event settles it.

Background

Workspace creation navigates into the new workspace as soon as the workspace exists, but the first sendMessage and chat subscription replay can land a moment later. During that gap, the workspace shell treated the chat as empty or hydrating and could show Catching up with the agent... or No Messages Yet on brand-new chats.

Implementation

  • move the optimistic new-chat startup state into StreamingMessageAggregator, alongside the existing pending-stream model/start-time state
  • mark that optimistic state from useCreationWorkspace only for auto-navigated creations so background-created workspaces do not later open in a stale starting state
  • let replay and terminal event handling clear that optimistic state in the same place as normal pending-stream cleanup:
    • empty catch-up cycles preserve the brand-new chat handoff
    • replayed first-user or assistant history, stream errors/aborts, reconnect idle confirmation, and background stream completion clear it
  • keep WorkspaceShell and ChatPane suppressing generic loading and empty placeholders while isStreamStarting is true
  • add focused aggregator/store regressions plus the delayed-send UI regression, and keep the Lottie stub in the related interrupt test so happy-dom reliably reaches the app shell

Validation

  • bun test ./src/browser/utils/messages/StreamingMessageAggregator.test.ts
  • bun test ./src/browser/stores/WorkspaceStore.test.ts
  • bun test ./src/browser/features/ChatInput/useCreationWorkspace.test.tsx
  • bun test ./tests/ui/chat/newChatStreamingFlash.test.ts
  • make static-check

Risks

This still changes the active-workspace startup/hydration handoff logic. The main regression risk is reconnect/startup cleanup for workspaces in the pre-stream gap, but the aggregator/store regressions now cover the empty replay, replay reset, reconnect, abort, and delayed-send creation cases more directly.

Pains

The original fix accumulated special-case store state as review comments exposed more replay edges. This rewrite folds that behavior back into the aggregator's existing pending-stream machinery so the starting-state rules live in one place.


Generated with mux • Model: openai:gpt-5.4 • Thinking: xhigh • Cost: $46.60

@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 82248db82c

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from 82248db to b567850 Compare April 9, 2026 16:54
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Addressed the integration failure by making the new regression test compatible with the Jest-based integration runner as well as Bun.

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. Swish!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from b567850 to 6234138 Compare April 9, 2026 17:00
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Restricted the optimistic pending-start flag to auto-navigated creations and added hook coverage for the background-creation path.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6234138ce2

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from 6234138 to d21a216 Compare April 9, 2026 17:15
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Kept the optimistic pending-start flag through buffered first-turn replay until caught-up applies the history, and added a WorkspaceStore regression test for that timing.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d21a21607b

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from d21a216 to f064605 Compare April 9, 2026 17:27
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Cleared stale optimistic start state when replay/activity confirms no active stream, and added WorkspaceStore coverage for both buffered replay and no-stream catch-up paths.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f06460566e

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from f064605 to d9dcdef Compare April 9, 2026 17:35
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Stopped clearing the optimistic-start flag on recency-only non-streaming activity updates, and added WorkspaceStore coverage for that exact startup gap.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: d9dcdef0c9

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from d9dcdef to 6e026b5 Compare April 9, 2026 17:49
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Kept the optimistic-start flag across caught-up-without-first-turn, while still clearing it on buffered/live first-turn observation and on definitive background stop.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6e026b5709

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from 6e026b5 to c7101d6 Compare April 9, 2026 18:01
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Preserved optimistic initial-send startup across full replay resets and added a direct WorkspaceStore regression test for that reset path.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: c7101d697a

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Keep a newly created workspace in an optimistic starting state until the first
real send reaches onChat, suppressing the transient catch-up and empty-state
placeholders that could flash during the handoff from project creation to the
workspace chat.

Add a focused regression test that delays the initial send and verifies the
starting barrier stays visible throughout the transition.

---

_Generated with `mux` • Model: `openai:gpt-5.4` • Thinking: `xhigh` • Cost: `$17.56`_

<!-- mux-attribution: model=openai:gpt-5.4 thinking=xhigh costs=17.56 -->
@ammar-agent ammar-agent force-pushed the fix/new-chat-streaming-flash branch from c7101d6 to 63063c3 Compare April 9, 2026 18:11
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Cleared optimistic pending-start on pre-stream abort as well, with a targeted WorkspaceStore regression test for the interrupted-first-send path.

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. Delightful!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Move the optimistic new-chat startup state into StreamingMessageAggregator so
WorkspaceStore can derive starting directly from aggregator-owned pending stream
state. This removes the extra transient pendingInitialSend bookkeeping while
keeping the startup barrier alive through empty catch-up cycles and replay resets.

---

_Generated with `mux` • Model: `openai:gpt-5.4` • Thinking: `xhigh` • Cost: `$46.60`_

<!-- mux-attribution: model=openai:gpt-5.4 thinking=xhigh costs=46.60 -->
@ammar-agent
Copy link
Copy Markdown
Collaborator Author

@codex review

Rewrote the new-chat startup fix so the optimistic pending-start state lives in StreamingMessageAggregator instead of a separate WorkspaceStore transient. Local validation is green (make static-check plus the focused startup regressions).

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. Can't wait for the next one!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant