Skip to content

Conversation

arvinxx
Copy link
Member

@arvinxx arvinxx commented Sep 2, 2025

💻 变更类型 | Change Type

  • ✨ feat
  • 🐛 fix
  • ♻️ refactor
  • 💄 style
  • 👷 build
  • ⚡️ perf
  • ✅ test
  • 📝 docs
  • 🔨 chore

🔀 变更说明 | Description of Change

📝 补充信息 | Additional Information

Summary by Sourcery

Offload AI chat messaging to the server via new TRPC endpoints and client service, integrate a V2 chat action in the store, and improve performance with batched state updates

New Features:

  • Add a new TRPC aiChat router and server-side AiChatService to handle sendMessageInServer mutations
  • Implement generateAIChatV2 action and integrate it into the chat store for server-mode message dispatch

Enhancements:

  • Refactor sendMessage to route messages through sendMessageInServer in server mode for performance improvements
  • Introduce updateTopics and updateMessages reducer actions to apply full list updates from server responses
  • Make topic title optional in CreateTopicParams

Build:

  • Update Next.js configuration to ignore TypeScript build errors

Chores:

  • Consolidate and reorder imports across database models, utility modules, and type packages

Copy link

codecov bot commented Sep 2, 2025

Codecov Report

❌ Patch coverage is 9.52381% with 304 lines in your changes missing coverage. Please review.
✅ Project coverage is 76.60%. Comparing base (0a1dcf9) to head (a612916).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #9046      +/-   ##
==========================================
- Coverage   77.06%   76.60%   -0.47%     
==========================================
  Files         814      816       +2     
  Lines       48902    49237     +335     
  Branches     6343     5219    -1124     
==========================================
+ Hits        37688    37719      +31     
- Misses      11208    11512     +304     
  Partials        6        6              
Flag Coverage Δ
app 76.35% <9.56%> (-0.66%) ⬇️
database 95.93% <9.09%> (-0.33%) ⬇️
packages/electron-server-ipc 74.61% <ø> (ø)
packages/file-loaders 83.59% <ø> (ø)
packages/model-bank 100.00% <ø> (ø)
packages/model-runtime 74.62% <ø> (ø)
packages/prompts 100.00% <ø> (ø)
packages/utils 60.52% <0.00%> (-0.03%) ⬇️
packages/web-crawler 59.57% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Components Coverage Δ
Store 66.42% <7.49%> (-2.16%) ⬇️
Services 61.65% <47.05%> (-0.08%) ⬇️
Server 66.39% <ø> (ø)
Libs 42.11% <ø> (ø)
Utils 73.57% <ø> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Contributor

sourcery-ai bot commented Sep 2, 2025

Reviewer's Guide

This PR refactors the chat message flow to leverage a dedicated server-side API for faster processing by routing sendMessage calls to a new server-mode pathway, updating topics and messages in bulk via new reducer actions, and providing corresponding backend router and service implementations. It also includes miscellaneous type and configuration adjustments for imports, optional parameters, and build settings.

Sequence diagram for the new server-side sendMessage flow

sequenceDiagram
    participant User
    participant Frontend
    participant AiChatService (frontend)
    participant LambdaClient
    participant LambdaRouter (backend)
    participant AiChatService (backend)
    participant MessageModel
    participant TopicModel

    User->>Frontend: Triggers sendMessage
    Frontend->>AiChatService (frontend): sendMessageInServer(params)
    AiChatService (frontend)->>LambdaClient: aiChat.sendMessageInServer.mutate(params)
    LambdaClient->>LambdaRouter (backend): sendMessageInServer
    LambdaRouter (backend)->>AiChatService (backend): getMessagesAndTopics
    AiChatService (backend)->>MessageModel: query messages
    AiChatService (backend)->>TopicModel: query topics
    MessageModel-->>AiChatService (backend): messages
    TopicModel-->>AiChatService (backend): topics
    AiChatService (backend)-->>LambdaRouter (backend): {messages, topics}
    LambdaRouter (backend)-->>LambdaClient: response
    LambdaClient-->>AiChatService (frontend): response
    AiChatService (frontend)-->>Frontend: response
    Frontend->>Frontend: internal_dispatchTopic/updateTopics
    Frontend->>Frontend: internal_dispatchMessage/updateMessages
Loading

Class diagram for updated reducers with bulk update actions

classDiagram
    class topicReducer {
        +updateTopics(value: ChatTopic[]): ChatTopic[]
    }
    class messagesReducer {
        +updateMessages(value: ChatMessage[]): ChatMessage[]
    }
    class ChatTopicDispatch {
        +AddChatTopicAction
        +UpdateChatTopicAction
        +DeleteChatTopicAction
        +UpdateTopicsAction
    }
    class MessageDispatch {
        +CreateMessage
        +UpdateMessage
        +UpdateMessages
        +UpdatePluginState
        +UpdateMessageExtra
        +DeleteMessage
    }
    topicReducer --> ChatTopicDispatch
    messagesReducer --> MessageDispatch
Loading

File-Level Changes

Change Details Files
Implement server-mode message sending with a new API pathway
  • Introduce AIGenerateV2Action and integrate generateAIChatV2
  • Route sendMessage to sendMessageInServer when in server mode
  • Add TRPC router endpoint sendMessageInServer in aiChatRouter
  • Implement AiChatService backend class and client-side aiChatService wrapper
src/store/chat/slices/aiChat/actions/generateAIChat.ts
src/store/chat/slices/aiChat/actions/index.ts
src/store/chat/slices/aiChat/actions/generateAIChatV2.ts
src/server/routers/lambda/aiChat.ts
src/server/services/aiChat/index.ts
src/services/aiChat.tsx
packages/types/src/aiChat.ts
Add bulk state update actions for topics and messages
  • Define UpdateTopicsAction and handle in topicReducer
  • Define UpdateMessagesAction and handle in messagesReducer
  • Extend dispatch types to include new updateTopics and updateMessages
src/store/chat/slices/topic/reducer.ts
src/store/chat/slices/message/reducer.ts
Miscellaneous type, import, and configuration adjustments
  • Reorder imports in topic database model and make title optional in CreateTopicParams
  • Enable ignoring TypeScript build errors in next.config.ts
  • Add aiChat export in types index and object utility export
  • Define new aiChat types in packages/types/src/aiChat.ts
packages/database/src/models/topic.ts
next.config.ts
packages/types/src/index.ts
packages/types/src/aiChat.ts
packages/utils/src/index.ts

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@lobehubbot
Copy link
Member

👍 @arvinxx

Thank you for raising your pull request and contributing to our Community
Please make sure you have followed our contributing guidelines. We will review it as soon as possible.
If you encounter any problems, please feel free to connect with us.
非常感谢您提出拉取请求并为我们的社区做出贡献,请确保您已经遵循了我们的贡献指南,我们会尽快审查它。
如果您遇到任何问题,请随时与我们联系。

Copy link

vercel bot commented Sep 2, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
lobe-chat-database Ready Ready Preview Comment Sep 4, 2025 3:15am
lobe-chat-preview Canceled Canceled Sep 4, 2025 3:15am

@arvinxx arvinxx force-pushed the refactor/send-message branch from 8858218 to d6ef723 Compare September 3, 2025 07:58
@tjx666
Copy link
Collaborator

tjx666 commented Sep 3, 2025

https://lobe-chat-database-git-fork-sxjeru-93-lobehub-community.vercel.app/chat?topic=tpc_qGGsnIGwf6B2 对比了下还是可以明显感觉出快了 2, 3s 的

@arvinxx arvinxx force-pushed the refactor/send-message branch from fea6423 to 5903b58 Compare September 4, 2025 02:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants