diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 04416e1..6b4ce5b 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -11,33 +11,63 @@ Perplexica's codebase is organized as follows: - **UI Components and Pages**: - **Components (`src/components`)**: Reusable UI components. - **Pages and Routes (`src/app`)**: Next.js app directory structure with page components. - - Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), library (`/library`), and settings (`/settings`). - - **API Routes (`src/app/api`)**: API endpoints implemented with Next.js API routes. - - `/api/chat`: Handles chat interactions. - - `/api/search`: Provides direct access to Perplexica's search capabilities. - - Other endpoints for models, files, and suggestions. + - Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), and library (`/library`). + - **API Routes (`src/app/api`)**: Server endpoints implemented with Next.js route handlers. - **Backend Logic (`src/lib`)**: Contains all the backend functionality including search, database, and API logic. - - The search functionality is present inside `src/lib/search` directory. - - All of the focus modes are implemented using the Meta Search Agent class in `src/lib/search/metaSearchAgent.ts`. + - The search system lives in `src/lib/agents/search`. + - The search pipeline is split into classification, research, widgets, and writing. - Database functionality is in `src/lib/db`. - - Chat model and embedding model providers are managed in `src/lib/providers`. - - Prompt templates and LLM chain definitions are in `src/lib/prompts` and `src/lib/chains` respectively. + - Chat model and embedding model providers are in `src/lib/models/providers`, and models are loaded via `src/lib/models/registry.ts`. + - Prompt templates are in `src/lib/prompts`. + - SearXNG integration is in `src/lib/searxng.ts`. + - Upload search lives in `src/lib/uploads`. + +### Where to make changes + +If you are not sure where to start, use this section as a map. + +- **Search behavior and reasoning** + + - `src/lib/agents/search` contains the core chat and search pipeline. + - `classifier.ts` decides whether research is needed and what should run. + - `researcher/` gathers information in the background. + +- **Add or change a search capability** + + - Research tools (web, academic, discussions, uploads, scraping) live in `src/lib/agents/search/researcher/actions`. + - Tools are registered in `src/lib/agents/search/researcher/actions/index.ts`. + +- **Add or change widgets** + + - Widgets live in `src/lib/agents/search/widgets`. + - Widgets run in parallel with research and show structured results in the UI. + +- **Model integrations** + + - Providers live in `src/lib/models/providers`. + - Add new providers there and wire them into the model registry so they show up in the app. + +- **Architecture docs** + - High level overview: `docs/architecture/README.md` + - High level flow: `docs/architecture/WORKING.md` ## API Documentation -Perplexica exposes several API endpoints for programmatic access, including: +Perplexica includes API documentation for programmatic access. -- **Search API**: Access Perplexica's advanced search capabilities directly via the `/api/search` endpoint. For detailed documentation, see `docs/api/search.md`. +- **Search API**: For detailed documentation, see `docs/API/SEARCH.md`. ## Setting Up Your Environment Before diving into coding, setting up your local environment is key. Here's what you need to do: -1. In the root directory, locate the `sample.config.toml` file. -2. Rename it to `config.toml` and fill in the necessary configuration fields. -3. Run `npm install` to install all dependencies. -4. Run `npm run db:migrate` to set up the local sqlite database. -5. Use `npm run dev` to start the application in development mode. +1. Run `npm install` to install all dependencies. +2. Use `npm run dev` to start the application in development mode. +3. Open http://localhost:3000 and complete the setup in the UI (API keys, models, search backend URL, etc.). + +Database migrations are applied automatically on startup. + +For full installation options (Docker and non Docker), see the installation guide in the repository README. **Please note**: Docker configurations are present for setting up production environments, whereas `npm run dev` is used for development purposes. diff --git a/README.md b/README.md index 9ef1f1d..49b1fc5 100644 --- a/README.md +++ b/README.md @@ -18,9 +18,11 @@ Want to know more about its architecture and how it works? You can read it [here 🤖 **Support for all major AI providers** - Use local LLMs through Ollama or connect to OpenAI, Anthropic Claude, Google Gemini, Groq, and more. Mix and match models based on your needs. -⚡ **Smart search modes** - Choose Balanced Mode for everyday searches, Fast Mode when you need quick answers, or wait for Quality Mode (coming soon) for deep research. +⚡ **Smart search modes** - Choose Speed Mode when you need quick answers, Balanced Mode for everyday searches, or Quality Mode for deep research. -🎯 **Six specialized focus modes** - Get better results with modes designed for specific tasks: Academic papers, YouTube videos, Reddit discussions, Wolfram Alpha calculations, writing assistance, or general web search. +🧭 **Pick your sources** - Search the web, discussions, or academic papers. More sources and integrations are in progress. + +🧩 **Widgets** - Helpful UI cards that show up when relevant, like weather, calculations, stock prices, and other quick lookups. 🔍 **Web search powered by SearxNG** - Access multiple search engines while keeping your identity private. Support for Tavily and Exa coming soon for even better results. @@ -61,9 +63,9 @@ We'd also like to thank the following partners for their generous support: -
+ - Exa + Exa @@ -81,7 +83,7 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker. Perplexica can be easily run using Docker. Simply run the following command: ```bash -docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest +docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest ``` This will pull and start the Perplexica container with the bundled SearxNG search engine. Once running, open your browser and navigate to http://localhost:3000. You can then configure your settings (API keys, models, etc.) directly in the setup screen. @@ -93,7 +95,7 @@ This will pull and start the Perplexica container with the bundled SearxNG searc If you already have SearxNG running, you can use the slim version of Perplexica: ```bash -docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest +docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest ``` **Important**: Make sure your SearxNG instance has: @@ -120,7 +122,7 @@ If you prefer to build from source or need more control: ```bash docker build -t perplexica . - docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica perplexica + docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica perplexica ``` 5. Access Perplexica at http://localhost:3000 and configure your settings in the setup screen. @@ -237,13 +239,8 @@ Perplexica runs on Next.js and handles all API requests. It works right away on ## Upcoming Features -- [x] Add settings page -- [x] Adding support for local LLMs -- [x] History Saving features -- [x] Introducing various Focus Modes -- [x] Adding API support -- [x] Adding Discover -- [ ] Finalizing Copilot Mode +- [] Adding more widgets, integrations, search sources +- [] Adding authentication ## Support Us diff --git a/docker-compose.yaml b/docker-compose.yaml index 50b6785..e2c245d 100644 --- a/docker-compose.yaml +++ b/docker-compose.yaml @@ -1,6 +1,8 @@ services: perplexica: image: itzcrazykns1337/perplexica:latest + build: + context: . ports: - '3000:3000' volumes: diff --git a/docs/API/SEARCH.md b/docs/API/SEARCH.md index 04f11ef..0c35a81 100644 --- a/docs/API/SEARCH.md +++ b/docs/API/SEARCH.md @@ -57,7 +57,7 @@ Use the `id` field as the `providerId` and the `key` field from the models array ### Request -The API accepts a JSON object in the request body, where you define the focus mode, chat models, embedding models, and your query. +The API accepts a JSON object in the request body, where you define the enabled search `sources`, chat models, embedding models, and your query. #### Request Body Structure @@ -72,7 +72,7 @@ The API accepts a JSON object in the request body, where you define the focus mo "key": "text-embedding-3-large" }, "optimizationMode": "speed", - "focusMode": "webSearch", + "sources": ["web"], "query": "What is Perplexica", "history": [ ["human", "Hi, how are you?"], @@ -87,24 +87,25 @@ The API accepts a JSON object in the request body, where you define the focus mo ### Request Parameters -- **`chatModel`** (object, optional): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. +- **`chatModel`** (object, required): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. - `providerId` (string): The UUID of the provider. You can get this from the `/api/providers` endpoint response. - `key` (string): The model key/identifier (e.g., `gpt-4o-mini`, `llama3.1:latest`). Use the `key` value from the provider's `chatModels` array, not the display name. -- **`embeddingModel`** (object, optional): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. +- **`embeddingModel`** (object, required): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. - `providerId` (string): The UUID of the embedding provider. You can get this from the `/api/providers` endpoint response. - `key` (string): The embedding model key (e.g., `text-embedding-3-large`, `nomic-embed-text`). Use the `key` value from the provider's `embeddingModels` array, not the display name. -- **`focusMode`** (string, required): Specifies which focus mode to use. Available modes: +- **`sources`** (array, required): Which search sources to enable. Available values: - - `webSearch`, `academicSearch`, `writingAssistant`, `wolframAlphaSearch`, `youtubeSearch`, `redditSearch`. + - `web`, `academic`, `discussions`. - **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes: - `speed`: Prioritize speed and return the fastest answer. - `balanced`: Provide a balanced answer with good speed and reasonable quality. + - `quality`: Prioritize answer quality (may be slower). - **`query`** (string, required): The search query or question. @@ -132,14 +133,14 @@ The response from the API includes both the final message and the sources used t "message": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online. Here are some key features and characteristics of Perplexica:\n\n- **AI-Powered Technology**: It utilizes advanced machine learning algorithms to not only retrieve information but also to understand the context and intent behind user queries, providing more relevant results [1][5].\n\n- **Open-Source**: Being open-source, Perplexica offers flexibility and transparency, allowing users to explore its functionalities without the constraints of proprietary software [3][10].", "sources": [ { - "pageContent": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.", + "content": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.", "metadata": { "title": "What is Perplexica, and how does it function as an AI-powered search ...", "url": "https://askai.glarity.app/search/What-is-Perplexica--and-how-does-it-function-as-an-AI-powered-search-engine" } }, { - "pageContent": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.", + "content": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.", "metadata": { "title": "Sahar Mor's Post", "url": "https://www.linkedin.com/posts/sahar-mor_a-new-open-source-project-called-perplexica-activity-7204489745668694016-ncja" @@ -158,7 +159,7 @@ Example of streamed response objects: ``` {"type":"init","data":"Stream connected"} -{"type":"sources","data":[{"pageContent":"...","metadata":{"title":"...","url":"..."}},...]} +{"type":"sources","data":[{"content":"...","metadata":{"title":"...","url":"..."}},...]} {"type":"response","data":"Perplexica is an "} {"type":"response","data":"innovative, open-source "} {"type":"response","data":"AI-powered search engine..."} @@ -174,9 +175,9 @@ Clients should process each line as a separate JSON object. The different messag ### Fields in the Response -- **`message`** (string): The search result, generated based on the query and focus mode. +- **`message`** (string): The search result, generated based on the query and enabled `sources`. - **`sources`** (array): A list of sources that were used to generate the search result. Each source includes: - - `pageContent`: A snippet of the relevant content from the source. + - `content`: A snippet of the relevant content from the source. - `metadata`: Metadata about the source, including: - `title`: The title of the webpage. - `url`: The URL of the webpage. @@ -185,5 +186,5 @@ Clients should process each line as a separate JSON object. The different messag If an error occurs during the search process, the API will return an appropriate error message with an HTTP status code. -- **400**: If the request is malformed or missing required fields (e.g., no focus mode or query). +- **400**: If the request is malformed or missing required fields (e.g., no `sources` or `query`). - **500**: If an internal server error occurs during the search. diff --git a/docs/architecture/README.md b/docs/architecture/README.md index 5732471..5593b37 100644 --- a/docs/architecture/README.md +++ b/docs/architecture/README.md @@ -1,11 +1,38 @@ -# Perplexica's Architecture +# Perplexica Architecture -Perplexica's architecture consists of the following key components: +Perplexica is a Next.js application that combines an AI chat experience with search. -1. **User Interface**: A web-based interface that allows users to interact with Perplexica for searching images, videos, and much more. -2. **Agent/Chains**: These components predict Perplexica's next actions, understand user queries, and decide whether a web search is necessary. -3. **SearXNG**: A metadata search engine used by Perplexica to search the web for sources. -4. **LLMs (Large Language Models)**: Utilized by agents and chains for tasks like understanding content, writing responses, and citing sources. Examples include Claude, GPTs, etc. -5. **Embedding Models**: To improve the accuracy of search results, embedding models re-rank the results using similarity search algorithms such as cosine similarity and dot product distance. +For a high level flow, see [WORKING.md](WORKING.md). For deeper implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md). -For a more detailed explanation of how these components work together, see [WORKING.md](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/WORKING.md). +## Key components + +1. **User Interface** + + - A web based UI that lets users chat, search, and view citations. + +2. **API Routes** + + - `POST /api/chat` powers the chat UI. + - `POST /api/search` provides a programmatic search endpoint. + - `GET /api/providers` lists available providers and model keys. + +3. **Agents and Orchestration** + + - The system classifies the question first. + - It can run research and widgets in parallel. + - It generates the final answer and includes citations. + +4. **Search Backend** + + - A meta search backend is used to fetch relevant web results when research is enabled. + +5. **LLMs (Large Language Models)** + + - Used for classification, writing answers, and producing citations. + +6. **Embedding Models** + + - Used for semantic search over user uploaded files. + +7. **Storage** + - Chats and messages are stored so conversations can be reloaded. diff --git a/docs/architecture/WORKING.md b/docs/architecture/WORKING.md index 6bad4f9..af29b90 100644 --- a/docs/architecture/WORKING.md +++ b/docs/architecture/WORKING.md @@ -1,19 +1,72 @@ -# How does Perplexica work? +# How Perplexica Works -Curious about how Perplexica works? Don't worry, we'll cover it here. Before we begin, make sure you've read about the architecture of Perplexica to ensure you understand what it's made up of. Haven't read it? You can read it [here](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/README.md). +This is a high level overview of how Perplexica answers a question. -We'll understand how Perplexica works by taking an example of a scenario where a user asks: "How does an A.C. work?". We'll break down the process into steps to make it easier to understand. The steps are as follows: +If you want a component level overview, see [README.md](README.md). -1. The message is sent to the `/api/chat` route where it invokes the chain. The chain will depend on your focus mode. For this example, let's assume we use the "webSearch" focus mode. -2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources and searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started. -3. The query returned by the first chain is passed to SearXNG to search the web for information. -4. After the information is retrieved, it is based on keyword-based search. We then convert the information into embeddings and the query as well, then we perform a similarity search to find the most relevant sources to answer the query. -5. After all this is done, the sources are passed to the response generator. This chain takes all the chat history, the query, and the sources. It generates a response that is streamed to the UI. +If you want implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md). -## How are the answers cited? +## What happens when you ask a question -The LLMs are prompted to do so. We've prompted them so well that they cite the answers themselves, and using some UI magic, we display it to the user. +When you send a message in the UI, the app calls `POST /api/chat`. -## Image and Video Search +At a high level, we do three things: -Image and video searches are conducted in a similar manner. A query is always generated first, then we search the web for images and videos that match the query. These results are then returned to the user. +1. Classify the question and decide what to do next. +2. Run research and widgets in parallel. +3. Write the final answer and include citations. + +## Classification + +Before searching or answering, we run a classification step. + +This step decides things like: + +- Whether we should do research for this question +- Whether we should show any widgets +- How to rewrite the question into a clearer standalone form + +## Widgets + +Widgets are small, structured helpers that can run alongside research. + +Examples include weather, stocks, and simple calculations. + +If a widget is relevant, we show it in the UI while the answer is still being generated. + +Widgets are helpful context for the answer, but they are not part of what the model should cite. + +## Research + +If research is needed, we gather information in the background while widgets can run. + +Depending on configuration, research may include web lookup and searching user uploaded files. + +## Answer generation + +Once we have enough context, the chat model generates the final response. + +You can control the tradeoff between speed and quality using `optimizationMode`: + +- `speed` +- `balanced` +- `quality` + +## How citations work + +We prompt the model to cite the references it used. The UI then renders those citations alongside the supporting links. + +## Search API + +If you are integrating Perplexica into another product, you can call `POST /api/search`. + +It returns: + +- `message`: the generated answer +- `sources`: supporting references used for the answer + +You can also enable streaming by setting `stream: true`. + +## Image and video search + +Image and video search use separate endpoints (`POST /api/images` and `POST /api/videos`). We generate a focused query using the chat model, then fetch matching results from a search backend. diff --git a/docs/installation/UPDATING.md b/docs/installation/UPDATING.md index 0603671..4f2be75 100644 --- a/docs/installation/UPDATING.md +++ b/docs/installation/UPDATING.md @@ -10,7 +10,7 @@ Simply pull the latest image and restart your container: docker pull itzcrazykns1337/perplexica:latest docker stop perplexica docker rm perplexica -docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest +docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest ``` For slim version: @@ -19,7 +19,7 @@ For slim version: docker pull itzcrazykns1337/perplexica:slim-latest docker stop perplexica docker rm perplexica -docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest +docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest ``` Once updated, go to http://localhost:3000 and verify the latest changes. Your settings are preserved automatically. diff --git a/drizzle/0002_daffy_wrecker.sql b/drizzle/0002_daffy_wrecker.sql new file mode 100644 index 0000000..1520a65 --- /dev/null +++ b/drizzle/0002_daffy_wrecker.sql @@ -0,0 +1 @@ +/* do nothing */ \ No newline at end of file diff --git a/drizzle/meta/0002_snapshot.json b/drizzle/meta/0002_snapshot.json new file mode 100644 index 0000000..feb820c --- /dev/null +++ b/drizzle/meta/0002_snapshot.json @@ -0,0 +1,132 @@ +{ + "version": "6", + "dialect": "sqlite", + "id": "1c5eb804-d6b4-48ec-9a8f-75fb729c8e52", + "prevId": "6dedf55f-0e44-478f-82cf-14a21ac686f8", + "tables": { + "chats": { + "name": "chats", + "columns": { + "id": { + "name": "id", + "type": "text", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "title": { + "name": "title", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "createdAt": { + "name": "createdAt", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "sources": { + "name": "sources", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "files": { + "name": "files", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false, + "default": "'[]'" + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + }, + "messages": { + "name": "messages", + "columns": { + "id": { + "name": "id", + "type": "integer", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "messageId": { + "name": "messageId", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "chatId": { + "name": "chatId", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "backendId": { + "name": "backendId", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "query": { + "name": "query", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "createdAt": { + "name": "createdAt", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + }, + "responseBlocks": { + "name": "responseBlocks", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false, + "default": "'[]'" + }, + "status": { + "name": "status", + "type": "text", + "primaryKey": false, + "notNull": false, + "autoincrement": false, + "default": "'answering'" + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {}, + "checkConstraints": {} + } + }, + "views": {}, + "enums": {}, + "_meta": { + "schemas": {}, + "tables": {}, + "columns": {} + }, + "internal": { + "indexes": {} + } +} diff --git a/drizzle/meta/_journal.json b/drizzle/meta/_journal.json index cf1610b..c271ddc 100644 --- a/drizzle/meta/_journal.json +++ b/drizzle/meta/_journal.json @@ -15,6 +15,13 @@ "when": 1758863991284, "tag": "0001_wise_rockslide", "breakpoints": true + }, + { + "idx": 2, + "version": "6", + "when": 1763732708332, + "tag": "0002_daffy_wrecker", + "breakpoints": true } ] } diff --git a/next-env.d.ts b/next-env.d.ts index 1b3be08..c4b7818 100644 --- a/next-env.d.ts +++ b/next-env.d.ts @@ -1,5 +1,6 @@ /// /// +import "./.next/dev/types/routes.d.ts"; // NOTE: This file should not be edited // see https://nextjs.org/docs/app/api-reference/config/typescript for more information. diff --git a/next.config.mjs b/next.config.mjs index 2300ff4..5770f76 100644 --- a/next.config.mjs +++ b/next.config.mjs @@ -1,3 +1,5 @@ +import pkg from './package.json' with { type: 'json' }; + /** @type {import('next').NextConfig} */ const nextConfig = { output: 'standalone', @@ -9,6 +11,9 @@ const nextConfig = { ], }, serverExternalPackages: ['pdf-parse'], + env: { + NEXT_PUBLIC_VERSION: pkg.version, + }, }; export default nextConfig; diff --git a/package.json b/package.json index 7083b66..1040261 100644 --- a/package.json +++ b/package.json @@ -11,53 +11,55 @@ "format:write": "prettier . --write" }, "dependencies": { + "@google/genai": "^1.34.0", "@headlessui/react": "^2.2.0", "@headlessui/tailwindcss": "^0.2.2", - "@huggingface/transformers": "^3.7.5", - "@iarna/toml": "^2.2.5", + "@huggingface/transformers": "^3.8.1", "@icons-pack/react-simple-icons": "^12.3.0", - "@langchain/anthropic": "^1.0.0", - "@langchain/community": "^1.0.0", - "@langchain/core": "^1.0.1", - "@langchain/google-genai": "^1.0.0", - "@langchain/groq": "^1.0.0", - "@langchain/ollama": "^1.0.0", - "@langchain/openai": "^1.0.0", - "@langchain/textsplitters": "^1.0.0", + "@phosphor-icons/react": "^2.1.10", + "@radix-ui/react-tooltip": "^1.2.8", "@tailwindcss/typography": "^0.5.12", + "@types/jspdf": "^2.0.0", "axios": "^1.8.3", "better-sqlite3": "^11.9.1", "clsx": "^2.1.0", - "compute-cosine-similarity": "^1.1.0", "drizzle-orm": "^0.40.1", - "framer-motion": "^12.23.24", - "html-to-text": "^9.0.5", - "jspdf": "^3.0.1", - "langchain": "^1.0.1", - "lucide-react": "^0.363.0", + "js-tiktoken": "^1.0.21", + "jspdf": "^3.0.4", + "lightweight-charts": "^5.0.9", + "lucide-react": "^0.556.0", "mammoth": "^1.9.1", "markdown-to-jsx": "^7.7.2", - "next": "^15.2.2", + "mathjs": "^15.1.0", + "motion": "^12.23.26", + "next": "^16.0.7", "next-themes": "^0.3.0", - "pdf-parse": "^1.1.1", + "officeparser": "^5.2.2", + "ollama": "^0.6.3", + "openai": "^6.9.0", + "partial-json": "^0.1.7", + "pdf-parse": "^2.4.5", "react": "^18", "react-dom": "^18", + "react-syntax-highlighter": "^16.1.0", "react-text-to-speech": "^0.14.5", "react-textarea-autosize": "^8.5.3", + "rfc6902": "^5.1.2", "sonner": "^1.4.41", "tailwind-merge": "^2.2.2", - "winston": "^3.17.0", + "turndown": "^7.2.2", + "yahoo-finance2": "^3.10.2", "yet-another-react-lightbox": "^3.17.2", - "zod": "^3.22.4" + "zod": "^4.1.12" }, "devDependencies": { "@types/better-sqlite3": "^7.6.12", - "@types/html-to-text": "^9.0.4", - "@types/jspdf": "^2.0.0", "@types/node": "^24.8.1", "@types/pdf-parse": "^1.1.4", "@types/react": "^18", "@types/react-dom": "^18", + "@types/react-syntax-highlighter": "^15.5.13", + "@types/turndown": "^5.0.6", "autoprefixer": "^10.0.1", "drizzle-kit": "^0.30.5", "eslint": "^8", diff --git a/src/app/api/chat/route.ts b/src/app/api/chat/route.ts index 25b8104..6362ebc 100644 --- a/src/app/api/chat/route.ts +++ b/src/app/api/chat/route.ts @@ -1,14 +1,14 @@ -import crypto from 'crypto'; -import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages'; -import { EventEmitter } from 'stream'; -import db from '@/lib/db'; -import { chats, messages as messagesSchema } from '@/lib/db/schema'; -import { and, eq, gt } from 'drizzle-orm'; -import { getFileDetails } from '@/lib/utils/files'; -import { searchHandlers } from '@/lib/search'; import { z } from 'zod'; import ModelRegistry from '@/lib/models/registry'; import { ModelWithProvider } from '@/lib/models/types'; +import SearchAgent from '@/lib/agents/search'; +import SessionManager from '@/lib/session'; +import { ChatTurnMessage } from '@/lib/types'; +import { SearchSources } from '@/lib/agents/search/types'; +import db from '@/lib/db'; +import { eq } from 'drizzle-orm'; +import { chats } from '@/lib/db/schema'; +import UploadManager from '@/lib/uploads/manager'; export const runtime = 'nodejs'; export const dynamic = 'force-dynamic'; @@ -20,47 +20,25 @@ const messageSchema = z.object({ }); const chatModelSchema: z.ZodType = z.object({ - providerId: z.string({ - errorMap: () => ({ - message: 'Chat model provider id must be provided', - }), - }), - key: z.string({ - errorMap: () => ({ - message: 'Chat model key must be provided', - }), - }), + providerId: z.string({ message: 'Chat model provider id must be provided' }), + key: z.string({ message: 'Chat model key must be provided' }), }); const embeddingModelSchema: z.ZodType = z.object({ providerId: z.string({ - errorMap: () => ({ - message: 'Embedding model provider id must be provided', - }), - }), - key: z.string({ - errorMap: () => ({ - message: 'Embedding model key must be provided', - }), + message: 'Embedding model provider id must be provided', }), + key: z.string({ message: 'Embedding model key must be provided' }), }); const bodySchema = z.object({ message: messageSchema, optimizationMode: z.enum(['speed', 'balanced', 'quality'], { - errorMap: () => ({ - message: 'Optimization mode must be one of: speed, balanced, quality', - }), + message: 'Optimization mode must be one of: speed, balanced, quality', }), - focusMode: z.string().min(1, 'Focus mode is required'), + sources: z.array(z.string()).optional().default([]), history: z - .array( - z.tuple([z.string(), z.string()], { - errorMap: () => ({ - message: 'History items must be tuples of two strings', - }), - }), - ) + .array(z.tuple([z.string(), z.string()])) .optional() .default([]), files: z.array(z.string()).optional().default([]), @@ -69,7 +47,6 @@ const bodySchema = z.object({ systemInstructions: z.string().nullable().optional().default(''), }); -type Message = z.infer; type Body = z.infer; const safeValidateBody = (data: unknown) => { @@ -78,7 +55,7 @@ const safeValidateBody = (data: unknown) => { if (!result.success) { return { success: false, - error: result.error.errors.map((e) => ({ + error: result.error.issues.map((e: any) => ({ path: e.path.join('.'), message: e.message, })), @@ -91,143 +68,35 @@ const safeValidateBody = (data: unknown) => { }; }; -const handleEmitterEvents = async ( - stream: EventEmitter, - writer: WritableStreamDefaultWriter, - encoder: TextEncoder, - chatId: string, -) => { - let receivedMessage = ''; - const aiMessageId = crypto.randomBytes(7).toString('hex'); - - stream.on('data', (data) => { - const parsedData = JSON.parse(data); - if (parsedData.type === 'response') { - writer.write( - encoder.encode( - JSON.stringify({ - type: 'message', - data: parsedData.data, - messageId: aiMessageId, - }) + '\n', - ), - ); - - receivedMessage += parsedData.data; - } else if (parsedData.type === 'sources') { - writer.write( - encoder.encode( - JSON.stringify({ - type: 'sources', - data: parsedData.data, - messageId: aiMessageId, - }) + '\n', - ), - ); - - const sourceMessageId = crypto.randomBytes(7).toString('hex'); - - db.insert(messagesSchema) - .values({ - chatId: chatId, - messageId: sourceMessageId, - role: 'source', - sources: parsedData.data, - createdAt: new Date().toString(), - }) - .execute(); - } - }); - stream.on('end', () => { - writer.write( - encoder.encode( - JSON.stringify({ - type: 'messageEnd', - }) + '\n', - ), - ); - writer.close(); - - db.insert(messagesSchema) - .values({ - content: receivedMessage, - chatId: chatId, - messageId: aiMessageId, - role: 'assistant', - createdAt: new Date().toString(), +const ensureChatExists = async (input: { + id: string; + sources: SearchSources[]; + query: string; + fileIds: string[]; +}) => { + try { + const exists = await db.query.chats + .findFirst({ + where: eq(chats.id, input.id), }) .execute(); - }); - stream.on('error', (data) => { - const parsedData = JSON.parse(data); - writer.write( - encoder.encode( - JSON.stringify({ - type: 'error', - data: parsedData.data, + + if (!exists) { + await db.insert(chats).values({ + id: input.id, + createdAt: new Date().toISOString(), + sources: input.sources, + title: input.query, + files: input.fileIds.map((id) => { + return { + fileId: id, + name: UploadManager.getFile(id)?.name || 'Uploaded File', + }; }), - ), - ); - writer.close(); - }); -}; - -const handleHistorySave = async ( - message: Message, - humanMessageId: string, - focusMode: string, - files: string[], -) => { - const chat = await db.query.chats.findFirst({ - where: eq(chats.id, message.chatId), - }); - - const fileData = files.map(getFileDetails); - - if (!chat) { - await db - .insert(chats) - .values({ - id: message.chatId, - title: message.content, - createdAt: new Date().toString(), - focusMode: focusMode, - files: fileData, - }) - .execute(); - } else if (JSON.stringify(chat.files ?? []) != JSON.stringify(fileData)) { - db.update(chats) - .set({ - files: files.map(getFileDetails), - }) - .where(eq(chats.id, message.chatId)); - } - - const messageExists = await db.query.messages.findFirst({ - where: eq(messagesSchema.messageId, humanMessageId), - }); - - if (!messageExists) { - await db - .insert(messagesSchema) - .values({ - content: message.content, - chatId: message.chatId, - messageId: humanMessageId, - role: 'user', - createdAt: new Date().toString(), - }) - .execute(); - } else { - await db - .delete(messagesSchema) - .where( - and( - gt(messagesSchema.id, messageExists.id), - eq(messagesSchema.chatId, message.chatId), - ), - ) - .execute(); + }); + } + } catch (err) { + console.error('Failed to check/save chat:', err); } }; @@ -236,6 +105,7 @@ export const POST = async (req: Request) => { const reqBody = (await req.json()) as Body; const parseBody = safeValidateBody(reqBody); + if (!parseBody.success) { return Response.json( { message: 'Invalid request body', error: parseBody.error }, @@ -265,48 +135,107 @@ export const POST = async (req: Request) => { ), ]); - const humanMessageId = - message.messageId ?? crypto.randomBytes(7).toString('hex'); - - const history: BaseMessage[] = body.history.map((msg) => { + const history: ChatTurnMessage[] = body.history.map((msg) => { if (msg[0] === 'human') { - return new HumanMessage({ + return { + role: 'user', content: msg[1], - }); + }; } else { - return new AIMessage({ + return { + role: 'assistant', content: msg[1], - }); + }; } }); - const handler = searchHandlers[body.focusMode]; - - if (!handler) { - return Response.json( - { - message: 'Invalid focus mode', - }, - { status: 400 }, - ); - } - - const stream = await handler.searchAndAnswer( - message.content, - history, - llm, - embedding, - body.optimizationMode, - body.files, - body.systemInstructions as string, - ); + const agent = new SearchAgent(); + const session = SessionManager.createSession(); const responseStream = new TransformStream(); const writer = responseStream.writable.getWriter(); const encoder = new TextEncoder(); - handleEmitterEvents(stream, writer, encoder, message.chatId); - handleHistorySave(message, humanMessageId, body.focusMode, body.files); + const disconnect = session.subscribe((event: string, data: any) => { + if (event === 'data') { + if (data.type === 'block') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'block', + block: data.block, + }) + '\n', + ), + ); + } else if (data.type === 'updateBlock') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'updateBlock', + blockId: data.blockId, + patch: data.patch, + }) + '\n', + ), + ); + } else if (data.type === 'researchComplete') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'researchComplete', + }) + '\n', + ), + ); + } + } else if (event === 'end') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'messageEnd', + }) + '\n', + ), + ); + writer.close(); + session.removeAllListeners(); + } else if (event === 'error') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'error', + data: data.data, + }) + '\n', + ), + ); + writer.close(); + session.removeAllListeners(); + } + }); + + agent.searchAsync(session, { + chatHistory: history, + followUp: message.content, + chatId: body.message.chatId, + messageId: body.message.messageId, + config: { + llm, + embedding: embedding, + sources: body.sources as SearchSources[], + mode: body.optimizationMode, + fileIds: body.files, + systemInstructions: body.systemInstructions || 'None', + }, + }); + + ensureChatExists({ + id: body.message.chatId, + sources: body.sources as SearchSources[], + fileIds: body.files, + query: body.message.content, + }); + + req.signal.addEventListener('abort', () => { + disconnect(); + writer.close(); + }); return new Response(responseStream.readable, { headers: { diff --git a/src/app/api/images/route.ts b/src/app/api/images/route.ts index d3416ca..9cfabb2 100644 --- a/src/app/api/images/route.ts +++ b/src/app/api/images/route.ts @@ -1,7 +1,6 @@ -import handleImageSearch from '@/lib/chains/imageSearchAgent'; +import searchImages from '@/lib/agents/media/image'; import ModelRegistry from '@/lib/models/registry'; import { ModelWithProvider } from '@/lib/models/types'; -import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages'; interface ImageSearchBody { query: string; @@ -13,16 +12,6 @@ export const POST = async (req: Request) => { try { const body: ImageSearchBody = await req.json(); - const chatHistory = body.chatHistory - .map((msg: any) => { - if (msg.role === 'user') { - return new HumanMessage(msg.content); - } else if (msg.role === 'assistant') { - return new AIMessage(msg.content); - } - }) - .filter((msg) => msg !== undefined) as BaseMessage[]; - const registry = new ModelRegistry(); const llm = await registry.loadChatModel( @@ -30,9 +19,9 @@ export const POST = async (req: Request) => { body.chatModel.key, ); - const images = await handleImageSearch( + const images = await searchImages( { - chat_history: chatHistory, + chatHistory: body.chatHistory, query: body.query, }, llm, diff --git a/src/app/api/reconnect/[id]/route.ts b/src/app/api/reconnect/[id]/route.ts new file mode 100644 index 0000000..08be11b --- /dev/null +++ b/src/app/api/reconnect/[id]/route.ts @@ -0,0 +1,93 @@ +import SessionManager from '@/lib/session'; + +export const POST = async ( + req: Request, + { params }: { params: Promise<{ id: string }> }, +) => { + try { + const { id } = await params; + + const session = SessionManager.getSession(id); + + if (!session) { + return Response.json({ message: 'Session not found' }, { status: 404 }); + } + + const responseStream = new TransformStream(); + const writer = responseStream.writable.getWriter(); + const encoder = new TextEncoder(); + + const disconnect = session.subscribe((event, data) => { + if (event === 'data') { + if (data.type === 'block') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'block', + block: data.block, + }) + '\n', + ), + ); + } else if (data.type === 'updateBlock') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'updateBlock', + blockId: data.blockId, + patch: data.patch, + }) + '\n', + ), + ); + } else if (data.type === 'researchComplete') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'researchComplete', + }) + '\n', + ), + ); + } + } else if (event === 'end') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'messageEnd', + }) + '\n', + ), + ); + writer.close(); + disconnect(); + } else if (event === 'error') { + writer.write( + encoder.encode( + JSON.stringify({ + type: 'error', + data: data.data, + }) + '\n', + ), + ); + writer.close(); + disconnect(); + } + }); + + req.signal.addEventListener('abort', () => { + disconnect(); + writer.close(); + }); + + return new Response(responseStream.readable, { + headers: { + 'Content-Type': 'text/event-stream', + Connection: 'keep-alive', + 'Cache-Control': 'no-cache, no-transform', + }, + }); + } catch (err) { + console.error('Error in reconnecting to session stream: ', err); + return Response.json( + { message: 'An error has occurred.' }, + { status: 500 }, + ); + } +}; diff --git a/src/app/api/search/route.ts b/src/app/api/search/route.ts index bc7255f..0991268 100644 --- a/src/app/api/search/route.ts +++ b/src/app/api/search/route.ts @@ -1,12 +1,13 @@ -import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages'; -import { MetaSearchAgentType } from '@/lib/search/metaSearchAgent'; -import { searchHandlers } from '@/lib/search'; import ModelRegistry from '@/lib/models/registry'; import { ModelWithProvider } from '@/lib/models/types'; +import SessionManager from '@/lib/session'; +import { ChatTurnMessage } from '@/lib/types'; +import { SearchSources } from '@/lib/agents/search/types'; +import APISearchAgent from '@/lib/agents/search/api'; interface ChatRequestBody { - optimizationMode: 'speed' | 'balanced'; - focusMode: string; + optimizationMode: 'speed' | 'balanced' | 'quality'; + sources: SearchSources[]; chatModel: ModelWithProvider; embeddingModel: ModelWithProvider; query: string; @@ -19,23 +20,17 @@ export const POST = async (req: Request) => { try { const body: ChatRequestBody = await req.json(); - if (!body.focusMode || !body.query) { + if (!body.sources || !body.query) { return Response.json( - { message: 'Missing focus mode or query' }, + { message: 'Missing sources or query' }, { status: 400 }, ); } body.history = body.history || []; - body.optimizationMode = body.optimizationMode || 'balanced'; + body.optimizationMode = body.optimizationMode || 'speed'; body.stream = body.stream || false; - const history: BaseMessage[] = body.history.map((msg) => { - return msg[0] === 'human' - ? new HumanMessage({ content: msg[1] }) - : new AIMessage({ content: msg[1] }); - }); - const registry = new ModelRegistry(); const [llm, embeddings] = await Promise.all([ @@ -46,21 +41,30 @@ export const POST = async (req: Request) => { ), ]); - const searchHandler: MetaSearchAgentType = searchHandlers[body.focusMode]; + const history: ChatTurnMessage[] = body.history.map((msg) => { + return msg[0] === 'human' + ? { role: 'user', content: msg[1] } + : { role: 'assistant', content: msg[1] }; + }); - if (!searchHandler) { - return Response.json({ message: 'Invalid focus mode' }, { status: 400 }); - } + const session = SessionManager.createSession(); - const emitter = await searchHandler.searchAndAnswer( - body.query, - history, - llm, - embeddings, - body.optimizationMode, - [], - body.systemInstructions || '', - ); + const agent = new APISearchAgent(); + + agent.searchAsync(session, { + chatHistory: history, + config: { + embedding: embeddings, + llm: llm, + sources: body.sources, + mode: body.optimizationMode, + fileIds: [], + systemInstructions: body.systemInstructions || '', + }, + followUp: body.query, + chatId: crypto.randomUUID(), + messageId: crypto.randomUUID(), + }); if (!body.stream) { return new Promise( @@ -71,36 +75,37 @@ export const POST = async (req: Request) => { let message = ''; let sources: any[] = []; - emitter.on('data', (data: string) => { - try { - const parsedData = JSON.parse(data); - if (parsedData.type === 'response') { - message += parsedData.data; - } else if (parsedData.type === 'sources') { - sources = parsedData.data; + session.subscribe((event: string, data: Record) => { + if (event === 'data') { + try { + if (data.type === 'response') { + message += data.data; + } else if (data.type === 'searchResults') { + sources = data.data; + } + } catch (error) { + reject( + Response.json( + { message: 'Error parsing data' }, + { status: 500 }, + ), + ); } - } catch (error) { + } + + if (event === 'end') { + resolve(Response.json({ message, sources }, { status: 200 })); + } + + if (event === 'error') { reject( Response.json( - { message: 'Error parsing data' }, + { message: 'Search error', error: data }, { status: 500 }, ), ); } }); - - emitter.on('end', () => { - resolve(Response.json({ message, sources }, { status: 200 })); - }); - - emitter.on('error', (error: any) => { - reject( - Response.json( - { message: 'Search error', error }, - { status: 500 }, - ), - ); - }); }, ); } @@ -124,61 +129,61 @@ export const POST = async (req: Request) => { ); signal.addEventListener('abort', () => { - emitter.removeAllListeners(); + session.removeAllListeners(); try { controller.close(); } catch (error) {} }); - emitter.on('data', (data: string) => { - if (signal.aborted) return; + session.subscribe((event: string, data: Record) => { + if (event === 'data') { + if (signal.aborted) return; - try { - const parsedData = JSON.parse(data); - - if (parsedData.type === 'response') { - controller.enqueue( - encoder.encode( - JSON.stringify({ - type: 'response', - data: parsedData.data, - }) + '\n', - ), - ); - } else if (parsedData.type === 'sources') { - sources = parsedData.data; - controller.enqueue( - encoder.encode( - JSON.stringify({ - type: 'sources', - data: sources, - }) + '\n', - ), - ); + try { + if (data.type === 'response') { + controller.enqueue( + encoder.encode( + JSON.stringify({ + type: 'response', + data: data.data, + }) + '\n', + ), + ); + } else if (data.type === 'searchResults') { + sources = data.data; + controller.enqueue( + encoder.encode( + JSON.stringify({ + type: 'sources', + data: sources, + }) + '\n', + ), + ); + } + } catch (error) { + controller.error(error); } - } catch (error) { - controller.error(error); } - }); - emitter.on('end', () => { - if (signal.aborted) return; + if (event === 'end') { + if (signal.aborted) return; - controller.enqueue( - encoder.encode( - JSON.stringify({ - type: 'done', - }) + '\n', - ), - ); - controller.close(); - }); + controller.enqueue( + encoder.encode( + JSON.stringify({ + type: 'done', + }) + '\n', + ), + ); + controller.close(); + } - emitter.on('error', (error: any) => { - if (signal.aborted) return; + if (event === 'error') { + if (signal.aborted) return; - controller.error(error); + controller.error(data); + } }); }, cancel() { diff --git a/src/app/api/suggestions/route.ts b/src/app/api/suggestions/route.ts index d8312cf..07432d6 100644 --- a/src/app/api/suggestions/route.ts +++ b/src/app/api/suggestions/route.ts @@ -1,8 +1,6 @@ -import generateSuggestions from '@/lib/chains/suggestionGeneratorAgent'; +import generateSuggestions from '@/lib/agents/suggestions'; import ModelRegistry from '@/lib/models/registry'; import { ModelWithProvider } from '@/lib/models/types'; -import { BaseChatModel } from '@langchain/core/language_models/chat_models'; -import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages'; interface SuggestionsGenerationBody { chatHistory: any[]; @@ -13,16 +11,6 @@ export const POST = async (req: Request) => { try { const body: SuggestionsGenerationBody = await req.json(); - const chatHistory = body.chatHistory - .map((msg: any) => { - if (msg.role === 'user') { - return new HumanMessage(msg.content); - } else if (msg.role === 'assistant') { - return new AIMessage(msg.content); - } - }) - .filter((msg) => msg !== undefined) as BaseMessage[]; - const registry = new ModelRegistry(); const llm = await registry.loadChatModel( @@ -32,7 +20,7 @@ export const POST = async (req: Request) => { const suggestions = await generateSuggestions( { - chat_history: chatHistory, + chatHistory: body.chatHistory, }, llm, ); diff --git a/src/app/api/uploads/route.ts b/src/app/api/uploads/route.ts index 2a275f4..9cac0f7 100644 --- a/src/app/api/uploads/route.ts +++ b/src/app/api/uploads/route.ts @@ -1,39 +1,16 @@ import { NextResponse } from 'next/server'; -import fs from 'fs'; -import path from 'path'; -import crypto from 'crypto'; -import { PDFLoader } from '@langchain/community/document_loaders/fs/pdf'; -import { DocxLoader } from '@langchain/community/document_loaders/fs/docx'; -import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters'; -import { Document } from '@langchain/core/documents'; import ModelRegistry from '@/lib/models/registry'; - -interface FileRes { - fileName: string; - fileExtension: string; - fileId: string; -} - -const uploadDir = path.join(process.cwd(), 'uploads'); - -if (!fs.existsSync(uploadDir)) { - fs.mkdirSync(uploadDir, { recursive: true }); -} - -const splitter = new RecursiveCharacterTextSplitter({ - chunkSize: 500, - chunkOverlap: 100, -}); +import UploadManager from '@/lib/uploads/manager'; export async function POST(req: Request) { try { const formData = await req.formData(); const files = formData.getAll('files') as File[]; - const embedding_model = formData.get('embedding_model_key') as string; - const embedding_model_provider = formData.get('embedding_model_provider_id') as string; + const embeddingModel = formData.get('embedding_model_key') as string; + const embeddingModelProvider = formData.get('embedding_model_provider_id') as string; - if (!embedding_model || !embedding_model_provider) { + if (!embeddingModel || !embeddingModelProvider) { return NextResponse.json( { message: 'Missing embedding model or provider' }, { status: 400 }, @@ -42,73 +19,13 @@ export async function POST(req: Request) { const registry = new ModelRegistry(); - const model = await registry.loadEmbeddingModel(embedding_model_provider, embedding_model); + const model = await registry.loadEmbeddingModel(embeddingModelProvider, embeddingModel); + + const uploadManager = new UploadManager({ + embeddingModel: model, + }) - const processedFiles: FileRes[] = []; - - await Promise.all( - files.map(async (file: any) => { - const fileExtension = file.name.split('.').pop(); - if (!['pdf', 'docx', 'txt'].includes(fileExtension!)) { - return NextResponse.json( - { message: 'File type not supported' }, - { status: 400 }, - ); - } - - const uniqueFileName = `${crypto.randomBytes(16).toString('hex')}.${fileExtension}`; - const filePath = path.join(uploadDir, uniqueFileName); - - const buffer = Buffer.from(await file.arrayBuffer()); - fs.writeFileSync(filePath, new Uint8Array(buffer)); - - let docs: any[] = []; - if (fileExtension === 'pdf') { - const loader = new PDFLoader(filePath); - docs = await loader.load(); - } else if (fileExtension === 'docx') { - const loader = new DocxLoader(filePath); - docs = await loader.load(); - } else if (fileExtension === 'txt') { - const text = fs.readFileSync(filePath, 'utf-8'); - docs = [ - new Document({ pageContent: text, metadata: { title: file.name } }), - ]; - } - - const splitted = await splitter.splitDocuments(docs); - - const extractedDataPath = filePath.replace(/\.\w+$/, '-extracted.json'); - fs.writeFileSync( - extractedDataPath, - JSON.stringify({ - title: file.name, - contents: splitted.map((doc) => doc.pageContent), - }), - ); - - const embeddings = await model.embedDocuments( - splitted.map((doc) => doc.pageContent), - ); - const embeddingsDataPath = filePath.replace( - /\.\w+$/, - '-embeddings.json', - ); - fs.writeFileSync( - embeddingsDataPath, - JSON.stringify({ - title: file.name, - embeddings, - }), - ); - - processedFiles.push({ - fileName: file.name, - fileExtension: fileExtension, - fileId: uniqueFileName.replace(/\.\w+$/, ''), - }); - }), - ); + const processedFiles = await uploadManager.processFiles(files); return NextResponse.json({ files: processedFiles, diff --git a/src/app/api/videos/route.ts b/src/app/api/videos/route.ts index 02e5909..0d5e03c 100644 --- a/src/app/api/videos/route.ts +++ b/src/app/api/videos/route.ts @@ -1,7 +1,6 @@ -import handleVideoSearch from '@/lib/chains/videoSearchAgent'; +import handleVideoSearch from '@/lib/agents/media/video'; import ModelRegistry from '@/lib/models/registry'; import { ModelWithProvider } from '@/lib/models/types'; -import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages'; interface VideoSearchBody { query: string; @@ -13,16 +12,6 @@ export const POST = async (req: Request) => { try { const body: VideoSearchBody = await req.json(); - const chatHistory = body.chatHistory - .map((msg: any) => { - if (msg.role === 'user') { - return new HumanMessage(msg.content); - } else if (msg.role === 'assistant') { - return new AIMessage(msg.content); - } - }) - .filter((msg) => msg !== undefined) as BaseMessage[]; - const registry = new ModelRegistry(); const llm = await registry.loadChatModel( @@ -32,7 +21,7 @@ export const POST = async (req: Request) => { const videos = await handleVideoSearch( { - chat_history: chatHistory, + chatHistory: body.chatHistory, query: body.query, }, llm, diff --git a/src/app/c/[chatId]/page.tsx b/src/app/c/[chatId]/page.tsx index 39b93f0..06cd823 100644 --- a/src/app/c/[chatId]/page.tsx +++ b/src/app/c/[chatId]/page.tsx @@ -1,10 +1,5 @@ 'use client'; import ChatWindow from '@/components/ChatWindow'; -import React from 'react'; -const Page = () => { - return ; -}; - -export default Page; +export default ChatWindow; diff --git a/src/app/layout.tsx b/src/app/layout.tsx index e9fd8c7..535a0e0 100644 --- a/src/app/layout.tsx +++ b/src/app/layout.tsx @@ -34,7 +34,7 @@ export default function RootLayout({ return ( - + {setupComplete ? ( diff --git a/src/app/library/page.tsx b/src/app/library/page.tsx index 9c40b2b..3eb923e 100644 --- a/src/app/library/page.tsx +++ b/src/app/library/page.tsx @@ -1,8 +1,8 @@ 'use client'; import DeleteChat from '@/components/DeleteChat'; -import { cn, formatTimeDifference } from '@/lib/utils'; -import { BookOpenText, ClockIcon, Delete, ScanEye } from 'lucide-react'; +import { formatTimeDifference } from '@/lib/utils'; +import { BookOpenText, ClockIcon, FileText, Globe2Icon } from 'lucide-react'; import Link from 'next/link'; import { useEffect, useState } from 'react'; @@ -10,7 +10,8 @@ export interface Chat { id: string; title: string; createdAt: string; - focusMode: string; + sources: string[]; + files: { fileId: string; name: string }[]; } const Page = () => { @@ -37,74 +38,137 @@ const Page = () => { fetchChats(); }, []); - return loading ? ( -
- -
- ) : ( + return (
-
-
- -

Library

-
-
-
- {chats.length === 0 && ( -
-

- No chats found. -

-
- )} - {chats.length > 0 && ( -
- {chats.map((chat, i) => ( -
- +
+
+ +
+

- {chat.title} - -
-
- -

- {formatTimeDifference(new Date(), chat.createdAt)} Ago -

-
- + Library +

+
+ Past chats, sources, and uploads.
- ))} +
+ +
+ + + {loading + ? 'Loading…' + : `${chats.length} ${chats.length === 1 ? 'chat' : 'chats'}`} + +
+
+
+ + {loading ? ( +
+ +
+ ) : chats.length === 0 ? ( +
+
+ +
+

+ No chats found. +

+

+ + Start a new chat + {' '} + to see it listed here. +

+
+ ) : ( +
+
+ {chats.map((chat, index) => { + const sourcesLabel = + chat.sources.length === 0 + ? null + : chat.sources.length <= 2 + ? chat.sources + .map((s) => s.charAt(0).toUpperCase() + s.slice(1)) + .join(', ') + : `${chat.sources + .slice(0, 2) + .map((s) => s.charAt(0).toUpperCase() + s.slice(1)) + .join(', ')} + ${chat.sources.length - 2}`; + + return ( +
+
+ + {chat.title} + +
+ +
+
+ +
+ + + {formatTimeDifference(new Date(), chat.createdAt)} Ago + + + {sourcesLabel && ( + + + {sourcesLabel} + + )} + {chat.files.length > 0 && ( + + + {chat.files.length}{' '} + {chat.files.length === 1 ? 'file' : 'files'} + + )} +
+
+ ); + })} +
)}
diff --git a/src/components/AssistantSteps.tsx b/src/components/AssistantSteps.tsx new file mode 100644 index 0000000..c715a92 --- /dev/null +++ b/src/components/AssistantSteps.tsx @@ -0,0 +1,266 @@ +'use client'; + +import { + Brain, + Search, + FileText, + ChevronDown, + ChevronUp, + BookSearch, +} from 'lucide-react'; +import { motion, AnimatePresence } from 'framer-motion'; +import { useEffect, useState } from 'react'; +import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types'; +import { useChat } from '@/lib/hooks/useChat'; + +const getStepIcon = (step: ResearchBlockSubStep) => { + if (step.type === 'reasoning') { + return ; + } else if (step.type === 'searching' || step.type === 'upload_searching') { + return ; + } else if ( + step.type === 'search_results' || + step.type === 'upload_search_results' + ) { + return ; + } else if (step.type === 'reading') { + return ; + } + + return null; +}; + +const getStepTitle = ( + step: ResearchBlockSubStep, + isStreaming: boolean, +): string => { + if (step.type === 'reasoning') { + return isStreaming && !step.reasoning ? 'Thinking...' : 'Thinking'; + } else if (step.type === 'searching') { + return `Searching ${step.searching.length} ${step.searching.length === 1 ? 'query' : 'queries'}`; + } else if (step.type === 'search_results') { + return `Found ${step.reading.length} ${step.reading.length === 1 ? 'result' : 'results'}`; + } else if (step.type === 'reading') { + return `Reading ${step.reading.length} ${step.reading.length === 1 ? 'source' : 'sources'}`; + } else if (step.type === 'upload_searching') { + return 'Scanning your uploaded documents'; + } else if (step.type === 'upload_search_results') { + return `Reading ${step.results.length} ${step.results.length === 1 ? 'document' : 'documents'}`; + } + + return 'Processing'; +}; + +const AssistantSteps = ({ + block, + status, + isLast, +}: { + block: ResearchBlock; + status: 'answering' | 'completed' | 'error'; + isLast: boolean; +}) => { + const [isExpanded, setIsExpanded] = useState( + isLast && status === 'answering' ? true : false, + ); + const { researchEnded, loading } = useChat(); + + useEffect(() => { + if (researchEnded && isLast) { + setIsExpanded(false); + } else if (status === 'answering' && isLast) { + setIsExpanded(true); + } + }, [researchEnded, status]); + + if (!block || block.data.subSteps.length === 0) return null; + + return ( +
+ + + + {isExpanded && ( + +
+ {block.data.subSteps.map((step, index) => { + const isLastStep = index === block.data.subSteps.length - 1; + const isStreaming = loading && isLastStep && !researchEnded; + + return ( + +
+
+ {getStepIcon(step)} +
+ {index < block.data.subSteps.length - 1 && ( +
+ )} +
+ +
+ + {getStepTitle(step, isStreaming)} + + + {step.type === 'reasoning' && ( + <> + {step.reasoning && ( +

+ {step.reasoning} +

+ )} + {isStreaming && !step.reasoning && ( +
+
+
+
+
+ )} + + )} + + {step.type === 'searching' && + step.searching.length > 0 && ( +
+ {step.searching.map((query, idx) => ( + + {query} + + ))} +
+ )} + + {(step.type === 'search_results' || + step.type === 'reading') && + step.reading.length > 0 && ( +
+ {step.reading.slice(0, 4).map((result, idx) => { + const url = result.metadata.url || ''; + const title = result.metadata.title || 'Untitled'; + const domain = url ? new URL(url).hostname : ''; + const faviconUrl = domain + ? `https://s2.googleusercontent.com/s2/favicons?domain=${domain}&sz=128` + : ''; + + return ( + + {faviconUrl && ( + { + e.currentTarget.style.display = 'none'; + }} + /> + )} + {title} + + ); + })} +
+ )} + + {step.type === 'upload_searching' && + step.queries.length > 0 && ( +
+ {step.queries.map((query, idx) => ( + + {query} + + ))} +
+ )} + + {step.type === 'upload_search_results' && + step.results.length > 0 && ( +
+ {step.results.slice(0, 4).map((result, idx) => { + const title = + (result.metadata && + (result.metadata.title || + result.metadata.fileName)) || + 'Untitled document'; + + return ( +
+
+ +
+
+

+ {title} +

+
+
+ ); + })} +
+ )} +
+ + ); + })} +
+ + )} + +
+ ); +}; + +export default AssistantSteps; diff --git a/src/components/Chat.tsx b/src/components/Chat.tsx index 22e0a48..1c95d26 100644 --- a/src/components/Chat.tsx +++ b/src/components/Chat.tsx @@ -7,11 +7,12 @@ import MessageBoxLoading from './MessageBoxLoading'; import { useChat } from '@/lib/hooks/useChat'; const Chat = () => { - const { sections, chatTurns, loading, messageAppeared } = useChat(); + const { sections, loading, messageAppeared, messages } = useChat(); const [dividerWidth, setDividerWidth] = useState(0); const dividerRef = useRef(null); const messageEnd = useRef(null); + const lastScrolledRef = useRef(0); useEffect(() => { const updateDividerWidth = () => { @@ -22,43 +23,48 @@ const Chat = () => { updateDividerWidth(); + const resizeObserver = new ResizeObserver(() => { + updateDividerWidth(); + }); + + const currentRef = dividerRef.current; + if (currentRef) { + resizeObserver.observe(currentRef); + } + window.addEventListener('resize', updateDividerWidth); return () => { + if (currentRef) { + resizeObserver.unobserve(currentRef); + } + resizeObserver.disconnect(); window.removeEventListener('resize', updateDividerWidth); }; - }, []); + }, [sections.length]); useEffect(() => { const scroll = () => { messageEnd.current?.scrollIntoView({ behavior: 'auto' }); }; - if (chatTurns.length === 1) { - document.title = `${chatTurns[0].content.substring(0, 30)} - Perplexica`; + if (messages.length === 1) { + document.title = `${messages[0].query.substring(0, 30)} - Perplexica`; } - const messageEndBottom = - messageEnd.current?.getBoundingClientRect().bottom ?? 0; - - const distanceFromMessageEnd = window.innerHeight - messageEndBottom; - - if (distanceFromMessageEnd >= -100) { + if (sections.length > lastScrolledRef.current) { scroll(); + lastScrolledRef.current = sections.length; } - - if (chatTurns[chatTurns.length - 1]?.role === 'user') { - scroll(); - } - }, [chatTurns]); + }, [messages]); return ( -
+
{sections.map((section, i) => { const isLast = i === sections.length - 1; return ( - + { {loading && !messageAppeared && }
{dividerWidth > 0 && ( -
+
+
+
)} diff --git a/src/components/ChatWindow.tsx b/src/components/ChatWindow.tsx index c04b4ea..a2a9f67 100644 --- a/src/components/ChatWindow.tsx +++ b/src/components/ChatWindow.tsx @@ -1,15 +1,13 @@ 'use client'; -import { Document } from '@langchain/core/documents'; import Navbar from './Navbar'; import Chat from './Chat'; import EmptyChat from './EmptyChat'; -import { Settings } from 'lucide-react'; -import Link from 'next/link'; import NextError from 'next/error'; import { useChat } from '@/lib/hooks/useChat'; -import Loader from './ui/Loader'; import SettingsButtonMobile from './Settings/SettingsButtonMobile'; +import { Block } from '@/lib/types'; +import Loader from './ui/Loader'; export interface BaseMessage { chatId: string; @@ -17,42 +15,27 @@ export interface BaseMessage { createdAt: Date; } -export interface AssistantMessage extends BaseMessage { - role: 'assistant'; - content: string; - suggestions?: string[]; +export interface Message extends BaseMessage { + backendId: string; + query: string; + responseBlocks: Block[]; + status: 'answering' | 'completed' | 'error'; } -export interface UserMessage extends BaseMessage { - role: 'user'; - content: string; -} - -export interface SourceMessage extends BaseMessage { - role: 'source'; - sources: Document[]; -} - -export interface SuggestionMessage extends BaseMessage { - role: 'suggestion'; - suggestions: string[]; -} - -export type Message = - | AssistantMessage - | UserMessage - | SourceMessage - | SuggestionMessage; -export type ChatTurn = UserMessage | AssistantMessage; - export interface File { fileName: string; fileExtension: string; fileId: string; } +export interface Widget { + widgetType: string; + params: Record; +} + const ChatWindow = () => { - const { hasError, isReady, notFound, messages } = useChat(); + const { hasError, notFound, messages, isReady } = useChat(); + if (hasError) { return (
@@ -84,7 +67,7 @@ const ChatWindow = () => {
) ) : ( -
+
); diff --git a/src/components/EmptyChat.tsx b/src/components/EmptyChat.tsx index d9b6686..775fc9d 100644 --- a/src/components/EmptyChat.tsx +++ b/src/components/EmptyChat.tsx @@ -1,3 +1,6 @@ +'use client'; + +import { useEffect, useState } from 'react'; import { Settings } from 'lucide-react'; import EmptyChatMessageInput from './EmptyChatMessageInput'; import { File } from './ChatWindow'; @@ -5,8 +8,39 @@ import Link from 'next/link'; import WeatherWidget from './WeatherWidget'; import NewsArticleWidget from './NewsArticleWidget'; import SettingsButtonMobile from '@/components/Settings/SettingsButtonMobile'; +import { + getShowNewsWidget, + getShowWeatherWidget, +} from '@/lib/config/clientRegistry'; const EmptyChat = () => { + const [showWeather, setShowWeather] = useState(() => + typeof window !== 'undefined' ? getShowWeatherWidget() : true, + ); + const [showNews, setShowNews] = useState(() => + typeof window !== 'undefined' ? getShowNewsWidget() : true, + ); + + useEffect(() => { + const updateWidgetVisibility = () => { + setShowWeather(getShowWeatherWidget()); + setShowNews(getShowNewsWidget()); + }; + + updateWidgetVisibility(); + + window.addEventListener('client-config-changed', updateWidgetVisibility); + window.addEventListener('storage', updateWidgetVisibility); + + return () => { + window.removeEventListener( + 'client-config-changed', + updateWidgetVisibility, + ); + window.removeEventListener('storage', updateWidgetVisibility); + }; + }, []); + return (
@@ -19,14 +53,20 @@ const EmptyChat = () => {
-
-
- + {(showWeather || showNews) && ( +
+ {showWeather && ( +
+ +
+ )} + {showNews && ( +
+ +
+ )}
-
- -
-
+ )}
); diff --git a/src/components/EmptyChatMessageInput.tsx b/src/components/EmptyChatMessageInput.tsx index 770c647..6d159f9 100644 --- a/src/components/EmptyChatMessageInput.tsx +++ b/src/components/EmptyChatMessageInput.tsx @@ -1,7 +1,7 @@ import { ArrowRight } from 'lucide-react'; import { useEffect, useRef, useState } from 'react'; import TextareaAutosize from 'react-textarea-autosize'; -import Focus from './MessageInputActions/Focus'; +import Sources from './MessageInputActions/Sources'; import Optimization from './MessageInputActions/Optimization'; import Attach from './MessageInputActions/Attach'; import { useChat } from '@/lib/hooks/useChat'; @@ -68,8 +68,8 @@ const EmptyChatMessageInput = () => {
+ -
); }; diff --git a/src/components/MessageActions/Rewrite.tsx b/src/components/MessageActions/Rewrite.tsx index 80fadb3..3902e1e 100644 --- a/src/components/MessageActions/Rewrite.tsx +++ b/src/components/MessageActions/Rewrite.tsx @@ -1,4 +1,4 @@ -import { ArrowLeftRight } from 'lucide-react'; +import { ArrowLeftRight, Repeat } from 'lucide-react'; const Rewrite = ({ rewrite, @@ -10,12 +10,11 @@ const Rewrite = ({ return ( ); }; - +1; export default Rewrite; diff --git a/src/components/MessageBox.tsx b/src/components/MessageBox.tsx index 062bb90..19e3546 100644 --- a/src/components/MessageBox.tsx +++ b/src/components/MessageBox.tsx @@ -10,8 +10,9 @@ import { StopCircle, Layers3, Plus, + CornerDownRight, } from 'lucide-react'; -import Markdown, { MarkdownToJSX } from 'markdown-to-jsx'; +import Markdown, { MarkdownToJSX, RuleType } from 'markdown-to-jsx'; import Copy from './MessageActions/Copy'; import Rewrite from './MessageActions/Rewrite'; import MessageSources from './MessageSources'; @@ -20,7 +21,11 @@ import SearchVideos from './SearchVideos'; import { useSpeech } from 'react-text-to-speech'; import ThinkBox from './ThinkBox'; import { useChat, Section } from '@/lib/hooks/useChat'; -import Citation from './Citation'; +import Citation from './MessageRenderer/Citation'; +import AssistantSteps from './AssistantSteps'; +import { ResearchBlock } from '@/lib/types'; +import Renderer from './Widgets/Renderer'; +import CodeBlock from './MessageRenderer/CodeBlock'; const ThinkTagProcessor = ({ children, @@ -45,15 +50,39 @@ const MessageBox = ({ dividerRef?: MutableRefObject; isLast: boolean; }) => { - const { loading, chatTurns, sendMessage, rewrite } = useChat(); + const { loading, sendMessage, rewrite, messages, researchEnded } = useChat(); - const parsedMessage = section.parsedAssistantMessage || ''; + const parsedMessage = section.parsedTextBlocks.join('\n\n'); const speechMessage = section.speechMessage || ''; const thinkingEnded = section.thinkingEnded; + const sourceBlocks = section.message.responseBlocks.filter( + (block): block is typeof block & { type: 'source' } => + block.type === 'source', + ); + + const sources = sourceBlocks.flatMap((block) => block.data); + + const hasContent = section.parsedTextBlocks.length > 0; + const { speechStatus, start, stop } = useSpeech({ text: speechMessage }); const markdownOverrides: MarkdownToJSX.Options = { + renderRule(next, node, renderChildren, state) { + if (node.type === RuleType.codeInline) { + return `\`${node.text}\``; + } + + if (node.type === RuleType.codeBlock) { + return ( + + {node.text} + + ); + } + + return next(); + }, overrides: { think: { component: ThinkTagProcessor, @@ -71,7 +100,7 @@ const MessageBox = ({

- {section.userMessage.content} + {section.message.query}

@@ -80,21 +109,51 @@ const MessageBox = ({ ref={dividerRef} className="flex flex-col space-y-6 w-full lg:w-9/12" > - {section.sourceMessage && - section.sourceMessage.sources.length > 0 && ( -
-
- -

- Sources -

-
- + {sources.length > 0 && ( +
+
+ +

+ Sources +

+
+ +
+ )} + + {section.message.responseBlocks + .filter( + (block): block is ResearchBlock => + block.type === 'research' && block.data.subSteps.length > 0, + ) + .map((researchBlock) => ( +
+ +
+ ))} + + {isLast && + loading && + !researchEnded && + !section.message.responseBlocks.some( + (b) => b.type === 'research' && b.data.subSteps.length > 0, + ) && ( +
+ + + Brainstorming... +
)} + {section.widgets.length > 0 && } +
- {section.sourceMessage && ( + {sources.length > 0 && (
)} - {section.assistantMessage && ( + {hasContent && ( <> {loading && isLast ? null : ( -
-
+
+
-
- +
+
@@ -157,9 +213,9 @@ const MessageBox = ({ {isLast && section.suggestions && section.suggestions.length > 0 && - section.assistantMessage && + hasContent && !loading && ( -
+
(
- {i > 0 && ( -
- )} +
@@ -201,17 +261,17 @@ const MessageBox = ({
- {section.assistantMessage && ( + {hasContent && (
)} diff --git a/src/components/MessageInput.tsx b/src/components/MessageInput.tsx index d1fc989..56054eb 100644 --- a/src/components/MessageInput.tsx +++ b/src/components/MessageInput.tsx @@ -2,9 +2,6 @@ import { cn } from '@/lib/utils'; import { ArrowUp } from 'lucide-react'; import { useEffect, useRef, useState } from 'react'; import TextareaAutosize from 'react-textarea-autosize'; -import Attach from './MessageInputActions/Attach'; -import CopilotToggle from './MessageInputActions/Copilot'; -import { File } from './ChatWindow'; import AttachSmall from './MessageInputActions/AttachSmall'; import { useChat } from '@/lib/hooks/useChat'; @@ -64,7 +61,7 @@ const MessageInput = () => { } }} className={cn( - 'bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-hidden border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300', + 'relative bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-visible border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300', mode === 'multi' ? 'flex-col rounded-2xl' : 'flex-row rounded-full', )} > @@ -80,11 +77,16 @@ const MessageInput = () => { placeholder="Ask a follow-up" /> {mode === 'single' && ( -
- + + )} + {mode === 'multi' && ( +
+
)} - {mode === 'multi' && ( -
- -
- - -
-
- )} ); }; diff --git a/src/components/MessageInputActions/Attach.tsx b/src/components/MessageInputActions/Attach.tsx index fbc2e7e..84d7152 100644 --- a/src/components/MessageInputActions/Attach.tsx +++ b/src/components/MessageInputActions/Attach.tsx @@ -16,6 +16,8 @@ import { } from 'lucide-react'; import { Fragment, useRef, useState } from 'react'; import { useChat } from '@/lib/hooks/useChat'; +import { AnimatePresence } from 'motion/react'; +import { motion } from 'framer-motion'; const Attach = () => { const { files, setFiles, setFileIds, fileIds } = useChat(); @@ -53,86 +55,95 @@ const Attach = () => { return loading ? (
- +
) : files.length > 0 ? ( - - - - - -
-
-

- Attached files -

-
- - -
-
-
-
- {files.map((file, i) => ( -
-
- +
+

+ Attached files +

+
+ + +
-

- {file.fileName.length > 25 - ? file.fileName.replace(/\.\w+$/, '').substring(0, 25) + - '...' + - file.fileExtension - : file.fileName} -

-
- ))} -
-
- - +
+
+ {files.map((file, i) => ( +
+
+ +
+

+ {file.fileName.length > 25 + ? file.fileName + .replace(/\.\w+$/, '') + .substring(0, 25) + + '...' + + file.fileExtension + : file.fileName} +

+
+ ))} +
+ + + )} + + + )} ) : ( - -
-
-
-
- {files.map((file, i) => ( -
-
- +
+

+ Attached files +

+
+ + +
-

- {file.fileName.length > 25 - ? file.fileName.replace(/\.\w+$/, '').substring(0, 25) + - '...' + - file.fileExtension - : file.fileName} -

-
- ))} -
-
- - +
+
+ {files.map((file, i) => ( +
+
+ +
+

+ {file.fileName.length > 25 + ? file.fileName + .replace(/\.\w+$/, '') + .substring(0, 25) + + '...' + + file.fileExtension + : file.fileName} +

+
+ ))} +
+ + + )} + + + )} ) : ( + +
+ {provider.chatModels.map((model) => ( + + ))} +
+ + {providerIndex < filteredProviders.length - 1 && ( +
+ )} +
))}
- - {providerIndex < filteredProviders.length - 1 && ( -
- )} -
- ))} -
- )} -
-
- - + )} +
+ + + )} + + + )} ); }; diff --git a/src/components/MessageInputActions/Copilot.tsx b/src/components/MessageInputActions/Copilot.tsx deleted file mode 100644 index 5a3e476..0000000 --- a/src/components/MessageInputActions/Copilot.tsx +++ /dev/null @@ -1,43 +0,0 @@ -import { cn } from '@/lib/utils'; -import { Switch } from '@headlessui/react'; - -const CopilotToggle = ({ - copilotEnabled, - setCopilotEnabled, -}: { - copilotEnabled: boolean; - setCopilotEnabled: (enabled: boolean) => void; -}) => { - return ( -
- - Copilot - - -

setCopilotEnabled(!copilotEnabled)} - className={cn( - 'text-xs font-medium transition-colors duration-150 ease-in-out', - copilotEnabled - ? 'text-[#24A0ED]' - : 'text-black/50 dark:text-white/50 group-hover:text-black dark:group-hover:text-white', - )} - > - Copilot -

-
- ); -}; - -export default CopilotToggle; diff --git a/src/components/MessageInputActions/Focus.tsx b/src/components/MessageInputActions/Focus.tsx deleted file mode 100644 index 58b1a39..0000000 --- a/src/components/MessageInputActions/Focus.tsx +++ /dev/null @@ -1,123 +0,0 @@ -import { - BadgePercent, - ChevronDown, - Globe, - Pencil, - ScanEye, - SwatchBook, -} from 'lucide-react'; -import { cn } from '@/lib/utils'; -import { - Popover, - PopoverButton, - PopoverPanel, - Transition, -} from '@headlessui/react'; -import { SiReddit, SiYoutube } from '@icons-pack/react-simple-icons'; -import { Fragment } from 'react'; -import { useChat } from '@/lib/hooks/useChat'; - -const focusModes = [ - { - key: 'webSearch', - title: 'All', - description: 'Searches across all of the internet', - icon: , - }, - { - key: 'academicSearch', - title: 'Academic', - description: 'Search in published academic papers', - icon: , - }, - { - key: 'writingAssistant', - title: 'Writing', - description: 'Chat without searching the web', - icon: , - }, - { - key: 'wolframAlphaSearch', - title: 'Wolfram Alpha', - description: 'Computational knowledge engine', - icon: , - }, - { - key: 'youtubeSearch', - title: 'Youtube', - description: 'Search and watch videos', - icon: , - }, - { - key: 'redditSearch', - title: 'Reddit', - description: 'Search for discussions and opinions', - icon: , - }, -]; - -const Focus = () => { - const { focusMode, setFocusMode } = useChat(); - - return ( - - - {focusMode !== 'webSearch' ? ( -
- {focusModes.find((mode) => mode.key === focusMode)?.icon} -
- ) : ( -
- -
- )} -
- - -
- {focusModes.map((mode, i) => ( - setFocusMode(mode.key)} - key={i} - className={cn( - 'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-2 duration-200 cursor-pointer transition focus:outline-none', - focusMode === mode.key - ? 'bg-light-secondary dark:bg-dark-secondary' - : 'hover:bg-light-secondary dark:hover:bg-dark-secondary', - )} - > -
- {mode.icon} -

{mode.title}

-
-

- {mode.description} -

-
- ))} -
-
-
-
- ); -}; - -export default Focus; diff --git a/src/components/MessageInputActions/Optimization.tsx b/src/components/MessageInputActions/Optimization.tsx index fe04190..2f0cd82 100644 --- a/src/components/MessageInputActions/Optimization.tsx +++ b/src/components/MessageInputActions/Optimization.tsx @@ -8,6 +8,7 @@ import { } from '@headlessui/react'; import { Fragment } from 'react'; import { useChat } from '@/lib/hooks/useChat'; +import { AnimatePresence, motion } from 'motion/react'; const OptimizationModes = [ { @@ -24,7 +25,7 @@ const OptimizationModes = [ }, { key: 'quality', - title: 'Quality (Soon)', + title: 'Quality', description: 'Get the most thorough and accurate answer', icon: ( { />
- - -
- {OptimizationModes.map((mode, i) => ( - setOptimizationMode(mode.key)} - key={i} - disabled={mode.key === 'quality'} - className={cn( - 'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-1 duration-200 cursor-pointer transition focus:outline-none', - optimizationMode === mode.key - ? 'bg-light-secondary dark:bg-dark-secondary' - : 'hover:bg-light-secondary dark:hover:bg-dark-secondary', - mode.key === 'quality' && 'opacity-50 cursor-not-allowed', - )} - > -
- {mode.icon} -

{mode.title}

-
-

- {mode.description} -

-
- ))} -
-
-
+ + {open && ( + + + {OptimizationModes.map((mode, i) => ( + setOptimizationMode(mode.key)} + key={i} + className={cn( + 'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-1 duration-200 cursor-pointer transition focus:outline-none', + optimizationMode === mode.key + ? 'bg-light-secondary dark:bg-dark-secondary' + : 'hover:bg-light-secondary dark:hover:bg-dark-secondary', + )} + > +
+
+ {mode.icon} +

{mode.title}

+
+ {mode.key === 'quality' && ( + + Beta + + )} +
+

+ {mode.description} +

+
+ ))} +
+
+ )} +
)} diff --git a/src/components/MessageInputActions/Sources.tsx b/src/components/MessageInputActions/Sources.tsx new file mode 100644 index 0000000..2652d58 --- /dev/null +++ b/src/components/MessageInputActions/Sources.tsx @@ -0,0 +1,93 @@ +import { useChat } from '@/lib/hooks/useChat'; +import { + Popover, + PopoverButton, + PopoverPanel, + Switch, +} from '@headlessui/react'; +import { + GlobeIcon, + GraduationCapIcon, + NetworkIcon, +} from '@phosphor-icons/react'; +import { AnimatePresence, motion } from 'motion/react'; + +const sourcesList = [ + { + name: 'Web', + key: 'web', + icon: , + }, + { + name: 'Academic', + key: 'academic', + icon: , + }, + { + name: 'Social', + key: 'discussions', + icon: , + }, +]; + +const Sources = () => { + const { sources, setSources } = useChat(); + + return ( + + {({ open }) => ( + <> + + + + + {open && ( + + + {sourcesList.map((source, i) => ( +
{ + if (!sources.includes(source.key)) { + setSources([...sources, source.key]); + } else { + setSources(sources.filter((s) => s !== source.key)); + } + }} + > +
+ {source.icon} +

{source.name}

+
+ + +
+ ))} +
+
+ )} +
+ + )} +
+ ); +}; + +export default Sources; diff --git a/src/components/Citation.tsx b/src/components/MessageRenderer/Citation.tsx similarity index 100% rename from src/components/Citation.tsx rename to src/components/MessageRenderer/Citation.tsx diff --git a/src/components/MessageRenderer/CodeBlock/CodeBlockDarkTheme.ts b/src/components/MessageRenderer/CodeBlock/CodeBlockDarkTheme.ts new file mode 100644 index 0000000..0a9d6a4 --- /dev/null +++ b/src/components/MessageRenderer/CodeBlock/CodeBlockDarkTheme.ts @@ -0,0 +1,102 @@ +import type { CSSProperties } from 'react'; + +const darkTheme = { + 'hljs-comment': { + color: '#8b949e', + }, + 'hljs-quote': { + color: '#8b949e', + }, + 'hljs-variable': { + color: '#ff7b72', + }, + 'hljs-template-variable': { + color: '#ff7b72', + }, + 'hljs-tag': { + color: '#ff7b72', + }, + 'hljs-name': { + color: '#ff7b72', + }, + 'hljs-selector-id': { + color: '#ff7b72', + }, + 'hljs-selector-class': { + color: '#ff7b72', + }, + 'hljs-regexp': { + color: '#ff7b72', + }, + 'hljs-deletion': { + color: '#ff7b72', + }, + 'hljs-number': { + color: '#f2cc60', + }, + 'hljs-built_in': { + color: '#f2cc60', + }, + 'hljs-builtin-name': { + color: '#f2cc60', + }, + 'hljs-literal': { + color: '#f2cc60', + }, + 'hljs-type': { + color: '#f2cc60', + }, + 'hljs-params': { + color: '#f2cc60', + }, + 'hljs-meta': { + color: '#f2cc60', + }, + 'hljs-link': { + color: '#f2cc60', + }, + 'hljs-attribute': { + color: '#58a6ff', + }, + 'hljs-string': { + color: '#7ee787', + }, + 'hljs-symbol': { + color: '#7ee787', + }, + 'hljs-bullet': { + color: '#7ee787', + }, + 'hljs-addition': { + color: '#7ee787', + }, + 'hljs-title': { + color: '#79c0ff', + }, + 'hljs-section': { + color: '#79c0ff', + }, + 'hljs-keyword': { + color: '#c297ff', + }, + 'hljs-selector-tag': { + color: '#c297ff', + }, + hljs: { + display: 'block', + overflowX: 'auto', + background: '#0d1117', + color: '#c9d1d9', + padding: '0.75em', + border: '1px solid #21262d', + borderRadius: '10px', + }, + 'hljs-emphasis': { + fontStyle: 'italic', + }, + 'hljs-strong': { + fontWeight: 'bold', + }, +} satisfies Record; + +export default darkTheme; diff --git a/src/components/MessageRenderer/CodeBlock/CodeBlockLightTheme.ts b/src/components/MessageRenderer/CodeBlock/CodeBlockLightTheme.ts new file mode 100644 index 0000000..758dbac --- /dev/null +++ b/src/components/MessageRenderer/CodeBlock/CodeBlockLightTheme.ts @@ -0,0 +1,102 @@ +import type { CSSProperties } from 'react'; + +const lightTheme = { + 'hljs-comment': { + color: '#6e7781', + }, + 'hljs-quote': { + color: '#6e7781', + }, + 'hljs-variable': { + color: '#d73a49', + }, + 'hljs-template-variable': { + color: '#d73a49', + }, + 'hljs-tag': { + color: '#d73a49', + }, + 'hljs-name': { + color: '#d73a49', + }, + 'hljs-selector-id': { + color: '#d73a49', + }, + 'hljs-selector-class': { + color: '#d73a49', + }, + 'hljs-regexp': { + color: '#d73a49', + }, + 'hljs-deletion': { + color: '#d73a49', + }, + 'hljs-number': { + color: '#b08800', + }, + 'hljs-built_in': { + color: '#b08800', + }, + 'hljs-builtin-name': { + color: '#b08800', + }, + 'hljs-literal': { + color: '#b08800', + }, + 'hljs-type': { + color: '#b08800', + }, + 'hljs-params': { + color: '#b08800', + }, + 'hljs-meta': { + color: '#b08800', + }, + 'hljs-link': { + color: '#b08800', + }, + 'hljs-attribute': { + color: '#0a64ae', + }, + 'hljs-string': { + color: '#22863a', + }, + 'hljs-symbol': { + color: '#22863a', + }, + 'hljs-bullet': { + color: '#22863a', + }, + 'hljs-addition': { + color: '#22863a', + }, + 'hljs-title': { + color: '#005cc5', + }, + 'hljs-section': { + color: '#005cc5', + }, + 'hljs-keyword': { + color: '#6f42c1', + }, + 'hljs-selector-tag': { + color: '#6f42c1', + }, + hljs: { + display: 'block', + overflowX: 'auto', + background: '#ffffff', + color: '#24292f', + padding: '0.75em', + border: '1px solid #e8edf1', + borderRadius: '10px', + }, + 'hljs-emphasis': { + fontStyle: 'italic', + }, + 'hljs-strong': { + fontWeight: 'bold', + }, +} satisfies Record; + +export default lightTheme; diff --git a/src/components/MessageRenderer/CodeBlock/index.tsx b/src/components/MessageRenderer/CodeBlock/index.tsx new file mode 100644 index 0000000..493a0d0 --- /dev/null +++ b/src/components/MessageRenderer/CodeBlock/index.tsx @@ -0,0 +1,64 @@ +'use client'; + +import { CheckIcon, CopyIcon } from '@phosphor-icons/react'; +import React, { useEffect, useMemo, useState } from 'react'; +import { useTheme } from 'next-themes'; +import SyntaxHighlighter from 'react-syntax-highlighter'; +import darkTheme from './CodeBlockDarkTheme'; +import lightTheme from './CodeBlockLightTheme'; + +const CodeBlock = ({ + language, + children, +}: { + language: string; + children: React.ReactNode; +}) => { + const { resolvedTheme } = useTheme(); + const [mounted, setMounted] = useState(false); + + const [copied, setCopied] = useState(false); + + useEffect(() => { + setMounted(true); + }, []); + + const syntaxTheme = useMemo(() => { + if (!mounted) return lightTheme; + return resolvedTheme === 'dark' ? darkTheme : lightTheme; + }, [mounted, resolvedTheme]); + + return ( +
+ + + {children as string} + +
+ ); +}; + +export default CodeBlock; diff --git a/src/components/MessageSources.tsx b/src/components/MessageSources.tsx index fb2b5bb..a1db27a 100644 --- a/src/components/MessageSources.tsx +++ b/src/components/MessageSources.tsx @@ -6,11 +6,11 @@ import { Transition, TransitionChild, } from '@headlessui/react'; -import { Document } from '@langchain/core/documents'; import { File } from 'lucide-react'; import { Fragment, useState } from 'react'; +import { Chunk } from '@/lib/types'; -const MessageSources = ({ sources }: { sources: Document[] }) => { +const MessageSources = ({ sources }: { sources: Chunk[] }) => { const [isDialogOpen, setIsDialogOpen] = useState(false); const closeModal = () => { @@ -37,7 +37,7 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {

- {source.metadata.url === 'File' ? ( + {source.metadata.url.includes('file_id://') ? (
@@ -51,7 +51,9 @@ const MessageSources = ({ sources }: { sources: Document[] }) => { /> )}

- {source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')} + {source.metadata.url.includes('file_id://') + ? 'Uploaded File' + : source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}

diff --git a/src/components/Navbar.tsx b/src/components/Navbar.tsx index bbcd470..6d3e77c 100644 --- a/src/components/Navbar.tsx +++ b/src/components/Navbar.tsx @@ -11,6 +11,7 @@ import { } from '@headlessui/react'; import jsPDF from 'jspdf'; import { useChat, Section } from '@/lib/hooks/useChat'; +import { SourceBlock } from '@/lib/types'; const downloadFile = (filename: string, content: string, type: string) => { const blob = new Blob([content], { type }); @@ -28,35 +29,41 @@ const downloadFile = (filename: string, content: string, type: string) => { const exportAsMarkdown = (sections: Section[], title: string) => { const date = new Date( - sections[0]?.userMessage?.createdAt || Date.now(), + sections[0].message.createdAt || Date.now(), ).toLocaleString(); let md = `# 💬 Chat Export: ${title}\n\n`; md += `*Exported on: ${date}*\n\n---\n`; sections.forEach((section, idx) => { - if (section.userMessage) { - md += `\n---\n`; - md += `**🧑 User** + md += `\n---\n`; + md += `**🧑 User** `; - md += `*${new Date(section.userMessage.createdAt).toLocaleString()}*\n\n`; - md += `> ${section.userMessage.content.replace(/\n/g, '\n> ')}\n`; - } + md += `*${new Date(section.message.createdAt).toLocaleString()}*\n\n`; + md += `> ${section.message.query.replace(/\n/g, '\n> ')}\n`; - if (section.assistantMessage) { + if (section.message.responseBlocks.length > 0) { md += `\n---\n`; md += `**🤖 Assistant** `; - md += `*${new Date(section.assistantMessage.createdAt).toLocaleString()}*\n\n`; - md += `> ${section.assistantMessage.content.replace(/\n/g, '\n> ')}\n`; + md += `*${new Date(section.message.createdAt).toLocaleString()}*\n\n`; + md += `> ${section.message.responseBlocks + .filter((b) => b.type === 'text') + .map((block) => block.data) + .join('\n') + .replace(/\n/g, '\n> ')}\n`; } + const sourceResponseBlock = section.message.responseBlocks.find( + (block) => block.type === 'source', + ) as SourceBlock | undefined; + if ( - section.sourceMessage && - section.sourceMessage.sources && - section.sourceMessage.sources.length > 0 + sourceResponseBlock && + sourceResponseBlock.data && + sourceResponseBlock.data.length > 0 ) { md += `\n**Citations:**\n`; - section.sourceMessage.sources.forEach((src: any, i: number) => { + sourceResponseBlock.data.forEach((src: any, i: number) => { const url = src.metadata?.url || ''; md += `- [${i + 1}] [${url}](${url})\n`; }); @@ -69,7 +76,7 @@ const exportAsMarkdown = (sections: Section[], title: string) => { const exportAsPDF = (sections: Section[], title: string) => { const doc = new jsPDF(); const date = new Date( - sections[0]?.userMessage?.createdAt || Date.now(), + sections[0]?.message?.createdAt || Date.now(), ).toLocaleString(); let y = 15; const pageHeight = doc.internal.pageSize.height; @@ -86,44 +93,38 @@ const exportAsPDF = (sections: Section[], title: string) => { doc.setTextColor(30); sections.forEach((section, idx) => { - if (section.userMessage) { - if (y > pageHeight - 30) { - doc.addPage(); - y = 15; - } - doc.setFont('helvetica', 'bold'); - doc.text('User', 10, y); - doc.setFont('helvetica', 'normal'); - doc.setFontSize(10); - doc.setTextColor(120); - doc.text( - `${new Date(section.userMessage.createdAt).toLocaleString()}`, - 40, - y, - ); - y += 6; - doc.setTextColor(30); - doc.setFontSize(12); - const userLines = doc.splitTextToSize(section.userMessage.content, 180); - for (let i = 0; i < userLines.length; i++) { - if (y > pageHeight - 20) { - doc.addPage(); - y = 15; - } - doc.text(userLines[i], 12, y); - y += 6; - } - y += 6; - doc.setDrawColor(230); - if (y > pageHeight - 10) { - doc.addPage(); - y = 15; - } - doc.line(10, y, 200, y); - y += 4; + if (y > pageHeight - 30) { + doc.addPage(); + y = 15; } + doc.setFont('helvetica', 'bold'); + doc.text('User', 10, y); + doc.setFont('helvetica', 'normal'); + doc.setFontSize(10); + doc.setTextColor(120); + doc.text(`${new Date(section.message.createdAt).toLocaleString()}`, 40, y); + y += 6; + doc.setTextColor(30); + doc.setFontSize(12); + const userLines = doc.splitTextToSize(section.message.query, 180); + for (let i = 0; i < userLines.length; i++) { + if (y > pageHeight - 20) { + doc.addPage(); + y = 15; + } + doc.text(userLines[i], 12, y); + y += 6; + } + y += 6; + doc.setDrawColor(230); + if (y > pageHeight - 10) { + doc.addPage(); + y = 15; + } + doc.line(10, y, 200, y); + y += 4; - if (section.assistantMessage) { + if (section.message.responseBlocks.length > 0) { if (y > pageHeight - 30) { doc.addPage(); y = 15; @@ -134,7 +135,7 @@ const exportAsPDF = (sections: Section[], title: string) => { doc.setFontSize(10); doc.setTextColor(120); doc.text( - `${new Date(section.assistantMessage.createdAt).toLocaleString()}`, + `${new Date(section.message.createdAt).toLocaleString()}`, 40, y, ); @@ -142,7 +143,7 @@ const exportAsPDF = (sections: Section[], title: string) => { doc.setTextColor(30); doc.setFontSize(12); const assistantLines = doc.splitTextToSize( - section.assistantMessage.content, + section.parsedTextBlocks.join('\n'), 180, ); for (let i = 0; i < assistantLines.length; i++) { @@ -154,10 +155,14 @@ const exportAsPDF = (sections: Section[], title: string) => { y += 6; } + const sourceResponseBlock = section.message.responseBlocks.find( + (block) => block.type === 'source', + ) as SourceBlock | undefined; + if ( - section.sourceMessage && - section.sourceMessage.sources && - section.sourceMessage.sources.length > 0 + sourceResponseBlock && + sourceResponseBlock.data && + sourceResponseBlock.data.length > 0 ) { doc.setFontSize(11); doc.setTextColor(80); @@ -167,7 +172,7 @@ const exportAsPDF = (sections: Section[], title: string) => { } doc.text('Citations:', 12, y); y += 5; - section.sourceMessage.sources.forEach((src: any, i: number) => { + sourceResponseBlock.data.forEach((src: any, i: number) => { const url = src.metadata?.url || ''; if (y > pageHeight - 15) { doc.addPage(); @@ -198,15 +203,16 @@ const Navbar = () => { const { sections, chatId } = useChat(); useEffect(() => { - if (sections.length > 0 && sections[0].userMessage) { + if (sections.length > 0 && sections[0].message) { const newTitle = - sections[0].userMessage.content.length > 20 - ? `${sections[0].userMessage.content.substring(0, 20).trim()}...` - : sections[0].userMessage.content; + sections[0].message.query.length > 30 + ? `${sections[0].message.query.substring(0, 30).trim()}...` + : sections[0].message.query || 'New Conversation'; + setTitle(newTitle); const newTimeAgo = formatTimeDifference( new Date(), - sections[0].userMessage.createdAt, + sections[0].message.createdAt, ); setTimeAgo(newTimeAgo); } @@ -214,10 +220,10 @@ const Navbar = () => { useEffect(() => { const intervalId = setInterval(() => { - if (sections.length > 0 && sections[0].userMessage) { + if (sections.length > 0 && sections[0].message) { const newTimeAgo = formatTimeDifference( new Date(), - sections[0].userMessage.createdAt, + sections[0].message.createdAt, ); setTimeAgo(newTimeAgo); } diff --git a/src/components/Settings/SettingsDialogue.tsx b/src/components/Settings/SettingsDialogue.tsx index ba097a9..f42ce9c 100644 --- a/src/components/Settings/SettingsDialogue.tsx +++ b/src/components/Settings/SettingsDialogue.tsx @@ -3,6 +3,7 @@ import { ArrowLeft, BrainCog, ChevronLeft, + ExternalLink, Search, Sliders, ToggleRight, @@ -115,35 +116,52 @@ const SettingsDialogue = ({
) : (
-
- + +
+ {sections.map((section) => ( + + ))} +
+
+
+

+ Version: {process.env.NEXT_PUBLIC_VERSION}

- -
- {sections.map((section) => ( - - ))} + + GitHub + +
diff --git a/src/components/Settings/SettingsField.tsx b/src/components/Settings/SettingsField.tsx index 55aa640..447ce1c 100644 --- a/src/components/Settings/SettingsField.tsx +++ b/src/components/Settings/SettingsField.tsx @@ -12,6 +12,12 @@ import { useTheme } from 'next-themes'; import { Loader2 } from 'lucide-react'; import { Switch } from '@headlessui/react'; +const emitClientConfigChanged = () => { + if (typeof window !== 'undefined') { + window.dispatchEvent(new Event('client-config-changed')); + } +}; + const SettingsSelect = ({ field, value, @@ -35,6 +41,7 @@ const SettingsSelect = ({ if (field.key === 'theme') { setTheme(newValue); } + emitClientConfigChanged(); } else { const res = await fetch('/api/config', { method: 'POST', @@ -106,6 +113,7 @@ const SettingsInput = ({ try { if (field.scope === 'client') { localStorage.setItem(field.key, newValue); + emitClientConfigChanged(); } else { const res = await fetch('/api/config', { method: 'POST', @@ -182,6 +190,7 @@ const SettingsTextarea = ({ try { if (field.scope === 'client') { localStorage.setItem(field.key, newValue); + emitClientConfigChanged(); } else { const res = await fetch('/api/config', { method: 'POST', @@ -258,6 +267,7 @@ const SettingsSwitch = ({ try { if (field.scope === 'client') { localStorage.setItem(field.key, String(newValue)); + emitClientConfigChanged(); } else { const res = await fetch('/api/config', { method: 'POST', @@ -300,7 +310,7 @@ const SettingsSwitch = ({ checked={isChecked} onChange={handleSave} disabled={loading} - className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500" + className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-light-200 dark:bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500 dark:data-[checked]:bg-sky-500" >