mirror of
https://github.com/ItzCrazyKns/Perplexica.git
synced 2026-01-03 01:56:56 +00:00
Merge branch 'canary'
This commit is contained in:
@@ -11,33 +11,63 @@ Perplexica's codebase is organized as follows:
|
||||
- **UI Components and Pages**:
|
||||
- **Components (`src/components`)**: Reusable UI components.
|
||||
- **Pages and Routes (`src/app`)**: Next.js app directory structure with page components.
|
||||
- Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), library (`/library`), and settings (`/settings`).
|
||||
- **API Routes (`src/app/api`)**: API endpoints implemented with Next.js API routes.
|
||||
- `/api/chat`: Handles chat interactions.
|
||||
- `/api/search`: Provides direct access to Perplexica's search capabilities.
|
||||
- Other endpoints for models, files, and suggestions.
|
||||
- Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), and library (`/library`).
|
||||
- **API Routes (`src/app/api`)**: Server endpoints implemented with Next.js route handlers.
|
||||
- **Backend Logic (`src/lib`)**: Contains all the backend functionality including search, database, and API logic.
|
||||
- The search functionality is present inside `src/lib/search` directory.
|
||||
- All of the focus modes are implemented using the Meta Search Agent class in `src/lib/search/metaSearchAgent.ts`.
|
||||
- The search system lives in `src/lib/agents/search`.
|
||||
- The search pipeline is split into classification, research, widgets, and writing.
|
||||
- Database functionality is in `src/lib/db`.
|
||||
- Chat model and embedding model providers are managed in `src/lib/providers`.
|
||||
- Prompt templates and LLM chain definitions are in `src/lib/prompts` and `src/lib/chains` respectively.
|
||||
- Chat model and embedding model providers are in `src/lib/models/providers`, and models are loaded via `src/lib/models/registry.ts`.
|
||||
- Prompt templates are in `src/lib/prompts`.
|
||||
- SearXNG integration is in `src/lib/searxng.ts`.
|
||||
- Upload search lives in `src/lib/uploads`.
|
||||
|
||||
### Where to make changes
|
||||
|
||||
If you are not sure where to start, use this section as a map.
|
||||
|
||||
- **Search behavior and reasoning**
|
||||
|
||||
- `src/lib/agents/search` contains the core chat and search pipeline.
|
||||
- `classifier.ts` decides whether research is needed and what should run.
|
||||
- `researcher/` gathers information in the background.
|
||||
|
||||
- **Add or change a search capability**
|
||||
|
||||
- Research tools (web, academic, discussions, uploads, scraping) live in `src/lib/agents/search/researcher/actions`.
|
||||
- Tools are registered in `src/lib/agents/search/researcher/actions/index.ts`.
|
||||
|
||||
- **Add or change widgets**
|
||||
|
||||
- Widgets live in `src/lib/agents/search/widgets`.
|
||||
- Widgets run in parallel with research and show structured results in the UI.
|
||||
|
||||
- **Model integrations**
|
||||
|
||||
- Providers live in `src/lib/models/providers`.
|
||||
- Add new providers there and wire them into the model registry so they show up in the app.
|
||||
|
||||
- **Architecture docs**
|
||||
- High level overview: `docs/architecture/README.md`
|
||||
- High level flow: `docs/architecture/WORKING.md`
|
||||
|
||||
## API Documentation
|
||||
|
||||
Perplexica exposes several API endpoints for programmatic access, including:
|
||||
Perplexica includes API documentation for programmatic access.
|
||||
|
||||
- **Search API**: Access Perplexica's advanced search capabilities directly via the `/api/search` endpoint. For detailed documentation, see `docs/api/search.md`.
|
||||
- **Search API**: For detailed documentation, see `docs/API/SEARCH.md`.
|
||||
|
||||
## Setting Up Your Environment
|
||||
|
||||
Before diving into coding, setting up your local environment is key. Here's what you need to do:
|
||||
|
||||
1. In the root directory, locate the `sample.config.toml` file.
|
||||
2. Rename it to `config.toml` and fill in the necessary configuration fields.
|
||||
3. Run `npm install` to install all dependencies.
|
||||
4. Run `npm run db:migrate` to set up the local sqlite database.
|
||||
5. Use `npm run dev` to start the application in development mode.
|
||||
1. Run `npm install` to install all dependencies.
|
||||
2. Use `npm run dev` to start the application in development mode.
|
||||
3. Open http://localhost:3000 and complete the setup in the UI (API keys, models, search backend URL, etc.).
|
||||
|
||||
Database migrations are applied automatically on startup.
|
||||
|
||||
For full installation options (Docker and non Docker), see the installation guide in the repository README.
|
||||
|
||||
**Please note**: Docker configurations are present for setting up production environments, whereas `npm run dev` is used for development purposes.
|
||||
|
||||
|
||||
21
README.md
21
README.md
@@ -18,9 +18,11 @@ Want to know more about its architecture and how it works? You can read it [here
|
||||
|
||||
🤖 **Support for all major AI providers** - Use local LLMs through Ollama or connect to OpenAI, Anthropic Claude, Google Gemini, Groq, and more. Mix and match models based on your needs.
|
||||
|
||||
⚡ **Smart search modes** - Choose Balanced Mode for everyday searches, Fast Mode when you need quick answers, or wait for Quality Mode (coming soon) for deep research.
|
||||
⚡ **Smart search modes** - Choose Speed Mode when you need quick answers, Balanced Mode for everyday searches, or Quality Mode for deep research.
|
||||
|
||||
🎯 **Six specialized focus modes** - Get better results with modes designed for specific tasks: Academic papers, YouTube videos, Reddit discussions, Wolfram Alpha calculations, writing assistance, or general web search.
|
||||
🧭 **Pick your sources** - Search the web, discussions, or academic papers. More sources and integrations are in progress.
|
||||
|
||||
🧩 **Widgets** - Helpful UI cards that show up when relevant, like weather, calculations, stock prices, and other quick lookups.
|
||||
|
||||
🔍 **Web search powered by SearxNG** - Access multiple search engines while keeping your identity private. Support for Tavily and Exa coming soon for even better results.
|
||||
|
||||
@@ -81,7 +83,7 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
|
||||
Perplexica can be easily run using Docker. Simply run the following command:
|
||||
|
||||
```bash
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest
|
||||
```
|
||||
|
||||
This will pull and start the Perplexica container with the bundled SearxNG search engine. Once running, open your browser and navigate to http://localhost:3000. You can then configure your settings (API keys, models, etc.) directly in the setup screen.
|
||||
@@ -93,7 +95,7 @@ This will pull and start the Perplexica container with the bundled SearxNG searc
|
||||
If you already have SearxNG running, you can use the slim version of Perplexica:
|
||||
|
||||
```bash
|
||||
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest
|
||||
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest
|
||||
```
|
||||
|
||||
**Important**: Make sure your SearxNG instance has:
|
||||
@@ -120,7 +122,7 @@ If you prefer to build from source or need more control:
|
||||
|
||||
```bash
|
||||
docker build -t perplexica .
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica perplexica
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica perplexica
|
||||
```
|
||||
|
||||
5. Access Perplexica at http://localhost:3000 and configure your settings in the setup screen.
|
||||
@@ -237,13 +239,8 @@ Perplexica runs on Next.js and handles all API requests. It works right away on
|
||||
|
||||
## Upcoming Features
|
||||
|
||||
- [x] Add settings page
|
||||
- [x] Adding support for local LLMs
|
||||
- [x] History Saving features
|
||||
- [x] Introducing various Focus Modes
|
||||
- [x] Adding API support
|
||||
- [x] Adding Discover
|
||||
- [ ] Finalizing Copilot Mode
|
||||
- [ ] Adding more widgets, integrations, search sources
|
||||
- [ ] Adding authentication
|
||||
|
||||
## Support Us
|
||||
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
services:
|
||||
perplexica:
|
||||
image: itzcrazykns1337/perplexica:latest
|
||||
build:
|
||||
context: .
|
||||
ports:
|
||||
- '3000:3000'
|
||||
volumes:
|
||||
|
||||
@@ -57,7 +57,7 @@ Use the `id` field as the `providerId` and the `key` field from the models array
|
||||
|
||||
### Request
|
||||
|
||||
The API accepts a JSON object in the request body, where you define the focus mode, chat models, embedding models, and your query.
|
||||
The API accepts a JSON object in the request body, where you define the enabled search `sources`, chat models, embedding models, and your query.
|
||||
|
||||
#### Request Body Structure
|
||||
|
||||
@@ -72,7 +72,7 @@ The API accepts a JSON object in the request body, where you define the focus mo
|
||||
"key": "text-embedding-3-large"
|
||||
},
|
||||
"optimizationMode": "speed",
|
||||
"focusMode": "webSearch",
|
||||
"sources": ["web"],
|
||||
"query": "What is Perplexica",
|
||||
"history": [
|
||||
["human", "Hi, how are you?"],
|
||||
@@ -87,24 +87,25 @@ The API accepts a JSON object in the request body, where you define the focus mo
|
||||
|
||||
### Request Parameters
|
||||
|
||||
- **`chatModel`** (object, optional): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
|
||||
- **`chatModel`** (object, required): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
|
||||
|
||||
- `providerId` (string): The UUID of the provider. You can get this from the `/api/providers` endpoint response.
|
||||
- `key` (string): The model key/identifier (e.g., `gpt-4o-mini`, `llama3.1:latest`). Use the `key` value from the provider's `chatModels` array, not the display name.
|
||||
|
||||
- **`embeddingModel`** (object, optional): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
|
||||
- **`embeddingModel`** (object, required): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
|
||||
|
||||
- `providerId` (string): The UUID of the embedding provider. You can get this from the `/api/providers` endpoint response.
|
||||
- `key` (string): The embedding model key (e.g., `text-embedding-3-large`, `nomic-embed-text`). Use the `key` value from the provider's `embeddingModels` array, not the display name.
|
||||
|
||||
- **`focusMode`** (string, required): Specifies which focus mode to use. Available modes:
|
||||
- **`sources`** (array, required): Which search sources to enable. Available values:
|
||||
|
||||
- `webSearch`, `academicSearch`, `writingAssistant`, `wolframAlphaSearch`, `youtubeSearch`, `redditSearch`.
|
||||
- `web`, `academic`, `discussions`.
|
||||
|
||||
- **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes:
|
||||
|
||||
- `speed`: Prioritize speed and return the fastest answer.
|
||||
- `balanced`: Provide a balanced answer with good speed and reasonable quality.
|
||||
- `quality`: Prioritize answer quality (may be slower).
|
||||
|
||||
- **`query`** (string, required): The search query or question.
|
||||
|
||||
@@ -132,14 +133,14 @@ The response from the API includes both the final message and the sources used t
|
||||
"message": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online. Here are some key features and characteristics of Perplexica:\n\n- **AI-Powered Technology**: It utilizes advanced machine learning algorithms to not only retrieve information but also to understand the context and intent behind user queries, providing more relevant results [1][5].\n\n- **Open-Source**: Being open-source, Perplexica offers flexibility and transparency, allowing users to explore its functionalities without the constraints of proprietary software [3][10].",
|
||||
"sources": [
|
||||
{
|
||||
"pageContent": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.",
|
||||
"content": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.",
|
||||
"metadata": {
|
||||
"title": "What is Perplexica, and how does it function as an AI-powered search ...",
|
||||
"url": "https://askai.glarity.app/search/What-is-Perplexica--and-how-does-it-function-as-an-AI-powered-search-engine"
|
||||
}
|
||||
},
|
||||
{
|
||||
"pageContent": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.",
|
||||
"content": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.",
|
||||
"metadata": {
|
||||
"title": "Sahar Mor's Post",
|
||||
"url": "https://www.linkedin.com/posts/sahar-mor_a-new-open-source-project-called-perplexica-activity-7204489745668694016-ncja"
|
||||
@@ -158,7 +159,7 @@ Example of streamed response objects:
|
||||
|
||||
```
|
||||
{"type":"init","data":"Stream connected"}
|
||||
{"type":"sources","data":[{"pageContent":"...","metadata":{"title":"...","url":"..."}},...]}
|
||||
{"type":"sources","data":[{"content":"...","metadata":{"title":"...","url":"..."}},...]}
|
||||
{"type":"response","data":"Perplexica is an "}
|
||||
{"type":"response","data":"innovative, open-source "}
|
||||
{"type":"response","data":"AI-powered search engine..."}
|
||||
@@ -174,9 +175,9 @@ Clients should process each line as a separate JSON object. The different messag
|
||||
|
||||
### Fields in the Response
|
||||
|
||||
- **`message`** (string): The search result, generated based on the query and focus mode.
|
||||
- **`message`** (string): The search result, generated based on the query and enabled `sources`.
|
||||
- **`sources`** (array): A list of sources that were used to generate the search result. Each source includes:
|
||||
- `pageContent`: A snippet of the relevant content from the source.
|
||||
- `content`: A snippet of the relevant content from the source.
|
||||
- `metadata`: Metadata about the source, including:
|
||||
- `title`: The title of the webpage.
|
||||
- `url`: The URL of the webpage.
|
||||
@@ -185,5 +186,5 @@ Clients should process each line as a separate JSON object. The different messag
|
||||
|
||||
If an error occurs during the search process, the API will return an appropriate error message with an HTTP status code.
|
||||
|
||||
- **400**: If the request is malformed or missing required fields (e.g., no focus mode or query).
|
||||
- **400**: If the request is malformed or missing required fields (e.g., no `sources` or `query`).
|
||||
- **500**: If an internal server error occurs during the search.
|
||||
|
||||
@@ -1,11 +1,38 @@
|
||||
# Perplexica's Architecture
|
||||
# Perplexica Architecture
|
||||
|
||||
Perplexica's architecture consists of the following key components:
|
||||
Perplexica is a Next.js application that combines an AI chat experience with search.
|
||||
|
||||
1. **User Interface**: A web-based interface that allows users to interact with Perplexica for searching images, videos, and much more.
|
||||
2. **Agent/Chains**: These components predict Perplexica's next actions, understand user queries, and decide whether a web search is necessary.
|
||||
3. **SearXNG**: A metadata search engine used by Perplexica to search the web for sources.
|
||||
4. **LLMs (Large Language Models)**: Utilized by agents and chains for tasks like understanding content, writing responses, and citing sources. Examples include Claude, GPTs, etc.
|
||||
5. **Embedding Models**: To improve the accuracy of search results, embedding models re-rank the results using similarity search algorithms such as cosine similarity and dot product distance.
|
||||
For a high level flow, see [WORKING.md](WORKING.md). For deeper implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md).
|
||||
|
||||
For a more detailed explanation of how these components work together, see [WORKING.md](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/WORKING.md).
|
||||
## Key components
|
||||
|
||||
1. **User Interface**
|
||||
|
||||
- A web based UI that lets users chat, search, and view citations.
|
||||
|
||||
2. **API Routes**
|
||||
|
||||
- `POST /api/chat` powers the chat UI.
|
||||
- `POST /api/search` provides a programmatic search endpoint.
|
||||
- `GET /api/providers` lists available providers and model keys.
|
||||
|
||||
3. **Agents and Orchestration**
|
||||
|
||||
- The system classifies the question first.
|
||||
- It can run research and widgets in parallel.
|
||||
- It generates the final answer and includes citations.
|
||||
|
||||
4. **Search Backend**
|
||||
|
||||
- A meta search backend is used to fetch relevant web results when research is enabled.
|
||||
|
||||
5. **LLMs (Large Language Models)**
|
||||
|
||||
- Used for classification, writing answers, and producing citations.
|
||||
|
||||
6. **Embedding Models**
|
||||
|
||||
- Used for semantic search over user uploaded files.
|
||||
|
||||
7. **Storage**
|
||||
- Chats and messages are stored so conversations can be reloaded.
|
||||
|
||||
@@ -1,19 +1,72 @@
|
||||
# How does Perplexica work?
|
||||
# How Perplexica Works
|
||||
|
||||
Curious about how Perplexica works? Don't worry, we'll cover it here. Before we begin, make sure you've read about the architecture of Perplexica to ensure you understand what it's made up of. Haven't read it? You can read it [here](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/README.md).
|
||||
This is a high level overview of how Perplexica answers a question.
|
||||
|
||||
We'll understand how Perplexica works by taking an example of a scenario where a user asks: "How does an A.C. work?". We'll break down the process into steps to make it easier to understand. The steps are as follows:
|
||||
If you want a component level overview, see [README.md](README.md).
|
||||
|
||||
1. The message is sent to the `/api/chat` route where it invokes the chain. The chain will depend on your focus mode. For this example, let's assume we use the "webSearch" focus mode.
|
||||
2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources and searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started.
|
||||
3. The query returned by the first chain is passed to SearXNG to search the web for information.
|
||||
4. After the information is retrieved, it is based on keyword-based search. We then convert the information into embeddings and the query as well, then we perform a similarity search to find the most relevant sources to answer the query.
|
||||
5. After all this is done, the sources are passed to the response generator. This chain takes all the chat history, the query, and the sources. It generates a response that is streamed to the UI.
|
||||
If you want implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md).
|
||||
|
||||
## How are the answers cited?
|
||||
## What happens when you ask a question
|
||||
|
||||
The LLMs are prompted to do so. We've prompted them so well that they cite the answers themselves, and using some UI magic, we display it to the user.
|
||||
When you send a message in the UI, the app calls `POST /api/chat`.
|
||||
|
||||
## Image and Video Search
|
||||
At a high level, we do three things:
|
||||
|
||||
Image and video searches are conducted in a similar manner. A query is always generated first, then we search the web for images and videos that match the query. These results are then returned to the user.
|
||||
1. Classify the question and decide what to do next.
|
||||
2. Run research and widgets in parallel.
|
||||
3. Write the final answer and include citations.
|
||||
|
||||
## Classification
|
||||
|
||||
Before searching or answering, we run a classification step.
|
||||
|
||||
This step decides things like:
|
||||
|
||||
- Whether we should do research for this question
|
||||
- Whether we should show any widgets
|
||||
- How to rewrite the question into a clearer standalone form
|
||||
|
||||
## Widgets
|
||||
|
||||
Widgets are small, structured helpers that can run alongside research.
|
||||
|
||||
Examples include weather, stocks, and simple calculations.
|
||||
|
||||
If a widget is relevant, we show it in the UI while the answer is still being generated.
|
||||
|
||||
Widgets are helpful context for the answer, but they are not part of what the model should cite.
|
||||
|
||||
## Research
|
||||
|
||||
If research is needed, we gather information in the background while widgets can run.
|
||||
|
||||
Depending on configuration, research may include web lookup and searching user uploaded files.
|
||||
|
||||
## Answer generation
|
||||
|
||||
Once we have enough context, the chat model generates the final response.
|
||||
|
||||
You can control the tradeoff between speed and quality using `optimizationMode`:
|
||||
|
||||
- `speed`
|
||||
- `balanced`
|
||||
- `quality`
|
||||
|
||||
## How citations work
|
||||
|
||||
We prompt the model to cite the references it used. The UI then renders those citations alongside the supporting links.
|
||||
|
||||
## Search API
|
||||
|
||||
If you are integrating Perplexica into another product, you can call `POST /api/search`.
|
||||
|
||||
It returns:
|
||||
|
||||
- `message`: the generated answer
|
||||
- `sources`: supporting references used for the answer
|
||||
|
||||
You can also enable streaming by setting `stream: true`.
|
||||
|
||||
## Image and video search
|
||||
|
||||
Image and video search use separate endpoints (`POST /api/images` and `POST /api/videos`). We generate a focused query using the chat model, then fetch matching results from a search backend.
|
||||
|
||||
@@ -10,7 +10,7 @@ Simply pull the latest image and restart your container:
|
||||
docker pull itzcrazykns1337/perplexica:latest
|
||||
docker stop perplexica
|
||||
docker rm perplexica
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest
|
||||
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest
|
||||
```
|
||||
|
||||
For slim version:
|
||||
@@ -19,7 +19,7 @@ For slim version:
|
||||
docker pull itzcrazykns1337/perplexica:slim-latest
|
||||
docker stop perplexica
|
||||
docker rm perplexica
|
||||
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest
|
||||
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest
|
||||
```
|
||||
|
||||
Once updated, go to http://localhost:3000 and verify the latest changes. Your settings are preserved automatically.
|
||||
|
||||
1
drizzle/0002_daffy_wrecker.sql
Normal file
1
drizzle/0002_daffy_wrecker.sql
Normal file
@@ -0,0 +1 @@
|
||||
/* do nothing */
|
||||
132
drizzle/meta/0002_snapshot.json
Normal file
132
drizzle/meta/0002_snapshot.json
Normal file
@@ -0,0 +1,132 @@
|
||||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "1c5eb804-d6b4-48ec-9a8f-75fb729c8e52",
|
||||
"prevId": "6dedf55f-0e44-478f-82cf-14a21ac686f8",
|
||||
"tables": {
|
||||
"chats": {
|
||||
"name": "chats",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"createdAt": {
|
||||
"name": "createdAt",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"sources": {
|
||||
"name": "sources",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"files": {
|
||||
"name": "files",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": "'[]'"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"messages": {
|
||||
"name": "messages",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"messageId": {
|
||||
"name": "messageId",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"chatId": {
|
||||
"name": "chatId",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"backendId": {
|
||||
"name": "backendId",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"query": {
|
||||
"name": "query",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"createdAt": {
|
||||
"name": "createdAt",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"responseBlocks": {
|
||||
"name": "responseBlocks",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": "'[]'"
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": "'answering'"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
@@ -15,6 +15,13 @@
|
||||
"when": 1758863991284,
|
||||
"tag": "0001_wise_rockslide",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 2,
|
||||
"version": "6",
|
||||
"when": 1763732708332,
|
||||
"tag": "0002_daffy_wrecker",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
1
next-env.d.ts
vendored
1
next-env.d.ts
vendored
@@ -1,5 +1,6 @@
|
||||
/// <reference types="next" />
|
||||
/// <reference types="next/image-types/global" />
|
||||
import "./.next/dev/types/routes.d.ts";
|
||||
|
||||
// NOTE: This file should not be edited
|
||||
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information.
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
import pkg from './package.json' with { type: 'json' };
|
||||
|
||||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {
|
||||
output: 'standalone',
|
||||
@@ -9,6 +11,16 @@ const nextConfig = {
|
||||
],
|
||||
},
|
||||
serverExternalPackages: ['pdf-parse'],
|
||||
outputFileTracingIncludes: {
|
||||
'/api/**': [
|
||||
'./node_modules/@napi-rs/canvas/**',
|
||||
'./node_modules/@napi-rs/canvas-linux-x64-gnu/**',
|
||||
'./node_modules/@napi-rs/canvas-linux-x64-musl/**',
|
||||
],
|
||||
},
|
||||
env: {
|
||||
NEXT_PUBLIC_VERSION: pkg.version,
|
||||
},
|
||||
};
|
||||
|
||||
export default nextConfig;
|
||||
|
||||
55
package.json
55
package.json
@@ -1,63 +1,65 @@
|
||||
{
|
||||
"name": "perplexica-frontend",
|
||||
"version": "1.11.2",
|
||||
"name": "perplexica",
|
||||
"version": "1.12.0",
|
||||
"license": "MIT",
|
||||
"author": "ItzCrazyKns",
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "next build",
|
||||
"dev": "next dev --webpack",
|
||||
"build": "next build --webpack",
|
||||
"start": "next start",
|
||||
"lint": "next lint",
|
||||
"format:write": "prettier . --write"
|
||||
},
|
||||
"dependencies": {
|
||||
"@google/genai": "^1.34.0",
|
||||
"@headlessui/react": "^2.2.0",
|
||||
"@headlessui/tailwindcss": "^0.2.2",
|
||||
"@huggingface/transformers": "^3.7.5",
|
||||
"@iarna/toml": "^2.2.5",
|
||||
"@huggingface/transformers": "^3.8.1",
|
||||
"@icons-pack/react-simple-icons": "^12.3.0",
|
||||
"@langchain/anthropic": "^1.0.0",
|
||||
"@langchain/community": "^1.0.0",
|
||||
"@langchain/core": "^1.0.1",
|
||||
"@langchain/google-genai": "^1.0.0",
|
||||
"@langchain/groq": "^1.0.0",
|
||||
"@langchain/ollama": "^1.0.0",
|
||||
"@langchain/openai": "^1.0.0",
|
||||
"@langchain/textsplitters": "^1.0.0",
|
||||
"@phosphor-icons/react": "^2.1.10",
|
||||
"@radix-ui/react-tooltip": "^1.2.8",
|
||||
"@tailwindcss/typography": "^0.5.12",
|
||||
"axios": "^1.8.3",
|
||||
"better-sqlite3": "^11.9.1",
|
||||
"clsx": "^2.1.0",
|
||||
"compute-cosine-similarity": "^1.1.0",
|
||||
"drizzle-orm": "^0.40.1",
|
||||
"framer-motion": "^12.23.24",
|
||||
"html-to-text": "^9.0.5",
|
||||
"jspdf": "^3.0.1",
|
||||
"langchain": "^1.0.1",
|
||||
"lucide-react": "^0.363.0",
|
||||
"js-tiktoken": "^1.0.21",
|
||||
"jspdf": "^3.0.4",
|
||||
"lightweight-charts": "^5.0.9",
|
||||
"lucide-react": "^0.556.0",
|
||||
"mammoth": "^1.9.1",
|
||||
"markdown-to-jsx": "^7.7.2",
|
||||
"next": "^15.2.2",
|
||||
"mathjs": "^15.1.0",
|
||||
"motion": "^12.23.26",
|
||||
"next": "^16.0.7",
|
||||
"next-themes": "^0.3.0",
|
||||
"pdf-parse": "^1.1.1",
|
||||
"officeparser": "^5.2.2",
|
||||
"ollama": "^0.6.3",
|
||||
"openai": "^6.9.0",
|
||||
"partial-json": "^0.1.7",
|
||||
"pdf-parse": "^2.4.5",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-syntax-highlighter": "^16.1.0",
|
||||
"react-text-to-speech": "^0.14.5",
|
||||
"react-textarea-autosize": "^8.5.3",
|
||||
"rfc6902": "^5.1.2",
|
||||
"sonner": "^1.4.41",
|
||||
"tailwind-merge": "^2.2.2",
|
||||
"winston": "^3.17.0",
|
||||
"turndown": "^7.2.2",
|
||||
"yahoo-finance2": "^3.10.2",
|
||||
"yet-another-react-lightbox": "^3.17.2",
|
||||
"zod": "^3.22.4"
|
||||
"zod": "^4.1.12"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/better-sqlite3": "^7.6.12",
|
||||
"@types/html-to-text": "^9.0.4",
|
||||
"@types/jspdf": "^2.0.0",
|
||||
"@types/node": "^24.8.1",
|
||||
"@types/pdf-parse": "^1.1.4",
|
||||
"@types/react": "^18",
|
||||
"@types/react-dom": "^18",
|
||||
"@types/react-syntax-highlighter": "^15.5.13",
|
||||
"@types/turndown": "^5.0.6",
|
||||
"autoprefixer": "^10.0.1",
|
||||
"drizzle-kit": "^0.30.5",
|
||||
"eslint": "^8",
|
||||
@@ -66,5 +68,8 @@
|
||||
"prettier": "^3.2.5",
|
||||
"tailwindcss": "^3.3.0",
|
||||
"typescript": "^5.9.3"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@napi-rs/canvas": "^0.1.87"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
import crypto from 'crypto';
|
||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||
import { EventEmitter } from 'stream';
|
||||
import db from '@/lib/db';
|
||||
import { chats, messages as messagesSchema } from '@/lib/db/schema';
|
||||
import { and, eq, gt } from 'drizzle-orm';
|
||||
import { getFileDetails } from '@/lib/utils/files';
|
||||
import { searchHandlers } from '@/lib/search';
|
||||
import { z } from 'zod';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
import { ModelWithProvider } from '@/lib/models/types';
|
||||
import SearchAgent from '@/lib/agents/search';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { ChatTurnMessage } from '@/lib/types';
|
||||
import { SearchSources } from '@/lib/agents/search/types';
|
||||
import db from '@/lib/db';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { chats } from '@/lib/db/schema';
|
||||
import UploadManager from '@/lib/uploads/manager';
|
||||
|
||||
export const runtime = 'nodejs';
|
||||
export const dynamic = 'force-dynamic';
|
||||
@@ -20,47 +20,25 @@ const messageSchema = z.object({
|
||||
});
|
||||
|
||||
const chatModelSchema: z.ZodType<ModelWithProvider> = z.object({
|
||||
providerId: z.string({
|
||||
errorMap: () => ({
|
||||
message: 'Chat model provider id must be provided',
|
||||
}),
|
||||
}),
|
||||
key: z.string({
|
||||
errorMap: () => ({
|
||||
message: 'Chat model key must be provided',
|
||||
}),
|
||||
}),
|
||||
providerId: z.string({ message: 'Chat model provider id must be provided' }),
|
||||
key: z.string({ message: 'Chat model key must be provided' }),
|
||||
});
|
||||
|
||||
const embeddingModelSchema: z.ZodType<ModelWithProvider> = z.object({
|
||||
providerId: z.string({
|
||||
errorMap: () => ({
|
||||
message: 'Embedding model provider id must be provided',
|
||||
}),
|
||||
}),
|
||||
key: z.string({
|
||||
errorMap: () => ({
|
||||
message: 'Embedding model key must be provided',
|
||||
}),
|
||||
}),
|
||||
key: z.string({ message: 'Embedding model key must be provided' }),
|
||||
});
|
||||
|
||||
const bodySchema = z.object({
|
||||
message: messageSchema,
|
||||
optimizationMode: z.enum(['speed', 'balanced', 'quality'], {
|
||||
errorMap: () => ({
|
||||
message: 'Optimization mode must be one of: speed, balanced, quality',
|
||||
}),
|
||||
}),
|
||||
focusMode: z.string().min(1, 'Focus mode is required'),
|
||||
sources: z.array(z.string()).optional().default([]),
|
||||
history: z
|
||||
.array(
|
||||
z.tuple([z.string(), z.string()], {
|
||||
errorMap: () => ({
|
||||
message: 'History items must be tuples of two strings',
|
||||
}),
|
||||
}),
|
||||
)
|
||||
.array(z.tuple([z.string(), z.string()]))
|
||||
.optional()
|
||||
.default([]),
|
||||
files: z.array(z.string()).optional().default([]),
|
||||
@@ -69,7 +47,6 @@ const bodySchema = z.object({
|
||||
systemInstructions: z.string().nullable().optional().default(''),
|
||||
});
|
||||
|
||||
type Message = z.infer<typeof messageSchema>;
|
||||
type Body = z.infer<typeof bodySchema>;
|
||||
|
||||
const safeValidateBody = (data: unknown) => {
|
||||
@@ -78,7 +55,7 @@ const safeValidateBody = (data: unknown) => {
|
||||
if (!result.success) {
|
||||
return {
|
||||
success: false,
|
||||
error: result.error.errors.map((e) => ({
|
||||
error: result.error.issues.map((e: any) => ({
|
||||
path: e.path.join('.'),
|
||||
message: e.message,
|
||||
})),
|
||||
@@ -91,143 +68,35 @@ const safeValidateBody = (data: unknown) => {
|
||||
};
|
||||
};
|
||||
|
||||
const handleEmitterEvents = async (
|
||||
stream: EventEmitter,
|
||||
writer: WritableStreamDefaultWriter,
|
||||
encoder: TextEncoder,
|
||||
chatId: string,
|
||||
) => {
|
||||
let receivedMessage = '';
|
||||
const aiMessageId = crypto.randomBytes(7).toString('hex');
|
||||
|
||||
stream.on('data', (data) => {
|
||||
const parsedData = JSON.parse(data);
|
||||
if (parsedData.type === 'response') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'message',
|
||||
data: parsedData.data,
|
||||
messageId: aiMessageId,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
|
||||
receivedMessage += parsedData.data;
|
||||
} else if (parsedData.type === 'sources') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'sources',
|
||||
data: parsedData.data,
|
||||
messageId: aiMessageId,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
|
||||
const sourceMessageId = crypto.randomBytes(7).toString('hex');
|
||||
|
||||
db.insert(messagesSchema)
|
||||
.values({
|
||||
chatId: chatId,
|
||||
messageId: sourceMessageId,
|
||||
role: 'source',
|
||||
sources: parsedData.data,
|
||||
createdAt: new Date().toString(),
|
||||
const ensureChatExists = async (input: {
|
||||
id: string;
|
||||
sources: SearchSources[];
|
||||
query: string;
|
||||
fileIds: string[];
|
||||
}) => {
|
||||
try {
|
||||
const exists = await db.query.chats
|
||||
.findFirst({
|
||||
where: eq(chats.id, input.id),
|
||||
})
|
||||
.execute();
|
||||
}
|
||||
});
|
||||
stream.on('end', () => {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'messageEnd',
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
|
||||
db.insert(messagesSchema)
|
||||
.values({
|
||||
content: receivedMessage,
|
||||
chatId: chatId,
|
||||
messageId: aiMessageId,
|
||||
role: 'assistant',
|
||||
createdAt: new Date().toString(),
|
||||
})
|
||||
.execute();
|
||||
});
|
||||
stream.on('error', (data) => {
|
||||
const parsedData = JSON.parse(data);
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: parsedData.data,
|
||||
}),
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
});
|
||||
if (!exists) {
|
||||
await db.insert(chats).values({
|
||||
id: input.id,
|
||||
createdAt: new Date().toISOString(),
|
||||
sources: input.sources,
|
||||
title: input.query,
|
||||
files: input.fileIds.map((id) => {
|
||||
return {
|
||||
fileId: id,
|
||||
name: UploadManager.getFile(id)?.name || 'Uploaded File',
|
||||
};
|
||||
|
||||
const handleHistorySave = async (
|
||||
message: Message,
|
||||
humanMessageId: string,
|
||||
focusMode: string,
|
||||
files: string[],
|
||||
) => {
|
||||
const chat = await db.query.chats.findFirst({
|
||||
where: eq(chats.id, message.chatId),
|
||||
}),
|
||||
});
|
||||
|
||||
const fileData = files.map(getFileDetails);
|
||||
|
||||
if (!chat) {
|
||||
await db
|
||||
.insert(chats)
|
||||
.values({
|
||||
id: message.chatId,
|
||||
title: message.content,
|
||||
createdAt: new Date().toString(),
|
||||
focusMode: focusMode,
|
||||
files: fileData,
|
||||
})
|
||||
.execute();
|
||||
} else if (JSON.stringify(chat.files ?? []) != JSON.stringify(fileData)) {
|
||||
db.update(chats)
|
||||
.set({
|
||||
files: files.map(getFileDetails),
|
||||
})
|
||||
.where(eq(chats.id, message.chatId));
|
||||
}
|
||||
|
||||
const messageExists = await db.query.messages.findFirst({
|
||||
where: eq(messagesSchema.messageId, humanMessageId),
|
||||
});
|
||||
|
||||
if (!messageExists) {
|
||||
await db
|
||||
.insert(messagesSchema)
|
||||
.values({
|
||||
content: message.content,
|
||||
chatId: message.chatId,
|
||||
messageId: humanMessageId,
|
||||
role: 'user',
|
||||
createdAt: new Date().toString(),
|
||||
})
|
||||
.execute();
|
||||
} else {
|
||||
await db
|
||||
.delete(messagesSchema)
|
||||
.where(
|
||||
and(
|
||||
gt(messagesSchema.id, messageExists.id),
|
||||
eq(messagesSchema.chatId, message.chatId),
|
||||
),
|
||||
)
|
||||
.execute();
|
||||
} catch (err) {
|
||||
console.error('Failed to check/save chat:', err);
|
||||
}
|
||||
};
|
||||
|
||||
@@ -236,6 +105,7 @@ export const POST = async (req: Request) => {
|
||||
const reqBody = (await req.json()) as Body;
|
||||
|
||||
const parseBody = safeValidateBody(reqBody);
|
||||
|
||||
if (!parseBody.success) {
|
||||
return Response.json(
|
||||
{ message: 'Invalid request body', error: parseBody.error },
|
||||
@@ -265,48 +135,107 @@ export const POST = async (req: Request) => {
|
||||
),
|
||||
]);
|
||||
|
||||
const humanMessageId =
|
||||
message.messageId ?? crypto.randomBytes(7).toString('hex');
|
||||
|
||||
const history: BaseMessage[] = body.history.map((msg) => {
|
||||
const history: ChatTurnMessage[] = body.history.map((msg) => {
|
||||
if (msg[0] === 'human') {
|
||||
return new HumanMessage({
|
||||
return {
|
||||
role: 'user',
|
||||
content: msg[1],
|
||||
});
|
||||
};
|
||||
} else {
|
||||
return new AIMessage({
|
||||
return {
|
||||
role: 'assistant',
|
||||
content: msg[1],
|
||||
});
|
||||
};
|
||||
}
|
||||
});
|
||||
|
||||
const handler = searchHandlers[body.focusMode];
|
||||
|
||||
if (!handler) {
|
||||
return Response.json(
|
||||
{
|
||||
message: 'Invalid focus mode',
|
||||
},
|
||||
{ status: 400 },
|
||||
);
|
||||
}
|
||||
|
||||
const stream = await handler.searchAndAnswer(
|
||||
message.content,
|
||||
history,
|
||||
llm,
|
||||
embedding,
|
||||
body.optimizationMode,
|
||||
body.files,
|
||||
body.systemInstructions as string,
|
||||
);
|
||||
const agent = new SearchAgent();
|
||||
const session = SessionManager.createSession();
|
||||
|
||||
const responseStream = new TransformStream();
|
||||
const writer = responseStream.writable.getWriter();
|
||||
const encoder = new TextEncoder();
|
||||
|
||||
handleEmitterEvents(stream, writer, encoder, message.chatId);
|
||||
handleHistorySave(message, humanMessageId, body.focusMode, body.files);
|
||||
const disconnect = session.subscribe((event: string, data: any) => {
|
||||
if (event === 'data') {
|
||||
if (data.type === 'block') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'block',
|
||||
block: data.block,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
} else if (data.type === 'updateBlock') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'updateBlock',
|
||||
blockId: data.blockId,
|
||||
patch: data.patch,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
} else if (data.type === 'researchComplete') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'researchComplete',
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
}
|
||||
} else if (event === 'end') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'messageEnd',
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
session.removeAllListeners();
|
||||
} else if (event === 'error') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: data.data,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
session.removeAllListeners();
|
||||
}
|
||||
});
|
||||
|
||||
agent.searchAsync(session, {
|
||||
chatHistory: history,
|
||||
followUp: message.content,
|
||||
chatId: body.message.chatId,
|
||||
messageId: body.message.messageId,
|
||||
config: {
|
||||
llm,
|
||||
embedding: embedding,
|
||||
sources: body.sources as SearchSources[],
|
||||
mode: body.optimizationMode,
|
||||
fileIds: body.files,
|
||||
systemInstructions: body.systemInstructions || 'None',
|
||||
},
|
||||
});
|
||||
|
||||
ensureChatExists({
|
||||
id: body.message.chatId,
|
||||
sources: body.sources as SearchSources[],
|
||||
fileIds: body.files,
|
||||
query: body.message.content,
|
||||
});
|
||||
|
||||
req.signal.addEventListener('abort', () => {
|
||||
disconnect();
|
||||
writer.close();
|
||||
});
|
||||
|
||||
return new Response(responseStream.readable, {
|
||||
headers: {
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import handleImageSearch from '@/lib/chains/imageSearchAgent';
|
||||
import searchImages from '@/lib/agents/media/image';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
import { ModelWithProvider } from '@/lib/models/types';
|
||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||
|
||||
interface ImageSearchBody {
|
||||
query: string;
|
||||
@@ -13,16 +12,6 @@ export const POST = async (req: Request) => {
|
||||
try {
|
||||
const body: ImageSearchBody = await req.json();
|
||||
|
||||
const chatHistory = body.chatHistory
|
||||
.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
})
|
||||
.filter((msg) => msg !== undefined) as BaseMessage[];
|
||||
|
||||
const registry = new ModelRegistry();
|
||||
|
||||
const llm = await registry.loadChatModel(
|
||||
@@ -30,9 +19,12 @@ export const POST = async (req: Request) => {
|
||||
body.chatModel.key,
|
||||
);
|
||||
|
||||
const images = await handleImageSearch(
|
||||
const images = await searchImages(
|
||||
{
|
||||
chat_history: chatHistory,
|
||||
chatHistory: body.chatHistory.map(([role, content]) => ({
|
||||
role: role === 'human' ? 'user' : 'assistant',
|
||||
content,
|
||||
})),
|
||||
query: body.query,
|
||||
},
|
||||
llm,
|
||||
|
||||
93
src/app/api/reconnect/[id]/route.ts
Normal file
93
src/app/api/reconnect/[id]/route.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import SessionManager from '@/lib/session';
|
||||
|
||||
export const POST = async (
|
||||
req: Request,
|
||||
{ params }: { params: Promise<{ id: string }> },
|
||||
) => {
|
||||
try {
|
||||
const { id } = await params;
|
||||
|
||||
const session = SessionManager.getSession(id);
|
||||
|
||||
if (!session) {
|
||||
return Response.json({ message: 'Session not found' }, { status: 404 });
|
||||
}
|
||||
|
||||
const responseStream = new TransformStream();
|
||||
const writer = responseStream.writable.getWriter();
|
||||
const encoder = new TextEncoder();
|
||||
|
||||
const disconnect = session.subscribe((event, data) => {
|
||||
if (event === 'data') {
|
||||
if (data.type === 'block') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'block',
|
||||
block: data.block,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
} else if (data.type === 'updateBlock') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'updateBlock',
|
||||
blockId: data.blockId,
|
||||
patch: data.patch,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
} else if (data.type === 'researchComplete') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'researchComplete',
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
}
|
||||
} else if (event === 'end') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'messageEnd',
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
disconnect();
|
||||
} else if (event === 'error') {
|
||||
writer.write(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: data.data,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
writer.close();
|
||||
disconnect();
|
||||
}
|
||||
});
|
||||
|
||||
req.signal.addEventListener('abort', () => {
|
||||
disconnect();
|
||||
writer.close();
|
||||
});
|
||||
|
||||
return new Response(responseStream.readable, {
|
||||
headers: {
|
||||
'Content-Type': 'text/event-stream',
|
||||
Connection: 'keep-alive',
|
||||
'Cache-Control': 'no-cache, no-transform',
|
||||
},
|
||||
});
|
||||
} catch (err) {
|
||||
console.error('Error in reconnecting to session stream: ', err);
|
||||
return Response.json(
|
||||
{ message: 'An error has occurred.' },
|
||||
{ status: 500 },
|
||||
);
|
||||
}
|
||||
};
|
||||
@@ -1,12 +1,13 @@
|
||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||
import { MetaSearchAgentType } from '@/lib/search/metaSearchAgent';
|
||||
import { searchHandlers } from '@/lib/search';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
import { ModelWithProvider } from '@/lib/models/types';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { ChatTurnMessage } from '@/lib/types';
|
||||
import { SearchSources } from '@/lib/agents/search/types';
|
||||
import APISearchAgent from '@/lib/agents/search/api';
|
||||
|
||||
interface ChatRequestBody {
|
||||
optimizationMode: 'speed' | 'balanced';
|
||||
focusMode: string;
|
||||
optimizationMode: 'speed' | 'balanced' | 'quality';
|
||||
sources: SearchSources[];
|
||||
chatModel: ModelWithProvider;
|
||||
embeddingModel: ModelWithProvider;
|
||||
query: string;
|
||||
@@ -19,23 +20,17 @@ export const POST = async (req: Request) => {
|
||||
try {
|
||||
const body: ChatRequestBody = await req.json();
|
||||
|
||||
if (!body.focusMode || !body.query) {
|
||||
if (!body.sources || !body.query) {
|
||||
return Response.json(
|
||||
{ message: 'Missing focus mode or query' },
|
||||
{ message: 'Missing sources or query' },
|
||||
{ status: 400 },
|
||||
);
|
||||
}
|
||||
|
||||
body.history = body.history || [];
|
||||
body.optimizationMode = body.optimizationMode || 'balanced';
|
||||
body.optimizationMode = body.optimizationMode || 'speed';
|
||||
body.stream = body.stream || false;
|
||||
|
||||
const history: BaseMessage[] = body.history.map((msg) => {
|
||||
return msg[0] === 'human'
|
||||
? new HumanMessage({ content: msg[1] })
|
||||
: new AIMessage({ content: msg[1] });
|
||||
});
|
||||
|
||||
const registry = new ModelRegistry();
|
||||
|
||||
const [llm, embeddings] = await Promise.all([
|
||||
@@ -46,21 +41,30 @@ export const POST = async (req: Request) => {
|
||||
),
|
||||
]);
|
||||
|
||||
const searchHandler: MetaSearchAgentType = searchHandlers[body.focusMode];
|
||||
const history: ChatTurnMessage[] = body.history.map((msg) => {
|
||||
return msg[0] === 'human'
|
||||
? { role: 'user', content: msg[1] }
|
||||
: { role: 'assistant', content: msg[1] };
|
||||
});
|
||||
|
||||
if (!searchHandler) {
|
||||
return Response.json({ message: 'Invalid focus mode' }, { status: 400 });
|
||||
}
|
||||
const session = SessionManager.createSession();
|
||||
|
||||
const emitter = await searchHandler.searchAndAnswer(
|
||||
body.query,
|
||||
history,
|
||||
llm,
|
||||
embeddings,
|
||||
body.optimizationMode,
|
||||
[],
|
||||
body.systemInstructions || '',
|
||||
);
|
||||
const agent = new APISearchAgent();
|
||||
|
||||
agent.searchAsync(session, {
|
||||
chatHistory: history,
|
||||
config: {
|
||||
embedding: embeddings,
|
||||
llm: llm,
|
||||
sources: body.sources,
|
||||
mode: body.optimizationMode,
|
||||
fileIds: [],
|
||||
systemInstructions: body.systemInstructions || '',
|
||||
},
|
||||
followUp: body.query,
|
||||
chatId: crypto.randomUUID(),
|
||||
messageId: crypto.randomUUID(),
|
||||
});
|
||||
|
||||
if (!body.stream) {
|
||||
return new Promise(
|
||||
@@ -71,13 +75,13 @@ export const POST = async (req: Request) => {
|
||||
let message = '';
|
||||
let sources: any[] = [];
|
||||
|
||||
emitter.on('data', (data: string) => {
|
||||
session.subscribe((event: string, data: Record<string, any>) => {
|
||||
if (event === 'data') {
|
||||
try {
|
||||
const parsedData = JSON.parse(data);
|
||||
if (parsedData.type === 'response') {
|
||||
message += parsedData.data;
|
||||
} else if (parsedData.type === 'sources') {
|
||||
sources = parsedData.data;
|
||||
if (data.type === 'response') {
|
||||
message += data.data;
|
||||
} else if (data.type === 'searchResults') {
|
||||
sources = data.data;
|
||||
}
|
||||
} catch (error) {
|
||||
reject(
|
||||
@@ -87,19 +91,20 @@ export const POST = async (req: Request) => {
|
||||
),
|
||||
);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
emitter.on('end', () => {
|
||||
if (event === 'end') {
|
||||
resolve(Response.json({ message, sources }, { status: 200 }));
|
||||
});
|
||||
}
|
||||
|
||||
emitter.on('error', (error: any) => {
|
||||
if (event === 'error') {
|
||||
reject(
|
||||
Response.json(
|
||||
{ message: 'Search error', error },
|
||||
{ message: 'Search error', error: data },
|
||||
{ status: 500 },
|
||||
),
|
||||
);
|
||||
}
|
||||
});
|
||||
},
|
||||
);
|
||||
@@ -124,30 +129,29 @@ export const POST = async (req: Request) => {
|
||||
);
|
||||
|
||||
signal.addEventListener('abort', () => {
|
||||
emitter.removeAllListeners();
|
||||
session.removeAllListeners();
|
||||
|
||||
try {
|
||||
controller.close();
|
||||
} catch (error) {}
|
||||
});
|
||||
|
||||
emitter.on('data', (data: string) => {
|
||||
session.subscribe((event: string, data: Record<string, any>) => {
|
||||
if (event === 'data') {
|
||||
if (signal.aborted) return;
|
||||
|
||||
try {
|
||||
const parsedData = JSON.parse(data);
|
||||
|
||||
if (parsedData.type === 'response') {
|
||||
if (data.type === 'response') {
|
||||
controller.enqueue(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
type: 'response',
|
||||
data: parsedData.data,
|
||||
data: data.data,
|
||||
}) + '\n',
|
||||
),
|
||||
);
|
||||
} else if (parsedData.type === 'sources') {
|
||||
sources = parsedData.data;
|
||||
} else if (data.type === 'searchResults') {
|
||||
sources = data.data;
|
||||
controller.enqueue(
|
||||
encoder.encode(
|
||||
JSON.stringify({
|
||||
@@ -160,9 +164,9 @@ export const POST = async (req: Request) => {
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
emitter.on('end', () => {
|
||||
if (event === 'end') {
|
||||
if (signal.aborted) return;
|
||||
|
||||
controller.enqueue(
|
||||
@@ -173,12 +177,13 @@ export const POST = async (req: Request) => {
|
||||
),
|
||||
);
|
||||
controller.close();
|
||||
});
|
||||
}
|
||||
|
||||
emitter.on('error', (error: any) => {
|
||||
if (event === 'error') {
|
||||
if (signal.aborted) return;
|
||||
|
||||
controller.error(error);
|
||||
controller.error(data);
|
||||
}
|
||||
});
|
||||
},
|
||||
cancel() {
|
||||
|
||||
@@ -1,8 +1,6 @@
|
||||
import generateSuggestions from '@/lib/chains/suggestionGeneratorAgent';
|
||||
import generateSuggestions from '@/lib/agents/suggestions';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
import { ModelWithProvider } from '@/lib/models/types';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||
|
||||
interface SuggestionsGenerationBody {
|
||||
chatHistory: any[];
|
||||
@@ -13,16 +11,6 @@ export const POST = async (req: Request) => {
|
||||
try {
|
||||
const body: SuggestionsGenerationBody = await req.json();
|
||||
|
||||
const chatHistory = body.chatHistory
|
||||
.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
})
|
||||
.filter((msg) => msg !== undefined) as BaseMessage[];
|
||||
|
||||
const registry = new ModelRegistry();
|
||||
|
||||
const llm = await registry.loadChatModel(
|
||||
@@ -32,7 +20,10 @@ export const POST = async (req: Request) => {
|
||||
|
||||
const suggestions = await generateSuggestions(
|
||||
{
|
||||
chat_history: chatHistory,
|
||||
chatHistory: body.chatHistory.map(([role, content]) => ({
|
||||
role: role === 'human' ? 'user' : 'assistant',
|
||||
content,
|
||||
})),
|
||||
},
|
||||
llm,
|
||||
);
|
||||
|
||||
@@ -1,39 +1,16 @@
|
||||
import { NextResponse } from 'next/server';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import crypto from 'crypto';
|
||||
import { PDFLoader } from '@langchain/community/document_loaders/fs/pdf';
|
||||
import { DocxLoader } from '@langchain/community/document_loaders/fs/docx';
|
||||
import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
|
||||
interface FileRes {
|
||||
fileName: string;
|
||||
fileExtension: string;
|
||||
fileId: string;
|
||||
}
|
||||
|
||||
const uploadDir = path.join(process.cwd(), 'uploads');
|
||||
|
||||
if (!fs.existsSync(uploadDir)) {
|
||||
fs.mkdirSync(uploadDir, { recursive: true });
|
||||
}
|
||||
|
||||
const splitter = new RecursiveCharacterTextSplitter({
|
||||
chunkSize: 500,
|
||||
chunkOverlap: 100,
|
||||
});
|
||||
import UploadManager from '@/lib/uploads/manager';
|
||||
|
||||
export async function POST(req: Request) {
|
||||
try {
|
||||
const formData = await req.formData();
|
||||
|
||||
const files = formData.getAll('files') as File[];
|
||||
const embedding_model = formData.get('embedding_model_key') as string;
|
||||
const embedding_model_provider = formData.get('embedding_model_provider_id') as string;
|
||||
const embeddingModel = formData.get('embedding_model_key') as string;
|
||||
const embeddingModelProvider = formData.get('embedding_model_provider_id') as string;
|
||||
|
||||
if (!embedding_model || !embedding_model_provider) {
|
||||
if (!embeddingModel || !embeddingModelProvider) {
|
||||
return NextResponse.json(
|
||||
{ message: 'Missing embedding model or provider' },
|
||||
{ status: 400 },
|
||||
@@ -42,73 +19,13 @@ export async function POST(req: Request) {
|
||||
|
||||
const registry = new ModelRegistry();
|
||||
|
||||
const model = await registry.loadEmbeddingModel(embedding_model_provider, embedding_model);
|
||||
const model = await registry.loadEmbeddingModel(embeddingModelProvider, embeddingModel);
|
||||
|
||||
const processedFiles: FileRes[] = [];
|
||||
const uploadManager = new UploadManager({
|
||||
embeddingModel: model,
|
||||
})
|
||||
|
||||
await Promise.all(
|
||||
files.map(async (file: any) => {
|
||||
const fileExtension = file.name.split('.').pop();
|
||||
if (!['pdf', 'docx', 'txt'].includes(fileExtension!)) {
|
||||
return NextResponse.json(
|
||||
{ message: 'File type not supported' },
|
||||
{ status: 400 },
|
||||
);
|
||||
}
|
||||
|
||||
const uniqueFileName = `${crypto.randomBytes(16).toString('hex')}.${fileExtension}`;
|
||||
const filePath = path.join(uploadDir, uniqueFileName);
|
||||
|
||||
const buffer = Buffer.from(await file.arrayBuffer());
|
||||
fs.writeFileSync(filePath, new Uint8Array(buffer));
|
||||
|
||||
let docs: any[] = [];
|
||||
if (fileExtension === 'pdf') {
|
||||
const loader = new PDFLoader(filePath);
|
||||
docs = await loader.load();
|
||||
} else if (fileExtension === 'docx') {
|
||||
const loader = new DocxLoader(filePath);
|
||||
docs = await loader.load();
|
||||
} else if (fileExtension === 'txt') {
|
||||
const text = fs.readFileSync(filePath, 'utf-8');
|
||||
docs = [
|
||||
new Document({ pageContent: text, metadata: { title: file.name } }),
|
||||
];
|
||||
}
|
||||
|
||||
const splitted = await splitter.splitDocuments(docs);
|
||||
|
||||
const extractedDataPath = filePath.replace(/\.\w+$/, '-extracted.json');
|
||||
fs.writeFileSync(
|
||||
extractedDataPath,
|
||||
JSON.stringify({
|
||||
title: file.name,
|
||||
contents: splitted.map((doc) => doc.pageContent),
|
||||
}),
|
||||
);
|
||||
|
||||
const embeddings = await model.embedDocuments(
|
||||
splitted.map((doc) => doc.pageContent),
|
||||
);
|
||||
const embeddingsDataPath = filePath.replace(
|
||||
/\.\w+$/,
|
||||
'-embeddings.json',
|
||||
);
|
||||
fs.writeFileSync(
|
||||
embeddingsDataPath,
|
||||
JSON.stringify({
|
||||
title: file.name,
|
||||
embeddings,
|
||||
}),
|
||||
);
|
||||
|
||||
processedFiles.push({
|
||||
fileName: file.name,
|
||||
fileExtension: fileExtension,
|
||||
fileId: uniqueFileName.replace(/\.\w+$/, ''),
|
||||
});
|
||||
}),
|
||||
);
|
||||
const processedFiles = await uploadManager.processFiles(files);
|
||||
|
||||
return NextResponse.json({
|
||||
files: processedFiles,
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import handleVideoSearch from '@/lib/chains/videoSearchAgent';
|
||||
import handleVideoSearch from '@/lib/agents/media/video';
|
||||
import ModelRegistry from '@/lib/models/registry';
|
||||
import { ModelWithProvider } from '@/lib/models/types';
|
||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||
|
||||
interface VideoSearchBody {
|
||||
query: string;
|
||||
@@ -13,16 +12,6 @@ export const POST = async (req: Request) => {
|
||||
try {
|
||||
const body: VideoSearchBody = await req.json();
|
||||
|
||||
const chatHistory = body.chatHistory
|
||||
.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
})
|
||||
.filter((msg) => msg !== undefined) as BaseMessage[];
|
||||
|
||||
const registry = new ModelRegistry();
|
||||
|
||||
const llm = await registry.loadChatModel(
|
||||
@@ -32,7 +21,10 @@ export const POST = async (req: Request) => {
|
||||
|
||||
const videos = await handleVideoSearch(
|
||||
{
|
||||
chat_history: chatHistory,
|
||||
chatHistory: body.chatHistory.map(([role, content]) => ({
|
||||
role: role === 'human' ? 'user' : 'assistant',
|
||||
content,
|
||||
})),
|
||||
query: body.query,
|
||||
},
|
||||
llm,
|
||||
|
||||
@@ -1,10 +1,5 @@
|
||||
'use client';
|
||||
|
||||
import ChatWindow from '@/components/ChatWindow';
|
||||
import React from 'react';
|
||||
|
||||
const Page = () => {
|
||||
return <ChatWindow />;
|
||||
};
|
||||
|
||||
export default Page;
|
||||
export default ChatWindow;
|
||||
|
||||
@@ -34,7 +34,7 @@ export default function RootLayout({
|
||||
|
||||
return (
|
||||
<html className="h-full" lang="en" suppressHydrationWarning>
|
||||
<body className={cn('h-full', montserrat.className)}>
|
||||
<body className={cn('h-full antialiased', montserrat.className)}>
|
||||
<ThemeProvider>
|
||||
{setupComplete ? (
|
||||
<ChatProvider>
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
'use client';
|
||||
|
||||
import DeleteChat from '@/components/DeleteChat';
|
||||
import { cn, formatTimeDifference } from '@/lib/utils';
|
||||
import { BookOpenText, ClockIcon, Delete, ScanEye } from 'lucide-react';
|
||||
import { formatTimeDifference } from '@/lib/utils';
|
||||
import { BookOpenText, ClockIcon, FileText, Globe2Icon } from 'lucide-react';
|
||||
import Link from 'next/link';
|
||||
import { useEffect, useState } from 'react';
|
||||
|
||||
@@ -10,7 +10,8 @@ export interface Chat {
|
||||
id: string;
|
||||
title: string;
|
||||
createdAt: string;
|
||||
focusMode: string;
|
||||
sources: string[];
|
||||
files: { fileId: string; name: string }[];
|
||||
}
|
||||
|
||||
const Page = () => {
|
||||
@@ -37,8 +38,38 @@ const Page = () => {
|
||||
fetchChats();
|
||||
}, []);
|
||||
|
||||
return loading ? (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
return (
|
||||
<div>
|
||||
<div className="flex flex-col pt-10 border-b border-light-200/20 dark:border-dark-200/20 pb-6 px-2">
|
||||
<div className="flex flex-col lg:flex-row lg:items-end lg:justify-between gap-3">
|
||||
<div className="flex items-center justify-center">
|
||||
<BookOpenText size={45} className="mb-2.5" />
|
||||
<div className="flex flex-col">
|
||||
<h1
|
||||
className="text-5xl font-normal p-2 pb-0"
|
||||
style={{ fontFamily: 'PP Editorial, serif' }}
|
||||
>
|
||||
Library
|
||||
</h1>
|
||||
<div className="px-2 text-sm text-black/60 dark:text-white/60 text-center lg:text-left">
|
||||
Past chats, sources, and uploads.
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center justify-center lg:justify-end gap-2 text-xs text-black/60 dark:text-white/60">
|
||||
<span className="inline-flex items-center gap-1 rounded-full border border-black/20 dark:border-white/20 px-2 py-0.5">
|
||||
<BookOpenText size={14} />
|
||||
{loading
|
||||
? 'Loading…'
|
||||
: `${chats.length} ${chats.length === 1 ? 'chat' : 'chats'}`}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{loading ? (
|
||||
<div className="flex flex-row items-center justify-center min-h-[60vh]">
|
||||
<svg
|
||||
aria-hidden="true"
|
||||
className="w-8 h-8 text-light-200 fill-light-secondary dark:text-[#202020] animate-spin dark:fill-[#ffffff3b]"
|
||||
@@ -56,47 +87,56 @@ const Page = () => {
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
) : (
|
||||
<div>
|
||||
<div className="flex flex-col pt-4">
|
||||
<div className="flex items-center">
|
||||
<BookOpenText />
|
||||
<h1 className="text-3xl font-medium p-2">Library</h1>
|
||||
) : chats.length === 0 ? (
|
||||
<div className="flex flex-col items-center justify-center min-h-[70vh] px-2 text-center">
|
||||
<div className="flex items-center justify-center w-12 h-12 rounded-2xl border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary">
|
||||
<BookOpenText className="text-black/70 dark:text-white/70" />
|
||||
</div>
|
||||
<hr className="border-t border-[#2B2C2C] my-4 w-full" />
|
||||
</div>
|
||||
{chats.length === 0 && (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
<p className="mt-2 text-black/70 dark:text-white/70 text-sm">
|
||||
No chats found.
|
||||
</p>
|
||||
<p className="mt-1 text-black/70 dark:text-white/70 text-sm">
|
||||
<Link href="/" className="text-sky-400">
|
||||
Start a new chat
|
||||
</Link>{' '}
|
||||
to see it listed here.
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
{chats.length > 0 && (
|
||||
<div className="flex flex-col pb-20 lg:pb-2">
|
||||
{chats.map((chat, i) => (
|
||||
) : (
|
||||
<div className="pt-6 pb-28 px-2">
|
||||
<div className="rounded-2xl border border-light-200 dark:border-dark-200 overflow-hidden bg-light-primary dark:bg-dark-primary">
|
||||
{chats.map((chat, index) => {
|
||||
const sourcesLabel =
|
||||
chat.sources.length === 0
|
||||
? null
|
||||
: chat.sources.length <= 2
|
||||
? chat.sources
|
||||
.map((s) => s.charAt(0).toUpperCase() + s.slice(1))
|
||||
.join(', ')
|
||||
: `${chat.sources
|
||||
.slice(0, 2)
|
||||
.map((s) => s.charAt(0).toUpperCase() + s.slice(1))
|
||||
.join(', ')} + ${chat.sources.length - 2}`;
|
||||
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex flex-col space-y-4 py-6',
|
||||
i !== chats.length - 1
|
||||
? 'border-b border-white-200 dark:border-dark-200'
|
||||
: '',
|
||||
)}
|
||||
key={i}
|
||||
key={chat.id}
|
||||
className={
|
||||
'group flex flex-col gap-2 p-4 hover:bg-light-secondary dark:hover:bg-dark-secondary transition-colors duration-200 ' +
|
||||
(index !== chats.length - 1
|
||||
? 'border-b border-light-200 dark:border-dark-200'
|
||||
: '')
|
||||
}
|
||||
>
|
||||
<div className="flex items-start justify-between gap-3">
|
||||
<Link
|
||||
href={`/c/${chat.id}`}
|
||||
className="text-black dark:text-white lg:text-xl font-medium truncate transition duration-200 hover:text-[#24A0ED] dark:hover:text-[#24A0ED] cursor-pointer"
|
||||
className="flex-1 text-black dark:text-white text-base lg:text-lg font-medium leading-snug line-clamp-2 group-hover:text-[#24A0ED] transition duration-200"
|
||||
title={chat.title}
|
||||
>
|
||||
{chat.title}
|
||||
</Link>
|
||||
<div className="flex flex-row items-center justify-between w-full">
|
||||
<div className="flex flex-row items-center space-x-1 lg:space-x-1.5 text-black/70 dark:text-white/70">
|
||||
<ClockIcon size={15} />
|
||||
<p className="text-xs">
|
||||
{formatTimeDifference(new Date(), chat.createdAt)} Ago
|
||||
</p>
|
||||
</div>
|
||||
<div className="pt-0.5 shrink-0">
|
||||
<DeleteChat
|
||||
chatId={chat.id}
|
||||
chats={chats}
|
||||
@@ -104,7 +144,31 @@ const Page = () => {
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
|
||||
<div className="flex flex-wrap items-center gap-2 text-black/70 dark:text-white/70">
|
||||
<span className="inline-flex items-center gap-1 text-xs">
|
||||
<ClockIcon size={14} />
|
||||
{formatTimeDifference(new Date(), chat.createdAt)} Ago
|
||||
</span>
|
||||
|
||||
{sourcesLabel && (
|
||||
<span className="inline-flex items-center gap-1 text-xs border border-black/20 dark:border-white/20 rounded-full px-2 py-0.5">
|
||||
<Globe2Icon size={14} />
|
||||
{sourcesLabel}
|
||||
</span>
|
||||
)}
|
||||
{chat.files.length > 0 && (
|
||||
<span className="inline-flex items-center gap-1 text-xs border border-black/20 dark:border-white/20 rounded-full px-2 py-0.5">
|
||||
<FileText size={14} />
|
||||
{chat.files.length}{' '}
|
||||
{chat.files.length === 1 ? 'file' : 'files'}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
266
src/components/AssistantSteps.tsx
Normal file
266
src/components/AssistantSteps.tsx
Normal file
@@ -0,0 +1,266 @@
|
||||
'use client';
|
||||
|
||||
import {
|
||||
Brain,
|
||||
Search,
|
||||
FileText,
|
||||
ChevronDown,
|
||||
ChevronUp,
|
||||
BookSearch,
|
||||
} from 'lucide-react';
|
||||
import { motion, AnimatePresence } from 'framer-motion';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
|
||||
const getStepIcon = (step: ResearchBlockSubStep) => {
|
||||
if (step.type === 'reasoning') {
|
||||
return <Brain className="w-4 h-4" />;
|
||||
} else if (step.type === 'searching' || step.type === 'upload_searching') {
|
||||
return <Search className="w-4 h-4" />;
|
||||
} else if (
|
||||
step.type === 'search_results' ||
|
||||
step.type === 'upload_search_results'
|
||||
) {
|
||||
return <FileText className="w-4 h-4" />;
|
||||
} else if (step.type === 'reading') {
|
||||
return <BookSearch className="w-4 h-4" />;
|
||||
}
|
||||
|
||||
return null;
|
||||
};
|
||||
|
||||
const getStepTitle = (
|
||||
step: ResearchBlockSubStep,
|
||||
isStreaming: boolean,
|
||||
): string => {
|
||||
if (step.type === 'reasoning') {
|
||||
return isStreaming && !step.reasoning ? 'Thinking...' : 'Thinking';
|
||||
} else if (step.type === 'searching') {
|
||||
return `Searching ${step.searching.length} ${step.searching.length === 1 ? 'query' : 'queries'}`;
|
||||
} else if (step.type === 'search_results') {
|
||||
return `Found ${step.reading.length} ${step.reading.length === 1 ? 'result' : 'results'}`;
|
||||
} else if (step.type === 'reading') {
|
||||
return `Reading ${step.reading.length} ${step.reading.length === 1 ? 'source' : 'sources'}`;
|
||||
} else if (step.type === 'upload_searching') {
|
||||
return 'Scanning your uploaded documents';
|
||||
} else if (step.type === 'upload_search_results') {
|
||||
return `Reading ${step.results.length} ${step.results.length === 1 ? 'document' : 'documents'}`;
|
||||
}
|
||||
|
||||
return 'Processing';
|
||||
};
|
||||
|
||||
const AssistantSteps = ({
|
||||
block,
|
||||
status,
|
||||
isLast,
|
||||
}: {
|
||||
block: ResearchBlock;
|
||||
status: 'answering' | 'completed' | 'error';
|
||||
isLast: boolean;
|
||||
}) => {
|
||||
const [isExpanded, setIsExpanded] = useState(
|
||||
isLast && status === 'answering' ? true : false,
|
||||
);
|
||||
const { researchEnded, loading } = useChat();
|
||||
|
||||
useEffect(() => {
|
||||
if (researchEnded && isLast) {
|
||||
setIsExpanded(false);
|
||||
} else if (status === 'answering' && isLast) {
|
||||
setIsExpanded(true);
|
||||
}
|
||||
}, [researchEnded, status]);
|
||||
|
||||
if (!block || block.data.subSteps.length === 0) return null;
|
||||
|
||||
return (
|
||||
<div className="rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 overflow-hidden">
|
||||
<button
|
||||
onClick={() => setIsExpanded(!isExpanded)}
|
||||
className="w-full flex items-center justify-between p-3 hover:bg-light-200 dark:hover:bg-dark-200 transition duration-200"
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<Brain className="w-4 h-4 text-black dark:text-white" />
|
||||
<span className="text-sm font-medium text-black dark:text-white">
|
||||
Research Progress ({block.data.subSteps.length}{' '}
|
||||
{block.data.subSteps.length === 1 ? 'step' : 'steps'})
|
||||
</span>
|
||||
</div>
|
||||
{isExpanded ? (
|
||||
<ChevronUp className="w-4 h-4 text-black/70 dark:text-white/70" />
|
||||
) : (
|
||||
<ChevronDown className="w-4 h-4 text-black/70 dark:text-white/70" />
|
||||
)}
|
||||
</button>
|
||||
|
||||
<AnimatePresence>
|
||||
{isExpanded && (
|
||||
<motion.div
|
||||
initial={{ height: 0, opacity: 0 }}
|
||||
animate={{ height: 'auto', opacity: 1 }}
|
||||
exit={{ height: 0, opacity: 0 }}
|
||||
transition={{ duration: 0.2 }}
|
||||
className="border-t border-light-200 dark:border-dark-200"
|
||||
>
|
||||
<div className="p-3 space-y-2">
|
||||
{block.data.subSteps.map((step, index) => {
|
||||
const isLastStep = index === block.data.subSteps.length - 1;
|
||||
const isStreaming = loading && isLastStep && !researchEnded;
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
key={step.id}
|
||||
initial={{ opacity: 0, x: -10 }}
|
||||
animate={{ opacity: 1, x: 0 }}
|
||||
transition={{ duration: 0.2, delay: 0 }}
|
||||
className="flex gap-2"
|
||||
>
|
||||
<div className="flex flex-col items-center -mt-0.5">
|
||||
<div
|
||||
className={`rounded-full p-1.5 bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 ${isStreaming ? 'animate-pulse' : ''}`}
|
||||
>
|
||||
{getStepIcon(step)}
|
||||
</div>
|
||||
{index < block.data.subSteps.length - 1 && (
|
||||
<div className="w-0.5 flex-1 min-h-[20px] bg-light-200 dark:bg-dark-200 mt-1.5" />
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex-1 pb-1">
|
||||
<span className="text-sm font-medium text-black dark:text-white">
|
||||
{getStepTitle(step, isStreaming)}
|
||||
</span>
|
||||
|
||||
{step.type === 'reasoning' && (
|
||||
<>
|
||||
{step.reasoning && (
|
||||
<p className="text-xs text-black/70 dark:text-white/70 mt-0.5">
|
||||
{step.reasoning}
|
||||
</p>
|
||||
)}
|
||||
{isStreaming && !step.reasoning && (
|
||||
<div className="flex items-center gap-1.5 mt-0.5">
|
||||
<div
|
||||
className="w-1.5 h-1.5 bg-black/40 dark:bg-white/40 rounded-full animate-bounce"
|
||||
style={{ animationDelay: '0ms' }}
|
||||
/>
|
||||
<div
|
||||
className="w-1.5 h-1.5 bg-black/40 dark:bg-white/40 rounded-full animate-bounce"
|
||||
style={{ animationDelay: '150ms' }}
|
||||
/>
|
||||
<div
|
||||
className="w-1.5 h-1.5 bg-black/40 dark:bg-white/40 rounded-full animate-bounce"
|
||||
style={{ animationDelay: '300ms' }}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
|
||||
{step.type === 'searching' &&
|
||||
step.searching.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1.5 mt-1.5">
|
||||
{step.searching.map((query, idx) => (
|
||||
<span
|
||||
key={idx}
|
||||
className="inline-flex items-center px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200"
|
||||
>
|
||||
{query}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{(step.type === 'search_results' ||
|
||||
step.type === 'reading') &&
|
||||
step.reading.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1.5 mt-1.5">
|
||||
{step.reading.slice(0, 4).map((result, idx) => {
|
||||
const url = result.metadata.url || '';
|
||||
const title = result.metadata.title || 'Untitled';
|
||||
const domain = url ? new URL(url).hostname : '';
|
||||
const faviconUrl = domain
|
||||
? `https://s2.googleusercontent.com/s2/favicons?domain=${domain}&sz=128`
|
||||
: '';
|
||||
|
||||
return (
|
||||
<a
|
||||
key={idx}
|
||||
href={url}
|
||||
target="_blank"
|
||||
className="inline-flex items-center gap-1.5 px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200"
|
||||
>
|
||||
{faviconUrl && (
|
||||
<img
|
||||
src={faviconUrl}
|
||||
alt=""
|
||||
className="w-3 h-3 rounded-sm flex-shrink-0"
|
||||
onError={(e) => {
|
||||
e.currentTarget.style.display = 'none';
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
<span className="line-clamp-1">{title}</span>
|
||||
</a>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{step.type === 'upload_searching' &&
|
||||
step.queries.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1.5 mt-1.5">
|
||||
{step.queries.map((query, idx) => (
|
||||
<span
|
||||
key={idx}
|
||||
className="inline-flex items-center px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200"
|
||||
>
|
||||
{query}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{step.type === 'upload_search_results' &&
|
||||
step.results.length > 0 && (
|
||||
<div className="mt-1.5 grid gap-3 lg:grid-cols-3">
|
||||
{step.results.slice(0, 4).map((result, idx) => {
|
||||
const title =
|
||||
(result.metadata &&
|
||||
(result.metadata.title ||
|
||||
result.metadata.fileName)) ||
|
||||
'Untitled document';
|
||||
|
||||
return (
|
||||
<div
|
||||
key={idx}
|
||||
className="flex flex-row space-x-3 rounded-lg border border-light-200 dark:border-dark-200 bg-light-100 dark:bg-dark-100 p-2 cursor-pointer"
|
||||
>
|
||||
<div className="mt-0.5 h-10 w-10 rounded-md bg-cyan-100 text-cyan-800 dark:bg-sky-500 dark:text-cyan-50 flex items-center justify-center">
|
||||
<FileText className="w-5 h-5" />
|
||||
</div>
|
||||
<div className="flex flex-col justify-center">
|
||||
<p className="text-[13px] text-black dark:text-white line-clamp-1">
|
||||
{title}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</motion.div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</motion.div>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default AssistantSteps;
|
||||
@@ -7,11 +7,12 @@ import MessageBoxLoading from './MessageBoxLoading';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
|
||||
const Chat = () => {
|
||||
const { sections, chatTurns, loading, messageAppeared } = useChat();
|
||||
const { sections, loading, messageAppeared, messages } = useChat();
|
||||
|
||||
const [dividerWidth, setDividerWidth] = useState(0);
|
||||
const dividerRef = useRef<HTMLDivElement | null>(null);
|
||||
const messageEnd = useRef<HTMLDivElement | null>(null);
|
||||
const lastScrolledRef = useRef<number>(0);
|
||||
|
||||
useEffect(() => {
|
||||
const updateDividerWidth = () => {
|
||||
@@ -22,43 +23,48 @@ const Chat = () => {
|
||||
|
||||
updateDividerWidth();
|
||||
|
||||
const resizeObserver = new ResizeObserver(() => {
|
||||
updateDividerWidth();
|
||||
});
|
||||
|
||||
const currentRef = dividerRef.current;
|
||||
if (currentRef) {
|
||||
resizeObserver.observe(currentRef);
|
||||
}
|
||||
|
||||
window.addEventListener('resize', updateDividerWidth);
|
||||
|
||||
return () => {
|
||||
if (currentRef) {
|
||||
resizeObserver.unobserve(currentRef);
|
||||
}
|
||||
resizeObserver.disconnect();
|
||||
window.removeEventListener('resize', updateDividerWidth);
|
||||
};
|
||||
}, []);
|
||||
}, [sections.length]);
|
||||
|
||||
useEffect(() => {
|
||||
const scroll = () => {
|
||||
messageEnd.current?.scrollIntoView({ behavior: 'auto' });
|
||||
};
|
||||
|
||||
if (chatTurns.length === 1) {
|
||||
document.title = `${chatTurns[0].content.substring(0, 30)} - Perplexica`;
|
||||
if (messages.length === 1) {
|
||||
document.title = `${messages[0].query.substring(0, 30)} - Perplexica`;
|
||||
}
|
||||
|
||||
const messageEndBottom =
|
||||
messageEnd.current?.getBoundingClientRect().bottom ?? 0;
|
||||
|
||||
const distanceFromMessageEnd = window.innerHeight - messageEndBottom;
|
||||
|
||||
if (distanceFromMessageEnd >= -100) {
|
||||
if (sections.length > lastScrolledRef.current) {
|
||||
scroll();
|
||||
lastScrolledRef.current = sections.length;
|
||||
}
|
||||
|
||||
if (chatTurns[chatTurns.length - 1]?.role === 'user') {
|
||||
scroll();
|
||||
}
|
||||
}, [chatTurns]);
|
||||
}, [messages]);
|
||||
|
||||
return (
|
||||
<div className="flex flex-col space-y-6 pt-8 pb-44 lg:pb-32 sm:mx-4 md:mx-8">
|
||||
<div className="flex flex-col space-y-6 pt-8 pb-44 lg:pb-28 sm:mx-4 md:mx-8">
|
||||
{sections.map((section, i) => {
|
||||
const isLast = i === sections.length - 1;
|
||||
|
||||
return (
|
||||
<Fragment key={section.userMessage.messageId}>
|
||||
<Fragment key={section.message.messageId}>
|
||||
<MessageBox
|
||||
section={section}
|
||||
sectionIndex={i}
|
||||
@@ -74,10 +80,21 @@ const Chat = () => {
|
||||
{loading && !messageAppeared && <MessageBoxLoading />}
|
||||
<div ref={messageEnd} className="h-0" />
|
||||
{dividerWidth > 0 && (
|
||||
<div className="fixed z-40 bottom-24 lg:bottom-6" style={{ width: dividerWidth }}>
|
||||
<div
|
||||
className="bottom-24 lg:bottom-10 fixed z-40"
|
||||
style={{ width: dividerWidth }}
|
||||
>
|
||||
className="pointer-events-none absolute -bottom-6 left-0 right-0 h-[calc(100%+24px+24px)] dark:hidden"
|
||||
style={{
|
||||
background:
|
||||
'linear-gradient(to top, #ffffff 0%, #ffffff 35%, rgba(255,255,255,0.95) 45%, rgba(255,255,255,0.85) 55%, rgba(255,255,255,0.7) 65%, rgba(255,255,255,0.5) 75%, rgba(255,255,255,0.3) 85%, rgba(255,255,255,0.1) 92%, transparent 100%)',
|
||||
}}
|
||||
/>
|
||||
<div
|
||||
className="pointer-events-none absolute -bottom-6 left-0 right-0 h-[calc(100%+24px+24px)] hidden dark:block"
|
||||
style={{
|
||||
background:
|
||||
'linear-gradient(to top, #0d1117 0%, #0d1117 35%, rgba(13,17,23,0.95) 45%, rgba(13,17,23,0.85) 55%, rgba(13,17,23,0.7) 65%, rgba(13,17,23,0.5) 75%, rgba(13,17,23,0.3) 85%, rgba(13,17,23,0.1) 92%, transparent 100%)',
|
||||
}}
|
||||
/>
|
||||
<MessageInput />
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -1,15 +1,13 @@
|
||||
'use client';
|
||||
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import Navbar from './Navbar';
|
||||
import Chat from './Chat';
|
||||
import EmptyChat from './EmptyChat';
|
||||
import { Settings } from 'lucide-react';
|
||||
import Link from 'next/link';
|
||||
import NextError from 'next/error';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import Loader from './ui/Loader';
|
||||
import SettingsButtonMobile from './Settings/SettingsButtonMobile';
|
||||
import { Block } from '@/lib/types';
|
||||
import Loader from './ui/Loader';
|
||||
|
||||
export interface BaseMessage {
|
||||
chatId: string;
|
||||
@@ -17,42 +15,27 @@ export interface BaseMessage {
|
||||
createdAt: Date;
|
||||
}
|
||||
|
||||
export interface AssistantMessage extends BaseMessage {
|
||||
role: 'assistant';
|
||||
content: string;
|
||||
suggestions?: string[];
|
||||
export interface Message extends BaseMessage {
|
||||
backendId: string;
|
||||
query: string;
|
||||
responseBlocks: Block[];
|
||||
status: 'answering' | 'completed' | 'error';
|
||||
}
|
||||
|
||||
export interface UserMessage extends BaseMessage {
|
||||
role: 'user';
|
||||
content: string;
|
||||
}
|
||||
|
||||
export interface SourceMessage extends BaseMessage {
|
||||
role: 'source';
|
||||
sources: Document[];
|
||||
}
|
||||
|
||||
export interface SuggestionMessage extends BaseMessage {
|
||||
role: 'suggestion';
|
||||
suggestions: string[];
|
||||
}
|
||||
|
||||
export type Message =
|
||||
| AssistantMessage
|
||||
| UserMessage
|
||||
| SourceMessage
|
||||
| SuggestionMessage;
|
||||
export type ChatTurn = UserMessage | AssistantMessage;
|
||||
|
||||
export interface File {
|
||||
fileName: string;
|
||||
fileExtension: string;
|
||||
fileId: string;
|
||||
}
|
||||
|
||||
export interface Widget {
|
||||
widgetType: string;
|
||||
params: Record<string, any>;
|
||||
}
|
||||
|
||||
const ChatWindow = () => {
|
||||
const { hasError, isReady, notFound, messages } = useChat();
|
||||
const { hasError, notFound, messages, isReady } = useChat();
|
||||
|
||||
if (hasError) {
|
||||
return (
|
||||
<div className="relative">
|
||||
@@ -84,7 +67,7 @@ const ChatWindow = () => {
|
||||
</div>
|
||||
)
|
||||
) : (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
<div className="flex items-center justify-center min-h-screen w-full">
|
||||
<Loader />
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { ArrowRight } from 'lucide-react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import Focus from './MessageInputActions/Focus';
|
||||
import Sources from './MessageInputActions/Sources';
|
||||
import Optimization from './MessageInputActions/Optimization';
|
||||
import Attach from './MessageInputActions/Attach';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
@@ -68,8 +68,8 @@ const EmptyChatMessageInput = () => {
|
||||
<Optimization />
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<Sources />
|
||||
<ModelSelector />
|
||||
<Focus />
|
||||
<Attach />
|
||||
</div>
|
||||
<button
|
||||
|
||||
@@ -2,6 +2,7 @@ import { Check, ClipboardList } from 'lucide-react';
|
||||
import { Message } from '../ChatWindow';
|
||||
import { useState } from 'react';
|
||||
import { Section } from '@/lib/hooks/useChat';
|
||||
import { SourceBlock } from '@/lib/types';
|
||||
|
||||
const Copy = ({
|
||||
section,
|
||||
@@ -15,14 +16,31 @@ const Copy = ({
|
||||
return (
|
||||
<button
|
||||
onClick={() => {
|
||||
const contentToCopy = `${initialMessage}${section?.sourceMessage?.sources && section.sourceMessage.sources.length > 0 && `\n\nCitations:\n${section.sourceMessage.sources?.map((source: any, i: any) => `[${i + 1}] ${source.metadata.url}`).join(`\n`)}`}`;
|
||||
const sources = section.message.responseBlocks.filter(
|
||||
(b) => b.type === 'source' && b.data.length > 0,
|
||||
) as SourceBlock[];
|
||||
|
||||
const contentToCopy = `${initialMessage}${
|
||||
sources.length > 0
|
||||
? `\n\nCitations:\n${sources
|
||||
.map((source) => source.data)
|
||||
.flat()
|
||||
.map(
|
||||
(s, i) =>
|
||||
`[${i + 1}] ${s.metadata.url.startsWith('file_id://') ? s.metadata.fileName || 'Uploaded File' : s.metadata.url}`,
|
||||
)
|
||||
.join(`\n`)}`
|
||||
: ''
|
||||
}`;
|
||||
|
||||
navigator.clipboard.writeText(contentToCopy);
|
||||
|
||||
setCopied(true);
|
||||
setTimeout(() => setCopied(false), 1000);
|
||||
}}
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-full hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
{copied ? <Check size={18} /> : <ClipboardList size={18} />}
|
||||
{copied ? <Check size={16} /> : <ClipboardList size={16} />}
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { ArrowLeftRight } from 'lucide-react';
|
||||
import { ArrowLeftRight, Repeat } from 'lucide-react';
|
||||
|
||||
const Rewrite = ({
|
||||
rewrite,
|
||||
@@ -10,12 +10,11 @@ const Rewrite = ({
|
||||
return (
|
||||
<button
|
||||
onClick={() => rewrite(messageId)}
|
||||
className="py-2 px-3 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white flex flex-row items-center space-x-1"
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-full hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white flex flex-row items-center space-x-1"
|
||||
>
|
||||
<ArrowLeftRight size={18} />
|
||||
<p className="text-xs font-medium">Rewrite</p>
|
||||
<Repeat size={16} />
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
||||
1;
|
||||
export default Rewrite;
|
||||
|
||||
@@ -10,8 +10,9 @@ import {
|
||||
StopCircle,
|
||||
Layers3,
|
||||
Plus,
|
||||
CornerDownRight,
|
||||
} from 'lucide-react';
|
||||
import Markdown, { MarkdownToJSX } from 'markdown-to-jsx';
|
||||
import Markdown, { MarkdownToJSX, RuleType } from 'markdown-to-jsx';
|
||||
import Copy from './MessageActions/Copy';
|
||||
import Rewrite from './MessageActions/Rewrite';
|
||||
import MessageSources from './MessageSources';
|
||||
@@ -20,7 +21,11 @@ import SearchVideos from './SearchVideos';
|
||||
import { useSpeech } from 'react-text-to-speech';
|
||||
import ThinkBox from './ThinkBox';
|
||||
import { useChat, Section } from '@/lib/hooks/useChat';
|
||||
import Citation from './Citation';
|
||||
import Citation from './MessageRenderer/Citation';
|
||||
import AssistantSteps from './AssistantSteps';
|
||||
import { ResearchBlock } from '@/lib/types';
|
||||
import Renderer from './Widgets/Renderer';
|
||||
import CodeBlock from './MessageRenderer/CodeBlock';
|
||||
|
||||
const ThinkTagProcessor = ({
|
||||
children,
|
||||
@@ -45,15 +50,46 @@ const MessageBox = ({
|
||||
dividerRef?: MutableRefObject<HTMLDivElement | null>;
|
||||
isLast: boolean;
|
||||
}) => {
|
||||
const { loading, chatTurns, sendMessage, rewrite } = useChat();
|
||||
const {
|
||||
loading,
|
||||
sendMessage,
|
||||
rewrite,
|
||||
messages,
|
||||
researchEnded,
|
||||
chatHistory,
|
||||
} = useChat();
|
||||
|
||||
const parsedMessage = section.parsedAssistantMessage || '';
|
||||
const parsedMessage = section.parsedTextBlocks.join('\n\n');
|
||||
const speechMessage = section.speechMessage || '';
|
||||
const thinkingEnded = section.thinkingEnded;
|
||||
|
||||
const sourceBlocks = section.message.responseBlocks.filter(
|
||||
(block): block is typeof block & { type: 'source' } =>
|
||||
block.type === 'source',
|
||||
);
|
||||
|
||||
const sources = sourceBlocks.flatMap((block) => block.data);
|
||||
|
||||
const hasContent = section.parsedTextBlocks.length > 0;
|
||||
|
||||
const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
|
||||
|
||||
const markdownOverrides: MarkdownToJSX.Options = {
|
||||
renderRule(next, node, renderChildren, state) {
|
||||
if (node.type === RuleType.codeInline) {
|
||||
return `\`${node.text}\``;
|
||||
}
|
||||
|
||||
if (node.type === RuleType.codeBlock) {
|
||||
return (
|
||||
<CodeBlock key={state.key} language={node.lang || ''}>
|
||||
{node.text}
|
||||
</CodeBlock>
|
||||
);
|
||||
}
|
||||
|
||||
return next();
|
||||
},
|
||||
overrides: {
|
||||
think: {
|
||||
component: ThinkTagProcessor,
|
||||
@@ -71,7 +107,7 @@ const MessageBox = ({
|
||||
<div className="space-y-6">
|
||||
<div className={'w-full pt-8 break-words'}>
|
||||
<h2 className="text-black dark:text-white font-medium text-3xl lg:w-9/12">
|
||||
{section.userMessage.content}
|
||||
{section.message.query}
|
||||
</h2>
|
||||
</div>
|
||||
|
||||
@@ -80,8 +116,7 @@ const MessageBox = ({
|
||||
ref={dividerRef}
|
||||
className="flex flex-col space-y-6 w-full lg:w-9/12"
|
||||
>
|
||||
{section.sourceMessage &&
|
||||
section.sourceMessage.sources.length > 0 && (
|
||||
{sources.length > 0 && (
|
||||
<div className="flex flex-col space-y-2">
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<BookCopy className="text-black dark:text-white" size={20} />
|
||||
@@ -89,12 +124,43 @@ const MessageBox = ({
|
||||
Sources
|
||||
</h3>
|
||||
</div>
|
||||
<MessageSources sources={section.sourceMessage.sources} />
|
||||
<MessageSources sources={sources} />
|
||||
</div>
|
||||
)}
|
||||
|
||||
{section.message.responseBlocks
|
||||
.filter(
|
||||
(block): block is ResearchBlock =>
|
||||
block.type === 'research' && block.data.subSteps.length > 0,
|
||||
)
|
||||
.map((researchBlock) => (
|
||||
<div key={researchBlock.id} className="flex flex-col space-y-2">
|
||||
<AssistantSteps
|
||||
block={researchBlock}
|
||||
status={section.message.status}
|
||||
isLast={isLast}
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
|
||||
{isLast &&
|
||||
loading &&
|
||||
!researchEnded &&
|
||||
!section.message.responseBlocks.some(
|
||||
(b) => b.type === 'research' && b.data.subSteps.length > 0,
|
||||
) && (
|
||||
<div className="flex items-center gap-2 p-3 rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200">
|
||||
<Disc3 className="w-4 h-4 text-black dark:text-white animate-spin" />
|
||||
<span className="text-sm text-black/70 dark:text-white/70">
|
||||
Brainstorming...
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{section.widgets.length > 0 && <Renderer widgets={section.widgets} />}
|
||||
|
||||
<div className="flex flex-col space-y-2">
|
||||
{section.sourceMessage && (
|
||||
{sources.length > 0 && (
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<Disc3
|
||||
className={cn(
|
||||
@@ -109,7 +175,7 @@ const MessageBox = ({
|
||||
</div>
|
||||
)}
|
||||
|
||||
{section.assistantMessage && (
|
||||
{hasContent && (
|
||||
<>
|
||||
<Markdown
|
||||
className={cn(
|
||||
@@ -122,18 +188,15 @@ const MessageBox = ({
|
||||
</Markdown>
|
||||
|
||||
{loading && isLast ? null : (
|
||||
<div className="flex flex-row items-center justify-between w-full text-black dark:text-white py-4 -mx-2">
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<div className="flex flex-row items-center justify-between w-full text-black dark:text-white py-4">
|
||||
<div className="flex flex-row items-center -ml-2">
|
||||
<Rewrite
|
||||
rewrite={rewrite}
|
||||
messageId={section.assistantMessage.messageId}
|
||||
messageId={section.message.messageId}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<Copy
|
||||
initialMessage={section.assistantMessage.content}
|
||||
section={section}
|
||||
/>
|
||||
<div className="flex flex-row items-center -mr-2">
|
||||
<Copy initialMessage={parsedMessage} section={section} />
|
||||
<button
|
||||
onClick={() => {
|
||||
if (speechStatus === 'started') {
|
||||
@@ -142,12 +205,12 @@ const MessageBox = ({
|
||||
start();
|
||||
}
|
||||
}}
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-full hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
{speechStatus === 'started' ? (
|
||||
<StopCircle size={18} />
|
||||
<StopCircle size={16} />
|
||||
) : (
|
||||
<Volume2 size={18} />
|
||||
<Volume2 size={16} />
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
@@ -157,9 +220,9 @@ const MessageBox = ({
|
||||
{isLast &&
|
||||
section.suggestions &&
|
||||
section.suggestions.length > 0 &&
|
||||
section.assistantMessage &&
|
||||
hasContent &&
|
||||
!loading && (
|
||||
<div className="mt-8 pt-6 border-t border-light-200/50 dark:border-dark-200/50">
|
||||
<div className="mt-6">
|
||||
<div className="flex flex-row items-center space-x-2 mb-4">
|
||||
<Layers3
|
||||
className="text-black dark:text-white"
|
||||
@@ -173,20 +236,24 @@ const MessageBox = ({
|
||||
{section.suggestions.map(
|
||||
(suggestion: string, i: number) => (
|
||||
<div key={i}>
|
||||
{i > 0 && (
|
||||
<div className="h-px bg-light-200/40 dark:bg-dark-200/40 mx-3" />
|
||||
)}
|
||||
<div className="h-px bg-light-200/40 dark:bg-dark-200/40" />
|
||||
<button
|
||||
onClick={() => sendMessage(suggestion)}
|
||||
className="group w-full px-3 py-4 text-left transition-colors duration-200"
|
||||
className="group w-full py-4 text-left transition-colors duration-200"
|
||||
>
|
||||
<div className="flex items-center justify-between gap-3">
|
||||
<p className="text-sm text-black/70 dark:text-white/70 group-hover:text-[#24A0ED] transition-colors duration-200 leading-relaxed">
|
||||
<div className="flex flex-row space-x-3 items-center">
|
||||
<CornerDownRight
|
||||
size={15}
|
||||
className="group-hover:text-sky-400 transition-colors duration-200 flex-shrink-0"
|
||||
/>
|
||||
<p className="text-sm text-black/70 dark:text-white/70 group-hover:text-sky-400 transition-colors duration-200 leading-relaxed">
|
||||
{suggestion}
|
||||
</p>
|
||||
</div>
|
||||
<Plus
|
||||
size={16}
|
||||
className="text-black/40 dark:text-white/40 group-hover:text-[#24A0ED] transition-colors duration-200 flex-shrink-0"
|
||||
className="text-black/40 dark:text-white/40 group-hover:text-sky-400 transition-colors duration-200 flex-shrink-0"
|
||||
/>
|
||||
</div>
|
||||
</button>
|
||||
@@ -201,17 +268,17 @@ const MessageBox = ({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{section.assistantMessage && (
|
||||
{hasContent && (
|
||||
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
|
||||
<SearchImages
|
||||
query={section.userMessage.content}
|
||||
chatHistory={chatTurns.slice(0, sectionIndex * 2)}
|
||||
messageId={section.assistantMessage.messageId}
|
||||
query={section.message.query}
|
||||
chatHistory={chatHistory}
|
||||
messageId={section.message.messageId}
|
||||
/>
|
||||
<SearchVideos
|
||||
chatHistory={chatTurns.slice(0, sectionIndex * 2)}
|
||||
query={section.userMessage.content}
|
||||
messageId={section.assistantMessage.messageId}
|
||||
chatHistory={chatHistory}
|
||||
query={section.message.query}
|
||||
messageId={section.message.messageId}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -2,9 +2,6 @@ import { cn } from '@/lib/utils';
|
||||
import { ArrowUp } from 'lucide-react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import Attach from './MessageInputActions/Attach';
|
||||
import CopilotToggle from './MessageInputActions/Copilot';
|
||||
import { File } from './ChatWindow';
|
||||
import AttachSmall from './MessageInputActions/AttachSmall';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
|
||||
@@ -64,7 +61,7 @@ const MessageInput = () => {
|
||||
}
|
||||
}}
|
||||
className={cn(
|
||||
'bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-hidden border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300',
|
||||
'relative bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-visible border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300',
|
||||
mode === 'multi' ? 'flex-col rounded-2xl' : 'flex-row rounded-full',
|
||||
)}
|
||||
>
|
||||
@@ -80,11 +77,16 @@ const MessageInput = () => {
|
||||
placeholder="Ask a follow-up"
|
||||
/>
|
||||
{mode === 'single' && (
|
||||
<div className="flex flex-row items-center space-x-4">
|
||||
<CopilotToggle
|
||||
copilotEnabled={copilotEnabled}
|
||||
setCopilotEnabled={setCopilotEnabled}
|
||||
/>
|
||||
<button
|
||||
disabled={message.trim().length === 0 || loading}
|
||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
||||
>
|
||||
<ArrowUp className="bg-background" size={17} />
|
||||
</button>
|
||||
)}
|
||||
{mode === 'multi' && (
|
||||
<div className="flex flex-row items-center justify-between w-full pt-2">
|
||||
<AttachSmall />
|
||||
<button
|
||||
disabled={message.trim().length === 0 || loading}
|
||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
||||
@@ -93,23 +95,6 @@ const MessageInput = () => {
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
{mode === 'multi' && (
|
||||
<div className="flex flex-row items-center justify-between w-full pt-2">
|
||||
<AttachSmall />
|
||||
<div className="flex flex-row items-center space-x-4">
|
||||
<CopilotToggle
|
||||
copilotEnabled={copilotEnabled}
|
||||
setCopilotEnabled={setCopilotEnabled}
|
||||
/>
|
||||
<button
|
||||
disabled={message.trim().length === 0 || loading}
|
||||
className="bg-[#24A0ED] text-white text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
||||
>
|
||||
<ArrowUp className="bg-background" size={17} />
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</form>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -16,6 +16,8 @@ import {
|
||||
} from 'lucide-react';
|
||||
import { Fragment, useRef, useState } from 'react';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import { AnimatePresence } from 'motion/react';
|
||||
import { motion } from 'framer-motion';
|
||||
|
||||
const Attach = () => {
|
||||
const { files, setFiles, setFileIds, fileIds } = useChat();
|
||||
@@ -53,29 +55,33 @@ const Attach = () => {
|
||||
|
||||
return loading ? (
|
||||
<div className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none text-black/50 dark:text-white/50 transition duration-200">
|
||||
<LoaderCircle size={16} className="text-sky-400 animate-spin" />
|
||||
<LoaderCircle size={16} className="text-sky-500 animate-spin" />
|
||||
</div>
|
||||
) : files.length > 0 ? (
|
||||
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
|
||||
{({ open }) => (
|
||||
<>
|
||||
<PopoverButton
|
||||
type="button"
|
||||
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
<File size={16} className="text-sky-400" />
|
||||
<File size={16} className="text-sky-500" />
|
||||
</PopoverButton>
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-150"
|
||||
enterFrom="opacity-0 translate-y-1"
|
||||
enterTo="opacity-100 translate-y-0"
|
||||
leave="transition ease-in duration-150"
|
||||
leaveFrom="opacity-100 translate-y-0"
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
<AnimatePresence>
|
||||
{open && (
|
||||
<PopoverPanel
|
||||
className="absolute z-10 w-64 md:w-[350px] right-0"
|
||||
static
|
||||
>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.9 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
exit={{ opacity: 0, scale: 0.9 }}
|
||||
transition={{ duration: 0.1, ease: 'easeOut' }}
|
||||
className="origin-top-right bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col"
|
||||
>
|
||||
<PopoverPanel className="absolute z-10 w-64 md:w-[350px] right-0">
|
||||
<div className="bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col">
|
||||
<div className="flex flex-row items-center justify-between px-3 py-2">
|
||||
<h4 className="text-black dark:text-white font-medium text-sm">
|
||||
<h4 className="text-black/70 dark:text-white/70 text-sm">
|
||||
Attached files
|
||||
</h4>
|
||||
<div className="flex flex-row items-center space-x-4">
|
||||
@@ -102,7 +108,7 @@ const Attach = () => {
|
||||
}}
|
||||
className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200 focus:outline-none"
|
||||
>
|
||||
<Trash size={14} />
|
||||
<Trash size={13} />
|
||||
<p className="text-xs">Clear</p>
|
||||
</button>
|
||||
</div>
|
||||
@@ -114,15 +120,17 @@ const Attach = () => {
|
||||
key={i}
|
||||
className="flex flex-row items-center justify-start w-full space-x-3 p-3"
|
||||
>
|
||||
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-10 h-10 rounded-md">
|
||||
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-9 h-9 rounded-md">
|
||||
<File
|
||||
size={16}
|
||||
className="text-black/70 dark:text-white/70"
|
||||
/>
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
{file.fileName.length > 25
|
||||
? file.fileName.replace(/\.\w+$/, '').substring(0, 25) +
|
||||
? file.fileName
|
||||
.replace(/\.\w+$/, '')
|
||||
.substring(0, 25) +
|
||||
'...' +
|
||||
file.fileExtension
|
||||
: file.fileName}
|
||||
@@ -130,9 +138,12 @@ const Attach = () => {
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</motion.div>
|
||||
</PopoverPanel>
|
||||
</Transition>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</>
|
||||
)}
|
||||
</Popover>
|
||||
) : (
|
||||
<button
|
||||
|
||||
@@ -1,21 +1,14 @@
|
||||
import { cn } from '@/lib/utils';
|
||||
import {
|
||||
Popover,
|
||||
PopoverButton,
|
||||
PopoverPanel,
|
||||
Transition,
|
||||
} from '@headlessui/react';
|
||||
import {
|
||||
CopyPlus,
|
||||
File,
|
||||
LoaderCircle,
|
||||
Paperclip,
|
||||
Plus,
|
||||
Trash,
|
||||
} from 'lucide-react';
|
||||
import { File, LoaderCircle, Paperclip, Plus, Trash } from 'lucide-react';
|
||||
import { Fragment, useRef, useState } from 'react';
|
||||
import { File as FileType } from '../ChatWindow';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import { AnimatePresence } from 'motion/react';
|
||||
import { motion } from 'framer-motion';
|
||||
|
||||
const AttachSmall = () => {
|
||||
const { files, setFiles, setFileIds, fileIds } = useChat();
|
||||
@@ -53,29 +46,33 @@ const AttachSmall = () => {
|
||||
|
||||
return loading ? (
|
||||
<div className="flex flex-row items-center justify-between space-x-1 p-1 ">
|
||||
<LoaderCircle size={20} className="text-sky-400 animate-spin" />
|
||||
<LoaderCircle size={20} className="text-sky-500 animate-spin" />
|
||||
</div>
|
||||
) : files.length > 0 ? (
|
||||
<Popover className="max-w-[15rem] md:max-w-md lg:max-w-lg">
|
||||
{({ open }) => (
|
||||
<>
|
||||
<PopoverButton
|
||||
type="button"
|
||||
className="flex flex-row items-center justify-between space-x-1 p-1 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
<File size={20} className="text-sky-400" />
|
||||
<File size={20} className="text-sky-500" />
|
||||
</PopoverButton>
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-150"
|
||||
enterFrom="opacity-0 translate-y-1"
|
||||
enterTo="opacity-100 translate-y-0"
|
||||
leave="transition ease-in duration-150"
|
||||
leaveFrom="opacity-100 translate-y-0"
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
<AnimatePresence>
|
||||
{open && (
|
||||
<PopoverPanel
|
||||
className="absolute z-10 w-64 md:w-[350px] bottom-14"
|
||||
static
|
||||
>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.9 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
exit={{ opacity: 0, scale: 0.9 }}
|
||||
transition={{ duration: 0.1, ease: 'easeOut' }}
|
||||
className="origin-bottom-left bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col"
|
||||
>
|
||||
<PopoverPanel className="absolute z-10 w-64 md:w-[350px] bottom-14 -ml-3">
|
||||
<div className="bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col">
|
||||
<div className="flex flex-row items-center justify-between px-3 py-2">
|
||||
<h4 className="text-black dark:text-white font-medium text-sm">
|
||||
<h4 className="text-black/70 dark:text-white/70 font-medium text-sm">
|
||||
Attached files
|
||||
</h4>
|
||||
<div className="flex flex-row items-center space-x-4">
|
||||
@@ -92,7 +89,7 @@ const AttachSmall = () => {
|
||||
multiple
|
||||
hidden
|
||||
/>
|
||||
<Plus size={18} />
|
||||
<Plus size={16} />
|
||||
<p className="text-xs">Add</p>
|
||||
</button>
|
||||
<button
|
||||
@@ -102,7 +99,7 @@ const AttachSmall = () => {
|
||||
}}
|
||||
className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200"
|
||||
>
|
||||
<Trash size={14} />
|
||||
<Trash size={13} />
|
||||
<p className="text-xs">Clear</p>
|
||||
</button>
|
||||
</div>
|
||||
@@ -114,15 +111,17 @@ const AttachSmall = () => {
|
||||
key={i}
|
||||
className="flex flex-row items-center justify-start w-full space-x-3 p-3"
|
||||
>
|
||||
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-10 h-10 rounded-md">
|
||||
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-9 h-9 rounded-md">
|
||||
<File
|
||||
size={16}
|
||||
className="text-black/70 dark:text-white/70"
|
||||
/>
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
{file.fileName.length > 25
|
||||
? file.fileName.replace(/\.\w+$/, '').substring(0, 25) +
|
||||
? file.fileName
|
||||
.replace(/\.\w+$/, '')
|
||||
.substring(0, 25) +
|
||||
'...' +
|
||||
file.fileExtension
|
||||
: file.fileName}
|
||||
@@ -130,9 +129,12 @@ const AttachSmall = () => {
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</motion.div>
|
||||
</PopoverPanel>
|
||||
</Transition>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</>
|
||||
)}
|
||||
</Popover>
|
||||
) : (
|
||||
<button
|
||||
|
||||
@@ -2,15 +2,11 @@
|
||||
|
||||
import { Cpu, Loader2, Search } from 'lucide-react';
|
||||
import { cn } from '@/lib/utils';
|
||||
import {
|
||||
Popover,
|
||||
PopoverButton,
|
||||
PopoverPanel,
|
||||
Transition,
|
||||
} from '@headlessui/react';
|
||||
import { Fragment, useEffect, useMemo, useState } from 'react';
|
||||
import { Popover, PopoverButton, PopoverPanel } from '@headlessui/react';
|
||||
import { useEffect, useMemo, useState } from 'react';
|
||||
import { MinimalProvider } from '@/lib/models/types';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import { AnimatePresence, motion } from 'motion/react';
|
||||
|
||||
const ModelSelector = () => {
|
||||
const [providers, setProviders] = useState<MinimalProvider[]>([]);
|
||||
@@ -79,24 +75,28 @@ const ModelSelector = () => {
|
||||
|
||||
return (
|
||||
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
|
||||
{({ open }) => (
|
||||
<>
|
||||
<PopoverButton
|
||||
type="button"
|
||||
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
<Cpu size={16} className="text-sky-500" />
|
||||
</PopoverButton>
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-100"
|
||||
enterFrom="opacity-0 translate-y-1"
|
||||
enterTo="opacity-100 translate-y-0"
|
||||
leave="transition ease-in duration-100"
|
||||
leaveFrom="opacity-100 translate-y-0"
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
<AnimatePresence>
|
||||
{open && (
|
||||
<PopoverPanel
|
||||
className="absolute z-10 w-[230px] sm:w-[270px] md:w-[300px] right-0"
|
||||
static
|
||||
>
|
||||
<PopoverPanel className="absolute z-10 w-[230px] sm:w-[270px] md:w-[300px] -right-4">
|
||||
<div className="bg-light-primary dark:bg-dark-primary max-h-[300px] sm:max-w-none border rounded-lg border-light-200 dark:border-dark-200 w-full flex flex-col shadow-lg overflow-hidden">
|
||||
<div className="p-4 border-b border-light-200 dark:border-dark-200">
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.9 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
exit={{ opacity: 0, scale: 0.9 }}
|
||||
transition={{ duration: 0.1, ease: 'easeOut' }}
|
||||
className="origin-top-right bg-light-primary dark:bg-dark-primary max-h-[300px] sm:max-w-none border rounded-lg border-light-200 dark:border-dark-200 w-full flex flex-col shadow-lg overflow-hidden"
|
||||
>
|
||||
<div className="p-2 border-b border-light-200 dark:border-dark-200">
|
||||
<div className="relative">
|
||||
<Search
|
||||
size={16}
|
||||
@@ -107,7 +107,7 @@ const ModelSelector = () => {
|
||||
placeholder="Search models..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className="w-full pl-9 pr-3 py-2 bg-light-secondary dark:bg-dark-secondary rounded-lg placeholder:text-sm text-sm text-black dark:text-white placeholder:text-black/40 dark:placeholder:text-white/40 focus:outline-none focus:ring-2 focus:ring-sky-500/20 border border-transparent focus:border-sky-500/30 transition duration-200"
|
||||
className="w-full pl-8 pr-3 py-2 bg-light-secondary dark:bg-dark-secondary rounded-lg placeholder:text-xs placeholder:-translate-y-[1.5px] text-xs text-black dark:text-white placeholder:text-black/40 dark:placeholder:text-white/40 focus:outline-none border border-transparent transition duration-200"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
@@ -146,7 +146,8 @@ const ModelSelector = () => {
|
||||
type="button"
|
||||
className={cn(
|
||||
'px-3 py-2 flex items-center justify-between text-start duration-200 cursor-pointer transition rounded-lg group',
|
||||
chatModelProvider?.providerId === provider.id &&
|
||||
chatModelProvider?.providerId ===
|
||||
provider.id &&
|
||||
chatModelProvider?.key === model.key
|
||||
? 'bg-light-secondary dark:bg-dark-secondary'
|
||||
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
|
||||
@@ -166,7 +167,7 @@ const ModelSelector = () => {
|
||||
/>
|
||||
<p
|
||||
className={cn(
|
||||
'text-sm truncate',
|
||||
'text-xs truncate',
|
||||
chatModelProvider?.providerId ===
|
||||
provider.id &&
|
||||
chatModelProvider?.key === model.key
|
||||
@@ -189,9 +190,12 @@ const ModelSelector = () => {
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</motion.div>
|
||||
</PopoverPanel>
|
||||
</Transition>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</>
|
||||
)}
|
||||
</Popover>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,43 +0,0 @@
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Switch } from '@headlessui/react';
|
||||
|
||||
const CopilotToggle = ({
|
||||
copilotEnabled,
|
||||
setCopilotEnabled,
|
||||
}: {
|
||||
copilotEnabled: boolean;
|
||||
setCopilotEnabled: (enabled: boolean) => void;
|
||||
}) => {
|
||||
return (
|
||||
<div className="group flex flex-row items-center space-x-1 active:scale-95 duration-200 transition cursor-pointer">
|
||||
<Switch
|
||||
checked={copilotEnabled}
|
||||
onChange={setCopilotEnabled}
|
||||
className="bg-light-secondary dark:bg-dark-secondary border border-light-200/70 dark:border-dark-200 relative inline-flex h-5 w-10 sm:h-6 sm:w-11 items-center rounded-full"
|
||||
>
|
||||
<span className="sr-only">Copilot</span>
|
||||
<span
|
||||
className={cn(
|
||||
copilotEnabled
|
||||
? 'translate-x-6 bg-[#24A0ED]'
|
||||
: 'translate-x-1 bg-black/50 dark:bg-white/50',
|
||||
'inline-block h-3 w-3 sm:h-4 sm:w-4 transform rounded-full transition-all duration-200',
|
||||
)}
|
||||
/>
|
||||
</Switch>
|
||||
<p
|
||||
onClick={() => setCopilotEnabled(!copilotEnabled)}
|
||||
className={cn(
|
||||
'text-xs font-medium transition-colors duration-150 ease-in-out',
|
||||
copilotEnabled
|
||||
? 'text-[#24A0ED]'
|
||||
: 'text-black/50 dark:text-white/50 group-hover:text-black dark:group-hover:text-white',
|
||||
)}
|
||||
>
|
||||
Copilot
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default CopilotToggle;
|
||||
@@ -1,123 +0,0 @@
|
||||
import {
|
||||
BadgePercent,
|
||||
ChevronDown,
|
||||
Globe,
|
||||
Pencil,
|
||||
ScanEye,
|
||||
SwatchBook,
|
||||
} from 'lucide-react';
|
||||
import { cn } from '@/lib/utils';
|
||||
import {
|
||||
Popover,
|
||||
PopoverButton,
|
||||
PopoverPanel,
|
||||
Transition,
|
||||
} from '@headlessui/react';
|
||||
import { SiReddit, SiYoutube } from '@icons-pack/react-simple-icons';
|
||||
import { Fragment } from 'react';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
|
||||
const focusModes = [
|
||||
{
|
||||
key: 'webSearch',
|
||||
title: 'All',
|
||||
description: 'Searches across all of the internet',
|
||||
icon: <Globe size={16} />,
|
||||
},
|
||||
{
|
||||
key: 'academicSearch',
|
||||
title: 'Academic',
|
||||
description: 'Search in published academic papers',
|
||||
icon: <SwatchBook size={16} />,
|
||||
},
|
||||
{
|
||||
key: 'writingAssistant',
|
||||
title: 'Writing',
|
||||
description: 'Chat without searching the web',
|
||||
icon: <Pencil size={16} />,
|
||||
},
|
||||
{
|
||||
key: 'wolframAlphaSearch',
|
||||
title: 'Wolfram Alpha',
|
||||
description: 'Computational knowledge engine',
|
||||
icon: <BadgePercent size={16} />,
|
||||
},
|
||||
{
|
||||
key: 'youtubeSearch',
|
||||
title: 'Youtube',
|
||||
description: 'Search and watch videos',
|
||||
icon: <SiYoutube className="h-[16px] w-auto mr-0.5" />,
|
||||
},
|
||||
{
|
||||
key: 'redditSearch',
|
||||
title: 'Reddit',
|
||||
description: 'Search for discussions and opinions',
|
||||
icon: <SiReddit className="h-[16px] w-auto mr-0.5" />,
|
||||
},
|
||||
];
|
||||
|
||||
const Focus = () => {
|
||||
const { focusMode, setFocusMode } = useChat();
|
||||
|
||||
return (
|
||||
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
|
||||
<PopoverButton
|
||||
type="button"
|
||||
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
{focusMode !== 'webSearch' ? (
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{focusModes.find((mode) => mode.key === focusMode)?.icon}
|
||||
</div>
|
||||
) : (
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<Globe size={16} />
|
||||
</div>
|
||||
)}
|
||||
</PopoverButton>
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-150"
|
||||
enterFrom="opacity-0 translate-y-1"
|
||||
enterTo="opacity-100 translate-y-0"
|
||||
leave="transition ease-in duration-150"
|
||||
leaveFrom="opacity-100 translate-y-0"
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
>
|
||||
<PopoverPanel className="absolute z-10 w-64 md:w-[500px] -right-4">
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
||||
{focusModes.map((mode, i) => (
|
||||
<PopoverButton
|
||||
onClick={() => setFocusMode(mode.key)}
|
||||
key={i}
|
||||
className={cn(
|
||||
'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-2 duration-200 cursor-pointer transition focus:outline-none',
|
||||
focusMode === mode.key
|
||||
? 'bg-light-secondary dark:bg-dark-secondary'
|
||||
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
|
||||
)}
|
||||
>
|
||||
<div
|
||||
className={cn(
|
||||
'flex flex-row items-center space-x-1',
|
||||
focusMode === mode.key
|
||||
? 'text-[#24A0ED]'
|
||||
: 'text-black dark:text-white',
|
||||
)}
|
||||
>
|
||||
{mode.icon}
|
||||
<p className="text-sm font-medium">{mode.title}</p>
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
{mode.description}
|
||||
</p>
|
||||
</PopoverButton>
|
||||
))}
|
||||
</div>
|
||||
</PopoverPanel>
|
||||
</Transition>
|
||||
</Popover>
|
||||
);
|
||||
};
|
||||
|
||||
export default Focus;
|
||||
@@ -8,6 +8,7 @@ import {
|
||||
} from '@headlessui/react';
|
||||
import { Fragment } from 'react';
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import { AnimatePresence, motion } from 'motion/react';
|
||||
|
||||
const OptimizationModes = [
|
||||
{
|
||||
@@ -24,7 +25,7 @@ const OptimizationModes = [
|
||||
},
|
||||
{
|
||||
key: 'quality',
|
||||
title: 'Quality (Soon)',
|
||||
title: 'Quality',
|
||||
description: 'Get the most thorough and accurate answer',
|
||||
icon: (
|
||||
<Star
|
||||
@@ -60,42 +61,50 @@ const Optimization = () => {
|
||||
/>
|
||||
</div>
|
||||
</PopoverButton>
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-150"
|
||||
enterFrom="opacity-0 translate-y-1"
|
||||
enterTo="opacity-100 translate-y-0"
|
||||
leave="transition ease-in duration-150"
|
||||
leaveFrom="opacity-100 translate-y-0"
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
<AnimatePresence>
|
||||
{open && (
|
||||
<PopoverPanel
|
||||
className="absolute z-10 w-64 md:w-[250px] left-0"
|
||||
static
|
||||
>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.9 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
exit={{ opacity: 0, scale: 0.9 }}
|
||||
transition={{ duration: 0.1, ease: 'easeOut' }}
|
||||
className="origin-top-left flex flex-col space-y-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-2 max-h-[200px] md:max-h-none overflow-y-auto"
|
||||
>
|
||||
<PopoverPanel className="absolute z-10 w-64 md:w-[250px] left-0">
|
||||
<div className="flex flex-col gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
||||
{OptimizationModes.map((mode, i) => (
|
||||
<PopoverButton
|
||||
onClick={() => setOptimizationMode(mode.key)}
|
||||
key={i}
|
||||
disabled={mode.key === 'quality'}
|
||||
className={cn(
|
||||
'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-1 duration-200 cursor-pointer transition focus:outline-none',
|
||||
optimizationMode === mode.key
|
||||
? 'bg-light-secondary dark:bg-dark-secondary'
|
||||
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
|
||||
mode.key === 'quality' && 'opacity-50 cursor-not-allowed',
|
||||
)}
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-1 text-black dark:text-white">
|
||||
<div className="flex flex-row justify-between w-full text-black dark:text-white">
|
||||
<div className="flex flex-row space-x-1">
|
||||
{mode.icon}
|
||||
<p className="text-sm font-medium">{mode.title}</p>
|
||||
<p className="text-xs font-medium">{mode.title}</p>
|
||||
</div>
|
||||
{mode.key === 'quality' && (
|
||||
<span className="bg-sky-500/70 dark:bg-sky-500/40 border border-sky-600 px-1 rounded-full text-[10px] text-white">
|
||||
Beta
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
{mode.description}
|
||||
</p>
|
||||
</PopoverButton>
|
||||
))}
|
||||
</div>
|
||||
</motion.div>
|
||||
</PopoverPanel>
|
||||
</Transition>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</>
|
||||
)}
|
||||
</Popover>
|
||||
|
||||
93
src/components/MessageInputActions/Sources.tsx
Normal file
93
src/components/MessageInputActions/Sources.tsx
Normal file
@@ -0,0 +1,93 @@
|
||||
import { useChat } from '@/lib/hooks/useChat';
|
||||
import {
|
||||
Popover,
|
||||
PopoverButton,
|
||||
PopoverPanel,
|
||||
Switch,
|
||||
} from '@headlessui/react';
|
||||
import {
|
||||
GlobeIcon,
|
||||
GraduationCapIcon,
|
||||
NetworkIcon,
|
||||
} from '@phosphor-icons/react';
|
||||
import { AnimatePresence, motion } from 'motion/react';
|
||||
|
||||
const sourcesList = [
|
||||
{
|
||||
name: 'Web',
|
||||
key: 'web',
|
||||
icon: <GlobeIcon className="h-[16px] w-auto" />,
|
||||
},
|
||||
{
|
||||
name: 'Academic',
|
||||
key: 'academic',
|
||||
icon: <GraduationCapIcon className="h-[16px] w-auto" />,
|
||||
},
|
||||
{
|
||||
name: 'Social',
|
||||
key: 'discussions',
|
||||
icon: <NetworkIcon className="h-[16px] w-auto" />,
|
||||
},
|
||||
];
|
||||
|
||||
const Sources = () => {
|
||||
const { sources, setSources } = useChat();
|
||||
|
||||
return (
|
||||
<Popover className="relative">
|
||||
{({ open }) => (
|
||||
<>
|
||||
<PopoverButton className="flex items-center justify-center active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white">
|
||||
<GlobeIcon className="h-[18px] w-auto" />
|
||||
</PopoverButton>
|
||||
<AnimatePresence>
|
||||
{open && (
|
||||
<PopoverPanel
|
||||
static
|
||||
className="absolute z-10 w-64 md:w-[225px] right-0"
|
||||
>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.9 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
exit={{ opacity: 0, scale: 0.9 }}
|
||||
transition={{ duration: 0.1, ease: 'easeOut' }}
|
||||
className="origin-top-right flex flex-col bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-1 max-h-[200px] md:max-h-none overflow-y-auto shadow-lg"
|
||||
>
|
||||
{sourcesList.map((source, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className="flex flex-row justify-between hover:bg-light-100 hover:dark:bg-dark-100 rounded-md py-3 px-2 cursor-pointer"
|
||||
onClick={() => {
|
||||
if (!sources.includes(source.key)) {
|
||||
setSources([...sources, source.key]);
|
||||
} else {
|
||||
setSources(sources.filter((s) => s !== source.key));
|
||||
}
|
||||
}}
|
||||
>
|
||||
<div className="flex flex-row space-x-1.5 text-black/80 dark:text-white/80">
|
||||
{source.icon}
|
||||
<p className="text-xs">{source.name}</p>
|
||||
</div>
|
||||
<Switch
|
||||
checked={sources.includes(source.key)}
|
||||
className="group relative flex h-4 w-7 shrink-0 cursor-pointer rounded-full bg-light-200 dark:bg-white/10 p-0.5 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500 dark:data-[checked]:bg-sky-500"
|
||||
>
|
||||
<span
|
||||
aria-hidden="true"
|
||||
className="pointer-events-none inline-block size-3 translate-x-[1px] group-data-[checked]:translate-x-3 rounded-full bg-white shadow-lg ring-0 transition duration-200 ease-in-out"
|
||||
/>
|
||||
</Switch>
|
||||
</div>
|
||||
))}
|
||||
</motion.div>
|
||||
</PopoverPanel>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
</>
|
||||
)}
|
||||
</Popover>
|
||||
);
|
||||
};
|
||||
|
||||
export default Sources;
|
||||
102
src/components/MessageRenderer/CodeBlock/CodeBlockDarkTheme.ts
Normal file
102
src/components/MessageRenderer/CodeBlock/CodeBlockDarkTheme.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import type { CSSProperties } from 'react';
|
||||
|
||||
const darkTheme = {
|
||||
'hljs-comment': {
|
||||
color: '#8b949e',
|
||||
},
|
||||
'hljs-quote': {
|
||||
color: '#8b949e',
|
||||
},
|
||||
'hljs-variable': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-template-variable': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-tag': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-name': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-selector-id': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-selector-class': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-regexp': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-deletion': {
|
||||
color: '#ff7b72',
|
||||
},
|
||||
'hljs-number': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-built_in': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-builtin-name': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-literal': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-type': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-params': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-meta': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-link': {
|
||||
color: '#f2cc60',
|
||||
},
|
||||
'hljs-attribute': {
|
||||
color: '#58a6ff',
|
||||
},
|
||||
'hljs-string': {
|
||||
color: '#7ee787',
|
||||
},
|
||||
'hljs-symbol': {
|
||||
color: '#7ee787',
|
||||
},
|
||||
'hljs-bullet': {
|
||||
color: '#7ee787',
|
||||
},
|
||||
'hljs-addition': {
|
||||
color: '#7ee787',
|
||||
},
|
||||
'hljs-title': {
|
||||
color: '#79c0ff',
|
||||
},
|
||||
'hljs-section': {
|
||||
color: '#79c0ff',
|
||||
},
|
||||
'hljs-keyword': {
|
||||
color: '#c297ff',
|
||||
},
|
||||
'hljs-selector-tag': {
|
||||
color: '#c297ff',
|
||||
},
|
||||
hljs: {
|
||||
display: 'block',
|
||||
overflowX: 'auto',
|
||||
background: '#0d1117',
|
||||
color: '#c9d1d9',
|
||||
padding: '0.75em',
|
||||
border: '1px solid #21262d',
|
||||
borderRadius: '10px',
|
||||
},
|
||||
'hljs-emphasis': {
|
||||
fontStyle: 'italic',
|
||||
},
|
||||
'hljs-strong': {
|
||||
fontWeight: 'bold',
|
||||
},
|
||||
} satisfies Record<string, CSSProperties>;
|
||||
|
||||
export default darkTheme;
|
||||
102
src/components/MessageRenderer/CodeBlock/CodeBlockLightTheme.ts
Normal file
102
src/components/MessageRenderer/CodeBlock/CodeBlockLightTheme.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import type { CSSProperties } from 'react';
|
||||
|
||||
const lightTheme = {
|
||||
'hljs-comment': {
|
||||
color: '#6e7781',
|
||||
},
|
||||
'hljs-quote': {
|
||||
color: '#6e7781',
|
||||
},
|
||||
'hljs-variable': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-template-variable': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-tag': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-name': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-selector-id': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-selector-class': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-regexp': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-deletion': {
|
||||
color: '#d73a49',
|
||||
},
|
||||
'hljs-number': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-built_in': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-builtin-name': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-literal': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-type': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-params': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-meta': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-link': {
|
||||
color: '#b08800',
|
||||
},
|
||||
'hljs-attribute': {
|
||||
color: '#0a64ae',
|
||||
},
|
||||
'hljs-string': {
|
||||
color: '#22863a',
|
||||
},
|
||||
'hljs-symbol': {
|
||||
color: '#22863a',
|
||||
},
|
||||
'hljs-bullet': {
|
||||
color: '#22863a',
|
||||
},
|
||||
'hljs-addition': {
|
||||
color: '#22863a',
|
||||
},
|
||||
'hljs-title': {
|
||||
color: '#005cc5',
|
||||
},
|
||||
'hljs-section': {
|
||||
color: '#005cc5',
|
||||
},
|
||||
'hljs-keyword': {
|
||||
color: '#6f42c1',
|
||||
},
|
||||
'hljs-selector-tag': {
|
||||
color: '#6f42c1',
|
||||
},
|
||||
hljs: {
|
||||
display: 'block',
|
||||
overflowX: 'auto',
|
||||
background: '#ffffff',
|
||||
color: '#24292f',
|
||||
padding: '0.75em',
|
||||
border: '1px solid #e8edf1',
|
||||
borderRadius: '10px',
|
||||
},
|
||||
'hljs-emphasis': {
|
||||
fontStyle: 'italic',
|
||||
},
|
||||
'hljs-strong': {
|
||||
fontWeight: 'bold',
|
||||
},
|
||||
} satisfies Record<string, CSSProperties>;
|
||||
|
||||
export default lightTheme;
|
||||
64
src/components/MessageRenderer/CodeBlock/index.tsx
Normal file
64
src/components/MessageRenderer/CodeBlock/index.tsx
Normal file
@@ -0,0 +1,64 @@
|
||||
'use client';
|
||||
|
||||
import { CheckIcon, CopyIcon } from '@phosphor-icons/react';
|
||||
import React, { useEffect, useMemo, useState } from 'react';
|
||||
import { useTheme } from 'next-themes';
|
||||
import SyntaxHighlighter from 'react-syntax-highlighter';
|
||||
import darkTheme from './CodeBlockDarkTheme';
|
||||
import lightTheme from './CodeBlockLightTheme';
|
||||
|
||||
const CodeBlock = ({
|
||||
language,
|
||||
children,
|
||||
}: {
|
||||
language: string;
|
||||
children: React.ReactNode;
|
||||
}) => {
|
||||
const { resolvedTheme } = useTheme();
|
||||
const [mounted, setMounted] = useState(false);
|
||||
|
||||
const [copied, setCopied] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
setMounted(true);
|
||||
}, []);
|
||||
|
||||
const syntaxTheme = useMemo(() => {
|
||||
if (!mounted) return lightTheme;
|
||||
return resolvedTheme === 'dark' ? darkTheme : lightTheme;
|
||||
}, [mounted, resolvedTheme]);
|
||||
|
||||
return (
|
||||
<div className="relative">
|
||||
<button
|
||||
className="absolute top-2 right-2 p-1"
|
||||
onClick={() => {
|
||||
navigator.clipboard.writeText(children as string);
|
||||
setCopied(true);
|
||||
setTimeout(() => setCopied(false), 2000);
|
||||
}}
|
||||
>
|
||||
{copied ? (
|
||||
<CheckIcon
|
||||
size={16}
|
||||
className="absolute top-2 right-2 text-black/70 dark:text-white/70"
|
||||
/>
|
||||
) : (
|
||||
<CopyIcon
|
||||
size={16}
|
||||
className="absolute top-2 right-2 transition duration-200 text-black/70 dark:text-white/70 hover:text-gray-800/70 hover:dark:text-gray-300/70"
|
||||
/>
|
||||
)}
|
||||
</button>
|
||||
<SyntaxHighlighter
|
||||
language={language}
|
||||
style={syntaxTheme}
|
||||
showInlineLineNumbers
|
||||
>
|
||||
{children as string}
|
||||
</SyntaxHighlighter>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default CodeBlock;
|
||||
@@ -6,11 +6,11 @@ import {
|
||||
Transition,
|
||||
TransitionChild,
|
||||
} from '@headlessui/react';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { File } from 'lucide-react';
|
||||
import { Fragment, useState } from 'react';
|
||||
import { Chunk } from '@/lib/types';
|
||||
|
||||
const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
const MessageSources = ({ sources }: { sources: Chunk[] }) => {
|
||||
const [isDialogOpen, setIsDialogOpen] = useState(false);
|
||||
|
||||
const closeModal = () => {
|
||||
@@ -37,7 +37,7 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
</p>
|
||||
<div className="flex flex-row items-center justify-between">
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{source.metadata.url === 'File' ? (
|
||||
{source.metadata.url.includes('file_id://') ? (
|
||||
<div className="bg-dark-200 hover:bg-dark-100 transition duration-200 flex items-center justify-center w-6 h-6 rounded-full">
|
||||
<File size={12} className="text-white/70" />
|
||||
</div>
|
||||
@@ -51,7 +51,9 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
/>
|
||||
)}
|
||||
<p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
{source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}
|
||||
{source.metadata.url.includes('file_id://')
|
||||
? 'Uploaded File'
|
||||
: source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}
|
||||
</p>
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1 text-black/50 dark:text-white/50 text-xs">
|
||||
|
||||
@@ -11,6 +11,7 @@ import {
|
||||
} from '@headlessui/react';
|
||||
import jsPDF from 'jspdf';
|
||||
import { useChat, Section } from '@/lib/hooks/useChat';
|
||||
import { SourceBlock } from '@/lib/types';
|
||||
|
||||
const downloadFile = (filename: string, content: string, type: string) => {
|
||||
const blob = new Blob([content], { type });
|
||||
@@ -28,35 +29,41 @@ const downloadFile = (filename: string, content: string, type: string) => {
|
||||
|
||||
const exportAsMarkdown = (sections: Section[], title: string) => {
|
||||
const date = new Date(
|
||||
sections[0]?.userMessage?.createdAt || Date.now(),
|
||||
sections[0].message.createdAt || Date.now(),
|
||||
).toLocaleString();
|
||||
let md = `# 💬 Chat Export: ${title}\n\n`;
|
||||
md += `*Exported on: ${date}*\n\n---\n`;
|
||||
|
||||
sections.forEach((section, idx) => {
|
||||
if (section.userMessage) {
|
||||
md += `\n---\n`;
|
||||
md += `**🧑 User**
|
||||
`;
|
||||
md += `*${new Date(section.userMessage.createdAt).toLocaleString()}*\n\n`;
|
||||
md += `> ${section.userMessage.content.replace(/\n/g, '\n> ')}\n`;
|
||||
}
|
||||
md += `*${new Date(section.message.createdAt).toLocaleString()}*\n\n`;
|
||||
md += `> ${section.message.query.replace(/\n/g, '\n> ')}\n`;
|
||||
|
||||
if (section.assistantMessage) {
|
||||
if (section.message.responseBlocks.length > 0) {
|
||||
md += `\n---\n`;
|
||||
md += `**🤖 Assistant**
|
||||
`;
|
||||
md += `*${new Date(section.assistantMessage.createdAt).toLocaleString()}*\n\n`;
|
||||
md += `> ${section.assistantMessage.content.replace(/\n/g, '\n> ')}\n`;
|
||||
md += `*${new Date(section.message.createdAt).toLocaleString()}*\n\n`;
|
||||
md += `> ${section.message.responseBlocks
|
||||
.filter((b) => b.type === 'text')
|
||||
.map((block) => block.data)
|
||||
.join('\n')
|
||||
.replace(/\n/g, '\n> ')}\n`;
|
||||
}
|
||||
|
||||
const sourceResponseBlock = section.message.responseBlocks.find(
|
||||
(block) => block.type === 'source',
|
||||
) as SourceBlock | undefined;
|
||||
|
||||
if (
|
||||
section.sourceMessage &&
|
||||
section.sourceMessage.sources &&
|
||||
section.sourceMessage.sources.length > 0
|
||||
sourceResponseBlock &&
|
||||
sourceResponseBlock.data &&
|
||||
sourceResponseBlock.data.length > 0
|
||||
) {
|
||||
md += `\n**Citations:**\n`;
|
||||
section.sourceMessage.sources.forEach((src: any, i: number) => {
|
||||
sourceResponseBlock.data.forEach((src: any, i: number) => {
|
||||
const url = src.metadata?.url || '';
|
||||
md += `- [${i + 1}] [${url}](${url})\n`;
|
||||
});
|
||||
@@ -69,7 +76,7 @@ const exportAsMarkdown = (sections: Section[], title: string) => {
|
||||
const exportAsPDF = (sections: Section[], title: string) => {
|
||||
const doc = new jsPDF();
|
||||
const date = new Date(
|
||||
sections[0]?.userMessage?.createdAt || Date.now(),
|
||||
sections[0]?.message?.createdAt || Date.now(),
|
||||
).toLocaleString();
|
||||
let y = 15;
|
||||
const pageHeight = doc.internal.pageSize.height;
|
||||
@@ -86,7 +93,6 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
doc.setTextColor(30);
|
||||
|
||||
sections.forEach((section, idx) => {
|
||||
if (section.userMessage) {
|
||||
if (y > pageHeight - 30) {
|
||||
doc.addPage();
|
||||
y = 15;
|
||||
@@ -96,15 +102,11 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
doc.setFont('helvetica', 'normal');
|
||||
doc.setFontSize(10);
|
||||
doc.setTextColor(120);
|
||||
doc.text(
|
||||
`${new Date(section.userMessage.createdAt).toLocaleString()}`,
|
||||
40,
|
||||
y,
|
||||
);
|
||||
doc.text(`${new Date(section.message.createdAt).toLocaleString()}`, 40, y);
|
||||
y += 6;
|
||||
doc.setTextColor(30);
|
||||
doc.setFontSize(12);
|
||||
const userLines = doc.splitTextToSize(section.userMessage.content, 180);
|
||||
const userLines = doc.splitTextToSize(section.message.query, 180);
|
||||
for (let i = 0; i < userLines.length; i++) {
|
||||
if (y > pageHeight - 20) {
|
||||
doc.addPage();
|
||||
@@ -121,9 +123,8 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
}
|
||||
doc.line(10, y, 200, y);
|
||||
y += 4;
|
||||
}
|
||||
|
||||
if (section.assistantMessage) {
|
||||
if (section.message.responseBlocks.length > 0) {
|
||||
if (y > pageHeight - 30) {
|
||||
doc.addPage();
|
||||
y = 15;
|
||||
@@ -134,7 +135,7 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
doc.setFontSize(10);
|
||||
doc.setTextColor(120);
|
||||
doc.text(
|
||||
`${new Date(section.assistantMessage.createdAt).toLocaleString()}`,
|
||||
`${new Date(section.message.createdAt).toLocaleString()}`,
|
||||
40,
|
||||
y,
|
||||
);
|
||||
@@ -142,7 +143,7 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
doc.setTextColor(30);
|
||||
doc.setFontSize(12);
|
||||
const assistantLines = doc.splitTextToSize(
|
||||
section.assistantMessage.content,
|
||||
section.parsedTextBlocks.join('\n'),
|
||||
180,
|
||||
);
|
||||
for (let i = 0; i < assistantLines.length; i++) {
|
||||
@@ -154,10 +155,14 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
y += 6;
|
||||
}
|
||||
|
||||
const sourceResponseBlock = section.message.responseBlocks.find(
|
||||
(block) => block.type === 'source',
|
||||
) as SourceBlock | undefined;
|
||||
|
||||
if (
|
||||
section.sourceMessage &&
|
||||
section.sourceMessage.sources &&
|
||||
section.sourceMessage.sources.length > 0
|
||||
sourceResponseBlock &&
|
||||
sourceResponseBlock.data &&
|
||||
sourceResponseBlock.data.length > 0
|
||||
) {
|
||||
doc.setFontSize(11);
|
||||
doc.setTextColor(80);
|
||||
@@ -167,7 +172,7 @@ const exportAsPDF = (sections: Section[], title: string) => {
|
||||
}
|
||||
doc.text('Citations:', 12, y);
|
||||
y += 5;
|
||||
section.sourceMessage.sources.forEach((src: any, i: number) => {
|
||||
sourceResponseBlock.data.forEach((src: any, i: number) => {
|
||||
const url = src.metadata?.url || '';
|
||||
if (y > pageHeight - 15) {
|
||||
doc.addPage();
|
||||
@@ -198,15 +203,16 @@ const Navbar = () => {
|
||||
const { sections, chatId } = useChat();
|
||||
|
||||
useEffect(() => {
|
||||
if (sections.length > 0 && sections[0].userMessage) {
|
||||
if (sections.length > 0 && sections[0].message) {
|
||||
const newTitle =
|
||||
sections[0].userMessage.content.length > 20
|
||||
? `${sections[0].userMessage.content.substring(0, 20).trim()}...`
|
||||
: sections[0].userMessage.content;
|
||||
sections[0].message.query.length > 30
|
||||
? `${sections[0].message.query.substring(0, 30).trim()}...`
|
||||
: sections[0].message.query || 'New Conversation';
|
||||
|
||||
setTitle(newTitle);
|
||||
const newTimeAgo = formatTimeDifference(
|
||||
new Date(),
|
||||
sections[0].userMessage.createdAt,
|
||||
sections[0].message.createdAt,
|
||||
);
|
||||
setTimeAgo(newTimeAgo);
|
||||
}
|
||||
@@ -214,10 +220,10 @@ const Navbar = () => {
|
||||
|
||||
useEffect(() => {
|
||||
const intervalId = setInterval(() => {
|
||||
if (sections.length > 0 && sections[0].userMessage) {
|
||||
if (sections.length > 0 && sections[0].message) {
|
||||
const newTimeAgo = formatTimeDifference(
|
||||
new Date(),
|
||||
sections[0].userMessage.createdAt,
|
||||
sections[0].message.createdAt,
|
||||
);
|
||||
setTimeAgo(newTimeAgo);
|
||||
}
|
||||
|
||||
@@ -17,7 +17,7 @@ const SearchImages = ({
|
||||
messageId,
|
||||
}: {
|
||||
query: string;
|
||||
chatHistory: Message[];
|
||||
chatHistory: [string, string][];
|
||||
messageId: string;
|
||||
}) => {
|
||||
const [images, setImages] = useState<Image[] | null>(null);
|
||||
|
||||
@@ -30,7 +30,7 @@ const Searchvideos = ({
|
||||
messageId,
|
||||
}: {
|
||||
query: string;
|
||||
chatHistory: Message[];
|
||||
chatHistory: [string, string][];
|
||||
messageId: string;
|
||||
}) => {
|
||||
const [videos, setVideos] = useState<Video[] | null>(null);
|
||||
|
||||
@@ -3,6 +3,7 @@ import {
|
||||
ArrowLeft,
|
||||
BrainCog,
|
||||
ChevronLeft,
|
||||
ExternalLink,
|
||||
Search,
|
||||
Sliders,
|
||||
ToggleRight,
|
||||
@@ -115,7 +116,8 @@ const SettingsDialogue = ({
|
||||
</div>
|
||||
) : (
|
||||
<div className="flex flex-1 inset-0 h-full overflow-hidden">
|
||||
<div className="hidden lg:flex flex-col w-[240px] border-r border-white-200 dark:border-dark-200 h-full px-3 pt-3 overflow-y-auto">
|
||||
<div className="hidden lg:flex flex-col justify-between w-[240px] border-r border-white-200 dark:border-dark-200 h-full px-3 pt-3 overflow-y-auto">
|
||||
<div className="flex flex-col">
|
||||
<button
|
||||
onClick={() => setIsOpen(false)}
|
||||
className="group flex flex-row items-center hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg"
|
||||
@@ -128,6 +130,7 @@ const SettingsDialogue = ({
|
||||
Back
|
||||
</p>
|
||||
</button>
|
||||
|
||||
<div className="flex flex-col items-start space-y-1 mt-8">
|
||||
{sections.map((section) => (
|
||||
<button
|
||||
@@ -146,6 +149,21 @@ const SettingsDialogue = ({
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-1 py-[18px] px-2">
|
||||
<p className="text-xs text-black/70 dark:text-white/70">
|
||||
Version: {process.env.NEXT_PUBLIC_VERSION}
|
||||
</p>
|
||||
<a
|
||||
href="https://github.com/itzcrazykns/perplexica"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="text-xs text-black/70 dark:text-white/70 flex flex-row space-x-1 items-center transition duration-200 hover:text-black/90 hover:dark:text-white/90"
|
||||
>
|
||||
<span>GitHub</span>
|
||||
<ExternalLink size={12} />
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
<div className="w-full flex flex-col overflow-hidden">
|
||||
<div className="flex flex-row lg:hidden w-full justify-between px-[20px] my-4 flex-shrink-0">
|
||||
<button
|
||||
|
||||
@@ -310,7 +310,7 @@ const SettingsSwitch = ({
|
||||
checked={isChecked}
|
||||
onChange={handleSave}
|
||||
disabled={loading}
|
||||
className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500"
|
||||
className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-light-200 dark:bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500 dark:data-[checked]:bg-sky-500"
|
||||
>
|
||||
<span
|
||||
aria-hidden="true"
|
||||
|
||||
46
src/components/Widgets/Calculation.tsx
Normal file
46
src/components/Widgets/Calculation.tsx
Normal file
@@ -0,0 +1,46 @@
|
||||
'use client';
|
||||
|
||||
import { Calculator, Equal } from 'lucide-react';
|
||||
|
||||
type CalculationWidgetProps = {
|
||||
expression: string;
|
||||
result: number;
|
||||
};
|
||||
|
||||
const Calculation = ({ expression, result }: CalculationWidgetProps) => {
|
||||
return (
|
||||
<div className="rounded-lg border border-light-200 dark:border-dark-200">
|
||||
<div className="p-4 space-y-4">
|
||||
<div className="space-y-2">
|
||||
<div className="flex items-center gap-2 text-black/60 dark:text-white/70">
|
||||
<Calculator className="w-4 h-4" />
|
||||
<span className="text-xs uppercase font-semibold tracking-wide">
|
||||
Expression
|
||||
</span>
|
||||
</div>
|
||||
<div className="rounded-lg border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary p-3">
|
||||
<code className="text-sm text-black dark:text-white font-mono break-all">
|
||||
{expression}
|
||||
</code>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
<div className="flex items-center gap-2 text-black/60 dark:text-white/70">
|
||||
<Equal className="w-4 h-4" />
|
||||
<span className="text-xs uppercase font-semibold tracking-wide">
|
||||
Result
|
||||
</span>
|
||||
</div>
|
||||
<div className="rounded-xl border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary p-5">
|
||||
<div className="text-4xl font-bold text-black dark:text-white font-mono tabular-nums">
|
||||
{result.toLocaleString()}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default Calculation;
|
||||
76
src/components/Widgets/Renderer.tsx
Normal file
76
src/components/Widgets/Renderer.tsx
Normal file
@@ -0,0 +1,76 @@
|
||||
import React from 'react';
|
||||
import { Widget } from '../ChatWindow';
|
||||
import Weather from './Weather';
|
||||
import Calculation from './Calculation';
|
||||
import Stock from './Stock';
|
||||
|
||||
const Renderer = ({ widgets }: { widgets: Widget[] }) => {
|
||||
return widgets.map((widget, index) => {
|
||||
switch (widget.widgetType) {
|
||||
case 'weather':
|
||||
return (
|
||||
<Weather
|
||||
key={index}
|
||||
location={widget.params.location}
|
||||
current={widget.params.current}
|
||||
daily={widget.params.daily}
|
||||
timezone={widget.params.timezone}
|
||||
/>
|
||||
);
|
||||
case 'calculation_result':
|
||||
return (
|
||||
<Calculation
|
||||
expression={widget.params.expression}
|
||||
result={widget.params.result}
|
||||
key={index}
|
||||
/>
|
||||
);
|
||||
case 'stock':
|
||||
return (
|
||||
<Stock
|
||||
key={index}
|
||||
symbol={widget.params.symbol}
|
||||
shortName={widget.params.shortName}
|
||||
longName={widget.params.longName}
|
||||
exchange={widget.params.exchange}
|
||||
currency={widget.params.currency}
|
||||
marketState={widget.params.marketState}
|
||||
regularMarketPrice={widget.params.regularMarketPrice}
|
||||
regularMarketChange={widget.params.regularMarketChange}
|
||||
regularMarketChangePercent={
|
||||
widget.params.regularMarketChangePercent
|
||||
}
|
||||
regularMarketPreviousClose={
|
||||
widget.params.regularMarketPreviousClose
|
||||
}
|
||||
regularMarketOpen={widget.params.regularMarketOpen}
|
||||
regularMarketDayHigh={widget.params.regularMarketDayHigh}
|
||||
regularMarketDayLow={widget.params.regularMarketDayLow}
|
||||
regularMarketVolume={widget.params.regularMarketVolume}
|
||||
averageDailyVolume3Month={widget.params.averageDailyVolume3Month}
|
||||
marketCap={widget.params.marketCap}
|
||||
fiftyTwoWeekLow={widget.params.fiftyTwoWeekLow}
|
||||
fiftyTwoWeekHigh={widget.params.fiftyTwoWeekHigh}
|
||||
trailingPE={widget.params.trailingPE}
|
||||
forwardPE={widget.params.forwardPE}
|
||||
dividendYield={widget.params.dividendYield}
|
||||
earningsPerShare={widget.params.earningsPerShare}
|
||||
website={widget.params.website}
|
||||
postMarketPrice={widget.params.postMarketPrice}
|
||||
postMarketChange={widget.params.postMarketChange}
|
||||
postMarketChangePercent={widget.params.postMarketChangePercent}
|
||||
preMarketPrice={widget.params.preMarketPrice}
|
||||
preMarketChange={widget.params.preMarketChange}
|
||||
preMarketChangePercent={widget.params.preMarketChangePercent}
|
||||
chartData={widget.params.chartData}
|
||||
comparisonData={widget.params.comparisonData}
|
||||
error={widget.params.error}
|
||||
/>
|
||||
);
|
||||
default:
|
||||
return <div key={index}>Unknown widget type: {widget.widgetType}</div>;
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
export default Renderer;
|
||||
517
src/components/Widgets/Stock.tsx
Normal file
517
src/components/Widgets/Stock.tsx
Normal file
@@ -0,0 +1,517 @@
|
||||
'use client';
|
||||
|
||||
import { Clock, ArrowUpRight, ArrowDownRight, Minus } from 'lucide-react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import {
|
||||
createChart,
|
||||
ColorType,
|
||||
LineStyle,
|
||||
BaselineSeries,
|
||||
LineSeries,
|
||||
} from 'lightweight-charts';
|
||||
|
||||
type StockWidgetProps = {
|
||||
symbol: string;
|
||||
shortName: string;
|
||||
longName?: string;
|
||||
exchange?: string;
|
||||
currency?: string;
|
||||
marketState?: string;
|
||||
regularMarketPrice?: number;
|
||||
regularMarketChange?: number;
|
||||
regularMarketChangePercent?: number;
|
||||
regularMarketPreviousClose?: number;
|
||||
regularMarketOpen?: number;
|
||||
regularMarketDayHigh?: number;
|
||||
regularMarketDayLow?: number;
|
||||
regularMarketVolume?: number;
|
||||
averageDailyVolume3Month?: number;
|
||||
marketCap?: number;
|
||||
fiftyTwoWeekLow?: number;
|
||||
fiftyTwoWeekHigh?: number;
|
||||
trailingPE?: number;
|
||||
forwardPE?: number;
|
||||
dividendYield?: number;
|
||||
earningsPerShare?: number;
|
||||
website?: string;
|
||||
postMarketPrice?: number;
|
||||
postMarketChange?: number;
|
||||
postMarketChangePercent?: number;
|
||||
preMarketPrice?: number;
|
||||
preMarketChange?: number;
|
||||
preMarketChangePercent?: number;
|
||||
chartData?: {
|
||||
'1D'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'5D'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'1M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'3M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'6M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'1Y'?: { timestamps: number[]; prices: number[] } | null;
|
||||
MAX?: { timestamps: number[]; prices: number[] } | null;
|
||||
} | null;
|
||||
comparisonData?: Array<{
|
||||
ticker: string;
|
||||
name: string;
|
||||
chartData: {
|
||||
'1D'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'5D'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'1M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'3M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'6M'?: { timestamps: number[]; prices: number[] } | null;
|
||||
'1Y'?: { timestamps: number[]; prices: number[] } | null;
|
||||
MAX?: { timestamps: number[]; prices: number[] } | null;
|
||||
};
|
||||
}> | null;
|
||||
error?: string;
|
||||
};
|
||||
|
||||
const formatNumber = (num: number | undefined, decimals = 2): string => {
|
||||
if (num === undefined || num === null) return 'N/A';
|
||||
return num.toLocaleString(undefined, {
|
||||
minimumFractionDigits: decimals,
|
||||
maximumFractionDigits: decimals,
|
||||
});
|
||||
};
|
||||
|
||||
const formatLargeNumber = (num: number | undefined): string => {
|
||||
if (num === undefined || num === null) return 'N/A';
|
||||
if (num >= 1e12) return `$${(num / 1e12).toFixed(2)}T`;
|
||||
if (num >= 1e9) return `$${(num / 1e9).toFixed(2)}B`;
|
||||
if (num >= 1e6) return `$${(num / 1e6).toFixed(2)}M`;
|
||||
if (num >= 1e3) return `$${(num / 1e3).toFixed(2)}K`;
|
||||
return `$${num.toFixed(2)}`;
|
||||
};
|
||||
|
||||
const Stock = (props: StockWidgetProps) => {
|
||||
const [isDarkMode, setIsDarkMode] = useState(false);
|
||||
const [selectedTimeframe, setSelectedTimeframe] = useState<
|
||||
'1D' | '5D' | '1M' | '3M' | '6M' | '1Y' | 'MAX'
|
||||
>('1M');
|
||||
const chartContainerRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const checkDarkMode = () => {
|
||||
setIsDarkMode(document.documentElement.classList.contains('dark'));
|
||||
};
|
||||
|
||||
checkDarkMode();
|
||||
|
||||
const observer = new MutationObserver(checkDarkMode);
|
||||
observer.observe(document.documentElement, {
|
||||
attributes: true,
|
||||
attributeFilter: ['class'],
|
||||
});
|
||||
|
||||
return () => observer.disconnect();
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
const currentChartData = props.chartData?.[selectedTimeframe];
|
||||
if (
|
||||
!chartContainerRef.current ||
|
||||
!currentChartData ||
|
||||
currentChartData.timestamps.length === 0
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
const chart = createChart(chartContainerRef.current, {
|
||||
width: chartContainerRef.current.clientWidth,
|
||||
height: 280,
|
||||
layout: {
|
||||
background: { type: ColorType.Solid, color: 'transparent' },
|
||||
textColor: isDarkMode ? '#6b7280' : '#9ca3af',
|
||||
fontSize: 11,
|
||||
attributionLogo: false,
|
||||
},
|
||||
grid: {
|
||||
vertLines: {
|
||||
color: isDarkMode ? '#21262d' : '#e8edf1',
|
||||
style: LineStyle.Solid,
|
||||
},
|
||||
horzLines: {
|
||||
color: isDarkMode ? '#21262d' : '#e8edf1',
|
||||
style: LineStyle.Solid,
|
||||
},
|
||||
},
|
||||
crosshair: {
|
||||
vertLine: {
|
||||
color: isDarkMode ? '#30363d' : '#d0d7de',
|
||||
labelVisible: false,
|
||||
},
|
||||
horzLine: {
|
||||
color: isDarkMode ? '#30363d' : '#d0d7de',
|
||||
labelVisible: true,
|
||||
},
|
||||
},
|
||||
rightPriceScale: {
|
||||
borderVisible: false,
|
||||
visible: false,
|
||||
},
|
||||
leftPriceScale: {
|
||||
borderVisible: false,
|
||||
visible: true,
|
||||
},
|
||||
timeScale: {
|
||||
borderVisible: false,
|
||||
timeVisible: false,
|
||||
},
|
||||
handleScroll: false,
|
||||
handleScale: false,
|
||||
});
|
||||
|
||||
const prices = currentChartData.prices;
|
||||
let baselinePrice: number;
|
||||
|
||||
if (selectedTimeframe === '1D') {
|
||||
baselinePrice = props.regularMarketPreviousClose ?? prices[0];
|
||||
} else {
|
||||
baselinePrice = prices[0];
|
||||
}
|
||||
|
||||
const baselineSeries = chart.addSeries(BaselineSeries);
|
||||
|
||||
baselineSeries.applyOptions({
|
||||
baseValue: { type: 'price', price: baselinePrice },
|
||||
topLineColor: isDarkMode ? '#14b8a6' : '#0d9488',
|
||||
topFillColor1: isDarkMode
|
||||
? 'rgba(20, 184, 166, 0.28)'
|
||||
: 'rgba(13, 148, 136, 0.24)',
|
||||
topFillColor2: isDarkMode
|
||||
? 'rgba(20, 184, 166, 0.05)'
|
||||
: 'rgba(13, 148, 136, 0.05)',
|
||||
bottomLineColor: isDarkMode ? '#f87171' : '#dc2626',
|
||||
bottomFillColor1: isDarkMode
|
||||
? 'rgba(248, 113, 113, 0.05)'
|
||||
: 'rgba(220, 38, 38, 0.05)',
|
||||
bottomFillColor2: isDarkMode
|
||||
? 'rgba(248, 113, 113, 0.28)'
|
||||
: 'rgba(220, 38, 38, 0.24)',
|
||||
lineWidth: 2,
|
||||
crosshairMarkerVisible: true,
|
||||
crosshairMarkerRadius: 4,
|
||||
crosshairMarkerBorderColor: '',
|
||||
crosshairMarkerBackgroundColor: '',
|
||||
});
|
||||
|
||||
const data = currentChartData.timestamps.map((timestamp, index) => {
|
||||
const price = currentChartData.prices[index];
|
||||
return {
|
||||
time: (timestamp / 1000) as any,
|
||||
value: price,
|
||||
};
|
||||
});
|
||||
|
||||
baselineSeries.setData(data);
|
||||
|
||||
const comparisonColors = ['#8b5cf6', '#f59e0b', '#ec4899'];
|
||||
if (props.comparisonData && props.comparisonData.length > 0) {
|
||||
props.comparisonData.forEach((comp, index) => {
|
||||
const compChartData = comp.chartData[selectedTimeframe];
|
||||
if (compChartData && compChartData.prices.length > 0) {
|
||||
const compData = compChartData.timestamps.map((timestamp, i) => ({
|
||||
time: (timestamp / 1000) as any,
|
||||
value: compChartData.prices[i],
|
||||
}));
|
||||
|
||||
const compSeries = chart.addSeries(LineSeries);
|
||||
compSeries.applyOptions({
|
||||
color: comparisonColors[index] || '#6b7280',
|
||||
lineWidth: 2,
|
||||
crosshairMarkerVisible: true,
|
||||
crosshairMarkerRadius: 4,
|
||||
priceScaleId: 'left',
|
||||
});
|
||||
compSeries.setData(compData);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
chart.timeScale().fitContent();
|
||||
|
||||
const handleResize = () => {
|
||||
if (chartContainerRef.current) {
|
||||
chart.applyOptions({
|
||||
width: chartContainerRef.current.clientWidth,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
window.addEventListener('resize', handleResize);
|
||||
|
||||
return () => {
|
||||
window.removeEventListener('resize', handleResize);
|
||||
chart.remove();
|
||||
};
|
||||
}, [
|
||||
props.chartData,
|
||||
props.comparisonData,
|
||||
selectedTimeframe,
|
||||
isDarkMode,
|
||||
props.regularMarketPreviousClose,
|
||||
]);
|
||||
|
||||
const isPositive = (props.regularMarketChange ?? 0) >= 0;
|
||||
const isMarketOpen = props.marketState === 'REGULAR';
|
||||
const isPreMarket = props.marketState === 'PRE';
|
||||
const isPostMarket = props.marketState === 'POST';
|
||||
|
||||
const displayPrice = isPostMarket
|
||||
? props.postMarketPrice ?? props.regularMarketPrice
|
||||
: isPreMarket
|
||||
? props.preMarketPrice ?? props.regularMarketPrice
|
||||
: props.regularMarketPrice;
|
||||
|
||||
const displayChange = isPostMarket
|
||||
? props.postMarketChange ?? props.regularMarketChange
|
||||
: isPreMarket
|
||||
? props.preMarketChange ?? props.regularMarketChange
|
||||
: props.regularMarketChange;
|
||||
|
||||
const displayChangePercent = isPostMarket
|
||||
? props.postMarketChangePercent ?? props.regularMarketChangePercent
|
||||
: isPreMarket
|
||||
? props.preMarketChangePercent ?? props.regularMarketChangePercent
|
||||
: props.regularMarketChangePercent;
|
||||
|
||||
const changeColor = isPositive
|
||||
? 'text-green-600 dark:text-green-400'
|
||||
: 'text-red-600 dark:text-red-400';
|
||||
|
||||
if (props.error) {
|
||||
return (
|
||||
<div className="rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 p-4">
|
||||
<p className="text-sm text-black dark:text-white">
|
||||
Error: {props.error}
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="rounded-lg border border-light-200 dark:border-dark-200 overflow-hidden">
|
||||
<div className="p-4 space-y-4">
|
||||
<div className="flex items-start justify-between gap-4 pb-4 border-b border-light-200 dark:border-dark-200">
|
||||
<div className="flex-1">
|
||||
<div className="flex items-center gap-2 mb-1">
|
||||
{props.website && (
|
||||
<img
|
||||
src={`https://logo.clearbit.com/${new URL(props.website).hostname}`}
|
||||
alt={`${props.symbol} logo`}
|
||||
className="w-8 h-8 rounded-lg"
|
||||
onError={(e) => {
|
||||
(e.target as HTMLImageElement).style.display = 'none';
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
<h3 className="text-2xl font-bold text-black dark:text-white">
|
||||
{props.symbol}
|
||||
</h3>
|
||||
{props.exchange && (
|
||||
<span className="px-2 py-0.5 text-xs font-medium rounded bg-light-100 dark:bg-dark-100 text-black/60 dark:text-white/60">
|
||||
{props.exchange}
|
||||
</span>
|
||||
)}
|
||||
{isMarketOpen && (
|
||||
<div className="flex items-center gap-1.5 px-2 py-0.5 rounded-full bg-green-100 dark:bg-green-950/40 border border-green-300 dark:border-green-800">
|
||||
<div className="w-1.5 h-1.5 rounded-full bg-green-500 animate-pulse" />
|
||||
<span className="text-xs font-medium text-green-700 dark:text-green-400">
|
||||
Live
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{isPreMarket && (
|
||||
<div className="flex items-center gap-1.5 px-2 py-0.5 rounded-full bg-blue-100 dark:bg-blue-950/40 border border-blue-300 dark:border-blue-800">
|
||||
<Clock className="w-3 h-3 text-blue-600 dark:text-blue-400" />
|
||||
<span className="text-xs font-medium text-blue-700 dark:text-blue-400">
|
||||
Pre-Market
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{isPostMarket && (
|
||||
<div className="flex items-center gap-1.5 px-2 py-0.5 rounded-full bg-orange-100 dark:bg-orange-950/40 border border-orange-300 dark:border-orange-800">
|
||||
<Clock className="w-3 h-3 text-orange-600 dark:text-orange-400" />
|
||||
<span className="text-xs font-medium text-orange-700 dark:text-orange-400">
|
||||
After Hours
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
<p className="text-sm text-black/60 dark:text-white/60">
|
||||
{props.longName || props.shortName}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="text-right">
|
||||
<div className="flex items-baseline gap-2 mb-1">
|
||||
<span className="text-3xl font-medium text-black dark:text-white">
|
||||
{props.currency === 'USD' ? '$' : ''}
|
||||
{formatNumber(displayPrice)}
|
||||
</span>
|
||||
</div>
|
||||
<div
|
||||
className={`flex items-center justify-end gap-1 ${changeColor}`}
|
||||
>
|
||||
{isPositive ? (
|
||||
<ArrowUpRight className="w-4 h-4" />
|
||||
) : displayChange === 0 ? (
|
||||
<Minus className="w-4 h-4" />
|
||||
) : (
|
||||
<ArrowDownRight className="w-4 h-4" />
|
||||
)}
|
||||
<span className="text-lg font-normal">
|
||||
{displayChange !== undefined && displayChange >= 0 ? '+' : ''}
|
||||
{formatNumber(displayChange)}
|
||||
</span>
|
||||
<span className="text-sm font-normal">
|
||||
(
|
||||
{displayChangePercent !== undefined && displayChangePercent >= 0
|
||||
? '+'
|
||||
: ''}
|
||||
{formatNumber(displayChangePercent)}%)
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{props.chartData && (
|
||||
<div className="bg-light-secondary dark:bg-dark-secondary rounded-lg overflow-hidden">
|
||||
<div className="flex items-center justify-between p-3 border-b border-light-200 dark:border-dark-200">
|
||||
<div className="flex items-center gap-1">
|
||||
{(['1D', '5D', '1M', '3M', '6M', '1Y', 'MAX'] as const).map(
|
||||
(timeframe) => (
|
||||
<button
|
||||
key={timeframe}
|
||||
onClick={() => setSelectedTimeframe(timeframe)}
|
||||
disabled={!props.chartData?.[timeframe]}
|
||||
className={`px-3 py-1.5 text-xs font-medium rounded transition-colors ${
|
||||
selectedTimeframe === timeframe
|
||||
? 'bg-black/10 dark:bg-white/10 text-black dark:text-white'
|
||||
: 'text-black/50 dark:text-white/50 hover:text-black/80 dark:hover:text-white/80'
|
||||
} disabled:opacity-30 disabled:cursor-not-allowed`}
|
||||
>
|
||||
{timeframe}
|
||||
</button>
|
||||
),
|
||||
)}
|
||||
</div>
|
||||
|
||||
{props.comparisonData && props.comparisonData.length > 0 && (
|
||||
<div className="flex items-center gap-3 ml-auto">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
{props.symbol}
|
||||
</span>
|
||||
{props.comparisonData.map((comp, index) => {
|
||||
const colors = ['#8b5cf6', '#f59e0b', '#ec4899'];
|
||||
return (
|
||||
<div
|
||||
key={comp.ticker}
|
||||
className="flex items-center gap-1.5"
|
||||
>
|
||||
<div
|
||||
className="w-2 h-2 rounded-full"
|
||||
style={{ backgroundColor: colors[index] }}
|
||||
/>
|
||||
<span className="text-xs text-black/70 dark:text-white/70">
|
||||
{comp.ticker}
|
||||
</span>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="p-4">
|
||||
<div ref={chartContainerRef} />
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-3 border-t border-light-200 dark:border-dark-200">
|
||||
<div className="flex justify-between p-3 border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Prev Close
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
${formatNumber(props.regularMarketPreviousClose)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
52W Range
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
${formatNumber(props.fiftyTwoWeekLow, 2)}-$
|
||||
{formatNumber(props.fiftyTwoWeekHigh, 2)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Market Cap
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
{formatLargeNumber(props.marketCap)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Open
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
${formatNumber(props.regularMarketOpen)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
P/E Ratio
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
{props.trailingPE ? formatNumber(props.trailingPE, 2) : 'N/A'}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Dividend Yield
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
{props.dividendYield
|
||||
? `${formatNumber(props.dividendYield * 100, 2)}%`
|
||||
: 'N/A'}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Day Range
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
${formatNumber(props.regularMarketDayLow, 2)}-$
|
||||
{formatNumber(props.regularMarketDayHigh, 2)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-r border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
Volume
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
{formatLargeNumber(props.regularMarketVolume)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between p-3 border-t border-light-200 dark:border-dark-200">
|
||||
<span className="text-xs text-black/50 dark:text-white/50">
|
||||
EPS
|
||||
</span>
|
||||
<span className="text-xs text-black dark:text-white font-medium">
|
||||
$
|
||||
{props.earningsPerShare
|
||||
? formatNumber(props.earningsPerShare, 2)
|
||||
: 'N/A'}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default Stock;
|
||||
422
src/components/Widgets/Weather.tsx
Normal file
422
src/components/Widgets/Weather.tsx
Normal file
@@ -0,0 +1,422 @@
|
||||
'use client';
|
||||
|
||||
import { getMeasurementUnit } from '@/lib/config/clientRegistry';
|
||||
import { Wind, Droplets, Gauge } from 'lucide-react';
|
||||
import { useMemo, useEffect, useState } from 'react';
|
||||
|
||||
type WeatherWidgetProps = {
|
||||
location: string;
|
||||
current: {
|
||||
time: string;
|
||||
temperature_2m: number;
|
||||
relative_humidity_2m: number;
|
||||
apparent_temperature: number;
|
||||
is_day: number;
|
||||
precipitation: number;
|
||||
weather_code: number;
|
||||
wind_speed_10m: number;
|
||||
wind_direction_10m: number;
|
||||
wind_gusts_10m?: number;
|
||||
};
|
||||
daily: {
|
||||
time: string[];
|
||||
weather_code: number[];
|
||||
temperature_2m_max: number[];
|
||||
temperature_2m_min: number[];
|
||||
precipitation_probability_max: number[];
|
||||
};
|
||||
timezone: string;
|
||||
};
|
||||
|
||||
const getWeatherInfo = (code: number, isDay: boolean, isDarkMode: boolean) => {
|
||||
const dayNight = isDay ? 'day' : 'night';
|
||||
|
||||
const weatherMap: Record<
|
||||
number,
|
||||
{ icon: string; description: string; gradient: string }
|
||||
> = {
|
||||
0: {
|
||||
icon: `clear-${dayNight}.svg`,
|
||||
description: 'Clear',
|
||||
gradient: isDarkMode
|
||||
? isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #E8F1FA, #7A9DBF 35%, #4A7BA8 60%, #2F5A88)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #5A6A7E, #3E4E63 40%, #2A3544 65%, #1A2230)'
|
||||
: isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #DBEAFE 30%, #93C5FD 60%, #60A5FA)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #7B8694, #475569 45%, #334155 70%, #1E293B)',
|
||||
},
|
||||
1: {
|
||||
icon: `clear-${dayNight}.svg`,
|
||||
description: 'Mostly Clear',
|
||||
gradient: isDarkMode
|
||||
? isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #E8F1FA, #7A9DBF 35%, #4A7BA8 60%, #2F5A88)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #5A6A7E, #3E4E63 40%, #2A3544 65%, #1A2230)'
|
||||
: isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #DBEAFE 30%, #93C5FD 60%, #60A5FA)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #7B8694, #475569 45%, #334155 70%, #1E293B)',
|
||||
},
|
||||
2: {
|
||||
icon: `cloudy-1-${dayNight}.svg`,
|
||||
description: 'Partly Cloudy',
|
||||
gradient: isDarkMode
|
||||
? isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4E1ED, #8BA3B8 35%, #617A93 60%, #426070)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #6B7583, #4A5563 40%, #3A4450 65%, #2A3340)'
|
||||
: isDay
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #E0F2FE 28%, #BFDBFE 58%, #93C5FD)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #8B99AB, #64748B 45%, #475569 70%, #334155)',
|
||||
},
|
||||
3: {
|
||||
icon: `cloudy-1-${dayNight}.svg`,
|
||||
description: 'Cloudy',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #B8C3CF, #758190 38%, #546270 65%, #3D4A58)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #F5F8FA, #CBD5E1 32%, #94A3B8 65%, #64748B)',
|
||||
},
|
||||
45: {
|
||||
icon: `fog-${dayNight}.svg`,
|
||||
description: 'Foggy',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #C5CDD8, #8892A0 38%, #697380 65%, #4F5A68)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #E2E8F0 30%, #CBD5E1 62%, #94A3B8)',
|
||||
},
|
||||
48: {
|
||||
icon: `fog-${dayNight}.svg`,
|
||||
description: 'Rime Fog',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #C5CDD8, #8892A0 38%, #697380 65%, #4F5A68)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #E2E8F0 30%, #CBD5E1 62%, #94A3B8)',
|
||||
},
|
||||
51: {
|
||||
icon: `rainy-1-${dayNight}.svg`,
|
||||
description: 'Light Drizzle',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #B8D4E5, #6FA4C5 35%, #4A85AC 60%, #356A8E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #E5FBFF, #A5F3FC 28%, #67E8F9 60%, #22D3EE)',
|
||||
},
|
||||
53: {
|
||||
icon: `rainy-1-${dayNight}.svg`,
|
||||
description: 'Drizzle',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #B8D4E5, #6FA4C5 35%, #4A85AC 60%, #356A8E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #E5FBFF, #A5F3FC 28%, #67E8F9 60%, #22D3EE)',
|
||||
},
|
||||
55: {
|
||||
icon: `rainy-2-${dayNight}.svg`,
|
||||
description: 'Heavy Drizzle',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #A5C5D8, #5E92B0 35%, #3F789D 60%, #2A5F82)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4F3FF, #7DD3FC 30%, #38BDF8 62%, #0EA5E9)',
|
||||
},
|
||||
61: {
|
||||
icon: `rainy-2-${dayNight}.svg`,
|
||||
description: 'Light Rain',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #A5C5D8, #5E92B0 35%, #3F789D 60%, #2A5F82)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4F3FF, #7DD3FC 30%, #38BDF8 62%, #0EA5E9)',
|
||||
},
|
||||
63: {
|
||||
icon: `rainy-2-${dayNight}.svg`,
|
||||
description: 'Rain',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #8DB3C8, #4D819F 38%, #326A87 65%, #215570)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #B8E8FF, #38BDF8 32%, #0EA5E9 65%, #0284C7)',
|
||||
},
|
||||
65: {
|
||||
icon: `rainy-3-${dayNight}.svg`,
|
||||
description: 'Heavy Rain',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #7BA3B8, #3D6F8A 38%, #295973 65%, #1A455D)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #9CD9F5, #0EA5E9 32%, #0284C7 65%, #0369A1)',
|
||||
},
|
||||
71: {
|
||||
icon: `snowy-1-${dayNight}.svg`,
|
||||
description: 'Light Snow',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #E5F0FA, #9BB5CE 32%, #7496B8 58%, #527A9E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #F0F9FF 25%, #E0F2FE 55%, #BAE6FD)',
|
||||
},
|
||||
73: {
|
||||
icon: `snowy-2-${dayNight}.svg`,
|
||||
description: 'Snow',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4E5F3, #85A1BD 35%, #6584A8 60%, #496A8E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FAFEFF, #E0F2FE 28%, #BAE6FD 60%, #7DD3FC)',
|
||||
},
|
||||
75: {
|
||||
icon: `snowy-3-${dayNight}.svg`,
|
||||
description: 'Heavy Snow',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #BDD8EB, #6F92AE 35%, #4F7593 60%, #365A78)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #F0FAFF, #BAE6FD 30%, #7DD3FC 62%, #38BDF8)',
|
||||
},
|
||||
77: {
|
||||
icon: `snowy-1-${dayNight}.svg`,
|
||||
description: 'Snow Grains',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #E5F0FA, #9BB5CE 32%, #7496B8 58%, #527A9E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FFFFFF, #F0F9FF 25%, #E0F2FE 55%, #BAE6FD)',
|
||||
},
|
||||
80: {
|
||||
icon: `rainy-2-${dayNight}.svg`,
|
||||
description: 'Light Showers',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #A5C5D8, #5E92B0 35%, #3F789D 60%, #2A5F82)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4F3FF, #7DD3FC 30%, #38BDF8 62%, #0EA5E9)',
|
||||
},
|
||||
81: {
|
||||
icon: `rainy-2-${dayNight}.svg`,
|
||||
description: 'Showers',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #8DB3C8, #4D819F 38%, #326A87 65%, #215570)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #B8E8FF, #38BDF8 32%, #0EA5E9 65%, #0284C7)',
|
||||
},
|
||||
82: {
|
||||
icon: `rainy-3-${dayNight}.svg`,
|
||||
description: 'Heavy Showers',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #7BA3B8, #3D6F8A 38%, #295973 65%, #1A455D)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #9CD9F5, #0EA5E9 32%, #0284C7 65%, #0369A1)',
|
||||
},
|
||||
85: {
|
||||
icon: `snowy-2-${dayNight}.svg`,
|
||||
description: 'Light Snow Showers',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #D4E5F3, #85A1BD 35%, #6584A8 60%, #496A8E)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #FAFEFF, #E0F2FE 28%, #BAE6FD 60%, #7DD3FC)',
|
||||
},
|
||||
86: {
|
||||
icon: `snowy-3-${dayNight}.svg`,
|
||||
description: 'Snow Showers',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #BDD8EB, #6F92AE 35%, #4F7593 60%, #365A78)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #F0FAFF, #BAE6FD 30%, #7DD3FC 62%, #38BDF8)',
|
||||
},
|
||||
95: {
|
||||
icon: `scattered-thunderstorms-${dayNight}.svg`,
|
||||
description: 'Thunderstorm',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #8A95A3, #5F6A7A 38%, #475260 65%, #2F3A48)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #C8D1DD, #94A3B8 32%, #64748B 65%, #475569)',
|
||||
},
|
||||
96: {
|
||||
icon: 'severe-thunderstorm.svg',
|
||||
description: 'Thunderstorm + Hail',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #7A8593, #515C6D 38%, #3A4552 65%, #242D3A)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #B0BBC8, #64748B 32%, #475569 65%, #334155)',
|
||||
},
|
||||
99: {
|
||||
icon: 'severe-thunderstorm.svg',
|
||||
description: 'Severe Thunderstorm',
|
||||
gradient: isDarkMode
|
||||
? 'radial-gradient(ellipse 150% 100% at 50% 100%, #6A7583, #434E5D 40%, #2F3A47 68%, #1C2530)'
|
||||
: 'radial-gradient(ellipse 150% 100% at 50% 100%, #9BA8B8, #475569 35%, #334155 68%, #1E293B)',
|
||||
},
|
||||
};
|
||||
|
||||
return weatherMap[code] || weatherMap[0];
|
||||
};
|
||||
|
||||
const Weather = ({
|
||||
location,
|
||||
current,
|
||||
daily,
|
||||
timezone,
|
||||
}: WeatherWidgetProps) => {
|
||||
const [isDarkMode, setIsDarkMode] = useState(false);
|
||||
const unit = getMeasurementUnit();
|
||||
const isImperial = unit === 'imperial';
|
||||
const tempUnitLabel = isImperial ? '°F' : '°C';
|
||||
const windUnitLabel = isImperial ? 'mph' : 'km/h';
|
||||
|
||||
const formatTemp = (celsius: number) => {
|
||||
if (!Number.isFinite(celsius)) return 0;
|
||||
return Math.round(isImperial ? (celsius * 9) / 5 + 32 : celsius);
|
||||
};
|
||||
|
||||
const formatWind = (speedKmh: number) => {
|
||||
if (!Number.isFinite(speedKmh)) return 0;
|
||||
return Math.round(isImperial ? speedKmh * 0.621371 : speedKmh);
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
const checkDarkMode = () => {
|
||||
setIsDarkMode(document.documentElement.classList.contains('dark'));
|
||||
};
|
||||
|
||||
checkDarkMode();
|
||||
|
||||
const observer = new MutationObserver(checkDarkMode);
|
||||
observer.observe(document.documentElement, {
|
||||
attributes: true,
|
||||
attributeFilter: ['class'],
|
||||
});
|
||||
|
||||
return () => observer.disconnect();
|
||||
}, []);
|
||||
|
||||
const weatherInfo = useMemo(
|
||||
() =>
|
||||
getWeatherInfo(
|
||||
current?.weather_code || 0,
|
||||
current?.is_day === 1,
|
||||
isDarkMode,
|
||||
),
|
||||
[current?.weather_code, current?.is_day, isDarkMode],
|
||||
);
|
||||
|
||||
const forecast = useMemo(() => {
|
||||
if (!daily?.time || daily.time.length === 0) return [];
|
||||
|
||||
return daily.time.slice(1, 7).map((time, idx) => {
|
||||
const date = new Date(time);
|
||||
const dayName = date.toLocaleDateString('en-US', { weekday: 'short' });
|
||||
const isDay = true;
|
||||
const weatherCode = daily.weather_code[idx + 1];
|
||||
const info = getWeatherInfo(weatherCode, isDay, isDarkMode);
|
||||
|
||||
return {
|
||||
day: dayName,
|
||||
icon: info.icon,
|
||||
high: formatTemp(daily.temperature_2m_max[idx + 1]),
|
||||
low: formatTemp(daily.temperature_2m_min[idx + 1]),
|
||||
precipitation: daily.precipitation_probability_max[idx + 1] || 0,
|
||||
};
|
||||
});
|
||||
}, [daily, isDarkMode, isImperial]);
|
||||
|
||||
if (!current || !daily || !daily.time || daily.time.length === 0) {
|
||||
return (
|
||||
<div className="relative overflow-hidden rounded-lg shadow-md bg-gray-200 dark:bg-gray-800">
|
||||
<div className="p-4 text-black dark:text-white">
|
||||
<p className="text-sm">Weather data unavailable for {location}</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="relative overflow-hidden rounded-lg shadow-md">
|
||||
<div
|
||||
className="absolute inset-0"
|
||||
style={{
|
||||
background: weatherInfo.gradient,
|
||||
}}
|
||||
/>
|
||||
|
||||
<div className="relative p-4 text-gray-800 dark:text-white">
|
||||
<div className="flex items-start justify-between mb-3">
|
||||
<div className="flex items-center gap-3">
|
||||
<img
|
||||
src={`/weather-ico/${weatherInfo.icon}`}
|
||||
alt={weatherInfo.description}
|
||||
className="w-16 h-16 drop-shadow-lg"
|
||||
/>
|
||||
<div>
|
||||
<div className="flex items-baseline gap-1">
|
||||
<span className="text-4xl font-bold drop-shadow-md">
|
||||
{formatTemp(current.temperature_2m)}°
|
||||
</span>
|
||||
<span className="text-lg">{tempUnitLabel}</span>
|
||||
</div>
|
||||
<p className="text-sm font-medium drop-shadow mt-0.5">
|
||||
{weatherInfo.description}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
<div className="text-right">
|
||||
<p className="text-xs font-medium opacity-90">
|
||||
{formatTemp(daily.temperature_2m_max[0])}°{' '}
|
||||
{formatTemp(daily.temperature_2m_min[0])}°
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="mb-3 pb-3 border-b border-gray-800/20 dark:border-white/20">
|
||||
<h3 className="text-base font-semibold drop-shadow-md">{location}</h3>
|
||||
<p className="text-xs text-gray-700 dark:text-white/80 drop-shadow mt-0.5">
|
||||
{new Date(current.time).toLocaleString('en-US', {
|
||||
weekday: 'short',
|
||||
hour: 'numeric',
|
||||
minute: '2-digit',
|
||||
})}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-6 gap-2 mb-3 pb-3 border-b border-gray-800/20 dark:border-white/20">
|
||||
{forecast.map((day, idx) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="flex flex-col items-center bg-gray-800/10 dark:bg-white/10 backdrop-blur-sm rounded-md p-2"
|
||||
>
|
||||
<p className="text-xs font-medium mb-1">{day.day}</p>
|
||||
<img
|
||||
src={`/weather-ico/${day.icon}`}
|
||||
alt=""
|
||||
className="w-8 h-8 mb-1"
|
||||
/>
|
||||
<div className="flex items-center gap-1 text-xs">
|
||||
<span className="font-semibold">{day.high}°</span>
|
||||
<span className="text-gray-600 dark:text-white/60">
|
||||
{day.low}°
|
||||
</span>
|
||||
</div>
|
||||
{day.precipitation > 0 && (
|
||||
<div className="flex items-center gap-0.5 mt-1">
|
||||
<Droplets className="w-3 h-3 text-gray-600 dark:text-white/70" />
|
||||
<span className="text-[10px] text-gray-600 dark:text-white/70">
|
||||
{day.precipitation}%
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-3 gap-2 text-xs">
|
||||
<div className="flex items-center gap-2 bg-gray-800/10 dark:bg-white/10 backdrop-blur-sm rounded-md p-2">
|
||||
<Wind className="w-4 h-4 text-gray-700 dark:text-white/80 flex-shrink-0" />
|
||||
<div>
|
||||
<p className="text-[10px] text-gray-600 dark:text-white/70">
|
||||
Wind
|
||||
</p>
|
||||
<p className="font-semibold">
|
||||
{formatWind(current.wind_speed_10m)} {windUnitLabel}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2 bg-gray-800/10 dark:bg-white/10 backdrop-blur-sm rounded-md p-2">
|
||||
<Droplets className="w-4 h-4 text-gray-700 dark:text-white/80 flex-shrink-0" />
|
||||
<div>
|
||||
<p className="text-[10px] text-gray-600 dark:text-white/70">
|
||||
Humidity
|
||||
</p>
|
||||
<p className="font-semibold">
|
||||
{Math.round(current.relative_humidity_2m)}%
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2 bg-gray-800/10 dark:bg-white/10 backdrop-blur-sm rounded-md p-2">
|
||||
<Gauge className="w-4 h-4 text-gray-700 dark:text-white/80 flex-shrink-0" />
|
||||
<div>
|
||||
<p className="text-[10px] text-gray-600 dark:text-white/70">
|
||||
Feels Like
|
||||
</p>
|
||||
<p className="font-semibold">
|
||||
{formatTemp(current.apparent_temperature)}
|
||||
{tempUnitLabel}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default Weather;
|
||||
@@ -1,6 +1,12 @@
|
||||
import { Message } from '@/components/ChatWindow';
|
||||
export const getSuggestions = async (chatHistory: [string, string][]) => {
|
||||
const chatTurns = chatHistory.map(([role, content]) => {
|
||||
if (role === 'human') {
|
||||
return { role: 'user', content };
|
||||
} else {
|
||||
return { role: 'assistant', content };
|
||||
}
|
||||
});
|
||||
|
||||
export const getSuggestions = async (chatHistory: Message[]) => {
|
||||
const chatModel = localStorage.getItem('chatModelKey');
|
||||
const chatModelProvider = localStorage.getItem('chatModelProviderId');
|
||||
|
||||
@@ -10,7 +16,7 @@ export const getSuggestions = async (chatHistory: Message[]) => {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
chatHistory: chatHistory,
|
||||
chatHistory: chatTurns,
|
||||
chatModel: {
|
||||
providerId: chatModelProvider,
|
||||
key: chatModel,
|
||||
|
||||
66
src/lib/agents/media/image.ts
Normal file
66
src/lib/agents/media/image.ts
Normal file
@@ -0,0 +1,66 @@
|
||||
/* I don't think can be classified as agents but to keep the structure consistent i guess ill keep it here */
|
||||
|
||||
import { searchSearxng } from '@/lib/searxng';
|
||||
import {
|
||||
imageSearchFewShots,
|
||||
imageSearchPrompt,
|
||||
} from '@/lib/prompts/media/image';
|
||||
import BaseLLM from '@/lib/models/base/llm';
|
||||
import z from 'zod';
|
||||
import { ChatTurnMessage } from '@/lib/types';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
|
||||
type ImageSearchChainInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
type ImageSearchResult = {
|
||||
img_src: string;
|
||||
url: string;
|
||||
title: string;
|
||||
};
|
||||
|
||||
const searchImages = async (
|
||||
input: ImageSearchChainInput,
|
||||
llm: BaseLLM<any>,
|
||||
) => {
|
||||
const schema = z.object({
|
||||
query: z.string().describe('The image search query.'),
|
||||
});
|
||||
|
||||
const res = await llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: imageSearchPrompt,
|
||||
},
|
||||
...imageSearchFewShots,
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation>\n<follow_up>\n${input.query}\n</follow_up>`,
|
||||
},
|
||||
],
|
||||
schema: schema,
|
||||
});
|
||||
|
||||
const searchRes = await searchSearxng(res.query, {
|
||||
engines: ['bing images', 'google images'],
|
||||
});
|
||||
|
||||
const images: ImageSearchResult[] = [];
|
||||
|
||||
searchRes.results.forEach((result) => {
|
||||
if (result.img_src && result.url && result.title) {
|
||||
images.push({
|
||||
img_src: result.img_src,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return images.slice(0, 10);
|
||||
};
|
||||
|
||||
export default searchImages;
|
||||
66
src/lib/agents/media/video.ts
Normal file
66
src/lib/agents/media/video.ts
Normal file
@@ -0,0 +1,66 @@
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
import { searchSearxng } from '@/lib/searxng';
|
||||
import {
|
||||
videoSearchFewShots,
|
||||
videoSearchPrompt,
|
||||
} from '@/lib/prompts/media/videos';
|
||||
import { ChatTurnMessage } from '@/lib/types';
|
||||
import BaseLLM from '@/lib/models/base/llm';
|
||||
import z from 'zod';
|
||||
|
||||
type VideoSearchChainInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
type VideoSearchResult = {
|
||||
img_src: string;
|
||||
url: string;
|
||||
title: string;
|
||||
iframe_src: string;
|
||||
};
|
||||
|
||||
const searchVideos = async (
|
||||
input: VideoSearchChainInput,
|
||||
llm: BaseLLM<any>,
|
||||
) => {
|
||||
const schema = z.object({
|
||||
query: z.string().describe('The video search query.'),
|
||||
});
|
||||
|
||||
const res = await llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: videoSearchPrompt,
|
||||
},
|
||||
...videoSearchFewShots,
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation>\n<follow_up>\n${input.query}\n</follow_up>`,
|
||||
},
|
||||
],
|
||||
schema: schema,
|
||||
});
|
||||
|
||||
const searchRes = await searchSearxng(res.query, {
|
||||
engines: ['youtube'],
|
||||
});
|
||||
|
||||
const videos: VideoSearchResult[] = [];
|
||||
|
||||
searchRes.results.forEach((result) => {
|
||||
if (result.thumbnail && result.url && result.title && result.iframe_src) {
|
||||
videos.push({
|
||||
img_src: result.thumbnail,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
iframe_src: result.iframe_src,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return videos.slice(0, 10);
|
||||
};
|
||||
|
||||
export default searchVideos;
|
||||
99
src/lib/agents/search/api.ts
Normal file
99
src/lib/agents/search/api.ts
Normal file
@@ -0,0 +1,99 @@
|
||||
import { ResearcherOutput, SearchAgentInput } from './types';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { classify } from './classifier';
|
||||
import Researcher from './researcher';
|
||||
import { getWriterPrompt } from '@/lib/prompts/search/writer';
|
||||
import { WidgetExecutor } from './widgets';
|
||||
|
||||
class APISearchAgent {
|
||||
async searchAsync(session: SessionManager, input: SearchAgentInput) {
|
||||
const classification = await classify({
|
||||
chatHistory: input.chatHistory,
|
||||
enabledSources: input.config.sources,
|
||||
query: input.followUp,
|
||||
llm: input.config.llm,
|
||||
});
|
||||
|
||||
const widgetPromise = WidgetExecutor.executeAll({
|
||||
classification,
|
||||
chatHistory: input.chatHistory,
|
||||
followUp: input.followUp,
|
||||
llm: input.config.llm,
|
||||
});
|
||||
|
||||
let searchPromise: Promise<ResearcherOutput> | null = null;
|
||||
|
||||
if (!classification.classification.skipSearch) {
|
||||
const researcher = new Researcher();
|
||||
searchPromise = researcher.research(SessionManager.createSession(), {
|
||||
chatHistory: input.chatHistory,
|
||||
followUp: input.followUp,
|
||||
classification: classification,
|
||||
config: input.config,
|
||||
});
|
||||
}
|
||||
|
||||
const [widgetOutputs, searchResults] = await Promise.all([
|
||||
widgetPromise,
|
||||
searchPromise,
|
||||
]);
|
||||
|
||||
if (searchResults) {
|
||||
session.emit('data', {
|
||||
type: 'searchResults',
|
||||
data: searchResults.searchFindings,
|
||||
});
|
||||
}
|
||||
|
||||
session.emit('data', {
|
||||
type: 'researchComplete',
|
||||
});
|
||||
|
||||
const finalContext =
|
||||
searchResults?.searchFindings
|
||||
.map(
|
||||
(f, index) =>
|
||||
`<result index=${index + 1} title=${f.metadata.title}>${f.content}</result>`,
|
||||
)
|
||||
.join('\n') || '';
|
||||
|
||||
const widgetContext = widgetOutputs
|
||||
.map((o) => {
|
||||
return `<result>${o.llmContext}</result>`;
|
||||
})
|
||||
.join('\n-------------\n');
|
||||
|
||||
const finalContextWithWidgets = `<search_results note="These are the search results and assistant can cite these">\n${finalContext}\n</search_results>\n<widgets_result noteForAssistant="Its output is already showed to the user, assistant can use this information to answer the query but do not CITE this as a souce">\n${widgetContext}\n</widgets_result>`;
|
||||
|
||||
const writerPrompt = getWriterPrompt(
|
||||
finalContextWithWidgets,
|
||||
input.config.systemInstructions,
|
||||
input.config.mode,
|
||||
);
|
||||
|
||||
const answerStream = input.config.llm.streamText({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: writerPrompt,
|
||||
},
|
||||
...input.chatHistory,
|
||||
{
|
||||
role: 'user',
|
||||
content: input.followUp,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
for await (const chunk of answerStream) {
|
||||
session.emit('data', {
|
||||
type: 'response',
|
||||
data: chunk.contentChunk,
|
||||
});
|
||||
}
|
||||
|
||||
session.emit('end', {});
|
||||
}
|
||||
}
|
||||
|
||||
export default APISearchAgent;
|
||||
53
src/lib/agents/search/classifier.ts
Normal file
53
src/lib/agents/search/classifier.ts
Normal file
@@ -0,0 +1,53 @@
|
||||
import z from 'zod';
|
||||
import { ClassifierInput } from './types';
|
||||
import { classifierPrompt } from '@/lib/prompts/search/classifier';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
|
||||
const schema = z.object({
|
||||
classification: z.object({
|
||||
skipSearch: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to skip the search step.'),
|
||||
personalSearch: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to perform a personal search.'),
|
||||
academicSearch: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to perform an academic search.'),
|
||||
discussionSearch: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to perform a discussion search.'),
|
||||
showWeatherWidget: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to show the weather widget.'),
|
||||
showStockWidget: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to show the stock widget.'),
|
||||
showCalculationWidget: z
|
||||
.boolean()
|
||||
.describe('Indicates whether to show the calculation widget.'),
|
||||
}),
|
||||
standaloneFollowUp: z
|
||||
.string()
|
||||
.describe(
|
||||
"A self-contained, context-independent reformulation of the user's question.",
|
||||
),
|
||||
});
|
||||
|
||||
export const classify = async (input: ClassifierInput) => {
|
||||
const output = await input.llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: classifierPrompt,
|
||||
},
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_query>\n${input.query}\n</user_query>`,
|
||||
},
|
||||
],
|
||||
schema,
|
||||
});
|
||||
|
||||
return output;
|
||||
};
|
||||
186
src/lib/agents/search/index.ts
Normal file
186
src/lib/agents/search/index.ts
Normal file
@@ -0,0 +1,186 @@
|
||||
import { ResearcherOutput, SearchAgentInput } from './types';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { classify } from './classifier';
|
||||
import Researcher from './researcher';
|
||||
import { getWriterPrompt } from '@/lib/prompts/search/writer';
|
||||
import { WidgetExecutor } from './widgets';
|
||||
import db from '@/lib/db';
|
||||
import { chats, messages } from '@/lib/db/schema';
|
||||
import { and, eq, gt } from 'drizzle-orm';
|
||||
import { TextBlock } from '@/lib/types';
|
||||
|
||||
class SearchAgent {
|
||||
async searchAsync(session: SessionManager, input: SearchAgentInput) {
|
||||
const exists = await db.query.messages.findFirst({
|
||||
where: and(
|
||||
eq(messages.chatId, input.chatId),
|
||||
eq(messages.messageId, input.messageId),
|
||||
),
|
||||
});
|
||||
|
||||
if (!exists) {
|
||||
await db.insert(messages).values({
|
||||
chatId: input.chatId,
|
||||
messageId: input.messageId,
|
||||
backendId: session.id,
|
||||
query: input.followUp,
|
||||
createdAt: new Date().toISOString(),
|
||||
status: 'answering',
|
||||
responseBlocks: [],
|
||||
});
|
||||
} else {
|
||||
await db
|
||||
.delete(messages)
|
||||
.where(
|
||||
and(eq(messages.chatId, input.chatId), gt(messages.id, exists.id)),
|
||||
)
|
||||
.execute();
|
||||
await db
|
||||
.update(messages)
|
||||
.set({
|
||||
status: 'answering',
|
||||
backendId: session.id,
|
||||
responseBlocks: [],
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(messages.chatId, input.chatId),
|
||||
eq(messages.messageId, input.messageId),
|
||||
),
|
||||
)
|
||||
.execute();
|
||||
}
|
||||
|
||||
const classification = await classify({
|
||||
chatHistory: input.chatHistory,
|
||||
enabledSources: input.config.sources,
|
||||
query: input.followUp,
|
||||
llm: input.config.llm,
|
||||
});
|
||||
|
||||
const widgetPromise = WidgetExecutor.executeAll({
|
||||
classification,
|
||||
chatHistory: input.chatHistory,
|
||||
followUp: input.followUp,
|
||||
llm: input.config.llm,
|
||||
}).then((widgetOutputs) => {
|
||||
widgetOutputs.forEach((o) => {
|
||||
session.emitBlock({
|
||||
id: crypto.randomUUID(),
|
||||
type: 'widget',
|
||||
data: {
|
||||
widgetType: o.type,
|
||||
params: o.data,
|
||||
},
|
||||
});
|
||||
});
|
||||
return widgetOutputs;
|
||||
});
|
||||
|
||||
let searchPromise: Promise<ResearcherOutput> | null = null;
|
||||
|
||||
if (!classification.classification.skipSearch) {
|
||||
const researcher = new Researcher();
|
||||
searchPromise = researcher.research(session, {
|
||||
chatHistory: input.chatHistory,
|
||||
followUp: input.followUp,
|
||||
classification: classification,
|
||||
config: input.config,
|
||||
});
|
||||
}
|
||||
|
||||
const [widgetOutputs, searchResults] = await Promise.all([
|
||||
widgetPromise,
|
||||
searchPromise,
|
||||
]);
|
||||
|
||||
session.emit('data', {
|
||||
type: 'researchComplete',
|
||||
});
|
||||
|
||||
const finalContext =
|
||||
searchResults?.searchFindings
|
||||
.map(
|
||||
(f, index) =>
|
||||
`<result index=${index + 1} title=${f.metadata.title}>${f.content}</result>`,
|
||||
)
|
||||
.join('\n') || '';
|
||||
|
||||
const widgetContext = widgetOutputs
|
||||
.map((o) => {
|
||||
return `<result>${o.llmContext}</result>`;
|
||||
})
|
||||
.join('\n-------------\n');
|
||||
|
||||
const finalContextWithWidgets = `<search_results note="These are the search results and assistant can cite these">\n${finalContext}\n</search_results>\n<widgets_result noteForAssistant="Its output is already showed to the user, assistant can use this information to answer the query but do not CITE this as a souce">\n${widgetContext}\n</widgets_result>`;
|
||||
|
||||
const writerPrompt = getWriterPrompt(
|
||||
finalContextWithWidgets,
|
||||
input.config.systemInstructions,
|
||||
input.config.mode,
|
||||
);
|
||||
const answerStream = input.config.llm.streamText({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: writerPrompt,
|
||||
},
|
||||
...input.chatHistory,
|
||||
{
|
||||
role: 'user',
|
||||
content: input.followUp,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
let responseBlockId = '';
|
||||
|
||||
for await (const chunk of answerStream) {
|
||||
if (!responseBlockId) {
|
||||
const block: TextBlock = {
|
||||
id: crypto.randomUUID(),
|
||||
type: 'text',
|
||||
data: chunk.contentChunk,
|
||||
};
|
||||
|
||||
session.emitBlock(block);
|
||||
|
||||
responseBlockId = block.id;
|
||||
} else {
|
||||
const block = session.getBlock(responseBlockId) as TextBlock | null;
|
||||
|
||||
if (!block) {
|
||||
continue;
|
||||
}
|
||||
|
||||
block.data += chunk.contentChunk;
|
||||
|
||||
session.updateBlock(block.id, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data',
|
||||
value: block.data,
|
||||
},
|
||||
]);
|
||||
}
|
||||
}
|
||||
|
||||
session.emit('end', {});
|
||||
|
||||
await db
|
||||
.update(messages)
|
||||
.set({
|
||||
status: 'completed',
|
||||
responseBlocks: session.getAllBlocks(),
|
||||
})
|
||||
.where(
|
||||
and(
|
||||
eq(messages.chatId, input.chatId),
|
||||
eq(messages.messageId, input.messageId),
|
||||
),
|
||||
)
|
||||
.execute();
|
||||
}
|
||||
}
|
||||
|
||||
export default SearchAgent;
|
||||
129
src/lib/agents/search/researcher/actions/academicSearch.ts
Normal file
129
src/lib/agents/search/researcher/actions/academicSearch.ts
Normal file
@@ -0,0 +1,129 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
|
||||
import { searchSearxng } from '@/lib/searxng';
|
||||
|
||||
const schema = z.object({
|
||||
queries: z.array(z.string()).describe('List of academic search queries'),
|
||||
});
|
||||
|
||||
const academicSearchDescription = `
|
||||
Use this tool to perform academic searches for scholarly articles, papers, and research studies relevant to the user's query. Provide a list of concise search queries that will help gather comprehensive academic information on the topic at hand.
|
||||
You can provide up to 3 queries at a time. Make sure the queries are specific and relevant to the user's needs.
|
||||
|
||||
For example, if the user is interested in recent advancements in renewable energy, your queries could be:
|
||||
1. "Recent advancements in renewable energy 2024"
|
||||
2. "Cutting-edge research on solar power technologies"
|
||||
3. "Innovations in wind energy systems"
|
||||
|
||||
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed academic information.
|
||||
`;
|
||||
|
||||
const academicSearchAction: ResearchAction<typeof schema> = {
|
||||
name: 'academic_search',
|
||||
schema: schema,
|
||||
getDescription: () => academicSearchDescription,
|
||||
getToolDescription: () =>
|
||||
"Use this tool to perform academic searches for scholarly articles, papers, and research studies relevant to the user's query. Provide a list of concise search queries that will help gather comprehensive academic information on the topic at hand.",
|
||||
enabled: (config) =>
|
||||
config.sources.includes('academic') &&
|
||||
config.classification.classification.skipSearch === false &&
|
||||
config.classification.classification.academicSearch === true,
|
||||
execute: async (input, additionalConfig) => {
|
||||
input.queries = input.queries.slice(0, 3);
|
||||
|
||||
const researchBlock = additionalConfig.session.getBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
);
|
||||
|
||||
if (researchBlock && researchBlock.type === 'research') {
|
||||
researchBlock.data.subSteps.push({
|
||||
type: 'searching',
|
||||
id: crypto.randomUUID(),
|
||||
searching: input.queries,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
const searchResultsBlockId = crypto.randomUUID();
|
||||
let searchResultsEmitted = false;
|
||||
|
||||
let results: Chunk[] = [];
|
||||
|
||||
const search = async (q: string) => {
|
||||
const res = await searchSearxng(q, {
|
||||
engines: ['arxiv', 'google scholar', 'pubmed'],
|
||||
});
|
||||
|
||||
const resultChunks: Chunk[] = res.results.map((r) => ({
|
||||
content: r.content || r.title,
|
||||
metadata: {
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
},
|
||||
}));
|
||||
|
||||
results.push(...resultChunks);
|
||||
|
||||
if (
|
||||
!searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
searchResultsEmitted = true;
|
||||
|
||||
researchBlock.data.subSteps.push({
|
||||
id: searchResultsBlockId,
|
||||
type: 'search_results',
|
||||
reading: resultChunks,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
} else if (
|
||||
searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
const subStepIndex = researchBlock.data.subSteps.findIndex(
|
||||
(step) => step.id === searchResultsBlockId,
|
||||
);
|
||||
|
||||
const subStep = researchBlock.data.subSteps[
|
||||
subStepIndex
|
||||
] as SearchResultsResearchBlock;
|
||||
|
||||
subStep.reading.push(...resultChunks);
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
};
|
||||
|
||||
await Promise.all(input.queries.map(search));
|
||||
|
||||
return {
|
||||
type: 'search_results',
|
||||
results,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default academicSearchAction;
|
||||
24
src/lib/agents/search/researcher/actions/done.ts
Normal file
24
src/lib/agents/search/researcher/actions/done.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
|
||||
const actionDescription = `
|
||||
Use this action ONLY when you have completed all necessary research and are ready to provide a final answer to the user. This indicates that you have gathered sufficient information from previous steps and are concluding the research process.
|
||||
YOU MUST CALL THIS ACTION TO SIGNAL COMPLETION; DO NOT OUTPUT FINAL ANSWERS DIRECTLY TO THE USER.
|
||||
IT WILL BE AUTOMATICALLY TRIGGERED IF MAXIMUM ITERATIONS ARE REACHED SO IF YOU'RE LOW ON ITERATIONS, DON'T CALL IT AND INSTEAD FOCUS ON GATHERING ESSENTIAL INFO FIRST.
|
||||
`;
|
||||
|
||||
const doneAction: ResearchAction<any> = {
|
||||
name: 'done',
|
||||
schema: z.object({}),
|
||||
getToolDescription: () =>
|
||||
'Only call this after __reasoning_preamble AND after any other needed tool calls when you truly have enough to answer. Do not call if information is still missing.',
|
||||
getDescription: () => actionDescription,
|
||||
enabled: (_) => true,
|
||||
execute: async (params, additionalConfig) => {
|
||||
return {
|
||||
type: 'done',
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default doneAction;
|
||||
18
src/lib/agents/search/researcher/actions/index.ts
Normal file
18
src/lib/agents/search/researcher/actions/index.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
import academicSearchAction from './academicSearch';
|
||||
import doneAction from './done';
|
||||
import planAction from './plan';
|
||||
import ActionRegistry from './registry';
|
||||
import scrapeURLAction from './scrapeURL';
|
||||
import socialSearchAction from './socialSearch';
|
||||
import uploadsSearchAction from './uploadsSearch';
|
||||
import webSearchAction from './webSearch';
|
||||
|
||||
ActionRegistry.register(webSearchAction);
|
||||
ActionRegistry.register(doneAction);
|
||||
ActionRegistry.register(planAction);
|
||||
ActionRegistry.register(scrapeURLAction);
|
||||
ActionRegistry.register(uploadsSearchAction);
|
||||
ActionRegistry.register(academicSearchAction);
|
||||
ActionRegistry.register(socialSearchAction);
|
||||
|
||||
export { ActionRegistry };
|
||||
40
src/lib/agents/search/researcher/actions/plan.ts
Normal file
40
src/lib/agents/search/researcher/actions/plan.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
|
||||
const schema = z.object({
|
||||
plan: z
|
||||
.string()
|
||||
.describe(
|
||||
'A concise natural-language plan in one short paragraph. Open with a short intent phrase (e.g., "Okay, the user wants to...", "Searching for...", "Looking into...") and lay out the steps you will take.',
|
||||
),
|
||||
});
|
||||
|
||||
const actionDescription = `
|
||||
Use this tool FIRST on every turn to state your plan in natural language before any other action. Keep it short, action-focused, and tailored to the current query.
|
||||
Make sure to not include reference to any tools or actions you might take, just the plan itself. The user isn't aware about tools, but they love to see your thought process.
|
||||
|
||||
Here are some examples of good plans:
|
||||
<examples>
|
||||
- "Okay, the user wants to know the latest advancements in renewable energy. I will start by looking for recent articles and studies on this topic, then summarize the key points." -> "I have gathered enough information to provide a comprehensive answer."
|
||||
- "The user is asking about the health benefits of a Mediterranean diet. I will search for scientific studies and expert opinions on this diet, then compile the findings into a clear summary." -> "I have gathered information about the Mediterranean diet and its health benefits, I will now look up for any recent studies to ensure the information is current."
|
||||
</examples>
|
||||
|
||||
YOU CAN NEVER CALL ANY OTHER TOOL BEFORE CALLING THIS ONE FIRST, IF YOU DO, THAT CALL WOULD BE IGNORED.
|
||||
`;
|
||||
|
||||
const planAction: ResearchAction<typeof schema> = {
|
||||
name: '__reasoning_preamble',
|
||||
schema: schema,
|
||||
getToolDescription: () =>
|
||||
'Use this FIRST on every turn to state your plan in natural language before any other action. Keep it short, action-focused, and tailored to the current query.',
|
||||
getDescription: () => actionDescription,
|
||||
enabled: (config) => config.mode !== 'speed',
|
||||
execute: async (input, _) => {
|
||||
return {
|
||||
type: 'reasoning',
|
||||
reasoning: input.plan,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default planAction;
|
||||
105
src/lib/agents/search/researcher/actions/registry.ts
Normal file
105
src/lib/agents/search/researcher/actions/registry.ts
Normal file
@@ -0,0 +1,105 @@
|
||||
import { Tool, ToolCall } from '@/lib/models/types';
|
||||
import {
|
||||
ActionOutput,
|
||||
AdditionalConfig,
|
||||
ClassifierOutput,
|
||||
ResearchAction,
|
||||
SearchAgentConfig,
|
||||
SearchSources,
|
||||
} from '../../types';
|
||||
|
||||
class ActionRegistry {
|
||||
private static actions: Map<string, ResearchAction> = new Map();
|
||||
|
||||
static register(action: ResearchAction<any>) {
|
||||
this.actions.set(action.name, action);
|
||||
}
|
||||
|
||||
static get(name: string): ResearchAction | undefined {
|
||||
return this.actions.get(name);
|
||||
}
|
||||
|
||||
static getAvailableActions(config: {
|
||||
classification: ClassifierOutput;
|
||||
fileIds: string[];
|
||||
mode: SearchAgentConfig['mode'];
|
||||
sources: SearchSources[];
|
||||
}): ResearchAction[] {
|
||||
return Array.from(
|
||||
this.actions.values().filter((action) => action.enabled(config)),
|
||||
);
|
||||
}
|
||||
|
||||
static getAvailableActionTools(config: {
|
||||
classification: ClassifierOutput;
|
||||
fileIds: string[];
|
||||
mode: SearchAgentConfig['mode'];
|
||||
sources: SearchSources[];
|
||||
}): Tool[] {
|
||||
const availableActions = this.getAvailableActions(config);
|
||||
|
||||
return availableActions.map((action) => ({
|
||||
name: action.name,
|
||||
description: action.getToolDescription({ mode: config.mode }),
|
||||
schema: action.schema,
|
||||
}));
|
||||
}
|
||||
|
||||
static getAvailableActionsDescriptions(config: {
|
||||
classification: ClassifierOutput;
|
||||
fileIds: string[];
|
||||
mode: SearchAgentConfig['mode'];
|
||||
sources: SearchSources[];
|
||||
}): string {
|
||||
const availableActions = this.getAvailableActions(config);
|
||||
|
||||
return availableActions
|
||||
.map(
|
||||
(action) =>
|
||||
`<tool name="${action.name}">\n${action.getDescription({ mode: config.mode })}\n</tool>`,
|
||||
)
|
||||
.join('\n\n');
|
||||
}
|
||||
|
||||
static async execute(
|
||||
name: string,
|
||||
params: any,
|
||||
additionalConfig: AdditionalConfig & {
|
||||
researchBlockId: string;
|
||||
fileIds: string[];
|
||||
},
|
||||
) {
|
||||
const action = this.actions.get(name);
|
||||
|
||||
if (!action) {
|
||||
throw new Error(`Action with name ${name} not found`);
|
||||
}
|
||||
|
||||
return action.execute(params, additionalConfig);
|
||||
}
|
||||
|
||||
static async executeAll(
|
||||
actions: ToolCall[],
|
||||
additionalConfig: AdditionalConfig & {
|
||||
researchBlockId: string;
|
||||
fileIds: string[];
|
||||
},
|
||||
): Promise<ActionOutput[]> {
|
||||
const results: ActionOutput[] = [];
|
||||
|
||||
await Promise.all(
|
||||
actions.map(async (actionConfig) => {
|
||||
const output = await this.execute(
|
||||
actionConfig.name,
|
||||
actionConfig.arguments,
|
||||
additionalConfig,
|
||||
);
|
||||
results.push(output);
|
||||
}),
|
||||
);
|
||||
|
||||
return results;
|
||||
}
|
||||
}
|
||||
|
||||
export default ActionRegistry;
|
||||
139
src/lib/agents/search/researcher/actions/scrapeURL.ts
Normal file
139
src/lib/agents/search/researcher/actions/scrapeURL.ts
Normal file
@@ -0,0 +1,139 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
import { Chunk, ReadingResearchBlock } from '@/lib/types';
|
||||
import TurnDown from 'turndown';
|
||||
import path from 'path';
|
||||
|
||||
const turndownService = new TurnDown();
|
||||
|
||||
const schema = z.object({
|
||||
urls: z.array(z.string()).describe('A list of URLs to scrape content from.'),
|
||||
});
|
||||
|
||||
const actionDescription = `
|
||||
Use this tool to scrape and extract content from the provided URLs. This is useful when you the user has asked you to extract or summarize information from specific web pages. You can provide up to 3 URLs at a time. NEVER CALL THIS TOOL EXPLICITLY YOURSELF UNLESS INSTRUCTED TO DO SO BY THE USER.
|
||||
You should only call this tool when the user has specifically requested information from certain web pages, never call this yourself to get extra information without user instruction.
|
||||
|
||||
For example, if the user says "Please summarize the content of https://example.com/article", you can call this tool with that URL to get the content and then provide the summary or "What does X mean according to https://example.com/page", you can call this tool with that URL to get the content and provide the explanation.
|
||||
`;
|
||||
|
||||
const scrapeURLAction: ResearchAction<typeof schema> = {
|
||||
name: 'scrape_url',
|
||||
schema: schema,
|
||||
getToolDescription: () =>
|
||||
'Use this tool to scrape and extract content from the provided URLs. This is useful when you the user has asked you to extract or summarize information from specific web pages. You can provide up to 3 URLs at a time. NEVER CALL THIS TOOL EXPLICITLY YOURSELF UNLESS INSTRUCTED TO DO SO BY THE USER.',
|
||||
getDescription: () => actionDescription,
|
||||
enabled: (_) => true,
|
||||
execute: async (params, additionalConfig) => {
|
||||
params.urls = params.urls.slice(0, 3);
|
||||
|
||||
let readingBlockId = crypto.randomUUID();
|
||||
let readingEmitted = false;
|
||||
|
||||
const researchBlock = additionalConfig.session.getBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
);
|
||||
|
||||
const results: Chunk[] = [];
|
||||
|
||||
await Promise.all(
|
||||
params.urls.map(async (url) => {
|
||||
try {
|
||||
const res = await fetch(url);
|
||||
const text = await res.text();
|
||||
|
||||
const title =
|
||||
text.match(/<title>(.*?)<\/title>/i)?.[1] || `Content from ${url}`;
|
||||
|
||||
if (
|
||||
!readingEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
readingEmitted = true;
|
||||
researchBlock.data.subSteps.push({
|
||||
id: readingBlockId,
|
||||
type: 'reading',
|
||||
reading: [
|
||||
{
|
||||
content: '',
|
||||
metadata: {
|
||||
url,
|
||||
title: title,
|
||||
},
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
[
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
],
|
||||
);
|
||||
} else if (
|
||||
readingEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
const subStepIndex = researchBlock.data.subSteps.findIndex(
|
||||
(step: any) => step.id === readingBlockId,
|
||||
);
|
||||
|
||||
const subStep = researchBlock.data.subSteps[
|
||||
subStepIndex
|
||||
] as ReadingResearchBlock;
|
||||
|
||||
subStep.reading.push({
|
||||
content: '',
|
||||
metadata: {
|
||||
url,
|
||||
title: title,
|
||||
},
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
[
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
],
|
||||
);
|
||||
}
|
||||
|
||||
const markdown = turndownService.turndown(text);
|
||||
|
||||
results.push({
|
||||
content: markdown,
|
||||
metadata: {
|
||||
url,
|
||||
title: title,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
results.push({
|
||||
content: `Failed to fetch content from ${url}: ${error}`,
|
||||
metadata: {
|
||||
url,
|
||||
title: `Error fetching ${url}`,
|
||||
},
|
||||
});
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
return {
|
||||
type: 'search_results',
|
||||
results,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default scrapeURLAction;
|
||||
129
src/lib/agents/search/researcher/actions/socialSearch.ts
Normal file
129
src/lib/agents/search/researcher/actions/socialSearch.ts
Normal file
@@ -0,0 +1,129 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
|
||||
import { searchSearxng } from '@/lib/searxng';
|
||||
|
||||
const schema = z.object({
|
||||
queries: z.array(z.string()).describe('List of social search queries'),
|
||||
});
|
||||
|
||||
const socialSearchDescription = `
|
||||
Use this tool to perform social media searches for relevant posts, discussions, and trends related to the user's query. Provide a list of concise search queries that will help gather comprehensive social media information on the topic at hand.
|
||||
You can provide up to 3 queries at a time. Make sure the queries are specific and relevant to the user's needs.
|
||||
|
||||
For example, if the user is interested in public opinion on electric vehicles, your queries could be:
|
||||
1. "Electric vehicles public opinion 2024"
|
||||
2. "Social media discussions on EV adoption"
|
||||
3. "Trends in electric vehicle usage"
|
||||
|
||||
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed social media information.
|
||||
`;
|
||||
|
||||
const socialSearchAction: ResearchAction<typeof schema> = {
|
||||
name: 'social_search',
|
||||
schema: schema,
|
||||
getDescription: () => socialSearchDescription,
|
||||
getToolDescription: () =>
|
||||
"Use this tool to perform social media searches for relevant posts, discussions, and trends related to the user's query. Provide a list of concise search queries that will help gather comprehensive social media information on the topic at hand.",
|
||||
enabled: (config) =>
|
||||
config.sources.includes('discussions') &&
|
||||
config.classification.classification.skipSearch === false &&
|
||||
config.classification.classification.discussionSearch === true,
|
||||
execute: async (input, additionalConfig) => {
|
||||
input.queries = input.queries.slice(0, 3);
|
||||
|
||||
const researchBlock = additionalConfig.session.getBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
);
|
||||
|
||||
if (researchBlock && researchBlock.type === 'research') {
|
||||
researchBlock.data.subSteps.push({
|
||||
type: 'searching',
|
||||
id: crypto.randomUUID(),
|
||||
searching: input.queries,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
const searchResultsBlockId = crypto.randomUUID();
|
||||
let searchResultsEmitted = false;
|
||||
|
||||
let results: Chunk[] = [];
|
||||
|
||||
const search = async (q: string) => {
|
||||
const res = await searchSearxng(q, {
|
||||
engines: ['reddit'],
|
||||
});
|
||||
|
||||
const resultChunks: Chunk[] = res.results.map((r) => ({
|
||||
content: r.content || r.title,
|
||||
metadata: {
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
},
|
||||
}));
|
||||
|
||||
results.push(...resultChunks);
|
||||
|
||||
if (
|
||||
!searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
searchResultsEmitted = true;
|
||||
|
||||
researchBlock.data.subSteps.push({
|
||||
id: searchResultsBlockId,
|
||||
type: 'search_results',
|
||||
reading: resultChunks,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
} else if (
|
||||
searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
const subStepIndex = researchBlock.data.subSteps.findIndex(
|
||||
(step) => step.id === searchResultsBlockId,
|
||||
);
|
||||
|
||||
const subStep = researchBlock.data.subSteps[
|
||||
subStepIndex
|
||||
] as SearchResultsResearchBlock;
|
||||
|
||||
subStep.reading.push(...resultChunks);
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
};
|
||||
|
||||
await Promise.all(input.queries.map(search));
|
||||
|
||||
return {
|
||||
type: 'search_results',
|
||||
results,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default socialSearchAction;
|
||||
102
src/lib/agents/search/researcher/actions/uploadsSearch.ts
Normal file
102
src/lib/agents/search/researcher/actions/uploadsSearch.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
import UploadStore from '@/lib/uploads/store';
|
||||
|
||||
const schema = z.object({
|
||||
queries: z
|
||||
.array(z.string())
|
||||
.describe(
|
||||
'A list of queries to search in user uploaded files. Can be a maximum of 3 queries.',
|
||||
),
|
||||
});
|
||||
|
||||
const uploadsSearchAction: ResearchAction<typeof schema> = {
|
||||
name: 'uploads_search',
|
||||
enabled: (config) =>
|
||||
(config.classification.classification.personalSearch &&
|
||||
config.fileIds.length > 0) ||
|
||||
config.fileIds.length > 0,
|
||||
schema,
|
||||
getToolDescription: () =>
|
||||
`Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.`,
|
||||
getDescription: () => `
|
||||
Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
|
||||
Always ensure that the queries you use are directly relevant to the user's request and pertain to the content of their uploaded files.
|
||||
|
||||
For example, if the user says "Please find information about X in my uploaded documents", you can call this tool with a query related to X to retrieve the relevant information from their files.
|
||||
Never use this tool to search the web or for information that is not contained within the user's uploaded files.
|
||||
`,
|
||||
execute: async (input, additionalConfig) => {
|
||||
input.queries = input.queries.slice(0, 3);
|
||||
|
||||
const researchBlock = additionalConfig.session.getBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
);
|
||||
|
||||
if (researchBlock && researchBlock.type === 'research') {
|
||||
researchBlock.data.subSteps.push({
|
||||
id: crypto.randomUUID(),
|
||||
type: 'upload_searching',
|
||||
queries: input.queries,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
const uploadStore = new UploadStore({
|
||||
embeddingModel: additionalConfig.embedding,
|
||||
fileIds: additionalConfig.fileIds,
|
||||
});
|
||||
|
||||
const results = await uploadStore.query(input.queries, 10);
|
||||
|
||||
const seenIds = new Map<string, number>();
|
||||
|
||||
const filteredSearchResults = results
|
||||
.map((result, index) => {
|
||||
if (result.metadata.url && !seenIds.has(result.metadata.url)) {
|
||||
seenIds.set(result.metadata.url, index);
|
||||
return result;
|
||||
} else if (result.metadata.url && seenIds.has(result.metadata.url)) {
|
||||
const existingIndex = seenIds.get(result.metadata.url)!;
|
||||
const existingResult = results[existingIndex];
|
||||
|
||||
existingResult.content += `\n\n${result.content}`;
|
||||
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return result;
|
||||
})
|
||||
.filter((r) => r !== undefined);
|
||||
|
||||
if (researchBlock && researchBlock.type === 'research') {
|
||||
researchBlock.data.subSteps.push({
|
||||
id: crypto.randomUUID(),
|
||||
type: 'upload_search_results',
|
||||
results: filteredSearchResults,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
return {
|
||||
type: 'search_results',
|
||||
results: filteredSearchResults,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default uploadsSearchAction;
|
||||
182
src/lib/agents/search/researcher/actions/webSearch.ts
Normal file
182
src/lib/agents/search/researcher/actions/webSearch.ts
Normal file
@@ -0,0 +1,182 @@
|
||||
import z from 'zod';
|
||||
import { ResearchAction } from '../../types';
|
||||
import { searchSearxng } from '@/lib/searxng';
|
||||
import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
|
||||
|
||||
const actionSchema = z.object({
|
||||
type: z.literal('web_search'),
|
||||
queries: z
|
||||
.array(z.string())
|
||||
.describe('An array of search queries to perform web searches for.'),
|
||||
});
|
||||
|
||||
const speedModePrompt = `
|
||||
Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
|
||||
You are currently on speed mode, meaning you would only get to call this tool once. Make sure to prioritize the most important queries that are likely to get you the needed information in one go.
|
||||
|
||||
Your queries should be very targeted and specific to the information you need, avoid broad or generic queries.
|
||||
Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
|
||||
|
||||
For example, if the user is asking about the features of a new technology, you might use queries like "GPT-5.1 features", "GPT-5.1 release date", "GPT-5.1 improvements" rather than a broad query like "Tell me about GPT-5.1".
|
||||
|
||||
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
|
||||
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information.
|
||||
`;
|
||||
|
||||
const balancedModePrompt = `
|
||||
Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
|
||||
|
||||
You can call this tool several times if needed to gather enough information.
|
||||
Start initially with broader queries to get an overview, then narrow down with more specific queries based on the results you receive.
|
||||
|
||||
Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
|
||||
|
||||
For example if the user is asking about Tesla, your actions should be like:
|
||||
1. __reasoning_preamble "The user is asking about Tesla. I will start with broader queries to get an overview of Tesla, then narrow down with more specific queries based on the results I receive." then
|
||||
2. web_search ["Tesla", "Tesla latest news", "Tesla stock price"] then
|
||||
3. __reasoning_preamble "Based on the previous search results, I will now narrow down my queries to focus on Tesla's recent developments and stock performance." then
|
||||
4. web_search ["Tesla Q2 2025 earnings", "Tesla new model 2025", "Tesla stock analysis"] then done.
|
||||
5. __reasoning_preamble "I have gathered enough information to provide a comprehensive answer."
|
||||
6. done.
|
||||
|
||||
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
|
||||
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information. You can call this tools, multiple times as needed.
|
||||
`;
|
||||
|
||||
const qualityModePrompt = `
|
||||
Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
|
||||
|
||||
You have to call this tool several times to gather enough information unless the question is very simple (like greeting questions or basic facts).
|
||||
Start initially with broader queries to get an overview, then narrow down with more specific queries based on the results you receive.
|
||||
Never stop before at least 5-6 iterations of searches unless the user question is very simple.
|
||||
|
||||
Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
|
||||
|
||||
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
|
||||
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information. You can call this tools, multiple times as needed.
|
||||
`;
|
||||
|
||||
const webSearchAction: ResearchAction<typeof actionSchema> = {
|
||||
name: 'web_search',
|
||||
schema: actionSchema,
|
||||
getToolDescription: () =>
|
||||
"Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.",
|
||||
getDescription: (config) => {
|
||||
let prompt = '';
|
||||
|
||||
switch (config.mode) {
|
||||
case 'speed':
|
||||
prompt = speedModePrompt;
|
||||
break;
|
||||
case 'balanced':
|
||||
prompt = balancedModePrompt;
|
||||
break;
|
||||
case 'quality':
|
||||
prompt = qualityModePrompt;
|
||||
break;
|
||||
default:
|
||||
prompt = speedModePrompt;
|
||||
break;
|
||||
}
|
||||
|
||||
return prompt;
|
||||
},
|
||||
enabled: (config) =>
|
||||
config.sources.includes('web') &&
|
||||
config.classification.classification.skipSearch === false,
|
||||
execute: async (input, additionalConfig) => {
|
||||
input.queries = input.queries.slice(0, 3);
|
||||
|
||||
const researchBlock = additionalConfig.session.getBlock(
|
||||
additionalConfig.researchBlockId,
|
||||
);
|
||||
|
||||
if (researchBlock && researchBlock.type === 'research') {
|
||||
researchBlock.data.subSteps.push({
|
||||
id: crypto.randomUUID(),
|
||||
type: 'searching',
|
||||
searching: input.queries,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
|
||||
const searchResultsBlockId = crypto.randomUUID();
|
||||
let searchResultsEmitted = false;
|
||||
|
||||
let results: Chunk[] = [];
|
||||
|
||||
const search = async (q: string) => {
|
||||
const res = await searchSearxng(q);
|
||||
|
||||
const resultChunks: Chunk[] = res.results.map((r) => ({
|
||||
content: r.content || r.title,
|
||||
metadata: {
|
||||
title: r.title,
|
||||
url: r.url,
|
||||
},
|
||||
}));
|
||||
|
||||
results.push(...resultChunks);
|
||||
|
||||
if (
|
||||
!searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
searchResultsEmitted = true;
|
||||
|
||||
researchBlock.data.subSteps.push({
|
||||
id: searchResultsBlockId,
|
||||
type: 'search_results',
|
||||
reading: resultChunks,
|
||||
});
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
} else if (
|
||||
searchResultsEmitted &&
|
||||
researchBlock &&
|
||||
researchBlock.type === 'research'
|
||||
) {
|
||||
const subStepIndex = researchBlock.data.subSteps.findIndex(
|
||||
(step) => step.id === searchResultsBlockId,
|
||||
);
|
||||
|
||||
const subStep = researchBlock.data.subSteps[
|
||||
subStepIndex
|
||||
] as SearchResultsResearchBlock;
|
||||
|
||||
subStep.reading.push(...resultChunks);
|
||||
|
||||
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: researchBlock.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
};
|
||||
|
||||
await Promise.all(input.queries.map(search));
|
||||
|
||||
return {
|
||||
type: 'search_results',
|
||||
results,
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default webSearchAction;
|
||||
222
src/lib/agents/search/researcher/index.ts
Normal file
222
src/lib/agents/search/researcher/index.ts
Normal file
@@ -0,0 +1,222 @@
|
||||
import { ActionOutput, ResearcherInput, ResearcherOutput } from '../types';
|
||||
import { ActionRegistry } from './actions';
|
||||
import { getResearcherPrompt } from '@/lib/prompts/search/researcher';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { Message, ReasoningResearchBlock } from '@/lib/types';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
import { ToolCall } from '@/lib/models/types';
|
||||
|
||||
class Researcher {
|
||||
async research(
|
||||
session: SessionManager,
|
||||
input: ResearcherInput,
|
||||
): Promise<ResearcherOutput> {
|
||||
let actionOutput: ActionOutput[] = [];
|
||||
let maxIteration =
|
||||
input.config.mode === 'speed'
|
||||
? 2
|
||||
: input.config.mode === 'balanced'
|
||||
? 6
|
||||
: 25;
|
||||
|
||||
const availableTools = ActionRegistry.getAvailableActionTools({
|
||||
classification: input.classification,
|
||||
fileIds: input.config.fileIds,
|
||||
mode: input.config.mode,
|
||||
sources: input.config.sources,
|
||||
});
|
||||
|
||||
const availableActionsDescription =
|
||||
ActionRegistry.getAvailableActionsDescriptions({
|
||||
classification: input.classification,
|
||||
fileIds: input.config.fileIds,
|
||||
mode: input.config.mode,
|
||||
sources: input.config.sources,
|
||||
});
|
||||
|
||||
const researchBlockId = crypto.randomUUID();
|
||||
|
||||
session.emitBlock({
|
||||
id: researchBlockId,
|
||||
type: 'research',
|
||||
data: {
|
||||
subSteps: [],
|
||||
},
|
||||
});
|
||||
|
||||
const agentMessageHistory: Message[] = [
|
||||
{
|
||||
role: 'user',
|
||||
content: `
|
||||
<conversation>
|
||||
${formatChatHistoryAsString(input.chatHistory.slice(-10))}
|
||||
User: ${input.followUp} (Standalone question: ${input.classification.standaloneFollowUp})
|
||||
</conversation>
|
||||
`,
|
||||
},
|
||||
];
|
||||
|
||||
for (let i = 0; i < maxIteration; i++) {
|
||||
const researcherPrompt = getResearcherPrompt(
|
||||
availableActionsDescription,
|
||||
input.config.mode,
|
||||
i,
|
||||
maxIteration,
|
||||
input.config.fileIds,
|
||||
);
|
||||
|
||||
const actionStream = input.config.llm.streamText({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: researcherPrompt,
|
||||
},
|
||||
...agentMessageHistory,
|
||||
],
|
||||
tools: availableTools,
|
||||
});
|
||||
|
||||
const block = session.getBlock(researchBlockId);
|
||||
|
||||
let reasoningEmitted = false;
|
||||
let reasoningId = crypto.randomUUID();
|
||||
|
||||
let finalToolCalls: ToolCall[] = [];
|
||||
|
||||
for await (const partialRes of actionStream) {
|
||||
if (partialRes.toolCallChunk.length > 0) {
|
||||
partialRes.toolCallChunk.forEach((tc) => {
|
||||
if (
|
||||
tc.name === '__reasoning_preamble' &&
|
||||
tc.arguments['plan'] &&
|
||||
!reasoningEmitted &&
|
||||
block &&
|
||||
block.type === 'research'
|
||||
) {
|
||||
reasoningEmitted = true;
|
||||
|
||||
block.data.subSteps.push({
|
||||
id: reasoningId,
|
||||
type: 'reasoning',
|
||||
reasoning: tc.arguments['plan'],
|
||||
});
|
||||
|
||||
session.updateBlock(researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: block.data.subSteps,
|
||||
},
|
||||
]);
|
||||
} else if (
|
||||
tc.name === '__reasoning_preamble' &&
|
||||
tc.arguments['plan'] &&
|
||||
reasoningEmitted &&
|
||||
block &&
|
||||
block.type === 'research'
|
||||
) {
|
||||
const subStepIndex = block.data.subSteps.findIndex(
|
||||
(step: any) => step.id === reasoningId,
|
||||
);
|
||||
|
||||
if (subStepIndex !== -1) {
|
||||
const subStep = block.data.subSteps[
|
||||
subStepIndex
|
||||
] as ReasoningResearchBlock;
|
||||
subStep.reasoning = tc.arguments['plan'];
|
||||
session.updateBlock(researchBlockId, [
|
||||
{
|
||||
op: 'replace',
|
||||
path: '/data/subSteps',
|
||||
value: block.data.subSteps,
|
||||
},
|
||||
]);
|
||||
}
|
||||
}
|
||||
|
||||
const existingIndex = finalToolCalls.findIndex(
|
||||
(ftc) => ftc.id === tc.id,
|
||||
);
|
||||
|
||||
if (existingIndex !== -1) {
|
||||
finalToolCalls[existingIndex].arguments = tc.arguments;
|
||||
} else {
|
||||
finalToolCalls.push(tc);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (finalToolCalls.length === 0) {
|
||||
break;
|
||||
}
|
||||
|
||||
if (finalToolCalls[finalToolCalls.length - 1].name === 'done') {
|
||||
break;
|
||||
}
|
||||
|
||||
agentMessageHistory.push({
|
||||
role: 'assistant',
|
||||
content: '',
|
||||
tool_calls: finalToolCalls,
|
||||
});
|
||||
|
||||
const actionResults = await ActionRegistry.executeAll(finalToolCalls, {
|
||||
llm: input.config.llm,
|
||||
embedding: input.config.embedding,
|
||||
session: session,
|
||||
researchBlockId: researchBlockId,
|
||||
fileIds: input.config.fileIds,
|
||||
});
|
||||
|
||||
actionOutput.push(...actionResults);
|
||||
|
||||
actionResults.forEach((action, i) => {
|
||||
agentMessageHistory.push({
|
||||
role: 'tool',
|
||||
id: finalToolCalls[i].id,
|
||||
name: finalToolCalls[i].name,
|
||||
content: JSON.stringify(action),
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const searchResults = actionOutput
|
||||
.filter((a) => a.type === 'search_results')
|
||||
.flatMap((a) => a.results);
|
||||
|
||||
const seenUrls = new Map<string, number>();
|
||||
|
||||
const filteredSearchResults = searchResults
|
||||
.map((result, index) => {
|
||||
if (result.metadata.url && !seenUrls.has(result.metadata.url)) {
|
||||
seenUrls.set(result.metadata.url, index);
|
||||
return result;
|
||||
} else if (result.metadata.url && seenUrls.has(result.metadata.url)) {
|
||||
const existingIndex = seenUrls.get(result.metadata.url)!;
|
||||
|
||||
const existingResult = searchResults[existingIndex];
|
||||
|
||||
existingResult.content += `\n\n${result.content}`;
|
||||
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return result;
|
||||
})
|
||||
.filter((r) => r !== undefined);
|
||||
|
||||
session.emitBlock({
|
||||
id: crypto.randomUUID(),
|
||||
type: 'source',
|
||||
data: filteredSearchResults,
|
||||
});
|
||||
|
||||
return {
|
||||
findings: actionOutput,
|
||||
searchFindings: filteredSearchResults,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default Researcher;
|
||||
122
src/lib/agents/search/types.ts
Normal file
122
src/lib/agents/search/types.ts
Normal file
@@ -0,0 +1,122 @@
|
||||
import z from 'zod';
|
||||
import BaseLLM from '../../models/base/llm';
|
||||
import BaseEmbedding from '@/lib/models/base/embedding';
|
||||
import SessionManager from '@/lib/session';
|
||||
import { ChatTurnMessage, Chunk } from '@/lib/types';
|
||||
|
||||
export type SearchSources = 'web' | 'discussions' | 'academic';
|
||||
|
||||
export type SearchAgentConfig = {
|
||||
sources: SearchSources[];
|
||||
fileIds: string[];
|
||||
llm: BaseLLM<any>;
|
||||
embedding: BaseEmbedding<any>;
|
||||
mode: 'speed' | 'balanced' | 'quality';
|
||||
systemInstructions: string;
|
||||
};
|
||||
|
||||
export type SearchAgentInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
followUp: string;
|
||||
config: SearchAgentConfig;
|
||||
chatId: string;
|
||||
messageId: string;
|
||||
};
|
||||
|
||||
export type WidgetInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
followUp: string;
|
||||
classification: ClassifierOutput;
|
||||
llm: BaseLLM<any>;
|
||||
};
|
||||
|
||||
export type Widget = {
|
||||
type: string;
|
||||
shouldExecute: (classification: ClassifierOutput) => boolean;
|
||||
execute: (input: WidgetInput) => Promise<WidgetOutput | void>;
|
||||
};
|
||||
|
||||
export type WidgetOutput = {
|
||||
type: string;
|
||||
llmContext: string;
|
||||
data: any;
|
||||
};
|
||||
|
||||
export type ClassifierInput = {
|
||||
llm: BaseLLM<any>;
|
||||
enabledSources: SearchSources[];
|
||||
query: string;
|
||||
chatHistory: ChatTurnMessage[];
|
||||
};
|
||||
|
||||
export type ClassifierOutput = {
|
||||
classification: {
|
||||
skipSearch: boolean;
|
||||
personalSearch: boolean;
|
||||
academicSearch: boolean;
|
||||
discussionSearch: boolean;
|
||||
showWeatherWidget: boolean;
|
||||
showStockWidget: boolean;
|
||||
showCalculationWidget: boolean;
|
||||
};
|
||||
standaloneFollowUp: string;
|
||||
};
|
||||
|
||||
export type AdditionalConfig = {
|
||||
llm: BaseLLM<any>;
|
||||
embedding: BaseEmbedding<any>;
|
||||
session: SessionManager;
|
||||
};
|
||||
|
||||
export type ResearcherInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
followUp: string;
|
||||
classification: ClassifierOutput;
|
||||
config: SearchAgentConfig;
|
||||
};
|
||||
|
||||
export type ResearcherOutput = {
|
||||
findings: ActionOutput[];
|
||||
searchFindings: Chunk[];
|
||||
};
|
||||
|
||||
export type SearchActionOutput = {
|
||||
type: 'search_results';
|
||||
results: Chunk[];
|
||||
};
|
||||
|
||||
export type DoneActionOutput = {
|
||||
type: 'done';
|
||||
};
|
||||
|
||||
export type ReasoningResearchAction = {
|
||||
type: 'reasoning';
|
||||
reasoning: string;
|
||||
};
|
||||
|
||||
export type ActionOutput =
|
||||
| SearchActionOutput
|
||||
| DoneActionOutput
|
||||
| ReasoningResearchAction;
|
||||
|
||||
export interface ResearchAction<
|
||||
TSchema extends z.ZodObject<any> = z.ZodObject<any>,
|
||||
> {
|
||||
name: string;
|
||||
schema: z.ZodObject<any>;
|
||||
getToolDescription: (config: { mode: SearchAgentConfig['mode'] }) => string;
|
||||
getDescription: (config: { mode: SearchAgentConfig['mode'] }) => string;
|
||||
enabled: (config: {
|
||||
classification: ClassifierOutput;
|
||||
fileIds: string[];
|
||||
mode: SearchAgentConfig['mode'];
|
||||
sources: SearchSources[];
|
||||
}) => boolean;
|
||||
execute: (
|
||||
params: z.infer<TSchema>,
|
||||
additionalConfig: AdditionalConfig & {
|
||||
researchBlockId: string;
|
||||
fileIds: string[];
|
||||
},
|
||||
) => Promise<ActionOutput>;
|
||||
}
|
||||
71
src/lib/agents/search/widgets/calculationWidget.ts
Normal file
71
src/lib/agents/search/widgets/calculationWidget.ts
Normal file
@@ -0,0 +1,71 @@
|
||||
import z from 'zod';
|
||||
import { Widget } from '../types';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
import { exp, evaluate as mathEval } from 'mathjs';
|
||||
|
||||
const schema = z.object({
|
||||
expression: z
|
||||
.string()
|
||||
.describe('Mathematical expression to calculate or evaluate.'),
|
||||
notPresent: z
|
||||
.boolean()
|
||||
.describe('Whether there is any need for the calculation widget.'),
|
||||
});
|
||||
|
||||
const system = `
|
||||
<role>
|
||||
Assistant is a calculation expression extractor. You will recieve a user follow up and a conversation history.
|
||||
Your task is to determine if there is a mathematical expression that needs to be calculated or evaluated. If there is, extract the expression and return it. If there is no need for any calculation, set notPresent to true.
|
||||
</role>
|
||||
|
||||
<instructions>
|
||||
Make sure that the extracted expression is valid and can be used to calculate the result with Math JS library (https://mathjs.org/). If the expression is not valid, set notPresent to true.
|
||||
If you feel like you cannot extract a valid expression, set notPresent to true.
|
||||
</instructions>
|
||||
|
||||
<output_format>
|
||||
You must respond in the following JSON format without any extra text, explanations or filler sentences:
|
||||
{
|
||||
"expression": string,
|
||||
"notPresent": boolean
|
||||
}
|
||||
</output_format>
|
||||
`;
|
||||
|
||||
const calculationWidget: Widget = {
|
||||
type: 'calculationWidget',
|
||||
shouldExecute: (classification) =>
|
||||
classification.classification.showCalculationWidget,
|
||||
execute: async (input) => {
|
||||
const output = await input.llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: system,
|
||||
},
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
|
||||
},
|
||||
],
|
||||
schema,
|
||||
});
|
||||
|
||||
if (output.notPresent) {
|
||||
return;
|
||||
}
|
||||
|
||||
const result = mathEval(output.expression);
|
||||
|
||||
return {
|
||||
type: 'calculation_result',
|
||||
llmContext: `The result of the calculation for the expression "${output.expression}" is: ${result}`,
|
||||
data: {
|
||||
expression: output.expression,
|
||||
result,
|
||||
},
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
export default calculationWidget;
|
||||
36
src/lib/agents/search/widgets/executor.ts
Normal file
36
src/lib/agents/search/widgets/executor.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
import { Widget, WidgetInput, WidgetOutput } from '../types';
|
||||
|
||||
class WidgetExecutor {
|
||||
static widgets = new Map<string, Widget>();
|
||||
|
||||
static register(widget: Widget) {
|
||||
this.widgets.set(widget.type, widget);
|
||||
}
|
||||
|
||||
static getWidget(type: string): Widget | undefined {
|
||||
return this.widgets.get(type);
|
||||
}
|
||||
|
||||
static async executeAll(input: WidgetInput): Promise<WidgetOutput[]> {
|
||||
const results: WidgetOutput[] = [];
|
||||
|
||||
await Promise.all(
|
||||
Array.from(this.widgets.values()).map(async (widget) => {
|
||||
try {
|
||||
if (widget.shouldExecute(input.classification)) {
|
||||
const output = await widget.execute(input);
|
||||
if (output) {
|
||||
results.push(output);
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
console.log(`Error executing widget ${widget.type}:`, e);
|
||||
}
|
||||
}),
|
||||
);
|
||||
|
||||
return results;
|
||||
}
|
||||
}
|
||||
|
||||
export default WidgetExecutor;
|
||||
10
src/lib/agents/search/widgets/index.ts
Normal file
10
src/lib/agents/search/widgets/index.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
import calculationWidget from './calculationWidget';
|
||||
import WidgetExecutor from './executor';
|
||||
import weatherWidget from './weatherWidget';
|
||||
import stockWidget from './stockWidget';
|
||||
|
||||
WidgetExecutor.register(weatherWidget);
|
||||
WidgetExecutor.register(calculationWidget);
|
||||
WidgetExecutor.register(stockWidget);
|
||||
|
||||
export { WidgetExecutor };
|
||||
434
src/lib/agents/search/widgets/stockWidget.ts
Normal file
434
src/lib/agents/search/widgets/stockWidget.ts
Normal file
@@ -0,0 +1,434 @@
|
||||
import z from 'zod';
|
||||
import { Widget } from '../types';
|
||||
import YahooFinance from 'yahoo-finance2';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
|
||||
const yf = new YahooFinance({
|
||||
suppressNotices: ['yahooSurvey'],
|
||||
});
|
||||
|
||||
const schema = z.object({
|
||||
name: z
|
||||
.string()
|
||||
.describe(
|
||||
"The stock name for example Nvidia, Google, Apple, Microsoft etc. You can also return ticker if you're aware of it otherwise just use the name.",
|
||||
),
|
||||
comparisonNames: z
|
||||
.array(z.string())
|
||||
.max(3)
|
||||
.describe(
|
||||
"Optional array of up to 3 stock names to compare against the base name (e.g., ['Microsoft', 'GOOGL', 'Meta']). Charts will show percentage change comparison.",
|
||||
),
|
||||
notPresent: z
|
||||
.boolean()
|
||||
.describe('Whether there is no need for the stock widget.'),
|
||||
});
|
||||
|
||||
const systemPrompt = `
|
||||
<role>
|
||||
You are a stock ticker/name extractor. You will receive a user follow up and a conversation history.
|
||||
Your task is to determine if the user is asking about stock information and extract the stock name(s) they want data for.
|
||||
</role>
|
||||
|
||||
<instructions>
|
||||
- If the user is asking about a stock, extract the primary stock name or ticker.
|
||||
- If the user wants to compare stocks, extract up to 3 comparison stock names in comparisonNames.
|
||||
- You can use either stock names (e.g., "Nvidia", "Apple") or tickers (e.g., "NVDA", "AAPL").
|
||||
- If you cannot determine a valid stock or the query is not stock-related, set notPresent to true.
|
||||
- If no comparison is needed, set comparisonNames to an empty array.
|
||||
</instructions>
|
||||
|
||||
<output_format>
|
||||
You must respond in the following JSON format without any extra text, explanations or filler sentences:
|
||||
{
|
||||
"name": string,
|
||||
"comparisonNames": string[],
|
||||
"notPresent": boolean
|
||||
}
|
||||
</output_format>
|
||||
`;
|
||||
|
||||
const stockWidget: Widget = {
|
||||
type: 'stockWidget',
|
||||
shouldExecute: (classification) =>
|
||||
classification.classification.showStockWidget,
|
||||
execute: async (input) => {
|
||||
const output = await input.llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: systemPrompt,
|
||||
},
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
|
||||
},
|
||||
],
|
||||
schema,
|
||||
});
|
||||
|
||||
if (output.notPresent) {
|
||||
return;
|
||||
}
|
||||
|
||||
const params = output;
|
||||
try {
|
||||
const name = params.name;
|
||||
|
||||
const findings = await yf.search(name);
|
||||
|
||||
if (findings.quotes.length === 0)
|
||||
throw new Error(`Failed to find quote for name/symbol: ${name}`);
|
||||
|
||||
const ticker = findings.quotes[0].symbol as string;
|
||||
|
||||
const quote: any = await yf.quote(ticker);
|
||||
|
||||
const chartPromises = {
|
||||
'1D': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 2 * 24 * 60 * 60 * 1000),
|
||||
period2: new Date(),
|
||||
interval: '5m',
|
||||
})
|
||||
.catch(() => null),
|
||||
'5D': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 6 * 24 * 60 * 60 * 1000),
|
||||
period2: new Date(),
|
||||
interval: '15m',
|
||||
})
|
||||
.catch(() => null),
|
||||
'1M': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
'3M': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 90 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
'6M': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 180 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
'1Y': yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 365 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
MAX: yf
|
||||
.chart(ticker, {
|
||||
period1: new Date(Date.now() - 10 * 365 * 24 * 60 * 60 * 1000),
|
||||
interval: '1wk',
|
||||
})
|
||||
.catch(() => null),
|
||||
};
|
||||
|
||||
const charts = await Promise.all([
|
||||
chartPromises['1D'],
|
||||
chartPromises['5D'],
|
||||
chartPromises['1M'],
|
||||
chartPromises['3M'],
|
||||
chartPromises['6M'],
|
||||
chartPromises['1Y'],
|
||||
chartPromises['MAX'],
|
||||
]);
|
||||
|
||||
const [chart1D, chart5D, chart1M, chart3M, chart6M, chart1Y, chartMAX] =
|
||||
charts;
|
||||
|
||||
if (!quote) {
|
||||
throw new Error(`No data found for ticker: ${ticker}`);
|
||||
}
|
||||
|
||||
let comparisonData: any = null;
|
||||
if (params.comparisonNames.length > 0) {
|
||||
const comparisonPromises = params.comparisonNames
|
||||
.slice(0, 3)
|
||||
.map(async (compName) => {
|
||||
try {
|
||||
const compFindings = await yf.search(compName);
|
||||
|
||||
if (compFindings.quotes.length === 0) return null;
|
||||
|
||||
const compTicker = compFindings.quotes[0].symbol as string;
|
||||
const compQuote = await yf.quote(compTicker);
|
||||
const compCharts = await Promise.all([
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 2 * 24 * 60 * 60 * 1000),
|
||||
period2: new Date(),
|
||||
interval: '5m',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 6 * 24 * 60 * 60 * 1000),
|
||||
period2: new Date(),
|
||||
interval: '15m',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 90 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 180 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(Date.now() - 365 * 24 * 60 * 60 * 1000),
|
||||
interval: '1d',
|
||||
})
|
||||
.catch(() => null),
|
||||
yf
|
||||
.chart(compTicker, {
|
||||
period1: new Date(
|
||||
Date.now() - 10 * 365 * 24 * 60 * 60 * 1000,
|
||||
),
|
||||
interval: '1wk',
|
||||
})
|
||||
.catch(() => null),
|
||||
]);
|
||||
return {
|
||||
ticker: compTicker,
|
||||
name: compQuote.shortName || compTicker,
|
||||
charts: compCharts,
|
||||
};
|
||||
} catch (error) {
|
||||
console.error(
|
||||
`Failed to fetch comparison ticker ${compName}:`,
|
||||
error,
|
||||
);
|
||||
return null;
|
||||
}
|
||||
});
|
||||
const compResults = await Promise.all(comparisonPromises);
|
||||
comparisonData = compResults.filter((r) => r !== null);
|
||||
}
|
||||
|
||||
const stockData = {
|
||||
symbol: quote.symbol,
|
||||
shortName: quote.shortName || quote.longName || ticker,
|
||||
longName: quote.longName,
|
||||
exchange: quote.fullExchangeName || quote.exchange,
|
||||
currency: quote.currency,
|
||||
quoteType: quote.quoteType,
|
||||
|
||||
marketState: quote.marketState,
|
||||
regularMarketTime: quote.regularMarketTime,
|
||||
postMarketTime: quote.postMarketTime,
|
||||
preMarketTime: quote.preMarketTime,
|
||||
|
||||
regularMarketPrice: quote.regularMarketPrice,
|
||||
regularMarketChange: quote.regularMarketChange,
|
||||
regularMarketChangePercent: quote.regularMarketChangePercent,
|
||||
regularMarketPreviousClose: quote.regularMarketPreviousClose,
|
||||
regularMarketOpen: quote.regularMarketOpen,
|
||||
regularMarketDayHigh: quote.regularMarketDayHigh,
|
||||
regularMarketDayLow: quote.regularMarketDayLow,
|
||||
|
||||
postMarketPrice: quote.postMarketPrice,
|
||||
postMarketChange: quote.postMarketChange,
|
||||
postMarketChangePercent: quote.postMarketChangePercent,
|
||||
preMarketPrice: quote.preMarketPrice,
|
||||
preMarketChange: quote.preMarketChange,
|
||||
preMarketChangePercent: quote.preMarketChangePercent,
|
||||
|
||||
regularMarketVolume: quote.regularMarketVolume,
|
||||
averageDailyVolume3Month: quote.averageDailyVolume3Month,
|
||||
averageDailyVolume10Day: quote.averageDailyVolume10Day,
|
||||
bid: quote.bid,
|
||||
bidSize: quote.bidSize,
|
||||
ask: quote.ask,
|
||||
askSize: quote.askSize,
|
||||
|
||||
fiftyTwoWeekLow: quote.fiftyTwoWeekLow,
|
||||
fiftyTwoWeekHigh: quote.fiftyTwoWeekHigh,
|
||||
fiftyTwoWeekChange: quote.fiftyTwoWeekChange,
|
||||
fiftyTwoWeekChangePercent: quote.fiftyTwoWeekChangePercent,
|
||||
|
||||
marketCap: quote.marketCap,
|
||||
trailingPE: quote.trailingPE,
|
||||
forwardPE: quote.forwardPE,
|
||||
priceToBook: quote.priceToBook,
|
||||
bookValue: quote.bookValue,
|
||||
earningsPerShare: quote.epsTrailingTwelveMonths,
|
||||
epsForward: quote.epsForward,
|
||||
|
||||
dividendRate: quote.dividendRate,
|
||||
dividendYield: quote.dividendYield,
|
||||
exDividendDate: quote.exDividendDate,
|
||||
trailingAnnualDividendRate: quote.trailingAnnualDividendRate,
|
||||
trailingAnnualDividendYield: quote.trailingAnnualDividendYield,
|
||||
|
||||
beta: quote.beta,
|
||||
|
||||
fiftyDayAverage: quote.fiftyDayAverage,
|
||||
fiftyDayAverageChange: quote.fiftyDayAverageChange,
|
||||
fiftyDayAverageChangePercent: quote.fiftyDayAverageChangePercent,
|
||||
twoHundredDayAverage: quote.twoHundredDayAverage,
|
||||
twoHundredDayAverageChange: quote.twoHundredDayAverageChange,
|
||||
twoHundredDayAverageChangePercent:
|
||||
quote.twoHundredDayAverageChangePercent,
|
||||
|
||||
sector: quote.sector,
|
||||
industry: quote.industry,
|
||||
website: quote.website,
|
||||
|
||||
chartData: {
|
||||
'1D': chart1D
|
||||
? {
|
||||
timestamps: chart1D.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart1D.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'5D': chart5D
|
||||
? {
|
||||
timestamps: chart5D.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart5D.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'1M': chart1M
|
||||
? {
|
||||
timestamps: chart1M.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart1M.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'3M': chart3M
|
||||
? {
|
||||
timestamps: chart3M.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart3M.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'6M': chart6M
|
||||
? {
|
||||
timestamps: chart6M.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart6M.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'1Y': chart1Y
|
||||
? {
|
||||
timestamps: chart1Y.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chart1Y.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
MAX: chartMAX
|
||||
? {
|
||||
timestamps: chartMAX.quotes.map((q: any) => q.date.getTime()),
|
||||
prices: chartMAX.quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
},
|
||||
comparisonData: comparisonData
|
||||
? comparisonData.map((comp: any) => ({
|
||||
ticker: comp.ticker,
|
||||
name: comp.name,
|
||||
chartData: {
|
||||
'1D': comp.charts[0]
|
||||
? {
|
||||
timestamps: comp.charts[0].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[0].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'5D': comp.charts[1]
|
||||
? {
|
||||
timestamps: comp.charts[1].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[1].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'1M': comp.charts[2]
|
||||
? {
|
||||
timestamps: comp.charts[2].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[2].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'3M': comp.charts[3]
|
||||
? {
|
||||
timestamps: comp.charts[3].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[3].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'6M': comp.charts[4]
|
||||
? {
|
||||
timestamps: comp.charts[4].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[4].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
'1Y': comp.charts[5]
|
||||
? {
|
||||
timestamps: comp.charts[5].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[5].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
MAX: comp.charts[6]
|
||||
? {
|
||||
timestamps: comp.charts[6].quotes.map((q: any) =>
|
||||
q.date.getTime(),
|
||||
),
|
||||
prices: comp.charts[6].quotes.map((q: any) => q.close),
|
||||
}
|
||||
: null,
|
||||
},
|
||||
}))
|
||||
: null,
|
||||
};
|
||||
|
||||
return {
|
||||
type: 'stock',
|
||||
llmContext: `Current price of ${stockData.shortName} (${stockData.symbol}) is ${stockData.regularMarketPrice} ${stockData.currency}. Other details: ${JSON.stringify(
|
||||
{
|
||||
marketState: stockData.marketState,
|
||||
regularMarketChange: stockData.regularMarketChange,
|
||||
regularMarketChangePercent: stockData.regularMarketChangePercent,
|
||||
marketCap: stockData.marketCap,
|
||||
peRatio: stockData.trailingPE,
|
||||
dividendYield: stockData.dividendYield,
|
||||
},
|
||||
)}`,
|
||||
data: stockData,
|
||||
};
|
||||
} catch (error: any) {
|
||||
return {
|
||||
type: 'stock',
|
||||
llmContext: 'Failed to fetch stock data.',
|
||||
data: {
|
||||
error: `Error fetching stock data: ${error.message || error}`,
|
||||
ticker: params.name,
|
||||
},
|
||||
};
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
export default stockWidget;
|
||||
203
src/lib/agents/search/widgets/weatherWidget.ts
Normal file
203
src/lib/agents/search/widgets/weatherWidget.ts
Normal file
@@ -0,0 +1,203 @@
|
||||
import z from 'zod';
|
||||
import { Widget } from '../types';
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
|
||||
const schema = z.object({
|
||||
location: z
|
||||
.string()
|
||||
.describe(
|
||||
'Human-readable location name (e.g., "New York, NY, USA", "London, UK"). Use this OR lat/lon coordinates, never both. Leave empty string if providing coordinates.',
|
||||
),
|
||||
lat: z
|
||||
.number()
|
||||
.describe(
|
||||
'Latitude coordinate in decimal degrees (e.g., 40.7128). Only use when location name is empty.',
|
||||
),
|
||||
lon: z
|
||||
.number()
|
||||
.describe(
|
||||
'Longitude coordinate in decimal degrees (e.g., -74.0060). Only use when location name is empty.',
|
||||
),
|
||||
notPresent: z
|
||||
.boolean()
|
||||
.describe('Whether there is no need for the weather widget.'),
|
||||
});
|
||||
|
||||
const systemPrompt = `
|
||||
<role>
|
||||
You are a location extractor for weather queries. You will receive a user follow up and a conversation history.
|
||||
Your task is to determine if the user is asking about weather and extract the location they want weather for.
|
||||
</role>
|
||||
|
||||
<instructions>
|
||||
- If the user is asking about weather, extract the location name OR coordinates (never both).
|
||||
- If using location name, set lat and lon to 0.
|
||||
- If using coordinates, set location to empty string.
|
||||
- If you cannot determine a valid location or the query is not weather-related, set notPresent to true.
|
||||
- Location should be specific (city, state/region, country) for best results.
|
||||
- You have to give the location so that it can be used to fetch weather data, it cannot be left empty unless notPresent is true.
|
||||
- Make sure to infer short forms of location names (e.g., "NYC" -> "New York City", "LA" -> "Los Angeles").
|
||||
</instructions>
|
||||
|
||||
<output_format>
|
||||
You must respond in the following JSON format without any extra text, explanations or filler sentences:
|
||||
{
|
||||
"location": string,
|
||||
"lat": number,
|
||||
"lon": number,
|
||||
"notPresent": boolean
|
||||
}
|
||||
</output_format>
|
||||
`;
|
||||
|
||||
const weatherWidget: Widget = {
|
||||
type: 'weatherWidget',
|
||||
shouldExecute: (classification) =>
|
||||
classification.classification.showWeatherWidget,
|
||||
execute: async (input) => {
|
||||
const output = await input.llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: systemPrompt,
|
||||
},
|
||||
{
|
||||
role: 'user',
|
||||
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
|
||||
},
|
||||
],
|
||||
schema,
|
||||
});
|
||||
|
||||
if (output.notPresent) {
|
||||
return;
|
||||
}
|
||||
|
||||
const params = output;
|
||||
|
||||
try {
|
||||
if (
|
||||
params.location === '' &&
|
||||
(params.lat === undefined || params.lon === undefined)
|
||||
) {
|
||||
throw new Error(
|
||||
'Either location name or both latitude and longitude must be provided.',
|
||||
);
|
||||
}
|
||||
|
||||
if (params.location !== '') {
|
||||
const openStreetMapUrl = `https://nominatim.openstreetmap.org/search?q=${encodeURIComponent(params.location)}&format=json&limit=1`;
|
||||
|
||||
const locationRes = await fetch(openStreetMapUrl, {
|
||||
headers: {
|
||||
'User-Agent': 'Perplexica',
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const data = await locationRes.json();
|
||||
|
||||
const location = data[0];
|
||||
|
||||
if (!location) {
|
||||
throw new Error(
|
||||
`Could not find coordinates for location: ${params.location}`,
|
||||
);
|
||||
}
|
||||
|
||||
const weatherRes = await fetch(
|
||||
`https://api.open-meteo.com/v1/forecast?latitude=${location.lat}&longitude=${location.lon}¤t=temperature_2m,relative_humidity_2m,apparent_temperature,is_day,precipitation,rain,showers,snowfall,weather_code,cloud_cover,pressure_msl,surface_pressure,wind_speed_10m,wind_direction_10m,wind_gusts_10m&hourly=temperature_2m,precipitation_probability,precipitation,weather_code&daily=weather_code,temperature_2m_max,temperature_2m_min,precipitation_sum,precipitation_probability_max&timezone=auto&forecast_days=7`,
|
||||
{
|
||||
headers: {
|
||||
'User-Agent': 'Perplexica',
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
const weatherData = await weatherRes.json();
|
||||
|
||||
return {
|
||||
type: 'weather',
|
||||
llmContext: `Weather in ${params.location} is ${JSON.stringify(weatherData.current)}`,
|
||||
data: {
|
||||
location: params.location,
|
||||
latitude: location.lat,
|
||||
longitude: location.lon,
|
||||
current: weatherData.current,
|
||||
hourly: {
|
||||
time: weatherData.hourly.time.slice(0, 24),
|
||||
temperature_2m: weatherData.hourly.temperature_2m.slice(0, 24),
|
||||
precipitation_probability:
|
||||
weatherData.hourly.precipitation_probability.slice(0, 24),
|
||||
precipitation: weatherData.hourly.precipitation.slice(0, 24),
|
||||
weather_code: weatherData.hourly.weather_code.slice(0, 24),
|
||||
},
|
||||
daily: weatherData.daily,
|
||||
timezone: weatherData.timezone,
|
||||
},
|
||||
};
|
||||
} else if (params.lat !== undefined && params.lon !== undefined) {
|
||||
const [weatherRes, locationRes] = await Promise.all([
|
||||
fetch(
|
||||
`https://api.open-meteo.com/v1/forecast?latitude=${params.lat}&longitude=${params.lon}¤t=temperature_2m,relative_humidity_2m,apparent_temperature,is_day,precipitation,rain,showers,snowfall,weather_code,cloud_cover,pressure_msl,surface_pressure,wind_speed_10m,wind_direction_10m,wind_gusts_10m&hourly=temperature_2m,precipitation_probability,precipitation,weather_code&daily=weather_code,temperature_2m_max,temperature_2m_min,precipitation_sum,precipitation_probability_max&timezone=auto&forecast_days=7`,
|
||||
{
|
||||
headers: {
|
||||
'User-Agent': 'Perplexica',
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
},
|
||||
),
|
||||
fetch(
|
||||
`https://nominatim.openstreetmap.org/reverse?lat=${params.lat}&lon=${params.lon}&format=json`,
|
||||
{
|
||||
headers: {
|
||||
'User-Agent': 'Perplexica',
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
},
|
||||
),
|
||||
]);
|
||||
|
||||
const weatherData = await weatherRes.json();
|
||||
const locationData = await locationRes.json();
|
||||
|
||||
return {
|
||||
type: 'weather',
|
||||
llmContext: `Weather in ${locationData.display_name} is ${JSON.stringify(weatherData.current)}`,
|
||||
data: {
|
||||
location: locationData.display_name,
|
||||
latitude: params.lat,
|
||||
longitude: params.lon,
|
||||
current: weatherData.current,
|
||||
hourly: {
|
||||
time: weatherData.hourly.time.slice(0, 24),
|
||||
temperature_2m: weatherData.hourly.temperature_2m.slice(0, 24),
|
||||
precipitation_probability:
|
||||
weatherData.hourly.precipitation_probability.slice(0, 24),
|
||||
precipitation: weatherData.hourly.precipitation.slice(0, 24),
|
||||
weather_code: weatherData.hourly.weather_code.slice(0, 24),
|
||||
},
|
||||
daily: weatherData.daily,
|
||||
timezone: weatherData.timezone,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
type: 'weather',
|
||||
llmContext: 'No valid location or coordinates provided.',
|
||||
data: null,
|
||||
};
|
||||
} catch (err) {
|
||||
return {
|
||||
type: 'weather',
|
||||
llmContext: 'Failed to fetch weather data.',
|
||||
data: {
|
||||
error: `Error fetching weather data: ${err}`,
|
||||
},
|
||||
};
|
||||
}
|
||||
},
|
||||
};
|
||||
export default weatherWidget;
|
||||
39
src/lib/agents/suggestions/index.ts
Normal file
39
src/lib/agents/suggestions/index.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
|
||||
import { suggestionGeneratorPrompt } from '@/lib/prompts/suggestions';
|
||||
import { ChatTurnMessage } from '@/lib/types';
|
||||
import z from 'zod';
|
||||
import BaseLLM from '@/lib/models/base/llm';
|
||||
import { i } from 'mathjs';
|
||||
|
||||
type SuggestionGeneratorInput = {
|
||||
chatHistory: ChatTurnMessage[];
|
||||
};
|
||||
|
||||
const schema = z.object({
|
||||
suggestions: z
|
||||
.array(z.string())
|
||||
.describe('List of suggested questions or prompts'),
|
||||
});
|
||||
|
||||
const generateSuggestions = async (
|
||||
input: SuggestionGeneratorInput,
|
||||
llm: BaseLLM<any>,
|
||||
) => {
|
||||
const res = await llm.generateObject<typeof schema>({
|
||||
messages: [
|
||||
{
|
||||
role: 'system',
|
||||
content: suggestionGeneratorPrompt,
|
||||
},
|
||||
{
|
||||
role: 'user',
|
||||
content: `<chat_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</chat_history>`,
|
||||
},
|
||||
],
|
||||
schema,
|
||||
});
|
||||
|
||||
return res.suggestions;
|
||||
};
|
||||
|
||||
export default generateSuggestions;
|
||||
@@ -1,105 +0,0 @@
|
||||
import {
|
||||
RunnableSequence,
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatPromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { searchSearxng } from '../searxng';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import LineOutputParser from '../outputParsers/lineOutputParser';
|
||||
|
||||
const imageSearchChainPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search the web for images.
|
||||
You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation.
|
||||
Output only the rephrased query wrapped in an XML <query> element. Do not include any explanation or additional text.
|
||||
`;
|
||||
|
||||
type ImageSearchChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
interface ImageSearchResult {
|
||||
img_src: string;
|
||||
url: string;
|
||||
title: string;
|
||||
}
|
||||
|
||||
const strParser = new StringOutputParser();
|
||||
|
||||
const createImageSearchChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: ImageSearchChainInput) => {
|
||||
return formatChatHistoryAsString(input.chat_history);
|
||||
},
|
||||
query: (input: ImageSearchChainInput) => {
|
||||
return input.query;
|
||||
},
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', imageSearchChainPrompt],
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nWhat is a cat?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>A cat</query>'],
|
||||
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nWhat is a car? How does it work?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>Car working</query>'],
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nHow does an AC work?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>AC working</query>'],
|
||||
[
|
||||
'user',
|
||||
'<conversation>{chat_history}</conversation>\n<follow_up>\n{query}\n</follow_up>',
|
||||
],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const queryParser = new LineOutputParser({
|
||||
key: 'query',
|
||||
});
|
||||
|
||||
return await queryParser.parse(input);
|
||||
}),
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const res = await searchSearxng(input, {
|
||||
engines: ['bing images', 'google images'],
|
||||
});
|
||||
|
||||
const images: ImageSearchResult[] = [];
|
||||
|
||||
res.results.forEach((result) => {
|
||||
if (result.img_src && result.url && result.title) {
|
||||
images.push({
|
||||
img_src: result.img_src,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return images.slice(0, 10);
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const handleImageSearch = (
|
||||
input: ImageSearchChainInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
const imageSearchChain = createImageSearchChain(llm);
|
||||
return imageSearchChain.invoke(input);
|
||||
};
|
||||
|
||||
export default handleImageSearch;
|
||||
@@ -1,55 +0,0 @@
|
||||
import { RunnableSequence, RunnableMap } from '@langchain/core/runnables';
|
||||
import ListLineOutputParser from '../outputParsers/listLineOutputParser';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
|
||||
const suggestionGeneratorPrompt = `
|
||||
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
|
||||
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
|
||||
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
|
||||
|
||||
Provide these suggestions separated by newlines between the XML tags <suggestions> and </suggestions>. For example:
|
||||
|
||||
<suggestions>
|
||||
Tell me more about SpaceX and their recent projects
|
||||
What is the latest news on SpaceX?
|
||||
Who is the CEO of SpaceX?
|
||||
</suggestions>
|
||||
|
||||
Conversation:
|
||||
{chat_history}
|
||||
`;
|
||||
|
||||
type SuggestionGeneratorInput = {
|
||||
chat_history: BaseMessage[];
|
||||
};
|
||||
|
||||
const outputParser = new ListLineOutputParser({
|
||||
key: 'suggestions',
|
||||
});
|
||||
|
||||
const createSuggestionGeneratorChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: SuggestionGeneratorInput) =>
|
||||
formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
PromptTemplate.fromTemplate(suggestionGeneratorPrompt),
|
||||
llm,
|
||||
outputParser,
|
||||
]);
|
||||
};
|
||||
|
||||
const generateSuggestions = (
|
||||
input: SuggestionGeneratorInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
(llm as unknown as ChatOpenAI).temperature = 0;
|
||||
const suggestionGeneratorChain = createSuggestionGeneratorChain(llm);
|
||||
return suggestionGeneratorChain.invoke(input);
|
||||
};
|
||||
|
||||
export default generateSuggestions;
|
||||
@@ -1,110 +0,0 @@
|
||||
import {
|
||||
RunnableSequence,
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatPromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { searchSearxng } from '../searxng';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import LineOutputParser from '../outputParsers/lineOutputParser';
|
||||
|
||||
const videoSearchChainPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search Youtube for videos.
|
||||
You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation.
|
||||
Output only the rephrased query wrapped in an XML <query> element. Do not include any explanation or additional text.
|
||||
`;
|
||||
|
||||
type VideoSearchChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
interface VideoSearchResult {
|
||||
img_src: string;
|
||||
url: string;
|
||||
title: string;
|
||||
iframe_src: string;
|
||||
}
|
||||
|
||||
const strParser = new StringOutputParser();
|
||||
|
||||
const createVideoSearchChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: VideoSearchChainInput) => {
|
||||
return formatChatHistoryAsString(input.chat_history);
|
||||
},
|
||||
query: (input: VideoSearchChainInput) => {
|
||||
return input.query;
|
||||
},
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', videoSearchChainPrompt],
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nHow does a car work?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>How does a car work?</query>'],
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nWhat is the theory of relativity?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>Theory of relativity</query>'],
|
||||
[
|
||||
'user',
|
||||
'<conversation>\n</conversation>\n<follow_up>\nHow does an AC work?\n</follow_up>',
|
||||
],
|
||||
['assistant', '<query>AC working</query>'],
|
||||
[
|
||||
'user',
|
||||
'<conversation>{chat_history}</conversation>\n<follow_up>\n{query}\n</follow_up>',
|
||||
],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const queryParser = new LineOutputParser({
|
||||
key: 'query',
|
||||
});
|
||||
return await queryParser.parse(input);
|
||||
}),
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const res = await searchSearxng(input, {
|
||||
engines: ['youtube'],
|
||||
});
|
||||
|
||||
const videos: VideoSearchResult[] = [];
|
||||
|
||||
res.results.forEach((result) => {
|
||||
if (
|
||||
result.thumbnail &&
|
||||
result.url &&
|
||||
result.title &&
|
||||
result.iframe_src
|
||||
) {
|
||||
videos.push({
|
||||
img_src: result.thumbnail,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
iframe_src: result.iframe_src,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return videos.slice(0, 10);
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const handleVideoSearch = (
|
||||
input: VideoSearchChainInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
const videoSearchChain = createVideoSearchChain(llm);
|
||||
return videoSearchChain.invoke(input);
|
||||
};
|
||||
|
||||
export default handleVideoSearch;
|
||||
@@ -17,3 +17,13 @@ export const getShowWeatherWidget = () =>
|
||||
|
||||
export const getShowNewsWidget = () =>
|
||||
getClientConfig('showNewsWidget', 'true') === 'true';
|
||||
|
||||
export const getMeasurementUnit = () => {
|
||||
const value =
|
||||
getClientConfig('measureUnit') ??
|
||||
getClientConfig('measurementUnit', 'metric');
|
||||
|
||||
if (typeof value !== 'string') return 'metric';
|
||||
|
||||
return value.toLowerCase();
|
||||
};
|
||||
|
||||
@@ -18,12 +18,18 @@ db.exec(`
|
||||
`);
|
||||
|
||||
function sanitizeSql(content: string) {
|
||||
return content
|
||||
const statements = content
|
||||
.split(/--> statement-breakpoint/g)
|
||||
.map((stmt) =>
|
||||
stmt
|
||||
.split(/\r?\n/)
|
||||
.filter(
|
||||
(l) => !l.trim().startsWith('-->') && !l.includes('statement-breakpoint'),
|
||||
.filter((l) => !l.trim().startsWith('-->'))
|
||||
.join('\n')
|
||||
.trim(),
|
||||
)
|
||||
.join('\n');
|
||||
.filter((stmt) => stmt.length > 0);
|
||||
|
||||
return statements;
|
||||
}
|
||||
|
||||
fs.readdirSync(migrationsFolder)
|
||||
@@ -32,13 +38,14 @@ fs.readdirSync(migrationsFolder)
|
||||
.forEach((file) => {
|
||||
const filePath = path.join(migrationsFolder, file);
|
||||
let content = fs.readFileSync(filePath, 'utf-8');
|
||||
content = sanitizeSql(content);
|
||||
const statements = sanitizeSql(content);
|
||||
|
||||
const migrationName = file.split('_')[0] || file;
|
||||
|
||||
const already = db
|
||||
.prepare('SELECT 1 FROM ran_migrations WHERE name = ?')
|
||||
.get(migrationName);
|
||||
|
||||
if (already) {
|
||||
console.log(`Skipping already-applied migration: ${file}`);
|
||||
return;
|
||||
@@ -107,8 +114,167 @@ fs.readdirSync(migrationsFolder)
|
||||
|
||||
db.exec('DROP TABLE messages;');
|
||||
db.exec('ALTER TABLE messages_with_sources RENAME TO messages;');
|
||||
} else if (migrationName === '0002') {
|
||||
/* Migrate chat */
|
||||
db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS chats_new (
|
||||
id TEXT PRIMARY KEY,
|
||||
title TEXT NOT NULL,
|
||||
createdAt TEXT NOT NULL,
|
||||
sources TEXT DEFAULT '[]',
|
||||
files TEXT DEFAULT '[]'
|
||||
);
|
||||
`);
|
||||
|
||||
const chats = db
|
||||
.prepare('SELECT id, title, createdAt, files FROM chats')
|
||||
.all();
|
||||
|
||||
const insertChat = db.prepare(`
|
||||
INSERT INTO chats_new (id, title, createdAt, sources, files)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
chats.forEach((chat: any) => {
|
||||
let files = chat.files;
|
||||
while (typeof files === 'string') {
|
||||
files = JSON.parse(files || '[]');
|
||||
}
|
||||
|
||||
insertChat.run(
|
||||
chat.id,
|
||||
chat.title,
|
||||
chat.createdAt,
|
||||
'["web"]',
|
||||
JSON.stringify(files),
|
||||
);
|
||||
});
|
||||
|
||||
db.exec('DROP TABLE chats;');
|
||||
db.exec('ALTER TABLE chats_new RENAME TO chats;');
|
||||
|
||||
/* Migrate messages */
|
||||
|
||||
db.exec(`
|
||||
CREATE TABLE IF NOT EXISTS messages_new (
|
||||
id INTEGER PRIMARY KEY,
|
||||
messageId TEXT NOT NULL,
|
||||
chatId TEXT NOT NULL,
|
||||
backendId TEXT NOT NULL,
|
||||
query TEXT NOT NULL,
|
||||
createdAt TEXT NOT NULL,
|
||||
responseBlocks TEXT DEFAULT '[]',
|
||||
status TEXT DEFAULT 'answering'
|
||||
);
|
||||
`);
|
||||
|
||||
const messages = db
|
||||
.prepare(
|
||||
'SELECT id, messageId, chatId, type, content, createdAt, sources FROM messages ORDER BY id ASC',
|
||||
)
|
||||
.all();
|
||||
|
||||
const insertMessage = db.prepare(`
|
||||
INSERT INTO messages_new (messageId, chatId, backendId, query, createdAt, responseBlocks, status)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
let currentMessageData: {
|
||||
sources?: any[];
|
||||
response?: string;
|
||||
query?: string;
|
||||
messageId?: string;
|
||||
chatId?: string;
|
||||
createdAt?: string;
|
||||
} = {};
|
||||
let lastCompleted = true;
|
||||
|
||||
messages.forEach((msg: any) => {
|
||||
if (msg.type === 'user' && lastCompleted) {
|
||||
currentMessageData = {};
|
||||
currentMessageData.messageId = msg.messageId;
|
||||
currentMessageData.chatId = msg.chatId;
|
||||
currentMessageData.query = msg.content;
|
||||
currentMessageData.createdAt = msg.createdAt;
|
||||
lastCompleted = false;
|
||||
} else if (msg.type === 'source' && !lastCompleted) {
|
||||
let sources = msg.sources;
|
||||
|
||||
while (typeof sources === 'string') {
|
||||
sources = JSON.parse(sources || '[]');
|
||||
}
|
||||
|
||||
currentMessageData.sources = sources;
|
||||
} else if (msg.type === 'assistant' && !lastCompleted) {
|
||||
currentMessageData.response = msg.content;
|
||||
insertMessage.run(
|
||||
currentMessageData.messageId,
|
||||
currentMessageData.chatId,
|
||||
`${currentMessageData.messageId}-backend`,
|
||||
currentMessageData.query,
|
||||
currentMessageData.createdAt,
|
||||
JSON.stringify([
|
||||
{
|
||||
id: crypto.randomUUID(),
|
||||
type: 'text',
|
||||
data: currentMessageData.response || '',
|
||||
},
|
||||
...(currentMessageData.sources &&
|
||||
currentMessageData.sources.length > 0
|
||||
? [
|
||||
{
|
||||
id: crypto.randomUUID(),
|
||||
type: 'source',
|
||||
data: currentMessageData.sources,
|
||||
},
|
||||
]
|
||||
: []),
|
||||
]),
|
||||
'completed',
|
||||
);
|
||||
|
||||
lastCompleted = true;
|
||||
} else if (msg.type === 'user' && !lastCompleted) {
|
||||
/* Message wasn't completed so we'll just create the record with empty response */
|
||||
insertMessage.run(
|
||||
currentMessageData.messageId,
|
||||
currentMessageData.chatId,
|
||||
`${currentMessageData.messageId}-backend`,
|
||||
currentMessageData.query,
|
||||
currentMessageData.createdAt,
|
||||
JSON.stringify([
|
||||
{
|
||||
id: crypto.randomUUID(),
|
||||
type: 'text',
|
||||
data: '',
|
||||
},
|
||||
...(currentMessageData.sources &&
|
||||
currentMessageData.sources.length > 0
|
||||
? [
|
||||
{
|
||||
id: crypto.randomUUID(),
|
||||
type: 'source',
|
||||
data: currentMessageData.sources,
|
||||
},
|
||||
]
|
||||
: []),
|
||||
]),
|
||||
'completed',
|
||||
);
|
||||
|
||||
lastCompleted = true;
|
||||
}
|
||||
});
|
||||
|
||||
db.exec('DROP TABLE messages;');
|
||||
db.exec('ALTER TABLE messages_new RENAME TO messages;');
|
||||
} else {
|
||||
db.exec(content);
|
||||
// Execute each statement separately
|
||||
statements.forEach((stmt) => {
|
||||
if (stmt.trim()) {
|
||||
db.exec(stmt);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
db.prepare('INSERT OR IGNORE INTO ran_migrations (name) VALUES (?)').run(
|
||||
|
||||
@@ -1,26 +1,24 @@
|
||||
import { sql } from 'drizzle-orm';
|
||||
import { text, integer, sqliteTable } from 'drizzle-orm/sqlite-core';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { Block } from '../types';
|
||||
import { SearchSources } from '../agents/search/types';
|
||||
|
||||
export const messages = sqliteTable('messages', {
|
||||
id: integer('id').primaryKey(),
|
||||
role: text('type', { enum: ['assistant', 'user', 'source'] }).notNull(),
|
||||
chatId: text('chatId').notNull(),
|
||||
createdAt: text('createdAt')
|
||||
.notNull()
|
||||
.default(sql`CURRENT_TIMESTAMP`),
|
||||
messageId: text('messageId').notNull(),
|
||||
|
||||
content: text('content'),
|
||||
|
||||
sources: text('sources', {
|
||||
mode: 'json',
|
||||
})
|
||||
.$type<Document[]>()
|
||||
chatId: text('chatId').notNull(),
|
||||
backendId: text('backendId').notNull(),
|
||||
query: text('query').notNull(),
|
||||
createdAt: text('createdAt').notNull(),
|
||||
responseBlocks: text('responseBlocks', { mode: 'json' })
|
||||
.$type<Block[]>()
|
||||
.default(sql`'[]'`),
|
||||
status: text({ enum: ['answering', 'completed', 'error'] }).default(
|
||||
'answering',
|
||||
),
|
||||
});
|
||||
|
||||
interface File {
|
||||
interface DBFile {
|
||||
name: string;
|
||||
fileId: string;
|
||||
}
|
||||
@@ -29,8 +27,12 @@ export const chats = sqliteTable('chats', {
|
||||
id: text('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
createdAt: text('createdAt').notNull(),
|
||||
focusMode: text('focusMode').notNull(),
|
||||
sources: text('sources', {
|
||||
mode: 'json',
|
||||
})
|
||||
.$type<SearchSources[]>()
|
||||
.default(sql`'[]'`),
|
||||
files: text('files', { mode: 'json' })
|
||||
.$type<File[]>()
|
||||
.$type<DBFile[]>()
|
||||
.default(sql`'[]'`),
|
||||
});
|
||||
|
||||
@@ -1,13 +1,7 @@
|
||||
'use client';
|
||||
|
||||
import {
|
||||
AssistantMessage,
|
||||
ChatTurn,
|
||||
Message,
|
||||
SourceMessage,
|
||||
SuggestionMessage,
|
||||
UserMessage,
|
||||
} from '@/components/ChatWindow';
|
||||
import { Message } from '@/components/ChatWindow';
|
||||
import { Block } from '@/lib/types';
|
||||
import {
|
||||
createContext,
|
||||
useContext,
|
||||
@@ -22,25 +16,25 @@ import { toast } from 'sonner';
|
||||
import { getSuggestions } from '../actions';
|
||||
import { MinimalProvider } from '../models/types';
|
||||
import { getAutoMediaSearch } from '../config/clientRegistry';
|
||||
import { applyPatch } from 'rfc6902';
|
||||
import { Widget } from '@/components/ChatWindow';
|
||||
|
||||
export type Section = {
|
||||
userMessage: UserMessage;
|
||||
assistantMessage: AssistantMessage | undefined;
|
||||
parsedAssistantMessage: string | undefined;
|
||||
speechMessage: string | undefined;
|
||||
sourceMessage: SourceMessage | undefined;
|
||||
message: Message;
|
||||
widgets: Widget[];
|
||||
parsedTextBlocks: string[];
|
||||
speechMessage: string;
|
||||
thinkingEnded: boolean;
|
||||
suggestions?: string[];
|
||||
};
|
||||
|
||||
type ChatContext = {
|
||||
messages: Message[];
|
||||
chatTurns: ChatTurn[];
|
||||
sections: Section[];
|
||||
chatHistory: [string, string][];
|
||||
files: File[];
|
||||
fileIds: string[];
|
||||
focusMode: string;
|
||||
sources: string[];
|
||||
chatId: string | undefined;
|
||||
optimizationMode: string;
|
||||
isMessagesLoaded: boolean;
|
||||
@@ -51,8 +45,10 @@ type ChatContext = {
|
||||
hasError: boolean;
|
||||
chatModelProvider: ChatModelProvider;
|
||||
embeddingModelProvider: EmbeddingModelProvider;
|
||||
researchEnded: boolean;
|
||||
setResearchEnded: (ended: boolean) => void;
|
||||
setOptimizationMode: (mode: string) => void;
|
||||
setFocusMode: (mode: string) => void;
|
||||
setSources: (sources: string[]) => void;
|
||||
setFiles: (files: File[]) => void;
|
||||
setFileIds: (fileIds: string[]) => void;
|
||||
sendMessage: (
|
||||
@@ -179,8 +175,8 @@ const loadMessages = async (
|
||||
chatId: string,
|
||||
setMessages: (messages: Message[]) => void,
|
||||
setIsMessagesLoaded: (loaded: boolean) => void,
|
||||
setChatHistory: (history: [string, string][]) => void,
|
||||
setFocusMode: (mode: string) => void,
|
||||
chatHistory: React.MutableRefObject<[string, string][]>,
|
||||
setSources: (sources: string[]) => void,
|
||||
setNotFound: (notFound: boolean) => void,
|
||||
setFiles: (files: File[]) => void,
|
||||
setFileIds: (fileIds: string[]) => void,
|
||||
@@ -204,18 +200,26 @@ const loadMessages = async (
|
||||
|
||||
setMessages(messages);
|
||||
|
||||
const chatTurns = messages.filter(
|
||||
(msg): msg is ChatTurn => msg.role === 'user' || msg.role === 'assistant',
|
||||
);
|
||||
const history: [string, string][] = [];
|
||||
messages.forEach((msg) => {
|
||||
history.push(['human', msg.query]);
|
||||
|
||||
const history = chatTurns.map((msg) => {
|
||||
return [msg.role, msg.content];
|
||||
}) as [string, string][];
|
||||
const textBlocks = msg.responseBlocks
|
||||
.filter(
|
||||
(block): block is Block & { type: 'text' } => block.type === 'text',
|
||||
)
|
||||
.map((block) => block.data)
|
||||
.join('\n');
|
||||
|
||||
if (textBlocks) {
|
||||
history.push(['assistant', textBlocks]);
|
||||
}
|
||||
});
|
||||
|
||||
console.debug(new Date(), 'app:messages_loaded');
|
||||
|
||||
if (chatTurns.length > 0) {
|
||||
document.title = chatTurns[0].content;
|
||||
if (messages.length > 0) {
|
||||
document.title = messages[0].query;
|
||||
}
|
||||
|
||||
const files = data.chat.files.map((file: any) => {
|
||||
@@ -229,8 +233,8 @@ const loadMessages = async (
|
||||
setFiles(files);
|
||||
setFileIds(files.map((file: File) => file.fileId));
|
||||
|
||||
setChatHistory(history);
|
||||
setFocusMode(data.chat.focusMode);
|
||||
chatHistory.current = history;
|
||||
setSources(data.chat.sources);
|
||||
setIsMessagesLoaded(true);
|
||||
};
|
||||
|
||||
@@ -239,31 +243,33 @@ export const chatContext = createContext<ChatContext>({
|
||||
chatId: '',
|
||||
fileIds: [],
|
||||
files: [],
|
||||
focusMode: '',
|
||||
sources: [],
|
||||
hasError: false,
|
||||
isMessagesLoaded: false,
|
||||
isReady: false,
|
||||
loading: false,
|
||||
messageAppeared: false,
|
||||
messages: [],
|
||||
chatTurns: [],
|
||||
sections: [],
|
||||
notFound: false,
|
||||
optimizationMode: '',
|
||||
chatModelProvider: { key: '', providerId: '' },
|
||||
embeddingModelProvider: { key: '', providerId: '' },
|
||||
researchEnded: false,
|
||||
rewrite: () => {},
|
||||
sendMessage: async () => {},
|
||||
setFileIds: () => {},
|
||||
setFiles: () => {},
|
||||
setFocusMode: () => {},
|
||||
setSources: () => {},
|
||||
setOptimizationMode: () => {},
|
||||
setChatModelProvider: () => {},
|
||||
setEmbeddingModelProvider: () => {},
|
||||
setResearchEnded: () => {},
|
||||
});
|
||||
|
||||
export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
const params: { chatId: string } = useParams();
|
||||
|
||||
const searchParams = useSearchParams();
|
||||
const initialMessage = searchParams.get('q');
|
||||
|
||||
@@ -273,13 +279,15 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [messageAppeared, setMessageAppeared] = useState(false);
|
||||
|
||||
const [chatHistory, setChatHistory] = useState<[string, string][]>([]);
|
||||
const [researchEnded, setResearchEnded] = useState(false);
|
||||
|
||||
const chatHistory = useRef<[string, string][]>([]);
|
||||
const [messages, setMessages] = useState<Message[]>([]);
|
||||
|
||||
const [files, setFiles] = useState<File[]>([]);
|
||||
const [fileIds, setFileIds] = useState<string[]>([]);
|
||||
|
||||
const [focusMode, setFocusMode] = useState('webSearch');
|
||||
const [sources, setSources] = useState<string[]>(['web']);
|
||||
const [optimizationMode, setOptimizationMode] = useState('speed');
|
||||
|
||||
const [isMessagesLoaded, setIsMessagesLoaded] = useState(false);
|
||||
@@ -305,66 +313,44 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
|
||||
const messagesRef = useRef<Message[]>([]);
|
||||
|
||||
const chatTurns = useMemo((): ChatTurn[] => {
|
||||
return messages.filter(
|
||||
(msg): msg is ChatTurn => msg.role === 'user' || msg.role === 'assistant',
|
||||
);
|
||||
}, [messages]);
|
||||
|
||||
const sections = useMemo<Section[]>(() => {
|
||||
const sections: Section[] = [];
|
||||
|
||||
messages.forEach((msg, i) => {
|
||||
if (msg.role === 'user') {
|
||||
const nextUserMessageIndex = messages.findIndex(
|
||||
(m, j) => j > i && m.role === 'user',
|
||||
);
|
||||
|
||||
const aiMessage = messages.find(
|
||||
(m, j) =>
|
||||
j > i &&
|
||||
m.role === 'assistant' &&
|
||||
(nextUserMessageIndex === -1 || j < nextUserMessageIndex),
|
||||
) as AssistantMessage | undefined;
|
||||
|
||||
const sourceMessage = messages.find(
|
||||
(m, j) =>
|
||||
j > i &&
|
||||
m.role === 'source' &&
|
||||
m.sources &&
|
||||
(nextUserMessageIndex === -1 || j < nextUserMessageIndex),
|
||||
) as SourceMessage | undefined;
|
||||
|
||||
return messages.map((msg) => {
|
||||
const textBlocks: string[] = [];
|
||||
let speechMessage = '';
|
||||
let thinkingEnded = false;
|
||||
let processedMessage = aiMessage?.content ?? '';
|
||||
let speechMessage = aiMessage?.content ?? '';
|
||||
let suggestions: string[] = [];
|
||||
|
||||
if (aiMessage) {
|
||||
const sourceBlocks = msg.responseBlocks.filter(
|
||||
(block): block is Block & { type: 'source' } => block.type === 'source',
|
||||
);
|
||||
const sources = sourceBlocks.flatMap((block) => block.data);
|
||||
|
||||
const widgetBlocks = msg.responseBlocks
|
||||
.filter((b) => b.type === 'widget')
|
||||
.map((b) => b.data) as Widget[];
|
||||
|
||||
msg.responseBlocks.forEach((block) => {
|
||||
if (block.type === 'text') {
|
||||
let processedText = block.data;
|
||||
const citationRegex = /\[([^\]]+)\]/g;
|
||||
const regex = /\[(\d+)\]/g;
|
||||
|
||||
if (processedMessage.includes('<think>')) {
|
||||
const openThinkTag =
|
||||
processedMessage.match(/<think>/g)?.length || 0;
|
||||
if (processedText.includes('<think>')) {
|
||||
const openThinkTag = processedText.match(/<think>/g)?.length || 0;
|
||||
const closeThinkTag =
|
||||
processedMessage.match(/<\/think>/g)?.length || 0;
|
||||
processedText.match(/<\/think>/g)?.length || 0;
|
||||
|
||||
if (openThinkTag && !closeThinkTag) {
|
||||
processedMessage += '</think> <a> </a>';
|
||||
processedText += '</think> <a> </a>';
|
||||
}
|
||||
}
|
||||
|
||||
if (aiMessage.content.includes('</think>')) {
|
||||
if (block.data.includes('</think>')) {
|
||||
thinkingEnded = true;
|
||||
}
|
||||
|
||||
if (
|
||||
sourceMessage &&
|
||||
sourceMessage.sources &&
|
||||
sourceMessage.sources.length > 0
|
||||
) {
|
||||
processedMessage = processedMessage.replace(
|
||||
if (sources.length > 0) {
|
||||
processedText = processedText.replace(
|
||||
citationRegex,
|
||||
(_, capturedContent: string) => {
|
||||
const numbers = capturedContent
|
||||
@@ -379,7 +365,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
return `[${numStr}]`;
|
||||
}
|
||||
|
||||
const source = sourceMessage.sources?.[number - 1];
|
||||
const source = sources[number - 1];
|
||||
const url = source?.metadata?.url;
|
||||
|
||||
if (url) {
|
||||
@@ -393,38 +379,86 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
return linksHtml;
|
||||
},
|
||||
);
|
||||
speechMessage = aiMessage.content.replace(regex, '');
|
||||
speechMessage += block.data.replace(regex, '');
|
||||
} else {
|
||||
processedMessage = processedMessage.replace(regex, '');
|
||||
speechMessage = aiMessage.content.replace(regex, '');
|
||||
processedText = processedText.replace(regex, '');
|
||||
speechMessage += block.data.replace(regex, '');
|
||||
}
|
||||
|
||||
const suggestionMessage = messages.find(
|
||||
(m, j) =>
|
||||
j > i &&
|
||||
m.role === 'suggestion' &&
|
||||
(nextUserMessageIndex === -1 || j < nextUserMessageIndex),
|
||||
) as SuggestionMessage | undefined;
|
||||
|
||||
if (suggestionMessage && suggestionMessage.suggestions.length > 0) {
|
||||
suggestions = suggestionMessage.suggestions;
|
||||
}
|
||||
textBlocks.push(processedText);
|
||||
} else if (block.type === 'suggestion') {
|
||||
suggestions = block.data;
|
||||
}
|
||||
});
|
||||
|
||||
sections.push({
|
||||
userMessage: msg,
|
||||
assistantMessage: aiMessage,
|
||||
sourceMessage: sourceMessage,
|
||||
parsedAssistantMessage: processedMessage,
|
||||
return {
|
||||
message: msg,
|
||||
parsedTextBlocks: textBlocks,
|
||||
speechMessage,
|
||||
thinkingEnded,
|
||||
suggestions: suggestions,
|
||||
suggestions,
|
||||
widgets: widgetBlocks,
|
||||
};
|
||||
});
|
||||
}
|
||||
}, [messages]);
|
||||
|
||||
const isReconnectingRef = useRef(false);
|
||||
const handledMessageEndRef = useRef<Set<string>>(new Set());
|
||||
|
||||
const checkReconnect = async () => {
|
||||
if (isReconnectingRef.current) return;
|
||||
|
||||
setIsReady(true);
|
||||
console.debug(new Date(), 'app:ready');
|
||||
|
||||
if (messages.length > 0) {
|
||||
const lastMsg = messages[messages.length - 1];
|
||||
|
||||
if (lastMsg.status === 'answering') {
|
||||
setLoading(true);
|
||||
setResearchEnded(false);
|
||||
setMessageAppeared(false);
|
||||
|
||||
isReconnectingRef.current = true;
|
||||
|
||||
const res = await fetch(`/api/reconnect/${lastMsg.backendId}`, {
|
||||
method: 'POST',
|
||||
});
|
||||
|
||||
return sections;
|
||||
}, [messages]);
|
||||
if (!res.body) throw new Error('No response body');
|
||||
|
||||
const reader = res.body?.getReader();
|
||||
const decoder = new TextDecoder('utf-8');
|
||||
|
||||
let partialChunk = '';
|
||||
|
||||
const messageHandler = getMessageHandler(lastMsg);
|
||||
|
||||
try {
|
||||
while (true) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) break;
|
||||
|
||||
partialChunk += decoder.decode(value, { stream: true });
|
||||
|
||||
try {
|
||||
const messages = partialChunk.split('\n');
|
||||
for (const msg of messages) {
|
||||
if (!msg.trim()) continue;
|
||||
const json = JSON.parse(msg);
|
||||
messageHandler(json);
|
||||
}
|
||||
partialChunk = '';
|
||||
} catch (error) {
|
||||
console.warn('Incomplete JSON, waiting for next chunk...');
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
isReconnectingRef.current = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
checkConfig(
|
||||
@@ -440,7 +474,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
if (params.chatId && params.chatId !== chatId) {
|
||||
setChatId(params.chatId);
|
||||
setMessages([]);
|
||||
setChatHistory([]);
|
||||
chatHistory.current = [];
|
||||
setFiles([]);
|
||||
setFileIds([]);
|
||||
setIsMessagesLoaded(false);
|
||||
@@ -460,8 +494,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
chatId,
|
||||
setMessages,
|
||||
setIsMessagesLoaded,
|
||||
setChatHistory,
|
||||
setFocusMode,
|
||||
chatHistory,
|
||||
setSources,
|
||||
setNotFound,
|
||||
setFiles,
|
||||
setFileIds,
|
||||
@@ -479,34 +513,27 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
}, [messages]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isMessagesLoaded && isConfigReady) {
|
||||
if (isMessagesLoaded && isConfigReady && newChatCreated) {
|
||||
setIsReady(true);
|
||||
console.debug(new Date(), 'app:ready');
|
||||
} else if (isMessagesLoaded && isConfigReady && !newChatCreated) {
|
||||
checkReconnect();
|
||||
} else {
|
||||
setIsReady(false);
|
||||
}
|
||||
}, [isMessagesLoaded, isConfigReady]);
|
||||
}, [isMessagesLoaded, isConfigReady, newChatCreated]);
|
||||
|
||||
const rewrite = (messageId: string) => {
|
||||
const index = messages.findIndex((msg) => msg.messageId === messageId);
|
||||
const chatTurnsIndex = chatTurns.findIndex(
|
||||
(msg) => msg.messageId === messageId,
|
||||
);
|
||||
|
||||
if (index === -1) return;
|
||||
|
||||
const message = chatTurns[chatTurnsIndex - 1];
|
||||
setMessages((prev) => prev.slice(0, index));
|
||||
|
||||
setMessages((prev) => {
|
||||
return [
|
||||
...prev.slice(0, messages.length > 2 ? messages.indexOf(message) : 0),
|
||||
];
|
||||
});
|
||||
setChatHistory((prev) => {
|
||||
return [...prev.slice(0, chatTurns.length > 2 ? chatTurnsIndex - 1 : 0)];
|
||||
});
|
||||
chatHistory.current = chatHistory.current.slice(0, index * 2);
|
||||
|
||||
sendMessage(message.content, message.messageId, true);
|
||||
const messageToRewrite = messages[index];
|
||||
sendMessage(messageToRewrite.query, messageToRewrite.messageId, true);
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
@@ -520,95 +547,118 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [isConfigReady, isReady, initialMessage]);
|
||||
|
||||
const sendMessage: ChatContext['sendMessage'] = async (
|
||||
message,
|
||||
messageId,
|
||||
rewrite = false,
|
||||
) => {
|
||||
if (loading || !message) return;
|
||||
setLoading(true);
|
||||
setMessageAppeared(false);
|
||||
const getMessageHandler = (message: Message) => {
|
||||
const messageId = message.messageId;
|
||||
|
||||
if (messages.length <= 1) {
|
||||
window.history.replaceState(null, '', `/c/${chatId}`);
|
||||
}
|
||||
|
||||
let recievedMessage = '';
|
||||
let added = false;
|
||||
|
||||
messageId = messageId ?? crypto.randomBytes(7).toString('hex');
|
||||
|
||||
setMessages((prevMessages) => [
|
||||
...prevMessages,
|
||||
{
|
||||
content: message,
|
||||
messageId: messageId,
|
||||
chatId: chatId!,
|
||||
role: 'user',
|
||||
createdAt: new Date(),
|
||||
},
|
||||
]);
|
||||
|
||||
const messageHandler = async (data: any) => {
|
||||
return async (data: any) => {
|
||||
if (data.type === 'error') {
|
||||
toast.error(data.data);
|
||||
setLoading(false);
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) =>
|
||||
msg.messageId === messageId
|
||||
? { ...msg, status: 'error' as const }
|
||||
: msg,
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
if (data.type === 'sources') {
|
||||
setMessages((prevMessages) => [
|
||||
...prevMessages,
|
||||
{
|
||||
messageId: data.messageId,
|
||||
chatId: chatId!,
|
||||
role: 'source',
|
||||
sources: data.data,
|
||||
createdAt: new Date(),
|
||||
},
|
||||
]);
|
||||
if (data.data.length > 0) {
|
||||
setMessageAppeared(true);
|
||||
}
|
||||
}
|
||||
|
||||
if (data.type === 'message') {
|
||||
if (!added) {
|
||||
setMessages((prevMessages) => [
|
||||
...prevMessages,
|
||||
{
|
||||
content: data.data,
|
||||
messageId: data.messageId,
|
||||
chatId: chatId!,
|
||||
role: 'assistant',
|
||||
createdAt: new Date(),
|
||||
},
|
||||
]);
|
||||
added = true;
|
||||
setMessageAppeared(true);
|
||||
} else {
|
||||
setMessages((prev) =>
|
||||
prev.map((message) => {
|
||||
if (data.type === 'researchComplete') {
|
||||
setResearchEnded(true);
|
||||
if (
|
||||
message.messageId === data.messageId &&
|
||||
message.role === 'assistant'
|
||||
message.responseBlocks.find(
|
||||
(b) => b.type === 'source' && b.data.length > 0,
|
||||
)
|
||||
) {
|
||||
return { ...message, content: message.content + data.data };
|
||||
setMessageAppeared(true);
|
||||
}
|
||||
}
|
||||
|
||||
return message;
|
||||
if (data.type === 'block') {
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) => {
|
||||
if (msg.messageId === messageId) {
|
||||
const exists = msg.responseBlocks.findIndex(
|
||||
(b) => b.id === data.block.id,
|
||||
);
|
||||
|
||||
if (exists !== -1) {
|
||||
const existingBlocks = [...msg.responseBlocks];
|
||||
existingBlocks[exists] = data.block;
|
||||
|
||||
return {
|
||||
...msg,
|
||||
responseBlocks: existingBlocks,
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
...msg,
|
||||
responseBlocks: [...msg.responseBlocks, data.block],
|
||||
};
|
||||
}
|
||||
return msg;
|
||||
}),
|
||||
);
|
||||
|
||||
if (
|
||||
(data.block.type === 'source' && data.block.data.length > 0) ||
|
||||
data.block.type === 'text'
|
||||
) {
|
||||
setMessageAppeared(true);
|
||||
}
|
||||
}
|
||||
|
||||
if (data.type === 'updateBlock') {
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) => {
|
||||
if (msg.messageId === messageId) {
|
||||
const updatedBlocks = msg.responseBlocks.map((block) => {
|
||||
if (block.id === data.blockId) {
|
||||
const updatedBlock = { ...block };
|
||||
applyPatch(updatedBlock, data.patch);
|
||||
return updatedBlock;
|
||||
}
|
||||
return block;
|
||||
});
|
||||
return { ...msg, responseBlocks: updatedBlocks };
|
||||
}
|
||||
return msg;
|
||||
}),
|
||||
);
|
||||
}
|
||||
recievedMessage += data.data;
|
||||
}
|
||||
|
||||
if (data.type === 'messageEnd') {
|
||||
setChatHistory((prevHistory) => [
|
||||
...prevHistory,
|
||||
['human', message],
|
||||
['assistant', recievedMessage],
|
||||
]);
|
||||
if (handledMessageEndRef.current.has(messageId)) {
|
||||
return;
|
||||
}
|
||||
|
||||
handledMessageEndRef.current.add(messageId);
|
||||
|
||||
const currentMsg = messagesRef.current.find(
|
||||
(msg) => msg.messageId === messageId,
|
||||
);
|
||||
|
||||
const newHistory: [string, string][] = [
|
||||
...chatHistory.current,
|
||||
['human', message.query],
|
||||
[
|
||||
'assistant',
|
||||
currentMsg?.responseBlocks.find((b) => b.type === 'text')?.data ||
|
||||
'',
|
||||
],
|
||||
];
|
||||
|
||||
chatHistory.current = newHistory;
|
||||
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) =>
|
||||
msg.messageId === messageId
|
||||
? { ...msg, status: 'completed' as const }
|
||||
: msg,
|
||||
),
|
||||
);
|
||||
|
||||
setLoading(false);
|
||||
|
||||
@@ -617,6 +667,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
const autoMediaSearch = getAutoMediaSearch();
|
||||
|
||||
if (autoMediaSearch) {
|
||||
setTimeout(() => {
|
||||
document
|
||||
.getElementById(`search-images-${lastMsg.messageId}`)
|
||||
?.click();
|
||||
@@ -624,43 +675,70 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
document
|
||||
.getElementById(`search-videos-${lastMsg.messageId}`)
|
||||
?.click();
|
||||
}, 200);
|
||||
}
|
||||
|
||||
/* Check if there are sources after message id's index and no suggestions */
|
||||
// Check if there are sources and no suggestions
|
||||
|
||||
const userMessageIndex = messagesRef.current.findIndex(
|
||||
(msg) => msg.messageId === messageId && msg.role === 'user',
|
||||
const hasSourceBlocks = currentMsg?.responseBlocks.some(
|
||||
(block) => block.type === 'source' && block.data.length > 0,
|
||||
);
|
||||
const hasSuggestions = currentMsg?.responseBlocks.some(
|
||||
(block) => block.type === 'suggestion',
|
||||
);
|
||||
|
||||
const sourceMessage = messagesRef.current.find(
|
||||
(msg, i) => i > userMessageIndex && msg.role === 'source',
|
||||
) as SourceMessage | undefined;
|
||||
if (hasSourceBlocks && !hasSuggestions) {
|
||||
const suggestions = await getSuggestions(newHistory);
|
||||
const suggestionBlock: Block = {
|
||||
id: crypto.randomBytes(7).toString('hex'),
|
||||
type: 'suggestion',
|
||||
data: suggestions,
|
||||
};
|
||||
|
||||
const suggestionMessageIndex = messagesRef.current.findIndex(
|
||||
(msg, i) => i > userMessageIndex && msg.role === 'suggestion',
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) => {
|
||||
if (msg.messageId === messageId) {
|
||||
return {
|
||||
...msg,
|
||||
responseBlocks: [...msg.responseBlocks, suggestionBlock],
|
||||
};
|
||||
}
|
||||
return msg;
|
||||
}),
|
||||
);
|
||||
|
||||
if (
|
||||
sourceMessage &&
|
||||
sourceMessage.sources.length > 0 &&
|
||||
suggestionMessageIndex == -1
|
||||
) {
|
||||
const suggestions = await getSuggestions(messagesRef.current);
|
||||
setMessages((prev) => {
|
||||
return [
|
||||
...prev,
|
||||
{
|
||||
role: 'suggestion',
|
||||
suggestions: suggestions,
|
||||
chatId: chatId!,
|
||||
createdAt: new Date(),
|
||||
messageId: crypto.randomBytes(7).toString('hex'),
|
||||
},
|
||||
];
|
||||
});
|
||||
}
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
const sendMessage: ChatContext['sendMessage'] = async (
|
||||
message,
|
||||
messageId,
|
||||
rewrite = false,
|
||||
) => {
|
||||
if (loading || !message) return;
|
||||
setLoading(true);
|
||||
setResearchEnded(false);
|
||||
setMessageAppeared(false);
|
||||
|
||||
if (messages.length <= 1) {
|
||||
window.history.replaceState(null, '', `/c/${chatId}`);
|
||||
}
|
||||
|
||||
messageId = messageId ?? crypto.randomBytes(7).toString('hex');
|
||||
const backendId = crypto.randomBytes(20).toString('hex');
|
||||
|
||||
const newMessage: Message = {
|
||||
messageId,
|
||||
chatId: chatId!,
|
||||
backendId,
|
||||
query: message,
|
||||
responseBlocks: [],
|
||||
status: 'answering',
|
||||
createdAt: new Date(),
|
||||
};
|
||||
|
||||
setMessages((prevMessages) => [...prevMessages, newMessage]);
|
||||
|
||||
const messageIndex = messages.findIndex((m) => m.messageId === messageId);
|
||||
|
||||
@@ -678,11 +756,14 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
},
|
||||
chatId: chatId!,
|
||||
files: fileIds,
|
||||
focusMode: focusMode,
|
||||
sources: sources,
|
||||
optimizationMode: optimizationMode,
|
||||
history: rewrite
|
||||
? chatHistory.slice(0, messageIndex === -1 ? undefined : messageIndex)
|
||||
: chatHistory,
|
||||
? chatHistory.current.slice(
|
||||
0,
|
||||
messageIndex === -1 ? undefined : messageIndex,
|
||||
)
|
||||
: chatHistory.current,
|
||||
chatModel: {
|
||||
key: chatModelProvider.key,
|
||||
providerId: chatModelProvider.providerId,
|
||||
@@ -702,6 +783,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
|
||||
let partialChunk = '';
|
||||
|
||||
const messageHandler = getMessageHandler(newMessage);
|
||||
|
||||
while (true) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) break;
|
||||
@@ -726,12 +809,11 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
<chatContext.Provider
|
||||
value={{
|
||||
messages,
|
||||
chatTurns,
|
||||
sections,
|
||||
chatHistory,
|
||||
chatHistory: chatHistory.current,
|
||||
files,
|
||||
fileIds,
|
||||
focusMode,
|
||||
sources,
|
||||
chatId,
|
||||
hasError,
|
||||
isMessagesLoaded,
|
||||
@@ -742,7 +824,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
optimizationMode,
|
||||
setFileIds,
|
||||
setFiles,
|
||||
setFocusMode,
|
||||
setSources,
|
||||
setOptimizationMode,
|
||||
rewrite,
|
||||
sendMessage,
|
||||
@@ -750,6 +832,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
|
||||
chatModelProvider,
|
||||
embeddingModelProvider,
|
||||
setEmbeddingModelProvider,
|
||||
researchEnded,
|
||||
setResearchEnded,
|
||||
}}
|
||||
>
|
||||
{children}
|
||||
|
||||
9
src/lib/models/base/embedding.ts
Normal file
9
src/lib/models/base/embedding.ts
Normal file
@@ -0,0 +1,9 @@
|
||||
import { Chunk } from '@/lib/types';
|
||||
|
||||
abstract class BaseEmbedding<CONFIG> {
|
||||
constructor(protected config: CONFIG) {}
|
||||
abstract embedText(texts: string[]): Promise<number[][]>;
|
||||
abstract embedChunks(chunks: Chunk[]): Promise<number[][]>;
|
||||
}
|
||||
|
||||
export default BaseEmbedding;
|
||||
22
src/lib/models/base/llm.ts
Normal file
22
src/lib/models/base/llm.ts
Normal file
@@ -0,0 +1,22 @@
|
||||
import z from 'zod';
|
||||
import {
|
||||
GenerateObjectInput,
|
||||
GenerateOptions,
|
||||
GenerateTextInput,
|
||||
GenerateTextOutput,
|
||||
StreamTextOutput,
|
||||
} from '../types';
|
||||
|
||||
abstract class BaseLLM<CONFIG> {
|
||||
constructor(protected config: CONFIG) {}
|
||||
abstract generateText(input: GenerateTextInput): Promise<GenerateTextOutput>;
|
||||
abstract streamText(
|
||||
input: GenerateTextInput,
|
||||
): AsyncGenerator<StreamTextOutput>;
|
||||
abstract generateObject<T>(input: GenerateObjectInput): Promise<z.infer<T>>;
|
||||
abstract streamObject<T>(
|
||||
input: GenerateObjectInput,
|
||||
): AsyncGenerator<Partial<z.infer<T>>>;
|
||||
}
|
||||
|
||||
export default BaseLLM;
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import { ModelList, ProviderMetadata } from '../types';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import BaseLLM from './llm';
|
||||
import BaseEmbedding from './embedding';
|
||||
|
||||
abstract class BaseModelProvider<CONFIG> {
|
||||
constructor(
|
||||
@@ -11,8 +11,8 @@ abstract class BaseModelProvider<CONFIG> {
|
||||
) {}
|
||||
abstract getDefaultModels(): Promise<ModelList>;
|
||||
abstract getModelList(): Promise<ModelList>;
|
||||
abstract loadChatModel(modelName: string): Promise<BaseChatModel>;
|
||||
abstract loadEmbeddingModel(modelName: string): Promise<Embeddings>;
|
||||
abstract loadChatModel(modelName: string): Promise<BaseLLM<any>>;
|
||||
abstract loadEmbeddingModel(modelName: string): Promise<BaseEmbedding<any>>;
|
||||
static getProviderConfigFields(): UIConfigField[] {
|
||||
throw new Error('Method not implemented.');
|
||||
}
|
||||
@@ -1,152 +0,0 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
|
||||
interface AimlConfig {
|
||||
apiKey: string;
|
||||
}
|
||||
|
||||
const providerConfigFields: UIConfigField[] = [
|
||||
{
|
||||
type: 'password',
|
||||
name: 'API Key',
|
||||
key: 'apiKey',
|
||||
description: 'Your AI/ML API key',
|
||||
required: true,
|
||||
placeholder: 'AI/ML API Key',
|
||||
env: 'AIML_API_KEY',
|
||||
scope: 'server',
|
||||
},
|
||||
];
|
||||
|
||||
class AimlProvider extends BaseModelProvider<AimlConfig> {
|
||||
constructor(id: string, name: string, config: AimlConfig) {
|
||||
super(id, name, config);
|
||||
}
|
||||
|
||||
async getDefaultModels(): Promise<ModelList> {
|
||||
try {
|
||||
const res = await fetch('https://api.aimlapi.com/models', {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
Authorization: `Bearer ${this.config.apiKey}`,
|
||||
},
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const chatModels: Model[] = data.data
|
||||
.filter((m: any) => m.type === 'chat-completion')
|
||||
.map((m: any) => {
|
||||
return {
|
||||
name: m.id,
|
||||
key: m.id,
|
||||
};
|
||||
});
|
||||
|
||||
const embeddingModels: Model[] = data.data
|
||||
.filter((m: any) => m.type === 'embedding')
|
||||
.map((m: any) => {
|
||||
return {
|
||||
name: m.id,
|
||||
key: m.id,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
embedding: embeddingModels,
|
||||
chat: chatModels,
|
||||
};
|
||||
} catch (err) {
|
||||
if (err instanceof TypeError) {
|
||||
throw new Error(
|
||||
'Error connecting to AI/ML API. Please ensure your API key is correct and the service is available.',
|
||||
);
|
||||
}
|
||||
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async getModelList(): Promise<ModelList> {
|
||||
const defaultModels = await this.getDefaultModels();
|
||||
const configProvider = getConfiguredModelProviderById(this.id)!;
|
||||
|
||||
return {
|
||||
embedding: [
|
||||
...defaultModels.embedding,
|
||||
...configProvider.embeddingModels,
|
||||
],
|
||||
chat: [...defaultModels.chat, ...configProvider.chatModels],
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
|
||||
if (!exists) {
|
||||
throw new Error(
|
||||
'Error Loading AI/ML API Chat Model. Invalid Model Selected',
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatOpenAI({
|
||||
apiKey: this.config.apiKey,
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: 'https://api.aimlapi.com',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
const modelList = await this.getModelList();
|
||||
const exists = modelList.embedding.find((m) => m.key === key);
|
||||
|
||||
if (!exists) {
|
||||
throw new Error(
|
||||
'Error Loading AI/ML API Embedding Model. Invalid Model Selected.',
|
||||
);
|
||||
}
|
||||
|
||||
return new OpenAIEmbeddings({
|
||||
apiKey: this.config.apiKey,
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: 'https://api.aimlapi.com',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
static parseAndValidate(raw: any): AimlConfig {
|
||||
if (!raw || typeof raw !== 'object')
|
||||
throw new Error('Invalid config provided. Expected object');
|
||||
if (!raw.apiKey)
|
||||
throw new Error('Invalid config provided. API key must be provided');
|
||||
|
||||
return {
|
||||
apiKey: String(raw.apiKey),
|
||||
};
|
||||
}
|
||||
|
||||
static getProviderConfigFields(): UIConfigField[] {
|
||||
return providerConfigFields;
|
||||
}
|
||||
|
||||
static getProviderMetadata(): ProviderMetadata {
|
||||
return {
|
||||
key: 'aiml',
|
||||
name: 'AI/ML API',
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default AimlProvider;
|
||||
5
src/lib/models/providers/anthropic/anthropicLLM.ts
Normal file
5
src/lib/models/providers/anthropic/anthropicLLM.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAILLM from '../openai/openaiLLM';
|
||||
|
||||
class AnthropicLLM extends OpenAILLM {}
|
||||
|
||||
export default AnthropicLLM;
|
||||
@@ -1,10 +1,10 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatAnthropic } from '@langchain/anthropic';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
import { Model, ModelList, ProviderMetadata } from '../../types';
|
||||
import BaseEmbedding from '../../base/embedding';
|
||||
import BaseModelProvider from '../../base/provider';
|
||||
import BaseLLM from '../../base/llm';
|
||||
import AnthropicLLM from './anthropicLLM';
|
||||
|
||||
interface AnthropicConfig {
|
||||
apiKey: string;
|
||||
@@ -67,7 +67,7 @@ class AnthropicProvider extends BaseModelProvider<AnthropicConfig> {
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
async loadChatModel(key: string): Promise<BaseLLM<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
@@ -78,14 +78,14 @@ class AnthropicProvider extends BaseModelProvider<AnthropicConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatAnthropic({
|
||||
return new AnthropicLLM({
|
||||
apiKey: this.config.apiKey,
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
baseURL: 'https://api.anthropic.com/v1',
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
|
||||
throw new Error('Anthropic provider does not support embedding models.');
|
||||
}
|
||||
|
||||
@@ -1,107 +0,0 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
|
||||
interface DeepSeekConfig {
|
||||
apiKey: string;
|
||||
}
|
||||
|
||||
const defaultChatModels: Model[] = [
|
||||
{
|
||||
name: 'Deepseek Chat / DeepSeek V3.2 Exp',
|
||||
key: 'deepseek-chat',
|
||||
},
|
||||
{
|
||||
name: 'Deepseek Reasoner / DeepSeek V3.2 Exp',
|
||||
key: 'deepseek-reasoner',
|
||||
},
|
||||
];
|
||||
|
||||
const providerConfigFields: UIConfigField[] = [
|
||||
{
|
||||
type: 'password',
|
||||
name: 'API Key',
|
||||
key: 'apiKey',
|
||||
description: 'Your DeepSeek API key',
|
||||
required: true,
|
||||
placeholder: 'DeepSeek API Key',
|
||||
env: 'DEEPSEEK_API_KEY',
|
||||
scope: 'server',
|
||||
},
|
||||
];
|
||||
|
||||
class DeepSeekProvider extends BaseModelProvider<DeepSeekConfig> {
|
||||
constructor(id: string, name: string, config: DeepSeekConfig) {
|
||||
super(id, name, config);
|
||||
}
|
||||
|
||||
async getDefaultModels(): Promise<ModelList> {
|
||||
return {
|
||||
embedding: [],
|
||||
chat: defaultChatModels,
|
||||
};
|
||||
}
|
||||
|
||||
async getModelList(): Promise<ModelList> {
|
||||
const defaultModels = await this.getDefaultModels();
|
||||
const configProvider = getConfiguredModelProviderById(this.id)!;
|
||||
|
||||
return {
|
||||
embedding: [],
|
||||
chat: [...defaultModels.chat, ...configProvider.chatModels],
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
|
||||
if (!exists) {
|
||||
throw new Error(
|
||||
'Error Loading DeepSeek Chat Model. Invalid Model Selected',
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatOpenAI({
|
||||
apiKey: this.config.apiKey,
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: 'https://api.deepseek.com',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
throw new Error('DeepSeek provider does not support embedding models.');
|
||||
}
|
||||
|
||||
static parseAndValidate(raw: any): DeepSeekConfig {
|
||||
if (!raw || typeof raw !== 'object')
|
||||
throw new Error('Invalid config provided. Expected object');
|
||||
if (!raw.apiKey)
|
||||
throw new Error('Invalid config provided. API key must be provided');
|
||||
|
||||
return {
|
||||
apiKey: String(raw.apiKey),
|
||||
};
|
||||
}
|
||||
|
||||
static getProviderConfigFields(): UIConfigField[] {
|
||||
return providerConfigFields;
|
||||
}
|
||||
|
||||
static getProviderMetadata(): ProviderMetadata {
|
||||
return {
|
||||
key: 'deepseek',
|
||||
name: 'Deepseek AI',
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default DeepSeekProvider;
|
||||
5
src/lib/models/providers/gemini/geminiEmbedding.ts
Normal file
5
src/lib/models/providers/gemini/geminiEmbedding.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAIEmbedding from '../openai/openaiEmbedding';
|
||||
|
||||
class GeminiEmbedding extends OpenAIEmbedding {}
|
||||
|
||||
export default GeminiEmbedding;
|
||||
5
src/lib/models/providers/gemini/geminiLLM.ts
Normal file
5
src/lib/models/providers/gemini/geminiLLM.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAILLM from '../openai/openaiLLM';
|
||||
|
||||
class GeminiLLM extends OpenAILLM {}
|
||||
|
||||
export default GeminiLLM;
|
||||
@@ -1,13 +1,11 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import {
|
||||
ChatGoogleGenerativeAI,
|
||||
GoogleGenerativeAIEmbeddings,
|
||||
} from '@langchain/google-genai';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
import { Model, ModelList, ProviderMetadata } from '../../types';
|
||||
import GeminiEmbedding from './geminiEmbedding';
|
||||
import BaseEmbedding from '../../base/embedding';
|
||||
import BaseModelProvider from '../../base/provider';
|
||||
import BaseLLM from '../../base/llm';
|
||||
import GeminiLLM from './geminiLLM';
|
||||
|
||||
interface GeminiConfig {
|
||||
apiKey: string;
|
||||
@@ -18,9 +16,9 @@ const providerConfigFields: UIConfigField[] = [
|
||||
type: 'password',
|
||||
name: 'API Key',
|
||||
key: 'apiKey',
|
||||
description: 'Your Google Gemini API key',
|
||||
description: 'Your Gemini API key',
|
||||
required: true,
|
||||
placeholder: 'Google Gemini API Key',
|
||||
placeholder: 'Gemini API Key',
|
||||
env: 'GEMINI_API_KEY',
|
||||
scope: 'server',
|
||||
},
|
||||
@@ -85,7 +83,7 @@ class GeminiProvider extends BaseModelProvider<GeminiConfig> {
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
async loadChatModel(key: string): Promise<BaseLLM<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
@@ -96,14 +94,14 @@ class GeminiProvider extends BaseModelProvider<GeminiConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatGoogleGenerativeAI({
|
||||
return new GeminiLLM({
|
||||
apiKey: this.config.apiKey,
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai',
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
const exists = modelList.embedding.find((m) => m.key === key);
|
||||
|
||||
@@ -113,9 +111,10 @@ class GeminiProvider extends BaseModelProvider<GeminiConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new GoogleGenerativeAIEmbeddings({
|
||||
return new GeminiEmbedding({
|
||||
apiKey: this.config.apiKey,
|
||||
model: key,
|
||||
baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai',
|
||||
});
|
||||
}
|
||||
|
||||
@@ -137,7 +136,7 @@ class GeminiProvider extends BaseModelProvider<GeminiConfig> {
|
||||
static getProviderMetadata(): ProviderMetadata {
|
||||
return {
|
||||
key: 'gemini',
|
||||
name: 'Google Gemini',
|
||||
name: 'Gemini',
|
||||
};
|
||||
}
|
||||
}
|
||||
5
src/lib/models/providers/groq/groqLLM.ts
Normal file
5
src/lib/models/providers/groq/groqLLM.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAILLM from '../openai/openaiLLM';
|
||||
|
||||
class GroqLLM extends OpenAILLM {}
|
||||
|
||||
export default GroqLLM;
|
||||
@@ -1,10 +1,10 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatGroq } from '@langchain/groq';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
import { Model, ModelList, ProviderMetadata } from '../../types';
|
||||
import BaseEmbedding from '../../base/embedding';
|
||||
import BaseModelProvider from '../../base/provider';
|
||||
import BaseLLM from '../../base/llm';
|
||||
import GroqLLM from './groqLLM';
|
||||
|
||||
interface GroqConfig {
|
||||
apiKey: string;
|
||||
@@ -29,8 +29,7 @@ class GroqProvider extends BaseModelProvider<GroqConfig> {
|
||||
}
|
||||
|
||||
async getDefaultModels(): Promise<ModelList> {
|
||||
try {
|
||||
const res = await fetch('https://api.groq.com/openai/v1/models', {
|
||||
const res = await fetch(`https://api.groq.com/openai/v1/models`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
@@ -40,26 +39,19 @@ class GroqProvider extends BaseModelProvider<GroqConfig> {
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const models: Model[] = data.data.map((m: any) => {
|
||||
return {
|
||||
name: m.id,
|
||||
const defaultChatModels: Model[] = [];
|
||||
|
||||
data.data.forEach((m: any) => {
|
||||
defaultChatModels.push({
|
||||
key: m.id,
|
||||
};
|
||||
name: m.id,
|
||||
});
|
||||
});
|
||||
|
||||
return {
|
||||
embedding: [],
|
||||
chat: models,
|
||||
chat: defaultChatModels,
|
||||
};
|
||||
} catch (err) {
|
||||
if (err instanceof TypeError) {
|
||||
throw new Error(
|
||||
'Error connecting to Groq API. Please ensure your API key is correct and the Groq service is available.',
|
||||
);
|
||||
}
|
||||
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async getModelList(): Promise<ModelList> {
|
||||
@@ -67,12 +59,15 @@ class GroqProvider extends BaseModelProvider<GroqConfig> {
|
||||
const configProvider = getConfiguredModelProviderById(this.id)!;
|
||||
|
||||
return {
|
||||
embedding: [],
|
||||
embedding: [
|
||||
...defaultModels.embedding,
|
||||
...configProvider.embeddingModels,
|
||||
],
|
||||
chat: [...defaultModels.chat, ...configProvider.chatModels],
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
async loadChatModel(key: string): Promise<BaseLLM<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
@@ -81,15 +76,15 @@ class GroqProvider extends BaseModelProvider<GroqConfig> {
|
||||
throw new Error('Error Loading Groq Chat Model. Invalid Model Selected');
|
||||
}
|
||||
|
||||
return new ChatGroq({
|
||||
return new GroqLLM({
|
||||
apiKey: this.config.apiKey,
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
baseURL: 'https://api.groq.com/openai/v1',
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
throw new Error('Groq provider does not support embedding models.');
|
||||
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
|
||||
throw new Error('Groq Provider does not support embedding models.');
|
||||
}
|
||||
|
||||
static parseAndValidate(raw: any): GroqConfig {
|
||||
@@ -1,27 +1,21 @@
|
||||
import { ModelProviderUISection } from '@/lib/config/types';
|
||||
import { ProviderConstructor } from './baseProvider';
|
||||
import { ProviderConstructor } from '../base/provider';
|
||||
import OpenAIProvider from './openai';
|
||||
import OllamaProvider from './ollama';
|
||||
import TransformersProvider from './transformers';
|
||||
import AnthropicProvider from './anthropic';
|
||||
import GeminiProvider from './gemini';
|
||||
import TransformersProvider from './transformers';
|
||||
import GroqProvider from './groq';
|
||||
import DeepSeekProvider from './deepseek';
|
||||
import LMStudioProvider from './lmstudio';
|
||||
import LemonadeProvider from './lemonade';
|
||||
import AimlProvider from '@/lib/models/providers/aiml';
|
||||
import AnthropicProvider from './anthropic';
|
||||
|
||||
export const providers: Record<string, ProviderConstructor<any>> = {
|
||||
openai: OpenAIProvider,
|
||||
ollama: OllamaProvider,
|
||||
transformers: TransformersProvider,
|
||||
anthropic: AnthropicProvider,
|
||||
gemini: GeminiProvider,
|
||||
transformers: TransformersProvider,
|
||||
groq: GroqProvider,
|
||||
deepseek: DeepSeekProvider,
|
||||
aiml: AimlProvider,
|
||||
lmstudio: LMStudioProvider,
|
||||
lemonade: LemonadeProvider,
|
||||
anthropic: AnthropicProvider,
|
||||
};
|
||||
|
||||
export const getModelProvidersUIConfigSection =
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
import BaseModelProvider from '../../base/provider';
|
||||
import { Model, ModelList, ProviderMetadata } from '../../types';
|
||||
import BaseLLM from '../../base/llm';
|
||||
import LemonadeLLM from './lemonadeLLM';
|
||||
import BaseEmbedding from '../../base/embedding';
|
||||
import LemonadeEmbedding from './lemonadeEmbedding';
|
||||
|
||||
interface LemonadeConfig {
|
||||
baseURL: string;
|
||||
@@ -41,22 +42,21 @@ class LemonadeProvider extends BaseModelProvider<LemonadeConfig> {
|
||||
|
||||
async getDefaultModels(): Promise<ModelList> {
|
||||
try {
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/json',
|
||||
};
|
||||
|
||||
if (this.config.apiKey) {
|
||||
headers['Authorization'] = `Bearer ${this.config.apiKey}`;
|
||||
}
|
||||
|
||||
const res = await fetch(`${this.config.baseURL}/models`, {
|
||||
method: 'GET',
|
||||
headers,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
...(this.config.apiKey
|
||||
? { Authorization: `Bearer ${this.config.apiKey}` }
|
||||
: {}),
|
||||
},
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const models: Model[] = data.data.map((m: any) => {
|
||||
const models: Model[] = data.data
|
||||
.filter((m: any) => m.recipe === 'llamacpp')
|
||||
.map((m: any) => {
|
||||
return {
|
||||
name: m.id,
|
||||
key: m.id,
|
||||
@@ -91,7 +91,7 @@ class LemonadeProvider extends BaseModelProvider<LemonadeConfig> {
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
async loadChatModel(key: string): Promise<BaseLLM<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
@@ -102,17 +102,14 @@ class LemonadeProvider extends BaseModelProvider<LemonadeConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatOpenAI({
|
||||
return new LemonadeLLM({
|
||||
apiKey: this.config.apiKey || 'not-needed',
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: this.config.baseURL,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
const exists = modelList.embedding.find((m) => m.key === key);
|
||||
|
||||
@@ -122,12 +119,10 @@ class LemonadeProvider extends BaseModelProvider<LemonadeConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new OpenAIEmbeddings({
|
||||
return new LemonadeEmbedding({
|
||||
apiKey: this.config.apiKey || 'not-needed',
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: this.config.baseURL,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
5
src/lib/models/providers/lemonade/lemonadeEmbedding.ts
Normal file
5
src/lib/models/providers/lemonade/lemonadeEmbedding.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAIEmbedding from '../openai/openaiEmbedding';
|
||||
|
||||
class LemonadeEmbedding extends OpenAIEmbedding {}
|
||||
|
||||
export default LemonadeEmbedding;
|
||||
5
src/lib/models/providers/lemonade/lemonadeLLM.ts
Normal file
5
src/lib/models/providers/lemonade/lemonadeLLM.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import OpenAILLM from '../openai/openaiLLM';
|
||||
|
||||
class LemonadeLLM extends OpenAILLM {}
|
||||
|
||||
export default LemonadeLLM;
|
||||
@@ -1,148 +0,0 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
|
||||
interface LMStudioConfig {
|
||||
baseURL: string;
|
||||
}
|
||||
|
||||
const providerConfigFields: UIConfigField[] = [
|
||||
{
|
||||
type: 'string',
|
||||
name: 'Base URL',
|
||||
key: 'baseURL',
|
||||
description: 'The base URL for LM Studio server',
|
||||
required: true,
|
||||
placeholder: 'http://localhost:1234',
|
||||
env: 'LM_STUDIO_BASE_URL',
|
||||
scope: 'server',
|
||||
},
|
||||
];
|
||||
|
||||
class LMStudioProvider extends BaseModelProvider<LMStudioConfig> {
|
||||
constructor(id: string, name: string, config: LMStudioConfig) {
|
||||
super(id, name, config);
|
||||
}
|
||||
|
||||
private normalizeBaseURL(url: string): string {
|
||||
const trimmed = url.trim().replace(/\/+$/, '');
|
||||
return trimmed.endsWith('/v1') ? trimmed : `${trimmed}/v1`;
|
||||
}
|
||||
|
||||
async getDefaultModels(): Promise<ModelList> {
|
||||
try {
|
||||
const baseURL = this.normalizeBaseURL(this.config.baseURL);
|
||||
|
||||
const res = await fetch(`${baseURL}/models`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const models: Model[] = data.data.map((m: any) => {
|
||||
return {
|
||||
name: m.id,
|
||||
key: m.id,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
embedding: models,
|
||||
chat: models,
|
||||
};
|
||||
} catch (err) {
|
||||
if (err instanceof TypeError) {
|
||||
throw new Error(
|
||||
'Error connecting to LM Studio. Please ensure the base URL is correct and the LM Studio server is running.',
|
||||
);
|
||||
}
|
||||
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async getModelList(): Promise<ModelList> {
|
||||
const defaultModels = await this.getDefaultModels();
|
||||
const configProvider = getConfiguredModelProviderById(this.id)!;
|
||||
|
||||
return {
|
||||
embedding: [
|
||||
...defaultModels.embedding,
|
||||
...configProvider.embeddingModels,
|
||||
],
|
||||
chat: [...defaultModels.chat, ...configProvider.chatModels],
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
|
||||
if (!exists) {
|
||||
throw new Error(
|
||||
'Error Loading LM Studio Chat Model. Invalid Model Selected',
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatOpenAI({
|
||||
apiKey: 'lm-studio',
|
||||
temperature: 0.7,
|
||||
model: key,
|
||||
streaming: true,
|
||||
configuration: {
|
||||
baseURL: this.normalizeBaseURL(this.config.baseURL),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
const modelList = await this.getModelList();
|
||||
const exists = modelList.embedding.find((m) => m.key === key);
|
||||
|
||||
if (!exists) {
|
||||
throw new Error(
|
||||
'Error Loading LM Studio Embedding Model. Invalid Model Selected.',
|
||||
);
|
||||
}
|
||||
|
||||
return new OpenAIEmbeddings({
|
||||
apiKey: 'lm-studio',
|
||||
model: key,
|
||||
configuration: {
|
||||
baseURL: this.normalizeBaseURL(this.config.baseURL),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
static parseAndValidate(raw: any): LMStudioConfig {
|
||||
if (!raw || typeof raw !== 'object')
|
||||
throw new Error('Invalid config provided. Expected object');
|
||||
if (!raw.baseURL)
|
||||
throw new Error('Invalid config provided. Base URL must be provided');
|
||||
|
||||
return {
|
||||
baseURL: String(raw.baseURL),
|
||||
};
|
||||
}
|
||||
|
||||
static getProviderConfigFields(): UIConfigField[] {
|
||||
return providerConfigFields;
|
||||
}
|
||||
|
||||
static getProviderMetadata(): ProviderMetadata {
|
||||
return {
|
||||
key: 'lmstudio',
|
||||
name: 'LM Studio',
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default LMStudioProvider;
|
||||
@@ -1,10 +1,11 @@
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { Model, ModelList, ProviderMetadata } from '../types';
|
||||
import BaseModelProvider from './baseProvider';
|
||||
import { ChatOllama, OllamaEmbeddings } from '@langchain/ollama';
|
||||
import { Embeddings } from '@langchain/core/embeddings';
|
||||
import { UIConfigField } from '@/lib/config/types';
|
||||
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
|
||||
import BaseModelProvider from '../../base/provider';
|
||||
import { Model, ModelList, ProviderMetadata } from '../../types';
|
||||
import BaseLLM from '../../base/llm';
|
||||
import BaseEmbedding from '../../base/embedding';
|
||||
import OllamaLLM from './ollamaLLM';
|
||||
import OllamaEmbedding from './ollamaEmbedding';
|
||||
|
||||
interface OllamaConfig {
|
||||
baseURL: string;
|
||||
@@ -76,7 +77,7 @@ class OllamaProvider extends BaseModelProvider<OllamaConfig> {
|
||||
};
|
||||
}
|
||||
|
||||
async loadChatModel(key: string): Promise<BaseChatModel> {
|
||||
async loadChatModel(key: string): Promise<BaseLLM<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
|
||||
const exists = modelList.chat.find((m) => m.key === key);
|
||||
@@ -87,14 +88,13 @@ class OllamaProvider extends BaseModelProvider<OllamaConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new ChatOllama({
|
||||
temperature: 0.7,
|
||||
return new OllamaLLM({
|
||||
baseURL: this.config.baseURL,
|
||||
model: key,
|
||||
baseUrl: this.config.baseURL,
|
||||
});
|
||||
}
|
||||
|
||||
async loadEmbeddingModel(key: string): Promise<Embeddings> {
|
||||
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
|
||||
const modelList = await this.getModelList();
|
||||
const exists = modelList.embedding.find((m) => m.key === key);
|
||||
|
||||
@@ -104,9 +104,9 @@ class OllamaProvider extends BaseModelProvider<OllamaConfig> {
|
||||
);
|
||||
}
|
||||
|
||||
return new OllamaEmbeddings({
|
||||
return new OllamaEmbedding({
|
||||
model: key,
|
||||
baseUrl: this.config.baseURL,
|
||||
baseURL: this.config.baseURL,
|
||||
});
|
||||
}
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user