Compare commits

...

157 Commits

Author SHA1 Message Date
Kushagra Srivastava
d7b020e5bb Update README.md 2026-01-10 23:03:58 +05:30
ItzCrazyKns
d95ff9ccdd Update docker-compose.yaml 2026-01-08 22:36:01 +05:30
ItzCrazyKns
8347b798f3 feat(app): lint & beautify 2026-01-03 23:12:19 +05:30
ItzCrazyKns
a16472bcf3 feat(actions): prevent double conversion to object array 2026-01-01 21:56:46 +05:30
ItzCrazyKns
3b8d8be676 feat(package): bump version 2025-12-31 12:58:59 +05:30
ItzCrazyKns
b83f9bac78 feat(providers): extract/repair json before parsing 2025-12-31 12:58:24 +05:30
ItzCrazyKns
bd7c563137 feat(package): add json repair 2025-12-31 12:57:59 +05:30
ItzCrazyKns
23b903db9a Update searxng.ts 2025-12-30 22:16:06 +05:30
ItzCrazyKns
a98f0df83f feat(app): lint & beautify 2025-12-29 22:02:21 +05:30
ItzCrazyKns
164d528761 feat(compose): add build context, remove uploads 2025-12-28 13:11:05 +05:30
ItzCrazyKns
af4ec17117 Update docker-compose.yaml 2025-12-28 12:49:25 +05:30
ItzCrazyKns
1622e0893a feat(providers): add lm studio 2025-12-28 11:29:34 +05:30
ItzCrazyKns
55a4b9d436 feat(openai-llm): use function call index instead of type 2025-12-28 01:21:33 +05:30
ItzCrazyKns
b450d0e668 Merge branch 'canary' 2025-12-27 20:52:56 +05:30
ItzCrazyKns
0987ee4370 Update next.config.mjs 2025-12-27 20:29:11 +05:30
ItzCrazyKns
d1bd22786d Update package.json 2025-12-27 20:15:34 +05:30
ItzCrazyKns
bb7b7170ca feat(media, suggestions): handle chat history correctly 2025-12-27 20:03:34 +05:30
ItzCrazyKns
be7bd62a74 feat(prompts): update media 2025-12-27 20:02:49 +05:30
ItzCrazyKns
a691f3bab0 feat(chat-hook): fix history saving delay (async state), add delay before media search to allow component refresh 2025-12-27 20:02:36 +05:30
ItzCrazyKns
f1c9fa0e33 feat(package): bump version 2025-12-27 18:56:16 +05:30
ItzCrazyKns
d872cf5009 feat(chat-hook): prevent duplicate blocks 2025-12-27 18:36:13 +05:30
ItzCrazyKns
fdef718980 feat(transformer-provider): specify dtype 2025-12-27 18:36:01 +05:30
ItzCrazyKns
19dde42f22 feat(app): fix build errors, use webpack 2025-12-27 18:35:30 +05:30
ItzCrazyKns
c9f6893d99 feat(pdf-parse): fix DOMMatrix issues 2025-12-27 14:54:46 +05:30
ItzCrazyKns
53e9859b6c Update README.md 2025-12-27 13:35:47 +05:30
Kushagra Srivastava
53e39cd985 Merge pull request #950 from ItzCrazyKns/feat/improve-search-architecture
feat: improve search architecture, write custom API classes (remove langchain), add deep research & more
2025-12-27 13:33:54 +05:30
Kushagra Srivastava
7f3f881964 Update src/components/Navbar.tsx
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2025-12-27 13:32:20 +05:30
Kushagra Srivastava
9620e63e3f Update src/components/MessageActions/Copy.tsx
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2025-12-27 13:29:43 +05:30
ItzCrazyKns
ec5ff6f4a8 Update plan.ts 2025-12-27 13:26:07 +05:30
ItzCrazyKns
0ace778b03 Merge branch 'feat/improve-search-architecture' of https://github.com/ItzCrazyKns/Perplexica into feat/improve-search-architecture 2025-12-27 13:24:50 +05:30
ItzCrazyKns
6919ad1a0f feat(app): address review 2025-12-27 13:24:35 +05:30
Kushagra Srivastava
b5ba8c48c0 Update src/components/WeatherWidget.tsx
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2025-12-27 13:14:46 +05:30
ItzCrazyKns
65fdecb122 feat(docs): update architecture docs 2025-12-27 13:09:11 +05:30
ItzCrazyKns
5a44319d85 feat(guides): update contributing guides 2025-12-27 13:09:01 +05:30
ItzCrazyKns
cc183cd0cd feat(readme): update features & upcoming features 2025-12-27 13:08:28 +05:30
ItzCrazyKns
50ca7ac73a feat(api): update search api & related documentation 2025-12-27 13:07:59 +05:30
ItzCrazyKns
a31a4ab295 feat(agents): add api search agent 2025-12-27 13:07:42 +05:30
ItzCrazyKns
edba47aed8 feat(db): add migration scripts 2025-12-26 14:51:24 +05:30
ItzCrazyKns
ae132ebee8 feat(app): lint & beautify 2025-12-25 18:58:33 +05:30
ItzCrazyKns
60dd7a8108 feat(ui): fix theming issues 2025-12-24 17:24:07 +05:30
ItzCrazyKns
f5e054f6ea feat(chat): fix hidden input 2025-12-24 15:48:16 +05:30
ItzCrazyKns
452180356d feat(library): enhance ui & ux 2025-12-24 15:47:56 +05:30
ItzCrazyKns
0a9641a110 feat(providers): add anthropic 2025-12-24 15:24:06 +05:30
ItzCrazyKns
a2f2e17bbb feat(providers): add lemonade 2025-12-24 14:12:22 +05:30
ItzCrazyKns
e1afcbb787 feat(package): add google genai & bump transformers 2025-12-24 13:56:43 +05:30
ItzCrazyKns
fe2c1b8210 feat(providers): update index map 2025-12-24 13:56:24 +05:30
ItzCrazyKns
d40fcd57d9 feat(ollama): add nemotron to thinking list 2025-12-24 13:56:11 +05:30
ItzCrazyKns
86a43086cc feat(providers): add transformers 2025-12-24 13:55:56 +05:30
ItzCrazyKns
9ce17edd4a feat(providers): add groq 2025-12-24 13:55:42 +05:30
ItzCrazyKns
c4349f3d5c feat(providers): add gemini 2025-12-24 13:55:32 +05:30
ItzCrazyKns
d4c276ab93 Update types.ts 2025-12-24 13:55:12 +05:30
ItzCrazyKns
6ae885e0ed feat(steps): display after loading animation 2025-12-24 13:55:07 +05:30
ItzCrazyKns
dc74e7174f feat(researcher): rename 0_reasoning to __reasoning_preamble to comply with provider guidelines 2025-12-24 13:54:49 +05:30
ItzCrazyKns
53697bb42e feat(classifier-prompt): add calculation widget 2025-12-24 13:53:35 +05:30
ItzCrazyKns
eca66f0b5f feat(writer): add system instructions, send response block on response 2025-12-24 13:53:09 +05:30
ItzCrazyKns
cf95ea0af7 feat(app): lint & beautify 2025-12-23 18:54:01 +05:30
ItzCrazyKns
24c32ed881 feat(app): enhance attach transition 2025-12-23 18:53:40 +05:30
ItzCrazyKns
b47f522bf2 feat(app): update guide for run command 2025-12-23 18:40:30 +05:30
ItzCrazyKns
ea18c13326 feat(app): remove uploads 2025-12-23 18:38:25 +05:30
ItzCrazyKns
b706434bac feat(chat-window): display only when ready 2025-12-23 17:56:15 +05:30
ItzCrazyKns
2c65bd916b feat(chat-hook): set ready before reconnecting 2025-12-23 17:29:14 +05:30
ItzCrazyKns
c3b74a3fd0 feat(assistant-steps): only open last comp 2025-12-23 17:17:56 +05:30
ItzCrazyKns
5f04034650 feat(chat-hook): handle reconnect 2025-12-23 17:17:19 +05:30
ItzCrazyKns
5847379db0 Update types.ts 2025-12-23 17:15:46 +05:30
ItzCrazyKns
8520ea6fe5 feat(researcher): emit sources as block 2025-12-23 17:15:42 +05:30
ItzCrazyKns
a6d4f47130 feat(search-agent): save history 2025-12-23 17:15:32 +05:30
ItzCrazyKns
f278eb8bf1 feat(routes): add reconnect route 2025-12-23 17:15:02 +05:30
ItzCrazyKns
0e176e0b78 feat(chat-route): add history saving, disconnect on abort, use subscribe method 2025-12-23 17:14:02 +05:30
ItzCrazyKns
8ba64be446 feat(session): fix sessions getting disregarded due to reload 2025-12-23 17:12:56 +05:30
ItzCrazyKns
216332fb20 feat(session): add subscribe method, getAllBlocks 2025-12-23 17:12:15 +05:30
ItzCrazyKns
68a9e048ac feat(schema): change focusMode to sources 2025-12-23 17:11:38 +05:30
ItzCrazyKns
13d6bcf113 Update Optimization.tsx 2025-12-22 17:58:30 +05:30
ItzCrazyKns
94a24d4058 feat(message-input): add overflow to prevent blocked popovers 2025-12-21 19:56:43 +05:30
ItzCrazyKns
300cfa35c7 Update Optimization.tsx 2025-12-19 16:45:46 +05:30
ItzCrazyKns
85273493a0 feat(copy): fix type mismatch 2025-12-19 16:35:13 +05:30
ItzCrazyKns
6e2345bd2d feat(message-box): update markdown2jsx overrides to render codeblock 2025-12-19 16:27:55 +05:30
ItzCrazyKns
fdee29c93e feat(renderers): add code block 2025-12-19 16:26:51 +05:30
ItzCrazyKns
21cb0f5fd9 feat(app): add syntax highlighter 2025-12-19 16:26:38 +05:30
ItzCrazyKns
a82b605c70 feat(citation): move to message renderer 2025-12-19 16:26:13 +05:30
ItzCrazyKns
64683e3dec feat(assistant-steps): improve style 2025-12-19 16:25:56 +05:30
ItzCrazyKns
604774ef6e feat(social-search): add social search 2025-12-18 13:56:39 +05:30
ItzCrazyKns
ac183a90e8 feat(academic-search): add academic search 2025-12-18 13:56:26 +05:30
ItzCrazyKns
5511a276d4 Update Sources.tsx 2025-12-18 13:56:08 +05:30
ItzCrazyKns
473a04b6a5 feat(suggestions): prevent icon from shrinking 2025-12-18 13:56:04 +05:30
ItzCrazyKns
491136822f feat(app): lint & beautify 2025-12-17 21:17:21 +05:30
ItzCrazyKns
6e086953b1 feat(agents): add academic and social search 2025-12-17 21:17:08 +05:30
ItzCrazyKns
1961e4e707 feat(empty-chat-message-input): use sources 2025-12-15 23:49:26 +05:30
ItzCrazyKns
249889f55a feat(actions-registry): add sources, update web search to become active on web 2025-12-15 23:49:11 +05:30
ItzCrazyKns
9b2c229e9c feat(message-input): remove copilot toggle 2025-12-15 23:48:32 +05:30
ItzCrazyKns
4bdb90e150 feat(message-input-actions): update to use motion, improve animations 2025-12-15 23:48:14 +05:30
ItzCrazyKns
f9cc97ffb5 feat(message-input-actions): add sources 2025-12-15 23:47:48 +05:30
ItzCrazyKns
9dd670f46a feat(chat-hook): handle sources 2025-12-15 23:47:38 +05:30
ItzCrazyKns
bd3c5f895a feat(message-input-actions): remove copilot, focus selector 2025-12-15 23:47:21 +05:30
ItzCrazyKns
e6c8a0aa6f Add antialiased class to body element 2025-12-15 23:47:01 +05:30
ItzCrazyKns
b90b92079b feat(chat-route): accept sources 2025-12-15 23:46:46 +05:30
ItzCrazyKns
a3065d58ef feat(package): add motion, react tooltip, phosphor icons 2025-12-15 23:46:11 +05:30
ItzCrazyKns
ca4809f0f2 Update manager.ts 2025-12-14 19:32:09 +05:30
ItzCrazyKns
3d1d164f68 feat(app): lint & beautify 2025-12-13 22:23:54 +05:30
ItzCrazyKns
a99702d837 feat(app): update UI to handle uploads 2025-12-13 22:23:39 +05:30
ItzCrazyKns
60675955e4 feat(researcher-prompt): add user uploaded files 2025-12-13 22:23:08 +05:30
ItzCrazyKns
a6ff94d030 feat(api): update to use fileIds 2025-12-13 22:22:41 +05:30
ItzCrazyKns
748ee4d3c2 feat(actions): add uploads search action 2025-12-13 22:22:17 +05:30
ItzCrazyKns
1f3bf8da32 feat(researcher): use reasoning 2025-12-13 22:21:44 +05:30
ItzCrazyKns
8d471ac40e feat(registry): update to send fileIds 2025-12-13 22:21:22 +05:30
ItzCrazyKns
40b25a487b feat(uploads): update to use new manager 2025-12-13 22:20:26 +05:30
ItzCrazyKns
3949748bbd feat(suggestions-agent): fix type errors 2025-12-13 22:19:52 +05:30
ItzCrazyKns
56e47d6c39 feat(ollama-llm): use hash to generate id 2025-12-13 22:19:38 +05:30
ItzCrazyKns
fd745577d6 feat(writer-prompt): revert to old prompt to fix length issues 2025-12-13 22:19:06 +05:30
ItzCrazyKns
86ea3cde7e feat(types): add upload research blocks 2025-12-13 22:18:48 +05:30
ItzCrazyKns
aeb90cb137 feat(uploads): add uploads store with reciprocal rerank fusion 2025-12-13 22:18:33 +05:30
ItzCrazyKns
6473e51fde feat(uploads): add uploads manager 2025-12-13 22:18:07 +05:30
ItzCrazyKns
c7c327a7bb feat(utils): add token based text splitting 2025-12-13 22:17:51 +05:30
ItzCrazyKns
0688630863 feat(actions): update web search action to use reasoning 2025-12-13 22:17:02 +05:30
ItzCrazyKns
0b9e193ed1 feat(actions): rename plan to reasoning 2025-12-13 22:16:21 +05:30
ItzCrazyKns
8d1b04e05f feat(search-agent): use index + 1 to fix zero errors 2025-12-13 22:15:47 +05:30
ItzCrazyKns
ff4cf98b50 feat(media-search): fix type errors 2025-12-13 22:15:29 +05:30
ItzCrazyKns
13ae0b9451 feat(package): remove langchain, other unused packages 2025-12-13 22:14:29 +05:30
ItzCrazyKns
0cfa01422c Create fileSearch.ts 2025-12-12 23:56:34 +05:30
ItzCrazyKns
fdaa2f0646 feat(openai): update model list 2025-12-12 00:22:59 +05:30
ItzCrazyKns
fc0c444b6a feat(researcher-prompt): add mode based prompts 2025-12-09 11:42:11 +05:30
ItzCrazyKns
01b537ade1 feat(actions): add tool description, description 2025-12-09 11:41:55 +05:30
ItzCrazyKns
3bffc72422 feat(types): update research action type 2025-12-09 11:40:40 +05:30
ItzCrazyKns
6016090f12 feat(actions): stream results internally 2025-12-08 13:10:11 +05:30
ItzCrazyKns
8aed9518a2 feat(researcher): pass research block id 2025-12-08 13:09:52 +05:30
ItzCrazyKns
2df6250ba1 feat(weather): respect unit preference 2025-12-08 13:09:21 +05:30
ItzCrazyKns
85f6c3b901 feat(client-registry): add getMeasurementUnit 2025-12-08 13:08:52 +05:30
ItzCrazyKns
96001a9e26 feat(assistant-steps): handle reading, search_results 2025-12-08 13:08:26 +05:30
ItzCrazyKns
331387efa4 feat(search): add better context handling 2025-12-08 13:07:52 +05:30
ItzCrazyKns
d0e71e6482 feat(types): add search_results research block 2025-12-08 13:07:16 +05:30
ItzCrazyKns
e329820bc8 feat(package): update lucide-react, framer-motion 2025-12-08 13:06:58 +05:30
ItzCrazyKns
5174820554 feat(package): bump next version 2025-12-07 22:09:11 +05:30
ItzCrazyKns
1c3a5fe275 feat(actions): limit urls & queries to 3 2025-12-07 22:08:46 +05:30
ItzCrazyKns
d0124b9f06 feat(actions): add scrape URL action 2025-12-06 18:54:37 +05:30
ItzCrazyKns
a14f3e9464 feat(prompts): update researcher prompt 2025-12-06 15:38:55 +05:30
ItzCrazyKns
9afea48d31 feat(search-agent): use function calling 2025-12-06 15:38:40 +05:30
ItzCrazyKns
2d82cd65d9 feat(registry): register plan action 2025-12-06 15:38:20 +05:30
ItzCrazyKns
97838fd693 feat(actions): add plan, update done & web search 2025-12-06 15:38:07 +05:30
ItzCrazyKns
8ab675b119 feat(action-registry): use tool types, add tool methods 2025-12-06 15:37:36 +05:30
ItzCrazyKns
5e3001756b feat(search-types0: add reasoning action 2025-12-06 15:31:57 +05:30
ItzCrazyKns
4c4c1d1930 feat(ollama-llm): process ollama messages with tool calls 2025-12-06 15:31:35 +05:30
ItzCrazyKns
3c524b0f98 feat(openai-llm): process assistant message with tool calls 2025-12-06 15:25:48 +05:30
ItzCrazyKns
e99c8bdd50 feat(models-types): update to use Message 2025-12-06 15:25:15 +05:30
ItzCrazyKns
574b3d55e2 feat(types): separate user, assistant & system message 2025-12-06 15:24:46 +05:30
ItzCrazyKns
f2f2af9451 feat(message-input): hide content after input 2025-12-06 15:24:15 +05:30
ItzCrazyKns
65ef299d72 feat(settings): display app version, link 2025-12-06 15:22:06 +05:30
ItzCrazyKns
4fc810d976 feat(calculation-widget): enhance UI 2025-12-05 21:30:41 +05:30
ItzCrazyKns
a548fd694a feat(utils): compute cosine similarity, remove package 2025-12-05 21:28:15 +05:30
ItzCrazyKns
2c61f47088 feat(openai-llm): implement function calling 2025-12-05 21:17:43 +05:30
ItzCrazyKns
1c0e90c8e0 feat(ollama-llm): implement function calling 2025-12-05 21:17:28 +05:30
ItzCrazyKns
ee5d9172a4 feat(models): add tool, tool call 2025-12-05 21:16:41 +05:30
ItzCrazyKns
c35b684dc5 feat(types): add ToolMessage, Message 2025-12-05 21:08:37 +05:30
ItzCrazyKns
046f159528 feat(widgets): use new classifier, implement new widget executor, delete registry 2025-12-02 11:52:40 +05:30
ItzCrazyKns
6899b49ca0 Merge branch 'feat/improve-search-architecture' of https://github.com/ItzCrazyKns/Perplexica into feat/improve-search-architecture 2025-12-02 11:52:31 +05:30
ItzCrazyKns
dbc2137efb Revise writer prompt for warmer, conversational tone 2025-12-02 11:51:17 +05:30
ItzCrazyKns
1ea348ddb7 feat(classifier-prompt): update and add showCalculationWidget 2025-12-02 11:50:54 +05:30
ItzCrazyKns
b8a7fb936f feat(classifier): add showCalculationWidget 2025-12-02 11:50:26 +05:30
ItzCrazyKns
33c8f454a3 feat(weather-widget): do not round temperature 2025-12-02 11:49:37 +05:30
112 changed files with 6444 additions and 2994 deletions

View File

@@ -11,33 +11,63 @@ Perplexica's codebase is organized as follows:
- **UI Components and Pages**: - **UI Components and Pages**:
- **Components (`src/components`)**: Reusable UI components. - **Components (`src/components`)**: Reusable UI components.
- **Pages and Routes (`src/app`)**: Next.js app directory structure with page components. - **Pages and Routes (`src/app`)**: Next.js app directory structure with page components.
- Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), library (`/library`), and settings (`/settings`). - Main app routes include: home (`/`), chat (`/c`), discover (`/discover`), and library (`/library`).
- **API Routes (`src/app/api`)**: API endpoints implemented with Next.js API routes. - **API Routes (`src/app/api`)**: Server endpoints implemented with Next.js route handlers.
- `/api/chat`: Handles chat interactions.
- `/api/search`: Provides direct access to Perplexica's search capabilities.
- Other endpoints for models, files, and suggestions.
- **Backend Logic (`src/lib`)**: Contains all the backend functionality including search, database, and API logic. - **Backend Logic (`src/lib`)**: Contains all the backend functionality including search, database, and API logic.
- The search functionality is present inside `src/lib/search` directory. - The search system lives in `src/lib/agents/search`.
- All of the focus modes are implemented using the Meta Search Agent class in `src/lib/search/metaSearchAgent.ts`. - The search pipeline is split into classification, research, widgets, and writing.
- Database functionality is in `src/lib/db`. - Database functionality is in `src/lib/db`.
- Chat model and embedding model providers are managed in `src/lib/providers`. - Chat model and embedding model providers are in `src/lib/models/providers`, and models are loaded via `src/lib/models/registry.ts`.
- Prompt templates and LLM chain definitions are in `src/lib/prompts` and `src/lib/chains` respectively. - Prompt templates are in `src/lib/prompts`.
- SearXNG integration is in `src/lib/searxng.ts`.
- Upload search lives in `src/lib/uploads`.
### Where to make changes
If you are not sure where to start, use this section as a map.
- **Search behavior and reasoning**
- `src/lib/agents/search` contains the core chat and search pipeline.
- `classifier.ts` decides whether research is needed and what should run.
- `researcher/` gathers information in the background.
- **Add or change a search capability**
- Research tools (web, academic, discussions, uploads, scraping) live in `src/lib/agents/search/researcher/actions`.
- Tools are registered in `src/lib/agents/search/researcher/actions/index.ts`.
- **Add or change widgets**
- Widgets live in `src/lib/agents/search/widgets`.
- Widgets run in parallel with research and show structured results in the UI.
- **Model integrations**
- Providers live in `src/lib/models/providers`.
- Add new providers there and wire them into the model registry so they show up in the app.
- **Architecture docs**
- High level overview: `docs/architecture/README.md`
- High level flow: `docs/architecture/WORKING.md`
## API Documentation ## API Documentation
Perplexica exposes several API endpoints for programmatic access, including: Perplexica includes API documentation for programmatic access.
- **Search API**: Access Perplexica's advanced search capabilities directly via the `/api/search` endpoint. For detailed documentation, see `docs/api/search.md`. - **Search API**: For detailed documentation, see `docs/API/SEARCH.md`.
## Setting Up Your Environment ## Setting Up Your Environment
Before diving into coding, setting up your local environment is key. Here's what you need to do: Before diving into coding, setting up your local environment is key. Here's what you need to do:
1. In the root directory, locate the `sample.config.toml` file. 1. Run `npm install` to install all dependencies.
2. Rename it to `config.toml` and fill in the necessary configuration fields. 2. Use `npm run dev` to start the application in development mode.
3. Run `npm install` to install all dependencies. 3. Open http://localhost:3000 and complete the setup in the UI (API keys, models, search backend URL, etc.).
4. Run `npm run db:migrate` to set up the local sqlite database.
5. Use `npm run dev` to start the application in development mode. Database migrations are applied automatically on startup.
For full installation options (Docker and non Docker), see the installation guide in the repository README.
**Please note**: Docker configurations are present for setting up production environments, whereas `npm run dev` is used for development purposes. **Please note**: Docker configurations are present for setting up production environments, whereas `npm run dev` is used for development purposes.

View File

@@ -18,9 +18,11 @@ Want to know more about its architecture and how it works? You can read it [here
🤖 **Support for all major AI providers** - Use local LLMs through Ollama or connect to OpenAI, Anthropic Claude, Google Gemini, Groq, and more. Mix and match models based on your needs. 🤖 **Support for all major AI providers** - Use local LLMs through Ollama or connect to OpenAI, Anthropic Claude, Google Gemini, Groq, and more. Mix and match models based on your needs.
**Smart search modes** - Choose Balanced Mode for everyday searches, Fast Mode when you need quick answers, or wait for Quality Mode (coming soon) for deep research. **Smart search modes** - Choose Speed Mode when you need quick answers, Balanced Mode for everyday searches, or Quality Mode for deep research.
🎯 **Six specialized focus modes** - Get better results with modes designed for specific tasks: Academic papers, YouTube videos, Reddit discussions, Wolfram Alpha calculations, writing assistance, or general web search. 🧭 **Pick your sources** - Search the web, discussions, or academic papers. More sources and integrations are in progress.
🧩 **Widgets** - Helpful UI cards that show up when relevant, like weather, calculations, stock prices, and other quick lookups.
🔍 **Web search powered by SearxNG** - Access multiple search engines while keeping your identity private. Support for Tavily and Exa coming soon for even better results. 🔍 **Web search powered by SearxNG** - Access multiple search engines while keeping your identity private. Support for Tavily and Exa coming soon for even better results.
@@ -81,7 +83,7 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
Perplexica can be easily run using Docker. Simply run the following command: Perplexica can be easily run using Docker. Simply run the following command:
```bash ```bash
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest
``` ```
This will pull and start the Perplexica container with the bundled SearxNG search engine. Once running, open your browser and navigate to http://localhost:3000. You can then configure your settings (API keys, models, etc.) directly in the setup screen. This will pull and start the Perplexica container with the bundled SearxNG search engine. Once running, open your browser and navigate to http://localhost:3000. You can then configure your settings (API keys, models, etc.) directly in the setup screen.
@@ -93,7 +95,7 @@ This will pull and start the Perplexica container with the bundled SearxNG searc
If you already have SearxNG running, you can use the slim version of Perplexica: If you already have SearxNG running, you can use the slim version of Perplexica:
```bash ```bash
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest
``` ```
**Important**: Make sure your SearxNG instance has: **Important**: Make sure your SearxNG instance has:
@@ -120,7 +122,7 @@ If you prefer to build from source or need more control:
```bash ```bash
docker build -t perplexica . docker build -t perplexica .
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica perplexica docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica perplexica
``` ```
5. Access Perplexica at http://localhost:3000 and configure your settings in the setup screen. 5. Access Perplexica at http://localhost:3000 and configure your settings in the setup screen.
@@ -237,13 +239,9 @@ Perplexica runs on Next.js and handles all API requests. It works right away on
## Upcoming Features ## Upcoming Features
- [x] Add settings page - [ ] Adding more widgets, integrations, search sources
- [x] Adding support for local LLMs - [ ] Adding ability to create custom agents (name T.B.D.)
- [x] History Saving features - [ ] Adding authentication
- [x] Introducing various Focus Modes
- [x] Adding API support
- [x] Adding Discover
- [ ] Finalizing Copilot Mode
## Support Us ## Support Us

View File

@@ -1,15 +1,14 @@
services: services:
perplexica: perplexica:
image: itzcrazykns1337/perplexica:latest image: itzcrazykns1337/perplexica:latest
build:
context: .
ports: ports:
- '3000:3000' - '3000:3000'
volumes: volumes:
- data:/home/perplexica/data - data:/home/perplexica/data
- uploads:/home/perplexica/uploads
restart: unless-stopped restart: unless-stopped
volumes: volumes:
data: data:
name: 'perplexica-data' name: 'perplexica-data'
uploads:
name: 'perplexica-uploads'

View File

@@ -57,7 +57,7 @@ Use the `id` field as the `providerId` and the `key` field from the models array
### Request ### Request
The API accepts a JSON object in the request body, where you define the focus mode, chat models, embedding models, and your query. The API accepts a JSON object in the request body, where you define the enabled search `sources`, chat models, embedding models, and your query.
#### Request Body Structure #### Request Body Structure
@@ -72,7 +72,7 @@ The API accepts a JSON object in the request body, where you define the focus mo
"key": "text-embedding-3-large" "key": "text-embedding-3-large"
}, },
"optimizationMode": "speed", "optimizationMode": "speed",
"focusMode": "webSearch", "sources": ["web"],
"query": "What is Perplexica", "query": "What is Perplexica",
"history": [ "history": [
["human", "Hi, how are you?"], ["human", "Hi, how are you?"],
@@ -87,24 +87,25 @@ The API accepts a JSON object in the request body, where you define the focus mo
### Request Parameters ### Request Parameters
- **`chatModel`** (object, optional): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. - **`chatModel`** (object, required): Defines the chat model to be used for the query. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
- `providerId` (string): The UUID of the provider. You can get this from the `/api/providers` endpoint response. - `providerId` (string): The UUID of the provider. You can get this from the `/api/providers` endpoint response.
- `key` (string): The model key/identifier (e.g., `gpt-4o-mini`, `llama3.1:latest`). Use the `key` value from the provider's `chatModels` array, not the display name. - `key` (string): The model key/identifier (e.g., `gpt-4o-mini`, `llama3.1:latest`). Use the `key` value from the provider's `chatModels` array, not the display name.
- **`embeddingModel`** (object, optional): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`. - **`embeddingModel`** (object, required): Defines the embedding model for similarity-based searching. To get available providers and models, send a GET request to `http://localhost:3000/api/providers`.
- `providerId` (string): The UUID of the embedding provider. You can get this from the `/api/providers` endpoint response. - `providerId` (string): The UUID of the embedding provider. You can get this from the `/api/providers` endpoint response.
- `key` (string): The embedding model key (e.g., `text-embedding-3-large`, `nomic-embed-text`). Use the `key` value from the provider's `embeddingModels` array, not the display name. - `key` (string): The embedding model key (e.g., `text-embedding-3-large`, `nomic-embed-text`). Use the `key` value from the provider's `embeddingModels` array, not the display name.
- **`focusMode`** (string, required): Specifies which focus mode to use. Available modes: - **`sources`** (array, required): Which search sources to enable. Available values:
- `webSearch`, `academicSearch`, `writingAssistant`, `wolframAlphaSearch`, `youtubeSearch`, `redditSearch`. - `web`, `academic`, `discussions`.
- **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes: - **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes:
- `speed`: Prioritize speed and return the fastest answer. - `speed`: Prioritize speed and return the fastest answer.
- `balanced`: Provide a balanced answer with good speed and reasonable quality. - `balanced`: Provide a balanced answer with good speed and reasonable quality.
- `quality`: Prioritize answer quality (may be slower).
- **`query`** (string, required): The search query or question. - **`query`** (string, required): The search query or question.
@@ -132,14 +133,14 @@ The response from the API includes both the final message and the sources used t
"message": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online. Here are some key features and characteristics of Perplexica:\n\n- **AI-Powered Technology**: It utilizes advanced machine learning algorithms to not only retrieve information but also to understand the context and intent behind user queries, providing more relevant results [1][5].\n\n- **Open-Source**: Being open-source, Perplexica offers flexibility and transparency, allowing users to explore its functionalities without the constraints of proprietary software [3][10].", "message": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online. Here are some key features and characteristics of Perplexica:\n\n- **AI-Powered Technology**: It utilizes advanced machine learning algorithms to not only retrieve information but also to understand the context and intent behind user queries, providing more relevant results [1][5].\n\n- **Open-Source**: Being open-source, Perplexica offers flexibility and transparency, allowing users to explore its functionalities without the constraints of proprietary software [3][10].",
"sources": [ "sources": [
{ {
"pageContent": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.", "content": "Perplexica is an innovative, open-source AI-powered search engine designed to enhance the way users search for information online.",
"metadata": { "metadata": {
"title": "What is Perplexica, and how does it function as an AI-powered search ...", "title": "What is Perplexica, and how does it function as an AI-powered search ...",
"url": "https://askai.glarity.app/search/What-is-Perplexica--and-how-does-it-function-as-an-AI-powered-search-engine" "url": "https://askai.glarity.app/search/What-is-Perplexica--and-how-does-it-function-as-an-AI-powered-search-engine"
} }
}, },
{ {
"pageContent": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.", "content": "Perplexica is an open-source AI-powered search tool that dives deep into the internet to find precise answers.",
"metadata": { "metadata": {
"title": "Sahar Mor's Post", "title": "Sahar Mor's Post",
"url": "https://www.linkedin.com/posts/sahar-mor_a-new-open-source-project-called-perplexica-activity-7204489745668694016-ncja" "url": "https://www.linkedin.com/posts/sahar-mor_a-new-open-source-project-called-perplexica-activity-7204489745668694016-ncja"
@@ -158,7 +159,7 @@ Example of streamed response objects:
``` ```
{"type":"init","data":"Stream connected"} {"type":"init","data":"Stream connected"}
{"type":"sources","data":[{"pageContent":"...","metadata":{"title":"...","url":"..."}},...]} {"type":"sources","data":[{"content":"...","metadata":{"title":"...","url":"..."}},...]}
{"type":"response","data":"Perplexica is an "} {"type":"response","data":"Perplexica is an "}
{"type":"response","data":"innovative, open-source "} {"type":"response","data":"innovative, open-source "}
{"type":"response","data":"AI-powered search engine..."} {"type":"response","data":"AI-powered search engine..."}
@@ -174,9 +175,9 @@ Clients should process each line as a separate JSON object. The different messag
### Fields in the Response ### Fields in the Response
- **`message`** (string): The search result, generated based on the query and focus mode. - **`message`** (string): The search result, generated based on the query and enabled `sources`.
- **`sources`** (array): A list of sources that were used to generate the search result. Each source includes: - **`sources`** (array): A list of sources that were used to generate the search result. Each source includes:
- `pageContent`: A snippet of the relevant content from the source. - `content`: A snippet of the relevant content from the source.
- `metadata`: Metadata about the source, including: - `metadata`: Metadata about the source, including:
- `title`: The title of the webpage. - `title`: The title of the webpage.
- `url`: The URL of the webpage. - `url`: The URL of the webpage.
@@ -185,5 +186,5 @@ Clients should process each line as a separate JSON object. The different messag
If an error occurs during the search process, the API will return an appropriate error message with an HTTP status code. If an error occurs during the search process, the API will return an appropriate error message with an HTTP status code.
- **400**: If the request is malformed or missing required fields (e.g., no focus mode or query). - **400**: If the request is malformed or missing required fields (e.g., no `sources` or `query`).
- **500**: If an internal server error occurs during the search. - **500**: If an internal server error occurs during the search.

View File

@@ -1,11 +1,38 @@
# Perplexica's Architecture # Perplexica Architecture
Perplexica's architecture consists of the following key components: Perplexica is a Next.js application that combines an AI chat experience with search.
1. **User Interface**: A web-based interface that allows users to interact with Perplexica for searching images, videos, and much more. For a high level flow, see [WORKING.md](WORKING.md). For deeper implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md).
2. **Agent/Chains**: These components predict Perplexica's next actions, understand user queries, and decide whether a web search is necessary.
3. **SearXNG**: A metadata search engine used by Perplexica to search the web for sources.
4. **LLMs (Large Language Models)**: Utilized by agents and chains for tasks like understanding content, writing responses, and citing sources. Examples include Claude, GPTs, etc.
5. **Embedding Models**: To improve the accuracy of search results, embedding models re-rank the results using similarity search algorithms such as cosine similarity and dot product distance.
For a more detailed explanation of how these components work together, see [WORKING.md](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/WORKING.md). ## Key components
1. **User Interface**
- A web based UI that lets users chat, search, and view citations.
2. **API Routes**
- `POST /api/chat` powers the chat UI.
- `POST /api/search` provides a programmatic search endpoint.
- `GET /api/providers` lists available providers and model keys.
3. **Agents and Orchestration**
- The system classifies the question first.
- It can run research and widgets in parallel.
- It generates the final answer and includes citations.
4. **Search Backend**
- A meta search backend is used to fetch relevant web results when research is enabled.
5. **LLMs (Large Language Models)**
- Used for classification, writing answers, and producing citations.
6. **Embedding Models**
- Used for semantic search over user uploaded files.
7. **Storage**
- Chats and messages are stored so conversations can be reloaded.

View File

@@ -1,19 +1,72 @@
# How does Perplexica work? # How Perplexica Works
Curious about how Perplexica works? Don't worry, we'll cover it here. Before we begin, make sure you've read about the architecture of Perplexica to ensure you understand what it's made up of. Haven't read it? You can read it [here](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/README.md). This is a high level overview of how Perplexica answers a question.
We'll understand how Perplexica works by taking an example of a scenario where a user asks: "How does an A.C. work?". We'll break down the process into steps to make it easier to understand. The steps are as follows: If you want a component level overview, see [README.md](README.md).
1. The message is sent to the `/api/chat` route where it invokes the chain. The chain will depend on your focus mode. For this example, let's assume we use the "webSearch" focus mode. If you want implementation details, see [CONTRIBUTING.md](../../CONTRIBUTING.md).
2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources and searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started.
3. The query returned by the first chain is passed to SearXNG to search the web for information.
4. After the information is retrieved, it is based on keyword-based search. We then convert the information into embeddings and the query as well, then we perform a similarity search to find the most relevant sources to answer the query.
5. After all this is done, the sources are passed to the response generator. This chain takes all the chat history, the query, and the sources. It generates a response that is streamed to the UI.
## How are the answers cited? ## What happens when you ask a question
The LLMs are prompted to do so. We've prompted them so well that they cite the answers themselves, and using some UI magic, we display it to the user. When you send a message in the UI, the app calls `POST /api/chat`.
## Image and Video Search At a high level, we do three things:
Image and video searches are conducted in a similar manner. A query is always generated first, then we search the web for images and videos that match the query. These results are then returned to the user. 1. Classify the question and decide what to do next.
2. Run research and widgets in parallel.
3. Write the final answer and include citations.
## Classification
Before searching or answering, we run a classification step.
This step decides things like:
- Whether we should do research for this question
- Whether we should show any widgets
- How to rewrite the question into a clearer standalone form
## Widgets
Widgets are small, structured helpers that can run alongside research.
Examples include weather, stocks, and simple calculations.
If a widget is relevant, we show it in the UI while the answer is still being generated.
Widgets are helpful context for the answer, but they are not part of what the model should cite.
## Research
If research is needed, we gather information in the background while widgets can run.
Depending on configuration, research may include web lookup and searching user uploaded files.
## Answer generation
Once we have enough context, the chat model generates the final response.
You can control the tradeoff between speed and quality using `optimizationMode`:
- `speed`
- `balanced`
- `quality`
## How citations work
We prompt the model to cite the references it used. The UI then renders those citations alongside the supporting links.
## Search API
If you are integrating Perplexica into another product, you can call `POST /api/search`.
It returns:
- `message`: the generated answer
- `sources`: supporting references used for the answer
You can also enable streaming by setting `stream: true`.
## Image and video search
Image and video search use separate endpoints (`POST /api/images` and `POST /api/videos`). We generate a focused query using the chat model, then fetch matching results from a search backend.

View File

@@ -10,7 +10,7 @@ Simply pull the latest image and restart your container:
docker pull itzcrazykns1337/perplexica:latest docker pull itzcrazykns1337/perplexica:latest
docker stop perplexica docker stop perplexica
docker rm perplexica docker rm perplexica
docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:latest docker run -d -p 3000:3000 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:latest
``` ```
For slim version: For slim version:
@@ -19,7 +19,7 @@ For slim version:
docker pull itzcrazykns1337/perplexica:slim-latest docker pull itzcrazykns1337/perplexica:slim-latest
docker stop perplexica docker stop perplexica
docker rm perplexica docker rm perplexica
docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data -v perplexica-uploads:/home/perplexica/uploads --name perplexica itzcrazykns1337/perplexica:slim-latest docker run -d -p 3000:3000 -e SEARXNG_API_URL=http://your-searxng-url:8080 -v perplexica-data:/home/perplexica/data --name perplexica itzcrazykns1337/perplexica:slim-latest
``` ```
Once updated, go to http://localhost:3000 and verify the latest changes. Your settings are preserved automatically. Once updated, go to http://localhost:3000 and verify the latest changes. Your settings are preserved automatically.

View File

@@ -1,15 +1 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint /* do nothing */
CREATE TABLE `__new_messages` (
`id` integer PRIMARY KEY NOT NULL,
`messageId` text NOT NULL,
`chatId` text NOT NULL,
`backendId` text NOT NULL,
`query` text NOT NULL,
`createdAt` text NOT NULL,
`responseBlocks` text DEFAULT '[]',
`status` text DEFAULT 'answering'
);
--> statement-breakpoint
DROP TABLE `messages`;--> statement-breakpoint
ALTER TABLE `__new_messages` RENAME TO `messages`;--> statement-breakpoint
PRAGMA foreign_keys=ON;

View File

@@ -28,8 +28,8 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"focusMode": { "sources": {
"name": "focusMode", "name": "sources",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": true, "notNull": true,

1
next-env.d.ts vendored
View File

@@ -1,5 +1,6 @@
/// <reference types="next" /> /// <reference types="next" />
/// <reference types="next/image-types/global" /> /// <reference types="next/image-types/global" />
import './.next/dev/types/routes.d.ts';
// NOTE: This file should not be edited // NOTE: This file should not be edited
// see https://nextjs.org/docs/app/api-reference/config/typescript for more information. // see https://nextjs.org/docs/app/api-reference/config/typescript for more information.

View File

@@ -1,3 +1,5 @@
import pkg from './package.json' with { type: 'json' };
/** @type {import('next').NextConfig} */ /** @type {import('next').NextConfig} */
const nextConfig = { const nextConfig = {
output: 'standalone', output: 'standalone',
@@ -9,6 +11,16 @@ const nextConfig = {
], ],
}, },
serverExternalPackages: ['pdf-parse'], serverExternalPackages: ['pdf-parse'],
outputFileTracingIncludes: {
'/api/**': [
'./node_modules/@napi-rs/canvas/**',
'./node_modules/@napi-rs/canvas-linux-x64-gnu/**',
'./node_modules/@napi-rs/canvas-linux-x64-musl/**',
],
},
env: {
NEXT_PUBLIC_VERSION: pkg.version,
},
}; };
export default nextConfig; export default nextConfig;

View File

@@ -1,71 +1,66 @@
{ {
"name": "perplexica-frontend", "name": "perplexica",
"version": "1.11.2", "version": "1.12.1",
"license": "MIT", "license": "MIT",
"author": "ItzCrazyKns", "author": "ItzCrazyKns",
"scripts": { "scripts": {
"dev": "next dev", "dev": "next dev --webpack",
"build": "next build", "build": "next build --webpack",
"start": "next start", "start": "next start",
"lint": "next lint", "lint": "next lint",
"format:write": "prettier . --write" "format:write": "prettier . --write"
}, },
"dependencies": { "dependencies": {
"@google/genai": "^1.34.0",
"@headlessui/react": "^2.2.0", "@headlessui/react": "^2.2.0",
"@headlessui/tailwindcss": "^0.2.2", "@headlessui/tailwindcss": "^0.2.2",
"@huggingface/transformers": "^3.7.5", "@huggingface/transformers": "^3.8.1",
"@iarna/toml": "^2.2.5",
"@icons-pack/react-simple-icons": "^12.3.0", "@icons-pack/react-simple-icons": "^12.3.0",
"@langchain/anthropic": "^1.0.1", "@phosphor-icons/react": "^2.1.10",
"@langchain/community": "^1.0.3", "@radix-ui/react-tooltip": "^1.2.8",
"@langchain/core": "^1.0.5",
"@langchain/google-genai": "^1.0.1",
"@langchain/groq": "^1.0.1",
"@langchain/langgraph": "^1.0.1",
"@langchain/ollama": "^1.0.1",
"@langchain/openai": "^1.1.1",
"@langchain/textsplitters": "^1.0.0",
"@tailwindcss/typography": "^0.5.12", "@tailwindcss/typography": "^0.5.12",
"@toolsycc/json-repair": "^0.1.22",
"axios": "^1.8.3", "axios": "^1.8.3",
"better-sqlite3": "^11.9.1", "better-sqlite3": "^11.9.1",
"clsx": "^2.1.0", "clsx": "^2.1.0",
"compute-cosine-similarity": "^1.1.0",
"drizzle-orm": "^0.40.1", "drizzle-orm": "^0.40.1",
"framer-motion": "^12.23.24", "js-tiktoken": "^1.0.21",
"html-to-text": "^9.0.5", "jspdf": "^3.0.4",
"jspdf": "^3.0.1",
"langchain": "^1.0.4",
"lightweight-charts": "^5.0.9", "lightweight-charts": "^5.0.9",
"lucide-react": "^0.363.0", "lucide-react": "^0.556.0",
"mammoth": "^1.9.1", "mammoth": "^1.9.1",
"markdown-to-jsx": "^7.7.2", "markdown-to-jsx": "^7.7.2",
"mathjs": "^15.1.0", "mathjs": "^15.1.0",
"next": "^15.2.2", "motion": "^12.23.26",
"next": "^16.0.7",
"next-themes": "^0.3.0", "next-themes": "^0.3.0",
"officeparser": "^5.2.2",
"ollama": "^0.6.3", "ollama": "^0.6.3",
"openai": "^6.9.0", "openai": "^6.9.0",
"partial-json": "^0.1.7", "partial-json": "^0.1.7",
"pdf-parse": "^1.1.1", "pdf-parse": "^2.4.5",
"react": "^18", "react": "^18",
"react-dom": "^18", "react-dom": "^18",
"react-syntax-highlighter": "^16.1.0",
"react-text-to-speech": "^0.14.5", "react-text-to-speech": "^0.14.5",
"react-textarea-autosize": "^8.5.3", "react-textarea-autosize": "^8.5.3",
"rfc6902": "^5.1.2", "rfc6902": "^5.1.2",
"sonner": "^1.4.41", "sonner": "^1.4.41",
"tailwind-merge": "^2.2.2", "tailwind-merge": "^2.2.2",
"winston": "^3.17.0", "turndown": "^7.2.2",
"yahoo-finance2": "^3.10.2", "yahoo-finance2": "^3.10.2",
"yet-another-react-lightbox": "^3.17.2", "yet-another-react-lightbox": "^3.17.2",
"zod": "^4.1.12" "zod": "^4.1.12"
}, },
"devDependencies": { "devDependencies": {
"@types/better-sqlite3": "^7.6.12", "@types/better-sqlite3": "^7.6.12",
"@types/html-to-text": "^9.0.4",
"@types/jspdf": "^2.0.0", "@types/jspdf": "^2.0.0",
"@types/node": "^24.8.1", "@types/node": "^24.8.1",
"@types/pdf-parse": "^1.1.4", "@types/pdf-parse": "^1.1.4",
"@types/react": "^18", "@types/react": "^18",
"@types/react-dom": "^18", "@types/react-dom": "^18",
"@types/react-syntax-highlighter": "^15.5.13",
"@types/turndown": "^5.0.6",
"autoprefixer": "^10.0.1", "autoprefixer": "^10.0.1",
"drizzle-kit": "^0.30.5", "drizzle-kit": "^0.30.5",
"eslint": "^8", "eslint": "^8",
@@ -74,5 +69,8 @@
"prettier": "^3.2.5", "prettier": "^3.2.5",
"tailwindcss": "^3.3.0", "tailwindcss": "^3.3.0",
"typescript": "^5.9.3" "typescript": "^5.9.3"
},
"optionalDependencies": {
"@napi-rs/canvas": "^0.1.87"
} }
} }

View File

@@ -1,10 +1,14 @@
import crypto from 'crypto';
import { z } from 'zod'; import { z } from 'zod';
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { ModelWithProvider } from '@/lib/models/types'; import { ModelWithProvider } from '@/lib/models/types';
import SearchAgent from '@/lib/agents/search'; import SearchAgent from '@/lib/agents/search';
import SessionManager from '@/lib/session'; import SessionManager from '@/lib/session';
import { ChatTurnMessage } from '@/lib/types'; import { ChatTurnMessage } from '@/lib/types';
import { SearchSources } from '@/lib/agents/search/types';
import db from '@/lib/db';
import { eq } from 'drizzle-orm';
import { chats } from '@/lib/db/schema';
import UploadManager from '@/lib/uploads/manager';
export const runtime = 'nodejs'; export const runtime = 'nodejs';
export const dynamic = 'force-dynamic'; export const dynamic = 'force-dynamic';
@@ -32,7 +36,7 @@ const bodySchema = z.object({
optimizationMode: z.enum(['speed', 'balanced', 'quality'], { optimizationMode: z.enum(['speed', 'balanced', 'quality'], {
message: 'Optimization mode must be one of: speed, balanced, quality', message: 'Optimization mode must be one of: speed, balanced, quality',
}), }),
focusMode: z.string().min(1, 'Focus mode is required'), sources: z.array(z.string()).optional().default([]),
history: z history: z
.array(z.tuple([z.string(), z.string()])) .array(z.tuple([z.string(), z.string()]))
.optional() .optional()
@@ -43,7 +47,6 @@ const bodySchema = z.object({
systemInstructions: z.string().nullable().optional().default(''), systemInstructions: z.string().nullable().optional().default(''),
}); });
type Message = z.infer<typeof messageSchema>;
type Body = z.infer<typeof bodySchema>; type Body = z.infer<typeof bodySchema>;
const safeValidateBody = (data: unknown) => { const safeValidateBody = (data: unknown) => {
@@ -65,6 +68,38 @@ const safeValidateBody = (data: unknown) => {
}; };
}; };
const ensureChatExists = async (input: {
id: string;
sources: SearchSources[];
query: string;
fileIds: string[];
}) => {
try {
const exists = await db.query.chats
.findFirst({
where: eq(chats.id, input.id),
})
.execute();
if (!exists) {
await db.insert(chats).values({
id: input.id,
createdAt: new Date().toISOString(),
sources: input.sources,
title: input.query,
files: input.fileIds.map((id) => {
return {
fileId: id,
name: UploadManager.getFile(id)?.name || 'Uploaded File',
};
}),
});
}
} catch (err) {
console.error('Failed to check/save chat:', err);
}
};
export const POST = async (req: Request) => { export const POST = async (req: Request) => {
try { try {
const reqBody = (await req.json()) as Body; const reqBody = (await req.json()) as Body;
@@ -121,29 +156,9 @@ export const POST = async (req: Request) => {
const writer = responseStream.writable.getWriter(); const writer = responseStream.writable.getWriter();
const encoder = new TextEncoder(); const encoder = new TextEncoder();
let receivedMessage = ''; const disconnect = session.subscribe((event: string, data: any) => {
if (event === 'data') {
session.addListener('data', (data: any) => { if (data.type === 'block') {
if (data.type === 'response') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'message',
data: data.data,
}) + '\n',
),
);
receivedMessage += data.data;
} else if (data.type === 'sources') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'sources',
data: data.data,
}) + '\n',
),
);
} else if (data.type === 'block') {
writer.write( writer.write(
encoder.encode( encoder.encode(
JSON.stringify({ JSON.stringify({
@@ -171,9 +186,7 @@ export const POST = async (req: Request) => {
), ),
); );
} }
}); } else if (event === 'end') {
session.addListener('end', () => {
writer.write( writer.write(
encoder.encode( encoder.encode(
JSON.stringify({ JSON.stringify({
@@ -183,9 +196,7 @@ export const POST = async (req: Request) => {
); );
writer.close(); writer.close();
session.removeAllListeners(); session.removeAllListeners();
}); } else if (event === 'error') {
session.addListener('error', (data: any) => {
writer.write( writer.write(
encoder.encode( encoder.encode(
JSON.stringify({ JSON.stringify({
@@ -196,20 +207,35 @@ export const POST = async (req: Request) => {
); );
writer.close(); writer.close();
session.removeAllListeners(); session.removeAllListeners();
}
}); });
agent.searchAsync(session, { agent.searchAsync(session, {
chatHistory: history, chatHistory: history,
followUp: message.content, followUp: message.content,
chatId: body.message.chatId,
messageId: body.message.messageId,
config: { config: {
llm, llm,
embedding: embedding, embedding: embedding,
sources: ['web'], sources: body.sources as SearchSources[],
mode: body.optimizationMode, mode: body.optimizationMode,
fileIds: body.files,
systemInstructions: body.systemInstructions || 'None',
}, },
}); });
/* handleHistorySave(message, humanMessageId, body.focusMode, body.files); */ ensureChatExists({
id: body.message.chatId,
sources: body.sources as SearchSources[],
fileIds: body.files,
query: body.message.content,
});
req.signal.addEventListener('abort', () => {
disconnect();
writer.close();
});
return new Response(responseStream.readable, { return new Response(responseStream.readable, {
headers: { headers: {

View File

@@ -21,7 +21,10 @@ export const POST = async (req: Request) => {
const images = await searchImages( const images = await searchImages(
{ {
chatHistory: body.chatHistory, chatHistory: body.chatHistory.map(([role, content]) => ({
role: role === 'human' ? 'user' : 'assistant',
content,
})),
query: body.query, query: body.query,
}, },
llm, llm,

View File

@@ -0,0 +1,93 @@
import SessionManager from '@/lib/session';
export const POST = async (
req: Request,
{ params }: { params: Promise<{ id: string }> },
) => {
try {
const { id } = await params;
const session = SessionManager.getSession(id);
if (!session) {
return Response.json({ message: 'Session not found' }, { status: 404 });
}
const responseStream = new TransformStream();
const writer = responseStream.writable.getWriter();
const encoder = new TextEncoder();
const disconnect = session.subscribe((event, data) => {
if (event === 'data') {
if (data.type === 'block') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'block',
block: data.block,
}) + '\n',
),
);
} else if (data.type === 'updateBlock') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'updateBlock',
blockId: data.blockId,
patch: data.patch,
}) + '\n',
),
);
} else if (data.type === 'researchComplete') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'researchComplete',
}) + '\n',
),
);
}
} else if (event === 'end') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'messageEnd',
}) + '\n',
),
);
writer.close();
disconnect();
} else if (event === 'error') {
writer.write(
encoder.encode(
JSON.stringify({
type: 'error',
data: data.data,
}) + '\n',
),
);
writer.close();
disconnect();
}
});
req.signal.addEventListener('abort', () => {
disconnect();
writer.close();
});
return new Response(responseStream.readable, {
headers: {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache, no-transform',
},
});
} catch (err) {
console.error('Error in reconnecting to session stream: ', err);
return Response.json(
{ message: 'An error has occurred.' },
{ status: 500 },
);
}
};

View File

@@ -1,12 +1,13 @@
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { ModelWithProvider } from '@/lib/models/types'; import { ModelWithProvider } from '@/lib/models/types';
import SessionManager from '@/lib/session'; import SessionManager from '@/lib/session';
import SearchAgent from '@/lib/agents/search';
import { ChatTurnMessage } from '@/lib/types'; import { ChatTurnMessage } from '@/lib/types';
import { SearchSources } from '@/lib/agents/search/types';
import APISearchAgent from '@/lib/agents/search/api';
interface ChatRequestBody { interface ChatRequestBody {
optimizationMode: 'speed' | 'balanced'; optimizationMode: 'speed' | 'balanced' | 'quality';
focusMode: string; sources: SearchSources[];
chatModel: ModelWithProvider; chatModel: ModelWithProvider;
embeddingModel: ModelWithProvider; embeddingModel: ModelWithProvider;
query: string; query: string;
@@ -19,15 +20,15 @@ export const POST = async (req: Request) => {
try { try {
const body: ChatRequestBody = await req.json(); const body: ChatRequestBody = await req.json();
if (!body.focusMode || !body.query) { if (!body.sources || !body.query) {
return Response.json( return Response.json(
{ message: 'Missing focus mode or query' }, { message: 'Missing sources or query' },
{ status: 400 }, { status: 400 },
); );
} }
body.history = body.history || []; body.history = body.history || [];
body.optimizationMode = body.optimizationMode || 'balanced'; body.optimizationMode = body.optimizationMode || 'speed';
body.stream = body.stream || false; body.stream = body.stream || false;
const registry = new ModelRegistry(); const registry = new ModelRegistry();
@@ -48,17 +49,21 @@ export const POST = async (req: Request) => {
const session = SessionManager.createSession(); const session = SessionManager.createSession();
const agent = new SearchAgent(); const agent = new APISearchAgent();
agent.searchAsync(session, { agent.searchAsync(session, {
chatHistory: history, chatHistory: history,
config: { config: {
embedding: embeddings, embedding: embeddings,
llm: llm, llm: llm,
sources: ['web', 'discussions', 'academic'], sources: body.sources,
mode: 'balanced', mode: body.optimizationMode,
fileIds: [],
systemInstructions: body.systemInstructions || '',
}, },
followUp: body.query, followUp: body.query,
chatId: crypto.randomUUID(),
messageId: crypto.randomUUID(),
}); });
if (!body.stream) { if (!body.stream) {
@@ -70,13 +75,13 @@ export const POST = async (req: Request) => {
let message = ''; let message = '';
let sources: any[] = []; let sources: any[] = [];
session.addListener('data', (data: string) => { session.subscribe((event: string, data: Record<string, any>) => {
if (event === 'data') {
try { try {
const parsedData = JSON.parse(data); if (data.type === 'response') {
if (parsedData.type === 'response') { message += data.data;
message += parsedData.data; } else if (data.type === 'searchResults') {
} else if (parsedData.type === 'sources') { sources = data.data;
sources = parsedData.data;
} }
} catch (error) { } catch (error) {
reject( reject(
@@ -86,19 +91,20 @@ export const POST = async (req: Request) => {
), ),
); );
} }
}); }
session.addListener('end', () => { if (event === 'end') {
resolve(Response.json({ message, sources }, { status: 200 })); resolve(Response.json({ message, sources }, { status: 200 }));
}); }
session.addListener('error', (error: any) => { if (event === 'error') {
reject( reject(
Response.json( Response.json(
{ message: 'Search error', error }, { message: 'Search error', error: data },
{ status: 500 }, { status: 500 },
), ),
); );
}
}); });
}, },
); );
@@ -130,23 +136,22 @@ export const POST = async (req: Request) => {
} catch (error) {} } catch (error) {}
}); });
session.addListener('data', (data: string) => { session.subscribe((event: string, data: Record<string, any>) => {
if (event === 'data') {
if (signal.aborted) return; if (signal.aborted) return;
try { try {
const parsedData = JSON.parse(data); if (data.type === 'response') {
if (parsedData.type === 'response') {
controller.enqueue( controller.enqueue(
encoder.encode( encoder.encode(
JSON.stringify({ JSON.stringify({
type: 'response', type: 'response',
data: parsedData.data, data: data.data,
}) + '\n', }) + '\n',
), ),
); );
} else if (parsedData.type === 'sources') { } else if (data.type === 'searchResults') {
sources = parsedData.data; sources = data.data;
controller.enqueue( controller.enqueue(
encoder.encode( encoder.encode(
JSON.stringify({ JSON.stringify({
@@ -159,9 +164,9 @@ export const POST = async (req: Request) => {
} catch (error) { } catch (error) {
controller.error(error); controller.error(error);
} }
}); }
session.addListener('end', () => { if (event === 'end') {
if (signal.aborted) return; if (signal.aborted) return;
controller.enqueue( controller.enqueue(
@@ -172,12 +177,13 @@ export const POST = async (req: Request) => {
), ),
); );
controller.close(); controller.close();
}); }
session.addListener('error', (error: any) => { if (event === 'error') {
if (signal.aborted) return; if (signal.aborted) return;
controller.error(error); controller.error(data);
}
}); });
}, },
cancel() { cancel() {

View File

@@ -1,7 +1,6 @@
import generateSuggestions from '@/lib/agents/suggestions'; import generateSuggestions from '@/lib/agents/suggestions';
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { ModelWithProvider } from '@/lib/models/types'; import { ModelWithProvider } from '@/lib/models/types';
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
interface SuggestionsGenerationBody { interface SuggestionsGenerationBody {
chatHistory: any[]; chatHistory: any[];
@@ -21,7 +20,10 @@ export const POST = async (req: Request) => {
const suggestions = await generateSuggestions( const suggestions = await generateSuggestions(
{ {
chatHistory: body.chatHistory, chatHistory: body.chatHistory.map(([role, content]) => ({
role: role === 'human' ? 'user' : 'assistant',
content,
})),
}, },
llm, llm,
); );

View File

@@ -1,40 +1,16 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import fs from 'fs';
import path from 'path';
import crypto from 'crypto';
import { PDFLoader } from '@langchain/community/document_loaders/fs/pdf';
import { DocxLoader } from '@langchain/community/document_loaders/fs/docx';
import { RecursiveCharacterTextSplitter } from '@langchain/textsplitters';
import { Document } from '@langchain/core/documents';
import ModelRegistry from '@/lib/models/registry'; import ModelRegistry from '@/lib/models/registry';
import { Chunk } from '@/lib/types'; import UploadManager from '@/lib/uploads/manager';
interface FileRes {
fileName: string;
fileExtension: string;
fileId: string;
}
const uploadDir = path.join(process.cwd(), 'uploads');
if (!fs.existsSync(uploadDir)) {
fs.mkdirSync(uploadDir, { recursive: true });
}
const splitter = new RecursiveCharacterTextSplitter({
chunkSize: 500,
chunkOverlap: 100,
});
export async function POST(req: Request) { export async function POST(req: Request) {
try { try {
const formData = await req.formData(); const formData = await req.formData();
const files = formData.getAll('files') as File[]; const files = formData.getAll('files') as File[];
const embedding_model = formData.get('embedding_model_key') as string; const embeddingModel = formData.get('embedding_model_key') as string;
const embedding_model_provider = formData.get('embedding_model_provider_id') as string; const embeddingModelProvider = formData.get('embedding_model_provider_id') as string;
if (!embedding_model || !embedding_model_provider) { if (!embeddingModel || !embeddingModelProvider) {
return NextResponse.json( return NextResponse.json(
{ message: 'Missing embedding model or provider' }, { message: 'Missing embedding model or provider' },
{ status: 400 }, { status: 400 },
@@ -43,81 +19,13 @@ export async function POST(req: Request) {
const registry = new ModelRegistry(); const registry = new ModelRegistry();
const model = await registry.loadEmbeddingModel(embedding_model_provider, embedding_model); const model = await registry.loadEmbeddingModel(embeddingModelProvider, embeddingModel);
const processedFiles: FileRes[] = []; const uploadManager = new UploadManager({
embeddingModel: model,
})
await Promise.all( const processedFiles = await uploadManager.processFiles(files);
files.map(async (file: any) => {
const fileExtension = file.name.split('.').pop();
if (!['pdf', 'docx', 'txt'].includes(fileExtension!)) {
return NextResponse.json(
{ message: 'File type not supported' },
{ status: 400 },
);
}
const uniqueFileName = `${crypto.randomBytes(16).toString('hex')}.${fileExtension}`;
const filePath = path.join(uploadDir, uniqueFileName);
const buffer = Buffer.from(await file.arrayBuffer());
fs.writeFileSync(filePath, new Uint8Array(buffer));
let docs: any[] = [];
if (fileExtension === 'pdf') {
const loader = new PDFLoader(filePath);
docs = await loader.load();
} else if (fileExtension === 'docx') {
const loader = new DocxLoader(filePath);
docs = await loader.load();
} else if (fileExtension === 'txt') {
const text = fs.readFileSync(filePath, 'utf-8');
docs = [
new Document({ pageContent: text, metadata: { title: file.name } }),
];
}
const splitted = await splitter.splitDocuments(docs);
const extractedDataPath = filePath.replace(/\.\w+$/, '-extracted.json');
fs.writeFileSync(
extractedDataPath,
JSON.stringify({
title: file.name,
contents: splitted.map((doc) => doc.pageContent),
}),
);
const chunks: Chunk[] = splitted.map((doc) => {
return {
content: doc.pageContent,
metadata: doc.metadata,
}
});
const embeddings = await model.embedChunks(
chunks
);
const embeddingsDataPath = filePath.replace(
/\.\w+$/,
'-embeddings.json',
);
fs.writeFileSync(
embeddingsDataPath,
JSON.stringify({
title: file.name,
embeddings,
}),
);
processedFiles.push({
fileName: file.name,
fileExtension: fileExtension,
fileId: uniqueFileName.replace(/\.\w+$/, ''),
});
}),
);
return NextResponse.json({ return NextResponse.json({
files: processedFiles, files: processedFiles,

View File

@@ -21,7 +21,10 @@ export const POST = async (req: Request) => {
const videos = await handleVideoSearch( const videos = await handleVideoSearch(
{ {
chatHistory: body.chatHistory, chatHistory: body.chatHistory.map(([role, content]) => ({
role: role === 'human' ? 'user' : 'assistant',
content,
})),
query: body.query, query: body.query,
}, },
llm, llm,

View File

@@ -1,10 +1,5 @@
'use client'; 'use client';
import ChatWindow from '@/components/ChatWindow'; import ChatWindow from '@/components/ChatWindow';
import React from 'react';
const Page = () => { export default ChatWindow;
return <ChatWindow />;
};
export default Page;

View File

@@ -34,7 +34,7 @@ export default function RootLayout({
return ( return (
<html className="h-full" lang="en" suppressHydrationWarning> <html className="h-full" lang="en" suppressHydrationWarning>
<body className={cn('h-full', montserrat.className)}> <body className={cn('h-full antialiased', montserrat.className)}>
<ThemeProvider> <ThemeProvider>
{setupComplete ? ( {setupComplete ? (
<ChatProvider> <ChatProvider>

View File

@@ -1,8 +1,8 @@
'use client'; 'use client';
import DeleteChat from '@/components/DeleteChat'; import DeleteChat from '@/components/DeleteChat';
import { cn, formatTimeDifference } from '@/lib/utils'; import { formatTimeDifference } from '@/lib/utils';
import { BookOpenText, ClockIcon, Delete, ScanEye } from 'lucide-react'; import { BookOpenText, ClockIcon, FileText, Globe2Icon } from 'lucide-react';
import Link from 'next/link'; import Link from 'next/link';
import { useEffect, useState } from 'react'; import { useEffect, useState } from 'react';
@@ -10,7 +10,8 @@ export interface Chat {
id: string; id: string;
title: string; title: string;
createdAt: string; createdAt: string;
focusMode: string; sources: string[];
files: { fileId: string; name: string }[];
} }
const Page = () => { const Page = () => {
@@ -37,8 +38,38 @@ const Page = () => {
fetchChats(); fetchChats();
}, []); }, []);
return loading ? ( return (
<div className="flex flex-row items-center justify-center min-h-screen"> <div>
<div className="flex flex-col pt-10 border-b border-light-200/20 dark:border-dark-200/20 pb-6 px-2">
<div className="flex flex-col lg:flex-row lg:items-end lg:justify-between gap-3">
<div className="flex items-center justify-center">
<BookOpenText size={45} className="mb-2.5" />
<div className="flex flex-col">
<h1
className="text-5xl font-normal p-2 pb-0"
style={{ fontFamily: 'PP Editorial, serif' }}
>
Library
</h1>
<div className="px-2 text-sm text-black/60 dark:text-white/60 text-center lg:text-left">
Past chats, sources, and uploads.
</div>
</div>
</div>
<div className="flex items-center justify-center lg:justify-end gap-2 text-xs text-black/60 dark:text-white/60">
<span className="inline-flex items-center gap-1 rounded-full border border-black/20 dark:border-white/20 px-2 py-0.5">
<BookOpenText size={14} />
{loading
? 'Loading…'
: `${chats.length} ${chats.length === 1 ? 'chat' : 'chats'}`}
</span>
</div>
</div>
</div>
{loading ? (
<div className="flex flex-row items-center justify-center min-h-[60vh]">
<svg <svg
aria-hidden="true" aria-hidden="true"
className="w-8 h-8 text-light-200 fill-light-secondary dark:text-[#202020] animate-spin dark:fill-[#ffffff3b]" className="w-8 h-8 text-light-200 fill-light-secondary dark:text-[#202020] animate-spin dark:fill-[#ffffff3b]"
@@ -56,47 +87,56 @@ const Page = () => {
/> />
</svg> </svg>
</div> </div>
) : ( ) : chats.length === 0 ? (
<div> <div className="flex flex-col items-center justify-center min-h-[70vh] px-2 text-center">
<div className="flex flex-col pt-4"> <div className="flex items-center justify-center w-12 h-12 rounded-2xl border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary">
<div className="flex items-center"> <BookOpenText className="text-black/70 dark:text-white/70" />
<BookOpenText />
<h1 className="text-3xl font-medium p-2">Library</h1>
</div> </div>
<hr className="border-t border-[#2B2C2C] my-4 w-full" /> <p className="mt-2 text-black/70 dark:text-white/70 text-sm">
</div>
{chats.length === 0 && (
<div className="flex flex-row items-center justify-center min-h-screen">
<p className="text-black/70 dark:text-white/70 text-sm">
No chats found. No chats found.
</p> </p>
<p className="mt-1 text-black/70 dark:text-white/70 text-sm">
<Link href="/" className="text-sky-400">
Start a new chat
</Link>{' '}
to see it listed here.
</p>
</div> </div>
)} ) : (
{chats.length > 0 && ( <div className="pt-6 pb-28 px-2">
<div className="flex flex-col pb-20 lg:pb-2"> <div className="rounded-2xl border border-light-200 dark:border-dark-200 overflow-hidden bg-light-primary dark:bg-dark-primary">
{chats.map((chat, i) => ( {chats.map((chat, index) => {
const sourcesLabel =
chat.sources.length === 0
? null
: chat.sources.length <= 2
? chat.sources
.map((s) => s.charAt(0).toUpperCase() + s.slice(1))
.join(', ')
: `${chat.sources
.slice(0, 2)
.map((s) => s.charAt(0).toUpperCase() + s.slice(1))
.join(', ')} + ${chat.sources.length - 2}`;
return (
<div <div
className={cn( key={chat.id}
'flex flex-col space-y-4 py-6', className={
i !== chats.length - 1 'group flex flex-col gap-2 p-4 hover:bg-light-secondary dark:hover:bg-dark-secondary transition-colors duration-200 ' +
? 'border-b border-white-200 dark:border-dark-200' (index !== chats.length - 1
: '', ? 'border-b border-light-200 dark:border-dark-200'
)} : '')
key={i} }
> >
<div className="flex items-start justify-between gap-3">
<Link <Link
href={`/c/${chat.id}`} href={`/c/${chat.id}`}
className="text-black dark:text-white lg:text-xl font-medium truncate transition duration-200 hover:text-[#24A0ED] dark:hover:text-[#24A0ED] cursor-pointer" className="flex-1 text-black dark:text-white text-base lg:text-lg font-medium leading-snug line-clamp-2 group-hover:text-[#24A0ED] transition duration-200"
title={chat.title}
> >
{chat.title} {chat.title}
</Link> </Link>
<div className="flex flex-row items-center justify-between w-full"> <div className="pt-0.5 shrink-0">
<div className="flex flex-row items-center space-x-1 lg:space-x-1.5 text-black/70 dark:text-white/70">
<ClockIcon size={15} />
<p className="text-xs">
{formatTimeDifference(new Date(), chat.createdAt)} Ago
</p>
</div>
<DeleteChat <DeleteChat
chatId={chat.id} chatId={chat.id}
chats={chats} chats={chats}
@@ -104,7 +144,31 @@ const Page = () => {
/> />
</div> </div>
</div> </div>
))}
<div className="flex flex-wrap items-center gap-2 text-black/70 dark:text-white/70">
<span className="inline-flex items-center gap-1 text-xs">
<ClockIcon size={14} />
{formatTimeDifference(new Date(), chat.createdAt)} Ago
</span>
{sourcesLabel && (
<span className="inline-flex items-center gap-1 text-xs border border-black/20 dark:border-white/20 rounded-full px-2 py-0.5">
<Globe2Icon size={14} />
{sourcesLabel}
</span>
)}
{chat.files.length > 0 && (
<span className="inline-flex items-center gap-1 text-xs border border-black/20 dark:border-white/20 rounded-full px-2 py-0.5">
<FileText size={14} />
{chat.files.length}{' '}
{chat.files.length === 1 ? 'file' : 'files'}
</span>
)}
</div>
</div>
);
})}
</div>
</div> </div>
)} )}
</div> </div>

View File

@@ -1,6 +1,13 @@
'use client'; 'use client';
import { Brain, Search, FileText, ChevronDown, ChevronUp } from 'lucide-react'; import {
Brain,
Search,
FileText,
ChevronDown,
ChevronUp,
BookSearch,
} from 'lucide-react';
import { motion, AnimatePresence } from 'framer-motion'; import { motion, AnimatePresence } from 'framer-motion';
import { useEffect, useState } from 'react'; import { useEffect, useState } from 'react';
import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types'; import { ResearchBlock, ResearchBlockSubStep } from '@/lib/types';
@@ -9,11 +16,17 @@ import { useChat } from '@/lib/hooks/useChat';
const getStepIcon = (step: ResearchBlockSubStep) => { const getStepIcon = (step: ResearchBlockSubStep) => {
if (step.type === 'reasoning') { if (step.type === 'reasoning') {
return <Brain className="w-4 h-4" />; return <Brain className="w-4 h-4" />;
} else if (step.type === 'searching') { } else if (step.type === 'searching' || step.type === 'upload_searching') {
return <Search className="w-4 h-4" />; return <Search className="w-4 h-4" />;
} else if (step.type === 'reading') { } else if (
step.type === 'search_results' ||
step.type === 'upload_search_results'
) {
return <FileText className="w-4 h-4" />; return <FileText className="w-4 h-4" />;
} else if (step.type === 'reading') {
return <BookSearch className="w-4 h-4" />;
} }
return null; return null;
}; };
@@ -25,26 +38,37 @@ const getStepTitle = (
return isStreaming && !step.reasoning ? 'Thinking...' : 'Thinking'; return isStreaming && !step.reasoning ? 'Thinking...' : 'Thinking';
} else if (step.type === 'searching') { } else if (step.type === 'searching') {
return `Searching ${step.searching.length} ${step.searching.length === 1 ? 'query' : 'queries'}`; return `Searching ${step.searching.length} ${step.searching.length === 1 ? 'query' : 'queries'}`;
} else if (step.type === 'reading') { } else if (step.type === 'search_results') {
return `Found ${step.reading.length} ${step.reading.length === 1 ? 'result' : 'results'}`; return `Found ${step.reading.length} ${step.reading.length === 1 ? 'result' : 'results'}`;
} else if (step.type === 'reading') {
return `Reading ${step.reading.length} ${step.reading.length === 1 ? 'source' : 'sources'}`;
} else if (step.type === 'upload_searching') {
return 'Scanning your uploaded documents';
} else if (step.type === 'upload_search_results') {
return `Reading ${step.results.length} ${step.results.length === 1 ? 'document' : 'documents'}`;
} }
return 'Processing'; return 'Processing';
}; };
const AssistantSteps = ({ const AssistantSteps = ({
block, block,
status, status,
isLast,
}: { }: {
block: ResearchBlock; block: ResearchBlock;
status: 'answering' | 'completed' | 'error'; status: 'answering' | 'completed' | 'error';
isLast: boolean;
}) => { }) => {
const [isExpanded, setIsExpanded] = useState(true); const [isExpanded, setIsExpanded] = useState(
isLast && status === 'answering' ? true : false,
);
const { researchEnded, loading } = useChat(); const { researchEnded, loading } = useChat();
useEffect(() => { useEffect(() => {
if (researchEnded) { if (researchEnded && isLast) {
setIsExpanded(false); setIsExpanded(false);
} else if (status === 'answering') { } else if (status === 'answering' && isLast) {
setIsExpanded(true); setIsExpanded(true);
} }
}, [researchEnded, status]); }, [researchEnded, status]);
@@ -91,10 +115,9 @@ const AssistantSteps = ({
initial={{ opacity: 0, x: -10 }} initial={{ opacity: 0, x: -10 }}
animate={{ opacity: 1, x: 0 }} animate={{ opacity: 1, x: 0 }}
transition={{ duration: 0.2, delay: 0 }} transition={{ duration: 0.2, delay: 0 }}
className="flex gap-3" className="flex gap-2"
> >
{/* Timeline connector */} <div className="flex flex-col items-center -mt-0.5">
<div className="flex flex-col items-center pt-0.5">
<div <div
className={`rounded-full p-1.5 bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 ${isStreaming ? 'animate-pulse' : ''}`} className={`rounded-full p-1.5 bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 ${isStreaming ? 'animate-pulse' : ''}`}
> >
@@ -105,7 +128,6 @@ const AssistantSteps = ({
)} )}
</div> </div>
{/* Step content */}
<div className="flex-1 pb-1"> <div className="flex-1 pb-1">
<span className="text-sm font-medium text-black dark:text-white"> <span className="text-sm font-medium text-black dark:text-white">
{getStepTitle(step, isStreaming)} {getStepTitle(step, isStreaming)}
@@ -151,7 +173,9 @@ const AssistantSteps = ({
</div> </div>
)} )}
{step.type === 'reading' && step.reading.length > 0 && ( {(step.type === 'search_results' ||
step.type === 'reading') &&
step.reading.length > 0 && (
<div className="flex flex-wrap gap-1.5 mt-1.5"> <div className="flex flex-wrap gap-1.5 mt-1.5">
{step.reading.slice(0, 4).map((result, idx) => { {step.reading.slice(0, 4).map((result, idx) => {
const url = result.metadata.url || ''; const url = result.metadata.url || '';
@@ -162,8 +186,10 @@ const AssistantSteps = ({
: ''; : '';
return ( return (
<span <a
key={idx} key={idx}
href={url}
target="_blank"
className="inline-flex items-center gap-1.5 px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200" className="inline-flex items-center gap-1.5 px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200"
> >
{faviconUrl && ( {faviconUrl && (
@@ -177,7 +203,50 @@ const AssistantSteps = ({
/> />
)} )}
<span className="line-clamp-1">{title}</span> <span className="line-clamp-1">{title}</span>
</a>
);
})}
</div>
)}
{step.type === 'upload_searching' &&
step.queries.length > 0 && (
<div className="flex flex-wrap gap-1.5 mt-1.5">
{step.queries.map((query, idx) => (
<span
key={idx}
className="inline-flex items-center px-2 py-0.5 rounded-md text-xs font-medium bg-light-100 dark:bg-dark-100 text-black/70 dark:text-white/70 border border-light-200 dark:border-dark-200"
>
{query}
</span> </span>
))}
</div>
)}
{step.type === 'upload_search_results' &&
step.results.length > 0 && (
<div className="mt-1.5 grid gap-3 lg:grid-cols-3">
{step.results.slice(0, 4).map((result, idx) => {
const title =
(result.metadata &&
(result.metadata.title ||
result.metadata.fileName)) ||
'Untitled document';
return (
<div
key={idx}
className="flex flex-row space-x-3 rounded-lg border border-light-200 dark:border-dark-200 bg-light-100 dark:bg-dark-100 p-2 cursor-pointer"
>
<div className="mt-0.5 h-10 w-10 rounded-md bg-cyan-100 text-cyan-800 dark:bg-sky-500 dark:text-cyan-50 flex items-center justify-center">
<FileText className="w-5 h-5" />
</div>
<div className="flex flex-col justify-center">
<p className="text-[13px] text-black dark:text-white line-clamp-1">
{title}
</p>
</div>
</div>
); );
})} })}
</div> </div>

View File

@@ -59,7 +59,7 @@ const Chat = () => {
}, [messages]); }, [messages]);
return ( return (
<div className="flex flex-col space-y-6 pt-8 pb-44 lg:pb-32 sm:mx-4 md:mx-8"> <div className="flex flex-col space-y-6 pt-8 pb-44 lg:pb-28 sm:mx-4 md:mx-8">
{sections.map((section, i) => { {sections.map((section, i) => {
const isLast = i === sections.length - 1; const isLast = i === sections.length - 1;
@@ -81,9 +81,23 @@ const Chat = () => {
<div ref={messageEnd} className="h-0" /> <div ref={messageEnd} className="h-0" />
{dividerWidth > 0 && ( {dividerWidth > 0 && (
<div <div
className="bottom-24 lg:bottom-10 fixed z-40" className="fixed z-40 bottom-24 lg:bottom-6"
style={{ width: dividerWidth }} style={{ width: dividerWidth }}
> >
<div
className="pointer-events-none absolute -bottom-6 left-0 right-0 h-[calc(100%+24px+24px)] dark:hidden"
style={{
background:
'linear-gradient(to top, #ffffff 0%, #ffffff 35%, rgba(255,255,255,0.95) 45%, rgba(255,255,255,0.85) 55%, rgba(255,255,255,0.7) 65%, rgba(255,255,255,0.5) 75%, rgba(255,255,255,0.3) 85%, rgba(255,255,255,0.1) 92%, transparent 100%)',
}}
/>
<div
className="pointer-events-none absolute -bottom-6 left-0 right-0 h-[calc(100%+24px+24px)] hidden dark:block"
style={{
background:
'linear-gradient(to top, #0d1117 0%, #0d1117 35%, rgba(13,17,23,0.95) 45%, rgba(13,17,23,0.85) 55%, rgba(13,17,23,0.7) 65%, rgba(13,17,23,0.5) 75%, rgba(13,17,23,0.3) 85%, rgba(13,17,23,0.1) 92%, transparent 100%)',
}}
/>
<MessageInput /> <MessageInput />
</div> </div>
)} )}

View File

@@ -6,7 +6,8 @@ import EmptyChat from './EmptyChat';
import NextError from 'next/error'; import NextError from 'next/error';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import SettingsButtonMobile from './Settings/SettingsButtonMobile'; import SettingsButtonMobile from './Settings/SettingsButtonMobile';
import { Block, Chunk } from '@/lib/types'; import { Block } from '@/lib/types';
import Loader from './ui/Loader';
export interface BaseMessage { export interface BaseMessage {
chatId: string; chatId: string;
@@ -21,35 +22,6 @@ export interface Message extends BaseMessage {
status: 'answering' | 'completed' | 'error'; status: 'answering' | 'completed' | 'error';
} }
export interface UserMessage extends BaseMessage {
role: 'user';
content: string;
}
export interface AssistantMessage extends BaseMessage {
role: 'assistant';
content: string;
suggestions?: string[];
}
export interface SourceMessage extends BaseMessage {
role: 'source';
sources: Chunk[];
}
export interface SuggestionMessage extends BaseMessage {
role: 'suggestion';
suggestions: string[];
}
export type LegacyMessage =
| AssistantMessage
| UserMessage
| SourceMessage
| SuggestionMessage;
export type ChatTurn = UserMessage | AssistantMessage;
export interface File { export interface File {
fileName: string; fileName: string;
fileExtension: string; fileExtension: string;
@@ -62,7 +34,8 @@ export interface Widget {
} }
const ChatWindow = () => { const ChatWindow = () => {
const { hasError, notFound, messages } = useChat(); const { hasError, notFound, messages, isReady } = useChat();
if (hasError) { if (hasError) {
return ( return (
<div className="relative"> <div className="relative">
@@ -78,7 +51,8 @@ const ChatWindow = () => {
); );
} }
return notFound ? ( return isReady ? (
notFound ? (
<NextError statusCode={404} /> <NextError statusCode={404} />
) : ( ) : (
<div> <div>
@@ -91,6 +65,11 @@ const ChatWindow = () => {
<EmptyChat /> <EmptyChat />
)} )}
</div> </div>
)
) : (
<div className="flex items-center justify-center min-h-screen w-full">
<Loader />
</div>
); );
}; };

View File

@@ -1,7 +1,7 @@
import { ArrowRight } from 'lucide-react'; import { ArrowRight } from 'lucide-react';
import { useEffect, useRef, useState } from 'react'; import { useEffect, useRef, useState } from 'react';
import TextareaAutosize from 'react-textarea-autosize'; import TextareaAutosize from 'react-textarea-autosize';
import Focus from './MessageInputActions/Focus'; import Sources from './MessageInputActions/Sources';
import Optimization from './MessageInputActions/Optimization'; import Optimization from './MessageInputActions/Optimization';
import Attach from './MessageInputActions/Attach'; import Attach from './MessageInputActions/Attach';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
@@ -68,8 +68,8 @@ const EmptyChatMessageInput = () => {
<Optimization /> <Optimization />
<div className="flex flex-row items-center space-x-2"> <div className="flex flex-row items-center space-x-2">
<div className="flex flex-row items-center space-x-1"> <div className="flex flex-row items-center space-x-1">
<Sources />
<ModelSelector /> <ModelSelector />
<Focus />
<Attach /> <Attach />
</div> </div>
<button <button

View File

@@ -2,6 +2,7 @@ import { Check, ClipboardList } from 'lucide-react';
import { Message } from '../ChatWindow'; import { Message } from '../ChatWindow';
import { useState } from 'react'; import { useState } from 'react';
import { Section } from '@/lib/hooks/useChat'; import { Section } from '@/lib/hooks/useChat';
import { SourceBlock } from '@/lib/types';
const Copy = ({ const Copy = ({
section, section,
@@ -15,15 +16,25 @@ const Copy = ({
return ( return (
<button <button
onClick={() => { onClick={() => {
const sources = section.message.responseBlocks.filter(
(b) => b.type === 'source' && b.data.length > 0,
) as SourceBlock[];
const contentToCopy = `${initialMessage}${ const contentToCopy = `${initialMessage}${
section?.message.responseBlocks.filter((b) => b.type === 'source') sources.length > 0
?.length > 0 && ? `\n\nCitations:\n${sources
`\n\nCitations:\n${section.message.responseBlocks .map((source) => source.data)
.filter((b) => b.type === 'source') .flat()
?.map((source: any, i: any) => `[${i + 1}] ${source.metadata.url}`) .map(
(s, i) =>
`[${i + 1}] ${s.metadata.url.startsWith('file_id://') ? s.metadata.fileName || 'Uploaded File' : s.metadata.url}`,
)
.join(`\n`)}` .join(`\n`)}`
: ''
}`; }`;
navigator.clipboard.writeText(contentToCopy); navigator.clipboard.writeText(contentToCopy);
setCopied(true); setCopied(true);
setTimeout(() => setCopied(false), 1000); setTimeout(() => setCopied(false), 1000);
}} }}

View File

@@ -12,7 +12,7 @@ import {
Plus, Plus,
CornerDownRight, CornerDownRight,
} from 'lucide-react'; } from 'lucide-react';
import Markdown, { MarkdownToJSX } from 'markdown-to-jsx'; import Markdown, { MarkdownToJSX, RuleType } from 'markdown-to-jsx';
import Copy from './MessageActions/Copy'; import Copy from './MessageActions/Copy';
import Rewrite from './MessageActions/Rewrite'; import Rewrite from './MessageActions/Rewrite';
import MessageSources from './MessageSources'; import MessageSources from './MessageSources';
@@ -21,10 +21,11 @@ import SearchVideos from './SearchVideos';
import { useSpeech } from 'react-text-to-speech'; import { useSpeech } from 'react-text-to-speech';
import ThinkBox from './ThinkBox'; import ThinkBox from './ThinkBox';
import { useChat, Section } from '@/lib/hooks/useChat'; import { useChat, Section } from '@/lib/hooks/useChat';
import Citation from './Citation'; import Citation from './MessageRenderer/Citation';
import AssistantSteps from './AssistantSteps'; import AssistantSteps from './AssistantSteps';
import { ResearchBlock } from '@/lib/types'; import { ResearchBlock } from '@/lib/types';
import Renderer from './Widgets/Renderer'; import Renderer from './Widgets/Renderer';
import CodeBlock from './MessageRenderer/CodeBlock';
const ThinkTagProcessor = ({ const ThinkTagProcessor = ({
children, children,
@@ -49,7 +50,14 @@ const MessageBox = ({
dividerRef?: MutableRefObject<HTMLDivElement | null>; dividerRef?: MutableRefObject<HTMLDivElement | null>;
isLast: boolean; isLast: boolean;
}) => { }) => {
const { loading, sendMessage, rewrite, messages, researchEnded } = useChat(); const {
loading,
sendMessage,
rewrite,
messages,
researchEnded,
chatHistory,
} = useChat();
const parsedMessage = section.parsedTextBlocks.join('\n\n'); const parsedMessage = section.parsedTextBlocks.join('\n\n');
const speechMessage = section.speechMessage || ''; const speechMessage = section.speechMessage || '';
@@ -67,6 +75,21 @@ const MessageBox = ({
const { speechStatus, start, stop } = useSpeech({ text: speechMessage }); const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
const markdownOverrides: MarkdownToJSX.Options = { const markdownOverrides: MarkdownToJSX.Options = {
renderRule(next, node, renderChildren, state) {
if (node.type === RuleType.codeInline) {
return `\`${node.text}\``;
}
if (node.type === RuleType.codeBlock) {
return (
<CodeBlock key={state.key} language={node.lang || ''}>
{node.text}
</CodeBlock>
);
}
return next();
},
overrides: { overrides: {
think: { think: {
component: ThinkTagProcessor, component: ThinkTagProcessor,
@@ -115,12 +138,11 @@ const MessageBox = ({
<AssistantSteps <AssistantSteps
block={researchBlock} block={researchBlock}
status={section.message.status} status={section.message.status}
isLast={isLast}
/> />
</div> </div>
))} ))}
{section.widgets.length > 0 && <Renderer widgets={section.widgets} />}
{isLast && {isLast &&
loading && loading &&
!researchEnded && !researchEnded &&
@@ -135,6 +157,8 @@ const MessageBox = ({
</div> </div>
)} )}
{section.widgets.length > 0 && <Renderer widgets={section.widgets} />}
<div className="flex flex-col space-y-2"> <div className="flex flex-col space-y-2">
{sources.length > 0 && ( {sources.length > 0 && (
<div className="flex flex-row items-center space-x-2"> <div className="flex flex-row items-center space-x-2">
@@ -218,10 +242,10 @@ const MessageBox = ({
className="group w-full py-4 text-left transition-colors duration-200" className="group w-full py-4 text-left transition-colors duration-200"
> >
<div className="flex items-center justify-between gap-3"> <div className="flex items-center justify-between gap-3">
<div className="flex flex-row space-x-3 items-center "> <div className="flex flex-row space-x-3 items-center">
<CornerDownRight <CornerDownRight
size={17} size={15}
className="group-hover:text-sky-400 transition-colors duration-200" className="group-hover:text-sky-400 transition-colors duration-200 flex-shrink-0"
/> />
<p className="text-sm text-black/70 dark:text-white/70 group-hover:text-sky-400 transition-colors duration-200 leading-relaxed"> <p className="text-sm text-black/70 dark:text-white/70 group-hover:text-sky-400 transition-colors duration-200 leading-relaxed">
{suggestion} {suggestion}
@@ -248,11 +272,11 @@ const MessageBox = ({
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4"> <div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
<SearchImages <SearchImages
query={section.message.query} query={section.message.query}
chatHistory={messages} chatHistory={chatHistory}
messageId={section.message.messageId} messageId={section.message.messageId}
/> />
<SearchVideos <SearchVideos
chatHistory={messages} chatHistory={chatHistory}
query={section.message.query} query={section.message.query}
messageId={section.message.messageId} messageId={section.message.messageId}
/> />

View File

@@ -2,9 +2,6 @@ import { cn } from '@/lib/utils';
import { ArrowUp } from 'lucide-react'; import { ArrowUp } from 'lucide-react';
import { useEffect, useRef, useState } from 'react'; import { useEffect, useRef, useState } from 'react';
import TextareaAutosize from 'react-textarea-autosize'; import TextareaAutosize from 'react-textarea-autosize';
import Attach from './MessageInputActions/Attach';
import CopilotToggle from './MessageInputActions/Copilot';
import { File } from './ChatWindow';
import AttachSmall from './MessageInputActions/AttachSmall'; import AttachSmall from './MessageInputActions/AttachSmall';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
@@ -64,7 +61,7 @@ const MessageInput = () => {
} }
}} }}
className={cn( className={cn(
'bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-hidden border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300', 'relative bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-visible border border-light-200 dark:border-dark-200 shadow-sm shadow-light-200/10 dark:shadow-black/20 transition-all duration-200 focus-within:border-light-300 dark:focus-within:border-dark-300',
mode === 'multi' ? 'flex-col rounded-2xl' : 'flex-row rounded-full', mode === 'multi' ? 'flex-col rounded-2xl' : 'flex-row rounded-full',
)} )}
> >
@@ -80,11 +77,16 @@ const MessageInput = () => {
placeholder="Ask a follow-up" placeholder="Ask a follow-up"
/> />
{mode === 'single' && ( {mode === 'single' && (
<div className="flex flex-row items-center space-x-4"> <button
<CopilotToggle disabled={message.trim().length === 0 || loading}
copilotEnabled={copilotEnabled} className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
setCopilotEnabled={setCopilotEnabled} >
/> <ArrowUp className="bg-background" size={17} />
</button>
)}
{mode === 'multi' && (
<div className="flex flex-row items-center justify-between w-full pt-2">
<AttachSmall />
<button <button
disabled={message.trim().length === 0 || loading} disabled={message.trim().length === 0 || loading}
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2" className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
@@ -93,23 +95,6 @@ const MessageInput = () => {
</button> </button>
</div> </div>
)} )}
{mode === 'multi' && (
<div className="flex flex-row items-center justify-between w-full pt-2">
<AttachSmall />
<div className="flex flex-row items-center space-x-4">
<CopilotToggle
copilotEnabled={copilotEnabled}
setCopilotEnabled={setCopilotEnabled}
/>
<button
disabled={message.trim().length === 0 || loading}
className="bg-[#24A0ED] text-white text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
>
<ArrowUp className="bg-background" size={17} />
</button>
</div>
</div>
)}
</form> </form>
); );
}; };

View File

@@ -16,6 +16,8 @@ import {
} from 'lucide-react'; } from 'lucide-react';
import { Fragment, useRef, useState } from 'react'; import { Fragment, useRef, useState } from 'react';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import { AnimatePresence } from 'motion/react';
import { motion } from 'framer-motion';
const Attach = () => { const Attach = () => {
const { files, setFiles, setFileIds, fileIds } = useChat(); const { files, setFiles, setFileIds, fileIds } = useChat();
@@ -53,29 +55,33 @@ const Attach = () => {
return loading ? ( return loading ? (
<div className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none text-black/50 dark:text-white/50 transition duration-200"> <div className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none text-black/50 dark:text-white/50 transition duration-200">
<LoaderCircle size={16} className="text-sky-400 animate-spin" /> <LoaderCircle size={16} className="text-sky-500 animate-spin" />
</div> </div>
) : files.length > 0 ? ( ) : files.length > 0 ? (
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg"> <Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
{({ open }) => (
<>
<PopoverButton <PopoverButton
type="button" type="button"
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white" className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
> >
<File size={16} className="text-sky-400" /> <File size={16} className="text-sky-500" />
</PopoverButton> </PopoverButton>
<Transition <AnimatePresence>
as={Fragment} {open && (
enter="transition ease-out duration-150" <PopoverPanel
enterFrom="opacity-0 translate-y-1" className="absolute z-10 w-64 md:w-[350px] right-0"
enterTo="opacity-100 translate-y-0" static
leave="transition ease-in duration-150" >
leaveFrom="opacity-100 translate-y-0" <motion.div
leaveTo="opacity-0 translate-y-1" initial={{ opacity: 0, scale: 0.9 }}
animate={{ opacity: 1, scale: 1 }}
exit={{ opacity: 0, scale: 0.9 }}
transition={{ duration: 0.1, ease: 'easeOut' }}
className="origin-top-right bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col"
> >
<PopoverPanel className="absolute z-10 w-64 md:w-[350px] right-0">
<div className="bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col">
<div className="flex flex-row items-center justify-between px-3 py-2"> <div className="flex flex-row items-center justify-between px-3 py-2">
<h4 className="text-black dark:text-white font-medium text-sm"> <h4 className="text-black/70 dark:text-white/70 text-sm">
Attached files Attached files
</h4> </h4>
<div className="flex flex-row items-center space-x-4"> <div className="flex flex-row items-center space-x-4">
@@ -102,7 +108,7 @@ const Attach = () => {
}} }}
className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200 focus:outline-none" className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200 focus:outline-none"
> >
<Trash size={14} /> <Trash size={13} />
<p className="text-xs">Clear</p> <p className="text-xs">Clear</p>
</button> </button>
</div> </div>
@@ -114,15 +120,17 @@ const Attach = () => {
key={i} key={i}
className="flex flex-row items-center justify-start w-full space-x-3 p-3" className="flex flex-row items-center justify-start w-full space-x-3 p-3"
> >
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-10 h-10 rounded-md"> <div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-9 h-9 rounded-md">
<File <File
size={16} size={16}
className="text-black/70 dark:text-white/70" className="text-black/70 dark:text-white/70"
/> />
</div> </div>
<p className="text-black/70 dark:text-white/70 text-sm"> <p className="text-black/70 dark:text-white/70 text-xs">
{file.fileName.length > 25 {file.fileName.length > 25
? file.fileName.replace(/\.\w+$/, '').substring(0, 25) + ? file.fileName
.replace(/\.\w+$/, '')
.substring(0, 25) +
'...' + '...' +
file.fileExtension file.fileExtension
: file.fileName} : file.fileName}
@@ -130,9 +138,12 @@ const Attach = () => {
</div> </div>
))} ))}
</div> </div>
</div> </motion.div>
</PopoverPanel> </PopoverPanel>
</Transition> )}
</AnimatePresence>
</>
)}
</Popover> </Popover>
) : ( ) : (
<button <button

View File

@@ -1,21 +1,14 @@
import { cn } from '@/lib/utils';
import { import {
Popover, Popover,
PopoverButton, PopoverButton,
PopoverPanel, PopoverPanel,
Transition, Transition,
} from '@headlessui/react'; } from '@headlessui/react';
import { import { File, LoaderCircle, Paperclip, Plus, Trash } from 'lucide-react';
CopyPlus,
File,
LoaderCircle,
Paperclip,
Plus,
Trash,
} from 'lucide-react';
import { Fragment, useRef, useState } from 'react'; import { Fragment, useRef, useState } from 'react';
import { File as FileType } from '../ChatWindow';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import { AnimatePresence } from 'motion/react';
import { motion } from 'framer-motion';
const AttachSmall = () => { const AttachSmall = () => {
const { files, setFiles, setFileIds, fileIds } = useChat(); const { files, setFiles, setFileIds, fileIds } = useChat();
@@ -53,29 +46,33 @@ const AttachSmall = () => {
return loading ? ( return loading ? (
<div className="flex flex-row items-center justify-between space-x-1 p-1 "> <div className="flex flex-row items-center justify-between space-x-1 p-1 ">
<LoaderCircle size={20} className="text-sky-400 animate-spin" /> <LoaderCircle size={20} className="text-sky-500 animate-spin" />
</div> </div>
) : files.length > 0 ? ( ) : files.length > 0 ? (
<Popover className="max-w-[15rem] md:max-w-md lg:max-w-lg"> <Popover className="max-w-[15rem] md:max-w-md lg:max-w-lg">
{({ open }) => (
<>
<PopoverButton <PopoverButton
type="button" type="button"
className="flex flex-row items-center justify-between space-x-1 p-1 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white" className="flex flex-row items-center justify-between space-x-1 p-1 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
> >
<File size={20} className="text-sky-400" /> <File size={20} className="text-sky-500" />
</PopoverButton> </PopoverButton>
<Transition <AnimatePresence>
as={Fragment} {open && (
enter="transition ease-out duration-150" <PopoverPanel
enterFrom="opacity-0 translate-y-1" className="absolute z-10 w-64 md:w-[350px] bottom-14"
enterTo="opacity-100 translate-y-0" static
leave="transition ease-in duration-150" >
leaveFrom="opacity-100 translate-y-0" <motion.div
leaveTo="opacity-0 translate-y-1" initial={{ opacity: 0, scale: 0.9 }}
animate={{ opacity: 1, scale: 1 }}
exit={{ opacity: 0, scale: 0.9 }}
transition={{ duration: 0.1, ease: 'easeOut' }}
className="origin-bottom-left bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col"
> >
<PopoverPanel className="absolute z-10 w-64 md:w-[350px] bottom-14 -ml-3">
<div className="bg-light-primary dark:bg-dark-primary border rounded-md border-light-200 dark:border-dark-200 w-full max-h-[200px] md:max-h-none overflow-y-auto flex flex-col">
<div className="flex flex-row items-center justify-between px-3 py-2"> <div className="flex flex-row items-center justify-between px-3 py-2">
<h4 className="text-black dark:text-white font-medium text-sm"> <h4 className="text-black/70 dark:text-white/70 font-medium text-sm">
Attached files Attached files
</h4> </h4>
<div className="flex flex-row items-center space-x-4"> <div className="flex flex-row items-center space-x-4">
@@ -92,7 +89,7 @@ const AttachSmall = () => {
multiple multiple
hidden hidden
/> />
<Plus size={18} /> <Plus size={16} />
<p className="text-xs">Add</p> <p className="text-xs">Add</p>
</button> </button>
<button <button
@@ -102,7 +99,7 @@ const AttachSmall = () => {
}} }}
className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200" className="flex flex-row items-center space-x-1 text-black/70 dark:text-white/70 hover:text-black hover:dark:text-white transition duration-200"
> >
<Trash size={14} /> <Trash size={13} />
<p className="text-xs">Clear</p> <p className="text-xs">Clear</p>
</button> </button>
</div> </div>
@@ -114,15 +111,17 @@ const AttachSmall = () => {
key={i} key={i}
className="flex flex-row items-center justify-start w-full space-x-3 p-3" className="flex flex-row items-center justify-start w-full space-x-3 p-3"
> >
<div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-10 h-10 rounded-md"> <div className="bg-light-100 dark:bg-dark-100 flex items-center justify-center w-9 h-9 rounded-md">
<File <File
size={16} size={16}
className="text-black/70 dark:text-white/70" className="text-black/70 dark:text-white/70"
/> />
</div> </div>
<p className="text-black/70 dark:text-white/70 text-sm"> <p className="text-black/70 dark:text-white/70 text-xs">
{file.fileName.length > 25 {file.fileName.length > 25
? file.fileName.replace(/\.\w+$/, '').substring(0, 25) + ? file.fileName
.replace(/\.\w+$/, '')
.substring(0, 25) +
'...' + '...' +
file.fileExtension file.fileExtension
: file.fileName} : file.fileName}
@@ -130,9 +129,12 @@ const AttachSmall = () => {
</div> </div>
))} ))}
</div> </div>
</div> </motion.div>
</PopoverPanel> </PopoverPanel>
</Transition> )}
</AnimatePresence>
</>
)}
</Popover> </Popover>
) : ( ) : (
<button <button

View File

@@ -2,15 +2,11 @@
import { Cpu, Loader2, Search } from 'lucide-react'; import { Cpu, Loader2, Search } from 'lucide-react';
import { cn } from '@/lib/utils'; import { cn } from '@/lib/utils';
import { import { Popover, PopoverButton, PopoverPanel } from '@headlessui/react';
Popover, import { useEffect, useMemo, useState } from 'react';
PopoverButton,
PopoverPanel,
Transition,
} from '@headlessui/react';
import { Fragment, useEffect, useMemo, useState } from 'react';
import { MinimalProvider } from '@/lib/models/types'; import { MinimalProvider } from '@/lib/models/types';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import { AnimatePresence, motion } from 'motion/react';
const ModelSelector = () => { const ModelSelector = () => {
const [providers, setProviders] = useState<MinimalProvider[]>([]); const [providers, setProviders] = useState<MinimalProvider[]>([]);
@@ -79,24 +75,28 @@ const ModelSelector = () => {
return ( return (
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg"> <Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
{({ open }) => (
<>
<PopoverButton <PopoverButton
type="button" type="button"
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white" className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
> >
<Cpu size={16} className="text-sky-500" /> <Cpu size={16} className="text-sky-500" />
</PopoverButton> </PopoverButton>
<Transition <AnimatePresence>
as={Fragment} {open && (
enter="transition ease-out duration-100" <PopoverPanel
enterFrom="opacity-0 translate-y-1" className="absolute z-10 w-[230px] sm:w-[270px] md:w-[300px] right-0"
enterTo="opacity-100 translate-y-0" static
leave="transition ease-in duration-100"
leaveFrom="opacity-100 translate-y-0"
leaveTo="opacity-0 translate-y-1"
> >
<PopoverPanel className="absolute z-10 w-[230px] sm:w-[270px] md:w-[300px] -right-4"> <motion.div
<div className="bg-light-primary dark:bg-dark-primary max-h-[300px] sm:max-w-none border rounded-lg border-light-200 dark:border-dark-200 w-full flex flex-col shadow-lg overflow-hidden"> initial={{ opacity: 0, scale: 0.9 }}
<div className="p-4 border-b border-light-200 dark:border-dark-200"> animate={{ opacity: 1, scale: 1 }}
exit={{ opacity: 0, scale: 0.9 }}
transition={{ duration: 0.1, ease: 'easeOut' }}
className="origin-top-right bg-light-primary dark:bg-dark-primary max-h-[300px] sm:max-w-none border rounded-lg border-light-200 dark:border-dark-200 w-full flex flex-col shadow-lg overflow-hidden"
>
<div className="p-2 border-b border-light-200 dark:border-dark-200">
<div className="relative"> <div className="relative">
<Search <Search
size={16} size={16}
@@ -107,7 +107,7 @@ const ModelSelector = () => {
placeholder="Search models..." placeholder="Search models..."
value={searchQuery} value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)} onChange={(e) => setSearchQuery(e.target.value)}
className="w-full pl-9 pr-3 py-2 bg-light-secondary dark:bg-dark-secondary rounded-lg placeholder:text-sm text-sm text-black dark:text-white placeholder:text-black/40 dark:placeholder:text-white/40 focus:outline-none focus:ring-2 focus:ring-sky-500/20 border border-transparent focus:border-sky-500/30 transition duration-200" className="w-full pl-8 pr-3 py-2 bg-light-secondary dark:bg-dark-secondary rounded-lg placeholder:text-xs placeholder:-translate-y-[1.5px] text-xs text-black dark:text-white placeholder:text-black/40 dark:placeholder:text-white/40 focus:outline-none border border-transparent transition duration-200"
/> />
</div> </div>
</div> </div>
@@ -146,7 +146,8 @@ const ModelSelector = () => {
type="button" type="button"
className={cn( className={cn(
'px-3 py-2 flex items-center justify-between text-start duration-200 cursor-pointer transition rounded-lg group', 'px-3 py-2 flex items-center justify-between text-start duration-200 cursor-pointer transition rounded-lg group',
chatModelProvider?.providerId === provider.id && chatModelProvider?.providerId ===
provider.id &&
chatModelProvider?.key === model.key chatModelProvider?.key === model.key
? 'bg-light-secondary dark:bg-dark-secondary' ? 'bg-light-secondary dark:bg-dark-secondary'
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary', : 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
@@ -166,7 +167,7 @@ const ModelSelector = () => {
/> />
<p <p
className={cn( className={cn(
'text-sm truncate', 'text-xs truncate',
chatModelProvider?.providerId === chatModelProvider?.providerId ===
provider.id && provider.id &&
chatModelProvider?.key === model.key chatModelProvider?.key === model.key
@@ -189,9 +190,12 @@ const ModelSelector = () => {
</div> </div>
)} )}
</div> </div>
</div> </motion.div>
</PopoverPanel> </PopoverPanel>
</Transition> )}
</AnimatePresence>
</>
)}
</Popover> </Popover>
); );
}; };

View File

@@ -1,43 +0,0 @@
import { cn } from '@/lib/utils';
import { Switch } from '@headlessui/react';
const CopilotToggle = ({
copilotEnabled,
setCopilotEnabled,
}: {
copilotEnabled: boolean;
setCopilotEnabled: (enabled: boolean) => void;
}) => {
return (
<div className="group flex flex-row items-center space-x-1 active:scale-95 duration-200 transition cursor-pointer">
<Switch
checked={copilotEnabled}
onChange={setCopilotEnabled}
className="bg-light-secondary dark:bg-dark-secondary border border-light-200/70 dark:border-dark-200 relative inline-flex h-5 w-10 sm:h-6 sm:w-11 items-center rounded-full"
>
<span className="sr-only">Copilot</span>
<span
className={cn(
copilotEnabled
? 'translate-x-6 bg-[#24A0ED]'
: 'translate-x-1 bg-black/50 dark:bg-white/50',
'inline-block h-3 w-3 sm:h-4 sm:w-4 transform rounded-full transition-all duration-200',
)}
/>
</Switch>
<p
onClick={() => setCopilotEnabled(!copilotEnabled)}
className={cn(
'text-xs font-medium transition-colors duration-150 ease-in-out',
copilotEnabled
? 'text-[#24A0ED]'
: 'text-black/50 dark:text-white/50 group-hover:text-black dark:group-hover:text-white',
)}
>
Copilot
</p>
</div>
);
};
export default CopilotToggle;

View File

@@ -1,123 +0,0 @@
import {
BadgePercent,
ChevronDown,
Globe,
Pencil,
ScanEye,
SwatchBook,
} from 'lucide-react';
import { cn } from '@/lib/utils';
import {
Popover,
PopoverButton,
PopoverPanel,
Transition,
} from '@headlessui/react';
import { SiReddit, SiYoutube } from '@icons-pack/react-simple-icons';
import { Fragment } from 'react';
import { useChat } from '@/lib/hooks/useChat';
const focusModes = [
{
key: 'webSearch',
title: 'All',
description: 'Searches across all of the internet',
icon: <Globe size={16} />,
},
{
key: 'academicSearch',
title: 'Academic',
description: 'Search in published academic papers',
icon: <SwatchBook size={16} />,
},
{
key: 'writingAssistant',
title: 'Writing',
description: 'Chat without searching the web',
icon: <Pencil size={16} />,
},
{
key: 'wolframAlphaSearch',
title: 'Wolfram Alpha',
description: 'Computational knowledge engine',
icon: <BadgePercent size={16} />,
},
{
key: 'youtubeSearch',
title: 'Youtube',
description: 'Search and watch videos',
icon: <SiYoutube className="h-[16px] w-auto mr-0.5" />,
},
{
key: 'redditSearch',
title: 'Reddit',
description: 'Search for discussions and opinions',
icon: <SiReddit className="h-[16px] w-auto mr-0.5" />,
},
];
const Focus = () => {
const { focusMode, setFocusMode } = useChat();
return (
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
<PopoverButton
type="button"
className="active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none headless-open:text-black dark:headless-open:text-white text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
>
{focusMode !== 'webSearch' ? (
<div className="flex flex-row items-center space-x-1">
{focusModes.find((mode) => mode.key === focusMode)?.icon}
</div>
) : (
<div className="flex flex-row items-center space-x-1">
<Globe size={16} />
</div>
)}
</PopoverButton>
<Transition
as={Fragment}
enter="transition ease-out duration-150"
enterFrom="opacity-0 translate-y-1"
enterTo="opacity-100 translate-y-0"
leave="transition ease-in duration-150"
leaveFrom="opacity-100 translate-y-0"
leaveTo="opacity-0 translate-y-1"
>
<PopoverPanel className="absolute z-10 w-64 md:w-[500px] -right-4">
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
{focusModes.map((mode, i) => (
<PopoverButton
onClick={() => setFocusMode(mode.key)}
key={i}
className={cn(
'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-2 duration-200 cursor-pointer transition focus:outline-none',
focusMode === mode.key
? 'bg-light-secondary dark:bg-dark-secondary'
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
)}
>
<div
className={cn(
'flex flex-row items-center space-x-1',
focusMode === mode.key
? 'text-[#24A0ED]'
: 'text-black dark:text-white',
)}
>
{mode.icon}
<p className="text-sm font-medium">{mode.title}</p>
</div>
<p className="text-black/70 dark:text-white/70 text-xs">
{mode.description}
</p>
</PopoverButton>
))}
</div>
</PopoverPanel>
</Transition>
</Popover>
);
};
export default Focus;

View File

@@ -8,6 +8,7 @@ import {
} from '@headlessui/react'; } from '@headlessui/react';
import { Fragment } from 'react'; import { Fragment } from 'react';
import { useChat } from '@/lib/hooks/useChat'; import { useChat } from '@/lib/hooks/useChat';
import { AnimatePresence, motion } from 'motion/react';
const OptimizationModes = [ const OptimizationModes = [
{ {
@@ -60,17 +61,19 @@ const Optimization = () => {
/> />
</div> </div>
</PopoverButton> </PopoverButton>
<Transition <AnimatePresence>
as={Fragment} {open && (
enter="transition ease-out duration-150" <PopoverPanel
enterFrom="opacity-0 translate-y-1" className="absolute z-10 w-64 md:w-[250px] left-0"
enterTo="opacity-100 translate-y-0" static
leave="transition ease-in duration-150" >
leaveFrom="opacity-100 translate-y-0" <motion.div
leaveTo="opacity-0 translate-y-1" initial={{ opacity: 0, scale: 0.9 }}
animate={{ opacity: 1, scale: 1 }}
exit={{ opacity: 0, scale: 0.9 }}
transition={{ duration: 0.1, ease: 'easeOut' }}
className="origin-top-left flex flex-col space-y-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-2 max-h-[200px] md:max-h-none overflow-y-auto"
> >
<PopoverPanel className="absolute z-10 w-64 md:w-[250px] left-0">
<div className="flex flex-col gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
{OptimizationModes.map((mode, i) => ( {OptimizationModes.map((mode, i) => (
<PopoverButton <PopoverButton
onClick={() => setOptimizationMode(mode.key)} onClick={() => setOptimizationMode(mode.key)}
@@ -82,18 +85,26 @@ const Optimization = () => {
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary', : 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
)} )}
> >
<div className="flex flex-row items-center space-x-1 text-black dark:text-white"> <div className="flex flex-row justify-between w-full text-black dark:text-white">
<div className="flex flex-row space-x-1">
{mode.icon} {mode.icon}
<p className="text-sm font-medium">{mode.title}</p> <p className="text-xs font-medium">{mode.title}</p>
</div>
{mode.key === 'quality' && (
<span className="bg-sky-500/70 dark:bg-sky-500/40 border border-sky-600 px-1 rounded-full text-[10px] text-white">
Beta
</span>
)}
</div> </div>
<p className="text-black/70 dark:text-white/70 text-xs"> <p className="text-black/70 dark:text-white/70 text-xs">
{mode.description} {mode.description}
</p> </p>
</PopoverButton> </PopoverButton>
))} ))}
</div> </motion.div>
</PopoverPanel> </PopoverPanel>
</Transition> )}
</AnimatePresence>
</> </>
)} )}
</Popover> </Popover>

View File

@@ -0,0 +1,93 @@
import { useChat } from '@/lib/hooks/useChat';
import {
Popover,
PopoverButton,
PopoverPanel,
Switch,
} from '@headlessui/react';
import {
GlobeIcon,
GraduationCapIcon,
NetworkIcon,
} from '@phosphor-icons/react';
import { AnimatePresence, motion } from 'motion/react';
const sourcesList = [
{
name: 'Web',
key: 'web',
icon: <GlobeIcon className="h-[16px] w-auto" />,
},
{
name: 'Academic',
key: 'academic',
icon: <GraduationCapIcon className="h-[16px] w-auto" />,
},
{
name: 'Social',
key: 'discussions',
icon: <NetworkIcon className="h-[16px] w-auto" />,
},
];
const Sources = () => {
const { sources, setSources } = useChat();
return (
<Popover className="relative">
{({ open }) => (
<>
<PopoverButton className="flex items-center justify-center active:border-none hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg focus:outline-none text-black/50 dark:text-white/50 active:scale-95 transition duration-200 hover:text-black dark:hover:text-white">
<GlobeIcon className="h-[18px] w-auto" />
</PopoverButton>
<AnimatePresence>
{open && (
<PopoverPanel
static
className="absolute z-10 w-64 md:w-[225px] right-0"
>
<motion.div
initial={{ opacity: 0, scale: 0.9 }}
animate={{ opacity: 1, scale: 1 }}
exit={{ opacity: 0, scale: 0.9 }}
transition={{ duration: 0.1, ease: 'easeOut' }}
className="origin-top-right flex flex-col bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-1 max-h-[200px] md:max-h-none overflow-y-auto shadow-lg"
>
{sourcesList.map((source, i) => (
<div
key={i}
className="flex flex-row justify-between hover:bg-light-100 hover:dark:bg-dark-100 rounded-md py-3 px-2 cursor-pointer"
onClick={() => {
if (!sources.includes(source.key)) {
setSources([...sources, source.key]);
} else {
setSources(sources.filter((s) => s !== source.key));
}
}}
>
<div className="flex flex-row space-x-1.5 text-black/80 dark:text-white/80">
{source.icon}
<p className="text-xs">{source.name}</p>
</div>
<Switch
checked={sources.includes(source.key)}
className="group relative flex h-4 w-7 shrink-0 cursor-pointer rounded-full bg-light-200 dark:bg-white/10 p-0.5 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500 dark:data-[checked]:bg-sky-500"
>
<span
aria-hidden="true"
className="pointer-events-none inline-block size-3 translate-x-[1px] group-data-[checked]:translate-x-3 rounded-full bg-white shadow-lg ring-0 transition duration-200 ease-in-out"
/>
</Switch>
</div>
))}
</motion.div>
</PopoverPanel>
)}
</AnimatePresence>
</>
)}
</Popover>
);
};
export default Sources;

View File

@@ -0,0 +1,102 @@
import type { CSSProperties } from 'react';
const darkTheme = {
'hljs-comment': {
color: '#8b949e',
},
'hljs-quote': {
color: '#8b949e',
},
'hljs-variable': {
color: '#ff7b72',
},
'hljs-template-variable': {
color: '#ff7b72',
},
'hljs-tag': {
color: '#ff7b72',
},
'hljs-name': {
color: '#ff7b72',
},
'hljs-selector-id': {
color: '#ff7b72',
},
'hljs-selector-class': {
color: '#ff7b72',
},
'hljs-regexp': {
color: '#ff7b72',
},
'hljs-deletion': {
color: '#ff7b72',
},
'hljs-number': {
color: '#f2cc60',
},
'hljs-built_in': {
color: '#f2cc60',
},
'hljs-builtin-name': {
color: '#f2cc60',
},
'hljs-literal': {
color: '#f2cc60',
},
'hljs-type': {
color: '#f2cc60',
},
'hljs-params': {
color: '#f2cc60',
},
'hljs-meta': {
color: '#f2cc60',
},
'hljs-link': {
color: '#f2cc60',
},
'hljs-attribute': {
color: '#58a6ff',
},
'hljs-string': {
color: '#7ee787',
},
'hljs-symbol': {
color: '#7ee787',
},
'hljs-bullet': {
color: '#7ee787',
},
'hljs-addition': {
color: '#7ee787',
},
'hljs-title': {
color: '#79c0ff',
},
'hljs-section': {
color: '#79c0ff',
},
'hljs-keyword': {
color: '#c297ff',
},
'hljs-selector-tag': {
color: '#c297ff',
},
hljs: {
display: 'block',
overflowX: 'auto',
background: '#0d1117',
color: '#c9d1d9',
padding: '0.75em',
border: '1px solid #21262d',
borderRadius: '10px',
},
'hljs-emphasis': {
fontStyle: 'italic',
},
'hljs-strong': {
fontWeight: 'bold',
},
} satisfies Record<string, CSSProperties>;
export default darkTheme;

View File

@@ -0,0 +1,102 @@
import type { CSSProperties } from 'react';
const lightTheme = {
'hljs-comment': {
color: '#6e7781',
},
'hljs-quote': {
color: '#6e7781',
},
'hljs-variable': {
color: '#d73a49',
},
'hljs-template-variable': {
color: '#d73a49',
},
'hljs-tag': {
color: '#d73a49',
},
'hljs-name': {
color: '#d73a49',
},
'hljs-selector-id': {
color: '#d73a49',
},
'hljs-selector-class': {
color: '#d73a49',
},
'hljs-regexp': {
color: '#d73a49',
},
'hljs-deletion': {
color: '#d73a49',
},
'hljs-number': {
color: '#b08800',
},
'hljs-built_in': {
color: '#b08800',
},
'hljs-builtin-name': {
color: '#b08800',
},
'hljs-literal': {
color: '#b08800',
},
'hljs-type': {
color: '#b08800',
},
'hljs-params': {
color: '#b08800',
},
'hljs-meta': {
color: '#b08800',
},
'hljs-link': {
color: '#b08800',
},
'hljs-attribute': {
color: '#0a64ae',
},
'hljs-string': {
color: '#22863a',
},
'hljs-symbol': {
color: '#22863a',
},
'hljs-bullet': {
color: '#22863a',
},
'hljs-addition': {
color: '#22863a',
},
'hljs-title': {
color: '#005cc5',
},
'hljs-section': {
color: '#005cc5',
},
'hljs-keyword': {
color: '#6f42c1',
},
'hljs-selector-tag': {
color: '#6f42c1',
},
hljs: {
display: 'block',
overflowX: 'auto',
background: '#ffffff',
color: '#24292f',
padding: '0.75em',
border: '1px solid #e8edf1',
borderRadius: '10px',
},
'hljs-emphasis': {
fontStyle: 'italic',
},
'hljs-strong': {
fontWeight: 'bold',
},
} satisfies Record<string, CSSProperties>;
export default lightTheme;

View File

@@ -0,0 +1,64 @@
'use client';
import { CheckIcon, CopyIcon } from '@phosphor-icons/react';
import React, { useEffect, useMemo, useState } from 'react';
import { useTheme } from 'next-themes';
import SyntaxHighlighter from 'react-syntax-highlighter';
import darkTheme from './CodeBlockDarkTheme';
import lightTheme from './CodeBlockLightTheme';
const CodeBlock = ({
language,
children,
}: {
language: string;
children: React.ReactNode;
}) => {
const { resolvedTheme } = useTheme();
const [mounted, setMounted] = useState(false);
const [copied, setCopied] = useState(false);
useEffect(() => {
setMounted(true);
}, []);
const syntaxTheme = useMemo(() => {
if (!mounted) return lightTheme;
return resolvedTheme === 'dark' ? darkTheme : lightTheme;
}, [mounted, resolvedTheme]);
return (
<div className="relative">
<button
className="absolute top-2 right-2 p-1"
onClick={() => {
navigator.clipboard.writeText(children as string);
setCopied(true);
setTimeout(() => setCopied(false), 2000);
}}
>
{copied ? (
<CheckIcon
size={16}
className="absolute top-2 right-2 text-black/70 dark:text-white/70"
/>
) : (
<CopyIcon
size={16}
className="absolute top-2 right-2 transition duration-200 text-black/70 dark:text-white/70 hover:text-gray-800/70 hover:dark:text-gray-300/70"
/>
)}
</button>
<SyntaxHighlighter
language={language}
style={syntaxTheme}
showInlineLineNumbers
>
{children as string}
</SyntaxHighlighter>
</div>
);
};
export default CodeBlock;

View File

@@ -37,7 +37,7 @@ const MessageSources = ({ sources }: { sources: Chunk[] }) => {
</p> </p>
<div className="flex flex-row items-center justify-between"> <div className="flex flex-row items-center justify-between">
<div className="flex flex-row items-center space-x-1"> <div className="flex flex-row items-center space-x-1">
{source.metadata.url === 'File' ? ( {source.metadata.url.includes('file_id://') ? (
<div className="bg-dark-200 hover:bg-dark-100 transition duration-200 flex items-center justify-center w-6 h-6 rounded-full"> <div className="bg-dark-200 hover:bg-dark-100 transition duration-200 flex items-center justify-center w-6 h-6 rounded-full">
<File size={12} className="text-white/70" /> <File size={12} className="text-white/70" />
</div> </div>
@@ -51,7 +51,9 @@ const MessageSources = ({ sources }: { sources: Chunk[] }) => {
/> />
)} )}
<p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis"> <p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
{source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')} {source.metadata.url.includes('file_id://')
? 'Uploaded File'
: source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}
</p> </p>
</div> </div>
<div className="flex flex-row items-center space-x-1 text-black/50 dark:text-white/50 text-xs"> <div className="flex flex-row items-center space-x-1 text-black/50 dark:text-white/50 text-xs">

View File

@@ -205,8 +205,9 @@ const Navbar = () => {
useEffect(() => { useEffect(() => {
if (sections.length > 0 && sections[0].message) { if (sections.length > 0 && sections[0].message) {
const newTitle = const newTitle =
sections[0].message.query.substring(0, 30) + '...' || sections[0].message.query.length > 30
'New Conversation'; ? `${sections[0].message.query.substring(0, 30).trim()}...`
: sections[0].message.query || 'New Conversation';
setTitle(newTitle); setTitle(newTitle);
const newTimeAgo = formatTimeDifference( const newTimeAgo = formatTimeDifference(

View File

@@ -17,7 +17,7 @@ const SearchImages = ({
messageId, messageId,
}: { }: {
query: string; query: string;
chatHistory: Message[]; chatHistory: [string, string][];
messageId: string; messageId: string;
}) => { }) => {
const [images, setImages] = useState<Image[] | null>(null); const [images, setImages] = useState<Image[] | null>(null);

View File

@@ -30,7 +30,7 @@ const Searchvideos = ({
messageId, messageId,
}: { }: {
query: string; query: string;
chatHistory: Message[]; chatHistory: [string, string][];
messageId: string; messageId: string;
}) => { }) => {
const [videos, setVideos] = useState<Video[] | null>(null); const [videos, setVideos] = useState<Video[] | null>(null);

View File

@@ -3,6 +3,7 @@ import {
ArrowLeft, ArrowLeft,
BrainCog, BrainCog,
ChevronLeft, ChevronLeft,
ExternalLink,
Search, Search,
Sliders, Sliders,
ToggleRight, ToggleRight,
@@ -115,7 +116,8 @@ const SettingsDialogue = ({
</div> </div>
) : ( ) : (
<div className="flex flex-1 inset-0 h-full overflow-hidden"> <div className="flex flex-1 inset-0 h-full overflow-hidden">
<div className="hidden lg:flex flex-col w-[240px] border-r border-white-200 dark:border-dark-200 h-full px-3 pt-3 overflow-y-auto"> <div className="hidden lg:flex flex-col justify-between w-[240px] border-r border-white-200 dark:border-dark-200 h-full px-3 pt-3 overflow-y-auto">
<div className="flex flex-col">
<button <button
onClick={() => setIsOpen(false)} onClick={() => setIsOpen(false)}
className="group flex flex-row items-center hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg" className="group flex flex-row items-center hover:bg-light-200 hover:dark:bg-dark-200 p-2 rounded-lg"
@@ -128,6 +130,7 @@ const SettingsDialogue = ({
Back Back
</p> </p>
</button> </button>
<div className="flex flex-col items-start space-y-1 mt-8"> <div className="flex flex-col items-start space-y-1 mt-8">
{sections.map((section) => ( {sections.map((section) => (
<button <button
@@ -146,6 +149,21 @@ const SettingsDialogue = ({
))} ))}
</div> </div>
</div> </div>
<div className="flex flex-col space-y-1 py-[18px] px-2">
<p className="text-xs text-black/70 dark:text-white/70">
Version: {process.env.NEXT_PUBLIC_VERSION}
</p>
<a
href="https://github.com/itzcrazykns/perplexica"
target="_blank"
rel="noopener noreferrer"
className="text-xs text-black/70 dark:text-white/70 flex flex-row space-x-1 items-center transition duration-200 hover:text-black/90 hover:dark:text-white/90"
>
<span>GitHub</span>
<ExternalLink size={12} />
</a>
</div>
</div>
<div className="w-full flex flex-col overflow-hidden"> <div className="w-full flex flex-col overflow-hidden">
<div className="flex flex-row lg:hidden w-full justify-between px-[20px] my-4 flex-shrink-0"> <div className="flex flex-row lg:hidden w-full justify-between px-[20px] my-4 flex-shrink-0">
<button <button

View File

@@ -310,7 +310,7 @@ const SettingsSwitch = ({
checked={isChecked} checked={isChecked}
onChange={handleSave} onChange={handleSave}
disabled={loading} disabled={loading}
className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500" className="group relative flex h-6 w-12 shrink-0 cursor-pointer rounded-full bg-light-200 dark:bg-white/10 p-1 duration-200 ease-in-out focus:outline-none transition-colors disabled:opacity-60 disabled:cursor-not-allowed data-[checked]:bg-sky-500 dark:data-[checked]:bg-sky-500"
> >
<span <span
aria-hidden="true" aria-hidden="true"

View File

@@ -91,7 +91,7 @@ const WeatherWidget = () => {
setData({ setData({
temperature: data.temperature, temperature: data.temperature,
condition: data.condition, condition: data.condition,
location: 'Mars', location: location.city,
humidity: data.humidity, humidity: data.humidity,
windSpeed: data.windSpeed, windSpeed: data.windSpeed,
icon: data.icon, icon: data.icon,

View File

@@ -9,38 +9,30 @@ type CalculationWidgetProps = {
const Calculation = ({ expression, result }: CalculationWidgetProps) => { const Calculation = ({ expression, result }: CalculationWidgetProps) => {
return ( return (
<div className="rounded-lg bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 overflow-hidden shadow-sm"> <div className="rounded-lg border border-light-200 dark:border-dark-200">
<div className="flex items-center gap-2 p-3 bg-light-100/50 dark:bg-dark-100/50 border-b border-light-200 dark:border-dark-200"> <div className="p-4 space-y-4">
<div className="rounded-full p-1.5 bg-light-100 dark:bg-dark-100"> <div className="space-y-2">
<Calculator className="w-4 h-4 text-black/70 dark:text-white/70" /> <div className="flex items-center gap-2 text-black/60 dark:text-white/70">
</div> <Calculator className="w-4 h-4" />
<span className="text-sm font-medium text-black dark:text-white"> <span className="text-xs uppercase font-semibold tracking-wide">
Calculation
</span>
</div>
<div className="p-4 space-y-3">
<div>
<div className="flex items-center gap-1.5 mb-1.5">
<span className="text-xs text-black/50 dark:text-white/50 font-medium">
Expression Expression
</span> </span>
</div> </div>
<div className="bg-light-100 dark:bg-dark-100 rounded-md p-2.5 border border-light-200 dark:border-dark-200"> <div className="rounded-lg border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary p-3">
<code className="text-sm text-black dark:text-white font-mono break-all"> <code className="text-sm text-black dark:text-white font-mono break-all">
{expression} {expression}
</code> </code>
</div> </div>
</div> </div>
<div> <div className="space-y-2">
<div className="flex items-center gap-1.5 mb-1.5"> <div className="flex items-center gap-2 text-black/60 dark:text-white/70">
<Equal className="w-3.5 h-3.5 text-black/50 dark:text-white/50" /> <Equal className="w-4 h-4" />
<span className="text-xs text-black/50 dark:text-white/50 font-medium"> <span className="text-xs uppercase font-semibold tracking-wide">
Result Result
</span> </span>
</div> </div>
<div className="bg-gradient-to-br from-light-100 to-light-secondary dark:from-dark-100 dark:to-dark-secondary rounded-md p-4 border-2 border-light-200 dark:border-dark-200"> <div className="rounded-xl border border-light-200 dark:border-dark-200 bg-light-secondary dark:bg-dark-secondary p-5">
<div className="text-4xl font-bold text-black dark:text-white font-mono tabular-nums"> <div className="text-4xl font-bold text-black dark:text-white font-mono tabular-nums">
{result.toLocaleString()} {result.toLocaleString()}
</div> </div>

View File

@@ -1,5 +1,6 @@
'use client'; 'use client';
import { getMeasurementUnit } from '@/lib/config/clientRegistry';
import { Wind, Droplets, Gauge } from 'lucide-react'; import { Wind, Droplets, Gauge } from 'lucide-react';
import { useMemo, useEffect, useState } from 'react'; import { useMemo, useEffect, useState } from 'react';
@@ -226,6 +227,20 @@ const Weather = ({
timezone, timezone,
}: WeatherWidgetProps) => { }: WeatherWidgetProps) => {
const [isDarkMode, setIsDarkMode] = useState(false); const [isDarkMode, setIsDarkMode] = useState(false);
const unit = getMeasurementUnit();
const isImperial = unit === 'imperial';
const tempUnitLabel = isImperial ? '°F' : '°C';
const windUnitLabel = isImperial ? 'mph' : 'km/h';
const formatTemp = (celsius: number) => {
if (!Number.isFinite(celsius)) return 0;
return Math.round(isImperial ? (celsius * 9) / 5 + 32 : celsius);
};
const formatWind = (speedKmh: number) => {
if (!Number.isFinite(speedKmh)) return 0;
return Math.round(isImperial ? speedKmh * 0.621371 : speedKmh);
};
useEffect(() => { useEffect(() => {
const checkDarkMode = () => { const checkDarkMode = () => {
@@ -266,14 +281,12 @@ const Weather = ({
return { return {
day: dayName, day: dayName,
icon: info.icon, icon: info.icon,
high: Math.round(daily.temperature_2m_max[idx + 1]), high: formatTemp(daily.temperature_2m_max[idx + 1]),
low: Math.round(daily.temperature_2m_min[idx + 1]), low: formatTemp(daily.temperature_2m_min[idx + 1]),
highF: Math.round((daily.temperature_2m_max[idx + 1] * 9) / 5 + 32),
lowF: Math.round((daily.temperature_2m_min[idx + 1] * 9) / 5 + 32),
precipitation: daily.precipitation_probability_max[idx + 1] || 0, precipitation: daily.precipitation_probability_max[idx + 1] || 0,
}; };
}); });
}, [daily, isDarkMode]); }, [daily, isDarkMode, isImperial]);
if (!current || !daily || !daily.time || daily.time.length === 0) { if (!current || !daily || !daily.time || daily.time.length === 0) {
return ( return (
@@ -305,9 +318,9 @@ const Weather = ({
<div> <div>
<div className="flex items-baseline gap-1"> <div className="flex items-baseline gap-1">
<span className="text-4xl font-bold drop-shadow-md"> <span className="text-4xl font-bold drop-shadow-md">
{Math.round(current.temperature_2m)}° {formatTemp(current.temperature_2m)}°
</span> </span>
<span className="text-lg">F C</span> <span className="text-lg">{tempUnitLabel}</span>
</div> </div>
<p className="text-sm font-medium drop-shadow mt-0.5"> <p className="text-sm font-medium drop-shadow mt-0.5">
{weatherInfo.description} {weatherInfo.description}
@@ -316,8 +329,8 @@ const Weather = ({
</div> </div>
<div className="text-right"> <div className="text-right">
<p className="text-xs font-medium opacity-90"> <p className="text-xs font-medium opacity-90">
{Math.round(daily.temperature_2m_max[0])}°{' '} {formatTemp(daily.temperature_2m_max[0])}°{' '}
{Math.round(daily.temperature_2m_min[0])}° {formatTemp(daily.temperature_2m_min[0])}°
</p> </p>
</div> </div>
</div> </div>
@@ -371,7 +384,7 @@ const Weather = ({
Wind Wind
</p> </p>
<p className="font-semibold"> <p className="font-semibold">
{Math.round(current.wind_speed_10m)} km/h {formatWind(current.wind_speed_10m)} {windUnitLabel}
</p> </p>
</div> </div>
</div> </div>
@@ -395,7 +408,8 @@ const Weather = ({
Feels Like Feels Like
</p> </p>
<p className="font-semibold"> <p className="font-semibold">
{Math.round(current.apparent_temperature)}°C {formatTemp(current.apparent_temperature)}
{tempUnitLabel}
</p> </p>
</div> </div>
</div> </div>

View File

@@ -1,14 +1,4 @@
import { Message } from '@/components/ChatWindow';
export const getSuggestions = async (chatHistory: [string, string][]) => { export const getSuggestions = async (chatHistory: [string, string][]) => {
const chatTurns = chatHistory.map(([role, content]) => {
if (role === 'human') {
return { role: 'user', content };
} else {
return { role: 'assistant', content };
}
});
const chatModel = localStorage.getItem('chatModelKey'); const chatModel = localStorage.getItem('chatModelKey');
const chatModelProvider = localStorage.getItem('chatModelProviderId'); const chatModelProvider = localStorage.getItem('chatModelProviderId');
@@ -18,7 +8,7 @@ export const getSuggestions = async (chatHistory: [string, string][]) => {
'Content-Type': 'application/json', 'Content-Type': 'application/json',
}, },
body: JSON.stringify({ body: JSON.stringify({
chatHistory: chatTurns, chatHistory,
chatModel: { chatModel: {
providerId: chatModelProvider, providerId: chatModelProvider,
key: chatModel, key: chatModel,

View File

@@ -29,7 +29,7 @@ const searchImages = async (
query: z.string().describe('The image search query.'), query: z.string().describe('The image search query.'),
}); });
const res = await llm.generateObject<z.infer<typeof schema>>({ const res = await llm.generateObject<typeof schema>({
messages: [ messages: [
{ {
role: 'system', role: 'system',

View File

@@ -28,7 +28,7 @@ const searchVideos = async (
query: z.string().describe('The video search query.'), query: z.string().describe('The video search query.'),
}); });
const res = await llm.generateObject<z.infer<typeof schema>>({ const res = await llm.generateObject<typeof schema>({
messages: [ messages: [
{ {
role: 'system', role: 'system',

View File

@@ -0,0 +1,99 @@
import { ResearcherOutput, SearchAgentInput } from './types';
import SessionManager from '@/lib/session';
import { classify } from './classifier';
import Researcher from './researcher';
import { getWriterPrompt } from '@/lib/prompts/search/writer';
import { WidgetExecutor } from './widgets';
class APISearchAgent {
async searchAsync(session: SessionManager, input: SearchAgentInput) {
const classification = await classify({
chatHistory: input.chatHistory,
enabledSources: input.config.sources,
query: input.followUp,
llm: input.config.llm,
});
const widgetPromise = WidgetExecutor.executeAll({
classification,
chatHistory: input.chatHistory,
followUp: input.followUp,
llm: input.config.llm,
});
let searchPromise: Promise<ResearcherOutput> | null = null;
if (!classification.classification.skipSearch) {
const researcher = new Researcher();
searchPromise = researcher.research(SessionManager.createSession(), {
chatHistory: input.chatHistory,
followUp: input.followUp,
classification: classification,
config: input.config,
});
}
const [widgetOutputs, searchResults] = await Promise.all([
widgetPromise,
searchPromise,
]);
if (searchResults) {
session.emit('data', {
type: 'searchResults',
data: searchResults.searchFindings,
});
}
session.emit('data', {
type: 'researchComplete',
});
const finalContext =
searchResults?.searchFindings
.map(
(f, index) =>
`<result index=${index + 1} title=${f.metadata.title}>${f.content}</result>`,
)
.join('\n') || '';
const widgetContext = widgetOutputs
.map((o) => {
return `<result>${o.llmContext}</result>`;
})
.join('\n-------------\n');
const finalContextWithWidgets = `<search_results note="These are the search results and assistant can cite these">\n${finalContext}\n</search_results>\n<widgets_result noteForAssistant="Its output is already showed to the user, assistant can use this information to answer the query but do not CITE this as a souce">\n${widgetContext}\n</widgets_result>`;
const writerPrompt = getWriterPrompt(
finalContextWithWidgets,
input.config.systemInstructions,
input.config.mode,
);
const answerStream = input.config.llm.streamText({
messages: [
{
role: 'system',
content: writerPrompt,
},
...input.chatHistory,
{
role: 'user',
content: input.followUp,
},
],
});
for await (const chunk of answerStream) {
session.emit('data', {
type: 'response',
data: chunk.contentChunk,
});
}
session.emit('end', {});
}
}
export default APISearchAgent;

View File

@@ -23,6 +23,9 @@ const schema = z.object({
showStockWidget: z showStockWidget: z
.boolean() .boolean()
.describe('Indicates whether to show the stock widget.'), .describe('Indicates whether to show the stock widget.'),
showCalculationWidget: z
.boolean()
.describe('Indicates whether to show the calculation widget.'),
}), }),
standaloneFollowUp: z standaloneFollowUp: z
.string() .string()

View File

@@ -1,26 +1,68 @@
import { ResearcherOutput, SearchAgentInput } from './types'; import { ResearcherOutput, SearchAgentInput } from './types';
import SessionManager from '@/lib/session'; import SessionManager from '@/lib/session';
import Classifier from './classifier'; import { classify } from './classifier';
import { WidgetRegistry } from './widgets';
import Researcher from './researcher'; import Researcher from './researcher';
import { getWriterPrompt } from '@/lib/prompts/search/writer'; import { getWriterPrompt } from '@/lib/prompts/search/writer';
import fs from 'fs'; import { WidgetExecutor } from './widgets';
import db from '@/lib/db';
import { chats, messages } from '@/lib/db/schema';
import { and, eq, gt } from 'drizzle-orm';
import { TextBlock } from '@/lib/types';
class SearchAgent { class SearchAgent {
async searchAsync(session: SessionManager, input: SearchAgentInput) { async searchAsync(session: SessionManager, input: SearchAgentInput) {
const classifier = new Classifier(); const exists = await db.query.messages.findFirst({
where: and(
eq(messages.chatId, input.chatId),
eq(messages.messageId, input.messageId),
),
});
const classification = await classifier.classify({ if (!exists) {
await db.insert(messages).values({
chatId: input.chatId,
messageId: input.messageId,
backendId: session.id,
query: input.followUp,
createdAt: new Date().toISOString(),
status: 'answering',
responseBlocks: [],
});
} else {
await db
.delete(messages)
.where(
and(eq(messages.chatId, input.chatId), gt(messages.id, exists.id)),
)
.execute();
await db
.update(messages)
.set({
status: 'answering',
backendId: session.id,
responseBlocks: [],
})
.where(
and(
eq(messages.chatId, input.chatId),
eq(messages.messageId, input.messageId),
),
)
.execute();
}
const classification = await classify({
chatHistory: input.chatHistory, chatHistory: input.chatHistory,
enabledSources: input.config.sources, enabledSources: input.config.sources,
query: input.followUp, query: input.followUp,
llm: input.config.llm, llm: input.config.llm,
}); });
const widgetPromise = WidgetRegistry.executeAll(classification.widgets, { const widgetPromise = WidgetExecutor.executeAll({
classification,
chatHistory: input.chatHistory,
followUp: input.followUp,
llm: input.config.llm, llm: input.config.llm,
embedding: input.config.embedding,
session: session,
}).then((widgetOutputs) => { }).then((widgetOutputs) => {
widgetOutputs.forEach((o) => { widgetOutputs.forEach((o) => {
session.emitBlock({ session.emitBlock({
@@ -37,7 +79,7 @@ class SearchAgent {
let searchPromise: Promise<ResearcherOutput> | null = null; let searchPromise: Promise<ResearcherOutput> | null = null;
if (!classification.skipSearch) { if (!classification.classification.skipSearch) {
const researcher = new Researcher(); const researcher = new Researcher();
searchPromise = researcher.research(session, { searchPromise = researcher.research(session, {
chatHistory: input.chatHistory, chatHistory: input.chatHistory,
@@ -57,21 +99,26 @@ class SearchAgent {
}); });
const finalContext = const finalContext =
searchResults?.findings searchResults?.searchFindings
.filter((f) => f.type === 'search_results') .map(
.flatMap((f) => f.results) (f, index) =>
.map((f) => `${f.metadata.title}: ${f.content}`) `<result index=${index + 1} title=${f.metadata.title}>${f.content}</result>`,
)
.join('\n') || ''; .join('\n') || '';
const widgetContext = widgetOutputs const widgetContext = widgetOutputs
.map((o) => { .map((o) => {
return `${o.type}: ${o.llmContext}`; return `<result>${o.llmContext}</result>`;
}) })
.join('\n-------------\n'); .join('\n-------------\n');
const finalContextWithWidgets = `<search_results note="These are the search results and you can cite these">${finalContext}</search_results>\n<widgets_result noteForAssistant="Its output is already showed to the user, you can use this information to answer the query but do not CITE this as a souce">${widgetContext}</widgets_result>`; const finalContextWithWidgets = `<search_results note="These are the search results and assistant can cite these">\n${finalContext}\n</search_results>\n<widgets_result noteForAssistant="Its output is already showed to the user, assistant can use this information to answer the query but do not CITE this as a souce">\n${widgetContext}\n</widgets_result>`;
const writerPrompt = getWriterPrompt(finalContextWithWidgets); const writerPrompt = getWriterPrompt(
finalContextWithWidgets,
input.config.systemInstructions,
input.config.mode,
);
const answerStream = input.config.llm.streamText({ const answerStream = input.config.llm.streamText({
messages: [ messages: [
{ {
@@ -86,18 +133,53 @@ class SearchAgent {
], ],
}); });
let accumulatedText = ''; let responseBlockId = '';
for await (const chunk of answerStream) { for await (const chunk of answerStream) {
accumulatedText += chunk.contentChunk; if (!responseBlockId) {
const block: TextBlock = {
session.emit('data', { id: crypto.randomUUID(),
type: 'response', type: 'text',
data: chunk.contentChunk, data: chunk.contentChunk,
}); };
session.emitBlock(block);
responseBlockId = block.id;
} else {
const block = session.getBlock(responseBlockId) as TextBlock | null;
if (!block) {
continue;
}
block.data += chunk.contentChunk;
session.updateBlock(block.id, [
{
op: 'replace',
path: '/data',
value: block.data,
},
]);
}
} }
session.emit('end', {}); session.emit('end', {});
await db
.update(messages)
.set({
status: 'completed',
responseBlocks: session.getAllBlocks(),
})
.where(
and(
eq(messages.chatId, input.chatId),
eq(messages.messageId, input.messageId),
),
)
.execute();
} }
} }

View File

@@ -0,0 +1,129 @@
import z from 'zod';
import { ResearchAction } from '../../types';
import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
import { searchSearxng } from '@/lib/searxng';
const schema = z.object({
queries: z.array(z.string()).describe('List of academic search queries'),
});
const academicSearchDescription = `
Use this tool to perform academic searches for scholarly articles, papers, and research studies relevant to the user's query. Provide a list of concise search queries that will help gather comprehensive academic information on the topic at hand.
You can provide up to 3 queries at a time. Make sure the queries are specific and relevant to the user's needs.
For example, if the user is interested in recent advancements in renewable energy, your queries could be:
1. "Recent advancements in renewable energy 2024"
2. "Cutting-edge research on solar power technologies"
3. "Innovations in wind energy systems"
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed academic information.
`;
const academicSearchAction: ResearchAction<typeof schema> = {
name: 'academic_search',
schema: schema,
getDescription: () => academicSearchDescription,
getToolDescription: () =>
"Use this tool to perform academic searches for scholarly articles, papers, and research studies relevant to the user's query. Provide a list of concise search queries that will help gather comprehensive academic information on the topic at hand.",
enabled: (config) =>
config.sources.includes('academic') &&
config.classification.classification.skipSearch === false &&
config.classification.classification.academicSearch === true,
execute: async (input, additionalConfig) => {
input.queries = input.queries.slice(0, 3);
const researchBlock = additionalConfig.session.getBlock(
additionalConfig.researchBlockId,
);
if (researchBlock && researchBlock.type === 'research') {
researchBlock.data.subSteps.push({
type: 'searching',
id: crypto.randomUUID(),
searching: input.queries,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
const searchResultsBlockId = crypto.randomUUID();
let searchResultsEmitted = false;
let results: Chunk[] = [];
const search = async (q: string) => {
const res = await searchSearxng(q, {
engines: ['arxiv', 'google scholar', 'pubmed'],
});
const resultChunks: Chunk[] = res.results.map((r) => ({
content: r.content || r.title,
metadata: {
title: r.title,
url: r.url,
},
}));
results.push(...resultChunks);
if (
!searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
searchResultsEmitted = true;
researchBlock.data.subSteps.push({
id: searchResultsBlockId,
type: 'search_results',
reading: resultChunks,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
} else if (
searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
const subStepIndex = researchBlock.data.subSteps.findIndex(
(step) => step.id === searchResultsBlockId,
);
const subStep = researchBlock.data.subSteps[
subStepIndex
] as SearchResultsResearchBlock;
subStep.reading.push(...resultChunks);
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
};
await Promise.all(input.queries.map(search));
return {
type: 'search_results',
results,
};
},
};
export default academicSearchAction;

View File

@@ -1,14 +1,19 @@
import z from 'zod'; import z from 'zod';
import { ResearchAction } from '../../types'; import { ResearchAction } from '../../types';
const actionDescription = `
Use this action ONLY when you have completed all necessary research and are ready to provide a final answer to the user. This indicates that you have gathered sufficient information from previous steps and are concluding the research process.
YOU MUST CALL THIS ACTION TO SIGNAL COMPLETION; DO NOT OUTPUT FINAL ANSWERS DIRECTLY TO THE USER.
IT WILL BE AUTOMATICALLY TRIGGERED IF MAXIMUM ITERATIONS ARE REACHED SO IF YOU'RE LOW ON ITERATIONS, DON'T CALL IT AND INSTEAD FOCUS ON GATHERING ESSENTIAL INFO FIRST.
`;
const doneAction: ResearchAction<any> = { const doneAction: ResearchAction<any> = {
name: 'done', name: 'done',
description: schema: z.object({}),
"Indicates that the research process is complete and no further actions are needed. Use this action when you have gathered sufficient information to answer the user's query.", getToolDescription: () =>
'Only call this after __reasoning_preamble AND after any other needed tool calls when you truly have enough to answer. Do not call if information is still missing.',
getDescription: () => actionDescription,
enabled: (_) => true, enabled: (_) => true,
schema: z.object({
type: z.literal('done'),
}),
execute: async (params, additionalConfig) => { execute: async (params, additionalConfig) => {
return { return {
type: 'done', type: 'done',

View File

@@ -1,8 +1,18 @@
import academicSearchAction from './academicSearch';
import doneAction from './done'; import doneAction from './done';
import planAction from './plan';
import ActionRegistry from './registry'; import ActionRegistry from './registry';
import scrapeURLAction from './scrapeURL';
import socialSearchAction from './socialSearch';
import uploadsSearchAction from './uploadsSearch';
import webSearchAction from './webSearch'; import webSearchAction from './webSearch';
ActionRegistry.register(webSearchAction); ActionRegistry.register(webSearchAction);
ActionRegistry.register(doneAction); ActionRegistry.register(doneAction);
ActionRegistry.register(planAction);
ActionRegistry.register(scrapeURLAction);
ActionRegistry.register(uploadsSearchAction);
ActionRegistry.register(academicSearchAction);
ActionRegistry.register(socialSearchAction);
export { ActionRegistry }; export { ActionRegistry };

View File

@@ -0,0 +1,40 @@
import z from 'zod';
import { ResearchAction } from '../../types';
const schema = z.object({
plan: z
.string()
.describe(
'A concise natural-language plan in one short paragraph. Open with a short intent phrase (e.g., "Okay, the user wants to...", "Searching for...", "Looking into...") and lay out the steps you will take.',
),
});
const actionDescription = `
Use this tool FIRST on every turn to state your plan in natural language before any other action. Keep it short, action-focused, and tailored to the current query.
Make sure to not include reference to any tools or actions you might take, just the plan itself. The user isn't aware about tools, but they love to see your thought process.
Here are some examples of good plans:
<examples>
- "Okay, the user wants to know the latest advancements in renewable energy. I will start by looking for recent articles and studies on this topic, then summarize the key points." -> "I have gathered enough information to provide a comprehensive answer."
- "The user is asking about the health benefits of a Mediterranean diet. I will search for scientific studies and expert opinions on this diet, then compile the findings into a clear summary." -> "I have gathered information about the Mediterranean diet and its health benefits, I will now look up for any recent studies to ensure the information is current."
</examples>
YOU CAN NEVER CALL ANY OTHER TOOL BEFORE CALLING THIS ONE FIRST, IF YOU DO, THAT CALL WOULD BE IGNORED.
`;
const planAction: ResearchAction<typeof schema> = {
name: '__reasoning_preamble',
schema: schema,
getToolDescription: () =>
'Use this FIRST on every turn to state your plan in natural language before any other action. Keep it short, action-focused, and tailored to the current query.',
getDescription: () => actionDescription,
enabled: (config) => config.mode !== 'speed',
execute: async (input, _) => {
return {
type: 'reasoning',
reasoning: input.plan,
};
},
};
export default planAction;

View File

@@ -1,9 +1,11 @@
import { Tool, ToolCall } from '@/lib/models/types';
import { import {
ActionConfig,
ActionOutput, ActionOutput,
AdditionalConfig, AdditionalConfig,
ClassifierOutput, ClassifierOutput,
ResearchAction, ResearchAction,
SearchAgentConfig,
SearchSources,
} from '../../types'; } from '../../types';
class ActionRegistry { class ActionRegistry {
@@ -19,26 +21,53 @@ class ActionRegistry {
static getAvailableActions(config: { static getAvailableActions(config: {
classification: ClassifierOutput; classification: ClassifierOutput;
fileIds: string[];
mode: SearchAgentConfig['mode'];
sources: SearchSources[];
}): ResearchAction[] { }): ResearchAction[] {
return Array.from( return Array.from(
this.actions.values().filter((action) => action.enabled(config)), this.actions.values().filter((action) => action.enabled(config)),
); );
} }
static getAvailableActionTools(config: {
classification: ClassifierOutput;
fileIds: string[];
mode: SearchAgentConfig['mode'];
sources: SearchSources[];
}): Tool[] {
const availableActions = this.getAvailableActions(config);
return availableActions.map((action) => ({
name: action.name,
description: action.getToolDescription({ mode: config.mode }),
schema: action.schema,
}));
}
static getAvailableActionsDescriptions(config: { static getAvailableActionsDescriptions(config: {
classification: ClassifierOutput; classification: ClassifierOutput;
fileIds: string[];
mode: SearchAgentConfig['mode'];
sources: SearchSources[];
}): string { }): string {
const availableActions = this.getAvailableActions(config); const availableActions = this.getAvailableActions(config);
return availableActions return availableActions
.map((action) => `------------\n##${action.name}\n${action.description}`) .map(
(action) =>
`<tool name="${action.name}">\n${action.getDescription({ mode: config.mode })}\n</tool>`,
)
.join('\n\n'); .join('\n\n');
} }
static async execute( static async execute(
name: string, name: string,
params: any, params: any,
additionalConfig: AdditionalConfig, additionalConfig: AdditionalConfig & {
researchBlockId: string;
fileIds: string[];
},
) { ) {
const action = this.actions.get(name); const action = this.actions.get(name);
@@ -50,16 +79,19 @@ class ActionRegistry {
} }
static async executeAll( static async executeAll(
actions: ActionConfig[], actions: ToolCall[],
additionalConfig: AdditionalConfig, additionalConfig: AdditionalConfig & {
researchBlockId: string;
fileIds: string[];
},
): Promise<ActionOutput[]> { ): Promise<ActionOutput[]> {
const results: ActionOutput[] = []; const results: ActionOutput[] = [];
await Promise.all( await Promise.all(
actions.map(async (actionConfig) => { actions.map(async (actionConfig) => {
const output = await this.execute( const output = await this.execute(
actionConfig.type, actionConfig.name,
actionConfig.params, actionConfig.arguments,
additionalConfig, additionalConfig,
); );
results.push(output); results.push(output);

View File

@@ -0,0 +1,139 @@
import z from 'zod';
import { ResearchAction } from '../../types';
import { Chunk, ReadingResearchBlock } from '@/lib/types';
import TurnDown from 'turndown';
import path from 'path';
const turndownService = new TurnDown();
const schema = z.object({
urls: z.array(z.string()).describe('A list of URLs to scrape content from.'),
});
const actionDescription = `
Use this tool to scrape and extract content from the provided URLs. This is useful when you the user has asked you to extract or summarize information from specific web pages. You can provide up to 3 URLs at a time. NEVER CALL THIS TOOL EXPLICITLY YOURSELF UNLESS INSTRUCTED TO DO SO BY THE USER.
You should only call this tool when the user has specifically requested information from certain web pages, never call this yourself to get extra information without user instruction.
For example, if the user says "Please summarize the content of https://example.com/article", you can call this tool with that URL to get the content and then provide the summary or "What does X mean according to https://example.com/page", you can call this tool with that URL to get the content and provide the explanation.
`;
const scrapeURLAction: ResearchAction<typeof schema> = {
name: 'scrape_url',
schema: schema,
getToolDescription: () =>
'Use this tool to scrape and extract content from the provided URLs. This is useful when you the user has asked you to extract or summarize information from specific web pages. You can provide up to 3 URLs at a time. NEVER CALL THIS TOOL EXPLICITLY YOURSELF UNLESS INSTRUCTED TO DO SO BY THE USER.',
getDescription: () => actionDescription,
enabled: (_) => true,
execute: async (params, additionalConfig) => {
params.urls = params.urls.slice(0, 3);
let readingBlockId = crypto.randomUUID();
let readingEmitted = false;
const researchBlock = additionalConfig.session.getBlock(
additionalConfig.researchBlockId,
);
const results: Chunk[] = [];
await Promise.all(
params.urls.map(async (url) => {
try {
const res = await fetch(url);
const text = await res.text();
const title =
text.match(/<title>(.*?)<\/title>/i)?.[1] || `Content from ${url}`;
if (
!readingEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
readingEmitted = true;
researchBlock.data.subSteps.push({
id: readingBlockId,
type: 'reading',
reading: [
{
content: '',
metadata: {
url,
title: title,
},
},
],
});
additionalConfig.session.updateBlock(
additionalConfig.researchBlockId,
[
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
],
);
} else if (
readingEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
const subStepIndex = researchBlock.data.subSteps.findIndex(
(step: any) => step.id === readingBlockId,
);
const subStep = researchBlock.data.subSteps[
subStepIndex
] as ReadingResearchBlock;
subStep.reading.push({
content: '',
metadata: {
url,
title: title,
},
});
additionalConfig.session.updateBlock(
additionalConfig.researchBlockId,
[
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
],
);
}
const markdown = turndownService.turndown(text);
results.push({
content: markdown,
metadata: {
url,
title: title,
},
});
} catch (error) {
results.push({
content: `Failed to fetch content from ${url}: ${error}`,
metadata: {
url,
title: `Error fetching ${url}`,
},
});
}
}),
);
return {
type: 'search_results',
results,
};
},
};
export default scrapeURLAction;

View File

@@ -0,0 +1,129 @@
import z from 'zod';
import { ResearchAction } from '../../types';
import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
import { searchSearxng } from '@/lib/searxng';
const schema = z.object({
queries: z.array(z.string()).describe('List of social search queries'),
});
const socialSearchDescription = `
Use this tool to perform social media searches for relevant posts, discussions, and trends related to the user's query. Provide a list of concise search queries that will help gather comprehensive social media information on the topic at hand.
You can provide up to 3 queries at a time. Make sure the queries are specific and relevant to the user's needs.
For example, if the user is interested in public opinion on electric vehicles, your queries could be:
1. "Electric vehicles public opinion 2024"
2. "Social media discussions on EV adoption"
3. "Trends in electric vehicle usage"
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed social media information.
`;
const socialSearchAction: ResearchAction<typeof schema> = {
name: 'social_search',
schema: schema,
getDescription: () => socialSearchDescription,
getToolDescription: () =>
"Use this tool to perform social media searches for relevant posts, discussions, and trends related to the user's query. Provide a list of concise search queries that will help gather comprehensive social media information on the topic at hand.",
enabled: (config) =>
config.sources.includes('discussions') &&
config.classification.classification.skipSearch === false &&
config.classification.classification.discussionSearch === true,
execute: async (input, additionalConfig) => {
input.queries = input.queries.slice(0, 3);
const researchBlock = additionalConfig.session.getBlock(
additionalConfig.researchBlockId,
);
if (researchBlock && researchBlock.type === 'research') {
researchBlock.data.subSteps.push({
type: 'searching',
id: crypto.randomUUID(),
searching: input.queries,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
const searchResultsBlockId = crypto.randomUUID();
let searchResultsEmitted = false;
let results: Chunk[] = [];
const search = async (q: string) => {
const res = await searchSearxng(q, {
engines: ['reddit'],
});
const resultChunks: Chunk[] = res.results.map((r) => ({
content: r.content || r.title,
metadata: {
title: r.title,
url: r.url,
},
}));
results.push(...resultChunks);
if (
!searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
searchResultsEmitted = true;
researchBlock.data.subSteps.push({
id: searchResultsBlockId,
type: 'search_results',
reading: resultChunks,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
} else if (
searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
const subStepIndex = researchBlock.data.subSteps.findIndex(
(step) => step.id === searchResultsBlockId,
);
const subStep = researchBlock.data.subSteps[
subStepIndex
] as SearchResultsResearchBlock;
subStep.reading.push(...resultChunks);
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
};
await Promise.all(input.queries.map(search));
return {
type: 'search_results',
results,
};
},
};
export default socialSearchAction;

View File

@@ -0,0 +1,102 @@
import z from 'zod';
import { ResearchAction } from '../../types';
import UploadStore from '@/lib/uploads/store';
const schema = z.object({
queries: z
.array(z.string())
.describe(
'A list of queries to search in user uploaded files. Can be a maximum of 3 queries.',
),
});
const uploadsSearchAction: ResearchAction<typeof schema> = {
name: 'uploads_search',
enabled: (config) =>
(config.classification.classification.personalSearch &&
config.fileIds.length > 0) ||
config.fileIds.length > 0,
schema,
getToolDescription: () =>
`Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.`,
getDescription: () => `
Use this tool to perform searches over the user's uploaded files. This is useful when you need to gather information from the user's documents to answer their questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
Always ensure that the queries you use are directly relevant to the user's request and pertain to the content of their uploaded files.
For example, if the user says "Please find information about X in my uploaded documents", you can call this tool with a query related to X to retrieve the relevant information from their files.
Never use this tool to search the web or for information that is not contained within the user's uploaded files.
`,
execute: async (input, additionalConfig) => {
input.queries = input.queries.slice(0, 3);
const researchBlock = additionalConfig.session.getBlock(
additionalConfig.researchBlockId,
);
if (researchBlock && researchBlock.type === 'research') {
researchBlock.data.subSteps.push({
id: crypto.randomUUID(),
type: 'upload_searching',
queries: input.queries,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
const uploadStore = new UploadStore({
embeddingModel: additionalConfig.embedding,
fileIds: additionalConfig.fileIds,
});
const results = await uploadStore.query(input.queries, 10);
const seenIds = new Map<string, number>();
const filteredSearchResults = results
.map((result, index) => {
if (result.metadata.url && !seenIds.has(result.metadata.url)) {
seenIds.set(result.metadata.url, index);
return result;
} else if (result.metadata.url && seenIds.has(result.metadata.url)) {
const existingIndex = seenIds.get(result.metadata.url)!;
const existingResult = results[existingIndex];
existingResult.content += `\n\n${result.content}`;
return undefined;
}
return result;
})
.filter((r) => r !== undefined);
if (researchBlock && researchBlock.type === 'research') {
researchBlock.data.subSteps.push({
id: crypto.randomUUID(),
type: 'upload_search_results',
results: filteredSearchResults,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
return {
type: 'search_results',
results: filteredSearchResults,
};
},
};
export default uploadsSearchAction;

View File

@@ -1,7 +1,7 @@
import z from 'zod'; import z from 'zod';
import { ResearchAction } from '../../types'; import { ResearchAction } from '../../types';
import { searchSearxng } from '@/lib/searxng'; import { searchSearxng } from '@/lib/searxng';
import { Chunk } from '@/lib/types'; import { Chunk, SearchResultsResearchBlock } from '@/lib/types';
const actionSchema = z.object({ const actionSchema = z.object({
type: z.literal('web_search'), type: z.literal('web_search'),
@@ -10,38 +10,164 @@ const actionSchema = z.object({
.describe('An array of search queries to perform web searches for.'), .describe('An array of search queries to perform web searches for.'),
}); });
const actionDescription = ` const speedModePrompt = `
You have to use this action aggressively to find relevant information from the web to answer user queries. You can combine this action with other actions to gather comprehensive data. Always ensure that you provide accurate and up-to-date information by leveraging web search results. Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
When this action is present, you must use it to obtain current information from the web. You are currently on speed mode, meaning you would only get to call this tool once. Make sure to prioritize the most important queries that are likely to get you the needed information in one go.
### How to use: Your queries should be very targeted and specific to the information you need, avoid broad or generic queries.
1. For speed search mode, you can use this action once. Make sure to cover all aspects of the user's query in that single search. Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
2. If you're on quality mode, you'll get to use this action up to two times. Use the first search to gather general information, and the second search to fill in any gaps or get more specific details based on the initial findings.
3. If you're set on quality mode, then you will get to use this action multiple times to gather more information. Use your judgment to decide when additional searches are necessary to provide a thorough and accurate response.
Input: An array of search queries. Make sure the queries are relevant to the user's request and cover different aspects if necessary. You can include a maximum of 3 queries. Make sure the queries are SEO friendly and not sentences rather keywords which can be used to search a search engine like Google, Bing, etc. For example, if the user is asking about the features of a new technology, you might use queries like "GPT-5.1 features", "GPT-5.1 release date", "GPT-5.1 improvements" rather than a broad query like "Tell me about GPT-5.1".
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information.
`;
const balancedModePrompt = `
Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
You can call this tool several times if needed to gather enough information.
Start initially with broader queries to get an overview, then narrow down with more specific queries based on the results you receive.
Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
For example if the user is asking about Tesla, your actions should be like:
1. __reasoning_preamble "The user is asking about Tesla. I will start with broader queries to get an overview of Tesla, then narrow down with more specific queries based on the results I receive." then
2. web_search ["Tesla", "Tesla latest news", "Tesla stock price"] then
3. __reasoning_preamble "Based on the previous search results, I will now narrow down my queries to focus on Tesla's recent developments and stock performance." then
4. web_search ["Tesla Q2 2025 earnings", "Tesla new model 2025", "Tesla stock analysis"] then done.
5. __reasoning_preamble "I have gathered enough information to provide a comprehensive answer."
6. done.
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information. You can call this tools, multiple times as needed.
`;
const qualityModePrompt = `
Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.
You have to call this tool several times to gather enough information unless the question is very simple (like greeting questions or basic facts).
Start initially with broader queries to get an overview, then narrow down with more specific queries based on the results you receive.
Never stop before at least 5-6 iterations of searches unless the user question is very simple.
Your queries shouldn't be sentences but rather keywords that are SEO friendly and can be used to search the web for information.
You can search for 3 queries in one go, make sure to utilize all 3 queries to maximize the information you can gather. If a question is simple, then split your queries to cover different aspects or related topics to get a comprehensive understanding.
If this tool is present and no other tools are more relevant, you MUST use this tool to get the needed information. You can call this tools, multiple times as needed.
`; `;
const webSearchAction: ResearchAction<typeof actionSchema> = { const webSearchAction: ResearchAction<typeof actionSchema> = {
name: 'web_search', name: 'web_search',
description: actionDescription,
schema: actionSchema, schema: actionSchema,
enabled: (config) => config.classification.intents.includes('web_search'), getToolDescription: () =>
execute: async (input, _) => { "Use this tool to perform web searches based on the provided queries. This is useful when you need to gather information from the web to answer the user's questions. You can provide up to 3 queries at a time. You will have to use this every single time if this is present and relevant.",
getDescription: (config) => {
let prompt = '';
switch (config.mode) {
case 'speed':
prompt = speedModePrompt;
break;
case 'balanced':
prompt = balancedModePrompt;
break;
case 'quality':
prompt = qualityModePrompt;
break;
default:
prompt = speedModePrompt;
break;
}
return prompt;
},
enabled: (config) =>
config.sources.includes('web') &&
config.classification.classification.skipSearch === false,
execute: async (input, additionalConfig) => {
input.queries = input.queries.slice(0, 3);
const researchBlock = additionalConfig.session.getBlock(
additionalConfig.researchBlockId,
);
if (researchBlock && researchBlock.type === 'research') {
researchBlock.data.subSteps.push({
id: crypto.randomUUID(),
type: 'searching',
searching: input.queries,
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
const searchResultsBlockId = crypto.randomUUID();
let searchResultsEmitted = false;
let results: Chunk[] = []; let results: Chunk[] = [];
const search = async (q: string) => { const search = async (q: string) => {
const res = await searchSearxng(q); const res = await searchSearxng(q);
res.results.forEach((r) => { const resultChunks: Chunk[] = res.results.map((r) => ({
results.push({
content: r.content || r.title, content: r.content || r.title,
metadata: { metadata: {
title: r.title, title: r.title,
url: r.url, url: r.url,
}, },
}));
results.push(...resultChunks);
if (
!searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
searchResultsEmitted = true;
researchBlock.data.subSteps.push({
id: searchResultsBlockId,
type: 'search_results',
reading: resultChunks,
}); });
});
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
} else if (
searchResultsEmitted &&
researchBlock &&
researchBlock.type === 'research'
) {
const subStepIndex = researchBlock.data.subSteps.findIndex(
(step) => step.id === searchResultsBlockId,
);
const subStep = researchBlock.data.subSteps[
subStepIndex
] as SearchResultsResearchBlock;
subStep.reading.push(...resultChunks);
additionalConfig.session.updateBlock(additionalConfig.researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: researchBlock.data.subSteps,
},
]);
}
}; };
await Promise.all(input.queries.map(search)); await Promise.all(input.queries.map(search));

View File

@@ -1,46 +1,37 @@
import z from 'zod'; import { ActionOutput, ResearcherInput, ResearcherOutput } from '../types';
import {
ActionConfig,
ActionOutput,
ResearcherInput,
ResearcherOutput,
} from '../types';
import { ActionRegistry } from './actions'; import { ActionRegistry } from './actions';
import { getResearcherPrompt } from '@/lib/prompts/search/researcher'; import { getResearcherPrompt } from '@/lib/prompts/search/researcher';
import SessionManager from '@/lib/session'; import SessionManager from '@/lib/session';
import { ReasoningResearchBlock } from '@/lib/types'; import { Message, ReasoningResearchBlock } from '@/lib/types';
import formatChatHistoryAsString from '@/lib/utils/formatHistory'; import formatChatHistoryAsString from '@/lib/utils/formatHistory';
import { ToolCall } from '@/lib/models/types';
class Researcher { class Researcher {
async research( async research(
session: SessionManager, session: SessionManager,
input: ResearcherInput, input: ResearcherInput,
): Promise<ResearcherOutput> { ): Promise<ResearcherOutput> {
let findings: string = '';
let actionOutput: ActionOutput[] = []; let actionOutput: ActionOutput[] = [];
let maxIteration = let maxIteration =
input.config.mode === 'speed' input.config.mode === 'speed'
? 1 ? 2
: input.config.mode === 'balanced' : input.config.mode === 'balanced'
? 3 ? 6
: 25; : 25;
const availableActions = ActionRegistry.getAvailableActions({ const availableTools = ActionRegistry.getAvailableActionTools({
classification: input.classification, classification: input.classification,
}); fileIds: input.config.fileIds,
mode: input.config.mode,
const schema = z.object({ sources: input.config.sources,
reasoning: z
.string()
.describe('The reasoning behind choosing the next action.'),
action: z
.union(availableActions.map((a) => a.schema))
.describe('The action to be performed next.'),
}); });
const availableActionsDescription = const availableActionsDescription =
ActionRegistry.getAvailableActionsDescriptions({ ActionRegistry.getAvailableActionsDescriptions({
classification: input.classification, classification: input.classification,
fileIds: input.config.fileIds,
mode: input.config.mode,
sources: input.config.sources,
}); });
const researchBlockId = crypto.randomUUID(); const researchBlockId = crypto.randomUUID();
@@ -53,22 +44,7 @@ class Researcher {
}, },
}); });
for (let i = 0; i < maxIteration; i++) { const agentMessageHistory: Message[] = [
const researcherPrompt = getResearcherPrompt(
availableActionsDescription,
input.config.mode,
i,
maxIteration,
);
const actionStream = input.config.llm.streamObject<
z.infer<typeof schema>
>({
messages: [
{
role: 'system',
content: researcherPrompt,
},
{ {
role: 'user', role: 'user',
content: ` content: `
@@ -76,14 +52,28 @@ class Researcher {
${formatChatHistoryAsString(input.chatHistory.slice(-10))} ${formatChatHistoryAsString(input.chatHistory.slice(-10))}
User: ${input.followUp} (Standalone question: ${input.classification.standaloneFollowUp}) User: ${input.followUp} (Standalone question: ${input.classification.standaloneFollowUp})
</conversation> </conversation>
<previous_actions>
${findings}
</previous_actions>
`, `,
}, },
];
for (let i = 0; i < maxIteration; i++) {
const researcherPrompt = getResearcherPrompt(
availableActionsDescription,
input.config.mode,
i,
maxIteration,
input.config.fileIds,
);
const actionStream = input.config.llm.streamText({
messages: [
{
role: 'system',
content: researcherPrompt,
},
...agentMessageHistory,
], ],
schema, tools: availableTools,
}); });
const block = session.getBlock(researchBlockId); const block = session.getBlock(researchBlockId);
@@ -91,22 +81,26 @@ class Researcher {
let reasoningEmitted = false; let reasoningEmitted = false;
let reasoningId = crypto.randomUUID(); let reasoningId = crypto.randomUUID();
let finalActionRes: any; let finalToolCalls: ToolCall[] = [];
for await (const partialRes of actionStream) { for await (const partialRes of actionStream) {
try { if (partialRes.toolCallChunk.length > 0) {
partialRes.toolCallChunk.forEach((tc) => {
if ( if (
partialRes.reasoning && tc.name === '__reasoning_preamble' &&
tc.arguments['plan'] &&
!reasoningEmitted && !reasoningEmitted &&
block && block &&
block.type === 'research' block.type === 'research'
) { ) {
reasoningEmitted = true; reasoningEmitted = true;
block.data.subSteps.push({ block.data.subSteps.push({
id: reasoningId, id: reasoningId,
type: 'reasoning', type: 'reasoning',
reasoning: partialRes.reasoning, reasoning: tc.arguments['plan'],
}); });
session.updateBlock(researchBlockId, [ session.updateBlock(researchBlockId, [
{ {
op: 'replace', op: 'replace',
@@ -115,7 +109,8 @@ class Researcher {
}, },
]); ]);
} else if ( } else if (
partialRes.reasoning && tc.name === '__reasoning_preamble' &&
tc.arguments['plan'] &&
reasoningEmitted && reasoningEmitted &&
block && block &&
block.type === 'research' block.type === 'research'
@@ -123,11 +118,12 @@ class Researcher {
const subStepIndex = block.data.subSteps.findIndex( const subStepIndex = block.data.subSteps.findIndex(
(step: any) => step.id === reasoningId, (step: any) => step.id === reasoningId,
); );
if (subStepIndex !== -1) { if (subStepIndex !== -1) {
const subStep = block.data.subSteps[ const subStep = block.data.subSteps[
subStepIndex subStepIndex
] as ReasoningResearchBlock; ] as ReasoningResearchBlock;
subStep.reasoning = partialRes.reasoning; subStep.reasoning = tc.arguments['plan'];
session.updateBlock(researchBlockId, [ session.updateBlock(researchBlockId, [
{ {
op: 'replace', op: 'replace',
@@ -138,92 +134,87 @@ class Researcher {
} }
} }
finalActionRes = partialRes; const existingIndex = finalToolCalls.findIndex(
} catch (e) { (ftc) => ftc.id === tc.id,
// nothing );
if (existingIndex !== -1) {
finalToolCalls[existingIndex].arguments = tc.arguments;
} else {
finalToolCalls.push(tc);
}
});
} }
} }
if (finalActionRes.action.type === 'done') { if (finalToolCalls.length === 0) {
break; break;
} }
const actionConfig: ActionConfig = { if (finalToolCalls[finalToolCalls.length - 1].name === 'done') {
type: finalActionRes.action.type as string, break;
params: finalActionRes.action,
};
const queries = actionConfig.params.queries || [];
if (block && block.type === 'research') {
block.data.subSteps.push({
id: crypto.randomUUID(),
type: 'searching',
searching: queries,
});
session.updateBlock(researchBlockId, [
{ op: 'replace', path: '/data/subSteps', value: block.data.subSteps },
]);
} }
findings += `\n---\nIteration ${i + 1}:\n`; agentMessageHistory.push({
findings += 'Reasoning: ' + finalActionRes.reasoning + '\n'; role: 'assistant',
findings += `Executing Action: ${actionConfig.type} with params ${JSON.stringify(actionConfig.params)}\n`; content: '',
tool_calls: finalToolCalls,
});
const actionResult = await ActionRegistry.execute( const actionResults = await ActionRegistry.executeAll(finalToolCalls, {
actionConfig.type,
actionConfig.params,
{
llm: input.config.llm, llm: input.config.llm,
embedding: input.config.embedding, embedding: input.config.embedding,
session: session, session: session,
}, researchBlockId: researchBlockId,
); fileIds: input.config.fileIds,
});
actionOutput.push(actionResult);
actionOutput.push(...actionResults);
if (actionResult.type === 'search_results') {
if (block && block.type === 'research') { actionResults.forEach((action, i) => {
block.data.subSteps.push({ agentMessageHistory.push({
id: crypto.randomUUID(), role: 'tool',
type: 'reading', id: finalToolCalls[i].id,
reading: actionResult.results, name: finalToolCalls[i].name,
content: JSON.stringify(action),
});
}); });
session.updateBlock(researchBlockId, [
{
op: 'replace',
path: '/data/subSteps',
value: block.data.subSteps,
},
]);
} }
findings += actionResult.results const searchResults = actionOutput
.map( .filter((a) => a.type === 'search_results')
(r) => .flatMap((a) => a.results);
`Title: ${r.metadata.title}\nURL: ${r.metadata.url}\nContent: ${r.content}\n`,
) const seenUrls = new Map<string, number>();
.join('\n');
const filteredSearchResults = searchResults
.map((result, index) => {
if (result.metadata.url && !seenUrls.has(result.metadata.url)) {
seenUrls.set(result.metadata.url, index);
return result;
} else if (result.metadata.url && seenUrls.has(result.metadata.url)) {
const existingIndex = seenUrls.get(result.metadata.url)!;
const existingResult = searchResults[existingIndex];
existingResult.content += `\n\n${result.content}`;
return undefined;
} }
findings += '\n---------\n'; return result;
} })
.filter((r) => r !== undefined);
const searchResults = actionOutput.filter( session.emitBlock({
(a) => a.type === 'search_results', id: crypto.randomUUID(),
); type: 'source',
data: filteredSearchResults,
session.emit('data', {
type: 'sources',
data: searchResults
.flatMap((a) => a.results)
.map((r) => ({
content: r.content,
metadata: r.metadata,
})),
}); });
return { return {
findings: actionOutput, findings: actionOutput,
searchFindings: filteredSearchResults,
}; };
} }
} }

View File

@@ -8,37 +8,32 @@ export type SearchSources = 'web' | 'discussions' | 'academic';
export type SearchAgentConfig = { export type SearchAgentConfig = {
sources: SearchSources[]; sources: SearchSources[];
fileIds: string[];
llm: BaseLLM<any>; llm: BaseLLM<any>;
embedding: BaseEmbedding<any>; embedding: BaseEmbedding<any>;
mode: 'speed' | 'balanced' | 'quality'; mode: 'speed' | 'balanced' | 'quality';
systemInstructions: string;
}; };
export type SearchAgentInput = { export type SearchAgentInput = {
chatHistory: ChatTurnMessage[]; chatHistory: ChatTurnMessage[];
followUp: string; followUp: string;
config: SearchAgentConfig; config: SearchAgentConfig;
chatId: string;
messageId: string;
}; };
export interface Intent { export type WidgetInput = {
name: string; chatHistory: ChatTurnMessage[];
description: string; followUp: string;
requiresSearch: boolean; classification: ClassifierOutput;
enabled: (config: { sources: SearchSources[] }) => boolean; llm: BaseLLM<any>;
}
export type Widget<TSchema extends z.ZodObject<any> = z.ZodObject<any>> = {
name: string;
description: string;
schema: TSchema;
execute: (
params: z.infer<TSchema>,
additionalConfig: AdditionalConfig,
) => Promise<WidgetOutput>;
}; };
export type WidgetConfig = { export type Widget = {
type: string; type: string;
params: Record<string, any>; shouldExecute: (classification: ClassifierOutput) => boolean;
execute: (input: WidgetInput) => Promise<WidgetOutput | void>;
}; };
export type WidgetOutput = { export type WidgetOutput = {
@@ -62,6 +57,7 @@ export type ClassifierOutput = {
discussionSearch: boolean; discussionSearch: boolean;
showWeatherWidget: boolean; showWeatherWidget: boolean;
showStockWidget: boolean; showStockWidget: boolean;
showCalculationWidget: boolean;
}; };
standaloneFollowUp: string; standaloneFollowUp: string;
}; };
@@ -81,6 +77,7 @@ export type ResearcherInput = {
export type ResearcherOutput = { export type ResearcherOutput = {
findings: ActionOutput[]; findings: ActionOutput[];
searchFindings: Chunk[];
}; };
export type SearchActionOutput = { export type SearchActionOutput = {
@@ -92,22 +89,34 @@ export type DoneActionOutput = {
type: 'done'; type: 'done';
}; };
export type ActionOutput = SearchActionOutput | DoneActionOutput; export type ReasoningResearchAction = {
type: 'reasoning';
reasoning: string;
};
export type ActionOutput =
| SearchActionOutput
| DoneActionOutput
| ReasoningResearchAction;
export interface ResearchAction< export interface ResearchAction<
TSchema extends z.ZodObject<any> = z.ZodObject<any>, TSchema extends z.ZodObject<any> = z.ZodObject<any>,
> { > {
name: string; name: string;
description: string;
schema: z.ZodObject<any>; schema: z.ZodObject<any>;
enabled: (config: { classification: ClassifierOutput }) => boolean; getToolDescription: (config: { mode: SearchAgentConfig['mode'] }) => string;
getDescription: (config: { mode: SearchAgentConfig['mode'] }) => string;
enabled: (config: {
classification: ClassifierOutput;
fileIds: string[];
mode: SearchAgentConfig['mode'];
sources: SearchSources[];
}) => boolean;
execute: ( execute: (
params: z.infer<TSchema>, params: z.infer<TSchema>,
additionalConfig: AdditionalConfig, additionalConfig: AdditionalConfig & {
researchBlockId: string;
fileIds: string[];
},
) => Promise<ActionOutput>; ) => Promise<ActionOutput>;
} }
export type ActionConfig = {
type: string;
params: Record<string, any>;
};

View File

@@ -1,66 +1,70 @@
import z from 'zod'; import z from 'zod';
import { Widget } from '../types'; import { Widget } from '../types';
import { evaluate as mathEval } from 'mathjs'; import formatChatHistoryAsString from '@/lib/utils/formatHistory';
import { exp, evaluate as mathEval } from 'mathjs';
const schema = z.object({ const schema = z.object({
type: z.literal('calculation'),
expression: z expression: z
.string() .string()
.describe( .describe('Mathematical expression to calculate or evaluate.'),
"A valid mathematical expression to be evaluated (e.g., '2 + 2', '3 * (4 + 5)').", notPresent: z
), .boolean()
.describe('Whether there is any need for the calculation widget.'),
}); });
const calculationWidget: Widget<typeof schema> = { const system = `
name: 'calculation', <role>
description: `Performs mathematical calculations and evaluates mathematical expressions. Supports arithmetic operations, algebraic equations, functions, and complex mathematical computations. Assistant is a calculation expression extractor. You will recieve a user follow up and a conversation history.
Your task is to determine if there is a mathematical expression that needs to be calculated or evaluated. If there is, extract the expression and return it. If there is no need for any calculation, set notPresent to true.
</role>
**What it provides:** <instructions>
- Evaluates mathematical expressions and returns computed results Make sure that the extracted expression is valid and can be used to calculate the result with Math JS library (https://mathjs.org/). If the expression is not valid, set notPresent to true.
- Handles basic arithmetic (+, -, *, /) If you feel like you cannot extract a valid expression, set notPresent to true.
- Supports functions (sqrt, sin, cos, log, etc.) </instructions>
- Can process complex expressions with parentheses and order of operations
**When to use:** <output_format>
- User asks to calculate, compute, or evaluate a mathematical expression You must respond in the following JSON format without any extra text, explanations or filler sentences:
- Questions like "what is X", "calculate Y", "how much is Z" where X/Y/Z are math expressions
- Any request involving numbers and mathematical operations
**Example call:**
{ {
"type": "calculation", "expression": string,
"expression": "25% of 480" "notPresent": boolean
} }
</output_format>
`;
{ const calculationWidget: Widget = {
"type": "calculation", type: 'calculationWidget',
"expression": "sqrt(144) + 5 * 2" shouldExecute: (classification) =>
} classification.classification.showCalculationWidget,
execute: async (input) => {
**Important:** The expression must be valid mathematical syntax that can be evaluated by mathjs. Format percentages as "0.25 * 480" or "25% of 480". Do not include currency symbols, units, or non-mathematical text in the expression.`, const output = await input.llm.generateObject<typeof schema>({
schema: schema, messages: [
execute: async (params, _) => { {
try { role: 'system',
const result = mathEval(params.expression); content: system,
return {
type: 'calculation_result',
llmContext: `The result of the expression "${params.expression}" is ${result}.`,
data: {
expression: params.expression,
result: result,
}, },
}; {
} catch (error) { role: 'user',
return { content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
type: 'calculation_result',
llmContext: 'Failed to evaluate mathematical expression.',
data: {
expression: params.expression,
result: `Error evaluating expression: ${error}`,
}, },
}; ],
schema,
});
if (output.notPresent) {
return;
} }
const result = mathEval(output.expression);
return {
type: 'calculation_result',
llmContext: `The result of the calculation for the expression "${output.expression}" is: ${result}`,
data: {
expression: output.expression,
result,
},
};
}, },
}; };

View File

@@ -0,0 +1,36 @@
import { Widget, WidgetInput, WidgetOutput } from '../types';
class WidgetExecutor {
static widgets = new Map<string, Widget>();
static register(widget: Widget) {
this.widgets.set(widget.type, widget);
}
static getWidget(type: string): Widget | undefined {
return this.widgets.get(type);
}
static async executeAll(input: WidgetInput): Promise<WidgetOutput[]> {
const results: WidgetOutput[] = [];
await Promise.all(
Array.from(this.widgets.values()).map(async (widget) => {
try {
if (widget.shouldExecute(input.classification)) {
const output = await widget.execute(input);
if (output) {
results.push(output);
}
}
} catch (e) {
console.log(`Error executing widget ${widget.type}:`, e);
}
}),
);
return results;
}
}
export default WidgetExecutor;

View File

@@ -1,10 +1,10 @@
import calculationWidget from './calculationWidget'; import calculationWidget from './calculationWidget';
import WidgetRegistry from './registry'; import WidgetExecutor from './executor';
import weatherWidget from './weatherWidget'; import weatherWidget from './weatherWidget';
import stockWidget from './stockWidget'; import stockWidget from './stockWidget';
WidgetRegistry.register(weatherWidget); WidgetExecutor.register(weatherWidget);
WidgetRegistry.register(calculationWidget); WidgetExecutor.register(calculationWidget);
WidgetRegistry.register(stockWidget); WidgetExecutor.register(stockWidget);
export { WidgetRegistry }; export { WidgetExecutor };

View File

@@ -1,65 +0,0 @@
import {
AdditionalConfig,
SearchAgentConfig,
Widget,
WidgetConfig,
WidgetOutput,
} from '../types';
class WidgetRegistry {
private static widgets = new Map<string, Widget>();
static register(widget: Widget<any>) {
this.widgets.set(widget.name, widget);
}
static get(name: string): Widget | undefined {
return this.widgets.get(name);
}
static getAll(): Widget[] {
return Array.from(this.widgets.values());
}
static getDescriptions(): string {
return Array.from(this.widgets.values())
.map((widget) => `${widget.name}: ${widget.description}`)
.join('\n\n');
}
static async execute(
name: string,
params: any,
config: AdditionalConfig,
): Promise<WidgetOutput> {
const widget = this.get(name);
if (!widget) {
throw new Error(`Widget with name ${name} not found`);
}
return widget.execute(params, config);
}
static async executeAll(
widgets: WidgetConfig[],
additionalConfig: AdditionalConfig,
): Promise<WidgetOutput[]> {
const results: WidgetOutput[] = [];
await Promise.all(
widgets.map(async (widgetConfig) => {
const output = await this.execute(
widgetConfig.type,
widgetConfig.params,
additionalConfig,
);
results.push(output);
}),
);
return results;
}
}
export default WidgetRegistry;

View File

@@ -1,13 +1,13 @@
import z from 'zod'; import z from 'zod';
import { Widget } from '../types'; import { Widget } from '../types';
import YahooFinance from 'yahoo-finance2'; import YahooFinance from 'yahoo-finance2';
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
const yf = new YahooFinance({ const yf = new YahooFinance({
suppressNotices: ['yahooSurvey'], suppressNotices: ['yahooSurvey'],
}); });
const schema = z.object({ const schema = z.object({
type: z.literal('stock'),
name: z name: z
.string() .string()
.describe( .describe(
@@ -19,60 +19,59 @@ const schema = z.object({
.describe( .describe(
"Optional array of up to 3 stock names to compare against the base name (e.g., ['Microsoft', 'GOOGL', 'Meta']). Charts will show percentage change comparison.", "Optional array of up to 3 stock names to compare against the base name (e.g., ['Microsoft', 'GOOGL', 'Meta']). Charts will show percentage change comparison.",
), ),
notPresent: z
.boolean()
.describe('Whether there is no need for the stock widget.'),
}); });
const stockWidget: Widget<typeof schema> = { const systemPrompt = `
name: 'stock', <role>
description: `Provides comprehensive real-time stock market data and financial information for any publicly traded company. Returns detailed quote data, market status, trading metrics, and company fundamentals. You are a stock ticker/name extractor. You will receive a user follow up and a conversation history.
Your task is to determine if the user is asking about stock information and extract the stock name(s) they want data for.
</role>
You can set skipSearch to true if the stock widget can fully answer the user's query without needing additional web search. <instructions>
- If the user is asking about a stock, extract the primary stock name or ticker.
- If the user wants to compare stocks, extract up to 3 comparison stock names in comparisonNames.
- You can use either stock names (e.g., "Nvidia", "Apple") or tickers (e.g., "NVDA", "AAPL").
- If you cannot determine a valid stock or the query is not stock-related, set notPresent to true.
- If no comparison is needed, set comparisonNames to an empty array.
</instructions>
**What it provides:** <output_format>
- **Real-time Price Data**: Current price, previous close, open price, day's range (high/low) You must respond in the following JSON format without any extra text, explanations or filler sentences:
- **Market Status**: Whether market is currently open or closed, trading sessions
- **Trading Metrics**: Volume, average volume, bid/ask prices and sizes
- **Performance**: Price changes (absolute and percentage), 52-week high/low range
- **Valuation**: Market capitalization, P/E ratio, earnings per share (EPS)
- **Dividends**: Dividend rate, dividend yield, ex-dividend date
- **Company Info**: Full company name, exchange, currency, sector/industry (when available)
- **Advanced Metrics**: Beta, trailing/forward P/E, book value, price-to-book ratio
- **Charts Data**: Historical price movements for visualization
- **Comparison**: Compare up to 3 stocks side-by-side with percentage-based performance visualization
**When to use:**
- User asks about a stock price ("What's AAPL stock price?", "How is Tesla doing?")
- Questions about company market performance ("Is Microsoft up or down today?")
- Requests for stock market data, trading info, or company valuation
- Queries about dividends, P/E ratio, market cap, or other financial metrics
- Any stock/equity-related question for a specific company
- Stock comparisons ("Compare AAPL vs MSFT", "How is TSLA doing vs RIVN and LCID?")
**Example calls:**
{ {
"type": "stock", "name": string,
"name": "AAPL" "comparisonNames": string[],
"notPresent": boolean
} }
</output_format>
`;
{ const stockWidget: Widget = {
"type": "stock", type: 'stockWidget',
"name": "TSLA", shouldExecute: (classification) =>
"comparisonNames": ["RIVN", "LCID"] classification.classification.showStockWidget,
} execute: async (input) => {
const output = await input.llm.generateObject<typeof schema>({
messages: [
{
role: 'system',
content: systemPrompt,
},
{
role: 'user',
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
},
],
schema,
});
{ if (output.notPresent) {
"type": "stock", return;
"name": "Google", }
"comparisonNames": ["Microsoft", "Meta", "Amazon"]
}
**Important:** const params = output;
- You can use both tickers and names (prefer name when you're not aware of the ticker).
- For companies with multiple share classes, use the most common one.
- The widget works for stocks listed on major exchanges (NYSE, NASDAQ, etc.)
- Returns comprehensive data; the UI will display relevant metrics based on availability
- Market data may be delayed by 15-20 minutes for free data sources during trading hours`,
schema: schema,
execute: async (params, _) => {
try { try {
const name = params.name; const name = params.name;

View File

@@ -1,8 +1,8 @@
import z from 'zod'; import z from 'zod';
import { Widget } from '../types'; import { Widget } from '../types';
import formatChatHistoryAsString from '@/lib/utils/formatHistory';
const WeatherWidgetSchema = z.object({ const schema = z.object({
type: z.literal('weather'),
location: z location: z
.string() .string()
.describe( .describe(
@@ -18,38 +18,63 @@ const WeatherWidgetSchema = z.object({
.describe( .describe(
'Longitude coordinate in decimal degrees (e.g., -74.0060). Only use when location name is empty.', 'Longitude coordinate in decimal degrees (e.g., -74.0060). Only use when location name is empty.',
), ),
notPresent: z
.boolean()
.describe('Whether there is no need for the weather widget.'),
}); });
const weatherWidget: Widget<typeof WeatherWidgetSchema> = { const systemPrompt = `
name: 'weather', <role>
description: `Provides comprehensive current weather information and forecasts for any location worldwide. Returns real-time weather data including temperature, conditions, humidity, wind, and multi-day forecasts. You are a location extractor for weather queries. You will receive a user follow up and a conversation history.
Your task is to determine if the user is asking about weather and extract the location they want weather for.
</role>
You can set skipSearch to true if the weather widget can fully answer the user's query without needing additional web search. <instructions>
- If the user is asking about weather, extract the location name OR coordinates (never both).
- If using location name, set lat and lon to 0.
- If using coordinates, set location to empty string.
- If you cannot determine a valid location or the query is not weather-related, set notPresent to true.
- Location should be specific (city, state/region, country) for best results.
- You have to give the location so that it can be used to fetch weather data, it cannot be left empty unless notPresent is true.
- Make sure to infer short forms of location names (e.g., "NYC" -> "New York City", "LA" -> "Los Angeles").
</instructions>
**What it provides:** <output_format>
- Current weather conditions (temperature, feels-like, humidity, precipitation) You must respond in the following JSON format without any extra text, explanations or filler sentences:
- Wind speed, direction, and gusts
- Weather codes/conditions (clear, cloudy, rainy, etc.)
- Hourly forecast for next 24 hours
- Daily forecast for next 7 days (high/low temps, precipitation probability)
- Timezone information
**When to use:**
- User asks about weather in a location ("weather in X", "is it raining in Y")
- Questions about temperature, conditions, or forecast
- Any weather-related query for a specific place
**Example call:**
{ {
"type": "weather", "location": string,
"location": "San Francisco, CA, USA", "lat": number,
"lat": 0, "lon": number,
"lon": 0 "notPresent": boolean
} }
</output_format>
`;
const weatherWidget: Widget = {
type: 'weatherWidget',
shouldExecute: (classification) =>
classification.classification.showWeatherWidget,
execute: async (input) => {
const output = await input.llm.generateObject<typeof schema>({
messages: [
{
role: 'system',
content: systemPrompt,
},
{
role: 'user',
content: `<conversation_history>\n${formatChatHistoryAsString(input.chatHistory)}\n</conversation_history>\n<user_follow_up>\n${input.followUp}\n</user_follow_up>`,
},
],
schema,
});
if (output.notPresent) {
return;
}
const params = output;
**Important:** Provide EITHER a location name OR latitude/longitude coordinates, never both. If using location name, set lat/lon to 0. Location should be specific (city, state/region, country) for best results.`,
schema: WeatherWidgetSchema,
execute: async (params, _) => {
try { try {
if ( if (
params.location === '' && params.location === '' &&

View File

@@ -3,7 +3,6 @@ import { suggestionGeneratorPrompt } from '@/lib/prompts/suggestions';
import { ChatTurnMessage } from '@/lib/types'; import { ChatTurnMessage } from '@/lib/types';
import z from 'zod'; import z from 'zod';
import BaseLLM from '@/lib/models/base/llm'; import BaseLLM from '@/lib/models/base/llm';
import { i } from 'mathjs';
type SuggestionGeneratorInput = { type SuggestionGeneratorInput = {
chatHistory: ChatTurnMessage[]; chatHistory: ChatTurnMessage[];
@@ -19,7 +18,7 @@ const generateSuggestions = async (
input: SuggestionGeneratorInput, input: SuggestionGeneratorInput,
llm: BaseLLM<any>, llm: BaseLLM<any>,
) => { ) => {
const res = await llm.generateObject<z.infer<typeof schema>>({ const res = await llm.generateObject<typeof schema>({
messages: [ messages: [
{ {
role: 'system', role: 'system',

View File

@@ -17,3 +17,13 @@ export const getShowWeatherWidget = () =>
export const getShowNewsWidget = () => export const getShowNewsWidget = () =>
getClientConfig('showNewsWidget', 'true') === 'true'; getClientConfig('showNewsWidget', 'true') === 'true';
export const getMeasurementUnit = () => {
const value =
getClientConfig('measureUnit') ??
getClientConfig('measurementUnit', 'metric');
if (typeof value !== 'string') return 'metric';
return value.toLowerCase();
};

View File

@@ -45,6 +45,7 @@ fs.readdirSync(migrationsFolder)
const already = db const already = db
.prepare('SELECT 1 FROM ran_migrations WHERE name = ?') .prepare('SELECT 1 FROM ran_migrations WHERE name = ?')
.get(migrationName); .get(migrationName);
if (already) { if (already) {
console.log(`Skipping already-applied migration: ${file}`); console.log(`Skipping already-applied migration: ${file}`);
return; return;
@@ -113,6 +114,160 @@ fs.readdirSync(migrationsFolder)
db.exec('DROP TABLE messages;'); db.exec('DROP TABLE messages;');
db.exec('ALTER TABLE messages_with_sources RENAME TO messages;'); db.exec('ALTER TABLE messages_with_sources RENAME TO messages;');
} else if (migrationName === '0002') {
/* Migrate chat */
db.exec(`
CREATE TABLE IF NOT EXISTS chats_new (
id TEXT PRIMARY KEY,
title TEXT NOT NULL,
createdAt TEXT NOT NULL,
sources TEXT DEFAULT '[]',
files TEXT DEFAULT '[]'
);
`);
const chats = db
.prepare('SELECT id, title, createdAt, files FROM chats')
.all();
const insertChat = db.prepare(`
INSERT INTO chats_new (id, title, createdAt, sources, files)
VALUES (?, ?, ?, ?, ?)
`);
chats.forEach((chat: any) => {
let files = chat.files;
while (typeof files === 'string') {
files = JSON.parse(files || '[]');
}
insertChat.run(
chat.id,
chat.title,
chat.createdAt,
'["web"]',
JSON.stringify(files),
);
});
db.exec('DROP TABLE chats;');
db.exec('ALTER TABLE chats_new RENAME TO chats;');
/* Migrate messages */
db.exec(`
CREATE TABLE IF NOT EXISTS messages_new (
id INTEGER PRIMARY KEY,
messageId TEXT NOT NULL,
chatId TEXT NOT NULL,
backendId TEXT NOT NULL,
query TEXT NOT NULL,
createdAt TEXT NOT NULL,
responseBlocks TEXT DEFAULT '[]',
status TEXT DEFAULT 'answering'
);
`);
const messages = db
.prepare(
'SELECT id, messageId, chatId, type, content, createdAt, sources FROM messages ORDER BY id ASC',
)
.all();
const insertMessage = db.prepare(`
INSERT INTO messages_new (messageId, chatId, backendId, query, createdAt, responseBlocks, status)
VALUES (?, ?, ?, ?, ?, ?, ?)
`);
let currentMessageData: {
sources?: any[];
response?: string;
query?: string;
messageId?: string;
chatId?: string;
createdAt?: string;
} = {};
let lastCompleted = true;
messages.forEach((msg: any) => {
if (msg.type === 'user' && lastCompleted) {
currentMessageData = {};
currentMessageData.messageId = msg.messageId;
currentMessageData.chatId = msg.chatId;
currentMessageData.query = msg.content;
currentMessageData.createdAt = msg.createdAt;
lastCompleted = false;
} else if (msg.type === 'source' && !lastCompleted) {
let sources = msg.sources;
while (typeof sources === 'string') {
sources = JSON.parse(sources || '[]');
}
currentMessageData.sources = sources;
} else if (msg.type === 'assistant' && !lastCompleted) {
currentMessageData.response = msg.content;
insertMessage.run(
currentMessageData.messageId,
currentMessageData.chatId,
`${currentMessageData.messageId}-backend`,
currentMessageData.query,
currentMessageData.createdAt,
JSON.stringify([
{
id: crypto.randomUUID(),
type: 'text',
data: currentMessageData.response || '',
},
...(currentMessageData.sources &&
currentMessageData.sources.length > 0
? [
{
id: crypto.randomUUID(),
type: 'source',
data: currentMessageData.sources,
},
]
: []),
]),
'completed',
);
lastCompleted = true;
} else if (msg.type === 'user' && !lastCompleted) {
/* Message wasn't completed so we'll just create the record with empty response */
insertMessage.run(
currentMessageData.messageId,
currentMessageData.chatId,
`${currentMessageData.messageId}-backend`,
currentMessageData.query,
currentMessageData.createdAt,
JSON.stringify([
{
id: crypto.randomUUID(),
type: 'text',
data: '',
},
...(currentMessageData.sources &&
currentMessageData.sources.length > 0
? [
{
id: crypto.randomUUID(),
type: 'source',
data: currentMessageData.sources,
},
]
: []),
]),
'completed',
);
lastCompleted = true;
}
});
db.exec('DROP TABLE messages;');
db.exec('ALTER TABLE messages_new RENAME TO messages;');
} else { } else {
// Execute each statement separately // Execute each statement separately
statements.forEach((stmt) => { statements.forEach((stmt) => {

View File

@@ -1,6 +1,7 @@
import { sql } from 'drizzle-orm'; import { sql } from 'drizzle-orm';
import { text, integer, sqliteTable } from 'drizzle-orm/sqlite-core'; import { text, integer, sqliteTable } from 'drizzle-orm/sqlite-core';
import { Block } from '../types'; import { Block } from '../types';
import { SearchSources } from '../agents/search/types';
export const messages = sqliteTable('messages', { export const messages = sqliteTable('messages', {
id: integer('id').primaryKey(), id: integer('id').primaryKey(),
@@ -26,7 +27,11 @@ export const chats = sqliteTable('chats', {
id: text('id').primaryKey(), id: text('id').primaryKey(),
title: text('title').notNull(), title: text('title').notNull(),
createdAt: text('createdAt').notNull(), createdAt: text('createdAt').notNull(),
focusMode: text('focusMode').notNull(), sources: text('sources', {
mode: 'json',
})
.$type<SearchSources[]>()
.default(sql`'[]'`),
files: text('files', { mode: 'json' }) files: text('files', { mode: 'json' })
.$type<DBFile[]>() .$type<DBFile[]>()
.default(sql`'[]'`), .default(sql`'[]'`),

View File

@@ -34,7 +34,7 @@ type ChatContext = {
chatHistory: [string, string][]; chatHistory: [string, string][];
files: File[]; files: File[];
fileIds: string[]; fileIds: string[];
focusMode: string; sources: string[];
chatId: string | undefined; chatId: string | undefined;
optimizationMode: string; optimizationMode: string;
isMessagesLoaded: boolean; isMessagesLoaded: boolean;
@@ -48,7 +48,7 @@ type ChatContext = {
researchEnded: boolean; researchEnded: boolean;
setResearchEnded: (ended: boolean) => void; setResearchEnded: (ended: boolean) => void;
setOptimizationMode: (mode: string) => void; setOptimizationMode: (mode: string) => void;
setFocusMode: (mode: string) => void; setSources: (sources: string[]) => void;
setFiles: (files: File[]) => void; setFiles: (files: File[]) => void;
setFileIds: (fileIds: string[]) => void; setFileIds: (fileIds: string[]) => void;
sendMessage: ( sendMessage: (
@@ -175,8 +175,8 @@ const loadMessages = async (
chatId: string, chatId: string,
setMessages: (messages: Message[]) => void, setMessages: (messages: Message[]) => void,
setIsMessagesLoaded: (loaded: boolean) => void, setIsMessagesLoaded: (loaded: boolean) => void,
setChatHistory: (history: [string, string][]) => void, chatHistory: React.MutableRefObject<[string, string][]>,
setFocusMode: (mode: string) => void, setSources: (sources: string[]) => void,
setNotFound: (notFound: boolean) => void, setNotFound: (notFound: boolean) => void,
setFiles: (files: File[]) => void, setFiles: (files: File[]) => void,
setFileIds: (fileIds: string[]) => void, setFileIds: (fileIds: string[]) => void,
@@ -233,8 +233,8 @@ const loadMessages = async (
setFiles(files); setFiles(files);
setFileIds(files.map((file: File) => file.fileId)); setFileIds(files.map((file: File) => file.fileId));
setChatHistory(history); chatHistory.current = history;
setFocusMode(data.chat.focusMode); setSources(data.chat.sources);
setIsMessagesLoaded(true); setIsMessagesLoaded(true);
}; };
@@ -243,7 +243,7 @@ export const chatContext = createContext<ChatContext>({
chatId: '', chatId: '',
fileIds: [], fileIds: [],
files: [], files: [],
focusMode: '', sources: [],
hasError: false, hasError: false,
isMessagesLoaded: false, isMessagesLoaded: false,
isReady: false, isReady: false,
@@ -260,7 +260,7 @@ export const chatContext = createContext<ChatContext>({
sendMessage: async () => {}, sendMessage: async () => {},
setFileIds: () => {}, setFileIds: () => {},
setFiles: () => {}, setFiles: () => {},
setFocusMode: () => {}, setSources: () => {},
setOptimizationMode: () => {}, setOptimizationMode: () => {},
setChatModelProvider: () => {}, setChatModelProvider: () => {},
setEmbeddingModelProvider: () => {}, setEmbeddingModelProvider: () => {},
@@ -269,6 +269,7 @@ export const chatContext = createContext<ChatContext>({
export const ChatProvider = ({ children }: { children: React.ReactNode }) => { export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
const params: { chatId: string } = useParams(); const params: { chatId: string } = useParams();
const searchParams = useSearchParams(); const searchParams = useSearchParams();
const initialMessage = searchParams.get('q'); const initialMessage = searchParams.get('q');
@@ -280,13 +281,13 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
const [researchEnded, setResearchEnded] = useState(false); const [researchEnded, setResearchEnded] = useState(false);
const [chatHistory, setChatHistory] = useState<[string, string][]>([]); const chatHistory = useRef<[string, string][]>([]);
const [messages, setMessages] = useState<Message[]>([]); const [messages, setMessages] = useState<Message[]>([]);
const [files, setFiles] = useState<File[]>([]); const [files, setFiles] = useState<File[]>([]);
const [fileIds, setFileIds] = useState<string[]>([]); const [fileIds, setFileIds] = useState<string[]>([]);
const [focusMode, setFocusMode] = useState('webSearch'); const [sources, setSources] = useState<string[]>(['web']);
const [optimizationMode, setOptimizationMode] = useState('speed'); const [optimizationMode, setOptimizationMode] = useState('speed');
const [isMessagesLoaded, setIsMessagesLoaded] = useState(false); const [isMessagesLoaded, setIsMessagesLoaded] = useState(false);
@@ -401,6 +402,64 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
}); });
}, [messages]); }, [messages]);
const isReconnectingRef = useRef(false);
const handledMessageEndRef = useRef<Set<string>>(new Set());
const checkReconnect = async () => {
if (isReconnectingRef.current) return;
setIsReady(true);
console.debug(new Date(), 'app:ready');
if (messages.length > 0) {
const lastMsg = messages[messages.length - 1];
if (lastMsg.status === 'answering') {
setLoading(true);
setResearchEnded(false);
setMessageAppeared(false);
isReconnectingRef.current = true;
const res = await fetch(`/api/reconnect/${lastMsg.backendId}`, {
method: 'POST',
});
if (!res.body) throw new Error('No response body');
const reader = res.body?.getReader();
const decoder = new TextDecoder('utf-8');
let partialChunk = '';
const messageHandler = getMessageHandler(lastMsg);
try {
while (true) {
const { value, done } = await reader.read();
if (done) break;
partialChunk += decoder.decode(value, { stream: true });
try {
const messages = partialChunk.split('\n');
for (const msg of messages) {
if (!msg.trim()) continue;
const json = JSON.parse(msg);
messageHandler(json);
}
partialChunk = '';
} catch (error) {
console.warn('Incomplete JSON, waiting for next chunk...');
}
}
} finally {
isReconnectingRef.current = false;
}
}
}
};
useEffect(() => { useEffect(() => {
checkConfig( checkConfig(
setChatModelProvider, setChatModelProvider,
@@ -415,7 +474,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
if (params.chatId && params.chatId !== chatId) { if (params.chatId && params.chatId !== chatId) {
setChatId(params.chatId); setChatId(params.chatId);
setMessages([]); setMessages([]);
setChatHistory([]); chatHistory.current = [];
setFiles([]); setFiles([]);
setFileIds([]); setFileIds([]);
setIsMessagesLoaded(false); setIsMessagesLoaded(false);
@@ -435,8 +494,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
chatId, chatId,
setMessages, setMessages,
setIsMessagesLoaded, setIsMessagesLoaded,
setChatHistory, chatHistory,
setFocusMode, setSources,
setNotFound, setNotFound,
setFiles, setFiles,
setFileIds, setFileIds,
@@ -454,13 +513,15 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
}, [messages]); }, [messages]);
useEffect(() => { useEffect(() => {
if (isMessagesLoaded && isConfigReady) { if (isMessagesLoaded && isConfigReady && newChatCreated) {
setIsReady(true); setIsReady(true);
console.debug(new Date(), 'app:ready'); console.debug(new Date(), 'app:ready');
} else if (isMessagesLoaded && isConfigReady && !newChatCreated) {
checkReconnect();
} else { } else {
setIsReady(false); setIsReady(false);
} }
}, [isMessagesLoaded, isConfigReady]); }, [isMessagesLoaded, isConfigReady, newChatCreated]);
const rewrite = (messageId: string) => { const rewrite = (messageId: string) => {
const index = messages.findIndex((msg) => msg.messageId === messageId); const index = messages.findIndex((msg) => msg.messageId === messageId);
@@ -469,9 +530,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
setMessages((prev) => prev.slice(0, index)); setMessages((prev) => prev.slice(0, index));
setChatHistory((prev) => { chatHistory.current = chatHistory.current.slice(0, index * 2);
return prev.slice(0, index * 2);
});
const messageToRewrite = messages[index]; const messageToRewrite = messages[index];
sendMessage(messageToRewrite.query, messageToRewrite.messageId, true); sendMessage(messageToRewrite.query, messageToRewrite.messageId, true);
@@ -488,38 +547,10 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
// eslint-disable-next-line react-hooks/exhaustive-deps // eslint-disable-next-line react-hooks/exhaustive-deps
}, [isConfigReady, isReady, initialMessage]); }, [isConfigReady, isReady, initialMessage]);
const sendMessage: ChatContext['sendMessage'] = async ( const getMessageHandler = (message: Message) => {
message, const messageId = message.messageId;
messageId,
rewrite = false,
) => {
if (loading || !message) return;
setLoading(true);
setResearchEnded(false);
setMessageAppeared(false);
if (messages.length <= 1) { return async (data: any) => {
window.history.replaceState(null, '', `/c/${chatId}`);
}
messageId = messageId ?? crypto.randomBytes(7).toString('hex');
const backendId = crypto.randomBytes(20).toString('hex');
const newMessage: Message = {
messageId,
chatId: chatId!,
backendId,
query: message,
responseBlocks: [],
status: 'answering',
createdAt: new Date(),
};
setMessages((prevMessages) => [...prevMessages, newMessage]);
const receivedTextRef = { current: '' };
const messageHandler = async (data: any) => {
if (data.type === 'error') { if (data.type === 'error') {
toast.error(data.data); toast.error(data.data);
setLoading(false); setLoading(false);
@@ -536,7 +567,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
if (data.type === 'researchComplete') { if (data.type === 'researchComplete') {
setResearchEnded(true); setResearchEnded(true);
if ( if (
newMessage.responseBlocks.find( message.responseBlocks.find(
(b) => b.type === 'source' && b.data.length > 0, (b) => b.type === 'source' && b.data.length > 0,
) )
) { ) {
@@ -548,6 +579,20 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
setMessages((prev) => setMessages((prev) =>
prev.map((msg) => { prev.map((msg) => {
if (msg.messageId === messageId) { if (msg.messageId === messageId) {
const exists = msg.responseBlocks.findIndex(
(b) => b.id === data.block.id,
);
if (exists !== -1) {
const existingBlocks = [...msg.responseBlocks];
existingBlocks[exists] = data.block;
return {
...msg,
responseBlocks: existingBlocks,
};
}
return { return {
...msg, ...msg,
responseBlocks: [...msg.responseBlocks, data.block], responseBlocks: [...msg.responseBlocks, data.block],
@@ -556,6 +601,13 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
return msg; return msg;
}), }),
); );
if (
(data.block.type === 'source' && data.block.data.length > 0) ||
data.block.type === 'text'
) {
setMessageAppeared(true);
}
} }
if (data.type === 'updateBlock') { if (data.type === 'updateBlock') {
@@ -577,75 +629,28 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
); );
} }
if (data.type === 'sources') {
const sourceBlock: Block = {
id: crypto.randomBytes(7).toString('hex'),
type: 'source',
data: data.data,
};
setMessages((prev) =>
prev.map((msg) => {
if (msg.messageId === messageId) {
return {
...msg,
responseBlocks: [...msg.responseBlocks, sourceBlock],
};
}
return msg;
}),
);
if (data.data.length > 0) {
setMessageAppeared(true);
}
}
if (data.type === 'message') {
receivedTextRef.current += data.data;
setMessages((prev) =>
prev.map((msg) => {
if (msg.messageId === messageId) {
const existingTextBlockIndex = msg.responseBlocks.findIndex(
(b) => b.type === 'text',
);
if (existingTextBlockIndex >= 0) {
const updatedBlocks = [...msg.responseBlocks];
const existingBlock = updatedBlocks[
existingTextBlockIndex
] as Block & { type: 'text' };
updatedBlocks[existingTextBlockIndex] = {
...existingBlock,
data: existingBlock.data + data.data,
};
return { ...msg, responseBlocks: updatedBlocks };
} else {
const textBlock: Block = {
id: crypto.randomBytes(7).toString('hex'),
type: 'text',
data: data.data,
};
return {
...msg,
responseBlocks: [...msg.responseBlocks, textBlock],
};
}
}
return msg;
}),
);
setMessageAppeared(true);
}
if (data.type === 'messageEnd') { if (data.type === 'messageEnd') {
if (handledMessageEndRef.current.has(messageId)) {
return;
}
handledMessageEndRef.current.add(messageId);
const currentMsg = messagesRef.current.find(
(msg) => msg.messageId === messageId,
);
const newHistory: [string, string][] = [ const newHistory: [string, string][] = [
...chatHistory, ...chatHistory.current,
['human', message], ['human', message.query],
['assistant', receivedTextRef.current], [
'assistant',
currentMsg?.responseBlocks.find((b) => b.type === 'text')?.data ||
'',
],
]; ];
setChatHistory(newHistory); chatHistory.current = newHistory;
setMessages((prev) => setMessages((prev) =>
prev.map((msg) => prev.map((msg) =>
@@ -662,6 +667,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
const autoMediaSearch = getAutoMediaSearch(); const autoMediaSearch = getAutoMediaSearch();
if (autoMediaSearch) { if (autoMediaSearch) {
setTimeout(() => {
document document
.getElementById(`search-images-${lastMsg.messageId}`) .getElementById(`search-images-${lastMsg.messageId}`)
?.click(); ?.click();
@@ -669,12 +675,10 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
document document
.getElementById(`search-videos-${lastMsg.messageId}`) .getElementById(`search-videos-${lastMsg.messageId}`)
?.click(); ?.click();
}, 200);
} }
// Check if there are sources and no suggestions // Check if there are sources and no suggestions
const currentMsg = messagesRef.current.find(
(msg) => msg.messageId === messageId,
);
const hasSourceBlocks = currentMsg?.responseBlocks.some( const hasSourceBlocks = currentMsg?.responseBlocks.some(
(block) => block.type === 'source' && block.data.length > 0, (block) => block.type === 'source' && block.data.length > 0,
@@ -705,6 +709,36 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
} }
} }
}; };
};
const sendMessage: ChatContext['sendMessage'] = async (
message,
messageId,
rewrite = false,
) => {
if (loading || !message) return;
setLoading(true);
setResearchEnded(false);
setMessageAppeared(false);
if (messages.length <= 1) {
window.history.replaceState(null, '', `/c/${chatId}`);
}
messageId = messageId ?? crypto.randomBytes(7).toString('hex');
const backendId = crypto.randomBytes(20).toString('hex');
const newMessage: Message = {
messageId,
chatId: chatId!,
backendId,
query: message,
responseBlocks: [],
status: 'answering',
createdAt: new Date(),
};
setMessages((prevMessages) => [...prevMessages, newMessage]);
const messageIndex = messages.findIndex((m) => m.messageId === messageId); const messageIndex = messages.findIndex((m) => m.messageId === messageId);
@@ -722,11 +756,14 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
}, },
chatId: chatId!, chatId: chatId!,
files: fileIds, files: fileIds,
focusMode: focusMode, sources: sources,
optimizationMode: optimizationMode, optimizationMode: optimizationMode,
history: rewrite history: rewrite
? chatHistory.slice(0, messageIndex === -1 ? undefined : messageIndex) ? chatHistory.current.slice(
: chatHistory, 0,
messageIndex === -1 ? undefined : messageIndex,
)
: chatHistory.current,
chatModel: { chatModel: {
key: chatModelProvider.key, key: chatModelProvider.key,
providerId: chatModelProvider.providerId, providerId: chatModelProvider.providerId,
@@ -746,6 +783,8 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
let partialChunk = ''; let partialChunk = '';
const messageHandler = getMessageHandler(newMessage);
while (true) { while (true) {
const { value, done } = await reader.read(); const { value, done } = await reader.read();
if (done) break; if (done) break;
@@ -771,10 +810,10 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
value={{ value={{
messages, messages,
sections, sections,
chatHistory, chatHistory: chatHistory.current,
files, files,
fileIds, fileIds,
focusMode, sources,
chatId, chatId,
hasError, hasError,
isMessagesLoaded, isMessagesLoaded,
@@ -785,7 +824,7 @@ export const ChatProvider = ({ children }: { children: React.ReactNode }) => {
optimizationMode, optimizationMode,
setFileIds, setFileIds,
setFiles, setFiles,
setFocusMode, setSources,
setOptimizationMode, setOptimizationMode,
rewrite, rewrite,
sendMessage, sendMessage,

View File

@@ -0,0 +1,5 @@
import OpenAILLM from '../openai/openaiLLM';
class AnthropicLLM extends OpenAILLM {}
export default AnthropicLLM;

View File

@@ -0,0 +1,115 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import { Model, ModelList, ProviderMetadata } from '../../types';
import BaseEmbedding from '../../base/embedding';
import BaseModelProvider from '../../base/provider';
import BaseLLM from '../../base/llm';
import AnthropicLLM from './anthropicLLM';
interface AnthropicConfig {
apiKey: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'password',
name: 'API Key',
key: 'apiKey',
description: 'Your Anthropic API key',
required: true,
placeholder: 'Anthropic API Key',
env: 'ANTHROPIC_API_KEY',
scope: 'server',
},
];
class AnthropicProvider extends BaseModelProvider<AnthropicConfig> {
constructor(id: string, name: string, config: AnthropicConfig) {
super(id, name, config);
}
async getDefaultModels(): Promise<ModelList> {
const res = await fetch('https://api.anthropic.com/v1/models?limit=999', {
method: 'GET',
headers: {
'x-api-key': this.config.apiKey,
'anthropic-version': '2023-06-01',
'Content-type': 'application/json',
},
});
if (!res.ok) {
throw new Error(`Failed to fetch Anthropic models: ${res.statusText}`);
}
const data = (await res.json()).data;
const models: Model[] = data.map((m: any) => {
return {
key: m.id,
name: m.display_name,
};
});
return {
embedding: [],
chat: models,
};
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [],
chat: [...defaultModels.chat, ...configProvider.chatModels],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
const modelList = await this.getModelList();
const exists = modelList.chat.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading Anthropic Chat Model. Invalid Model Selected',
);
}
return new AnthropicLLM({
apiKey: this.config.apiKey,
model: key,
baseURL: 'https://api.anthropic.com/v1',
});
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
throw new Error('Anthropic provider does not support embedding models.');
}
static parseAndValidate(raw: any): AnthropicConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
if (!raw.apiKey)
throw new Error('Invalid config provided. API key must be provided');
return {
apiKey: String(raw.apiKey),
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'anthropic',
name: 'Anthropic',
};
}
}
export default AnthropicProvider;

View File

@@ -0,0 +1,5 @@
import OpenAIEmbedding from '../openai/openaiEmbedding';
class GeminiEmbedding extends OpenAIEmbedding {}
export default GeminiEmbedding;

View File

@@ -0,0 +1,5 @@
import OpenAILLM from '../openai/openaiLLM';
class GeminiLLM extends OpenAILLM {}
export default GeminiLLM;

View File

@@ -0,0 +1,144 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import { Model, ModelList, ProviderMetadata } from '../../types';
import GeminiEmbedding from './geminiEmbedding';
import BaseEmbedding from '../../base/embedding';
import BaseModelProvider from '../../base/provider';
import BaseLLM from '../../base/llm';
import GeminiLLM from './geminiLLM';
interface GeminiConfig {
apiKey: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'password',
name: 'API Key',
key: 'apiKey',
description: 'Your Gemini API key',
required: true,
placeholder: 'Gemini API Key',
env: 'GEMINI_API_KEY',
scope: 'server',
},
];
class GeminiProvider extends BaseModelProvider<GeminiConfig> {
constructor(id: string, name: string, config: GeminiConfig) {
super(id, name, config);
}
async getDefaultModels(): Promise<ModelList> {
const res = await fetch(
`https://generativelanguage.googleapis.com/v1beta/models?key=${this.config.apiKey}`,
{
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
},
);
const data = await res.json();
let defaultEmbeddingModels: Model[] = [];
let defaultChatModels: Model[] = [];
data.models.forEach((m: any) => {
if (
m.supportedGenerationMethods.some(
(genMethod: string) =>
genMethod === 'embedText' || genMethod === 'embedContent',
)
) {
defaultEmbeddingModels.push({
key: m.name,
name: m.displayName,
});
} else if (m.supportedGenerationMethods.includes('generateContent')) {
defaultChatModels.push({
key: m.name,
name: m.displayName,
});
}
});
return {
embedding: defaultEmbeddingModels,
chat: defaultChatModels,
};
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [
...defaultModels.embedding,
...configProvider.embeddingModels,
],
chat: [...defaultModels.chat, ...configProvider.chatModels],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
const modelList = await this.getModelList();
const exists = modelList.chat.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading Gemini Chat Model. Invalid Model Selected',
);
}
return new GeminiLLM({
apiKey: this.config.apiKey,
model: key,
baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai',
});
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
const modelList = await this.getModelList();
const exists = modelList.embedding.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading Gemini Embedding Model. Invalid Model Selected.',
);
}
return new GeminiEmbedding({
apiKey: this.config.apiKey,
model: key,
baseURL: 'https://generativelanguage.googleapis.com/v1beta/openai',
});
}
static parseAndValidate(raw: any): GeminiConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
if (!raw.apiKey)
throw new Error('Invalid config provided. API key must be provided');
return {
apiKey: String(raw.apiKey),
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'gemini',
name: 'Gemini',
};
}
}
export default GeminiProvider;

View File

@@ -0,0 +1,5 @@
import OpenAILLM from '../openai/openaiLLM';
class GroqLLM extends OpenAILLM {}
export default GroqLLM;

View File

@@ -0,0 +1,113 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import { Model, ModelList, ProviderMetadata } from '../../types';
import BaseEmbedding from '../../base/embedding';
import BaseModelProvider from '../../base/provider';
import BaseLLM from '../../base/llm';
import GroqLLM from './groqLLM';
interface GroqConfig {
apiKey: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'password',
name: 'API Key',
key: 'apiKey',
description: 'Your Groq API key',
required: true,
placeholder: 'Groq API Key',
env: 'GROQ_API_KEY',
scope: 'server',
},
];
class GroqProvider extends BaseModelProvider<GroqConfig> {
constructor(id: string, name: string, config: GroqConfig) {
super(id, name, config);
}
async getDefaultModels(): Promise<ModelList> {
const res = await fetch(`https://api.groq.com/openai/v1/models`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${this.config.apiKey}`,
},
});
const data = await res.json();
const defaultChatModels: Model[] = [];
data.data.forEach((m: any) => {
defaultChatModels.push({
key: m.id,
name: m.id,
});
});
return {
embedding: [],
chat: defaultChatModels,
};
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [
...defaultModels.embedding,
...configProvider.embeddingModels,
],
chat: [...defaultModels.chat, ...configProvider.chatModels],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
const modelList = await this.getModelList();
const exists = modelList.chat.find((m) => m.key === key);
if (!exists) {
throw new Error('Error Loading Groq Chat Model. Invalid Model Selected');
}
return new GroqLLM({
apiKey: this.config.apiKey,
model: key,
baseURL: 'https://api.groq.com/openai/v1',
});
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
throw new Error('Groq Provider does not support embedding models.');
}
static parseAndValidate(raw: any): GroqConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
if (!raw.apiKey)
throw new Error('Invalid config provided. API key must be provided');
return {
apiKey: String(raw.apiKey),
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'groq',
name: 'Groq',
};
}
}
export default GroqProvider;

View File

@@ -2,10 +2,22 @@ import { ModelProviderUISection } from '@/lib/config/types';
import { ProviderConstructor } from '../base/provider'; import { ProviderConstructor } from '../base/provider';
import OpenAIProvider from './openai'; import OpenAIProvider from './openai';
import OllamaProvider from './ollama'; import OllamaProvider from './ollama';
import GeminiProvider from './gemini';
import TransformersProvider from './transformers';
import GroqProvider from './groq';
import LemonadeProvider from './lemonade';
import AnthropicProvider from './anthropic';
import LMStudioProvider from './lmstudio';
export const providers: Record<string, ProviderConstructor<any>> = { export const providers: Record<string, ProviderConstructor<any>> = {
openai: OpenAIProvider, openai: OpenAIProvider,
ollama: OllamaProvider, ollama: OllamaProvider,
gemini: GeminiProvider,
transformers: TransformersProvider,
groq: GroqProvider,
lemonade: LemonadeProvider,
anthropic: AnthropicProvider,
lmstudio: LMStudioProvider,
}; };
export const getModelProvidersUIConfigSection = export const getModelProvidersUIConfigSection =

View File

@@ -0,0 +1,153 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import BaseModelProvider from '../../base/provider';
import { Model, ModelList, ProviderMetadata } from '../../types';
import BaseLLM from '../../base/llm';
import LemonadeLLM from './lemonadeLLM';
import BaseEmbedding from '../../base/embedding';
import LemonadeEmbedding from './lemonadeEmbedding';
interface LemonadeConfig {
baseURL: string;
apiKey?: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'string',
name: 'Base URL',
key: 'baseURL',
description: 'The base URL for Lemonade API',
required: true,
placeholder: 'https://api.lemonade.ai/v1',
env: 'LEMONADE_BASE_URL',
scope: 'server',
},
{
type: 'password',
name: 'API Key',
key: 'apiKey',
description: 'Your Lemonade API key (optional)',
required: false,
placeholder: 'Lemonade API Key',
env: 'LEMONADE_API_KEY',
scope: 'server',
},
];
class LemonadeProvider extends BaseModelProvider<LemonadeConfig> {
constructor(id: string, name: string, config: LemonadeConfig) {
super(id, name, config);
}
async getDefaultModels(): Promise<ModelList> {
try {
const res = await fetch(`${this.config.baseURL}/models`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
...(this.config.apiKey
? { Authorization: `Bearer ${this.config.apiKey}` }
: {}),
},
});
const data = await res.json();
const models: Model[] = data.data
.filter((m: any) => m.recipe === 'llamacpp')
.map((m: any) => {
return {
name: m.id,
key: m.id,
};
});
return {
embedding: models,
chat: models,
};
} catch (err) {
if (err instanceof TypeError) {
throw new Error(
'Error connecting to Lemonade API. Please ensure the base URL is correct and the service is available.',
);
}
throw err;
}
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [
...defaultModels.embedding,
...configProvider.embeddingModels,
],
chat: [...defaultModels.chat, ...configProvider.chatModels],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
const modelList = await this.getModelList();
const exists = modelList.chat.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading Lemonade Chat Model. Invalid Model Selected',
);
}
return new LemonadeLLM({
apiKey: this.config.apiKey || 'not-needed',
model: key,
baseURL: this.config.baseURL,
});
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
const modelList = await this.getModelList();
const exists = modelList.embedding.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading Lemonade Embedding Model. Invalid Model Selected.',
);
}
return new LemonadeEmbedding({
apiKey: this.config.apiKey || 'not-needed',
model: key,
baseURL: this.config.baseURL,
});
}
static parseAndValidate(raw: any): LemonadeConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
if (!raw.baseURL)
throw new Error('Invalid config provided. Base URL must be provided');
return {
baseURL: String(raw.baseURL),
apiKey: raw.apiKey ? String(raw.apiKey) : undefined,
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'lemonade',
name: 'Lemonade',
};
}
}
export default LemonadeProvider;

View File

@@ -0,0 +1,5 @@
import OpenAIEmbedding from '../openai/openaiEmbedding';
class LemonadeEmbedding extends OpenAIEmbedding {}
export default LemonadeEmbedding;

View File

@@ -0,0 +1,5 @@
import OpenAILLM from '../openai/openaiLLM';
class LemonadeLLM extends OpenAILLM {}
export default LemonadeLLM;

View File

@@ -0,0 +1,143 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import BaseModelProvider from '../../base/provider';
import { Model, ModelList, ProviderMetadata } from '../../types';
import LMStudioLLM from './lmstudioLLM';
import BaseLLM from '../../base/llm';
import BaseEmbedding from '../../base/embedding';
import LMStudioEmbedding from './lmstudioEmbedding';
interface LMStudioConfig {
baseURL: string;
}
const providerConfigFields: UIConfigField[] = [
{
type: 'string',
name: 'Base URL',
key: 'baseURL',
description: 'The base URL for LM Studio server',
required: true,
placeholder: 'http://localhost:1234',
env: 'LM_STUDIO_BASE_URL',
scope: 'server',
},
];
class LMStudioProvider extends BaseModelProvider<LMStudioConfig> {
constructor(id: string, name: string, config: LMStudioConfig) {
super(id, name, config);
}
private normalizeBaseURL(url: string): string {
const trimmed = url.trim().replace(/\/+$/, '');
return trimmed.endsWith('/v1') ? trimmed : `${trimmed}/v1`;
}
async getDefaultModels(): Promise<ModelList> {
try {
const baseURL = this.normalizeBaseURL(this.config.baseURL);
const res = await fetch(`${baseURL}/models`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
});
const data = await res.json();
const models: Model[] = data.data.map((m: any) => {
return {
name: m.id,
key: m.id,
};
});
return {
embedding: models,
chat: models,
};
} catch (err) {
if (err instanceof TypeError) {
throw new Error(
'Error connecting to LM Studio. Please ensure the base URL is correct and the LM Studio server is running.',
);
}
throw err;
}
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [
...defaultModels.embedding,
...configProvider.embeddingModels,
],
chat: [...defaultModels.chat, ...configProvider.chatModels],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
const modelList = await this.getModelList();
const exists = modelList.chat.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading LM Studio Chat Model. Invalid Model Selected',
);
}
return new LMStudioLLM({
apiKey: 'lm-studio',
model: key,
baseURL: this.normalizeBaseURL(this.config.baseURL),
});
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
const modelList = await this.getModelList();
const exists = modelList.embedding.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading LM Studio Embedding Model. Invalid Model Selected.',
);
}
return new LMStudioEmbedding({
apiKey: 'lm-studio',
model: key,
baseURL: this.normalizeBaseURL(this.config.baseURL),
});
}
static parseAndValidate(raw: any): LMStudioConfig {
if (!raw || typeof raw !== 'object')
throw new Error('Invalid config provided. Expected object');
if (!raw.baseURL)
throw new Error('Invalid config provided. Base URL must be provided');
return {
baseURL: String(raw.baseURL),
};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'lmstudio',
name: 'LM Studio',
};
}
}
export default LMStudioProvider;

View File

@@ -0,0 +1,5 @@
import OpenAIEmbedding from '../openai/openaiEmbedding';
class LMStudioEmbedding extends OpenAIEmbedding {}
export default LMStudioEmbedding;

View File

@@ -0,0 +1,5 @@
import OpenAILLM from '../openai/openaiLLM';
class LMStudioLLM extends OpenAILLM {}
export default LMStudioLLM;

View File

@@ -7,8 +7,11 @@ import {
GenerateTextOutput, GenerateTextOutput,
StreamTextOutput, StreamTextOutput,
} from '../../types'; } from '../../types';
import { Ollama } from 'ollama'; import { Ollama, Tool as OllamaTool, Message as OllamaMessage } from 'ollama';
import { parse } from 'partial-json'; import { parse } from 'partial-json';
import crypto from 'crypto';
import { Message } from '@/lib/types';
import { repairJson } from '@toolsycc/json-repair';
type OllamaConfig = { type OllamaConfig = {
baseURL: string; baseURL: string;
@@ -22,6 +25,7 @@ const reasoningModels = [
'qwen3', 'qwen3',
'deepseek-v3.1', 'deepseek-v3.1',
'magistral', 'magistral',
'nemotron-3-nano',
]; ];
class OllamaLLM extends BaseLLM<OllamaConfig> { class OllamaLLM extends BaseLLM<OllamaConfig> {
@@ -35,10 +39,54 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
}); });
} }
convertToOllamaMessages(messages: Message[]): OllamaMessage[] {
return messages.map((msg) => {
if (msg.role === 'tool') {
return {
role: 'tool',
tool_name: msg.name,
content: msg.content,
} as OllamaMessage;
} else if (msg.role === 'assistant') {
return {
role: 'assistant',
content: msg.content,
tool_calls:
msg.tool_calls?.map((tc, i) => ({
function: {
index: i,
name: tc.name,
arguments: tc.arguments,
},
})) || [],
};
}
return msg;
});
}
async generateText(input: GenerateTextInput): Promise<GenerateTextOutput> { async generateText(input: GenerateTextInput): Promise<GenerateTextOutput> {
const ollamaTools: OllamaTool[] = [];
input.tools?.forEach((tool) => {
ollamaTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema).properties,
},
});
});
const res = await this.ollamaClient.chat({ const res = await this.ollamaClient.chat({
model: this.config.model, model: this.config.model,
messages: input.messages, messages: this.convertToOllamaMessages(input.messages),
tools: ollamaTools.length > 0 ? ollamaTools : undefined,
...(reasoningModels.find((m) => this.config.model.includes(m))
? { think: false }
: {}),
options: { options: {
top_p: input.options?.topP ?? this.config.options?.topP, top_p: input.options?.topP ?? this.config.options?.topP,
temperature: temperature:
@@ -58,6 +106,12 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
return { return {
content: res.message.content, content: res.message.content,
toolCalls:
res.message.tool_calls?.map((tc) => ({
id: crypto.randomUUID(),
name: tc.function.name,
arguments: tc.function.arguments,
})) || [],
additionalInfo: { additionalInfo: {
reasoning: res.message.thinking, reasoning: res.message.thinking,
}, },
@@ -67,10 +121,27 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
async *streamText( async *streamText(
input: GenerateTextInput, input: GenerateTextInput,
): AsyncGenerator<StreamTextOutput> { ): AsyncGenerator<StreamTextOutput> {
const ollamaTools: OllamaTool[] = [];
input.tools?.forEach((tool) => {
ollamaTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema) as any,
},
});
});
const stream = await this.ollamaClient.chat({ const stream = await this.ollamaClient.chat({
model: this.config.model, model: this.config.model,
messages: input.messages, messages: this.convertToOllamaMessages(input.messages),
stream: true, stream: true,
...(reasoningModels.find((m) => this.config.model.includes(m))
? { think: false }
: {}),
tools: ollamaTools.length > 0 ? ollamaTools : undefined,
options: { options: {
top_p: input.options?.topP ?? this.config.options?.topP, top_p: input.options?.topP ?? this.config.options?.topP,
temperature: temperature:
@@ -91,6 +162,17 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
for await (const chunk of stream) { for await (const chunk of stream) {
yield { yield {
contentChunk: chunk.message.content, contentChunk: chunk.message.content,
toolCallChunk:
chunk.message.tool_calls?.map((tc, i) => ({
id: crypto
.createHash('sha256')
.update(
`${i}-${tc.function.name}`,
) /* Ollama currently doesn't return a tool call ID so we're creating one based on the index and tool call name */
.digest('hex'),
name: tc.function.name,
arguments: tc.function.arguments,
})) || [],
done: chunk.done, done: chunk.done,
additionalInfo: { additionalInfo: {
reasoning: chunk.message.thinking, reasoning: chunk.message.thinking,
@@ -102,7 +184,7 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
async generateObject<T>(input: GenerateObjectInput): Promise<T> { async generateObject<T>(input: GenerateObjectInput): Promise<T> {
const response = await this.ollamaClient.chat({ const response = await this.ollamaClient.chat({
model: this.config.model, model: this.config.model,
messages: input.messages, messages: this.convertToOllamaMessages(input.messages),
format: z.toJSONSchema(input.schema), format: z.toJSONSchema(input.schema),
...(reasoningModels.find((m) => this.config.model.includes(m)) ...(reasoningModels.find((m) => this.config.model.includes(m))
? { think: false } ? { think: false }
@@ -124,7 +206,13 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
}); });
try { try {
return input.schema.parse(JSON.parse(response.message.content)) as T; return input.schema.parse(
JSON.parse(
repairJson(response.message.content, {
extractJson: true,
}) as string,
),
) as T;
} catch (err) { } catch (err) {
throw new Error(`Error parsing response from Ollama: ${err}`); throw new Error(`Error parsing response from Ollama: ${err}`);
} }
@@ -135,7 +223,7 @@ class OllamaLLM extends BaseLLM<OllamaConfig> {
const stream = await this.ollamaClient.chat({ const stream = await this.ollamaClient.chat({
model: this.config.model, model: this.config.model,
messages: input.messages, messages: this.convertToOllamaMessages(input.messages),
format: z.toJSONSchema(input.schema), format: z.toJSONSchema(input.schema),
stream: true, stream: true,
...(reasoningModels.find((m) => this.config.model.includes(m)) ...(reasoningModels.find((m) => this.config.model.includes(m))

View File

@@ -61,6 +61,22 @@ const defaultChatModels: Model[] = [
name: 'GPT 5 Mini', name: 'GPT 5 Mini',
key: 'gpt-5-mini', key: 'gpt-5-mini',
}, },
{
name: 'GPT 5 Pro',
key: 'gpt-5-pro',
},
{
name: 'GPT 5.1',
key: 'gpt-5.1',
},
{
name: 'GPT 5.2',
key: 'gpt-5.2',
},
{
name: 'GPT 5.2 Pro',
key: 'gpt-5.2-pro',
},
{ {
name: 'o1', name: 'o1',
key: 'o1', key: 'o1',

View File

@@ -7,8 +7,18 @@ import {
GenerateTextInput, GenerateTextInput,
GenerateTextOutput, GenerateTextOutput,
StreamTextOutput, StreamTextOutput,
ToolCall,
} from '../../types'; } from '../../types';
import { parse } from 'partial-json'; import { parse } from 'partial-json';
import z from 'zod';
import {
ChatCompletionAssistantMessageParam,
ChatCompletionMessageParam,
ChatCompletionTool,
ChatCompletionToolMessageParam,
} from 'openai/resources/index.mjs';
import { Message } from '@/lib/types';
import { repairJson } from '@toolsycc/json-repair';
type OpenAIConfig = { type OpenAIConfig = {
apiKey: string; apiKey: string;
@@ -29,10 +39,54 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
}); });
} }
convertToOpenAIMessages(messages: Message[]): ChatCompletionMessageParam[] {
return messages.map((msg) => {
if (msg.role === 'tool') {
return {
role: 'tool',
tool_call_id: msg.id,
content: msg.content,
} as ChatCompletionToolMessageParam;
} else if (msg.role === 'assistant') {
return {
role: 'assistant',
content: msg.content,
...(msg.tool_calls &&
msg.tool_calls.length > 0 && {
tool_calls: msg.tool_calls?.map((tc) => ({
id: tc.id,
type: 'function',
function: {
name: tc.name,
arguments: JSON.stringify(tc.arguments),
},
})),
}),
} as ChatCompletionAssistantMessageParam;
}
return msg;
});
}
async generateText(input: GenerateTextInput): Promise<GenerateTextOutput> { async generateText(input: GenerateTextInput): Promise<GenerateTextOutput> {
const openaiTools: ChatCompletionTool[] = [];
input.tools?.forEach((tool) => {
openaiTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema),
},
});
});
const response = await this.openAIClient.chat.completions.create({ const response = await this.openAIClient.chat.completions.create({
model: this.config.model, model: this.config.model,
messages: input.messages, tools: openaiTools.length > 0 ? openaiTools : undefined,
messages: this.convertToOpenAIMessages(input.messages),
temperature: temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0, input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
top_p: input.options?.topP ?? this.config.options?.topP, top_p: input.options?.topP ?? this.config.options?.topP,
@@ -49,6 +103,18 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
if (response.choices && response.choices.length > 0) { if (response.choices && response.choices.length > 0) {
return { return {
content: response.choices[0].message.content!, content: response.choices[0].message.content!,
toolCalls:
response.choices[0].message.tool_calls
?.map((tc) => {
if (tc.type === 'function') {
return {
name: tc.function.name,
id: tc.id,
arguments: JSON.parse(tc.function.arguments),
};
}
})
.filter((tc) => tc !== undefined) || [],
additionalInfo: { additionalInfo: {
finishReason: response.choices[0].finish_reason, finishReason: response.choices[0].finish_reason,
}, },
@@ -61,9 +127,23 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
async *streamText( async *streamText(
input: GenerateTextInput, input: GenerateTextInput,
): AsyncGenerator<StreamTextOutput> { ): AsyncGenerator<StreamTextOutput> {
const openaiTools: ChatCompletionTool[] = [];
input.tools?.forEach((tool) => {
openaiTools.push({
type: 'function',
function: {
name: tool.name,
description: tool.description,
parameters: z.toJSONSchema(tool.schema),
},
});
});
const stream = await this.openAIClient.chat.completions.create({ const stream = await this.openAIClient.chat.completions.create({
model: this.config.model, model: this.config.model,
messages: input.messages, messages: this.convertToOpenAIMessages(input.messages),
tools: openaiTools.length > 0 ? openaiTools : undefined,
temperature: temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0, input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
top_p: input.options?.topP ?? this.config.options?.topP, top_p: input.options?.topP ?? this.config.options?.topP,
@@ -78,10 +158,33 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
stream: true, stream: true,
}); });
let recievedToolCalls: { name: string; id: string; arguments: string }[] =
[];
for await (const chunk of stream) { for await (const chunk of stream) {
if (chunk.choices && chunk.choices.length > 0) { if (chunk.choices && chunk.choices.length > 0) {
const toolCalls = chunk.choices[0].delta.tool_calls;
yield { yield {
contentChunk: chunk.choices[0].delta.content || '', contentChunk: chunk.choices[0].delta.content || '',
toolCallChunk:
toolCalls?.map((tc) => {
if (!recievedToolCalls[tc.index]) {
const call = {
name: tc.function?.name!,
id: tc.id!,
arguments: tc.function?.arguments || '',
};
recievedToolCalls.push(call);
return { ...call, arguments: parse(call.arguments || '{}') };
} else {
const existingCall = recievedToolCalls[tc.index];
existingCall.arguments += tc.function?.arguments || '';
return {
...existingCall,
arguments: parse(existingCall.arguments),
};
}
}) || [],
done: chunk.choices[0].finish_reason !== null, done: chunk.choices[0].finish_reason !== null,
additionalInfo: { additionalInfo: {
finishReason: chunk.choices[0].finish_reason, finishReason: chunk.choices[0].finish_reason,
@@ -93,7 +196,7 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
async generateObject<T>(input: GenerateObjectInput): Promise<T> { async generateObject<T>(input: GenerateObjectInput): Promise<T> {
const response = await this.openAIClient.chat.completions.parse({ const response = await this.openAIClient.chat.completions.parse({
messages: input.messages, messages: this.convertToOpenAIMessages(input.messages),
model: this.config.model, model: this.config.model,
temperature: temperature:
input.options?.temperature ?? this.config.options?.temperature ?? 1.0, input.options?.temperature ?? this.config.options?.temperature ?? 1.0,
@@ -111,7 +214,13 @@ class OpenAILLM extends BaseLLM<OpenAIConfig> {
if (response.choices && response.choices.length > 0) { if (response.choices && response.choices.length > 0) {
try { try {
return input.schema.parse(response.choices[0].message.parsed) as T; return input.schema.parse(
JSON.parse(
repairJson(response.choices[0].message.content!, {
extractJson: true,
}) as string,
),
) as T;
} catch (err) { } catch (err) {
throw new Error(`Error parsing response from OpenAI: ${err}`); throw new Error(`Error parsing response from OpenAI: ${err}`);
} }

View File

@@ -0,0 +1,88 @@
import { UIConfigField } from '@/lib/config/types';
import { getConfiguredModelProviderById } from '@/lib/config/serverRegistry';
import { Model, ModelList, ProviderMetadata } from '../../types';
import BaseModelProvider from '../../base/provider';
import BaseLLM from '../../base/llm';
import BaseEmbedding from '../../base/embedding';
import TransformerEmbedding from './transformerEmbedding';
interface TransformersConfig {}
const defaultEmbeddingModels: Model[] = [
{
name: 'all-MiniLM-L6-v2',
key: 'Xenova/all-MiniLM-L6-v2',
},
{
name: 'mxbai-embed-large-v1',
key: 'mixedbread-ai/mxbai-embed-large-v1',
},
{
name: 'nomic-embed-text-v1',
key: 'Xenova/nomic-embed-text-v1',
},
];
const providerConfigFields: UIConfigField[] = [];
class TransformersProvider extends BaseModelProvider<TransformersConfig> {
constructor(id: string, name: string, config: TransformersConfig) {
super(id, name, config);
}
async getDefaultModels(): Promise<ModelList> {
return {
embedding: [...defaultEmbeddingModels],
chat: [],
};
}
async getModelList(): Promise<ModelList> {
const defaultModels = await this.getDefaultModels();
const configProvider = getConfiguredModelProviderById(this.id)!;
return {
embedding: [
...defaultModels.embedding,
...configProvider.embeddingModels,
],
chat: [],
};
}
async loadChatModel(key: string): Promise<BaseLLM<any>> {
throw new Error('Transformers Provider does not support chat models.');
}
async loadEmbeddingModel(key: string): Promise<BaseEmbedding<any>> {
const modelList = await this.getModelList();
const exists = modelList.embedding.find((m) => m.key === key);
if (!exists) {
throw new Error(
'Error Loading OpenAI Embedding Model. Invalid Model Selected.',
);
}
return new TransformerEmbedding({
model: key,
});
}
static parseAndValidate(raw: any): TransformersConfig {
return {};
}
static getProviderConfigFields(): UIConfigField[] {
return providerConfigFields;
}
static getProviderMetadata(): ProviderMetadata {
return {
key: 'transformers',
name: 'Transformers',
};
}
}
export default TransformersProvider;

View File

@@ -0,0 +1,41 @@
import { Chunk } from '@/lib/types';
import BaseEmbedding from '../../base/embedding';
import { FeatureExtractionPipeline } from '@huggingface/transformers';
type TransformerConfig = {
model: string;
};
class TransformerEmbedding extends BaseEmbedding<TransformerConfig> {
private pipelinePromise: Promise<FeatureExtractionPipeline> | null = null;
constructor(protected config: TransformerConfig) {
super(config);
}
async embedText(texts: string[]): Promise<number[][]> {
return this.embed(texts);
}
async embedChunks(chunks: Chunk[]): Promise<number[][]> {
return this.embed(chunks.map((c) => c.content));
}
private async embed(texts: string[]) {
if (!this.pipelinePromise) {
this.pipelinePromise = (async () => {
const { pipeline } = await import('@huggingface/transformers');
const result = await pipeline('feature-extraction', this.config.model, {
dtype: 'fp32',
});
return result as FeatureExtractionPipeline;
})();
}
const pipe = await this.pipelinePromise;
const output = await pipe(texts, { pooling: 'mean', normalize: true });
return output.tolist() as number[][];
}
}
export default TransformerEmbedding;

View File

@@ -1,5 +1,5 @@
import z from 'zod'; import z from 'zod';
import { ChatTurnMessage } from '../types'; import { Message } from '../types';
type Model = { type Model = {
name: string; name: string;
@@ -37,25 +37,40 @@ type GenerateOptions = {
presencePenalty?: number; presencePenalty?: number;
}; };
type Tool = {
name: string;
description: string;
schema: z.ZodObject<any>;
};
type ToolCall = {
id: string;
name: string;
arguments: Record<string, any>;
};
type GenerateTextInput = { type GenerateTextInput = {
messages: ChatTurnMessage[]; messages: Message[];
tools?: Tool[];
options?: GenerateOptions; options?: GenerateOptions;
}; };
type GenerateTextOutput = { type GenerateTextOutput = {
content: string; content: string;
toolCalls: ToolCall[];
additionalInfo?: Record<string, any>; additionalInfo?: Record<string, any>;
}; };
type StreamTextOutput = { type StreamTextOutput = {
contentChunk: string; contentChunk: string;
toolCallChunk: ToolCall[];
additionalInfo?: Record<string, any>; additionalInfo?: Record<string, any>;
done?: boolean; done?: boolean;
}; };
type GenerateObjectInput = { type GenerateObjectInput = {
schema: z.ZodTypeAny; schema: z.ZodTypeAny;
messages: ChatTurnMessage[]; messages: Message[];
options?: GenerateOptions; options?: GenerateOptions;
}; };
@@ -83,4 +98,6 @@ export type {
GenerateObjectInput, GenerateObjectInput,
GenerateObjectOutput, GenerateObjectOutput,
StreamObjectOutput, StreamObjectOutput,
Tool,
ToolCall,
}; };

View File

@@ -3,6 +3,7 @@ import { ChatTurnMessage } from '@/lib/types';
export const imageSearchPrompt = ` export const imageSearchPrompt = `
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search the web for images. You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search the web for images.
You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation. You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation.
Make sure to make the querey standalone and not something very broad, use context from the answers in the conversation to make it specific so user can get best image search results.
Output only the rephrased query in query key JSON format. Do not include any explanation or additional text. Output only the rephrased query in query key JSON format. Do not include any explanation or additional text.
`; `;

View File

@@ -3,6 +3,7 @@ import { ChatTurnMessage } from '@/lib/types';
export const videoSearchPrompt = ` export const videoSearchPrompt = `
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search Youtube for videos. You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search Youtube for videos.
You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation. You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation.
Make sure to make the querey standalone and not something very broad, use context from the answers in the conversation to make it specific so user can get best video search results.
Output only the rephrased query in query key JSON format. Do not include any explanation or additional text. Output only the rephrased query in query key JSON format. Do not include any explanation or additional text.
`; `;

View File

@@ -31,6 +31,10 @@ NOTE: BY GENERAL KNOWLEDGE WE MEAN INFORMATION THAT IS OBVIOUS, WIDELY KNOWN, OR
- Set it to true if the user's query is specifically about current stock prices or stock related information for particular companies. Never use it for a market analysis or news about stock market. - Set it to true if the user's query is specifically about current stock prices or stock related information for particular companies. Never use it for a market analysis or news about stock market.
- Set it to true for queries like "What's the stock price of [Company]?" or "How is the [Stock] performing today?" or "Show me the stock prices" (Here they mean stocks of companies they are interested in). - Set it to true for queries like "What's the stock price of [Company]?" or "How is the [Stock] performing today?" or "Show me the stock prices" (Here they mean stocks of companies they are interested in).
- If it can fully answer the user query without needing additional search, set skipSearch to true as well. - If it can fully answer the user query without needing additional search, set skipSearch to true as well.
7. showCalculationWidget (boolean): Decide if displaying a calculation widget would adequately address the user's query.
- Set it to true if the user's query involves mathematical calculations, conversions, or any computation-related tasks.
- Set it to true for queries like "What is 25% of 80?" or "Convert 100 USD to EUR" or "Calculate the square root of 256" or "What is 2 * 3 + 5?" or other mathematical expressions.
- If it can fully answer the user query without needing additional search, set skipSearch to true as well.
</labels> </labels>
<standalone_followup> <standalone_followup>
@@ -51,7 +55,8 @@ You must respond in the following JSON format without any extra text, explanatio
"academicSearch": boolean, "academicSearch": boolean,
"discussionSearch": boolean, "discussionSearch": boolean,
"showWeatherWidget": boolean, "showWeatherWidget": boolean,
"showStockWidget": boolean "showStockWidget": boolean,
"showCalculationWidget": boolean,
}, },
"standaloneFollowUp": string "standaloneFollowUp": string
} }

Some files were not shown because too many files have changed in this diff Show More