mirror of
https://github.com/ItzCrazyKns/Perplexica.git
synced 2025-09-16 06:11:32 +00:00
Compare commits
57 Commits
v1.10.1
...
0617ac9410
Author | SHA1 | Date | |
---|---|---|---|
|
0617ac9410 | ||
|
85605fe166 | ||
|
d839769d7e | ||
|
ddfe8c607d | ||
|
f65b168388 | ||
|
8796009141 | ||
|
6220822c7c | ||
|
8241c87784 | ||
|
c8def1989a | ||
|
a71e4ae10d | ||
|
abf9dbb8ba | ||
|
68e151b2bd | ||
|
06ff272541 | ||
|
4154d5e4b1 | ||
|
b3aafba30c | ||
|
9f7fd178e0 | ||
|
59a10d7d00 | ||
|
67ee9eff53 | ||
|
0bb860b154 | ||
|
c0705d1d9e | ||
|
73b5e8832e | ||
|
b2da9faeed | ||
|
1a2ad8a59d | ||
|
1862491496 | ||
|
073b5e897c | ||
|
9a332e79e4 | ||
|
72450b9217 | ||
|
7e1dc33a08 | ||
|
aa240009ab | ||
|
41b258e4d8 | ||
|
da1123d84b | ||
|
627775c430 | ||
|
245573efca | ||
|
28b9cca413 | ||
|
a85f762c58 | ||
|
3ddcceda0a | ||
|
e226645bc7 | ||
|
5447530ece | ||
|
ed6d46a440 | ||
|
588e68e93e | ||
|
c4440327db | ||
|
64e2d457cc | ||
|
bf705afc21 | ||
|
2e4433a6b3 | ||
|
09661ae11d | ||
|
a8d410bc2f | ||
|
7d52fbb368 | ||
|
4b8e0ea1aa | ||
|
5b1055e8c9 | ||
|
4b2a7916fd | ||
|
e0817d1008 | ||
|
690ef42861 | ||
|
b84e4e4ce6 | ||
|
467905d9f2 | ||
|
18b6f5b674 | ||
|
2bdcbf20fb | ||
|
8aaee2c40c |
94
.github/copilot-instructions.md
vendored
Normal file
94
.github/copilot-instructions.md
vendored
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
# GitHub Copilot Instructions for Perplexica
|
||||||
|
|
||||||
|
This file provides context and guidance for GitHub Copilot when working with the Perplexica codebase.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
|
||||||
|
Perplexica is an open-source AI-powered search engine that uses advanced machine learning to provide intelligent search results. It combines web search capabilities with LLM-based processing to understand and answer user questions, similar to Perplexity AI but fully open source.
|
||||||
|
|
||||||
|
## Key Components
|
||||||
|
|
||||||
|
- **Frontend**: Next.js application with React components (in `/src/components` and `/src/app`)
|
||||||
|
- **Backend Logic**: Node.js backend with API routes (in `/src/app/api`) and library code (in `/src/lib`)
|
||||||
|
- **Search Engine**: Uses SearXNG as a metadata search engine
|
||||||
|
- **LLM Integration**: Supports multiple models including OpenAI, Anthropic, Groq, Ollama (local models)
|
||||||
|
- **Database**: SQLite database managed with Drizzle ORM
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
The system works through these main steps:
|
||||||
|
|
||||||
|
- User submits a query
|
||||||
|
- The system determines if web search is needed
|
||||||
|
- If needed, it searches the web using SearXNG
|
||||||
|
- Results are ranked using embedding-based similarity search
|
||||||
|
- LLMs are used to generate a comprehensive response with cited sources
|
||||||
|
|
||||||
|
## Key Technologies
|
||||||
|
|
||||||
|
- **Frontend**: React, Next.js, Tailwind CSS
|
||||||
|
- **Backend**: Node.js
|
||||||
|
- **Database**: SQLite with Drizzle ORM
|
||||||
|
- **AI/ML**: LangChain for orchestration, various LLM providers
|
||||||
|
- **Search**: SearXNG integration
|
||||||
|
- **Embedding Models**: For re-ranking search results
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
- `/src/app`: Next.js app directory with page components and API routes
|
||||||
|
- `/src/components`: Reusable UI components
|
||||||
|
- `/src/lib`: Backend functionality
|
||||||
|
- `/lib/search`: Search functionality and meta search agent
|
||||||
|
- `/lib/db`: Database schema and operations
|
||||||
|
- `/lib/providers`: LLM and embedding model integrations
|
||||||
|
- `/lib/prompts`: Prompt templates for LLMs
|
||||||
|
- `/lib/chains`: LangChain chains for various operations
|
||||||
|
|
||||||
|
## Focus Modes
|
||||||
|
|
||||||
|
Perplexica supports multiple specialized search modes:
|
||||||
|
|
||||||
|
- All Mode: General web search
|
||||||
|
- Local Research Mode: Research and interact with local files with citations
|
||||||
|
- Chat Mode: Have a creative conversation
|
||||||
|
- Academic Search Mode: For academic research
|
||||||
|
- YouTube Search Mode: For video content
|
||||||
|
- Wolfram Alpha Search Mode: For calculations and data analysis
|
||||||
|
- Reddit Search Mode: For community discussions
|
||||||
|
|
||||||
|
## Development Workflow
|
||||||
|
|
||||||
|
- Use `npm run dev` for local development
|
||||||
|
- Format code with `npm run format:write` before committing
|
||||||
|
- Database migrations: `npm run db:push`
|
||||||
|
- Build for production: `npm run build`
|
||||||
|
- Start production server: `npm run start`
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
The application uses a `config.toml` file (created from `sample.config.toml`) for configuration, including:
|
||||||
|
|
||||||
|
- API keys for various LLM providers
|
||||||
|
- Database settings
|
||||||
|
- Search engine configuration
|
||||||
|
- Similarity measure settings
|
||||||
|
|
||||||
|
## Common Tasks
|
||||||
|
|
||||||
|
When working on this codebase, you might need to:
|
||||||
|
|
||||||
|
- Add new API endpoints in `/src/app/api`
|
||||||
|
- Modify UI components in `/src/components`
|
||||||
|
- Extend search functionality in `/src/lib/search`
|
||||||
|
- Add new LLM providers in `/src/lib/providers`
|
||||||
|
- Update database schema in `/src/lib/db/schema.ts`
|
||||||
|
- Create new prompt templates in `/src/lib/prompts`
|
||||||
|
- Build new chains in `/src/lib/chains`
|
||||||
|
|
||||||
|
## AI Behavior
|
||||||
|
|
||||||
|
- Avoid conciliatory language
|
||||||
|
- It is not necessary to apologize
|
||||||
|
- If you don't know the answer, ask for clarification
|
||||||
|
- Do not add additional packages or dependencies unless explicitly requested
|
||||||
|
- Only make changes to the code that are relevant to the task at hand
|
5
.github/workflows/docker-build.yaml
vendored
5
.github/workflows/docker-build.yaml
vendored
@@ -114,6 +114,11 @@ jobs:
|
|||||||
username: ${{ secrets.DOCKER_USERNAME }}
|
username: ${{ secrets.DOCKER_USERNAME }}
|
||||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||||
|
|
||||||
|
- name: Extract version from release tag
|
||||||
|
if: github.event_name == 'release'
|
||||||
|
id: version
|
||||||
|
run: echo "RELEASE_VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_ENV
|
||||||
|
|
||||||
- name: Create and push multi-arch manifest for main
|
- name: Create and push multi-arch manifest for main
|
||||||
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
|
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
|
||||||
run: |
|
run: |
|
||||||
|
53
README.md
53
README.md
@@ -1,21 +1,5 @@
|
|||||||
# 🚀 Perplexica - An AI-powered search engine 🔎 <!-- omit in toc -->
|
# 🚀 Perplexica - An AI-powered search engine 🔎 <!-- omit in toc -->
|
||||||
|
|
||||||
<div align="center" markdown="1">
|
|
||||||
<sup>Special thanks to:</sup>
|
|
||||||
<br>
|
|
||||||
<br>
|
|
||||||
<a href="https://www.warp.dev/perplexica">
|
|
||||||
<img alt="Warp sponsorship" width="400" src="https://github.com/user-attachments/assets/775dd593-9b5f-40f1-bf48-479faff4c27b">
|
|
||||||
</a>
|
|
||||||
|
|
||||||
### [Warp, the AI Devtool that lives in your terminal](https://www.warp.dev/perplexica)
|
|
||||||
|
|
||||||
[Available for MacOS, Linux, & Windows](https://www.warp.dev/perplexica)
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<hr/>
|
|
||||||
|
|
||||||
[](https://discord.gg/26aArMy8tT)
|
[](https://discord.gg/26aArMy8tT)
|
||||||
|
|
||||||

|

|
||||||
@@ -57,9 +41,10 @@ Want to know more about its architecture and how it works? You can read it [here
|
|||||||
- **Two Main Modes:**
|
- **Two Main Modes:**
|
||||||
- **Copilot Mode:** (In development) Boosts search by generating different queries to find more relevant internet sources. Like normal search instead of just using the context by SearxNG, it visits the top matches and tries to find relevant sources to the user's query directly from the page.
|
- **Copilot Mode:** (In development) Boosts search by generating different queries to find more relevant internet sources. Like normal search instead of just using the context by SearxNG, it visits the top matches and tries to find relevant sources to the user's query directly from the page.
|
||||||
- **Normal Mode:** Processes your query and performs a web search.
|
- **Normal Mode:** Processes your query and performs a web search.
|
||||||
- **Focus Modes:** Special modes to better answer specific types of questions. Perplexica currently has 6 focus modes:
|
- **Focus Modes:** Special modes to better answer specific types of questions. Perplexica currently has 7 focus modes:
|
||||||
- **All Mode:** Searches the entire web to find the best results.
|
- **All Mode:** Searches the entire web to find the best results.
|
||||||
- **Writing Assistant Mode:** Helpful for writing tasks that do not require searching the web.
|
- **Local Research Mode:** Research and interact with local files with citations.
|
||||||
|
- **Chat Mode:** Have a truly creative conversation without web search.
|
||||||
- **Academic Search Mode:** Finds articles and papers, ideal for academic research.
|
- **Academic Search Mode:** Finds articles and papers, ideal for academic research.
|
||||||
- **YouTube Search Mode:** Finds YouTube videos based on the search query.
|
- **YouTube Search Mode:** Finds YouTube videos based on the search query.
|
||||||
- **Wolfram Alpha Search Mode:** Answers queries that need calculations or data analysis using Wolfram Alpha.
|
- **Wolfram Alpha Search Mode:** Answers queries that need calculations or data analysis using Wolfram Alpha.
|
||||||
@@ -155,10 +140,42 @@ For more details, check out the full documentation [here](https://github.com/Itz
|
|||||||
|
|
||||||
Perplexica runs on Next.js and handles all API requests. It works right away on the same network and stays accessible even with port forwarding.
|
Perplexica runs on Next.js and handles all API requests. It works right away on the same network and stays accessible even with port forwarding.
|
||||||
|
|
||||||
|
### Running Behind a Reverse Proxy
|
||||||
|
|
||||||
|
When running Perplexica behind a reverse proxy (like Nginx, Apache, or Traefik), follow these steps to ensure proper functionality:
|
||||||
|
|
||||||
|
1. **Configure the BASE_URL setting**:
|
||||||
|
- In `config.toml`, set the `BASE_URL` parameter under the `[GENERAL]` section to your public-facing URL (e.g., `https://perplexica.yourdomain.com`)
|
||||||
|
|
||||||
|
2. **Ensure proper headers forwarding**:
|
||||||
|
- Your reverse proxy should forward the following headers:
|
||||||
|
- `X-Forwarded-Host`
|
||||||
|
- `X-Forwarded-Proto`
|
||||||
|
- `X-Forwarded-Port` (if using non-standard ports)
|
||||||
|
|
||||||
|
3. **Example Nginx configuration**:
|
||||||
|
```nginx
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name perplexica.yourdomain.com;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_pass http://localhost:3000;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_set_header X-Forwarded-Host $host;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
This ensures that OpenSearch descriptions, browser integrations, and all URLs work properly when accessing Perplexica through your reverse proxy.
|
||||||
|
|
||||||
## One-Click Deployment
|
## One-Click Deployment
|
||||||
|
|
||||||
[](https://usw.sealos.io/?openapp=system-template%3FtemplateName%3Dperplexica)
|
[](https://usw.sealos.io/?openapp=system-template%3FtemplateName%3Dperplexica)
|
||||||
[](https://repocloud.io/details/?app_id=267)
|
[](https://repocloud.io/details/?app_id=267)
|
||||||
|
[](https://template.run.claw.cloud/?referralCode=U11MRQ8U9RM4&openapp=system-fastdeploy%3FtemplateName%3Dperplexica)
|
||||||
|
|
||||||
## Upcoming Features
|
## Upcoming Features
|
||||||
|
|
||||||
|
@@ -33,6 +33,7 @@ The API accepts a JSON object in the request body, where you define the focus mo
|
|||||||
["human", "Hi, how are you?"],
|
["human", "Hi, how are you?"],
|
||||||
["assistant", "I am doing well, how can I help you today?"]
|
["assistant", "I am doing well, how can I help you today?"]
|
||||||
],
|
],
|
||||||
|
"systemInstructions": "Focus on providing technical details about Perplexica's architecture.",
|
||||||
"stream": false
|
"stream": false
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -54,7 +55,7 @@ The API accepts a JSON object in the request body, where you define the focus mo
|
|||||||
|
|
||||||
- **`focusMode`** (string, required): Specifies which focus mode to use. Available modes:
|
- **`focusMode`** (string, required): Specifies which focus mode to use. Available modes:
|
||||||
|
|
||||||
- `webSearch`, `academicSearch`, `writingAssistant`, `wolframAlphaSearch`, `youtubeSearch`, `redditSearch`.
|
- `webSearch`, `academicSearch`, `localResearch`, `chat`, `wolframAlphaSearch`, `youtubeSearch`, `redditSearch`.
|
||||||
|
|
||||||
- **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes:
|
- **`optimizationMode`** (string, optional): Specifies the optimization mode to control the balance between performance and quality. Available modes:
|
||||||
|
|
||||||
@@ -63,6 +64,8 @@ The API accepts a JSON object in the request body, where you define the focus mo
|
|||||||
|
|
||||||
- **`query`** (string, required): The search query or question.
|
- **`query`** (string, required): The search query or question.
|
||||||
|
|
||||||
|
- **`systemInstructions`** (string, optional): Custom instructions provided by the user to guide the AI's response. These instructions are treated as user preferences and have lower priority than the system's core instructions. For example, you can specify a particular writing style, format, or focus area.
|
||||||
|
|
||||||
- **`history`** (array, optional): An array of message pairs representing the conversation history. Each pair consists of a role (either 'human' or 'assistant') and the message content. This allows the system to use the context of the conversation to refine results. Example:
|
- **`history`** (array, optional): An array of message pairs representing the conversation history. Each pair consists of a role (either 'human' or 'assistant') and the message content. This allows the system to use the context of the conversation to refine results. Example:
|
||||||
|
|
||||||
```json
|
```json
|
||||||
|
11860
package-lock.json
generated
Normal file
11860
package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,10 +1,10 @@
|
|||||||
{
|
{
|
||||||
"name": "perplexica-frontend",
|
"name": "perplexica-frontend",
|
||||||
"version": "1.10.1",
|
"version": "1.10.2",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"author": "ItzCrazyKns",
|
"author": "ItzCrazyKns",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "next dev",
|
"dev": "next dev --turbopack",
|
||||||
"build": "npm run db:push && next build",
|
"build": "npm run db:push && next build",
|
||||||
"start": "next start",
|
"start": "next start",
|
||||||
"lint": "next lint",
|
"lint": "next lint",
|
||||||
@@ -19,9 +19,11 @@
|
|||||||
"@langchain/community": "^0.3.36",
|
"@langchain/community": "^0.3.36",
|
||||||
"@langchain/core": "^0.3.42",
|
"@langchain/core": "^0.3.42",
|
||||||
"@langchain/google-genai": "^0.1.12",
|
"@langchain/google-genai": "^0.1.12",
|
||||||
|
"@langchain/ollama": "^0.2.0",
|
||||||
"@langchain/openai": "^0.0.25",
|
"@langchain/openai": "^0.0.25",
|
||||||
"@langchain/textsplitters": "^0.1.0",
|
"@langchain/textsplitters": "^0.1.0",
|
||||||
"@tailwindcss/typography": "^0.5.12",
|
"@tailwindcss/typography": "^0.5.12",
|
||||||
|
"@types/react-syntax-highlighter": "^15.5.13",
|
||||||
"@xenova/transformers": "^2.17.2",
|
"@xenova/transformers": "^2.17.2",
|
||||||
"axios": "^1.8.3",
|
"axios": "^1.8.3",
|
||||||
"better-sqlite3": "^11.9.1",
|
"better-sqlite3": "^11.9.1",
|
||||||
@@ -38,6 +40,7 @@
|
|||||||
"pdf-parse": "^1.1.1",
|
"pdf-parse": "^1.1.1",
|
||||||
"react": "^18",
|
"react": "^18",
|
||||||
"react-dom": "^18",
|
"react-dom": "^18",
|
||||||
|
"react-syntax-highlighter": "^15.6.1",
|
||||||
"react-text-to-speech": "^0.14.5",
|
"react-text-to-speech": "^0.14.5",
|
||||||
"react-textarea-autosize": "^8.5.3",
|
"react-textarea-autosize": "^8.5.3",
|
||||||
"sonner": "^1.4.41",
|
"sonner": "^1.4.41",
|
||||||
|
@@ -1,6 +1,7 @@
|
|||||||
[GENERAL]
|
[GENERAL]
|
||||||
SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"
|
SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"
|
||||||
KEEP_ALIVE = "5m" # How long to keep Ollama models loaded into memory. (Instead of using -1 use "-1m")
|
KEEP_ALIVE = "5m" # How long to keep Ollama models loaded into memory. (Instead of using -1 use "-1m")
|
||||||
|
BASE_URL = "" # Optional. When set, overrides detected URL for OpenSearch and other public URLs
|
||||||
|
|
||||||
[MODELS.OPENAI]
|
[MODELS.OPENAI]
|
||||||
API_KEY = ""
|
API_KEY = ""
|
||||||
@@ -22,5 +23,11 @@ MODEL_NAME = ""
|
|||||||
[MODELS.OLLAMA]
|
[MODELS.OLLAMA]
|
||||||
API_URL = "" # Ollama API URL - http://host.docker.internal:11434
|
API_URL = "" # Ollama API URL - http://host.docker.internal:11434
|
||||||
|
|
||||||
|
[MODELS.DEEPSEEK]
|
||||||
|
API_KEY = ""
|
||||||
|
|
||||||
|
[MODELS.LM_STUDIO]
|
||||||
|
API_URL = "" # LM Studio API URL - http://host.docker.internal:1234
|
||||||
|
|
||||||
[API_ENDPOINTS]
|
[API_ENDPOINTS]
|
||||||
SEARXNG = "" # SearxNG API URL - http://localhost:32768
|
SEARXNG = "" # SearxNG API URL - http://localhost:32768
|
||||||
|
@@ -1,26 +1,23 @@
|
|||||||
import prompts from '@/lib/prompts';
|
|
||||||
import MetaSearchAgent from '@/lib/search/metaSearchAgent';
|
|
||||||
import crypto from 'crypto';
|
|
||||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
|
||||||
import { EventEmitter } from 'stream';
|
|
||||||
import {
|
|
||||||
chatModelProviders,
|
|
||||||
embeddingModelProviders,
|
|
||||||
getAvailableChatModelProviders,
|
|
||||||
getAvailableEmbeddingModelProviders,
|
|
||||||
} from '@/lib/providers';
|
|
||||||
import db from '@/lib/db';
|
|
||||||
import { chats, messages as messagesSchema } from '@/lib/db/schema';
|
|
||||||
import { and, eq, gt } from 'drizzle-orm';
|
|
||||||
import { getFileDetails } from '@/lib/utils/files';
|
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
|
||||||
import { ChatOpenAI } from '@langchain/openai';
|
|
||||||
import {
|
import {
|
||||||
getCustomOpenaiApiKey,
|
getCustomOpenaiApiKey,
|
||||||
getCustomOpenaiApiUrl,
|
getCustomOpenaiApiUrl,
|
||||||
getCustomOpenaiModelName,
|
getCustomOpenaiModelName,
|
||||||
} from '@/lib/config';
|
} from '@/lib/config';
|
||||||
|
import db from '@/lib/db';
|
||||||
|
import { chats, messages as messagesSchema } from '@/lib/db/schema';
|
||||||
|
import {
|
||||||
|
getAvailableChatModelProviders,
|
||||||
|
getAvailableEmbeddingModelProviders,
|
||||||
|
} from '@/lib/providers';
|
||||||
import { searchHandlers } from '@/lib/search';
|
import { searchHandlers } from '@/lib/search';
|
||||||
|
import { getFileDetails } from '@/lib/utils/files';
|
||||||
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
|
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
import crypto from 'crypto';
|
||||||
|
import { and, eq, gte } from 'drizzle-orm';
|
||||||
|
import { EventEmitter } from 'stream';
|
||||||
|
|
||||||
export const runtime = 'nodejs';
|
export const runtime = 'nodejs';
|
||||||
export const dynamic = 'force-dynamic';
|
export const dynamic = 'force-dynamic';
|
||||||
@@ -34,6 +31,7 @@ type Message = {
|
|||||||
type ChatModel = {
|
type ChatModel = {
|
||||||
provider: string;
|
provider: string;
|
||||||
name: string;
|
name: string;
|
||||||
|
ollamaContextWindow?: number;
|
||||||
};
|
};
|
||||||
|
|
||||||
type EmbeddingModel = {
|
type EmbeddingModel = {
|
||||||
@@ -49,6 +47,12 @@ type Body = {
|
|||||||
files: Array<string>;
|
files: Array<string>;
|
||||||
chatModel: ChatModel;
|
chatModel: ChatModel;
|
||||||
embeddingModel: EmbeddingModel;
|
embeddingModel: EmbeddingModel;
|
||||||
|
systemInstructions: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ModelStats = {
|
||||||
|
modelName: string;
|
||||||
|
responseTime?: number;
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleEmitterEvents = async (
|
const handleEmitterEvents = async (
|
||||||
@@ -57,9 +61,12 @@ const handleEmitterEvents = async (
|
|||||||
encoder: TextEncoder,
|
encoder: TextEncoder,
|
||||||
aiMessageId: string,
|
aiMessageId: string,
|
||||||
chatId: string,
|
chatId: string,
|
||||||
|
startTime: number,
|
||||||
) => {
|
) => {
|
||||||
let recievedMessage = '';
|
let recievedMessage = '';
|
||||||
let sources: any[] = [];
|
let sources: any[] = [];
|
||||||
|
let searchQuery: string | undefined;
|
||||||
|
let searchUrl: string | undefined;
|
||||||
|
|
||||||
stream.on('data', (data) => {
|
stream.on('data', (data) => {
|
||||||
const parsedData = JSON.parse(data);
|
const parsedData = JSON.parse(data);
|
||||||
@@ -76,12 +83,22 @@ const handleEmitterEvents = async (
|
|||||||
|
|
||||||
recievedMessage += parsedData.data;
|
recievedMessage += parsedData.data;
|
||||||
} else if (parsedData.type === 'sources') {
|
} else if (parsedData.type === 'sources') {
|
||||||
|
// Capture the search query if available
|
||||||
|
if (parsedData.searchQuery) {
|
||||||
|
searchQuery = parsedData.searchQuery;
|
||||||
|
}
|
||||||
|
if (parsedData.searchUrl) {
|
||||||
|
searchUrl = parsedData.searchUrl;
|
||||||
|
}
|
||||||
|
|
||||||
writer.write(
|
writer.write(
|
||||||
encoder.encode(
|
encoder.encode(
|
||||||
JSON.stringify({
|
JSON.stringify({
|
||||||
type: 'sources',
|
type: 'sources',
|
||||||
data: parsedData.data,
|
data: parsedData.data,
|
||||||
|
searchQuery: parsedData.searchQuery,
|
||||||
messageId: aiMessageId,
|
messageId: aiMessageId,
|
||||||
|
searchUrl: searchUrl,
|
||||||
}) + '\n',
|
}) + '\n',
|
||||||
),
|
),
|
||||||
);
|
);
|
||||||
@@ -89,12 +106,34 @@ const handleEmitterEvents = async (
|
|||||||
sources = parsedData.data;
|
sources = parsedData.data;
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
let modelStats: ModelStats = {
|
||||||
|
modelName: '',
|
||||||
|
};
|
||||||
|
|
||||||
|
stream.on('stats', (data) => {
|
||||||
|
const parsedData = JSON.parse(data);
|
||||||
|
if (parsedData.type === 'modelStats') {
|
||||||
|
modelStats = parsedData.data;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
stream.on('end', () => {
|
stream.on('end', () => {
|
||||||
|
const endTime = Date.now();
|
||||||
|
const duration = endTime - startTime;
|
||||||
|
|
||||||
|
modelStats = {
|
||||||
|
...modelStats,
|
||||||
|
responseTime: duration,
|
||||||
|
};
|
||||||
|
|
||||||
writer.write(
|
writer.write(
|
||||||
encoder.encode(
|
encoder.encode(
|
||||||
JSON.stringify({
|
JSON.stringify({
|
||||||
type: 'messageEnd',
|
type: 'messageEnd',
|
||||||
messageId: aiMessageId,
|
messageId: aiMessageId,
|
||||||
|
modelStats: modelStats,
|
||||||
|
searchQuery: searchQuery,
|
||||||
|
searchUrl: searchUrl,
|
||||||
}) + '\n',
|
}) + '\n',
|
||||||
),
|
),
|
||||||
);
|
);
|
||||||
@@ -109,6 +148,9 @@ const handleEmitterEvents = async (
|
|||||||
metadata: JSON.stringify({
|
metadata: JSON.stringify({
|
||||||
createdAt: new Date(),
|
createdAt: new Date(),
|
||||||
...(sources && sources.length > 0 && { sources }),
|
...(sources && sources.length > 0 && { sources }),
|
||||||
|
...(searchQuery && { searchQuery }),
|
||||||
|
modelStats: modelStats,
|
||||||
|
...(searchUrl && { searchUrl }),
|
||||||
}),
|
}),
|
||||||
})
|
})
|
||||||
.execute();
|
.execute();
|
||||||
@@ -172,7 +214,7 @@ const handleHistorySave = async (
|
|||||||
.delete(messagesSchema)
|
.delete(messagesSchema)
|
||||||
.where(
|
.where(
|
||||||
and(
|
and(
|
||||||
gt(messagesSchema.id, messageExists.id),
|
gte(messagesSchema.id, messageExists.id),
|
||||||
eq(messagesSchema.chatId, message.chatId),
|
eq(messagesSchema.chatId, message.chatId),
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
@@ -182,6 +224,7 @@ const handleHistorySave = async (
|
|||||||
|
|
||||||
export const POST = async (req: Request) => {
|
export const POST = async (req: Request) => {
|
||||||
try {
|
try {
|
||||||
|
const startTime = Date.now();
|
||||||
const body = (await req.json()) as Body;
|
const body = (await req.json()) as Body;
|
||||||
const { message } = body;
|
const { message } = body;
|
||||||
|
|
||||||
@@ -231,6 +274,11 @@ export const POST = async (req: Request) => {
|
|||||||
}) as unknown as BaseChatModel;
|
}) as unknown as BaseChatModel;
|
||||||
} else if (chatModelProvider && chatModel) {
|
} else if (chatModelProvider && chatModel) {
|
||||||
llm = chatModel.model;
|
llm = chatModel.model;
|
||||||
|
|
||||||
|
// Set context window size for Ollama models
|
||||||
|
if (llm instanceof ChatOllama && body.chatModel?.provider === 'ollama') {
|
||||||
|
llm.numCtx = body.chatModel.ollamaContextWindow || 2048;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!llm) {
|
if (!llm) {
|
||||||
@@ -278,13 +326,21 @@ export const POST = async (req: Request) => {
|
|||||||
embedding,
|
embedding,
|
||||||
body.optimizationMode,
|
body.optimizationMode,
|
||||||
body.files,
|
body.files,
|
||||||
|
body.systemInstructions,
|
||||||
);
|
);
|
||||||
|
|
||||||
const responseStream = new TransformStream();
|
const responseStream = new TransformStream();
|
||||||
const writer = responseStream.writable.getWriter();
|
const writer = responseStream.writable.getWriter();
|
||||||
const encoder = new TextEncoder();
|
const encoder = new TextEncoder();
|
||||||
|
|
||||||
handleEmitterEvents(stream, writer, encoder, aiMessageId, message.chatId);
|
handleEmitterEvents(
|
||||||
|
stream,
|
||||||
|
writer,
|
||||||
|
encoder,
|
||||||
|
aiMessageId,
|
||||||
|
message.chatId,
|
||||||
|
startTime,
|
||||||
|
);
|
||||||
handleHistorySave(message, humanMessageId, body.focusMode, body.files);
|
handleHistorySave(message, humanMessageId, body.focusMode, body.files);
|
||||||
|
|
||||||
return new Response(responseStream.readable, {
|
return new Response(responseStream.readable, {
|
||||||
|
@@ -1,5 +1,6 @@
|
|||||||
import {
|
import {
|
||||||
getAnthropicApiKey,
|
getAnthropicApiKey,
|
||||||
|
getBaseUrl,
|
||||||
getCustomOpenaiApiKey,
|
getCustomOpenaiApiKey,
|
||||||
getCustomOpenaiApiUrl,
|
getCustomOpenaiApiUrl,
|
||||||
getCustomOpenaiModelName,
|
getCustomOpenaiModelName,
|
||||||
@@ -7,6 +8,8 @@ import {
|
|||||||
getGroqApiKey,
|
getGroqApiKey,
|
||||||
getOllamaApiEndpoint,
|
getOllamaApiEndpoint,
|
||||||
getOpenaiApiKey,
|
getOpenaiApiKey,
|
||||||
|
getDeepseekApiKey,
|
||||||
|
getLMStudioApiEndpoint,
|
||||||
updateConfig,
|
updateConfig,
|
||||||
} from '@/lib/config';
|
} from '@/lib/config';
|
||||||
import {
|
import {
|
||||||
@@ -50,12 +53,15 @@ export const GET = async (req: Request) => {
|
|||||||
|
|
||||||
config['openaiApiKey'] = getOpenaiApiKey();
|
config['openaiApiKey'] = getOpenaiApiKey();
|
||||||
config['ollamaApiUrl'] = getOllamaApiEndpoint();
|
config['ollamaApiUrl'] = getOllamaApiEndpoint();
|
||||||
|
config['lmStudioApiUrl'] = getLMStudioApiEndpoint();
|
||||||
config['anthropicApiKey'] = getAnthropicApiKey();
|
config['anthropicApiKey'] = getAnthropicApiKey();
|
||||||
config['groqApiKey'] = getGroqApiKey();
|
config['groqApiKey'] = getGroqApiKey();
|
||||||
config['geminiApiKey'] = getGeminiApiKey();
|
config['geminiApiKey'] = getGeminiApiKey();
|
||||||
|
config['deepseekApiKey'] = getDeepseekApiKey();
|
||||||
config['customOpenaiApiUrl'] = getCustomOpenaiApiUrl();
|
config['customOpenaiApiUrl'] = getCustomOpenaiApiUrl();
|
||||||
config['customOpenaiApiKey'] = getCustomOpenaiApiKey();
|
config['customOpenaiApiKey'] = getCustomOpenaiApiKey();
|
||||||
config['customOpenaiModelName'] = getCustomOpenaiModelName();
|
config['customOpenaiModelName'] = getCustomOpenaiModelName();
|
||||||
|
config['baseUrl'] = getBaseUrl();
|
||||||
|
|
||||||
return Response.json({ ...config }, { status: 200 });
|
return Response.json({ ...config }, { status: 200 });
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
@@ -88,6 +94,12 @@ export const POST = async (req: Request) => {
|
|||||||
OLLAMA: {
|
OLLAMA: {
|
||||||
API_URL: config.ollamaApiUrl,
|
API_URL: config.ollamaApiUrl,
|
||||||
},
|
},
|
||||||
|
DEEPSEEK: {
|
||||||
|
API_KEY: config.deepseekApiKey,
|
||||||
|
},
|
||||||
|
LM_STUDIO: {
|
||||||
|
API_URL: config.lmStudioApiUrl,
|
||||||
|
},
|
||||||
CUSTOM_OPENAI: {
|
CUSTOM_OPENAI: {
|
||||||
API_URL: config.customOpenaiApiUrl,
|
API_URL: config.customOpenaiApiUrl,
|
||||||
API_KEY: config.customOpenaiApiKey,
|
API_KEY: config.customOpenaiApiKey,
|
||||||
|
@@ -7,11 +7,13 @@ import {
|
|||||||
import { getAvailableChatModelProviders } from '@/lib/providers';
|
import { getAvailableChatModelProviders } from '@/lib/providers';
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
import { ChatOpenAI } from '@langchain/openai';
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
|
||||||
interface ChatModel {
|
interface ChatModel {
|
||||||
provider: string;
|
provider: string;
|
||||||
model: string;
|
model: string;
|
||||||
|
ollamaContextWindow?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface ImageSearchBody {
|
interface ImageSearchBody {
|
||||||
@@ -58,6 +60,10 @@ export const POST = async (req: Request) => {
|
|||||||
}) as unknown as BaseChatModel;
|
}) as unknown as BaseChatModel;
|
||||||
} else if (chatModelProvider && chatModel) {
|
} else if (chatModelProvider && chatModel) {
|
||||||
llm = chatModel.model;
|
llm = chatModel.model;
|
||||||
|
// Set context window size for Ollama models
|
||||||
|
if (llm instanceof ChatOllama && body.chatModel?.provider === 'ollama') {
|
||||||
|
llm.numCtx = body.chatModel.ollamaContextWindow || 2048;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!llm) {
|
if (!llm) {
|
||||||
|
63
src/app/api/opensearch/route.ts
Normal file
63
src/app/api/opensearch/route.ts
Normal file
@@ -0,0 +1,63 @@
|
|||||||
|
import { NextResponse } from 'next/server';
|
||||||
|
import { getBaseUrl } from '@/lib/config';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates an OpenSearch XML response with the given origin URL
|
||||||
|
*/
|
||||||
|
function generateOpenSearchResponse(origin: string): NextResponse {
|
||||||
|
const opensearchXml = `<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<OpenSearchDescription xmlns="http://a9.com/-/spec/opensearch/1.1/" xmlns:moz="http://www.mozilla.org/2006/browser/search/">
|
||||||
|
<ShortName>Perplexica</ShortName>
|
||||||
|
<LongName>Search with Perplexica AI</LongName>
|
||||||
|
<Description>Perplexica is a powerful AI-driven search engine that understands your queries and delivers relevant results.</Description>
|
||||||
|
<InputEncoding>UTF-8</InputEncoding>
|
||||||
|
<Image width="16" height="16" type="image/x-icon">${origin}/favicon.ico</Image>
|
||||||
|
<Url type="text/html" template="${origin}/?q={searchTerms}"/>
|
||||||
|
<Url type="application/opensearchdescription+xml" rel="self" template="${origin}/api/opensearch"/>
|
||||||
|
</OpenSearchDescription>`;
|
||||||
|
|
||||||
|
return new NextResponse(opensearchXml, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/opensearchdescription+xml',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function GET(request: Request) {
|
||||||
|
// Check if a BASE_URL is explicitly configured
|
||||||
|
const configBaseUrl = getBaseUrl();
|
||||||
|
|
||||||
|
// If BASE_URL is configured, use it, otherwise detect from request
|
||||||
|
if (configBaseUrl) {
|
||||||
|
// Remove any trailing slashes for consistency
|
||||||
|
let origin = configBaseUrl.replace(/\/+$/, '');
|
||||||
|
return generateOpenSearchResponse(origin);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detect the correct origin, taking into account reverse proxy headers
|
||||||
|
const url = new URL(request.url);
|
||||||
|
let origin = url.origin;
|
||||||
|
|
||||||
|
// Extract headers
|
||||||
|
const headers = Object.fromEntries(request.headers);
|
||||||
|
|
||||||
|
// Check for X-Forwarded-Host and related headers to handle reverse proxies
|
||||||
|
if (headers['x-forwarded-host']) {
|
||||||
|
// Determine protocol: prefer X-Forwarded-Proto, fall back to original or https
|
||||||
|
const protocol = headers['x-forwarded-proto'] || url.protocol.replace(':', '');
|
||||||
|
// Build the correct public-facing origin
|
||||||
|
origin = `${protocol}://${headers['x-forwarded-host']}`;
|
||||||
|
|
||||||
|
// Handle non-standard ports if specified in X-Forwarded-Port
|
||||||
|
if (headers['x-forwarded-port']) {
|
||||||
|
const port = headers['x-forwarded-port'];
|
||||||
|
// Don't append standard ports (80 for HTTP, 443 for HTTPS)
|
||||||
|
if (!((protocol === 'http' && port === '80') || (protocol === 'https' && port === '443'))) {
|
||||||
|
origin = `${origin}:${port}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate and return the OpenSearch response
|
||||||
|
return generateOpenSearchResponse(origin);
|
||||||
|
}
|
@@ -13,12 +13,14 @@ import {
|
|||||||
getCustomOpenaiModelName,
|
getCustomOpenaiModelName,
|
||||||
} from '@/lib/config';
|
} from '@/lib/config';
|
||||||
import { searchHandlers } from '@/lib/search';
|
import { searchHandlers } from '@/lib/search';
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
|
|
||||||
interface chatModel {
|
interface chatModel {
|
||||||
provider: string;
|
provider: string;
|
||||||
name: string;
|
name: string;
|
||||||
customOpenAIKey?: string;
|
customOpenAIKey?: string;
|
||||||
customOpenAIBaseURL?: string;
|
customOpenAIBaseURL?: string;
|
||||||
|
ollamaContextWindow?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface embeddingModel {
|
interface embeddingModel {
|
||||||
@@ -34,6 +36,7 @@ interface ChatRequestBody {
|
|||||||
query: string;
|
query: string;
|
||||||
history: Array<[string, string]>;
|
history: Array<[string, string]>;
|
||||||
stream?: boolean;
|
stream?: boolean;
|
||||||
|
systemInstructions?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
export const POST = async (req: Request) => {
|
export const POST = async (req: Request) => {
|
||||||
@@ -96,6 +99,10 @@ export const POST = async (req: Request) => {
|
|||||||
.model as unknown as BaseChatModel | undefined;
|
.model as unknown as BaseChatModel | undefined;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (llm instanceof ChatOllama && body.chatModel?.provider === 'ollama') {
|
||||||
|
llm.numCtx = body.chatModel.ollamaContextWindow || 2048;
|
||||||
|
}
|
||||||
|
|
||||||
if (
|
if (
|
||||||
embeddingModelProviders[embeddingModelProvider] &&
|
embeddingModelProviders[embeddingModelProvider] &&
|
||||||
embeddingModelProviders[embeddingModelProvider][embeddingModel]
|
embeddingModelProviders[embeddingModelProvider][embeddingModel]
|
||||||
@@ -125,6 +132,7 @@ export const POST = async (req: Request) => {
|
|||||||
embeddings,
|
embeddings,
|
||||||
body.optimizationMode,
|
body.optimizationMode,
|
||||||
[],
|
[],
|
||||||
|
body.systemInstructions || '',
|
||||||
);
|
);
|
||||||
|
|
||||||
if (!body.stream) {
|
if (!body.stream) {
|
||||||
|
@@ -8,10 +8,12 @@ import { getAvailableChatModelProviders } from '@/lib/providers';
|
|||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||||
import { ChatOpenAI } from '@langchain/openai';
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
|
|
||||||
interface ChatModel {
|
interface ChatModel {
|
||||||
provider: string;
|
provider: string;
|
||||||
model: string;
|
model: string;
|
||||||
|
ollamaContextWindow?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface SuggestionsGenerationBody {
|
interface SuggestionsGenerationBody {
|
||||||
@@ -57,6 +59,10 @@ export const POST = async (req: Request) => {
|
|||||||
}) as unknown as BaseChatModel;
|
}) as unknown as BaseChatModel;
|
||||||
} else if (chatModelProvider && chatModel) {
|
} else if (chatModelProvider && chatModel) {
|
||||||
llm = chatModel.model;
|
llm = chatModel.model;
|
||||||
|
// Set context window size for Ollama models
|
||||||
|
if (llm instanceof ChatOllama && body.chatModel?.provider === 'ollama') {
|
||||||
|
llm.numCtx = body.chatModel.ollamaContextWindow || 2048;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!llm) {
|
if (!llm) {
|
||||||
|
@@ -7,11 +7,13 @@ import {
|
|||||||
import { getAvailableChatModelProviders } from '@/lib/providers';
|
import { getAvailableChatModelProviders } from '@/lib/providers';
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
import { ChatOpenAI } from '@langchain/openai';
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
|
||||||
interface ChatModel {
|
interface ChatModel {
|
||||||
provider: string;
|
provider: string;
|
||||||
model: string;
|
model: string;
|
||||||
|
ollamaContextWindow?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface VideoSearchBody {
|
interface VideoSearchBody {
|
||||||
@@ -58,6 +60,10 @@ export const POST = async (req: Request) => {
|
|||||||
}) as unknown as BaseChatModel;
|
}) as unknown as BaseChatModel;
|
||||||
} else if (chatModelProvider && chatModel) {
|
} else if (chatModelProvider && chatModel) {
|
||||||
llm = chatModel.model;
|
llm = chatModel.model;
|
||||||
|
// Set context window size for Ollama models
|
||||||
|
if (llm instanceof ChatOllama && body.chatModel?.provider === 'ollama') {
|
||||||
|
llm.numCtx = body.chatModel.ollamaContextWindow || 2048;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!llm) {
|
if (!llm) {
|
||||||
|
@@ -26,6 +26,14 @@ export default function RootLayout({
|
|||||||
}>) {
|
}>) {
|
||||||
return (
|
return (
|
||||||
<html className="h-full" lang="en" suppressHydrationWarning>
|
<html className="h-full" lang="en" suppressHydrationWarning>
|
||||||
|
<head>
|
||||||
|
<link
|
||||||
|
rel="search"
|
||||||
|
type="application/opensearchdescription+xml"
|
||||||
|
title="Perplexica Search"
|
||||||
|
href="/api/opensearch"
|
||||||
|
/>
|
||||||
|
</head>
|
||||||
<body className={cn('h-full', montserrat.className)}>
|
<body className={cn('h-full', montserrat.className)}>
|
||||||
<ThemeProvider>
|
<ThemeProvider>
|
||||||
<Sidebar>{children}</Sidebar>
|
<Sidebar>{children}</Sidebar>
|
||||||
|
@@ -5,8 +5,9 @@ import { useEffect, useState } from 'react';
|
|||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
import { Switch } from '@headlessui/react';
|
import { Switch } from '@headlessui/react';
|
||||||
import ThemeSwitcher from '@/components/theme/Switcher';
|
import ThemeSwitcher from '@/components/theme/Switcher';
|
||||||
import { ImagesIcon, VideoIcon } from 'lucide-react';
|
import { ImagesIcon, VideoIcon, Layers3 } from 'lucide-react';
|
||||||
import Link from 'next/link';
|
import Link from 'next/link';
|
||||||
|
import { PROVIDER_METADATA } from '@/lib/providers';
|
||||||
|
|
||||||
interface SettingsType {
|
interface SettingsType {
|
||||||
chatModelProviders: {
|
chatModelProviders: {
|
||||||
@@ -20,9 +21,12 @@ interface SettingsType {
|
|||||||
anthropicApiKey: string;
|
anthropicApiKey: string;
|
||||||
geminiApiKey: string;
|
geminiApiKey: string;
|
||||||
ollamaApiUrl: string;
|
ollamaApiUrl: string;
|
||||||
|
lmStudioApiUrl: string;
|
||||||
|
deepseekApiKey: string;
|
||||||
customOpenaiApiKey: string;
|
customOpenaiApiKey: string;
|
||||||
customOpenaiApiUrl: string;
|
customOpenaiApiUrl: string;
|
||||||
customOpenaiModelName: string;
|
customOpenaiModelName: string;
|
||||||
|
ollamaContextWindow: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface InputProps extends React.InputHTMLAttributes<HTMLInputElement> {
|
interface InputProps extends React.InputHTMLAttributes<HTMLInputElement> {
|
||||||
@@ -54,6 +58,38 @@ const Input = ({ className, isSaving, onSave, ...restProps }: InputProps) => {
|
|||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
|
interface TextareaProps extends React.InputHTMLAttributes<HTMLTextAreaElement> {
|
||||||
|
isSaving?: boolean;
|
||||||
|
onSave?: (value: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const Textarea = ({
|
||||||
|
className,
|
||||||
|
isSaving,
|
||||||
|
onSave,
|
||||||
|
...restProps
|
||||||
|
}: TextareaProps) => {
|
||||||
|
return (
|
||||||
|
<div className="relative">
|
||||||
|
<textarea
|
||||||
|
placeholder="Any special instructions for the LLM"
|
||||||
|
className="placeholder:text-sm text-sm w-full flex items-center justify-between p-3 bg-light-secondary dark:bg-dark-secondary rounded-lg hover:bg-light-200 dark:hover:bg-dark-200 transition-colors"
|
||||||
|
rows={4}
|
||||||
|
onBlur={(e) => onSave?.(e.target.value)}
|
||||||
|
{...restProps}
|
||||||
|
/>
|
||||||
|
{isSaving && (
|
||||||
|
<div className="absolute right-3 top-3">
|
||||||
|
<Loader2
|
||||||
|
size={16}
|
||||||
|
className="animate-spin text-black/70 dark:text-white/70"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
const Select = ({
|
const Select = ({
|
||||||
className,
|
className,
|
||||||
options,
|
options,
|
||||||
@@ -111,7 +147,14 @@ const Page = () => {
|
|||||||
const [isLoading, setIsLoading] = useState(false);
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
const [automaticImageSearch, setAutomaticImageSearch] = useState(false);
|
const [automaticImageSearch, setAutomaticImageSearch] = useState(false);
|
||||||
const [automaticVideoSearch, setAutomaticVideoSearch] = useState(false);
|
const [automaticVideoSearch, setAutomaticVideoSearch] = useState(false);
|
||||||
|
const [automaticSuggestions, setAutomaticSuggestions] = useState(true);
|
||||||
|
const [systemInstructions, setSystemInstructions] = useState<string>('');
|
||||||
const [savingStates, setSavingStates] = useState<Record<string, boolean>>({});
|
const [savingStates, setSavingStates] = useState<Record<string, boolean>>({});
|
||||||
|
const [contextWindowSize, setContextWindowSize] = useState(2048);
|
||||||
|
const [isCustomContextWindow, setIsCustomContextWindow] = useState(false);
|
||||||
|
const predefinedContextSizes = [
|
||||||
|
1024, 2048, 3072, 4096, 8192, 16384, 32768, 65536, 131072,
|
||||||
|
];
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const fetchConfig = async () => {
|
const fetchConfig = async () => {
|
||||||
@@ -123,6 +166,7 @@ const Page = () => {
|
|||||||
});
|
});
|
||||||
|
|
||||||
const data = (await res.json()) as SettingsType;
|
const data = (await res.json()) as SettingsType;
|
||||||
|
|
||||||
setConfig(data);
|
setConfig(data);
|
||||||
|
|
||||||
const chatModelProvidersKeys = Object.keys(data.chatModelProviders || {});
|
const chatModelProvidersKeys = Object.keys(data.chatModelProviders || {});
|
||||||
@@ -171,6 +215,18 @@ const Page = () => {
|
|||||||
setAutomaticVideoSearch(
|
setAutomaticVideoSearch(
|
||||||
localStorage.getItem('autoVideoSearch') === 'true',
|
localStorage.getItem('autoVideoSearch') === 'true',
|
||||||
);
|
);
|
||||||
|
setAutomaticSuggestions(
|
||||||
|
localStorage.getItem('autoSuggestions') !== 'false', // default to true if not set
|
||||||
|
);
|
||||||
|
const storedContextWindow = parseInt(
|
||||||
|
localStorage.getItem('ollamaContextWindow') ?? '2048',
|
||||||
|
);
|
||||||
|
setContextWindowSize(storedContextWindow);
|
||||||
|
setIsCustomContextWindow(
|
||||||
|
!predefinedContextSizes.includes(storedContextWindow),
|
||||||
|
);
|
||||||
|
|
||||||
|
setSystemInstructions(localStorage.getItem('systemInstructions')!);
|
||||||
|
|
||||||
setIsLoading(false);
|
setIsLoading(false);
|
||||||
};
|
};
|
||||||
@@ -320,6 +376,8 @@ const Page = () => {
|
|||||||
localStorage.setItem('autoImageSearch', value.toString());
|
localStorage.setItem('autoImageSearch', value.toString());
|
||||||
} else if (key === 'automaticVideoSearch') {
|
} else if (key === 'automaticVideoSearch') {
|
||||||
localStorage.setItem('autoVideoSearch', value.toString());
|
localStorage.setItem('autoVideoSearch', value.toString());
|
||||||
|
} else if (key === 'automaticSuggestions') {
|
||||||
|
localStorage.setItem('autoSuggestions', value.toString());
|
||||||
} else if (key === 'chatModelProvider') {
|
} else if (key === 'chatModelProvider') {
|
||||||
localStorage.setItem('chatModelProvider', value);
|
localStorage.setItem('chatModelProvider', value);
|
||||||
} else if (key === 'chatModel') {
|
} else if (key === 'chatModel') {
|
||||||
@@ -328,6 +386,10 @@ const Page = () => {
|
|||||||
localStorage.setItem('embeddingModelProvider', value);
|
localStorage.setItem('embeddingModelProvider', value);
|
||||||
} else if (key === 'embeddingModel') {
|
} else if (key === 'embeddingModel') {
|
||||||
localStorage.setItem('embeddingModel', value);
|
localStorage.setItem('embeddingModel', value);
|
||||||
|
} else if (key === 'ollamaContextWindow') {
|
||||||
|
localStorage.setItem('ollamaContextWindow', value.toString());
|
||||||
|
} else if (key === 'systemInstructions') {
|
||||||
|
localStorage.setItem('systemInstructions', value);
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Failed to save:', err);
|
console.error('Failed to save:', err);
|
||||||
@@ -470,6 +532,60 @@ const Page = () => {
|
|||||||
/>
|
/>
|
||||||
</Switch>
|
</Switch>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center justify-between p-3 bg-light-secondary dark:bg-dark-secondary rounded-lg hover:bg-light-200 dark:hover:bg-dark-200 transition-colors">
|
||||||
|
<div className="flex items-center space-x-3">
|
||||||
|
<div className="p-2 bg-light-200 dark:bg-dark-200 rounded-lg">
|
||||||
|
<Layers3
|
||||||
|
size={18}
|
||||||
|
className="text-black/70 dark:text-white/70"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<p className="text-sm text-black/90 dark:text-white/90 font-medium">
|
||||||
|
Automatic Suggestions
|
||||||
|
</p>
|
||||||
|
<p className="text-xs text-black/60 dark:text-white/60 mt-0.5">
|
||||||
|
Automatically show related suggestions after responses
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<Switch
|
||||||
|
checked={automaticSuggestions}
|
||||||
|
onChange={(checked) => {
|
||||||
|
setAutomaticSuggestions(checked);
|
||||||
|
saveConfig('automaticSuggestions', checked);
|
||||||
|
}}
|
||||||
|
className={cn(
|
||||||
|
automaticSuggestions
|
||||||
|
? 'bg-[#24A0ED]'
|
||||||
|
: 'bg-light-200 dark:bg-dark-200',
|
||||||
|
'relative inline-flex h-6 w-11 items-center rounded-full transition-colors focus:outline-none',
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<span
|
||||||
|
className={cn(
|
||||||
|
automaticSuggestions
|
||||||
|
? 'translate-x-6'
|
||||||
|
: 'translate-x-1',
|
||||||
|
'inline-block h-4 w-4 transform rounded-full bg-white transition-transform',
|
||||||
|
)}
|
||||||
|
/>
|
||||||
|
</Switch>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</SettingsSection>
|
||||||
|
|
||||||
|
<SettingsSection title="System Instructions">
|
||||||
|
<div className="flex flex-col space-y-4">
|
||||||
|
<Textarea
|
||||||
|
value={systemInstructions}
|
||||||
|
isSaving={savingStates['systemInstructions']}
|
||||||
|
onChange={(e) => {
|
||||||
|
setSystemInstructions(e.target.value);
|
||||||
|
}}
|
||||||
|
onSave={(value) => saveConfig('systemInstructions', value)}
|
||||||
|
/>
|
||||||
</div>
|
</div>
|
||||||
</SettingsSection>
|
</SettingsSection>
|
||||||
|
|
||||||
@@ -497,8 +613,9 @@ const Page = () => {
|
|||||||
(provider) => ({
|
(provider) => ({
|
||||||
value: provider,
|
value: provider,
|
||||||
label:
|
label:
|
||||||
|
(PROVIDER_METADATA as any)[provider]?.displayName ||
|
||||||
provider.charAt(0).toUpperCase() +
|
provider.charAt(0).toUpperCase() +
|
||||||
provider.slice(1),
|
provider.slice(1),
|
||||||
}),
|
}),
|
||||||
)}
|
)}
|
||||||
/>
|
/>
|
||||||
@@ -545,6 +662,78 @@ const Page = () => {
|
|||||||
];
|
];
|
||||||
})()}
|
})()}
|
||||||
/>
|
/>
|
||||||
|
{selectedChatModelProvider === 'ollama' && (
|
||||||
|
<div className="flex flex-col space-y-1">
|
||||||
|
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||||
|
Chat Context Window Size
|
||||||
|
</p>
|
||||||
|
<Select
|
||||||
|
value={
|
||||||
|
isCustomContextWindow
|
||||||
|
? 'custom'
|
||||||
|
: contextWindowSize.toString()
|
||||||
|
}
|
||||||
|
onChange={(e) => {
|
||||||
|
const value = e.target.value;
|
||||||
|
if (value === 'custom') {
|
||||||
|
setIsCustomContextWindow(true);
|
||||||
|
} else {
|
||||||
|
setIsCustomContextWindow(false);
|
||||||
|
const numValue = parseInt(value);
|
||||||
|
setContextWindowSize(numValue);
|
||||||
|
setConfig((prev) => ({
|
||||||
|
...prev!,
|
||||||
|
ollamaContextWindow: numValue,
|
||||||
|
}));
|
||||||
|
saveConfig('ollamaContextWindow', numValue);
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
options={[
|
||||||
|
...predefinedContextSizes.map((size) => ({
|
||||||
|
value: size.toString(),
|
||||||
|
label: `${size.toLocaleString()} tokens`,
|
||||||
|
})),
|
||||||
|
{ value: 'custom', label: 'Custom...' },
|
||||||
|
]}
|
||||||
|
/>
|
||||||
|
{isCustomContextWindow && (
|
||||||
|
<div className="mt-2">
|
||||||
|
<Input
|
||||||
|
type="number"
|
||||||
|
min={512}
|
||||||
|
value={contextWindowSize}
|
||||||
|
placeholder="Custom context window size (minimum 512)"
|
||||||
|
isSaving={savingStates['ollamaContextWindow']}
|
||||||
|
onChange={(e) => {
|
||||||
|
// Allow any value to be typed
|
||||||
|
const value =
|
||||||
|
parseInt(e.target.value) ||
|
||||||
|
contextWindowSize;
|
||||||
|
setContextWindowSize(value);
|
||||||
|
}}
|
||||||
|
onSave={(value) => {
|
||||||
|
// Validate only when saving
|
||||||
|
const numValue = Math.max(
|
||||||
|
512,
|
||||||
|
parseInt(value) || 2048,
|
||||||
|
);
|
||||||
|
setContextWindowSize(numValue);
|
||||||
|
setConfig((prev) => ({
|
||||||
|
...prev!,
|
||||||
|
ollamaContextWindow: numValue,
|
||||||
|
}));
|
||||||
|
saveConfig('ollamaContextWindow', numValue);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
<p className="text-xs text-black/60 dark:text-white/60 mt-0.5">
|
||||||
|
{isCustomContextWindow
|
||||||
|
? 'Adjust the context window size for Ollama models (minimum 512 tokens)'
|
||||||
|
: 'Adjust the context window size for Ollama models'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
@@ -639,8 +828,9 @@ const Page = () => {
|
|||||||
(provider) => ({
|
(provider) => ({
|
||||||
value: provider,
|
value: provider,
|
||||||
label:
|
label:
|
||||||
|
(PROVIDER_METADATA as any)[provider]?.displayName ||
|
||||||
provider.charAt(0).toUpperCase() +
|
provider.charAt(0).toUpperCase() +
|
||||||
provider.slice(1),
|
provider.slice(1),
|
||||||
}),
|
}),
|
||||||
)}
|
)}
|
||||||
/>
|
/>
|
||||||
@@ -788,6 +978,44 @@ const Page = () => {
|
|||||||
onSave={(value) => saveConfig('geminiApiKey', value)}
|
onSave={(value) => saveConfig('geminiApiKey', value)}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div className="flex flex-col space-y-1">
|
||||||
|
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||||
|
Deepseek API Key
|
||||||
|
</p>
|
||||||
|
<Input
|
||||||
|
type="text"
|
||||||
|
placeholder="Deepseek API Key"
|
||||||
|
value={config.deepseekApiKey}
|
||||||
|
isSaving={savingStates['deepseekApiKey']}
|
||||||
|
onChange={(e) => {
|
||||||
|
setConfig((prev) => ({
|
||||||
|
...prev!,
|
||||||
|
deepseekApiKey: e.target.value,
|
||||||
|
}));
|
||||||
|
}}
|
||||||
|
onSave={(value) => saveConfig('deepseekApiKey', value)}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex flex-col space-y-1">
|
||||||
|
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||||
|
LM Studio API URL
|
||||||
|
</p>
|
||||||
|
<Input
|
||||||
|
type="text"
|
||||||
|
placeholder="LM Studio API URL"
|
||||||
|
value={config.lmStudioApiUrl}
|
||||||
|
isSaving={savingStates['lmStudioApiUrl']}
|
||||||
|
onChange={(e) => {
|
||||||
|
setConfig((prev) => ({
|
||||||
|
...prev!,
|
||||||
|
lmStudioApiUrl: e.target.value,
|
||||||
|
}));
|
||||||
|
}}
|
||||||
|
onSave={(value) => saveConfig('lmStudioApiUrl', value)}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</SettingsSection>
|
</SettingsSection>
|
||||||
</div>
|
</div>
|
||||||
|
@@ -5,31 +5,111 @@ import MessageInput from './MessageInput';
|
|||||||
import { File, Message } from './ChatWindow';
|
import { File, Message } from './ChatWindow';
|
||||||
import MessageBox from './MessageBox';
|
import MessageBox from './MessageBox';
|
||||||
import MessageBoxLoading from './MessageBoxLoading';
|
import MessageBoxLoading from './MessageBoxLoading';
|
||||||
|
import { check } from 'drizzle-orm/gel-core';
|
||||||
|
|
||||||
const Chat = ({
|
const Chat = ({
|
||||||
loading,
|
loading,
|
||||||
messages,
|
messages,
|
||||||
sendMessage,
|
sendMessage,
|
||||||
messageAppeared,
|
scrollTrigger,
|
||||||
rewrite,
|
rewrite,
|
||||||
fileIds,
|
fileIds,
|
||||||
setFileIds,
|
setFileIds,
|
||||||
files,
|
files,
|
||||||
setFiles,
|
setFiles,
|
||||||
|
optimizationMode,
|
||||||
|
setOptimizationMode,
|
||||||
|
focusMode,
|
||||||
|
setFocusMode,
|
||||||
}: {
|
}: {
|
||||||
messages: Message[];
|
messages: Message[];
|
||||||
sendMessage: (message: string) => void;
|
sendMessage: (
|
||||||
|
message: string,
|
||||||
|
options?: {
|
||||||
|
messageId?: string;
|
||||||
|
rewriteIndex?: number;
|
||||||
|
suggestions?: string[];
|
||||||
|
},
|
||||||
|
) => void;
|
||||||
loading: boolean;
|
loading: boolean;
|
||||||
messageAppeared: boolean;
|
scrollTrigger: number;
|
||||||
rewrite: (messageId: string) => void;
|
rewrite: (messageId: string) => void;
|
||||||
fileIds: string[];
|
fileIds: string[];
|
||||||
setFileIds: (fileIds: string[]) => void;
|
setFileIds: (fileIds: string[]) => void;
|
||||||
files: File[];
|
files: File[];
|
||||||
setFiles: (files: File[]) => void;
|
setFiles: (files: File[]) => void;
|
||||||
|
optimizationMode: string;
|
||||||
|
setOptimizationMode: (mode: string) => void;
|
||||||
|
focusMode: string;
|
||||||
|
setFocusMode: (mode: string) => void;
|
||||||
}) => {
|
}) => {
|
||||||
const [dividerWidth, setDividerWidth] = useState(0);
|
const [dividerWidth, setDividerWidth] = useState(0);
|
||||||
|
const [isAtBottom, setIsAtBottom] = useState(true);
|
||||||
|
const [manuallyScrolledUp, setManuallyScrolledUp] = useState(false);
|
||||||
const dividerRef = useRef<HTMLDivElement | null>(null);
|
const dividerRef = useRef<HTMLDivElement | null>(null);
|
||||||
const messageEnd = useRef<HTMLDivElement | null>(null);
|
const messageEnd = useRef<HTMLDivElement | null>(null);
|
||||||
|
const SCROLL_THRESHOLD = 250; // pixels from bottom to consider "at bottom"
|
||||||
|
|
||||||
|
// Check if user is at bottom of page
|
||||||
|
useEffect(() => {
|
||||||
|
const checkIsAtBottom = () => {
|
||||||
|
const position = window.innerHeight + window.scrollY;
|
||||||
|
const height = document.body.scrollHeight;
|
||||||
|
const atBottom = position >= height - SCROLL_THRESHOLD;
|
||||||
|
|
||||||
|
setIsAtBottom(atBottom);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Initial check
|
||||||
|
checkIsAtBottom();
|
||||||
|
|
||||||
|
// Add scroll event listener
|
||||||
|
window.addEventListener('scroll', checkIsAtBottom);
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
window.removeEventListener('scroll', checkIsAtBottom);
|
||||||
|
};
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
// Detect wheel and touch events to identify user's scrolling direction
|
||||||
|
useEffect(() => {
|
||||||
|
const checkIsAtBottom = () => {
|
||||||
|
const position = window.innerHeight + window.scrollY;
|
||||||
|
const height = document.body.scrollHeight;
|
||||||
|
const atBottom = position >= height - SCROLL_THRESHOLD;
|
||||||
|
|
||||||
|
// If user scrolls to bottom, reset the manuallyScrolledUp flag
|
||||||
|
if (atBottom) {
|
||||||
|
setManuallyScrolledUp(false);
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsAtBottom(atBottom);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleWheel = (e: WheelEvent) => {
|
||||||
|
// Positive deltaY means scrolling down, negative means scrolling up
|
||||||
|
if (e.deltaY < 0) {
|
||||||
|
// User is scrolling up
|
||||||
|
setManuallyScrolledUp(true);
|
||||||
|
} else if (e.deltaY > 0) {
|
||||||
|
checkIsAtBottom();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleTouchStart = (e: TouchEvent) => {
|
||||||
|
// Immediately stop auto-scrolling on any touch interaction
|
||||||
|
setManuallyScrolledUp(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Add event listeners
|
||||||
|
window.addEventListener('wheel', handleWheel, { passive: true });
|
||||||
|
window.addEventListener('touchstart', handleTouchStart, { passive: true });
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
window.removeEventListener('wheel', handleWheel);
|
||||||
|
window.removeEventListener('touchstart', handleTouchStart);
|
||||||
|
};
|
||||||
|
}, [isAtBottom]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const updateDividerWidth = () => {
|
const updateDividerWidth = () => {
|
||||||
@@ -47,6 +127,7 @@ const Chat = ({
|
|||||||
};
|
};
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Scroll when user sends a message
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const scroll = () => {
|
const scroll = () => {
|
||||||
messageEnd.current?.scrollIntoView({ behavior: 'smooth' });
|
messageEnd.current?.scrollIntoView({ behavior: 'smooth' });
|
||||||
@@ -56,13 +137,28 @@ const Chat = ({
|
|||||||
document.title = `${messages[0].content.substring(0, 30)} - Perplexica`;
|
document.title = `${messages[0].content.substring(0, 30)} - Perplexica`;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (messages[messages.length - 1]?.role == 'user') {
|
// Always scroll when user sends a message
|
||||||
|
if (messages[messages.length - 1]?.role === 'user') {
|
||||||
scroll();
|
scroll();
|
||||||
|
setIsAtBottom(true); // Reset to true when user sends a message
|
||||||
|
setManuallyScrolledUp(false); // Reset manually scrolled flag when user sends a message
|
||||||
}
|
}
|
||||||
}, [messages]);
|
}, [messages]);
|
||||||
|
|
||||||
|
// Auto-scroll for assistant responses only if user is at bottom and hasn't manually scrolled up
|
||||||
|
useEffect(() => {
|
||||||
|
const position = window.innerHeight + window.scrollY;
|
||||||
|
const height = document.body.scrollHeight;
|
||||||
|
const atBottom = position >= height - SCROLL_THRESHOLD;
|
||||||
|
setIsAtBottom(atBottom);
|
||||||
|
|
||||||
|
if (isAtBottom && !manuallyScrolledUp && messages.length > 0) {
|
||||||
|
messageEnd.current?.scrollIntoView({ behavior: 'smooth' });
|
||||||
|
}
|
||||||
|
}, [scrollTrigger, isAtBottom, messages.length, manuallyScrolledUp]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="flex flex-col space-y-6 pt-8 pb-44 lg:pb-32 sm:mx-4 md:mx-8">
|
<div className="flex flex-col space-y-6 pt-8 pb-48 sm:mx-4 md:mx-8">
|
||||||
{messages.map((msg, i) => {
|
{messages.map((msg, i) => {
|
||||||
const isLast = i === messages.length - 1;
|
const isLast = i === messages.length - 1;
|
||||||
|
|
||||||
@@ -85,20 +181,56 @@ const Chat = ({
|
|||||||
</Fragment>
|
</Fragment>
|
||||||
);
|
);
|
||||||
})}
|
})}
|
||||||
{loading && !messageAppeared && <MessageBoxLoading />}
|
{loading && <MessageBoxLoading />}
|
||||||
<div ref={messageEnd} className="h-0" />
|
<div ref={messageEnd} className="h-0" />
|
||||||
|
|
||||||
{dividerWidth > 0 && (
|
{dividerWidth > 0 && (
|
||||||
<div
|
<div
|
||||||
className="bottom-24 lg:bottom-10 fixed z-40"
|
className="bottom-24 lg:bottom-10 fixed z-40"
|
||||||
style={{ width: dividerWidth }}
|
style={{ width: dividerWidth }}
|
||||||
>
|
>
|
||||||
|
{/* Scroll to bottom button - appears above the MessageInput when user has scrolled up */}
|
||||||
|
{manuallyScrolledUp && !isAtBottom && (
|
||||||
|
<div className="absolute -top-14 right-2 z-10">
|
||||||
|
<button
|
||||||
|
onClick={() => {
|
||||||
|
setManuallyScrolledUp(false);
|
||||||
|
setIsAtBottom(true);
|
||||||
|
messageEnd.current?.scrollIntoView({ behavior: 'smooth' });
|
||||||
|
}}
|
||||||
|
className="bg-[#24A0ED] text-white hover:bg-opacity-85 transition duration-100 rounded-full px-4 py-2 shadow-lg flex items-center justify-center"
|
||||||
|
aria-label="Scroll to bottom"
|
||||||
|
>
|
||||||
|
<svg
|
||||||
|
xmlns="http://www.w3.org/2000/svg"
|
||||||
|
className="h-5 w-5 mr-1"
|
||||||
|
viewBox="0 0 20 20"
|
||||||
|
fill="currentColor"
|
||||||
|
>
|
||||||
|
<path
|
||||||
|
fillRule="evenodd"
|
||||||
|
d="M14.707 12.707a1 1 0 01-1.414 0L10 9.414l-3.293 3.293a1 1 0 01-1.414-1.414l4-4a1 1 0 011.414 0l4 4a1 1 0 010 1.414z"
|
||||||
|
clipRule="evenodd"
|
||||||
|
transform="rotate(180 10 10)"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
<span className="text-sm">Scroll to bottom</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
<MessageInput
|
<MessageInput
|
||||||
|
firstMessage={messages.length === 0}
|
||||||
loading={loading}
|
loading={loading}
|
||||||
sendMessage={sendMessage}
|
sendMessage={sendMessage}
|
||||||
fileIds={fileIds}
|
fileIds={fileIds}
|
||||||
setFileIds={setFileIds}
|
setFileIds={setFileIds}
|
||||||
files={files}
|
files={files}
|
||||||
setFiles={setFiles}
|
setFiles={setFiles}
|
||||||
|
optimizationMode={optimizationMode}
|
||||||
|
setOptimizationMode={setOptimizationMode}
|
||||||
|
focusMode={focusMode}
|
||||||
|
setFocusMode={setFocusMode}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
@@ -13,6 +13,11 @@ import { Settings } from 'lucide-react';
|
|||||||
import Link from 'next/link';
|
import Link from 'next/link';
|
||||||
import NextError from 'next/error';
|
import NextError from 'next/error';
|
||||||
|
|
||||||
|
export type ModelStats = {
|
||||||
|
modelName: string;
|
||||||
|
responseTime?: number;
|
||||||
|
};
|
||||||
|
|
||||||
export type Message = {
|
export type Message = {
|
||||||
messageId: string;
|
messageId: string;
|
||||||
chatId: string;
|
chatId: string;
|
||||||
@@ -21,6 +26,9 @@ export type Message = {
|
|||||||
role: 'user' | 'assistant';
|
role: 'user' | 'assistant';
|
||||||
suggestions?: string[];
|
suggestions?: string[];
|
||||||
sources?: Document[];
|
sources?: Document[];
|
||||||
|
modelStats?: ModelStats;
|
||||||
|
searchQuery?: string;
|
||||||
|
searchUrl?: string;
|
||||||
};
|
};
|
||||||
|
|
||||||
export interface File {
|
export interface File {
|
||||||
@@ -272,7 +280,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
const [loading, setLoading] = useState(false);
|
const [loading, setLoading] = useState(false);
|
||||||
const [messageAppeared, setMessageAppeared] = useState(false);
|
const [scrollTrigger, setScrollTrigger] = useState(0);
|
||||||
|
|
||||||
const [chatHistory, setChatHistory] = useState<[string, string][]>([]);
|
const [chatHistory, setChatHistory] = useState<[string, string][]>([]);
|
||||||
const [messages, setMessages] = useState<Message[]>([]);
|
const [messages, setMessages] = useState<Message[]>([]);
|
||||||
@@ -287,6 +295,16 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
|
|
||||||
const [notFound, setNotFound] = useState(false);
|
const [notFound, setNotFound] = useState(false);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const savedOptimizationMode = localStorage.getItem('optimizationMode');
|
||||||
|
|
||||||
|
if (savedOptimizationMode !== null) {
|
||||||
|
setOptimizationMode(savedOptimizationMode);
|
||||||
|
} else {
|
||||||
|
localStorage.setItem('optimizationMode', optimizationMode);
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (
|
if (
|
||||||
chatId &&
|
chatId &&
|
||||||
@@ -327,7 +345,28 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
}
|
}
|
||||||
}, [isMessagesLoaded, isConfigReady]);
|
}, [isMessagesLoaded, isConfigReady]);
|
||||||
|
|
||||||
const sendMessage = async (message: string, messageId?: string) => {
|
const sendMessage = async (
|
||||||
|
message: string,
|
||||||
|
options?: {
|
||||||
|
messageId?: string;
|
||||||
|
rewriteIndex?: number;
|
||||||
|
suggestions?: string[];
|
||||||
|
},
|
||||||
|
) => {
|
||||||
|
setScrollTrigger((x) => (x === 0 ? -1 : 0));
|
||||||
|
// Special case: If we're just updating an existing message with suggestions
|
||||||
|
if (options?.suggestions && options.messageId) {
|
||||||
|
setMessages((prev) =>
|
||||||
|
prev.map((msg) => {
|
||||||
|
if (msg.messageId === options.messageId) {
|
||||||
|
return { ...msg, suggestions: options.suggestions };
|
||||||
|
}
|
||||||
|
return msg;
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (loading) return;
|
if (loading) return;
|
||||||
if (!isConfigReady) {
|
if (!isConfigReady) {
|
||||||
toast.error('Cannot send message before the configuration is ready');
|
toast.error('Cannot send message before the configuration is ready');
|
||||||
@@ -335,13 +374,29 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
setLoading(true);
|
setLoading(true);
|
||||||
setMessageAppeared(false);
|
|
||||||
|
|
||||||
let sources: Document[] | undefined = undefined;
|
let sources: Document[] | undefined = undefined;
|
||||||
let recievedMessage = '';
|
let recievedMessage = '';
|
||||||
let added = false;
|
let added = false;
|
||||||
|
let messageChatHistory = chatHistory;
|
||||||
|
|
||||||
messageId = messageId ?? crypto.randomBytes(7).toString('hex');
|
if (options?.rewriteIndex !== undefined) {
|
||||||
|
const rewriteIndex = options.rewriteIndex;
|
||||||
|
setMessages((prev) => {
|
||||||
|
return [...prev.slice(0, messages.length > 2 ? rewriteIndex - 1 : 0)];
|
||||||
|
});
|
||||||
|
|
||||||
|
messageChatHistory = chatHistory.slice(
|
||||||
|
0,
|
||||||
|
messages.length > 2 ? rewriteIndex - 1 : 0,
|
||||||
|
);
|
||||||
|
setChatHistory(messageChatHistory);
|
||||||
|
|
||||||
|
setScrollTrigger((prev) => prev + 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const messageId =
|
||||||
|
options?.messageId ?? crypto.randomBytes(7).toString('hex');
|
||||||
|
|
||||||
setMessages((prevMessages) => [
|
setMessages((prevMessages) => [
|
||||||
...prevMessages,
|
...prevMessages,
|
||||||
@@ -372,12 +427,14 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
chatId: chatId!,
|
chatId: chatId!,
|
||||||
role: 'assistant',
|
role: 'assistant',
|
||||||
sources: sources,
|
sources: sources,
|
||||||
|
searchQuery: data.searchQuery,
|
||||||
|
searchUrl: data.searchUrl,
|
||||||
createdAt: new Date(),
|
createdAt: new Date(),
|
||||||
},
|
},
|
||||||
]);
|
]);
|
||||||
added = true;
|
added = true;
|
||||||
|
setScrollTrigger((prev) => prev + 1);
|
||||||
}
|
}
|
||||||
setMessageAppeared(true);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (data.type === 'message') {
|
if (data.type === 'message') {
|
||||||
@@ -391,6 +448,9 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
role: 'assistant',
|
role: 'assistant',
|
||||||
sources: sources,
|
sources: sources,
|
||||||
createdAt: new Date(),
|
createdAt: new Date(),
|
||||||
|
modelStats: {
|
||||||
|
modelName: data.modelName,
|
||||||
|
},
|
||||||
},
|
},
|
||||||
]);
|
]);
|
||||||
added = true;
|
added = true;
|
||||||
@@ -407,7 +467,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
);
|
);
|
||||||
|
|
||||||
recievedMessage += data.data;
|
recievedMessage += data.data;
|
||||||
setMessageAppeared(true);
|
setScrollTrigger((prev) => prev + 1);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (data.type === 'messageEnd') {
|
if (data.type === 'messageEnd') {
|
||||||
@@ -417,12 +477,31 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
['assistant', recievedMessage],
|
['assistant', recievedMessage],
|
||||||
]);
|
]);
|
||||||
|
|
||||||
|
// Always update the message, adding modelStats if available
|
||||||
|
setMessages((prev) =>
|
||||||
|
prev.map((message) => {
|
||||||
|
if (message.messageId === data.messageId) {
|
||||||
|
return {
|
||||||
|
...message,
|
||||||
|
// Include model stats if available, otherwise null
|
||||||
|
modelStats: data.modelStats || null,
|
||||||
|
// Make sure the searchQuery is preserved (if available in the message data)
|
||||||
|
searchQuery: message.searchQuery || data.searchQuery,
|
||||||
|
searchUrl: message.searchUrl || data.searchUrl,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return message;
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
|
setScrollTrigger((prev) => prev + 1);
|
||||||
|
|
||||||
const lastMsg = messagesRef.current[messagesRef.current.length - 1];
|
const lastMsg = messagesRef.current[messagesRef.current.length - 1];
|
||||||
|
|
||||||
const autoImageSearch = localStorage.getItem('autoImageSearch');
|
const autoImageSearch = localStorage.getItem('autoImageSearch');
|
||||||
const autoVideoSearch = localStorage.getItem('autoVideoSearch');
|
const autoVideoSearch = localStorage.getItem('autoVideoSearch');
|
||||||
|
const autoSuggestions = localStorage.getItem('autoSuggestions');
|
||||||
|
|
||||||
if (autoImageSearch === 'true') {
|
if (autoImageSearch === 'true') {
|
||||||
document
|
document
|
||||||
@@ -440,7 +519,8 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
lastMsg.role === 'assistant' &&
|
lastMsg.role === 'assistant' &&
|
||||||
lastMsg.sources &&
|
lastMsg.sources &&
|
||||||
lastMsg.sources.length > 0 &&
|
lastMsg.sources.length > 0 &&
|
||||||
!lastMsg.suggestions
|
!lastMsg.suggestions &&
|
||||||
|
autoSuggestions !== 'false' // Default to true if not set
|
||||||
) {
|
) {
|
||||||
const suggestions = await getSuggestions(messagesRef.current);
|
const suggestions = await getSuggestions(messagesRef.current);
|
||||||
setMessages((prev) =>
|
setMessages((prev) =>
|
||||||
@@ -455,6 +535,18 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const ollamaContextWindow =
|
||||||
|
localStorage.getItem('ollamaContextWindow') || '2048';
|
||||||
|
|
||||||
|
// Get the latest model selection from localStorage
|
||||||
|
const currentChatModelProvider = localStorage.getItem('chatModelProvider');
|
||||||
|
const currentChatModel = localStorage.getItem('chatModel');
|
||||||
|
|
||||||
|
// Use the most current model selection from localStorage, falling back to the state if not available
|
||||||
|
const modelProvider =
|
||||||
|
currentChatModelProvider || chatModelProvider.provider;
|
||||||
|
const modelName = currentChatModel || chatModelProvider.name;
|
||||||
|
|
||||||
const res = await fetch('/api/chat', {
|
const res = await fetch('/api/chat', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
@@ -471,15 +563,19 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
files: fileIds,
|
files: fileIds,
|
||||||
focusMode: focusMode,
|
focusMode: focusMode,
|
||||||
optimizationMode: optimizationMode,
|
optimizationMode: optimizationMode,
|
||||||
history: chatHistory,
|
history: messageChatHistory,
|
||||||
chatModel: {
|
chatModel: {
|
||||||
name: chatModelProvider.name,
|
name: modelName,
|
||||||
provider: chatModelProvider.provider,
|
provider: modelProvider,
|
||||||
|
...(chatModelProvider.provider === 'ollama' && {
|
||||||
|
ollamaContextWindow: parseInt(ollamaContextWindow),
|
||||||
|
}),
|
||||||
},
|
},
|
||||||
embeddingModel: {
|
embeddingModel: {
|
||||||
name: embeddingModelProvider.name,
|
name: embeddingModelProvider.name,
|
||||||
provider: embeddingModelProvider.provider,
|
provider: embeddingModelProvider.provider,
|
||||||
},
|
},
|
||||||
|
systemInstructions: localStorage.getItem('systemInstructions'),
|
||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -511,20 +607,14 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
const rewrite = (messageId: string) => {
|
const rewrite = (messageId: string) => {
|
||||||
const index = messages.findIndex((msg) => msg.messageId === messageId);
|
const messageIndex = messages.findIndex(
|
||||||
|
(msg) => msg.messageId === messageId,
|
||||||
if (index === -1) return;
|
);
|
||||||
|
if (messageIndex == -1) return;
|
||||||
const message = messages[index - 1];
|
sendMessage(messages[messageIndex - 1].content, {
|
||||||
|
messageId: messageId,
|
||||||
setMessages((prev) => {
|
rewriteIndex: messageIndex,
|
||||||
return [...prev.slice(0, messages.length > 2 ? index - 1 : 0)];
|
|
||||||
});
|
});
|
||||||
setChatHistory((prev) => {
|
|
||||||
return [...prev.slice(0, messages.length > 2 ? index - 1 : 0)];
|
|
||||||
});
|
|
||||||
|
|
||||||
sendMessage(message.content, message.messageId);
|
|
||||||
};
|
};
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -563,12 +653,16 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
|||||||
loading={loading}
|
loading={loading}
|
||||||
messages={messages}
|
messages={messages}
|
||||||
sendMessage={sendMessage}
|
sendMessage={sendMessage}
|
||||||
messageAppeared={messageAppeared}
|
scrollTrigger={scrollTrigger}
|
||||||
rewrite={rewrite}
|
rewrite={rewrite}
|
||||||
fileIds={fileIds}
|
fileIds={fileIds}
|
||||||
setFileIds={setFileIds}
|
setFileIds={setFileIds}
|
||||||
files={files}
|
files={files}
|
||||||
setFiles={setFiles}
|
setFiles={setFiles}
|
||||||
|
optimizationMode={optimizationMode}
|
||||||
|
setOptimizationMode={setOptimizationMode}
|
||||||
|
focusMode={focusMode}
|
||||||
|
setFocusMode={setFocusMode}
|
||||||
/>
|
/>
|
||||||
</>
|
</>
|
||||||
) : (
|
) : (
|
||||||
|
@@ -1,8 +1,8 @@
|
|||||||
import { Settings } from 'lucide-react';
|
import { Settings } from 'lucide-react';
|
||||||
import EmptyChatMessageInput from './EmptyChatMessageInput';
|
|
||||||
import { useState } from 'react';
|
import { useState } from 'react';
|
||||||
import { File } from './ChatWindow';
|
import { File } from './ChatWindow';
|
||||||
import Link from 'next/link';
|
import Link from 'next/link';
|
||||||
|
import MessageInput from './MessageInput';
|
||||||
|
|
||||||
const EmptyChat = ({
|
const EmptyChat = ({
|
||||||
sendMessage,
|
sendMessage,
|
||||||
@@ -38,7 +38,9 @@ const EmptyChat = ({
|
|||||||
<h2 className="text-black/70 dark:text-white/70 text-3xl font-medium -mt-8">
|
<h2 className="text-black/70 dark:text-white/70 text-3xl font-medium -mt-8">
|
||||||
Research begins here.
|
Research begins here.
|
||||||
</h2>
|
</h2>
|
||||||
<EmptyChatMessageInput
|
<MessageInput
|
||||||
|
firstMessage={true}
|
||||||
|
loading={false}
|
||||||
sendMessage={sendMessage}
|
sendMessage={sendMessage}
|
||||||
focusMode={focusMode}
|
focusMode={focusMode}
|
||||||
setFocusMode={setFocusMode}
|
setFocusMode={setFocusMode}
|
||||||
|
@@ -1,114 +0,0 @@
|
|||||||
import { ArrowRight } from 'lucide-react';
|
|
||||||
import { useEffect, useRef, useState } from 'react';
|
|
||||||
import TextareaAutosize from 'react-textarea-autosize';
|
|
||||||
import CopilotToggle from './MessageInputActions/Copilot';
|
|
||||||
import Focus from './MessageInputActions/Focus';
|
|
||||||
import Optimization from './MessageInputActions/Optimization';
|
|
||||||
import Attach from './MessageInputActions/Attach';
|
|
||||||
import { File } from './ChatWindow';
|
|
||||||
|
|
||||||
const EmptyChatMessageInput = ({
|
|
||||||
sendMessage,
|
|
||||||
focusMode,
|
|
||||||
setFocusMode,
|
|
||||||
optimizationMode,
|
|
||||||
setOptimizationMode,
|
|
||||||
fileIds,
|
|
||||||
setFileIds,
|
|
||||||
files,
|
|
||||||
setFiles,
|
|
||||||
}: {
|
|
||||||
sendMessage: (message: string) => void;
|
|
||||||
focusMode: string;
|
|
||||||
setFocusMode: (mode: string) => void;
|
|
||||||
optimizationMode: string;
|
|
||||||
setOptimizationMode: (mode: string) => void;
|
|
||||||
fileIds: string[];
|
|
||||||
setFileIds: (fileIds: string[]) => void;
|
|
||||||
files: File[];
|
|
||||||
setFiles: (files: File[]) => void;
|
|
||||||
}) => {
|
|
||||||
const [copilotEnabled, setCopilotEnabled] = useState(false);
|
|
||||||
const [message, setMessage] = useState('');
|
|
||||||
|
|
||||||
const inputRef = useRef<HTMLTextAreaElement | null>(null);
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
const handleKeyDown = (e: KeyboardEvent) => {
|
|
||||||
const activeElement = document.activeElement;
|
|
||||||
|
|
||||||
const isInputFocused =
|
|
||||||
activeElement?.tagName === 'INPUT' ||
|
|
||||||
activeElement?.tagName === 'TEXTAREA' ||
|
|
||||||
activeElement?.hasAttribute('contenteditable');
|
|
||||||
|
|
||||||
if (e.key === '/' && !isInputFocused) {
|
|
||||||
e.preventDefault();
|
|
||||||
inputRef.current?.focus();
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
document.addEventListener('keydown', handleKeyDown);
|
|
||||||
|
|
||||||
inputRef.current?.focus();
|
|
||||||
|
|
||||||
return () => {
|
|
||||||
document.removeEventListener('keydown', handleKeyDown);
|
|
||||||
};
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return (
|
|
||||||
<form
|
|
||||||
onSubmit={(e) => {
|
|
||||||
e.preventDefault();
|
|
||||||
sendMessage(message);
|
|
||||||
setMessage('');
|
|
||||||
}}
|
|
||||||
onKeyDown={(e) => {
|
|
||||||
if (e.key === 'Enter' && !e.shiftKey) {
|
|
||||||
e.preventDefault();
|
|
||||||
sendMessage(message);
|
|
||||||
setMessage('');
|
|
||||||
}
|
|
||||||
}}
|
|
||||||
className="w-full"
|
|
||||||
>
|
|
||||||
<div className="flex flex-col bg-light-secondary dark:bg-dark-secondary px-5 pt-5 pb-2 rounded-lg w-full border border-light-200 dark:border-dark-200">
|
|
||||||
<TextareaAutosize
|
|
||||||
ref={inputRef}
|
|
||||||
value={message}
|
|
||||||
onChange={(e) => setMessage(e.target.value)}
|
|
||||||
minRows={2}
|
|
||||||
className="bg-transparent placeholder:text-black/50 dark:placeholder:text-white/50 text-sm text-black dark:text-white resize-none focus:outline-none w-full max-h-24 lg:max-h-36 xl:max-h-48"
|
|
||||||
placeholder="Ask anything..."
|
|
||||||
/>
|
|
||||||
<div className="flex flex-row items-center justify-between mt-4">
|
|
||||||
<div className="flex flex-row items-center space-x-2 lg:space-x-4">
|
|
||||||
<Focus focusMode={focusMode} setFocusMode={setFocusMode} />
|
|
||||||
<Attach
|
|
||||||
fileIds={fileIds}
|
|
||||||
setFileIds={setFileIds}
|
|
||||||
files={files}
|
|
||||||
setFiles={setFiles}
|
|
||||||
showText
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-row items-center space-x-1 sm:space-x-4">
|
|
||||||
<Optimization
|
|
||||||
optimizationMode={optimizationMode}
|
|
||||||
setOptimizationMode={setOptimizationMode}
|
|
||||||
/>
|
|
||||||
<button
|
|
||||||
disabled={message.trim().length === 0}
|
|
||||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 disabled:bg-[#e0e0dc] dark:disabled:bg-[#ececec21] hover:bg-opacity-85 transition duration-100 rounded-full p-2"
|
|
||||||
>
|
|
||||||
<ArrowRight className="bg-background" size={17} />
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</form>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default EmptyChatMessageInput;
|
|
82
src/components/MessageActions/ModelInfo.tsx
Normal file
82
src/components/MessageActions/ModelInfo.tsx
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import React, { useState, useEffect, useRef } from 'react';
|
||||||
|
import { Info } from 'lucide-react';
|
||||||
|
import { ModelStats } from '../ChatWindow';
|
||||||
|
import { cn } from '@/lib/utils';
|
||||||
|
|
||||||
|
interface ModelInfoButtonProps {
|
||||||
|
modelStats: ModelStats | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const ModelInfoButton: React.FC<ModelInfoButtonProps> = ({ modelStats }) => {
|
||||||
|
const [showPopover, setShowPopover] = useState(false);
|
||||||
|
const popoverRef = useRef<HTMLDivElement>(null);
|
||||||
|
const buttonRef = useRef<HTMLButtonElement>(null);
|
||||||
|
|
||||||
|
// Always render, using "Unknown" as fallback if model info isn't available
|
||||||
|
const modelName = modelStats?.modelName || 'Unknown';
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const handleClickOutside = (event: MouseEvent) => {
|
||||||
|
if (
|
||||||
|
popoverRef.current &&
|
||||||
|
!popoverRef.current.contains(event.target as Node) &&
|
||||||
|
buttonRef.current &&
|
||||||
|
!buttonRef.current.contains(event.target as Node)
|
||||||
|
) {
|
||||||
|
setShowPopover(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
document.addEventListener('mousedown', handleClickOutside);
|
||||||
|
return () => {
|
||||||
|
document.removeEventListener('mousedown', handleClickOutside);
|
||||||
|
};
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="relative">
|
||||||
|
<button
|
||||||
|
ref={buttonRef}
|
||||||
|
className="p-1 ml-1 text-black/50 dark:text-white/50 rounded-full hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||||
|
onClick={() => setShowPopover(!showPopover)}
|
||||||
|
aria-label="Show model information"
|
||||||
|
>
|
||||||
|
<Info size={14} />
|
||||||
|
</button>
|
||||||
|
{showPopover && (
|
||||||
|
<div
|
||||||
|
ref={popoverRef}
|
||||||
|
className="absolute z-10 left-6 top-0 w-64 rounded-md shadow-lg bg-white dark:bg-dark-secondary border border-light-200 dark:border-dark-200"
|
||||||
|
>
|
||||||
|
<div className="py-2 px-3">
|
||||||
|
<h4 className="text-sm font-medium mb-2 text-black dark:text-white">
|
||||||
|
Model Information
|
||||||
|
</h4>
|
||||||
|
<div className="space-y-1 text-xs">
|
||||||
|
<div className="flex justify-between">
|
||||||
|
<span className="text-black/70 dark:text-white/70">Model:</span>
|
||||||
|
<span className="text-black dark:text-white font-medium">
|
||||||
|
{modelName}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{modelStats?.responseTime && (
|
||||||
|
<div className="flex justify-between">
|
||||||
|
<span className="text-black/70 dark:text-white/70">
|
||||||
|
Response time:
|
||||||
|
</span>
|
||||||
|
<span className="text-black dark:text-white font-medium">
|
||||||
|
{(modelStats.responseTime / 1000).toFixed(2)}s
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default ModelInfoButton;
|
@@ -4,6 +4,7 @@
|
|||||||
import React, { MutableRefObject, useEffect, useState } from 'react';
|
import React, { MutableRefObject, useEffect, useState } from 'react';
|
||||||
import { Message } from './ChatWindow';
|
import { Message } from './ChatWindow';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
|
import { getSuggestions } from '@/lib/actions';
|
||||||
import {
|
import {
|
||||||
BookCopy,
|
BookCopy,
|
||||||
Disc3,
|
Disc3,
|
||||||
@@ -11,20 +12,92 @@ import {
|
|||||||
StopCircle,
|
StopCircle,
|
||||||
Layers3,
|
Layers3,
|
||||||
Plus,
|
Plus,
|
||||||
|
Sparkles,
|
||||||
|
Copy as CopyIcon,
|
||||||
|
CheckCheck,
|
||||||
} from 'lucide-react';
|
} from 'lucide-react';
|
||||||
import Markdown, { MarkdownToJSX } from 'markdown-to-jsx';
|
import Markdown, { MarkdownToJSX } from 'markdown-to-jsx';
|
||||||
import Copy from './MessageActions/Copy';
|
import Copy from './MessageActions/Copy';
|
||||||
import Rewrite from './MessageActions/Rewrite';
|
import Rewrite from './MessageActions/Rewrite';
|
||||||
|
import ModelInfoButton from './MessageActions/ModelInfo';
|
||||||
import MessageSources from './MessageSources';
|
import MessageSources from './MessageSources';
|
||||||
import SearchImages from './SearchImages';
|
import SearchImages from './SearchImages';
|
||||||
import SearchVideos from './SearchVideos';
|
import SearchVideos from './SearchVideos';
|
||||||
import { useSpeech } from 'react-text-to-speech';
|
import { useSpeech } from 'react-text-to-speech';
|
||||||
import ThinkBox from './ThinkBox';
|
import ThinkBox from './ThinkBox';
|
||||||
|
import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter';
|
||||||
|
import { oneDark } from 'react-syntax-highlighter/dist/cjs/styles/prism';
|
||||||
|
|
||||||
const ThinkTagProcessor = ({ children }: { children: React.ReactNode }) => {
|
const ThinkTagProcessor = ({ children }: { children: React.ReactNode }) => {
|
||||||
return <ThinkBox content={children as string} />;
|
return <ThinkBox content={children as string} />;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const CodeBlock = ({
|
||||||
|
className,
|
||||||
|
children,
|
||||||
|
}: {
|
||||||
|
className?: string;
|
||||||
|
children: React.ReactNode;
|
||||||
|
}) => {
|
||||||
|
// Extract language from className (format could be "language-javascript" or "lang-javascript")
|
||||||
|
let language = '';
|
||||||
|
if (className) {
|
||||||
|
if (className.startsWith('language-')) {
|
||||||
|
language = className.replace('language-', '');
|
||||||
|
} else if (className.startsWith('lang-')) {
|
||||||
|
language = className.replace('lang-', '');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const content = children as string;
|
||||||
|
|
||||||
|
const [isCopied, setIsCopied] = useState(false);
|
||||||
|
|
||||||
|
const handleCopyCode = () => {
|
||||||
|
navigator.clipboard.writeText(content);
|
||||||
|
setIsCopied(true);
|
||||||
|
setTimeout(() => setIsCopied(false), 2000);
|
||||||
|
};
|
||||||
|
|
||||||
|
console.log('Code block language:', language, 'Class name:', className); // For debugging
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-md overflow-hidden my-4 relative group border border-dark-secondary">
|
||||||
|
<div className="flex justify-between items-center px-4 py-2 bg-dark-200 border-b border-dark-secondary text-xs text-white/70 font-mono">
|
||||||
|
<span>{language}</span>
|
||||||
|
<button
|
||||||
|
onClick={handleCopyCode}
|
||||||
|
className="p-1 rounded-md hover:bg-dark-secondary transition duration-200"
|
||||||
|
aria-label="Copy code to clipboard"
|
||||||
|
>
|
||||||
|
{isCopied ? (
|
||||||
|
<CheckCheck size={14} className="text-green-500" />
|
||||||
|
) : (
|
||||||
|
<CopyIcon size={14} className="text-white/70" />
|
||||||
|
)}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<SyntaxHighlighter
|
||||||
|
language={language || 'text'}
|
||||||
|
style={oneDark}
|
||||||
|
customStyle={{
|
||||||
|
margin: 0,
|
||||||
|
padding: '1rem',
|
||||||
|
borderRadius: 0,
|
||||||
|
backgroundColor: '#1c1c1c',
|
||||||
|
}}
|
||||||
|
wrapLines={true}
|
||||||
|
wrapLongLines={true}
|
||||||
|
showLineNumbers={language !== '' && content.split('\n').length > 1}
|
||||||
|
useInlineStyles={true}
|
||||||
|
PreTag="div"
|
||||||
|
>
|
||||||
|
{content}
|
||||||
|
</SyntaxHighlighter>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
const MessageBox = ({
|
const MessageBox = ({
|
||||||
message,
|
message,
|
||||||
messageIndex,
|
messageIndex,
|
||||||
@@ -42,12 +115,43 @@ const MessageBox = ({
|
|||||||
dividerRef?: MutableRefObject<HTMLDivElement | null>;
|
dividerRef?: MutableRefObject<HTMLDivElement | null>;
|
||||||
isLast: boolean;
|
isLast: boolean;
|
||||||
rewrite: (messageId: string) => void;
|
rewrite: (messageId: string) => void;
|
||||||
sendMessage: (message: string) => void;
|
sendMessage: (
|
||||||
|
message: string,
|
||||||
|
options?: {
|
||||||
|
messageId?: string;
|
||||||
|
rewriteIndex?: number;
|
||||||
|
suggestions?: string[];
|
||||||
|
},
|
||||||
|
) => void;
|
||||||
}) => {
|
}) => {
|
||||||
const [parsedMessage, setParsedMessage] = useState(message.content);
|
const [parsedMessage, setParsedMessage] = useState(message.content);
|
||||||
const [speechMessage, setSpeechMessage] = useState(message.content);
|
const [speechMessage, setSpeechMessage] = useState(message.content);
|
||||||
|
const [loadingSuggestions, setLoadingSuggestions] = useState(false);
|
||||||
|
const [autoSuggestions, setAutoSuggestions] = useState(
|
||||||
|
localStorage.getItem('autoSuggestions'),
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleLoadSuggestions = async () => {
|
||||||
|
if (
|
||||||
|
loadingSuggestions ||
|
||||||
|
(message?.suggestions && message.suggestions.length > 0)
|
||||||
|
)
|
||||||
|
return;
|
||||||
|
|
||||||
|
setLoadingSuggestions(true);
|
||||||
|
try {
|
||||||
|
const suggestions = await getSuggestions([...history]);
|
||||||
|
// We need to update the message.suggestions property through parent component
|
||||||
|
sendMessage('', { messageId: message.messageId, suggestions });
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error loading suggestions:', error);
|
||||||
|
} finally {
|
||||||
|
setLoadingSuggestions(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
const citationRegex = /\[([^\]]+)\]/g;
|
||||||
const regex = /\[(\d+)\]/g;
|
const regex = /\[(\d+)\]/g;
|
||||||
let processedMessage = message.content;
|
let processedMessage = message.content;
|
||||||
|
|
||||||
@@ -67,13 +171,36 @@ const MessageBox = ({
|
|||||||
) {
|
) {
|
||||||
setParsedMessage(
|
setParsedMessage(
|
||||||
processedMessage.replace(
|
processedMessage.replace(
|
||||||
regex,
|
citationRegex,
|
||||||
(_, number) =>
|
(_, capturedContent: string) => {
|
||||||
`<a href="${
|
const numbers = capturedContent
|
||||||
message.sources?.[number - 1]?.metadata?.url
|
.split(',')
|
||||||
}" target="_blank" className="bg-light-secondary dark:bg-dark-secondary px-1 rounded ml-1 no-underline text-xs text-black/70 dark:text-white/70 relative">${number}</a>`,
|
.map((numStr) => numStr.trim());
|
||||||
|
|
||||||
|
const linksHtml = numbers
|
||||||
|
.map((numStr) => {
|
||||||
|
const number = parseInt(numStr);
|
||||||
|
|
||||||
|
if (isNaN(number) || number <= 0) {
|
||||||
|
return `[${numStr}]`;
|
||||||
|
}
|
||||||
|
|
||||||
|
const source = message.sources?.[number - 1];
|
||||||
|
const url = source?.metadata?.url;
|
||||||
|
|
||||||
|
if (url) {
|
||||||
|
return `<a href="${url}" target="_blank" className="bg-light-secondary dark:bg-dark-secondary px-1 rounded ml-1 no-underline text-xs text-black/70 dark:text-white/70 relative">${numStr}</a>`;
|
||||||
|
} else {
|
||||||
|
return `[${numStr}]`;
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.join('');
|
||||||
|
|
||||||
|
return linksHtml;
|
||||||
|
},
|
||||||
),
|
),
|
||||||
);
|
);
|
||||||
|
setSpeechMessage(message.content.replace(regex, ''));
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -81,6 +208,18 @@ const MessageBox = ({
|
|||||||
setParsedMessage(processedMessage);
|
setParsedMessage(processedMessage);
|
||||||
}, [message.content, message.sources, message.role]);
|
}, [message.content, message.sources, message.role]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const handleStorageChange = () => {
|
||||||
|
setAutoSuggestions(localStorage.getItem('autoSuggestions'));
|
||||||
|
};
|
||||||
|
|
||||||
|
window.addEventListener('storage', handleStorageChange);
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
window.removeEventListener('storage', handleStorageChange);
|
||||||
|
};
|
||||||
|
}, []);
|
||||||
|
|
||||||
const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
|
const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
|
||||||
|
|
||||||
const markdownOverrides: MarkdownToJSX.Options = {
|
const markdownOverrides: MarkdownToJSX.Options = {
|
||||||
@@ -88,6 +227,24 @@ const MessageBox = ({
|
|||||||
think: {
|
think: {
|
||||||
component: ThinkTagProcessor,
|
component: ThinkTagProcessor,
|
||||||
},
|
},
|
||||||
|
code: {
|
||||||
|
component: ({ className, children }) => {
|
||||||
|
// Check if it's an inline code block or a fenced code block
|
||||||
|
if (className) {
|
||||||
|
// This is a fenced code block (```code```)
|
||||||
|
return <CodeBlock className={className}>{children}</CodeBlock>;
|
||||||
|
}
|
||||||
|
// This is an inline code block (`code`)
|
||||||
|
return (
|
||||||
|
<code className="px-1.5 py-0.5 rounded bg-dark-secondary text-white font-mono text-sm">
|
||||||
|
{children}
|
||||||
|
</code>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
pre: {
|
||||||
|
component: ({ children }) => children,
|
||||||
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -121,10 +278,32 @@ const MessageBox = ({
|
|||||||
Sources
|
Sources
|
||||||
</h3>
|
</h3>
|
||||||
</div>
|
</div>
|
||||||
|
{message.searchQuery && (
|
||||||
|
<div className="mb-2 text-sm bg-light-secondary dark:bg-dark-secondary rounded-lg p-3">
|
||||||
|
<span className="font-medium text-black/70 dark:text-white/70">
|
||||||
|
Search query:
|
||||||
|
</span>{' '}
|
||||||
|
{message.searchUrl ? (
|
||||||
|
<a
|
||||||
|
href={message.searchUrl}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="dark:text-white text-black hover:underline"
|
||||||
|
>
|
||||||
|
{message.searchQuery}
|
||||||
|
</a>
|
||||||
|
) : (
|
||||||
|
<span className="text-black dark:text-white">
|
||||||
|
{message.searchQuery}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
<MessageSources sources={message.sources} />
|
<MessageSources sources={message.sources} />
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
<div className="flex flex-col space-y-2">
|
<div className="flex flex-col space-y-2">
|
||||||
|
{' '}
|
||||||
<div className="flex flex-row items-center space-x-2">
|
<div className="flex flex-row items-center space-x-2">
|
||||||
<Disc3
|
<Disc3
|
||||||
className={cn(
|
className={cn(
|
||||||
@@ -136,12 +315,16 @@ const MessageBox = ({
|
|||||||
<h3 className="text-black dark:text-white font-medium text-xl">
|
<h3 className="text-black dark:text-white font-medium text-xl">
|
||||||
Answer
|
Answer
|
||||||
</h3>
|
</h3>
|
||||||
|
{message.modelStats && (
|
||||||
|
<ModelInfoButton modelStats={message.modelStats} />
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<Markdown
|
<Markdown
|
||||||
className={cn(
|
className={cn(
|
||||||
'prose prose-h1:mb-3 prose-h2:mb-2 prose-h2:mt-6 prose-h2:font-[800] prose-h3:mt-4 prose-h3:mb-1.5 prose-h3:font-[600] dark:prose-invert prose-p:leading-relaxed prose-pre:p-0 font-[400]',
|
'prose prose-h1:mb-3 prose-h2:mb-2 prose-h2:mt-6 prose-h2:font-[800] prose-h3:mt-4 prose-h3:mb-1.5 prose-h3:font-[600] prose-invert prose-p:leading-relaxed prose-pre:p-0 font-[400]',
|
||||||
'max-w-none break-words text-black dark:text-white',
|
'prose-code:bg-transparent prose-code:p-0 prose-code:text-inherit prose-code:font-normal prose-code:before:content-none prose-code:after:content-none',
|
||||||
|
'prose-pre:bg-transparent prose-pre:border-0 prose-pre:m-0 prose-pre:p-0',
|
||||||
|
'max-w-none break-words text-white',
|
||||||
)}
|
)}
|
||||||
options={markdownOverrides}
|
options={markdownOverrides}
|
||||||
>
|
>
|
||||||
@@ -176,18 +359,37 @@ const MessageBox = ({
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
{isLast &&
|
{isLast && message.role === 'assistant' && !loading && (
|
||||||
message.suggestions &&
|
<>
|
||||||
message.suggestions.length > 0 &&
|
<div className="h-px w-full bg-light-secondary dark:bg-dark-secondary" />
|
||||||
message.role === 'assistant' &&
|
<div className="flex flex-col space-y-3 text-black dark:text-white">
|
||||||
!loading && (
|
<div className="flex flex-row items-center space-x-2 mt-4">
|
||||||
<>
|
<Layers3 />
|
||||||
<div className="h-px w-full bg-light-secondary dark:bg-dark-secondary" />
|
<h3 className="text-xl font-medium">Related</h3>{' '}
|
||||||
<div className="flex flex-col space-y-3 text-black dark:text-white">
|
{(!autoSuggestions || autoSuggestions === 'false') &&
|
||||||
<div className="flex flex-row items-center space-x-2 mt-4">
|
(!message.suggestions ||
|
||||||
<Layers3 />
|
message.suggestions.length === 0) ? (
|
||||||
<h3 className="text-xl font-medium">Related</h3>
|
<div className="bg-light-secondary dark:bg-dark-secondary">
|
||||||
</div>
|
<button
|
||||||
|
onClick={handleLoadSuggestions}
|
||||||
|
disabled={loadingSuggestions}
|
||||||
|
className="px-4 py-2 flex flex-row items-center justify-center space-x-2 rounded-lg bg-light-secondary dark:bg-dark-secondary hover:bg-light-200 dark:hover:bg-dark-200 transition duration-200 text-black/70 dark:text-white/70 hover:text-black dark:hover:text-white"
|
||||||
|
>
|
||||||
|
{loadingSuggestions ? (
|
||||||
|
<div className="w-4 h-4 border-2 border-t-transparent border-gray-400 dark:border-gray-500 rounded-full animate-spin" />
|
||||||
|
) : (
|
||||||
|
<Sparkles size={16} />
|
||||||
|
)}
|
||||||
|
<span>
|
||||||
|
{loadingSuggestions
|
||||||
|
? 'Loading suggestions...'
|
||||||
|
: 'Load suggestions'}
|
||||||
|
</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
) : null}
|
||||||
|
</div>
|
||||||
|
{message.suggestions && message.suggestions.length > 0 ? (
|
||||||
<div className="flex flex-col space-y-3">
|
<div className="flex flex-col space-y-3">
|
||||||
{message.suggestions.map((suggestion, i) => (
|
{message.suggestions.map((suggestion, i) => (
|
||||||
<div
|
<div
|
||||||
@@ -212,9 +414,10 @@ const MessageBox = ({
|
|||||||
</div>
|
</div>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
) : null}
|
||||||
</>
|
</div>
|
||||||
)}
|
</>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
|
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
|
||||||
|
@@ -1,11 +1,11 @@
|
|||||||
import { cn } from '@/lib/utils';
|
import { ArrowRight, ArrowUp } from 'lucide-react';
|
||||||
import { ArrowUp } from 'lucide-react';
|
|
||||||
import { useEffect, useRef, useState } from 'react';
|
import { useEffect, useRef, useState } from 'react';
|
||||||
import TextareaAutosize from 'react-textarea-autosize';
|
import TextareaAutosize from 'react-textarea-autosize';
|
||||||
import Attach from './MessageInputActions/Attach';
|
|
||||||
import CopilotToggle from './MessageInputActions/Copilot';
|
|
||||||
import { File } from './ChatWindow';
|
import { File } from './ChatWindow';
|
||||||
import AttachSmall from './MessageInputActions/AttachSmall';
|
import Attach from './MessageInputActions/Attach';
|
||||||
|
import Focus from './MessageInputActions/Focus';
|
||||||
|
import ModelSelector from './MessageInputActions/ModelSelector';
|
||||||
|
import Optimization from './MessageInputActions/Optimization';
|
||||||
|
|
||||||
const MessageInput = ({
|
const MessageInput = ({
|
||||||
sendMessage,
|
sendMessage,
|
||||||
@@ -14,6 +14,11 @@ const MessageInput = ({
|
|||||||
setFileIds,
|
setFileIds,
|
||||||
files,
|
files,
|
||||||
setFiles,
|
setFiles,
|
||||||
|
optimizationMode,
|
||||||
|
setOptimizationMode,
|
||||||
|
focusMode,
|
||||||
|
setFocusMode,
|
||||||
|
firstMessage,
|
||||||
}: {
|
}: {
|
||||||
sendMessage: (message: string) => void;
|
sendMessage: (message: string) => void;
|
||||||
loading: boolean;
|
loading: boolean;
|
||||||
@@ -21,118 +26,123 @@ const MessageInput = ({
|
|||||||
setFileIds: (fileIds: string[]) => void;
|
setFileIds: (fileIds: string[]) => void;
|
||||||
files: File[];
|
files: File[];
|
||||||
setFiles: (files: File[]) => void;
|
setFiles: (files: File[]) => void;
|
||||||
|
optimizationMode: string;
|
||||||
|
setOptimizationMode: (mode: string) => void;
|
||||||
|
focusMode: string;
|
||||||
|
setFocusMode: (mode: string) => void;
|
||||||
|
firstMessage: boolean;
|
||||||
}) => {
|
}) => {
|
||||||
const [copilotEnabled, setCopilotEnabled] = useState(false);
|
|
||||||
const [message, setMessage] = useState('');
|
const [message, setMessage] = useState('');
|
||||||
const [textareaRows, setTextareaRows] = useState(1);
|
const [selectedModel, setSelectedModel] = useState<{
|
||||||
const [mode, setMode] = useState<'multi' | 'single'>('single');
|
provider: string;
|
||||||
|
model: string;
|
||||||
|
} | null>(null);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (textareaRows >= 2 && message && mode === 'single') {
|
// Load saved model preferences from localStorage
|
||||||
setMode('multi');
|
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||||
} else if (!message && mode === 'multi') {
|
const chatModel = localStorage.getItem('chatModel');
|
||||||
setMode('single');
|
|
||||||
|
if (chatModelProvider && chatModel) {
|
||||||
|
setSelectedModel({
|
||||||
|
provider: chatModelProvider,
|
||||||
|
model: chatModel,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}, [textareaRows, mode, message]);
|
}, []);
|
||||||
|
|
||||||
const inputRef = useRef<HTMLTextAreaElement | null>(null);
|
const inputRef = useRef<HTMLTextAreaElement | null>(null);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const handleKeyDown = (e: KeyboardEvent) => {
|
const handleKeyDown = (e: KeyboardEvent) => {
|
||||||
const activeElement = document.activeElement;
|
const activeElement = document.activeElement;
|
||||||
|
|
||||||
const isInputFocused =
|
const isInputFocused =
|
||||||
activeElement?.tagName === 'INPUT' ||
|
activeElement?.tagName === 'INPUT' ||
|
||||||
activeElement?.tagName === 'TEXTAREA' ||
|
activeElement?.tagName === 'TEXTAREA' ||
|
||||||
activeElement?.hasAttribute('contenteditable');
|
activeElement?.hasAttribute('contenteditable');
|
||||||
|
|
||||||
if (e.key === '/' && !isInputFocused) {
|
if (e.key === '/' && !isInputFocused) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
inputRef.current?.focus();
|
inputRef.current?.focus();
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
document.addEventListener('keydown', handleKeyDown);
|
document.addEventListener('keydown', handleKeyDown);
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
document.removeEventListener('keydown', handleKeyDown);
|
document.removeEventListener('keydown', handleKeyDown);
|
||||||
};
|
};
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
|
// Function to handle message submission
|
||||||
|
const handleSubmitMessage = () => {
|
||||||
|
// Only submit if we have a non-empty message and not currently loading
|
||||||
|
if (loading || message.trim().length === 0) return;
|
||||||
|
|
||||||
|
// Make sure the selected model is used when sending a message
|
||||||
|
if (selectedModel) {
|
||||||
|
localStorage.setItem('chatModelProvider', selectedModel.provider);
|
||||||
|
localStorage.setItem('chatModel', selectedModel.model);
|
||||||
|
}
|
||||||
|
|
||||||
|
sendMessage(message);
|
||||||
|
setMessage('');
|
||||||
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<form
|
<form
|
||||||
onSubmit={(e) => {
|
onSubmit={(e) => {
|
||||||
if (loading) return;
|
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
sendMessage(message);
|
handleSubmitMessage();
|
||||||
setMessage('');
|
|
||||||
}}
|
}}
|
||||||
onKeyDown={(e) => {
|
onKeyDown={(e) => {
|
||||||
if (e.key === 'Enter' && !e.shiftKey && !loading) {
|
if (e.key === 'Enter' && !e.shiftKey) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
sendMessage(message);
|
handleSubmitMessage();
|
||||||
setMessage('');
|
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
className={cn(
|
className="w-full"
|
||||||
'bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-hidden border border-light-200 dark:border-dark-200',
|
|
||||||
mode === 'multi' ? 'flex-col rounded-lg' : 'flex-row rounded-full',
|
|
||||||
)}
|
|
||||||
>
|
>
|
||||||
{mode === 'single' && (
|
<div className="flex flex-col bg-light-secondary dark:bg-dark-secondary px-5 pt-5 pb-2 rounded-lg w-full border border-light-200 dark:border-dark-200">
|
||||||
<AttachSmall
|
<TextareaAutosize
|
||||||
fileIds={fileIds}
|
ref={inputRef}
|
||||||
setFileIds={setFileIds}
|
value={message}
|
||||||
files={files}
|
onChange={(e) => setMessage(e.target.value)}
|
||||||
setFiles={setFiles}
|
minRows={2}
|
||||||
|
className="bg-transparent placeholder:text-black/50 dark:placeholder:text-white/50 text-sm text-black dark:text-white resize-none focus:outline-none w-full max-h-24 lg:max-h-36 xl:max-h-48"
|
||||||
|
placeholder={firstMessage ? 'Ask anything...' : 'Ask a follow-up'}
|
||||||
/>
|
/>
|
||||||
)}
|
<div className="flex flex-row items-center justify-between mt-4">
|
||||||
<TextareaAutosize
|
<div className="flex flex-row items-center space-x-2 lg:space-x-4">
|
||||||
ref={inputRef}
|
<Focus focusMode={focusMode} setFocusMode={setFocusMode} />
|
||||||
value={message}
|
<Attach
|
||||||
onChange={(e) => setMessage(e.target.value)}
|
fileIds={fileIds}
|
||||||
onHeightChange={(height, props) => {
|
setFileIds={setFileIds}
|
||||||
setTextareaRows(Math.ceil(height / props.rowHeight));
|
files={files}
|
||||||
}}
|
setFiles={setFiles}
|
||||||
className="transition bg-transparent dark:placeholder:text-white/50 placeholder:text-sm text-sm dark:text-white resize-none focus:outline-none w-full px-2 max-h-24 lg:max-h-36 xl:max-h-48 flex-grow flex-shrink"
|
showText={firstMessage}
|
||||||
placeholder="Ask a follow-up"
|
/>
|
||||||
/>
|
<ModelSelector
|
||||||
{mode === 'single' && (
|
selectedModel={selectedModel}
|
||||||
<div className="flex flex-row items-center space-x-4">
|
setSelectedModel={setSelectedModel}
|
||||||
<CopilotToggle
|
/>
|
||||||
copilotEnabled={copilotEnabled}
|
</div>
|
||||||
setCopilotEnabled={setCopilotEnabled}
|
<div className="flex flex-row items-center space-x-1 sm:space-x-4">
|
||||||
/>
|
<Optimization
|
||||||
<button
|
optimizationMode={optimizationMode}
|
||||||
disabled={message.trim().length === 0 || loading}
|
setOptimizationMode={setOptimizationMode}
|
||||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
|
||||||
>
|
|
||||||
<ArrowUp className="bg-background" size={17} />
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
{mode === 'multi' && (
|
|
||||||
<div className="flex flex-row items-center justify-between w-full pt-2">
|
|
||||||
<AttachSmall
|
|
||||||
fileIds={fileIds}
|
|
||||||
setFileIds={setFileIds}
|
|
||||||
files={files}
|
|
||||||
setFiles={setFiles}
|
|
||||||
/>
|
|
||||||
<div className="flex flex-row items-center space-x-4">
|
|
||||||
<CopilotToggle
|
|
||||||
copilotEnabled={copilotEnabled}
|
|
||||||
setCopilotEnabled={setCopilotEnabled}
|
|
||||||
/>
|
/>
|
||||||
<button
|
<button
|
||||||
disabled={message.trim().length === 0 || loading}
|
disabled={message.trim().length === 0}
|
||||||
className="bg-[#24A0ED] text-white text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 disabled:bg-[#e0e0dc] dark:disabled:bg-[#ececec21] hover:bg-opacity-85 transition duration-100 rounded-full p-2"
|
||||||
|
type="submit"
|
||||||
>
|
>
|
||||||
<ArrowUp className="bg-background" size={17} />
|
{firstMessage ? (
|
||||||
|
<ArrowRight className="bg-background" size={17} />
|
||||||
|
) : (
|
||||||
|
<ArrowUp className="bg-background" size={17} />
|
||||||
|
)}
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
)}
|
</div>
|
||||||
</form>
|
</form>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
@@ -5,7 +5,7 @@ import {
|
|||||||
PopoverPanel,
|
PopoverPanel,
|
||||||
Transition,
|
Transition,
|
||||||
} from '@headlessui/react';
|
} from '@headlessui/react';
|
||||||
import { CopyPlus, File, LoaderCircle, Plus, Trash } from 'lucide-react';
|
import { File, LoaderCircle, Paperclip, Plus, Trash } from 'lucide-react';
|
||||||
import { Fragment, useRef, useState } from 'react';
|
import { Fragment, useRef, useState } from 'react';
|
||||||
import { File as FileType } from '../ChatWindow';
|
import { File as FileType } from '../ChatWindow';
|
||||||
|
|
||||||
@@ -176,8 +176,10 @@ const Attach = ({
|
|||||||
multiple
|
multiple
|
||||||
hidden
|
hidden
|
||||||
/>
|
/>
|
||||||
<CopyPlus size={showText ? 18 : undefined} />
|
<Paperclip size="18" />
|
||||||
{showText && <p className="text-xs font-medium pl-[1px]">Attach</p>}
|
{showText && (
|
||||||
|
<p className="text-xs font-medium pl-[1px] hidden lg:block">Attach</p>
|
||||||
|
)}
|
||||||
</button>
|
</button>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
@@ -2,6 +2,7 @@ import {
|
|||||||
BadgePercent,
|
BadgePercent,
|
||||||
ChevronDown,
|
ChevronDown,
|
||||||
Globe,
|
Globe,
|
||||||
|
MessageCircle,
|
||||||
Pencil,
|
Pencil,
|
||||||
ScanEye,
|
ScanEye,
|
||||||
SwatchBook,
|
SwatchBook,
|
||||||
@@ -30,11 +31,23 @@ const focusModes = [
|
|||||||
icon: <SwatchBook size={20} />,
|
icon: <SwatchBook size={20} />,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
key: 'writingAssistant',
|
key: 'chat',
|
||||||
title: 'Writing',
|
title: 'Chat',
|
||||||
description: 'Chat without searching the web',
|
description: 'Have a creative conversation',
|
||||||
|
icon: <MessageCircle size={16} />,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
key: 'localResearch',
|
||||||
|
title: 'Local Research',
|
||||||
|
description: 'Research and interact with local files with citations',
|
||||||
icon: <Pencil size={16} />,
|
icon: <Pencil size={16} />,
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
key: 'redditSearch',
|
||||||
|
title: 'Reddit',
|
||||||
|
description: 'Search for discussions and opinions',
|
||||||
|
icon: <SiReddit className="h-5 w-auto mr-0.5" />,
|
||||||
|
},
|
||||||
{
|
{
|
||||||
key: 'wolframAlphaSearch',
|
key: 'wolframAlphaSearch',
|
||||||
title: 'Wolfram Alpha',
|
title: 'Wolfram Alpha',
|
||||||
@@ -47,12 +60,6 @@ const focusModes = [
|
|||||||
description: 'Search and watch videos',
|
description: 'Search and watch videos',
|
||||||
icon: <SiYoutube className="h-5 w-auto mr-0.5" />,
|
icon: <SiYoutube className="h-5 w-auto mr-0.5" />,
|
||||||
},
|
},
|
||||||
{
|
|
||||||
key: 'redditSearch',
|
|
||||||
title: 'Reddit',
|
|
||||||
description: 'Search for discussions and opinions',
|
|
||||||
icon: <SiReddit className="h-5 w-auto mr-0.5" />,
|
|
||||||
},
|
|
||||||
];
|
];
|
||||||
|
|
||||||
const Focus = ({
|
const Focus = ({
|
||||||
@@ -86,13 +93,13 @@ const Focus = ({
|
|||||||
<Transition
|
<Transition
|
||||||
as={Fragment}
|
as={Fragment}
|
||||||
enter="transition ease-out duration-150"
|
enter="transition ease-out duration-150"
|
||||||
enterFrom="opacity-0 translate-y-1"
|
enterFrom="opacity-0 -translate-y-1"
|
||||||
enterTo="opacity-100 translate-y-0"
|
enterTo="opacity-100 translate-y-0"
|
||||||
leave="transition ease-in duration-150"
|
leave="transition ease-in duration-150"
|
||||||
leaveFrom="opacity-100 translate-y-0"
|
leaveFrom="opacity-100 translate-y-0"
|
||||||
leaveTo="opacity-0 translate-y-1"
|
leaveTo="opacity-0 -translate-y-1"
|
||||||
>
|
>
|
||||||
<PopoverPanel className="absolute z-10 w-64 md:w-[500px] left-0">
|
<PopoverPanel className="absolute z-10 w-64 md:w-[500px] left-0 bottom-full mb-2">
|
||||||
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
||||||
{focusModes.map((mode, i) => (
|
{focusModes.map((mode, i) => (
|
||||||
<PopoverButton
|
<PopoverButton
|
||||||
|
305
src/components/MessageInputActions/ModelSelector.tsx
Normal file
305
src/components/MessageInputActions/ModelSelector.tsx
Normal file
@@ -0,0 +1,305 @@
|
|||||||
|
import { useEffect, useState } from 'react';
|
||||||
|
import { Cpu, ChevronDown, ChevronRight } from 'lucide-react';
|
||||||
|
import { cn } from '@/lib/utils';
|
||||||
|
import {
|
||||||
|
Popover,
|
||||||
|
PopoverButton,
|
||||||
|
PopoverPanel,
|
||||||
|
Transition,
|
||||||
|
} from '@headlessui/react';
|
||||||
|
import { Fragment } from 'react';
|
||||||
|
|
||||||
|
interface ModelOption {
|
||||||
|
provider: string;
|
||||||
|
model: string;
|
||||||
|
displayName: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ProviderModelMap {
|
||||||
|
[provider: string]: {
|
||||||
|
displayName: string;
|
||||||
|
models: ModelOption[];
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const ModelSelector = ({
|
||||||
|
selectedModel,
|
||||||
|
setSelectedModel,
|
||||||
|
}: {
|
||||||
|
selectedModel: { provider: string; model: string } | null;
|
||||||
|
setSelectedModel: (model: { provider: string; model: string }) => void;
|
||||||
|
}) => {
|
||||||
|
const [providerModels, setProviderModels] = useState<ProviderModelMap>({});
|
||||||
|
const [providersList, setProvidersList] = useState<string[]>([]);
|
||||||
|
const [loading, setLoading] = useState(true);
|
||||||
|
const [selectedModelDisplay, setSelectedModelDisplay] = useState<string>('');
|
||||||
|
const [selectedProviderDisplay, setSelectedProviderDisplay] =
|
||||||
|
useState<string>('');
|
||||||
|
const [expandedProviders, setExpandedProviders] = useState<
|
||||||
|
Record<string, boolean>
|
||||||
|
>({});
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const fetchModels = async () => {
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/models', {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Failed to fetch models: ${response.status}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data = await response.json();
|
||||||
|
const providersData: ProviderModelMap = {};
|
||||||
|
|
||||||
|
// Organize models by provider
|
||||||
|
Object.entries(data.chatModelProviders).forEach(
|
||||||
|
([provider, models]: [string, any]) => {
|
||||||
|
const providerDisplayName =
|
||||||
|
provider.charAt(0).toUpperCase() + provider.slice(1);
|
||||||
|
providersData[provider] = {
|
||||||
|
displayName: providerDisplayName,
|
||||||
|
models: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
Object.entries(models).forEach(
|
||||||
|
([modelKey, modelData]: [string, any]) => {
|
||||||
|
providersData[provider].models.push({
|
||||||
|
provider,
|
||||||
|
model: modelKey,
|
||||||
|
displayName: modelData.displayName || modelKey,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
// Filter out providers with no models
|
||||||
|
Object.keys(providersData).forEach((provider) => {
|
||||||
|
if (providersData[provider].models.length === 0) {
|
||||||
|
delete providersData[provider];
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Sort providers by name (only those that have models)
|
||||||
|
const sortedProviders = Object.keys(providersData).sort();
|
||||||
|
setProvidersList(sortedProviders);
|
||||||
|
|
||||||
|
// Initialize expanded state for all providers
|
||||||
|
const initialExpandedState: Record<string, boolean> = {};
|
||||||
|
sortedProviders.forEach((provider) => {
|
||||||
|
initialExpandedState[provider] = selectedModel?.provider === provider;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Expand the first provider if none is selected
|
||||||
|
if (sortedProviders.length > 0 && !selectedModel) {
|
||||||
|
initialExpandedState[sortedProviders[0]] = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
setExpandedProviders(initialExpandedState);
|
||||||
|
setProviderModels(providersData);
|
||||||
|
|
||||||
|
// Find the current model in our options to display its name
|
||||||
|
if (selectedModel) {
|
||||||
|
const provider = providersData[selectedModel.provider];
|
||||||
|
if (provider) {
|
||||||
|
const currentModel = provider.models.find(
|
||||||
|
(option) => option.model === selectedModel.model,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (currentModel) {
|
||||||
|
setSelectedModelDisplay(currentModel.displayName);
|
||||||
|
setSelectedProviderDisplay(provider.displayName);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
setLoading(false);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching models:', error);
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchModels();
|
||||||
|
}, [selectedModel, setSelectedModel]);
|
||||||
|
|
||||||
|
const toggleProviderExpanded = (provider: string) => {
|
||||||
|
setExpandedProviders((prev) => ({
|
||||||
|
...prev,
|
||||||
|
[provider]: !prev[provider],
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSelectModel = (option: ModelOption) => {
|
||||||
|
setSelectedModel({
|
||||||
|
provider: option.provider,
|
||||||
|
model: option.model,
|
||||||
|
});
|
||||||
|
|
||||||
|
setSelectedModelDisplay(option.displayName);
|
||||||
|
setSelectedProviderDisplay(
|
||||||
|
providerModels[option.provider]?.displayName || option.provider,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Save to localStorage for persistence
|
||||||
|
localStorage.setItem('chatModelProvider', option.provider);
|
||||||
|
localStorage.setItem('chatModel', option.model);
|
||||||
|
};
|
||||||
|
|
||||||
|
const getDisplayText = () => {
|
||||||
|
if (loading) return 'Loading...';
|
||||||
|
if (!selectedModelDisplay) return 'Select model';
|
||||||
|
|
||||||
|
return `${selectedModelDisplay} (${selectedProviderDisplay})`;
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Popover className="relative">
|
||||||
|
{({ open }) => (
|
||||||
|
<>
|
||||||
|
<div className="relative">
|
||||||
|
<PopoverButton className="group flex items-center justify-center text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white">
|
||||||
|
<Cpu size={18} />
|
||||||
|
<span className="mx-2 text-xs font-medium overflow-hidden text-ellipsis whitespace-nowrap max-w-44 hidden lg:block">
|
||||||
|
{getDisplayText()}
|
||||||
|
</span>
|
||||||
|
<ChevronDown
|
||||||
|
size={16}
|
||||||
|
className={cn(
|
||||||
|
'transition-transform',
|
||||||
|
open ? 'rotate-180' : 'rotate-0',
|
||||||
|
)}
|
||||||
|
/>
|
||||||
|
</PopoverButton>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Transition
|
||||||
|
as={Fragment}
|
||||||
|
enter="transition ease-out duration-200"
|
||||||
|
enterFrom="opacity-0 translate-y-1"
|
||||||
|
enterTo="opacity-100 translate-y-0"
|
||||||
|
leave="transition ease-in duration-150"
|
||||||
|
leaveFrom="opacity-100 translate-y-0"
|
||||||
|
leaveTo="opacity-0 translate-y-1"
|
||||||
|
>
|
||||||
|
<PopoverPanel className="absolute z-10 w-72 transform bottom-full mb-2">
|
||||||
|
<div className="overflow-hidden rounded-lg shadow-lg ring-1 ring-black/5 dark:ring-white/5 bg-white dark:bg-dark-secondary divide-y divide-light-200 dark:divide-dark-200">
|
||||||
|
<div className="px-4 py-3">
|
||||||
|
<h3 className="text-sm font-medium text-black/90 dark:text-white/90">
|
||||||
|
Select Model
|
||||||
|
</h3>
|
||||||
|
<p className="text-xs text-black/60 dark:text-white/60 mt-1">
|
||||||
|
Choose a provider and model for your conversation
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="max-h-72 overflow-y-auto">
|
||||||
|
{loading ? (
|
||||||
|
<div className="px-4 py-3 text-sm text-black/70 dark:text-white/70">
|
||||||
|
Loading available models...
|
||||||
|
</div>
|
||||||
|
) : providersList.length === 0 ? (
|
||||||
|
<div className="px-4 py-3 text-sm text-black/70 dark:text-white/70">
|
||||||
|
No models available
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="py-1">
|
||||||
|
{providersList.map((providerKey) => {
|
||||||
|
const provider = providerModels[providerKey];
|
||||||
|
const isExpanded = expandedProviders[providerKey];
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
key={providerKey}
|
||||||
|
className="border-t border-light-200 dark:border-dark-200 first:border-t-0"
|
||||||
|
>
|
||||||
|
{/* Provider header */}
|
||||||
|
<button
|
||||||
|
className={cn(
|
||||||
|
'w-full flex items-center justify-between px-4 py-2 text-sm text-left',
|
||||||
|
'hover:bg-light-100 dark:hover:bg-dark-100',
|
||||||
|
selectedModel?.provider === providerKey
|
||||||
|
? 'bg-light-50 dark:bg-dark-50'
|
||||||
|
: '',
|
||||||
|
)}
|
||||||
|
onClick={() =>
|
||||||
|
toggleProviderExpanded(providerKey)
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<div className="font-medium flex items-center">
|
||||||
|
<Cpu
|
||||||
|
size={14}
|
||||||
|
className="mr-2 text-black/70 dark:text-white/70"
|
||||||
|
/>
|
||||||
|
{provider.displayName}
|
||||||
|
{selectedModel?.provider === providerKey && (
|
||||||
|
<span className="ml-2 text-xs text-[#24A0ED]">
|
||||||
|
(active)
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
<ChevronRight
|
||||||
|
size={14}
|
||||||
|
className={cn(
|
||||||
|
'transition-transform',
|
||||||
|
isExpanded ? 'rotate-90' : '',
|
||||||
|
)}
|
||||||
|
/>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{/* Models list */}
|
||||||
|
{isExpanded && (
|
||||||
|
<div className="pl-6">
|
||||||
|
{provider.models.map((modelOption) => (
|
||||||
|
<PopoverButton
|
||||||
|
key={`${modelOption.provider}-${modelOption.model}`}
|
||||||
|
className={cn(
|
||||||
|
'w-full text-left px-4 py-2 text-sm flex items-center',
|
||||||
|
selectedModel?.provider ===
|
||||||
|
modelOption.provider &&
|
||||||
|
selectedModel?.model ===
|
||||||
|
modelOption.model
|
||||||
|
? 'bg-light-100 dark:bg-dark-100 text-black dark:text-white'
|
||||||
|
: 'text-black/70 dark:text-white/70 hover:bg-light-100 dark:hover:bg-dark-100',
|
||||||
|
)}
|
||||||
|
onClick={() =>
|
||||||
|
handleSelectModel(modelOption)
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<div className="flex flex-col flex-1">
|
||||||
|
<span className="font-medium">
|
||||||
|
{modelOption.displayName}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{/* Active indicator */}
|
||||||
|
{selectedModel?.provider ===
|
||||||
|
modelOption.provider &&
|
||||||
|
selectedModel?.model ===
|
||||||
|
modelOption.model && (
|
||||||
|
<div className="ml-auto bg-[#24A0ED] text-white text-xs px-1.5 py-0.5 rounded">
|
||||||
|
Active
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</PopoverButton>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</PopoverPanel>
|
||||||
|
</Transition>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</Popover>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default ModelSelector;
|
@@ -1,4 +1,4 @@
|
|||||||
import { ChevronDown, Sliders, Star, Zap } from 'lucide-react';
|
import { ChevronDown, Minimize2, Sliders, Star, Zap } from 'lucide-react';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
import {
|
import {
|
||||||
Popover,
|
Popover,
|
||||||
@@ -7,7 +7,6 @@ import {
|
|||||||
Transition,
|
Transition,
|
||||||
} from '@headlessui/react';
|
} from '@headlessui/react';
|
||||||
import { Fragment } from 'react';
|
import { Fragment } from 'react';
|
||||||
|
|
||||||
const OptimizationModes = [
|
const OptimizationModes = [
|
||||||
{
|
{
|
||||||
key: 'speed',
|
key: 'speed',
|
||||||
@@ -41,8 +40,13 @@ const Optimization = ({
|
|||||||
optimizationMode: string;
|
optimizationMode: string;
|
||||||
setOptimizationMode: (mode: string) => void;
|
setOptimizationMode: (mode: string) => void;
|
||||||
}) => {
|
}) => {
|
||||||
|
const handleOptimizationChange = (mode: string) => {
|
||||||
|
setOptimizationMode(mode);
|
||||||
|
localStorage.setItem('optimizationMode', mode);
|
||||||
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Popover className="relative w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
|
<Popover className="relative">
|
||||||
<PopoverButton
|
<PopoverButton
|
||||||
type="button"
|
type="button"
|
||||||
className="p-2 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
className="p-2 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||||
@@ -52,12 +56,12 @@ const Optimization = ({
|
|||||||
OptimizationModes.find((mode) => mode.key === optimizationMode)
|
OptimizationModes.find((mode) => mode.key === optimizationMode)
|
||||||
?.icon
|
?.icon
|
||||||
}
|
}
|
||||||
<p className="text-xs font-medium">
|
{/* <p className="text-xs font-medium hidden lg:block">
|
||||||
{
|
{
|
||||||
OptimizationModes.find((mode) => mode.key === optimizationMode)
|
OptimizationModes.find((mode) => mode.key === optimizationMode)
|
||||||
?.title
|
?.title
|
||||||
}
|
}
|
||||||
</p>
|
</p> */}
|
||||||
<ChevronDown size={20} />
|
<ChevronDown size={20} />
|
||||||
</div>
|
</div>
|
||||||
</PopoverButton>
|
</PopoverButton>
|
||||||
@@ -70,11 +74,11 @@ const Optimization = ({
|
|||||||
leaveFrom="opacity-100 translate-y-0"
|
leaveFrom="opacity-100 translate-y-0"
|
||||||
leaveTo="opacity-0 translate-y-1"
|
leaveTo="opacity-0 translate-y-1"
|
||||||
>
|
>
|
||||||
<PopoverPanel className="absolute z-10 w-64 md:w-[250px] right-0">
|
<PopoverPanel className="absolute z-10 bottom-[100%] mb-2 left-1/2 transform -translate-x-1/2">
|
||||||
<div className="flex flex-col gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
<div className="flex flex-col gap-2 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-max max-w-[300px] p-4 max-h-[200px] md:max-h-none overflow-y-auto">
|
||||||
{OptimizationModes.map((mode, i) => (
|
{OptimizationModes.map((mode, i) => (
|
||||||
<PopoverButton
|
<PopoverButton
|
||||||
onClick={() => setOptimizationMode(mode.key)}
|
onClick={() => handleOptimizationChange(mode.key)}
|
||||||
key={i}
|
key={i}
|
||||||
disabled={mode.key === 'quality'}
|
disabled={mode.key === 'quality'}
|
||||||
className={cn(
|
className={cn(
|
||||||
|
@@ -35,9 +35,10 @@ const SearchImages = ({
|
|||||||
|
|
||||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||||
const chatModel = localStorage.getItem('chatModel');
|
const chatModel = localStorage.getItem('chatModel');
|
||||||
|
|
||||||
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
||||||
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
||||||
|
const ollamaContextWindow =
|
||||||
|
localStorage.getItem('ollamaContextWindow') || '2048';
|
||||||
|
|
||||||
const res = await fetch(`/api/images`, {
|
const res = await fetch(`/api/images`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
@@ -54,6 +55,9 @@ const SearchImages = ({
|
|||||||
customOpenAIBaseURL: customOpenAIBaseURL,
|
customOpenAIBaseURL: customOpenAIBaseURL,
|
||||||
customOpenAIKey: customOpenAIKey,
|
customOpenAIKey: customOpenAIKey,
|
||||||
}),
|
}),
|
||||||
|
...(chatModelProvider === 'ollama' && {
|
||||||
|
ollamaContextWindow: parseInt(ollamaContextWindow),
|
||||||
|
}),
|
||||||
},
|
},
|
||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
@@ -50,9 +50,10 @@ const Searchvideos = ({
|
|||||||
|
|
||||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||||
const chatModel = localStorage.getItem('chatModel');
|
const chatModel = localStorage.getItem('chatModel');
|
||||||
|
|
||||||
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
||||||
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
||||||
|
const ollamaContextWindow =
|
||||||
|
localStorage.getItem('ollamaContextWindow') || '2048';
|
||||||
|
|
||||||
const res = await fetch(`/api/videos`, {
|
const res = await fetch(`/api/videos`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
@@ -69,6 +70,9 @@ const Searchvideos = ({
|
|||||||
customOpenAIBaseURL: customOpenAIBaseURL,
|
customOpenAIBaseURL: customOpenAIBaseURL,
|
||||||
customOpenAIKey: customOpenAIKey,
|
customOpenAIKey: customOpenAIKey,
|
||||||
}),
|
}),
|
||||||
|
...(chatModelProvider === 'ollama' && {
|
||||||
|
ollamaContextWindow: parseInt(ollamaContextWindow),
|
||||||
|
}),
|
||||||
},
|
},
|
||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
@@ -6,6 +6,8 @@ export const getSuggestions = async (chatHisory: Message[]) => {
|
|||||||
|
|
||||||
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
const customOpenAIKey = localStorage.getItem('openAIApiKey');
|
||||||
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
const customOpenAIBaseURL = localStorage.getItem('openAIBaseURL');
|
||||||
|
const ollamaContextWindow =
|
||||||
|
localStorage.getItem('ollamaContextWindow') || '2048';
|
||||||
|
|
||||||
const res = await fetch(`/api/suggestions`, {
|
const res = await fetch(`/api/suggestions`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
@@ -21,6 +23,9 @@ export const getSuggestions = async (chatHisory: Message[]) => {
|
|||||||
customOpenAIKey,
|
customOpenAIKey,
|
||||||
customOpenAIBaseURL,
|
customOpenAIBaseURL,
|
||||||
}),
|
}),
|
||||||
|
...(chatModelProvider === 'ollama' && {
|
||||||
|
ollamaContextWindow: parseInt(ollamaContextWindow),
|
||||||
|
}),
|
||||||
},
|
},
|
||||||
}),
|
}),
|
||||||
});
|
});
|
||||||
|
@@ -10,6 +10,7 @@ const suggestionGeneratorPrompt = `
|
|||||||
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
|
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
|
||||||
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
|
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
|
||||||
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
|
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
|
||||||
|
If you are a thinking or reasoning AI, you should avoid using \`<suggestions>\` and \`</suggestions>\` tags in your thinking. Those tags should only be used in the final output.
|
||||||
|
|
||||||
Provide these suggestions separated by newlines between the XML tags <suggestions> and </suggestions>. For example:
|
Provide these suggestions separated by newlines between the XML tags <suggestions> and </suggestions>. For example:
|
||||||
|
|
||||||
|
@@ -1,13 +1,21 @@
|
|||||||
import fs from 'fs';
|
|
||||||
import path from 'path';
|
|
||||||
import toml from '@iarna/toml';
|
import toml from '@iarna/toml';
|
||||||
|
|
||||||
|
// Use dynamic imports for Node.js modules to prevent client-side errors
|
||||||
|
let fs: any;
|
||||||
|
let path: any;
|
||||||
|
if (typeof window === 'undefined') {
|
||||||
|
// We're on the server
|
||||||
|
fs = require('fs');
|
||||||
|
path = require('path');
|
||||||
|
}
|
||||||
|
|
||||||
const configFileName = 'config.toml';
|
const configFileName = 'config.toml';
|
||||||
|
|
||||||
interface Config {
|
interface Config {
|
||||||
GENERAL: {
|
GENERAL: {
|
||||||
SIMILARITY_MEASURE: string;
|
SIMILARITY_MEASURE: string;
|
||||||
KEEP_ALIVE: string;
|
KEEP_ALIVE: string;
|
||||||
|
BASE_URL?: string;
|
||||||
};
|
};
|
||||||
MODELS: {
|
MODELS: {
|
||||||
OPENAI: {
|
OPENAI: {
|
||||||
@@ -25,6 +33,12 @@ interface Config {
|
|||||||
OLLAMA: {
|
OLLAMA: {
|
||||||
API_URL: string;
|
API_URL: string;
|
||||||
};
|
};
|
||||||
|
DEEPSEEK: {
|
||||||
|
API_KEY: string;
|
||||||
|
};
|
||||||
|
LM_STUDIO: {
|
||||||
|
API_URL: string;
|
||||||
|
};
|
||||||
CUSTOM_OPENAI: {
|
CUSTOM_OPENAI: {
|
||||||
API_URL: string;
|
API_URL: string;
|
||||||
API_KEY: string;
|
API_KEY: string;
|
||||||
@@ -40,16 +54,25 @@ type RecursivePartial<T> = {
|
|||||||
[P in keyof T]?: RecursivePartial<T[P]>;
|
[P in keyof T]?: RecursivePartial<T[P]>;
|
||||||
};
|
};
|
||||||
|
|
||||||
const loadConfig = () =>
|
const loadConfig = () => {
|
||||||
toml.parse(
|
// Server-side only
|
||||||
fs.readFileSync(path.join(process.cwd(), `${configFileName}`), 'utf-8'),
|
if (typeof window === 'undefined') {
|
||||||
) as any as Config;
|
return toml.parse(
|
||||||
|
fs.readFileSync(path.join(process.cwd(), `${configFileName}`), 'utf-8'),
|
||||||
|
) as any as Config;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Client-side fallback - settings will be loaded via API
|
||||||
|
return {} as Config;
|
||||||
|
};
|
||||||
|
|
||||||
export const getSimilarityMeasure = () =>
|
export const getSimilarityMeasure = () =>
|
||||||
loadConfig().GENERAL.SIMILARITY_MEASURE;
|
loadConfig().GENERAL.SIMILARITY_MEASURE;
|
||||||
|
|
||||||
export const getKeepAlive = () => loadConfig().GENERAL.KEEP_ALIVE;
|
export const getKeepAlive = () => loadConfig().GENERAL.KEEP_ALIVE;
|
||||||
|
|
||||||
|
export const getBaseUrl = () => loadConfig().GENERAL.BASE_URL;
|
||||||
|
|
||||||
export const getOpenaiApiKey = () => loadConfig().MODELS.OPENAI.API_KEY;
|
export const getOpenaiApiKey = () => loadConfig().MODELS.OPENAI.API_KEY;
|
||||||
|
|
||||||
export const getGroqApiKey = () => loadConfig().MODELS.GROQ.API_KEY;
|
export const getGroqApiKey = () => loadConfig().MODELS.GROQ.API_KEY;
|
||||||
@@ -63,6 +86,8 @@ export const getSearxngApiEndpoint = () =>
|
|||||||
|
|
||||||
export const getOllamaApiEndpoint = () => loadConfig().MODELS.OLLAMA.API_URL;
|
export const getOllamaApiEndpoint = () => loadConfig().MODELS.OLLAMA.API_URL;
|
||||||
|
|
||||||
|
export const getDeepseekApiKey = () => loadConfig().MODELS.DEEPSEEK.API_KEY;
|
||||||
|
|
||||||
export const getCustomOpenaiApiKey = () =>
|
export const getCustomOpenaiApiKey = () =>
|
||||||
loadConfig().MODELS.CUSTOM_OPENAI.API_KEY;
|
loadConfig().MODELS.CUSTOM_OPENAI.API_KEY;
|
||||||
|
|
||||||
@@ -72,6 +97,9 @@ export const getCustomOpenaiApiUrl = () =>
|
|||||||
export const getCustomOpenaiModelName = () =>
|
export const getCustomOpenaiModelName = () =>
|
||||||
loadConfig().MODELS.CUSTOM_OPENAI.MODEL_NAME;
|
loadConfig().MODELS.CUSTOM_OPENAI.MODEL_NAME;
|
||||||
|
|
||||||
|
export const getLMStudioApiEndpoint = () =>
|
||||||
|
loadConfig().MODELS.LM_STUDIO.API_URL;
|
||||||
|
|
||||||
const mergeConfigs = (current: any, update: any): any => {
|
const mergeConfigs = (current: any, update: any): any => {
|
||||||
if (update === null || update === undefined) {
|
if (update === null || update === undefined) {
|
||||||
return current;
|
return current;
|
||||||
@@ -104,10 +132,13 @@ const mergeConfigs = (current: any, update: any): any => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
export const updateConfig = (config: RecursivePartial<Config>) => {
|
export const updateConfig = (config: RecursivePartial<Config>) => {
|
||||||
const currentConfig = loadConfig();
|
// Server-side only
|
||||||
const mergedConfig = mergeConfigs(currentConfig, config);
|
if (typeof window === 'undefined') {
|
||||||
fs.writeFileSync(
|
const currentConfig = loadConfig();
|
||||||
path.join(path.join(process.cwd(), `${configFileName}`)),
|
const mergedConfig = mergeConfigs(currentConfig, config);
|
||||||
toml.stringify(mergedConfig),
|
fs.writeFileSync(
|
||||||
);
|
path.join(path.join(process.cwd(), `${configFileName}`)),
|
||||||
|
toml.stringify(mergedConfig),
|
||||||
|
);
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
@@ -21,6 +21,10 @@ class LineOutputParser extends BaseOutputParser<string> {
|
|||||||
async parse(text: string): Promise<string> {
|
async parse(text: string): Promise<string> {
|
||||||
text = text.trim() || '';
|
text = text.trim() || '';
|
||||||
|
|
||||||
|
// First, remove all <think>...</think> blocks to avoid parsing tags inside thinking content
|
||||||
|
// This might be a little aggressive. Prompt massaging might be all we need, but this is a guarantee and should rarely mess anything up.
|
||||||
|
text = this.removeThinkingBlocks(text);
|
||||||
|
|
||||||
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
||||||
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
||||||
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
||||||
@@ -40,6 +44,17 @@ class LineOutputParser extends BaseOutputParser<string> {
|
|||||||
return line;
|
return line;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Removes all content within <think>...</think> blocks
|
||||||
|
* @param text The input text containing thinking blocks
|
||||||
|
* @returns The text with all thinking blocks removed
|
||||||
|
*/
|
||||||
|
private removeThinkingBlocks(text: string): string {
|
||||||
|
// Use regex to identify and remove all <think>...</think> blocks
|
||||||
|
// Using the 's' flag to make dot match newlines
|
||||||
|
return text.replace(/<think>[\s\S]*?<\/think>/g, '').trim();
|
||||||
|
}
|
||||||
|
|
||||||
getFormatInstructions(): string {
|
getFormatInstructions(): string {
|
||||||
throw new Error('Not implemented.');
|
throw new Error('Not implemented.');
|
||||||
}
|
}
|
||||||
|
@@ -21,6 +21,10 @@ class LineListOutputParser extends BaseOutputParser<string[]> {
|
|||||||
async parse(text: string): Promise<string[]> {
|
async parse(text: string): Promise<string[]> {
|
||||||
text = text.trim() || '';
|
text = text.trim() || '';
|
||||||
|
|
||||||
|
// First, remove all <think>...</think> blocks to avoid parsing tags inside thinking content
|
||||||
|
// This might be a little aggressive. Prompt massaging might be all we need, but this is a guarantee and should rarely mess anything up.
|
||||||
|
text = this.removeThinkingBlocks(text);
|
||||||
|
|
||||||
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
||||||
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
||||||
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
||||||
@@ -42,6 +46,17 @@ class LineListOutputParser extends BaseOutputParser<string[]> {
|
|||||||
return lines;
|
return lines;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Removes all content within <think>...</think> blocks
|
||||||
|
* @param text The input text containing thinking blocks
|
||||||
|
* @returns The text with all thinking blocks removed
|
||||||
|
*/
|
||||||
|
private removeThinkingBlocks(text: string): string {
|
||||||
|
// Use regex to identify and remove all <think>...</think> blocks
|
||||||
|
// Using [\s\S] pattern to match all characters including newlines
|
||||||
|
return text.replace(/<think>[\s\S]*?<\/think>/g, '').trim();
|
||||||
|
}
|
||||||
|
|
||||||
getFormatInstructions(): string {
|
getFormatInstructions(): string {
|
||||||
throw new Error('Not implemented.');
|
throw new Error('Not implemented.');
|
||||||
}
|
}
|
||||||
|
@@ -51,6 +51,10 @@ export const academicSearchResponsePrompt = `
|
|||||||
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
||||||
- You are set on focus mode 'Academic', this means you will be searching for academic papers and articles on the web.
|
- You are set on focus mode 'Academic', this means you will be searching for academic papers and articles on the web.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
### Example Output
|
### Example Output
|
||||||
- Begin with a brief introduction summarizing the event or query topic.
|
- Begin with a brief introduction summarizing the event or query topic.
|
||||||
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
||||||
|
19
src/lib/prompts/chat.ts
Normal file
19
src/lib/prompts/chat.ts
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
export const chatPrompt = `
|
||||||
|
You are Perplexica, an AI model who is expert at having creative conversations with users. You are currently set on focus mode 'Chat', which means you will engage in a truly creative conversation without searching the web or citing sources.
|
||||||
|
|
||||||
|
In Chat mode, you should be:
|
||||||
|
- Creative and engaging in your responses
|
||||||
|
- Helpful and informative based on your internal knowledge
|
||||||
|
- Conversational and natural in your tone
|
||||||
|
- Willing to explore ideas, hypothetical scenarios, and creative topics
|
||||||
|
|
||||||
|
Since you are in Chat mode, you would not perform web searches or cite sources. If the user asks a question that would benefit from web search or specific data, you can suggest they switch to a different focus mode like 'All Mode' for general web search or another specialized mode.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
|
<context>
|
||||||
|
{context}
|
||||||
|
</context>
|
||||||
|
`;
|
@@ -11,7 +11,8 @@ import {
|
|||||||
wolframAlphaSearchResponsePrompt,
|
wolframAlphaSearchResponsePrompt,
|
||||||
wolframAlphaSearchRetrieverPrompt,
|
wolframAlphaSearchRetrieverPrompt,
|
||||||
} from './wolframAlpha';
|
} from './wolframAlpha';
|
||||||
import { writingAssistantPrompt } from './writingAssistant';
|
import { localResearchPrompt } from './localResearch';
|
||||||
|
import { chatPrompt } from './chat';
|
||||||
import {
|
import {
|
||||||
youtubeSearchResponsePrompt,
|
youtubeSearchResponsePrompt,
|
||||||
youtubeSearchRetrieverPrompt,
|
youtubeSearchRetrieverPrompt,
|
||||||
@@ -26,7 +27,8 @@ export default {
|
|||||||
redditSearchRetrieverPrompt,
|
redditSearchRetrieverPrompt,
|
||||||
wolframAlphaSearchResponsePrompt,
|
wolframAlphaSearchResponsePrompt,
|
||||||
wolframAlphaSearchRetrieverPrompt,
|
wolframAlphaSearchRetrieverPrompt,
|
||||||
writingAssistantPrompt,
|
localResearchPrompt,
|
||||||
|
chatPrompt,
|
||||||
youtubeSearchResponsePrompt,
|
youtubeSearchResponsePrompt,
|
||||||
youtubeSearchRetrieverPrompt,
|
youtubeSearchRetrieverPrompt,
|
||||||
};
|
};
|
||||||
|
@@ -1,12 +1,16 @@
|
|||||||
export const writingAssistantPrompt = `
|
export const localResearchPrompt = `
|
||||||
You are Perplexica, an AI model who is expert at searching the web and answering user's queries. You are currently set on focus mode 'Writing Assistant', this means you will be helping the user write a response to a given query.
|
You are Perplexica, an AI model who is expert at searching the web and answering user's queries. You are currently set on focus mode 'Local Research', this means you will be helping the user research and interact with local files with citations.
|
||||||
Since you are a writing assistant, you would not perform web searches. If you think you lack information to answer the query, you can ask the user for more information or suggest them to switch to a different focus mode.
|
Since you are in local research mode, you would not perform web searches. If you think you lack information to answer the query, you can ask the user for more information or suggest them to switch to a different focus mode.
|
||||||
You will be shared a context that can contain information from files user has uploaded to get answers from. You will have to generate answers upon that.
|
You will be shared a context that can contain information from files user has uploaded to get answers from. You will have to generate answers upon that.
|
||||||
|
|
||||||
You have to cite the answer using [number] notation. You must cite the sentences with their relevent context number. You must cite each and every part of the answer so the user can know where the information is coming from.
|
You have to cite the answer using [number] notation. You must cite the sentences with their relevent context number. You must cite each and every part of the answer so the user can know where the information is coming from.
|
||||||
Place these citations at the end of that particular sentence. You can cite the same sentence multiple times if it is relevant to the user's query like [number1][number2].
|
Place these citations at the end of that particular sentence. You can cite the same sentence multiple times if it is relevant to the user's query like [number1][number2].
|
||||||
However you do not need to cite it using the same number. You can use different numbers to cite the same sentence multiple times. The number refers to the number of the search result (passed in the context) used to generate that part of the answer.
|
However you do not need to cite it using the same number. You can use different numbers to cite the same sentence multiple times. The number refers to the number of the search result (passed in the context) used to generate that part of the answer.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
<context>
|
<context>
|
||||||
{context}
|
{context}
|
||||||
</context>
|
</context>
|
@@ -51,6 +51,10 @@ export const redditSearchResponsePrompt = `
|
|||||||
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
||||||
- You are set on focus mode 'Reddit', this means you will be searching for information, opinions and discussions on the web using Reddit.
|
- You are set on focus mode 'Reddit', this means you will be searching for information, opinions and discussions on the web using Reddit.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
### Example Output
|
### Example Output
|
||||||
- Begin with a brief introduction summarizing the event or query topic.
|
- Begin with a brief introduction summarizing the event or query topic.
|
||||||
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
||||||
|
@@ -1,8 +1,9 @@
|
|||||||
export const webSearchRetrieverPrompt = `
|
export const webSearchRetrieverPrompt = `
|
||||||
You are an AI question rephraser. You will be given a conversation and a follow-up question, you will have to rephrase the follow up question so it is a standalone question and can be used by another LLM to search the web for information to answer it.
|
You are an AI question rephraser. You will be given a conversation and a follow-up question, you will have to rephrase the follow up question so it is a standalone question and can be used by another LLM to search the web for information to answer it. You should condense the question to its essence and remove any unnecessary details. You should also make sure that the question is clear and easy to understand. You should not add any new information or change the meaning of the question. You should also make sure that the question is grammatically correct and free of spelling errors.
|
||||||
If it is a smple writing task or a greeting (unless the greeting contains a question after it) like Hi, Hello, How are you, etc. than a question then you need to return \`not_needed\` as the response (This is because the LLM won't need to search the web for finding information on this topic).
|
If it is a simple writing task or a greeting (unless the greeting contains a question after it) like Hi, Hello, How are you, etc. than a question then you need to return \`not_needed\` as the response (This is because the LLM won't need to search the web for finding information on this topic).
|
||||||
If the user asks some question from some URL or wants you to summarize a PDF or a webpage (via URL) you need to return the links inside the \`links\` XML block and the question inside the \`question\` XML block. If the user wants to you to summarize the webpage or the PDF you need to return \`summarize\` inside the \`question\` XML block in place of a question and the link to summarize in the \`links\` XML block.
|
If the user asks some question from some URL or wants you to summarize a PDF or a webpage (via URL) you need to return the links inside the \`links\` XML block and the question inside the \`question\` XML block. If the user wants to you to summarize the webpage or the PDF you need to return \`summarize\` inside the \`question\` XML block in place of a question and the link to summarize in the \`links\` XML block.
|
||||||
You must always return the rephrased question inside the \`question\` XML block, if there are no links in the follow-up question then don't insert a \`links\` XML block in your response.
|
You must always return the rephrased question inside the \`question\` XML block, if there are no links in the follow-up question then don't insert a \`links\` XML block in your response.
|
||||||
|
If you are a thinking or reasoning AI, you should avoid using \`<question>\` and \`</question>\` tags in your thinking. Those tags should only be used in the final output. You should also avoid using \`<links>\` and \`</links>\` tags in your thinking. Those tags should only be used in the final output.
|
||||||
|
|
||||||
There are several examples attached for your reference inside the below \`examples\` XML block
|
There are several examples attached for your reference inside the below \`examples\` XML block
|
||||||
|
|
||||||
@@ -49,6 +50,21 @@ summarize
|
|||||||
https://example.com
|
https://example.com
|
||||||
</links>
|
</links>
|
||||||
\`
|
\`
|
||||||
|
|
||||||
|
6. Follow-up question: Get the current F1 constructor standings and return the results in a table
|
||||||
|
Rephrased question: \`
|
||||||
|
<question>
|
||||||
|
Current F1 constructor standings
|
||||||
|
</question>
|
||||||
|
\`
|
||||||
|
|
||||||
|
7. Follow-up question: What are the top 10 restaurants in New York? Show the results in a table and include a short description of each restaurant.
|
||||||
|
Rephrased question: \`
|
||||||
|
<question>
|
||||||
|
Top 10 restaurants in New York
|
||||||
|
</question>
|
||||||
|
\`
|
||||||
|
|
||||||
</examples>
|
</examples>
|
||||||
|
|
||||||
Anything below is the part of the actual conversation and you need to use conversation and the follow-up question to rephrase the follow-up question as a standalone question based on the guidelines shared above.
|
Anything below is the part of the actual conversation and you need to use conversation and the follow-up question to rephrase the follow-up question as a standalone question based on the guidelines shared above.
|
||||||
@@ -92,6 +108,10 @@ export const webSearchResponsePrompt = `
|
|||||||
- If the user provides vague input or if relevant information is missing, explain what additional details might help refine the search.
|
- If the user provides vague input or if relevant information is missing, explain what additional details might help refine the search.
|
||||||
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
### Example Output
|
### Example Output
|
||||||
- Begin with a brief introduction summarizing the event or query topic.
|
- Begin with a brief introduction summarizing the event or query topic.
|
||||||
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
||||||
|
@@ -51,6 +51,10 @@ export const wolframAlphaSearchResponsePrompt = `
|
|||||||
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
||||||
- You are set on focus mode 'Wolfram Alpha', this means you will be searching for information on the web using Wolfram Alpha. It is a computational knowledge engine that can answer factual queries and perform computations.
|
- You are set on focus mode 'Wolfram Alpha', this means you will be searching for information on the web using Wolfram Alpha. It is a computational knowledge engine that can answer factual queries and perform computations.
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
### Example Output
|
### Example Output
|
||||||
- Begin with a brief introduction summarizing the event or query topic.
|
- Begin with a brief introduction summarizing the event or query topic.
|
||||||
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
||||||
|
@@ -51,6 +51,10 @@ export const youtubeSearchResponsePrompt = `
|
|||||||
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
- If no relevant information is found, say: "Hmm, sorry I could not find any relevant information on this topic. Would you like me to search again or ask something else?" Be transparent about limitations and suggest alternatives or ways to reframe the query.
|
||||||
- You are set on focus mode 'Youtube', this means you will be searching for videos on the web using Youtube and providing information based on the video's transcrip
|
- You are set on focus mode 'Youtube', this means you will be searching for videos on the web using Youtube and providing information based on the video's transcrip
|
||||||
|
|
||||||
|
### User instructions
|
||||||
|
These instructions are shared to you by the user and not by the system. You will have to follow them but give them less priority than the above instructions. If the user has provided specific instructions or preferences, incorporate them into your response while adhering to the overall guidelines.
|
||||||
|
{systemInstructions}
|
||||||
|
|
||||||
### Example Output
|
### Example Output
|
||||||
- Begin with a brief introduction summarizing the event or query topic.
|
- Begin with a brief introduction summarizing the event or query topic.
|
||||||
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
- Follow with detailed sections under clear headings, covering all aspects of the query if possible.
|
||||||
|
@@ -1,6 +1,11 @@
|
|||||||
import { ChatAnthropic } from '@langchain/anthropic';
|
import { ChatAnthropic } from '@langchain/anthropic';
|
||||||
import { ChatModel } from '.';
|
import { ChatModel } from '.';
|
||||||
import { getAnthropicApiKey } from '../config';
|
import { getAnthropicApiKey } from '../config';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'anthropic',
|
||||||
|
displayName: 'Anthropic',
|
||||||
|
};
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
|
|
||||||
const anthropicChatModels: Record<string, string>[] = [
|
const anthropicChatModels: Record<string, string>[] = [
|
||||||
|
49
src/lib/providers/deepseek.ts
Normal file
49
src/lib/providers/deepseek.ts
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
import { getDeepseekApiKey } from '../config';
|
||||||
|
import { ChatModel } from '.';
|
||||||
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'deepseek',
|
||||||
|
displayName: 'Deepseek AI',
|
||||||
|
};
|
||||||
|
|
||||||
|
const deepseekChatModels: Record<string, string>[] = [
|
||||||
|
{
|
||||||
|
displayName: 'Deepseek Chat (Deepseek V3)',
|
||||||
|
key: 'deepseek-chat',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
displayName: 'Deepseek Reasoner (Deepseek R1)',
|
||||||
|
key: 'deepseek-reasoner',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export const loadDeepseekChatModels = async () => {
|
||||||
|
const deepseekApiKey = getDeepseekApiKey();
|
||||||
|
|
||||||
|
if (!deepseekApiKey) return {};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const chatModels: Record<string, ChatModel> = {};
|
||||||
|
|
||||||
|
deepseekChatModels.forEach((model) => {
|
||||||
|
chatModels[model.key] = {
|
||||||
|
displayName: model.displayName,
|
||||||
|
model: new ChatOpenAI({
|
||||||
|
openAIApiKey: deepseekApiKey,
|
||||||
|
modelName: model.key,
|
||||||
|
temperature: 0.7,
|
||||||
|
configuration: {
|
||||||
|
baseURL: 'https://api.deepseek.com',
|
||||||
|
},
|
||||||
|
}) as unknown as BaseChatModel,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
return chatModels;
|
||||||
|
} catch (err) {
|
||||||
|
console.error(`Error loading Deepseek models: ${err}`);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
};
|
@@ -4,6 +4,11 @@ import {
|
|||||||
} from '@langchain/google-genai';
|
} from '@langchain/google-genai';
|
||||||
import { getGeminiApiKey } from '../config';
|
import { getGeminiApiKey } from '../config';
|
||||||
import { ChatModel, EmbeddingModel } from '.';
|
import { ChatModel, EmbeddingModel } from '.';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'gemini',
|
||||||
|
displayName: 'Google Gemini',
|
||||||
|
};
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { Embeddings } from '@langchain/core/embeddings';
|
import { Embeddings } from '@langchain/core/embeddings';
|
||||||
|
|
||||||
@@ -40,8 +45,12 @@ const geminiChatModels: Record<string, string>[] = [
|
|||||||
|
|
||||||
const geminiEmbeddingModels: Record<string, string>[] = [
|
const geminiEmbeddingModels: Record<string, string>[] = [
|
||||||
{
|
{
|
||||||
displayName: 'Gemini Embedding',
|
displayName: 'Text Embedding 004',
|
||||||
key: 'gemini-embedding-exp',
|
key: 'models/text-embedding-004',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
displayName: 'Embedding 001',
|
||||||
|
key: 'models/embedding-001',
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
|
@@ -1,6 +1,11 @@
|
|||||||
import { ChatOpenAI } from '@langchain/openai';
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
import { getGroqApiKey } from '../config';
|
import { getGroqApiKey } from '../config';
|
||||||
import { ChatModel } from '.';
|
import { ChatModel } from '.';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'groq',
|
||||||
|
displayName: 'Groq',
|
||||||
|
};
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
|
|
||||||
const groqChatModels: Record<string, string>[] = [
|
const groqChatModels: Record<string, string>[] = [
|
||||||
@@ -72,6 +77,14 @@ const groqChatModels: Record<string, string>[] = [
|
|||||||
displayName: 'Llama 3.2 90B Vision Preview (Preview)',
|
displayName: 'Llama 3.2 90B Vision Preview (Preview)',
|
||||||
key: 'llama-3.2-90b-vision-preview',
|
key: 'llama-3.2-90b-vision-preview',
|
||||||
},
|
},
|
||||||
|
/* {
|
||||||
|
displayName: 'Llama 4 Maverick 17B 128E Instruct (Preview)',
|
||||||
|
key: 'meta-llama/llama-4-maverick-17b-128e-instruct',
|
||||||
|
}, */
|
||||||
|
{
|
||||||
|
displayName: 'Llama 4 Scout 17B 16E Instruct (Preview)',
|
||||||
|
key: 'meta-llama/llama-4-scout-17b-16e-instruct',
|
||||||
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
export const loadGroqChatModels = async () => {
|
export const loadGroqChatModels = async () => {
|
||||||
|
@@ -1,17 +1,60 @@
|
|||||||
import { Embeddings } from '@langchain/core/embeddings';
|
import { Embeddings } from '@langchain/core/embeddings';
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { loadOpenAIChatModels, loadOpenAIEmbeddingModels } from './openai';
|
import {
|
||||||
|
loadOpenAIChatModels,
|
||||||
|
loadOpenAIEmbeddingModels,
|
||||||
|
PROVIDER_INFO as OpenAIInfo,
|
||||||
|
PROVIDER_INFO,
|
||||||
|
} from './openai';
|
||||||
import {
|
import {
|
||||||
getCustomOpenaiApiKey,
|
getCustomOpenaiApiKey,
|
||||||
getCustomOpenaiApiUrl,
|
getCustomOpenaiApiUrl,
|
||||||
getCustomOpenaiModelName,
|
getCustomOpenaiModelName,
|
||||||
} from '../config';
|
} from '../config';
|
||||||
import { ChatOpenAI } from '@langchain/openai';
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
import { loadOllamaChatModels, loadOllamaEmbeddingModels } from './ollama';
|
import {
|
||||||
import { loadGroqChatModels } from './groq';
|
loadOllamaChatModels,
|
||||||
import { loadAnthropicChatModels } from './anthropic';
|
loadOllamaEmbeddingModels,
|
||||||
import { loadGeminiChatModels, loadGeminiEmbeddingModels } from './gemini';
|
PROVIDER_INFO as OllamaInfo,
|
||||||
import { loadTransformersEmbeddingsModels } from './transformers';
|
} from './ollama';
|
||||||
|
import { loadGroqChatModels, PROVIDER_INFO as GroqInfo } from './groq';
|
||||||
|
import {
|
||||||
|
loadAnthropicChatModels,
|
||||||
|
PROVIDER_INFO as AnthropicInfo,
|
||||||
|
} from './anthropic';
|
||||||
|
import {
|
||||||
|
loadGeminiChatModels,
|
||||||
|
loadGeminiEmbeddingModels,
|
||||||
|
PROVIDER_INFO as GeminiInfo,
|
||||||
|
} from './gemini';
|
||||||
|
import {
|
||||||
|
loadTransformersEmbeddingsModels,
|
||||||
|
PROVIDER_INFO as TransformersInfo,
|
||||||
|
} from './transformers';
|
||||||
|
import {
|
||||||
|
loadDeepseekChatModels,
|
||||||
|
PROVIDER_INFO as DeepseekInfo,
|
||||||
|
} from './deepseek';
|
||||||
|
import {
|
||||||
|
loadLMStudioChatModels,
|
||||||
|
loadLMStudioEmbeddingsModels,
|
||||||
|
PROVIDER_INFO as LMStudioInfo,
|
||||||
|
} from './lmstudio';
|
||||||
|
|
||||||
|
export const PROVIDER_METADATA = {
|
||||||
|
openai: OpenAIInfo,
|
||||||
|
ollama: OllamaInfo,
|
||||||
|
groq: GroqInfo,
|
||||||
|
anthropic: AnthropicInfo,
|
||||||
|
gemini: GeminiInfo,
|
||||||
|
transformers: TransformersInfo,
|
||||||
|
deepseek: DeepseekInfo,
|
||||||
|
lmstudio: LMStudioInfo,
|
||||||
|
custom_openai: {
|
||||||
|
key: 'custom_openai',
|
||||||
|
displayName: 'Custom OpenAI',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
export interface ChatModel {
|
export interface ChatModel {
|
||||||
displayName: string;
|
displayName: string;
|
||||||
@@ -32,6 +75,8 @@ export const chatModelProviders: Record<
|
|||||||
groq: loadGroqChatModels,
|
groq: loadGroqChatModels,
|
||||||
anthropic: loadAnthropicChatModels,
|
anthropic: loadAnthropicChatModels,
|
||||||
gemini: loadGeminiChatModels,
|
gemini: loadGeminiChatModels,
|
||||||
|
deepseek: loadDeepseekChatModels,
|
||||||
|
lmstudio: loadLMStudioChatModels,
|
||||||
};
|
};
|
||||||
|
|
||||||
export const embeddingModelProviders: Record<
|
export const embeddingModelProviders: Record<
|
||||||
@@ -42,6 +87,7 @@ export const embeddingModelProviders: Record<
|
|||||||
ollama: loadOllamaEmbeddingModels,
|
ollama: loadOllamaEmbeddingModels,
|
||||||
gemini: loadGeminiEmbeddingModels,
|
gemini: loadGeminiEmbeddingModels,
|
||||||
transformers: loadTransformersEmbeddingsModels,
|
transformers: loadTransformersEmbeddingsModels,
|
||||||
|
lmstudio: loadLMStudioEmbeddingsModels,
|
||||||
};
|
};
|
||||||
|
|
||||||
export const getAvailableChatModelProviders = async () => {
|
export const getAvailableChatModelProviders = async () => {
|
||||||
@@ -50,7 +96,14 @@ export const getAvailableChatModelProviders = async () => {
|
|||||||
for (const provider in chatModelProviders) {
|
for (const provider in chatModelProviders) {
|
||||||
const providerModels = await chatModelProviders[provider]();
|
const providerModels = await chatModelProviders[provider]();
|
||||||
if (Object.keys(providerModels).length > 0) {
|
if (Object.keys(providerModels).length > 0) {
|
||||||
models[provider] = providerModels;
|
// Sort models alphabetically by their keys
|
||||||
|
const sortedModels: Record<string, ChatModel> = {};
|
||||||
|
Object.keys(providerModels)
|
||||||
|
.sort()
|
||||||
|
.forEach((key) => {
|
||||||
|
sortedModels[key] = providerModels[key];
|
||||||
|
});
|
||||||
|
models[provider] = sortedModels;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -85,7 +138,14 @@ export const getAvailableEmbeddingModelProviders = async () => {
|
|||||||
for (const provider in embeddingModelProviders) {
|
for (const provider in embeddingModelProviders) {
|
||||||
const providerModels = await embeddingModelProviders[provider]();
|
const providerModels = await embeddingModelProviders[provider]();
|
||||||
if (Object.keys(providerModels).length > 0) {
|
if (Object.keys(providerModels).length > 0) {
|
||||||
models[provider] = providerModels;
|
// Sort embedding models alphabetically by their keys
|
||||||
|
const sortedModels: Record<string, EmbeddingModel> = {};
|
||||||
|
Object.keys(providerModels)
|
||||||
|
.sort()
|
||||||
|
.forEach((key) => {
|
||||||
|
sortedModels[key] = providerModels[key];
|
||||||
|
});
|
||||||
|
models[provider] = sortedModels;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
100
src/lib/providers/lmstudio.ts
Normal file
100
src/lib/providers/lmstudio.ts
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
import { getKeepAlive, getLMStudioApiEndpoint } from '../config';
|
||||||
|
import axios from 'axios';
|
||||||
|
import { ChatModel, EmbeddingModel } from '.';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'lmstudio',
|
||||||
|
displayName: 'LM Studio',
|
||||||
|
};
|
||||||
|
import { ChatOpenAI } from '@langchain/openai';
|
||||||
|
import { OpenAIEmbeddings } from '@langchain/openai';
|
||||||
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
|
import { Embeddings } from '@langchain/core/embeddings';
|
||||||
|
|
||||||
|
interface LMStudioModel {
|
||||||
|
id: string;
|
||||||
|
name?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const ensureV1Endpoint = (endpoint: string): string =>
|
||||||
|
endpoint.endsWith('/v1') ? endpoint : `${endpoint}/v1`;
|
||||||
|
|
||||||
|
const checkServerAvailability = async (endpoint: string): Promise<boolean> => {
|
||||||
|
try {
|
||||||
|
await axios.get(`${ensureV1Endpoint(endpoint)}/models`, {
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
});
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const loadLMStudioChatModels = async () => {
|
||||||
|
const endpoint = getLMStudioApiEndpoint();
|
||||||
|
|
||||||
|
if (!endpoint) return {};
|
||||||
|
if (!(await checkServerAvailability(endpoint))) return {};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await axios.get(`${ensureV1Endpoint(endpoint)}/models`, {
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const chatModels: Record<string, ChatModel> = {};
|
||||||
|
|
||||||
|
response.data.data.forEach((model: LMStudioModel) => {
|
||||||
|
chatModels[model.id] = {
|
||||||
|
displayName: model.name || model.id,
|
||||||
|
model: new ChatOpenAI({
|
||||||
|
openAIApiKey: 'lm-studio',
|
||||||
|
configuration: {
|
||||||
|
baseURL: ensureV1Endpoint(endpoint),
|
||||||
|
},
|
||||||
|
modelName: model.id,
|
||||||
|
temperature: 0.7,
|
||||||
|
streaming: true,
|
||||||
|
maxRetries: 3,
|
||||||
|
}) as unknown as BaseChatModel,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
return chatModels;
|
||||||
|
} catch (err) {
|
||||||
|
console.error(`Error loading LM Studio models: ${err}`);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const loadLMStudioEmbeddingsModels = async () => {
|
||||||
|
const endpoint = getLMStudioApiEndpoint();
|
||||||
|
|
||||||
|
if (!endpoint) return {};
|
||||||
|
if (!(await checkServerAvailability(endpoint))) return {};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await axios.get(`${ensureV1Endpoint(endpoint)}/models`, {
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const embeddingsModels: Record<string, EmbeddingModel> = {};
|
||||||
|
|
||||||
|
response.data.data.forEach((model: LMStudioModel) => {
|
||||||
|
embeddingsModels[model.id] = {
|
||||||
|
displayName: model.name || model.id,
|
||||||
|
model: new OpenAIEmbeddings({
|
||||||
|
openAIApiKey: 'lm-studio',
|
||||||
|
configuration: {
|
||||||
|
baseURL: ensureV1Endpoint(endpoint),
|
||||||
|
},
|
||||||
|
modelName: model.id,
|
||||||
|
}) as unknown as Embeddings,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
return embeddingsModels;
|
||||||
|
} catch (err) {
|
||||||
|
console.error(`Error loading LM Studio embeddings model: ${err}`);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
};
|
@@ -1,8 +1,13 @@
|
|||||||
import axios from 'axios';
|
import axios from 'axios';
|
||||||
import { getKeepAlive, getOllamaApiEndpoint } from '../config';
|
import { getKeepAlive, getOllamaApiEndpoint } from '../config';
|
||||||
import { ChatModel, EmbeddingModel } from '.';
|
import { ChatModel, EmbeddingModel } from '.';
|
||||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
|
||||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'ollama',
|
||||||
|
displayName: 'Ollama',
|
||||||
|
};
|
||||||
|
import { ChatOllama } from '@langchain/ollama';
|
||||||
|
import { OllamaEmbeddings } from '@langchain/ollama';
|
||||||
|
|
||||||
export const loadOllamaChatModels = async () => {
|
export const loadOllamaChatModels = async () => {
|
||||||
const ollamaApiEndpoint = getOllamaApiEndpoint();
|
const ollamaApiEndpoint = getOllamaApiEndpoint();
|
||||||
|
@@ -1,6 +1,11 @@
|
|||||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||||
import { getOpenaiApiKey } from '../config';
|
import { getOpenaiApiKey } from '../config';
|
||||||
import { ChatModel, EmbeddingModel } from '.';
|
import { ChatModel, EmbeddingModel } from '.';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'openai',
|
||||||
|
displayName: 'OpenAI',
|
||||||
|
};
|
||||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||||
import { Embeddings } from '@langchain/core/embeddings';
|
import { Embeddings } from '@langchain/core/embeddings';
|
||||||
|
|
||||||
@@ -25,6 +30,18 @@ const openaiChatModels: Record<string, string>[] = [
|
|||||||
displayName: 'GPT-4 omni mini',
|
displayName: 'GPT-4 omni mini',
|
||||||
key: 'gpt-4o-mini',
|
key: 'gpt-4o-mini',
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
displayName: 'GPT 4.1 nano',
|
||||||
|
key: 'gpt-4.1-nano',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
displayName: 'GPT 4.1 mini',
|
||||||
|
key: 'gpt-4.1-mini',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
displayName: 'GPT 4.1',
|
||||||
|
key: 'gpt-4.1',
|
||||||
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
const openaiEmbeddingModels: Record<string, string>[] = [
|
const openaiEmbeddingModels: Record<string, string>[] = [
|
||||||
|
@@ -1,5 +1,10 @@
|
|||||||
import { HuggingFaceTransformersEmbeddings } from '../huggingfaceTransformer';
|
import { HuggingFaceTransformersEmbeddings } from '../huggingfaceTransformer';
|
||||||
|
|
||||||
|
export const PROVIDER_INFO = {
|
||||||
|
key: 'transformers',
|
||||||
|
displayName: 'Hugging Face',
|
||||||
|
};
|
||||||
|
|
||||||
export const loadTransformersEmbeddingsModels = async () => {
|
export const loadTransformersEmbeddingsModels = async () => {
|
||||||
try {
|
try {
|
||||||
const embeddingModels = {
|
const embeddingModels = {
|
||||||
|
@@ -20,15 +20,24 @@ export const searchHandlers: Record<string, MetaSearchAgent> = {
|
|||||||
searchWeb: true,
|
searchWeb: true,
|
||||||
summarizer: false,
|
summarizer: false,
|
||||||
}),
|
}),
|
||||||
writingAssistant: new MetaSearchAgent({
|
localResearch: new MetaSearchAgent({
|
||||||
activeEngines: [],
|
activeEngines: [],
|
||||||
queryGeneratorPrompt: '',
|
queryGeneratorPrompt: '',
|
||||||
responsePrompt: prompts.writingAssistantPrompt,
|
responsePrompt: prompts.localResearchPrompt,
|
||||||
rerank: true,
|
rerank: true,
|
||||||
rerankThreshold: 0,
|
rerankThreshold: 0,
|
||||||
searchWeb: false,
|
searchWeb: false,
|
||||||
summarizer: false,
|
summarizer: false,
|
||||||
}),
|
}),
|
||||||
|
chat: new MetaSearchAgent({
|
||||||
|
activeEngines: [],
|
||||||
|
queryGeneratorPrompt: '',
|
||||||
|
responsePrompt: prompts.chatPrompt,
|
||||||
|
rerank: false,
|
||||||
|
rerankThreshold: 0,
|
||||||
|
searchWeb: false,
|
||||||
|
summarizer: false,
|
||||||
|
}),
|
||||||
wolframAlphaSearch: new MetaSearchAgent({
|
wolframAlphaSearch: new MetaSearchAgent({
|
||||||
activeEngines: ['wolframalpha'],
|
activeEngines: ['wolframalpha'],
|
||||||
queryGeneratorPrompt: prompts.wolframAlphaSearchRetrieverPrompt,
|
queryGeneratorPrompt: prompts.wolframAlphaSearchRetrieverPrompt,
|
||||||
|
@@ -33,6 +33,7 @@ export interface MetaSearchAgentType {
|
|||||||
embeddings: Embeddings,
|
embeddings: Embeddings,
|
||||||
optimizationMode: 'speed' | 'balanced' | 'quality',
|
optimizationMode: 'speed' | 'balanced' | 'quality',
|
||||||
fileIds: string[],
|
fileIds: string[],
|
||||||
|
systemInstructions: string,
|
||||||
) => Promise<eventEmitter>;
|
) => Promise<eventEmitter>;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -54,6 +55,8 @@ type BasicChainInput = {
|
|||||||
class MetaSearchAgent implements MetaSearchAgentType {
|
class MetaSearchAgent implements MetaSearchAgentType {
|
||||||
private config: Config;
|
private config: Config;
|
||||||
private strParser = new StringOutputParser();
|
private strParser = new StringOutputParser();
|
||||||
|
private searchQuery?: string;
|
||||||
|
private searxngUrl?: string;
|
||||||
|
|
||||||
constructor(config: Config) {
|
constructor(config: Config) {
|
||||||
this.config = config;
|
this.config = config;
|
||||||
@@ -79,6 +82,7 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
let question = this.config.summarizer
|
let question = this.config.summarizer
|
||||||
? await questionOutputParser.parse(input)
|
? await questionOutputParser.parse(input)
|
||||||
: input;
|
: input;
|
||||||
|
console.log('question', question);
|
||||||
|
|
||||||
if (question === 'not_needed') {
|
if (question === 'not_needed') {
|
||||||
return { query: '', docs: [] };
|
return { query: '', docs: [] };
|
||||||
@@ -204,12 +208,15 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
} else {
|
} else {
|
||||||
question = question.replace(/<think>.*?<\/think>/g, '');
|
question = question.replace(/<think>.*?<\/think>/g, '');
|
||||||
|
|
||||||
const res = await searchSearxng(question, {
|
const searxngResult = await searchSearxng(question, {
|
||||||
language: 'en',
|
language: 'en',
|
||||||
engines: this.config.activeEngines,
|
engines: this.config.activeEngines,
|
||||||
});
|
});
|
||||||
|
|
||||||
const documents = res.results.map(
|
// Store the SearXNG URL for later use in emitting to the client
|
||||||
|
this.searxngUrl = searxngResult.searchUrl;
|
||||||
|
|
||||||
|
const documents = searxngResult.results.map(
|
||||||
(result) =>
|
(result) =>
|
||||||
new Document({
|
new Document({
|
||||||
pageContent:
|
pageContent:
|
||||||
@@ -225,7 +232,7 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
return { query: question, docs: documents };
|
return { query: question, docs: documents, searchQuery: question };
|
||||||
}
|
}
|
||||||
}),
|
}),
|
||||||
]);
|
]);
|
||||||
@@ -236,9 +243,11 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
fileIds: string[],
|
fileIds: string[],
|
||||||
embeddings: Embeddings,
|
embeddings: Embeddings,
|
||||||
optimizationMode: 'speed' | 'balanced' | 'quality',
|
optimizationMode: 'speed' | 'balanced' | 'quality',
|
||||||
|
systemInstructions: string,
|
||||||
) {
|
) {
|
||||||
return RunnableSequence.from([
|
return RunnableSequence.from([
|
||||||
RunnableMap.from({
|
RunnableMap.from({
|
||||||
|
systemInstructions: () => systemInstructions,
|
||||||
query: (input: BasicChainInput) => input.query,
|
query: (input: BasicChainInput) => input.query,
|
||||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||||
date: () => new Date().toISOString(),
|
date: () => new Date().toISOString(),
|
||||||
@@ -261,6 +270,11 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
|
|
||||||
query = searchRetrieverResult.query;
|
query = searchRetrieverResult.query;
|
||||||
docs = searchRetrieverResult.docs;
|
docs = searchRetrieverResult.docs;
|
||||||
|
|
||||||
|
// Store the search query in the context for emitting to the client
|
||||||
|
if (searchRetrieverResult.searchQuery) {
|
||||||
|
this.searchQuery = searchRetrieverResult.searchQuery;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const sortedDocs = await this.rerankDocs(
|
const sortedDocs = await this.rerankDocs(
|
||||||
@@ -431,17 +445,30 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
private async handleStream(
|
private async handleStream(
|
||||||
stream: AsyncGenerator<StreamEvent, any, any>,
|
stream: AsyncGenerator<StreamEvent, any, any>,
|
||||||
emitter: eventEmitter,
|
emitter: eventEmitter,
|
||||||
|
llm: BaseChatModel,
|
||||||
) {
|
) {
|
||||||
for await (const event of stream) {
|
for await (const event of stream) {
|
||||||
if (
|
if (
|
||||||
event.event === 'on_chain_end' &&
|
event.event === 'on_chain_end' &&
|
||||||
event.name === 'FinalSourceRetriever'
|
event.name === 'FinalSourceRetriever'
|
||||||
) {
|
) {
|
||||||
``;
|
const sourcesData = event.data.output;
|
||||||
emitter.emit(
|
if (this.searchQuery) {
|
||||||
'data',
|
emitter.emit(
|
||||||
JSON.stringify({ type: 'sources', data: event.data.output }),
|
'data',
|
||||||
);
|
JSON.stringify({
|
||||||
|
type: 'sources',
|
||||||
|
data: sourcesData,
|
||||||
|
searchQuery: this.searchQuery,
|
||||||
|
searchUrl: this.searxngUrl,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
emitter.emit(
|
||||||
|
'data',
|
||||||
|
JSON.stringify({ type: 'sources', data: sourcesData }),
|
||||||
|
);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
if (
|
if (
|
||||||
event.event === 'on_chain_stream' &&
|
event.event === 'on_chain_stream' &&
|
||||||
@@ -456,6 +483,50 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
event.event === 'on_chain_end' &&
|
event.event === 'on_chain_end' &&
|
||||||
event.name === 'FinalResponseGenerator'
|
event.name === 'FinalResponseGenerator'
|
||||||
) {
|
) {
|
||||||
|
// Get model name safely with better detection
|
||||||
|
let modelName = 'Unknown';
|
||||||
|
try {
|
||||||
|
// @ts-ignore - Different LLM implementations have different properties
|
||||||
|
if (llm.modelName) {
|
||||||
|
// @ts-ignore
|
||||||
|
modelName = llm.modelName;
|
||||||
|
// @ts-ignore
|
||||||
|
} else if (llm._llm && llm._llm.modelName) {
|
||||||
|
// @ts-ignore
|
||||||
|
modelName = llm._llm.modelName;
|
||||||
|
// @ts-ignore
|
||||||
|
} else if (llm.model && llm.model.modelName) {
|
||||||
|
// @ts-ignore
|
||||||
|
modelName = llm.model.modelName;
|
||||||
|
} else if ('model' in llm) {
|
||||||
|
// @ts-ignore
|
||||||
|
const model = llm.model;
|
||||||
|
if (typeof model === 'string') {
|
||||||
|
modelName = model;
|
||||||
|
// @ts-ignore
|
||||||
|
} else if (model && model.modelName) {
|
||||||
|
// @ts-ignore
|
||||||
|
modelName = model.modelName;
|
||||||
|
}
|
||||||
|
} else if (llm.constructor && llm.constructor.name) {
|
||||||
|
// Last resort: use the class name
|
||||||
|
modelName = llm.constructor.name;
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.error('Failed to get model name:', e);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send model info before ending
|
||||||
|
emitter.emit(
|
||||||
|
'stats',
|
||||||
|
JSON.stringify({
|
||||||
|
type: 'modelStats',
|
||||||
|
data: {
|
||||||
|
modelName,
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
emitter.emit('end');
|
emitter.emit('end');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -468,6 +539,7 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
embeddings: Embeddings,
|
embeddings: Embeddings,
|
||||||
optimizationMode: 'speed' | 'balanced' | 'quality',
|
optimizationMode: 'speed' | 'balanced' | 'quality',
|
||||||
fileIds: string[],
|
fileIds: string[],
|
||||||
|
systemInstructions: string,
|
||||||
) {
|
) {
|
||||||
const emitter = new eventEmitter();
|
const emitter = new eventEmitter();
|
||||||
|
|
||||||
@@ -476,6 +548,7 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
fileIds,
|
fileIds,
|
||||||
embeddings,
|
embeddings,
|
||||||
optimizationMode,
|
optimizationMode,
|
||||||
|
systemInstructions,
|
||||||
);
|
);
|
||||||
|
|
||||||
const stream = answeringChain.streamEvents(
|
const stream = answeringChain.streamEvents(
|
||||||
@@ -488,7 +561,7 @@ class MetaSearchAgent implements MetaSearchAgentType {
|
|||||||
},
|
},
|
||||||
);
|
);
|
||||||
|
|
||||||
this.handleStream(stream, emitter);
|
this.handleStream(stream, emitter, llm);
|
||||||
|
|
||||||
return emitter;
|
return emitter;
|
||||||
}
|
}
|
||||||
|
@@ -19,6 +19,12 @@ interface SearxngSearchResult {
|
|||||||
iframe_src?: string;
|
iframe_src?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface SearxngResponse {
|
||||||
|
results: SearxngSearchResult[];
|
||||||
|
suggestions: string[];
|
||||||
|
searchUrl: string;
|
||||||
|
}
|
||||||
|
|
||||||
export const searchSearxng = async (
|
export const searchSearxng = async (
|
||||||
query: string,
|
query: string,
|
||||||
opts?: SearxngSearchOptions,
|
opts?: SearxngSearchOptions,
|
||||||
@@ -44,5 +50,16 @@ export const searchSearxng = async (
|
|||||||
const results: SearxngSearchResult[] = res.data.results;
|
const results: SearxngSearchResult[] = res.data.results;
|
||||||
const suggestions: string[] = res.data.suggestions;
|
const suggestions: string[] = res.data.suggestions;
|
||||||
|
|
||||||
return { results, suggestions };
|
// Create a URL for viewing the search results in the SearXNG web interface
|
||||||
|
const searchUrl = new URL(searxngURL);
|
||||||
|
searchUrl.pathname = '/search';
|
||||||
|
searchUrl.searchParams.append('q', query);
|
||||||
|
if (opts?.engines?.length) {
|
||||||
|
searchUrl.searchParams.append('engines', opts.engines.join(','));
|
||||||
|
}
|
||||||
|
if (opts?.language) {
|
||||||
|
searchUrl.searchParams.append('language', opts.language);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { results, suggestions, searchUrl: searchUrl.toString() };
|
||||||
};
|
};
|
||||||
|
@@ -64,7 +64,7 @@ export const getDocumentsFromLinks = async ({ links }: { links: string[] }) => {
|
|||||||
const splittedText = await splitter.splitText(parsedText);
|
const splittedText = await splitter.splitText(parsedText);
|
||||||
const title = res.data
|
const title = res.data
|
||||||
.toString('utf8')
|
.toString('utf8')
|
||||||
.match(/<title>(.*?)<\/title>/)?.[1];
|
.match(/<title.*>(.*?)<\/title>/)?.[1];
|
||||||
|
|
||||||
const linkDocs = splittedText.map((text) => {
|
const linkDocs = splittedText.map((text) => {
|
||||||
return new Document({
|
return new Document({
|
||||||
|
Reference in New Issue
Block a user