fix(api): History rewriting should delete the current message.

fix(UI): Model changes shouldn't submit the form.
This commit is contained in:
Willie Zutz
2025-05-06 23:45:46 -06:00
parent 6220822c7c
commit 8796009141
3 changed files with 111 additions and 21 deletions

94
.github/copilot-instructions.md vendored Normal file
View File

@ -0,0 +1,94 @@
# GitHub Copilot Instructions for Perplexica
This file provides context and guidance for GitHub Copilot when working with the Perplexica codebase.
## Project Overview
Perplexica is an open-source AI-powered search engine that uses advanced machine learning to provide intelligent search results. It combines web search capabilities with LLM-based processing to understand and answer user questions, similar to Perplexity AI but fully open source.
## Key Components
- **Frontend**: Next.js application with React components (in `/src/components` and `/src/app`)
- **Backend Logic**: Node.js backend with API routes (in `/src/app/api`) and library code (in `/src/lib`)
- **Search Engine**: Uses SearXNG as a metadata search engine
- **LLM Integration**: Supports multiple models including OpenAI, Anthropic, Groq, Ollama (local models)
- **Database**: SQLite database managed with Drizzle ORM
## Architecture
The system works through these main steps:
- User submits a query
- The system determines if web search is needed
- If needed, it searches the web using SearXNG
- Results are ranked using embedding-based similarity search
- LLMs are used to generate a comprehensive response with cited sources
## Key Technologies
- **Frontend**: React, Next.js, Tailwind CSS
- **Backend**: Node.js
- **Database**: SQLite with Drizzle ORM
- **AI/ML**: LangChain for orchestration, various LLM providers
- **Search**: SearXNG integration
- **Embedding Models**: For re-ranking search results
## Project Structure
- `/src/app`: Next.js app directory with page components and API routes
- `/src/components`: Reusable UI components
- `/src/lib`: Backend functionality
- `/lib/search`: Search functionality and meta search agent
- `/lib/db`: Database schema and operations
- `/lib/providers`: LLM and embedding model integrations
- `/lib/prompts`: Prompt templates for LLMs
- `/lib/chains`: LangChain chains for various operations
## Focus Modes
Perplexica supports multiple specialized search modes:
- All Mode: General web search
- Local Research Mode: Research and interact with local files with citations
- Chat Mode: Have a creative conversation
- Academic Search Mode: For academic research
- YouTube Search Mode: For video content
- Wolfram Alpha Search Mode: For calculations and data analysis
- Reddit Search Mode: For community discussions
## Development Workflow
- Use `npm run dev` for local development
- Format code with `npm run format:write` before committing
- Database migrations: `npm run db:push`
- Build for production: `npm run build`
- Start production server: `npm run start`
## Configuration
The application uses a `config.toml` file (created from `sample.config.toml`) for configuration, including:
- API keys for various LLM providers
- Database settings
- Search engine configuration
- Similarity measure settings
## Common Tasks
When working on this codebase, you might need to:
- Add new API endpoints in `/src/app/api`
- Modify UI components in `/src/components`
- Extend search functionality in `/src/lib/search`
- Add new LLM providers in `/src/lib/providers`
- Update database schema in `/src/lib/db/schema.ts`
- Create new prompt templates in `/src/lib/prompts`
- Build new chains in `/src/lib/chains`
## AI Behavior
- Avoid conciliatory language
- It is not necessary to apologize
- If you don't know the answer, ask for clarification
- Do not add additional packages or dependencies unless explicitly requested
- Only make changes to the code that are relevant to the task at hand

View File

@ -1,27 +1,23 @@
import prompts from '@/lib/prompts';
import MetaSearchAgent from '@/lib/search/metaSearchAgent';
import crypto from 'crypto';
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
import { EventEmitter } from 'stream';
import {
chatModelProviders,
embeddingModelProviders,
getAvailableChatModelProviders,
getAvailableEmbeddingModelProviders,
} from '@/lib/providers';
import db from '@/lib/db';
import { chats, messages as messagesSchema } from '@/lib/db/schema';
import { and, eq, gt } from 'drizzle-orm';
import { getFileDetails } from '@/lib/utils/files';
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
import { ChatOpenAI } from '@langchain/openai';
import { import {
getCustomOpenaiApiKey, getCustomOpenaiApiKey,
getCustomOpenaiApiUrl, getCustomOpenaiApiUrl,
getCustomOpenaiModelName, getCustomOpenaiModelName,
} from '@/lib/config'; } from '@/lib/config';
import { ChatOllama } from '@langchain/ollama'; import db from '@/lib/db';
import { chats, messages as messagesSchema } from '@/lib/db/schema';
import {
getAvailableChatModelProviders,
getAvailableEmbeddingModelProviders
} from '@/lib/providers';
import { searchHandlers } from '@/lib/search'; import { searchHandlers } from '@/lib/search';
import { getFileDetails } from '@/lib/utils/files';
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
import { AIMessage, BaseMessage, HumanMessage } from '@langchain/core/messages';
import { ChatOllama } from '@langchain/ollama';
import { ChatOpenAI } from '@langchain/openai';
import crypto from 'crypto';
import { and, eq, gte } from 'drizzle-orm';
import { EventEmitter } from 'stream';
export const runtime = 'nodejs'; export const runtime = 'nodejs';
export const dynamic = 'force-dynamic'; export const dynamic = 'force-dynamic';
@ -202,7 +198,7 @@ const handleHistorySave = async (
.delete(messagesSchema) .delete(messagesSchema)
.where( .where(
and( and(
gt(messagesSchema.id, messageExists.id), gte(messagesSchema.id, messageExists.id),
eq(messagesSchema.chatId, message.chatId), eq(messagesSchema.chatId, message.chatId),
), ),
) )

View File

@ -254,7 +254,7 @@ const ModelSelector = ({
{isExpanded && ( {isExpanded && (
<div className="pl-6"> <div className="pl-6">
{provider.models.map((modelOption) => ( {provider.models.map((modelOption) => (
<button <PopoverButton
key={`${modelOption.provider}-${modelOption.model}`} key={`${modelOption.provider}-${modelOption.model}`}
className={cn( className={cn(
'w-full text-left px-4 py-2 text-sm flex items-center', 'w-full text-left px-4 py-2 text-sm flex items-center',
@ -283,7 +283,7 @@ const ModelSelector = ({
Active Active
</div> </div>
)} )}
</button> </PopoverButton>
))} ))}
</div> </div>
)} )}