mirror of
https://github.com/ItzCrazyKns/Perplexica.git
synced 2025-06-16 23:08:41 +00:00
Compare commits
4 Commits
v1.7.0
...
feat/ollam
Author | SHA1 | Date | |
---|---|---|---|
2f20d845c8 | |||
7a7eafb8e7 | |||
043f66b767 | |||
811822c03b |
5
.env.example
Normal file
5
.env.example
Normal file
@ -0,0 +1,5 @@
|
||||
PORT=3001
|
||||
OLLAMA_URL=http://localhost:11434 # url of the ollama server
|
||||
SIMILARITY_MEASURE=cosine # cosine or dot
|
||||
SEARXNG_API_URL= # no need to fill this if using docker
|
||||
MODEL_NAME=llama2
|
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -4,6 +4,7 @@ about: Create an issue to help us fix bugs
|
||||
title: ''
|
||||
labels: bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
@ -11,7 +12,6 @@ A clear and concise description of what the bug is.
|
||||
|
||||
**To Reproduce**
|
||||
Steps to reproduce the behavior:
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
|
3
.github/ISSUE_TEMPLATE/custom.md
vendored
3
.github/ISSUE_TEMPLATE/custom.md
vendored
@ -4,4 +4,7 @@ about: Describe this issue template's purpose here.
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
|
||||
|
1
.github/ISSUE_TEMPLATE/feature_request.md
vendored
1
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -4,6 +4,7 @@ about: Suggest an idea for this project
|
||||
title: ''
|
||||
labels: enhancement
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
|
9
.gitignore
vendored
9
.gitignore
vendored
@ -6,7 +6,6 @@ yarn-error.log
|
||||
# Build output
|
||||
/.next/
|
||||
/out/
|
||||
/dist/
|
||||
|
||||
# IDE/Editor specific
|
||||
.vscode/
|
||||
@ -20,9 +19,6 @@ yarn-error.log
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
# Config files
|
||||
config.toml
|
||||
|
||||
# Log files
|
||||
logs/
|
||||
*.log
|
||||
@ -32,7 +28,4 @@ logs/
|
||||
|
||||
# Miscellaneous
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Db
|
||||
db.sqlite
|
||||
Thumbs.db
|
@ -1,38 +0,0 @@
|
||||
# Ignore all files in the node_modules directory
|
||||
node_modules
|
||||
|
||||
# Ignore all files in the .next directory (Next.js build output)
|
||||
.next
|
||||
|
||||
# Ignore all files in the .out directory (TypeScript build output)
|
||||
.out
|
||||
|
||||
# Ignore all files in the .cache directory (Prettier cache)
|
||||
.cache
|
||||
|
||||
# Ignore all files in the .vscode directory (Visual Studio Code settings)
|
||||
.vscode
|
||||
|
||||
# Ignore all files in the .idea directory (IntelliJ IDEA settings)
|
||||
.idea
|
||||
|
||||
# Ignore all files in the dist directory (build output)
|
||||
dist
|
||||
|
||||
# Ignore all files in the build directory (build output)
|
||||
build
|
||||
|
||||
# Ignore all files in the coverage directory (test coverage reports)
|
||||
coverage
|
||||
|
||||
# Ignore all files with the .log extension
|
||||
*.log
|
||||
|
||||
# Ignore all files with the .tmp extension
|
||||
*.tmp
|
||||
|
||||
# Ignore all files with the .swp extension
|
||||
*.swp
|
||||
|
||||
# Ignore all files with the .DS_Store extension (macOS specific)
|
||||
.DS_Store
|
@ -9,14 +9,16 @@ Perplexica's design consists of two main domains:
|
||||
- **Frontend (`ui` directory)**: This is a Next.js application holding all user interface components. It's a self-contained environment that manages everything the user interacts with.
|
||||
- **Backend (root and `src` directory)**: The backend logic is situated in the `src` folder, but the root directory holds the main `package.json` for backend dependency management.
|
||||
|
||||
Both the root directory (for backend configurations outside `src`) and the `ui` folder come with an `.env.example` file. These are templates for environment variables that you need to set up manually for the application to run correctly.
|
||||
|
||||
## Setting Up Your Environment
|
||||
|
||||
Before diving into coding, setting up your local environment is key. Here's what you need to do:
|
||||
|
||||
### Backend
|
||||
|
||||
1. In the root directory, locate the `sample.config.toml` file.
|
||||
2. Rename it to `config.toml` and fill in the necessary configuration fields specific to the backend.
|
||||
1. In the root directory, locate the `.env.example` file.
|
||||
2. Rename it to `.env` and fill in the necessary environment variables specific to the backend.
|
||||
3. Run `npm install` to install dependencies.
|
||||
4. Use `npm run dev` to start the backend in development mode.
|
||||
|
||||
|
117
README.md
117
README.md
@ -10,41 +10,34 @@
|
||||
- [Installation](#installation)
|
||||
- [Getting Started with Docker (Recommended)](#getting-started-with-docker-recommended)
|
||||
- [Non-Docker Installation](#non-docker-installation)
|
||||
- [Ollama Connection Errors](#ollama-connection-errors)
|
||||
- [Using as a Search Engine](#using-as-a-search-engine)
|
||||
- [One-Click Deployment](#one-click-deployment)
|
||||
- [Upcoming Features](#upcoming-features)
|
||||
- [Support Us](#support-us)
|
||||
- [Donations](#donations)
|
||||
- [Contribution](#contribution)
|
||||
- [Help and Support](#help-and-support)
|
||||
- [Acknowledgements](#acknowledgements)
|
||||
|
||||
## Overview
|
||||
|
||||
Perplexica is an open-source AI-powered searching tool or an AI-powered search engine that goes deep into the internet to find answers. Inspired by Perplexity AI, it's an open-source option that not just searches the web but understands your questions. It uses advanced machine learning algorithms like similarity searching and embeddings to refine results and provides clear answers with sources cited.
|
||||
|
||||
Using SearxNG to stay current and fully open source, Perplexica ensures you always get the most up-to-date information without compromising your privacy.
|
||||
|
||||
Want to know more about its architecture and how it works? You can read it [here](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/README.md).
|
||||
|
||||
## Preview
|
||||
|
||||

|
||||
|
||||
## Features
|
||||
|
||||
- **Local LLMs**: You can make use local LLMs such as Llama3 and Mixtral using Ollama.
|
||||
- **Two Main Modes:**
|
||||
- **Copilot Mode:** (In development) Boosts search by generating different queries to find more relevant internet sources. Like normal search instead of just using the context by SearxNG, it visits the top matches and tries to find relevant sources to the user's query directly from the page.
|
||||
- **Normal Mode:** Processes your query and performs a web search.
|
||||
- **Focus Modes:** Special modes to better answer specific types of questions. Perplexica currently has 6 focus modes:
|
||||
- **All Mode:** Searches the entire web to find the best results.
|
||||
- **Writing Assistant Mode:** Helpful for writing tasks that does not require searching the web.
|
||||
- **Academic Search Mode:** Finds articles and papers, ideal for academic research.
|
||||
- **YouTube Search Mode:** Finds YouTube videos based on the search query.
|
||||
- **Wolfram Alpha Search Mode:** Answers queries that need calculations or data analysis using Wolfram Alpha.
|
||||
- **Reddit Search Mode:** Searches Reddit for discussions and opinions related to the query.
|
||||
- **Current Information:** Some search tools might give you outdated info because they use data from crawling bots and convert them into embeddings and store them in a index. Unlike them, Perplexica uses SearxNG, a metasearch engine to get the results and rerank and get the most relevant source out of it, ensuring you always get the latest information without the overhead of daily data updates.
|
||||
|
||||
1. **All Mode:** Searches the entire web to find the best results.
|
||||
2. **Writing Assistant Mode:** Helpful for writing tasks that does not require searching the web.
|
||||
3. **Academic Search Mode:** Finds articles and papers, ideal for academic research.
|
||||
4. **YouTube Search Mode:** Finds YouTube videos based on the search query.
|
||||
5. **Wolfram Alpha Search Mode:** Answers queries that need calculations or data analysis using Wolfram Alpha.
|
||||
6. **Reddit Search Mode:** Searches Reddit for discussions and opinions related to the query.
|
||||
|
||||
- **Current Information:** Some search tools might give you outdated info because they use data from crawling bots and convert them into embeddings and store them in a index (its like converting the web into embeddings which is quite expensive.). Unlike them, Perplexica uses SearxNG, a metasearch engine to get the results and rerank and get the most relevent source out of it, ensuring you always get the latest information without the overhead of daily data updates.
|
||||
|
||||
It has many more features like image and video search. Some of the planned features are mentioned in [upcoming features](#upcoming-features).
|
||||
|
||||
@ -58,106 +51,54 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
|
||||
2. Clone the Perplexica repository:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/ItzCrazyKns/Perplexica.git
|
||||
git clone -b feat/ollama-support https://github.com/ItzCrazyKns/Perplexica.git
|
||||
```
|
||||
|
||||
3. After cloning, navigate to the directory containing the project files.
|
||||
|
||||
4. Rename the `sample.config.toml` file to `config.toml`. For Docker setups, you need only fill in the following fields:
|
||||
4. Rename the `.env.example` file to `.env`. For Docker setups, you need only fill in the following fields:
|
||||
|
||||
- `OPENAI`: Your OpenAI API key. **You only need to fill this if you wish to use OpenAI's models**.
|
||||
- `OLLAMA`: Your Ollama API URL. You should enter it as `http://host.docker.internal:PORT_NUMBER`. If you installed Ollama on port 11434, use `http://host.docker.internal:11434`. For other ports, adjust accordingly. **You need to fill this if you wish to use Ollama's models instead of OpenAI's**.
|
||||
- `GROQ`: Your Groq API key. **You only need to fill this if you wish to use Groq's hosted models**
|
||||
|
||||
**Note**: You can change these after starting Perplexica from the settings dialog.
|
||||
|
||||
- `SIMILARITY_MEASURE`: The similarity measure to use (This is filled by default; you can leave it as is if you are unsure about it.)
|
||||
- `OLLAMA_URL` (It should be the URL where Ollama is running; it is also filled by default but you need to replace it if your Ollama URL is different.)
|
||||
- `MODEL_NAME` (This is filled by default; you can change it if you want to use a different model.)
|
||||
- `SIMILARITY_MEASURE` (This is filled by default; you can leave it as is if you are unsure about it.)
|
||||
|
||||
5. Ensure you are in the directory containing the `docker-compose.yaml` file and execute:
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
docker compose up
|
||||
```
|
||||
|
||||
6. Wait a few minutes for the setup to complete. You can access Perplexica at http://localhost:3000 in your web browser.
|
||||
|
||||
**Note**: After the containers are built, you can start Perplexica directly from Docker without having to open a terminal.
|
||||
**Note**: Once the terminal is stopped, Perplexica will also stop. To restart it, you will need to open Docker Desktop and run Perplexica again.
|
||||
|
||||
### Non-Docker Installation
|
||||
|
||||
1. Clone the repository and rename the `sample.config.toml` file to `config.toml` in the root directory. Ensure you complete all required fields in this file.
|
||||
2. Rename the `.env.example` file to `.env` in the `ui` folder and fill in all necessary fields.
|
||||
3. After populating the configuration and environment files, run `npm i` in both the `ui` folder and the root directory.
|
||||
4. Install the dependencies and then execute `npm run build` in both the `ui` folder and the root directory.
|
||||
5. Finally, start both the frontend and the backend by running `npm run start` in both the `ui` folder and the root directory.
|
||||
For setups without Docker:
|
||||
|
||||
1. Follow the initial steps to clone the repository and rename the `.env.example` file to `.env` in the root directory. You will need to fill in all the fields in this file.
|
||||
2. Additionally, rename the `.env.example` file to `.env` in the `ui` folder and complete all fields.
|
||||
3. The non-Docker setup requires manual configuration of both the backend and frontend.
|
||||
|
||||
**Note**: Using Docker is recommended as it simplifies the setup process, especially for managing environment variables and dependencies.
|
||||
|
||||
See the [installation documentation](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/installation) for more information like exposing it your network, etc.
|
||||
|
||||
### Ollama Connection Errors
|
||||
|
||||
If you're encountering an Ollama connection error, it is likely due to the backend being unable to connect to Ollama's API. To fix this issue you can:
|
||||
|
||||
1. **Check your Ollama API URL:** Ensure that the API URL is correctly set in the settings menu.
|
||||
2. **Update API URL Based on OS:**
|
||||
|
||||
- **Windows:** Use `http://host.docker.internal:11434`
|
||||
- **Mac:** Use `http://host.docker.internal:11434`
|
||||
- **Linux:** Use `http://<private_ip_of_host>:11434`
|
||||
|
||||
Adjust the port number if you're using a different one.
|
||||
|
||||
3. **Linux Users - Expose Ollama to Network:**
|
||||
|
||||
- Serve Ollama over your network with the command:
|
||||
|
||||
```bash
|
||||
OLLAMA_HOST=0.0.0.0 ollama serve
|
||||
```
|
||||
|
||||
- Ensure that the port (default is 11434) is not blocked by your firewall.
|
||||
|
||||
## Using as a Search Engine
|
||||
|
||||
If you wish to use Perplexica as an alternative to traditional search engines like Google or Bing, or if you want to add a shortcut for quick access from your browser's search bar, follow these steps:
|
||||
|
||||
1. Open your browser's settings.
|
||||
2. Navigate to the 'Search Engines' section.
|
||||
3. Add a new site search with the following URL: `http://localhost:3000/?q=%s`. Replace `localhost` with your IP address or domain name, and `3000` with the port number if Perplexica is not hosted locally.
|
||||
4. Click the add button. Now, you can use Perplexica directly from your browser's search bar.
|
||||
|
||||
## One-Click Deployment
|
||||
|
||||
[](https://repocloud.io/details/?app_id=267)
|
||||
|
||||
## Upcoming Features
|
||||
|
||||
- [x] Add settings page
|
||||
- [x] Adding support for local LLMs
|
||||
- [x] History Saving features
|
||||
- [x] Introducing various Focus Modes
|
||||
- [ ] Finalizing Copilot Mode
|
||||
- [ ] Adding Discover
|
||||
- [ ] Adding support for multiple local LLMs and LLM providers such as Anthropic, Google, etc.
|
||||
- [ ] Adding Discover and History Saving features
|
||||
- [x] Introducing various Focus Modes
|
||||
|
||||
## Support Us
|
||||
|
||||
If you find Perplexica useful, consider giving us a star on GitHub. This helps more people discover Perplexica and supports the development of new features. Your support is greatly appreciated.
|
||||
|
||||
### Donations
|
||||
|
||||
We also accept donations to help sustain our project. If you would like to contribute, you can use the following button to make a donation in cryptocurrency. Thank you for your support!
|
||||
|
||||
<a href="https://nowpayments.io/donation?api_key=RFFKJH1-GRR4DQG-HFV1DZP-00G6MMK&source=lk_donation&medium=referral" target="_blank">
|
||||
<img src="https://nowpayments.io/images/embeds/donation-button-white.svg" alt="Crypto donation button by NOWPayments">
|
||||
</a>
|
||||
If you find Perplexica useful, consider giving us a star on GitHub. This helps more people discover Perplexica and supports the development of new features. Your support is appreciated.
|
||||
|
||||
## Contribution
|
||||
|
||||
Perplexica is built on the idea that AI and large language models should be easy for everyone to use. If you find bugs or have ideas, please share them in via GitHub Issues. For more information on contributing to Perplexica you can read the [CONTRIBUTING.md](CONTRIBUTING.md) file to learn more about Perplexica and how you can contribute to it.
|
||||
|
||||
## Help and Support
|
||||
## Acknowledgements
|
||||
|
||||
If you have any questions or feedback, please feel free to reach out to us. You can create an issue on GitHub or join our Discord server. There, you can connect with other users, share your experiences and reviews, and receive more personalized help. [Click here](https://discord.gg/EFwsmQDgAu) to join the Discord server. To discuss matters outside of regular support, feel free to contact me on Discord at `itzcrazykns`.
|
||||
Inspired by Perplexity AI, Perplexica aims to provide a similar service but always up-to-date and fully open source, thanks to SearxNG.
|
||||
|
||||
Thank you for exploring Perplexica, the AI-powered search engine designed to enhance your search experience. We are constantly working to improve Perplexica and expand its capabilities. We value your feedback and contributions which help us make Perplexica even better. Don't forget to check back for updates and new features!
|
||||
If you have any queries you can reach me via my Discord - `itzcrazykns`. Thanks for checking out Perplexica.
|
||||
|
@ -1,20 +1,16 @@
|
||||
FROM nikolaik/python-nodejs:python3.12-nodejs20-bullseye
|
||||
FROM node:alpine
|
||||
|
||||
ARG SEARXNG_API_URL
|
||||
ENV SEARXNG_API_URL=${SEARXNG_API_URL}
|
||||
|
||||
WORKDIR /home/perplexica
|
||||
|
||||
COPY src /home/perplexica/src
|
||||
COPY tsconfig.json /home/perplexica/
|
||||
COPY config.toml /home/perplexica/
|
||||
COPY drizzle.config.ts /home/perplexica/
|
||||
COPY .env /home/perplexica/
|
||||
COPY package.json /home/perplexica/
|
||||
COPY yarn.lock /home/perplexica/
|
||||
|
||||
RUN sed -i "s|SEARXNG = \".*\"|SEARXNG = \"${SEARXNG_API_URL}\"|g" /home/perplexica/config.toml
|
||||
|
||||
RUN mkdir /home/perplexica/data
|
||||
|
||||
RUN yarn install
|
||||
RUN yarn build
|
||||
|
||||
|
2
data/.gitignore
vendored
2
data/.gitignore
vendored
@ -1,2 +0,0 @@
|
||||
*
|
||||
!.gitignore
|
@ -1,14 +1,14 @@
|
||||
services:
|
||||
searxng:
|
||||
image: docker.io/searxng/searxng:latest
|
||||
volumes:
|
||||
- ./searxng:/etc/searxng:rw
|
||||
build:
|
||||
context: .
|
||||
dockerfile: searxng.dockerfile
|
||||
expose:
|
||||
- 4000
|
||||
ports:
|
||||
- 4000:8080
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
perplexica-backend:
|
||||
build:
|
||||
context: .
|
||||
@ -17,15 +17,12 @@ services:
|
||||
- SEARXNG_API_URL=http://searxng:8080
|
||||
depends_on:
|
||||
- searxng
|
||||
expose:
|
||||
- 3001
|
||||
ports:
|
||||
- 3001:3001
|
||||
volumes:
|
||||
- backend-dbstore:/home/perplexica/data
|
||||
extra_hosts:
|
||||
- 'host.docker.internal:host-gateway'
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
perplexica-frontend:
|
||||
build:
|
||||
@ -36,14 +33,12 @@ services:
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
depends_on:
|
||||
- perplexica-backend
|
||||
expose:
|
||||
- 3000
|
||||
ports:
|
||||
- 3000:3000
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
networks:
|
||||
perplexica-network:
|
||||
|
||||
volumes:
|
||||
backend-dbstore:
|
||||
|
@ -1,11 +0,0 @@
|
||||
## Perplexica's Architecture
|
||||
|
||||
Perplexica's architecture consists of the following key components:
|
||||
|
||||
1. **User Interface**: A web-based interface that allows users to interact with Perplexica for searching images, videos, and much more.
|
||||
2. **Agent/Chains**: These components predict Perplexica's next actions, understand user queries, and decide whether a web search is necessary.
|
||||
3. **SearXNG**: A metadata search engine used by Perplexica to search the web for sources.
|
||||
4. **LLMs (Large Language Models)**: Utilized by agents and chains for tasks like understanding content, writing responses, and citing sources. Examples include Claude, GPTs, etc.
|
||||
5. **Embedding Models**: To improve the accuracy of search results, embedding models re-rank the results using similarity search algorithms such as cosine similarity and dot product distance.
|
||||
|
||||
For a more detailed explanation of how these components work together, see [WORKING.md](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/WORKING.md).
|
@ -1,19 +0,0 @@
|
||||
## How does Perplexica work?
|
||||
|
||||
Curious about how Perplexica works? Don't worry, we'll cover it here. Before we begin, make sure you've read about the architecture of Perplexica to ensure you understand what it's made up of. Haven't read it? You can read it [here](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/architecture/README.md).
|
||||
|
||||
We'll understand how Perplexica works by taking an example of a scenario where a user asks: "How does an A.C. work?". We'll break down the process into steps to make it easier to understand. The steps are as follows:
|
||||
|
||||
1. The message is sent via WS to the backend server where it invokes the chain. The chain will depend on your focus mode. For this example, let's assume we use the "webSearch" focus mode.
|
||||
2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources and searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started.
|
||||
3. The query returned by the first chain is passed to SearXNG to search the web for information.
|
||||
4. After the information is retrieved, it is based on keyword-based search. We then convert the information into embeddings and the query as well, then we perform a similarity search to find the most relevant sources to answer the query.
|
||||
5. After all this is done, the sources are passed to the response generator. This chain takes all the chat history, the query, and the sources. It generates a response that is streamed to the UI.
|
||||
|
||||
### How are the answers cited?
|
||||
|
||||
The LLMs are prompted to do so. We've prompted them so well that they cite the answers themselves, and using some UI magic, we display it to the user.
|
||||
|
||||
### Image and Video Search
|
||||
|
||||
Image and video searches are conducted in a similar manner. A query is always generated first, then we search the web for images and videos that match the query. These results are then returned to the user.
|
@ -1,109 +0,0 @@
|
||||
# Expose Perplexica to a network
|
||||
|
||||
This guide will show you how to make Perplexica available over a network. Follow these steps to allow computers on the same network to interact with Perplexica. Choose the instructions that match the operating system you are using.
|
||||
|
||||
## Windows
|
||||
|
||||
1. Open PowerShell as Administrator
|
||||
|
||||
2. Navigate to the directory containing the `docker-compose.yaml` file
|
||||
|
||||
3. Stop and remove the existing Perplexica containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Open the `docker-compose.yaml` file in a text editor like Notepad++
|
||||
|
||||
5. Replace `127.0.0.1` with the IP address of the server Perplexica is running on in these two lines:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and close the `docker-compose.yaml` file
|
||||
|
||||
7. Rebuild and restart the Perplexica container:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
|
||||
## macOS
|
||||
|
||||
1. Open the Terminal application
|
||||
|
||||
2. Navigate to the directory with the `docker-compose.yaml` file:
|
||||
|
||||
```
|
||||
cd /path/to/docker-compose.yaml
|
||||
```
|
||||
|
||||
3. Stop and remove existing containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Open `docker-compose.yaml` in a text editor like Sublime Text:
|
||||
|
||||
```
|
||||
nano docker-compose.yaml
|
||||
```
|
||||
|
||||
5. Replace `127.0.0.1` with the server IP in these lines:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and exit the editor
|
||||
|
||||
7. Rebuild and restart Perplexica:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
|
||||
## Linux
|
||||
|
||||
1. Open the terminal
|
||||
|
||||
2. Navigate to the `docker-compose.yaml` directory:
|
||||
|
||||
```
|
||||
cd /path/to/docker-compose.yaml
|
||||
```
|
||||
|
||||
3. Stop and remove containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Edit `docker-compose.yaml`:
|
||||
|
||||
```
|
||||
nano docker-compose.yaml
|
||||
```
|
||||
|
||||
5. Replace `127.0.0.1` with the server IP:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and exit the editor
|
||||
|
||||
7. Rebuild and restart Perplexica:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
@ -1,10 +0,0 @@
|
||||
import { defineConfig } from 'drizzle-kit';
|
||||
|
||||
export default defineConfig({
|
||||
dialect: 'sqlite',
|
||||
schema: './src/db/schema.ts',
|
||||
out: './drizzle',
|
||||
dbCredentials: {
|
||||
url: './data/db.sqlite',
|
||||
},
|
||||
});
|
17
package.json
17
package.json
@ -1,42 +1,33 @@
|
||||
{
|
||||
"name": "perplexica-backend",
|
||||
"version": "1.7.0",
|
||||
"version": "1.0.0",
|
||||
"license": "MIT",
|
||||
"author": "ItzCrazyKns",
|
||||
"scripts": {
|
||||
"start": "npm run db:push && node dist/app.js",
|
||||
"start": "node --env-file=.env dist/app.js",
|
||||
"build": "tsc",
|
||||
"dev": "nodemon src/app.ts",
|
||||
"db:push": "drizzle-kit push sqlite",
|
||||
"dev": "nodemon -r dotenv/config src/app.ts",
|
||||
"format": "prettier . --check",
|
||||
"format:write": "prettier . --write"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/better-sqlite3": "^7.6.10",
|
||||
"@types/cors": "^2.8.17",
|
||||
"@types/express": "^4.17.21",
|
||||
"@types/readable-stream": "^4.0.11",
|
||||
"drizzle-kit": "^0.22.7",
|
||||
"nodemon": "^3.1.0",
|
||||
"prettier": "^3.2.5",
|
||||
"ts-node": "^10.9.2",
|
||||
"typescript": "^5.4.3"
|
||||
},
|
||||
"dependencies": {
|
||||
"@iarna/toml": "^2.2.5",
|
||||
"@langchain/openai": "^0.0.25",
|
||||
"@xenova/transformers": "^2.17.1",
|
||||
"axios": "^1.6.8",
|
||||
"better-sqlite3": "^11.0.0",
|
||||
"compute-cosine-similarity": "^1.1.0",
|
||||
"compute-dot": "^1.1.0",
|
||||
"cors": "^2.8.5",
|
||||
"dotenv": "^16.4.5",
|
||||
"drizzle-orm": "^0.31.2",
|
||||
"express": "^4.19.2",
|
||||
"langchain": "^0.1.30",
|
||||
"winston": "^3.13.0",
|
||||
"ws": "^8.17.1",
|
||||
"ws": "^8.16.0",
|
||||
"zod": "^3.22.4"
|
||||
}
|
||||
}
|
||||
|
@ -1,11 +0,0 @@
|
||||
[GENERAL]
|
||||
PORT = 3001 # Port to run the server on
|
||||
SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"
|
||||
|
||||
[API_KEYS]
|
||||
OPENAI = "" # OpenAI API key - sk-1234567890abcdef1234567890abcdef
|
||||
GROQ = "" # Groq API key - gsk_1234567890abcdef1234567890abcdef
|
||||
|
||||
[API_ENDPOINTS]
|
||||
SEARXNG = "http://localhost:32768" # SearxNG API URL
|
||||
OLLAMA = "" # Ollama API URL - http://host.docker.internal:11434
|
@ -1071,6 +1071,25 @@ engines:
|
||||
require_api_key: false
|
||||
results: HTML
|
||||
|
||||
- name: azlyrics
|
||||
shortcut: lyrics
|
||||
engine: xpath
|
||||
timeout: 4.0
|
||||
disabled: true
|
||||
categories: [music, lyrics]
|
||||
paging: true
|
||||
search_url: https://search.azlyrics.com/search.php?q={query}&w=lyrics&p={pageno}
|
||||
url_xpath: //td[@class="text-left visitedlyr"]/a/@href
|
||||
title_xpath: //span/b/text()
|
||||
content_xpath: //td[@class="text-left visitedlyr"]/a/small
|
||||
about:
|
||||
website: https://azlyrics.com
|
||||
wikidata_id: Q66372542
|
||||
official_api_documentation:
|
||||
use_official_api: false
|
||||
require_api_key: false
|
||||
results: HTML
|
||||
|
||||
- name: mastodon users
|
||||
engine: mastodon
|
||||
mastodon_type: accounts
|
||||
@ -1550,6 +1569,11 @@ engines:
|
||||
shortcut: scc
|
||||
disabled: true
|
||||
|
||||
- name: framalibre
|
||||
engine: framalibre
|
||||
shortcut: frl
|
||||
disabled: true
|
||||
|
||||
# - name: searx
|
||||
# engine: searx_engine
|
||||
# shortcut: se
|
3
searxng.dockerfile
Normal file
3
searxng.dockerfile
Normal file
@ -0,0 +1,3 @@
|
||||
FROM searxng/searxng
|
||||
|
||||
COPY searxng-settings.yml /etc/searxng/settings.yml
|
@ -1,3 +0,0 @@
|
||||
[botdetection.ip_limit]
|
||||
# activate link_token method in the ip_limit method
|
||||
link_token = true
|
@ -1,50 +0,0 @@
|
||||
[uwsgi]
|
||||
# Who will run the code
|
||||
uid = searxng
|
||||
gid = searxng
|
||||
|
||||
# Number of workers (usually CPU count)
|
||||
# default value: %k (= number of CPU core, see Dockerfile)
|
||||
workers = %k
|
||||
|
||||
# Number of threads per worker
|
||||
# default value: 4 (see Dockerfile)
|
||||
threads = 4
|
||||
|
||||
# The right granted on the created socket
|
||||
chmod-socket = 666
|
||||
|
||||
# Plugin to use and interpreter config
|
||||
single-interpreter = true
|
||||
master = true
|
||||
plugin = python3
|
||||
lazy-apps = true
|
||||
enable-threads = 4
|
||||
|
||||
# Module to import
|
||||
module = searx.webapp
|
||||
|
||||
# Virtualenv and python path
|
||||
pythonpath = /usr/local/searxng/
|
||||
chdir = /usr/local/searxng/searx/
|
||||
|
||||
# automatically set processes name to something meaningful
|
||||
auto-procname = true
|
||||
|
||||
# Disable request logging for privacy
|
||||
disable-logging = true
|
||||
log-5xx = true
|
||||
|
||||
# Set the max size of a request (request-body excluded)
|
||||
buffer-size = 8192
|
||||
|
||||
# No keep alive
|
||||
# See https://github.com/searx/searx-docker/issues/24
|
||||
add-header = Connection: close
|
||||
|
||||
# uwsgi serves the static files
|
||||
static-map = /static=/usr/local/searxng/searx/static
|
||||
# expires set to one day
|
||||
static-expires = /* 86400
|
||||
static-gzip-all = True
|
||||
offload-threads = 4
|
@ -9,16 +9,33 @@ import {
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import eventEmitter from 'events';
|
||||
import computeSimilarity from '../utils/computeSimilarity';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const embeddings = new OllamaEmbeddings({
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const basicAcademicSearchRetrieverPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question if needed so it is a standalone question that can be used by the LLM to search the web for information.
|
||||
@ -42,7 +59,7 @@ Rephrased question:
|
||||
`;
|
||||
|
||||
const basicAcademicSearchResponsePrompt = `
|
||||
You are Perplexica, an AI model who is expert at searching the web and answering user's queries. You are set on focus mode 'Academic', this means you will be searching for academic papers and articles on the web.
|
||||
You are Perplexica, an AI model who is expert at searching the web and answering user's queries. You are set on focus mode 'Acadedemic', this means you will be searching for academic papers and articles on the web.
|
||||
|
||||
Generate a response that is informative and relevant to the user's query based on provided context (the context consits of search results containg a brief description of the content of that page).
|
||||
You must use this context to answer the user's query in the best way possible. Use an unbaised and journalistic tone in your response. Do not repeat the text.
|
||||
@ -97,139 +114,122 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const docEmbeddings = await embeddings.embedDocuments(
|
||||
docsWithContent.map((doc) => doc.pageContent),
|
||||
);
|
||||
|
||||
const queryEmbedding = await embeddings.embedQuery(query);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
type BasicChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const createBasicAcademicSearchRetrieverChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicAcademicSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: [
|
||||
'arxiv',
|
||||
'google scholar',
|
||||
'internetarchivescholar',
|
||||
'pubmed',
|
||||
],
|
||||
});
|
||||
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const createBasicAcademicSearchAnsweringChain = (
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const basicAcademicSearchRetrieverChain =
|
||||
createBasicAcademicSearchRetrieverChain(llm);
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
const basicAcademicSearchRetrieverChain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicAcademicSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const [docEmbeddings, queryEmbedding] = await Promise.all([
|
||||
embeddings.embedDocuments(docsWithContent.map((doc) => doc.pageContent)),
|
||||
embeddings.embedQuery(query),
|
||||
]);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: [
|
||||
'arxiv',
|
||||
'google_scholar',
|
||||
'internet_archive_scholar',
|
||||
'pubmed',
|
||||
],
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
basicAcademicSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicAcademicSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
);
|
||||
|
||||
const basicAcademicSearch = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
|
||||
const basicAcademicSearchAnsweringChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
basicAcademicSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicAcademicSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const basicAcademicSearch = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const basicAcademicSearchAnsweringChain =
|
||||
createBasicAcademicSearchAnsweringChain(llm, embeddings);
|
||||
|
||||
const stream = basicAcademicSearchAnsweringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -246,19 +246,14 @@ const basicAcademicSearch = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in academic search: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
};
|
||||
|
||||
const handleAcademicSearch = (
|
||||
message: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const emitter = basicAcademicSearch(message, history, llm, embeddings);
|
||||
const handleAcademicSearch = (message: string, history: BaseMessage[]) => {
|
||||
const emitter = basicAcademicSearch(message, history);
|
||||
return emitter;
|
||||
};
|
||||
|
||||
|
@ -4,11 +4,17 @@ import {
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const imageSearchChainPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search the web for images.
|
||||
@ -38,47 +44,38 @@ type ImageSearchChainInput = {
|
||||
|
||||
const strParser = new StringOutputParser();
|
||||
|
||||
const createImageSearchChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: ImageSearchChainInput) => {
|
||||
return formatChatHistoryAsString(input.chat_history);
|
||||
},
|
||||
query: (input: ImageSearchChainInput) => {
|
||||
return input.query;
|
||||
},
|
||||
}),
|
||||
PromptTemplate.fromTemplate(imageSearchChainPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const res = await searchSearxng(input, {
|
||||
engines: ['bing images', 'google images'],
|
||||
});
|
||||
const imageSearchChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: ImageSearchChainInput) => {
|
||||
return formatChatHistoryAsString(input.chat_history);
|
||||
},
|
||||
query: (input: ImageSearchChainInput) => {
|
||||
return input.query;
|
||||
},
|
||||
}),
|
||||
PromptTemplate.fromTemplate(imageSearchChainPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const res = await searchSearxng(input, {
|
||||
categories: ['images'],
|
||||
engines: ['bing_images', 'google_images'],
|
||||
});
|
||||
|
||||
const images = [];
|
||||
const images = [];
|
||||
|
||||
res.results.forEach((result) => {
|
||||
if (result.img_src && result.url && result.title) {
|
||||
images.push({
|
||||
img_src: result.img_src,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
});
|
||||
}
|
||||
});
|
||||
res.results.forEach((result) => {
|
||||
if (result.img_src && result.url && result.title) {
|
||||
images.push({
|
||||
img_src: result.img_src,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return images.slice(0, 10);
|
||||
}),
|
||||
]);
|
||||
};
|
||||
return images.slice(0, 10);
|
||||
}),
|
||||
]);
|
||||
|
||||
const handleImageSearch = (
|
||||
input: ImageSearchChainInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
const imageSearchChain = createImageSearchChain(llm);
|
||||
return imageSearchChain.invoke(input);
|
||||
};
|
||||
|
||||
export default handleImageSearch;
|
||||
export default imageSearchChain;
|
||||
|
@ -9,16 +9,33 @@ import {
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import eventEmitter from 'events';
|
||||
import computeSimilarity from '../utils/computeSimilarity';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const embeddings = new OllamaEmbeddings({
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const basicRedditSearchRetrieverPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question if needed so it is a standalone question that can be used by the LLM to search the web for information.
|
||||
@ -97,134 +114,118 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const docEmbeddings = await embeddings.embedDocuments(
|
||||
docsWithContent.map((doc) => doc.pageContent),
|
||||
);
|
||||
|
||||
const queryEmbedding = await embeddings.embedQuery(query);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.filter((sim) => sim.similarity > 0.3)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
type BasicChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const createBasicRedditSearchRetrieverChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicRedditSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['reddit'],
|
||||
});
|
||||
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content ? result.content : result.title,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const createBasicRedditSearchAnsweringChain = (
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const basicRedditSearchRetrieverChain =
|
||||
createBasicRedditSearchRetrieverChain(llm);
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
const basicRedditSearchRetrieverChain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicRedditSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const [docEmbeddings, queryEmbedding] = await Promise.all([
|
||||
embeddings.embedDocuments(docsWithContent.map((doc) => doc.pageContent)),
|
||||
embeddings.embedQuery(query),
|
||||
]);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['reddit'],
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.filter((sim) => sim.similarity > 0.3)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content ? result.content : result.title,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
basicRedditSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicRedditSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
);
|
||||
|
||||
const basicRedditSearch = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
|
||||
const basicRedditSearchAnsweringChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
basicRedditSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicRedditSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const basicRedditSearch = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const basicRedditSearchAnsweringChain =
|
||||
createBasicRedditSearchAnsweringChain(llm, embeddings);
|
||||
const stream = basicRedditSearchAnsweringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -241,19 +242,14 @@ const basicRedditSearch = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in RedditSearch: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
};
|
||||
|
||||
const handleRedditSearch = (
|
||||
message: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const emitter = basicRedditSearch(message, history, llm, embeddings);
|
||||
const handleRedditSearch = (message: string, history: BaseMessage[]) => {
|
||||
const emitter = basicRedditSearch(message, history);
|
||||
return emitter;
|
||||
};
|
||||
|
||||
|
@ -1,55 +0,0 @@
|
||||
import { RunnableSequence, RunnableMap } from '@langchain/core/runnables';
|
||||
import ListLineOutputParser from '../lib/outputParsers/listLineOutputParser';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
|
||||
const suggestionGeneratorPrompt = `
|
||||
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
|
||||
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
|
||||
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
|
||||
|
||||
Provide these suggestions separated by newlines between the XML tags <suggestions> and </suggestions>. For example:
|
||||
|
||||
<suggestions>
|
||||
Tell me more about SpaceX and their recent projects
|
||||
What is the latest news on SpaceX?
|
||||
Who is the CEO of SpaceX?
|
||||
</suggestions>
|
||||
|
||||
Conversation:
|
||||
{chat_history}
|
||||
`;
|
||||
|
||||
type SuggestionGeneratorInput = {
|
||||
chat_history: BaseMessage[];
|
||||
};
|
||||
|
||||
const outputParser = new ListLineOutputParser({
|
||||
key: 'suggestions',
|
||||
});
|
||||
|
||||
const createSuggestionGeneratorChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: SuggestionGeneratorInput) =>
|
||||
formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
PromptTemplate.fromTemplate(suggestionGeneratorPrompt),
|
||||
llm,
|
||||
outputParser,
|
||||
]);
|
||||
};
|
||||
|
||||
const generateSuggestions = (
|
||||
input: SuggestionGeneratorInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
(llm as ChatOpenAI).temperature = 0;
|
||||
const suggestionGeneratorChain = createSuggestionGeneratorChain(llm);
|
||||
return suggestionGeneratorChain.invoke(input);
|
||||
};
|
||||
|
||||
export default generateSuggestions;
|
@ -1,90 +0,0 @@
|
||||
import {
|
||||
RunnableSequence,
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
|
||||
const VideoSearchChainPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question so it is a standalone question that can be used by the LLM to search Youtube for videos.
|
||||
You need to make sure the rephrased question agrees with the conversation and is relevant to the conversation.
|
||||
|
||||
Example:
|
||||
1. Follow up question: How does a car work?
|
||||
Rephrased: How does a car work?
|
||||
|
||||
2. Follow up question: What is the theory of relativity?
|
||||
Rephrased: What is theory of relativity
|
||||
|
||||
3. Follow up question: How does an AC work?
|
||||
Rephrased: How does an AC work
|
||||
|
||||
Conversation:
|
||||
{chat_history}
|
||||
|
||||
Follow up question: {query}
|
||||
Rephrased question:
|
||||
`;
|
||||
|
||||
type VideoSearchChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const strParser = new StringOutputParser();
|
||||
|
||||
const createVideoSearchChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: VideoSearchChainInput) => {
|
||||
return formatChatHistoryAsString(input.chat_history);
|
||||
},
|
||||
query: (input: VideoSearchChainInput) => {
|
||||
return input.query;
|
||||
},
|
||||
}),
|
||||
PromptTemplate.fromTemplate(VideoSearchChainPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
const res = await searchSearxng(input, {
|
||||
engines: ['youtube'],
|
||||
});
|
||||
|
||||
const videos = [];
|
||||
|
||||
res.results.forEach((result) => {
|
||||
if (
|
||||
result.thumbnail &&
|
||||
result.url &&
|
||||
result.title &&
|
||||
result.iframe_src
|
||||
) {
|
||||
videos.push({
|
||||
img_src: result.thumbnail,
|
||||
url: result.url,
|
||||
title: result.title,
|
||||
iframe_src: result.iframe_src,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return videos.slice(0, 10);
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const handleVideoSearch = (
|
||||
input: VideoSearchChainInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
const VideoSearchChain = createVideoSearchChain(llm);
|
||||
return VideoSearchChain.invoke(input);
|
||||
};
|
||||
|
||||
export default handleVideoSearch;
|
@ -9,16 +9,33 @@ import {
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import eventEmitter from 'events';
|
||||
import computeSimilarity from '../utils/computeSimilarity';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const embeddings = new OllamaEmbeddings({
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const basicSearchRetrieverPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question if needed so it is a standalone question that can be used by the LLM to search the web for information.
|
||||
@ -97,135 +114,117 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const docEmbeddings = await embeddings.embedDocuments(
|
||||
docsWithContent.map((doc) => doc.pageContent),
|
||||
);
|
||||
|
||||
const queryEmbedding = await embeddings.embedQuery(query);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.filter((sim) => sim.similarity > 0.5)
|
||||
.slice(0, 15)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
type BasicChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const createBasicWebSearchRetrieverChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const createBasicWebSearchAnsweringChain = (
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const basicWebSearchRetrieverChain = createBasicWebSearchRetrieverChain(llm);
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
const basicWebSearchRetrieverChain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const [docEmbeddings, queryEmbedding] = await Promise.all([
|
||||
embeddings.embedDocuments(docsWithContent.map((doc) => doc.pageContent)),
|
||||
embeddings.embedQuery(query),
|
||||
]);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.filter((sim) => sim.similarity > 0.5)
|
||||
.slice(0, 15)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
basicWebSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWebSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
);
|
||||
|
||||
const basicWebSearch = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
|
||||
const basicWebSearchAnsweringChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
basicWebSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWebSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const basicWebSearch = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const basicWebSearchAnsweringChain = createBasicWebSearchAnsweringChain(
|
||||
llm,
|
||||
embeddings,
|
||||
);
|
||||
|
||||
const stream = basicWebSearchAnsweringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -242,19 +241,14 @@ const basicWebSearch = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in websearch: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
};
|
||||
|
||||
const handleWebSearch = (
|
||||
message: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const emitter = basicWebSearch(message, history, llm, embeddings);
|
||||
const handleWebSearch = (message: string, history: BaseMessage[]) => {
|
||||
const emitter = basicWebSearch(message, history);
|
||||
return emitter;
|
||||
};
|
||||
|
||||
|
@ -9,15 +9,26 @@ import {
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import eventEmitter from 'events';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const basicWolframAlphaSearchRetrieverPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question if needed so it is a standalone question that can be used by the LLM to search the web for information.
|
||||
@ -96,94 +107,81 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
type BasicChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const createBasicWolframAlphaSearchRetrieverChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicWolframAlphaSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
const basicWolframAlphaSearchRetrieverChain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicWolframAlphaSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['wolframalpha'],
|
||||
});
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['wolframalpha'],
|
||||
});
|
||||
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const createBasicWolframAlphaSearchAnsweringChain = (llm: BaseChatModel) => {
|
||||
const basicWolframAlphaSearchRetrieverChain =
|
||||
createBasicWolframAlphaSearchRetrieverChain(llm);
|
||||
|
||||
const processDocs = (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
basicWolframAlphaSearchRetrieverChain
|
||||
.pipe(({ query, docs }) => {
|
||||
return docs;
|
||||
})
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWolframAlphaSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
);
|
||||
|
||||
const basicWolframAlphaSearch = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
|
||||
const basicWolframAlphaSearchAnsweringChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
basicWolframAlphaSearchRetrieverChain
|
||||
.pipe(({ query, docs }) => {
|
||||
return docs;
|
||||
})
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWolframAlphaSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const basicWolframAlphaSearch = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const basicWolframAlphaSearchAnsweringChain =
|
||||
createBasicWolframAlphaSearchAnsweringChain(llm);
|
||||
const stream = basicWolframAlphaSearchAnsweringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -200,19 +198,14 @@ const basicWolframAlphaSearch = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in WolframAlphaSearch: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
};
|
||||
|
||||
const handleWolframAlphaSearch = (
|
||||
message: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const emitter = basicWolframAlphaSearch(message, history, llm);
|
||||
const handleWolframAlphaSearch = (message: string, history: BaseMessage[]) => {
|
||||
const emitter = basicWolframAlphaSearch(message, history);
|
||||
return emitter;
|
||||
};
|
||||
|
||||
|
@ -4,12 +4,16 @@ import {
|
||||
MessagesPlaceholder,
|
||||
} from '@langchain/core/prompts';
|
||||
import { RunnableSequence } from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import eventEmitter from 'events';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const writingAssistantPrompt = `
|
||||
You are Perplexica, an AI model who is expert at searching the web and answering user's queries. You are currently set on focus mode 'Writing Assistant', this means you will be helping the user write a response to a given query.
|
||||
@ -41,30 +45,22 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const createWritingAssistantChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', writingAssistantPrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
const writingAssistantChain = RunnableSequence.from([
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', writingAssistantPrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const handleWritingAssistant = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const handleWritingAssistant = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const writingAssistantChain = createWritingAssistantChain(llm);
|
||||
const stream = writingAssistantChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -81,7 +77,7 @@ const handleWritingAssistant = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in writing assistant: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
|
@ -9,16 +9,33 @@ import {
|
||||
RunnableMap,
|
||||
RunnableLambda,
|
||||
} from '@langchain/core/runnables';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { Ollama } from '@langchain/community/llms/ollama';
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { StringOutputParser } from '@langchain/core/output_parsers';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import { searchSearxng } from '../lib/searxng';
|
||||
import { searchSearxng } from '../core/searxng';
|
||||
import type { StreamEvent } from '@langchain/core/tracers/log_stream';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import eventEmitter from 'events';
|
||||
import computeSimilarity from '../utils/computeSimilarity';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const chatLLM = new ChatOllama({
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
model: process.env.MODEL_NAME,
|
||||
temperature: 0.7,
|
||||
});
|
||||
|
||||
const llm = new Ollama({
|
||||
temperature: 0,
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const embeddings = new OllamaEmbeddings({
|
||||
model: process.env.MODEL_NAME,
|
||||
baseUrl: process.env.OLLAMA_URL,
|
||||
});
|
||||
|
||||
const basicYoutubeSearchRetrieverPrompt = `
|
||||
You will be given a conversation below and a follow up question. You need to rephrase the follow-up question if needed so it is a standalone question that can be used by the LLM to search the web for information.
|
||||
@ -97,135 +114,118 @@ const handleStream = async (
|
||||
}
|
||||
};
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const docEmbeddings = await embeddings.embedDocuments(
|
||||
docsWithContent.map((doc) => doc.pageContent),
|
||||
);
|
||||
|
||||
const queryEmbedding = await embeddings.embedQuery(query);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.filter((sim) => sim.similarity > 0.3)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
type BasicChainInput = {
|
||||
chat_history: BaseMessage[];
|
||||
query: string;
|
||||
};
|
||||
|
||||
const createBasicYoutubeSearchRetrieverChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicYoutubeSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['youtube'],
|
||||
});
|
||||
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content ? result.content : result.title,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
);
|
||||
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
};
|
||||
|
||||
const createBasicYoutubeSearchAnsweringChain = (
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const basicYoutubeSearchRetrieverChain =
|
||||
createBasicYoutubeSearchRetrieverChain(llm);
|
||||
|
||||
const processDocs = async (docs: Document[]) => {
|
||||
return docs
|
||||
.map((_, index) => `${index + 1}. ${docs[index].pageContent}`)
|
||||
.join('\n');
|
||||
};
|
||||
|
||||
const rerankDocs = async ({
|
||||
query,
|
||||
docs,
|
||||
}: {
|
||||
query: string;
|
||||
docs: Document[];
|
||||
}) => {
|
||||
if (docs.length === 0) {
|
||||
return docs;
|
||||
const basicYoutubeSearchRetrieverChain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(basicYoutubeSearchRetrieverPrompt),
|
||||
llm,
|
||||
strParser,
|
||||
RunnableLambda.from(async (input: string) => {
|
||||
if (input === 'not_needed') {
|
||||
return { query: '', docs: [] };
|
||||
}
|
||||
|
||||
const docsWithContent = docs.filter(
|
||||
(doc) => doc.pageContent && doc.pageContent.length > 0,
|
||||
);
|
||||
|
||||
const [docEmbeddings, queryEmbedding] = await Promise.all([
|
||||
embeddings.embedDocuments(docsWithContent.map((doc) => doc.pageContent)),
|
||||
embeddings.embedQuery(query),
|
||||
]);
|
||||
|
||||
const similarity = docEmbeddings.map((docEmbedding, i) => {
|
||||
const sim = computeSimilarity(queryEmbedding, docEmbedding);
|
||||
|
||||
return {
|
||||
index: i,
|
||||
similarity: sim,
|
||||
};
|
||||
const res = await searchSearxng(input, {
|
||||
language: 'en',
|
||||
engines: ['youtube'],
|
||||
});
|
||||
|
||||
const sortedDocs = similarity
|
||||
.sort((a, b) => b.similarity - a.similarity)
|
||||
.slice(0, 15)
|
||||
.filter((sim) => sim.similarity > 0.3)
|
||||
.map((sim) => docsWithContent[sim.index]);
|
||||
|
||||
return sortedDocs;
|
||||
};
|
||||
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
const documents = res.results.map(
|
||||
(result) =>
|
||||
new Document({
|
||||
pageContent: result.content ? result.content : result.title,
|
||||
metadata: {
|
||||
title: result.title,
|
||||
url: result.url,
|
||||
...(result.img_src && { img_src: result.img_src }),
|
||||
},
|
||||
}),
|
||||
basicYoutubeSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicYoutubeSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
};
|
||||
);
|
||||
|
||||
const basicYoutubeSearch = (
|
||||
query: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
return { query: input, docs: documents };
|
||||
}),
|
||||
]);
|
||||
|
||||
const basicYoutubeSearchAnsweringChain = RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
query: (input: BasicChainInput) => input.query,
|
||||
chat_history: (input: BasicChainInput) => input.chat_history,
|
||||
context: RunnableSequence.from([
|
||||
(input) => ({
|
||||
query: input.query,
|
||||
chat_history: formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
basicYoutubeSearchRetrieverChain
|
||||
.pipe(rerankDocs)
|
||||
.withConfig({
|
||||
runName: 'FinalSourceRetriever',
|
||||
})
|
||||
.pipe(processDocs),
|
||||
]),
|
||||
}),
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicYoutubeSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
chatLLM,
|
||||
strParser,
|
||||
]).withConfig({
|
||||
runName: 'FinalResponseGenerator',
|
||||
});
|
||||
|
||||
const basicYoutubeSearch = (query: string, history: BaseMessage[]) => {
|
||||
const emitter = new eventEmitter();
|
||||
|
||||
try {
|
||||
const basicYoutubeSearchAnsweringChain =
|
||||
createBasicYoutubeSearchAnsweringChain(llm, embeddings);
|
||||
|
||||
const stream = basicYoutubeSearchAnsweringChain.streamEvents(
|
||||
{
|
||||
chat_history: history,
|
||||
@ -242,19 +242,14 @@ const basicYoutubeSearch = (
|
||||
'error',
|
||||
JSON.stringify({ data: 'An error has occurred please try again later' }),
|
||||
);
|
||||
logger.error(`Error in youtube search: ${err}`);
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
return emitter;
|
||||
};
|
||||
|
||||
const handleYoutubeSearch = (
|
||||
message: string,
|
||||
history: BaseMessage[],
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
const emitter = basicYoutubeSearch(message, history, llm, embeddings);
|
||||
const handleYoutubeSearch = (message: string, history: BaseMessage[]) => {
|
||||
const emitter = basicYoutubeSearch(message, history);
|
||||
return emitter;
|
||||
};
|
||||
|
||||
|
@ -3,10 +3,6 @@ import express from 'express';
|
||||
import cors from 'cors';
|
||||
import http from 'http';
|
||||
import routes from './routes';
|
||||
import { getPort } from './config';
|
||||
import logger from './utils/logger';
|
||||
|
||||
const port = getPort();
|
||||
|
||||
const app = express();
|
||||
const server = http.createServer(app);
|
||||
@ -23,8 +19,8 @@ app.get('/api', (_, res) => {
|
||||
res.status(200).json({ status: 'ok' });
|
||||
});
|
||||
|
||||
server.listen(port, () => {
|
||||
logger.info(`Server is running on port ${port}`);
|
||||
server.listen(process.env.PORT!, () => {
|
||||
console.log(`API server started on port ${process.env.PORT}`);
|
||||
});
|
||||
|
||||
startWebSocketServer(server);
|
||||
|
@ -1,69 +0,0 @@
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
import toml from '@iarna/toml';
|
||||
|
||||
const configFileName = 'config.toml';
|
||||
|
||||
interface Config {
|
||||
GENERAL: {
|
||||
PORT: number;
|
||||
SIMILARITY_MEASURE: string;
|
||||
};
|
||||
API_KEYS: {
|
||||
OPENAI: string;
|
||||
GROQ: string;
|
||||
};
|
||||
API_ENDPOINTS: {
|
||||
SEARXNG: string;
|
||||
OLLAMA: string;
|
||||
};
|
||||
}
|
||||
|
||||
type RecursivePartial<T> = {
|
||||
[P in keyof T]?: RecursivePartial<T[P]>;
|
||||
};
|
||||
|
||||
const loadConfig = () =>
|
||||
toml.parse(
|
||||
fs.readFileSync(path.join(__dirname, `../${configFileName}`), 'utf-8'),
|
||||
) as any as Config;
|
||||
|
||||
export const getPort = () => loadConfig().GENERAL.PORT;
|
||||
|
||||
export const getSimilarityMeasure = () =>
|
||||
loadConfig().GENERAL.SIMILARITY_MEASURE;
|
||||
|
||||
export const getOpenaiApiKey = () => loadConfig().API_KEYS.OPENAI;
|
||||
|
||||
export const getGroqApiKey = () => loadConfig().API_KEYS.GROQ;
|
||||
|
||||
export const getSearxngApiEndpoint = () => loadConfig().API_ENDPOINTS.SEARXNG;
|
||||
|
||||
export const getOllamaApiEndpoint = () => loadConfig().API_ENDPOINTS.OLLAMA;
|
||||
|
||||
export const updateConfig = (config: RecursivePartial<Config>) => {
|
||||
const currentConfig = loadConfig();
|
||||
|
||||
for (const key in currentConfig) {
|
||||
if (!config[key]) config[key] = {};
|
||||
|
||||
if (typeof currentConfig[key] === 'object' && currentConfig[key] !== null) {
|
||||
for (const nestedKey in currentConfig[key]) {
|
||||
if (
|
||||
!config[key][nestedKey] &&
|
||||
currentConfig[key][nestedKey] &&
|
||||
config[key][nestedKey] !== ''
|
||||
) {
|
||||
config[key][nestedKey] = currentConfig[key][nestedKey];
|
||||
}
|
||||
}
|
||||
} else if (currentConfig[key] && config[key] !== '') {
|
||||
config[key] = currentConfig[key];
|
||||
}
|
||||
}
|
||||
|
||||
fs.writeFileSync(
|
||||
path.join(__dirname, `../${configFileName}`),
|
||||
toml.stringify(config),
|
||||
);
|
||||
};
|
69
src/core/agentPicker.ts
Normal file
69
src/core/agentPicker.ts
Normal file
@ -0,0 +1,69 @@
|
||||
import { z } from 'zod';
|
||||
import { OpenAI } from '@langchain/openai';
|
||||
import { RunnableSequence } from '@langchain/core/runnables';
|
||||
import { StructuredOutputParser } from 'langchain/output_parsers';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
|
||||
const availableAgents = [
|
||||
{
|
||||
name: 'webSearch',
|
||||
description:
|
||||
'It is expert is searching the web for information and answer user queries',
|
||||
},
|
||||
/* {
|
||||
name: 'academicSearch',
|
||||
description:
|
||||
'It is expert is searching the academic databases for information and answer user queries. It is particularly good at finding research papers and articles on topics like science, engineering, and technology. Use this instead of wolframAlphaSearch if the user query is not mathematical or scientific in nature',
|
||||
},
|
||||
{
|
||||
name: 'youtubeSearch',
|
||||
description:
|
||||
'This model is expert at finding videos on youtube based on user queries',
|
||||
},
|
||||
{
|
||||
name: 'wolframAlphaSearch',
|
||||
description:
|
||||
'This model is expert at finding answers to mathematical and scientific questions based on user queries.',
|
||||
},
|
||||
{
|
||||
name: 'redditSearch',
|
||||
description:
|
||||
'This model is expert at finding posts and discussions on reddit based on user queries',
|
||||
},
|
||||
{
|
||||
name: 'writingAssistant',
|
||||
description:
|
||||
'If there is no need for searching, this model is expert at generating text based on user queries',
|
||||
}, */
|
||||
];
|
||||
|
||||
const parser = StructuredOutputParser.fromZodSchema(
|
||||
z.object({
|
||||
agent: z.string().describe('The name of the selected agent'),
|
||||
}),
|
||||
);
|
||||
|
||||
const prompt = `
|
||||
You are an AI model who is expert at finding suitable agents for user queries. The available agents are:
|
||||
${availableAgents.map((agent) => `- ${agent.name}: ${agent.description}`).join('\n')}
|
||||
|
||||
Your task is to find the most suitable agent for the following query: {query}
|
||||
|
||||
{format_instructions}
|
||||
`;
|
||||
|
||||
const chain = RunnableSequence.from([
|
||||
PromptTemplate.fromTemplate(prompt),
|
||||
new OpenAI({ temperature: 0 }),
|
||||
parser,
|
||||
]);
|
||||
|
||||
const pickSuitableAgent = async (query: string) => {
|
||||
const res = await chain.invoke({
|
||||
query,
|
||||
format_instructions: parser.getFormatInstructions(),
|
||||
});
|
||||
return res.agent;
|
||||
};
|
||||
|
||||
export default pickSuitableAgent;
|
@ -1,5 +1,4 @@
|
||||
import axios from 'axios';
|
||||
import { getSearxngApiEndpoint } from '../config';
|
||||
|
||||
interface SearxngSearchOptions {
|
||||
categories?: string[];
|
||||
@ -13,19 +12,15 @@ interface SearxngSearchResult {
|
||||
url: string;
|
||||
img_src?: string;
|
||||
thumbnail_src?: string;
|
||||
thumbnail?: string;
|
||||
content?: string;
|
||||
author?: string;
|
||||
iframe_src?: string;
|
||||
}
|
||||
|
||||
export const searchSearxng = async (
|
||||
query: string,
|
||||
opts?: SearxngSearchOptions,
|
||||
) => {
|
||||
const searxngURL = getSearxngApiEndpoint();
|
||||
|
||||
const url = new URL(`${searxngURL}/search?format=json`);
|
||||
const url = new URL(`${process.env.SEARXNG_API_URL}/search?format=json`);
|
||||
url.searchParams.append('q', query);
|
||||
|
||||
if (opts) {
|
@ -1,10 +0,0 @@
|
||||
import { drizzle } from 'drizzle-orm/better-sqlite3';
|
||||
import Database from 'better-sqlite3';
|
||||
import * as schema from './schema';
|
||||
|
||||
const sqlite = new Database('data/db.sqlite');
|
||||
const db = drizzle(sqlite, {
|
||||
schema: schema,
|
||||
});
|
||||
|
||||
export default db;
|
@ -1,19 +0,0 @@
|
||||
import { text, integer, sqliteTable } from 'drizzle-orm/sqlite-core';
|
||||
|
||||
export const messages = sqliteTable('messages', {
|
||||
id: integer('id').primaryKey(),
|
||||
content: text('content').notNull(),
|
||||
chatId: text('chatId').notNull(),
|
||||
messageId: text('messageId').notNull(),
|
||||
role: text('type', { enum: ['assistant', 'user'] }),
|
||||
metadata: text('metadata', {
|
||||
mode: 'json',
|
||||
}),
|
||||
});
|
||||
|
||||
export const chats = sqliteTable('chats', {
|
||||
id: text('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
createdAt: text('createdAt').notNull(),
|
||||
focusMode: text('focusMode').notNull(),
|
||||
});
|
@ -1,82 +0,0 @@
|
||||
import { Embeddings, type EmbeddingsParams } from '@langchain/core/embeddings';
|
||||
import { chunkArray } from '@langchain/core/utils/chunk_array';
|
||||
|
||||
export interface HuggingFaceTransformersEmbeddingsParams
|
||||
extends EmbeddingsParams {
|
||||
modelName: string;
|
||||
|
||||
model: string;
|
||||
|
||||
timeout?: number;
|
||||
|
||||
batchSize?: number;
|
||||
|
||||
stripNewLines?: boolean;
|
||||
}
|
||||
|
||||
export class HuggingFaceTransformersEmbeddings
|
||||
extends Embeddings
|
||||
implements HuggingFaceTransformersEmbeddingsParams
|
||||
{
|
||||
modelName = 'Xenova/all-MiniLM-L6-v2';
|
||||
|
||||
model = 'Xenova/all-MiniLM-L6-v2';
|
||||
|
||||
batchSize = 512;
|
||||
|
||||
stripNewLines = true;
|
||||
|
||||
timeout?: number;
|
||||
|
||||
private pipelinePromise: Promise<any>;
|
||||
|
||||
constructor(fields?: Partial<HuggingFaceTransformersEmbeddingsParams>) {
|
||||
super(fields ?? {});
|
||||
|
||||
this.modelName = fields?.model ?? fields?.modelName ?? this.model;
|
||||
this.model = this.modelName;
|
||||
this.stripNewLines = fields?.stripNewLines ?? this.stripNewLines;
|
||||
this.timeout = fields?.timeout;
|
||||
}
|
||||
|
||||
async embedDocuments(texts: string[]): Promise<number[][]> {
|
||||
const batches = chunkArray(
|
||||
this.stripNewLines ? texts.map((t) => t.replace(/\n/g, ' ')) : texts,
|
||||
this.batchSize,
|
||||
);
|
||||
|
||||
const batchRequests = batches.map((batch) => this.runEmbedding(batch));
|
||||
const batchResponses = await Promise.all(batchRequests);
|
||||
const embeddings: number[][] = [];
|
||||
|
||||
for (let i = 0; i < batchResponses.length; i += 1) {
|
||||
const batchResponse = batchResponses[i];
|
||||
for (let j = 0; j < batchResponse.length; j += 1) {
|
||||
embeddings.push(batchResponse[j]);
|
||||
}
|
||||
}
|
||||
|
||||
return embeddings;
|
||||
}
|
||||
|
||||
async embedQuery(text: string): Promise<number[]> {
|
||||
const data = await this.runEmbedding([
|
||||
this.stripNewLines ? text.replace(/\n/g, ' ') : text,
|
||||
]);
|
||||
return data[0];
|
||||
}
|
||||
|
||||
private async runEmbedding(texts: string[]) {
|
||||
const { pipeline } = await import('@xenova/transformers');
|
||||
|
||||
const pipe = await (this.pipelinePromise ??= pipeline(
|
||||
'feature-extraction',
|
||||
this.model,
|
||||
));
|
||||
|
||||
return this.caller.call(async () => {
|
||||
const output = await pipe(texts, { pooling: 'mean', normalize: true });
|
||||
return output.tolist();
|
||||
});
|
||||
}
|
||||
}
|
@ -1,43 +0,0 @@
|
||||
import { BaseOutputParser } from '@langchain/core/output_parsers';
|
||||
|
||||
interface LineListOutputParserArgs {
|
||||
key?: string;
|
||||
}
|
||||
|
||||
class LineListOutputParser extends BaseOutputParser<string[]> {
|
||||
private key = 'questions';
|
||||
|
||||
constructor(args?: LineListOutputParserArgs) {
|
||||
super();
|
||||
this.key = args.key ?? this.key;
|
||||
}
|
||||
|
||||
static lc_name() {
|
||||
return 'LineListOutputParser';
|
||||
}
|
||||
|
||||
lc_namespace = ['langchain', 'output_parsers', 'line_list_output_parser'];
|
||||
|
||||
async parse(text: string): Promise<string[]> {
|
||||
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
||||
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
||||
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
||||
const questionsStartIndex =
|
||||
startKeyIndex === -1 ? 0 : startKeyIndex + `<${this.key}>`.length;
|
||||
const questionsEndIndex = endKeyIndex === -1 ? text.length : endKeyIndex;
|
||||
const lines = text
|
||||
.slice(questionsStartIndex, questionsEndIndex)
|
||||
.trim()
|
||||
.split('\n')
|
||||
.filter((line) => line.trim() !== '')
|
||||
.map((line) => line.replace(regex, ''));
|
||||
|
||||
return lines;
|
||||
}
|
||||
|
||||
getFormatInstructions(): string {
|
||||
throw new Error('Not implemented.');
|
||||
}
|
||||
}
|
||||
|
||||
export default LineListOutputParser;
|
@ -1,187 +0,0 @@
|
||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { HuggingFaceTransformersEmbeddings } from './huggingfaceTransformer';
|
||||
import {
|
||||
getGroqApiKey,
|
||||
getOllamaApiEndpoint,
|
||||
getOpenaiApiKey,
|
||||
} from '../config';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
export const getAvailableChatModelProviders = async () => {
|
||||
const openAIApiKey = getOpenaiApiKey();
|
||||
const groqApiKey = getGroqApiKey();
|
||||
const ollamaEndpoint = getOllamaApiEndpoint();
|
||||
|
||||
const models = {};
|
||||
|
||||
if (openAIApiKey) {
|
||||
try {
|
||||
models['openai'] = {
|
||||
'GPT-3.5 turbo': new ChatOpenAI({
|
||||
openAIApiKey,
|
||||
modelName: 'gpt-3.5-turbo',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
'GPT-4': new ChatOpenAI({
|
||||
openAIApiKey,
|
||||
modelName: 'gpt-4',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
'GPT-4 turbo': new ChatOpenAI({
|
||||
openAIApiKey,
|
||||
modelName: 'gpt-4-turbo',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
'GPT-4 omni': new ChatOpenAI({
|
||||
openAIApiKey,
|
||||
modelName: 'gpt-4o',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading OpenAI models: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (groqApiKey) {
|
||||
try {
|
||||
models['groq'] = {
|
||||
'LLaMA3 8b': new ChatOpenAI(
|
||||
{
|
||||
openAIApiKey: groqApiKey,
|
||||
modelName: 'llama3-8b-8192',
|
||||
temperature: 0.7,
|
||||
},
|
||||
{
|
||||
baseURL: 'https://api.groq.com/openai/v1',
|
||||
},
|
||||
),
|
||||
'LLaMA3 70b': new ChatOpenAI(
|
||||
{
|
||||
openAIApiKey: groqApiKey,
|
||||
modelName: 'llama3-70b-8192',
|
||||
temperature: 0.7,
|
||||
},
|
||||
{
|
||||
baseURL: 'https://api.groq.com/openai/v1',
|
||||
},
|
||||
),
|
||||
'Mixtral 8x7b': new ChatOpenAI(
|
||||
{
|
||||
openAIApiKey: groqApiKey,
|
||||
modelName: 'mixtral-8x7b-32768',
|
||||
temperature: 0.7,
|
||||
},
|
||||
{
|
||||
baseURL: 'https://api.groq.com/openai/v1',
|
||||
},
|
||||
),
|
||||
'Gemma 7b': new ChatOpenAI(
|
||||
{
|
||||
openAIApiKey: groqApiKey,
|
||||
modelName: 'gemma-7b-it',
|
||||
temperature: 0.7,
|
||||
},
|
||||
{
|
||||
baseURL: 'https://api.groq.com/openai/v1',
|
||||
},
|
||||
),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading Groq models: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (ollamaEndpoint) {
|
||||
try {
|
||||
const response = await fetch(`${ollamaEndpoint}/api/tags`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const { models: ollamaModels } = (await response.json()) as any;
|
||||
|
||||
models['ollama'] = ollamaModels.reduce((acc, model) => {
|
||||
acc[model.model] = new ChatOllama({
|
||||
baseUrl: ollamaEndpoint,
|
||||
model: model.model,
|
||||
temperature: 0.7,
|
||||
});
|
||||
return acc;
|
||||
}, {});
|
||||
} catch (err) {
|
||||
logger.error(`Error loading Ollama models: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
models['custom_openai'] = {};
|
||||
|
||||
return models;
|
||||
};
|
||||
|
||||
export const getAvailableEmbeddingModelProviders = async () => {
|
||||
const openAIApiKey = getOpenaiApiKey();
|
||||
const ollamaEndpoint = getOllamaApiEndpoint();
|
||||
|
||||
const models = {};
|
||||
|
||||
if (openAIApiKey) {
|
||||
try {
|
||||
models['openai'] = {
|
||||
'Text embedding 3 small': new OpenAIEmbeddings({
|
||||
openAIApiKey,
|
||||
modelName: 'text-embedding-3-small',
|
||||
}),
|
||||
'Text embedding 3 large': new OpenAIEmbeddings({
|
||||
openAIApiKey,
|
||||
modelName: 'text-embedding-3-large',
|
||||
}),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading OpenAI embeddings: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (ollamaEndpoint) {
|
||||
try {
|
||||
const response = await fetch(`${ollamaEndpoint}/api/tags`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const { models: ollamaModels } = (await response.json()) as any;
|
||||
|
||||
models['ollama'] = ollamaModels.reduce((acc, model) => {
|
||||
acc[model.model] = new OllamaEmbeddings({
|
||||
baseUrl: ollamaEndpoint,
|
||||
model: model.model,
|
||||
});
|
||||
return acc;
|
||||
}, {});
|
||||
} catch (err) {
|
||||
logger.error(`Error loading Ollama embeddings: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
models['local'] = {
|
||||
'BGE Small': new HuggingFaceTransformersEmbeddings({
|
||||
modelName: 'Xenova/bge-small-en-v1.5',
|
||||
}),
|
||||
'GTE Small': new HuggingFaceTransformersEmbeddings({
|
||||
modelName: 'Xenova/gte-small',
|
||||
}),
|
||||
'Bert Multilingual': new HuggingFaceTransformersEmbeddings({
|
||||
modelName: 'Xenova/bert-base-multilingual-uncased',
|
||||
}),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading local embeddings: ${err}`);
|
||||
}
|
||||
|
||||
return models;
|
||||
};
|
@ -1,43 +0,0 @@
|
||||
import express from 'express';
|
||||
import logger from '../utils/logger';
|
||||
import db from '../db/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { chats, messages } from '../db/schema';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.get('/', async (_, res) => {
|
||||
try {
|
||||
let chats = await db.query.chats.findMany();
|
||||
|
||||
chats = chats.reverse();
|
||||
|
||||
return res.status(200).json({ chats: chats });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in getting chats: ${err.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
router.get('/:id', async (req, res) => {
|
||||
try {
|
||||
const chatExists = await db.query.chats.findFirst({
|
||||
where: eq(chats.id, req.params.id),
|
||||
});
|
||||
|
||||
if (!chatExists) {
|
||||
return res.status(404).json({ message: 'Chat not found' });
|
||||
}
|
||||
|
||||
const chatMessages = await db.query.messages.findMany({
|
||||
where: eq(messages.chatId, req.params.id),
|
||||
});
|
||||
|
||||
return res.status(200).json({ chat: chatExists, messages: chatMessages });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in getting chat: ${err.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
@ -1,63 +0,0 @@
|
||||
import express from 'express';
|
||||
import {
|
||||
getAvailableChatModelProviders,
|
||||
getAvailableEmbeddingModelProviders,
|
||||
} from '../lib/providers';
|
||||
import {
|
||||
getGroqApiKey,
|
||||
getOllamaApiEndpoint,
|
||||
getOpenaiApiKey,
|
||||
updateConfig,
|
||||
} from '../config';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.get('/', async (_, res) => {
|
||||
const config = {};
|
||||
|
||||
const [chatModelProviders, embeddingModelProviders] = await Promise.all([
|
||||
getAvailableChatModelProviders(),
|
||||
getAvailableEmbeddingModelProviders(),
|
||||
]);
|
||||
|
||||
config['chatModelProviders'] = {};
|
||||
config['embeddingModelProviders'] = {};
|
||||
|
||||
for (const provider in chatModelProviders) {
|
||||
config['chatModelProviders'][provider] = Object.keys(
|
||||
chatModelProviders[provider],
|
||||
);
|
||||
}
|
||||
|
||||
for (const provider in embeddingModelProviders) {
|
||||
config['embeddingModelProviders'][provider] = Object.keys(
|
||||
embeddingModelProviders[provider],
|
||||
);
|
||||
}
|
||||
|
||||
config['openaiApiKey'] = getOpenaiApiKey();
|
||||
config['ollamaApiUrl'] = getOllamaApiEndpoint();
|
||||
config['groqApiKey'] = getGroqApiKey();
|
||||
|
||||
res.status(200).json(config);
|
||||
});
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
const config = req.body;
|
||||
|
||||
const updatedConfig = {
|
||||
API_KEYS: {
|
||||
OPENAI: config.openaiApiKey,
|
||||
GROQ: config.groqApiKey,
|
||||
},
|
||||
API_ENDPOINTS: {
|
||||
OLLAMA: config.ollamaApiUrl,
|
||||
},
|
||||
};
|
||||
|
||||
updateConfig(updatedConfig);
|
||||
|
||||
res.status(200).json({ message: 'Config updated' });
|
||||
});
|
||||
|
||||
export default router;
|
@ -1,45 +1,21 @@
|
||||
import express from 'express';
|
||||
import handleImageSearch from '../agents/imageSearchAgent';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getAvailableChatModelProviders } from '../lib/providers';
|
||||
import { HumanMessage, AIMessage } from '@langchain/core/messages';
|
||||
import logger from '../utils/logger';
|
||||
import imageSearchChain from '../agents/imageSearchAgent';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
let { query, chat_history, chat_model_provider, chat_model } = req.body;
|
||||
const { query, chat_history } = req.body;
|
||||
|
||||
chat_history = chat_history.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
const images = await imageSearchChain.invoke({
|
||||
query,
|
||||
chat_history,
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
if (chatModels[provider] && chatModels[provider][chatModel]) {
|
||||
llm = chatModels[provider][chatModel] as BaseChatModel | undefined;
|
||||
}
|
||||
|
||||
if (!llm) {
|
||||
res.status(500).json({ message: 'Invalid LLM model selected' });
|
||||
return;
|
||||
}
|
||||
|
||||
const images = await handleImageSearch({ query, chat_history }, llm);
|
||||
|
||||
res.status(200).json({ images });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in image search: ${err.message}`);
|
||||
console.log(err.message);
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -1,18 +1,8 @@
|
||||
import express from 'express';
|
||||
import imagesRouter from './images';
|
||||
import videosRouter from './videos';
|
||||
import configRouter from './config';
|
||||
import modelsRouter from './models';
|
||||
import suggestionsRouter from './suggestions';
|
||||
import chatsRouter from './chats';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.use('/images', imagesRouter);
|
||||
router.use('/videos', videosRouter);
|
||||
router.use('/config', configRouter);
|
||||
router.use('/models', modelsRouter);
|
||||
router.use('/suggestions', suggestionsRouter);
|
||||
router.use('/chats', chatsRouter);
|
||||
|
||||
export default router;
|
||||
|
@ -1,24 +0,0 @@
|
||||
import express from 'express';
|
||||
import logger from '../utils/logger';
|
||||
import {
|
||||
getAvailableChatModelProviders,
|
||||
getAvailableEmbeddingModelProviders,
|
||||
} from '../lib/providers';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
const [chatModelProviders, embeddingModelProviders] = await Promise.all([
|
||||
getAvailableChatModelProviders(),
|
||||
getAvailableEmbeddingModelProviders(),
|
||||
]);
|
||||
|
||||
res.status(200).json({ chatModelProviders, embeddingModelProviders });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(err.message);
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
@ -1,46 +0,0 @@
|
||||
import express from 'express';
|
||||
import generateSuggestions from '../agents/suggestionGeneratorAgent';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getAvailableChatModelProviders } from '../lib/providers';
|
||||
import { HumanMessage, AIMessage } from '@langchain/core/messages';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
let { chat_history, chat_model, chat_model_provider } = req.body;
|
||||
|
||||
chat_history = chat_history.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
if (chatModels[provider] && chatModels[provider][chatModel]) {
|
||||
llm = chatModels[provider][chatModel] as BaseChatModel | undefined;
|
||||
}
|
||||
|
||||
if (!llm) {
|
||||
res.status(500).json({ message: 'Invalid LLM model selected' });
|
||||
return;
|
||||
}
|
||||
|
||||
const suggestions = await generateSuggestions({ chat_history }, llm);
|
||||
|
||||
res.status(200).json({ suggestions: suggestions });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in generating suggestions: ${err.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
@ -1,46 +0,0 @@
|
||||
import express from 'express';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getAvailableChatModelProviders } from '../lib/providers';
|
||||
import { HumanMessage, AIMessage } from '@langchain/core/messages';
|
||||
import logger from '../utils/logger';
|
||||
import handleVideoSearch from '../agents/videoSearchAgent';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
let { query, chat_history, chat_model_provider, chat_model } = req.body;
|
||||
|
||||
chat_history = chat_history.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
if (chatModels[provider] && chatModels[provider][chatModel]) {
|
||||
llm = chatModels[provider][chatModel] as BaseChatModel | undefined;
|
||||
}
|
||||
|
||||
if (!llm) {
|
||||
res.status(500).json({ message: 'Invalid LLM model selected' });
|
||||
return;
|
||||
}
|
||||
|
||||
const videos = await handleVideoSearch({ chat_history, query }, llm);
|
||||
|
||||
res.status(200).json({ videos });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in video search: ${err.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
@ -1,13 +1,10 @@
|
||||
import dot from 'compute-dot';
|
||||
import cosineSimilarity from 'compute-cosine-similarity';
|
||||
import { getSimilarityMeasure } from '../config';
|
||||
|
||||
const computeSimilarity = (x: number[], y: number[]): number => {
|
||||
const similarityMeasure = getSimilarityMeasure();
|
||||
|
||||
if (similarityMeasure === 'cosine') {
|
||||
if (process.env.SIMILARITY_MEASURE === 'cosine') {
|
||||
return cosineSimilarity(x, y);
|
||||
} else if (similarityMeasure === 'dot') {
|
||||
} else if (process.env.SIMILARITY_MEASURE === 'dot') {
|
||||
return dot(x, y);
|
||||
}
|
||||
|
||||
|
@ -1,22 +0,0 @@
|
||||
import winston from 'winston';
|
||||
|
||||
const logger = winston.createLogger({
|
||||
level: 'info',
|
||||
transports: [
|
||||
new winston.transports.Console({
|
||||
format: winston.format.combine(
|
||||
winston.format.colorize(),
|
||||
winston.format.simple(),
|
||||
),
|
||||
}),
|
||||
new winston.transports.File({
|
||||
filename: 'app.log',
|
||||
format: winston.format.combine(
|
||||
winston.format.timestamp(),
|
||||
winston.format.json(),
|
||||
),
|
||||
}),
|
||||
],
|
||||
});
|
||||
|
||||
export default logger;
|
@ -1,100 +1,11 @@
|
||||
import { WebSocket } from 'ws';
|
||||
import { handleMessage } from './messageHandler';
|
||||
import {
|
||||
getAvailableEmbeddingModelProviders,
|
||||
getAvailableChatModelProviders,
|
||||
} from '../lib/providers';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import type { IncomingMessage } from 'http';
|
||||
import logger from '../utils/logger';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
|
||||
export const handleConnection = async (
|
||||
ws: WebSocket,
|
||||
request: IncomingMessage,
|
||||
) => {
|
||||
try {
|
||||
const searchParams = new URL(request.url, `http://${request.headers.host}`)
|
||||
.searchParams;
|
||||
export const handleConnection = (ws: WebSocket) => {
|
||||
ws.on(
|
||||
'message',
|
||||
async (message) => await handleMessage(message.toString(), ws),
|
||||
);
|
||||
|
||||
const [chatModelProviders, embeddingModelProviders] = await Promise.all([
|
||||
getAvailableChatModelProviders(),
|
||||
getAvailableEmbeddingModelProviders(),
|
||||
]);
|
||||
|
||||
const chatModelProvider =
|
||||
searchParams.get('chatModelProvider') ||
|
||||
Object.keys(chatModelProviders)[0];
|
||||
const chatModel =
|
||||
searchParams.get('chatModel') ||
|
||||
Object.keys(chatModelProviders[chatModelProvider])[0];
|
||||
|
||||
const embeddingModelProvider =
|
||||
searchParams.get('embeddingModelProvider') ||
|
||||
Object.keys(embeddingModelProviders)[0];
|
||||
const embeddingModel =
|
||||
searchParams.get('embeddingModel') ||
|
||||
Object.keys(embeddingModelProviders[embeddingModelProvider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
let embeddings: Embeddings | undefined;
|
||||
|
||||
if (
|
||||
chatModelProviders[chatModelProvider] &&
|
||||
chatModelProviders[chatModelProvider][chatModel] &&
|
||||
chatModelProvider != 'custom_openai'
|
||||
) {
|
||||
llm = chatModelProviders[chatModelProvider][chatModel] as
|
||||
| BaseChatModel
|
||||
| undefined;
|
||||
} else if (chatModelProvider == 'custom_openai') {
|
||||
llm = new ChatOpenAI({
|
||||
modelName: chatModel,
|
||||
openAIApiKey: searchParams.get('openAIApiKey'),
|
||||
temperature: 0.7,
|
||||
configuration: {
|
||||
baseURL: searchParams.get('openAIBaseURL'),
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
if (
|
||||
embeddingModelProviders[embeddingModelProvider] &&
|
||||
embeddingModelProviders[embeddingModelProvider][embeddingModel]
|
||||
) {
|
||||
embeddings = embeddingModelProviders[embeddingModelProvider][
|
||||
embeddingModel
|
||||
] as Embeddings | undefined;
|
||||
}
|
||||
|
||||
if (!llm || !embeddings) {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Invalid LLM or embeddings model selected, please refresh the page and try again.',
|
||||
key: 'INVALID_MODEL_SELECTED',
|
||||
}),
|
||||
);
|
||||
ws.close();
|
||||
}
|
||||
|
||||
ws.on(
|
||||
'message',
|
||||
async (message) =>
|
||||
await handleMessage(message.toString(), ws, llm, embeddings),
|
||||
);
|
||||
|
||||
ws.on('close', () => logger.debug('Connection closed'));
|
||||
} catch (err) {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Internal server error.',
|
||||
key: 'INTERNAL_SERVER_ERROR',
|
||||
}),
|
||||
);
|
||||
ws.close();
|
||||
logger.error(err);
|
||||
}
|
||||
ws.on('close', () => console.log('Connection closed'));
|
||||
};
|
||||
|
@ -6,24 +6,11 @@ import handleWritingAssistant from '../agents/writingAssistant';
|
||||
import handleWolframAlphaSearch from '../agents/wolframAlphaSearchAgent';
|
||||
import handleYoutubeSearch from '../agents/youtubeSearchAgent';
|
||||
import handleRedditSearch from '../agents/redditSearchAgent';
|
||||
import type { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import logger from '../utils/logger';
|
||||
import db from '../db';
|
||||
import { chats, messages } from '../db/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import crypto from 'crypto';
|
||||
|
||||
type Message = {
|
||||
messageId: string;
|
||||
chatId: string;
|
||||
content: string;
|
||||
};
|
||||
|
||||
type WSMessage = {
|
||||
message: Message;
|
||||
copilot: boolean;
|
||||
type: string;
|
||||
content: string;
|
||||
copilot: boolean;
|
||||
focusMode: string;
|
||||
history: Array<[string, string]>;
|
||||
};
|
||||
@ -40,12 +27,8 @@ const searchHandlers = {
|
||||
const handleEmitterEvents = (
|
||||
emitter: EventEmitter,
|
||||
ws: WebSocket,
|
||||
messageId: string,
|
||||
chatId: string,
|
||||
id: string,
|
||||
) => {
|
||||
let recievedMessage = '';
|
||||
let sources = [];
|
||||
|
||||
emitter.on('data', (data) => {
|
||||
const parsedData = JSON.parse(data);
|
||||
if (parsedData.type === 'response') {
|
||||
@ -53,71 +36,39 @@ const handleEmitterEvents = (
|
||||
JSON.stringify({
|
||||
type: 'message',
|
||||
data: parsedData.data,
|
||||
messageId: messageId,
|
||||
messageId: id,
|
||||
}),
|
||||
);
|
||||
recievedMessage += parsedData.data;
|
||||
} else if (parsedData.type === 'sources') {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'sources',
|
||||
data: parsedData.data,
|
||||
messageId: messageId,
|
||||
messageId: id,
|
||||
}),
|
||||
);
|
||||
sources = parsedData.data;
|
||||
}
|
||||
});
|
||||
emitter.on('end', () => {
|
||||
ws.send(JSON.stringify({ type: 'messageEnd', messageId: messageId }));
|
||||
|
||||
db.insert(messages)
|
||||
.values({
|
||||
content: recievedMessage,
|
||||
chatId: chatId,
|
||||
messageId: messageId,
|
||||
role: 'assistant',
|
||||
metadata: JSON.stringify({
|
||||
createdAt: new Date(),
|
||||
...(sources && sources.length > 0 && { sources }),
|
||||
}),
|
||||
})
|
||||
.execute();
|
||||
ws.send(JSON.stringify({ type: 'messageEnd', messageId: id }));
|
||||
});
|
||||
emitter.on('error', (data) => {
|
||||
const parsedData = JSON.parse(data);
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: parsedData.data,
|
||||
key: 'CHAIN_ERROR',
|
||||
}),
|
||||
);
|
||||
ws.send(JSON.stringify({ type: 'error', data: parsedData.data }));
|
||||
});
|
||||
};
|
||||
|
||||
export const handleMessage = async (
|
||||
message: string,
|
||||
ws: WebSocket,
|
||||
llm: BaseChatModel,
|
||||
embeddings: Embeddings,
|
||||
) => {
|
||||
export const handleMessage = async (message: string, ws: WebSocket) => {
|
||||
try {
|
||||
const parsedWSMessage = JSON.parse(message) as WSMessage;
|
||||
const parsedMessage = parsedWSMessage.message;
|
||||
|
||||
const id = crypto.randomBytes(7).toString('hex');
|
||||
const parsedMessage = JSON.parse(message) as Message;
|
||||
const id = Math.random().toString(36).substring(7);
|
||||
|
||||
if (!parsedMessage.content)
|
||||
return ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Invalid message format',
|
||||
key: 'INVALID_FORMAT',
|
||||
}),
|
||||
JSON.stringify({ type: 'error', data: 'Invalid message format' }),
|
||||
);
|
||||
|
||||
const history: BaseMessage[] = parsedWSMessage.history.map((msg) => {
|
||||
const history: BaseMessage[] = parsedMessage.history.map((msg) => {
|
||||
if (msg[0] === 'human') {
|
||||
return new HumanMessage({
|
||||
content: msg[1],
|
||||
@ -129,65 +80,17 @@ export const handleMessage = async (
|
||||
}
|
||||
});
|
||||
|
||||
if (parsedWSMessage.type === 'message') {
|
||||
const handler = searchHandlers[parsedWSMessage.focusMode];
|
||||
|
||||
if (parsedMessage.type === 'message') {
|
||||
const handler = searchHandlers[parsedMessage.focusMode];
|
||||
if (handler) {
|
||||
const emitter = handler(
|
||||
parsedMessage.content,
|
||||
history,
|
||||
llm,
|
||||
embeddings,
|
||||
);
|
||||
|
||||
handleEmitterEvents(emitter, ws, id, parsedMessage.chatId);
|
||||
|
||||
const chat = await db.query.chats.findFirst({
|
||||
where: eq(chats.id, parsedMessage.chatId),
|
||||
});
|
||||
|
||||
if (!chat) {
|
||||
await db
|
||||
.insert(chats)
|
||||
.values({
|
||||
id: parsedMessage.chatId,
|
||||
title: parsedMessage.content,
|
||||
createdAt: new Date().toString(),
|
||||
focusMode: parsedWSMessage.focusMode,
|
||||
})
|
||||
.execute();
|
||||
}
|
||||
|
||||
await db
|
||||
.insert(messages)
|
||||
.values({
|
||||
content: parsedMessage.content,
|
||||
chatId: parsedMessage.chatId,
|
||||
messageId: id,
|
||||
role: 'user',
|
||||
metadata: JSON.stringify({
|
||||
createdAt: new Date(),
|
||||
}),
|
||||
})
|
||||
.execute();
|
||||
const emitter = handler(parsedMessage.content, history);
|
||||
handleEmitterEvents(emitter, ws, id);
|
||||
} else {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Invalid focus mode',
|
||||
key: 'INVALID_FOCUS_MODE',
|
||||
}),
|
||||
);
|
||||
ws.send(JSON.stringify({ type: 'error', data: 'Invalid focus mode' }));
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Invalid message format',
|
||||
key: 'INVALID_FORMAT',
|
||||
}),
|
||||
);
|
||||
logger.error(`Failed to handle message: ${err}`);
|
||||
} catch (error) {
|
||||
console.error('Failed to handle message', error);
|
||||
ws.send(JSON.stringify({ type: 'error', data: 'Invalid message format' }));
|
||||
}
|
||||
};
|
||||
|
@ -1,16 +1,15 @@
|
||||
import { WebSocketServer } from 'ws';
|
||||
import { handleConnection } from './connectionManager';
|
||||
import http from 'http';
|
||||
import { getPort } from '../config';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
export const initServer = (
|
||||
server: http.Server<typeof http.IncomingMessage, typeof http.ServerResponse>,
|
||||
) => {
|
||||
const port = getPort();
|
||||
const wss = new WebSocketServer({ server });
|
||||
|
||||
wss.on('connection', handleConnection);
|
||||
wss.on('connection', (ws) => {
|
||||
handleConnection(ws);
|
||||
});
|
||||
|
||||
logger.info(`WebSocket server started on port ${port}`);
|
||||
console.log(`WebSocket server started on port ${process.env.PORT}`);
|
||||
};
|
||||
|
@ -1,8 +1,7 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["ESNext"],
|
||||
"module": "Node16",
|
||||
"moduleResolution": "Node16",
|
||||
"module": "commonjs",
|
||||
"target": "ESNext",
|
||||
"outDir": "dist",
|
||||
"sourceMap": false,
|
||||
|
@ -1,7 +0,0 @@
|
||||
import ChatWindow from '@/components/ChatWindow';
|
||||
|
||||
const Page = ({ params }: { params: { chatId: string } }) => {
|
||||
return <ChatWindow id={params.chatId} />;
|
||||
};
|
||||
|
||||
export default Page;
|
5
ui/app/discover/page.tsx
Normal file
5
ui/app/discover/page.tsx
Normal file
@ -0,0 +1,5 @@
|
||||
const Page = () => {
|
||||
return <div>page</div>;
|
||||
};
|
||||
|
||||
export default Page;
|
@ -3,8 +3,6 @@ import { Montserrat } from 'next/font/google';
|
||||
import './globals.css';
|
||||
import { cn } from '@/lib/utils';
|
||||
import Sidebar from '@/components/Sidebar';
|
||||
import { Toaster } from 'sonner';
|
||||
import ThemeProvider from '@/components/theme/Provider';
|
||||
|
||||
const montserrat = Montserrat({
|
||||
weight: ['300', '400', '500', '700'],
|
||||
@ -25,20 +23,9 @@ export default function RootLayout({
|
||||
children: React.ReactNode;
|
||||
}>) {
|
||||
return (
|
||||
<html className="h-full" lang="en" suppressHydrationWarning>
|
||||
<html className="h-full" lang="en">
|
||||
<body className={cn('h-full', montserrat.className)}>
|
||||
<ThemeProvider>
|
||||
<Sidebar>{children}</Sidebar>
|
||||
<Toaster
|
||||
toastOptions={{
|
||||
unstyled: true,
|
||||
classNames: {
|
||||
toast:
|
||||
'bg-light-primary dark:bg-dark-primary text-white rounded-lg p-4 flex flex-row items-center space-x-2',
|
||||
},
|
||||
}}
|
||||
/>
|
||||
</ThemeProvider>
|
||||
<Sidebar>{children}</Sidebar>
|
||||
</body>
|
||||
</html>
|
||||
);
|
||||
|
@ -1,12 +0,0 @@
|
||||
import { Metadata } from 'next';
|
||||
import React from 'react';
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: 'Library - Perplexica',
|
||||
};
|
||||
|
||||
const Layout = ({ children }: { children: React.ReactNode }) => {
|
||||
return <div>{children}</div>;
|
||||
};
|
||||
|
||||
export default Layout;
|
@ -1,104 +0,0 @@
|
||||
'use client';
|
||||
|
||||
import { formatTimeDifference } from '@/lib/utils';
|
||||
import { BookOpenText, ClockIcon, ScanEye } from 'lucide-react';
|
||||
import Link from 'next/link';
|
||||
import { useEffect, useState } from 'react';
|
||||
|
||||
interface Chat {
|
||||
id: string;
|
||||
title: string;
|
||||
createdAt: string;
|
||||
focusMode: string;
|
||||
}
|
||||
|
||||
const Page = () => {
|
||||
const [chats, setChats] = useState<Chat[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
|
||||
useEffect(() => {
|
||||
const fetchChats = async () => {
|
||||
setLoading(true);
|
||||
|
||||
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/chats`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
setChats(data.chats);
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
fetchChats();
|
||||
}, []);
|
||||
|
||||
return loading ? (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
<svg
|
||||
aria-hidden="true"
|
||||
className="w-8 h-8 text-light-200 fill-light-secondary dark:text-[#202020] animate-spin dark:fill-[#ffffff3b]"
|
||||
viewBox="0 0 100 101"
|
||||
fill="none"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
>
|
||||
<path
|
||||
d="M100 50.5908C100.003 78.2051 78.1951 100.003 50.5908 100C22.9765 99.9972 0.997224 78.018 1 50.4037C1.00281 22.7993 22.8108 0.997224 50.4251 1C78.0395 1.00281 100.018 22.8108 100 50.4251ZM9.08164 50.594C9.06312 73.3997 27.7909 92.1272 50.5966 92.1457C73.4023 92.1642 92.1298 73.4365 92.1483 50.6308C92.1669 27.8251 73.4392 9.0973 50.6335 9.07878C27.8278 9.06026 9.10003 27.787 9.08164 50.594Z"
|
||||
fill="currentColor"
|
||||
/>
|
||||
<path
|
||||
d="M93.9676 39.0409C96.393 38.4037 97.8624 35.9116 96.9801 33.5533C95.1945 28.8227 92.871 24.3692 90.0681 20.348C85.6237 14.1775 79.4473 9.36872 72.0454 6.45794C64.6435 3.54717 56.3134 2.65431 48.3133 3.89319C45.869 4.27179 44.3768 6.77534 45.014 9.20079C45.6512 11.6262 48.1343 13.0956 50.5786 12.717C56.5073 11.8281 62.5542 12.5399 68.0406 14.7911C73.527 17.0422 78.2187 20.7487 81.5841 25.4923C83.7976 28.5886 85.4467 32.059 86.4416 35.7474C87.1273 38.1189 89.5423 39.6781 91.9676 39.0409Z"
|
||||
fill="currentFill"
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
) : (
|
||||
<div>
|
||||
<div className="fixed z-40 top-0 left-0 right-0 lg:pl-[104px] lg:pr-6 lg:px-8 px-4 py-4 lg:py-6 border-b border-light-200 dark:border-dark-200">
|
||||
<div className="flex flex-row items-center space-x-2 max-w-screen-lg lg:mx-auto">
|
||||
<BookOpenText />
|
||||
<h2 className="text-black dark:text-white lg:text-3xl lg:font-medium">
|
||||
Library
|
||||
</h2>
|
||||
</div>
|
||||
</div>
|
||||
{chats.length === 0 && (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
No chats found.
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
{chats.length > 0 && (
|
||||
<div className="flex flex-col pt-16 lg:pt-24">
|
||||
{chats.map((chat, i) => (
|
||||
<div
|
||||
className="flex flex-col space-y-4 border-b border-white-200 dark:border-dark-200 py-6 lg:mx-4"
|
||||
key={i}
|
||||
>
|
||||
<Link
|
||||
href={`/c/${chat.id}`}
|
||||
className="text-black dark:text-white lg:text-xl font-medium truncate transition duration-200 hover:text-[#24A0ED] dark:hover:text-[#24A0ED] cursor-pointer"
|
||||
>
|
||||
{chat.title}
|
||||
</Link>
|
||||
<div className="flex flex-row items-center justify-between w-full">
|
||||
<div className="flex flex-row items-center space-x-1 lg:space-x-1.5 text-black/70 dark:text-white/70">
|
||||
<ClockIcon size={15} />
|
||||
<p className="text-xs">
|
||||
{formatTimeDifference(new Date(), chat.createdAt)} Ago
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default Page;
|
@ -1,6 +1,5 @@
|
||||
import ChatWindow from '@/components/ChatWindow';
|
||||
import { Metadata } from 'next';
|
||||
import { Suspense } from 'react';
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: 'Chat - Perplexica',
|
||||
@ -10,9 +9,7 @@ export const metadata: Metadata = {
|
||||
const Home = () => {
|
||||
return (
|
||||
<div>
|
||||
<Suspense>
|
||||
<ChatWindow />
|
||||
</Suspense>
|
||||
<ChatWindow />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { Fragment, useEffect, useRef, useState } from 'react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import MessageInput from './MessageInput';
|
||||
import { Message } from './ChatWindow';
|
||||
import MessageBox from './MessageBox';
|
||||
@ -53,7 +53,7 @@ const Chat = ({
|
||||
const isLast = i === messages.length - 1;
|
||||
|
||||
return (
|
||||
<Fragment key={msg.messageId}>
|
||||
<>
|
||||
<MessageBox
|
||||
key={i}
|
||||
message={msg}
|
||||
@ -63,12 +63,11 @@ const Chat = ({
|
||||
dividerRef={isLast ? dividerRef : undefined}
|
||||
isLast={isLast}
|
||||
rewrite={rewrite}
|
||||
sendMessage={sendMessage}
|
||||
/>
|
||||
{!isLast && msg.role === 'assistant' && (
|
||||
<div className="h-px w-full bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div className="h-px w-full bg-[#1C1C1C]" />
|
||||
)}
|
||||
</Fragment>
|
||||
</>
|
||||
);
|
||||
})}
|
||||
{loading && !messageAppeared && <MessageBoxLoading />}
|
||||
@ -78,7 +77,7 @@ const Chat = ({
|
||||
className="bottom-24 lg:bottom-10 fixed z-40"
|
||||
style={{ width: dividerWidth }}
|
||||
>
|
||||
<MessageInput loading={loading} sendMessage={sendMessage} />
|
||||
<MessageInput sendMessage={sendMessage} />
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
@ -1,260 +1,48 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import Navbar from './Navbar';
|
||||
import Chat from './Chat';
|
||||
import EmptyChat from './EmptyChat';
|
||||
import crypto from 'crypto';
|
||||
import { toast } from 'sonner';
|
||||
import { useSearchParams } from 'next/navigation';
|
||||
import { getSuggestions } from '@/lib/actions';
|
||||
|
||||
export type Message = {
|
||||
messageId: string;
|
||||
chatId: string;
|
||||
id: string;
|
||||
createdAt: Date;
|
||||
content: string;
|
||||
role: 'user' | 'assistant';
|
||||
suggestions?: string[];
|
||||
sources?: Document[];
|
||||
};
|
||||
|
||||
const useSocket = (
|
||||
url: string,
|
||||
setIsWSReady: (ready: boolean) => void,
|
||||
setError: (error: boolean) => void,
|
||||
) => {
|
||||
const useSocket = (url: string) => {
|
||||
const [ws, setWs] = useState<WebSocket | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
if (!ws) {
|
||||
const connectWs = async () => {
|
||||
let chatModel = localStorage.getItem('chatModel');
|
||||
let chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
let embeddingModel = localStorage.getItem('embeddingModel');
|
||||
let embeddingModelProvider = localStorage.getItem(
|
||||
'embeddingModelProvider',
|
||||
);
|
||||
|
||||
if (
|
||||
!chatModel ||
|
||||
!chatModelProvider ||
|
||||
!embeddingModel ||
|
||||
!embeddingModelProvider
|
||||
) {
|
||||
const providers = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/models`,
|
||||
{
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
},
|
||||
).then(async (res) => await res.json());
|
||||
|
||||
const chatModelProviders = providers.chatModelProviders;
|
||||
const embeddingModelProviders = providers.embeddingModelProviders;
|
||||
|
||||
if (
|
||||
!chatModelProviders ||
|
||||
Object.keys(chatModelProviders).length === 0
|
||||
)
|
||||
return toast.error('No chat models available');
|
||||
|
||||
if (
|
||||
!embeddingModelProviders ||
|
||||
Object.keys(embeddingModelProviders).length === 0
|
||||
)
|
||||
return toast.error('No embedding models available');
|
||||
|
||||
chatModelProvider = Object.keys(chatModelProviders)[0];
|
||||
chatModel = Object.keys(chatModelProviders[chatModelProvider])[0];
|
||||
|
||||
embeddingModelProvider = Object.keys(embeddingModelProviders)[0];
|
||||
embeddingModel = Object.keys(
|
||||
embeddingModelProviders[embeddingModelProvider],
|
||||
)[0];
|
||||
|
||||
localStorage.setItem('chatModel', chatModel!);
|
||||
localStorage.setItem('chatModelProvider', chatModelProvider);
|
||||
localStorage.setItem('embeddingModel', embeddingModel!);
|
||||
localStorage.setItem(
|
||||
'embeddingModelProvider',
|
||||
embeddingModelProvider,
|
||||
);
|
||||
}
|
||||
|
||||
const wsURL = new URL(url);
|
||||
const searchParams = new URLSearchParams({});
|
||||
|
||||
searchParams.append('chatModel', chatModel!);
|
||||
searchParams.append('chatModelProvider', chatModelProvider);
|
||||
|
||||
if (chatModelProvider === 'custom_openai') {
|
||||
searchParams.append(
|
||||
'openAIApiKey',
|
||||
localStorage.getItem('openAIApiKey')!,
|
||||
);
|
||||
searchParams.append(
|
||||
'openAIBaseURL',
|
||||
localStorage.getItem('openAIBaseURL')!,
|
||||
);
|
||||
}
|
||||
|
||||
searchParams.append('embeddingModel', embeddingModel!);
|
||||
searchParams.append('embeddingModelProvider', embeddingModelProvider);
|
||||
|
||||
wsURL.search = searchParams.toString();
|
||||
|
||||
const ws = new WebSocket(wsURL.toString());
|
||||
|
||||
const timeoutId = setTimeout(() => {
|
||||
if (ws.readyState !== 1) {
|
||||
ws.close();
|
||||
setError(true);
|
||||
toast.error(
|
||||
'Failed to connect to the server. Please try again later.',
|
||||
);
|
||||
}
|
||||
}, 10000);
|
||||
|
||||
ws.onopen = () => {
|
||||
console.log('[DEBUG] open');
|
||||
clearTimeout(timeoutId);
|
||||
setError(false);
|
||||
setIsWSReady(true);
|
||||
};
|
||||
|
||||
ws.onerror = () => {
|
||||
clearTimeout(timeoutId);
|
||||
setError(true);
|
||||
toast.error('WebSocket connection error.');
|
||||
};
|
||||
|
||||
ws.onclose = () => {
|
||||
clearTimeout(timeoutId);
|
||||
setError(true);
|
||||
console.log('[DEBUG] closed');
|
||||
};
|
||||
|
||||
const ws = new WebSocket(url);
|
||||
ws.onopen = () => {
|
||||
console.log('[DEBUG] open');
|
||||
setWs(ws);
|
||||
};
|
||||
|
||||
connectWs();
|
||||
}
|
||||
|
||||
return () => {
|
||||
ws?.close();
|
||||
console.log('[DEBUG] closed');
|
||||
};
|
||||
}, [ws, url, setIsWSReady, setError]);
|
||||
}, [ws, url]);
|
||||
|
||||
return ws;
|
||||
};
|
||||
|
||||
const loadMessages = async (
|
||||
chatId: string,
|
||||
setMessages: (messages: Message[]) => void,
|
||||
setIsMessagesLoaded: (loaded: boolean) => void,
|
||||
setChatHistory: (history: [string, string][]) => void,
|
||||
setFocusMode: (mode: string) => void,
|
||||
) => {
|
||||
const res = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/chats/${chatId}`,
|
||||
{
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
},
|
||||
);
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const messages = data.messages.map((msg: any) => {
|
||||
return {
|
||||
...msg,
|
||||
...JSON.parse(msg.metadata),
|
||||
};
|
||||
}) as Message[];
|
||||
|
||||
setMessages(messages);
|
||||
|
||||
const history = messages.map((msg) => {
|
||||
return [msg.role, msg.content];
|
||||
}) as [string, string][];
|
||||
|
||||
console.log('[DEBUG] messages loaded');
|
||||
|
||||
document.title = messages[0].content;
|
||||
|
||||
setChatHistory(history);
|
||||
setFocusMode(data.chat.focusMode);
|
||||
console.log(data);
|
||||
setIsMessagesLoaded(true);
|
||||
};
|
||||
|
||||
const ChatWindow = ({ id }: { id?: string }) => {
|
||||
const searchParams = useSearchParams();
|
||||
const initialMessage = searchParams.get('q');
|
||||
|
||||
const [chatId, setChatId] = useState<string | undefined>(id);
|
||||
const [newChatCreated, setNewChatCreated] = useState(false);
|
||||
|
||||
const [hasError, setHasError] = useState(false);
|
||||
const [isReady, setIsReady] = useState(false);
|
||||
|
||||
const [isWSReady, setIsWSReady] = useState(false);
|
||||
const ws = useSocket(
|
||||
process.env.NEXT_PUBLIC_WS_URL!,
|
||||
setIsWSReady,
|
||||
setHasError,
|
||||
);
|
||||
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [messageAppeared, setMessageAppeared] = useState(false);
|
||||
|
||||
const ChatWindow = () => {
|
||||
const ws = useSocket(process.env.NEXT_PUBLIC_WS_URL!);
|
||||
const [chatHistory, setChatHistory] = useState<[string, string][]>([]);
|
||||
const [messages, setMessages] = useState<Message[]>([]);
|
||||
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [messageAppeared, setMessageAppeared] = useState(false);
|
||||
const [focusMode, setFocusMode] = useState('webSearch');
|
||||
|
||||
const [isMessagesLoaded, setIsMessagesLoaded] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (
|
||||
chatId &&
|
||||
!newChatCreated &&
|
||||
!isMessagesLoaded &&
|
||||
messages.length === 0
|
||||
) {
|
||||
loadMessages(
|
||||
chatId,
|
||||
setMessages,
|
||||
setIsMessagesLoaded,
|
||||
setChatHistory,
|
||||
setFocusMode,
|
||||
);
|
||||
} else if (!chatId) {
|
||||
setNewChatCreated(true);
|
||||
setIsMessagesLoaded(true);
|
||||
setChatId(crypto.randomBytes(20).toString('hex'));
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, []);
|
||||
|
||||
const messagesRef = useRef<Message[]>([]);
|
||||
|
||||
useEffect(() => {
|
||||
messagesRef.current = messages;
|
||||
}, [messages]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isMessagesLoaded && isWSReady) {
|
||||
setIsReady(true);
|
||||
}
|
||||
}, [isMessagesLoaded, isWSReady]);
|
||||
|
||||
const sendMessage = async (message: string) => {
|
||||
if (loading) return;
|
||||
setLoading(true);
|
||||
@ -264,15 +52,10 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
let recievedMessage = '';
|
||||
let added = false;
|
||||
|
||||
const messageId = crypto.randomBytes(7).toString('hex');
|
||||
|
||||
ws?.send(
|
||||
JSON.stringify({
|
||||
type: 'message',
|
||||
message: {
|
||||
chatId: chatId!,
|
||||
content: message,
|
||||
},
|
||||
content: message,
|
||||
focusMode: focusMode,
|
||||
history: [...chatHistory, ['human', message]],
|
||||
}),
|
||||
@ -282,22 +65,15 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
...prevMessages,
|
||||
{
|
||||
content: message,
|
||||
messageId: messageId,
|
||||
chatId: chatId!,
|
||||
id: Math.random().toString(36).substring(7),
|
||||
role: 'user',
|
||||
createdAt: new Date(),
|
||||
},
|
||||
]);
|
||||
|
||||
const messageHandler = async (e: MessageEvent) => {
|
||||
const messageHandler = (e: MessageEvent) => {
|
||||
const data = JSON.parse(e.data);
|
||||
|
||||
if (data.type === 'error') {
|
||||
toast.error(data.data);
|
||||
setLoading(false);
|
||||
return;
|
||||
}
|
||||
|
||||
if (data.type === 'sources') {
|
||||
sources = data.data;
|
||||
if (!added) {
|
||||
@ -305,8 +81,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
...prevMessages,
|
||||
{
|
||||
content: '',
|
||||
messageId: data.messageId,
|
||||
chatId: chatId!,
|
||||
id: data.messageId,
|
||||
role: 'assistant',
|
||||
sources: sources,
|
||||
createdAt: new Date(),
|
||||
@ -323,8 +98,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
...prevMessages,
|
||||
{
|
||||
content: data.data,
|
||||
messageId: data.messageId,
|
||||
chatId: chatId!,
|
||||
id: data.messageId,
|
||||
role: 'assistant',
|
||||
sources: sources,
|
||||
createdAt: new Date(),
|
||||
@ -335,7 +109,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
|
||||
setMessages((prev) =>
|
||||
prev.map((message) => {
|
||||
if (message.messageId === data.messageId) {
|
||||
if (message.id === data.messageId) {
|
||||
return { ...message, content: message.content + data.data };
|
||||
}
|
||||
|
||||
@ -353,28 +127,8 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
['human', message],
|
||||
['assistant', recievedMessage],
|
||||
]);
|
||||
|
||||
ws?.removeEventListener('message', messageHandler);
|
||||
setLoading(false);
|
||||
|
||||
const lastMsg = messagesRef.current[messagesRef.current.length - 1];
|
||||
|
||||
if (
|
||||
lastMsg.role === 'assistant' &&
|
||||
lastMsg.sources &&
|
||||
lastMsg.sources.length > 0 &&
|
||||
!lastMsg.suggestions
|
||||
) {
|
||||
const suggestions = await getSuggestions(messagesRef.current);
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) => {
|
||||
if (msg.messageId === lastMsg.messageId) {
|
||||
return { ...msg, suggestions: suggestions };
|
||||
}
|
||||
return msg;
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@ -382,7 +136,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
};
|
||||
|
||||
const rewrite = (messageId: string) => {
|
||||
const index = messages.findIndex((msg) => msg.messageId === messageId);
|
||||
const index = messages.findIndex((msg) => msg.id === messageId);
|
||||
|
||||
if (index === -1) return;
|
||||
|
||||
@ -398,24 +152,7 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
sendMessage(message.content);
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (isReady && initialMessage) {
|
||||
sendMessage(initialMessage);
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [isReady, initialMessage]);
|
||||
|
||||
if (hasError) {
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center min-h-screen">
|
||||
<p className="dark:text-white/70 text-black/70 text-sm">
|
||||
Failed to connect to the server. Please try again later.
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return isReady ? (
|
||||
return (
|
||||
<div>
|
||||
{messages.length > 0 ? (
|
||||
<>
|
||||
@ -436,25 +173,6 @@ const ChatWindow = ({ id }: { id?: string }) => {
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
<div className="flex flex-row items-center justify-center min-h-screen">
|
||||
<svg
|
||||
aria-hidden="true"
|
||||
className="w-8 h-8 text-light-200 fill-light-secondary dark:text-[#202020] animate-spin dark:fill-[#ffffff3b]"
|
||||
viewBox="0 0 100 101"
|
||||
fill="none"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
>
|
||||
<path
|
||||
d="M100 50.5908C100.003 78.2051 78.1951 100.003 50.5908 100C22.9765 99.9972 0.997224 78.018 1 50.4037C1.00281 22.7993 22.8108 0.997224 50.4251 1C78.0395 1.00281 100.018 22.8108 100 50.4251ZM9.08164 50.594C9.06312 73.3997 27.7909 92.1272 50.5966 92.1457C73.4023 92.1642 92.1298 73.4365 92.1483 50.6308C92.1669 27.8251 73.4392 9.0973 50.6335 9.07878C27.8278 9.06026 9.10003 27.787 9.08164 50.594Z"
|
||||
fill="currentColor"
|
||||
/>
|
||||
<path
|
||||
d="M93.9676 39.0409C96.393 38.4037 97.8624 35.9116 96.9801 33.5533C95.1945 28.8227 92.871 24.3692 90.0681 20.348C85.6237 14.1775 79.4473 9.36872 72.0454 6.45794C64.6435 3.54717 56.3134 2.65431 48.3133 3.89319C45.869 4.27179 44.3768 6.77534 45.014 9.20079C45.6512 11.6262 48.1343 13.0956 50.5786 12.717C56.5073 11.8281 62.5542 12.5399 68.0406 14.7911C73.527 17.0422 78.2187 20.7487 81.5841 25.4923C83.7976 28.5886 85.4467 32.059 86.4416 35.7474C87.1273 38.1189 89.5423 39.6781 91.9676 39.0409Z"
|
||||
fill="currentFill"
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
|
@ -10,17 +10,15 @@ const EmptyChat = ({
|
||||
setFocusMode: (mode: string) => void;
|
||||
}) => {
|
||||
return (
|
||||
<div className="relative">
|
||||
<div className="flex flex-col items-center justify-center min-h-screen max-w-screen-sm mx-auto p-2 space-y-8">
|
||||
<h2 className="text-black/70 dark:text-white/70 text-3xl font-medium -mt-8">
|
||||
Research begins here.
|
||||
</h2>
|
||||
<EmptyChatMessageInput
|
||||
sendMessage={sendMessage}
|
||||
focusMode={focusMode}
|
||||
setFocusMode={setFocusMode}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col items-center justify-center min-h-screen max-w-screen-sm mx-auto p-2 space-y-8">
|
||||
<h2 className="text-white/70 text-3xl font-medium -mt-8">
|
||||
Research begins here.
|
||||
</h2>
|
||||
<EmptyChatMessageInput
|
||||
sendMessage={sendMessage}
|
||||
focusMode={focusMode}
|
||||
setFocusMode={setFocusMode}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
@ -1,8 +1,7 @@
|
||||
import { ArrowRight } from 'lucide-react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { useState } from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import CopilotToggle from './MessageInputActions/Copilot';
|
||||
import Focus from './MessageInputActions/Focus';
|
||||
import { Attach, CopilotToggle, Focus } from './MessageInputActions';
|
||||
|
||||
const EmptyChatMessageInput = ({
|
||||
sendMessage,
|
||||
@ -16,23 +15,6 @@ const EmptyChatMessageInput = ({
|
||||
const [copilotEnabled, setCopilotEnabled] = useState(false);
|
||||
const [message, setMessage] = useState('');
|
||||
|
||||
const inputRef = useRef<HTMLTextAreaElement | null>(null);
|
||||
|
||||
const handleKeyDown = (e: KeyboardEvent) => {
|
||||
if (e.key === '/') {
|
||||
e.preventDefault();
|
||||
inputRef.current?.focus();
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
document.addEventListener('keydown', handleKeyDown);
|
||||
|
||||
return () => {
|
||||
document.removeEventListener('keydown', handleKeyDown);
|
||||
};
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<form
|
||||
onSubmit={(e) => {
|
||||
@ -49,13 +31,12 @@ const EmptyChatMessageInput = ({
|
||||
}}
|
||||
className="w-full"
|
||||
>
|
||||
<div className="flex flex-col bg-light-secondary dark:bg-dark-secondary px-5 pt-5 pb-2 rounded-lg w-full border border-light-200 dark:border-dark-200">
|
||||
<div className="flex flex-col bg-[#111111] px-5 pt-5 pb-2 rounded-lg w-full border border-[#1C1C1C]">
|
||||
<TextareaAutosize
|
||||
ref={inputRef}
|
||||
value={message}
|
||||
onChange={(e) => setMessage(e.target.value)}
|
||||
minRows={2}
|
||||
className="bg-transparent placeholder:text-black/50 dark:placeholder:text-white/50 text-sm text-black dark:text-white resize-none focus:outline-none w-full max-h-24 lg:max-h-36 xl:max-h-48"
|
||||
className="bg-transparent placeholder:text-white/50 text-sm text-white resize-none focus:outline-none w-full max-h-24 lg:max-h-36 xl:max-h-48"
|
||||
placeholder="Ask anything..."
|
||||
/>
|
||||
<div className="flex flex-row items-center justify-between mt-4">
|
||||
@ -70,7 +51,7 @@ const EmptyChatMessageInput = ({
|
||||
/>
|
||||
<button
|
||||
disabled={message.trim().length === 0}
|
||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 disabled:bg-[#e0e0dc] dark:disabled:bg-[#ececec21] hover:bg-opacity-85 transition duration-100 rounded-full p-2"
|
||||
className="bg-[#24A0ED] text-white disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#ececec21] rounded-full p-2"
|
||||
>
|
||||
<ArrowRight className="bg-background" size={17} />
|
||||
</button>
|
||||
|
@ -1,6 +1,6 @@
|
||||
const Layout = ({ children }: { children: React.ReactNode }) => {
|
||||
return (
|
||||
<main className="lg:pl-20 bg-light-primary dark:bg-dark-primary min-h-screen">
|
||||
<main className="lg:pl-20 bg-[#0A0A0A] min-h-screen">
|
||||
<div className="max-w-screen-lg lg:mx-auto mx-4">{children}</div>
|
||||
</main>
|
||||
);
|
||||
|
@ -19,7 +19,7 @@ const Copy = ({
|
||||
setCopied(true);
|
||||
setTimeout(() => setCopied(false), 1000);
|
||||
}}
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white"
|
||||
>
|
||||
{copied ? <Check size={18} /> : <ClipboardList size={18} />}
|
||||
</button>
|
||||
|
@ -10,10 +10,9 @@ const Rewrite = ({
|
||||
return (
|
||||
<button
|
||||
onClick={() => rewrite(messageId)}
|
||||
className="py-2 px-3 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white flex flex-row items-center space-x-1"
|
||||
className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white"
|
||||
>
|
||||
<ArrowLeftRight size={18} />
|
||||
<p className="text-xs font-medium">Rewrite</p>
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
@ -1,5 +1,3 @@
|
||||
'use client';
|
||||
|
||||
/* eslint-disable @next/next/no-img-element */
|
||||
import React, { MutableRefObject, useEffect, useState } from 'react';
|
||||
import { Message } from './ChatWindow';
|
||||
@ -7,18 +5,17 @@ import { cn } from '@/lib/utils';
|
||||
import {
|
||||
BookCopy,
|
||||
Disc3,
|
||||
Volume2,
|
||||
StopCircle,
|
||||
Layers3,
|
||||
Plus,
|
||||
FilePen,
|
||||
PlusIcon,
|
||||
Share,
|
||||
ThumbsDown,
|
||||
VideoIcon,
|
||||
} from 'lucide-react';
|
||||
import Markdown from 'markdown-to-jsx';
|
||||
import Copy from './MessageActions/Copy';
|
||||
import Rewrite from './MessageActions/Rewrite';
|
||||
import MessageSources from './MessageSources';
|
||||
import SearchImages from './SearchImages';
|
||||
import SearchVideos from './SearchVideos';
|
||||
import { useSpeech } from 'react-text-to-speech';
|
||||
|
||||
const MessageBox = ({
|
||||
message,
|
||||
@ -28,7 +25,6 @@ const MessageBox = ({
|
||||
dividerRef,
|
||||
isLast,
|
||||
rewrite,
|
||||
sendMessage,
|
||||
}: {
|
||||
message: Message;
|
||||
messageIndex: number;
|
||||
@ -37,39 +33,33 @@ const MessageBox = ({
|
||||
dividerRef?: MutableRefObject<HTMLDivElement | null>;
|
||||
isLast: boolean;
|
||||
rewrite: (messageId: string) => void;
|
||||
sendMessage: (message: string) => void;
|
||||
}) => {
|
||||
const [parsedMessage, setParsedMessage] = useState(message.content);
|
||||
const [speechMessage, setSpeechMessage] = useState(message.content);
|
||||
|
||||
useEffect(() => {
|
||||
const regex = /\[(\d+)\]/g;
|
||||
|
||||
if (
|
||||
message.role === 'assistant' &&
|
||||
message?.sources &&
|
||||
message.sources.length > 0
|
||||
) {
|
||||
const regex = /\[(\d+)\]/g;
|
||||
|
||||
return setParsedMessage(
|
||||
message.content.replace(
|
||||
regex,
|
||||
(_, number) =>
|
||||
`<a href="${message.sources?.[number - 1]?.metadata?.url}" target="_blank" className="bg-light-secondary dark:bg-dark-secondary px-1 rounded ml-1 no-underline text-xs text-black/70 dark:text-white/70 relative">${number}</a>`,
|
||||
`<a href="${message.sources?.[number - 1]?.metadata?.url}" target="_blank" className="bg-[#1C1C1C] px-1 rounded ml-1 no-underline text-xs text-white/70 relative">${number}</a>`,
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
setSpeechMessage(message.content.replace(regex, ''));
|
||||
setParsedMessage(message.content);
|
||||
}, [message.content, message.sources, message.role]);
|
||||
|
||||
const { speechStatus, start, stop } = useSpeech({ text: speechMessage });
|
||||
|
||||
return (
|
||||
<div>
|
||||
{message.role === 'user' && (
|
||||
<div className={cn('w-full', messageIndex === 0 ? 'pt-16' : 'pt-8')}>
|
||||
<h2 className="text-black dark:text-white font-medium text-3xl lg:w-9/12">
|
||||
<h2 className="text-white font-medium text-3xl lg:w-9/12">
|
||||
{message.content}
|
||||
</h2>
|
||||
</div>
|
||||
@ -84,10 +74,8 @@ const MessageBox = ({
|
||||
{message.sources && message.sources.length > 0 && (
|
||||
<div className="flex flex-col space-y-2">
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<BookCopy className="text-black dark:text-white" size={20} />
|
||||
<h3 className="text-black dark:text-white font-medium text-xl">
|
||||
Sources
|
||||
</h3>
|
||||
<BookCopy className="text-white" size={20} />
|
||||
<h3 className="text-white font-medium text-xl">Sources</h3>
|
||||
</div>
|
||||
<MessageSources sources={message.sources} />
|
||||
</div>
|
||||
@ -96,102 +84,46 @@ const MessageBox = ({
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<Disc3
|
||||
className={cn(
|
||||
'text-black dark:text-white',
|
||||
'text-white',
|
||||
isLast && loading ? 'animate-spin' : 'animate-none',
|
||||
)}
|
||||
size={20}
|
||||
/>
|
||||
<h3 className="text-black dark:text-white font-medium text-xl">
|
||||
Answer
|
||||
</h3>
|
||||
<h3 className="text-white font-medium text-xl">Answer</h3>
|
||||
</div>
|
||||
<Markdown
|
||||
className={cn(
|
||||
'prose dark:prose-invert prose-p:leading-relaxed prose-pre:p-0',
|
||||
'max-w-none break-words text-black dark:text-white text-sm md:text-base font-medium',
|
||||
)}
|
||||
>
|
||||
<Markdown className="prose max-w-none break-words prose-invert prose-p:leading-relaxed prose-pre:p-0 text-white text-sm md:text-base font-medium">
|
||||
{parsedMessage}
|
||||
</Markdown>
|
||||
{loading && isLast ? null : (
|
||||
<div className="flex flex-row items-center justify-between w-full text-black dark:text-white py-4 -mx-2">
|
||||
{!loading && (
|
||||
<div className="flex flex-row items-center justify-between w-full text-white py-4 -mx-2">
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{/* <button className="p-2 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black text-black dark:hover:text-white">
|
||||
<button className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white">
|
||||
<Share size={18} />
|
||||
</button> */}
|
||||
<Rewrite rewrite={rewrite} messageId={message.messageId} />
|
||||
</button>
|
||||
<Rewrite rewrite={rewrite} messageId={message.id} />
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<Copy initialMessage={message.content} message={message} />
|
||||
<button
|
||||
onClick={() => {
|
||||
if (speechStatus === 'started') {
|
||||
stop();
|
||||
} else {
|
||||
start();
|
||||
}
|
||||
}}
|
||||
className="p-2 text-black/70 dark:text-white/70 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
{speechStatus === 'started' ? (
|
||||
<StopCircle size={18} />
|
||||
) : (
|
||||
<Volume2 size={18} />
|
||||
)}
|
||||
<button className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white">
|
||||
<FilePen size={18} />
|
||||
</button>
|
||||
<button className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white">
|
||||
<ThumbsDown size={18} />
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
{isLast &&
|
||||
message.suggestions &&
|
||||
message.suggestions.length > 0 &&
|
||||
message.role === 'assistant' &&
|
||||
!loading && (
|
||||
<>
|
||||
<div className="h-px w-full bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div className="flex flex-col space-y-3 text-black dark:text-white">
|
||||
<div className="flex flex-row items-center space-x-2 mt-4">
|
||||
<Layers3 />
|
||||
<h3 className="text-xl font-medium">Related</h3>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-3">
|
||||
{message.suggestions.map((suggestion, i) => (
|
||||
<div
|
||||
className="flex flex-col space-y-3 text-sm"
|
||||
key={i}
|
||||
>
|
||||
<div className="h-px w-full bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div
|
||||
onClick={() => {
|
||||
sendMessage(suggestion);
|
||||
}}
|
||||
className="cursor-pointer flex flex-row justify-between font-medium space-x-2 items-center"
|
||||
>
|
||||
<p className="transition duration-200 hover:text-[#24A0ED]">
|
||||
{suggestion}
|
||||
</p>
|
||||
<Plus
|
||||
size={20}
|
||||
className="text-[#24A0ED] flex-shrink-0"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
|
||||
<SearchImages
|
||||
query={history[messageIndex - 1].content}
|
||||
chat_history={history.slice(0, messageIndex - 1)}
|
||||
/>
|
||||
<SearchVideos
|
||||
chat_history={history.slice(0, messageIndex - 1)}
|
||||
query={history[messageIndex - 1].content}
|
||||
/>
|
||||
<SearchImages query={history[messageIndex - 1].content} />
|
||||
<div className="border border-dashed border-[#1C1C1C] px-4 py-2 flex flex-row items-center justify-between rounded-lg text-white text-sm w-full">
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<VideoIcon size={17} />
|
||||
<p>Search videos</p>
|
||||
</div>
|
||||
<PlusIcon className="text-[#24A0ED]" size={17} />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
@ -1,9 +1,9 @@
|
||||
const MessageBoxLoading = () => {
|
||||
return (
|
||||
<div className="flex flex-col space-y-2 w-full lg:w-9/12 bg-light-primary dark:bg-dark-primary animate-pulse rounded-lg py-3">
|
||||
<div className="h-2 rounded-full w-full bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div className="h-2 rounded-full w-9/12 bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div className="h-2 rounded-full w-10/12 bg-light-secondary dark:bg-dark-secondary" />
|
||||
<div className="flex flex-col space-y-2 w-full lg:w-9/12 bg-[#111111] animate-pulse rounded-lg p-3">
|
||||
<div className="h-2 rounded-full w-full bg-[#1c1c1c]" />
|
||||
<div className="h-2 rounded-full w-9/12 bg-[#1c1c1c]" />
|
||||
<div className="h-2 rounded-full w-10/12 bg-[#1c1c1c]" />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
@ -1,16 +1,13 @@
|
||||
import { cn } from '@/lib/utils';
|
||||
import { ArrowUp } from 'lucide-react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import Attach from './MessageInputActions/Attach';
|
||||
import CopilotToggle from './MessageInputActions/Copilot';
|
||||
import { Attach, CopilotToggle } from './MessageInputActions';
|
||||
|
||||
const MessageInput = ({
|
||||
sendMessage,
|
||||
loading,
|
||||
}: {
|
||||
sendMessage: (message: string) => void;
|
||||
loading: boolean;
|
||||
}) => {
|
||||
const [copilotEnabled, setCopilotEnabled] = useState(false);
|
||||
const [message, setMessage] = useState('');
|
||||
@ -25,52 +22,33 @@ const MessageInput = ({
|
||||
}
|
||||
}, [textareaRows, mode, message]);
|
||||
|
||||
const inputRef = useRef<HTMLTextAreaElement | null>(null);
|
||||
|
||||
const handleKeyDown = (e: KeyboardEvent) => {
|
||||
if (e.key === '/') {
|
||||
e.preventDefault();
|
||||
inputRef.current?.focus();
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
document.addEventListener('keydown', handleKeyDown);
|
||||
|
||||
return () => {
|
||||
document.removeEventListener('keydown', handleKeyDown);
|
||||
};
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<form
|
||||
onSubmit={(e) => {
|
||||
if (loading) return;
|
||||
e.preventDefault();
|
||||
sendMessage(message);
|
||||
setMessage('');
|
||||
}}
|
||||
onKeyDown={(e) => {
|
||||
if (e.key === 'Enter' && !e.shiftKey && !loading) {
|
||||
if (e.key === 'Enter' && !e.shiftKey) {
|
||||
e.preventDefault();
|
||||
sendMessage(message);
|
||||
setMessage('');
|
||||
}
|
||||
}}
|
||||
className={cn(
|
||||
'bg-light-secondary dark:bg-dark-secondary p-4 flex items-center overflow-hidden border border-light-200 dark:border-dark-200',
|
||||
'bg-[#111111] p-4 flex items-center overflow-hidden border border-[#1C1C1C]',
|
||||
mode === 'multi' ? 'flex-col rounded-lg' : 'flex-row rounded-full',
|
||||
)}
|
||||
>
|
||||
{mode === 'single' && <Attach />}
|
||||
<TextareaAutosize
|
||||
ref={inputRef}
|
||||
value={message}
|
||||
onChange={(e) => setMessage(e.target.value)}
|
||||
onHeightChange={(height, props) => {
|
||||
setTextareaRows(Math.ceil(height / props.rowHeight));
|
||||
}}
|
||||
className="transition bg-transparent dark:placeholder:text-white/50 placeholder:text-sm text-sm dark:text-white resize-none focus:outline-none w-full px-2 max-h-24 lg:max-h-36 xl:max-h-48 flex-grow flex-shrink"
|
||||
className="transition bg-transparent placeholder:text-white/50 placeholder:text-sm text-sm text-white resize-none focus:outline-none w-full px-2 max-h-24 lg:max-h-36 xl:max-h-48 flex-grow flex-shrink"
|
||||
placeholder="Ask a follow-up"
|
||||
/>
|
||||
{mode === 'single' && (
|
||||
@ -80,8 +58,8 @@ const MessageInput = ({
|
||||
setCopilotEnabled={setCopilotEnabled}
|
||||
/>
|
||||
<button
|
||||
disabled={message.trim().length === 0 || loading}
|
||||
className="bg-[#24A0ED] text-white disabled:text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
||||
disabled={message.trim().length === 0}
|
||||
className="bg-[#24A0ED] text-white disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#ececec21] rounded-full p-2"
|
||||
>
|
||||
<ArrowUp className="bg-background" size={17} />
|
||||
</button>
|
||||
@ -96,8 +74,8 @@ const MessageInput = ({
|
||||
setCopilotEnabled={setCopilotEnabled}
|
||||
/>
|
||||
<button
|
||||
disabled={message.trim().length === 0 || loading}
|
||||
className="bg-[#24A0ED] text-white text-black/50 dark:disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#e0e0dc79] dark:disabled:bg-[#ececec21] rounded-full p-2"
|
||||
disabled={message.trim().length === 0}
|
||||
className="bg-[#24A0ED] text-white disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#ececec21] rounded-full p-2"
|
||||
>
|
||||
<ArrowUp className="bg-background" size={17} />
|
||||
</button>
|
||||
|
@ -1,16 +1,28 @@
|
||||
import {
|
||||
BadgePercent,
|
||||
ChevronDown,
|
||||
CopyPlus,
|
||||
Globe,
|
||||
Pencil,
|
||||
ScanEye,
|
||||
SwatchBook,
|
||||
} from 'lucide-react';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Popover, Transition } from '@headlessui/react';
|
||||
import { Popover, Switch, Transition } from '@headlessui/react';
|
||||
import { SiReddit, SiYoutube } from '@icons-pack/react-simple-icons';
|
||||
import { Fragment } from 'react';
|
||||
|
||||
export const Attach = () => {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
className="p-2 text-white/50 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white"
|
||||
>
|
||||
<CopyPlus />
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
||||
const focusModes = [
|
||||
{
|
||||
key: 'webSearch',
|
||||
@ -62,7 +74,7 @@ const focusModes = [
|
||||
},
|
||||
];
|
||||
|
||||
const Focus = ({
|
||||
export const Focus = ({
|
||||
focusMode,
|
||||
setFocusMode,
|
||||
}: {
|
||||
@ -73,7 +85,7 @@ const Focus = ({
|
||||
<Popover className="fixed w-full max-w-[15rem] md:max-w-md lg:max-w-lg">
|
||||
<Popover.Button
|
||||
type="button"
|
||||
className="p-2 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary active:scale-95 transition duration-200 hover:text-black dark:hover:text-white"
|
||||
className="p-2 text-white/50 rounded-xl hover:bg-[#1c1c1c] active:scale-95 transition duration-200 hover:text-white"
|
||||
>
|
||||
{focusMode !== 'webSearch' ? (
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
@ -97,7 +109,7 @@ const Focus = ({
|
||||
leaveTo="opacity-0 translate-y-1"
|
||||
>
|
||||
<Popover.Panel className="absolute z-10 w-full">
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-1 bg-light-primary dark:bg-dark-primary border rounded-lg border-light-200 dark:border-dark-200 w-full p-2 max-h-[200px] md:max-h-none overflow-y-auto">
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-1 bg-[#0A0A0A] border rounded-lg border-[#1c1c1c] w-full p-2">
|
||||
{focusModes.map((mode, i) => (
|
||||
<Popover.Button
|
||||
onClick={() => setFocusMode(mode.key)}
|
||||
@ -105,24 +117,20 @@ const Focus = ({
|
||||
className={cn(
|
||||
'p-2 rounded-lg flex flex-col items-start justify-start text-start space-y-2 duration-200 cursor-pointer transition',
|
||||
focusMode === mode.key
|
||||
? 'bg-light-secondary dark:bg-dark-secondary'
|
||||
: 'hover:bg-light-secondary dark:hover:bg-dark-secondary',
|
||||
? 'bg-[#111111]'
|
||||
: 'hover:bg-[#111111]',
|
||||
)}
|
||||
>
|
||||
<div
|
||||
className={cn(
|
||||
'flex flex-row items-center space-x-1',
|
||||
focusMode === mode.key
|
||||
? 'text-[#24A0ED]'
|
||||
: 'text-black dark:text-white',
|
||||
focusMode === mode.key ? 'text-[#24A0ED]' : 'text-white',
|
||||
)}
|
||||
>
|
||||
{mode.icon}
|
||||
<p className="text-sm font-medium">{mode.title}</p>
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
{mode.description}
|
||||
</p>
|
||||
<p className="text-white/70 text-xs">{mode.description}</p>
|
||||
</Popover.Button>
|
||||
))}
|
||||
</div>
|
||||
@ -132,4 +140,41 @@ const Focus = ({
|
||||
);
|
||||
};
|
||||
|
||||
export default Focus;
|
||||
export const CopilotToggle = ({
|
||||
copilotEnabled,
|
||||
setCopilotEnabled,
|
||||
}: {
|
||||
copilotEnabled: boolean;
|
||||
setCopilotEnabled: (enabled: boolean) => void;
|
||||
}) => {
|
||||
return (
|
||||
<div className="group flex flex-row items-center space-x-1 active:scale-95 duration-200 transition cursor-pointer">
|
||||
<Switch
|
||||
checked={copilotEnabled}
|
||||
onChange={setCopilotEnabled}
|
||||
className="bg-[#111111] border border-[#1C1C1C] relative inline-flex h-5 w-10 sm:h-6 sm:w-11 items-center rounded-full"
|
||||
>
|
||||
<span className="sr-only">Copilot</span>
|
||||
<span
|
||||
className={cn(
|
||||
copilotEnabled
|
||||
? 'translate-x-6 bg-[#24A0ED]'
|
||||
: 'translate-x-1 bg-white/50',
|
||||
'inline-block h-3 w-3 sm:h-4 sm:w-4 transform rounded-full transition-all duration-200',
|
||||
)}
|
||||
/>
|
||||
</Switch>
|
||||
<p
|
||||
onClick={() => setCopilotEnabled(!copilotEnabled)}
|
||||
className={cn(
|
||||
'text-xs font-medium transition-colors duration-150 ease-in-out',
|
||||
copilotEnabled
|
||||
? 'text-[#24A0ED]'
|
||||
: 'text-white/50 group-hover:text-white',
|
||||
)}
|
||||
>
|
||||
Copilot
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
};
|
@ -1,14 +0,0 @@
|
||||
import { CopyPlus } from 'lucide-react';
|
||||
|
||||
const Attach = () => {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
className="p-2 text-black/50 dark:text-white/50 rounded-xl hover:bg-light-secondary dark:hover:bg-dark-secondary transition duration-200 hover:text-black dark:hover:text-white"
|
||||
>
|
||||
<CopyPlus />
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
||||
export default Attach;
|
@ -1,43 +0,0 @@
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Switch } from '@headlessui/react';
|
||||
|
||||
const CopilotToggle = ({
|
||||
copilotEnabled,
|
||||
setCopilotEnabled,
|
||||
}: {
|
||||
copilotEnabled: boolean;
|
||||
setCopilotEnabled: (enabled: boolean) => void;
|
||||
}) => {
|
||||
return (
|
||||
<div className="group flex flex-row items-center space-x-1 active:scale-95 duration-200 transition cursor-pointer">
|
||||
<Switch
|
||||
checked={copilotEnabled}
|
||||
onChange={setCopilotEnabled}
|
||||
className="bg-light-secondary dark:bg-dark-secondary border border-light-200/70 dark:border-dark-200 relative inline-flex h-5 w-10 sm:h-6 sm:w-11 items-center rounded-full"
|
||||
>
|
||||
<span className="sr-only">Copilot</span>
|
||||
<span
|
||||
className={cn(
|
||||
copilotEnabled
|
||||
? 'translate-x-6 bg-[#24A0ED]'
|
||||
: 'translate-x-1 bg-black/50 dark:bg-white/50',
|
||||
'inline-block h-3 w-3 sm:h-4 sm:w-4 transform rounded-full transition-all duration-200',
|
||||
)}
|
||||
/>
|
||||
</Switch>
|
||||
<p
|
||||
onClick={() => setCopilotEnabled(!copilotEnabled)}
|
||||
className={cn(
|
||||
'text-xs font-medium transition-colors duration-150 ease-in-out',
|
||||
copilotEnabled
|
||||
? 'text-[#24A0ED]'
|
||||
: 'text-black/50 dark:text-white/50 group-hover:text-black dark:group-hover:text-white',
|
||||
)}
|
||||
>
|
||||
Copilot
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default CopilotToggle;
|
@ -1,31 +1,32 @@
|
||||
/* eslint-disable @next/next/no-img-element */
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Dialog, Transition } from '@headlessui/react';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import Link from 'next/link';
|
||||
import { Fragment, useState } from 'react';
|
||||
|
||||
const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
const [isDialogOpen, setIsDialogOpen] = useState(false);
|
||||
|
||||
const closeModal = () => {
|
||||
function closeModal() {
|
||||
setIsDialogOpen(false);
|
||||
document.body.classList.remove('overflow-hidden-scrollable');
|
||||
};
|
||||
}
|
||||
|
||||
const openModal = () => {
|
||||
function openModal() {
|
||||
setIsDialogOpen(true);
|
||||
document.body.classList.add('overflow-hidden-scrollable');
|
||||
};
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-2">
|
||||
{sources.slice(0, 3).map((source, i) => (
|
||||
<a
|
||||
className="bg-light-100 hover:bg-light-200 dark:bg-dark-100 dark:hover:bg-dark-200 transition duration-200 rounded-lg p-3 flex flex-col space-y-2 font-medium"
|
||||
className="bg-[#111111] hover:bg-[#1c1c1c] transition duration-200 rounded-lg p-3 flex flex-col space-y-2 font-medium"
|
||||
key={i}
|
||||
href={source.metadata.url}
|
||||
target="_blank"
|
||||
>
|
||||
<p className="dark:text-white text-xs overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
<p className="text-white text-xs overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
{source.metadata.title}
|
||||
</p>
|
||||
<div className="flex flex-row items-center justify-between">
|
||||
@ -37,12 +38,12 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
alt="favicon"
|
||||
className="rounded-lg h-4 w-4"
|
||||
/>
|
||||
<p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
<p className="text-xs text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
{source.metadata.url.replace(/.+\/\/|www.|\..+/g, '')}
|
||||
</p>
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1 text-black/50 dark:text-white/50 text-xs">
|
||||
<div className="bg-black/50 dark:bg-white/50 h-[4px] w-[4px] rounded-full" />
|
||||
<div className="flex flex-row items-center space-x-1 text-white/50 text-xs">
|
||||
<div className="bg-white/50 h-[4px] w-[4px] rounded-full" />
|
||||
<span>{i + 1}</span>
|
||||
</div>
|
||||
</div>
|
||||
@ -51,7 +52,7 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
{sources.length > 3 && (
|
||||
<button
|
||||
onClick={openModal}
|
||||
className="bg-light-100 hover:bg-light-200 dark:bg-dark-100 dark:hover:bg-dark-200 transition duration-200 rounded-lg p-3 flex flex-col space-y-2 font-medium"
|
||||
className="bg-[#111111] hover:bg-[#1c1c1c] transition duration-200 rounded-lg px-4 py-2 flex flex-col justify-between space-y-2"
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{sources.slice(3, 6).map((source, i) => (
|
||||
@ -65,7 +66,7 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
<p className="text-xs text-black/50 dark:text-white/50">
|
||||
<p className="text-xs text-white/50">
|
||||
View {sources.length - 3} more
|
||||
</p>
|
||||
</button>
|
||||
@ -83,19 +84,19 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
leaveFrom="opacity-100 scale-200"
|
||||
leaveTo="opacity-0 scale-95"
|
||||
>
|
||||
<Dialog.Panel className="w-full max-w-md transform rounded-2xl bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 p-6 text-left align-middle shadow-xl transition-all">
|
||||
<Dialog.Title className="text-lg font-medium leading-6 dark:text-white">
|
||||
<Dialog.Panel className="w-full max-w-md transform rounded-2xl bg-[#111111] border border-[#1c1c1c] p-6 text-left align-middle shadow-xl transition-all">
|
||||
<Dialog.Title className="text-lg font-medium leading-6 text-white">
|
||||
Sources
|
||||
</Dialog.Title>
|
||||
<div className="grid grid-cols-2 gap-2 overflow-auto max-h-[300px] mt-2 pr-2">
|
||||
{sources.map((source, i) => (
|
||||
<a
|
||||
className="bg-light-secondary hover:bg-light-200 dark:bg-dark-secondary dark:hover:bg-dark-200 border border-light-200 dark:border-dark-200 transition duration-200 rounded-lg p-3 flex flex-col space-y-2 font-medium"
|
||||
className="bg-[#111111] hover:bg-[#1c1c1c] border border-[#1c1c1c] transition duration-200 rounded-lg p-3 flex flex-col space-y-2 font-medium"
|
||||
key={i}
|
||||
href={source.metadata.url}
|
||||
target="_blank"
|
||||
>
|
||||
<p className="dark:text-white text-xs overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
<p className="text-white text-xs overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
{source.metadata.title}
|
||||
</p>
|
||||
<div className="flex flex-row items-center justify-between">
|
||||
@ -107,15 +108,15 @@ const MessageSources = ({ sources }: { sources: Document[] }) => {
|
||||
alt="favicon"
|
||||
className="rounded-lg h-4 w-4"
|
||||
/>
|
||||
<p className="text-xs text-black/50 dark:text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
<p className="text-xs text-white/50 overflow-hidden whitespace-nowrap text-ellipsis">
|
||||
{source.metadata.url.replace(
|
||||
/.+\/\/|www.|\..+/g,
|
||||
'',
|
||||
)}
|
||||
</p>
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1 text-black/50 dark:text-white/50 text-xs">
|
||||
<div className="bg-black/50 dark:bg-white/50 h-[4px] w-[4px] rounded-full" />
|
||||
<div className="flex flex-row items-center space-x-1 text-white/50 text-xs">
|
||||
<div className="bg-white/50 h-[4px] w-[4px] rounded-full" />
|
||||
<span>{i + 1}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
@ -38,17 +38,16 @@ const Navbar = ({ messages }: { messages: Message[] }) => {
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<div className="fixed z-40 top-0 left-0 right-0 px-4 lg:pl-[104px] lg:pr-6 lg:px-8 flex flex-row items-center justify-between w-full py-4 text-sm text-black dark:text-white/70 border-b bg-light-primary dark:bg-dark-primary border-light-100 dark:border-dark-200">
|
||||
<div className="fixed z-40 top-0 left-0 right-0 px-4 lg:pl-32 lg:pr-4 lg:px-8 flex flex-row items-center justify-between w-full py-4 text-sm text-white/70 border-b bg-[#0A0A0A] border-[#1C1C1C]">
|
||||
<Edit
|
||||
size={17}
|
||||
className="active:scale-95 transition duration-100 cursor-pointer lg:hidden"
|
||||
/>
|
||||
<div className="hidden lg:flex flex-row items-center justify-center space-x-2">
|
||||
<div className="hidden lg:flex flex-row items-center space-x-2">
|
||||
<Clock size={17} />
|
||||
<p className="text-xs">{timeAgo} ago</p>
|
||||
</div>
|
||||
<p className="hidden lg:flex">{title}</p>
|
||||
|
||||
<div className="flex flex-row items-center space-x-4">
|
||||
<Share
|
||||
size={17}
|
||||
|
@ -3,7 +3,6 @@ import { ImagesIcon, PlusIcon } from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import Lightbox from 'yet-another-react-lightbox';
|
||||
import 'yet-another-react-lightbox/styles.css';
|
||||
import { Message } from './ChatWindow';
|
||||
|
||||
type Image = {
|
||||
url: string;
|
||||
@ -11,13 +10,7 @@ type Image = {
|
||||
title: string;
|
||||
};
|
||||
|
||||
const SearchImages = ({
|
||||
query,
|
||||
chat_history,
|
||||
}: {
|
||||
query: string;
|
||||
chat_history: Message[];
|
||||
}) => {
|
||||
const SearchImages = ({ query }: { query: string }) => {
|
||||
const [images, setImages] = useState<Image[] | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [open, setOpen] = useState(false);
|
||||
@ -29,10 +22,6 @@ const SearchImages = ({
|
||||
<button
|
||||
onClick={async () => {
|
||||
setLoading(true);
|
||||
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
|
||||
const res = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/images`,
|
||||
{
|
||||
@ -42,9 +31,7 @@ const SearchImages = ({
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
chat_history: [],
|
||||
}),
|
||||
},
|
||||
);
|
||||
@ -62,7 +49,7 @@ const SearchImages = ({
|
||||
);
|
||||
setLoading(false);
|
||||
}}
|
||||
className="border border-dashed border-light-200 dark:border-dark-200 hover:bg-light-200 dark:hover:bg-dark-200 active:scale-95 duration-200 transition px-4 py-2 flex flex-row items-center justify-between rounded-lg dark:text-white text-sm w-full"
|
||||
className="border border-dashed border-[#1C1C1C] hover:bg-[#1c1c1c] active:scale-95 duration-200 transition px-4 py-2 flex flex-row items-center justify-between rounded-lg text-white text-sm w-full"
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<ImagesIcon size={17} />
|
||||
@ -72,11 +59,11 @@ const SearchImages = ({
|
||||
</button>
|
||||
)}
|
||||
{loading && (
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="grid grid-cols-1 lg:grid-cols-2 gap-2">
|
||||
{[...Array(4)].map((_, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className="bg-light-secondary dark:bg-dark-secondary h-32 w-full rounded-lg animate-pulse aspect-video object-cover"
|
||||
className="bg-[#1C1C1C] h-32 w-full rounded-lg animate-pulse aspect-video object-cover"
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
@ -98,7 +85,7 @@ const SearchImages = ({
|
||||
key={i}
|
||||
src={image.img_src}
|
||||
alt={image.title}
|
||||
className="h-full w-full aspect-video object-cover rounded-lg transition duration-200 active:scale-95 hover:scale-[1.02] cursor-zoom-in"
|
||||
className="h-full w-full aspect-video object-cover rounded-lg transition duration-200 active:scale-95 cursor-pointer"
|
||||
/>
|
||||
))
|
||||
: images.map((image, i) => (
|
||||
@ -114,13 +101,13 @@ const SearchImages = ({
|
||||
key={i}
|
||||
src={image.img_src}
|
||||
alt={image.title}
|
||||
className="h-full w-full aspect-video object-cover rounded-lg transition duration-200 active:scale-95 hover:scale-[1.02] cursor-zoom-in"
|
||||
className="h-full w-full aspect-video object-cover rounded-lg transition duration-200 active:scale-95 cursor-pointer"
|
||||
/>
|
||||
))}
|
||||
{images.length > 4 && (
|
||||
<button
|
||||
onClick={() => setOpen(true)}
|
||||
className="bg-light-100 hover:bg-light-200 dark:bg-dark-100 dark:hover:bg-dark-200 transition duration-200 active:scale-95 hover:scale-[1.02] h-auto w-full rounded-lg flex flex-col justify-between text-white p-2"
|
||||
className="bg-[#111111] hover:bg-[#1c1c1c] transition duration-200 active:scale-95 h-auto w-full rounded-lg flex flex-col justify-between text-white p-2"
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{images.slice(3, 6).map((image, i) => (
|
||||
@ -132,8 +119,8 @@ const SearchImages = ({
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
View {images.length - 3} more
|
||||
<p className="text-white/70 text-xs">
|
||||
View {images.slice(0, 2).length} more
|
||||
</p>
|
||||
</button>
|
||||
)}
|
||||
|
@ -1,196 +0,0 @@
|
||||
/* eslint-disable @next/next/no-img-element */
|
||||
import { PlayCircle, PlayIcon, PlusIcon, VideoIcon } from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import Lightbox, { GenericSlide, VideoSlide } from 'yet-another-react-lightbox';
|
||||
import 'yet-another-react-lightbox/styles.css';
|
||||
import { Message } from './ChatWindow';
|
||||
|
||||
type Video = {
|
||||
url: string;
|
||||
img_src: string;
|
||||
title: string;
|
||||
iframe_src: string;
|
||||
};
|
||||
|
||||
declare module 'yet-another-react-lightbox' {
|
||||
export interface VideoSlide extends GenericSlide {
|
||||
type: 'video-slide';
|
||||
src: string;
|
||||
iframe_src: string;
|
||||
}
|
||||
|
||||
interface SlideTypes {
|
||||
'video-slide': VideoSlide;
|
||||
}
|
||||
}
|
||||
|
||||
const Searchvideos = ({
|
||||
query,
|
||||
chat_history,
|
||||
}: {
|
||||
query: string;
|
||||
chat_history: Message[];
|
||||
}) => {
|
||||
const [videos, setVideos] = useState<Video[] | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [open, setOpen] = useState(false);
|
||||
const [slides, setSlides] = useState<VideoSlide[]>([]);
|
||||
|
||||
return (
|
||||
<>
|
||||
{!loading && videos === null && (
|
||||
<button
|
||||
onClick={async () => {
|
||||
setLoading(true);
|
||||
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
|
||||
const res = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/videos`,
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
}),
|
||||
},
|
||||
);
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
const videos = data.videos;
|
||||
setVideos(videos);
|
||||
setSlides(
|
||||
videos.map((video: Video) => {
|
||||
return {
|
||||
type: 'video-slide',
|
||||
iframe_src: video.iframe_src,
|
||||
src: video.img_src,
|
||||
};
|
||||
}),
|
||||
);
|
||||
setLoading(false);
|
||||
}}
|
||||
className="border border-dashed border-light-200 dark:border-dark-200 hover:bg-light-200 dark:hover:bg-dark-200 active:scale-95 duration-200 transition px-4 py-2 flex flex-row items-center justify-between rounded-lg dark:text-white text-sm w-full"
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-2">
|
||||
<VideoIcon size={17} />
|
||||
<p>Search videos</p>
|
||||
</div>
|
||||
<PlusIcon className="text-[#24A0ED]" size={17} />
|
||||
</button>
|
||||
)}
|
||||
{loading && (
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
{[...Array(4)].map((_, i) => (
|
||||
<div
|
||||
key={i}
|
||||
className="bg-light-secondary dark:bg-dark-secondary h-32 w-full rounded-lg animate-pulse aspect-video object-cover"
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
{videos !== null && videos.length > 0 && (
|
||||
<>
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
{videos.length > 4
|
||||
? videos.slice(0, 3).map((video, i) => (
|
||||
<div
|
||||
onClick={() => {
|
||||
setOpen(true);
|
||||
setSlides([
|
||||
slides[i],
|
||||
...slides.slice(0, i),
|
||||
...slides.slice(i + 1),
|
||||
]);
|
||||
}}
|
||||
className="relative transition duration-200 active:scale-95 hover:scale-[1.02] cursor-pointer"
|
||||
key={i}
|
||||
>
|
||||
<img
|
||||
src={video.img_src}
|
||||
alt={video.title}
|
||||
className="relative h-full w-full aspect-video object-cover rounded-lg"
|
||||
/>
|
||||
<div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md">
|
||||
<PlayCircle size={15} />
|
||||
<p className="text-xs">Video</p>
|
||||
</div>
|
||||
</div>
|
||||
))
|
||||
: videos.map((video, i) => (
|
||||
<div
|
||||
onClick={() => {
|
||||
setOpen(true);
|
||||
setSlides([
|
||||
slides[i],
|
||||
...slides.slice(0, i),
|
||||
...slides.slice(i + 1),
|
||||
]);
|
||||
}}
|
||||
className="relative transition duration-200 active:scale-95 hover:scale-[1.02] cursor-pointer"
|
||||
key={i}
|
||||
>
|
||||
<img
|
||||
src={video.img_src}
|
||||
alt={video.title}
|
||||
className="relative h-full w-full aspect-video object-cover rounded-lg"
|
||||
/>
|
||||
<div className="absolute bg-white/70 dark:bg-black/70 text-black/70 dark:text-white/70 px-2 py-1 flex flex-row items-center space-x-1 bottom-1 right-1 rounded-md">
|
||||
<PlayCircle size={15} />
|
||||
<p className="text-xs">Video</p>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{videos.length > 4 && (
|
||||
<button
|
||||
onClick={() => setOpen(true)}
|
||||
className="bg-light-100 hover:bg-light-200 dark:bg-dark-100 dark:hover:bg-dark-200 transition duration-200 active:scale-95 hover:scale-[1.02] h-auto w-full rounded-lg flex flex-col justify-between text-white p-2"
|
||||
>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
{videos.slice(3, 6).map((video, i) => (
|
||||
<img
|
||||
key={i}
|
||||
src={video.img_src}
|
||||
alt={video.title}
|
||||
className="h-6 w-12 rounded-md lg:h-3 lg:w-6 lg:rounded-sm aspect-video object-cover"
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
<p className="text-black/70 dark:text-white/70 text-xs">
|
||||
View {videos.length - 3} more
|
||||
</p>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
<Lightbox
|
||||
open={open}
|
||||
close={() => setOpen(false)}
|
||||
slides={slides}
|
||||
render={{
|
||||
slide: ({ slide }) =>
|
||||
slide.type === 'video-slide' ? (
|
||||
<div className="h-full w-full flex flex-row items-center justify-center">
|
||||
<iframe
|
||||
src={slide.iframe_src}
|
||||
className="aspect-video max-h-[95vh] w-[95vw] rounded-2xl md:w-[80vw]"
|
||||
allowFullScreen
|
||||
allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture"
|
||||
/>
|
||||
</div>
|
||||
) : null,
|
||||
}}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
export default Searchvideos;
|
@ -1,474 +0,0 @@
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Dialog, Transition } from '@headlessui/react';
|
||||
import { CloudUpload, RefreshCcw, RefreshCw } from 'lucide-react';
|
||||
import React, {
|
||||
Fragment,
|
||||
useEffect,
|
||||
useState,
|
||||
type SelectHTMLAttributes,
|
||||
} from 'react';
|
||||
import ThemeSwitcher from './theme/Switcher';
|
||||
|
||||
interface InputProps extends React.InputHTMLAttributes<HTMLInputElement> {}
|
||||
|
||||
const Input = ({ className, ...restProps }: InputProps) => {
|
||||
return (
|
||||
<input
|
||||
{...restProps}
|
||||
className={cn(
|
||||
'bg-light-secondary dark:bg-dark-secondary px-3 py-2 flex items-center overflow-hidden border border-light-200 dark:border-dark-200 dark:text-white rounded-lg text-sm',
|
||||
className,
|
||||
)}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
interface SelectProps extends SelectHTMLAttributes<HTMLSelectElement> {
|
||||
options: { value: string; label: string; disabled?: boolean }[];
|
||||
}
|
||||
|
||||
export const Select = ({ className, options, ...restProps }: SelectProps) => {
|
||||
return (
|
||||
<select
|
||||
{...restProps}
|
||||
className={cn(
|
||||
'bg-light-secondary dark:bg-dark-secondary px-3 py-2 flex items-center overflow-hidden border border-light-200 dark:border-dark-200 dark:text-white rounded-lg text-sm',
|
||||
className,
|
||||
)}
|
||||
>
|
||||
{options.map(({ label, value, disabled }) => {
|
||||
return (
|
||||
<option key={value} value={value} disabled={disabled}>
|
||||
{label}
|
||||
</option>
|
||||
);
|
||||
})}
|
||||
</select>
|
||||
);
|
||||
};
|
||||
|
||||
interface SettingsType {
|
||||
chatModelProviders: {
|
||||
[key: string]: string[];
|
||||
};
|
||||
embeddingModelProviders: {
|
||||
[key: string]: string[];
|
||||
};
|
||||
openaiApiKey: string;
|
||||
groqApiKey: string;
|
||||
ollamaApiUrl: string;
|
||||
}
|
||||
|
||||
const SettingsDialog = ({
|
||||
isOpen,
|
||||
setIsOpen,
|
||||
}: {
|
||||
isOpen: boolean;
|
||||
setIsOpen: (isOpen: boolean) => void;
|
||||
}) => {
|
||||
const [config, setConfig] = useState<SettingsType | null>(null);
|
||||
const [selectedChatModelProvider, setSelectedChatModelProvider] = useState<
|
||||
string | null
|
||||
>(null);
|
||||
const [selectedChatModel, setSelectedChatModel] = useState<string | null>(
|
||||
null,
|
||||
);
|
||||
const [selectedEmbeddingModelProvider, setSelectedEmbeddingModelProvider] =
|
||||
useState<string | null>(null);
|
||||
const [selectedEmbeddingModel, setSelectedEmbeddingModel] = useState<
|
||||
string | null
|
||||
>(null);
|
||||
const [customOpenAIApiKey, setCustomOpenAIApiKey] = useState<string>('');
|
||||
const [customOpenAIBaseURL, setCustomOpenAIBaseURL] = useState<string>('');
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [isUpdating, setIsUpdating] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (isOpen) {
|
||||
const fetchConfig = async () => {
|
||||
setIsLoading(true);
|
||||
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/config`, {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
const data = (await res.json()) as SettingsType;
|
||||
setConfig(data);
|
||||
|
||||
const chatModelProvidersKeys = Object.keys(
|
||||
data.chatModelProviders || {},
|
||||
);
|
||||
const embeddingModelProvidersKeys = Object.keys(
|
||||
data.embeddingModelProviders || {},
|
||||
);
|
||||
|
||||
const defaultChatModelProvider =
|
||||
chatModelProvidersKeys.length > 0 ? chatModelProvidersKeys[0] : '';
|
||||
const defaultEmbeddingModelProvider =
|
||||
embeddingModelProvidersKeys.length > 0
|
||||
? embeddingModelProvidersKeys[0]
|
||||
: '';
|
||||
|
||||
const chatModelProvider =
|
||||
localStorage.getItem('chatModelProvider') ||
|
||||
defaultChatModelProvider ||
|
||||
'';
|
||||
const chatModel =
|
||||
localStorage.getItem('chatModel') ||
|
||||
(data.chatModelProviders &&
|
||||
data.chatModelProviders[chatModelProvider]?.[0]) ||
|
||||
'';
|
||||
const embeddingModelProvider =
|
||||
localStorage.getItem('embeddingModelProvider') ||
|
||||
defaultEmbeddingModelProvider ||
|
||||
'';
|
||||
const embeddingModel =
|
||||
localStorage.getItem('embeddingModel') ||
|
||||
(data.embeddingModelProviders &&
|
||||
data.embeddingModelProviders[embeddingModelProvider]?.[0]) ||
|
||||
'';
|
||||
|
||||
setSelectedChatModelProvider(chatModelProvider);
|
||||
setSelectedChatModel(chatModel);
|
||||
setSelectedEmbeddingModelProvider(embeddingModelProvider);
|
||||
setSelectedEmbeddingModel(embeddingModel);
|
||||
setCustomOpenAIApiKey(localStorage.getItem('openAIApiKey') || '');
|
||||
setCustomOpenAIBaseURL(localStorage.getItem('openAIBaseURL') || '');
|
||||
setIsLoading(false);
|
||||
};
|
||||
|
||||
fetchConfig();
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [isOpen]);
|
||||
|
||||
const handleSubmit = async () => {
|
||||
setIsUpdating(true);
|
||||
|
||||
try {
|
||||
await fetch(`${process.env.NEXT_PUBLIC_API_URL}/config`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(config),
|
||||
});
|
||||
|
||||
localStorage.setItem('chatModelProvider', selectedChatModelProvider!);
|
||||
localStorage.setItem('chatModel', selectedChatModel!);
|
||||
localStorage.setItem(
|
||||
'embeddingModelProvider',
|
||||
selectedEmbeddingModelProvider!,
|
||||
);
|
||||
localStorage.setItem('embeddingModel', selectedEmbeddingModel!);
|
||||
localStorage.setItem('openAIApiKey', customOpenAIApiKey!);
|
||||
localStorage.setItem('openAIBaseURL', customOpenAIBaseURL!);
|
||||
} catch (err) {
|
||||
console.log(err);
|
||||
} finally {
|
||||
setIsUpdating(false);
|
||||
setIsOpen(false);
|
||||
|
||||
window.location.reload();
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<Transition appear show={isOpen} as={Fragment}>
|
||||
<Dialog
|
||||
as="div"
|
||||
className="relative z-50"
|
||||
onClose={() => setIsOpen(false)}
|
||||
>
|
||||
<Transition.Child
|
||||
as={Fragment}
|
||||
enter="ease-out duration-300"
|
||||
enterFrom="opacity-0"
|
||||
enterTo="opacity-100"
|
||||
leave="ease-in duration-200"
|
||||
leaveFrom="opacity-100"
|
||||
leaveTo="opacity-0"
|
||||
>
|
||||
<div className="fixed inset-0 bg-white/50 dark:bg-black/50" />
|
||||
</Transition.Child>
|
||||
<div className="fixed inset-0 overflow-y-auto">
|
||||
<div className="flex min-h-full items-center justify-center p-4 text-center">
|
||||
<Transition.Child
|
||||
as={Fragment}
|
||||
enter="ease-out duration-200"
|
||||
enterFrom="opacity-0 scale-95"
|
||||
enterTo="opacity-100 scale-100"
|
||||
leave="ease-in duration-100"
|
||||
leaveFrom="opacity-100 scale-200"
|
||||
leaveTo="opacity-0 scale-95"
|
||||
>
|
||||
<Dialog.Panel className="w-full max-w-md transform rounded-2xl bg-light-secondary dark:bg-dark-secondary border border-light-200 dark:border-dark-200 p-6 text-left align-middle shadow-xl transition-all">
|
||||
<Dialog.Title className="text-xl font-medium leading-6 dark:text-white">
|
||||
Settings
|
||||
</Dialog.Title>
|
||||
{config && !isLoading && (
|
||||
<div className="flex flex-col space-y-4 mt-6">
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Theme
|
||||
</p>
|
||||
<ThemeSwitcher />
|
||||
</div>
|
||||
{config.chatModelProviders && (
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Chat model Provider
|
||||
</p>
|
||||
<Select
|
||||
value={selectedChatModelProvider ?? undefined}
|
||||
onChange={(e) => {
|
||||
setSelectedChatModelProvider(e.target.value);
|
||||
setSelectedChatModel(
|
||||
config.chatModelProviders[e.target.value][0],
|
||||
);
|
||||
}}
|
||||
options={Object.keys(config.chatModelProviders).map(
|
||||
(provider) => ({
|
||||
value: provider,
|
||||
label:
|
||||
provider.charAt(0).toUpperCase() +
|
||||
provider.slice(1),
|
||||
}),
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
{selectedChatModelProvider &&
|
||||
selectedChatModelProvider != 'custom_openai' && (
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Chat Model
|
||||
</p>
|
||||
<Select
|
||||
value={selectedChatModel ?? undefined}
|
||||
onChange={(e) =>
|
||||
setSelectedChatModel(e.target.value)
|
||||
}
|
||||
options={(() => {
|
||||
const chatModelProvider =
|
||||
config.chatModelProviders[
|
||||
selectedChatModelProvider
|
||||
];
|
||||
|
||||
return chatModelProvider
|
||||
? chatModelProvider.length > 0
|
||||
? chatModelProvider.map((model) => ({
|
||||
value: model,
|
||||
label: model,
|
||||
}))
|
||||
: [
|
||||
{
|
||||
value: '',
|
||||
label: 'No models available',
|
||||
disabled: true,
|
||||
},
|
||||
]
|
||||
: [
|
||||
{
|
||||
value: '',
|
||||
label:
|
||||
'Invalid provider, please check backend logs',
|
||||
disabled: true,
|
||||
},
|
||||
];
|
||||
})()}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
{selectedChatModelProvider &&
|
||||
selectedChatModelProvider === 'custom_openai' && (
|
||||
<>
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Model name
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Model name"
|
||||
defaultValue={selectedChatModel!}
|
||||
onChange={(e) =>
|
||||
setSelectedChatModel(e.target.value)
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Custom OpenAI API Key
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Custom OpenAI API Key"
|
||||
defaultValue={customOpenAIApiKey!}
|
||||
onChange={(e) =>
|
||||
setCustomOpenAIApiKey(e.target.value)
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Custom OpenAI Base URL
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Custom OpenAI Base URL"
|
||||
defaultValue={customOpenAIBaseURL!}
|
||||
onChange={(e) =>
|
||||
setCustomOpenAIBaseURL(e.target.value)
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{/* Embedding models */}
|
||||
{config.embeddingModelProviders && (
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Embedding model Provider
|
||||
</p>
|
||||
<Select
|
||||
value={selectedEmbeddingModelProvider ?? undefined}
|
||||
onChange={(e) => {
|
||||
setSelectedEmbeddingModelProvider(e.target.value);
|
||||
setSelectedEmbeddingModel(
|
||||
config.embeddingModelProviders[e.target.value][0],
|
||||
);
|
||||
}}
|
||||
options={Object.keys(
|
||||
config.embeddingModelProviders,
|
||||
).map((provider) => ({
|
||||
label:
|
||||
provider.charAt(0).toUpperCase() +
|
||||
provider.slice(1),
|
||||
value: provider,
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
{selectedEmbeddingModelProvider && (
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Embedding Model
|
||||
</p>
|
||||
<Select
|
||||
value={selectedEmbeddingModel ?? undefined}
|
||||
onChange={(e) =>
|
||||
setSelectedEmbeddingModel(e.target.value)
|
||||
}
|
||||
options={(() => {
|
||||
const embeddingModelProvider =
|
||||
config.embeddingModelProviders[
|
||||
selectedEmbeddingModelProvider
|
||||
];
|
||||
|
||||
return embeddingModelProvider
|
||||
? embeddingModelProvider.length > 0
|
||||
? embeddingModelProvider.map((model) => ({
|
||||
label: model,
|
||||
value: model,
|
||||
}))
|
||||
: [
|
||||
{
|
||||
label: 'No embedding models available',
|
||||
value: '',
|
||||
disabled: true,
|
||||
},
|
||||
]
|
||||
: [
|
||||
{
|
||||
label:
|
||||
'Invalid provider, please check backend logs',
|
||||
value: '',
|
||||
disabled: true,
|
||||
},
|
||||
];
|
||||
})()}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
OpenAI API Key
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="OpenAI API Key"
|
||||
defaultValue={config.openaiApiKey}
|
||||
onChange={(e) =>
|
||||
setConfig({
|
||||
...config,
|
||||
openaiApiKey: e.target.value,
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
Ollama API URL
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="Ollama API URL"
|
||||
defaultValue={config.ollamaApiUrl}
|
||||
onChange={(e) =>
|
||||
setConfig({
|
||||
...config,
|
||||
ollamaApiUrl: e.target.value,
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-1">
|
||||
<p className="text-black/70 dark:text-white/70 text-sm">
|
||||
GROQ API Key
|
||||
</p>
|
||||
<Input
|
||||
type="text"
|
||||
placeholder="GROQ API Key"
|
||||
defaultValue={config.groqApiKey}
|
||||
onChange={(e) =>
|
||||
setConfig({
|
||||
...config,
|
||||
groqApiKey: e.target.value,
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
{isLoading && (
|
||||
<div className="w-full flex items-center justify-center mt-6 text-black/70 dark:text-white/70 py-6">
|
||||
<RefreshCcw className="animate-spin" />
|
||||
</div>
|
||||
)}
|
||||
<div className="w-full mt-6 space-y-2">
|
||||
<p className="text-xs text-black/50 dark:text-white/50">
|
||||
We'll refresh the page after updating the settings.
|
||||
</p>
|
||||
<button
|
||||
onClick={handleSubmit}
|
||||
className="bg-[#24A0ED] flex flex-row items-center space-x-2 text-white disabled:text-white/50 hover:bg-opacity-85 transition duration-100 disabled:bg-[#ececec21] rounded-full px-4 py-2"
|
||||
disabled={isLoading || isUpdating}
|
||||
>
|
||||
{isUpdating ? (
|
||||
<RefreshCw size={20} className="animate-spin" />
|
||||
) : (
|
||||
<CloudUpload size={20} />
|
||||
)}
|
||||
</button>
|
||||
</div>
|
||||
</Dialog.Panel>
|
||||
</Transition.Child>
|
||||
</div>
|
||||
</div>
|
||||
</Dialog>
|
||||
</Transition>
|
||||
);
|
||||
};
|
||||
|
||||
export default SettingsDialog;
|
@ -1,29 +1,21 @@
|
||||
'use client';
|
||||
|
||||
import { cn } from '@/lib/utils';
|
||||
import { BookOpenText, Home, Search, SquarePen, Settings } from 'lucide-react';
|
||||
import { BookOpenText, Home, Search, SquarePen } from 'lucide-react';
|
||||
import { SiGithub } from '@icons-pack/react-simple-icons';
|
||||
import Link from 'next/link';
|
||||
import { useSelectedLayoutSegments } from 'next/navigation';
|
||||
import React, { useState, type ReactNode } from 'react';
|
||||
import React from 'react';
|
||||
import Layout from './Layout';
|
||||
import SettingsDialog from './SettingsDialog';
|
||||
|
||||
const VerticalIconContainer = ({ children }: { children: ReactNode }) => {
|
||||
return (
|
||||
<div className="flex flex-col items-center gap-y-3 w-full">{children}</div>
|
||||
);
|
||||
};
|
||||
|
||||
const Sidebar = ({ children }: { children: React.ReactNode }) => {
|
||||
const segments = useSelectedLayoutSegments();
|
||||
|
||||
const [isSettingsOpen, setIsSettingsOpen] = useState(false);
|
||||
|
||||
const navLinks = [
|
||||
{
|
||||
icon: Home,
|
||||
href: '/',
|
||||
active: segments.length === 0 || segments.includes('c'),
|
||||
active: segments.length === 0,
|
||||
label: 'Home',
|
||||
},
|
||||
{
|
||||
@ -43,56 +35,52 @@ const Sidebar = ({ children }: { children: React.ReactNode }) => {
|
||||
return (
|
||||
<div>
|
||||
<div className="hidden lg:fixed lg:inset-y-0 lg:z-50 lg:flex lg:w-20 lg:flex-col">
|
||||
<div className="flex grow flex-col items-center justify-between gap-y-5 overflow-y-auto bg-light-secondary dark:bg-dark-secondary px-2 py-8">
|
||||
<div className="flex grow flex-col items-center justify-between gap-y-5 overflow-y-auto bg-[#111111] px-2 py-8">
|
||||
<a href="/">
|
||||
<SquarePen className="cursor-pointer" />
|
||||
<SquarePen className="text-white cursor-pointer" />
|
||||
</a>
|
||||
<VerticalIconContainer>
|
||||
<div className="flex flex-col items-center gap-y-3 w-full">
|
||||
{navLinks.map((link, i) => (
|
||||
<Link
|
||||
key={i}
|
||||
href={link.href}
|
||||
className={cn(
|
||||
'relative flex flex-row items-center justify-center cursor-pointer hover:bg-black/10 dark:hover:bg-white/10 duration-150 transition w-full py-2 rounded-lg',
|
||||
link.active
|
||||
? 'text-black dark:text-white'
|
||||
: 'text-black/70 dark:text-white/70',
|
||||
'relative flex flex-row items-center justify-center cursor-pointer hover:bg-white/10 hover:text-white duration-150 transition w-full py-2 rounded-lg',
|
||||
link.active ? 'text-white' : 'text-white/70',
|
||||
)}
|
||||
>
|
||||
<link.icon />
|
||||
{link.active && (
|
||||
<div className="absolute right-0 -mr-2 h-full w-1 rounded-l-lg bg-black dark:bg-white" />
|
||||
<div className="absolute right-0 -mr-2 h-full w-1 rounded-l-lg bg-white" />
|
||||
)}
|
||||
</Link>
|
||||
))}
|
||||
</VerticalIconContainer>
|
||||
|
||||
<Settings
|
||||
onClick={() => setIsSettingsOpen(!isSettingsOpen)}
|
||||
className="cursor-pointer"
|
||||
/>
|
||||
|
||||
<SettingsDialog
|
||||
isOpen={isSettingsOpen}
|
||||
setIsOpen={setIsSettingsOpen}
|
||||
/>
|
||||
</div>
|
||||
<Link
|
||||
href="https://github.com/ItzCrazyKns/Perplexica"
|
||||
className="flex flex-col items-center text-center justify-center"
|
||||
>
|
||||
<SiGithub
|
||||
className="text-white"
|
||||
onPointerEnterCapture={undefined}
|
||||
onPointerLeaveCapture={undefined}
|
||||
/>
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="fixed bottom-0 w-full z-50 flex flex-row items-center gap-x-6 bg-light-primary dark:bg-dark-primary px-4 py-4 shadow-sm lg:hidden">
|
||||
<div className="fixed bottom-0 w-full z-50 flex flex-row items-center gap-x-6 bg-[#111111] px-4 py-4 shadow-sm lg:hidden">
|
||||
{navLinks.map((link, i) => (
|
||||
<Link
|
||||
href={link.href}
|
||||
key={i}
|
||||
className={cn(
|
||||
'relative flex flex-col items-center space-y-1 text-center w-full',
|
||||
link.active
|
||||
? 'text-black dark:text-white'
|
||||
: 'text-black dark:text-white/70',
|
||||
link.active ? 'text-white' : 'text-white/70',
|
||||
)}
|
||||
>
|
||||
{link.active && (
|
||||
<div className="absolute top-0 -mt-4 h-1 w-full rounded-b-lg bg-black dark:bg-white" />
|
||||
<div className="absolute top-0 -mt-4 h-1 w-full rounded-b-lg bg-white" />
|
||||
)}
|
||||
<link.icon />
|
||||
<p className="text-xs">{link.label}</p>
|
||||
|
@ -1,16 +0,0 @@
|
||||
'use client';
|
||||
import { ThemeProvider } from 'next-themes';
|
||||
|
||||
const ThemeProviderComponent = ({
|
||||
children,
|
||||
}: {
|
||||
children: React.ReactNode;
|
||||
}) => {
|
||||
return (
|
||||
<ThemeProvider attribute="class" enableSystem={false} defaultTheme="dark">
|
||||
{children}
|
||||
</ThemeProvider>
|
||||
);
|
||||
};
|
||||
|
||||
export default ThemeProviderComponent;
|
@ -1,61 +0,0 @@
|
||||
'use client';
|
||||
import { useTheme } from 'next-themes';
|
||||
import { SunIcon, MoonIcon, MonitorIcon } from 'lucide-react';
|
||||
import { useCallback, useEffect, useState } from 'react';
|
||||
import { Select } from '../SettingsDialog';
|
||||
|
||||
type Theme = 'dark' | 'light' | 'system';
|
||||
|
||||
const ThemeSwitcher = ({ className }: { className?: string }) => {
|
||||
const [mounted, setMounted] = useState(false);
|
||||
|
||||
const { theme, setTheme } = useTheme();
|
||||
|
||||
const isTheme = useCallback((t: Theme) => t === theme, [theme]);
|
||||
|
||||
const handleThemeSwitch = (theme: Theme) => {
|
||||
setTheme(theme);
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
setMounted(true);
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
if (isTheme('system')) {
|
||||
const preferDarkScheme = window.matchMedia(
|
||||
'(prefers-color-scheme: dark)',
|
||||
);
|
||||
|
||||
const detectThemeChange = (event: MediaQueryListEvent) => {
|
||||
const theme: Theme = event.matches ? 'dark' : 'light';
|
||||
setTheme(theme);
|
||||
};
|
||||
|
||||
preferDarkScheme.addEventListener('change', detectThemeChange);
|
||||
|
||||
return () => {
|
||||
preferDarkScheme.removeEventListener('change', detectThemeChange);
|
||||
};
|
||||
}
|
||||
}, [isTheme, setTheme, theme]);
|
||||
|
||||
// Avoid Hydration Mismatch
|
||||
if (!mounted) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<Select
|
||||
className={className}
|
||||
value={theme}
|
||||
onChange={(e) => handleThemeSwitch(e.target.value as Theme)}
|
||||
options={[
|
||||
{ value: 'light', label: 'Light' },
|
||||
{ value: 'dark', label: 'Dark' },
|
||||
]}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export default ThemeSwitcher;
|
@ -1,22 +0,0 @@
|
||||
import { Message } from '@/components/ChatWindow';
|
||||
|
||||
export const getSuggestions = async (chatHisory: Message[]) => {
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
|
||||
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/suggestions`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
chat_history: chatHisory,
|
||||
chat_model: chatModel,
|
||||
chat_model_provider: chatModelProvider,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = (await res.json()) as { suggestions: string[] };
|
||||
|
||||
return data.suggestions;
|
||||
};
|
@ -3,13 +3,7 @@ import { twMerge } from 'tailwind-merge';
|
||||
|
||||
export const cn = (...classes: ClassValue[]) => twMerge(clsx(...classes));
|
||||
|
||||
export const formatTimeDifference = (
|
||||
date1: Date | string,
|
||||
date2: Date | string,
|
||||
): string => {
|
||||
date1 = new Date(date1);
|
||||
date2 = new Date(date2);
|
||||
|
||||
export const formatTimeDifference = (date1: Date, date2: Date): string => {
|
||||
const diffInSeconds = Math.floor(
|
||||
Math.abs(date2.getTime() - date1.getTime()) / 1000,
|
||||
);
|
||||
|
@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "perplexica-frontend",
|
||||
"version": "1.7.0",
|
||||
"version": "1.0.0",
|
||||
"license": "MIT",
|
||||
"author": "ItzCrazyKns",
|
||||
"scripts": {
|
||||
@ -20,12 +20,9 @@
|
||||
"lucide-react": "^0.363.0",
|
||||
"markdown-to-jsx": "^7.4.5",
|
||||
"next": "14.1.4",
|
||||
"next-themes": "^0.3.0",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-text-to-speech": "^0.14.5",
|
||||
"react-textarea-autosize": "^8.5.3",
|
||||
"sonner": "^1.4.41",
|
||||
"tailwind-merge": "^2.2.2",
|
||||
"yet-another-react-lightbox": "^3.17.2",
|
||||
"zod": "^3.22.4"
|
||||
|
@ -1,17 +1,4 @@
|
||||
import type { Config } from 'tailwindcss';
|
||||
import type { DefaultColors } from 'tailwindcss/types/generated/colors';
|
||||
|
||||
const themeDark = (colors: DefaultColors) => ({
|
||||
50: '#0a0a0a',
|
||||
100: '#111111',
|
||||
200: '#1c1c1c',
|
||||
});
|
||||
|
||||
const themeLight = (colors: DefaultColors) => ({
|
||||
50: '#fcfcf9',
|
||||
100: '#f3f3ee',
|
||||
200: '#e8e8e3',
|
||||
});
|
||||
|
||||
const config: Config = {
|
||||
content: [
|
||||
@ -19,33 +6,8 @@ const config: Config = {
|
||||
'./components/**/*.{js,ts,jsx,tsx,mdx}',
|
||||
'./app/**/*.{js,ts,jsx,tsx,mdx}',
|
||||
],
|
||||
darkMode: 'class',
|
||||
theme: {
|
||||
extend: {
|
||||
borderColor: ({ colors }) => {
|
||||
return {
|
||||
light: themeLight(colors),
|
||||
dark: themeDark(colors),
|
||||
};
|
||||
},
|
||||
colors: ({ colors }) => {
|
||||
const colorsDark = themeDark(colors);
|
||||
const colorsLight = themeLight(colors);
|
||||
|
||||
return {
|
||||
dark: {
|
||||
primary: colorsDark[50],
|
||||
secondary: colorsDark[100],
|
||||
...colorsDark,
|
||||
},
|
||||
light: {
|
||||
primary: colorsLight[50],
|
||||
secondary: colorsLight[100],
|
||||
...colorsLight,
|
||||
},
|
||||
};
|
||||
},
|
||||
},
|
||||
extend: {},
|
||||
},
|
||||
plugins: [require('@tailwindcss/typography')],
|
||||
};
|
||||
|
35
ui/yarn.lock
35
ui/yarn.lock
@ -2244,11 +2244,6 @@ natural-compare@^1.4.0:
|
||||
resolved "https://registry.yarnpkg.com/natural-compare/-/natural-compare-1.4.0.tgz#4abebfeed7541f2c27acfb29bdbbd15c8d5ba4f7"
|
||||
integrity sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==
|
||||
|
||||
next-themes@^0.3.0:
|
||||
version "0.3.0"
|
||||
resolved "https://registry.yarnpkg.com/next-themes/-/next-themes-0.3.0.tgz#b4d2a866137a67d42564b07f3a3e720e2ff3871a"
|
||||
integrity sha512-/QHIrsYpd6Kfk7xakK4svpDI5mmXP0gfvCoJdGpZQ2TOrQZmsW0QxjaiLn8wbIKjtm4BTSqLoix4lxYYOnLJ/w==
|
||||
|
||||
next@14.1.4:
|
||||
version "14.1.4"
|
||||
resolved "https://registry.yarnpkg.com/next/-/next-14.1.4.tgz#203310f7310578563fd5c961f0db4729ce7a502d"
|
||||
@ -2637,11 +2632,6 @@ react-is@^16.13.1:
|
||||
resolved "https://registry.yarnpkg.com/react-is/-/react-is-16.13.1.tgz#789729a4dc36de2999dc156dd6c1d9c18cea56a4"
|
||||
integrity sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ==
|
||||
|
||||
react-text-to-speech@^0.14.5:
|
||||
version "0.14.5"
|
||||
resolved "https://registry.yarnpkg.com/react-text-to-speech/-/react-text-to-speech-0.14.5.tgz#f918786ab283311535682011045bd49777193300"
|
||||
integrity sha512-3brr/IrK/5YTtOZSTo+Y8b+dnWelzfZiDZvkXnOct1e7O7fgA/h9bYAVrtwSRo/VxKfdw+wh6glkj6M0mlQuQQ==
|
||||
|
||||
react-textarea-autosize@^8.5.3:
|
||||
version "8.5.3"
|
||||
resolved "https://registry.yarnpkg.com/react-textarea-autosize/-/react-textarea-autosize-8.5.3.tgz#d1e9fe760178413891484847d3378706052dd409"
|
||||
@ -2844,11 +2834,6 @@ slash@^3.0.0:
|
||||
resolved "https://registry.yarnpkg.com/slash/-/slash-3.0.0.tgz#6539be870c165adbd5240220dbe361f1bc4d4634"
|
||||
integrity sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==
|
||||
|
||||
sonner@^1.4.41:
|
||||
version "1.4.41"
|
||||
resolved "https://registry.yarnpkg.com/sonner/-/sonner-1.4.41.tgz#ff085ae4f4244713daf294959beaa3e90f842d2c"
|
||||
integrity sha512-uG511ggnnsw6gcn/X+YKkWPo5ep9il9wYi3QJxHsYe7yTZ4+cOd1wuodOUmOpFuXL+/RE3R04LczdNCDygTDgQ==
|
||||
|
||||
source-map-js@^1.0.2, source-map-js@^1.2.0:
|
||||
version "1.2.0"
|
||||
resolved "https://registry.yarnpkg.com/source-map-js/-/source-map-js-1.2.0.tgz#16b809c162517b5b8c3e7dcd315a2a5c2612b2af"
|
||||
@ -2859,16 +2844,7 @@ streamsearch@^1.1.0:
|
||||
resolved "https://registry.yarnpkg.com/streamsearch/-/streamsearch-1.1.0.tgz#404dd1e2247ca94af554e841a8ef0eaa238da764"
|
||||
integrity sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==
|
||||
|
||||
"string-width-cjs@npm:string-width@^4.2.0":
|
||||
version "4.2.3"
|
||||
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
|
||||
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
|
||||
dependencies:
|
||||
emoji-regex "^8.0.0"
|
||||
is-fullwidth-code-point "^3.0.0"
|
||||
strip-ansi "^6.0.1"
|
||||
|
||||
string-width@^4.1.0:
|
||||
"string-width-cjs@npm:string-width@^4.2.0", string-width@^4.1.0:
|
||||
version "4.2.3"
|
||||
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
|
||||
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
|
||||
@ -2932,14 +2908,7 @@ string.prototype.trimstart@^1.0.8:
|
||||
define-properties "^1.2.1"
|
||||
es-object-atoms "^1.0.0"
|
||||
|
||||
"strip-ansi-cjs@npm:strip-ansi@^6.0.1":
|
||||
version "6.0.1"
|
||||
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
|
||||
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
|
||||
dependencies:
|
||||
ansi-regex "^5.0.1"
|
||||
|
||||
strip-ansi@^6.0.0, strip-ansi@^6.0.1:
|
||||
"strip-ansi-cjs@npm:strip-ansi@^6.0.1", strip-ansi@^6.0.0, strip-ansi@^6.0.1:
|
||||
version "6.0.1"
|
||||
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
|
||||
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
|
||||
|
Reference in New Issue
Block a user