mirror of
https://github.com/ItzCrazyKns/Perplexica.git
synced 2025-04-30 08:12:26 +00:00
Compare commits
54 Commits
Author | SHA1 | Date | |
---|---|---|---|
|
35a3eda213 | ||
|
dfed6a0ad8 | ||
|
e0d9522435 | ||
|
f7c3bc2823 | ||
|
6fb0c5b362 | ||
|
f4628ae52d | ||
|
0ac971e6b4 | ||
|
4ff6502dae | ||
|
795309cfe2 | ||
|
d04ba91c85 | ||
|
8bf4269208 | ||
|
4c7942d2e8 | ||
|
aa55206a30 | ||
|
27d7b000d0 | ||
|
7853c18b6f | ||
|
64ea4b4289 | ||
|
c61facef13 | ||
|
fcff93a594 | ||
|
3bfaf9be28 | ||
|
68b595023e | ||
|
8b9b4085ea | ||
|
2e58dab30a | ||
|
48018990be | ||
|
ebbe18ab45 | ||
|
cef75279c5 | ||
|
180e204c2d | ||
|
0e2f4514b4 | ||
|
0993c5a760 | ||
|
100872f2d9 | ||
|
22aee27cda | ||
|
9d30224faa | ||
|
b622df5a9f | ||
|
1b18715f8f | ||
|
9816eb1d36 | ||
|
828eeb0c77 | ||
|
c852bee8ed | ||
|
954b4bf89a | ||
|
3ef39c69a7 | ||
|
7a28be9e1a | ||
|
a60145137c | ||
|
c56a058a74 | ||
|
4e20c4ac56 | ||
|
e6c2042df6 | ||
|
7eace1e6bd | ||
|
baef45b456 | ||
|
9a7af945b0 | ||
|
09463999c2 | ||
|
0f6986fc9b | ||
|
5e940914a3 | ||
|
ac4cba32c8 | ||
|
0fedaef537 | ||
|
4f5f6be85f | ||
|
17fbc28172 | ||
|
0af66f8b72 |
20
Makefile
Normal file
20
Makefile
Normal file
@ -0,0 +1,20 @@
|
||||
.PHONY: run
|
||||
run:
|
||||
docker compose -f docker-compose.yaml up
|
||||
|
||||
|
||||
.PHONY: rebuild-run
|
||||
rebuild-run:
|
||||
docker compose -f docker-compose.yaml build --no-cache \
|
||||
&& docker compose -f docker-compose.yaml up
|
||||
|
||||
|
||||
.PHONY: run-app-only
|
||||
run-app-only:
|
||||
docker compose -f app-docker-compose.yaml up
|
||||
|
||||
|
||||
.PHONY: rebuild-run-app-only
|
||||
rebuild-run-app-only:
|
||||
docker compose -f app-docker-compose.yaml build --no-cache \
|
||||
&& docker compose -f app-docker-compose.yaml up
|
33
README.md
33
README.md
@ -11,6 +11,7 @@
|
||||
- [Getting Started with Docker (Recommended)](#getting-started-with-docker-recommended)
|
||||
- [Non-Docker Installation](#non-docker-installation)
|
||||
- [Ollama connection errors](#ollama-connection-errors)
|
||||
- [Using as a Search Engine](#using-as-a-search-engine)
|
||||
- [One-Click Deployment](#one-click-deployment)
|
||||
- [Upcoming Features](#upcoming-features)
|
||||
- [Support Us](#support-us)
|
||||
@ -92,6 +93,8 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.
|
||||
|
||||
**Note**: Using Docker is recommended as it simplifies the setup process, especially for managing environment variables and dependencies.
|
||||
|
||||
See the [installation documentation](https://github.com/ItzCrazyKns/Perplexica/tree/master/docs/installation) for more information like exposing it your network, etc.
|
||||
|
||||
### Ollama connection errors
|
||||
|
||||
If you're facing an Ollama connection error, it is often related to the backend not being able to connect to Ollama's API. How can you fix it? You can fix it by updating your Ollama API URL in the settings menu to the following:
|
||||
@ -102,10 +105,40 @@ On Linux: `http://private_ip_of_computer_hosting_ollama:11434`
|
||||
|
||||
You need to edit the ports accordingly.
|
||||
|
||||
## Using as a Search Engine
|
||||
|
||||
If you wish to use Perplexica as an alternative to traditional search engines like Google or Bing, or if you want to add a shortcut for quick access from your browser's search bar, follow these steps:
|
||||
|
||||
1. Open your browser's settings.
|
||||
2. Navigate to the 'Search Engines' section.
|
||||
3. Add a new site search with the following URL: `http://localhost:3000/?q=%s`. Replace `localhost` with your IP address or domain name, and `3000` with the port number if Perplexica is not hosted locally.
|
||||
4. Click the add button. Now, you can use Perplexica directly from your browser's search bar.
|
||||
|
||||
## One-Click Deployment
|
||||
|
||||
[](https://repocloud.io/details/?app_id=267)
|
||||
|
||||
## Deploy Perplexica backend to Google GKE
|
||||
|
||||
0: Install `docker` and `terraform` (Process specific to your system)
|
||||
1a: Copy the `sample.env` file to `.env`
|
||||
1b: Copy the `deploy/gcp/sample.env` file to `deploy/gcp/.env`
|
||||
2a: Fillout desired LLM provider access keys etc. in `.env`
|
||||
|
||||
- Note: you will have to comeback and edit this file again once you have the address of the K8s backend deploy
|
||||
2b: Fillout the GCP info in `deploy/gcp/.env`
|
||||
3: Edit `GCP_REPO` to the correct docker image repo path if you are using something other than Container registry
|
||||
4: Edit the `PREFIX` if you would like images and GKE entities to be prefixed with something else
|
||||
5: In `deploy/gcp` run `make init` to initialize terraform
|
||||
6: Follow the normal Preplexica configuration steps outlined in the project readme
|
||||
7: Auth docker with the appropriate credential for repo Ex. for `gcr.io` -> `gcloud auth configure-docker`
|
||||
8: In `deploy/gcp` run `make build-deplpy` to build and push the project images to the repo, create a GKE cluster and deploy the app
|
||||
9: Once deployed successfully edit the `.env` file in the root project folder and update the `REMOTE_BACKEND_ADDRESS` with the remote k8s deployment address and port
|
||||
10: In root project folder run `make rebuild-run-app-only`
|
||||
|
||||
If you configured everything correctly frontend app will run locally and provide you with a local url to open it.
|
||||
Now you can run queries against the remotely deployed backend from your local machine. :celebrate:
|
||||
|
||||
## Upcoming Features
|
||||
|
||||
- [ ] Finalizing Copilot Mode
|
||||
|
13
app-docker-compose.yaml
Normal file
13
app-docker-compose.yaml
Normal file
@ -0,0 +1,13 @@
|
||||
services:
|
||||
perplexica-frontend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: app.dockerfile
|
||||
args:
|
||||
- NEXT_PUBLIC_SUPER_SECRET_KEY=${SUPER_SECRET_KEY}
|
||||
- NEXT_PUBLIC_API_URL=https://${REMOTE_BACKEND_ADDRESS}/api
|
||||
- NEXT_PUBLIC_WS_URL=wss://${REMOTE_BACKEND_ADDRESS}
|
||||
expose:
|
||||
- 3000
|
||||
ports:
|
||||
- 3000:3000
|
@ -2,8 +2,11 @@ FROM node:alpine
|
||||
|
||||
ARG NEXT_PUBLIC_WS_URL
|
||||
ARG NEXT_PUBLIC_API_URL
|
||||
ARG NEXT_PUBLIC_SUPER_SECRET_KEY
|
||||
|
||||
ENV NEXT_PUBLIC_WS_URL=${NEXT_PUBLIC_WS_URL}
|
||||
ENV NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL}
|
||||
ENV NEXT_PUBLIC_SUPER_SECRET_KEY=${NEXT_PUBLIC_SUPER_SECRET_KEY}
|
||||
|
||||
WORKDIR /home/perplexica
|
||||
|
||||
@ -12,4 +15,4 @@ COPY ui /home/perplexica/
|
||||
RUN yarn install
|
||||
RUN yarn build
|
||||
|
||||
CMD ["yarn", "start"]
|
||||
CMD ["yarn", "start"]
|
||||
|
6
deploy/gcp/.gitignore
vendored
Normal file
6
deploy/gcp/.gitignore
vendored
Normal file
@ -0,0 +1,6 @@
|
||||
.env
|
||||
.auto.tfvars
|
||||
.terraform
|
||||
terraform.tfstate
|
||||
terraform.tfstate.*
|
||||
.terraform.lock.hcl
|
103
deploy/gcp/Makefile
Normal file
103
deploy/gcp/Makefile
Normal file
@ -0,0 +1,103 @@
|
||||
# Adds all the deployment relevant sensitive information about project
|
||||
include .env
|
||||
|
||||
# Adds secrets/ keys we have define for the project locally and deployment
|
||||
include ../../.env
|
||||
|
||||
# Use `location-id-docker.pkg` for artifact registry Ex. west-1-docker.pkg
|
||||
GCP_REPO=gcr.io
|
||||
PREFIX=perplexica
|
||||
SEARCH_PORT=8080
|
||||
BACKEND_PORT=3001
|
||||
SEARCH_IMAGE_TAG=$(GCP_REPO)/$(GCP_PROJECT_ID)/$(PREFIX)-searxng:latest
|
||||
BACKEND_IMAGE_TAG=$(GCP_REPO)/$(GCP_PROJECT_ID)/$(PREFIX)-backend:latest
|
||||
APP_IMAGE_TAG=$(GCP_REPO)/$(GCP_PROJECT_ID)/$(PREFIX)-app:latest
|
||||
CLUSTER_NAME=$(PREFIX)-cluster
|
||||
|
||||
|
||||
.PHONY: build-deploy
|
||||
build-deploy: docker-build-all deploy
|
||||
|
||||
|
||||
.PHONY: docker-build-all
|
||||
docker-build-all: docker-build-push-searxng docker-build-push-backend docker-build-push-app
|
||||
|
||||
|
||||
.PHONY: show_config
|
||||
show_config:
|
||||
@echo $(GCP_PROJECT_ID) \
|
||||
&& echo $(CLUSTER_NAME) \
|
||||
&& echo $(GCP_REGION) \
|
||||
&& echo $(GCP_SERVICE_ACCOUNT_KEY_FILE) \
|
||||
&& echo $(SEARCH_IMAGE_TAG) \
|
||||
&& echo $(BACKEND_IMAGE_TAG) \
|
||||
&& echo $(APP_IMAGE_TAG) \
|
||||
&& echo $(SEARCH_PORT) \
|
||||
&& echo $(BACKEND_PORT) \
|
||||
&& echo $(OPENAI) \
|
||||
&& echo $(SUPER_SECRET_KEY)
|
||||
|
||||
.PHONY: docker-build-push-searxng
|
||||
docker-build-push-searxng:
|
||||
cd ../../ && docker build -f ./deploy/gcp/searxng.dockerfile -t $(SEARCH_IMAGE_TAG) . --platform="linux/amd64"
|
||||
docker push $(SEARCH_IMAGE_TAG)
|
||||
|
||||
|
||||
.PHONY: docker-build-push-backend
|
||||
docker-build-push-backend:
|
||||
cd ../../ && docker build -f ./backend.dockerfile -t $(BACKEND_IMAGE_TAG) . --platform="linux/amd64"
|
||||
docker push $(BACKEND_IMAGE_TAG)
|
||||
|
||||
|
||||
.PHONY: docker-build-push-app
|
||||
docker-build-push-app:
|
||||
#
|
||||
# cd ../../ && docker build -f ./app.dockerfile -t $(APP_IMAGE_TAG) . --platform="linux/amd64"
|
||||
# docker push $(APP_IMAGE_TAG)
|
||||
|
||||
|
||||
.PHONY: init
|
||||
init:
|
||||
terraform init
|
||||
|
||||
|
||||
.PHONY: deploy
|
||||
deploy:
|
||||
export TF_VAR_project_id=$(GCP_PROJECT_ID) \
|
||||
&& export TF_VAR_cluster_name=$(CLUSTER_NAME) \
|
||||
&& export TF_VAR_region=$(GCP_REGION) \
|
||||
&& export TF_VAR_key_file=$(GCP_SERVICE_ACCOUNT_KEY_FILE) \
|
||||
&& export TF_VAR_search_image=$(SEARCH_IMAGE_TAG) \
|
||||
&& export TF_VAR_backend_image=$(BACKEND_IMAGE_TAG) \
|
||||
&& export TF_VAR_app_image=$(APP_IMAGE_TAG) \
|
||||
&& export TF_VAR_search_port=$(SEARCH_PORT) \
|
||||
&& export TF_VAR_backend_port=$(BACKEND_PORT) \
|
||||
&& export TF_VAR_open_ai=$(OPENAI) \
|
||||
&& export TF_VAR_secret_key=$(SUPER_SECRET_KEY) \
|
||||
&& terraform apply
|
||||
|
||||
|
||||
.PHONY: teardown
|
||||
teardown:
|
||||
export TF_VAR_project_id=$(GCP_PROJECT_ID) \
|
||||
&& export TF_VAR_cluster_name=$(CLUSTER_NAME) \
|
||||
&& export TF_VAR_region=$(GCP_REGION) \
|
||||
&& export TF_VAR_key_file=$(GCP_SERVICE_ACCOUNT_KEY_FILE) \
|
||||
&& export TF_VAR_search_image=$(SEARCH_IMAGE_TAG) \
|
||||
&& export TF_VAR_backend_image=$(BACKEND_IMAGE_TAG) \
|
||||
&& export TF_VAR_app_image=$(APP_IMAGE_TAG) \
|
||||
&& export TF_VAR_search_port=$(SEARCH_PORT) \
|
||||
&& export TF_VAR_backend_port=$(BACKEND_PORT) \
|
||||
&& export TF_VAR_open_ai=$(OPENAI) \
|
||||
&& export TF_VAR_secret_key=$(SUPER_SECRET_KEY) \
|
||||
&& terraform destroy
|
||||
|
||||
|
||||
.PHONY: auth-kubectl
|
||||
auth-kubectl:
|
||||
gcloud container clusters get-credentials $(CLUSTER_NAME) --region=$(GCP_REGION)
|
||||
|
||||
|
||||
.PHONY: rollout-new-version-backend
|
||||
rollout-new-version-backend: auth-kubectl
|
||||
kubectl rollout restart deploy backend
|
60
deploy/gcp/gke-cluster/main.tf
Normal file
60
deploy/gcp/gke-cluster/main.tf
Normal file
@ -0,0 +1,60 @@
|
||||
terraform {
|
||||
required_providers {
|
||||
google = {
|
||||
source = "hashicorp/google"
|
||||
version = "5.28.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
variable "project_id" {
|
||||
description = "The ID of the project in which resources will be deployed."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "name" {
|
||||
description = "The GKE Cluster name"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "region" {
|
||||
description = "The GCP region to deploy to."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "key_file" {
|
||||
description = "The path to the GCP service account key file."
|
||||
type = string
|
||||
}
|
||||
|
||||
provider "google" {
|
||||
credentials = file(var.key_file)
|
||||
project = var.project_id
|
||||
region = var.region
|
||||
}
|
||||
|
||||
resource "google_container_cluster" "cluster" {
|
||||
name = var.name
|
||||
location = var.region
|
||||
initial_node_count = 1
|
||||
remove_default_node_pool = true
|
||||
}
|
||||
|
||||
resource "google_container_node_pool" "primary_preemptible_nodes" {
|
||||
name = "${google_container_cluster.cluster.name}-node-pool"
|
||||
location = var.region
|
||||
cluster = google_container_cluster.cluster.name
|
||||
node_count = 1
|
||||
|
||||
node_config {
|
||||
machine_type = "n1-standard-4"
|
||||
disk_size_gb = 25
|
||||
spot = true
|
||||
oauth_scopes = [
|
||||
"https://www.googleapis.com/auth/cloud-platform",
|
||||
"https://www.googleapis.com/auth/devstorage.read_only",
|
||||
"https://www.googleapis.com/auth/logging.write",
|
||||
"https://www.googleapis.com/auth/monitoring",
|
||||
]
|
||||
}
|
||||
}
|
238
deploy/gcp/main.tf
Normal file
238
deploy/gcp/main.tf
Normal file
@ -0,0 +1,238 @@
|
||||
terraform {
|
||||
required_providers {
|
||||
google = {
|
||||
source = "hashicorp/google"
|
||||
version = "5.28.0"
|
||||
}
|
||||
kubernetes = {
|
||||
source = "hashicorp/kubernetes"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
provider "google" {
|
||||
credentials = file(var.key_file)
|
||||
project = var.project_id
|
||||
region = var.region
|
||||
}
|
||||
|
||||
data "google_client_config" "default" {
|
||||
depends_on = [module.gke-cluster]
|
||||
}
|
||||
|
||||
# Defer reading the cluster data until the GKE cluster exists.
|
||||
data "google_container_cluster" "default" {
|
||||
name = var.cluster_name
|
||||
depends_on = [module.gke-cluster]
|
||||
location = var.region
|
||||
}
|
||||
|
||||
provider "kubernetes" {
|
||||
host = "https://${data.google_container_cluster.default.endpoint}"
|
||||
token = data.google_client_config.default.access_token
|
||||
cluster_ca_certificate = base64decode(
|
||||
data.google_container_cluster.default.master_auth[0].cluster_ca_certificate,
|
||||
)
|
||||
}
|
||||
|
||||
#####################################################################################################
|
||||
# SearXNG - Search engine deployment and service
|
||||
#####################################################################################################
|
||||
resource "kubernetes_deployment" "searxng" {
|
||||
metadata {
|
||||
name = "searxng"
|
||||
labels = {
|
||||
app = "searxng"
|
||||
}
|
||||
}
|
||||
spec {
|
||||
replicas = 1
|
||||
selector {
|
||||
match_labels = {
|
||||
component = "searxng"
|
||||
}
|
||||
}
|
||||
template {
|
||||
metadata {
|
||||
labels = {
|
||||
component = "searxng"
|
||||
}
|
||||
}
|
||||
spec {
|
||||
container {
|
||||
image = var.search_image
|
||||
name = "searxng-container"
|
||||
port {
|
||||
container_port = var.search_port
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
resource "kubernetes_service" "searxng_service" {
|
||||
metadata {
|
||||
name = "searxng-service"
|
||||
namespace = "default"
|
||||
annotations = {
|
||||
"networking.gke.io/load-balancer-type" = "Internal" # Remove to create an external loadbalancer
|
||||
}
|
||||
}
|
||||
|
||||
spec {
|
||||
selector = {
|
||||
component = "searxng"
|
||||
}
|
||||
|
||||
port {
|
||||
port = var.search_port
|
||||
target_port = var.search_port
|
||||
}
|
||||
|
||||
type = "LoadBalancer"
|
||||
}
|
||||
}
|
||||
|
||||
#####################################################################################################
|
||||
# Perplexica - backend deployment and service
|
||||
#####################################################################################################
|
||||
resource "kubernetes_deployment" "backend" {
|
||||
metadata {
|
||||
name = "backend"
|
||||
labels = {
|
||||
app = "backend"
|
||||
}
|
||||
}
|
||||
spec {
|
||||
replicas = 1
|
||||
selector {
|
||||
match_labels = {
|
||||
component = "backend"
|
||||
}
|
||||
}
|
||||
template {
|
||||
metadata {
|
||||
labels = {
|
||||
component = "backend"
|
||||
}
|
||||
}
|
||||
spec {
|
||||
container {
|
||||
image = var.backend_image
|
||||
name = "backend-container"
|
||||
port {
|
||||
container_port = var.backend_port
|
||||
}
|
||||
env {
|
||||
# searxng service ip
|
||||
name = "SEARXNG_API_URL"
|
||||
value = "http://${kubernetes_service.searxng_service.status[0].load_balancer[0].ingress[0].ip}:${var.search_port}"
|
||||
}
|
||||
env {
|
||||
# openai key
|
||||
name = "OPENAI"
|
||||
value = var.open_ai
|
||||
}
|
||||
env {
|
||||
# port
|
||||
name = "PORT"
|
||||
value = var.backend_port
|
||||
}
|
||||
env {
|
||||
# Access key for backend
|
||||
name = "SUPER_SECRET_KEY"
|
||||
value = var.secret_key
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
resource "kubernetes_service" "backend_service" {
|
||||
metadata {
|
||||
name = "backend-service"
|
||||
namespace = "default"
|
||||
}
|
||||
|
||||
spec {
|
||||
selector = {
|
||||
component = "backend"
|
||||
}
|
||||
|
||||
port {
|
||||
port = var.backend_port
|
||||
target_port = var.backend_port
|
||||
}
|
||||
|
||||
type = "LoadBalancer"
|
||||
}
|
||||
}
|
||||
|
||||
#####################################################################################################
|
||||
# Variable and module definitions
|
||||
#####################################################################################################
|
||||
variable "project_id" {
|
||||
description = "The ID of the project in which the resources will be deployed."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "key_file" {
|
||||
description = "The path to the GCP service account key file."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "region" {
|
||||
description = "The GCP region to deploy to."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "cluster_name" {
|
||||
description = "The GCP region to deploy to."
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "search_image" {
|
||||
description = "Tag for the searxng image"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "backend_image" {
|
||||
description = "Tag for the Perplexica backend image"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "app_image" {
|
||||
description = "Tag for the app image"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "open_ai" {
|
||||
description = "OPENAI access key"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "secret_key" {
|
||||
description = "Access key to secure backend endpoints"
|
||||
type = string
|
||||
}
|
||||
|
||||
variable "search_port" {
|
||||
description = "Port for searxng service"
|
||||
type = number
|
||||
}
|
||||
|
||||
variable "backend_port" {
|
||||
description = "Port for backend service"
|
||||
type = number
|
||||
}
|
||||
|
||||
module "gke-cluster" {
|
||||
source = "./gke-cluster"
|
||||
|
||||
project_id = var.project_id
|
||||
name = var.cluster_name
|
||||
region = var.region
|
||||
key_file = var.key_file
|
||||
}
|
7
deploy/gcp/sample.env
Normal file
7
deploy/gcp/sample.env
Normal file
@ -0,0 +1,7 @@
|
||||
# Rename this file to .env
|
||||
# 0: Update to your GCP project id
|
||||
# 1: Update to the path where the GCP service account credential file is kept
|
||||
# 2: Update the region to your desired GCP region
|
||||
GCP_PROJECT_ID=name-of-your-gcp-project
|
||||
GCP_SERVICE_ACCOUNT_KEY_FILE=/Path/to/your/gcp-service-account-key-file.json
|
||||
GCP_REGION=us-east1
|
3
deploy/gcp/searxng.dockerfile
Normal file
3
deploy/gcp/searxng.dockerfile
Normal file
@ -0,0 +1,3 @@
|
||||
FROM searxng/searxng
|
||||
|
||||
COPY searxng/ /etc/searxng/
|
@ -1,31 +1,45 @@
|
||||
services:
|
||||
searxng:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: searxng.dockerfile
|
||||
image: docker.io/searxng/searxng:latest
|
||||
volumes:
|
||||
- ./searxng:/etc/searxng:rw
|
||||
ports:
|
||||
- 4000:8080
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
perplexica-backend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: backend.dockerfile
|
||||
args:
|
||||
- SEARXNG_API_URL=http://searxng:8080
|
||||
- SEARXNG_API_URL=null
|
||||
volumes:
|
||||
- "/Volumes/keys/headllamp/keys/:/var/keys/"
|
||||
- "${GOOGLE_APPLICATION_CREDENTIALS}:/var/keys/gcp_service_account.json"
|
||||
environment:
|
||||
SEARXNG_API_URL: 'http://searxng:8080'
|
||||
SUPER_SECRET_KEY: ${SUPER_SECRET_KEY}
|
||||
OPENAI: ${OPENAI}
|
||||
GROQ: ${GROQ}
|
||||
OLLAMA_API_URL: ${OLLAMA_API_URL}
|
||||
GOOGLE_APPLICATION_CREDENTIALS: /var/keys/gcp_service_account.json
|
||||
USE_JWT: ${USE_JWT}
|
||||
depends_on:
|
||||
- searxng
|
||||
ports:
|
||||
- 3001:3001
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
perplexica-frontend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: app.dockerfile
|
||||
args:
|
||||
- NEXT_PUBLIC_SUPER_SECRET_KEY=${SUPER_SECRET_KEY}
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
depends_on:
|
||||
@ -34,6 +48,7 @@ services:
|
||||
- 3000:3000
|
||||
networks:
|
||||
- perplexica-network
|
||||
restart: unless-stopped
|
||||
|
||||
networks:
|
||||
perplexica-network:
|
||||
perplexica-network:
|
||||
|
@ -5,7 +5,7 @@ Curious about how Perplexica works? Don't worry, we'll cover it here. Before we
|
||||
We'll understand how Perplexica works by taking an example of a scenario where a user asks: "How does an A.C. work?". We'll break down the process into steps to make it easier to understand. The steps are as follows:
|
||||
|
||||
1. The message is sent via WS to the backend server where it invokes the chain. The chain will depend on your focus mode. For this example, let's assume we use the "webSearch" focus mode.
|
||||
2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources or searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started.
|
||||
2. The chain is now invoked; first, the message is passed to another chain where it first predicts (using the chat history and the question) whether there is a need for sources and searching the web. If there is, it will generate a query (in accordance with the chat history) for searching the web that we'll take up later. If not, the chain will end there, and then the answer generator chain, also known as the response generator, will be started.
|
||||
3. The query returned by the first chain is passed to SearXNG to search the web for information.
|
||||
4. After the information is retrieved, it is based on keyword-based search. We then convert the information into embeddings and the query as well, then we perform a similarity search to find the most relevant sources to answer the query.
|
||||
5. After all this is done, the sources are passed to the response generator. This chain takes all the chat history, the query, and the sources. It generates a response that is streamed to the UI.
|
||||
|
109
docs/installation/NETWORKING.md
Normal file
109
docs/installation/NETWORKING.md
Normal file
@ -0,0 +1,109 @@
|
||||
# Expose Perplexica to a network
|
||||
|
||||
This guide will show you how to make Perplexica available over a network. Follow these steps to allow computers on the same network to interact with Perplexica. Choose the instructions that match the operating system you are using.
|
||||
|
||||
## Windows
|
||||
|
||||
1. Open PowerShell as Administrator
|
||||
|
||||
2. Navigate to the directory containing the `docker-compose.yaml` file
|
||||
|
||||
3. Stop and remove the existing Perplexica containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Open the `docker-compose.yaml` file in a text editor like Notepad++
|
||||
|
||||
5. Replace `127.0.0.1` with the IP address of the server Perplexica is running on in these two lines:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and close the `docker-compose.yaml` file
|
||||
|
||||
7. Rebuild and restart the Perplexica container:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
|
||||
## macOS
|
||||
|
||||
1. Open the Terminal application
|
||||
|
||||
2. Navigate to the directory with the `docker-compose.yaml` file:
|
||||
|
||||
```
|
||||
cd /path/to/docker-compose.yaml
|
||||
```
|
||||
|
||||
3. Stop and remove existing containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Open `docker-compose.yaml` in a text editor like Sublime Text:
|
||||
|
||||
```
|
||||
nano docker-compose.yaml
|
||||
```
|
||||
|
||||
5. Replace `127.0.0.1` with the server IP in these lines:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and exit the editor
|
||||
|
||||
7. Rebuild and restart Perplexica:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
||||
|
||||
## Linux
|
||||
|
||||
1. Open the terminal
|
||||
|
||||
2. Navigate to the `docker-compose.yaml` directory:
|
||||
|
||||
```
|
||||
cd /path/to/docker-compose.yaml
|
||||
```
|
||||
|
||||
3. Stop and remove containers and images:
|
||||
|
||||
```
|
||||
docker compose down --rmi all
|
||||
```
|
||||
|
||||
4. Edit `docker-compose.yaml`:
|
||||
|
||||
```
|
||||
nano docker-compose.yaml
|
||||
```
|
||||
|
||||
5. Replace `127.0.0.1` with the server IP:
|
||||
|
||||
```
|
||||
args:
|
||||
- NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
|
||||
- NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
|
||||
```
|
||||
|
||||
6. Save and exit the editor
|
||||
|
||||
7. Rebuild and restart Perplexica:
|
||||
|
||||
```
|
||||
docker compose up -d --build
|
||||
```
|
@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "perplexica-backend",
|
||||
"version": "1.4.0",
|
||||
"version": "1.5.0",
|
||||
"license": "MIT",
|
||||
"author": "ItzCrazyKns",
|
||||
"scripts": {
|
||||
"start": "node dist/app.js",
|
||||
"build": "tsc",
|
||||
"dev": "nodemon src/app.ts" ,
|
||||
"dev": "nodemon src/app.ts",
|
||||
"format": "prettier . --check",
|
||||
"format:write": "prettier . --write"
|
||||
},
|
||||
@ -21,6 +21,7 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@iarna/toml": "^2.2.5",
|
||||
"@langchain/google-vertexai": "^0.0.16",
|
||||
"@langchain/openai": "^0.0.25",
|
||||
"@xenova/transformers": "^2.17.1",
|
||||
"axios": "^1.6.8",
|
||||
|
@ -8,4 +8,4 @@ GROQ = "" # Groq API key - gsk_1234567890abcdef1234567890abcdef
|
||||
|
||||
[API_ENDPOINTS]
|
||||
SEARXNG = "http://localhost:32768" # SearxNG API URL
|
||||
OLLAMA = "" # Ollama API URL - http://host.docker.internal:11434
|
||||
OLLAMA = "" # Ollama API URL - http://host.docker.internal:11434
|
24
sample.env
Normal file
24
sample.env
Normal file
@ -0,0 +1,24 @@
|
||||
# Copy this file over to .env and fill in the desired config.
|
||||
# .env will become available to docker compose and these values will be
|
||||
# used when running docker compose up
|
||||
|
||||
# Edit to set OpenAI access key
|
||||
OPENAI=ADD OPENAI KEY HERE
|
||||
|
||||
# Uncomment and edit to set GROQ access key
|
||||
# GROQ: ${GROQ}
|
||||
|
||||
# Uncomment and edit to set OLLAMA Url
|
||||
# OLLAMA_API_URL: ${OLLAMA_API_URL}
|
||||
|
||||
# Address and port of the remotely deployed Perplexica backend
|
||||
REMOTE_BACKEND_ADDRESS=111.111.111.111:0000
|
||||
|
||||
# Uncomment and edit to configure backend to reject requests without token
|
||||
# leave commented to have open access to all endpoints
|
||||
# Secret key to "secure" backend
|
||||
# SUPER_SECRET_KEY=THISISASUPERSECRETKEYSERIOUSLY
|
||||
|
||||
# Uncomment and edit to configure a specific service account key file to use to
|
||||
# auth with VertexAI when running (backend) full Perplexica stack locally
|
||||
# GOOGLE_APPLICATION_CREDENTIALS=/absolute/path/to/gcp-service-account-key-file.json
|
@ -1,3 +0,0 @@
|
||||
FROM searxng/searxng
|
||||
|
||||
COPY searxng-settings.yml /etc/searxng/settings.yml
|
3
searxng/limiter.toml
Normal file
3
searxng/limiter.toml
Normal file
@ -0,0 +1,3 @@
|
||||
[botdetection.ip_limit]
|
||||
# activate link_token method in the ip_limit method
|
||||
link_token = true
|
50
searxng/uwsgi.ini
Normal file
50
searxng/uwsgi.ini
Normal file
@ -0,0 +1,50 @@
|
||||
[uwsgi]
|
||||
# Who will run the code
|
||||
uid = searxng
|
||||
gid = searxng
|
||||
|
||||
# Number of workers (usually CPU count)
|
||||
# default value: %k (= number of CPU core, see Dockerfile)
|
||||
workers = %k
|
||||
|
||||
# Number of threads per worker
|
||||
# default value: 4 (see Dockerfile)
|
||||
threads = 4
|
||||
|
||||
# The right granted on the created socket
|
||||
chmod-socket = 666
|
||||
|
||||
# Plugin to use and interpreter config
|
||||
single-interpreter = true
|
||||
master = true
|
||||
plugin = python3
|
||||
lazy-apps = true
|
||||
enable-threads = 4
|
||||
|
||||
# Module to import
|
||||
module = searx.webapp
|
||||
|
||||
# Virtualenv and python path
|
||||
pythonpath = /usr/local/searxng/
|
||||
chdir = /usr/local/searxng/searx/
|
||||
|
||||
# automatically set processes name to something meaningful
|
||||
auto-procname = true
|
||||
|
||||
# Disable request logging for privacy
|
||||
disable-logging = true
|
||||
log-5xx = true
|
||||
|
||||
# Set the max size of a request (request-body excluded)
|
||||
buffer-size = 8192
|
||||
|
||||
# No keep alive
|
||||
# See https://github.com/searx/searx-docker/issues/24
|
||||
add-header = Connection: close
|
||||
|
||||
# uwsgi serves the static files
|
||||
static-map = /static=/usr/local/searxng/searx/static
|
||||
# expires set to one day
|
||||
static-expires = /* 86400
|
||||
static-gzip-all = True
|
||||
offload-threads = 4
|
@ -209,7 +209,6 @@ const createBasicAcademicSearchAnsweringChain = (
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicAcademicSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
@ -205,7 +205,6 @@ const createBasicRedditSearchAnsweringChain = (
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicRedditSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
55
src/agents/suggestionGeneratorAgent.ts
Normal file
55
src/agents/suggestionGeneratorAgent.ts
Normal file
@ -0,0 +1,55 @@
|
||||
import { RunnableSequence, RunnableMap } from '@langchain/core/runnables';
|
||||
import ListLineOutputParser from '../lib/outputParsers/listLineOutputParser';
|
||||
import { PromptTemplate } from '@langchain/core/prompts';
|
||||
import formatChatHistoryAsString from '../utils/formatHistory';
|
||||
import { BaseMessage } from '@langchain/core/messages';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
|
||||
const suggestionGeneratorPrompt = `
|
||||
You are an AI suggestion generator for an AI powered search engine. You will be given a conversation below. You need to generate 4-5 suggestions based on the conversation. The suggestion should be relevant to the conversation that can be used by the user to ask the chat model for more information.
|
||||
You need to make sure the suggestions are relevant to the conversation and are helpful to the user. Keep a note that the user might use these suggestions to ask a chat model for more information.
|
||||
Make sure the suggestions are medium in length and are informative and relevant to the conversation.
|
||||
|
||||
Provide these suggestions separated by newlines between the XML tags <suggestions> and </suggestions>. For example:
|
||||
|
||||
<suggestions>
|
||||
Tell me more about SpaceX and their recent projects
|
||||
What is the latest news on SpaceX?
|
||||
Who is the CEO of SpaceX?
|
||||
</suggestions>
|
||||
|
||||
Conversation:
|
||||
{chat_history}
|
||||
`;
|
||||
|
||||
type SuggestionGeneratorInput = {
|
||||
chat_history: BaseMessage[];
|
||||
};
|
||||
|
||||
const outputParser = new ListLineOutputParser({
|
||||
key: 'suggestions',
|
||||
});
|
||||
|
||||
const createSuggestionGeneratorChain = (llm: BaseChatModel) => {
|
||||
return RunnableSequence.from([
|
||||
RunnableMap.from({
|
||||
chat_history: (input: SuggestionGeneratorInput) =>
|
||||
formatChatHistoryAsString(input.chat_history),
|
||||
}),
|
||||
PromptTemplate.fromTemplate(suggestionGeneratorPrompt),
|
||||
llm,
|
||||
outputParser,
|
||||
]);
|
||||
};
|
||||
|
||||
const generateSuggestions = (
|
||||
input: SuggestionGeneratorInput,
|
||||
llm: BaseChatModel,
|
||||
) => {
|
||||
(llm as ChatOpenAI).temperature = 0;
|
||||
const suggestionGeneratorChain = createSuggestionGeneratorChain(llm);
|
||||
return suggestionGeneratorChain.invoke(input);
|
||||
};
|
||||
|
||||
export default generateSuggestions;
|
@ -203,7 +203,6 @@ const createBasicWebSearchAnsweringChain = (
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWebSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
@ -165,7 +165,6 @@ const createBasicWolframAlphaSearchAnsweringChain = (llm: BaseChatModel) => {
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicWolframAlphaSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
@ -46,7 +46,6 @@ const createWritingAssistantChain = (llm: BaseChatModel) => {
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', writingAssistantPrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
@ -205,7 +205,6 @@ const createBasicYoutubeSearchAnsweringChain = (
|
||||
ChatPromptTemplate.fromMessages([
|
||||
['system', basicYoutubeSearchResponsePrompt],
|
||||
new MessagesPlaceholder('chat_history'),
|
||||
['user', '{query}'],
|
||||
]),
|
||||
llm,
|
||||
strParser,
|
||||
|
13
src/app.ts
13
src/app.ts
@ -3,7 +3,8 @@ import express from 'express';
|
||||
import cors from 'cors';
|
||||
import http from 'http';
|
||||
import routes from './routes';
|
||||
import { getPort } from './config';
|
||||
import { requireAccessKey } from './auth';
|
||||
import { getAccessKey, getPort } from './config';
|
||||
import logger from './utils/logger';
|
||||
|
||||
const port = getPort();
|
||||
@ -13,11 +14,21 @@ const server = http.createServer(app);
|
||||
|
||||
const corsOptions = {
|
||||
origin: '*',
|
||||
allowedHeaders: ['Authorization', 'Content-Type'],
|
||||
};
|
||||
|
||||
app.use(cors(corsOptions));
|
||||
|
||||
if (getAccessKey()) {
|
||||
app.all('/api/*', requireAccessKey);
|
||||
}
|
||||
|
||||
app.use(express.json());
|
||||
|
||||
app.get('/', (_, res) => {
|
||||
res.status(200).json({ status: 'ok' });
|
||||
});
|
||||
|
||||
app.use('/api', routes);
|
||||
app.get('/api', (_, res) => {
|
||||
res.status(200).json({ status: 'ok' });
|
||||
|
29
src/auth.ts
Normal file
29
src/auth.ts
Normal file
@ -0,0 +1,29 @@
|
||||
import { auth } from 'google-auth-library';
|
||||
import { getAccessKey } from './config';
|
||||
|
||||
export const requireAccessKey = (req, res, next) => {
|
||||
const authHeader = req.headers.authorization;
|
||||
|
||||
if (authHeader) {
|
||||
if (!checkAccessKey(authHeader)) {
|
||||
return res.sendStatus(403);
|
||||
}
|
||||
next();
|
||||
} else {
|
||||
res.sendStatus(401);
|
||||
}
|
||||
};
|
||||
|
||||
export const checkAccessKey = (authHeader) => {
|
||||
const token = authHeader.split(' ')[1];
|
||||
return Boolean(authHeader && token === getAccessKey());
|
||||
};
|
||||
|
||||
export const hasGCPCredentials = async () => {
|
||||
try {
|
||||
const credentials = await auth.getCredentials();
|
||||
return Object.keys(credentials).length > 0;
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
};
|
@ -8,6 +8,7 @@ interface Config {
|
||||
GENERAL: {
|
||||
PORT: number;
|
||||
SIMILARITY_MEASURE: string;
|
||||
SUPER_SECRET_KEY: string;
|
||||
};
|
||||
API_KEYS: {
|
||||
OPENAI: string;
|
||||
@ -28,18 +29,43 @@ const loadConfig = () =>
|
||||
fs.readFileSync(path.join(__dirname, `../${configFileName}`), 'utf-8'),
|
||||
) as any as Config;
|
||||
|
||||
const loadEnv = () => {
|
||||
return {
|
||||
GENERAL: {
|
||||
PORT: Number(process.env.PORT),
|
||||
SIMILARITY_MEASURE: process.env.SIMILARITY_MEASURE,
|
||||
SUPER_SECRET_KEY: process.env.SUPER_SECRET_KEY,
|
||||
},
|
||||
API_KEYS: {
|
||||
OPENAI: process.env.OPENAI,
|
||||
GROQ: process.env.GROQ,
|
||||
},
|
||||
API_ENDPOINTS: {
|
||||
SEARXNG: process.env.SEARXNG_API_URL,
|
||||
OLLAMA: process.env.OLLAMA_API_URL,
|
||||
},
|
||||
} as Config;
|
||||
};
|
||||
|
||||
export const getPort = () => loadConfig().GENERAL.PORT;
|
||||
|
||||
export const getAccessKey = () =>
|
||||
loadEnv().GENERAL.SUPER_SECRET_KEY || loadConfig().GENERAL.SUPER_SECRET_KEY;
|
||||
|
||||
export const getSimilarityMeasure = () =>
|
||||
loadConfig().GENERAL.SIMILARITY_MEASURE;
|
||||
|
||||
export const getOpenaiApiKey = () => loadConfig().API_KEYS.OPENAI;
|
||||
export const getOpenaiApiKey = () =>
|
||||
loadEnv().API_KEYS.OPENAI || loadConfig().API_KEYS.OPENAI;
|
||||
|
||||
export const getGroqApiKey = () => loadConfig().API_KEYS.GROQ;
|
||||
export const getGroqApiKey = () =>
|
||||
loadEnv().API_KEYS.GROQ || loadConfig().API_KEYS.GROQ;
|
||||
|
||||
export const getSearxngApiEndpoint = () => loadConfig().API_ENDPOINTS.SEARXNG;
|
||||
export const getSearxngApiEndpoint = () =>
|
||||
loadEnv().API_ENDPOINTS.SEARXNG || loadConfig().API_ENDPOINTS.SEARXNG;
|
||||
|
||||
export const getOllamaApiEndpoint = () => loadConfig().API_ENDPOINTS.OLLAMA;
|
||||
export const getOllamaApiEndpoint = () =>
|
||||
loadEnv().API_ENDPOINTS.OLLAMA || loadConfig().API_ENDPOINTS.OLLAMA;
|
||||
|
||||
export const updateConfig = (config: RecursivePartial<Config>) => {
|
||||
const currentConfig = loadConfig();
|
||||
|
43
src/lib/outputParsers/listLineOutputParser.ts
Normal file
43
src/lib/outputParsers/listLineOutputParser.ts
Normal file
@ -0,0 +1,43 @@
|
||||
import { BaseOutputParser } from '@langchain/core/output_parsers';
|
||||
|
||||
interface LineListOutputParserArgs {
|
||||
key?: string;
|
||||
}
|
||||
|
||||
class LineListOutputParser extends BaseOutputParser<string[]> {
|
||||
private key = 'questions';
|
||||
|
||||
constructor(args?: LineListOutputParserArgs) {
|
||||
super();
|
||||
this.key = args.key ?? this.key;
|
||||
}
|
||||
|
||||
static lc_name() {
|
||||
return 'LineListOutputParser';
|
||||
}
|
||||
|
||||
lc_namespace = ['langchain', 'output_parsers', 'line_list_output_parser'];
|
||||
|
||||
async parse(text: string): Promise<string[]> {
|
||||
const regex = /^(\s*(-|\*|\d+\.\s|\d+\)\s|\u2022)\s*)+/;
|
||||
const startKeyIndex = text.indexOf(`<${this.key}>`);
|
||||
const endKeyIndex = text.indexOf(`</${this.key}>`);
|
||||
const questionsStartIndex =
|
||||
startKeyIndex === -1 ? 0 : startKeyIndex + `<${this.key}>`.length;
|
||||
const questionsEndIndex = endKeyIndex === -1 ? text.length : endKeyIndex;
|
||||
const lines = text
|
||||
.slice(questionsStartIndex, questionsEndIndex)
|
||||
.trim()
|
||||
.split('\n')
|
||||
.filter((line) => line.trim() !== '')
|
||||
.map((line) => line.replace(regex, ''));
|
||||
|
||||
return lines;
|
||||
}
|
||||
|
||||
getFormatInstructions(): string {
|
||||
throw new Error('Not implemented.');
|
||||
}
|
||||
}
|
||||
|
||||
export default LineListOutputParser;
|
@ -1,7 +1,10 @@
|
||||
import { ChatOpenAI, OpenAIEmbeddings } from '@langchain/openai';
|
||||
import { ChatOllama } from '@langchain/community/chat_models/ollama';
|
||||
import { VertexAI } from "@langchain/google-vertexai";
|
||||
import { GoogleVertexAIEmbeddings } from "@langchain/community/embeddings/googlevertexai";
|
||||
import { OllamaEmbeddings } from '@langchain/community/embeddings/ollama';
|
||||
import { HuggingFaceTransformersEmbeddings } from './huggingfaceTransformer';
|
||||
import { hasGCPCredentials } from '../auth';
|
||||
import {
|
||||
getGroqApiKey,
|
||||
getOllamaApiEndpoint,
|
||||
@ -34,6 +37,11 @@ export const getAvailableChatModelProviders = async () => {
|
||||
modelName: 'gpt-4-turbo',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
'GPT-4 omni': new ChatOpenAI({
|
||||
openAIApiKey,
|
||||
modelName: 'gpt-4o',
|
||||
temperature: 0.7,
|
||||
}),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading OpenAI models: ${err}`);
|
||||
@ -112,6 +120,23 @@ export const getAvailableChatModelProviders = async () => {
|
||||
}
|
||||
}
|
||||
|
||||
if (await hasGCPCredentials()) {
|
||||
try {
|
||||
models['vertexai'] = {
|
||||
'gemini-1.5-pro (preview-0409)': new VertexAI({
|
||||
temperature: 0.7,
|
||||
modelName: 'gemini-1.5-pro-preview-0409',
|
||||
}),
|
||||
'gemini-1.0-pro (Latest)': new VertexAI({
|
||||
temperature: 0.7,
|
||||
modelName: 'gemini-1.0-pro',
|
||||
}),
|
||||
};
|
||||
} catch (err) {
|
||||
logger.error(`Error loading VertexAI models: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
models['custom_openai'] = {};
|
||||
|
||||
return models;
|
||||
@ -157,12 +182,21 @@ export const getAvailableEmbeddingModelProviders = async () => {
|
||||
});
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
} catch (err) {
|
||||
logger.error(`Error loading Ollama embeddings: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (await hasGCPCredentials()) {
|
||||
try {
|
||||
models['vertexai'] = {
|
||||
'Text Gecko default': new GoogleVertexAIEmbeddings(),
|
||||
}
|
||||
} catch (err) {
|
||||
logger.error(`Error loading VertexAI embeddings: ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
models['local'] = {
|
||||
'BGE Small': new HuggingFaceTransformersEmbeddings({
|
||||
@ -172,11 +206,11 @@ export const getAvailableEmbeddingModelProviders = async () => {
|
||||
modelName: 'Xenova/gte-small',
|
||||
}),
|
||||
'Bert Multilingual': new HuggingFaceTransformersEmbeddings({
|
||||
modelName: 'Xenova/bert-base-multilingual-uncased'
|
||||
modelName: 'Xenova/bert-base-multilingual-uncased',
|
||||
}),
|
||||
};
|
||||
} catch(err) {
|
||||
logger.error(`Error loading local embeddings: ${err}`);
|
||||
} catch (err) {
|
||||
logger.error(`Error loading local embeddings: ${err}`);
|
||||
}
|
||||
|
||||
return models;
|
||||
|
@ -20,8 +20,8 @@ router.post('/', async (req, res) => {
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider || Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model || Object.keys(chatModels[provider])[0];
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
|
@ -3,6 +3,7 @@ import imagesRouter from './images';
|
||||
import videosRouter from './videos';
|
||||
import configRouter from './config';
|
||||
import modelsRouter from './models';
|
||||
import suggestionsRouter from './suggestions';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
@ -10,5 +11,6 @@ router.use('/images', imagesRouter);
|
||||
router.use('/videos', videosRouter);
|
||||
router.use('/config', configRouter);
|
||||
router.use('/models', modelsRouter);
|
||||
router.use('/suggestions', suggestionsRouter);
|
||||
|
||||
export default router;
|
||||
|
46
src/routes/suggestions.ts
Normal file
46
src/routes/suggestions.ts
Normal file
@ -0,0 +1,46 @@
|
||||
import express from 'express';
|
||||
import generateSuggestions from '../agents/suggestionGeneratorAgent';
|
||||
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
|
||||
import { getAvailableChatModelProviders } from '../lib/providers';
|
||||
import { HumanMessage, AIMessage } from '@langchain/core/messages';
|
||||
import logger from '../utils/logger';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
let { chat_history, chat_model, chat_model_provider } = req.body;
|
||||
|
||||
chat_history = chat_history.map((msg: any) => {
|
||||
if (msg.role === 'user') {
|
||||
return new HumanMessage(msg.content);
|
||||
} else if (msg.role === 'assistant') {
|
||||
return new AIMessage(msg.content);
|
||||
}
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
if (chatModels[provider] && chatModels[provider][chatModel]) {
|
||||
llm = chatModels[provider][chatModel] as BaseChatModel | undefined;
|
||||
}
|
||||
|
||||
if (!llm) {
|
||||
res.status(500).json({ message: 'Invalid LLM model selected' });
|
||||
return;
|
||||
}
|
||||
|
||||
const suggestions = await generateSuggestions({ chat_history }, llm);
|
||||
|
||||
res.status(200).json({ suggestions: suggestions });
|
||||
} catch (err) {
|
||||
res.status(500).json({ message: 'An error has occurred.' });
|
||||
logger.error(`Error in generating suggestions: ${err.message}`);
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
@ -20,8 +20,8 @@ router.post('/', async (req, res) => {
|
||||
});
|
||||
|
||||
const chatModels = await getAvailableChatModelProviders();
|
||||
const provider = chat_model_provider || Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model || Object.keys(chatModels[provider])[0];
|
||||
const provider = chat_model_provider ?? Object.keys(chatModels)[0];
|
||||
const chatModel = chat_model ?? Object.keys(chatModels[provider])[0];
|
||||
|
||||
let llm: BaseChatModel | undefined;
|
||||
|
||||
|
@ -9,6 +9,8 @@ import type { Embeddings } from '@langchain/core/embeddings';
|
||||
import type { IncomingMessage } from 'http';
|
||||
import logger from '../utils/logger';
|
||||
import { ChatOpenAI } from '@langchain/openai';
|
||||
import { getAccessKey } from '../config';
|
||||
import { checkAccessKey } from '../auth';
|
||||
|
||||
export const handleConnection = async (
|
||||
ws: WebSocket,
|
||||
@ -18,6 +20,20 @@ export const handleConnection = async (
|
||||
const searchParams = new URL(request.url, `http://${request.headers.host}`)
|
||||
.searchParams;
|
||||
|
||||
if (getAccessKey()) {
|
||||
const securtyProtocolHeader = request.headers['sec-websocket-protocol'];
|
||||
if (!checkAccessKey(securtyProtocolHeader)) {
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
type: 'error',
|
||||
data: 'Incorrect or missing authentication token.',
|
||||
key: 'FAILED_AUTHORIZATION',
|
||||
}),
|
||||
);
|
||||
ws.close();
|
||||
}
|
||||
}
|
||||
|
||||
const [chatModelProviders, embeddingModelProviders] = await Promise.all([
|
||||
getAvailableChatModelProviders(),
|
||||
getAvailableEmbeddingModelProviders(),
|
||||
|
@ -1,5 +1,6 @@
|
||||
import ChatWindow from '@/components/ChatWindow';
|
||||
import { Metadata } from 'next';
|
||||
import { Suspense } from 'react';
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: 'Chat - Perplexica',
|
||||
@ -9,7 +10,9 @@ export const metadata: Metadata = {
|
||||
const Home = () => {
|
||||
return (
|
||||
<div>
|
||||
<ChatWindow />
|
||||
<Suspense>
|
||||
<ChatWindow />
|
||||
</Suspense>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { Fragment, useEffect, useRef, useState } from 'react';
|
||||
import MessageInput from './MessageInput';
|
||||
import { Message } from './ChatWindow';
|
||||
import MessageBox from './MessageBox';
|
||||
@ -53,7 +53,7 @@ const Chat = ({
|
||||
const isLast = i === messages.length - 1;
|
||||
|
||||
return (
|
||||
<>
|
||||
<Fragment key={msg.id}>
|
||||
<MessageBox
|
||||
key={i}
|
||||
message={msg}
|
||||
@ -63,11 +63,12 @@ const Chat = ({
|
||||
dividerRef={isLast ? dividerRef : undefined}
|
||||
isLast={isLast}
|
||||
rewrite={rewrite}
|
||||
sendMessage={sendMessage}
|
||||
/>
|
||||
{!isLast && msg.role === 'assistant' && (
|
||||
<div className="h-px w-full bg-[#1C1C1C]" />
|
||||
)}
|
||||
</>
|
||||
</Fragment>
|
||||
);
|
||||
})}
|
||||
{loading && !messageAppeared && <MessageBoxLoading />}
|
||||
|
@ -1,21 +1,26 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect, useState } from 'react';
|
||||
import { useEffect, useRef, useState } from 'react';
|
||||
import { Document } from '@langchain/core/documents';
|
||||
import Navbar from './Navbar';
|
||||
import Chat from './Chat';
|
||||
import EmptyChat from './EmptyChat';
|
||||
import { toast } from 'sonner';
|
||||
import { useSearchParams } from 'next/navigation';
|
||||
import { getSuggestions } from '@/lib/actions';
|
||||
import { clientFetch } from '@/lib/utils';
|
||||
import { getAccessKey } from '@/lib/config';
|
||||
|
||||
export type Message = {
|
||||
id: string;
|
||||
createdAt: Date;
|
||||
content: string;
|
||||
role: 'user' | 'assistant';
|
||||
suggestions?: string[];
|
||||
sources?: Document[];
|
||||
};
|
||||
|
||||
const useSocket = (url: string) => {
|
||||
const useSocket = (url: string, setIsReady: (ready: boolean) => void) => {
|
||||
const [ws, setWs] = useState<WebSocket | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
@ -34,14 +39,11 @@ const useSocket = (url: string) => {
|
||||
!embeddingModel ||
|
||||
!embeddingModelProvider
|
||||
) {
|
||||
const providers = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/models`,
|
||||
{
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
const providers = await clientFetch('/models', {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
).then(async (res) => await res.json());
|
||||
}).then(async (res) => await res.json());
|
||||
|
||||
const chatModelProviders = providers.chatModelProviders;
|
||||
const embeddingModelProviders = providers.embeddingModelProviders;
|
||||
@ -97,13 +99,28 @@ const useSocket = (url: string) => {
|
||||
|
||||
wsURL.search = searchParams.toString();
|
||||
|
||||
const ws = new WebSocket(wsURL.toString());
|
||||
let protocols: any[] = [];
|
||||
const secretToken = getAccessKey();
|
||||
|
||||
if (secretToken) {
|
||||
protocols = ['Authorization', `${secretToken}`];
|
||||
}
|
||||
|
||||
const ws = new WebSocket(wsURL.toString(), protocols);
|
||||
|
||||
ws.onopen = () => {
|
||||
console.log('[DEBUG] open');
|
||||
setWs(ws);
|
||||
};
|
||||
|
||||
const stateCheckInterval = setInterval(() => {
|
||||
if (ws.readyState === 1) {
|
||||
setIsReady(true);
|
||||
clearInterval(stateCheckInterval);
|
||||
}
|
||||
}, 100);
|
||||
|
||||
setWs(ws);
|
||||
|
||||
ws.onmessage = (e) => {
|
||||
const parsedData = JSON.parse(e.data);
|
||||
if (parsedData.type === 'error') {
|
||||
@ -122,19 +139,29 @@ const useSocket = (url: string) => {
|
||||
ws?.close();
|
||||
console.log('[DEBUG] closed');
|
||||
};
|
||||
}, [ws, url]);
|
||||
}, [ws, url, setIsReady]);
|
||||
|
||||
return ws;
|
||||
};
|
||||
|
||||
const ChatWindow = () => {
|
||||
const ws = useSocket(process.env.NEXT_PUBLIC_WS_URL!);
|
||||
const searchParams = useSearchParams();
|
||||
const initialMessage = searchParams.get('q');
|
||||
|
||||
const [isReady, setIsReady] = useState(false);
|
||||
const ws = useSocket(process.env.NEXT_PUBLIC_WS_URL!, setIsReady);
|
||||
|
||||
const [chatHistory, setChatHistory] = useState<[string, string][]>([]);
|
||||
const [messages, setMessages] = useState<Message[]>([]);
|
||||
const messagesRef = useRef<Message[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [messageAppeared, setMessageAppeared] = useState(false);
|
||||
const [focusMode, setFocusMode] = useState('webSearch');
|
||||
|
||||
useEffect(() => {
|
||||
messagesRef.current = messages;
|
||||
}, [messages]);
|
||||
|
||||
const sendMessage = async (message: string) => {
|
||||
if (loading) return;
|
||||
setLoading(true);
|
||||
@ -163,7 +190,7 @@ const ChatWindow = () => {
|
||||
},
|
||||
]);
|
||||
|
||||
const messageHandler = (e: MessageEvent) => {
|
||||
const messageHandler = async (e: MessageEvent) => {
|
||||
const data = JSON.parse(e.data);
|
||||
|
||||
if (data.type === 'error') {
|
||||
@ -225,8 +252,28 @@ const ChatWindow = () => {
|
||||
['human', message],
|
||||
['assistant', recievedMessage],
|
||||
]);
|
||||
|
||||
ws?.removeEventListener('message', messageHandler);
|
||||
setLoading(false);
|
||||
|
||||
const lastMsg = messagesRef.current[messagesRef.current.length - 1];
|
||||
|
||||
if (
|
||||
lastMsg.role === 'assistant' &&
|
||||
lastMsg.sources &&
|
||||
lastMsg.sources.length > 0 &&
|
||||
!lastMsg.suggestions
|
||||
) {
|
||||
const suggestions = await getSuggestions(messagesRef.current);
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) => {
|
||||
if (msg.id === lastMsg.id) {
|
||||
return { ...msg, suggestions: suggestions };
|
||||
}
|
||||
return msg;
|
||||
}),
|
||||
);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
@ -250,7 +297,14 @@ const ChatWindow = () => {
|
||||
sendMessage(message.content);
|
||||
};
|
||||
|
||||
return ws ? (
|
||||
useEffect(() => {
|
||||
if (isReady && initialMessage) {
|
||||
sendMessage(initialMessage);
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [isReady, initialMessage]);
|
||||
|
||||
return isReady ? (
|
||||
<div>
|
||||
{messages.length > 0 ? (
|
||||
<>
|
||||
|
@ -1,7 +1,7 @@
|
||||
import { ArrowRight } from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import { Attach, CopilotToggle, Focus } from './MessageInputActions';
|
||||
import { CopilotToggle, Focus } from './MessageInputActions';
|
||||
|
||||
const EmptyChatMessageInput = ({
|
||||
sendMessage,
|
||||
|
@ -10,9 +10,10 @@ const Rewrite = ({
|
||||
return (
|
||||
<button
|
||||
onClick={() => rewrite(messageId)}
|
||||
className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white"
|
||||
className="py-2 px-3 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white flex flex-row items-center space-x-1"
|
||||
>
|
||||
<ArrowLeftRight size={18} />
|
||||
<p className="text-xs font-medium">Rewrite</p>
|
||||
</button>
|
||||
);
|
||||
};
|
||||
|
@ -4,7 +4,15 @@
|
||||
import React, { MutableRefObject, useEffect, useState } from 'react';
|
||||
import { Message } from './ChatWindow';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { BookCopy, Disc3, Share, Volume2, StopCircle } from 'lucide-react';
|
||||
import {
|
||||
BookCopy,
|
||||
Disc3,
|
||||
Share,
|
||||
Volume2,
|
||||
StopCircle,
|
||||
Layers3,
|
||||
Plus,
|
||||
} from 'lucide-react';
|
||||
import Markdown from 'markdown-to-jsx';
|
||||
import Copy from './MessageActions/Copy';
|
||||
import Rewrite from './MessageActions/Rewrite';
|
||||
@ -21,6 +29,7 @@ const MessageBox = ({
|
||||
dividerRef,
|
||||
isLast,
|
||||
rewrite,
|
||||
sendMessage,
|
||||
}: {
|
||||
message: Message;
|
||||
messageIndex: number;
|
||||
@ -29,6 +38,7 @@ const MessageBox = ({
|
||||
dividerRef?: MutableRefObject<HTMLDivElement | null>;
|
||||
isLast: boolean;
|
||||
rewrite: (messageId: string) => void;
|
||||
sendMessage: (message: string) => void;
|
||||
}) => {
|
||||
const [parsedMessage, setParsedMessage] = useState(message.content);
|
||||
const [speechMessage, setSpeechMessage] = useState(message.content);
|
||||
@ -98,9 +108,9 @@ const MessageBox = ({
|
||||
{loading && isLast ? null : (
|
||||
<div className="flex flex-row items-center justify-between w-full text-white py-4 -mx-2">
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
<button className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white">
|
||||
{/* <button className="p-2 text-white/70 rounded-xl hover:bg-[#1c1c1c] transition duration-200 hover:text-white">
|
||||
<Share size={18} />
|
||||
</button>
|
||||
</button> */}
|
||||
<Rewrite rewrite={rewrite} messageId={message.id} />
|
||||
</div>
|
||||
<div className="flex flex-row items-center space-x-1">
|
||||
@ -124,6 +134,42 @@ const MessageBox = ({
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
{isLast &&
|
||||
message.suggestions &&
|
||||
message.suggestions.length > 0 &&
|
||||
message.role === 'assistant' &&
|
||||
!loading && (
|
||||
<>
|
||||
<div className="h-px w-full bg-[#1C1C1C]" />
|
||||
<div className="flex flex-col space-y-3 text-white">
|
||||
<div className="flex flex-row items-center space-x-2 mt-4">
|
||||
<Layers3 />
|
||||
<h3 className="text-xl font-medium">Related</h3>
|
||||
</div>
|
||||
<div className="flex flex-col space-y-3">
|
||||
{message.suggestions.map((suggestion, i) => (
|
||||
<div
|
||||
className="flex flex-col space-y-3 text-sm"
|
||||
key={i}
|
||||
>
|
||||
<div className="h-px w-full bg-[#1C1C1C]" />
|
||||
<div
|
||||
onClick={() => {
|
||||
sendMessage(suggestion);
|
||||
}}
|
||||
className="cursor-pointer flex flex-row justify-between font-medium space-x-2 items-center"
|
||||
>
|
||||
<p className="transition duration-200 hover:text-[#24A0ED]">
|
||||
{suggestion}
|
||||
</p>
|
||||
<Plus size={20} className="text-[#24A0ED]" />
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="lg:sticky lg:top-20 flex flex-col items-center space-y-3 w-full lg:w-3/12 z-30 h-full pb-4">
|
||||
|
@ -4,6 +4,7 @@ import { useState } from 'react';
|
||||
import Lightbox from 'yet-another-react-lightbox';
|
||||
import 'yet-another-react-lightbox/styles.css';
|
||||
import { Message } from './ChatWindow';
|
||||
import { clientFetch } from '@/lib/utils';
|
||||
|
||||
type Image = {
|
||||
url: string;
|
||||
@ -33,21 +34,18 @@ const SearchImages = ({
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
|
||||
const res = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/images`,
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
}),
|
||||
const res = await clientFetch('/images', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
);
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
|
@ -4,6 +4,7 @@ import { useState } from 'react';
|
||||
import Lightbox, { GenericSlide, VideoSlide } from 'yet-another-react-lightbox';
|
||||
import 'yet-another-react-lightbox/styles.css';
|
||||
import { Message } from './ChatWindow';
|
||||
import { clientFetch } from '@/lib/utils';
|
||||
|
||||
type Video = {
|
||||
url: string;
|
||||
@ -46,21 +47,18 @@ const Searchvideos = ({
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
|
||||
const res = await fetch(
|
||||
`${process.env.NEXT_PUBLIC_API_URL}/videos`,
|
||||
{
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
}),
|
||||
const res = await clientFetch('/videos', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
);
|
||||
body: JSON.stringify({
|
||||
query: query,
|
||||
chat_history: chat_history,
|
||||
chat_model_provider: chatModelProvider,
|
||||
chat_model: chatModel,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = await res.json();
|
||||
|
||||
|
@ -1,6 +1,7 @@
|
||||
import { Dialog, Transition } from '@headlessui/react';
|
||||
import { CloudUpload, RefreshCcw, RefreshCw } from 'lucide-react';
|
||||
import React, { Fragment, useEffect, useState } from 'react';
|
||||
import { clientFetch } from '@/lib/utils';
|
||||
|
||||
interface SettingsType {
|
||||
chatModelProviders: {
|
||||
@ -42,7 +43,7 @@ const SettingsDialog = ({
|
||||
if (isOpen) {
|
||||
const fetchConfig = async () => {
|
||||
setIsLoading(true);
|
||||
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/config`, {
|
||||
const res = await clientFetch('/config', {
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
@ -89,7 +90,7 @@ const SettingsDialog = ({
|
||||
setSelectedEmbeddingModelProvider(embeddingModelProvider);
|
||||
setSelectedEmbeddingModel(embeddingModel);
|
||||
setCustomOpenAIApiKey(localStorage.getItem('openAIApiKey') || '');
|
||||
setCustomOpenAIBaseURL(localStorage.getItem('openAIBaseUrl') || '');
|
||||
setCustomOpenAIBaseURL(localStorage.getItem('openAIBaseURL') || '');
|
||||
setIsLoading(false);
|
||||
};
|
||||
|
||||
@ -102,7 +103,7 @@ const SettingsDialog = ({
|
||||
setIsUpdating(true);
|
||||
|
||||
try {
|
||||
await fetch(`${process.env.NEXT_PUBLIC_API_URL}/config`, {
|
||||
await clientFetch('/config', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
|
23
ui/lib/actions.ts
Normal file
23
ui/lib/actions.ts
Normal file
@ -0,0 +1,23 @@
|
||||
import { Message } from '@/components/ChatWindow';
|
||||
import { clientFetch } from '@/lib/utils';
|
||||
|
||||
export const getSuggestions = async (chatHisory: Message[]) => {
|
||||
const chatModel = localStorage.getItem('chatModel');
|
||||
const chatModelProvider = localStorage.getItem('chatModelProvider');
|
||||
|
||||
const res = await clientFetch('/suggestions', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
chat_history: chatHisory,
|
||||
chat_model: chatModel,
|
||||
chat_model_provider: chatModelProvider,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = (await res.json()) as { suggestions: string[] };
|
||||
|
||||
return data.suggestions;
|
||||
};
|
22
ui/lib/config.ts
Normal file
22
ui/lib/config.ts
Normal file
@ -0,0 +1,22 @@
|
||||
interface Config {
|
||||
GENERAL: {
|
||||
NEXT_PUBLIC_SUPER_SECRET_KEY: string;
|
||||
NEXT_PUBLIC_API_URL: string;
|
||||
NEXT_PUBLIC_WS_URL: string;
|
||||
};
|
||||
}
|
||||
|
||||
const loadEnv = () => {
|
||||
return {
|
||||
GENERAL: {
|
||||
NEXT_PUBLIC_SUPER_SECRET_KEY: process.env.NEXT_PUBLIC_SUPER_SECRET_KEY!,
|
||||
NEXT_PUBLIC_API_URL: process.env.NEXT_PUBLIC_API_URL!,
|
||||
NEXT_PUBLIC_WS_URL: process.env.NEXT_PUBLIC_WS_URL!,
|
||||
},
|
||||
} as Config;
|
||||
};
|
||||
|
||||
export const getAccessKey = () =>
|
||||
loadEnv().GENERAL.NEXT_PUBLIC_SUPER_SECRET_KEY;
|
||||
|
||||
export const getBackendURL = () => loadEnv().GENERAL.NEXT_PUBLIC_API_URL;
|
@ -1,5 +1,6 @@
|
||||
import clsx, { ClassValue } from 'clsx';
|
||||
import { twMerge } from 'tailwind-merge';
|
||||
import { getAccessKey, getBackendURL } from './config';
|
||||
|
||||
export const cn = (...classes: ClassValue[]) => twMerge(clsx(...classes));
|
||||
|
||||
@ -19,3 +20,20 @@ export const formatTimeDifference = (date1: Date, date2: Date): string => {
|
||||
else
|
||||
return `${Math.floor(diffInSeconds / 31536000)} year${Math.floor(diffInSeconds / 31536000) !== 1 ? 's' : ''}`;
|
||||
};
|
||||
|
||||
export const clientFetch = async (path: string, payload: any): Promise<any> => {
|
||||
let headers = payload.headers;
|
||||
const url = `${getBackendURL()}${path}`;
|
||||
const secretToken = getAccessKey();
|
||||
|
||||
if (secretToken) {
|
||||
if (headers == null) {
|
||||
headers = {};
|
||||
}
|
||||
|
||||
headers['Authorization'] = `Bearer ${secretToken}`;
|
||||
payload.headers = headers;
|
||||
}
|
||||
|
||||
return await fetch(url, payload);
|
||||
};
|
||||
|
@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "perplexica-frontend",
|
||||
"version": "1.4.0",
|
||||
"version": "1.5.0",
|
||||
"license": "MIT",
|
||||
"author": "ItzCrazyKns",
|
||||
"scripts": {
|
||||
|
192
yarn.lock
192
yarn.lock
@ -79,6 +79,24 @@
|
||||
uuid "^9.0.0"
|
||||
zod "^3.22.3"
|
||||
|
||||
"@langchain/core@>0.1.56 <0.3.0", "@langchain/core@>0.1.56 <0.3.0":
|
||||
version "0.2.0"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/core/-/core-0.2.0.tgz#19c6374a5ad80daf8e14cb58582bc988109a1403"
|
||||
integrity sha512-UbCJUp9eh2JXd9AW/vhPbTgtZoMgTqJgSan5Wf/EP27X8JM65lWdCOpJW+gHyBXvabbyrZz3/EGaptTUL5gutw==
|
||||
dependencies:
|
||||
ansi-styles "^5.0.0"
|
||||
camelcase "6"
|
||||
decamelize "1.2.0"
|
||||
js-tiktoken "^1.0.12"
|
||||
langsmith "~0.1.7"
|
||||
ml-distance "^4.0.0"
|
||||
mustache "^4.2.0"
|
||||
p-queue "^6.6.2"
|
||||
p-retry "4"
|
||||
uuid "^9.0.0"
|
||||
zod "^3.22.4"
|
||||
zod-to-json-schema "^3.22.3"
|
||||
|
||||
"@langchain/core@~0.1.44", "@langchain/core@~0.1.45":
|
||||
version "0.1.52"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/core/-/core-0.1.52.tgz#7619310b83ffa841628efe2e1eda873ca714d068"
|
||||
@ -96,6 +114,32 @@
|
||||
zod "^3.22.4"
|
||||
zod-to-json-schema "^3.22.3"
|
||||
|
||||
"@langchain/google-common@~0.0.15":
|
||||
version "0.0.16"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/google-common/-/google-common-0.0.16.tgz#e2ff43eaebcf7bea84a067f8bdaf7f01e23bc1c0"
|
||||
integrity sha512-eQMdqEYfzcavkE5Cpk7LCUlFx2Gb+skNZci/DlS2zot4XCSVg8QDIYOkL+PrXtTZBsp36SyOnNfzHUzdbU8cPA==
|
||||
dependencies:
|
||||
"@langchain/core" ">0.1.56 <0.3.0"
|
||||
uuid "^9.0.0"
|
||||
zod-to-json-schema "^3.22.4"
|
||||
|
||||
"@langchain/google-gauth@~0.0.16":
|
||||
version "0.0.16"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/google-gauth/-/google-gauth-0.0.16.tgz#164c865c0d6363385f3375e54e2ed66c6ed06cfd"
|
||||
integrity sha512-mp68iw/XA/lbBwh8+6vV7FFsibP595mt+OZdEFU9QewpUv99YVHH1FT+mNoQAI9p6uZpSHYQ3Iip70nIU976sw==
|
||||
dependencies:
|
||||
"@langchain/core" ">0.1.56 <0.3.0"
|
||||
"@langchain/google-common" "~0.0.15"
|
||||
google-auth-library "^8.9.0"
|
||||
|
||||
"@langchain/google-vertexai@^0.0.16":
|
||||
version "0.0.16"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/google-vertexai/-/google-vertexai-0.0.16.tgz#388ddf21dc9537d4632acc5c046583fe9ac8022a"
|
||||
integrity sha512-tJTyPxg3vYSqhNyqx6/UViPNdn3NPeZL29JqNen26x/w4JYYMpde0Dm20KCd5TCsbdUfrkk7tMyJZjr2e30jMg==
|
||||
dependencies:
|
||||
"@langchain/core" ">0.1.56 <0.3.0"
|
||||
"@langchain/google-gauth" "~0.0.16"
|
||||
|
||||
"@langchain/openai@^0.0.25", "@langchain/openai@~0.0.19":
|
||||
version "0.0.25"
|
||||
resolved "https://registry.yarnpkg.com/@langchain/openai/-/openai-0.0.25.tgz#8332abea1e3acb9b1169f90636e518c0ee90622e"
|
||||
@ -357,6 +401,13 @@ acorn@^8.4.1:
|
||||
resolved "https://registry.yarnpkg.com/acorn/-/acorn-8.11.3.tgz#71e0b14e13a4ec160724b38fb7b0f233b1b81d7a"
|
||||
integrity sha512-Y9rRfJG5jcKOE0CLisYbojUjIrIEE7AGMzA/Sm4BslANhbS+cDMpgBdcPT91oJ7OuJ9hYJBx59RjbhxVnrF8Xg==
|
||||
|
||||
agent-base@6:
|
||||
version "6.0.2"
|
||||
resolved "https://registry.yarnpkg.com/agent-base/-/agent-base-6.0.2.tgz#49fff58577cfee3f37176feab4c22e00f86d7f77"
|
||||
integrity sha512-RZNwNclF7+MS/8bDg70amg32dyeZGZxiDuQmZxKLAlQjr3jGyLx+4Kkk58UO7D2QdgFIQCovuSuZESne6RG6XQ==
|
||||
dependencies:
|
||||
debug "4"
|
||||
|
||||
agentkeepalive@^4.2.1:
|
||||
version "4.5.0"
|
||||
resolved "https://registry.yarnpkg.com/agentkeepalive/-/agentkeepalive-4.5.0.tgz#2673ad1389b3c418c5a20c5d7364f93ca04be923"
|
||||
@ -392,6 +443,11 @@ array-flatten@1.1.1:
|
||||
resolved "https://registry.yarnpkg.com/array-flatten/-/array-flatten-1.1.1.tgz#9a5f699051b1e7073328f2a008968b64ea2955d2"
|
||||
integrity sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg==
|
||||
|
||||
arrify@^2.0.0:
|
||||
version "2.0.1"
|
||||
resolved "https://registry.yarnpkg.com/arrify/-/arrify-2.0.1.tgz#c9655e9331e0abcd588d2a7cad7e9956f66701fa"
|
||||
integrity sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug==
|
||||
|
||||
async@^3.2.3:
|
||||
version "3.2.5"
|
||||
resolved "https://registry.yarnpkg.com/async/-/async-3.2.5.tgz#ebd52a8fdaf7a2289a24df399f8d8485c8a46b66"
|
||||
@ -459,11 +515,16 @@ base-64@^0.1.0:
|
||||
resolved "https://registry.yarnpkg.com/base-64/-/base-64-0.1.0.tgz#780a99c84e7d600260361511c4877613bf24f6bb"
|
||||
integrity sha512-Y5gU45svrR5tI2Vt/X9GPd3L0HNIKzGu202EjxrXMpuc2V2CiKgemAbUUsqYmZJvPtCXoUKjNZwBJzsNScUbXA==
|
||||
|
||||
base64-js@^1.3.1, base64-js@^1.5.1:
|
||||
base64-js@^1.3.0, base64-js@^1.3.1, base64-js@^1.5.1:
|
||||
version "1.5.1"
|
||||
resolved "https://registry.yarnpkg.com/base64-js/-/base64-js-1.5.1.tgz#1b1b440160a5bf7ad40b650f095963481903930a"
|
||||
integrity sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==
|
||||
|
||||
bignumber.js@^9.0.0:
|
||||
version "9.1.2"
|
||||
resolved "https://registry.yarnpkg.com/bignumber.js/-/bignumber.js-9.1.2.tgz#b7c4242259c008903b13707983b5f4bbd31eda0c"
|
||||
integrity sha512-2/mKyZH9K85bzOEfhXDBFZTGd1CTs+5IHpeFQo9luiBG7hghdC851Pj2WAhb6E3R6b9tZj/XKhbg4fum+Kepug==
|
||||
|
||||
binary-extensions@^2.0.0, binary-extensions@^2.2.0:
|
||||
version "2.3.0"
|
||||
resolved "https://registry.yarnpkg.com/binary-extensions/-/binary-extensions-2.3.0.tgz#f6e14a97858d327252200242d4ccfe522c445522"
|
||||
@ -516,6 +577,11 @@ braces@~3.0.2:
|
||||
dependencies:
|
||||
fill-range "^7.0.1"
|
||||
|
||||
buffer-equal-constant-time@1.0.1:
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz#f8e71132f7ffe6e01a5c9697a4c6f3e48d5cc819"
|
||||
integrity sha512-zRpUiDwd/xk6ADqPMATG8vc9VPrkck7T07OIx0gnjmJAnHnTVXNQG3vfvWNuiZIkwu9KrKdA1iJKfsfTVxE6NA==
|
||||
|
||||
buffer@^5.5.0:
|
||||
version "5.7.1"
|
||||
resolved "https://registry.yarnpkg.com/buffer/-/buffer-5.7.1.tgz#ba62e7c13133053582197160851a8f648e99eed0"
|
||||
@ -716,7 +782,7 @@ debug@2.6.9:
|
||||
dependencies:
|
||||
ms "2.0.0"
|
||||
|
||||
debug@^4:
|
||||
debug@4, debug@^4:
|
||||
version "4.3.4"
|
||||
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.4.tgz#1319f6579357f2338d3337d2cdd4914bb5dcc865"
|
||||
integrity sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ==
|
||||
@ -787,6 +853,13 @@ dotenv@^16.4.5:
|
||||
resolved "https://registry.yarnpkg.com/dotenv/-/dotenv-16.4.5.tgz#cdd3b3b604cb327e286b4762e13502f717cb099f"
|
||||
integrity sha512-ZmdL2rui+eB2YwhsWzjInR8LldtZHGDoQ1ugH85ppHKwpUHL7j7rN0Ti9NCnGiQbhaZ11FpR+7ao1dNsmduNUg==
|
||||
|
||||
ecdsa-sig-formatter@1.0.11, ecdsa-sig-formatter@^1.0.11:
|
||||
version "1.0.11"
|
||||
resolved "https://registry.yarnpkg.com/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz#ae0f0fa2d85045ef14a817daa3ce9acd0489e5bf"
|
||||
integrity sha512-nagl3RYrbNv6kQkeJIpt6NJZy8twLB/2vtz6yN9Z4vRKHN4/QZJIEbqohALSgwKdnksuY3k5Addp5lg8sVoVcQ==
|
||||
dependencies:
|
||||
safe-buffer "^5.0.1"
|
||||
|
||||
ee-first@1.1.1:
|
||||
version "1.1.1"
|
||||
resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d"
|
||||
@ -888,11 +961,21 @@ express@^4.19.2:
|
||||
utils-merge "1.0.1"
|
||||
vary "~1.1.2"
|
||||
|
||||
extend@^3.0.2:
|
||||
version "3.0.2"
|
||||
resolved "https://registry.yarnpkg.com/extend/-/extend-3.0.2.tgz#f8b1136b4071fbd8eb140aff858b1019ec2915fa"
|
||||
integrity sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g==
|
||||
|
||||
fast-fifo@^1.1.0, fast-fifo@^1.2.0:
|
||||
version "1.3.2"
|
||||
resolved "https://registry.yarnpkg.com/fast-fifo/-/fast-fifo-1.3.2.tgz#286e31de96eb96d38a97899815740ba2a4f3640c"
|
||||
integrity sha512-/d9sfos4yxzpwkDkuN7k2SqFKtYNmCTzgfEpz82x34IM9/zc8KGxQoXg1liNC/izpRM/MBdt44Nmx41ZWqk+FQ==
|
||||
|
||||
fast-text-encoding@^1.0.0:
|
||||
version "1.0.6"
|
||||
resolved "https://registry.yarnpkg.com/fast-text-encoding/-/fast-text-encoding-1.0.6.tgz#0aa25f7f638222e3396d72bf936afcf1d42d6867"
|
||||
integrity sha512-VhXlQgj9ioXCqGstD37E/HBeqEGV/qOD/kmbVG8h5xKBYvM1L3lR1Zn4555cQ8GkYbJa8aJSipLPndE1k6zK2w==
|
||||
|
||||
fecha@^4.2.0:
|
||||
version "4.2.3"
|
||||
resolved "https://registry.yarnpkg.com/fecha/-/fecha-4.2.3.tgz#4d9ccdbc61e8629b259fdca67e65891448d569fd"
|
||||
@ -985,6 +1068,24 @@ function-bind@^1.1.2:
|
||||
resolved "https://registry.yarnpkg.com/function-bind/-/function-bind-1.1.2.tgz#2c02d864d97f3ea6c8830c464cbd11ab6eab7a1c"
|
||||
integrity sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==
|
||||
|
||||
gaxios@^5.0.0, gaxios@^5.0.1:
|
||||
version "5.1.3"
|
||||
resolved "https://registry.yarnpkg.com/gaxios/-/gaxios-5.1.3.tgz#f7fa92da0fe197c846441e5ead2573d4979e9013"
|
||||
integrity sha512-95hVgBRgEIRQQQHIbnxBXeHbW4TqFk4ZDJW7wmVtvYar72FdhRIo1UGOLS2eRAKCPEdPBWu+M7+A33D9CdX9rA==
|
||||
dependencies:
|
||||
extend "^3.0.2"
|
||||
https-proxy-agent "^5.0.0"
|
||||
is-stream "^2.0.0"
|
||||
node-fetch "^2.6.9"
|
||||
|
||||
gcp-metadata@^5.3.0:
|
||||
version "5.3.0"
|
||||
resolved "https://registry.yarnpkg.com/gcp-metadata/-/gcp-metadata-5.3.0.tgz#6f45eb473d0cb47d15001476b48b663744d25408"
|
||||
integrity sha512-FNTkdNEnBdlqF2oatizolQqNANMrcqJt6AAYt99B3y1aLLC8Hc5IOBb+ZnnzllodEEf6xMBp6wRcBbc16fa65w==
|
||||
dependencies:
|
||||
gaxios "^5.0.0"
|
||||
json-bigint "^1.0.0"
|
||||
|
||||
get-intrinsic@^1.1.3, get-intrinsic@^1.2.4:
|
||||
version "1.2.4"
|
||||
resolved "https://registry.yarnpkg.com/get-intrinsic/-/get-intrinsic-1.2.4.tgz#e385f5a4b5227d449c3eabbad05494ef0abbeadd"
|
||||
@ -1008,6 +1109,28 @@ glob-parent@~5.1.2:
|
||||
dependencies:
|
||||
is-glob "^4.0.1"
|
||||
|
||||
google-auth-library@^8.9.0:
|
||||
version "8.9.0"
|
||||
resolved "https://registry.yarnpkg.com/google-auth-library/-/google-auth-library-8.9.0.tgz#15a271eb2ec35d43b81deb72211bd61b1ef14dd0"
|
||||
integrity sha512-f7aQCJODJFmYWN6PeNKzgvy9LI2tYmXnzpNDHEjG5sDNPgGb2FXQyTBnXeSH+PAtpKESFD+LmHw3Ox3mN7e1Fg==
|
||||
dependencies:
|
||||
arrify "^2.0.0"
|
||||
base64-js "^1.3.0"
|
||||
ecdsa-sig-formatter "^1.0.11"
|
||||
fast-text-encoding "^1.0.0"
|
||||
gaxios "^5.0.0"
|
||||
gcp-metadata "^5.3.0"
|
||||
gtoken "^6.1.0"
|
||||
jws "^4.0.0"
|
||||
lru-cache "^6.0.0"
|
||||
|
||||
google-p12-pem@^4.0.0:
|
||||
version "4.0.1"
|
||||
resolved "https://registry.yarnpkg.com/google-p12-pem/-/google-p12-pem-4.0.1.tgz#82841798253c65b7dc2a4e5fe9df141db670172a"
|
||||
integrity sha512-WPkN4yGtz05WZ5EhtlxNDWPhC4JIic6G8ePitwUWy4l+XPVYec+a0j0Ts47PDtW59y3RwAhUd9/h9ZZ63px6RQ==
|
||||
dependencies:
|
||||
node-forge "^1.3.1"
|
||||
|
||||
gopd@^1.0.1:
|
||||
version "1.0.1"
|
||||
resolved "https://registry.yarnpkg.com/gopd/-/gopd-1.0.1.tgz#29ff76de69dac7489b7c0918a5788e56477c332c"
|
||||
@ -1015,6 +1138,15 @@ gopd@^1.0.1:
|
||||
dependencies:
|
||||
get-intrinsic "^1.1.3"
|
||||
|
||||
gtoken@^6.1.0:
|
||||
version "6.1.2"
|
||||
resolved "https://registry.yarnpkg.com/gtoken/-/gtoken-6.1.2.tgz#aeb7bdb019ff4c3ba3ac100bbe7b6e74dce0e8bc"
|
||||
integrity sha512-4ccGpzz7YAr7lxrT2neugmXQ3hP9ho2gcaityLVkiUecAiwiy60Ii8gRbZeOsXV19fYaRjgBSshs8kXw+NKCPQ==
|
||||
dependencies:
|
||||
gaxios "^5.0.1"
|
||||
google-p12-pem "^4.0.0"
|
||||
jws "^4.0.0"
|
||||
|
||||
guid-typescript@^1.0.9:
|
||||
version "1.0.9"
|
||||
resolved "https://registry.yarnpkg.com/guid-typescript/-/guid-typescript-1.0.9.tgz#e35f77003535b0297ea08548f5ace6adb1480ddc"
|
||||
@ -1060,6 +1192,14 @@ http-errors@2.0.0:
|
||||
statuses "2.0.1"
|
||||
toidentifier "1.0.1"
|
||||
|
||||
https-proxy-agent@^5.0.0:
|
||||
version "5.0.1"
|
||||
resolved "https://registry.yarnpkg.com/https-proxy-agent/-/https-proxy-agent-5.0.1.tgz#c59ef224a04fe8b754f3db0063a25ea30d0005d6"
|
||||
integrity sha512-dFcAjpTQFgoLMzC2VwU+C/CbS7uRL0lWmxDITmqm7C+7F0Odmj6s9l6alZc6AELXhrnggM2CeWSXHGOdX2YtwA==
|
||||
dependencies:
|
||||
agent-base "6"
|
||||
debug "4"
|
||||
|
||||
humanize-ms@^1.2.1:
|
||||
version "1.2.1"
|
||||
resolved "https://registry.yarnpkg.com/humanize-ms/-/humanize-ms-1.2.1.tgz#c46e3159a293f6b896da29316d8b6fe8bb79bbed"
|
||||
@ -1143,6 +1283,13 @@ is-stream@^2.0.0:
|
||||
resolved "https://registry.yarnpkg.com/is-stream/-/is-stream-2.0.1.tgz#fac1e3d53b97ad5a9d0ae9cef2389f5810a5c077"
|
||||
integrity sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==
|
||||
|
||||
js-tiktoken@^1.0.12:
|
||||
version "1.0.12"
|
||||
resolved "https://registry.yarnpkg.com/js-tiktoken/-/js-tiktoken-1.0.12.tgz#af0f5cf58e5e7318240d050c8413234019424211"
|
||||
integrity sha512-L7wURW1fH9Qaext0VzaUDpFGVQgjkdE3Dgsy9/+yXyGEpBKnylTd0mU0bfbNkKDlXRb6TEsZkwuflu1B8uQbJQ==
|
||||
dependencies:
|
||||
base64-js "^1.5.1"
|
||||
|
||||
js-tiktoken@^1.0.7, js-tiktoken@^1.0.8:
|
||||
version "1.0.10"
|
||||
resolved "https://registry.yarnpkg.com/js-tiktoken/-/js-tiktoken-1.0.10.tgz#2b343ec169399dcee8f9ef9807dbd4fafd3b30dc"
|
||||
@ -1157,11 +1304,35 @@ js-yaml@^4.1.0:
|
||||
dependencies:
|
||||
argparse "^2.0.1"
|
||||
|
||||
json-bigint@^1.0.0:
|
||||
version "1.0.0"
|
||||
resolved "https://registry.yarnpkg.com/json-bigint/-/json-bigint-1.0.0.tgz#ae547823ac0cad8398667f8cd9ef4730f5b01ff1"
|
||||
integrity sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ==
|
||||
dependencies:
|
||||
bignumber.js "^9.0.0"
|
||||
|
||||
jsonpointer@^5.0.1:
|
||||
version "5.0.1"
|
||||
resolved "https://registry.yarnpkg.com/jsonpointer/-/jsonpointer-5.0.1.tgz#2110e0af0900fd37467b5907ecd13a7884a1b559"
|
||||
integrity sha512-p/nXbhSEcu3pZRdkW1OfJhpsVtW1gd4Wa1fnQc9YLiTfAjn0312eMKimbdIQzuZl9aa9xUGaRlP9T/CJE/ditQ==
|
||||
|
||||
jwa@^2.0.0:
|
||||
version "2.0.0"
|
||||
resolved "https://registry.yarnpkg.com/jwa/-/jwa-2.0.0.tgz#a7e9c3f29dae94027ebcaf49975c9345593410fc"
|
||||
integrity sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA==
|
||||
dependencies:
|
||||
buffer-equal-constant-time "1.0.1"
|
||||
ecdsa-sig-formatter "1.0.11"
|
||||
safe-buffer "^5.0.1"
|
||||
|
||||
jws@^4.0.0:
|
||||
version "4.0.0"
|
||||
resolved "https://registry.yarnpkg.com/jws/-/jws-4.0.0.tgz#2d4e8cf6a318ffaa12615e9dec7e86e6c97310f4"
|
||||
integrity sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==
|
||||
dependencies:
|
||||
jwa "^2.0.0"
|
||||
safe-buffer "^5.0.1"
|
||||
|
||||
kuler@^2.0.0:
|
||||
version "2.0.0"
|
||||
resolved "https://registry.yarnpkg.com/kuler/-/kuler-2.0.0.tgz#e2c570a3800388fb44407e851531c1d670b061b3"
|
||||
@ -1349,6 +1520,11 @@ ms@2.1.3, ms@^2.0.0, ms@^2.1.1:
|
||||
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.3.tgz#574c8138ce1d2b5861f0b44579dbadd60c6615b2"
|
||||
integrity sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==
|
||||
|
||||
mustache@^4.2.0:
|
||||
version "4.2.0"
|
||||
resolved "https://registry.yarnpkg.com/mustache/-/mustache-4.2.0.tgz#e5892324d60a12ec9c2a73359edca52972bf6f64"
|
||||
integrity sha512-71ippSywq5Yb7/tVYyGbkBggbU8H3u5Rz56fH60jGFgr8uHwxs+aSKeqmluIVzM0m0kB7xQjKS6qPfd0b2ZoqQ==
|
||||
|
||||
napi-build-utils@^1.0.1:
|
||||
version "1.0.2"
|
||||
resolved "https://registry.yarnpkg.com/napi-build-utils/-/napi-build-utils-1.0.2.tgz#b1fddc0b2c46e380a0b7a76f984dd47c41a13806"
|
||||
@ -1376,13 +1552,18 @@ node-domexception@1.0.0:
|
||||
resolved "https://registry.yarnpkg.com/node-domexception/-/node-domexception-1.0.0.tgz#6888db46a1f71c0b76b3f7555016b63fe64766e5"
|
||||
integrity sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==
|
||||
|
||||
node-fetch@^2.6.7:
|
||||
node-fetch@^2.6.7, node-fetch@^2.6.9:
|
||||
version "2.7.0"
|
||||
resolved "https://registry.yarnpkg.com/node-fetch/-/node-fetch-2.7.0.tgz#d0f0fa6e3e2dc1d27efcd8ad99d550bda94d187d"
|
||||
integrity sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A==
|
||||
dependencies:
|
||||
whatwg-url "^5.0.0"
|
||||
|
||||
node-forge@^1.3.1:
|
||||
version "1.3.1"
|
||||
resolved "https://registry.yarnpkg.com/node-forge/-/node-forge-1.3.1.tgz#be8da2af243b2417d5f646a770663a92b7e9ded3"
|
||||
integrity sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA==
|
||||
|
||||
nodemon@^3.1.0:
|
||||
version "3.1.0"
|
||||
resolved "https://registry.yarnpkg.com/nodemon/-/nodemon-3.1.0.tgz#ff7394f2450eb6a5e96fe4180acd5176b29799c9"
|
||||
@ -2079,6 +2260,11 @@ zod-to-json-schema@^3.22.3:
|
||||
resolved "https://registry.yarnpkg.com/zod-to-json-schema/-/zod-to-json-schema-3.22.5.tgz#3646e81cfc318dbad2a22519e5ce661615418673"
|
||||
integrity sha512-+akaPo6a0zpVCCseDed504KBJUQpEW5QZw7RMneNmKw+fGaML1Z9tUNLnHHAC8x6dzVRO1eB2oEMyZRnuBZg7Q==
|
||||
|
||||
zod-to-json-schema@^3.22.4:
|
||||
version "3.23.0"
|
||||
resolved "https://registry.yarnpkg.com/zod-to-json-schema/-/zod-to-json-schema-3.23.0.tgz#4fc60e88d3c709eedbfaae3f92f8a7bf786469f2"
|
||||
integrity sha512-az0uJ243PxsRIa2x1WmNE/pnuA05gUq/JB8Lwe1EDCCL/Fz9MgjYQ0fPlyc2Tcv6aF2ZA7WM5TWaRZVEFaAIag==
|
||||
|
||||
zod@^3.22.3, zod@^3.22.4:
|
||||
version "3.22.4"
|
||||
resolved "https://registry.yarnpkg.com/zod/-/zod-3.22.4.tgz#f31c3a9386f61b1f228af56faa9255e845cf3fff"
|
||||
|
Reference in New Issue
Block a user