This repository houses a collection of code artifacts, including Github actions, Github workflows, and essential docker-compose files. These resources are instrumental in facilitating efficient and automated staging and production deployments.
cd compose
- In the
/redis
folder, create a random password for each redis service (replacePASSWORD
). See/redis/example.*.conf
files for required inputs. - Configure the environments for each service by creating your
.env.*
files. Seeexample.env.*
files for required inputs. - Create replica.key for mongodb:
openssl rand -base64 756 > mongo/replica.key # create replica.key
chmod 400 mongo/replica.key # read-only
sudo chown 999:999 mongo/replica.key # change ownership
- On Github generate a app and get the private key, place it in a file
github/githubapp.private-key.pem
. - Create an origin certificate on Cloudflare. Place both the .pem and .key files in the
/nginx/ssl
(make sure their name is the same as theDOMAIN
in.env.nginx
). - Allocate memswap:
free -h # Check Existing Swap Space
sudo fallocate -l 32G /swapfile # Create a 32GB swap file
sudo chmod 600 /swapfile # Set the correct permissions
sudo mkswap /swapfile # Set up the swap file
sudo swapon /swapfile # Enable the swap file
# Verify Swap Space
swapon --show
free -h
# Persist the Swap File
echo '/swapfile swap swap defaults 0 0' | sudo tee -a /etc/fstab
- Run
docker compose up -d
- Install neo4j-migrations tool
- Create a file
.migrations.properties
under the db folder for neo4j based on.migrations.properties.example
- run the
neo4j-migrations apply
withindb
directory (the directory containing the.migrations.properties
) to run neo4j migrations
- Install mongodb-migrations tool
- Change directory to
db/mongo
- Create a
config.ini
file based on theconfig.ini.example
- run the command
mongodb-migrate
within the directory you are in
Qdrant migration scripts handle various data migration tasks including renaming collections and transferring data between different storage systems.
This migration renames Qdrant collections from [communityId]_[platformName]
format to [communityId]_[platformId]
format.
-
Install the required dependencies:
cd db/qdrant pip install -r v001_requirements.txt
-
Set up environment variables by creating a
.env
file based on.env.v001.example
file -
Run the migration script:
cd db/qdrant python V001_collection_names.py
The script will:
- Identify collections that match the pattern
[communityId]_[platformName]
- Look up the corresponding platformId in MongoDB
- Create new collections with names in the format
[communityId]_[platformId]
- Transfer all data from the old collections to the new ones
This migration moves Discord data from PostgreSQL vector storage to Qdrant vector storage for all Discord platforms, including both regular Discord messages and Discord summaries.
-
Install the required dependencies:
cd db/qdrant pip install -r v002_requirements.txt
-
Set up environment variables by creating a
.env
file based on.env.v002.example
file -
Run the migration script:
cd db/qdrant python V002_migrate_discord_pgvector.py
Optional flags:
--dry-run
: Run in dry-run mode to preview the migration without actually transferring data
The script will:
- Connect to MongoDB to fetch all Discord platforms from the
Core
database - For each Discord platform, connect to the corresponding PostgreSQL community database
- Retrieve Discord messages and summaries with their embeddings
- Transfer all data to Qdrant collections using the platform ID
- Create separate collections for regular messages and summaries (
platform_id
andplatform_id_summary
)
cd /nginx/htpasswd/
htpasswd -c .htpasswd username