- Go v1.21
- Node v20.12.0
To compile the full REDEMOS web application, run the following command:
$ ./make.sh
This will build both the redemos
backend service binary and the front-end vue.js application inside a local ./dist
folder.
To run the redemos server, you need a configuration file. A default one is provided with the code. This will try to connect to a local sqlite database file called db.sqlite
. To create the file see the How to create the database section below.
Run the following command to start the server with the default config:
$ ./bin/redemos --configFile data/config.yml
2024/03/09 19:15:14.687189 Read configuration file from data/config.yml
┌───────────────────────────────────────────────────┐
│ Fiber v2.52.0 │
│ http://127.0.0.1:8080 │
│ (bound on host 0.0.0.0 and port 8080) │
│ │
│ Handlers ............. 8 Processes ........... 1 │
│ Prefork ....... Disabled PID ............. 96441 │
└───────────────────────────────────────────────────┘
Run redemos --help
to find out all the arguments you can provide.
You can run database migrations and import data through the main redemos
binary. To create the database from scratch, run the following command:
$ ./bin/redemos -c ./data/config.yml --migrate
A set of default data (aka the ALPHA redemos survey) is included inside the ./data
folder as a set of yaml files. To import these into the created database, run the following command:
$ ./bin/redemos -c ./data/config.yml --import './data/alpha_survey/*.yml'
Both above commands can be performed in a single run. Thus, to create a database from scratch and import the data to the server (without actually starting the server), use the following command:
$ ./bin/redemos -c ./data/config.yml -t --migrate --import './data/alpha_survey/*.yml'
To build and run the application as a docker container, use the following command:
$ docker build -t redemos .
After the build is finished, you can run the application with the following command:
$ docker run -p 8080:8080 redemos
This will perform database migrations and import all data as described in the previous section.