Skip to content

kippnorcal/galaxy

Repository files navigation

galaxy

Data Portal web app

Dependencies

Getting Started

Setup Environment

  1. Clone this repo
$  git clone https://github.com/kippnorcal/galaxy.git
  1. Install Pipenv
$ pip install pipenv
  1. Install Docker
  1. Create .env file with project secrets

dev environment

SECRET_KEY=
POSTGRES_USER=
POSTGRES_PASSWORD=
POSTGRES_DB=
TABLEAU_TRUSTED_URL=
USER_DOMAIN=
SAML_ENTITY_ID=
SAML_URL=
SAML_SLO=
SAML_CERT=
ROLLBAR_TOKEN=
APP_DOMAIN=

prod environment same as dev but also add:

SSL=1
ALLOWED_HOSTS=[]

Generating a unique secret key can be done via Django:

from django.core.management.utils import get_random_secret_key 
get_random_secret_key()
  1. Build Docker Image
$ docker-compose build

Running the Server

dev environment

$ docker-compose up -d

prod environment

$ docker-compose -f docker-compose.prod.yml up -d

Note: The first time running docker-compose you may get an error about the database not being available. Just run docker-compose down and then rerun docker-compose up.

Running Database Migrations

$ docker-compose run web python manage.py migrate

Create a superuser (if starting fresh)

$ docker-compose run web python manage.py createsuperuser

Export data from another instance

Note: docker-compose must be up to run the following command(s)

$ docker-compose exec web python manage.py dumpdata --indent 2 --exclude=contentypes >> db.json

Copy the db.json file to your new repo

Import data from another instance

$ docker-compose exec web python manage.py loaddata db.json

Taking the server down

$ docker-compose down

Running pytest (server must be running)

Run tests

$ docker-compose run web pytest

Run tests and check coverage for all modules

$ docker-compose run web pytest --cov=.

Run tests and check coverage for a specific module

$ docker-compose run web pytest --cov=accounts accounts

Collect static files for production

$ docker-compose -f docker-compose.prod.yml exec web python manage.py collectstatic --no-input --clear