Skip to content

Commit

Permalink
Merge branch 'main' into incubator-initial-gha
Browse files Browse the repository at this point in the history
  • Loading branch information
tylerthome authored Oct 22, 2024
2 parents cbb213f + 4bccfda commit ade74dd
Show file tree
Hide file tree
Showing 27 changed files with 651 additions and 968 deletions.
33 changes: 11 additions & 22 deletions .github/workflows/run-tests-v1.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,34 +11,23 @@ jobs:
defaults:
run:
shell: bash
working-directory: ./flask-api
working-directory: ./backend
steps:
- uses: actions/checkout@main
with:
fetch-depth: 500
fetch-tags: true
- name: Set up Python 3.10
uses: actions/setup-python@v4
- name: Install poetry
run: pipx install poetry
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: "3.10"
cache: "pip"
- name: Upgrade pip
run: python -m pip install --upgrade pip
- name: Install API dev Dependencies
run: |
pip install -r requirements-dev.txt
- name: Run development tests with mocking enabled, using tox
run: |
# Use tox because it is configured to test against the same package type being deployed
tox
- name: Run release tests with mocking disabled, using tox
env:
COGNITO_REGION: ${{ secrets.COGNITO_REGION }}
COGNITO_ACCESS_ID: ${{ secrets.COGNITO_ACCESS_ID }}
COGNITO_ACCESS_KEY: ${{ secrets.COGNITO_ACCESS_KEY }}
run: |
echo "COGNITO_REGION set"
tox -e releasetest
python-version: "3.12"
cache: "poetry"
- name: Install API Dependencies
run: poetry install --with test
- name: Run tests
run: poetry run pytest
test-app:
runs-on: ubuntu-latest
defaults:
Expand Down
58 changes: 35 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,28 +14,28 @@ This project is part of a larger initiative at Hack for LA around creating a sha

## Technology Overview

The HomeUniteUs project is structured as a multi-[docker](https://docs.docker.com/) container application (for local development and testing), with secret-protected access to networked resources. The project contains three containers, whose activities are coordinated using the `docker compose` configuration outlined in `docker-compose.yml`. The three containers are:

1. `app`: A frontend [React](https://reactjs.org/docs/getting-started.html) app developed using [TypeScript](https://www.typescriptlang.org/).
* We use [Redux](https://redux.js.org/) to manage client state, with the [Redux Toolkit](https://redux-toolkit.js.org/) to simplify development.
* We use the [Material UI](https://material-ui.com/) component library, for access to high quality UI components designed with accessibility in mind.
* We use the [Vite](https://vitejs.dev/) build tool, for fast dev builds and optimized production builds.
2. `api`: A backend python [connexion](https://connexion.readthedocs.io/en/latest/) REST API, hosted on [AWS](https://docs.aws.amazon.com/).
* We use `connexion` to simplify our API specification and validation. `connexion` uses the [OpenAPI](https://www.openapis.org/) [specification](https://spec.openapis.org/oas/v3.0.1.html) to map API URLs to specific python functions. It will handle routing, validation, security, and parameter passing when an API request is made. `connexion` also generates documentation for our API, and provides a useful user interface (at `{URL}/api/ui`) that can be used to easily interact with the API for development purposes. We use the variant of `connexion` that runs on top of [Flask](https://flask.palletsprojects.com/en/1.1.x/).
* We use the [SQLAlchemy](https://www.sqlalchemy.org/) SQL toolkit and [Object-Relational Mapper (ORM)](https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping). This serves as a developer-friendly layer between the python app and the database.
* We use [gunicorn](https://gunicorn.org/) as our WSGI server. A WSGI server acts as a communication intermediary between an HTTP proxy and the Hone Unite Us API, handling client requests and passing them to the application, then returning the application's responses to the client.
* We use [nginx](https://nginx.org/en/docs/) as our HTTP server. This “reverse proxy” sits in front of the WSGI server, and handles a number of complex web server tasks. It is capable of load balancing across the WSGI server works, managing TLS connections, serving static files, and more.
The HomeUniteUs project is structured as a multi-[docker](https://docs.docker.com/) container application, with secret-protected access to networked resources. The project contains five containers, whose activities are coordinated using the `docker compose` configuration outlined in `docker-compose.yml`. The five containers are:

1. `frontend`: A frontend [React](https://reactjs.org/docs/getting-started.html) app developed using [TypeScript](https://www.typescriptlang.org/).
* It uses [Redux](https://redux.js.org/) to manage client state, with the [Redux Toolkit](https://redux-toolkit.js.org/) to simplify development.
* It uses the [Material UI](https://material-ui.com/) component library, for access to high quality UI components designed with accessibility in mind.
* It uses the [Vite](https://vitejs.dev/) build tool, for fast dev builds and optimized production builds.
2. `backend`: A backend python API, hosted on [AWS](https://docs.aws.amazon.com/).
* It uses `FastAPI` as its web framework.
* It uses the [SQLAlchemy](https://www.sqlalchemy.org/) SQL toolkit and [Object-Relational Mapper (ORM)](https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping). This serves as a developer-friendly layer between the Python app and the database.
3. `db`: A [PostgreSQL](https://www.postgresql.org/) database container.
* The database is stored as a docker volume, `db-data`.
* If the volume is not found during spin-up, then an empty database volume will be created.

In the production environment, each of these services along with `nginx` are deployed onto an EC2 instance and managed as `systemd` service units instead of with Docker.
4. `motoserver`: A development tool. It runs [`moto`](http://docs.getmoto.org/en/latest/docs/server_mode.html) in Server Mode.
* It allows developers to mock AWS so that AWS secrets are not needed for local development. This tool is used because HUU uses AWS Cognito as its identity and access provider. However, most local development will not need to make actual calls to AWS Cognito for HUU feature development. Using this tool will allow developers to login to HUU on their development machine.
* It has a dashboard located at http://127.0.0.1:5000/moto-api/
5. `pgadmin`: An optional development tool. It is a container running [pgAdmin4](https://www.pgadmin.org/). pgAdmin4 is database administration and development platform for PostgreSQL.
* This tool will allow can be used to run queries against the PostgreSQL server running in the `db` container.
* It is accessed by going to http://127.0.0.1:5050

## Build Instructions

Before you can build the project, you will require a `.env` file containing access keys to the application third party services. Please message a team member on the [#home-unite-us slack channel](https://hackforla.slack.com/archives/CRWUG7X0C) once you've completed onboarding. See the [api](./api/README.md) and [app](./app/README.md) READMEs for more information about the required and optional environment variables.

Since this project is dockerized, you can choose to either build the backend and frontend apps as docker containers or directly onto your local machine. This guide will focus on docker builds, but full local build and deployment instructions can be found in the [api](./api/README.md) and [app](./app/README.md) READMEs.
Since this project is Dockerized, you can choose to either build the backend and frontend apps as Docker containers or directly onto your local machine. This guide will focus on Docker builds, but full local build and deployment instructions can be found in the [api](./backend/README.md) and [app](./frontend/README.md) READMEs.

Also note that the code in this repo *should* build without issue on Linux, Windows, and MacOS. We do, however, utilize some Linux-only tools during deployment and primarily target the Linux platform.

Expand All @@ -45,16 +45,28 @@ Building with Docker is the simplest option, and debugging applications within t

#### Requirements

* A copy of the `.env` file described above
* An up-to-date installation of [docker](https://docs.docker.com/get-docker/)
* An up-to-date installation of [Docker](https://docs.docker.com/get-docker/)

#### Instructions

1. Place a copy of the `.env` file in the `app` directory
2. Place a copy of the `.env` file in the `api` directory
3. Build all three containers by running the `docker compose up` shell command from the root directory:
4. Verify there are no build errors, and open `localhost:4040` in any browser, to see the application
1. Build and run all containers by running the `docker compose up -d --build` shell command from the root directory:
2. Verify there are no build errors. If there are build errors, reach out to the development team.
3. Open `http://localhost:34828` in any browser to use Home Unite Us.

* `pgAdmin4` is available at http://localhost:5050/browser/ to query the database.
* `moto` server is available at http://localhost:5000/moto-api/ to view mocked AWS data.

#### Test Users

For local development, test users already exist when started using Docker.

The password for all test users is `Test123!`.

- 1 Admin: [email protected]
- 26 Guests: guest[a-z]@email.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)
- 26 Coordinators: coordinator[a-z]@email.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)
- 26 Hosts: host[a-z]@email.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)

## Testing Instructions

Testing instructions for each application are in the [api](./api/README.md) and [app](./app/README.md) README files.
Testing instructions for each application are in the [backend](./backend/README.md) and [frontend](./frontend/README.md) README files.
19 changes: 10 additions & 9 deletions backend/.env.example
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
COGNITO_CLIENT_ID=
COGNITO_CLIENT_SECRET=
COGNITO_REGION=
COGNITO_REDIRECT_URI=http://localhost:34828/signin
COGNITO_USER_POOL_ID=
COGNITO_ACCESS_ID=
COGNITO_ACCESS_KEY=
ROOT_URL=http://localhost:34828
DATABASE_URL=sqlite:///./homeuniteus.db
COGNITO_CLIENT_ID=testing
COGNITO_CLIENT_SECRET=testing
COGNITO_REGION=us-east-1
COGNITO_REDIRECT_URI=http://localhost:4040/signin
COGNITO_USER_POOL_ID=testing
COGNITO_ACCESS_ID=testing
COGNITO_ACCESS_KEY=testing
COGNITO_ENDPOINT_URL=http://127.0.0.1:5000
ROOT_URL=http://localhost:4040
DATABASE_URL=postgresql+psycopg2://postgres:[email protected]:5432/huu
48 changes: 48 additions & 0 deletions backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
FROM python:3.12-bookworm AS builder

# --- Install Poetry ---
ARG POETRY_VERSION=1.8

ENV POETRY_HOME=/opt/poetry
ENV POETRY_NO_INTERACTION=1
ENV POETRY_VIRTUALENVS_IN_PROJECT=1
ENV POETRY_VIRTUALENVS_CREATE=1
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Tell Poetry where to place its cache and virtual environment
ENV POETRY_CACHE_DIR=/opt/.cache

RUN pip install "poetry==${POETRY_VERSION}"

WORKDIR /app

# --- Reproduce the environment ---
COPY pyproject.toml .

# Install the dependencies and clear the cache afterwards.
# This may save some MBs.
RUN poetry install --no-root && rm -rf $POETRY_CACHE_DIR

# Now let's build the runtime image from the builder.
# We'll just copy the env and the PATH reference.
FROM python:3.12-bookworm AS runtime

ENV VIRTUAL_ENV=/app/.venv
ENV PATH="/app/.venv/bin:$PATH"

COPY --from=builder ${VIRTUAL_ENV} ${VIRTUAL_ENV}

COPY ./alembic /code/alembic
COPY ./alembic.ini /code/alembic.ini
COPY ./app /code/app
COPY ./form_data /code/form_data
COPY ./startup_scripts/entrypoint.sh /code/startup_scripts/entrypoint.sh
COPY ./startup_scripts/setup_moto_server.py /code/startup_scripts/setup_moto_server.py
COPY ./startup_scripts/create_groups_users.py /code/startup_scripts/create_groups_users.py

RUN chmod +x /code/startup_scripts/entrypoint.sh

WORKDIR /code
ENTRYPOINT ["/code/startup_scripts/entrypoint.sh"]
CMD []
EXPOSE 8000
53 changes: 45 additions & 8 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,49 +13,86 @@ This server uses:

## Requirements

You will need Python 3.8+ to install Poetry.
You will need Python 3.12+ to install Poetry.

Run `python -V` to check the Python version.

**Note**: On some systems, you might need to use the `python3` and `pip3` commands.

[Poetry](https://python-poetry.org/docs/#installation) is used to manage the project dependencies. Follow the [installation instructions](https://python-poetry.org/docs/#installation) to run the CLI globally.

[Docker](https://www.docker.com) is used to run required dependencies for development.

## Usage - Development

### Getting Started

#### Run Required Docker Containers

The API uses PostgreSQL and Moto server as it's basic required services. Using Docker Compose, run these containers prior to running the API using the following command:

```shell
docker compose up -d --build pgadmin motoserver # Runs required docker services: PostgreSQL, Moto Server, pgAdmin4
```

The command above will run three containers. `pgAdmin4` is a convenient tool that wills developers to query the PostgreSQL database.

#### Configuration

The API configuration must be specified before running the application. Configuration variables are specified as entries within a `.env` file located within the `backend` directory. To get started, create a `.env` file within `/backend` and copy the values from `.env.example` into the new `.env` file. You may have to contact someone from the development team to get the necessary values.
The API configuration must be specified before running the application. Configuration variables are specified as entries within a `.env` file located within the `backend` directory. To get started, create a `.env` file within `/backend` and copy the values from `.env.example` into the new `.env` file.

#### Setup and Run
#### Setup and Run API - non-Docker version

Once the `.env` file has been configured and Poetry is installed, run the following commands in the `backend` directory to install the required development dependencies and run the application.

```shell
poetry install # Installs all dependencies
poetry install # Installs all dependencies

poetry shell # Activates the virtual environment
poetry shell # Activates the virtual environment

poetry run fastapi dev app/main.py # Runs this server in developer mode
# If using a shell use this:
startup_scripts/entrypoint.sh # Creates test users and runs the API in developer mode

# If using Powershell use this:
startup_scripts/entrypoint.ps1 # Creates test users and runs the API in developer mode
```

Your server is now running at:
```
http://127.0.0.1:8000
http://localhost:8000
```

And your API docs at:
```
http://127.0.0.1:8000/docs
http://localhost:8000/docs
```

pgAdmin4 is available at:
```
http://localhost:5050/browser
```

Moto server dashboard is available at:
```
http://localhost:5000/moto-api
```

To exit the virtual environment, within the shell run:
```shell
exit
```

## Test Users

The `startup_scripts/entrypoint.sh` (or `startup_scripts/entrypoint.ps1` if using Powershell) script creates the following users.

The password for all test users is `Test123!`.

- 1 Admin: [email protected]
- 26 Guests: guest[a-z]@example.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)
- 26 Coordinators: coordinator[a-z]@example.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)
- 26 Hosts: host[a-z]@example.com (e.g. `[email protected]`, `[email protected]`, ... `[email protected]`)

## Conventions

### API Endpoints
Expand Down
2 changes: 1 addition & 1 deletion backend/alembic.ini
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
# are written from script.py.mako
# output_encoding = utf-8

sqlalchemy.url = sqlite:///./homeuniteus.db
# sqlalchemy.url =


[post_write_hooks]
Expand Down
38 changes: 21 additions & 17 deletions backend/alembic/env.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,27 @@
from app.core.db import Base
from app.core.config import get_settings

import app.modules.access.models
import app.modules.intake_profile.models
import app.modules.intake_profile.forms.models
import app.modules.onboarding.models
import app.modules.matching.models
import app.modules.relationship_management.models
import app.modules.tenant_housing_orgs.models
import app.modules.workflow.models

from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy import create_engine

from alembic import context

import sys
import os

print(os.getcwd())
sys.path.append(os.getcwd())

from app import models as db

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
Expand All @@ -23,15 +33,15 @@

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = db.Base.metadata
target_metadata = Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.

database_url = get_settings().DATABASE_URL


def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
Expand All @@ -45,9 +55,8 @@ def run_migrations_offline() -> None:
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
url=database_url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
Expand All @@ -70,16 +79,11 @@ def run_migrations_online() -> None:
# with the test engine configuration.
connectable = context.config.attributes.get("connection", None)
if connectable is None:
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
connectable = create_engine(database_url)

with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)
context.configure(connection=connection,
target_metadata=target_metadata)

with context.begin_transaction():
context.run_migrations()
Expand Down
Loading

0 comments on commit ade74dd

Please sign in to comment.