Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ISSUE 2716] Create user tables for basic login.gov #2760

Merged
merged 29 commits into from
Nov 12, 2024
Merged

Conversation

babebe
Copy link
Collaborator

@babebe babebe commented Nov 7, 2024

Summary

Fixes #{2716}

Time to review: 10 mins

Changes proposed

3 user tables
migration script
updated factories to create new users

Context for reviewers

Users tables will be used for Oauth2

Additional information

Screenshots, GIF demos, code examples or output to help show the changes working as expected.

@babebe babebe changed the title add user tables [ISSUE 2716] Create user tables for basic login.gov Nov 7, 2024
api/src/db/models/user_models.py Outdated Show resolved Hide resolved
api/src/db/models/user_models.py Show resolved Hide resolved
chouinar and others added 5 commits November 7, 2024 16:03
…ults (#2730)

## Summary
Fixes #2729

### Time to review: __3 mins__

## Changes proposed
Set `track_total_hits` to True when calling OpenSearch

## Context for reviewers
While this field says it has possible performance cost due to needing to
count all the records, we also request a count for various facet counts
anyways, so I imagine this won't matter at all.

## Additional information
https://opensearch.org/docs/latest/api-reference/search/

I loaded ~16k records into my local search index. Querying it with no
filters returns this pagination info now:
```json
 {
    "order_by": "opportunity_id",
    "page_offset": 1,
    "page_size": 25,
    "sort_direction": "ascending",
    "total_pages": 676,
    "total_records": 16884
  }
```
## Context

This is currently failing a lot of CI builds
## Summary
Fixes #2665 

### Time to review: __1 min__

## Changes proposed
> What was added, updated, or removed in this PR.

Added `gh-transform-and-load` command to existing `make gh-data-export`
command. I'm not sure if this is sufficient or correct, but I'm taking a
guess based on what I see in
#2546 and
#2506.

## Context for reviewers
> Testing instructions, background context, more in-depth details of the
implementation, and anything else you'd like to call out or ask
reviewers. Explain how the changes were verified.

In the analytics work stream, we have a new CLI command `make
gh-transform-and-load` for transforming and loading (some) GitHub data.
Per issue #2665, that command should be run daily, after the existing
`gh-data-export` command which exports data from Github.

I see that `scheduled_jobs.tf` seems to be the mechanism by which `make
gh-data-export` runs daily. In this PR I'm taking and educated guess and
attempting to add `gh-transform-and-load` to the existing job, and
requesting feedback from @coilysiren as to whether this is the correct
approach.

## Additional information
> Screenshots, GIF demos, code examples or output to help show the
changes working as expected.

Co-authored-by: kai [they] <[email protected]>
## Summary
Fixes #2665 

### Time to review: __1 min__

## Changes proposed
> What was added, updated, or removed in this PR.
Added scheduled job to run `make init-db` 

## Context for reviewers
> Testing instructions, background context, more in-depth details of the
implementation, and anything else you'd like to call out or ask
reviewers. Explain how the changes were verified.

The GitHub data export, transform, and load job (see
#2759) depends on a
certain schema existing in Postgres. This PR creates a job to ensure the
schema exists.

## Additional information
> Screenshots, GIF demos, code examples or output to help show the
changes working as expected.
### Time to review: __1 mins__

## Context for reviewers

Platform's assertion is this: whenever a deploy fails for any reason, it
cancels the deploy, which locks the other 3 jobs. Those 3 jobs remain
locked indefinitely. On the next deploy, every job but 1 is locked, but
the other 3 jobs fail because they were locked prior, which causes 1
first job to be canceled, and thusly all 4 jobs are locked. It's an
avalanche effect. Whenever 1 deploy fails, all 4 fail that point
onwards.
@babebe babebe requested a review from chouinar November 8, 2024 15:56
api/src/db/models/user_models.py Outdated Show resolved Hide resolved
api/tests/src/db/models/factories.py Outdated Show resolved Hide resolved
api/src/db/models/user_models.py Outdated Show resolved Hide resolved
@babebe babebe marked this pull request as draft November 8, 2024 17:03
@babebe babebe requested a review from chouinar November 8, 2024 17:05
@babebe babebe marked this pull request as ready for review November 12, 2024 18:37
Copy link
Collaborator

@coilysiren coilysiren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

defering my review

Copy link
Collaborator

@chouinar chouinar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one minor adjustment on the factory - otherwise looks good

api/tests/src/db/models/factories.py Outdated Show resolved Hide resolved
@babebe babebe merged commit ad988c4 into main Nov 12, 2024
2 checks passed
@babebe babebe deleted the 2716/basic-user-tables branch November 12, 2024 19:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants