From 2c9981708abeb977520cef628c38c339973e4bbf Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:11:36 +0300 Subject: [PATCH 1/8] DevOps improvements (#24) * perf(quetzalcoatl): extract cors origins in envvar * refactor(anubis): fix api endpoint mispell * feat(cors): enable cors policy for anubis, enki and quetzalcoatl * perf(quetzalcoatl): simplify refresh token logic - now only a valid refresh token is required in order to issue new access tokens, as keeping the stale access tokens in sync with the refresh token using cookies is too intricate * perf(odin): add lua filter for appending access/refresh tokens from/to requests/responses * perf(odin): increase expiry time for newly created access tokens - the expiry time is currently hardcoded in the envoy.yaml file as I haven't figured it out how to use env vars here yet * perf(register): set auth tokens as samesite lax * perf(docker): do not restart containers if stopped * docs: add fastendpoints logo to diagram * perf(quetzalcoatl): update user dtos to use ProfilePictureId instead of ProfilePictureUrl * refactor(quetzalcoatl): set cookies as secure on prod * perf(prod): update SameSite attribute for cookies to None * fix(quetzalcoatl): check profile picture not to be null before adding its id to dtos * feat(users): add support for filtering-sorting-pagination for users endpoint * perf(users): define all fsp params as optional * feat: Add logging and error handling to refresh token endpoint Added logging statements to display the request path and HTTP method. Also added error handling for cases where cookies are missing or invalid, with appropriate response headers. The response headers now include `access-control-allow-credentials` and `access-control-allow-origin` to enable CORS support. * feat(images): allow anonymous users to access the image endpoint * perf(odin): add https support * feat(compose): add service for midgard-web * ops: add pantheonix profile to services * feat(roles): add endpoints for roles management * fix(odin): remove certs dependency as they are already set in lb * fix(odin): remove https redirection * fix(remove role): add role information to user response * refactor(validators): update validation rules and add constants for user registration and update validators * refactor(users): update GetAllUsersEndpoint to filter users by username instead of name * perf(enki): add authorization to GetListAsync and GetAsync methods * fix(quetzalcoatl): get total count of items after filtering for correct user pagination on GetAll endpoint * fix(enki): count items after filtering for correct pagination of problems (both published and unpublished) * feat(enki): add delete problem endpoint (propagate deletion event against hermes too) * style(enki): format code * fix(enki): allow proposer to keep the same name for an existing problem on problem update endpoint * feat(anubis): add endpoint for retrieving the highest score submissions per user (and problem if specified) * feat(enki+anubis): add pubsub support for eval metadata retrieval to improve performance Add RabbitMQ pub-sub using Dapr for publishing eval metadata related events when problems/tests are upserted/deleted in order to mitigate the overhead from Anubis submission evaluation * feat(docker-compose): add volumes for dapr redis and rabbitmq logs * fix(anubis): update tests PK as the composition between id and problem_id * feat(anubis): add create/get test case dapr client methods * perf(anubis): improve http errors format using json * perf(anubis): add problem name to get all submissions endpoint response * feat(anubis-judge0): add nginx lb between anubis and judge0 replicas * fix(anubis): configure CORS policy for Rocket * perf(anubis): add problem name to get submission by id endpoint response * ops(asgard): configure memory limits/reservations for docker containers * perf(anubis): add is_published field to submissions dtos * ops(compose): update resource limits and reservations * fix(submission source code): show submission source code iff problem has been solved previously by user * fix(anubis): remove unused import * ops: update docker-compose.prod.yaml * ops(anubis): scale up evaluator with more judge0 replicas * ops(anubis): add rabbitmq instance to prod * fix(anubis): add cors preflight catcher * fix(ci/cd): add support to auto-tagging both for git and docker * fix(ci/cd): update digitalocean production environment only when pushing to master * fix(ci/cd): disable autogenerated tags for now * fix(ci/cd): remove github action for autogenerated tags * fix(ci/cd): disable triggers on pull-request * ops(anubis): use cargo-chef for improving docker image building via deps caching (#18) Co-authored-by: WarriorsSami * Polish CI pipelines for building/testing and testing environments (#19) * ops(ci): add ci pipelines for building and testing quetzalcoatl and enki * fix(ci): remove caching layer from quetzalcoatl and enki pipelines * fix(ci): cd into quetzalcoatl and enki microservices projects before executing pipelines * fix(ci): pwd for quetzalcoatl in pipeline * fix(ci-quetzalcoatl): specify path to project when building and testing * fix(ci-quetzalcoatl): restore tests project too * misc(quetzalcoatl): update .net sdk to 8 (and packages too) * fix(quetzalcoatl): configure testing environment * fix+test(quetzalcoatl): remove obsolete tests and add new type formatters for response parsing * fix(quetzalcoatl): configure testing environment for ci pipeline * fix(quetzalcoatl): use .net 8 sdk in ci pipeline * fix(quetzalcoatl): downgrade back to .net 7 as authorization failed to work in .net 8 * test(quetzalcoatl): remove seed data for get all users test * style: format code * fix(quetzalcoatl): enable MARS for mssql dsn in integration tests * style: remove debuggin prints * fix+ops(enki): prepare test environment for ci pipeline * ops(hermes): setup testing environment for ci pipeline * fix(hermes): remove format and analyze steps from ci pipeline * fix(hermes): cd into microservice project before running tests * fix(hermes): create log file for tests * ops: add pull-request triggers for ci pipelines and prepend ci logic to already existing cd pipelines * ops(anubis): setup ci pipeline for testing * refactor+ops: specify project dir as default per pipeline * fix(anubis): place cache action after rust setup in ci pipeline * fix(quetzalcoatl): specify path to test project .csproj in the testing action of the ci pipeline * test+ops(anubis): setup local environment for integration testing using docker compose * fix(anubis): use correct path to testing environment compose in ci pipeline * test(anubis): setup seeding data for integration tests * test(anubis): setup cache stub as go CRUD API wrapper over Redis for integration testing * test(anubis): add tests for get submission(s) endpoints and setup nextest * ops(anubis): specify nextest version to be compatible with rustc version in ci pipeline * ops(anubis): specify nextest version to be compatible with rustc version in ci pipeline * test(anubis): add tests for get the highest score submissions and create submission endpoints * ops: disable digitalocean deployment pipeline ops: disable digitalocean deployment pipeline ops: disable digitalocean deployment pipeline --------- Co-authored-by: WarriorsSami * fix(anubis cd): specify version for cargo nextest * fix(cd pipelines): make deploy job depend upon the build one * fix(anubis cd): add docker compose step for setting up testing environment * Seeding script for Asgard local environment (#20) * refactor(docker-compose): remove deprecated version key and expose new port for debugging dapr redis * feat(seeder): add support for deserializing fixtures yaml config file into golang struct * feat(seeder): implement login method in http client and return jwt access token iff logged in successfully * feat(seeder): implement create problem method * feat(seeder): implement create tests method * feat(seeder): implement publish problem method * feat(seeder): implement submission sending perf: use goroutines to leverage parallel processing of http requests perf: use errgroup to report errors encountered inside goroutines * perf(seeder): execute seeding worker dynamically using docker compose * fix(seeder): add compose.override file for seeder * Add PR Verify Pipelines (#22) * refactor(ci-cd): define reusable workflows for pushing images to ghcr and docker hub * refactor(ci-cd): reconfigure ci pipelines as pr verify ones * refactor(ci-cd): integrate reusable workflows for image publishing in ci pipelines * fix(quetzalcoatl): pin explicit mssql tag for testcontainers in integration tests * fix: disable testing step in quetzalcoatl pipelines due to misbehaving mssql containers * feat(ci-cd): add release please pipeline and config * Devops/ci cd enhancement (#23) * refactor(ci-cd): define reusable workflows for pushing images to ghcr and docker hub * refactor(ci-cd): reconfigure ci pipelines as pr verify ones * refactor(ci-cd): integrate reusable workflows for image publishing in ci pipelines * fix(quetzalcoatl): pin explicit mssql tag for testcontainers in integration tests * fix: disable testing step in quetzalcoatl pipelines due to misbehaving mssql containers * feat(ci-cd): add release please pipeline and config * refactor(release-please): change tag separator to slash * refactor(release-please): update target branch --------- Co-authored-by: WarriorsSami --- .github/workflows/anubis-eval-ci.yaml | 62 ++++ .github/workflows/anubis-eval-pr-verify.yaml | 44 +++ .github/workflows/anubis-eval.yaml | 55 --- .github/workflows/asgard-deploy.yaml | 41 --- .github/workflows/dapr-config-ci.yaml | 26 ++ .github/workflows/dapr-config.yaml | 55 --- .github/workflows/enki-problems-ci.yaml | 49 +++ .../workflows/enki-problems-pr-verify.yaml | 31 ++ .github/workflows/enki-problems.yaml | 55 --- .github/workflows/eval-lb-ci.yaml | 26 ++ .github/workflows/eval-lb.yaml | 55 --- .github/workflows/hermes-tests-ci.yaml | 55 +++ .github/workflows/hermes-tests-pr-verify.yaml | 39 +++ .github/workflows/hermes-tests.yaml | 55 --- .github/workflows/odin-gateway-ci.yaml | 27 ++ .github/workflows/odin-gateway.yaml | 55 --- .github/workflows/quetzalcoatl-auth-ci.yaml | 49 +++ .../quetzalcoatl-auth-pr-verify.yaml | 31 ++ .github/workflows/quetzalcoatl-auth.yaml | 55 --- .github/workflows/release-please.yaml | 22 ++ .../workflows/step-deploy-to-docker-hub.yaml | 47 +++ .github/workflows/step-deploy-to-ghcr.yaml | 47 +++ .release-please-manifest.json | 9 + anubis-eval/.config/nextest.toml | 14 + anubis-eval/.dockerignore | 5 +- anubis-eval/.env.judge0.template | 2 +- anubis-eval/.env.template | 24 +- anubis-eval/Cargo.lock | 315 ++++++++++++------ anubis-eval/Cargo.toml | 9 +- anubis-eval/Dockerfile | 31 +- .../src/api/create_submission_endpoint.rs | 217 +++++++++++- .../src/api/get_highest_score_submissions.rs | 149 +++++++++ .../src/api/get_submission_endpoint.rs | 208 ++++++++++++ .../src/api/get_submissions_endpoint.rs | 113 +++++++ anubis-eval/src/api/health_check_endpoint.rs | 31 +- anubis-eval/src/api/middleware/auth.rs | 34 +- anubis-eval/src/application/dapr_client.rs | 4 +- anubis-eval/src/config/logger.rs | 123 ++++--- .../src/contracts/create_submission_dtos.rs | 4 +- .../get_highest_score_submissions_dtos.rs | 18 +- .../src/contracts/get_submission_dtos.rs | 54 +-- .../src/contracts/get_submissions_dtos.rs | 12 +- .../infrastructure/submission_repository.rs | 74 ++++ anubis-eval/src/main.rs | 1 + anubis-eval/src/tests/mod.rs | 181 ++++++++++ anubis-eval/src/tests/problem.rs | 69 ++++ anubis-eval/src/tests/submission.rs | 93 ++++++ anubis-eval/src/tests/user.rs | 55 +++ .../tests-setup/cache-stub/.dockerignore | 0 anubis-eval/tests-setup/cache-stub/.gitgnore | 25 ++ anubis-eval/tests-setup/cache-stub/Dockerfile | 26 ++ anubis-eval/tests-setup/cache-stub/go.mod | 37 ++ anubis-eval/tests-setup/cache-stub/go.sum | 95 ++++++ anubis-eval/tests-setup/cache-stub/main.go | 83 +++++ .../cache-stub/requests/get_item.http | 1 + .../cache-stub/requests/healthy.http | 1 + .../cache-stub/requests/set_item.http | 13 + anubis-eval/tests-setup/docker-compose.yaml | 69 ++++ .../tests-setup/fixtures/problems.yaml | 17 + .../tests-setup/fixtures/submissions.yaml | 59 ++++ .../tests-setup/fixtures/test_cases.yaml | 103 ++++++ anubis-eval/tests-setup/fixtures/tests.yaml | 20 ++ anubis-eval/tests-setup/fixtures/users.yaml | 17 + docker-compose.override.yaml | 12 + docker-compose.yaml | 4 +- enki-problems/EnkiProblems.sln.DotSettings | 23 -- .../EnkiProblemsDtoExtensions.cs | 3 +- .../ProblemEvalMetadataUpsertedEvent.cs | 2 +- .../Problems/Events/TestDeletedEvent.cs | 2 +- .../Problems/Events/TestUpsertedEvent.cs | 2 +- .../Problems/ProblemAppService.cs | 69 ++-- .../Problems/Tests/HermesTestsGrpcService.cs | 2 +- .../EnkiProblemsDomainSharedModule.cs | 3 +- .../EnkiProblemsGlobalFeatureConfigurator.cs | 3 +- .../Data/EnkiProblemsDbMigrationService.cs | 3 +- .../OpenIddictDataSeedContributor.cs | 115 ++++--- .../Problems/ProblemManager.cs | 4 +- .../EnkiProblemsHttpApiHostModule.cs | 170 +++++----- .../src/EnkiProblems.HttpApi.Host/Program.cs | 21 +- .../ProblemSubscriberController.cs | 34 +- .../MongoDb/EnkiProblemsMongoDbModule.cs | 10 +- .../EnkiProblemsApplicationTestBase.cs | 3 +- .../Problems/ProblemAppServiceTests.cs | 84 ++--- .../EnkiProblemsDomainTestBase.cs | 3 +- .../Problems/ProblemManagerTests.cs | 2 +- .../EnkiProblems.MongoDB.Tests.csproj | 3 +- ...nkiProblemsMongoDbCollectionFixtureBase.cs | 3 +- .../MongoDb/EnkiProblemsMongoDbFixture.cs | 26 +- .../MongoDb/EnkiProblemsMongoDbTestBase.cs | 3 +- .../MongoDb/EnkiProblemsMongoDbTestModule.cs | 7 +- hermes-tests/.dockerignore | 1 + hermes-tests/.gitignore | 8 +- hermes-tests/bin/client.dart | 2 +- .../temp/test/archived/sum/1-invalid.tar.gz | Bin 0 -> 185 bytes .../temp/test/archived/sum/1-oversize.zip | Bin 0 -> 470 bytes .../temp/test/archived/sum/1-valid.zip | Bin 0 -> 320 bytes hermes-tests/temp/test/archived/sum/1.zip | Bin 0 -> 320 bytes hermes-tests/temp/test/archived/sum/4.zip | Bin 0 -> 185 bytes .../temp/test/unarchived/sum/2/input.txt | 1 + .../temp/test/unarchived/sum/2/output.txt | 1 + .../temp/test/unarchived/sum/6/input.txt | 1 + .../hermes_grpc_server_integration_test.dart | 20 +- .../download_test_use_case_unit_test.dart | 10 +- .../encode_test_use_case_unit_test.dart | 12 +- .../fragment_test_use_case_unit_test.dart | 10 +- .../decode_test_use_case_unit_test.dart | 10 +- .../defragment_test_use_case_unit_test.dart | 18 +- .../upload_test_use_case_unit_test.dart | 10 +- quetzalcoatl-auth/.dockerignore | 1 + quetzalcoatl-auth/.env.template | 3 +- .../Api/Features/Auth/Login/Endpoint.cs | 54 ++- .../Features/Auth/RefreshToken/Endpoint.cs | 14 +- .../Api/Features/Auth/RefreshToken/Mappers.cs | 2 +- .../Api/Features/Auth/Register/Endpoint.cs | 54 ++- .../Api/Features/Auth/Register/Validators.cs | 23 +- .../Core/ApplicationUserExtensions.cs | 6 +- .../Api/Features/Core/JwtExtensions.cs | 8 +- .../Api/Features/Core/LinqExtensions.cs | 2 +- .../Api/Features/Users/Delete/Endpoint.cs | 3 +- .../Api/Features/Users/Get/Mappers.cs | 6 +- .../Api/Features/Users/GetAll/Endpoint.cs | 28 +- .../Api/Features/Users/GetAll/Mappers.cs | 6 +- .../Api/Features/Users/Roles/Add/Endpoint.cs | 24 +- .../Api/Features/Users/Roles/Add/Models.cs | 2 +- .../Api/Features/Users/Roles/Add/Summary.cs | 8 +- .../Features/Users/Roles/Remove/Endpoint.cs | 24 +- .../Api/Features/Users/Roles/Remove/Models.cs | 2 +- .../Features/Users/Roles/Remove/Summary.cs | 8 +- .../Api/Features/Users/Update/Endpoint.cs | 7 +- .../Api/Features/Users/Update/Mappers.cs | 6 +- .../Api/Features/Users/Update/Validators.cs | 15 +- quetzalcoatl-auth/Api/Usings.cs | 2 +- .../Features/Users/CreateUser/Handler.cs | 3 +- .../Bootstrapper/Bootstrapper.csproj | 1 + .../Extensions/ServiceCollectionExtensions.cs | 4 +- quetzalcoatl-auth/Bootstrapper/Program.cs | 80 ++--- quetzalcoatl-auth/Bootstrapper/Usings.cs | 3 + quetzalcoatl-auth/Dockerfile | 10 - .../Domain/Consts/ApplicationUserConsts.cs | 9 +- .../Domain/Consts/SystemConsts.cs | 6 + .../Infrastructure/ApplicationDbContext.cs | 3 +- .../Infrastructure/Infrastructure.csproj | 2 +- ...230315154826_AddGuidAsPKForIdentityUser.cs | 232 ++++++------- ...319104138_AddProfileImageToIdentityUser.cs | 13 +- .../20230509173144_AddRefreshTokenEntity.cs | 15 +- ...20230525105824_UpdateRefreshTokenEntity.cs | 19 +- ..._RemoveRedundantFieldsFromRefreshTokens.cs | 28 +- .../Triggers/DeleteStaleRefreshTokens.cs | 4 +- .../Features/Auth/RegisterEndpointTests.cs | 50 ++- .../Features/Images/GetImageEndpointTests.cs | 39 --- .../Api/Features/Users/DeleteEndpointTests.cs | 13 +- .../Api/Features/Users/GetAllEndpointTests.cs | 3 - .../Api/Features/Users/UpdateEndpointTests.cs | 70 ++-- .../Tests.Integration/Core/ApiWebFactory.cs | 6 +- quetzalcoatl-auth/global.json | 2 +- release-please-config.json | 37 ++ seeder/.dockerignore | 34 ++ seeder/.gitignore | 1 + seeder/Dockerfile | 66 ++++ seeder/fixtures.go | 71 ++++ seeder/fixtures.yaml | 30 ++ seeder/go.mod | 8 + seeder/go.sum | 6 + seeder/main.go | 26 ++ seeder/pantheonix_client.go | 274 +++++++++++++++ seeder/seeder.go | 86 +++++ 166 files changed, 4193 insertions(+), 1485 deletions(-) create mode 100644 .github/workflows/anubis-eval-ci.yaml create mode 100644 .github/workflows/anubis-eval-pr-verify.yaml delete mode 100644 .github/workflows/anubis-eval.yaml delete mode 100644 .github/workflows/asgard-deploy.yaml create mode 100644 .github/workflows/dapr-config-ci.yaml delete mode 100644 .github/workflows/dapr-config.yaml create mode 100644 .github/workflows/enki-problems-ci.yaml create mode 100644 .github/workflows/enki-problems-pr-verify.yaml delete mode 100644 .github/workflows/enki-problems.yaml create mode 100644 .github/workflows/eval-lb-ci.yaml delete mode 100644 .github/workflows/eval-lb.yaml create mode 100644 .github/workflows/hermes-tests-ci.yaml create mode 100644 .github/workflows/hermes-tests-pr-verify.yaml delete mode 100644 .github/workflows/hermes-tests.yaml create mode 100644 .github/workflows/odin-gateway-ci.yaml delete mode 100644 .github/workflows/odin-gateway.yaml create mode 100644 .github/workflows/quetzalcoatl-auth-ci.yaml create mode 100644 .github/workflows/quetzalcoatl-auth-pr-verify.yaml delete mode 100644 .github/workflows/quetzalcoatl-auth.yaml create mode 100644 .github/workflows/release-please.yaml create mode 100644 .github/workflows/step-deploy-to-docker-hub.yaml create mode 100644 .github/workflows/step-deploy-to-ghcr.yaml create mode 100644 .release-please-manifest.json create mode 100644 anubis-eval/.config/nextest.toml create mode 100644 anubis-eval/src/tests/mod.rs create mode 100644 anubis-eval/src/tests/problem.rs create mode 100644 anubis-eval/src/tests/submission.rs create mode 100644 anubis-eval/src/tests/user.rs create mode 100644 anubis-eval/tests-setup/cache-stub/.dockerignore create mode 100644 anubis-eval/tests-setup/cache-stub/.gitgnore create mode 100644 anubis-eval/tests-setup/cache-stub/Dockerfile create mode 100644 anubis-eval/tests-setup/cache-stub/go.mod create mode 100644 anubis-eval/tests-setup/cache-stub/go.sum create mode 100644 anubis-eval/tests-setup/cache-stub/main.go create mode 100644 anubis-eval/tests-setup/cache-stub/requests/get_item.http create mode 100644 anubis-eval/tests-setup/cache-stub/requests/healthy.http create mode 100644 anubis-eval/tests-setup/cache-stub/requests/set_item.http create mode 100644 anubis-eval/tests-setup/docker-compose.yaml create mode 100644 anubis-eval/tests-setup/fixtures/problems.yaml create mode 100644 anubis-eval/tests-setup/fixtures/submissions.yaml create mode 100644 anubis-eval/tests-setup/fixtures/test_cases.yaml create mode 100644 anubis-eval/tests-setup/fixtures/tests.yaml create mode 100644 anubis-eval/tests-setup/fixtures/users.yaml create mode 100644 docker-compose.override.yaml delete mode 100644 enki-problems/EnkiProblems.sln.DotSettings create mode 100644 hermes-tests/temp/test/archived/sum/1-invalid.tar.gz create mode 100644 hermes-tests/temp/test/archived/sum/1-oversize.zip create mode 100644 hermes-tests/temp/test/archived/sum/1-valid.zip create mode 100644 hermes-tests/temp/test/archived/sum/1.zip create mode 100644 hermes-tests/temp/test/archived/sum/4.zip create mode 100644 hermes-tests/temp/test/unarchived/sum/2/input.txt create mode 100644 hermes-tests/temp/test/unarchived/sum/2/output.txt create mode 100644 hermes-tests/temp/test/unarchived/sum/6/input.txt create mode 100644 quetzalcoatl-auth/Domain/Consts/SystemConsts.cs create mode 100644 release-please-config.json create mode 100644 seeder/.dockerignore create mode 100644 seeder/.gitignore create mode 100644 seeder/Dockerfile create mode 100644 seeder/fixtures.go create mode 100644 seeder/fixtures.yaml create mode 100644 seeder/go.mod create mode 100644 seeder/go.sum create mode 100644 seeder/main.go create mode 100644 seeder/pantheonix_client.go create mode 100644 seeder/seeder.go diff --git a/.github/workflows/anubis-eval-ci.yaml b/.github/workflows/anubis-eval-ci.yaml new file mode 100644 index 0000000..47abe93 --- /dev/null +++ b/.github/workflows/anubis-eval-ci.yaml @@ -0,0 +1,62 @@ +name: Anubis - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "anubis/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: anubis + IMAGE_NAME: anubis-eval + BUILD_CONTEXT: anubis-eval + +jobs: + build: + name: Build and Test Anubis Eval Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: anubis-eval + + steps: + - uses: actions/checkout@v4 + + - name: Run docker-compose + uses: hoverkraft-tech/compose-action@v2.0.1 + with: + compose-file: "./anubis-eval/tests-setup/docker-compose.yaml" + up-flags: "--build -d" + + - name: Setup rust + uses: hecrj/setup-rust-action@v2 + with: + rust-version: '1.72.0' + + - name: Install cargo-nextest + uses: baptiste0928/cargo-install@v3 + with: + crate: cargo-nextest + version: '0.9.64' + locked: true + + - name: Build + run: cargo build --release + + - name: Test + run: cargo nextest run --all-features --profile ci + + deploy-to-ghcr: + needs: build + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + needs: build + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} diff --git a/.github/workflows/anubis-eval-pr-verify.yaml b/.github/workflows/anubis-eval-pr-verify.yaml new file mode 100644 index 0000000..5b6bc6e --- /dev/null +++ b/.github/workflows/anubis-eval-pr-verify.yaml @@ -0,0 +1,44 @@ +name: Anubis - PR Verify + +on: + pull_request: + branches: + - develop + paths: + - "anubis-eval/**" + - ".github/workflows/anubis-eval-pr-verify.yaml" + +jobs: + build: + name: Build and Test Anubis Eval Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: anubis-eval + + steps: + - uses: actions/checkout@v4 + + - name: Run docker-compose + uses: hoverkraft-tech/compose-action@v2.0.1 + with: + compose-file: "./anubis-eval/tests-setup/docker-compose.yaml" + up-flags: "--build -d" + + - name: Setup rust + uses: hecrj/setup-rust-action@v2 + with: + rust-version: '1.72.0' + + - name: Install cargo-nextest + uses: baptiste0928/cargo-install@v3 + with: + crate: cargo-nextest + version: '0.9.64' + locked: true + + - name: Build + run: cargo build --release + + - name: Test + run: cargo nextest run --all-features --profile ci diff --git a/.github/workflows/anubis-eval.yaml b/.github/workflows/anubis-eval.yaml deleted file mode 100644 index 6b568ca..0000000 --- a/.github/workflows/anubis-eval.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Anubis - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "anubis-eval/**" - - ".github/workflows/anubis-eval.yaml" - - pull_request: - branches: - - develop - - paths: - - "anubis-eval/**" - - ".github/workflows/anubis-eval.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: anubis-eval - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: anubis-eval - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/asgard-deploy.yaml b/.github/workflows/asgard-deploy.yaml deleted file mode 100644 index 5e9f3d2..0000000 --- a/.github/workflows/asgard-deploy.yaml +++ /dev/null @@ -1,41 +0,0 @@ -name: Asgard Deployment - Deploy to DigitalOcean Droplet - -on: -# push: -# branches: -# - master -# - develop - - pull_request: - branches: - - master - -env: - REGISTRY: ghcr.io - -jobs: - deploy: - runs-on: ubuntu-latest - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Install sshpass - run: sudo apt-get install sshpass - - - name: Copy docker-compose.yaml to droplet - run: sshpass -v -p ${{ secrets.DROPLET_PASSWORD }} scp -o StrictHostKeyChecking=no docker-compose.prod.yaml root@${{ vars.DROPLET_IP }}:~/docker-compose.yaml - - - name: Deploy - uses: appleboy/ssh-action@master - with: - host: ${{ vars.DROPLET_IP }} - username: root - password: ${{ secrets.DROPLET_PASSWORD }} - script: | - cd ~ - docker-compose --profile asgard down - echo ${{ secrets.TOKEN }} | docker login ghcr.io -u ${{ github.actor }} --password-stdin - docker-compose --profile asgard pull - docker-compose --profile asgard up -d diff --git a/.github/workflows/dapr-config-ci.yaml b/.github/workflows/dapr-config-ci.yaml new file mode 100644 index 0000000..a407702 --- /dev/null +++ b/.github/workflows/dapr-config-ci.yaml @@ -0,0 +1,26 @@ +name: Dapr - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "dapr/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: asgard-dapr + IMAGE_NAME: asgard-dapr-config + BUILD_CONTEXT: dapr + +jobs: + deploy-to-ghcr: + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} \ No newline at end of file diff --git a/.github/workflows/dapr-config.yaml b/.github/workflows/dapr-config.yaml deleted file mode 100644 index 25caafc..0000000 --- a/.github/workflows/dapr-config.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Dapr - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "dapr/**" - - ".github/workflows/dapr-config.yaml" - - pull_request: - branches: - - develop - - paths: - - "dapr-config/**" - - ".github/workflows/dapr-config.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: asgard-dapr-config - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: dapr - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/enki-problems-ci.yaml b/.github/workflows/enki-problems-ci.yaml new file mode 100644 index 0000000..c6d40fc --- /dev/null +++ b/.github/workflows/enki-problems-ci.yaml @@ -0,0 +1,49 @@ +name: Enki - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "enki/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: enki + IMAGE_NAME: enki-problems + BUILD_CONTEXT: enki-problems + +jobs: + build: + name: Build and Test Enki Problems Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: enki-problems + + steps: + - uses: actions/checkout@v4 + - name: Setup dotnet + uses: actions/setup-dotnet@v4 + with: + dotnet-version: '7.0.x' + - name: Build + run: | + dotnet restore "src/EnkiProblems.HttpApi.Host/EnkiProblems.HttpApi.Host.csproj" + dotnet restore "test/EnkiProblems.Application.Tests/EnkiProblems.Application.Tests.csproj" + dotnet build --no-restore + - name: Test + run: dotnet test -e DOTNET_SYSTEM_GLOBALIZATION_INVARIANT=false + + deploy-to-ghcr: + needs: build + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + needs: build + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} diff --git a/.github/workflows/enki-problems-pr-verify.yaml b/.github/workflows/enki-problems-pr-verify.yaml new file mode 100644 index 0000000..fe115ef --- /dev/null +++ b/.github/workflows/enki-problems-pr-verify.yaml @@ -0,0 +1,31 @@ +name: Enki - PR Verify + +on: + pull_request: + branches: + - develop + paths: + - "enki-problems/**" + - ".github/workflows/enki-problems-pr-verify.yaml" + +jobs: + build: + name: Build and Test Enki Problems Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: enki-problems + + steps: + - uses: actions/checkout@v4 + - name: Setup dotnet + uses: actions/setup-dotnet@v4 + with: + dotnet-version: '7.0.x' + - name: Build + run: | + dotnet restore "src/EnkiProblems.HttpApi.Host/EnkiProblems.HttpApi.Host.csproj" + dotnet restore "test/EnkiProblems.Application.Tests/EnkiProblems.Application.Tests.csproj" + dotnet build --no-restore + - name: Test + run: dotnet test -e DOTNET_SYSTEM_GLOBALIZATION_INVARIANT=false \ No newline at end of file diff --git a/.github/workflows/enki-problems.yaml b/.github/workflows/enki-problems.yaml deleted file mode 100644 index 6863428..0000000 --- a/.github/workflows/enki-problems.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Enki - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "enki-problems/**" - - ".github/workflows/enki-problems.yaml" - - pull_request: - branches: - - develop - - paths: - - "enki-problems/**" - - ".github/workflows/enki-problems.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: enki-problems - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: enki-problems - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/eval-lb-ci.yaml b/.github/workflows/eval-lb-ci.yaml new file mode 100644 index 0000000..82044d3 --- /dev/null +++ b/.github/workflows/eval-lb-ci.yaml @@ -0,0 +1,26 @@ +name: Eval Nginx Load Balancer - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "eval-lb/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: eval-lb + IMAGE_NAME: asgard-eval-lb + BUILD_CONTEXT: anubis-eval/eval-lb + +jobs: + deploy-to-ghcr: + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} diff --git a/.github/workflows/eval-lb.yaml b/.github/workflows/eval-lb.yaml deleted file mode 100644 index 8f8c229..0000000 --- a/.github/workflows/eval-lb.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Eval Nginx Load Balancer - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "anubis-eval/eval-lb/**" - - ".github/workflows/eval-lb.yaml" - - pull_request: - branches: - - develop - - paths: - - "anubis-eval/eval-lb/**" - - ".github/workflows/eval-lb.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: asgard-eval-lb - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: anubis-eval/eval-lb - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/hermes-tests-ci.yaml b/.github/workflows/hermes-tests-ci.yaml new file mode 100644 index 0000000..c4a59aa --- /dev/null +++ b/.github/workflows/hermes-tests-ci.yaml @@ -0,0 +1,55 @@ +name: Hermes - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "hermes/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: hermes + IMAGE_NAME: hermes-tests + BUILD_CONTEXT: hermes-tests + HERMES_CONFIG: ${{ secrets.HERMES_CONFIG }} + +jobs: + build: + name: Build and Test Hermes Tests Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: hermes-tests + + steps: + - uses: actions/checkout@v4 + + - name: Setup dart + uses: dart-lang/setup-dart@v1 + with: + sdk: 2.19.2 + + - name: Create logs/test.log file + run: | + mkdir -p logs + touch logs/test.log + + - name: Get dependencies + run: dart pub get + + - name: Test + run: dart test + + deploy-to-ghcr: + needs: build + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + needs: build + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} diff --git a/.github/workflows/hermes-tests-pr-verify.yaml b/.github/workflows/hermes-tests-pr-verify.yaml new file mode 100644 index 0000000..c1fcefd --- /dev/null +++ b/.github/workflows/hermes-tests-pr-verify.yaml @@ -0,0 +1,39 @@ +name: Hermes - PR Verify + +on: + pull_request: + branches: + - develop + paths: + - "hermes-tests/**" + - ".github/workflows/hermes-tests-pr-verify.yaml" + +env: + HERMES_CONFIG: ${{ secrets.HERMES_CONFIG }} + +jobs: + build: + name: Build and Test Hermes Tests Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: hermes-tests + + steps: + - uses: actions/checkout@v4 + + - name: Setup dart + uses: dart-lang/setup-dart@v1 + with: + sdk: 2.19.2 + + - name: Create logs/test.log file + run: | + mkdir -p logs + touch logs/test.log + + - name: Get dependencies + run: dart pub get + + - name: Test + run: dart test \ No newline at end of file diff --git a/.github/workflows/hermes-tests.yaml b/.github/workflows/hermes-tests.yaml deleted file mode 100644 index f9a2a5f..0000000 --- a/.github/workflows/hermes-tests.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Hermes - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "hermes-tests/**" - - ".github/workflows/hermes-tests.yaml" - - pull_request: - branches: - - develop - - paths: - - "hermes-tests/**" - - ".github/workflows/hermes-tests.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: hermes-tests - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: hermes-tests - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/odin-gateway-ci.yaml b/.github/workflows/odin-gateway-ci.yaml new file mode 100644 index 0000000..b5e5484 --- /dev/null +++ b/.github/workflows/odin-gateway-ci.yaml @@ -0,0 +1,27 @@ +name: Odin - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "odin/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: odin + IMAGE_NAME: odin-api-gateway + BUILD_CONTEXT: odin-gateway + +jobs: + deploy-to-ghcr: + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} + diff --git a/.github/workflows/odin-gateway.yaml b/.github/workflows/odin-gateway.yaml deleted file mode 100644 index fb0bb83..0000000 --- a/.github/workflows/odin-gateway.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Odin - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "odin-gateway/**" - - ".github/workflows/odin-gateway.yaml" - - pull_request: - branches: - - develop - - paths: - - "odin-gateway/**" - - ".github/workflows/odin-gateway.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: odin-api-gateway - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: odin-gateway - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/quetzalcoatl-auth-ci.yaml b/.github/workflows/quetzalcoatl-auth-ci.yaml new file mode 100644 index 0000000..aba5d5d --- /dev/null +++ b/.github/workflows/quetzalcoatl-auth-ci.yaml @@ -0,0 +1,49 @@ +name: Quetzalcoatl - Build Docker image and publish to GHCR and Docker Hub + +on: + push: + tags: + - "quetzalcoatl/**" + +env: + NAMESPACE: pantheonix + REPOSITORY: quetzalcoatl + IMAGE_NAME: quetzalcoatl-auth + BUILD_CONTEXT: quetzalcoatl-auth + +jobs: + build: + name: Build and Test Quetzalcoatl Auth Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: quetzalcoatl-auth + + steps: + - uses: actions/checkout@v4 + - name: Setup dotnet + uses: actions/setup-dotnet@v4 + with: + dotnet-version: '7.0.x' + - name: Build + run: | + dotnet restore "Bootstrapper/Bootstrapper.csproj" + dotnet restore "Tests.Integration/Tests.Integration.csproj" + dotnet build --no-restore +# - name: Test +# run: dotnet test -e ASPNETCORE_ENVIRONMENT=Testing "Tests.Integration/Tests.Integration.csproj" + + deploy-to-ghcr: + needs: build + uses: ./.github/workflows/step-deploy-to-ghcr.yaml + with: + image_name: ${{ env.IMAGE_NAME }} + build_context: ${{ env.BUILD_CONTEXT }} + + deploy-to-docker-hub: + needs: build + uses: ./.github/workflows/step-deploy-to-docker-hub.yaml + with: + namespace: ${{ env.NAMESPACE }} + repository: ${{ env.REPOSITORY }} + build_context: ${{ env.BUILD_CONTEXT }} diff --git a/.github/workflows/quetzalcoatl-auth-pr-verify.yaml b/.github/workflows/quetzalcoatl-auth-pr-verify.yaml new file mode 100644 index 0000000..c0b2131 --- /dev/null +++ b/.github/workflows/quetzalcoatl-auth-pr-verify.yaml @@ -0,0 +1,31 @@ +name: Quetzalcoatl - PR Verify + +on: + pull_request: + branches: + - develop + paths: + - "quetzalcoatl-auth/**" + - ".github/workflows/quetzalcoatl-auth-pr-verify.yaml" + +jobs: + build: + name: Build and Test Quetzalcoatl Auth Microservice + runs-on: ubuntu-latest + defaults: + run: + working-directory: quetzalcoatl-auth + + steps: + - uses: actions/checkout@v4 + - name: Setup dotnet + uses: actions/setup-dotnet@v4 + with: + dotnet-version: '7.0.x' + - name: Build + run: | + dotnet restore "Bootstrapper/Bootstrapper.csproj" + dotnet restore "Tests.Integration/Tests.Integration.csproj" + dotnet build --no-restore +# - name: Test +# run: dotnet test -e ASPNETCORE_ENVIRONMENT=Testing "Tests.Integration/Tests.Integration.csproj" \ No newline at end of file diff --git a/.github/workflows/quetzalcoatl-auth.yaml b/.github/workflows/quetzalcoatl-auth.yaml deleted file mode 100644 index dcb4376..0000000 --- a/.github/workflows/quetzalcoatl-auth.yaml +++ /dev/null @@ -1,55 +0,0 @@ -name: Quetzalcoatl - Build Docker image and publish to GitHub Packages - -on: - push: - branches: - - develop - - paths: - - "quetzalcoatl-auth/**" - - ".github/workflows/quetzalcoatl-auth.yaml" - - pull_request: - branches: - - develop - - paths: - - "quetzalcoatl-auth/**" - - ".github/workflows/quetzalcoatl-auth.yaml" - -env: - REGISTRY: ghcr.io - IMAGE_NAME: quetzalcoatl-auth - -jobs: - build: - runs-on: ubuntu-latest - - permissions: - contents: read - packages: write - - steps: - - name: Checkout code - uses: actions/checkout@v4 - - - name: Log in to the Container registry - uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1 - with: - registry: ${{ env.REGISTRY }} - username: ${{ github.actor }} - password: ${{ secrets.TOKEN }} - - - name: Extract metadata (tags, labels) for Docker - id: meta - uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7 - with: - images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/${{ env.IMAGE_NAME }} - - - name: Build and push Docker image - uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4 - with: - context: quetzalcoatl-auth - push: true - tags: ${{ steps.meta.outputs.tags }} - labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/release-please.yaml b/.github/workflows/release-please.yaml new file mode 100644 index 0000000..2952070 --- /dev/null +++ b/.github/workflows/release-please.yaml @@ -0,0 +1,22 @@ +name: Release Please + +on: + push: + branches: + - develop + +permissions: + contents: read + pull-requests: write + +jobs: + release-please: + runs-on: ubuntu-latest + steps: + - uses: googleapis/release-please-action@v4 + with: + token: ${{ secrets.GITHUB_TOKEN }} + target-branch: 'develop' + config-file: release-please-config.json + manifest-file: .release-please-manifest.json + include-component-in-tag: 'true' diff --git a/.github/workflows/step-deploy-to-docker-hub.yaml b/.github/workflows/step-deploy-to-docker-hub.yaml new file mode 100644 index 0000000..9bfd034 --- /dev/null +++ b/.github/workflows/step-deploy-to-docker-hub.yaml @@ -0,0 +1,47 @@ +name: "Step - Push Docker image to Docker Hub Registry" + +on: + workflow_call: + inputs: + namespace: + type: string + required: true + repository: + type: string + required: true + build_context: + type: string + required: true + +jobs: + deploy: + name: "Deploy to Docker Hub" + runs-on: ubuntu-latest + + permissions: + contents: read + packages: write + + steps: + - name: Checkout code + uses: actions/checkout@v4 + + - name: Log in to Docker Hub + uses: docker/login-action@v3 + with: + username: ${{ secrets.DOCKER_USERNAME }} + password: ${{ secrets.DOCKER_PASSWORD }} + + - name: Extract metadata (tags, labels) for Docker + id: meta + uses: docker/metadata-action@v5 + with: + images: ${{ inputs.namespace }}/${{ inputs.repository }} + + - name: Build and push Docker image + uses: docker/build-push-action@v6 + with: + context: ${{ inputs.build_context }} + push: true + tags: ${{ steps.meta.outputs.tags }} + labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/step-deploy-to-ghcr.yaml b/.github/workflows/step-deploy-to-ghcr.yaml new file mode 100644 index 0000000..e3e69e5 --- /dev/null +++ b/.github/workflows/step-deploy-to-ghcr.yaml @@ -0,0 +1,47 @@ +name: "Step - Push Docker image to GitHub Container Registry (GHCR)" + +on: + workflow_call: + inputs: + image_name: + type: string + required: true + build_context: + type: string + required: true + +jobs: + deploy: + name: "Deploy to GHCR" + runs-on: ubuntu-latest + + permissions: + contents: read + packages: write + + steps: + - name: Checkout code + uses: actions/checkout@v4 + + - name: Log in to the Container registry + uses: docker/login-action@v3 + with: + registry: ghcr.io + username: ${{ github.actor }} + password: ${{ secrets.TOKEN }} + + - name: Extract metadata (tags, labels) for Docker + id: meta + uses: docker/metadata-action@v5 + with: + images: ghcr.io/${{ github.repository_owner }}/${{ inputs.image_name }} + tags: | + type=semver,pattern={{version}} + + - name: Build and push Docker image + uses: docker/build-push-action@v6 + with: + context: ${{ inputs.build_context }} + push: true + tags: ${{ steps.meta.outputs.tags }} + labels: ${{ steps.meta.outputs.labels }} diff --git a/.release-please-manifest.json b/.release-please-manifest.json new file mode 100644 index 0000000..65251c0 --- /dev/null +++ b/.release-please-manifest.json @@ -0,0 +1,9 @@ +{ + "quetzalcoatl": "2.0.0", + "enki": "2.0.0", + "hermes": "2.0.0", + "anubis": "2.0.0", + "odin": "2.0.0", + "dapr": "2.0.0", + "eval-lb": "2.0.0" +} \ No newline at end of file diff --git a/anubis-eval/.config/nextest.toml b/anubis-eval/.config/nextest.toml new file mode 100644 index 0000000..90ace23 --- /dev/null +++ b/anubis-eval/.config/nextest.toml @@ -0,0 +1,14 @@ +[test-groups] +serial-integration = { max-threads = 1 } + +[[profile.default.overrides]] +filter = 'package(anubis-eval)' +platform = 'cfg(unix)' +test-group = 'serial-integration' + +[profile.ci] +# Print out output for failing tests as soon as they fail, and also at the end +# of the run (for easy scrollability). +failure-output = "immediate-final" +# Do not cancel the test run on the first failure. +fail-fast = false \ No newline at end of file diff --git a/anubis-eval/.dockerignore b/anubis-eval/.dockerignore index 9e6ce3c..ad5e997 100644 --- a/anubis-eval/.dockerignore +++ b/anubis-eval/.dockerignore @@ -5,5 +5,8 @@ README.md .idea/ target/ logs/ +tests/ .env* -.gitignore \ No newline at end of file +.gitignore +eval-lb/ +tests-setup/ \ No newline at end of file diff --git a/anubis-eval/.env.judge0.template b/anubis-eval/.env.judge0.template index 6382583..1808029 100644 --- a/anubis-eval/.env.judge0.template +++ b/anubis-eval/.env.judge0.template @@ -196,7 +196,7 @@ REDIS_PORT= # Specify Redis password. Cannot be blank. # Default: NO DEFAULT! MUST BE SET! -REDIS_PASSWORD=YourPasswordHere1234 +REDIS_PASSWORD=pass ################################################################################ diff --git a/anubis-eval/.env.template b/anubis-eval/.env.template index ab40040..40fee56 100644 --- a/anubis-eval/.env.template +++ b/anubis-eval/.env.template @@ -1,24 +1,24 @@ ROCKET_ADDRESS=0.0.0.0 ROCKET_PORT=5213 -#ROCKET_DATABASES={anubis-submissions={url="postgres://postgres:2002@localhost:5435/anubis-submissions"}} -ROCKET_DATABASES={anubis-submissions={url="postgres://postgres:2002@anubis-psql-db:5432/anubis-submissions"}} +ROCKET_DATABASES={anubis-submissions={url="postgres://postgres:2002@127.0.0.1:5433/anubis-submissions"}} +#ROCKET_DATABASES={anubis-submissions={url="postgres://postgres:2002@anubis-psql-db:5432/anubis-submissions"}} CONFIG_JWT_SECRET_KEY=z7F+ut_aphaxeja0&ba*p9spew!4fe0rAFRO5HestitIKOv5nistlz3b=+edu1aP -CONFIG_DAPR_HTTP_PORT=3503 +CONFIG_DAPR_HTTP_PORT=8080 CONFIG_DAPR_EVAL_METADATA_ENDPOINT=http://dapr-app-id:enki-problems@127.0.0.1:3503/api/enki/problem/{problem_id}/eval-metadata -CONFIG_DAPR_JUDGE_SUBMISSION_BATCH_ENDPOINT=http://judge0-lb:4000/submissions/batch -CONFIG_DAPR_JUDGE_SUBMISSION_ENDPOINT=http://judge0-lb:4000/submissions -CONFIG_DAPR_GET_SUBMISSION_BATCH_ENDPOINT=http://judge0-lb:4000/submissions/batch?tokens={tokens} -CONFIG_DAPR_GET_SUBMISSION_ENDPOINT=http://judge0-lb:4000/submissions/{token} -CONFIG_DAPR_STATE_STORE_POST_ENDPOINT=http://127.0.0.1:3503/v1.0/state/statestore -CONFIG_DAPR_STATE_STORE_GET_ENDPOINT=http://127.0.0.1:3503/v1.0/state/statestore/{key} -CONFIG_EVAL_CRON_SCHEDULE='1/5 * * * * *' +CONFIG_DAPR_JUDGE_SUBMISSION_BATCH_ENDPOINT=http://127.0.0.1:2358/submissions/batch +CONFIG_DAPR_JUDGE_SUBMISSION_ENDPOINT=http://127.0.0.1:2358/submissions +CONFIG_DAPR_GET_SUBMISSION_BATCH_ENDPOINT=http://127.0.0.1:2358/submissions/batch?tokens={tokens} +CONFIG_DAPR_GET_SUBMISSION_ENDPOINT=http://127.0.0.1:2358/submissions/{token} +CONFIG_DAPR_STATE_STORE_POST_ENDPOINT=http://127.0.0.1:8080/v1.0/state/statestore +CONFIG_DAPR_STATE_STORE_GET_ENDPOINT=http://127.0.0.1:8080/v1.0/state/statestore/{key} +CONFIG_EVAL_CRON_SCHEDULE='*/1 * * * * *' CONFIG_DEFAULT_NO_SUBMISSIONS_PER_PAGE=10 CONFIG_EVAL_BATCH_SIZE=4 CONFIG_ALLOWED_ORIGINS="https://localhost:10000;http://localhost:10000;https://pantheonix.live;https://pantheonix-midgard.web.app;https://pantheonix-midgard.firebaseapp.com" -#DATABASE_URL=postgres://postgres:2002@localhost:5435/anubis-submissions -DATABASE_URL=postgres://postgres:2002@anubis-psql-db:5432/anubis-submissions +DATABASE_URL=postgres://postgres:2002@127.0.0.1:5433/anubis-submissions +#DATABASE_URL=postgres://postgres:2002@anubis-psql-db:5432/anubis-submissions POSTGRES_USER=postgres POSTGRES_PASSWORD=2002 diff --git a/anubis-eval/Cargo.lock b/anubis-eval/Cargo.lock index 8e8b130..0bf9dd4 100644 --- a/anubis-eval/Cargo.lock +++ b/anubis-eval/Cargo.lock @@ -53,8 +53,11 @@ version = "0.1.0" dependencies = [ "anyhow", "async-scoped", + "async_once", + "cder", "chrono", "cloudevents-sdk", + "ctor", "diesel 2.1.3", "diesel_migrations", "dotenvy", @@ -123,7 +126,7 @@ checksum = "16e62a023e7c117e27523144c5d2459f4397fcc3cab0085af8e2224f643a0193" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -134,9 +137,15 @@ checksum = "c980ee35e870bd1a4d2c8294d4c04d0499e67bca1e4b5cefcc693c2fa00caea9" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] +[[package]] +name = "async_once" +version = "0.2.6" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2ce4f10ea3abcd6617873bae9f91d1c5332b4a778bd9ce34d0cd517474c1de82" + [[package]] name = "atomic" version = "0.5.3" @@ -181,9 +190,9 @@ checksum = "3441f0f7b02788e948e47f457ca01f1d7e6d92c693bc132c22b087d3141c03ff" [[package]] name = "base64" -version = "0.21.4" +version = "0.21.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "9ba43ea6f343b788c8764558649e08df62f86c6ef251fdaeb1ffd010a9ae50a2" +checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567" [[package]] name = "bigdecimal" @@ -255,9 +264,9 @@ checksum = "1fd0f2584146f6f2ef48085050886acf353beff7305ebd1ae69500e27c67f64b" [[package]] name = "bytes" -version = "1.5.0" +version = "1.6.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a2bd12c1caf447e69cd4528f47f94d203fd2582878ecb9e9465484c4148a8223" +checksum = "514de17de45fdb8dc022b1a7975556c53c86f9f0aa5f534b98977b171857c2c9" [[package]] name = "cc" @@ -268,6 +277,19 @@ dependencies = [ "libc", ] +[[package]] +name = "cder" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "d30fa19274b4780f8aa61b9ae6a4d98eaf39eb84619d6366f108e7aea71d5114" +dependencies = [ + "anyhow", + "once_cell", + "regex", + "serde", + "serde_yaml 0.9.25", +] + [[package]] name = "cfg-if" version = "1.0.0" @@ -286,7 +308,7 @@ dependencies = [ "num-traits", "serde", "wasm-bindgen", - "windows-targets", + "windows-targets 0.48.5", ] [[package]] @@ -403,6 +425,16 @@ dependencies = [ "typenum", ] +[[package]] +name = "ctor" +version = "0.2.8" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "edb49164822f3ee45b17acd4a208cfc1251410cf0cad9a833234c9890774dd9f" +dependencies = [ + "quote", + "syn 2.0.67", +] + [[package]] name = "delegate-attr" version = "0.2.9" @@ -485,7 +517,7 @@ dependencies = [ "proc-macro2", "proc-macro2-diagnostics", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -538,7 +570,7 @@ dependencies = [ "diesel_table_macro_syntax", "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -558,7 +590,7 @@ version = "0.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fc5557efc453706fed5e4fa85006fe9817c224c3f480a34c7e5959fd700921c5" dependencies = [ - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -591,9 +623,9 @@ checksum = "1435fa1053d8b2fbbe9be7e97eca7f33d37b28409959813daefc1446a14247f1" [[package]] name = "either" -version = "1.9.0" +version = "1.12.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a26ae43d7bcc3b814de94796a5e736d4029efb0ee900c12e2d54c993ad1a1e07" +checksum = "3dca9240753cf90908d7e4aac30f630662b02aebaa1b58a3cadabdb23385b58b" [[package]] name = "encoding_rs" @@ -626,7 +658,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ac3e13f66a2f95e32a39eaa81f6b95d42878ca0e1db0c7543723dfe12557e860" dependencies = [ "libc", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -658,7 +690,7 @@ dependencies = [ "cfg-if", "libc", "redox_syscall 0.3.5", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -762,7 +794,7 @@ checksum = "89ca545a94061b6365f2c7355b4b32bd20df3ff95f02da9329b34ccc3bd6ee72" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1116,7 +1148,7 @@ checksum = "cb0889898416213fab133e1d33a0e5858a48177452750691bde3666d0fdbaf8b" dependencies = [ "hermit-abi", "rustix", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -1140,7 +1172,7 @@ version = "9.1.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "155c4d7e39ad04c172c5e3a99c434ea3b4a7ba7960b38ecd562b270b097cce09" dependencies = [ - "base64 0.21.4", + "base64 0.21.7", "pem", "ring", "serde", @@ -1176,9 +1208,9 @@ checksum = "e2abad23fbc42b3700f2f279844dc832adb2b2eb069b2df918f455c4e18cc646" [[package]] name = "libc" -version = "0.2.149" +version = "0.2.155" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "a08173bc88b7955d1b3145aa561539096c421ac8debde8cbc3612ec635fee29b" +checksum = "97b3888a4aecf77e811145cadf6eef5901f4782c53886191b2f693f24761847c" [[package]] name = "libm" @@ -1242,7 +1274,7 @@ dependencies = [ "serde", "serde-value", "serde_json", - "serde_yaml", + "serde_yaml 0.8.26", "thiserror", "thread-id", "typemap-ors", @@ -1281,9 +1313,9 @@ dependencies = [ [[package]] name = "memchr" -version = "2.6.4" +version = "2.7.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f665ee40bc4a3c5590afb1e9677db74a508659dfd71e126420da8274909a0167" +checksum = "78ca9ab1a0babb1e7d5695e3530886289c18cf2f87ec19a575a0abdce112e3a3" [[package]] name = "migrations_internals" @@ -1329,14 +1361,14 @@ dependencies = [ [[package]] name = "mio" -version = "0.8.8" +version = "0.8.11" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "927a765cd3fc26206e66b296465fa9d3e5ab003e651c1b3c060e7956d96b19d2" +checksum = "a4a650543ca06a924e8b371db273b2756685faae30f8487da1b56505a8f78b0c" dependencies = [ "libc", "log", "wasi", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -1363,7 +1395,7 @@ dependencies = [ "cfg-if", "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1420,7 +1452,7 @@ version = "1.1.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ec60c60a693226186f5d6edf073232bfb6464ed97eb22cf3b01c1e8198fd97f5" dependencies = [ - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -1439,7 +1471,7 @@ dependencies = [ "log", "mio", "walkdir", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -1514,9 +1546,9 @@ dependencies = [ [[package]] name = "once_cell" -version = "1.18.0" +version = "1.19.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "dd8b5dd2ae5ed71462c540258bedcb51965123ad7e7ccf4b9a8cafaa4a63576d" +checksum = "3fdb12b2476b595f9358c5161aa467c2438859caa136dec86c26fdd2efe17b92" [[package]] name = "openssl" @@ -1541,7 +1573,7 @@ checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1597,7 +1629,7 @@ dependencies = [ "libc", "redox_syscall 0.4.1", "smallvec", - "windows-targets", + "windows-targets 0.48.5", ] [[package]] @@ -1629,7 +1661,7 @@ dependencies = [ "proc-macro2", "proc-macro2-diagnostics", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1638,7 +1670,7 @@ version = "3.0.2" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "3163d2912b7c3b52d651a055f2c7eec9ba5cd22d26ef75b8dd3a59980b185923" dependencies = [ - "base64 0.21.4", + "base64 0.21.7", "serde", ] @@ -1679,7 +1711,7 @@ dependencies = [ "pest_meta", "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1748,7 +1780,7 @@ checksum = "266c042b60c9c76b8d53061e52b2e0d1116abc57cefc8c5cd671619a56ac3690" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -1842,9 +1874,9 @@ dependencies = [ [[package]] name = "proc-macro2" -version = "1.0.78" +version = "1.0.86" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "e2422ad645d89c99f8f3e6b88a9fdeca7fabeac836b1002371c4367c8f984aae" +checksum = "5e719e8df665df0d1c8fbfd238015744736151d4445ec0836b8e628aae103b77" dependencies = [ "unicode-ident", ] @@ -1857,16 +1889,16 @@ checksum = "af066a9c399a26e020ada66a034357a868728e72cd426f3adcd35f80d88d88c8" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", "version_check", "yansi", ] [[package]] name = "quote" -version = "1.0.35" +version = "1.0.36" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "291ec9ab5efd934aaf503a6466c5d5251535d108ee747472c3977cc5acc868ef" +checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7" dependencies = [ "proc-macro2", ] @@ -1957,19 +1989,19 @@ checksum = "7f7473c2cfcf90008193dd0e3e16599455cb601a9fce322b5bb55de799664925" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] name = "regex" -version = "1.10.2" +version = "1.10.5" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "380b951a9c5e80ddfd6136919eef32310721aa4aacd4889a8d39124b026ab343" +checksum = "b91213439dad192326a0d7c6ee3955910425f441d7038e0d6933b0aec5c4517f" dependencies = [ "aho-corasick", "memchr", - "regex-automata 0.4.3", - "regex-syntax 0.8.2", + "regex-automata 0.4.7", + "regex-syntax 0.8.4", ] [[package]] @@ -1983,13 +2015,13 @@ dependencies = [ [[package]] name = "regex-automata" -version = "0.4.3" +version = "0.4.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "5f804c7828047e88b2d32e2d7fe5a105da8ee3264f01902f796c8e067dc2483f" +checksum = "38caf58cc5ef2fed281f89292ef23f6365465ed9a41b7a7754eb4e26496c92df" dependencies = [ "aho-corasick", "memchr", - "regex-syntax 0.8.2", + "regex-syntax 0.8.4", ] [[package]] @@ -2000,9 +2032,9 @@ checksum = "f162c6dd7b008981e4d40210aca20b4bd0f9b60ca9271061b07f78537722f2e1" [[package]] name = "regex-syntax" -version = "0.8.2" +version = "0.8.4" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "c08c74e62047bb2de4ff487b251e4a92e24f48745648451635cec7d591162d9f" +checksum = "7a66a03ae7c801facd77a29370b4faec201768915ac14a721ba36f20bc9c209b" [[package]] name = "reqwest" @@ -2010,7 +2042,7 @@ version = "0.11.22" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "046cd98826c46c2ac8ddecae268eb5c2e58628688a5fc7a2643704a73faba95b" dependencies = [ - "base64 0.21.4", + "base64 0.21.7", "bytes", "encoding_rs", "futures-core", @@ -2053,7 +2085,7 @@ dependencies = [ "libc", "spin", "untrusted", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -2116,7 +2148,7 @@ dependencies = [ "proc-macro2", "quote", "rocket_http", - "syn 2.0.50", + "syn 2.0.67", "unicode-xid", "version_check", ] @@ -2202,7 +2234,7 @@ dependencies = [ "errno", "libc", "linux-raw-sys", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -2232,7 +2264,7 @@ version = "0.1.22" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0c3733bf4cf7ea0880754e19cb5a462007c4a8c1914bff372ccc95b464f1df88" dependencies = [ - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -2306,7 +2338,7 @@ checksum = "1e48d1f918009ce3145511378cf68d613e3b3d9137d67272562080d68a2b32d5" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -2353,6 +2385,19 @@ dependencies = [ "yaml-rust", ] +[[package]] +name = "serde_yaml" +version = "0.9.25" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "1a49e178e4452f45cb61d0cd8cebc1b0fafd3e41929e996cef79aa3aca91f574" +dependencies = [ + "indexmap 2.0.2", + "itoa", + "ryu", + "serde", + "unsafe-libyaml", +] + [[package]] name = "sha2" version = "0.10.8" @@ -2420,9 +2465,9 @@ dependencies = [ [[package]] name = "smallvec" -version = "1.11.1" +version = "1.13.2" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "942b4a808e05215192e39f4ab80813e599068285906cc91aa64f923db842bd5a" +checksum = "3c5e1a9a646d36c3599cd173a41282daf47c44583ad367b8e6837255952e5c67" [[package]] name = "snafu" @@ -2457,12 +2502,12 @@ dependencies = [ [[package]] name = "socket2" -version = "0.5.4" +version = "0.5.7" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4031e820eb552adee9295814c0ced9e5cf38ddf1e8b7d566d6de8e2538ea989e" +checksum = "ce305eb0b4296696835b71df73eb912e0f1ffd2556a501fcede6e0c50349191c" dependencies = [ "libc", - "windows-sys", + "windows-sys 0.52.0", ] [[package]] @@ -2502,9 +2547,9 @@ dependencies = [ [[package]] name = "syn" -version = "2.0.50" +version = "2.0.67" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "74f1bdc9872430ce9b75da68329d1c1746faf50ffac5f19e02b71e37ff881ffb" +checksum = "ff8655ed1d86f3af4ee3fd3263786bc14245ad17c4c7e85ba7187fb3ae028c90" dependencies = [ "proc-macro2", "quote", @@ -2542,7 +2587,7 @@ dependencies = [ "fastrand", "redox_syscall 0.3.5", "rustix", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -2575,22 +2620,22 @@ checksum = "3369f5ac52d5eb6ab48c6b4ffdc8efbcad6b89c765749064ba298f2c68a16a76" [[package]] name = "thiserror" -version = "1.0.50" +version = "1.0.61" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "f9a7210f5c9a7156bb50aa36aed4c95afb51df0df00713949448cf9e97d382d2" +checksum = "c546c80d6be4bc6a00c0f01730c08df82eaa7a7a61f11d656526506112cc1709" dependencies = [ "thiserror-impl", ] [[package]] name = "thiserror-impl" -version = "1.0.50" +version = "1.0.61" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "266b2e40bc00e5a6c09c3584011e08b06f123c00362c92b975ba9843aaaa14b8" +checksum = "46c3384250002a6d5af4d114f2845d37b57521033f30d5c3f46c4d70e1197533" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -2659,9 +2704,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20" [[package]] name = "tokio" -version = "1.33.0" +version = "1.38.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "4f38200e3ef7995e5ef13baec2f432a6da0aa9ac495b2c0e8f3b7eec2c92d653" +checksum = "ba4f4a02a7a80d6f274636f0aa95c7e383b912d41fe721a31f29e29698585a4a" dependencies = [ "backtrace", "bytes", @@ -2671,9 +2716,9 @@ dependencies = [ "parking_lot", "pin-project-lite", "signal-hook-registry", - "socket2 0.5.4", + "socket2 0.5.7", "tokio-macros", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] @@ -2693,13 +2738,13 @@ dependencies = [ [[package]] name = "tokio-macros" -version = "2.1.0" +version = "2.3.0" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e" +checksum = "5f5ae998a069d4b5aba8ee9dad856af7d520c3699e6159b185c2acd48155d39a" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -2714,9 +2759,9 @@ dependencies = [ [[package]] name = "tokio-stream" -version = "0.1.14" +version = "0.1.15" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "397c988d37662c7dda6d2208364a706264bf3d6138b11d436cbac0ad38832842" +checksum = "267ac89e0bec6e691e5813911606935d77c476ff49024f98abcea3e7b15e37af" dependencies = [ "futures-core", "pin-project-lite", @@ -2725,16 +2770,15 @@ dependencies = [ [[package]] name = "tokio-util" -version = "0.7.9" +version = "0.7.11" source = "registry+https://github.com/rust-lang/crates.io-index" -checksum = "1d68074620f57a0b21594d9735eb2e98ab38b17f80d3fcb189fca266771ca60d" +checksum = "9cf6b47b3771c49ac75ad09a6162f53ad4b8088b76ac60e8ec1455b31a189fe1" dependencies = [ "bytes", "futures-core", "futures-sink", "pin-project-lite", "tokio", - "tracing", ] [[package]] @@ -2821,7 +2865,7 @@ checksum = "34704c8d6ebcbc939824180af020566b01a7c01f80641264eba0999f6c2b6be7" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", ] [[package]] @@ -2995,6 +3039,12 @@ dependencies = [ "destructure_traitobject", ] +[[package]] +name = "unsafe-libyaml" +version = "0.2.11" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "673aac59facbab8a9007c7f6108d11f63b603f7cabff99fabf650fea5c32b861" + [[package]] name = "untrusted" version = "0.9.0" @@ -3129,7 +3179,7 @@ dependencies = [ "once_cell", "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", "wasm-bindgen-shared", ] @@ -3163,7 +3213,7 @@ checksum = "54681b18a46765f095758388f2d0cf16eb8d4169b639ab575a8f5693af210c7b" dependencies = [ "proc-macro2", "quote", - "syn 2.0.50", + "syn 2.0.67", "wasm-bindgen-backend", "wasm-bindgen-shared", ] @@ -3221,7 +3271,7 @@ version = "0.48.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "e686886bc078bc1b0b600cac0147aadb815089b6e4da64016cbd754b6342700f" dependencies = [ - "windows-targets", + "windows-targets 0.48.5", ] [[package]] @@ -3230,7 +3280,7 @@ version = "0.51.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "f1f8cf84f35d2db49a46868f947758c7a1138116f7fac3bc844f43ade1292e64" dependencies = [ - "windows-targets", + "windows-targets 0.48.5", ] [[package]] @@ -3239,7 +3289,16 @@ version = "0.48.0" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "677d2418bec65e3338edb076e806bc1ec15693c5d0104683f2efe857f61056a9" dependencies = [ - "windows-targets", + "windows-targets 0.48.5", +] + +[[package]] +name = "windows-sys" +version = "0.52.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "282be5f36a8ce781fad8c8ae18fa3f9beff57ec1b52cb3de0789201425d9a33d" +dependencies = [ + "windows-targets 0.52.5", ] [[package]] @@ -3248,13 +3307,29 @@ version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "9a2fa6e2155d7247be68c096456083145c183cbbbc2764150dda45a87197940c" dependencies = [ - "windows_aarch64_gnullvm", - "windows_aarch64_msvc", - "windows_i686_gnu", - "windows_i686_msvc", - "windows_x86_64_gnu", - "windows_x86_64_gnullvm", - "windows_x86_64_msvc", + "windows_aarch64_gnullvm 0.48.5", + "windows_aarch64_msvc 0.48.5", + "windows_i686_gnu 0.48.5", + "windows_i686_msvc 0.48.5", + "windows_x86_64_gnu 0.48.5", + "windows_x86_64_gnullvm 0.48.5", + "windows_x86_64_msvc 0.48.5", +] + +[[package]] +name = "windows-targets" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "6f0713a46559409d202e70e28227288446bf7841d3211583a4b53e3f6d96e7eb" +dependencies = [ + "windows_aarch64_gnullvm 0.52.5", + "windows_aarch64_msvc 0.52.5", + "windows_i686_gnu 0.52.5", + "windows_i686_gnullvm", + "windows_i686_msvc 0.52.5", + "windows_x86_64_gnu 0.52.5", + "windows_x86_64_gnullvm 0.52.5", + "windows_x86_64_msvc 0.52.5", ] [[package]] @@ -3263,42 +3338,90 @@ version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "2b38e32f0abccf9987a4e3079dfb67dcd799fb61361e53e2882c3cbaf0d905d8" +[[package]] +name = "windows_aarch64_gnullvm" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "7088eed71e8b8dda258ecc8bac5fb1153c5cffaf2578fc8ff5d61e23578d3263" + [[package]] name = "windows_aarch64_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "dc35310971f3b2dbbf3f0690a219f40e2d9afcf64f9ab7cc1be722937c26b4bc" +[[package]] +name = "windows_aarch64_msvc" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9985fd1504e250c615ca5f281c3f7a6da76213ebd5ccc9561496568a2752afb6" + [[package]] name = "windows_i686_gnu" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "a75915e7def60c94dcef72200b9a8e58e5091744960da64ec734a6c6e9b3743e" +[[package]] +name = "windows_i686_gnu" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "88ba073cf16d5372720ec942a8ccbf61626074c6d4dd2e745299726ce8b89670" + +[[package]] +name = "windows_i686_gnullvm" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "87f4261229030a858f36b459e748ae97545d6f1ec60e5e0d6a3d32e0dc232ee9" + [[package]] name = "windows_i686_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "8f55c233f70c4b27f66c523580f78f1004e8b5a8b659e05a4eb49d4166cca406" +[[package]] +name = "windows_i686_msvc" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "db3c2bf3d13d5b658be73463284eaf12830ac9a26a90c717b7f771dfe97487bf" + [[package]] name = "windows_x86_64_gnu" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "53d40abd2583d23e4718fddf1ebec84dbff8381c07cae67ff7768bbf19c6718e" +[[package]] +name = "windows_x86_64_gnu" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "4e4246f76bdeff09eb48875a0fd3e2af6aada79d409d33011886d3e1581517d9" + [[package]] name = "windows_x86_64_gnullvm" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "0b7b52767868a23d5bab768e390dc5f5c55825b6d30b86c844ff2dc7414044cc" +[[package]] +name = "windows_x86_64_gnullvm" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "852298e482cd67c356ddd9570386e2862b5673c85bd5f88df9ab6802b334c596" + [[package]] name = "windows_x86_64_msvc" version = "0.48.5" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "ed94fce61571a4006852b7389a063ab983c02eb1bb37b47f8272ce92d06d9538" +[[package]] +name = "windows_x86_64_msvc" +version = "0.52.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "bec47e5bfd1bff0eeaf6d8b485cc1074891a197ab4225d504cb7a1ab88b02bf0" + [[package]] name = "winnow" version = "0.5.17" @@ -3315,7 +3438,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "524e57b2c537c0f9b1e69f1965311ec12182b4122e45035b1508cd24d2adadb1" dependencies = [ "cfg-if", - "windows-sys", + "windows-sys 0.48.0", ] [[package]] diff --git a/anubis-eval/Cargo.toml b/anubis-eval/Cargo.toml index 2cf8c01..2ad3408 100644 --- a/anubis-eval/Cargo.toml +++ b/anubis-eval/Cargo.toml @@ -22,7 +22,6 @@ rocket-validation = "0.2.0" validator = "0.16.1" rocket_dyn_templates = { version = "0.1.0", features = ["tera"] } rocket_sync_db_pools = { version = "0.1.0", features = ["diesel_postgres_pool"] } -mockall = "0.12.1" lazy_static = "1.4.0" log4rs = "1.2.0" jsonwebtoken = "9.0.0" @@ -33,4 +32,10 @@ reqwest = { version = "0.11.22", features = ["json"] } futures = "0.3.28" cloudevents-sdk = "0.7.0" anyhow = "1.0.80" -async-scoped = { version = "0.9.0", features = ["use-tokio"] } \ No newline at end of file +async-scoped = { version = "0.9.0", features = ["use-tokio"] } + +[dev-dependencies] +cder = "0.2.1" +mockall = "0.12.1" +ctor = "0.2.8" +async_once = "0.2.6" diff --git a/anubis-eval/Dockerfile b/anubis-eval/Dockerfile index 073181c..c86b677 100644 --- a/anubis-eval/Dockerfile +++ b/anubis-eval/Dockerfile @@ -1,22 +1,21 @@ -FROM rust:1.71.1-slim-buster AS build +FROM lukemathwalker/cargo-chef:0.1.67-rust-slim-buster AS chef +WORKDIR /app + +FROM chef AS planner +COPY . . +RUN cargo chef prepare --recipe-path recipe.json + +FROM chef as builder LABEL stage=builder ENV CARGO_TERM_COLOR always RUN apt-get update && apt-get install -y libpq-dev libsqlite3-dev libmariadbclient-dev-compat pkg-config libssl1.1 libssl-dev && apt-get clean && rm -rf /var/lib/apt/lists/* - -# create empty project for caching dependencies -RUN USER=root cargo new --bin anubis-eval -WORKDIR /anubis-eval -COPY ./Cargo.lock ./ -COPY ./Cargo.toml ./ - -# cache dependencies -RUN cargo install --path . --locked +COPY --from=planner /app/recipe.json recipe.json +RUN cargo chef cook --release --recipe-path recipe.json COPY . . -RUN touch src/main.rs -RUN cargo install --path . --locked - -FROM debian:buster-slim +RUN cargo build --release --bin anubis-eval +FROM debian:buster-slim AS runtime +WORKDIR /app RUN apt-get update && apt-get install -y libpq-dev libsqlite3-dev libmariadbclient-dev-compat pkg-config libssl1.1 libssl-dev && apt-get clean && rm -rf /var/lib/apt/lists/* -COPY --from=build /usr/local/cargo/bin/anubis-eval /usr/local/bin/anubis-eval -CMD ["anubis-eval"] \ No newline at end of file +COPY --from=builder /app/target/release/anubis-eval /usr/local/bin +ENTRYPOINT ["/usr/local/bin/anubis-eval"] \ No newline at end of file diff --git a/anubis-eval/src/api/create_submission_endpoint.rs b/anubis-eval/src/api/create_submission_endpoint.rs index 57e938d..5498b9f 100644 --- a/anubis-eval/src/api/create_submission_endpoint.rs +++ b/anubis-eval/src/api/create_submission_endpoint.rs @@ -1,3 +1,12 @@ +use std::str::FromStr; + +use rocket::futures::future::join_all; +use rocket::serde::json::Json; +use rocket::serde::{Deserialize, Serialize}; +use rocket::{debug, error, info, post, Responder}; +use rocket_validation::{Validate, Validated}; +use uuid::Uuid; + use crate::api::middleware::auth::JwtContext; use crate::application::dapr_client::DaprClient; use crate::config::di::CONFIG; @@ -6,14 +15,8 @@ use crate::contracts::dapr_dtos::{CreateSubmissionBatchDto, CreateSubmissionTest use crate::domain::application_error::ApplicationError; use crate::domain::submission::{Language, Submission, TestCase, TestCaseStatus}; use crate::infrastructure::db::Db; -use rocket::futures::future::join_all; -use rocket::serde::json::Json; -use rocket::{debug, error, info, post, Responder}; -use rocket_validation::{Validate, Validated}; -use std::str::FromStr; -use uuid::Uuid; -#[derive(Debug, serde::Deserialize, Validate)] +#[derive(Debug, Serialize, Deserialize, Validate)] #[serde(crate = "rocket::serde")] pub struct CreateSubmissionRequest { problem_id: Uuid, @@ -169,3 +172,203 @@ pub async fn create_submission( }) .await } + +#[cfg(test)] +mod tests { + use std::str::FromStr; + + use rocket::http::{Header, Status}; + use uuid::Uuid; + + use crate::api::create_submission_endpoint::CreateSubmissionRequest; + use crate::api::middleware::auth::tests::encode_jwt; + use crate::config::di::DB_CONN; + use crate::contracts::create_submission_dtos::CreateSubmissionResponseDto; + use crate::domain::submission::Submission; + use crate::tests::common::{Result, ROCKET_CLIENT}; + use crate::tests::problem::tests::PROBLEMS; + use crate::tests::user::tests::{User, UserProfile}; + + #[tokio::test] + async fn unauthenticated_user_cannot_create_submission() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let problem = PROBLEMS.get("SumAB")?; + let submission_req = CreateSubmissionRequest { + problem_id: Uuid::from_str(problem.id.as_str()).unwrap(), + language: "Rust".to_string(), + source_code: "fn main() { let mut s = String::new(); std::io::stdin().read_line(&mut s).unwrap(); let v: Vec = s.trim().split_whitespace().map(|x| x.parse().unwrap()).collect(); println!(\"{}\", v[0] + v[1]); }".to_string(), + }; + + // Act + let response = client + .post("/api/submissions") + .header(Header::new("Content-Type", "application/json")) + .body(serde_json::to_string(&submission_req)?) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Unauthorized, + "Unauthenticated user cannot create submission" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_create_submission_for_unpublished_problem_if_proposer( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let problem = PROBLEMS.get("DiffAB")?; + let token = encode_jwt(User::get(UserProfile::Admin))?; + + let submission_req = CreateSubmissionRequest { + problem_id: Uuid::from_str(problem.id.as_str()).unwrap(), + language: "Rust".to_string(), + source_code: "fn main() { let mut s = String::new(); std::io::stdin().read_line(&mut s).unwrap(); let v: Vec = s.trim().split_whitespace().map(|x| x.parse().unwrap()).collect(); println!(\"{}\", v[0] - v[1]); }".to_string(), + }; + + // Act + let response = client + .post("/api/submissions") + .header(Header::new("Content-Type", "application/json")) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .body(serde_json::to_string(&submission_req)?) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Created, + "Authenticated user can create submission for unpublished problem if proposer" + ); + + let body: CreateSubmissionResponseDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + let conn = DB_CONN.clone(); + let mut conn = conn.lock().await; + + let (submission, _) = Submission::find_by_id(&body.id, &mut conn)?; + + // clear the submission after evaluation + Submission::delete_by_id(&body.id, &mut conn)?; + + assert_eq!(submission.id(), Uuid::from_str(&body.id)?); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_cannot_create_submission_for_unpublished_problem() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let problem = PROBLEMS.get("DiffAB")?; + let token = encode_jwt(User::get(UserProfile::Proposer))?; + + let submission_req = CreateSubmissionRequest { + problem_id: Uuid::from_str(problem.id.as_str()).unwrap(), + language: "Rust".to_string(), + source_code: "fn main() { let mut s = String::new(); std::io::stdin().read_line(&mut s).unwrap(); let v: Vec = s.trim().split_whitespace().map(|x| x.parse().unwrap()).collect(); println!(\"{}\", v[0] - v[1]); }".to_string(), + }; + + // Act + let response = client + .post("/api/submissions") + .header(Header::new("Content-Type", "application/json")) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .body(serde_json::to_string(&submission_req)?) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Forbidden, + "Authenticated user cannot create submission for unpublished problem" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_create_submission() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let problem = PROBLEMS.get("SumAB")?; + let token = encode_jwt(User::get(UserProfile::Ordinary))?; + + let submission_req = CreateSubmissionRequest { + problem_id: Uuid::from_str(problem.id.as_str()).unwrap(), + language: "Rust".to_string(), + source_code: "fn main() { let mut s = String::new(); std::io::stdin().read_line(&mut s).unwrap(); let v: Vec = s.trim().split_whitespace().map(|x| x.parse().unwrap()).collect(); println!(\"{}\", v[0] + v[1]); }".to_string(), + }; + + // Act + let response = client + .post("/api/submissions") + .header(Header::new("Content-Type", "application/json")) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .body(serde_json::to_string(&submission_req)?) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Created, + "Authenticated user can create submission and it is accepted" + ); + + let body: CreateSubmissionResponseDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + let conn = DB_CONN.clone(); + let mut conn = conn.lock().await; + + let (submission, _) = Submission::find_by_id(&body.id, &mut conn)?; + + // clear the submission after evaluation + Submission::delete_by_id(&body.id, &mut conn)?; + + assert_eq!(submission.id(), Uuid::from_str(&body.id)?); + + // let submission_id = Arc::new(Mutex::new(body.id.clone())); + // let conn = DB_CONN.clone(); + // + // let eval_task = async move { + // let mut conn = conn.lock().await; + // let submission_id = submission_id.lock().await.clone(); + // let mut ticker = tokio::time::interval(Duration::from_millis(100)); + // + // loop { + // ticker.tick().await; + // let (submission, _) = Submission::find_by_id(&submission_id, &mut conn).unwrap(); + // if submission.status() != SubmissionStatus::Evaluating { + // return Ok::(submission); + // } + // } + // }; + // let eval_handle = tokio::spawn(eval_task); + // + // // clear the submission after evaluation + // let conn = DB_CONN.clone(); + // let mut conn = conn.lock().await; + // let submission_id = body.id; + // Submission::delete_by_id(&submission_id, &mut conn)?; + // + // match timeout(Duration::from_secs(5), eval_handle).await { + // Ok(Ok(Ok(submission))) => { + // assert_eq!(submission.status(), SubmissionStatus::Accepted); + // } + // _ => { + // panic!("TIMEOUT: Submission did not complete evaluation in time"); + // } + // } + + Ok(()) + } +} diff --git a/anubis-eval/src/api/get_highest_score_submissions.rs b/anubis-eval/src/api/get_highest_score_submissions.rs index 13afcd4..b68fa59 100644 --- a/anubis-eval/src/api/get_highest_score_submissions.rs +++ b/anubis-eval/src/api/get_highest_score_submissions.rs @@ -46,3 +46,152 @@ pub async fn get_highest_score_submissions( }) .await } + +#[cfg(test)] +mod tests { + use crate::api::middleware::auth::tests::encode_jwt; + use crate::contracts::get_highest_score_submissions_dtos::GetHighestScoreSubmissionsDto; + use crate::tests::common::{Result, ROCKET_CLIENT}; + use crate::tests::problem::tests::PROBLEMS; + use crate::tests::user::tests::{User, UserProfile}; + use rocket::http::{Header, Status}; + + #[tokio::test] + async fn unauthenticated_user_cannot_get_submissions() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let user_profile = User::get(UserProfile::Ordinary); + + // Act + let response = client + .get(format!("/api/submissions/user/{}", user_profile.id)) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Unauthorized, + "Unauthenticated user cannot get highest score submissions" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_own_submissions() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let user_profile = User::get(UserProfile::Ordinary); + let token = encode_jwt(user_profile.clone())?; + + // Act + let response = client + .get(format!("/api/submissions/user/{}", user_profile.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get own highest score submissions" + ); + + let body: GetHighestScoreSubmissionsDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submissions.len(), 2); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions_for_specific_problem() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let user_profile = User::get(UserProfile::Ordinary); + let problem = PROBLEMS.get("SumAB")?; + let token = encode_jwt(user_profile.clone())?; + + // Act + let response = client + .get(format!( + "/api/submissions/user/{}?problem_id={}", + user_profile.id, problem.id + )) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get highest score submissions for specific problem" + ); + + let body: GetHighestScoreSubmissionsDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submissions.len(), 1); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions_as_proposer() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let user_profile = User::get(UserProfile::Ordinary); + let token = encode_jwt(User::get(UserProfile::Admin))?; + + // Act + let response = client + .get(format!("/api/submissions/user/{}", user_profile.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get highest score submissions as proposer" + ); + + let body: GetHighestScoreSubmissionsDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submissions.len(), 2); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions_only_for_published_problems_if_not_proposer( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let user_profile = User::get(UserProfile::Ordinary); + let token = encode_jwt(User::get(UserProfile::Proposer))?; + + // Act + let response = client + .get(format!("/api/submissions/user/{}", user_profile.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get highest score submissions only for published problems if not proposer" + ); + + let body: GetHighestScoreSubmissionsDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submissions.len(), 1); + + Ok(()) + } +} diff --git a/anubis-eval/src/api/get_submission_endpoint.rs b/anubis-eval/src/api/get_submission_endpoint.rs index f388362..585e429 100644 --- a/anubis-eval/src/api/get_submission_endpoint.rs +++ b/anubis-eval/src/api/get_submission_endpoint.rs @@ -78,3 +78,211 @@ pub async fn get_submission( }) .await } + +#[cfg(test)] +mod tests { + use crate::api::middleware::auth::tests::encode_jwt; + use crate::contracts::get_submission_dtos::GetSubmissionWithTestCasesDto; + use crate::tests::common::{Result, ROCKET_CLIENT}; + use crate::tests::submission::tests::SUBMISSIONS; + use crate::tests::user::tests::{User, UserProfile}; + use rocket::http::{Header, Status}; + use uuid::Uuid; + + #[tokio::test] + async fn unauthenticated_user_cannot_get_submission() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let submission = SUBMISSIONS.get("Ordinary_SumAB_Submission1")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Unauthorized, + "Unauthenticated user cannot get submission" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_own_submission() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Ordinary))?; + let submission = SUBMISSIONS.get("Ordinary_SumAB_Submission1")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get own submission" + ); + + let body: GetSubmissionWithTestCasesDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submission.id, submission.id); + assert_eq!( + body.submission.source_code, + Some(submission.source_code.clone()) + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_other_user_submission_without_source_code_for_still_unsolved_problem( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Admin))?; + let submission = SUBMISSIONS.get("Ordinary_SumAB_Submission1")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get other user's submission without source code for still unsolved problem" + ); + + let body: GetSubmissionWithTestCasesDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submission.id, submission.id); + assert_eq!(body.submission.source_code, None); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_other_user_submission_with_source_code_for_already_solved_problem( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Proposer))?; + let submission = SUBMISSIONS.get("Ordinary_SumAB_Submission1")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get other user's submission with source code for already solved problem" + ); + + let body: GetSubmissionWithTestCasesDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submission.id, submission.id); + assert_eq!( + body.submission.source_code, + Some(submission.source_code.clone()) + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_own_submission_for_prior_published_problem() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Ordinary))?; + let submission = SUBMISSIONS.get("Ordinary_DiffAB_Submission5")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get own submission for prior published problem" + ); + + let body: GetSubmissionWithTestCasesDto = + serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.submission.id, submission.id); + assert_eq!( + body.submission.source_code, + Some(submission.source_code.clone()) + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_cannot_get_non_existent_submission() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Ordinary))?; + + // Act + let response = client + .get(format!("/api/submissions/{}", Uuid::new_v4().to_string())) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::NotFound, + "Authenticated user cannot get non-existent submission" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_cannot_get_other_user_submission_for_unpublished_problem( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Proposer))?; + let submission = SUBMISSIONS.get("Admin_DiffAB_Submission4")?; + + // Act + let response = client + .get(format!("/api/submissions/{}", submission.id)) + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Forbidden, + "Authenticated user cannot get other user's submission for unpublished problem" + ); + + Ok(()) + } +} diff --git a/anubis-eval/src/api/get_submissions_endpoint.rs b/anubis-eval/src/api/get_submissions_endpoint.rs index c3288f9..7e76af9 100644 --- a/anubis-eval/src/api/get_submissions_endpoint.rs +++ b/anubis-eval/src/api/get_submissions_endpoint.rs @@ -43,3 +43,116 @@ pub async fn get_submissions( ) .await } + +#[cfg(test)] +mod tests { + use crate::api::middleware::auth::tests::encode_jwt; + use crate::contracts::get_submissions_dtos::GetSubmissionsDto; + use crate::tests::common::{Result, ROCKET_CLIENT}; + use crate::tests::user::tests::{User, UserProfile}; + use rocket::http::{Header, Status}; + + #[tokio::test] + async fn unauthenticated_user_cannot_get_submissions() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + + // Act + let response = client.get("/api/submissions").dispatch().await; + + // Assert + assert_eq!( + response.status(), + Status::Unauthorized, + "Unauthenticated user cannot get submissions" + ); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions_including_own_for_prior_published_problem( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Ordinary))?; + + // Act + let response = client + .get("/api/submissions") + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get submissions including own for prior published problem" + ); + + let body: GetSubmissionsDto = serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.items, 4); + assert_eq!(body.total_pages, 1); + assert_eq!(body.submissions.len(), 4); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions_including_those_for_own_unpublished_problem( + ) -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Admin))?; + + // Act + let response = client + .get("/api/submissions") + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get submissions including those for own unpublished problem" + ); + + let body: GetSubmissionsDto = serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.items, 5); + assert_eq!(body.total_pages, 1); + assert_eq!(body.submissions.len(), 5); + + Ok(()) + } + + #[tokio::test] + async fn authenticated_user_can_get_submissions() -> Result<()> { + // Arrange + let client = ROCKET_CLIENT.get().await.clone(); + let token = encode_jwt(User::get(UserProfile::Proposer))?; + + // Act + let response = client + .get("/api/submissions") + .header(Header::new("Authorization", format!("Bearer {}", token))) + .dispatch() + .await; + + // Assert + assert_eq!( + response.status(), + Status::Ok, + "Authenticated user can get submissions" + ); + + let body: GetSubmissionsDto = serde_json::from_str(&response.into_string().await.unwrap())?; + assert_eq!(body.items, 3); + assert_eq!(body.total_pages, 1); + assert_eq!(body.submissions.len(), 3); + + Ok(()) + } +} diff --git a/anubis-eval/src/api/health_check_endpoint.rs b/anubis-eval/src/api/health_check_endpoint.rs index 72e35f3..13b50e1 100644 --- a/anubis-eval/src/api/health_check_endpoint.rs +++ b/anubis-eval/src/api/health_check_endpoint.rs @@ -1,6 +1,33 @@ +use crate::infrastructure::db::Db; +use diesel::RunQueryDsl; use rocket::get; #[get("/health_check")] -pub fn health_check() -> &'static str { - "Healthy" +pub async fn health_check(db: Db) -> &'static str { + db.run( + move |conn| match diesel::sql_query("SELECT 1").execute(conn) { + Ok(_) => "Healthy!", + Err(_) => "Unhealthy!", + }, + ) + .await +} + +#[cfg(test)] +mod tests { + use crate::tests::common::Result; + use crate::tests::common::ROCKET_CLIENT; + use rocket::http::Status; + + #[tokio::test] + async fn test_health_check() -> Result<()> { + let client = ROCKET_CLIENT.get().await.clone(); + + let response = client.get("/api/health_check").dispatch().await; + + assert_eq!(response.status(), Status::Ok); + assert_eq!(response.into_string().await.unwrap(), "Healthy!"); + + Ok(()) + } } diff --git a/anubis-eval/src/api/middleware/auth.rs b/anubis-eval/src/api/middleware/auth.rs index 5f29b45..fe3ba12 100644 --- a/anubis-eval/src/api/middleware/auth.rs +++ b/anubis-eval/src/api/middleware/auth.rs @@ -67,7 +67,7 @@ fn decode_jwt(token: String) -> Result { let token = token.trim_start_matches("Bearer").trim(); match decode::( - &token, + token, &DecodingKey::from_secret(secret.as_bytes()), &Validation::new(Algorithm::HS256), ) { @@ -75,3 +75,35 @@ fn decode_jwt(token: String) -> Result { Err(err) => Err(err.kind().to_owned()), } } + +#[cfg(test)] +pub mod tests { + use crate::config::di::CONFIG; + use crate::tests::user::tests::User; + use rocket::serde::{Deserialize, Serialize}; + + #[derive(Debug, Deserialize, Serialize)] + pub struct Claims { + sub: String, + email: String, + role: Vec, + exp: usize, + } + + pub fn encode_jwt(user: User) -> Result> { + let secret = CONFIG.clone().jwt_secret_key; + let claims = Claims { + sub: user.id.to_string(), + email: user.email, + role: user.role.iter().map(|r| r.to_string()).collect(), + exp: (chrono::Utc::now() + chrono::Duration::hours(1)).timestamp() as usize, + }; + + jsonwebtoken::encode( + &jsonwebtoken::Header::default(), + &claims, + &jsonwebtoken::EncodingKey::from_secret(secret.as_ref()), + ) + .map_err(|e| e.into()) + } +} diff --git a/anubis-eval/src/application/dapr_client.rs b/anubis-eval/src/application/dapr_client.rs index 2f2f715..2c0dbd6 100644 --- a/anubis-eval/src/application/dapr_client.rs +++ b/anubis-eval/src/application/dapr_client.rs @@ -264,7 +264,7 @@ impl DaprClient { Ok(response) } - async fn get_item_from_state_store( + pub(crate) async fn get_item_from_state_store( &self, key: &str, ) -> Result, ApplicationError> { @@ -296,7 +296,7 @@ impl DaprClient { Ok(response) } - async fn set_items_in_state_store( + pub(crate) async fn set_items_in_state_store( &self, items: Vec, ) -> Result<(), ApplicationError> { diff --git a/anubis-eval/src/config/logger.rs b/anubis-eval/src/config/logger.rs index b1f8835..5af6af1 100644 --- a/anubis-eval/src/config/logger.rs +++ b/anubis-eval/src/config/logger.rs @@ -5,49 +5,86 @@ use log4rs::encode::pattern::PatternEncoder; use rocket::log::private::LevelFilter; pub fn init_logger() { - match log4rs::init_file("log4rs.yaml", Default::default()) { - Ok(_) => { - println!("Logger initialized using log4rs.yaml"); - } - Err(err) => { - println!( - "Failed to initialize logger, using default configuration due to error: {:?}", - err - ); - println!("Logger initialized using default configuration"); - - let file_appender = FileAppender::builder() - .encoder(Box::new(PatternEncoder::new( - "{d(%Y-%m-%d %H:%M:%S%.3f)} {h({l})} {M} - {m}{n}", - ))) - .build("logs/anubis.logs") - .unwrap(); - - let console_appender = ConsoleAppender::builder() - .encoder(Box::new(PatternEncoder::new( - "{d(%Y-%m-%d %H:%M:%S%.3f)} {h({l})} {M} - {m}{n}", - ))) - .build(); - - let log_config = log4rs::config::Config::builder() - .appender(Appender::builder().build("file_appender", Box::new(file_appender))) - .appender(Appender::builder().build("console_appender", Box::new(console_appender))) - .logger( - Logger::builder() - .appender("file_appender") - .appender("console_appender") - .additive(false) - .build("anubis", LevelFilter::Info), - ) - .build( - Root::builder() - .appender("file_appender") - .appender("console_appender") - .build(LevelFilter::Info), - ) - .unwrap(); - - log4rs::init_config(log_config).unwrap(); + if !cfg!(test) { + match log4rs::init_file("log4rs.yaml", Default::default()) { + Ok(_) => { + println!("Logger initialized using log4rs.yaml"); + } + Err(err) => { + println!( + "Failed to initialize logger, using default configuration due to error: {:?}", + err + ); + println!("Logger initialized using default configuration"); + + let file_appender = FileAppender::builder() + .encoder(Box::new(PatternEncoder::new( + "{d(%Y-%m-%d %H:%M:%S%.3f)} {h({l})} {M} - {m}{n}", + ))) + .build("logs/anubis.logs") + .unwrap(); + + let console_appender = ConsoleAppender::builder() + .encoder(Box::new(PatternEncoder::new( + "{d(%Y-%m-%d %H:%M:%S%.3f)} {h({l})} {M} - {m}{n}", + ))) + .build(); + + let log_config = log4rs::config::Config::builder() + .appender(Appender::builder().build("file_appender", Box::new(file_appender))) + .appender( + Appender::builder().build("console_appender", Box::new(console_appender)), + ) + .logger( + Logger::builder() + .appender("file_appender") + .appender("console_appender") + .additive(false) + .build("anubis", LevelFilter::Info), + ) + .build( + Root::builder() + .appender("file_appender") + .appender("console_appender") + .build(LevelFilter::Info), + ) + .unwrap(); + + log4rs::init_config(log_config).unwrap(); + } } } } + +#[cfg(test)] +pub mod tests { + use log4rs::append::console::ConsoleAppender; + use log4rs::config::{Appender, Logger, Root}; + use log4rs::encode::pattern::PatternEncoder; + use rocket::log::private::LevelFilter; + + pub(crate) fn init_logger() { + let console_appender = ConsoleAppender::builder() + .encoder(Box::new(PatternEncoder::new( + "{d(%Y-%m-%d %H:%M:%S%.3f)} {h({l})} {M} - {m}{n}", + ))) + .build(); + + let log_config = log4rs::config::Config::builder() + .appender(Appender::builder().build("console_appender", Box::new(console_appender))) + .logger( + Logger::builder() + .appender("console_appender") + .additive(false) + .build("anubis", LevelFilter::Info), + ) + .build( + Root::builder() + .appender("console_appender") + .build(LevelFilter::Info), + ) + .unwrap(); + + log4rs::init_config(log_config).unwrap(); + } +} diff --git a/anubis-eval/src/contracts/create_submission_dtos.rs b/anubis-eval/src/contracts/create_submission_dtos.rs index d4666de..5036448 100644 --- a/anubis-eval/src/contracts/create_submission_dtos.rs +++ b/anubis-eval/src/contracts/create_submission_dtos.rs @@ -1,6 +1,6 @@ -use rocket::serde::Serialize; +use rocket::serde::{Deserialize, Serialize}; -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct CreateSubmissionResponseDto { pub(crate) id: String, diff --git a/anubis-eval/src/contracts/get_highest_score_submissions_dtos.rs b/anubis-eval/src/contracts/get_highest_score_submissions_dtos.rs index 184bc25..b6dca4d 100644 --- a/anubis-eval/src/contracts/get_highest_score_submissions_dtos.rs +++ b/anubis-eval/src/contracts/get_highest_score_submissions_dtos.rs @@ -1,11 +1,11 @@ use crate::domain::problem::Problem; use crate::domain::submission::Submission; -use rocket::serde::Serialize; +use rocket::serde::{Deserialize, Serialize}; -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetHighestScoreSubmissionsDto { - submissions: Vec, + pub submissions: Vec, } #[rocket::async_trait] @@ -32,14 +32,14 @@ impl From> for GetHighestScoreSubmissionsDto { } } -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetHighestScoreSubmissionDto { - id: String, - problem_id: String, - problem_name: String, - is_published: bool, - score: usize, + pub id: String, + pub problem_id: String, + pub problem_name: String, + pub is_published: bool, + pub score: usize, } impl From<(Submission, Problem)> for GetHighestScoreSubmissionDto { diff --git a/anubis-eval/src/contracts/get_submission_dtos.rs b/anubis-eval/src/contracts/get_submission_dtos.rs index 6e25141..691d25a 100644 --- a/anubis-eval/src/contracts/get_submission_dtos.rs +++ b/anubis-eval/src/contracts/get_submission_dtos.rs @@ -1,14 +1,14 @@ use crate::domain::problem::Problem; use crate::domain::submission::Submission; use chrono::{DateTime, Utc}; -use rocket::serde::Serialize; +use rocket::serde::{Deserialize, Serialize}; -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetSubmissionWithTestCasesDto { #[serde(flatten)] - submission: GetSubmissionDto, - test_cases: Vec, + pub submission: GetSubmissionDto, + pub test_cases: Vec, } #[rocket::async_trait] @@ -63,34 +63,34 @@ impl From<(Submission, Problem)> for GetSubmissionWithTestCasesDto { } } -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetSubmissionDto { - id: String, - problem_id: String, - problem_name: String, - is_published: bool, - user_id: String, - language: String, - source_code: Option, - status: String, - score: usize, + pub id: String, + pub problem_id: String, + pub problem_name: String, + pub is_published: bool, + pub user_id: String, + pub language: String, + pub source_code: Option, + pub status: String, + pub score: usize, #[serde(with = "chrono::serde::ts_seconds")] - created_at: DateTime, - avg_time: f32, - avg_memory: f32, + pub created_at: DateTime, + pub avg_time: f32, + pub avg_memory: f32, } -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetSubmissionTestCaseDto { - id: usize, - status: String, - time: f32, - memory: f32, - expected_score: usize, - eval_message: String, - compile_output: String, - stdout: String, - stderr: String, + pub id: usize, + pub status: String, + pub time: f32, + pub memory: f32, + pub expected_score: usize, + pub eval_message: String, + pub compile_output: String, + pub stdout: String, + pub stderr: String, } diff --git a/anubis-eval/src/contracts/get_submissions_dtos.rs b/anubis-eval/src/contracts/get_submissions_dtos.rs index 67ad26f..8258c67 100644 --- a/anubis-eval/src/contracts/get_submissions_dtos.rs +++ b/anubis-eval/src/contracts/get_submissions_dtos.rs @@ -1,14 +1,14 @@ use crate::domain::problem::Problem; use crate::domain::submission::Submission; use chrono::{DateTime, Utc}; -use rocket::serde::Serialize; +use rocket::serde::{Deserialize, Serialize}; -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetSubmissionsDto { - submissions: Vec, - items: usize, - total_pages: usize, + pub submissions: Vec, + pub items: usize, + pub total_pages: usize, } #[rocket::async_trait] @@ -51,7 +51,7 @@ impl From> for GetSubmissionsDto { } } -#[derive(Serialize)] +#[derive(Serialize, Deserialize)] #[serde(crate = "rocket::serde")] pub struct GetSubmissionDto { id: String, diff --git a/anubis-eval/src/infrastructure/submission_repository.rs b/anubis-eval/src/infrastructure/submission_repository.rs index bcc4670..1521c2f 100644 --- a/anubis-eval/src/infrastructure/submission_repository.rs +++ b/anubis-eval/src/infrastructure/submission_repository.rs @@ -41,6 +41,44 @@ impl Submission { Ok(()) } + pub fn upsert(&self, conn: &mut PgConnection) -> Result<(), ApplicationError> { + // check if submission fails to insert + let submission: SubmissionModel = self.clone().into(); + + diesel::insert_into(all_submissions) + .values(&submission) + .on_conflict(crate::schema::submissions::dsl::id) + .do_update() + .set(&submission) + .execute(conn) + .map_err(|source| ApplicationError::SubmissionSaveError { + submission_id: self.id().to_string(), + source, + })?; + + // check if any of the test cases fail to insert + self.test_cases() + .iter() + .map(|testcase| testcase.upsert(conn)) + .collect::, _>>()?; + + Ok(()) + } + + pub fn delete_by_id(id: &String, conn: &mut PgConnection) -> Result<(), ApplicationError> { + // delete submission and its test cases + TestCase::delete_by_submission_id(id, conn)?; + + diesel::delete(all_submissions.find(id.to_string())) + .execute(conn) + .map_err(|source| ApplicationError::SubmissionSaveError { + submission_id: id.to_string(), + source, + })?; + + Ok(()) + } + pub fn find_by_id( id: &String, conn: &mut PgConnection, @@ -341,6 +379,24 @@ impl TestCase { Ok(()) } + fn upsert(&self, conn: &mut PgConnection) -> Result<(), ApplicationError> { + let testcase: TestCaseModel = self.clone().into(); + + diesel::insert_into(all_testcases) + .values(testcase.clone()) + .on_conflict(crate::schema::submissions_testcases::dsl::token) + .do_update() + .set(&testcase) + .execute(conn) + .map_err(|source| ApplicationError::TestCaseSaveError { + testcase_id: testcase.testcase_id.to_string(), + submission_id: testcase.submission_id.clone(), + source, + })?; + + Ok(()) + } + pub fn find_by_submission_id( submission_id: &String, conn: &mut PgConnection, @@ -409,4 +465,22 @@ impl TestCase { Ok(()) } + + pub fn delete_by_submission_id( + submission_id: &String, + conn: &mut PgConnection, + ) -> Result<(), ApplicationError> { + diesel::delete( + all_testcases + .filter(crate::schema::submissions_testcases::dsl::submission_id.eq(submission_id)), + ) + .execute(conn) + .map_err(|source| ApplicationError::TestCaseSaveError { + testcase_id: "".to_string(), + submission_id: submission_id.clone(), + source, + })?; + + Ok(()) + } } diff --git a/anubis-eval/src/main.rs b/anubis-eval/src/main.rs index 3bc7b49..052bb6e 100644 --- a/anubis-eval/src/main.rs +++ b/anubis-eval/src/main.rs @@ -14,6 +14,7 @@ mod contracts; mod domain; mod infrastructure; mod schema; +mod tests; #[launch] async fn rocket() -> _ { diff --git a/anubis-eval/src/tests/mod.rs b/anubis-eval/src/tests/mod.rs new file mode 100644 index 0000000..3cf5d23 --- /dev/null +++ b/anubis-eval/src/tests/mod.rs @@ -0,0 +1,181 @@ +pub mod problem; +pub mod submission; +pub mod user; + +#[cfg(test)] +pub mod common { + use std::ops::DerefMut; + use std::sync::Arc; + + use crate::application::dapr_client::DaprClient; + use async_once::AsyncOnce; + use diesel::PgConnection; + use lazy_static::lazy_static; + use rocket::info; + use rocket::local::asynchronous::Client; + use serde_json::Value; + + use crate::config::di::{Atomic, DAPR_CLIENT, DB_CONN}; + use crate::config::logger::tests::init_logger; + use crate::contracts::dapr_dtos::StateStoreSetItemDto; + use crate::domain::problem::Problem; + use crate::domain::submission::Submission; + use crate::rocket; + use crate::tests::problem::tests::{PROBLEMS, TESTS}; + use crate::tests::submission::tests::{SUBMISSIONS, TEST_CASES}; + + pub type DefaultError = Box; + pub type DefaultAtomicError = Box; + pub type Result = std::result::Result; + + lazy_static! { + pub static ref ROCKET_CLIENT: AsyncOnce> = AsyncOnce::new(async { + let client = setup_rocket().await.expect("Failed to setup rocket client"); + Arc::new(client) + }); + } + + #[ctor::ctor] + fn setup() { + dotenvy::from_filename(".env.template").expect("Failed to load env vars"); + init_logger(); + } + + async fn setup_rocket() -> Result { + let client = Client::tracked(rocket().await).await?; + seed().await?; + + Ok(client) + } + + async fn seed() -> Result<()> { + let conn = DB_CONN.clone(); + seed_problems(conn.clone()).await?; + seed_submissions(conn.clone()).await?; + + let dapr_client = DAPR_CLIENT.clone(); + seed_tests_cache(dapr_client.clone()).await?; + + Ok(()) + } + + async fn seed_problems(conn: Atomic) -> Result<()> { + let mut db = conn.lock().await; + + let raw_test1 = TESTS.get("SumAB_Test1")?.to_owned(); + let raw_test2 = TESTS.get("SumAB_Test2")?.to_owned(); + + let raw_problem = PROBLEMS.get("SumAB")?.to_owned(); + let problem: Problem = (raw_problem, vec![raw_test1, raw_test2]).into(); + + problem.upsert(db.deref_mut())?; + + let raw_test1 = TESTS.get("DiffAB_Test1")?.to_owned(); + + let raw_problem = PROBLEMS.get("DiffAB")?.to_owned(); + let problem: Problem = (raw_problem, vec![raw_test1]).into(); + + problem.upsert(db.deref_mut())?; + + Ok(()) + } + + async fn seed_submissions(conn: Atomic) -> Result<()> { + let mut db = conn.lock().await; + + let raw_testcase1 = TEST_CASES.get("Submission1_TestCase1")?.to_owned(); + let raw_testcase2 = TEST_CASES.get("Submission1_TestCase2")?.to_owned(); + + let raw_submission = SUBMISSIONS.get("Ordinary_SumAB_Submission1")?.to_owned(); + let submission: Submission = (raw_submission, vec![raw_testcase1, raw_testcase2]).into(); + + submission.upsert(db.deref_mut())?; + + let raw_testcase1 = TEST_CASES.get("Submission2_TestCase1")?.to_owned(); + let raw_testcase2 = TEST_CASES.get("Submission2_TestCase2")?.to_owned(); + + let raw_submission = SUBMISSIONS.get("Proposer_SumAB_Submission2")?.to_owned(); + let submission: Submission = (raw_submission, vec![raw_testcase1, raw_testcase2]).into(); + + submission.upsert(db.deref_mut())?; + + let raw_testcase1 = TEST_CASES.get("Submission3_TestCase1")?.to_owned(); + let raw_testcase2 = TEST_CASES.get("Submission3_TestCase2")?.to_owned(); + + let raw_submission = SUBMISSIONS.get("Admin_SumAB_Submission3")?.to_owned(); + let submission: Submission = (raw_submission, vec![raw_testcase1, raw_testcase2]).into(); + + submission.upsert(db.deref_mut())?; + + let raw_testcase1 = TEST_CASES.get("Submission4_TestCase1")?.to_owned(); + + let raw_submission = SUBMISSIONS.get("Admin_DiffAB_Submission4")?.to_owned(); + let submission: Submission = (raw_submission, vec![raw_testcase1]).into(); + + submission.upsert(db.deref_mut())?; + + let raw_testcase1 = TEST_CASES.get("Submission5_TestCase1")?.to_owned(); + + let raw_submission = SUBMISSIONS.get("Ordinary_DiffAB_Submission5")?.to_owned(); + let submission: Submission = (raw_submission, vec![raw_testcase1]).into(); + + submission.upsert(db.deref_mut())?; + + Ok(()) + } + + async fn seed_tests_cache(dapr_client: Atomic) -> Result<()> { + let dapr_client = dapr_client.lock().await; + + let problem1_id = PROBLEMS.get("SumAB")?.id.clone(); + dapr_client + .save_test_for_problem(1, problem1_id.clone(), ("1 2".to_string(), "3".to_string())) + .await?; + dapr_client + .save_test_for_problem(2, problem1_id.clone(), ("3 4".to_string(), "7".to_string())) + .await?; + + let problem2_id = PROBLEMS.get("DiffAB")?.id.clone(); + dapr_client + .save_test_for_problem( + 1, + problem2_id.clone(), + ("1 2".to_string(), "-1".to_string()), + ) + .await?; + + Ok(()) + } + + impl DaprClient { + async fn save_test_for_problem( + &self, + test_id: usize, + problem_id: String, + (input, output): (String, String), + ) -> Result<()> { + let input_key = format!("{}-{}-input", problem_id, test_id); + info!("Saving test input: {}", input_key); + let input_item = StateStoreSetItemDto { + key: input_key.clone(), + value: Value::String(input.clone()), + }; + + let output_key = format!("{}-{}-output", problem_id, test_id); + let output_item = StateStoreSetItemDto { + key: output_key, + value: Value::String(output.clone()), + }; + + self.set_items_in_state_store(vec![input_item, output_item]) + .await?; + + let val = self + .get_item_from_state_store(input_key.clone().as_str()) + .await?; + info!("Value: {:?}", val); + + Ok(()) + } + } +} diff --git a/anubis-eval/src/tests/problem.rs b/anubis-eval/src/tests/problem.rs new file mode 100644 index 0000000..f89e7b5 --- /dev/null +++ b/anubis-eval/src/tests/problem.rs @@ -0,0 +1,69 @@ +#[cfg(test)] +pub mod tests { + use crate::domain; + use cder::{Dict, StructLoader}; + use lazy_static::lazy_static; + use serde::Deserialize; + + #[derive(Debug, Clone, Deserialize)] + pub struct Problem { + pub id: String, + pub name: String, + pub proposer_id: String, + pub is_published: bool, + pub time: f32, + pub stack_memory: f32, + pub total_memory: f32, + } + + impl From<(Problem, Vec)> for domain::problem::Problem { + fn from((problem, tests): (Problem, Vec)) -> Self { + domain::problem::Problem::new( + problem.id.parse().unwrap(), + problem.name, + problem.proposer_id.parse().unwrap(), + problem.is_published, + Some(problem.time), + Some(problem.stack_memory), + Some(problem.total_memory), + tests.into_iter().map(|test| test.into()).collect(), + ) + } + } + + #[derive(Debug, Clone, Deserialize)] + pub struct Test { + pub id: i32, + pub problem_id: String, + pub score: i32, + pub input_url: String, + pub output_url: String, + } + + impl From for domain::test::Test { + fn from(test: Test) -> Self { + domain::test::Test::new( + test.id, + test.problem_id.parse().unwrap(), + test.score, + test.input_url, + test.output_url, + ) + } + } + + lazy_static! { + pub static ref PROBLEMS: StructLoader = { + let mut loader = StructLoader::::new("problems.yaml", "tests-setup/fixtures"); + loader.load(&Dict::::new()).unwrap(); + + loader + }; + pub static ref TESTS: StructLoader = { + let mut loader = StructLoader::::new("tests.yaml", "tests-setup/fixtures"); + loader.load(&Dict::::new()).unwrap(); + + loader + }; + } +} diff --git a/anubis-eval/src/tests/submission.rs b/anubis-eval/src/tests/submission.rs new file mode 100644 index 0000000..c3647fa --- /dev/null +++ b/anubis-eval/src/tests/submission.rs @@ -0,0 +1,93 @@ +#[cfg(test)] +pub mod tests { + use crate::domain; + use cder::{Dict, StructLoader}; + use lazy_static::lazy_static; + use serde::Deserialize; + use std::time::SystemTime; + + #[derive(Debug, Clone, Deserialize)] + pub struct Submission { + pub id: String, + pub user_id: String, + pub problem_id: String, + pub language: String, + pub source_code: String, + pub status: String, + pub score: i32, + pub created_at: String, + pub avg_time: f32, + pub avg_memory: f32, + } + + impl From<(Submission, Vec)> for domain::submission::Submission { + fn from((submission, test_cases): (Submission, Vec)) -> Self { + domain::submission::Submission::new( + submission.id.parse().unwrap(), + submission.user_id.parse().unwrap(), + submission.problem_id.parse().unwrap(), + submission.language.into(), + submission.source_code, + submission.status.into(), + submission.score, + SystemTime::now(), + Some(submission.avg_time), + Some(submission.avg_memory), + test_cases + .into_iter() + .map(|test_case| test_case.into()) + .collect(), + ) + } + } + + #[derive(Debug, Clone, Deserialize)] + pub struct TestCase { + pub token: String, + pub submission_id: String, + pub status: String, + pub time: f32, + pub memory: f32, + pub eval_message: String, + pub stdout: String, + pub stderr: String, + pub testcase_id: i32, + pub expected_score: i32, + pub compile_output: String, + } + + impl From for domain::submission::TestCase { + fn from(test_case: TestCase) -> Self { + domain::submission::TestCase::new( + test_case.token.parse().unwrap(), + test_case.submission_id.parse().unwrap(), + test_case.testcase_id, + test_case.status.into(), + test_case.time, + test_case.memory, + test_case.expected_score, + Some(test_case.eval_message), + Some(test_case.compile_output), + Some(test_case.stdout), + Some(test_case.stderr), + ) + } + } + + lazy_static! { + pub static ref SUBMISSIONS: StructLoader = { + let mut loader = + StructLoader::::new("submissions.yaml", "tests-setup/fixtures"); + loader.load(&Dict::::new()).unwrap(); + + loader + }; + pub static ref TEST_CASES: StructLoader = { + let mut loader = + StructLoader::::new("test_cases.yaml", "tests-setup/fixtures"); + loader.load(&Dict::::new()).unwrap(); + + loader + }; + } +} diff --git a/anubis-eval/src/tests/user.rs b/anubis-eval/src/tests/user.rs new file mode 100644 index 0000000..449c20f --- /dev/null +++ b/anubis-eval/src/tests/user.rs @@ -0,0 +1,55 @@ +#[cfg(test)] +pub mod tests { + use cder::{Dict, StructLoader}; + use lazy_static::lazy_static; + use serde::Deserialize; + use std::fmt; + use uuid::Uuid; + + #[derive(Debug, Clone, Deserialize)] + pub struct User { + pub id: Uuid, + pub email: String, + pub role: Vec, + } + + pub enum UserProfile { + Admin, + Proposer, + Ordinary, + } + + impl User { + pub fn get(user_profile: UserProfile) -> User { + match user_profile { + UserProfile::Admin => USERS.get("Admin").unwrap().to_owned(), + UserProfile::Proposer => USERS.get("Proposer").unwrap().to_owned(), + UserProfile::Ordinary => USERS.get("Ordinary").unwrap().to_owned(), + } + } + } + + #[derive(Debug, Clone, Deserialize)] + pub enum Role { + Admin, + Proposer, + } + + impl fmt::Display for Role { + fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { + match self { + Role::Admin => write!(f, "Admin"), + Role::Proposer => write!(f, "Proposer"), + } + } + } + + lazy_static! { + static ref USERS: StructLoader = { + let mut loader = StructLoader::::new("users.yaml", "tests-setup/fixtures"); + loader.load(&Dict::::new()).unwrap(); + + loader + }; + } +} diff --git a/anubis-eval/tests-setup/cache-stub/.dockerignore b/anubis-eval/tests-setup/cache-stub/.dockerignore new file mode 100644 index 0000000..e69de29 diff --git a/anubis-eval/tests-setup/cache-stub/.gitgnore b/anubis-eval/tests-setup/cache-stub/.gitgnore new file mode 100644 index 0000000..64ae7f3 --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/.gitgnore @@ -0,0 +1,25 @@ +# If you prefer the allow list template instead of the deny list, see community template: +# https://github.com/github/gitignore/blob/main/community/Golang/Go.AllowList.gitignore +# +# Binaries for programs and plugins +*.exe +*.exe~ +*.dll +*.so +*.dylib + +# Test binary, built with `go test -c` +*.test + +# Output of the go coverage tool, specifically when used with LiteIDE +*.out + +# Dependency directories (remove the comment below to include it) +# vendor/ + +# Go workspace file +go.work +go.work.sum + +# env file +.env \ No newline at end of file diff --git a/anubis-eval/tests-setup/cache-stub/Dockerfile b/anubis-eval/tests-setup/cache-stub/Dockerfile new file mode 100644 index 0000000..df48461 --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/Dockerfile @@ -0,0 +1,26 @@ +# specify the base image to be used for the application, alpine or ubuntu +FROM golang:1.22.4-alpine AS build + +# create a working directory inside the image +WORKDIR /app + +# copy Go modules and dependencies to image +COPY go.mod ./ +COPY go.sum ./ + +# download Go modules and dependencies +RUN go mod download + +# copy directory files i.e all files ending with .go +COPY *.go ./ + +# compile application +RUN go build -o /cache-stub + +FROM scratch + +WORKDIR / + +COPY --from=build /cache-stub /cache-stub + +ENTRYPOINT ["/cache-stub"] \ No newline at end of file diff --git a/anubis-eval/tests-setup/cache-stub/go.mod b/anubis-eval/tests-setup/cache-stub/go.mod new file mode 100644 index 0000000..8f49fc5 --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/go.mod @@ -0,0 +1,37 @@ +module cache-stub + +go 1.22.4 + +require github.com/gin-gonic/gin v1.10.0 + +require ( + github.com/bytedance/sonic v1.11.6 // indirect + github.com/bytedance/sonic/loader v0.1.1 // indirect + github.com/cespare/xxhash/v2 v2.2.0 // indirect + github.com/cloudwego/base64x v0.1.4 // indirect + github.com/cloudwego/iasm v0.2.0 // indirect + github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect + github.com/gabriel-vasile/mimetype v1.4.3 // indirect + github.com/gin-contrib/sse v0.1.0 // indirect + github.com/go-playground/locales v0.14.1 // indirect + github.com/go-playground/universal-translator v0.18.1 // indirect + github.com/go-playground/validator/v10 v10.20.0 // indirect + github.com/goccy/go-json v0.10.2 // indirect + github.com/json-iterator/go v1.1.12 // indirect + github.com/klauspost/cpuid/v2 v2.2.7 // indirect + github.com/leodido/go-urn v1.4.0 // indirect + github.com/mattn/go-isatty v0.0.20 // indirect + github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect + github.com/modern-go/reflect2 v1.0.2 // indirect + github.com/pelletier/go-toml/v2 v2.2.2 // indirect + github.com/redis/go-redis/v9 v9.5.3 // indirect + github.com/twitchyliquid64/golang-asm v0.15.1 // indirect + github.com/ugorji/go/codec v1.2.12 // indirect + golang.org/x/arch v0.8.0 // indirect + golang.org/x/crypto v0.23.0 // indirect + golang.org/x/net v0.25.0 // indirect + golang.org/x/sys v0.20.0 // indirect + golang.org/x/text v0.15.0 // indirect + google.golang.org/protobuf v1.34.1 // indirect + gopkg.in/yaml.v3 v3.0.1 // indirect +) diff --git a/anubis-eval/tests-setup/cache-stub/go.sum b/anubis-eval/tests-setup/cache-stub/go.sum new file mode 100644 index 0000000..b12dfca --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/go.sum @@ -0,0 +1,95 @@ +github.com/bytedance/sonic v1.11.6 h1:oUp34TzMlL+OY1OUWxHqsdkgC/Zfc85zGqw9siXjrc0= +github.com/bytedance/sonic v1.11.6/go.mod h1:LysEHSvpvDySVdC2f87zGWf6CIKJcAvqab1ZaiQtds4= +github.com/bytedance/sonic/loader v0.1.1 h1:c+e5Pt1k/cy5wMveRDyk2X4B9hF4g7an8N3zCYjJFNM= +github.com/bytedance/sonic/loader v0.1.1/go.mod h1:ncP89zfokxS5LZrJxl5z0UJcsk4M4yY2JpfqGeCtNLU= +github.com/cespare/xxhash/v2 v2.2.0 h1:DC2CZ1Ep5Y4k3ZQ899DldepgrayRUGE6BBZ/cd9Cj44= +github.com/cespare/xxhash/v2 v2.2.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs= +github.com/cloudwego/base64x v0.1.4 h1:jwCgWpFanWmN8xoIUHa2rtzmkd5J2plF/dnLS6Xd/0Y= +github.com/cloudwego/base64x v0.1.4/go.mod h1:0zlkT4Wn5C6NdauXdJRhSKRlJvmclQ1hhJgA0rcu/8w= +github.com/cloudwego/iasm v0.2.0 h1:1KNIy1I1H9hNNFEEH3DVnI4UujN+1zjpuk6gwHLTssg= +github.com/cloudwego/iasm v0.2.0/go.mod h1:8rXZaNYT2n95jn+zTI1sDr+IgcD2GVs0nlbbQPiEFhY= +github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= +github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= +github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= +github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78= +github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc= +github.com/gabriel-vasile/mimetype v1.4.3 h1:in2uUcidCuFcDKtdcBxlR0rJ1+fsokWf+uqxgUFjbI0= +github.com/gabriel-vasile/mimetype v1.4.3/go.mod h1:d8uq/6HKRL6CGdk+aubisF/M5GcPfT7nKyLpA0lbSSk= +github.com/gin-contrib/sse v0.1.0 h1:Y/yl/+YNO8GZSjAhjMsSuLt29uWRFHdHYUb5lYOV9qE= +github.com/gin-contrib/sse v0.1.0/go.mod h1:RHrZQHXnP2xjPF+u1gW/2HnVO7nvIa9PG3Gm+fLHvGI= +github.com/gin-gonic/gin v1.10.0 h1:nTuyha1TYqgedzytsKYqna+DfLos46nTv2ygFy86HFU= +github.com/gin-gonic/gin v1.10.0/go.mod h1:4PMNQiOhvDRa013RKVbsiNwoyezlm2rm0uX/T7kzp5Y= +github.com/go-playground/assert/v2 v2.2.0 h1:JvknZsQTYeFEAhQwI4qEt9cyV5ONwRHC+lYKSsYSR8s= +github.com/go-playground/assert/v2 v2.2.0/go.mod h1:VDjEfimB/XKnb+ZQfWdccd7VUvScMdVu0Titje2rxJ4= +github.com/go-playground/locales v0.14.1 h1:EWaQ/wswjilfKLTECiXz7Rh+3BjFhfDFKv/oXslEjJA= +github.com/go-playground/locales v0.14.1/go.mod h1:hxrqLVvrK65+Rwrd5Fc6F2O76J/NuW9t0sjnWqG1slY= +github.com/go-playground/universal-translator v0.18.1 h1:Bcnm0ZwsGyWbCzImXv+pAJnYK9S473LQFuzCbDbfSFY= +github.com/go-playground/universal-translator v0.18.1/go.mod h1:xekY+UJKNuX9WP91TpwSH2VMlDf28Uj24BCp08ZFTUY= +github.com/go-playground/validator/v10 v10.20.0 h1:K9ISHbSaI0lyB2eWMPJo+kOS/FBExVwjEviJTixqxL8= +github.com/go-playground/validator/v10 v10.20.0/go.mod h1:dbuPbCMFw/DrkbEynArYaCwl3amGuJotoKCe95atGMM= +github.com/goccy/go-json v0.10.2 h1:CrxCmQqYDkv1z7lO7Wbh2HN93uovUHgrECaO5ZrCXAU= +github.com/goccy/go-json v0.10.2/go.mod h1:6MelG93GURQebXPDq3khkgXZkazVtN9CRI+MGFi0w8I= +github.com/google/go-cmp v0.5.5 h1:Khx7svrCpmxxtHBq5j2mp/xVjsi8hQMfNLvJFAlrGgU= +github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE= +github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg= +github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM= +github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo= +github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg= +github.com/klauspost/cpuid/v2 v2.2.7 h1:ZWSB3igEs+d0qvnxR/ZBzXVmxkgt8DdzP6m9pfuVLDM= +github.com/klauspost/cpuid/v2 v2.2.7/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws= +github.com/knz/go-libedit v1.10.1/go.mod h1:MZTVkCWyz0oBc7JOWP3wNAzd002ZbM/5hgShxwh4x8M= +github.com/leodido/go-urn v1.4.0 h1:WT9HwE9SGECu3lg4d/dIA+jxlljEa1/ffXKmRjqdmIQ= +github.com/leodido/go-urn v1.4.0/go.mod h1:bvxc+MVxLKB4z00jd1z+Dvzr47oO32F/QSNjSBOlFxI= +github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY= +github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y= +github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q= +github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg= +github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q= +github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M= +github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk= +github.com/pelletier/go-toml/v2 v2.2.2 h1:aYUidT7k73Pcl9nb2gScu7NSrKCSHIDE89b3+6Wq+LM= +github.com/pelletier/go-toml/v2 v2.2.2/go.mod h1:1t835xjRzz80PqgE6HHgN2JOsmgYu/h4qDAS4n929Rs= +github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= +github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= +github.com/redis/go-redis/v9 v9.5.3 h1:fOAp1/uJG+ZtcITgZOfYFmTKPE7n4Vclj1wZFgRciUU= +github.com/redis/go-redis/v9 v9.5.3/go.mod h1:hdY0cQFCN4fnSYT6TkisLufl/4W5UIXyv0b/CLO2V2M= +github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= +github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw= +github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo= +github.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA= +github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI= +github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= +github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= +github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU= +github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4= +github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo= +github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg= +github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY= +github.com/twitchyliquid64/golang-asm v0.15.1 h1:SU5vSMR7hnwNxj24w34ZyCi/FmDZTkS4MhqMhdFk5YI= +github.com/twitchyliquid64/golang-asm v0.15.1/go.mod h1:a1lVb/DtPvCB8fslRZhAngC2+aY1QWCk3Cedj/Gdt08= +github.com/ugorji/go/codec v1.2.12 h1:9LC83zGrHhuUA9l16C9AHXAqEV/2wBQ4nkvumAE65EE= +github.com/ugorji/go/codec v1.2.12/go.mod h1:UNopzCgEMSXjBc6AOMqYvWC1ktqTAfzJZUZgYf6w6lg= +golang.org/x/arch v0.0.0-20210923205945-b76863e36670/go.mod h1:5om86z9Hs0C8fWVUuoMHwpExlXzs5Tkyp9hOrfG7pp8= +golang.org/x/arch v0.8.0 h1:3wRIsP3pM4yUptoR96otTUOXI367OS0+c9eeRi9doIc= +golang.org/x/arch v0.8.0/go.mod h1:FEVrYAQjsQXMVJ1nsMoVVXPZg6p2JE2mx8psSWTDQys= +golang.org/x/crypto v0.23.0 h1:dIJU/v2J8Mdglj/8rJ6UUOM3Zc9zLZxVZwwxMooUSAI= +golang.org/x/crypto v0.23.0/go.mod h1:CKFgDieR+mRhux2Lsu27y0fO304Db0wZe70UKqHu0v8= +golang.org/x/net v0.25.0 h1:d/OCCoBEUq33pjydKrGQhw7IlUPI2Oylr+8qLx49kac= +golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM= +golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= +golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= +golang.org/x/sys v0.20.0 h1:Od9JTbYCk261bKm4M/mw7AklTlFYIa0bIp9BgSm1S8Y= +golang.org/x/sys v0.20.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA= +golang.org/x/text v0.15.0 h1:h1V/4gjBv8v9cjcR6+AR5+/cIYK5N/WAgiv4xlsEtAk= +golang.org/x/text v0.15.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU= +golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543 h1:E7g+9GITq07hpfrRu66IVDexMakfv52eLZ2CXBWiKr4= +golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= +google.golang.org/protobuf v1.34.1 h1:9ddQBjfCyZPOHPUiPxpYESBLc+T8P3E+Vo4IbKZgFWg= +google.golang.org/protobuf v1.34.1/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos= +gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM= +gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= +gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= +gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA= +gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= +nullprogram.com/x/optparse v1.0.0/go.mod h1:KdyPE+Igbe0jQUrVfMqDMeJQIJZEuyV7pjYmp6pbG50= +rsc.io/pdf v0.1.1/go.mod h1:n8OzWcQ6Sp37PL01nO98y4iUCRdTGarVfzxY20ICaU4= diff --git a/anubis-eval/tests-setup/cache-stub/main.go b/anubis-eval/tests-setup/cache-stub/main.go new file mode 100644 index 0000000..ab52c96 --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/main.go @@ -0,0 +1,83 @@ +package main + +import ( + "github.com/gin-gonic/gin" + "github.com/redis/go-redis/v9" + "os" +) + +type State struct { + Key string `json:"key"` + Value string `json:"value"` +} + +func main() { + // read the redis dsn from the environment variable REDIS_DSN + redisDSN := os.Getenv("REDIS_DSN") + + if redisDSN == "" { + redisDSN = "redis://localhost:6379" + } + + // create a new redis client + opt, err := redis.ParseURL(redisDSN) + if err != nil { + panic(err) + } + + client := redis.NewClient(opt) + + r := gin.Default() + r.GET("/healthy", func(ctx *gin.Context) { + // check if the redis server is up + _, err := client.Ping(ctx).Result() + if err != nil { + ctx.JSON(500, gin.H{ + "status": "unhealthy", + "error": err.Error(), + }) + } else { + ctx.JSON(200, gin.H{ + "status": "healthy", + }) + } + }) + + r.GET("/v1.0/state/statestore/:key", func(ctx *gin.Context) { + key := ctx.Param("key") + val, err := client.Get(ctx, key).Result() + if err != nil { + ctx.JSON(404, gin.H{ + "error": err.Error(), + }) + } else { + ctx.JSON(200, val) + } + }) + + r.POST("/v1.0/state/statestore", func(ctx *gin.Context) { + var state []State + if err := ctx.BindJSON(&state); err != nil { + ctx.JSON(400, gin.H{ + "error": err.Error(), + }) + return + } + + for _, s := range state { + if err := client.Set(ctx, s.Key, s.Value, 0).Err(); err != nil { + ctx.JSON(500, gin.H{ + "error": err.Error(), + }) + } + } + + ctx.JSON(200, gin.H{ + "status": "success", + }) + }) + + if err := r.Run(); err != nil { + return + } +} diff --git a/anubis-eval/tests-setup/cache-stub/requests/get_item.http b/anubis-eval/tests-setup/cache-stub/requests/get_item.http new file mode 100644 index 0000000..d4d55d7 --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/requests/get_item.http @@ -0,0 +1 @@ +GET http://127.0.0.1:8080/v1.0/state/statestore/e8d818c3-cb9b-4164-a1f8-ba0967008086-2-input \ No newline at end of file diff --git a/anubis-eval/tests-setup/cache-stub/requests/healthy.http b/anubis-eval/tests-setup/cache-stub/requests/healthy.http new file mode 100644 index 0000000..4dcf69f --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/requests/healthy.http @@ -0,0 +1 @@ +GET http://127.0.0.1:8080/healthy \ No newline at end of file diff --git a/anubis-eval/tests-setup/cache-stub/requests/set_item.http b/anubis-eval/tests-setup/cache-stub/requests/set_item.http new file mode 100644 index 0000000..1578b8c --- /dev/null +++ b/anubis-eval/tests-setup/cache-stub/requests/set_item.http @@ -0,0 +1,13 @@ +POST http://127.0.0.1:8080/v1.0/state/statestore +Content-Type: application/json + +[ + { + "key": "hello", + "value": "there" + }, + { + "key": "foo", + "value": "bar" + } +] \ No newline at end of file diff --git a/anubis-eval/tests-setup/docker-compose.yaml b/anubis-eval/tests-setup/docker-compose.yaml new file mode 100644 index 0000000..50710ba --- /dev/null +++ b/anubis-eval/tests-setup/docker-compose.yaml @@ -0,0 +1,69 @@ +version: "3.8" + +services: + anubis-psql-db: + image: postgres:14.1 + command: postgres -c 'max_connections=250' + restart: unless-stopped + ports: + - "5433:5432" + env_file: + - ../.env.template + + server: + image: judge0/judge0:1.13.0 + volumes: + - ../.env.judge0.template:/judge0.conf:ro + ports: + - "2358:2358" + privileged: true + restart: unless-stopped + depends_on: + workers: + condition: service_started + redis: + condition: service_started + db: + condition: service_started + + workers: + image: judge0/judge0:1.13.0 + command: [ "./scripts/workers" ] + volumes: + - ../.env.judge0.template:/judge0.conf:ro + privileged: true + restart: unless-stopped + + db: + image: postgres:13.0 + env_file: ../.env.judge0.template + restart: unless-stopped + + redis: + image: redis:6.0 + command: + [ + "bash", + "-c", + 'docker-entrypoint.sh --appendonly yes --requirepass "$$REDIS_PASSWORD"', + ] + ports: + - "6379:6379" + env_file: ../.env.judge0.template + restart: unless-stopped + + cache-stub: + build: + context: ./cache-stub + dockerfile: Dockerfile + ports: + - "8080:8080" + environment: + - REDIS_DSN=redis://default:pass@redis:6379 + restart: unless-stopped + + rabbitmq: + image: rabbitmq:3-management-alpine + ports: + - "5672:5672" + restart: unless-stopped diff --git a/anubis-eval/tests-setup/fixtures/problems.yaml b/anubis-eval/tests-setup/fixtures/problems.yaml new file mode 100644 index 0000000..e5035ab --- /dev/null +++ b/anubis-eval/tests-setup/fixtures/problems.yaml @@ -0,0 +1,17 @@ +SumAB: + id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + name: SumAB + proposer_id: dabdfa1f-0e29-4c74-81e4-39e91b126dff + is_published: true + time: 0.2 + stack_memory: 8 + total_memory: 64 + +DiffAB: + id: 855efecc-0953-479f-b9fd-a64e78a8c1e6 + name: DiffAB + proposer_id: db4c3d60-229f-4d49-84f9-ee560f9160d5 + is_published: false + time: 0.2 + stack_memory: 8 + total_memory: 64 \ No newline at end of file diff --git a/anubis-eval/tests-setup/fixtures/submissions.yaml b/anubis-eval/tests-setup/fixtures/submissions.yaml new file mode 100644 index 0000000..4aa5298 --- /dev/null +++ b/anubis-eval/tests-setup/fixtures/submissions.yaml @@ -0,0 +1,59 @@ +Ordinary_SumAB_Submission1: + id: a0acb0f8-b3ce-4784-86a3-149fd53d1fac + user_id: f0833576-d077-4c08-89b5-9fc47dfcb7d2 + problem_id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + language: "Rust" + source_code: "fn main() { println!(\"Hello, world!\"); }" + status: "Accepted" + score: 100 + created_at: "2021-01-01T00:00:00Z" + avg_time: 0.1 + avg_memory: 32 + +Proposer_SumAB_Submission2: + id: 1b1b1b1b-1b1b-1b1b-1b1b-1b1b1b1b1b1b + user_id: dabdfa1f-0e29-4c74-81e4-39e91b126dff + problem_id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + language: "Rust" + source_code: "fn main() { println!(\"Hello, world!\"); }" + status: "Accepted" + score: 100 + created_at: "2021-01-01T00:00:00Z" + avg_time: 0.2 + avg_memory: 64 + +Admin_SumAB_Submission3: + id: 2b2b2b2b-2b2b-2b2b-2b2b-2b2b2b2b2b2b + user_id: db4c3d60-229f-4d49-84f9-ee560f9160d5 + problem_id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + language: "OCaml" + source_code: "print_endline \"Hello, world!\"" + status: "Rejected" + score: 50 + created_at: "2021-01-01T00:00:00Z" + avg_time: 0.2 + avg_memory: 32 + +Admin_DiffAB_Submission4: + id: 3c3c3c3c-3c3c-3c3c-3c3c-3c3c3c3c3c3c + user_id: db4c3d60-229f-4d49-84f9-ee560f9160d5 + problem_id: 855efecc-0953-479f-b9fd-a64e78a8c1e6 + language: "Python" + source_code: "print(\"Hello, world!\")" + status: "Accepted" + score: 100 + created_at: "2021-01-01T00:00:00Z" + avg_time: 0.2 + avg_memory: 64 + +Ordinary_DiffAB_Submission5: + id: 4d4d4d4d-4d4d-4d4d-4d4d-4d4d4d4d4d4d + user_id: f0833576-d077-4c08-89b5-9fc47dfcb7d2 + problem_id: 855efecc-0953-479f-b9fd-a64e78a8c1e6 + language: "Python" + source_code: "print(\"Hello, world!\")" + status: "Rejected" + score: 0 + created_at: "2021-01-01T00:00:00Z" + avg_time: 0.2 + avg_memory: 32 \ No newline at end of file diff --git a/anubis-eval/tests-setup/fixtures/test_cases.yaml b/anubis-eval/tests-setup/fixtures/test_cases.yaml new file mode 100644 index 0000000..e9bf0b2 --- /dev/null +++ b/anubis-eval/tests-setup/fixtures/test_cases.yaml @@ -0,0 +1,103 @@ +Submission1_TestCase1: + token: a478f90d-b4d2-49d5-b85f-9b28afbfcfb7 + submission_id: a0acb0f8-b3ce-4784-86a3-149fd53d1fac + status: "Accepted" + time: 0.1 + memory: 32 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 1 + expected_score: 50 + compile_output: "" + +Submission1_TestCase2: + token: 65f28cfd-02b0-4f52-85ee-b93bb7c095a6 + submission_id: a0acb0f8-b3ce-4784-86a3-149fd53d1fac + status: "Accepted" + time: 0.1 + memory: 32 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 2 + expected_score: 50 + compile_output: "" + +Submission2_TestCase1: + token: bd5b08b1-3cb0-40da-8b32-e9ad30e565bb + submission_id: 1b1b1b1b-1b1b-1b1b-1b1b-1b1b1b1b1b1b + status: "Accepted" + time: 0.2 + memory: 64 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 1 + expected_score: 50 + compile_output: "" + +Submission2_TestCase2: + token: 60ba7e13-f775-46b3-bf4d-c9b1354831a8 + submission_id: 1b1b1b1b-1b1b-1b1b-1b1b-1b1b1b1b1b1b + status: "Accepted" + time: 0.2 + memory: 64 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 2 + expected_score: 50 + compile_output: "" + +Submission3_TestCase1: + token: ddc53d46-1915-4836-83ec-852bec5e9f5e + submission_id: 2b2b2b2b-2b2b-2b2b-2b2b-2b2b2b2b2b2b + status: "Rejected" + time: 0.2 + memory: 32 + eval_message: "Wrong Answer" + stdout: "" + stderr: "" + testcase_id: 1 + expected_score: 50 + compile_output: "" + +Submission3_TestCase2: + token: 1b1b1b1b-1b1b-1b1b-1b1b-1b1b1b1b1b1b + submission_id: 2b2b2b2b-2b2b-2b2b-2b2b-2b2b2b2b2b2b + status: "Accepted" + time: 0.2 + memory: 32 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 2 + expected_score: 50 + compile_output: "" + +Submission4_TestCase1: + token: 38af049a-208e-4892-9d54-28e6138644b6 + submission_id: 3c3c3c3c-3c3c-3c3c-3c3c-3c3c3c3c3c3c + status: "Accepted" + time: 0.2 + memory: 64 + eval_message: "Accepted" + stdout: "" + stderr: "" + testcase_id: 1 + expected_score: 100 + compile_output: "" + +Submission5_TestCase1: + token: ed62ccad-6437-4806-aa87-3127cda27826 + submission_id: 4d4d4d4d-4d4d-4d4d-4d4d-4d4d4d4d4d4d + status: "Runtime Error" + time: 0.2 + memory: 32 + eval_message: "Runtime Error" + stdout: "" + stderr: "Traceback (most recent call last):\n File \"main.py\", line 1, in \n print(\"Hello, world!\")\nNameError: name 'print' is not defined\n" + testcase_id: 1 + expected_score: 0 + compile_output: "" \ No newline at end of file diff --git a/anubis-eval/tests-setup/fixtures/tests.yaml b/anubis-eval/tests-setup/fixtures/tests.yaml new file mode 100644 index 0000000..fa1c45c --- /dev/null +++ b/anubis-eval/tests-setup/fixtures/tests.yaml @@ -0,0 +1,20 @@ +SumAB_Test1: + id: 1 + problem_id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + score: 50 + input_url: "https://example.com/SumAB/input1.txt" + output_url: "https://example.com/SumAB/output1.txt" + +SumAB_Test2: + id: 2 + problem_id: e8d818c3-cb9b-4164-a1f8-ba0967008086 + score: 50 + input_url: "https://example.com/SumAB/input2.txt" + output_url: "https://example.com/SumAB/output2.txt" + +DiffAB_Test1: + id: 1 + problem_id: 855efecc-0953-479f-b9fd-a64e78a8c1e6 + score: 100 + input_url: "https://example.com/DiffAB/input1.txt" + output_url: "https://example.com/DiffAB/output1.txt" \ No newline at end of file diff --git a/anubis-eval/tests-setup/fixtures/users.yaml b/anubis-eval/tests-setup/fixtures/users.yaml new file mode 100644 index 0000000..dcc2d8d --- /dev/null +++ b/anubis-eval/tests-setup/fixtures/users.yaml @@ -0,0 +1,17 @@ +Ordinary: + id: f0833576-d077-4c08-89b5-9fc47dfcb7d2 + email: "user@gmail.com" + role: + +Proposer: + id: dabdfa1f-0e29-4c74-81e4-39e91b126dff + email: "proposer@gmail.com" + role: + - !Proposer + +Admin: + id: db4c3d60-229f-4d49-84f9-ee560f9160d5 + email: "admin@gmail.com" + role: + - !Admin + - !Proposer diff --git a/docker-compose.override.yaml b/docker-compose.override.yaml new file mode 100644 index 0000000..6291cef --- /dev/null +++ b/docker-compose.override.yaml @@ -0,0 +1,12 @@ +services: + seeder: + container_name: asgard-seeder + build: + context: seeder + dockerfile: ../seeder/Dockerfile + volumes: + - ./seeder/fixtures.yaml:/temp/fixtures.yaml + - ../ProblemArchive/:/temp/ProblemArchive/ + network_mode: host + profiles: + - seeding \ No newline at end of file diff --git a/docker-compose.yaml b/docker-compose.yaml index c09dfa3..1d9424b 100644 --- a/docker-compose.yaml +++ b/docker-compose.yaml @@ -1,6 +1,6 @@ # Define services for Asgard # TODO: add health checks, restart policies and extract hardcoded ports/tags/etc. to .env files -version: "3.8" +# version: "3.8" x-logging: &default-logging logging: @@ -610,7 +610,7 @@ services: container_name: asgard-redis image: redis:alpine ports: - - "6379" + - "6380:6379" restart: unless-stopped volumes: - dapr-redis-data:/data diff --git a/enki-problems/EnkiProblems.sln.DotSettings b/enki-problems/EnkiProblems.sln.DotSettings deleted file mode 100644 index cb0b2c9..0000000 --- a/enki-problems/EnkiProblems.sln.DotSettings +++ /dev/null @@ -1,23 +0,0 @@ - - True - WARNING - WARNING - WARNING - WARNING - WARNING - WARNING - WARNING - WARNING - Required - Required - Required - Required - False - True - False - False - True - False - False - SQL - \ No newline at end of file diff --git a/enki-problems/src/EnkiProblems.Application.Contracts/EnkiProblemsDtoExtensions.cs b/enki-problems/src/EnkiProblems.Application.Contracts/EnkiProblemsDtoExtensions.cs index 13be7db..92a2719 100644 --- a/enki-problems/src/EnkiProblems.Application.Contracts/EnkiProblemsDtoExtensions.cs +++ b/enki-problems/src/EnkiProblems.Application.Contracts/EnkiProblemsDtoExtensions.cs @@ -8,8 +8,7 @@ public static class EnkiProblemsDtoExtensions public static void Configure() { - OneTimeRunner.Run(() => - { + OneTimeRunner.Run(() => { /* You can add extension properties to DTOs * defined in the depended modules. * diff --git a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/ProblemEvalMetadataUpsertedEvent.cs b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/ProblemEvalMetadataUpsertedEvent.cs index 8ccecdb..4c840d6 100644 --- a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/ProblemEvalMetadataUpsertedEvent.cs +++ b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/ProblemEvalMetadataUpsertedEvent.cs @@ -21,4 +21,4 @@ public class ProblemEvalMetadataUpsertedEvent : EntityDto public IoTypeEnum IoType { get; set; } public IEnumerable Tests { get; set; } -} \ No newline at end of file +} diff --git a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestDeletedEvent.cs b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestDeletedEvent.cs index 0a3a560..359fbbb 100644 --- a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestDeletedEvent.cs +++ b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestDeletedEvent.cs @@ -6,4 +6,4 @@ namespace EnkiProblems.Problems.Events; public class TestDeletedEvent : EntityDto { public Guid ProblemId { get; set; } -} \ No newline at end of file +} diff --git a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestUpsertedEvent.cs b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestUpsertedEvent.cs index f4cfb5f..9071bb2 100644 --- a/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestUpsertedEvent.cs +++ b/enki-problems/src/EnkiProblems.Application.Contracts/Problems/Events/TestUpsertedEvent.cs @@ -10,4 +10,4 @@ public class TestUpsertedEvent : EntityDto public string InputDownloadUrl { get; set; } public string OutputDownloadUrl { get; set; } -} \ No newline at end of file +} diff --git a/enki-problems/src/EnkiProblems.Application/Problems/ProblemAppService.cs b/enki-problems/src/EnkiProblems.Application/Problems/ProblemAppService.cs index 3f7e4c3..f31de36 100644 --- a/enki-problems/src/EnkiProblems.Application/Problems/ProblemAppService.cs +++ b/enki-problems/src/EnkiProblems.Application/Problems/ProblemAppService.cs @@ -69,7 +69,10 @@ public async Task CreateAsync(CreateProblemDto input) problem = await _problemRepository.InsertAsync(problem); - var problemEvalMetadataUpsertedEvent = ObjectMapper.Map(problem); + var problemEvalMetadataUpsertedEvent = ObjectMapper.Map< + Problem, + ProblemEvalMetadataUpsertedEvent + >(problem); await _daprClient.PublishEventAsync( EnkiProblemsConsts.PubSubName, @@ -121,7 +124,9 @@ public async Task> GetListAsync(ProblemListFilterDto } [Authorize] - public async Task> GetListUnpublishedAsync(ProblemListFilterDto input) + public async Task> GetListUnpublishedAsync( + ProblemListFilterDto input + ) { _logger.LogInformation( "Getting unpublished problems list for user {UserId}", @@ -142,8 +147,9 @@ public async Task> GetListUnpublishedAsync(P var problemQueryable = await _problemRepository.GetQueryableAsync(); - problemQueryable = problemQueryable - .Where(p => !p.IsPublished && p.ProposerId == CurrentUser.Id); + problemQueryable = problemQueryable.Where(p => + !p.IsPublished && p.ProposerId == CurrentUser.Id + ); if (!string.IsNullOrEmpty(input.Name)) { @@ -273,8 +279,15 @@ public async Task UpdateAsync(Guid id, UpdateProblemDto input) updatedProblem = await _problemRepository.UpdateAsync(updatedProblem); - var problemEvalMetadataUpsertedEvent = ObjectMapper.Map(updatedProblem); - _logger.LogInformation("Publishing ProblemEvalMetadataUpsertedEvent for problem {ProblemId}: {Event}", id, problemEvalMetadataUpsertedEvent); + var problemEvalMetadataUpsertedEvent = ObjectMapper.Map< + Problem, + ProblemEvalMetadataUpsertedEvent + >(updatedProblem); + _logger.LogInformation( + "Publishing ProblemEvalMetadataUpsertedEvent for problem {ProblemId}: {Event}", + id, + problemEvalMetadataUpsertedEvent + ); await _daprClient.PublishEventAsync( EnkiProblemsConsts.PubSubName, @@ -282,9 +295,7 @@ await _daprClient.PublishEventAsync( problemEvalMetadataUpsertedEvent ); - return ObjectMapper.Map( - updatedProblem - ); + return ObjectMapper.Map(updatedProblem); } [Authorize] @@ -304,19 +315,31 @@ public async Task DeleteAsync(Guid id) } // TODO: convert to permission - if (problem.IsPublished && CurrentUser.Roles.All(r => r != EnkiProblemsConsts.AdminRoleName)) + if ( + problem.IsPublished && CurrentUser.Roles.All(r => r != EnkiProblemsConsts.AdminRoleName) + ) { - _logger.LogError("User {UserId} is not allowed to delete problem {ProblemId}", CurrentUser.Id, id); + _logger.LogError( + "User {UserId} is not allowed to delete problem {ProblemId}", + CurrentUser.Id, + id + ); throw new AbpAuthorizationException( EnkiProblemsDomainErrorCodes.NotAllowedToDeletePublishedProblem ); } // TODO: convert to permission - if (CurrentUser.Roles.All(r => r != EnkiProblemsConsts.AdminRoleName) && - CurrentUser.Id != id) + if ( + CurrentUser.Roles.All(r => r != EnkiProblemsConsts.AdminRoleName) + && CurrentUser.Id != id + ) { - _logger.LogError("User {UserId} is not allowed to delete problem {ProblemId}", CurrentUser.Id, id); + _logger.LogError( + "User {UserId} is not allowed to delete problem {ProblemId}", + CurrentUser.Id, + id + ); throw new AbpAuthorizationException( EnkiProblemsDomainErrorCodes.ProblemCannotBeDeleted ); @@ -327,7 +350,11 @@ public async Task DeleteAsync(Guid id) foreach (var problemTest in problemTests) { var deleteResponse = await _testService.DeleteTestAsync( - new DeleteTestRequest { ProblemId = id.ToString(), TestId = problemTest.Id.ToString() } + new DeleteTestRequest + { + ProblemId = id.ToString(), + TestId = problemTest.Id.ToString() + } ); if (deleteResponse.Status.Code != StatusCode.Ok) @@ -338,9 +365,9 @@ public async Task DeleteAsync(Guid id) deleteResponse.Status.Message ); throw new BusinessException( - EnkiProblemsDomainErrorCodes.TestDeleteFailed, - $"Test delete failed with status code {deleteResponse.Status.Code}: {deleteResponse.Status.Message}." - ) + EnkiProblemsDomainErrorCodes.TestDeleteFailed, + $"Test delete failed with status code {deleteResponse.Status.Code}: {deleteResponse.Status.Message}." + ) .WithData("id", problem.Id) .WithData("testId", problemTest.Id); } @@ -632,11 +659,7 @@ public async Task DeleteTestAsync(Guid id, int testId) await _daprClient.PublishEventAsync( EnkiProblemsConsts.PubSubName, EnkiProblemsConsts.TestDeletedTopic, - new TestDeletedEvent - { - Id = testId, - ProblemId = id, - } + new TestDeletedEvent { Id = testId, ProblemId = id, } ); return ObjectMapper.Map(updatedProblem); diff --git a/enki-problems/src/EnkiProblems.Application/Problems/Tests/HermesTestsGrpcService.cs b/enki-problems/src/EnkiProblems.Application/Problems/Tests/HermesTestsGrpcService.cs index aeb111d..e53f1e4 100644 --- a/enki-problems/src/EnkiProblems.Application/Problems/Tests/HermesTestsGrpcService.cs +++ b/enki-problems/src/EnkiProblems.Application/Problems/Tests/HermesTestsGrpcService.cs @@ -111,7 +111,7 @@ public async Task DownloadTestAsync(DownloadRequest input } } - response: + response: return new DownloadTestStreamDto { ProblemId = metadata.ProblemId, diff --git a/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsDomainSharedModule.cs b/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsDomainSharedModule.cs index 10357f2..42365b9 100644 --- a/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsDomainSharedModule.cs +++ b/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsDomainSharedModule.cs @@ -43,8 +43,7 @@ public override void ConfigureServices(ServiceConfigurationContext context) Configure(options => { options - .Resources - .Add("en") + .Resources.Add("en") .AddBaseTypes(typeof(AbpValidationResource)) .AddVirtualJson("/Localization/EnkiProblems"); diff --git a/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsGlobalFeatureConfigurator.cs b/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsGlobalFeatureConfigurator.cs index 2f737f5..ca53885 100644 --- a/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsGlobalFeatureConfigurator.cs +++ b/enki-problems/src/EnkiProblems.Domain.Shared/EnkiProblemsGlobalFeatureConfigurator.cs @@ -8,8 +8,7 @@ public static class EnkiProblemsGlobalFeatureConfigurator public static void Configure() { - OneTimeRunner.Run(() => - { + OneTimeRunner.Run(() => { /* You can configure (enable/disable) global features of the used modules here. * * YOU CAN SAFELY DELETE THIS CLASS AND REMOVE ITS USAGES IF YOU DON'T NEED TO IT! diff --git a/enki-problems/src/EnkiProblems.Domain/Data/EnkiProblemsDbMigrationService.cs b/enki-problems/src/EnkiProblems.Domain/Data/EnkiProblemsDbMigrationService.cs index 3c6b16a..7d8b1d9 100644 --- a/enki-problems/src/EnkiProblems.Domain/Data/EnkiProblemsDbMigrationService.cs +++ b/enki-problems/src/EnkiProblems.Domain/Data/EnkiProblemsDbMigrationService.cs @@ -54,8 +54,7 @@ public async Task MigrateAsync() if (tenant.ConnectionStrings.Any()) { var tenantConnectionStrings = tenant - .ConnectionStrings - .Select(x => x.Value) + .ConnectionStrings.Select(x => x.Value) .ToList(); if (!migratedDatabaseSchemas.IsSupersetOf(tenantConnectionStrings)) diff --git a/enki-problems/src/EnkiProblems.Domain/OpenIddict/OpenIddictDataSeedContributor.cs b/enki-problems/src/EnkiProblems.Domain/OpenIddict/OpenIddictDataSeedContributor.cs index 8b84145..ce74ef1 100644 --- a/enki-problems/src/EnkiProblems.Domain/OpenIddict/OpenIddictDataSeedContributor.cs +++ b/enki-problems/src/EnkiProblems.Domain/OpenIddict/OpenIddictDataSeedContributor.cs @@ -82,9 +82,8 @@ private async Task CreateApplicationsAsync() var webClientId = configurationSection["EnkiProblems_Web:ClientId"]; if (!webClientId.IsNullOrWhiteSpace()) { - var webClientRootUrl = configurationSection["EnkiProblems_Web:RootUrl"].EnsureEndsWith( - '/' - ); + var webClientRootUrl = configurationSection["EnkiProblems_Web:RootUrl"] + .EnsureEndsWith('/'); /* EnkiProblems_Web client is only needed if you created a tiered * solution. Otherwise, you can delete this client. */ @@ -110,9 +109,8 @@ await CreateApplicationAsync( var consoleAndAngularClientId = configurationSection["EnkiProblems_App:ClientId"]; if (!consoleAndAngularClientId.IsNullOrWhiteSpace()) { - var consoleAndAngularClientRootUrl = configurationSection[ - "EnkiProblems_App:RootUrl" - ]?.TrimEnd('/'); + var consoleAndAngularClientRootUrl = configurationSection["EnkiProblems_App:RootUrl"] + ?.TrimEnd('/'); await CreateApplicationAsync( name: consoleAndAngularClientId!, type: OpenIddictConstants.ClientTypes.Public, @@ -161,7 +159,8 @@ await CreateApplicationAsync( { var blazorServerTieredRootUrl = configurationSection[ "EnkiProblems_BlazorServerTiered:RootUrl" - ].EnsureEndsWith('/'); + ] + .EnsureEndsWith('/'); await CreateApplicationAsync( name: blazorServerTieredClientId!, @@ -273,9 +272,9 @@ private async Task CreateApplicationAsync( }.All(grantTypes.Contains) ) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.CodeIdToken); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.CodeIdToken + ); if ( string.Equals( @@ -285,12 +284,12 @@ private async Task CreateApplicationAsync( ) ) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.CodeIdTokenToken); - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.CodeToken); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.CodeIdTokenToken + ); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.CodeToken + ); } } @@ -313,9 +312,9 @@ private async Task CreateApplicationAsync( { if (grantType == OpenIddictConstants.GrantTypes.AuthorizationCode) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.AuthorizationCode); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.AuthorizationCode + ); application.Permissions.Add(OpenIddictConstants.Permissions.ResponseTypes.Code); } @@ -324,9 +323,9 @@ private async Task CreateApplicationAsync( || grantType == OpenIddictConstants.GrantTypes.Implicit ) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.Endpoints.Authorization); + application.Permissions.Add( + OpenIddictConstants.Permissions.Endpoints.Authorization + ); } if ( @@ -338,55 +337,55 @@ private async Task CreateApplicationAsync( ) { application.Permissions.Add(OpenIddictConstants.Permissions.Endpoints.Token); - application - .Permissions - .Add(OpenIddictConstants.Permissions.Endpoints.Revocation); - application - .Permissions - .Add(OpenIddictConstants.Permissions.Endpoints.Introspection); + application.Permissions.Add( + OpenIddictConstants.Permissions.Endpoints.Revocation + ); + application.Permissions.Add( + OpenIddictConstants.Permissions.Endpoints.Introspection + ); } if (grantType == OpenIddictConstants.GrantTypes.ClientCredentials) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.ClientCredentials); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.ClientCredentials + ); } if (grantType == OpenIddictConstants.GrantTypes.Implicit) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.Implicit); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.Implicit + ); } if (grantType == OpenIddictConstants.GrantTypes.Password) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.Password); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.Password + ); } if (grantType == OpenIddictConstants.GrantTypes.RefreshToken) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.RefreshToken); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.RefreshToken + ); } if (grantType == OpenIddictConstants.GrantTypes.DeviceCode) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.GrantTypes.DeviceCode); + application.Permissions.Add( + OpenIddictConstants.Permissions.GrantTypes.DeviceCode + ); application.Permissions.Add(OpenIddictConstants.Permissions.Endpoints.Device); } if (grantType == OpenIddictConstants.GrantTypes.Implicit) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.IdToken); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.IdToken + ); if ( string.Equals( type, @@ -395,20 +394,20 @@ private async Task CreateApplicationAsync( ) ) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.IdTokenToken); - application - .Permissions - .Add(OpenIddictConstants.Permissions.ResponseTypes.Token); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.IdTokenToken + ); + application.Permissions.Add( + OpenIddictConstants.Permissions.ResponseTypes.Token + ); } } if (!buildInGrantTypes.Contains(grantType)) { - application - .Permissions - .Add(OpenIddictConstants.Permissions.Prefixes.GrantType + grantType); + application.Permissions.Add( + OpenIddictConstants.Permissions.Prefixes.GrantType + grantType + ); } } @@ -429,9 +428,9 @@ private async Task CreateApplicationAsync( } else { - application - .Permissions - .Add(OpenIddictConstants.Permissions.Prefixes.Scope + scope); + application.Permissions.Add( + OpenIddictConstants.Permissions.Prefixes.Scope + scope + ); } } diff --git a/enki-problems/src/EnkiProblems.Domain/Problems/ProblemManager.cs b/enki-problems/src/EnkiProblems.Domain/Problems/ProblemManager.cs index 5c0dcb1..07b2b9b 100644 --- a/enki-problems/src/EnkiProblems.Domain/Problems/ProblemManager.cs +++ b/enki-problems/src/EnkiProblems.Domain/Problems/ProblemManager.cs @@ -85,7 +85,9 @@ public async Task UpdateAsync( ); } - var oldProblem = await _problemRepository.FirstOrDefaultAsync(p => p.Name == name && p.Id != problem.Id); + var oldProblem = await _problemRepository.FirstOrDefaultAsync(p => + p.Name == name && p.Id != problem.Id + ); if (oldProblem is not null) { _logger.LogError("Problem {Name} already exists", name); diff --git a/enki-problems/src/EnkiProblems.HttpApi.Host/EnkiProblemsHttpApiHostModule.cs b/enki-problems/src/EnkiProblems.HttpApi.Host/EnkiProblemsHttpApiHostModule.cs index e4cce2a..abf1e18 100644 --- a/enki-problems/src/EnkiProblems.HttpApi.Host/EnkiProblemsHttpApiHostModule.cs +++ b/enki-problems/src/EnkiProblems.HttpApi.Host/EnkiProblemsHttpApiHostModule.cs @@ -67,21 +67,20 @@ public override void ConfigureServices(ServiceConfigurationContext context) private void ConfigureLogger(ServiceConfigurationContext context, IConfiguration configuration) { - context - .Services - .AddLogging(config => - { - config.AddConfiguration(configuration.GetSection("Logging")); - config.AddConsole(); - config.AddDebug(); - }); + context.Services.AddLogging(config => + { + config.AddConfiguration(configuration.GetSection("Logging")); + config.AddConsole(); + config.AddDebug(); + }); } - private void ConfigureHttpClient(ServiceConfigurationContext context, IConfiguration configuration) + private void ConfigureHttpClient( + ServiceConfigurationContext context, + IConfiguration configuration + ) { - context - .Services - .AddHttpClient(); + context.Services.AddHttpClient(); } private void ConfigureDapr(ServiceConfigurationContext context, IConfiguration configuration) @@ -90,9 +89,8 @@ private void ConfigureDapr(ServiceConfigurationContext context, IConfiguration c var address = configuration["Dapr:GrpcEndpoint"]; context - .Services - .AddSingleton( - _ => new() { HermesContext = new() { { "dapr-app-id", hermesAppId! } } } + .Services.AddSingleton(_ => + new() { HermesContext = new() { { "dapr-app-id", hermesAppId! } } } ) .AddDaprClient(options => { @@ -109,9 +107,9 @@ IConfiguration configuration var address = configuration["Dapr:GrpcEndpoint"]; var channel = GrpcChannel.ForAddress(address!); - context - .Services - .AddSingleton(_ => new(channel)); + context.Services.AddSingleton(_ => + new(channel) + ); context.Services.AddScoped(); } @@ -129,15 +127,13 @@ private void ConfigureConventionalControllers() { options.ConventionalControllers.FormBodyBindingIgnoredTypes.Add(typeof(CreateTestDto)); options.ConventionalControllers.FormBodyBindingIgnoredTypes.Add(typeof(UpdateTestDto)); - options - .ConventionalControllers - .Create( - typeof(EnkiProblemsApplicationModule).Assembly, - opts => - { - opts.RootPath = "enki"; - } - ); + options.ConventionalControllers.Create( + typeof(EnkiProblemsApplicationModule).Assembly, + opts => + { + opts.RootPath = "enki"; + } + ); }); } @@ -147,8 +143,7 @@ IConfiguration configuration ) { context - .Services - .AddAuthentication(JwtBearerDefaults.AuthenticationScheme) + .Services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme) .AddJwtBearer(options => { options.TokenValidationParameters = new TokenValidationParameters @@ -170,44 +165,42 @@ private static void ConfigureSwaggerServices( IConfiguration configuration ) { - context - .Services - .AddAbpSwaggerGen(options => - { - options.SwaggerDoc( - "v1", - new OpenApiInfo { Title = "EnkiProblems API", Version = "v1" } - ); - options.DocInclusionPredicate((docName, description) => true); - options.CustomSchemaIds(type => type.FullName); - options.AddSecurityDefinition( - "Bearer", - new OpenApiSecurityScheme - { - Description = - "JWT Authorization header using the Bearer scheme. Example: \"Authorization: Bearer {token}\"", - Name = "Authorization", - In = ParameterLocation.Header, - Type = SecuritySchemeType.ApiKey - } - ); - options.AddSecurityRequirement( - new OpenApiSecurityRequirement + context.Services.AddAbpSwaggerGen(options => + { + options.SwaggerDoc( + "v1", + new OpenApiInfo { Title = "EnkiProblems API", Version = "v1" } + ); + options.DocInclusionPredicate((docName, description) => true); + options.CustomSchemaIds(type => type.FullName); + options.AddSecurityDefinition( + "Bearer", + new OpenApiSecurityScheme + { + Description = + "JWT Authorization header using the Bearer scheme. Example: \"Authorization: Bearer {token}\"", + Name = "Authorization", + In = ParameterLocation.Header, + Type = SecuritySchemeType.ApiKey + } + ); + options.AddSecurityRequirement( + new OpenApiSecurityRequirement + { { + new OpenApiSecurityScheme { - new OpenApiSecurityScheme + Reference = new OpenApiReference { - Reference = new OpenApiReference - { - Type = ReferenceType.SecurityScheme, - Id = "Bearer" - } - }, - Array.Empty() - } + Type = ReferenceType.SecurityScheme, + Id = "Bearer" + } + }, + Array.Empty() } - ); - }); + } + ); + }); } private void ConfigureDataProtection( @@ -217,8 +210,7 @@ IWebHostEnvironment hostingEnvironment ) { var dataProtectionBuilder = context - .Services - .AddDataProtection() + .Services.AddDataProtection() .SetApplicationName("EnkiProblems"); if (!hostingEnvironment.IsDevelopment()) { @@ -235,39 +227,33 @@ private void ConfigureDistributedLocking( IConfiguration configuration ) { - context - .Services - .AddSingleton(sp => - { - var connection = ConnectionMultiplexer.Connect( - configuration["Redis:Configuration"] - ); - return new RedisDistributedSynchronizationProvider(connection.GetDatabase()); - }); + context.Services.AddSingleton(sp => + { + var connection = ConnectionMultiplexer.Connect(configuration["Redis:Configuration"]); + return new RedisDistributedSynchronizationProvider(connection.GetDatabase()); + }); } private void ConfigureCors(ServiceConfigurationContext context, IConfiguration configuration) { - context - .Services - .AddCors(options => + context.Services.AddCors(options => + { + options.AddDefaultPolicy(builder => { - options.AddDefaultPolicy(builder => - { - builder - .WithOrigins( - configuration["AllowedOrigins"] - ?.Split(";", StringSplitOptions.RemoveEmptyEntries) - .Select(o => o.RemovePostFix("/")) - .ToArray() ?? Array.Empty() - ) - .WithAbpExposedHeaders() - .SetIsOriginAllowedToAllowWildcardSubdomains() - .AllowAnyHeader() - .AllowAnyMethod() - .AllowCredentials(); - }); + builder + .WithOrigins( + configuration["AllowedOrigins"] + ?.Split(";", StringSplitOptions.RemoveEmptyEntries) + .Select(o => o.RemovePostFix("/")) + .ToArray() ?? Array.Empty() + ) + .WithAbpExposedHeaders() + .SetIsOriginAllowedToAllowWildcardSubdomains() + .AllowAnyHeader() + .AllowAnyMethod() + .AllowCredentials(); }); + }); } public override void OnApplicationInitialization(ApplicationInitializationContext context) diff --git a/enki-problems/src/EnkiProblems.HttpApi.Host/Program.cs b/enki-problems/src/EnkiProblems.HttpApi.Host/Program.cs index 5f82077..ae6e138 100644 --- a/enki-problems/src/EnkiProblems.HttpApi.Host/Program.cs +++ b/enki-problems/src/EnkiProblems.HttpApi.Host/Program.cs @@ -14,22 +14,15 @@ public static async Task Main(string[] args) { Log.Logger = new LoggerConfiguration() #if DEBUG - .MinimumLevel - .Debug() + .MinimumLevel.Debug() #else - .MinimumLevel - .Information() + .MinimumLevel.Information() #endif - .MinimumLevel - .Override("Microsoft", LogEventLevel.Information) - .MinimumLevel - .Override("Microsoft.EntityFrameworkCore", LogEventLevel.Warning) - .Enrich - .FromLogContext() - .WriteTo - .Async(c => c.File("Logs/logs.txt")) - .WriteTo - .Async(c => c.Console()) + .MinimumLevel.Override("Microsoft", LogEventLevel.Information) + .MinimumLevel.Override("Microsoft.EntityFrameworkCore", LogEventLevel.Warning) + .Enrich.FromLogContext() + .WriteTo.Async(c => c.File("Logs/logs.txt")) + .WriteTo.Async(c => c.Console()) .CreateLogger(); try diff --git a/enki-problems/src/EnkiProblems.HttpApi/Controllers/ProblemSubscriberController.cs b/enki-problems/src/EnkiProblems.HttpApi/Controllers/ProblemSubscriberController.cs index 58771e3..4bbc1c3 100644 --- a/enki-problems/src/EnkiProblems.HttpApi/Controllers/ProblemSubscriberController.cs +++ b/enki-problems/src/EnkiProblems.HttpApi/Controllers/ProblemSubscriberController.cs @@ -15,7 +15,11 @@ public class ProblemSubscriberController : EnkiProblemsController private readonly DaprClient _daprClient; private readonly ILogger _logger; - public ProblemSubscriberController(HttpClient httpClient, DaprClient daprClient, ILogger logger) + public ProblemSubscriberController( + HttpClient httpClient, + DaprClient daprClient, + ILogger logger + ) { _httpClient = httpClient; _daprClient = daprClient; @@ -36,13 +40,19 @@ public async Task HandleTestUpsertedAsync([FromBody] TestUpsertedE if (inputResponse.StatusCode != HttpStatusCode.OK) { - _logger.LogError("Failed to download input file from {InputDownloadUrl}", @event.InputDownloadUrl); + _logger.LogError( + "Failed to download input file from {InputDownloadUrl}", + @event.InputDownloadUrl + ); return BadRequest(); } if (outputResponse.StatusCode != HttpStatusCode.OK) { - _logger.LogError("Failed to download output file from {OutputDownloadUrl}", @event.OutputDownloadUrl); + _logger.LogError( + "Failed to download output file from {OutputDownloadUrl}", + @event.OutputDownloadUrl + ); return BadRequest(); } @@ -51,7 +61,7 @@ public async Task HandleTestUpsertedAsync([FromBody] TestUpsertedE await _daprClient.DeleteStateAsync( EnkiProblemsConsts.StateStoreName, $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestInputSuffix}" - ); + ); await _daprClient.SaveStateAsync( EnkiProblemsConsts.StateStoreName, @@ -64,7 +74,7 @@ await _daprClient.SaveStateAsync( await _daprClient.DeleteStateAsync( EnkiProblemsConsts.StateStoreName, $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestOutputSuffix}" - ); + ); await _daprClient.SaveStateAsync( EnkiProblemsConsts.StateStoreName, @@ -84,15 +94,15 @@ public async Task HandleTestDeletedAsync([FromBody] TestDeletedEve // delete the input/output files from the dapr statestore await _daprClient.DeleteStateAsync( - EnkiProblemsConsts.StateStoreName, - $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestInputSuffix}" - ); + EnkiProblemsConsts.StateStoreName, + $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestInputSuffix}" + ); await _daprClient.DeleteStateAsync( - EnkiProblemsConsts.StateStoreName, - $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestOutputSuffix}" - ); + EnkiProblemsConsts.StateStoreName, + $"{@event.ProblemId}-{@event.Id}-{EnkiProblemsConsts.TestOutputSuffix}" + ); return Ok(); } -} \ No newline at end of file +} diff --git a/enki-problems/src/EnkiProblems.MongoDB/MongoDb/EnkiProblemsMongoDbModule.cs b/enki-problems/src/EnkiProblems.MongoDB/MongoDb/EnkiProblemsMongoDbModule.cs index 9df5f4a..74341a8 100644 --- a/enki-problems/src/EnkiProblems.MongoDB/MongoDb/EnkiProblemsMongoDbModule.cs +++ b/enki-problems/src/EnkiProblems.MongoDB/MongoDb/EnkiProblemsMongoDbModule.cs @@ -27,12 +27,10 @@ public class EnkiProblemsMongoDbModule : AbpModule { public override void ConfigureServices(ServiceConfigurationContext context) { - context - .Services - .AddMongoDbContext(options => - { - options.AddDefaultRepositories(); - }); + context.Services.AddMongoDbContext(options => + { + options.AddDefaultRepositories(); + }); Configure(options => { diff --git a/enki-problems/test/EnkiProblems.Application.Tests/EnkiProblemsApplicationTestBase.cs b/enki-problems/test/EnkiProblems.Application.Tests/EnkiProblemsApplicationTestBase.cs index 0728c06..4a4ee7b 100644 --- a/enki-problems/test/EnkiProblems.Application.Tests/EnkiProblemsApplicationTestBase.cs +++ b/enki-problems/test/EnkiProblems.Application.Tests/EnkiProblemsApplicationTestBase.cs @@ -1,5 +1,4 @@ namespace EnkiProblems; public abstract class EnkiProblemsApplicationTestBase - : EnkiProblemsTestBase -{ } + : EnkiProblemsTestBase { } diff --git a/enki-problems/test/EnkiProblems.Application.Tests/Problems/ProblemAppServiceTests.cs b/enki-problems/test/EnkiProblems.Application.Tests/Problems/ProblemAppServiceTests.cs index 616bb78..32217c9 100644 --- a/enki-problems/test/EnkiProblems.Application.Tests/Problems/ProblemAppServiceTests.cs +++ b/enki-problems/test/EnkiProblems.Application.Tests/Problems/ProblemAppServiceTests.cs @@ -3,6 +3,7 @@ using System.Linq; using System.Threading.Tasks; using Asgard.Hermes; +using Dapr.Client; using EnkiProblems.Problems.Tests; using Microsoft.Extensions.DependencyInjection; using NSubstitute; @@ -26,6 +27,7 @@ public class ProblemAppServiceTests : EnkiProblemsApplicationTestBase private readonly EnkiProblemsTestData _testData; private ICurrentUser _currentUser; private ITestService _testService; + private DaprClient _daprClient; public ProblemAppServiceTests() { @@ -42,6 +44,9 @@ protected override void AfterAddApplication(IServiceCollection services) _testService = Substitute.For(); services.AddSingleton(_testService); + + _daprClient = Substitute.For(); + services.AddSingleton(_daprClient); } #region CreateAsync @@ -125,14 +130,11 @@ public async Task Should_List_Published_Problems_When_Current_User_Is_Anonymous( var problemListDto = await _problemAppService.GetListAsync(new ProblemListFilterDto()); problemListDto.TotalCount.ShouldBe(1); - problemListDto - .Items - .ShouldContain( - p => - p.Name == _testData.ProblemName1 - && p.ProposerId == _testData.ProposerUserId1 - && p.IsPublished == true - ); + problemListDto.Items.ShouldContain(p => + p.Name == _testData.ProblemName1 + && p.ProposerId == _testData.ProposerUserId1 + && p.IsPublished == true + ); } #endregion @@ -142,26 +144,23 @@ public async Task Should_List_Unpublished_Problems_Only_For_Current_User_When_Cu { Login(_testData.ProposerUserId1, _testData.ProposerUserRoles); - var problemListDto = await _problemAppService.GetListUnpublishedAsync(new ProblemListFilterDto()); + var problemListDto = await _problemAppService.GetListUnpublishedAsync( + new ProblemListFilterDto() + ); problemListDto.TotalCount.ShouldBe(1); - problemListDto - .Items - .ShouldContain( - p => - p.Name == _testData.ProblemName1 - && p.ProposerId == _testData.ProposerUserId1 - && p.IsPublished == false - ); + problemListDto.Items.ShouldContain(p => + p.Name == _testData.ProblemName1 + && p.ProposerId == _testData.ProposerUserId1 + && p.IsPublished == false + ); problemListDto .Items[0] - .Tests - .ShouldContain( - t => - t.Score == _testData.TestScore1 - && t.InputDownloadUrl == _testData.TestInputLink1 - && t.OutputDownloadUrl == _testData.TestOutputLink1 + .Tests.ShouldContain(t => + t.Score == _testData.TestScore1 + && t.InputDownloadUrl == _testData.TestInputLink1 + && t.OutputDownloadUrl == _testData.TestOutputLink1 ); } @@ -222,14 +221,11 @@ public async Task Should_Get_Published_Problem_When_Current_User_Is_Proposer_And problemDto.ShouldNotBeNull(); problemDto.Id.ShouldBe(_testData.ProblemId1); problemDto.Name.ShouldBe(_testData.ProblemName1); - problemDto - .Tests - .ShouldContain( - t => - t.Score == _testData.TestScore1 - && t.InputDownloadUrl == _testData.TestInputLink1 - && t.OutputDownloadUrl == _testData.TestOutputLink1 - ); + problemDto.Tests.ShouldContain(t => + t.Score == _testData.TestScore1 + && t.InputDownloadUrl == _testData.TestInputLink1 + && t.OutputDownloadUrl == _testData.TestOutputLink1 + ); } [Fact] @@ -242,14 +238,11 @@ public async Task Should_Get_Unpublished_Problem_When_Current_User_Is_Proposer_A problemDto.ShouldNotBeNull(); problemDto.Id.ShouldBe(_testData.ProblemId1); problemDto.Name.ShouldBe(_testData.ProblemName1); - problemDto - .Tests - .ShouldContain( - t => - t.Score == _testData.TestScore1 - && t.InputDownloadUrl == _testData.TestInputLink1 - && t.OutputDownloadUrl == _testData.TestOutputLink1 - ); + problemDto.Tests.ShouldContain(t => + t.Score == _testData.TestScore1 + && t.InputDownloadUrl == _testData.TestInputLink1 + && t.OutputDownloadUrl == _testData.TestOutputLink1 + ); } [Fact] @@ -1096,14 +1089,11 @@ public async Task Should_Get_Eval_Metadata_For_Valid_Problem() evalMetadataDto.StackMemory.ShouldBe(_testData.ProblemStackMemoryLimit1); evalMetadataDto.IoType.ShouldBe(_testData.ProblemIoType1); evalMetadataDto.Tests.Count().ShouldBe(1); - evalMetadataDto - .Tests - .ShouldContain( - t => - t.Score == _testData.TestScore1 - && t.InputDownloadUrl == _testData.TestInputLink1 - && t.OutputDownloadUrl == _testData.TestOutputLink1 - ); + evalMetadataDto.Tests.ShouldContain(t => + t.Score == _testData.TestScore1 + && t.InputDownloadUrl == _testData.TestInputLink1 + && t.OutputDownloadUrl == _testData.TestOutputLink1 + ); } [Fact] diff --git a/enki-problems/test/EnkiProblems.Domain.Tests/EnkiProblemsDomainTestBase.cs b/enki-problems/test/EnkiProblems.Domain.Tests/EnkiProblemsDomainTestBase.cs index 39b31eb..38b08d9 100644 --- a/enki-problems/test/EnkiProblems.Domain.Tests/EnkiProblemsDomainTestBase.cs +++ b/enki-problems/test/EnkiProblems.Domain.Tests/EnkiProblemsDomainTestBase.cs @@ -1,5 +1,4 @@ namespace EnkiProblems; public abstract class EnkiProblemsDomainTestBase - : EnkiProblemsTestBase -{ } + : EnkiProblemsTestBase { } diff --git a/enki-problems/test/EnkiProblems.Domain.Tests/Problems/ProblemManagerTests.cs b/enki-problems/test/EnkiProblems.Domain.Tests/Problems/ProblemManagerTests.cs index 66513e2..a202cfd 100644 --- a/enki-problems/test/EnkiProblems.Domain.Tests/Problems/ProblemManagerTests.cs +++ b/enki-problems/test/EnkiProblems.Domain.Tests/Problems/ProblemManagerTests.cs @@ -132,7 +132,7 @@ await Assert.ThrowsAsync(async () => { await _problemManager.UpdateAsync( problem, - _testData.ProblemName1, + _testData.ProblemName3, _testData.ProblemBrief2, _testData.ProblemDescription2, _testData.ProblemSourceName2, diff --git a/enki-problems/test/EnkiProblems.MongoDB.Tests/EnkiProblems.MongoDB.Tests.csproj b/enki-problems/test/EnkiProblems.MongoDB.Tests/EnkiProblems.MongoDB.Tests.csproj index dcf9a2f..6516357 100644 --- a/enki-problems/test/EnkiProblems.MongoDB.Tests/EnkiProblems.MongoDB.Tests.csproj +++ b/enki-problems/test/EnkiProblems.MongoDB.Tests/EnkiProblems.MongoDB.Tests.csproj @@ -15,7 +15,8 @@ - + + diff --git a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbCollectionFixtureBase.cs b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbCollectionFixtureBase.cs index f3e6222..33d4509 100644 --- a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbCollectionFixtureBase.cs +++ b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbCollectionFixtureBase.cs @@ -3,5 +3,4 @@ namespace EnkiProblems.MongoDB; public class EnkiProblemsMongoDbCollectionFixtureBase - : ICollectionFixture -{ } + : ICollectionFixture { } diff --git a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbFixture.cs b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbFixture.cs index 778bc24..14b30ae 100644 --- a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbFixture.cs +++ b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbFixture.cs @@ -1,24 +1,16 @@ -using System; -using Mongo2Go; +using System.Threading.Tasks; +using Testcontainers.MongoDb; +using Xunit; namespace EnkiProblems.MongoDB; -public class EnkiProblemsMongoDbFixture : IDisposable +public class EnkiProblemsMongoDbFixture : IAsyncLifetime { - private static readonly MongoDbRunner MongoDbRunner; - public static readonly string ConnectionString; + private static readonly MongoDbContainer MongoDbContainer = new MongoDbBuilder().Build(); - static EnkiProblemsMongoDbFixture() - { - MongoDbRunner = MongoDbRunner.Start( - singleNodeReplSet: true, - singleNodeReplSetWaitTimeout: 20 - ); - ConnectionString = MongoDbRunner.ConnectionString; - } + public static string GetConnectionString() => MongoDbContainer.GetConnectionString(); - public void Dispose() - { - MongoDbRunner?.Dispose(); - } + public Task InitializeAsync() => MongoDbContainer.StartAsync(); + + public Task DisposeAsync() => MongoDbContainer.StopAsync(); } diff --git a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestBase.cs b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestBase.cs index 90a987a..db4be58 100644 --- a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestBase.cs +++ b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestBase.cs @@ -1,5 +1,4 @@ namespace EnkiProblems.MongoDB; public abstract class EnkiProblemsMongoDbTestBase - : EnkiProblemsTestBase -{ } + : EnkiProblemsTestBase { } diff --git a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestModule.cs b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestModule.cs index 7995475..868ece9 100644 --- a/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestModule.cs +++ b/enki-problems/test/EnkiProblems.MongoDB.Tests/MongoDb/EnkiProblemsMongoDbTestModule.cs @@ -9,13 +9,8 @@ public class EnkiProblemsMongoDbTestModule : AbpModule { public override void ConfigureServices(ServiceConfigurationContext context) { - var stringArray = EnkiProblemsMongoDbFixture.ConnectionString.Split('?'); var connectionString = - stringArray[0].EnsureEndsWith('/') - + "Db_" - + Guid.NewGuid().ToString("N") - + "/?" - + stringArray[1]; + $"{EnkiProblemsMongoDbFixture.GetConnectionString().EnsureEndsWith('/')}Db_{Guid.NewGuid():N}?authSource=admin"; Configure(options => { diff --git a/hermes-tests/.dockerignore b/hermes-tests/.dockerignore index c3cd8b6..1f3ec55 100644 --- a/hermes-tests/.dockerignore +++ b/hermes-tests/.dockerignore @@ -1,6 +1,7 @@ .vscode/ .idea/ .dart_tool/ +.fvm* test/ temp/test/ *.md diff --git a/hermes-tests/.gitignore b/hermes-tests/.gitignore index 692cac2..fe4bde3 100644 --- a/hermes-tests/.gitignore +++ b/hermes-tests/.gitignore @@ -3,7 +3,11 @@ .dart_tool/ .idea/ .vscode/ -temp/ +#temp/ logs/ config.json -.env \ No newline at end of file +.env + +# FVM Version Cache +.fvm/ +.fvmrc \ No newline at end of file diff --git a/hermes-tests/bin/client.dart b/hermes-tests/bin/client.dart index f64ad73..12e3a06 100644 --- a/hermes-tests/bin/client.dart +++ b/hermes-tests/bin/client.dart @@ -6,7 +6,7 @@ import 'package:hermes_tests/domain/core/file_log_output.dart'; import 'package:logger/logger.dart'; Future main(List arguments) async { - final config = Config.fromJsonFile('config.json'); + final config = Config.fromEnv('HERMES_CONFIG'); final serverConfig = ServerConfig.fromJson(config.dev); final logger = Logger( diff --git a/hermes-tests/temp/test/archived/sum/1-invalid.tar.gz b/hermes-tests/temp/test/archived/sum/1-invalid.tar.gz new file mode 100644 index 0000000000000000000000000000000000000000..22fbc75347a98d003eed28787496fe8e17dddc22 GIT binary patch literal 185 zcmb2|=3oE==C_xU3^im`LEs>agZx;lJgiMGy@_%;91P4Bu zzU+|Fv`IlT{AV{s&gV|8og&a9{(j2FNh%t$uiS&PWW3&O4Rf|D%YAoyqs*?_x|;8= z{<-gbW;5+{MA_4*tiOihGK=dk?T`H>{;}|ty_BKX?5M_T|Mvc0%bXCr?xo+<=l_rW idz$xOu`Bie{e-P|t(YM01Cu{4@b*>v8#8DyFaQ8ES5g%K literal 0 HcmV?d00001 diff --git a/hermes-tests/temp/test/archived/sum/1-oversize.zip b/hermes-tests/temp/test/archived/sum/1-oversize.zip new file mode 100644 index 0000000000000000000000000000000000000000..899e08720074569e447df565aaa3939a74104309 GIT binary patch literal 470 zcmWIWW@h1H00HxuYY|`ulwe|zVKCGW4dG;9&g>~m%K_rj3T_5QmKV$n3}7Mvs8Twa8j!w58COQQ|g m9Apm@XCkUkkxgs|n~3CZTn4hT0aXHnkeT5%kiH7yFaQ8CVpR$N literal 0 HcmV?d00001 diff --git a/hermes-tests/temp/test/archived/sum/1-valid.zip b/hermes-tests/temp/test/archived/sum/1-valid.zip new file mode 100644 index 0000000000000000000000000000000000000000..9a2c576912634c91965634d5d191d28aa4f25397 GIT binary patch literal 320 zcmWIWW@h1H00H;NYZ1Cdo8*{*Y!K#TkYUKoD=00|E2$_64dG;9e%4l&_Nb#Qt+axh zfsy3}P!X6gR4@!cF@<;ERVhZG2_Vb`G$p^Zga~7d0=yZSmmU3^im`LEs>agZx;lJgiMGy@_%;91P4Bu zzU+|Fv`IlT{AV{s&gV|8og&a9{(j2FNh%t$uiS&PWW3&O4Rf|D%YAoyqs*?_x|;8= z{<-gbW;5+{MA_4*tiOihGK=dk?T`H>{;}|ty_BKX?5M_T|Mvc0%bXCr?xo+<=l_rW idz$xOu`Bie{e-P|t(YM01Cu{4@b*>v8#8DyFaQ8ES5g%K literal 0 HcmV?d00001 diff --git a/hermes-tests/temp/test/unarchived/sum/2/input.txt b/hermes-tests/temp/test/unarchived/sum/2/input.txt new file mode 100644 index 0000000..92880af --- /dev/null +++ b/hermes-tests/temp/test/unarchived/sum/2/input.txt @@ -0,0 +1 @@ +1 1 \ No newline at end of file diff --git a/hermes-tests/temp/test/unarchived/sum/2/output.txt b/hermes-tests/temp/test/unarchived/sum/2/output.txt new file mode 100644 index 0000000..d8263ee --- /dev/null +++ b/hermes-tests/temp/test/unarchived/sum/2/output.txt @@ -0,0 +1 @@ +2 \ No newline at end of file diff --git a/hermes-tests/temp/test/unarchived/sum/6/input.txt b/hermes-tests/temp/test/unarchived/sum/6/input.txt new file mode 100644 index 0000000..92880af --- /dev/null +++ b/hermes-tests/temp/test/unarchived/sum/6/input.txt @@ -0,0 +1 @@ +1 1 \ No newline at end of file diff --git a/hermes-tests/test/api/server/hermes_grpc_server_integration_test.dart b/hermes-tests/test/api/server/hermes_grpc_server_integration_test.dart index babd364..b94f718 100644 --- a/hermes-tests/test/api/server/hermes_grpc_server_integration_test.dart +++ b/hermes-tests/test/api/server/hermes_grpc_server_integration_test.dart @@ -50,9 +50,9 @@ void main() { 'Then the uploaded test is accessible from the remote firebase cloud storage', () async { // Arrange - final String testPath = 'temp/test/archived/marsx/1-valid.zip'; + final String testPath = 'temp/test/archived/sum/1-valid.zip'; final Metadata testMetadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '10' ..testSize = File(testPath).lengthSync(); @@ -100,7 +100,7 @@ void main() { () async { // Arrange final request = DownloadRequest() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '9'; // Act @@ -133,9 +133,9 @@ void main() { 'Then the test is deleted from the remote firebase cloud storage', () async { // Arrange - final String testPath = 'temp/test/archived/marsx/1-valid.zip'; + final String testPath = 'temp/test/archived/sum/1-valid.zip'; final Metadata testMetadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '10' ..testSize = File(testPath).lengthSync(); @@ -154,6 +154,14 @@ void main() { // Assert expect(response.status.code, StatusCode.Ok); + final String localTestArchivePath = + '${testConfig.tempLocalArchivedTestFolder}/${testMetadata.problemId}/${testMetadata.testId}.zip'; + final String localTestPath = + '${testConfig.tempLocalUnarchivedTestFolder}/${testMetadata.problemId}/${testMetadata.testId}'; + + FileManager.disposeLocalFile(localTestArchivePath); + FileManager.disposeLocalDirectory(localTestPath); + client.close(); }); @@ -163,7 +171,7 @@ void main() { 'Then the download link is successfully retrieved', () async { // Arrange final request = GetDownloadLinkForTestRequest() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '9'; // Act diff --git a/hermes-tests/test/application/use_cases/download/download_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/download/download_test_use_case_unit_test.dart index c49f9a8..7011d4f 100644 --- a/hermes-tests/test/application/use_cases/download/download_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/download/download_test_use_case_unit_test.dart @@ -23,7 +23,7 @@ void main() { group('Download Test UseCase Unit Tests', () { setUpAll(() { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); mockTestRepository = MockTestRepository(); final logger = Logger( @@ -52,7 +52,7 @@ void main() { () async { // Arrange final TestMetadata requestTestMetadata = TestMetadata.testToDownload( - problemId: 'marsx', + problemId: 'sum', testId: '2', fromDir: testConfig.remoteUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -84,7 +84,7 @@ void main() { 'Then localTestNotFound storage failure is returned', () async { // Arrange final TestMetadata requestTestMetadata = TestMetadata.testToDownload( - problemId: 'marsx', + problemId: 'sum', testId: '6', fromDir: testConfig.remoteUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -122,7 +122,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata requestTestMetadata = TestMetadata.testToDownload( - problemId: 'marsx', + problemId: 'sum', testId: '5', fromDir: testConfig.remoteUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -163,7 +163,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata requestTestMetadata = TestMetadata.testToUpload( - problemId: 'marsx', + problemId: 'sum', testId: '5', fromDir: testConfig.remoteUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, diff --git a/hermes-tests/test/application/use_cases/download/encode_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/download/encode_test_use_case_unit_test.dart index 9aaecfd..42ea027 100644 --- a/hermes-tests/test/application/use_cases/download/encode_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/download/encode_test_use_case_unit_test.dart @@ -19,7 +19,7 @@ void main() { group('Encode Test UseCase Unit Tests', () { setUpAll(() { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); final logger = Logger( output: FileLogOutput( @@ -42,7 +42,7 @@ void main() { 'Then metadata for corresponding archived test is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToEncode( - problemId: 'marsx', + problemId: 'sum', testId: '2', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalUnarchivedTestFolder, @@ -77,7 +77,7 @@ void main() { 'Then metadata for corresponding archived test is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToEncode( - problemId: 'marsx', + problemId: 'sum', testId: '1', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalUnarchivedTestFolder, @@ -103,7 +103,7 @@ void main() { 'Then localTestNotFound storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToEncode( - problemId: 'marsx', + problemId: 'sum', testId: '3', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalUnarchivedTestFolder, @@ -135,7 +135,7 @@ void main() { 'Then invalidLocalTestFormat storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToEncode( - problemId: 'marsx', + problemId: 'sum', testId: '6', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalUnarchivedTestFolder, @@ -167,7 +167,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToDecode( - problemId: 'marsx', + problemId: 'sum', testId: '6', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalUnarchivedTestFolder, diff --git a/hermes-tests/test/application/use_cases/download/fragment_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/download/fragment_test_use_case_unit_test.dart index 4084bf0..daa9a3f 100644 --- a/hermes-tests/test/application/use_cases/download/fragment_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/download/fragment_test_use_case_unit_test.dart @@ -19,7 +19,7 @@ void main() { group('Fragment Test UseCase Unit Tests', () { setUpAll(() { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); final logger = Logger( output: FileLogOutput( @@ -42,7 +42,7 @@ void main() { 'Then a stream of chunks is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToFragment( - problemId: 'marsx', + problemId: 'sum', testId: '1', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -78,7 +78,7 @@ void main() { 'Then localTestNotFound storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToFragment( - problemId: 'marsx', + problemId: 'sum', testId: '7', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -108,7 +108,7 @@ void main() { 'Then invalidLocalTestFormat storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToFragment( - problemId: 'marsx', + problemId: 'sum', testId: '4', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -138,7 +138,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToUpload( - problemId: 'marsx', + problemId: 'sum', testId: '4', fromDir: testConfig.tempLocalUnarchivedTestFolder, toDir: testConfig.remoteUnarchivedTestFolder, diff --git a/hermes-tests/test/application/use_cases/upload/decode_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/upload/decode_test_use_case_unit_test.dart index bf88dfe..b49c4ca 100644 --- a/hermes-tests/test/application/use_cases/upload/decode_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/upload/decode_test_use_case_unit_test.dart @@ -19,7 +19,7 @@ void main() { group('Decode Test UseCase Unit Tests', () { setUpAll(() { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); final logger = Logger( output: FileLogOutput( @@ -43,7 +43,7 @@ void main() { () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToDecode( - problemId: 'marsx', + problemId: 'sum', testId: '1', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -86,7 +86,7 @@ void main() { 'Then localTestNotFound storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToDecode( - problemId: 'marsx', + problemId: 'sum', testId: '3', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -118,7 +118,7 @@ void main() { 'Then invalidLocalTestFormat storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToDecode( - problemId: 'marsx', + problemId: 'sum', testId: '4', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, @@ -150,7 +150,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToEncode( - problemId: 'marsx', + problemId: 'sum', testId: '4', archiveTypeExtension: testConfig.archiveTypeExtension, fromDir: testConfig.tempLocalArchivedTestFolder, diff --git a/hermes-tests/test/application/use_cases/upload/defragment_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/upload/defragment_test_use_case_unit_test.dart index b499af7..7b510ba 100644 --- a/hermes-tests/test/application/use_cases/upload/defragment_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/upload/defragment_test_use_case_unit_test.dart @@ -21,7 +21,7 @@ void main() { group('Defragment Test UseCase Unit Tests', () { setUpAll(() async { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); final logger = Logger( output: FileLogOutput( @@ -44,11 +44,11 @@ void main() { 'Then the test is successfully written on disk ' 'and associated metadata is returned', () async { // Arrange - final String inputPath = 'temp/test/archived/marsx/1-valid.zip'; + final String inputPath = 'temp/test/archived/sum/1-valid.zip'; final int testSize = File(inputPath).lengthSync(); final Metadata metadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '2' ..testSize = testSize; @@ -89,11 +89,11 @@ void main() { 'When defragment test use case is called, ' 'Then invalidLocalTestFormat storage failure is returned', () async { // Arrange - final String inputPath = 'temp/test/archived/marsx/1-invalid.tar.xz'; + final String inputPath = 'temp/test/archived/sum/1-invalid.tar.gz'; final int testSize = File(inputPath).lengthSync(); final Metadata metadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '2' ..testSize = testSize; @@ -134,11 +134,11 @@ void main() { 'When defragment test use case is called, ' 'Then testSizeLimitExceeded storage failure is returned', () async { // Arrange - final String inputPath = 'temp/test/archived/marsx/1-oversize.zip'; + final String inputPath = 'temp/test/archived/sum/1-oversize.zip'; final int testSize = File(inputPath).lengthSync(); final Metadata metadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '2' ..testSize = testSize; @@ -179,11 +179,11 @@ void main() { 'When defragment test use case is called, ' 'Then unexpected storage failure is returned', () async { // Arrange - final String inputPath = 'temp/test/archived/marsx/1-oversize.zip'; + final String inputPath = 'temp/test/archived/sum/1-oversize.zip'; final int testSize = File(inputPath).lengthSync(); final Metadata metadata = Metadata() - ..problemId = 'marsx' + ..problemId = 'sum' ..testId = '2' ..testSize = testSize; diff --git a/hermes-tests/test/application/use_cases/upload/upload_test_use_case_unit_test.dart b/hermes-tests/test/application/use_cases/upload/upload_test_use_case_unit_test.dart index 451b57b..f031d7b 100644 --- a/hermes-tests/test/application/use_cases/upload/upload_test_use_case_unit_test.dart +++ b/hermes-tests/test/application/use_cases/upload/upload_test_use_case_unit_test.dart @@ -23,7 +23,7 @@ void main() { group('Upload Test UseCase Unit Tests', () { setUpAll(() { testConfig = ServerConfig.fromJson( - Config.fromJsonFile('config.json').test, + Config.fromEnv('HERMES_CONFIG').test, ); mockTestRepository = MockTestRepository(); final logger = Logger( @@ -50,7 +50,7 @@ void main() { 'Then no storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToUpload( - problemId: 'marsx', + problemId: 'sum', testId: '2', fromDir: testConfig.tempLocalUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -85,7 +85,7 @@ void main() { 'Then localtTestNotFound storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToUpload( - problemId: 'marsx', + problemId: 'sum', testId: '3', fromDir: testConfig.tempLocalUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -126,7 +126,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToUpload( - problemId: 'marsx', + problemId: 'sum', testId: '2', fromDir: testConfig.tempLocalUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, @@ -167,7 +167,7 @@ void main() { 'Then unexpected storage failure is returned', () async { // Arrange final TestMetadata testMetadata = TestMetadata.testToDownload( - problemId: 'marsx', + problemId: 'sum', testId: '2', fromDir: testConfig.tempLocalUnarchivedTestFolder, toDir: testConfig.tempLocalUnarchivedTestFolder, diff --git a/quetzalcoatl-auth/.dockerignore b/quetzalcoatl-auth/.dockerignore index 4a02cf3..09e5f65 100644 --- a/quetzalcoatl-auth/.dockerignore +++ b/quetzalcoatl-auth/.dockerignore @@ -6,5 +6,6 @@ .env* QuetzalcoatlAuth.sln.DotSettings.user *.md +Tests.Integration/ Dockerfile* .dockerignore diff --git a/quetzalcoatl-auth/.env.template b/quetzalcoatl-auth/.env.template index 55b0184..18cee60 100644 --- a/quetzalcoatl-auth/.env.template +++ b/quetzalcoatl-auth/.env.template @@ -1,5 +1,6 @@ -ASPNETCORE_ENVIRONMENT=Development +ASPNETCORE_ENVIRONMENT=Testing ASPNETCORE_URLS=http://+:5210 +DOTNET_SYSTEM_GLOBALIZATION_INVARIANT=false JwtConfig__SecretKey="z7F+ut_aphaxeja0&ba*p9spew!4fe0rAFRO5HestitIKOv5nistlz3b=+edu1aP" JwtConfig__JwtAccessTokenLifeTime="0:01:00:00.0" diff --git a/quetzalcoatl-auth/Api/Features/Auth/Login/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Auth/Login/Endpoint.cs index 31d2cbe..4862a4d 100644 --- a/quetzalcoatl-auth/Api/Features/Auth/Login/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Auth/Login/Endpoint.cs @@ -47,39 +47,29 @@ public override async Task HandleAsync(LoginUserRequest req, CancellationToken c } ); - HttpContext - .Response - .Cookies - .Append( - CookieAuthenticationDefaults.CookiePrefix + "AccessToken", - tokenResponse.AccessToken, - new CookieOptions - { - HttpOnly = true, - SameSite = SameSiteMode.None, - Secure = true, - Expires = DateTimeOffset - .UtcNow - .AddTicks(_jwtConfig.JwtAccessTokenLifetime.Ticks) - } - ); + HttpContext.Response.Cookies.Append( + CookieAuthenticationDefaults.CookiePrefix + "AccessToken", + tokenResponse.AccessToken, + new CookieOptions + { + HttpOnly = true, + SameSite = SameSiteMode.None, + Secure = true, + Expires = DateTimeOffset.UtcNow.AddTicks(_jwtConfig.JwtAccessTokenLifetime.Ticks) + } + ); - HttpContext - .Response - .Cookies - .Append( - CookieAuthenticationDefaults.CookiePrefix + "RefreshToken", - tokenResponse.RefreshToken, - new CookieOptions - { - HttpOnly = true, - SameSite = SameSiteMode.None, - Secure = true, - Expires = DateTimeOffset - .UtcNow - .AddTicks(_jwtConfig.JwtRefreshTokenLifetime.Ticks) - } - ); + HttpContext.Response.Cookies.Append( + CookieAuthenticationDefaults.CookiePrefix + "RefreshToken", + tokenResponse.RefreshToken, + new CookieOptions + { + HttpOnly = true, + SameSite = SameSiteMode.None, + Secure = true, + Expires = DateTimeOffset.UtcNow.AddTicks(_jwtConfig.JwtRefreshTokenLifetime.Ticks) + } + ); await SendOkAsync( response: new UserTokenResponse diff --git a/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Endpoint.cs index b442c3c..827acd7 100644 --- a/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Endpoint.cs @@ -60,16 +60,12 @@ public override async Task PersistTokenAsync(UserTokenResponse response) public override async Task RefreshRequestValidationAsync(UserTokenRequest req) { - _logger.LogInformation( - "Validating the refresh token for user {UserId}", - req.UserId - ); + _logger.LogInformation("Validating the refresh token for user {UserId}", req.UserId); - var storedRefreshToken = await _tokenRepository.GetRefreshTokenAsync( - rt => - rt.Token == Guid.Parse(req.RefreshToken) - && rt.UserId == Guid.Parse(req.UserId) - && rt.ExpiryDate > DateTime.UtcNow + var storedRefreshToken = await _tokenRepository.GetRefreshTokenAsync(rt => + rt.Token == Guid.Parse(req.RefreshToken) + && rt.UserId == Guid.Parse(req.UserId) + && rt.ExpiryDate > DateTime.UtcNow ); if (storedRefreshToken is null) diff --git a/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Mappers.cs b/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Mappers.cs index f320b07..308d885 100644 --- a/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Mappers.cs +++ b/quetzalcoatl-auth/Api/Features/Auth/RefreshToken/Mappers.cs @@ -10,4 +10,4 @@ public UserTokenResponseToRefreshTokenEntityProfile() .ForMember(dest => dest.ExpiryDate, opt => opt.MapFrom(src => src.RefreshExpiry)) .ForMember(dest => dest.CreationDate, opt => opt.MapFrom(_ => DateTime.UtcNow)); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Auth/Register/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Auth/Register/Endpoint.cs index d614939..979cb33 100644 --- a/quetzalcoatl-auth/Api/Features/Auth/Register/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Auth/Register/Endpoint.cs @@ -46,39 +46,29 @@ public override async Task HandleAsync(RegisterUserRequest req, CancellationToke } ); - HttpContext - .Response - .Cookies - .Append( - CookieAuthenticationDefaults.CookiePrefix + "AccessToken", - tokenResponse.AccessToken, - new CookieOptions - { - HttpOnly = true, - SameSite = SameSiteMode.None, - Secure = true, - Expires = DateTimeOffset - .UtcNow - .AddTicks(_jwtConfig.JwtAccessTokenLifetime.Ticks) - } - ); + HttpContext.Response.Cookies.Append( + CookieAuthenticationDefaults.CookiePrefix + "AccessToken", + tokenResponse.AccessToken, + new CookieOptions + { + HttpOnly = true, + SameSite = SameSiteMode.None, + Secure = true, + Expires = DateTimeOffset.UtcNow.AddTicks(_jwtConfig.JwtAccessTokenLifetime.Ticks) + } + ); - HttpContext - .Response - .Cookies - .Append( - CookieAuthenticationDefaults.CookiePrefix + "RefreshToken", - tokenResponse.RefreshToken, - new CookieOptions - { - HttpOnly = true, - SameSite = SameSiteMode.None, - Secure = true, - Expires = DateTimeOffset - .UtcNow - .AddTicks(_jwtConfig.JwtRefreshTokenLifetime.Ticks) - } - ); + HttpContext.Response.Cookies.Append( + CookieAuthenticationDefaults.CookiePrefix + "RefreshToken", + tokenResponse.RefreshToken, + new CookieOptions + { + HttpOnly = true, + SameSite = SameSiteMode.None, + Secure = true, + Expires = DateTimeOffset.UtcNow.AddTicks(_jwtConfig.JwtRefreshTokenLifetime.Ticks) + } + ); await SendCreatedAtAsync( endpointName: $"/api/users/{user.Id}", diff --git a/quetzalcoatl-auth/Api/Features/Auth/Register/Validators.cs b/quetzalcoatl-auth/Api/Features/Auth/Register/Validators.cs index 23fdae4..2c3d3b4 100644 --- a/quetzalcoatl-auth/Api/Features/Auth/Register/Validators.cs +++ b/quetzalcoatl-auth/Api/Features/Auth/Register/Validators.cs @@ -8,7 +8,9 @@ public Validator() .NotEmpty() .WithMessage("Username is required") .MinimumLength(ApplicationUserConsts.UsernameMinLength) - .WithMessage($"Username must be at least {ApplicationUserConsts.UsernameMinLength} characters long"); + .WithMessage( + $"Username must be at least {ApplicationUserConsts.UsernameMinLength} characters long" + ); RuleFor(x => x.Email) .NotEmpty() @@ -20,9 +22,13 @@ public Validator() .NotEmpty() .WithMessage("Password is required") .MinimumLength(ApplicationUserConsts.PasswordMinLength) - .WithMessage($"Password must be at least {ApplicationUserConsts.PasswordMinLength} characters long") + .WithMessage( + $"Password must be at least {ApplicationUserConsts.PasswordMinLength} characters long" + ) .MaximumLength(ApplicationUserConsts.PasswordMaxLength) - .WithMessage($"Password must be at most {ApplicationUserConsts.PasswordMaxLength} characters long") + .WithMessage( + $"Password must be at most {ApplicationUserConsts.PasswordMaxLength} characters long" + ) .Matches(ApplicationUserConsts.PasswordRegex) .WithMessage( "Password must contain at least one uppercase letter, one lowercase letter, one number and one special character" @@ -30,12 +36,16 @@ public Validator() RuleFor(x => x.Fullname) .MaximumLength(ApplicationUserConsts.FullnameMaxLength) - .WithMessage($"Fullname must be at most {ApplicationUserConsts.FullnameMaxLength} characters long") + .WithMessage( + $"Fullname must be at most {ApplicationUserConsts.FullnameMaxLength} characters long" + ) .When(x => !string.IsNullOrWhiteSpace(x.Fullname)); RuleFor(x => x.Bio) .MaximumLength(ApplicationUserConsts.BioMaxLength) - .WithMessage($"Bio must be at most {ApplicationUserConsts.BioMaxLength} characters long") + .WithMessage( + $"Bio must be at most {ApplicationUserConsts.BioMaxLength} characters long" + ) .When(x => !string.IsNullOrWhiteSpace(x.Bio)); RuleFor(x => x.ProfilePicture) @@ -47,7 +57,8 @@ public Validator() .When(x => x.ProfilePicture is not null); } - private static bool IsAllowedSize(long length) => length <= ApplicationUserConsts.ProfilePictureMaxLength; + private static bool IsAllowedSize(long length) => + length <= ApplicationUserConsts.ProfilePictureMaxLength; private static bool IsAllowedType(string contentType) => ApplicationUserConsts.AllowedProfilePictureTypes.Contains(contentType); diff --git a/quetzalcoatl-auth/Api/Features/Core/ApplicationUserExtensions.cs b/quetzalcoatl-auth/Api/Features/Core/ApplicationUserExtensions.cs index 06df51b..3df1d9f 100644 --- a/quetzalcoatl-auth/Api/Features/Core/ApplicationUserExtensions.cs +++ b/quetzalcoatl-auth/Api/Features/Core/ApplicationUserExtensions.cs @@ -26,7 +26,7 @@ public static class SortUsersByExtensions public static IEnumerable SortUsers( this IEnumerable query, SortUsersBy sortBy - ) + ) { return sortBy switch { @@ -39,7 +39,7 @@ SortUsersBy sortBy public static IAsyncEnumerable SortUsers( this IAsyncEnumerable query, SortUsersBy sortBy - ) + ) { return sortBy switch { @@ -48,4 +48,4 @@ SortUsersBy sortBy _ => query.OrderBy(user => user.Username), }; } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Core/JwtExtensions.cs b/quetzalcoatl-auth/Api/Features/Core/JwtExtensions.cs index 7da210d..2a2d4fe 100644 --- a/quetzalcoatl-auth/Api/Features/Core/JwtExtensions.cs +++ b/quetzalcoatl-auth/Api/Features/Core/JwtExtensions.cs @@ -26,9 +26,9 @@ out var validatedToken private static bool IsJwtWithValidSecurityAlgorithm(SecurityToken validatedToken) { return (validatedToken is JwtSecurityToken jwtSecurityToken) - && jwtSecurityToken - .Header - .Alg - .Equals(SecurityAlgorithms.HmacSha256, StringComparison.InvariantCultureIgnoreCase); + && jwtSecurityToken.Header.Alg.Equals( + SecurityAlgorithms.HmacSha256, + StringComparison.InvariantCultureIgnoreCase + ); } } diff --git a/quetzalcoatl-auth/Api/Features/Core/LinqExtensions.cs b/quetzalcoatl-auth/Api/Features/Core/LinqExtensions.cs index 956b458..3e9c799 100644 --- a/quetzalcoatl-auth/Api/Features/Core/LinqExtensions.cs +++ b/quetzalcoatl-auth/Api/Features/Core/LinqExtensions.cs @@ -11,4 +11,4 @@ public static IAsyncEnumerable Paginate(IAsyncEnumerable query, int pag { return query.Skip((page - 1) * pageSize).Take(pageSize); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Delete/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Users/Delete/Endpoint.cs index 85c7d48..336e67c 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Delete/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Delete/Endpoint.cs @@ -39,8 +39,7 @@ public override async Task HandleAsync(DeleteUserRequest req, CancellationToken if (!result.Succeeded) { var errors = result - .Errors - .Select(e => e.Description) + .Errors.Select(e => e.Description) .Aggregate("Identity Errors: ", (a, b) => $"{a}, {b}"); _logger.LogWarning( diff --git a/quetzalcoatl-auth/Api/Features/Users/Get/Mappers.cs b/quetzalcoatl-auth/Api/Features/Users/Get/Mappers.cs index 146169f..7a190b9 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Get/Mappers.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Get/Mappers.cs @@ -27,9 +27,9 @@ public ApplicationUserToGetUserResponseProfile() .ForMember( dest => dest.ProfilePictureId, opt => - opt.MapFrom( - src => src.ProfilePicture != null ? src.ProfilePicture!.Id : null - ) + opt.MapFrom(src => + src.ProfilePicture != null ? src.ProfilePicture!.Id : null + ) ); } } diff --git a/quetzalcoatl-auth/Api/Features/Users/GetAll/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Users/GetAll/Endpoint.cs index ff3175d..74db2d7 100644 --- a/quetzalcoatl-auth/Api/Features/Users/GetAll/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Users/GetAll/Endpoint.cs @@ -29,20 +29,23 @@ public override async Task HandleAsync(GetAllUsersRequest req, CancellationToken { _logger.LogInformation("Getting all users"); - var users = _userManager.Users.AsAsyncEnumerable().SelectAwait(async user => - { - var userDto = _mapper.Map(user); - userDto.Roles = await _userManager.GetRolesAsync(user); - return userDto; - }).AsAsyncEnumerable(); - + var users = _userManager + .Users.AsAsyncEnumerable() + .SelectAwait(async user => + { + var userDto = _mapper.Map(user); + userDto.Roles = await _userManager.GetRolesAsync(user); + return userDto; + }) + .AsAsyncEnumerable(); + if (!string.IsNullOrWhiteSpace(req.Username)) { users = users.Where(user => user.Username.Contains(req.Username)); } var totalCount = await users.CountAsync(cancellationToken: ct); - + if (req.SortBy is not null) { users = users.SortUsers(req.SortBy.Value); @@ -50,6 +53,13 @@ public override async Task HandleAsync(GetAllUsersRequest req, CancellationToken users = LinqExtensions.Paginate(users, req.Page ?? 1, req.PageSize ?? 10); - await SendOkAsync(response: new GetAllUsersResponse { Users = users.ToEnumerable(), TotalCount = totalCount }, ct); + await SendOkAsync( + response: new GetAllUsersResponse + { + Users = users.ToEnumerable(), + TotalCount = totalCount + }, + ct + ); } } diff --git a/quetzalcoatl-auth/Api/Features/Users/GetAll/Mappers.cs b/quetzalcoatl-auth/Api/Features/Users/GetAll/Mappers.cs index c7d6681..d961313 100644 --- a/quetzalcoatl-auth/Api/Features/Users/GetAll/Mappers.cs +++ b/quetzalcoatl-auth/Api/Features/Users/GetAll/Mappers.cs @@ -27,9 +27,9 @@ public ApplicationUserToUserDtoProfile() .ForMember( dest => dest.ProfilePictureId, opt => - opt.MapFrom( - src => src.ProfilePicture != null ? src.ProfilePicture!.Id : null - ) + opt.MapFrom(src => + src.ProfilePicture != null ? src.ProfilePicture!.Id : null + ) ); } } diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Endpoint.cs index 5496750..d794467 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Endpoint.cs @@ -26,7 +26,11 @@ public override void Configure() public override async Task HandleAsync(AddRoleRequest req, CancellationToken ct) { - _logger.LogInformation("Adding role {Role} to user with id {Id}", req.Role.ToString(), req.Id.ToString()); + _logger.LogInformation( + "Adding role {Role} to user with id {Id}", + req.Role.ToString(), + req.Id.ToString() + ); if (!Enum.TryParse(req.Role, out var role)) { @@ -47,7 +51,11 @@ public override async Task HandleAsync(AddRoleRequest req, CancellationToken ct) if (await _userManager.IsInRoleAsync(user, role.ToString())) { - _logger.LogWarning("User with id {Id} already has role {Role}", req.Id.ToString(), role.ToString()); + _logger.LogWarning( + "User with id {Id} already has role {Role}", + req.Id.ToString(), + role.ToString() + ); var errors = $"User with id {req.Id.ToString()} already has role {role}"; AddError(errors); } @@ -58,11 +66,15 @@ public override async Task HandleAsync(AddRoleRequest req, CancellationToken ct) if (!result.Succeeded) { var errors = result - .Errors - .Select(e => e.Description) + .Errors.Select(e => e.Description) .Aggregate("Identity Errors: ", (a, b) => $"{a}, {b}"); - _logger.LogWarning("Failed to add role {Role} to user with id {Id}: {errors}", role.ToString(), req.Id.ToString(), errors); + _logger.LogWarning( + "Failed to add role {Role} to user with id {Id}: {errors}", + role.ToString(), + req.Id.ToString(), + errors + ); AddError(errors); } ThrowIfAnyErrors(); @@ -73,4 +85,4 @@ public override async Task HandleAsync(AddRoleRequest req, CancellationToken ct) await SendOkAsync(response, ct); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Models.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Models.cs index 1b6dfb8..1cd64a0 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Models.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Models.cs @@ -6,4 +6,4 @@ public class AddRoleRequest public Guid Id { get; set; } = Guid.Empty; public string Role { get; set; } = string.Empty; -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Summary.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Summary.cs index 5ee06b9..cb0ad30 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Summary.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Add/Summary.cs @@ -6,7 +6,11 @@ public AddRoleSummary() { Summary = "Add a role to a user"; Description = "Add a role to a user by id"; - ExampleRequest = new AddRoleRequest { Id = Guid.NewGuid(), Role = ApplicationRole.Proposer.ToString() }; + ExampleRequest = new AddRoleRequest + { + Id = Guid.NewGuid(), + Role = ApplicationRole.Proposer.ToString() + }; Response( 200, "Role added successfully", @@ -25,4 +29,4 @@ public AddRoleSummary() Response(401, "Unauthorized access"); Response(500, "Internal server error"); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Endpoint.cs index 0c9d832..c9250e6 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Endpoint.cs @@ -26,7 +26,11 @@ public override void Configure() public override async Task HandleAsync(RemoveRoleRequest req, CancellationToken ct) { - _logger.LogInformation("Removing role {Role} from user with id {Id}", req.Role.ToString(), req.Id.ToString()); + _logger.LogInformation( + "Removing role {Role} from user with id {Id}", + req.Role.ToString(), + req.Id.ToString() + ); if (!Enum.TryParse(req.Role, out var role)) { @@ -47,7 +51,11 @@ public override async Task HandleAsync(RemoveRoleRequest req, CancellationToken if (!await _userManager.IsInRoleAsync(user, role.ToString())) { - _logger.LogWarning("User with id {Id} does not have role {Role}", req.Id.ToString(), role.ToString()); + _logger.LogWarning( + "User with id {Id} does not have role {Role}", + req.Id.ToString(), + role.ToString() + ); var errors = $"User with id {req.Id.ToString()} does not have role {role}"; AddError(errors); } @@ -58,11 +66,15 @@ public override async Task HandleAsync(RemoveRoleRequest req, CancellationToken if (!result.Succeeded) { var errors = result - .Errors - .Select(e => e.Description) + .Errors.Select(e => e.Description) .Aggregate("Identity Errors: ", (a, b) => $"{a}, {b}"); - _logger.LogWarning("Failed to remove role {Role} from user with id {Id}: {errors}", role.ToString(), req.Id.ToString(), errors); + _logger.LogWarning( + "Failed to remove role {Role} from user with id {Id}: {errors}", + role.ToString(), + req.Id.ToString(), + errors + ); AddError(errors); } ThrowIfAnyErrors(); @@ -73,4 +85,4 @@ public override async Task HandleAsync(RemoveRoleRequest req, CancellationToken await SendOkAsync(response, ct); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Models.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Models.cs index 1380e28..227bed2 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Models.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Models.cs @@ -6,4 +6,4 @@ public class RemoveRoleRequest public Guid Id { get; set; } = Guid.Empty; public string Role { get; set; } = string.Empty; -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Summary.cs b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Summary.cs index 23e5d85..c0e34a7 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Summary.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Roles/Remove/Summary.cs @@ -6,7 +6,11 @@ public RemoveRoleSummary() { Summary = "Remove a role from a user"; Description = "Remove a role from a user by id"; - ExampleRequest = new RemoveRoleRequest { Id = Guid.NewGuid(), Role = ApplicationRole.Proposer.ToString() }; + ExampleRequest = new RemoveRoleRequest + { + Id = Guid.NewGuid(), + Role = ApplicationRole.Proposer.ToString() + }; Response( 200, "Role removed successfully", @@ -25,4 +29,4 @@ public RemoveRoleSummary() Response(401, "Unauthorized access"); Response(500, "Internal server error"); } -} \ No newline at end of file +} diff --git a/quetzalcoatl-auth/Api/Features/Users/Update/Endpoint.cs b/quetzalcoatl-auth/Api/Features/Users/Update/Endpoint.cs index 89d6473..5c93f03 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Update/Endpoint.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Update/Endpoint.cs @@ -28,8 +28,8 @@ public override async Task HandleAsync(UpdateUserRequest req, CancellationToken { _logger.LogInformation("Updating user with id {Id}", req.Id.ToString()); - var subClaim = User.Claims - .Where(c => c.Type == ClaimTypes.NameIdentifier) + var subClaim = User + .Claims.Where(c => c.Type == ClaimTypes.NameIdentifier) .Select(c => c.Value) .FirstOrDefault(); @@ -52,8 +52,7 @@ public override async Task HandleAsync(UpdateUserRequest req, CancellationToken if (!result.Succeeded) { var errors = result - .Errors - .Select(e => e.Description) + .Errors.Select(e => e.Description) .Aggregate("Identity Errors: ", (a, b) => $"{a}, {b}"); _logger.LogWarning( diff --git a/quetzalcoatl-auth/Api/Features/Users/Update/Mappers.cs b/quetzalcoatl-auth/Api/Features/Users/Update/Mappers.cs index 2dae3e8..a176cc1 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Update/Mappers.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Update/Mappers.cs @@ -61,9 +61,9 @@ public ApplicationUserToUpdateUserResponseProfile() .ForMember( dest => dest.ProfilePictureId, opt => - opt.MapFrom( - src => src.ProfilePicture != null ? src.ProfilePicture!.Id : null - ) + opt.MapFrom(src => + src.ProfilePicture != null ? src.ProfilePicture!.Id : null + ) ); } } diff --git a/quetzalcoatl-auth/Api/Features/Users/Update/Validators.cs b/quetzalcoatl-auth/Api/Features/Users/Update/Validators.cs index 44614df..f6bf1bb 100644 --- a/quetzalcoatl-auth/Api/Features/Users/Update/Validators.cs +++ b/quetzalcoatl-auth/Api/Features/Users/Update/Validators.cs @@ -8,7 +8,9 @@ public Validator() RuleFor(x => x.Username) .MinimumLength(ApplicationUserConsts.UsernameMinLength) - .WithMessage($"Username must be at least {ApplicationUserConsts.UsernameMinLength} characters long") + .WithMessage( + $"Username must be at least {ApplicationUserConsts.UsernameMinLength} characters long" + ) .When(x => !x.Username.IsNullOrEmpty()); RuleFor(x => x.Email) @@ -18,12 +20,16 @@ public Validator() RuleFor(x => x.Fullname) .MaximumLength(ApplicationUserConsts.FullnameMaxLength) - .WithMessage($"Fullname must be at most {ApplicationUserConsts.FullnameMaxLength} characters long") + .WithMessage( + $"Fullname must be at most {ApplicationUserConsts.FullnameMaxLength} characters long" + ) .When(x => !string.IsNullOrWhiteSpace(x.Fullname)); RuleFor(x => x.Bio) .MaximumLength(ApplicationUserConsts.BioMaxLength) - .WithMessage($"Bio must be at most {ApplicationUserConsts.BioMaxLength} characters long") + .WithMessage( + $"Bio must be at most {ApplicationUserConsts.BioMaxLength} characters long" + ) .When(x => !string.IsNullOrWhiteSpace(x.Bio)); RuleFor(x => x.ProfilePicture) @@ -35,7 +41,8 @@ public Validator() .When(x => x.ProfilePicture is not null); } - private static bool IsAllowedSize(long length) => length <= ApplicationUserConsts.ProfilePictureMaxLength; + private static bool IsAllowedSize(long length) => + length <= ApplicationUserConsts.ProfilePictureMaxLength; private static bool IsAllowedType(string contentType) => ApplicationUserConsts.AllowedProfilePictureTypes.Contains(contentType); diff --git a/quetzalcoatl-auth/Api/Usings.cs b/quetzalcoatl-auth/Api/Usings.cs index cd5ed68..02b0b22 100644 --- a/quetzalcoatl-auth/Api/Usings.cs +++ b/quetzalcoatl-auth/Api/Usings.cs @@ -7,6 +7,7 @@ global using Application.Features.Users.ValidateUserCredentials; global using AutoMapper; global using Domain.Configs; +global using Domain.Consts; global using Domain.Entities; global using Domain.Interfaces; global using FastEndpoints; @@ -21,4 +22,3 @@ global using Microsoft.Extensions.Options; global using Microsoft.IdentityModel.Tokens; global using IMapper = AutoMapper.IMapper; -global using Domain.Consts; diff --git a/quetzalcoatl-auth/Application/Features/Users/CreateUser/Handler.cs b/quetzalcoatl-auth/Application/Features/Users/CreateUser/Handler.cs index b82cedd..798d22d 100644 --- a/quetzalcoatl-auth/Application/Features/Users/CreateUser/Handler.cs +++ b/quetzalcoatl-auth/Application/Features/Users/CreateUser/Handler.cs @@ -30,8 +30,7 @@ public override async Task ExecuteAsync( if (!result.Succeeded) { var errors = result - .Errors - .Select(e => e.Description) + .Errors.Select(e => e.Description) .Aggregate("Identity Errors: ", (a, b) => $"{a}, {b}"); _logger.LogError( diff --git a/quetzalcoatl-auth/Bootstrapper/Bootstrapper.csproj b/quetzalcoatl-auth/Bootstrapper/Bootstrapper.csproj index eb3650d..5c415c3 100644 --- a/quetzalcoatl-auth/Bootstrapper/Bootstrapper.csproj +++ b/quetzalcoatl-auth/Bootstrapper/Bootstrapper.csproj @@ -9,6 +9,7 @@ + diff --git a/quetzalcoatl-auth/Bootstrapper/Extensions/ServiceCollectionExtensions.cs b/quetzalcoatl-auth/Bootstrapper/Extensions/ServiceCollectionExtensions.cs index d899933..6c21087 100644 --- a/quetzalcoatl-auth/Bootstrapper/Extensions/ServiceCollectionExtensions.cs +++ b/quetzalcoatl-auth/Bootstrapper/Extensions/ServiceCollectionExtensions.cs @@ -5,8 +5,8 @@ public static class ServiceCollectionExtensions public static void RemoveDbContext(this IServiceCollection services) where T : DbContext { - var descriptor = services.SingleOrDefault( - d => d.ServiceType == typeof(DbContextOptions) + var descriptor = services.SingleOrDefault(d => + d.ServiceType == typeof(DbContextOptions) ); if (descriptor != null) services.Remove(descriptor); diff --git a/quetzalcoatl-auth/Bootstrapper/Program.cs b/quetzalcoatl-auth/Bootstrapper/Program.cs index 04db81e..bffeb5c 100644 --- a/quetzalcoatl-auth/Bootstrapper/Program.cs +++ b/quetzalcoatl-auth/Bootstrapper/Program.cs @@ -1,10 +1,7 @@ Log.Logger = new LoggerConfiguration() - .MinimumLevel - .Override("Microsoft", LogEventLevel.Information) - .Enrich - .FromLogContext() - .WriteTo - .Console() + .MinimumLevel.Override("Microsoft", LogEventLevel.Information) + .Enrich.FromLogContext() + .WriteTo.Console() .CreateLogger(); try @@ -13,6 +10,11 @@ var builder = WebApplication.CreateBuilder(args); + if (builder.Environment.IsEnvironment(SystemConsts.TestingEnvironment)) + { + _ = builder.Configuration.AddDotNetEnv(".env.template", LoadOptions.TraversePath()).Build(); + } + builder.Services.Configure(builder.Configuration.GetSection(nameof(JwtConfig))); builder.Services.Configure(builder.Configuration.GetSection(nameof(AdminConfig))); @@ -28,26 +30,26 @@ }; var dsnConnectionString = builder.Configuration.GetConnectionString("DefaultConnection"); - builder.Services.AddHealthChecks().AddSqlServer(dsnConnectionString!); + if (!builder.Environment.IsEnvironment(SystemConsts.TestingEnvironment)) + { + builder.Services.AddHealthChecks().AddSqlServer(dsnConnectionString!); + } - builder - .Host - .UseSerilog( - (context, services, configuration) => - configuration - .ReadFrom - .Configuration(context.Configuration) - .ReadFrom - .Services(services) - ); + builder.Host.UseSerilog( + (context, services, configuration) => + configuration.ReadFrom.Configuration(context.Configuration).ReadFrom.Services(services) + ); builder - .Services - .AddDbContext(options => + .Services.AddDbContext(options => { - options.UseSqlServer(dsnConnectionString); - options.UseTriggers( - triggerOptions => triggerOptions.AddTrigger() + if (!builder.Environment.IsEnvironment(SystemConsts.TestingEnvironment)) + { + options.UseSqlServer(dsnConnectionString); + } + + options.UseTriggers(triggerOptions => + triggerOptions.AddTrigger() ); }) .AddScoped() @@ -66,12 +68,11 @@ var corsOrigins = builder.Configuration.GetSection("AllowedOrigins").Value?.Split(';'); Log.Information("Allowed origins: {CorsOrigins}", corsOrigins); builder - .Services - .AddCors(options => + .Services.AddCors(options => { - options.AddDefaultPolicy(builder => + options.AddDefaultPolicy(corsPolicyBuilder => { - builder + corsPolicyBuilder .WithOrigins( corsOrigins ?? new[] { "http://localhost:10000", "https://pantheonix.live" } ) @@ -80,6 +81,9 @@ .AllowCredentials(); }); }) + .AddSingleton(tokenValidationParameters) + .AddJWTBearerAuth(jwtConfig.SecretKey) + .AddAutoMapper(typeof(IApiMarker), typeof(IApplicationMarker)) .AddFastEndpoints(options => { options.DisableAutoDiscovery = true; @@ -88,14 +92,6 @@ typeof(IApiMarker).Assembly, typeof(IApplicationMarker).Assembly }; - }) - .AddSingleton(tokenValidationParameters) - .AddJWTBearerAuth(jwtConfig.SecretKey) - .AddAutoMapper(typeof(IApiMarker), typeof(IApplicationMarker)) - .AddSwaggerDoc(settings => - { - settings.Title = "Quetzalcoatl Auth API"; - settings.Version = "v1"; }); var app = builder.Build(); @@ -105,11 +101,17 @@ await app.UseSeedData(); } - app.MapHealthChecks( - "/_health", - new HealthCheckOptions { ResponseWriter = UIResponseWriter.WriteHealthCheckUIResponse, } - ) - .RequireHost("*:5210"); + if (!app.Environment.IsEnvironment(SystemConsts.TestingEnvironment)) + { + app.MapHealthChecks( + "/_health", + new HealthCheckOptions + { + ResponseWriter = UIResponseWriter.WriteHealthCheckUIResponse, + } + ) + .RequireHost("*:5210"); + } app.UseSerilogRequestLogging() .UseDefaultExceptionHandler() diff --git a/quetzalcoatl-auth/Bootstrapper/Usings.cs b/quetzalcoatl-auth/Bootstrapper/Usings.cs index 7ea42a5..a46420b 100644 --- a/quetzalcoatl-auth/Bootstrapper/Usings.cs +++ b/quetzalcoatl-auth/Bootstrapper/Usings.cs @@ -3,8 +3,11 @@ global using Application; global using Bootstrapper.Extensions; global using Domain.Configs; +global using Domain.Consts; global using Domain.Entities; global using Domain.Interfaces; +global using DotNetEnv; +global using DotNetEnv.Configuration; global using FastEndpoints; global using FastEndpoints.Security; global using FastEndpoints.Swagger; diff --git a/quetzalcoatl-auth/Dockerfile b/quetzalcoatl-auth/Dockerfile index 59a1b60..edfaa22 100644 --- a/quetzalcoatl-auth/Dockerfile +++ b/quetzalcoatl-auth/Dockerfile @@ -23,16 +23,6 @@ COPY Infrastructure/ Infrastructure/ WORKDIR /src/Bootstrapper RUN dotnet build -c release --no-restore -# Test Stage -# FROM build AS test -# LABEL stage=builder -# WORKDIR /src - -# COPY Tests.Integration/*.csproj Tests.Integration/ -# RUN dotnet restore Tests.Integration/Tests.Integration.csproj - -# ENTRYPOINT ["dotnet", "test", "--no-build", "--no-restore", "--verbosity=normal", "--logger:trx"] - # Publish Stage FROM build AS publish LABEL stage=builder diff --git a/quetzalcoatl-auth/Domain/Consts/ApplicationUserConsts.cs b/quetzalcoatl-auth/Domain/Consts/ApplicationUserConsts.cs index ebd3b1a..b50474b 100644 --- a/quetzalcoatl-auth/Domain/Consts/ApplicationUserConsts.cs +++ b/quetzalcoatl-auth/Domain/Consts/ApplicationUserConsts.cs @@ -9,5 +9,10 @@ public static class ApplicationUserConsts public const int BioMaxLength = 300; public const int ProfilePictureMaxLength = 10_000_000; public const string PasswordRegex = @"^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[^\da-zA-Z]).{6,20}$"; - public static readonly string[] AllowedProfilePictureTypes = { "image/png", "image/jpeg", "image/jpg" }; -} \ No newline at end of file + public static readonly string[] AllowedProfilePictureTypes = + { + "image/png", + "image/jpeg", + "image/jpg" + }; +} diff --git a/quetzalcoatl-auth/Domain/Consts/SystemConsts.cs b/quetzalcoatl-auth/Domain/Consts/SystemConsts.cs new file mode 100644 index 0000000..b25eda7 --- /dev/null +++ b/quetzalcoatl-auth/Domain/Consts/SystemConsts.cs @@ -0,0 +1,6 @@ +namespace Domain.Consts; + +public static class SystemConsts +{ + public const string TestingEnvironment = "Testing"; +} diff --git a/quetzalcoatl-auth/Infrastructure/ApplicationDbContext.cs b/quetzalcoatl-auth/Infrastructure/ApplicationDbContext.cs index 767f298..346d8e9 100644 --- a/quetzalcoatl-auth/Infrastructure/ApplicationDbContext.cs +++ b/quetzalcoatl-auth/Infrastructure/ApplicationDbContext.cs @@ -26,7 +26,8 @@ protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) { base.OnConfiguring(optionsBuilder); - if (optionsBuilder.IsConfigured) return; + if (optionsBuilder.IsConfigured) + return; var connectionString = Environment.GetEnvironmentVariable("QUETZALCOATL_DSN"); optionsBuilder.UseSqlServer(connectionString!); diff --git a/quetzalcoatl-auth/Infrastructure/Infrastructure.csproj b/quetzalcoatl-auth/Infrastructure/Infrastructure.csproj index d89d3e1..d537e21 100644 --- a/quetzalcoatl-auth/Infrastructure/Infrastructure.csproj +++ b/quetzalcoatl-auth/Infrastructure/Infrastructure.csproj @@ -26,4 +26,4 @@ - + \ No newline at end of file diff --git a/quetzalcoatl-auth/Infrastructure/Migrations/20230315154826_AddGuidAsPKForIdentityUser.cs b/quetzalcoatl-auth/Infrastructure/Migrations/20230315154826_AddGuidAsPKForIdentityUser.cs index 3e1da52..b937e98 100644 --- a/quetzalcoatl-auth/Infrastructure/Migrations/20230315154826_AddGuidAsPKForIdentityUser.cs +++ b/quetzalcoatl-auth/Infrastructure/Migrations/20230315154826_AddGuidAsPKForIdentityUser.cs @@ -13,29 +13,25 @@ protected override void Up(MigrationBuilder migrationBuilder) { migrationBuilder.CreateTable( name: "AspNetRoles", - columns: table => - new - { - Id = table.Column( - type: "uniqueidentifier", - nullable: false, - defaultValueSql: "newsequentialid()" - ), - Name = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - NormalizedName = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - ConcurrencyStamp = table.Column( - type: "nvarchar(max)", - nullable: true - ) - }, + columns: table => new + { + Id = table.Column( + type: "uniqueidentifier", + nullable: false, + defaultValueSql: "newsequentialid()" + ), + Name = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + NormalizedName = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + ConcurrencyStamp = table.Column(type: "nvarchar(max)", nullable: true) + }, constraints: table => { table.PrimaryKey("PK_AspNetRoles", x => x.Id); @@ -44,51 +40,47 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetUsers", - columns: table => - new - { - Id = table.Column( - type: "uniqueidentifier", - nullable: false, - defaultValueSql: "newsequentialid()" - ), - UserName = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - NormalizedUserName = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - Email = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - NormalizedEmail = table.Column( - type: "nvarchar(256)", - maxLength: 256, - nullable: true - ), - EmailConfirmed = table.Column(type: "bit", nullable: false), - PasswordHash = table.Column(type: "nvarchar(max)", nullable: true), - SecurityStamp = table.Column(type: "nvarchar(max)", nullable: true), - ConcurrencyStamp = table.Column( - type: "nvarchar(max)", - nullable: true - ), - PhoneNumber = table.Column(type: "nvarchar(max)", nullable: true), - PhoneNumberConfirmed = table.Column(type: "bit", nullable: false), - TwoFactorEnabled = table.Column(type: "bit", nullable: false), - LockoutEnd = table.Column( - type: "datetimeoffset", - nullable: true - ), - LockoutEnabled = table.Column(type: "bit", nullable: false), - AccessFailedCount = table.Column(type: "int", nullable: false) - }, + columns: table => new + { + Id = table.Column( + type: "uniqueidentifier", + nullable: false, + defaultValueSql: "newsequentialid()" + ), + UserName = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + NormalizedUserName = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + Email = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + NormalizedEmail = table.Column( + type: "nvarchar(256)", + maxLength: 256, + nullable: true + ), + EmailConfirmed = table.Column(type: "bit", nullable: false), + PasswordHash = table.Column(type: "nvarchar(max)", nullable: true), + SecurityStamp = table.Column(type: "nvarchar(max)", nullable: true), + ConcurrencyStamp = table.Column(type: "nvarchar(max)", nullable: true), + PhoneNumber = table.Column(type: "nvarchar(max)", nullable: true), + PhoneNumberConfirmed = table.Column(type: "bit", nullable: false), + TwoFactorEnabled = table.Column(type: "bit", nullable: false), + LockoutEnd = table.Column( + type: "datetimeoffset", + nullable: true + ), + LockoutEnabled = table.Column(type: "bit", nullable: false), + AccessFailedCount = table.Column(type: "int", nullable: false) + }, constraints: table => { table.PrimaryKey("PK_AspNetUsers", x => x.Id); @@ -97,16 +89,15 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetRoleClaims", - columns: table => - new - { - Id = table - .Column(type: "int", nullable: false) - .Annotation("SqlServer:Identity", "1, 1"), - RoleId = table.Column(type: "uniqueidentifier", nullable: false), - ClaimType = table.Column(type: "nvarchar(max)", nullable: true), - ClaimValue = table.Column(type: "nvarchar(max)", nullable: true) - }, + columns: table => new + { + Id = table + .Column(type: "int", nullable: false) + .Annotation("SqlServer:Identity", "1, 1"), + RoleId = table.Column(type: "uniqueidentifier", nullable: false), + ClaimType = table.Column(type: "nvarchar(max)", nullable: true), + ClaimValue = table.Column(type: "nvarchar(max)", nullable: true) + }, constraints: table => { table.PrimaryKey("PK_AspNetRoleClaims", x => x.Id); @@ -122,16 +113,15 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetUserClaims", - columns: table => - new - { - Id = table - .Column(type: "int", nullable: false) - .Annotation("SqlServer:Identity", "1, 1"), - UserId = table.Column(type: "uniqueidentifier", nullable: false), - ClaimType = table.Column(type: "nvarchar(max)", nullable: true), - ClaimValue = table.Column(type: "nvarchar(max)", nullable: true) - }, + columns: table => new + { + Id = table + .Column(type: "int", nullable: false) + .Annotation("SqlServer:Identity", "1, 1"), + UserId = table.Column(type: "uniqueidentifier", nullable: false), + ClaimType = table.Column(type: "nvarchar(max)", nullable: true), + ClaimValue = table.Column(type: "nvarchar(max)", nullable: true) + }, constraints: table => { table.PrimaryKey("PK_AspNetUserClaims", x => x.Id); @@ -147,20 +137,16 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetUserLogins", - columns: table => - new - { - LoginProvider = table.Column( - type: "nvarchar(450)", - nullable: false - ), - ProviderKey = table.Column(type: "nvarchar(450)", nullable: false), - ProviderDisplayName = table.Column( - type: "nvarchar(max)", - nullable: true - ), - UserId = table.Column(type: "uniqueidentifier", nullable: false) - }, + columns: table => new + { + LoginProvider = table.Column(type: "nvarchar(450)", nullable: false), + ProviderKey = table.Column(type: "nvarchar(450)", nullable: false), + ProviderDisplayName = table.Column( + type: "nvarchar(max)", + nullable: true + ), + UserId = table.Column(type: "uniqueidentifier", nullable: false) + }, constraints: table => { table.PrimaryKey( @@ -179,12 +165,11 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetUserRoles", - columns: table => - new - { - UserId = table.Column(type: "uniqueidentifier", nullable: false), - RoleId = table.Column(type: "uniqueidentifier", nullable: false) - }, + columns: table => new + { + UserId = table.Column(type: "uniqueidentifier", nullable: false), + RoleId = table.Column(type: "uniqueidentifier", nullable: false) + }, constraints: table => { table.PrimaryKey("PK_AspNetUserRoles", x => new { x.UserId, x.RoleId }); @@ -207,28 +192,23 @@ protected override void Up(MigrationBuilder migrationBuilder) migrationBuilder.CreateTable( name: "AspNetUserTokens", - columns: table => - new - { - UserId = table.Column(type: "uniqueidentifier", nullable: false), - LoginProvider = table.Column( - type: "nvarchar(450)", - nullable: false - ), - Name = table.Column(type: "nvarchar(450)", nullable: false), - Value = table.Column(type: "nvarchar(max)", nullable: true) - }, + columns: table => new + { + UserId = table.Column(type: "uniqueidentifier", nullable: false), + LoginProvider = table.Column(type: "nvarchar(450)", nullable: false), + Name = table.Column(type: "nvarchar(450)", nullable: false), + Value = table.Column(type: "nvarchar(max)", nullable: true) + }, constraints: table => { table.PrimaryKey( "PK_AspNetUserTokens", - x => - new - { - x.UserId, - x.LoginProvider, - x.Name - } + x => new + { + x.UserId, + x.LoginProvider, + x.Name + } ); table.ForeignKey( name: "FK_AspNetUserTokens_AspNetUsers_UserId", diff --git a/quetzalcoatl-auth/Infrastructure/Migrations/20230319104138_AddProfileImageToIdentityUser.cs b/quetzalcoatl-auth/Infrastructure/Migrations/20230319104138_AddProfileImageToIdentityUser.cs index d0e1071..a1f3296 100644 --- a/quetzalcoatl-auth/Infrastructure/Migrations/20230319104138_AddProfileImageToIdentityUser.cs +++ b/quetzalcoatl-auth/Infrastructure/Migrations/20230319104138_AddProfileImageToIdentityUser.cs @@ -13,13 +13,12 @@ protected override void Up(MigrationBuilder migrationBuilder) { migrationBuilder.CreateTable( name: "Pictures", - columns: table => - new - { - Id = table.Column(type: "uniqueidentifier", nullable: false), - Data = table.Column(type: "varbinary(max)", nullable: false), - UserId = table.Column(type: "uniqueidentifier", nullable: false) - }, + columns: table => new + { + Id = table.Column(type: "uniqueidentifier", nullable: false), + Data = table.Column(type: "varbinary(max)", nullable: false), + UserId = table.Column(type: "uniqueidentifier", nullable: false) + }, constraints: table => { table.PrimaryKey("PK_Pictures", x => x.Id); diff --git a/quetzalcoatl-auth/Infrastructure/Migrations/20230509173144_AddRefreshTokenEntity.cs b/quetzalcoatl-auth/Infrastructure/Migrations/20230509173144_AddRefreshTokenEntity.cs index 5fef439..a0e4044 100644 --- a/quetzalcoatl-auth/Infrastructure/Migrations/20230509173144_AddRefreshTokenEntity.cs +++ b/quetzalcoatl-auth/Infrastructure/Migrations/20230509173144_AddRefreshTokenEntity.cs @@ -13,14 +13,13 @@ protected override void Up(MigrationBuilder migrationBuilder) { migrationBuilder.CreateTable( name: "RefreshTokens", - columns: table => - new - { - Id = table.Column(type: "uniqueidentifier", nullable: false), - Token = table.Column(type: "nvarchar(max)", nullable: false), - ExpiryDate = table.Column(type: "datetime2", nullable: false), - UserId = table.Column(type: "uniqueidentifier", nullable: false) - }, + columns: table => new + { + Id = table.Column(type: "uniqueidentifier", nullable: false), + Token = table.Column(type: "nvarchar(max)", nullable: false), + ExpiryDate = table.Column(type: "datetime2", nullable: false), + UserId = table.Column(type: "uniqueidentifier", nullable: false) + }, constraints: table => { table.PrimaryKey("PK_RefreshTokens", x => x.Id); diff --git a/quetzalcoatl-auth/Infrastructure/Migrations/20230525105824_UpdateRefreshTokenEntity.cs b/quetzalcoatl-auth/Infrastructure/Migrations/20230525105824_UpdateRefreshTokenEntity.cs index fb438f1..f057092 100644 --- a/quetzalcoatl-auth/Infrastructure/Migrations/20230525105824_UpdateRefreshTokenEntity.cs +++ b/quetzalcoatl-auth/Infrastructure/Migrations/20230525105824_UpdateRefreshTokenEntity.cs @@ -13,16 +13,15 @@ protected override void Up(MigrationBuilder migrationBuilder) { migrationBuilder.CreateTable( name: "RefreshTokens", - columns: table => - new - { - Token = table.Column(type: "uniqueidentifier", nullable: false), - ExpiryDate = table.Column(type: "datetime2", nullable: false), - IsUsed = table.Column(type: "bit", nullable: false), - IsInvalidated = table.Column(type: "bit", nullable: false), - Jti = table.Column(type: "uniqueidentifier", nullable: false), - UserId = table.Column(type: "uniqueidentifier", nullable: false) - }, + columns: table => new + { + Token = table.Column(type: "uniqueidentifier", nullable: false), + ExpiryDate = table.Column(type: "datetime2", nullable: false), + IsUsed = table.Column(type: "bit", nullable: false), + IsInvalidated = table.Column(type: "bit", nullable: false), + Jti = table.Column(type: "uniqueidentifier", nullable: false), + UserId = table.Column(type: "uniqueidentifier", nullable: false) + }, constraints: table => { table.PrimaryKey("PK_RefreshTokens", x => x.Token); diff --git a/quetzalcoatl-auth/Infrastructure/Migrations/20231201120945_RemoveRedundantFieldsFromRefreshTokens.cs b/quetzalcoatl-auth/Infrastructure/Migrations/20231201120945_RemoveRedundantFieldsFromRefreshTokens.cs index a84ea53..24aeddb 100644 --- a/quetzalcoatl-auth/Infrastructure/Migrations/20231201120945_RemoveRedundantFieldsFromRefreshTokens.cs +++ b/quetzalcoatl-auth/Infrastructure/Migrations/20231201120945_RemoveRedundantFieldsFromRefreshTokens.cs @@ -11,49 +11,45 @@ public partial class RemoveRedundantFieldsFromRefreshTokens : Migration /// protected override void Up(MigrationBuilder migrationBuilder) { - migrationBuilder.DropPrimaryKey( - name: "PK_RefreshTokens", - table: "RefreshTokens"); + migrationBuilder.DropPrimaryKey(name: "PK_RefreshTokens", table: "RefreshTokens"); - migrationBuilder.DropColumn( - name: "Jti", - table: "RefreshTokens"); + migrationBuilder.DropColumn(name: "Jti", table: "RefreshTokens"); - migrationBuilder.DropColumn( - name: "IsUsed", - table: "RefreshTokens"); + migrationBuilder.DropColumn(name: "IsUsed", table: "RefreshTokens"); migrationBuilder.AddPrimaryKey( name: "PK_RefreshTokens", table: "RefreshTokens", - column: "Token"); + column: "Token" + ); } /// protected override void Down(MigrationBuilder migrationBuilder) { - migrationBuilder.DropPrimaryKey( - name: "PK_RefreshTokens", - table: "RefreshTokens"); + migrationBuilder.DropPrimaryKey(name: "PK_RefreshTokens", table: "RefreshTokens"); migrationBuilder.AddColumn( name: "Jti", table: "RefreshTokens", type: "uniqueidentifier", nullable: false, - defaultValue: new Guid("00000000-0000-0000-0000-000000000000")); + defaultValue: new Guid("00000000-0000-0000-0000-000000000000") + ); migrationBuilder.AddColumn( name: "IsUsed", table: "RefreshTokens", type: "bit", nullable: false, - defaultValue: false); + defaultValue: false + ); migrationBuilder.AddPrimaryKey( name: "PK_RefreshTokens", table: "RefreshTokens", - columns: new[] { "Token", "Jti" }); + columns: new[] { "Token", "Jti" } + ); } } } diff --git a/quetzalcoatl-auth/Infrastructure/Triggers/DeleteStaleRefreshTokens.cs b/quetzalcoatl-auth/Infrastructure/Triggers/DeleteStaleRefreshTokens.cs index 81e3503..e2b0c51 100644 --- a/quetzalcoatl-auth/Infrastructure/Triggers/DeleteStaleRefreshTokens.cs +++ b/quetzalcoatl-auth/Infrastructure/Triggers/DeleteStaleRefreshTokens.cs @@ -17,8 +17,8 @@ CancellationToken cancellationToken { if (context.ChangeType is ChangeType.Added or ChangeType.Modified) { - await _tokenRepository.DeleteRefreshTokenAsync( - token => token.IsInvalidated || token.ExpiryDate < DateTime.UtcNow + await _tokenRepository.DeleteRefreshTokenAsync(token => + token.IsInvalidated || token.ExpiryDate < DateTime.UtcNow ); } } diff --git a/quetzalcoatl-auth/Tests.Integration/Api/Features/Auth/RegisterEndpointTests.cs b/quetzalcoatl-auth/Tests.Integration/Api/Features/Auth/RegisterEndpointTests.cs index 656c93f..941812c 100644 --- a/quetzalcoatl-auth/Tests.Integration/Api/Features/Auth/RegisterEndpointTests.cs +++ b/quetzalcoatl-auth/Tests.Integration/Api/Features/Auth/RegisterEndpointTests.cs @@ -1,3 +1,5 @@ +using System.Net.Http.Formatting; + namespace Tests.Integration.Api.Features.Auth; public class RegisterEndpointTests : IClassFixture @@ -37,13 +39,13 @@ public async Task GivenValidUser_WhenRegistering_ThenReturnsCreated() "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -108,13 +110,13 @@ public async Task GivenInvalidUser_WhenRegistering_ThenReturnsBadRequest() "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -144,11 +146,25 @@ public async Task GivenInvalidUser_WhenRegistering_ThenReturnsBadRequest() // check if AccessToken and RefreshToken cookies exist response.Headers.TryGetValues("Set-Cookie", out _).Should().BeFalse(); - var result = await response.Content.ReadAsAsync(); + var formatters = new MediaTypeFormatterCollection(); + formatters.JsonFormatter.SupportedMediaTypes.Add( + new MediaTypeHeaderValue("application/json") + ); + formatters.JsonFormatter.SupportedMediaTypes.Add( + new MediaTypeHeaderValue("application/problem+json") + ); + + var result = await response.Content.ReadAsAsync(formatters: formatters); result.Should().NotBeNull(); - result!.Errors.Keys.Should().Contain(nameof(request.Password)); - result.Errors.Keys.Should().Contain(nameof(request.ProfilePicture)); + result! + .Errors.Keys.Select(r => r.ToUpper()) + .Should() + .Contain(nameof(request.Password).ToUpper()); + result + .Errors.Keys.Select(r => r.ToUpper()) + .Should() + .Contain(nameof(request.ProfilePicture).ToUpper()); #endregion } diff --git a/quetzalcoatl-auth/Tests.Integration/Api/Features/Images/GetImageEndpointTests.cs b/quetzalcoatl-auth/Tests.Integration/Api/Features/Images/GetImageEndpointTests.cs index 480ac9f..337737d 100644 --- a/quetzalcoatl-auth/Tests.Integration/Api/Features/Images/GetImageEndpointTests.cs +++ b/quetzalcoatl-auth/Tests.Integration/Api/Features/Images/GetImageEndpointTests.cs @@ -21,45 +21,6 @@ public GetImageEndpointTests(ApiWebFactory apiWebFactory) #endregion - [Fact] - public async Task GivenAnonymousUser_WhenGettingImage_ThenReturnsUnauthorized() - { - #region Arrange - - using var scope = _apiWebFactory.Services.CreateScope(); - var userManager = scope.ServiceProvider.GetRequiredService>(); - - var profilePictureData = await ImageHelpers.GetImageAsByteArrayAsync( - "https://picsum.photos/200" - ); - var profilePicture = new Picture { Data = profilePictureData }; - - var applicationUser = _applicationUserFaker - .Clone() - .RuleFor(rule => rule.ProfilePicture, profilePicture) - .Generate(); - - const string validPassword = "P@ssw0rd!"; - await userManager.CreateAsync(applicationUser, validPassword); - - var request = new GetImageRequest { Id = applicationUser.ProfilePicture!.Id }; - - #endregion - - #region Act - - var response = await _client.GETAsync(request); - - #endregion - - #region Assert - - response.Should().NotBeNull(); - response.StatusCode.Should().Be(HttpStatusCode.Unauthorized); - - #endregion - } - [Fact] public async Task GivenAuthorizedUserAndNonExistingImageId_WhenGettingImage_ThenReturnsNotFound() { diff --git a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/DeleteEndpointTests.cs b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/DeleteEndpointTests.cs index d4e76b5..4b03052 100644 --- a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/DeleteEndpointTests.cs +++ b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/DeleteEndpointTests.cs @@ -41,6 +41,7 @@ public async Task GivenAnonymousUser_WhenDeletingUser_ThenReturnsUnauthorized() const string validPassword = "P@ssw0rd!"; await userManager.CreateAsync(applicationUser, validPassword); + var users = await userManager.Users.ToListAsync(); var request = new DeleteUserRequest { Id = applicationUser.Id }; @@ -132,9 +133,9 @@ public async Task GivenAuthorizedUserAndIdOfNonExistingUser_WhenDeletingUser_The using var scope = _apiWebFactory.Services.CreateScope(); var userManager = scope.ServiceProvider.GetRequiredService>(); - var roleManager = scope - .ServiceProvider - .GetRequiredService>>(); + var roleManager = scope.ServiceProvider.GetRequiredService< + RoleManager> + >(); await roleManager.CreateAsync(new IdentityRole(ApplicationRole.Admin.ToString())); @@ -204,9 +205,9 @@ public async Task GivenAuthorizedUser_WhenDeletingUser_ThenReturnsNoContent() using var scope = _apiWebFactory.Services.CreateScope(); var userManager = scope.ServiceProvider.GetRequiredService>(); - var roleManager = scope - .ServiceProvider - .GetRequiredService>>(); + var roleManager = scope.ServiceProvider.GetRequiredService< + RoleManager> + >(); await roleManager.CreateAsync(new IdentityRole(ApplicationRole.Admin.ToString())); diff --git a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/GetAllEndpointTests.cs b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/GetAllEndpointTests.cs index fd8ad46..f066d74 100644 --- a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/GetAllEndpointTests.cs +++ b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/GetAllEndpointTests.cs @@ -29,9 +29,6 @@ public async Task GivenAuthorizedUser_WhenGettingAllUsers_ThenReturnsOk() using var scope = _apiWebFactory.Services.CreateScope(); var userManager = scope.ServiceProvider.GetRequiredService>(); - var existingUsers = await userManager.Users.ToListAsync(); - await userManager.DeleteAsync(existingUsers.ElementAt(0)); - var profilePictureData = await ImageHelpers.GetImageAsByteArrayAsync( "https://picsum.photos/200" ); diff --git a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/UpdateEndpointTests.cs b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/UpdateEndpointTests.cs index 0f2584b..e15ae38 100644 --- a/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/UpdateEndpointTests.cs +++ b/quetzalcoatl-auth/Tests.Integration/Api/Features/Users/UpdateEndpointTests.cs @@ -47,13 +47,13 @@ public async Task GivenAnonymousUser_WhenUpdatingUser_ThenReturnsUnauthorized() "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -128,13 +128,13 @@ public async Task GivenAuthorizedUserAndInvalidRequest_WhenUpdatingUser_ThenRetu "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -229,13 +229,13 @@ public async Task GivenAuthorizedUserAndRequestForUpdatingOtherUserThanSelf_When "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -331,13 +331,13 @@ public async Task GivenAuthorizedUserAndValidRequestForPartialUpdate_WhenUpdatin "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); @@ -439,13 +439,13 @@ public async Task GivenAuthorizedUserAndValidRequest_WhenUpdatingUser_ThenReturn "demo.jpg" ); - _client - .DefaultRequestHeaders - .Accept - .Add(new MediaTypeWithQualityHeaderValue("application/json")); - _client - .DefaultRequestHeaders - .TryAddWithoutValidation("Content-Type", "multipart/form-data"); + _client.DefaultRequestHeaders.Accept.Add( + new MediaTypeWithQualityHeaderValue("application/json") + ); + _client.DefaultRequestHeaders.TryAddWithoutValidation( + "Content-Type", + "multipart/form-data" + ); var requestForm = new MultipartFormDataContent(); diff --git a/quetzalcoatl-auth/Tests.Integration/Core/ApiWebFactory.cs b/quetzalcoatl-auth/Tests.Integration/Core/ApiWebFactory.cs index e5208a8..8979f5e 100644 --- a/quetzalcoatl-auth/Tests.Integration/Core/ApiWebFactory.cs +++ b/quetzalcoatl-auth/Tests.Integration/Core/ApiWebFactory.cs @@ -1,9 +1,11 @@ +using DotNet.Testcontainers.Builders; + namespace Tests.Integration.Core; public class ApiWebFactory : WebApplicationFactory, IAsyncLifetime { private readonly MsSqlContainer _database = new MsSqlBuilder() - .WithImage("mcr.microsoft.com/mssql/server:2022-latest") + .WithImage("mcr.microsoft.com/mssql/server:2022-CU14-ubuntu-22.04") .Build(); protected override void ConfigureWebHost(IWebHostBuilder builder) @@ -13,7 +15,7 @@ protected override void ConfigureWebHost(IWebHostBuilder builder) services.RemoveDbContext(); services.AddDbContext(options => { - options.UseSqlServer(_database.GetConnectionString()); + options.UseSqlServer($"{_database.GetConnectionString()};MultipleActiveResultSets=true"); }); services.ApplyMigrations(); }); diff --git a/quetzalcoatl-auth/global.json b/quetzalcoatl-auth/global.json index 7cd6a1f..dee0d43 100644 --- a/quetzalcoatl-auth/global.json +++ b/quetzalcoatl-auth/global.json @@ -1,6 +1,6 @@ { "sdk": { - "version": "7.0.0", + "version": "7.0.119", "rollForward": "latestMajor", "allowPrerelease": true } diff --git a/release-please-config.json b/release-please-config.json new file mode 100644 index 0000000..49d2612 --- /dev/null +++ b/release-please-config.json @@ -0,0 +1,37 @@ +{ + "include-component-in-tag": true, + "include-v-in-tag": true, + "tag-separator": "/", + "separate-pull-requests": true, + "packages": { + "quetzalcoatl": { + "path": "quetzalcoatl-auth", + "release-type": "simple" + }, + "enki": { + "path": "enki-problems", + "release-type": "simple" + }, + "hermes": { + "path": "hermes-tests", + "release-type": "dart" + }, + "anubis": { + "path": "anubis-eval", + "release-type": "rust" + }, + "odin": { + "path": "odin-gateway", + "release-type": "simple" + }, + "dapr": { + "path": "dapr", + "release-type": "simple" + }, + "eval-lb": { + "path": "anubis-eval/eval-lb", + "release-type": "simple" + } + }, + "$schema": "https://raw.githubusercontent.com/googleapis/release-please/main/schemas/config.json" +} \ No newline at end of file diff --git a/seeder/.dockerignore b/seeder/.dockerignore new file mode 100644 index 0000000..57600d9 --- /dev/null +++ b/seeder/.dockerignore @@ -0,0 +1,34 @@ +# Include any files or directories that you don't want to be copied to your +# container here (e.g., local build artifacts, temporary files, etc.). +# +# For more help, visit the .dockerignore file reference guide at +# https://docs.docker.com/go/build-context-dockerignore/ + +**/.DS_Store +**/.classpath +**/.dockerignore +**/.env +**/.git +**/.gitignore +**/.project +**/.settings +**/.toolstarget +**/.vs +**/.vscode +**/*.*proj.user +**/*.dbmdl +**/*.jfm +**/bin +**/charts +**/docker-compose* +**/compose.y*ml +**/Dockerfile* +**/node_modules +**/npm-debug.log +**/obj +**/secrets.dev.yaml +**/values.dev.yaml +**/.idea +LICENSE +README.md +fixtures.yaml diff --git a/seeder/.gitignore b/seeder/.gitignore new file mode 100644 index 0000000..c2efc81 --- /dev/null +++ b/seeder/.gitignore @@ -0,0 +1 @@ +.devcontainer/ \ No newline at end of file diff --git a/seeder/Dockerfile b/seeder/Dockerfile new file mode 100644 index 0000000..4ff8d4a --- /dev/null +++ b/seeder/Dockerfile @@ -0,0 +1,66 @@ +# Create a stage for building the application. +ARG GO_VERSION=1.22 +FROM --platform=$BUILDPLATFORM golang:${GO_VERSION} AS build +WORKDIR /src + +# Download dependencies as a separate step to take advantage of Docker's caching. +# Leverage a cache mount to /go/pkg/mod/ to speed up subsequent builds. +# Leverage bind mounts to go.sum and go.mod to avoid having to copy them into +# the container. +RUN --mount=type=cache,target=/go/pkg/mod/ \ + --mount=type=bind,source=go.sum,target=go.sum \ + --mount=type=bind,source=go.mod,target=go.mod \ + go mod download -x + +# This is the architecture you're building for, which is passed in by the builder. +# Placing it here allows the previous steps to be cached across architectures. +ARG TARGETARCH + +# Build the application. +# Leverage a cache mount to /go/pkg/mod/ to speed up subsequent builds. +# Leverage a bind mount to the current directory to avoid having to copy the +# source code into the container. +RUN --mount=type=cache,target=/go/pkg/mod/ \ + --mount=type=bind,target=. \ + CGO_ENABLED=0 GOARCH=$TARGETARCH go build -o /bin/seeder . + +################################################################################ +# Create a new stage for running the application that contains the minimal +# runtime dependencies for the application. This often uses a different base +# image from the build stage where the necessary files are copied from the build +# stage. +# +# The example below uses the alpine image as the foundation for running the app. +# By specifying the "latest" tag, it will also use whatever happens to be the +# most recent version of that image when you build your Dockerfile. If +# reproducability is important, consider using a versioned tag +# (e.g., alpine:3.17.2) or SHA (e.g., alpine@sha256:c41ab5c992deb4fe7e5da09f67a8804a46bd0592bfdf0b1847dde0e0889d2bff). +FROM alpine:latest AS final + +# Install any runtime dependencies that are needed to run your application. +# Leverage a cache mount to /var/cache/apk/ to speed up subsequent builds. +RUN --mount=type=cache,target=/var/cache/apk \ + apk --update add \ + ca-certificates \ + tzdata \ + && \ + update-ca-certificates + +# Create a non-privileged user that the app will run under. +# See https://docs.docker.com/go/dockerfile-user-best-practices/ +ARG UID=10001 +RUN adduser \ + --disabled-password \ + --gecos "" \ + --home "/nonexistent" \ + --shell "/sbin/nologin" \ + --no-create-home \ + --uid "${UID}" \ + appuser +USER appuser + +# Copy the executable from the "build" stage. +COPY --from=build /bin/seeder /bin/ + +# What the container should run when it is started. +ENTRYPOINT [ "/bin/seeder" ] diff --git a/seeder/fixtures.go b/seeder/fixtures.go new file mode 100644 index 0000000..ca74938 --- /dev/null +++ b/seeder/fixtures.go @@ -0,0 +1,71 @@ +package main + +import ( + "fmt" + "gopkg.in/yaml.v3" + "os" +) + +type Fixtures struct { + BaseUrl string `yaml:"base_url"` + Users UserFixtures `yaml:"users"` + Problems ProblemFixtures `yaml:"problems"` +} + +type UserFixtures struct { + Endpoints struct { + Register string `yaml:"register"` + Login string `yaml:"login"` + } + Data []*UserData `yaml:"data,flow"` +} + +type UserData struct { + Email string `yaml:"email" json:"email"` + Password string `yaml:"password" json:"password"` +} + +type ProblemFixtures struct { + Endpoints struct { + Create string `yaml:"create"` + Update string `yaml:"update"` + CreateTest string `yaml:"create_test"` + CreateSubmission string `yaml:"create_submission"` + } + Data []*ProblemData `yaml:"data,flow"` +} + +type ProblemData struct { + CreateReqPath string `yaml:"create_req_path"` + Tests []*TestData `yaml:"tests,flow"` + Submissions []*SubmissionData `yaml:"submissions,flow"` +} + +type TestData struct { + TestZipPath string `yaml:"test_zip_path"` + Score int `yaml:"score"` +} + +type SubmissionData struct { + SourceCodePath string `yaml:"source_code_path"` + Language string `yaml:"language"` +} + +// LoadFixtures loads fixtures from a YAML file +func LoadFixtures(filename string) (*Fixtures, error) { + fixtures := &Fixtures{} + + file, err := os.Open(filename) + defer file.Close() + + if err != nil { + return nil, fmt.Errorf("failed to open fixtures file: %s", err) + } + + content, err := os.ReadFile(filename) + if err := yaml.Unmarshal(content, fixtures); err != nil { + return nil, fmt.Errorf("failed to decode fixtures: %s", err) + } + + return fixtures, nil +} diff --git a/seeder/fixtures.yaml b/seeder/fixtures.yaml new file mode 100644 index 0000000..1c5ef63 --- /dev/null +++ b/seeder/fixtures.yaml @@ -0,0 +1,30 @@ +base_url: http://localhost/api +users: + endpoints: + register: /identity/auth/register + login: /identity/auth/login + data: + - email: admin@gmail.com + password: Password@123 +problems: + endpoints: + create: /problems + update: /problems/{problem_id} + create_test: /problems/{problem_id}/test + create_submission: /eval/submissions + data: + - create_req_path: /temp/ProblemArchive/vecsum/create_req.json + tests: + - test_zip_path: /temp/ProblemArchive/vecsum/tests/1/1.zip + score: 20 + - test_zip_path: /temp/ProblemArchive/vecsum/tests/2/2.zip + score: 20 + - test_zip_path: /temp/ProblemArchive/vecsum/tests/3/3.zip + score: 20 + - test_zip_path: /temp/ProblemArchive/vecsum/tests/4/4.zip + score: 20 + - test_zip_path: /temp/ProblemArchive/vecsum/tests/5/5.zip + score: 20 + submissions: + - source_code_path: /temp/ProblemArchive/vecsum/solutions/main.rs + language: Rust diff --git a/seeder/go.mod b/seeder/go.mod new file mode 100644 index 0000000..a33eb91 --- /dev/null +++ b/seeder/go.mod @@ -0,0 +1,8 @@ +module seeder + +go 1.22 + +require ( + golang.org/x/sync v0.8.0 + gopkg.in/yaml.v3 v3.0.1 +) diff --git a/seeder/go.sum b/seeder/go.sum new file mode 100644 index 0000000..c6f3269 --- /dev/null +++ b/seeder/go.sum @@ -0,0 +1,6 @@ +golang.org/x/sync v0.8.0 h1:3NFvSEYkUoMifnESzZl15y791HH1qU2xm6eCJU5ZPXQ= +golang.org/x/sync v0.8.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk= +gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM= +gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= +gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA= +gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= diff --git a/seeder/main.go b/seeder/main.go new file mode 100644 index 0000000..cb06dd3 --- /dev/null +++ b/seeder/main.go @@ -0,0 +1,26 @@ +package main + +import ( + "context" + "log" +) + +func main() { + // Load fixtures from fixtures.yaml + fixtures, err := LoadFixtures("/temp/fixtures.yaml") + if err != nil { + log.Fatalf("failed to load fixtures: %s", err) + } + + // Create a new Pantheonix client + client := NewPantheonixClient(fixtures) + + // Create a new seeder + seeder := NewSeeder(client) + + // Seed problems, tests and submissions + ctx := context.Background() + if err := seeder.SeedProblems(ctx); err != nil { + log.Fatalf("failed to seed problems: %s", err) + } +} diff --git a/seeder/pantheonix_client.go b/seeder/pantheonix_client.go new file mode 100644 index 0000000..c2214bb --- /dev/null +++ b/seeder/pantheonix_client.go @@ -0,0 +1,274 @@ +package main + +import ( + "bytes" + "context" + "encoding/json" + "fmt" + "io" + "log" + "mime/multipart" + "net/http" + "os" + "path/filepath" + "strings" + "time" +) + +type PantheonixClient struct { + config *Fixtures + *http.Client +} + +func NewPantheonixClient(config *Fixtures) *PantheonixClient { + return &PantheonixClient{config, &http.Client{Timeout: 30 * time.Second}} +} + +func (c *PantheonixClient) Endpoint(endpoint string) string { + return c.config.BaseUrl + endpoint +} + +func (t *BearerToken) SetAccessToken(token string, cookie *http.Cookie) { + t.AccessToken = token + t.Cookie = cookie +} + +func (c *PantheonixClient) Login(user *UserData) (*BearerToken, error) { + token := NewBearerToken() + + // Send a login request + bodyJson, err := json.Marshal(user) + if err != nil { + return token, fmt.Errorf("failed to serialize user: %s", err) + } + + bodyReader := bytes.NewReader(bodyJson) + res, err := c.Post(c.Endpoint(c.config.Users.Endpoints.Login), "application/json", bodyReader) + defer res.Body.Close() + + if err != nil { + return token, fmt.Errorf("failed to login: %s", err) + } + + if res.StatusCode != http.StatusOK { + return token, fmt.Errorf("failed to login: %s", res.Status) + } + + // Extract the bearer token from the response + cookies := res.Cookies() + + for _, cookie := range cookies { + if strings.Contains(cookie.Name, "AccessToken") { + token.SetAccessToken(cookie.Value, cookie) + break + } + } + + return token, nil +} + +func (c *PantheonixClient) CreateProblem(ctx context.Context, token *BearerToken, problemId int) (*ProblemDto, error) { + problem := c.config.Problems.Data[problemId] + + reqBodyJson, err := os.ReadFile(problem.CreateReqPath) + if err != nil { + return nil, fmt.Errorf("failed to parse create problem request file content for problem %d: %v", problemId, err) + } + + bodyReader := bytes.NewReader(reqBodyJson) + req, err := http.NewRequest(http.MethodPost, c.Endpoint(c.config.Problems.Endpoints.Create), bodyReader) + + if err != nil { + return nil, fmt.Errorf("failed to create problem with id %d: %v", problemId, err) + } + req.Header.Set("Content-Type", "application/json") + req.AddCookie(token.Cookie) + + res, err := c.Do(req) + if err != nil { + return nil, fmt.Errorf("failed to create problem with id %d: %s", problemId, err) + } + defer res.Body.Close() + + if res.StatusCode != http.StatusOK { + return nil, fmt.Errorf("failed to create problem: %s", res.Status) + } + + problemDto := &ProblemDto{} + resBody, err := io.ReadAll(res.Body) + if err != nil { + return nil, fmt.Errorf("failed to create problem with id %d: %s", problemId, err) + } + + if err := json.Unmarshal(resBody, problemDto); err != nil { + return nil, fmt.Errorf("failed to create problem with id %d: %s", problemId, err) + } + + // Upload tests + tests := c.config.Problems.Data[problemId].Tests + for testId := range len(tests) { + if err := c.CreateTest(token, problemDto.Id, problemId, testId); err != nil { + return nil, err + } + } + + // Publish problem + if err := c.PublishProblem(token, problemDto.Id); err != nil { + return nil, err + } + + log.Printf("Successfully created problem with id %d and guid %s\n", problemId, problemDto.Id) + + return problemDto, nil +} + +func (c *PantheonixClient) CreateTest(token *BearerToken, problemGuid string, problemId, testId int) error { + test := c.config.Problems.Data[problemId].Tests[testId] + + extraParams := map[string]string{ + "score": fmt.Sprintf("%d", test.Score), + } + createTestEndpoint := c.Endpoint(c.config.Problems.Endpoints.CreateTest) + createTestEndpoint = strings.Replace(createTestEndpoint, "{problem_id}", problemGuid, 1) + + req, err := newFileUploadRequest(createTestEndpoint, test.TestZipPath, "archiveFile", extraParams) + if err != nil { + return fmt.Errorf("failed to compose create request for test with id %d for problem %d: %s", testId, problemId, err) + } + req.AddCookie(token.Cookie) + + res, err := c.Do(req) + if err != nil { + return fmt.Errorf("failed to send create request test with id %d for problem %d: %s", testId, problemId, err) + } + defer res.Body.Close() + + if res.StatusCode != http.StatusOK { + return fmt.Errorf("failed to create test with id %d for problem %d: %s", testId, problemId, res.Status) + } + + log.Printf("Successfully created test with id %d\n", testId) + + return nil +} + +func (c *PantheonixClient) PublishProblem(token *BearerToken, problemGuid string) error { + publishProblemEndpoint := c.Endpoint(c.config.Problems.Endpoints.Update) + publishProblemEndpoint = strings.Replace(publishProblemEndpoint, "{problem_id}", problemGuid, 1) + + bodyJson, err := json.Marshal(map[string]bool{"isPublished": true}) + if err != nil { + return fmt.Errorf("failed to serialize publish request for problem %s: %s", problemGuid, err) + } + + body := bytes.NewReader(bodyJson) + req, err := http.NewRequest(http.MethodPut, publishProblemEndpoint, body) + if err != nil { + return fmt.Errorf("failed to compose publish request for problem %s: %s", problemGuid, err) + } + req.Header.Set("Content-Type", "application/json") + req.AddCookie(token.Cookie) + + res, err := c.Do(req) + if err != nil { + return fmt.Errorf("failed to send publish request for problem %s: %s", problemGuid, err) + } + defer res.Body.Close() + + if res.StatusCode != http.StatusOK { + return fmt.Errorf("failed to publish problem %s: %s", problemGuid, res.Status) + } + + log.Printf("Successfully published problem %s\n", problemGuid) + + return nil +} + +func (c *PantheonixClient) CreateSubmission(token *BearerToken, submissionDto *SubmissionDto) error { + createSubmissionEndpoint := c.Endpoint(c.config.Problems.Endpoints.CreateSubmission) + + bodyJson, err := json.Marshal(submissionDto) + if err != nil { + return fmt.Errorf("failed to serialize submission: %s", err) + } + + body := bytes.NewReader(bodyJson) + req, err := http.NewRequest(http.MethodPost, createSubmissionEndpoint, body) + if err != nil { + return fmt.Errorf("failed to compose submission request: %s", err) + } + req.Header.Set("Content-Type", "application/json") + req.AddCookie(token.Cookie) + + res, err := c.Do(req) + if err != nil { + return fmt.Errorf("failed to send submission request: %s", err) + } + defer res.Body.Close() + + if res.StatusCode != http.StatusCreated { + return fmt.Errorf("failed to submit: %s", res.Status) + } + + log.Printf("Successfully submitted solution for problem %s\n", submissionDto.ProblemId) + + return nil +} + +func newFileUploadRequest(uri string, filePath string, testFileParamName string, params map[string]string) (*http.Request, error) { + fileReader, err := os.Open(filePath) + if err != nil { + return nil, err + } + defer fileReader.Close() + + body := &bytes.Buffer{} + writer := multipart.NewWriter(body) + part, err := writer.CreateFormFile(testFileParamName, filepath.Base(filePath)) + if err != nil { + return nil, err + } + + _, err = io.Copy(part, fileReader) + for key, val := range params { + err = writer.WriteField(key, val) + if err != nil { + return nil, err + } + } + + err = writer.Close() + if err != nil { + return nil, err + } + + req, err := http.NewRequest(http.MethodPost, uri, body) + if err != nil { + return nil, err + } + req.Header.Set("Content-Type", writer.FormDataContentType()) + + return req, nil +} + +type ProblemDto struct { + Id string `json:"id"` +} + +type SubmissionDto struct { + ProblemId string `json:"problem_id"` + Language string `json:"language"` + SourceCode string `json:"source_code"` +} + +type BearerToken struct { + AccessToken string + Cookie *http.Cookie +} + +func NewBearerToken() *BearerToken { + return &BearerToken{ + AccessToken: "", + Cookie: nil, + } +} diff --git a/seeder/seeder.go b/seeder/seeder.go new file mode 100644 index 0000000..00c3f3f --- /dev/null +++ b/seeder/seeder.go @@ -0,0 +1,86 @@ +package main + +import ( + "context" + "fmt" + "golang.org/x/sync/errgroup" + "log" + "os" +) + +type Seeder struct { + client *PantheonixClient +} + +func NewSeeder(client *PantheonixClient) *Seeder { + return &Seeder{client} +} + +func (s *Seeder) SeedUsers() error { + return nil +} + +func (s *Seeder) SeedProblems(ctx context.Context) error { + admin := s.client.config.Users.Data[0] + token, err := s.client.Login(admin) + + if err != nil { + return err + } + + g, ctx := errgroup.WithContext(ctx) + problems := s.client.config.Problems.Data + for i, problem := range problems { + g.Go(func() error { + problemDto, err := s.client.CreateProblem(ctx, token, i) + if err != nil { + return err + } + + if err := s.SeedSubmissions(ctx, token, problemDto, problem); err != nil { + return err + } + + return nil + }) + } + + if err := g.Wait(); err != nil { + return err + } + + log.Println("Successfully created problems and tests") + + return nil +} + +func (s *Seeder) SeedSubmissions(ctx context.Context, token *BearerToken, problemDto *ProblemDto, problemData *ProblemData) error { + g, ctx := errgroup.WithContext(ctx) + for i, submission := range problemData.Submissions { + g.Go(func() error { + sourceCode, err := os.ReadFile(submission.SourceCodePath) + if err != nil { + return fmt.Errorf("failed to read source code for submission %d: %v", i, err) + } + + submissionDto := &SubmissionDto{ + ProblemId: problemDto.Id, + Language: submission.Language, + SourceCode: string(sourceCode), + } + if err := s.client.CreateSubmission(token, submissionDto); err != nil { + return err + } + + return nil + }) + } + + if err := g.Wait(); err != nil { + return err + } + + log.Printf("Successfully created submissions for problem %s\n", problemDto.Id) + + return nil +} From 09d78a4fbf9930a6efee74c4b1370bd8689f175d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:31:29 +0300 Subject: [PATCH 2/8] chore(develop): release anubis 0.1.0 (#35) --- .release-please-manifest.json | 3 ++- anubis-eval/CHANGELOG.md | 32 ++++++++++++++++++++++++++++++++ 2 files changed, 34 insertions(+), 1 deletion(-) create mode 100644 anubis-eval/CHANGELOG.md diff --git a/.release-please-manifest.json b/.release-please-manifest.json index 65251c0..2f9cbba 100644 --- a/.release-please-manifest.json +++ b/.release-please-manifest.json @@ -5,5 +5,6 @@ "anubis": "2.0.0", "odin": "2.0.0", "dapr": "2.0.0", - "eval-lb": "2.0.0" + "eval-lb": "2.0.0", + "anubis-eval": "0.1.0" } \ No newline at end of file diff --git a/anubis-eval/CHANGELOG.md b/anubis-eval/CHANGELOG.md new file mode 100644 index 0000000..4e84904 --- /dev/null +++ b/anubis-eval/CHANGELOG.md @@ -0,0 +1,32 @@ +# Changelog + +## 0.1.0 (2024-09-21) + + +### Features + +* **anubis-judge0:** add nginx lb between anubis and judge0 replicas ([13fbc85](https://github.com/Pantheonix/Asgard/commit/13fbc85d6ea9a4c436484668d040932d560d7265)) +* **anubis:** add create/get test case dapr client methods ([35702ea](https://github.com/Pantheonix/Asgard/commit/35702ea227af61eb5a90e61bea5e2bf0a480e111)) +* **anubis:** add endpoint for retrieving the highest score submissions per user (and problem if specified) ([251ff99](https://github.com/Pantheonix/Asgard/commit/251ff99bcc461c143114a37a04a0dac5891bb258)) +* **cors:** enable cors policy for anubis, enki and quetzalcoatl ([6652dbc](https://github.com/Pantheonix/Asgard/commit/6652dbcdbfaceeb94c36369423fecb7b2682d9d5)) +* **enki+anubis:** add pubsub support for eval metadata retrieval to improve performance ([35391c9](https://github.com/Pantheonix/Asgard/commit/35391c968ef2e91a89a86f20288890b866756bd7)) + + +### Bug Fixes + +* **anubis:** add cors preflight catcher ([80f5651](https://github.com/Pantheonix/Asgard/commit/80f5651f9ef37327db4edfc0f2c6d9d0a5337729)) +* **anubis:** configure CORS policy for Rocket ([f25b830](https://github.com/Pantheonix/Asgard/commit/f25b83054e91ee37a52e16075e5c4e4bb0d85f98)) +* **anubis:** fix compilation error in application error mapping ([54fa5ff](https://github.com/Pantheonix/Asgard/commit/54fa5ff4165f07ebfbe4f0ed1c2b5618697f69a7)) +* **anubis:** remove unused import ([5984d6c](https://github.com/Pantheonix/Asgard/commit/5984d6cde4e9a4d2a32073b325c5d330d270f6ba)) +* **anubis:** split submission batch into chunks ([8bc3f87](https://github.com/Pantheonix/Asgard/commit/8bc3f87e1440a456360a62fe606f9474869f2a49)) +* **anubis:** update tests PK as the composition between id and problem_id ([87c7b5b](https://github.com/Pantheonix/Asgard/commit/87c7b5bd6c7ec9653071daabe9088b7f9e6cac8b)) +* **submission source code:** show submission source code iff problem has been solved previously by user ([7b60949](https://github.com/Pantheonix/Asgard/commit/7b609495c69f89db2ad79e22b5c139b2790594f1)) + + +### Performance Improvements + +* **anubis:** add is_published field to submissions dtos ([a30c434](https://github.com/Pantheonix/Asgard/commit/a30c4348ae0505b34c1f69c16b45becee7a73937)) +* **anubis:** add ocaml and lua support ([a8a79eb](https://github.com/Pantheonix/Asgard/commit/a8a79ebd32b973e936fc3f8890ff2d81a51f9ad2)) +* **anubis:** add problem name to get all submissions endpoint response ([096f6b7](https://github.com/Pantheonix/Asgard/commit/096f6b70551eaafaa40d1e5d5b0713f926e110f0)) +* **anubis:** add problem name to get submission by id endpoint response ([94fa6c6](https://github.com/Pantheonix/Asgard/commit/94fa6c619b183165de65c715b25a1f76dd03c707)) +* **anubis:** improve http errors format using json ([d47cba0](https://github.com/Pantheonix/Asgard/commit/d47cba0b128c0e2ae8be36be16b9f47bea7cd046)) From 645bfe05353660d920e026c974abaf0e8ce13647 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:43:42 +0300 Subject: [PATCH 3/8] CI: remove redundant env vars from ci pipelines (#36) * fix(release-please): use custom github pat * fix(ci-cd): remove redundant env vars from ci pipelines --- .github/workflows/anubis-eval-ci.yaml | 16 +++++----------- .github/workflows/dapr-config-ci.yaml | 16 +++++----------- .github/workflows/enki-problems-ci.yaml | 16 +++++----------- .github/workflows/eval-lb-ci.yaml | 16 +++++----------- .github/workflows/hermes-tests-ci.yaml | 17 +++++------------ .github/workflows/odin-gateway-ci.yaml | 16 +++++----------- .github/workflows/quetzalcoatl-auth-ci.yaml | 16 +++++----------- 7 files changed, 35 insertions(+), 78 deletions(-) diff --git a/.github/workflows/anubis-eval-ci.yaml b/.github/workflows/anubis-eval-ci.yaml index 47abe93..62d0031 100644 --- a/.github/workflows/anubis-eval-ci.yaml +++ b/.github/workflows/anubis-eval-ci.yaml @@ -5,12 +5,6 @@ on: tags: - "anubis/**" -env: - NAMESPACE: pantheonix - REPOSITORY: anubis - IMAGE_NAME: anubis-eval - BUILD_CONTEXT: anubis-eval - jobs: build: name: Build and Test Anubis Eval Microservice @@ -50,13 +44,13 @@ jobs: needs: build uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: anubis-eval + build_context: anubis-eval deploy-to-docker-hub: needs: build uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: anubis + build_context: anubis-eval diff --git a/.github/workflows/dapr-config-ci.yaml b/.github/workflows/dapr-config-ci.yaml index a407702..a0dfac4 100644 --- a/.github/workflows/dapr-config-ci.yaml +++ b/.github/workflows/dapr-config-ci.yaml @@ -5,22 +5,16 @@ on: tags: - "dapr/**" -env: - NAMESPACE: pantheonix - REPOSITORY: asgard-dapr - IMAGE_NAME: asgard-dapr-config - BUILD_CONTEXT: dapr - jobs: deploy-to-ghcr: uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: asgard-dapr-config + build_context: dapr deploy-to-docker-hub: uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} \ No newline at end of file + namespace: pantheonix + repository: asgard-dapr + build_context: dapr \ No newline at end of file diff --git a/.github/workflows/enki-problems-ci.yaml b/.github/workflows/enki-problems-ci.yaml index c6d40fc..ff6d4bb 100644 --- a/.github/workflows/enki-problems-ci.yaml +++ b/.github/workflows/enki-problems-ci.yaml @@ -5,12 +5,6 @@ on: tags: - "enki/**" -env: - NAMESPACE: pantheonix - REPOSITORY: enki - IMAGE_NAME: enki-problems - BUILD_CONTEXT: enki-problems - jobs: build: name: Build and Test Enki Problems Microservice @@ -37,13 +31,13 @@ jobs: needs: build uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: enki-problems + build_context: enki-problems deploy-to-docker-hub: needs: build uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: enki + build_context: enki-problems diff --git a/.github/workflows/eval-lb-ci.yaml b/.github/workflows/eval-lb-ci.yaml index 82044d3..ef104f1 100644 --- a/.github/workflows/eval-lb-ci.yaml +++ b/.github/workflows/eval-lb-ci.yaml @@ -5,22 +5,16 @@ on: tags: - "eval-lb/**" -env: - NAMESPACE: pantheonix - REPOSITORY: eval-lb - IMAGE_NAME: asgard-eval-lb - BUILD_CONTEXT: anubis-eval/eval-lb - jobs: deploy-to-ghcr: uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: asgard-eval-lb + build_context: anubis-eval/eval-lb deploy-to-docker-hub: uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: eval-lb + build_context: anubis-eval/eval-lb diff --git a/.github/workflows/hermes-tests-ci.yaml b/.github/workflows/hermes-tests-ci.yaml index c4a59aa..53256cc 100644 --- a/.github/workflows/hermes-tests-ci.yaml +++ b/.github/workflows/hermes-tests-ci.yaml @@ -5,13 +5,6 @@ on: tags: - "hermes/**" -env: - NAMESPACE: pantheonix - REPOSITORY: hermes - IMAGE_NAME: hermes-tests - BUILD_CONTEXT: hermes-tests - HERMES_CONFIG: ${{ secrets.HERMES_CONFIG }} - jobs: build: name: Build and Test Hermes Tests Microservice @@ -43,13 +36,13 @@ jobs: needs: build uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: hermes-tests + build_context: hermes-tests deploy-to-docker-hub: needs: build uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: hermes + build_context: hermes-tests diff --git a/.github/workflows/odin-gateway-ci.yaml b/.github/workflows/odin-gateway-ci.yaml index b5e5484..40159a9 100644 --- a/.github/workflows/odin-gateway-ci.yaml +++ b/.github/workflows/odin-gateway-ci.yaml @@ -5,23 +5,17 @@ on: tags: - "odin/**" -env: - NAMESPACE: pantheonix - REPOSITORY: odin - IMAGE_NAME: odin-api-gateway - BUILD_CONTEXT: odin-gateway - jobs: deploy-to-ghcr: uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: odin-api-gateway + build_context: odin-gateway deploy-to-docker-hub: uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: odin + build_context: odin-gateway diff --git a/.github/workflows/quetzalcoatl-auth-ci.yaml b/.github/workflows/quetzalcoatl-auth-ci.yaml index aba5d5d..f01a0df 100644 --- a/.github/workflows/quetzalcoatl-auth-ci.yaml +++ b/.github/workflows/quetzalcoatl-auth-ci.yaml @@ -5,12 +5,6 @@ on: tags: - "quetzalcoatl/**" -env: - NAMESPACE: pantheonix - REPOSITORY: quetzalcoatl - IMAGE_NAME: quetzalcoatl-auth - BUILD_CONTEXT: quetzalcoatl-auth - jobs: build: name: Build and Test Quetzalcoatl Auth Microservice @@ -37,13 +31,13 @@ jobs: needs: build uses: ./.github/workflows/step-deploy-to-ghcr.yaml with: - image_name: ${{ env.IMAGE_NAME }} - build_context: ${{ env.BUILD_CONTEXT }} + image_name: quetzalcoatl-auth + build_context: quetzalcoatl-auth deploy-to-docker-hub: needs: build uses: ./.github/workflows/step-deploy-to-docker-hub.yaml with: - namespace: ${{ env.NAMESPACE }} - repository: ${{ env.REPOSITORY }} - build_context: ${{ env.BUILD_CONTEXT }} + namespace: pantheonix + repository: quetzalcoatl + build_context: quetzalcoatl-auth From e0491e242a01538a906e5886a32bed156dce8b4d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:46:47 +0300 Subject: [PATCH 4/8] chore(develop): release odin 1.0.0 (#32) --- .release-please-manifest.json | 3 +-- odin-gateway/CHANGELOG.md | 23 +++++++++++++++++++++++ 2 files changed, 24 insertions(+), 2 deletions(-) create mode 100644 odin-gateway/CHANGELOG.md diff --git a/.release-please-manifest.json b/.release-please-manifest.json index 2f9cbba..33f86c7 100644 --- a/.release-please-manifest.json +++ b/.release-please-manifest.json @@ -2,9 +2,8 @@ "quetzalcoatl": "2.0.0", "enki": "2.0.0", "hermes": "2.0.0", - "anubis": "2.0.0", - "odin": "2.0.0", "dapr": "2.0.0", "eval-lb": "2.0.0", + "odin-gateway": "1.0.0" "anubis-eval": "0.1.0" } \ No newline at end of file diff --git a/odin-gateway/CHANGELOG.md b/odin-gateway/CHANGELOG.md new file mode 100644 index 0000000..1c0746d --- /dev/null +++ b/odin-gateway/CHANGELOG.md @@ -0,0 +1,23 @@ +# Changelog + +## 1.0.0 (2024-09-21) + + +### Features + +* Add logging and error handling to refresh token endpoint ([fdfc9ba](https://github.com/Pantheonix/Asgard/commit/fdfc9baee1ff9d693e6221ea337fcf824a3a94a9)) + + +### Bug Fixes + +* **odin:** remove certs dependency as they are already set in lb ([92ec8a0](https://github.com/Pantheonix/Asgard/commit/92ec8a024559c4b5aaf9f60d68eb54d449b36f44)) +* **odin:** remove https redirection ([11cdfb6](https://github.com/Pantheonix/Asgard/commit/11cdfb66ff40c5c4736a6f6a337cd11eeb6f0c70)) +* **odin:** use correct name for access token ([d7aabcb](https://github.com/Pantheonix/Asgard/commit/d7aabcb7e8508fb095942a24e84547b020e89442)) + + +### Performance Improvements + +* **odin:** add https support ([17d199d](https://github.com/Pantheonix/Asgard/commit/17d199d6329cea36a0f0c363a144067c190ec750)) +* **odin:** add lua filter for appending access/refresh tokens from/to requests/responses ([884b168](https://github.com/Pantheonix/Asgard/commit/884b1685d1a2067f8758b2d1fb6d1418f6c2c47f)) +* **odin:** increase expiry time for newly created access tokens ([9e86619](https://github.com/Pantheonix/Asgard/commit/9e86619c839023e53060f6724eb380dbaf13703e)) +* **prod:** update SameSite attribute for cookies to None ([7d143ad](https://github.com/Pantheonix/Asgard/commit/7d143adde34b824adc3a90f6a062bb5453038912)) From eea3440d63146b341b439e0c6a9c4a5d154c8e87 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:47:39 +0300 Subject: [PATCH 5/8] chore(develop): release hermes 1.0.0 (#34) --- .release-please-manifest.json | 2 +- hermes-tests/CHANGELOG.md | 9 +++++++++ 2 files changed, 10 insertions(+), 1 deletion(-) diff --git a/.release-please-manifest.json b/.release-please-manifest.json index 33f86c7..147dd09 100644 --- a/.release-please-manifest.json +++ b/.release-please-manifest.json @@ -1,9 +1,9 @@ { "quetzalcoatl": "2.0.0", "enki": "2.0.0", - "hermes": "2.0.0", "dapr": "2.0.0", "eval-lb": "2.0.0", + "hermes-tests": "1.0.0" "odin-gateway": "1.0.0" "anubis-eval": "0.1.0" } \ No newline at end of file diff --git a/hermes-tests/CHANGELOG.md b/hermes-tests/CHANGELOG.md index effe43c..0f6b265 100644 --- a/hermes-tests/CHANGELOG.md +++ b/hermes-tests/CHANGELOG.md @@ -1,3 +1,12 @@ +# Changelog + +## 1.0.0 (2024-09-21) + + +### Features + +* **cors:** enable cors policy for anubis, enki and quetzalcoatl ([6652dbc](https://github.com/Pantheonix/Asgard/commit/6652dbcdbfaceeb94c36369423fecb7b2682d9d5)) + ## 1.0.0 - Initial version. From 31620f8b26469c0e2a597edc57fa852d8dda34ea Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:48:18 +0300 Subject: [PATCH 6/8] chore(develop): release quetzalcoatl 1.0.0 (#33) --- .release-please-manifest.json | 2 +- quetzalcoatl-auth/CHANGELOG.md | 29 +++++++++++++++++++++++++++++ 2 files changed, 30 insertions(+), 1 deletion(-) create mode 100644 quetzalcoatl-auth/CHANGELOG.md diff --git a/.release-please-manifest.json b/.release-please-manifest.json index 147dd09..f0184c5 100644 --- a/.release-please-manifest.json +++ b/.release-please-manifest.json @@ -1,8 +1,8 @@ { - "quetzalcoatl": "2.0.0", "enki": "2.0.0", "dapr": "2.0.0", "eval-lb": "2.0.0", + "quetzalcoatl-auth": "1.0.0" "hermes-tests": "1.0.0" "odin-gateway": "1.0.0" "anubis-eval": "0.1.0" diff --git a/quetzalcoatl-auth/CHANGELOG.md b/quetzalcoatl-auth/CHANGELOG.md new file mode 100644 index 0000000..b82cf5b --- /dev/null +++ b/quetzalcoatl-auth/CHANGELOG.md @@ -0,0 +1,29 @@ +# Changelog + +## 1.0.0 (2024-09-21) + + +### Features + +* **cors:** enable cors policy for anubis, enki and quetzalcoatl ([6652dbc](https://github.com/Pantheonix/Asgard/commit/6652dbcdbfaceeb94c36369423fecb7b2682d9d5)) +* **images:** allow anonymous users to access the image endpoint ([9689d00](https://github.com/Pantheonix/Asgard/commit/9689d003b1b09b5c4b01abd0577a21844f26af8a)) +* **roles:** add endpoints for roles management ([33e1b88](https://github.com/Pantheonix/Asgard/commit/33e1b88092bfd53c6e1aee14c6a05d36ec525f9e)) +* **users:** add support for filtering-sorting-pagination for users endpoint ([464b181](https://github.com/Pantheonix/Asgard/commit/464b1810efe519e155309573d55da5603bcb0a53)) + + +### Bug Fixes + +* **quetzalcoatl:** check profile picture not to be null before adding its id to dtos ([f869580](https://github.com/Pantheonix/Asgard/commit/f8695804e16dc01123b4e70f6726d3555885bb58)) +* **quetzalcoatl:** get total count of items after filtering for correct user pagination on GetAll endpoint ([8fdd7ec](https://github.com/Pantheonix/Asgard/commit/8fdd7ec9fbae2dc880b373abdab2f115bbd4e40d)) +* **remove role:** add role information to user response ([c1ad42e](https://github.com/Pantheonix/Asgard/commit/c1ad42efd1e1f497e1a84009bea0f81aef9a2479)) + + +### Performance Improvements + +* **prod:** update SameSite attribute for cookies to None ([7d143ad](https://github.com/Pantheonix/Asgard/commit/7d143adde34b824adc3a90f6a062bb5453038912)) +* **quetzalcoatl:** extract cors origins in envvar ([cffef24](https://github.com/Pantheonix/Asgard/commit/cffef24f5af89ee8c3328e0a8fd88e091c0f0939)) +* **quetzalcoatl:** set cookies as samesite none ([6108f12](https://github.com/Pantheonix/Asgard/commit/6108f12a86e7060eab56506a7059bf13745bed5e)) +* **quetzalcoatl:** simplify refresh token logic ([36f3105](https://github.com/Pantheonix/Asgard/commit/36f3105b35d0be6469a7afeae2a0f33a34ba0365)) +* **quetzalcoatl:** update user dtos to use ProfilePictureId instead of ProfilePictureUrl ([67da983](https://github.com/Pantheonix/Asgard/commit/67da983e7155937f9ea956afa2717e7d9144837f)) +* **register:** set auth tokens as samesite lax ([c4537b6](https://github.com/Pantheonix/Asgard/commit/c4537b6524e956215f278c32194b4a63a51634ba)) +* **users:** define all fsp params as optional ([54ec5c2](https://github.com/Pantheonix/Asgard/commit/54ec5c24bebfe674ffaf973c1bb61b2b79cf5170)) From bbcfa2f7d173aae02f33d518d476d640eb5bba17 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:48:47 +0300 Subject: [PATCH 7/8] chore(develop): release enki 1.0.0 (#31) --- .release-please-manifest.json | 2 +- enki-problems/CHANGELOG.md | 22 ++++++++++++++++++++++ 2 files changed, 23 insertions(+), 1 deletion(-) create mode 100644 enki-problems/CHANGELOG.md diff --git a/.release-please-manifest.json b/.release-please-manifest.json index f0184c5..30223f9 100644 --- a/.release-please-manifest.json +++ b/.release-please-manifest.json @@ -1,7 +1,7 @@ { - "enki": "2.0.0", "dapr": "2.0.0", "eval-lb": "2.0.0", + "enki-problems": "1.0.0" "quetzalcoatl-auth": "1.0.0" "hermes-tests": "1.0.0" "odin-gateway": "1.0.0" diff --git a/enki-problems/CHANGELOG.md b/enki-problems/CHANGELOG.md new file mode 100644 index 0000000..d089ab6 --- /dev/null +++ b/enki-problems/CHANGELOG.md @@ -0,0 +1,22 @@ +# Changelog + +## 1.0.0 (2024-09-21) + + +### Features + +* **anubis-judge0:** add nginx lb between anubis and judge0 replicas ([13fbc85](https://github.com/Pantheonix/Asgard/commit/13fbc85d6ea9a4c436484668d040932d560d7265)) +* **cors:** enable cors policy for anubis, enki and quetzalcoatl ([6652dbc](https://github.com/Pantheonix/Asgard/commit/6652dbcdbfaceeb94c36369423fecb7b2682d9d5)) +* **enki+anubis:** add pubsub support for eval metadata retrieval to improve performance ([35391c9](https://github.com/Pantheonix/Asgard/commit/35391c968ef2e91a89a86f20288890b866756bd7)) +* **enki:** add delete problem endpoint (propagate deletion event against hermes too) ([f29eaa5](https://github.com/Pantheonix/Asgard/commit/f29eaa59984119224f0af2c2250265cbcb84e50e)) + + +### Bug Fixes + +* **enki:** allow proposer to keep the same name for an existing problem on problem update endpoint ([b1226a1](https://github.com/Pantheonix/Asgard/commit/b1226a17304bd8d334723aef6507e6b4373d78bd)) +* **enki:** count items after filtering for correct pagination of problems (both published and unpublished) ([15a607b](https://github.com/Pantheonix/Asgard/commit/15a607bf6bb27f62149d8a85c0232b9f8c599e46)) + + +### Performance Improvements + +* **enki:** add authorization to GetListAsync and GetAsync methods ([be8e58b](https://github.com/Pantheonix/Asgard/commit/be8e58ba2d7b6a679aa67550428469cf1601304a)) From 86c427d759636c45cfb49157d599319884a1276c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?B=C4=83rbu=C8=9B-Dic=C4=83=20Sami?= <70215066+WarriorsSami@users.noreply.github.com> Date: Sat, 21 Sep 2024 15:49:30 +0300 Subject: [PATCH 8/8] chore(develop): release dapr 2.1.0 (#30) --- dapr/CHANGELOG.md | 8 ++++++++ 1 file changed, 8 insertions(+) create mode 100644 dapr/CHANGELOG.md diff --git a/dapr/CHANGELOG.md b/dapr/CHANGELOG.md new file mode 100644 index 0000000..244a135 --- /dev/null +++ b/dapr/CHANGELOG.md @@ -0,0 +1,8 @@ +# Changelog + +## [2.1.0](https://github.com/Pantheonix/Asgard/compare/dapr-v2.0.0...dapr/v2.1.0) (2024-09-21) + + +### Features + +* **enki+anubis:** add pubsub support for eval metadata retrieval to improve performance ([35391c9](https://github.com/Pantheonix/Asgard/commit/35391c968ef2e91a89a86f20288890b866756bd7))