This repository contains a collection of engines that power the core stack for Prisma, most prominently Prisma Client and Prisma Migrate.
If you're looking for how to install Prisma or any of the engines, the Getting Started guide might be useful.
This document describes some of the internals of the engines, and how to build and test them.
This repository contains the following core components:
- Query compiler β compiles Prisma Client queries into executable plans (SQL + orchestration) that the client runs through driver adapters in JavaScript land.
- Driver adapters & executor harness β TypeScript utilities that load query plans, talk to database drivers, and expose the legacy protocol for existing tooling.
- Schema engine β creates and runs migrations, and performs introspection.
- Prisma Format β historically a formatter for Prisma schemas, now also serves LSP features.
Additionally, the psl (Prisma Schema Language) is the library that defines how the language looks, how it's parsed, etc.
You'll also find:
- libs, for various (small) libraries such as macros, user facing errors, various connector/database-specific libraries, etc.
- a
docker-compose.ymlfile that's helpful for running tests and bringing up containers for various databases - a
shell.nixfile for bringing up all dependencies and making it easy to build the code in this repository (the use of this file andnixis entirely optional, but can be a good and easy way to get started) - an
.envrcfile to make it easier to set everything up, including thenix shell
The API docs (cargo doc) are published on our fabulous repo page.
Prerequisites:
- Installed the latest stable version of the Rust toolchain. You can get the toolchain at rustup or the package manager of your choice.
- Linux only: OpenSSL is required to be installed.
- Installed direnv, then
direnv allowon the repository root.- Make sure direnv is hooked into your shell
- Alternatively: Load the defined environment in
./.envrcmanually in your shell.
Note for nix users: it should be enough to direnv allow.
How to build:
To build all engines, simply execute cargo build on the repository root. This
builds non-production debug binaries. If you want to build the optimized
binaries in release mode, the command is cargo build --release.
Depending on how you invoked cargo in the previous step, you can find the compiled binaries inside
the repository root in the target/debug (without --release) or target/release directories (with
--release):
| Prisma Component | Path to Binary |
|---|---|
| Schema Engine | ./target/[debug|release]/schema-engine |
| Prisma Format | ./target/[debug|release]/prisma-fmt |
The query compiler is a library crate. To produce the Wasm bundles that power the JS runtime, use
make build-qc-wasm. Driver adapters are compiled via make build-driver-adapters-kit-qc.
The Prisma Schema Language is a library which defines the data structures and parsing rules for prisma files, including the available database connectors. For more technical details, please check the library README.
The PSL is used throughout the schema engine, as well as prisma format. The DataModel (DML), which is an annotated version of the PSL, is also used as input for the query compiler and driver adapters.
Prisma Client now executes queries through the query compiler and TypeScript driver adapters:
- The Rust query compiler consumes the DML and produces query plans describing the SQL and orchestration steps required to satisfy a Prisma query.
- Driver adapters (see
libs/driver-adapters) wrap database drivers in JavaScript. They are used by the@prisma/client-engine-runtimepackage in the main repo which implements the query plan interpreter and transaction management. Driver adapters are also used directly from Rust by the early stage work-in-progress Wasm port of the schema engine. - The connector test kit (
query-engine/connector-test-kit-rs) exercises this end-to-end by spawning the executor process and driving requests through the adapters.
You will typically touch three layers when working on the query stack:
- Rust planner logic (
query-compiler,query-core,query-structure, etc.). - The driver adapter executor (
libs/driver-adapters/executor). - Integration tests (
cargo test -p query-engine-tests, usually viamake dev-*-qc).
There is no standalone query engine binary anymore. The compatibility harness lives in JavaScript and is
bundled from this repository using make build-driver-adapters-kit-qc.
The Schema Engine does a couple of things:
- creates new migrations by comparing the prisma file with the current state of the database, in order to bring the database in sync with the prisma file
- run these migrations and keeps track of which migrations have been executed
- (re-)generate a prisma schema file starting from a live database
The engine uses:
- the prisma files, as the source of truth
- the database it connects to, for diffing and running migrations, as well as
keeping track of migrations in the
_prisma_migrationstable - the
prisma/migrationsdirectory which acts as a database of existing migrations
Prisma format can format prisma schema files. It also comes as a Wasm module via a node package. You can read more here.
When trying to debug code, here's a few things that might be useful:
- use the language server; being able to go to definition and reason about code can make things a lot easier,
- add
dbg!()statements to validate code paths, inspect variables, etc., - you can control the amount of logs you see, and where they come from using the
RUST_LOGenvironment variable; see the documentation
There are two test suites for the engines: Unit tests and integration tests.
-
Unit tests: They test internal functionality of individual crates and components.
You can find them across the whole codebase, usually in
./testsfolders at the root of modules. These tests can be executed viacargo test. Note that some of them will require theTEST_DATABASE_URLenviornment variable set up. -
Integration tests: They run GraphQL/JSON requests through the driver adapter executor (wrapping the query compiler) and assert that the responses match expectations.
You can find them at
./query-engine/connector-test-kit-rs.
Note
Help needed: document how to run
quainttests- schema engine tests
This can be a good first contribution.
To run unit tests for the whole workspace (except crates that require external services such as
quaint, sql-migration-tests, or the connector test kit), use:
make test-unitThis target wires up the appropriate --exclude list. If you prefer plain cargo, replicate the
exclusions used in the Makefile when invoking cargo test --workspace --all-features.
Prerequisites:
- Rust toolchain
- Docker (for SQL connectors)
- Node.js β₯ 20 and pnpm (driver adapters)
direnv allowin the repository root, or load.envrcmanually
Setup:
Use the dev-*-qc helpers to spin up a database (when needed), build the query-compiler Wasm, build
the driver adapters, and write the .test_config consumed by the connector test kit:
make dev-pg-qcmake dev-pg-cockroachdb-qcmake dev-mssql-qcmake dev-planetscale-qcmake dev-mariadb-qcmake dev-libsql-qcmake dev-better-sqlite3-qcmake dev-d1-qcmake dev-neon-qc
The non-*-qc helpers (e.g. make dev-postgres13) are still available when you only need a database,
but they do not build driver adapters for you.
On Windows without WSL: replicate what the Make targets do manually (start the container, run
make build-qc-wasm, run make build-driver-adapters-kit-qc, and create .test_config).
Run:
cargo test -p query-engine-tests -- --nocaptureSet DRIVER_ADAPTER=<adapter> when invoking make test-qe to run against a specific adapter, e.g.:
DRIVER_ADAPTER=pg make test-qeRefer to the connector test kit guide for the full list of adapters, environment variables, and troubleshooting notes.
Please refer to the Testing driver adapters section in the connector-test-kit-rs README.
βΉοΈ Important note on developing features that require changes to both the query compiler and driver adapter code
make test-qe (optionally with DRIVER_ADAPTER=...) ensures you have prisma/prisma checked out
next to this repository. The driver adapter sources are symlinked from there so that engines and
client stay in lockstep.
When working on a feature or bugfix spanning adapters and query-compiler code, you will need sibling
PRs in prisma/prisma and prisma/prisma-engines. Locally, each time you run
DRIVER_ADAPTER=$adapter make test-qe, tests use the adapters built from your local ../prisma
clone.
In CI we need to denote which branch of prisma/prisma should be consumed. By default CI clones the
main branch, which will not include your local adapter changes. To test in integration, add the
following tag to your PR description on a separate line:
/prisma-branch your/branch
Replace your/branch with the name of your branch in the prisma repository.
GitHub actions will then pick up the branch name and use it to clone that branch's code of prisma/prisma, and build the driver adapters code from there.
When it's time to merge the sibling PRs, you'll need to merge the prisma/prisma PR first, so when merging the engines PR you have the code of the adapters ready in prisma/prisma main branch.
You can trigger releases from this repository to npm that can be used for testing the engines in prisma/prisma either automatically or manually:
Any branch name starting with integration/ will, first, run the full test suite in GH Actions and, second, run the release workflow (build and upload engines to S3 & R2).
To trigger the release on any other branch, you have two options:
- Either run build-engines workflow on a specified branch manually.
- Or add
[integration]string anywhere in your commit messages/
The journey through the pipeline is the same as a commit on the main branch.
- It will trigger
prisma/engines-wrapperand publish a new@prisma/engines-versionnpm package but on theintegrationtag. - Which triggers
prisma/prismato create achore(Automated Integration PR): [...]PR with a branch name also starting withintegration/ - Since in
prisma/prismawe also trigger the publish pipeline when a branch name starts withintegration/, this will publish allprisma/prismamonorepo packages to npm on theintegrationtag. - Our ecosystem-tests tests will automatically pick up this new version and run tests, results will show in GitHub Actions
This end to end will take minimum ~1h20 to complete, but is completely automated π€
Notes:
- tests and publishing workflows are run in parallel in both
prisma/prisma-enginesandprisma/prismarepositories. So, it is possible that the engines would be published and only then test suite will discover a defect. It is advised that to keep an eye on both test and publishing workflows.
Additionally to the automated integration release for integration/ branches, you can also trigger a publish by pushing a commit with the content [integration].
When rust-analzyer runs cargo check it will lock the build directory and stop any cargo commands from running until it has completed. This makes the build process feel a lot longer. It is possible to avoid this by setting a different build path for
rust-analyzer. To avoid this. Open VSCode settings and search for Check on Save: Extra Args. Look for the Rust-analyzer βΊ Check On Save: Extra Args settings and add a new directory for rust-analyzer. Something like:
--target-dir:/tmp/rust-analyzer-check
To trigger an Automated integration releases from this repository to npm or Manual integration releases from this repository to npm branches of forks need to be pulled into this repository so the Github Actions job is triggered. You can use these GitHub and git CLI commands to achieve that easily:
gh pr checkout 4375
git checkout -b integration/sql-nested-transactions
git push --set-upstream origin integration/sql-nested-transactions
If there is a need to re-create this branch because it has been updated, deleting it and re-creating will make sure the content is identical and avoid any conflicts.
git branch --delete integration/sql-nested-transactions
gh pr checkout 4375
git checkout -b integration/sql-nested-transactions
git push --set-upstream origin integration/sql-nested-transactions --force
If you have a security issue to report, please contact us at [email protected]