feat(mongo): full index vocabulary, validators, and collection options#329
feat(mongo): full index vocabulary, validators, and collection options#329
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds collection-level DDL (create/drop/collMod), collection validators and options, expands index vocabulary (wildcardProjection, collation, weights, language fields), introduces canonicalize for order-independent keying, adds PSL→JSON Schema derivation, reshapes contract index/collection types, and removes the CLI Mongo statement extractor. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Planner as Mongo Planner
participant IR as Schema IR
participant Executor as Command Executor
participant MongoDB
Client->>Planner: requestPlan(origin,destination,policy)
Planner->>IR: contractToMongoSchemaIR(origin/destination)
IR-->>Planner: schemaIR (collections,indexes,validators,options)
Planner->>Planner: canonicalize() keys, diff validators/options
alt immutable option conflict
Planner-->>Client: plan failure (conflicts)
else
Planner-->>Client: operations list (createIndex/createCollection/collMod/dropCollection/...)
end
Client->>Executor: execute(operations)
Executor->>MongoDB: run commands (createCollection/createIndex/collMod/dropCollection)
MongoDB-->>Executor: results
Executor-->>Client: execution summary
sequenceDiagram
participant User
participant PSLParser as PSL Parser
participant Interpreter as Contract Interpreter
participant SchemaDeriver as JSONSchema Deriver
participant Contract as Mongo Contract
User->>PSLParser: supply PSL schema
PSLParser-->>Interpreter: parsed AST
Interpreter->>Interpreter: collect models, indexes, attributes
Interpreter->>SchemaDeriver: deriveJsonSchema(model fields)
SchemaDeriver-->>Interpreter: MongoStorageValidator
Interpreter->>Contract: attach collection (indexes + validator + options)
Interpreter-->>User: MongoContract
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Warning Review ran into problems🔥 ProblemsTimed out fetching pipeline failures after 30000ms Comment |
@prisma-next/mongo-runtime
@prisma-next/family-mongo
@prisma-next/sql-runtime
@prisma-next/family-sql
@prisma-next/extension-paradedb
@prisma-next/extension-pgvector
@prisma-next/postgres
@prisma-next/sql-orm-client
@prisma-next/sqlite
@prisma-next/target-mongo
@prisma-next/adapter-mongo
@prisma-next/driver-mongo
@prisma-next/contract
@prisma-next/utils
@prisma-next/config
@prisma-next/errors
@prisma-next/framework-components
@prisma-next/operations
@prisma-next/contract-authoring
@prisma-next/ids
@prisma-next/psl-parser
@prisma-next/psl-printer
@prisma-next/cli
@prisma-next/emitter
@prisma-next/migration-tools
@prisma-next/vite-plugin-contract-emit
@prisma-next/runtime-executor
@prisma-next/mongo-codec
@prisma-next/mongo-contract
@prisma-next/mongo-value
@prisma-next/mongo-contract-psl
@prisma-next/mongo-contract-ts
@prisma-next/mongo-emitter
@prisma-next/mongo-schema-ir
@prisma-next/mongo-query-ast
@prisma-next/mongo-orm
@prisma-next/mongo-pipeline-builder
@prisma-next/mongo-lowering
@prisma-next/mongo-wire
@prisma-next/sql-contract
@prisma-next/sql-errors
@prisma-next/sql-operations
@prisma-next/sql-schema-ir
@prisma-next/sql-contract-psl
@prisma-next/sql-contract-ts
@prisma-next/sql-contract-emitter
@prisma-next/sql-lane-query-builder
@prisma-next/sql-relational-core
@prisma-next/sql-builder
@prisma-next/target-postgres
@prisma-next/target-sqlite
@prisma-next/adapter-postgres
@prisma-next/adapter-sqlite
@prisma-next/driver-postgres
@prisma-next/driver-sqlite
commit: |
There was a problem hiding this comment.
Actionable comments posted: 10
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts (1)
27-41:⚠️ Potential issue | 🟠 MajorTighten Arktype schemas to validate nested command options instead of using blind casts at deserialization.
At this persistence boundary,
wildcardProjection,weights,timeseries,changeStreamPreAndPostImages, andclusteredIndexare defined asRecord<string, unknown>and only narrowed via blind casts (e.g.,as Record<string, 0 | 1>,as Record<string, number>,as CreateCollectionCommand['timeseries']). This allows malformed payloads to pass validation. Replace these with precise Arktype schemas using the{ '[string]': Schema }record syntax per coding guidelines.Suggested fix
+const WildcardProjectionJson = type({ '[string]': '0 | 1' }); +const WeightsJson = type({ '[string]': 'number' }); +const ChangeStreamPreAndPostImagesJson = type({ enabled: 'boolean' }); +const TimeseriesJson = type({ + timeField: 'string', + 'metaField?': 'string', + 'granularity?': '"seconds" | "minutes" | "hours"', +}); +const ClusteredIndexJson = type({ + key: { '[string]': 'number' }, + unique: 'boolean', + 'name?': 'string', +}); + const CreateIndexJson = type({ kind: '"createIndex"', collection: 'string', keys: IndexKeyJson.array().atLeastLength(1), @@ - 'wildcardProjection?': 'Record<string, unknown>', + 'wildcardProjection?': WildcardProjectionJson, 'collation?': 'Record<string, unknown>', - 'weights?': 'Record<string, unknown>', + 'weights?': WeightsJson, 'default_language?': 'string', 'language_override?': 'string', }); @@ - 'timeseries?': 'Record<string, unknown>', + 'timeseries?': TimeseriesJson, 'collation?': 'Record<string, unknown>', - 'changeStreamPreAndPostImages?': 'Record<string, unknown>', - 'clusteredIndex?': 'Record<string, unknown>', + 'changeStreamPreAndPostImages?': ChangeStreamPreAndPostImagesJson, + 'clusteredIndex?': ClusteredIndexJson, }); @@ - wildcardProjection: data.wildcardProjection as Record<string, 0 | 1> | undefined, + wildcardProjection: data.wildcardProjection, collation: data.collation, - weights: data.weights as Record<string, number> | undefined, + weights: data.weights, default_language: data.default_language, language_override: data.language_override, @@ - timeseries: data.timeseries as CreateCollectionCommand['timeseries'], + timeseries: data.timeseries, collation: data.collation, - changeStreamPreAndPostImages: data.changeStreamPreAndPostImages as - | { enabled: boolean } - | undefined, - clusteredIndex: data.clusteredIndex as CreateCollectionCommand['clusteredIndex'], + changeStreamPreAndPostImages: data.changeStreamPreAndPostImages, + clusteredIndex: data.clusteredIndex, @@ - changeStreamPreAndPostImages: data.changeStreamPreAndPostImages as - | { enabled: boolean } - | undefined, + changeStreamPreAndPostImages: data.changeStreamPreAndPostImages,Also applies to: lines 49-76, 168-216
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts` around lines 27 - 41, The CreateIndexJson and other Arktype schemas in mongo-ops-serializer currently use Record<string, unknown> for nested command option fields (e.g., wildcardProjection, weights, timeseries, changeStreamPreAndPostImages, clusteredIndex) and rely on blind casts at deserialization; replace those Record<string, unknown> entries with precise Arktype record schemas using the '{ "[string]": Schema }' syntax (for example '{ "[string]": literal(0).or(literal(1)) }' for 0|1 projection values or '{ "[string]": number() }' for numeric weights) so nested shapes are validated, and update any other schemas in this file that use Record<string, unknown> the same way (including the other index/collection/command schemas referenced in this module) to remove blind casts and enforce correct types at validation time.
🧹 Nitpick comments (8)
packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts (1)
163-186: Test covers M2 index options partially.The test verifies
default_languageandlanguage_overrideare present in the output, but doesn't assert thecollationandweightsoptions that are also passed. Consider adding assertions for completeness:expect(result[0]).toContain('default_language: "english"'); expect(result[0]).toContain('language_override: "lang"'); + expect(result[0]).toContain('collation:'); + expect(result[0]).toContain('weights:');🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts` around lines 163 - 186, The test for CreateIndexCommand in ddl-formatter.test.ts only asserts default_language and language_override; update the test that builds the op using CreateIndexCommand and the call to formatMongoOperations to also assert that the formatted output contains the collation object (e.g., 'collation: { locale: "en", strength: 2 }') and the weights mapping (e.g., 'weights: { bio: 10 }'), and if applicable include wildcardProjection assertions; locate the test block using the identifiers CreateIndexCommand and formatMongoOperations and add expect(...).toContain(...) assertions for those strings.packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts (1)
1197-1224: Blind casts should be avoided in production code.The
as unknown ascasts at lines 1210 and 1223 violate the coding guideline forbidding blind casts outside tests. Consider using a type-safe approach:♻️ Proposed refactor to eliminate blind casts
function toStorageIndex(index: MongoIndex): MongoStorageIndex { const keys = Object.entries(index.fields).map(([field, direction]) => ({ field, direction, })); - const result: Record<string, unknown> = { keys }; - if (index.options) { - for (const [key, value] of Object.entries(index.options)) { - if (value !== undefined) { - result[key] = value; - } - } - } - return result as unknown as MongoStorageIndex; + return { + keys, + ...(index.options?.unique !== undefined && { unique: index.options.unique }), + ...(index.options?.sparse !== undefined && { sparse: index.options.sparse }), + ...(index.options?.expireAfterSeconds !== undefined && { expireAfterSeconds: index.options.expireAfterSeconds }), + ...(index.options?.partialFilterExpression !== undefined && { partialFilterExpression: index.options.partialFilterExpression }), + ...(index.options?.wildcardProjection !== undefined && { wildcardProjection: index.options.wildcardProjection }), + ...(index.options?.collation !== undefined && { collation: index.options.collation }), + ...(index.options?.weights !== undefined && { weights: index.options.weights }), + ...(index.options?.default_language !== undefined && { default_language: index.options.default_language }), + ...(index.options?.language_override !== undefined && { language_override: index.options.language_override }), + }; }Alternatively, use
ifDefined()from@prisma-next/utils/definedfor cleaner conditional spreads per coding guidelines.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts` around lines 1197 - 1224, Both functions use blind "as unknown as" casts; change them to construct properly typed return values instead. In toStorageIndex: declare result as Partial<MongoStorageIndex> or directly as MongoStorageIndex with required fields (e.g., const result: Partial<MongoStorageIndex> = { keys }) then copy only defined index.options entries into that object and finally return result as MongoStorageIndex after ensuring required properties exist. In toStorageCollectionOptions: type result as Partial<MongoStorageCollectionOptions> (or MongoStorageCollectionOptions) and set capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex only when present; replace the "as unknown as" cast with a proper typed object and consider using ifDefined() for conditional spreads to keep code concise. Ensure you reference the functions toStorageIndex and toStorageCollectionOptions and the local vars keys, result, opts when making the changes.packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts (1)
784-916: Split these new suites into dedicated interpreter test files.This file is already far past the repo’s size limit for test files, and adding separate index-authoring and validator-derivation suites here makes navigation and failures harder to localize. Please move these blocks into focused files such as
interpreter.indexes.test.tsandinterpreter.validators.test.ts.As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files."
Also applies to: 918-1047
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts` around lines 784 - 916, The "index authoring" describe block (describe('index authoring', ...) and its it(...) cases) and the separate "validator-derivation" suite referenced in the comment should be split out of the large interpreter.test.ts into their own focused test files (e.g., interpreter.indexes.test.ts and interpreter.validators.test.ts); copy the describe('index authoring'...) block and the validator-related describe(...) block into new files, keep existing helper imports/fixtures used by interpretOk and any shared setup/teardown, update imports/exports so interpretOk and other helpers are referenced from their original module, remove the duplicated blocks from the original file, and run the test suite to ensure names and scopes remain the same so assertions (e.g., expectations against storage collections and index properties in the 'index authoring' tests) still pass.packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts (1)
45-49: UseifDefined()for the optional collection fields.Lines 48-49 use inline conditional spreads instead of the repo’s standard helper, which makes this pattern harder to keep consistent across the adapter.
As per coding guidelines, "Use
ifDefined()from@prisma-next/utils/definedfor conditional object spreads instead of inline conditional spread patterns".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts` around lines 45 - 49, Replace the inline conditional spreads for validator and options in the MongoSchemaCollection constructor with the repo helper ifDefined from `@prisma-next/utils/defined`: instead of using ...(def.validator != null && { validator: convertValidator(def.validator) }) and ...(def.options != null && { options: convertOptions(def.options) }), import ifDefined and pass converted values through it so the optional fields are added only when defined (apply to the call that constructs new MongoSchemaCollection in contract-to-schema.ts, referencing convertValidator and convertOptions).packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts (1)
457-645: Please split these validation suites into smaller test files.This file now bundles index validation, validator validation, and collection-options validation into a single suite that is over the 500-line limit. Breaking it into focused files would make it much easier to scan and maintain as the Mongo contract vocabulary keeps expanding.
As per coding guidelines "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts` around lines 457 - 645, The test file contains multiple large suites—specifically the describe blocks named 'storage validator validation' and 'storage collection options validation' (and the index-related suites nearby) that use helpers like validateMongoContract and makeValidContractJson—and should be split into focused test files under 500 lines; extract each suite into its own file (e.g., validator.test.ts, collection-options.test.ts, indexes.test.ts), move the related tests and any shared helpers (makeValidContractJson) to a common test-utils file or import them from the original module, update imports to reference validateMongoContract and makeValidContractJson as needed, and ensure test runner exports/registrations remain intact so each new file runs independently.packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts (1)
383-928: Split this test file by concern before it grows further.The added validator and collection-lifecycle suites push this file well past the 500-line cap, and it now mixes index diffing, validator behavior, policy gating, and collection lifecycle in one place. Splitting it into focused files such as
mongo-planner.indexes.test.ts,mongo-planner.validators.test.ts, andmongo-planner.collections.test.tswill keep failures easier to localize.As per coding guidelines "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts` around lines 383 - 928, The test file has grown past the 500-line guideline and mixes concerns; split the suites into three focused files by moving the "M2 index vocabulary" describe block (index diffing tests referencing makeContract, MongoSchemaIndex, planSuccess), the "validator diffing" describe block (validator tests referencing MongoSchemaValidator, CollModCommand, operationClass checks), and the "collection lifecycle" describe block (collection create/drop/options tests referencing CreateCollectionCommand, DropCollectionCommand, ALL_CLASSES_POLICY) into separate test files (e.g., mongo-planner.indexes.test.ts, mongo-planner.validators.test.ts, mongo-planner.collections.test.ts), preserving any shared test helpers/imports (planner, makeContract, irWithCollection, emptyIR) by extracting common setup into a test helper or re-importing them; run and update imports/exports so each new file has the necessary fixtures and keeps test behavior identical.test/integration/test/mongo/migration-m2-vocabulary.test.ts (2)
54-56: Remove unused_dbparameter.The
_dbparameter is declared but never used withinplanAndApply. Remove it to avoid confusion.🧹 Proposed fix
async function planAndApply( - _db: Db, replSetUri: string, origin: MongoContract | null, destination: MongoContract, ): Promise<void> {Update all call sites accordingly (e.g., line 148:
await planAndApply(replSetUri, null, contract);).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts` around lines 54 - 56, Remove the unused _db parameter from the function signature of planAndApply and update all call sites to match the new signature; specifically change async function planAndApply(_db: Db, replSetUri: string) to async function planAndApply(replSetUri: string, ...) (keep other parameters intact) and update calls such as await planAndApply(replSetUri, null, contract) to drop the removed argument so they call planAndApply(replSetUri, null, contract) with the new parameter order; ensure references to _db inside planAndApply (if any) are removed and tests still compile.
1-589: Consider splitting the test file to stay under the 500-line guideline.This file is 589 lines. Per coding guidelines, test files should stay under 500 lines. Consider splitting by functionality:
migration-m2-vocabulary.indexes.test.ts(compound, text, TTL, hashed, 2dsphere, partial, collation, wildcard)migration-m2-vocabulary.lifecycle.test.ts(modify indexes, validators, collection options, drops, full lifecycle)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts` around lines 1 - 589, The test file exceeds the 500-line guideline; split it into two smaller test files as suggested and move shared helpers into a common helper module: create migration-m2-vocabulary.indexes.test.ts containing the describe block "compound indexes" through "wildcard indexes", and migration-m2-vocabulary.lifecycle.test.ts containing "modify indexes" through "full lifecycle"; extract the top-level functions makeContract and planAndApply (and the ALL_POLICY constant) into a new helper (import them into both new test files) so both tests retain the same imports (contractToMongoSchemaIR, MongoMigrationPlanner, MongoMigrationRunner, mongoControlDriver, MongoMemoryReplSet, timeouts, MongoClient/Db, etc.), keep the original describe titles and expectations unchanged, and ensure beforeAll/beforeEach/afterAll lifecycle setup is present in each new file (or also moved to the helper as reusable setup functions) so tests run identically but with each file under 500 lines.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts`:
- Around line 283-286: ClusteredIndexSchema currently only validates name;
update ClusteredIndexSchema to require and validate the clustered-index shape by
adding a required key field (an object/record describing index keys) and a
required unique field (boolean) using Arktype validators (keep '+': 'reject' and
'name?' as-is), and mirror the same stricter required fields for the adjacent
index schema definitions that follow ClusteredIndexSchema so all index contracts
enforce key and unique consistently.
In `@packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts`:
- Around line 31-38: The derive-json-schema logic for value-object fields (where
field.type.kind === 'valueObject') currently returns the result of
deriveObjectSchema(vo.fields, valueObjects) (or an array with items = voSchema
when 'many' in field) without honoring field.nullable; update the branch
handling value objects to wrap the object schema in an anyOf that permits null
when field.nullable is true (for single object) and for arrays allow either null
or the array form as appropriate—use the existing deriveObjectSchema(vo.fields,
valueObjects) output as the non-null alternative and ensure the anyOf
alternatives use { bsonType: "null" } and the object/array schema so embedded
required rules are preserved.
In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts`:
- Around line 536-537: The code currently assigns collections[collectionName] =
... when building indexes (via collectIndexes(pslModel, fieldMappings,
modelNames)), which overwrites any existing metadata and causes later
single-collection polymorphism (resolvePolymorphism) to drop earlier
indexes/validator; instead, either (a) defer creating collection entries until
after resolvePolymorphism() and build one entry per final collection, or (b)
merge into an existing collections[collectionName] object (append indexes to an
existing indexes array and merge/override validator/other keys) rather than
replacing it; apply the same merge-or-defer fix where similar assignments occur
(the block around lines 559–565) so multiple models sharing a collection
accumulate indexes/metadata instead of the last model winning.
- Around line 278-285: parseIndexDirection currently defaults unknown/typo index
types to 1 which silently produces wrong indexes; change parseIndexDirection to
validate and not coerce: when raw is missing return 1 as before, but if a
stripped value is not a numeric 1/-1 and not one of the allowed strings
('text','2dsphere','2d','hashed'), return undefined (or throw a validation
error) instead of returning 1; update all call sites that use
parseIndexDirection (the callers that build index descriptors) to detect
undefined and emit a diagnostic like "Unknown index type '<value>'" and skip
creating that index rather than generating an ascending index. Ensure the
function signature reflects the new possible undefined return
(MongoIndexKeyDirection | undefined) and that diagnostics include the original
raw value.
- Around line 352-369: The text-index weights and language fields are left in
PSL names while other `keys` are normalized to storage names; update the
handling around parseJsonArg/getNamedArgument so that after building `weights`
from `rawWeights` you map each PSL key to its storage name using the same
normalization/mapping logic used for `keys` (i.e., replace PSL names with the
`@map` target before assigning into `weights`), and similarly map
`rawDefaultLang`/`rawLangOverride` (the single-field values that become
`default_language` and `language_override`) through the same name-mapping
function so stored index options reference the actual storage field names rather
than the PSL names.
In `@packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts`:
- Around line 3-6: The current key-order-sensitive comparisons in
index-equivalence.ts use deepEqual for option subdocuments like
partialFilterExpression, wildcardProjection, collation, and weights; update the
equality checks inside the main equivalence function (e.g., isIndexEquivalent)
to canonicalize() both sides of those subdocuments first (or canonicalize the
whole options object for those keys) and then compare (instead of direct
deepEqual) so key-order independence matches the lookup-key builder behavior;
apply the same change to the duplicate checks around the other occurrence (the
block referenced at the 51-56 range).
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts`:
- Around line 150-181: validatorsEqual and classifyValidatorUpdate incorrectly
use the key-order-sensitive deepEqual to compare validator.jsonSchema and thus
treat reordered keys as destructive changes; replace deepEqual(jsonSchema, ...)
with an order-independent structural comparison such as canonicalize(jsonSchema)
(or JSON.stringify(canonicalize(...))) before comparing, i.e., compute canonical
forms of a.jsonSchema and b.jsonSchema in validatorsEqual and of
origin.jsonSchema and dest.jsonSchema in classifyValidatorUpdate and then
compare those canonicalized strings/objects so key order differences do not
trigger spurious destructive classifications.
- Around line 217-255: The postcheck filters currently only assert
options.validationLevel and can false-positive; update the postcheck entries
that use ListCollectionsCommand and MongoAndExpr.of to also assert the stored
validator payload and action: for the "add" branch include
MongoFieldFilter.eq('options.validator', destValidator.validator) and
MongoFieldFilter.eq('options.validationAction', destValidator.validationAction)
alongside the existing validationLevel check, and for the "remove" branch assert
MongoFieldFilter.eq('options.validator', {}) and
MongoFieldFilter.eq('options.validationAction', 'error') (in addition to the
validationLevel check) so the runner verifies the actual validator body and
action were applied for collName.
- Around line 260-269: In hasImmutableOptionChange, don't early-return when
origin or dest is undefined; instead compare each immutable option field
individually (origin?.capped vs dest?.capped, origin?.timeseries vs
dest?.timeseries, origin?.collation vs dest?.collation, origin?.clusteredIndex
vs dest?.clusteredIndex) using deepEqual and return the field name on any
difference so additions or removals are detected as conflicts; update the
function body to perform four separate deepEqual checks against origin?.field
and dest?.field rather than requiring both option objects to exist.
---
Outside diff comments:
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.ts`:
- Around line 27-41: The CreateIndexJson and other Arktype schemas in
mongo-ops-serializer currently use Record<string, unknown> for nested command
option fields (e.g., wildcardProjection, weights, timeseries,
changeStreamPreAndPostImages, clusteredIndex) and rely on blind casts at
deserialization; replace those Record<string, unknown> entries with precise
Arktype record schemas using the '{ "[string]": Schema }' syntax (for example '{
"[string]": literal(0).or(literal(1)) }' for 0|1 projection values or '{
"[string]": number() }' for numeric weights) so nested shapes are validated, and
update any other schemas in this file that use Record<string, unknown> the same
way (including the other index/collection/command schemas referenced in this
module) to remove blind casts and enforce correct types at validation time.
---
Nitpick comments:
In `@packages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.ts`:
- Around line 457-645: The test file contains multiple large suites—specifically
the describe blocks named 'storage validator validation' and 'storage collection
options validation' (and the index-related suites nearby) that use helpers like
validateMongoContract and makeValidContractJson—and should be split into focused
test files under 500 lines; extract each suite into its own file (e.g.,
validator.test.ts, collection-options.test.ts, indexes.test.ts), move the
related tests and any shared helpers (makeValidContractJson) to a common
test-utils file or import them from the original module, update imports to
reference validateMongoContract and makeValidContractJson as needed, and ensure
test runner exports/registrations remain intact so each new file runs
independently.
In `@packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts`:
- Around line 784-916: The "index authoring" describe block (describe('index
authoring', ...) and its it(...) cases) and the separate "validator-derivation"
suite referenced in the comment should be split out of the large
interpreter.test.ts into their own focused test files (e.g.,
interpreter.indexes.test.ts and interpreter.validators.test.ts); copy the
describe('index authoring'...) block and the validator-related describe(...)
block into new files, keep existing helper imports/fixtures used by interpretOk
and any shared setup/teardown, update imports/exports so interpretOk and other
helpers are referenced from their original module, remove the duplicated blocks
from the original file, and run the test suite to ensure names and scopes remain
the same so assertions (e.g., expectations against storage collections and index
properties in the 'index authoring' tests) still pass.
In `@packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts`:
- Around line 1197-1224: Both functions use blind "as unknown as" casts; change
them to construct properly typed return values instead. In toStorageIndex:
declare result as Partial<MongoStorageIndex> or directly as MongoStorageIndex
with required fields (e.g., const result: Partial<MongoStorageIndex> = { keys })
then copy only defined index.options entries into that object and finally return
result as MongoStorageIndex after ensuring required properties exist. In
toStorageCollectionOptions: type result as
Partial<MongoStorageCollectionOptions> (or MongoStorageCollectionOptions) and
set capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex
only when present; replace the "as unknown as" cast with a proper typed object
and consider using ifDefined() for conditional spreads to keep code concise.
Ensure you reference the functions toStorageIndex and toStorageCollectionOptions
and the local vars keys, result, opts when making the changes.
In `@packages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.ts`:
- Around line 45-49: Replace the inline conditional spreads for validator and
options in the MongoSchemaCollection constructor with the repo helper ifDefined
from `@prisma-next/utils/defined`: instead of using ...(def.validator != null && {
validator: convertValidator(def.validator) }) and ...(def.options != null && {
options: convertOptions(def.options) }), import ifDefined and pass converted
values through it so the optional fields are added only when defined (apply to
the call that constructs new MongoSchemaCollection in contract-to-schema.ts,
referencing convertValidator and convertOptions).
In `@packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts`:
- Around line 163-186: The test for CreateIndexCommand in ddl-formatter.test.ts
only asserts default_language and language_override; update the test that builds
the op using CreateIndexCommand and the call to formatMongoOperations to also
assert that the formatted output contains the collation object (e.g.,
'collation: { locale: "en", strength: 2 }') and the weights mapping (e.g.,
'weights: { bio: 10 }'), and if applicable include wildcardProjection
assertions; locate the test block using the identifiers CreateIndexCommand and
formatMongoOperations and add expect(...).toContain(...) assertions for those
strings.
In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts`:
- Around line 383-928: The test file has grown past the 500-line guideline and
mixes concerns; split the suites into three focused files by moving the "M2
index vocabulary" describe block (index diffing tests referencing makeContract,
MongoSchemaIndex, planSuccess), the "validator diffing" describe block
(validator tests referencing MongoSchemaValidator, CollModCommand,
operationClass checks), and the "collection lifecycle" describe block
(collection create/drop/options tests referencing CreateCollectionCommand,
DropCollectionCommand, ALL_CLASSES_POLICY) into separate test files (e.g.,
mongo-planner.indexes.test.ts, mongo-planner.validators.test.ts,
mongo-planner.collections.test.ts), preserving any shared test helpers/imports
(planner, makeContract, irWithCollection, emptyIR) by extracting common setup
into a test helper or re-importing them; run and update imports/exports so each
new file has the necessary fixtures and keeps test behavior identical.
In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts`:
- Around line 54-56: Remove the unused _db parameter from the function signature
of planAndApply and update all call sites to match the new signature;
specifically change async function planAndApply(_db: Db, replSetUri: string) to
async function planAndApply(replSetUri: string, ...) (keep other parameters
intact) and update calls such as await planAndApply(replSetUri, null, contract)
to drop the removed argument so they call planAndApply(replSetUri, null,
contract) with the new parameter order; ensure references to _db inside
planAndApply (if any) are removed and tests still compile.
- Around line 1-589: The test file exceeds the 500-line guideline; split it into
two smaller test files as suggested and move shared helpers into a common helper
module: create migration-m2-vocabulary.indexes.test.ts containing the describe
block "compound indexes" through "wildcard indexes", and
migration-m2-vocabulary.lifecycle.test.ts containing "modify indexes" through
"full lifecycle"; extract the top-level functions makeContract and planAndApply
(and the ALL_POLICY constant) into a new helper (import them into both new test
files) so both tests retain the same imports (contractToMongoSchemaIR,
MongoMigrationPlanner, MongoMigrationRunner, mongoControlDriver,
MongoMemoryReplSet, timeouts, MongoClient/Db, etc.), keep the original describe
titles and expectations unchanged, and ensure beforeAll/beforeEach/afterAll
lifecycle setup is present in each new file (or also moved to the helper as
reusable setup functions) so tests run identically but with each file under 500
lines.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 87b1cc7f-d88c-4a71-b92a-ae35c2daf53c
⛔ Files ignored due to path filters (4)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yamlprojects/mongo-schema-migrations/plan.mdis excluded by!projects/**projects/mongo-schema-migrations/plans/m2-plan.mdis excluded by!projects/**projects/mongo-schema-migrations/specs/m2-full-vocabulary.spec.mdis excluded by!projects/**
📒 Files selected for processing (48)
docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.mddocs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.mddocs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.mdpackages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.tspackages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.tspackages/1-framework/3-tooling/cli/test/extract-operation-statements.test.tspackages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.tspackages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.tspackages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.tspackages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.tspackages/2-mongo-family/2-authoring/contract-psl/README.mdpackages/2-mongo-family/2-authoring/contract-psl/package.jsonpackages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.tspackages/2-mongo-family/2-authoring/contract-psl/src/interpreter.tspackages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.tspackages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.tspackages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.tspackages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.tspackages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/README.mdpackages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.tspackages/2-mongo-family/4-query/query-ast/src/ddl-commands.tspackages/2-mongo-family/4-query/query-ast/src/ddl-visitors.tspackages/2-mongo-family/4-query/query-ast/src/exports/control.tspackages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.tspackages/2-mongo-family/9-family/src/core/control-instance.tspackages/3-mongo-target/2-mongo-adapter/src/core/command-executor.tspackages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.tspackages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.tspackages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.tspackages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.tspackages/3-mongo-target/2-mongo-adapter/test/command-executor.test.tspackages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.tspackages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.tstest/integration/test/mongo/migration-m2-vocabulary.test.tstest/integration/test/mongo/migration-psl-authoring.test.ts
💤 Files with no reviewable changes (2)
- packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts
- packages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.ts
packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts
Show resolved
Hide resolved
packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts
Show resolved
Hide resolved
packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts
Show resolved
Hide resolved
packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts
Outdated
Show resolved
Hide resolved
…alidation Extends MongoStorageCollection with optional indexes array containing key definitions (field + direction) and index options (unique, sparse, expireAfterSeconds, partialFilterExpression). Adds Arktype schema validation and test coverage for valid/invalid index shapes.
Add _prisma_migrations collection support for MongoDB: - readMarker, initMarker, updateMarker (CAS), writeLedgerEntry - MongoControlDriverInstance extending ControlDriverInstance with db access - Wire readMarker() on MongoControlFamilyInstance - Integration tests using mongodb-memory-server
…tors, and collection options M2 extends every migration pipeline layer to cover the full breadth of MongoDB server-side configuration. The spec covers: - Complete index vocabulary: wildcardProjection, collation, text index options (weights, default_language, language_override), compound wildcard indexes, clustered indexes - $jsonSchema validators with widening/destructive classification - Collection options: capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex - PSL authoring: @@index, @@unique, @unique in Mongo interpreter - Emitter: auto-derive $jsonSchema from model fields - Canonical serialization for key-order-independent index matching The plan breaks work into 6 phases (19 tasks) with explicit dependency graph matching the M1 plan structure.
…tion, weights, default_language, language_override) Extend MongoStorageIndex type and Arktype validation schema with the remaining MongoDB index options needed for full vocabulary coverage.
…ctionOptions Add validator (jsonSchema, validationLevel, validationAction) and collection options (capped, timeseries, collation, changeStreamPreAndPostImages, clusteredIndex) to the contract type system with Arktype validation.
Add wildcardProjection, collation, weights, default_language, language_override to schema IR and update indexesEquivalent to compare new object-valued options via deepEqual.
…tionOptionsNode Add validator and collection options nodes to schema IR. Update visitor interface from unknown placeholders to typed parameters. Update collection to accept optional validator and options.
…mand, CollModCommand Add DDL command classes for collection lifecycle and collMod operations. Extend CreateIndexCommand with M2 index options. Update visitor interface with new command methods.
…erialization Used by the planner to produce deterministic index lookup keys for object-valued options like partialFilterExpression, collation, wildcardProjection, and weights.
… and new index options Convert MongoStorageValidator to MongoSchemaValidator, MongoStorageCollectionOptions to MongoSchemaCollectionOptionsNode, and pass through M2 index options.
… options Add Arktype schemas and deserialization for CreateCollectionCommand, DropCollectionCommand, CollModCommand, and M2 index options on CreateIndexCommand (weights, collation, wildcardProjection, etc).
…options Add formatting for createCollection, dropCollection, collMod commands. Include M2 index options (collation, weights, wildcardProjection, etc) in createIndex formatting.
… index options Use canonicalize() for key-order-independent comparison of object-valued index options. Pass wildcardProjection, collation, weights, default_language, and language_override through to CreateIndexCommand.
…ion lifecycle Add validator add/remove/change detection via collMod. Add collection create/drop operations with options. Detect immutable option conflicts (capped, timeseries, collation, clusteredIndex). Handle mutable option changes (changeStreamPreAndPostImages). Order: creates > drops > indexes > validators > drops.
Add createCollection, dropCollection, collMod methods. Pass M2 index options (collation, wildcardProjection, weights, etc) to MongoDB driver.
…l fields Add deriveJsonSchema() utility that maps ContractField types to BSON types for $jsonSchema validation. Handles scalar fields, nullable fields, array fields, and nested value objects recursively. Returns a MongoStorageValidator with strict/error defaults.
…interpreter After building model fields and value objects, call deriveJsonSchema() for each model and populate storage.collections[].validator with the derived schema (strict/error defaults). Validators are now always present alongside indexes in emitted contracts.
Add migration-m2-vocabulary.test.ts exercising compound, text, TTL,
hashed, 2dsphere, partial, collation, and wildcard indexes, plus
validator add/remove via collMod, capped collections, collection drops,
and a multi-step lifecycle test.
Fix planner to always emit dropCollection when a collection is removed
(not gated on collectionHasOptions). Fix text index pre/postchecks to
use key._fts instead of the original key spec, since MongoDB stores
text index keys as { _fts: "text", _ftsx: 1 } internally.
Prove the PSL -> contract -> plan -> apply pipeline for indexes, unique constraints, $jsonSchema validators (including nullable, value objects), and @Map field name mappings against a real MongoDB instance.
origin/main introduced MongoIndex (fields: Record) and MongoCollectionOptions while the M2 branch uses MongoStorageIndex (keys: Array) and MongoStorageCollectionOptions. This commit: - Updates contract-builder to convert MongoIndex -> MongoStorageIndex - Updates contract-to-schema to import MongoContract properly - Adds mongo-contract dep to contract-psl package - Fixes type casts in tests for exactOptionalPropertyTypes - Aligns all test assertions with the keys-array index format - Removes useless constructor in MongoFamilyInstance - Fixes unused parameter lint warning
…checks Validator operations now follow the spec classification matrix: - Removal: widening (was destructive) - Add: destructive - validationAction error->warn: widening; warn->error: destructive - validationLevel strict->moderate: widening; moderate->strict: destructive - jsonSchema body change: destructive (conservative default) - Mixed widening+destructive: destructive Also adds ListCollections-based prechecks (collection exists) and postchecks (validator applied/removed) to validator operations, and classifies disabling changeStreamPreAndPostImages as destructive (enabling remains widening).
…de, type PSL index builder - Remove extract-mongo-statements.ts (incomplete, duplicates ddl-formatter.ts) - Return undefined for mongo family in extractOperationStatements - Rename MongoSchemaCollectionOptionsNode -> MongoSchemaCollectionOptions for consistency with MongoSchemaCollection, MongoSchemaIndex, MongoSchemaValidator - Replace Record<string, unknown> index builder in PSL interpreter with typed construction; validate weights values are numbers - Fix typo: exercices -> exercises in integration test
…ation test Add negative validation tests for: - capped option without required size field - invalid wildcardProjection values (2 instead of 0|1) - validator missing jsonSchema, validationLevel, or validationAction Add language_override to text index integration test to verify it flows through the full plan->apply->verify pipeline.
…for M2 ADR 187: All four schema IR nodes are now implemented (validator, collection options); update visitor types from unknown to concrete; update index node to include M2 options. ADR 188: Update DDL vocabulary to include CreateCollectionCommand, DropCollectionCommand, and CollModCommand. ADR 189: Update buildIndexLookupKey pseudocode to use canonicalize() instead of JSON.stringify; add M2 index options to the lookup key. schema-ir README: Update indexesEquivalent description with M2 options; fix dependents section (IR produced by contractToMongoSchemaIR in adapter-mongo); add canonicalize to responsibilities. contract-psl README: Document known PSL limitations (collation, partialFilterExpression, wildcardProjection not supported). Add JSDoc to deepEqual noting key-order sensitivity.
…sts (F12, F14) F14: add unit tests verifying ADDITIVE_ONLY_POLICY rejects destructive validator-add operations and widening policy permits validator removal. F12: add integration tests for collection-level collation, changeStreamPreAndPostImages (enable + toggle), timeseries, and clusteredIndex options via the full plan-and-apply E2E pipeline.
…s access
Cast listCollections results to Record<string, unknown> before
accessing options, since the MongoDB driver union type
(CollectionInfo | Pick<CollectionInfo, "name"|"type">) does not
expose options directly. Also fix MongoStorage cast in
migration-psl-authoring by routing through unknown first.
Update validator-removal test to accept MongoDB behavior where
collMod with validator:{} leaves an empty validator object.
Add tests covering uncovered branches across four files: - command-executor: text-index options, wildcardProjection, changeStreamPreAndPostImages, collation, timeseries, clusteredIndex - ddl-formatter: wildcardProjection, partialFilterExpression, changeStreamPreAndPostImages for createCollection and collMod - mongo-ops-serializer: invalid and/or/not filter validation, createCollection M2 options round-trip - mongo-runner: marker-origin-mismatch edge cases (marker exists with no plan origin, no marker with plan origin), CAS failure Branch coverage rises from 87.56% to 93.05%.
…ncy skip
The validator-removal operation used a postcheck that matched
options.validationLevel=strict, which was already satisfied when
the original validator also used strict. This caused the idempotency
probe to skip the collMod entirely, leaving the validator in place.
Empty the postcheck array so the operation always executes. The collMod
with validator:{} is inherently idempotent so no safety is lost.
c6c3885 to
b13db34
Compare
A07: Use canonicalize() instead of deepEqual() for object-valued index options (partialFilterExpression, wildcardProjection, collation, weights) in indexesEquivalent. This prevents spurious drop/create churn from harmless key-order differences. A08: Use canonicalize() instead of deepEqual() for validator jsonSchema comparison in validatorsEqual and classifyValidatorUpdate. Prevents spurious destructive collMod plans from key reordering. A09: Strengthen add-validator postcheck to also assert validationAction, not just validationLevel. A10: Fix hasImmutableOptionChange to compare each field individually via origin?.field vs dest?.field, detecting additions and removals even when one side has no options object.
…ormat
The generated contract fixture used the old {fields, options} index
shape but MongoStorageIndex now expects {keys: [{field, direction}],
unique?, ...}. Update fixture and test assertions to match.
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (4)
packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts (3)
535-537:⚠️ Potential issue | 🟠 MajorDon't materialize
collections[...]before polymorphism is resolved.These assignments overwrite any existing collection entry per model. When multiple models collapse into one Mongo collection, earlier indexes/validators are lost, and variant-only collection entries can survive in
storage.collectionseven after Line 624 rewrites the model'sstorage.collection.Also applies to: 559-565
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts` around lines 535 - 537, The code currently assigns collections[collectionName] immediately when processing each model (via models[pslModel.name] = ... and collections[collectionName] = modelIndexes.length > 0 ? { indexes: modelIndexes } : {}), which overwrites prior entries and can leave variant-only collection metadata before polymorphism resolution; instead defer or merge collection metadata: do not materialize collections[collectionName] directly inside the per-model loop—collect modelIndexes (using collectIndexes) and accumulate them per collectionName (merging indexes/validators) or store them in a temporary structure keyed by collectionName, then after polymorphism is resolved (where storage.collection is finalized) write a single merged entry into collections for each concrete collection; update any logic that assumes one-shot assignment (references: models[pslModel.name], collectIndexes, collections[collectionName], and model storage.collection) to merge rather than overwrite.
278-285:⚠️ Potential issue | 🟠 MajorReject unknown
type:values instead of coercing them to ascending.A typo here still produces
direction: 1, so PSL authoring emits a valid-but-wrong index instead of surfacing a diagnostic. Make this parser returnundefined/an error for unknown values and skip building the index at Line 341.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts` around lines 278 - 285, parseIndexDirection currently coerces unknown type strings to 1; change its signature to return MongoIndexKeyDirection | undefined and make it return undefined for unrecognized values (i.e., remove the final "return 1" fallback). Then update the index-building logic that calls parseIndexDirection (where index keys are assembled from PSL field entries) to skip creating that index if any field direction is undefined and emit a diagnostic/error for the unknown `type:` value instead of silently using ascending. Refer to parseIndexDirection and the index-construction loop that consumes its result and add the conditional skip+diagnostic there.
352-369:⚠️ Potential issue | 🟠 MajorNormalize text-index option field names through
@map.
keysare converted to storage names at Lines 343-346, butweightskeys andlanguage_overridestill keep PSL names. For mapped fields, the text index targets the stored field while its option payload still points at the authoring name.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts` around lines 352 - 369, The weights and language override/default_language options are left using PSL (authoring) names while the index keys were already converted to storage names; update the parsing for rawWeights, rawDefaultLang and rawLangOverride so that weights' keys are converted with the same mapping used for the index keys (the same conversion logic used when transforming keys at lines 343-346) and apply that mapping to the single field name extracted for language_override/default_language as well (i.e., map the authoring name to the storage name before assigning into weights and before setting language_override/default_language).packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts (1)
31-38:⚠️ Potential issue | 🟠 MajorHonor
nullablefor embedded value objects.This branch returns the object schema directly, so optional value-object fields still reject
null. Wrap the derived object schema in ananyOfwith{ bsonType: 'null' }so nestedrequiredrules are preserved when the field is non-null.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts` around lines 31 - 38, The valueObject branch currently returns voSchema directly so nullable value-object fields still reject null; update the branch in derive-json-schema.ts (the code that checks field.type.kind === 'valueObject' and uses deriveObjectSchema(vo.fields,...)) to honor field.nullable: if field.nullable is true, wrap the result in an anyOf that includes { bsonType: 'null' } and the original schema; for the 'many' case return anyOf: [{ bsonType: 'null' }, { bsonType: 'array', items: voSchema }], and for the non-many case return anyOf: [{ bsonType: 'null' }, voSchema] so nested required rules are preserved when non-null.
🧹 Nitpick comments (3)
test/integration/test/mongo/migration-m2-vocabulary.test.ts (1)
99-730: Split this E2E suite by feature area.The new suite is already partitioned into index vocabulary, validators, collection options, and lifecycle flows, but keeping all of that in one file leaves it well over the repo limit. Separate files will make environment-specific failures much easier to isolate.
As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files." and "Split test files when they exceed 500 lines, contain multiple distinct concerns that can be logically separated, or have multiple top-level
describeblocks that can be split by functionality."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts` around lines 99 - 730, The file exceeds the 500-line guideline and mixes distinct concerns; split the single E2E suite into multiple test files grouped by feature area (e.g., index vocabulary tests, validator tests, collection-options tests, and lifecycle/drop tests). Locate usages of makeContract and planAndApply and move the relevant describe blocks (e.g., "compound indexes", "text indexes", "TTL indexes", "hashed indexes", "2dsphere indexes", "partial indexes", "indexes with collation", "wildcard indexes" → indexes file; "validators via collMod" → validators file; "collection with options" → collection options file; "collection drops" and "full lifecycle: create → modify → remove" → lifecycle file) into their respective new test files, exporting or reusing any shared helpers (makeContract, planAndApply, timeouts, MongoMemoryReplSet setup) by extracting common beforeAll/afterAll/test helpers into a shared test util module so each new file remains under 500 lines and reuses setup/teardown logic.packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts (1)
428-971: Split planner coverage into feature-area files.This added validator/collection coverage pushes the file close to 1,000 lines, and the suites already separate naturally into indexes, validators, and collection lifecycle. Please split it so each area is easier to maintain and debug.
As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files." and "Split test files when they exceed 500 lines, contain multiple distinct concerns that can be logically separated, or have multiple top-level
describeblocks that can be split by functionality."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts` around lines 428 - 971, The test file is too large and should be split by feature area; move the "M2 index vocabulary", "validator diffing", and "collection lifecycle" top-level describe blocks into separate test files so each file stays under ~500 lines. Create three new test files each containing one of those describe suites and copy any helper references they use (planner, makeContract, planSuccess, irWithCollection, emptyIR, MongoSchemaCollection, MongoSchemaIndex, MongoSchemaValidator, CreateIndexCommand, CollModCommand, CreateCollectionCommand, DropCollectionCommand, MongoMigrationPlanOperation, ALL_CLASSES_POLICY) or import those from a shared test-utils file; if helpers are duplicated across the new files, extract common setup into a shared module to avoid duplication and ensure beforeEach/afterEach/setup remains consistent, then remove the moved suites from the original file so the original stays concise.packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts (1)
477-696: Split this serializer suite by round-trip vs rejection cases.These additions push the file well past the repo's 500-line limit, and the happy-path command coverage is now mixed with malformed-input validation. Breaking this into feature-area files will keep failures easier to navigate.
As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files." and "Split test files when they exceed 500 lines, contain multiple distinct concerns that can be logically separated, or have multiple top-level
describeblocks that can be split by functionality."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts` around lines 477 - 696, The test file exceeds the 500-line guideline and mixes happy-path serialization round-trip tests with malformed-input rejection cases; split it into two files: one for round-trip happy-paths and one for rejection/validation cases. Create e.g. mongo-ops-serializer.roundtrip.test.ts containing the tests named "round-trips createIndex with M2 options", "round-trips createCollection command", "round-trips dropCollection command", "round-trips collMod command", and "round-trips createCollection with M2 options" and keep their helpers (serializeMongoOps/deserializeMongoOps, command classes) imported; create mongo-ops-serializer.rejections.test.ts containing "rejects and filter with non-array exprs", "rejects or filter with non-array exprs", and "rejects not filter with missing expr" with the same shared imports. Update any top-level describe blocks or shared setup so both new files run independently and remove the moved tests from the original file.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts`:
- Around line 348-379: The interpreter currently parses sparse,
expireAfterSeconds, weights, default_language and language_override from attr
but omits partialFilterExpression and wildcardProjection; update the same
parsing pattern to read getNamedArgument(attr, 'partialFilterExpression') and
getNamedArgument(attr, 'wildcardProjection') (use parseJsonArg for both, similar
to rawWeights) and normalize/validate the results (accept only object values for
partialFilterExpression and wildcardProjection, stripping quotes where needed
like rawDefaultLang), then include them in the MongoStorageIndex literal (add
...(partialFilterExpression != null && { partialFilterExpression }) and
...(wildcardProjection != null && { wildcardProjection })). Ensure you reference
the existing symbols (attr, getNamedArgument, parseJsonArg, rawWeights, index)
so the change mirrors the existing parsing and conditional inclusion style.
- Around line 335-346: The index positional argument is being treated as raw
names by parseFieldList so entries like createdAt(sort: Desc) are not parsed for
per-field directions; update the index parsing in interpreter.ts to parse the
positional argument structurally (not via parseFieldList) — use
getPositionalArgument(attr, 0) to obtain the raw AST/token for the bracketed
list, iterate its elements and for each element parse any per-field modifiers
(e.g., sort or type) to compute the correct direction via parseIndexDirection
(or an adapted helper) instead of applying a single direction to all fields;
then build keys using fieldMappings.pslNameToMapped.get(name) ?? name with each
element’s computed direction so descending and mixed compound indexes are
handled correctly.
In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts`:
- Around line 579-599: The clusteredIndex fixture in the test is incomplete
(only { name: 'myCluster' }), causing valid-server runs to treat it as invalid;
update the contract passed to makeContract to include a full clusteredIndex
object with at minimum a key spec (e.g., { field: 1 } or similar) and a unique
boolean (unique: true/false) so it matches the shape expected by the
create-collection command in ddl-commands.ts; adjust the
clustered.options.clusteredIndex in the test contract and re-run planAndApply to
validate the feature test.
---
Duplicate comments:
In `@packages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.ts`:
- Around line 31-38: The valueObject branch currently returns voSchema directly
so nullable value-object fields still reject null; update the branch in
derive-json-schema.ts (the code that checks field.type.kind === 'valueObject'
and uses deriveObjectSchema(vo.fields,...)) to honor field.nullable: if
field.nullable is true, wrap the result in an anyOf that includes { bsonType:
'null' } and the original schema; for the 'many' case return anyOf: [{ bsonType:
'null' }, { bsonType: 'array', items: voSchema }], and for the non-many case
return anyOf: [{ bsonType: 'null' }, voSchema] so nested required rules are
preserved when non-null.
In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts`:
- Around line 535-537: The code currently assigns collections[collectionName]
immediately when processing each model (via models[pslModel.name] = ... and
collections[collectionName] = modelIndexes.length > 0 ? { indexes: modelIndexes
} : {}), which overwrites prior entries and can leave variant-only collection
metadata before polymorphism resolution; instead defer or merge collection
metadata: do not materialize collections[collectionName] directly inside the
per-model loop—collect modelIndexes (using collectIndexes) and accumulate them
per collectionName (merging indexes/validators) or store them in a temporary
structure keyed by collectionName, then after polymorphism is resolved (where
storage.collection is finalized) write a single merged entry into collections
for each concrete collection; update any logic that assumes one-shot assignment
(references: models[pslModel.name], collectIndexes, collections[collectionName],
and model storage.collection) to merge rather than overwrite.
- Around line 278-285: parseIndexDirection currently coerces unknown type
strings to 1; change its signature to return MongoIndexKeyDirection | undefined
and make it return undefined for unrecognized values (i.e., remove the final
"return 1" fallback). Then update the index-building logic that calls
parseIndexDirection (where index keys are assembled from PSL field entries) to
skip creating that index if any field direction is undefined and emit a
diagnostic/error for the unknown `type:` value instead of silently using
ascending. Refer to parseIndexDirection and the index-construction loop that
consumes its result and add the conditional skip+diagnostic there.
- Around line 352-369: The weights and language override/default_language
options are left using PSL (authoring) names while the index keys were already
converted to storage names; update the parsing for rawWeights, rawDefaultLang
and rawLangOverride so that weights' keys are converted with the same mapping
used for the index keys (the same conversion logic used when transforming keys
at lines 343-346) and apply that mapping to the single field name extracted for
language_override/default_language as well (i.e., map the authoring name to the
storage name before assigning into weights and before setting
language_override/default_language).
---
Nitpick comments:
In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.ts`:
- Around line 477-696: The test file exceeds the 500-line guideline and mixes
happy-path serialization round-trip tests with malformed-input rejection cases;
split it into two files: one for round-trip happy-paths and one for
rejection/validation cases. Create e.g. mongo-ops-serializer.roundtrip.test.ts
containing the tests named "round-trips createIndex with M2 options",
"round-trips createCollection command", "round-trips dropCollection command",
"round-trips collMod command", and "round-trips createCollection with M2
options" and keep their helpers (serializeMongoOps/deserializeMongoOps, command
classes) imported; create mongo-ops-serializer.rejections.test.ts containing
"rejects and filter with non-array exprs", "rejects or filter with non-array
exprs", and "rejects not filter with missing expr" with the same shared imports.
Update any top-level describe blocks or shared setup so both new files run
independently and remove the moved tests from the original file.
In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts`:
- Around line 428-971: The test file is too large and should be split by feature
area; move the "M2 index vocabulary", "validator diffing", and "collection
lifecycle" top-level describe blocks into separate test files so each file stays
under ~500 lines. Create three new test files each containing one of those
describe suites and copy any helper references they use (planner, makeContract,
planSuccess, irWithCollection, emptyIR, MongoSchemaCollection, MongoSchemaIndex,
MongoSchemaValidator, CreateIndexCommand, CollModCommand,
CreateCollectionCommand, DropCollectionCommand, MongoMigrationPlanOperation,
ALL_CLASSES_POLICY) or import those from a shared test-utils file; if helpers
are duplicated across the new files, extract common setup into a shared module
to avoid duplication and ensure beforeEach/afterEach/setup remains consistent,
then remove the moved suites from the original file so the original stays
concise.
In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts`:
- Around line 99-730: The file exceeds the 500-line guideline and mixes distinct
concerns; split the single E2E suite into multiple test files grouped by feature
area (e.g., index vocabulary tests, validator tests, collection-options tests,
and lifecycle/drop tests). Locate usages of makeContract and planAndApply and
move the relevant describe blocks (e.g., "compound indexes", "text indexes",
"TTL indexes", "hashed indexes", "2dsphere indexes", "partial indexes", "indexes
with collation", "wildcard indexes" → indexes file; "validators via collMod" →
validators file; "collection with options" → collection options file;
"collection drops" and "full lifecycle: create → modify → remove" → lifecycle
file) into their respective new test files, exporting or reusing any shared
helpers (makeContract, planAndApply, timeouts, MongoMemoryReplSet setup) by
extracting common beforeAll/afterAll/test helpers into a shared test util module
so each new file remains under 500 lines and reuses setup/teardown logic.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: dffa3a81-6be3-4bd3-bd3c-006483ab3c7c
⛔ Files ignored due to path filters (4)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yamlprojects/mongo-schema-migrations/plan.mdis excluded by!projects/**projects/mongo-schema-migrations/plans/m2-plan.mdis excluded by!projects/**projects/mongo-schema-migrations/specs/m2-full-vocabulary.spec.mdis excluded by!projects/**
📒 Files selected for processing (48)
docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.mddocs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.mddocs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.mdpackages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.tspackages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.tspackages/1-framework/3-tooling/cli/test/extract-operation-statements.test.tspackages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.tspackages/2-mongo-family/1-foundation/mongo-contract/src/contract-types.tspackages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.tspackages/2-mongo-family/1-foundation/mongo-contract/test/validate.test.tspackages/2-mongo-family/2-authoring/contract-psl/README.mdpackages/2-mongo-family/2-authoring/contract-psl/package.jsonpackages/2-mongo-family/2-authoring/contract-psl/src/derive-json-schema.tspackages/2-mongo-family/2-authoring/contract-psl/src/interpreter.tspackages/2-mongo-family/2-authoring/contract-psl/src/psl-helpers.tspackages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.tspackages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.tspackages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.tspackages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/README.mdpackages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.tspackages/2-mongo-family/4-query/query-ast/src/ddl-commands.tspackages/2-mongo-family/4-query/query-ast/src/ddl-visitors.tspackages/2-mongo-family/4-query/query-ast/src/exports/control.tspackages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.tspackages/2-mongo-family/9-family/src/core/control-instance.tspackages/3-mongo-target/2-mongo-adapter/src/core/command-executor.tspackages/3-mongo-target/2-mongo-adapter/src/core/contract-to-schema.tspackages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.tspackages/3-mongo-target/2-mongo-adapter/src/core/mongo-ops-serializer.tspackages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.tspackages/3-mongo-target/2-mongo-adapter/test/command-executor.test.tspackages/3-mongo-target/2-mongo-adapter/test/contract-to-schema.test.tspackages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-ops-serializer.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-runner.test.tstest/integration/test/mongo/migration-m2-vocabulary.test.tstest/integration/test/mongo/migration-psl-authoring.test.ts
💤 Files with no reviewable changes (2)
- packages/1-framework/3-tooling/cli/src/control-api/operations/extract-operation-statements.ts
- packages/1-framework/3-tooling/cli/src/control-api/operations/extract-mongo-statements.ts
✅ Files skipped from review due to trivial changes (11)
- packages/2-mongo-family/2-authoring/contract-psl/README.md
- packages/2-mongo-family/2-authoring/contract-psl/package.json
- packages/2-mongo-family/2-authoring/contract-ts/test/contract-builder.dsl.test.ts
- docs/architecture docs/adrs/ADR 188 - MongoDB migration operation model.md
- packages/2-mongo-family/1-foundation/mongo-contract/src/exports/index.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-index.ts
- packages/2-mongo-family/2-authoring/contract-psl/test/derive-json-schema.test.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-validator.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/README.md
- test/integration/test/mongo/migration-psl-authoring.test.ts
- docs/architecture docs/adrs/ADR 189 - Structural index matching for MongoDB migrations.md
🚧 Files skipped from review as they are similar to previous changes (19)
- packages/1-framework/3-tooling/cli/test/extract-operation-statements.test.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/types.ts
- packages/2-mongo-family/4-query/query-ast/test/ddl-commands.test.ts
- packages/3-mongo-target/2-mongo-adapter/src/core/ddl-formatter.ts
- packages/3-mongo-target/2-mongo-adapter/test/ddl-formatter.test.ts
- packages/3-mongo-target/2-mongo-adapter/test/command-executor.test.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/canonicalize.ts
- packages/2-mongo-family/2-authoring/contract-ts/src/contract-builder.ts
- docs/architecture docs/adrs/ADR 187 - MongoDB schema representation for migration diffing.md
- packages/3-mongo-target/2-mongo-adapter/src/core/command-executor.ts
- packages/2-mongo-family/1-foundation/mongo-contract/src/contract-schema.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/schema-collection-options.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/visitor.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts
- packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/exports/index.ts
- packages/2-mongo-family/9-family/src/core/control-instance.ts
- packages/2-mongo-family/2-authoring/contract-psl/test/interpreter.test.ts
| const fieldsArg = getPositionalArgument(attr, 0); | ||
| if (!fieldsArg) continue; | ||
| const fieldNames = parseFieldList(fieldsArg); | ||
| if (fieldNames.length === 0) continue; | ||
|
|
||
| const typeArg = getNamedArgument(attr, 'type'); | ||
| const direction = parseIndexDirection(typeArg); | ||
|
|
||
| const keys = fieldNames.map((name) => ({ | ||
| field: fieldMappings.pslNameToMapped.get(name) ?? name, | ||
| direction, | ||
| })); |
There was a problem hiding this comment.
@@index([field(sort: Desc)]) is misparsed here.
parseFieldList() only splits the raw string, so entries like createdAt(sort: Desc) become literal field names and still get direction: 1. Descending PSL indexes and mixed-sort compound indexes will be compiled incorrectly until the field list is parsed structurally.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts` around
lines 335 - 346, The index positional argument is being treated as raw names by
parseFieldList so entries like createdAt(sort: Desc) are not parsed for
per-field directions; update the index parsing in interpreter.ts to parse the
positional argument structurally (not via parseFieldList) — use
getPositionalArgument(attr, 0) to obtain the raw AST/token for the bracketed
list, iterate its elements and for each element parse any per-field modifiers
(e.g., sort or type) to compute the correct direction via parseIndexDirection
(or an adapted helper) instead of applying a single direction to all fields;
then build keys using fieldMappings.pslNameToMapped.get(name) ?? name with each
element’s computed direction so descending and mixed compound indexes are
handled correctly.
| const unique = attr.name === 'unique' ? true : undefined; | ||
| const sparse = parseBooleanArg(getNamedArgument(attr, 'sparse')); | ||
| const expireAfterSeconds = parseNumericArg(getNamedArgument(attr, 'expireAfterSeconds')); | ||
|
|
||
| const rawWeights = parseJsonArg(getNamedArgument(attr, 'weights')); | ||
| let weights: Record<string, number> | undefined; | ||
| if (rawWeights) { | ||
| weights = {}; | ||
| for (const [k, v] of Object.entries(rawWeights)) { | ||
| if (typeof v === 'number') weights[k] = v; | ||
| } | ||
| } | ||
|
|
||
| const rawDefaultLang = getNamedArgument(attr, 'default_language'); | ||
| const default_language = rawDefaultLang | ||
| ? rawDefaultLang.replace(/^["']/, '').replace(/["']$/, '') | ||
| : undefined; | ||
|
|
||
| const rawLangOverride = getNamedArgument(attr, 'language_override'); | ||
| const language_override = rawLangOverride | ||
| ? rawLangOverride.replace(/^["']/, '').replace(/["']$/, '') | ||
| : undefined; | ||
|
|
||
| const index: MongoStorageIndex = { | ||
| keys, | ||
| ...(unique != null && { unique }), | ||
| ...(sparse != null && { sparse }), | ||
| ...(expireAfterSeconds != null && { expireAfterSeconds }), | ||
| ...(weights != null && { weights }), | ||
| ...(default_language != null && { default_language }), | ||
| ...(language_override != null && { language_override }), | ||
| }; |
There was a problem hiding this comment.
PSL authoring still drops partialFilterExpression and wildcardProjection.
The storage/planning layers understand these options, but this interpreter never reads them from @@index / @@unique, so the PSL path cannot express them. Parse them alongside sparse and expireAfterSeconds.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/2-mongo-family/2-authoring/contract-psl/src/interpreter.ts` around
lines 348 - 379, The interpreter currently parses sparse, expireAfterSeconds,
weights, default_language and language_override from attr but omits
partialFilterExpression and wildcardProjection; update the same parsing pattern
to read getNamedArgument(attr, 'partialFilterExpression') and
getNamedArgument(attr, 'wildcardProjection') (use parseJsonArg for both, similar
to rawWeights) and normalize/validate the results (accept only object values for
partialFilterExpression and wildcardProjection, stripping quotes where needed
like rawDefaultLang), then include them in the MongoStorageIndex literal (add
...(partialFilterExpression != null && { partialFilterExpression }) and
...(wildcardProjection != null && { wildcardProjection })). Ensure you reference
the existing symbols (attr, getNamedArgument, parseJsonArg, rawWeights, index)
so the change mirrors the existing parsing and conditional inclusion style.
| it('creates a collection with clusteredIndex', async () => { | ||
| const contract = makeContract( | ||
| { | ||
| clustered: { | ||
| options: { | ||
| clusteredIndex: { name: 'myCluster' }, | ||
| }, | ||
| }, | ||
| }, | ||
| 'clustered-coll', | ||
| ); | ||
|
|
||
| try { | ||
| await planAndApply(db, replSetUri, null, contract); | ||
| } catch (e) { | ||
| const msg = e instanceof Error ? e.message : String(e); | ||
| if (msg.includes('not supported') || msg.includes('requires') || msg.includes('unknown')) { | ||
| console.log(`Skipping clusteredIndex test: ${msg}`); | ||
| return; | ||
| } | ||
| throw e; |
There was a problem hiding this comment.
Use a complete clusteredIndex fixture here.
The create-collection command shape in packages/2-mongo-family/4-query/query-ast/src/ddl-commands.ts (Lines 92-97) expects clusteredIndex to include at least key and unique, but this test only passes { name: 'myCluster' }. On a server that actually supports clustered collections, that turns this into an invalid-payload test instead of a feature test.
🧪 Example fixture update
options: {
- clusteredIndex: { name: 'myCluster' },
+ clusteredIndex: { key: { _id: 1 }, unique: true, name: 'myCluster' },
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| it('creates a collection with clusteredIndex', async () => { | |
| const contract = makeContract( | |
| { | |
| clustered: { | |
| options: { | |
| clusteredIndex: { name: 'myCluster' }, | |
| }, | |
| }, | |
| }, | |
| 'clustered-coll', | |
| ); | |
| try { | |
| await planAndApply(db, replSetUri, null, contract); | |
| } catch (e) { | |
| const msg = e instanceof Error ? e.message : String(e); | |
| if (msg.includes('not supported') || msg.includes('requires') || msg.includes('unknown')) { | |
| console.log(`Skipping clusteredIndex test: ${msg}`); | |
| return; | |
| } | |
| throw e; | |
| it('creates a collection with clusteredIndex', async () => { | |
| const contract = makeContract( | |
| { | |
| clustered: { | |
| options: { | |
| clusteredIndex: { key: { _id: 1 }, unique: true, name: 'myCluster' }, | |
| }, | |
| }, | |
| }, | |
| 'clustered-coll', | |
| ); | |
| try { | |
| await planAndApply(db, replSetUri, null, contract); | |
| } catch (e) { | |
| const msg = e instanceof Error ? e.message : String(e); | |
| if (msg.includes('not supported') || msg.includes('requires') || msg.includes('unknown')) { | |
| console.log(`Skipping clusteredIndex test: ${msg}`); | |
| return; | |
| } | |
| throw e; |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@test/integration/test/mongo/migration-m2-vocabulary.test.ts` around lines 579
- 599, The clusteredIndex fixture in the test is incomplete (only { name:
'myCluster' }), causing valid-server runs to treat it as invalid; update the
contract passed to makeContract to include a full clusteredIndex object with at
minimum a key spec (e.g., { field: 1 } or similar) and a unique boolean (unique:
true/false) so it matches the shape expected by the create-collection command in
ddl-commands.ts; adjust the clustered.options.clusteredIndex in the test
contract and re-run planAndApply to validate the feature test.
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts (1)
35-50:⚠️ Potential issue | 🔴 CriticalIndex verification does not cover the expanded M2 vocabulary.
buildIndexLookupKey()now treatspartialFilterExpression,wildcardProjection,collation,weights,default_language, andlanguage_overrideas part of index identity, but the runtime filter still only matches key/unique — and for text indexes justkey._fts = 'text'. If one of those fields is dropped by formatting or serialization, the runner can still mark the migration successful with the wrong index definition. Build the pre/post filters from the same option set used for equivalence.Also applies to: 65-71, 88-99
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts` around lines 35 - 50, The index verification is incomplete: update the runtime pre/post index filters to be derived from the same option set used by buildIndexLookupKey (the opts array: unique, sparse, ttl/expireAfterSeconds, partialFilterExpression, wildcardProjection, collation, weights, default_language, language_override) so that the matcher compares all those properties (using canonicalize for object-valued options) instead of only key/unique (and key._fts for text); locate buildIndexLookupKey and the index comparison/matching code that currently only checks key/unique/text and change it to construct filters from that same opts list and compare each option consistently for equivalence.
♻️ Duplicate comments (1)
packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts (1)
217-225:⚠️ Potential issue | 🔴 CriticalValidator postcheck still omits the
$jsonSchemabody.
options.validationLevelandoptions.validationActioncan both match whileoptions.validatoris stale or missing, so this operation can still advance the marker with the wrong validator. Please include the full{ $jsonSchema: destValidator.jsonSchema }payload in the add/updateListCollectionsCommandfilter as well.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts` around lines 217 - 225, The postcheck filter for ensuring the validator was applied is missing the validator body; update the ListCollectionsCommand filter used in the postcheck array (the block referencing postcheck, ListCollectionsCommand, MongoFieldFilter and MongoAndExpr) to also assert the collection's options.validator contains the full JSON Schema by adding a MongoFieldFilter.eq for 'options.validator' (or 'options.validator.$jsonSchema' depending on stored structure) comparing to destValidator.jsonSchema so the postcheck verifies the actual { $jsonSchema: destValidator.jsonSchema } payload for collName.
🧹 Nitpick comments (1)
packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts (1)
428-1045: Please split this planner suite by feature area.The new index-vocabulary, validator, and collection-lifecycle coverage pushes this file past 1,000 lines with several top-level
describe()blocks. Breaking it into smaller files will keep failures much easier to navigate.As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts` around lines 428 - 1045, The test file has grown too large—split the top-level describe blocks into separate test files: move the "M2 index vocabulary" suite into a new mongo-planner.index-vocabulary.test.ts, the "validator diffing" suite into mongo-planner.validator.test.ts, and the "collection lifecycle" suite into mongo-planner.collection-lifecycle.test.ts; for each new file copy the relevant describe(...) block and its tests, keep references to helper symbols (planner, planSuccess, makeContract, irWithCollection, MongoSchemaIndex, MongoSchemaCollection, MongoSchemaValidator, MongoSchemaCollectionOptions, CreateIndexCommand, CollModCommand, CreateCollectionCommand, DropCollectionCommand, ALL_CLASSES_POLICY) and the same imports at top of each file, remove the moved blocks from the original file, run tests and fix any missing exports/imports for shared helpers so the test runner still discovers and executes all suites.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts`:
- Around line 262-304: The postcheck arrays for planCreateCollection,
planDropCollection, and the collMod execution path are empty; add
ListCollectionsCommand-based postchecks that re-query the collection and assert
existence/non-existence and the concrete option fields changed (e.g., collation,
clusteredIndex, timeseries, changeStreamPreAndPostImages, validator ->
$jsonSchema, validationLevel, validationAction, capped/size/max) so the plan can
verify the effective server-side settings after execute; locate
planCreateCollection, planDropCollection and the collMod branch in
mongo-planner.ts and append appropriate postcheck entries using
ListCollectionsCommand + MongoFieldFilter.eq('name', collName) and checks that
compare the returned options to the values used in the CreateCollectionCommand /
collMod operations.
- Around line 251-258: The hasImmutableOptionChange function is key-order
sensitive because it compares origin and dest options (capped, timeseries,
collation, clusteredIndex) with deepEqual directly; update
hasImmutableOptionChange to canonicalize/sort object keys for each compared
option (same approach used for validators/index options) before calling
deepEqual so semantically identical objects with different key orders do not
trigger an immutable-change result—apply the canonicalization to
origin?.capped/dest?.capped, origin?.timeseries/dest?.timeseries,
origin?.collation/dest?.collation, and
origin?.clusteredIndex/dest?.clusteredIndex before comparison.
---
Outside diff comments:
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts`:
- Around line 35-50: The index verification is incomplete: update the runtime
pre/post index filters to be derived from the same option set used by
buildIndexLookupKey (the opts array: unique, sparse, ttl/expireAfterSeconds,
partialFilterExpression, wildcardProjection, collation, weights,
default_language, language_override) so that the matcher compares all those
properties (using canonicalize for object-valued options) instead of only
key/unique (and key._fts for text); locate buildIndexLookupKey and the index
comparison/matching code that currently only checks key/unique/text and change
it to construct filters from that same opts list and compare each option
consistently for equivalence.
---
Duplicate comments:
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts`:
- Around line 217-225: The postcheck filter for ensuring the validator was
applied is missing the validator body; update the ListCollectionsCommand filter
used in the postcheck array (the block referencing postcheck,
ListCollectionsCommand, MongoFieldFilter and MongoAndExpr) to also assert the
collection's options.validator contains the full JSON Schema by adding a
MongoFieldFilter.eq for 'options.validator' (or 'options.validator.$jsonSchema'
depending on stored structure) comparing to destValidator.jsonSchema so the
postcheck verifies the actual { $jsonSchema: destValidator.jsonSchema } payload
for collName.
---
Nitpick comments:
In `@packages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.ts`:
- Around line 428-1045: The test file has grown too large—split the top-level
describe blocks into separate test files: move the "M2 index vocabulary" suite
into a new mongo-planner.index-vocabulary.test.ts, the "validator diffing" suite
into mongo-planner.validator.test.ts, and the "collection lifecycle" suite into
mongo-planner.collection-lifecycle.test.ts; for each new file copy the relevant
describe(...) block and its tests, keep references to helper symbols (planner,
planSuccess, makeContract, irWithCollection, MongoSchemaIndex,
MongoSchemaCollection, MongoSchemaValidator, MongoSchemaCollectionOptions,
CreateIndexCommand, CollModCommand, CreateCollectionCommand,
DropCollectionCommand, ALL_CLASSES_POLICY) and the same imports at top of each
file, remove the moved blocks from the original file, run tests and fix any
missing exports/imports for shared helpers so the test runner still discovers
and executes all suites.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: ec68cf28-945d-41fc-8d11-990c162f5add
⛔ Files ignored due to path filters (2)
test/integration/test/mongo/fixtures/generated/contract.d.tsis excluded by!**/generated/**test/integration/test/mongo/fixtures/generated/contract.jsonis excluded by!**/generated/**
📒 Files selected for processing (6)
packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.tspackages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.tspackages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-planner.test.tstest/integration/test/mongo/migration-e2e.test.tstest/integration/test/mongo/orm.test.ts
✅ Files skipped from review due to trivial changes (1)
- test/integration/test/mongo/orm.test.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/2-mongo-family/3-tooling/mongo-schema-ir/src/index-equivalence.ts
- packages/2-mongo-family/3-tooling/mongo-schema-ir/test/schema-ir.test.ts
| function hasImmutableOptionChange( | ||
| origin: MongoSchemaCollectionOptions | undefined, | ||
| dest: MongoSchemaCollectionOptions | undefined, | ||
| ): string | undefined { | ||
| if (!deepEqual(origin?.capped, dest?.capped)) return 'capped'; | ||
| if (!deepEqual(origin?.timeseries, dest?.timeseries)) return 'timeseries'; | ||
| if (!deepEqual(origin?.collation, dest?.collation)) return 'collation'; | ||
| if (!deepEqual(origin?.clusteredIndex, dest?.clusteredIndex)) return 'clusteredIndex'; |
There was a problem hiding this comment.
Immutable option comparison is still key-order sensitive.
The planner already canonicalizes validators and index options for this reason, but immutable collection options are still compared with deepEqual(). A pure key reorder in capped, timeseries, collation, or clusteredIndex will now look like an immutable change and fail the plan even though the option is semantically identical.
♻️ Proposed fix
function hasImmutableOptionChange(
origin: MongoSchemaCollectionOptions | undefined,
dest: MongoSchemaCollectionOptions | undefined,
): string | undefined {
- if (!deepEqual(origin?.capped, dest?.capped)) return 'capped';
- if (!deepEqual(origin?.timeseries, dest?.timeseries)) return 'timeseries';
- if (!deepEqual(origin?.collation, dest?.collation)) return 'collation';
- if (!deepEqual(origin?.clusteredIndex, dest?.clusteredIndex)) return 'clusteredIndex';
+ if (canonicalize(origin?.capped) !== canonicalize(dest?.capped)) return 'capped';
+ if (canonicalize(origin?.timeseries) !== canonicalize(dest?.timeseries)) return 'timeseries';
+ if (canonicalize(origin?.collation) !== canonicalize(dest?.collation)) return 'collation';
+ if (canonicalize(origin?.clusteredIndex) !== canonicalize(dest?.clusteredIndex))
+ return 'clusteredIndex';
return undefined;
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function hasImmutableOptionChange( | |
| origin: MongoSchemaCollectionOptions | undefined, | |
| dest: MongoSchemaCollectionOptions | undefined, | |
| ): string | undefined { | |
| if (!deepEqual(origin?.capped, dest?.capped)) return 'capped'; | |
| if (!deepEqual(origin?.timeseries, dest?.timeseries)) return 'timeseries'; | |
| if (!deepEqual(origin?.collation, dest?.collation)) return 'collation'; | |
| if (!deepEqual(origin?.clusteredIndex, dest?.clusteredIndex)) return 'clusteredIndex'; | |
| function hasImmutableOptionChange( | |
| origin: MongoSchemaCollectionOptions | undefined, | |
| dest: MongoSchemaCollectionOptions | undefined, | |
| ): string | undefined { | |
| if (canonicalize(origin?.capped) !== canonicalize(dest?.capped)) return 'capped'; | |
| if (canonicalize(origin?.timeseries) !== canonicalize(dest?.timeseries)) return 'timeseries'; | |
| if (canonicalize(origin?.collation) !== canonicalize(dest?.collation)) return 'collation'; | |
| if (canonicalize(origin?.clusteredIndex) !== canonicalize(dest?.clusteredIndex)) | |
| return 'clusteredIndex'; | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts` around
lines 251 - 258, The hasImmutableOptionChange function is key-order sensitive
because it compares origin and dest options (capped, timeseries, collation,
clusteredIndex) with deepEqual directly; update hasImmutableOptionChange to
canonicalize/sort object keys for each compared option (same approach used for
validators/index options) before calling deepEqual so semantically identical
objects with different key orders do not trigger an immutable-change
result—apply the canonicalization to origin?.capped/dest?.capped,
origin?.timeseries/dest?.timeseries, origin?.collation/dest?.collation, and
origin?.clusteredIndex/dest?.clusteredIndex before comparison.
| function planCreateCollection( | ||
| collName: string, | ||
| dest: MongoSchemaCollection, | ||
| ): MongoMigrationPlanOperation { | ||
| const opts = dest.options; | ||
| const validator = dest.validator; | ||
| return { | ||
| id: `collection.${collName}.create`, | ||
| label: `Create collection ${collName}`, | ||
| operationClass: 'additive', | ||
| precheck: [ | ||
| { | ||
| description: `collection ${collName} does not exist`, | ||
| source: new ListCollectionsCommand(), | ||
| filter: MongoFieldFilter.eq('name', collName), | ||
| expect: 'notExists', | ||
| }, | ||
| ], | ||
| execute: [ | ||
| { | ||
| description: `create collection ${collName}`, | ||
| command: new CreateCollectionCommand(collName, { | ||
| capped: opts?.capped ? true : undefined, | ||
| size: opts?.capped?.size, | ||
| max: opts?.capped?.max, | ||
| timeseries: opts?.timeseries, | ||
| collation: opts?.collation, | ||
| clusteredIndex: opts?.clusteredIndex | ||
| ? { | ||
| key: { _id: 1 } as Record<string, number>, | ||
| unique: true as boolean, | ||
| ...(opts.clusteredIndex.name != null ? { name: opts.clusteredIndex.name } : {}), | ||
| } | ||
| : undefined, | ||
| validator: validator ? { $jsonSchema: validator.jsonSchema } : undefined, | ||
| validationLevel: validator?.validationLevel, | ||
| validationAction: validator?.validationAction, | ||
| changeStreamPreAndPostImages: opts?.changeStreamPreAndPostImages, | ||
| }), | ||
| }, | ||
| ], | ||
| postcheck: [], | ||
| }; |
There was a problem hiding this comment.
Collection-level ops need postchecks.
planCreateCollection(), planDropCollection(), and the mutable collMod path all return empty postchecks. That leaves the plan with no way to re-read create-only fields like collation or clusteredIndex after execution, and it makes already-applied plans much weaker to re-run safely than the index path. Please add ListCollectionsCommand postchecks for existence/non-existence and the specific option fields being changed.
Also applies to: 307-320, 323-345
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/3-mongo-target/2-mongo-adapter/src/core/mongo-planner.ts` around
lines 262 - 304, The postcheck arrays for planCreateCollection,
planDropCollection, and the collMod execution path are empty; add
ListCollectionsCommand-based postchecks that re-query the collection and assert
existence/non-existence and the concrete option fields changed (e.g., collation,
clusteredIndex, timeseries, changeStreamPreAndPostImages, validator ->
$jsonSchema, validationLevel, validationAction, capped/size/max) so the plan can
verify the effective server-side settings after execute; locate
planCreateCollection, planDropCollection and the collMod branch in
mongo-planner.ts and append appropriate postcheck entries using
ListCollectionsCommand + MongoFieldFilter.eq('name', collName) and checks that
compare the returned options to the values used in the CreateCollectionCommand /
collMod operations.
…nd PSL validators
- mongo-demo contract.json: migrate index from {fields,options} to
{keys:[{field,direction}],unique} format
- side-by-side mongo fixture: update expected contract.json to include
PSL-derived validators (now emitted by the PSL interpreter)
There was a problem hiding this comment.
🧹 Nitpick comments (1)
test/integration/test/authoring/side-by-side-contracts.test.ts (1)
221-256: Extract a single strip helper to avoid comparison drift.The stripping logic is duplicated in two local functions. A single helper keeps Mongo comparison rules consistent in one place when fields evolve.
♻️ Suggested refactor
+function stripMongoValidatorAndStorageHash(input: Record<string, unknown>): Record<string, unknown> { + const storage = input['storage'] as Record<string, unknown>; + const collections = storage['collections'] as Record<string, Record<string, unknown>>; + const strippedCollections: Record<string, unknown> = {}; + + for (const [name, coll] of Object.entries(collections)) { + const { validator: _validator, ...rest } = coll; + strippedCollections[name] = rest; + } + + const { storageHash: _storageHash, ...restStorage } = storage; + return { ...input, storage: { ...restStorage, collections: strippedCollections } }; +} ... -const stripValidatorFields = (contract: typeof normalizedTs) => { - ... -}; -expect(stripValidatorFields(normalizedTs)).toEqual(stripValidatorFields(normalizedPsl)); +expect( + stripMongoValidatorAndStorageHash(normalizedTs as unknown as Record<string, unknown>), +).toEqual( + stripMongoValidatorAndStorageHash(normalizedPsl as unknown as Record<string, unknown>), +); ... -const stripForComparison = (json: string) => { - ... -}; -expect(stripForComparison(emittedTs.contractJson)).toEqual(stripForComparison(emittedPsl.contractJson)); +expect(stripMongoValidatorAndStorageHash(parseContractJson(emittedTs.contractJson))).toEqual( + stripMongoValidatorAndStorageHash(parseContractJson(emittedPsl.contractJson)), +);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/authoring/side-by-side-contracts.test.ts` around lines 221 - 256, The duplicated stripping logic in stripValidatorFields and stripForComparison should be extracted into a single helper (e.g., stripStorageValidators) and used in both places so Mongo comparison rules stay consistent; update usages where you compare normalizedTs/normalizedPsl (currently using stripValidatorFields) and where you compare emitted JSONs (currently using stripForComparison on emittedTs.contractJson and emittedPsl.contractJson) to call the new helper, ensuring it accepts both parsed contract objects and parsed JSON results (or provide small adapter wrappers) and preserves existing behavior of removing collection.validator and storage.storageHash.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@test/integration/test/authoring/side-by-side-contracts.test.ts`:
- Around line 221-256: The duplicated stripping logic in stripValidatorFields
and stripForComparison should be extracted into a single helper (e.g.,
stripStorageValidators) and used in both places so Mongo comparison rules stay
consistent; update usages where you compare normalizedTs/normalizedPsl
(currently using stripValidatorFields) and where you compare emitted JSONs
(currently using stripForComparison on emittedTs.contractJson and
emittedPsl.contractJson) to call the new helper, ensuring it accepts both parsed
contract objects and parsed JSON results (or provide small adapter wrappers) and
preserves existing behavior of removing collection.validator and
storage.storageHash.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 981e4e81-6f11-4582-abf0-805b07ddf162
📒 Files selected for processing (3)
examples/mongo-demo/src/contract.jsontest/integration/test/authoring/side-by-side-contracts.test.tstest/integration/test/authoring/side-by-side/mongo/contract.json
✅ Files skipped from review due to trivial changes (1)
- examples/mongo-demo/src/contract.json
closes TML-2231
Walkthrough — MongoDB Schema Migrations (M1 + M2)
Intent
Build a complete MongoDB migration pipeline — from contract types through planning, serialization, execution, and CLI display — so that MongoDB server-side configuration (indexes, validators, collection options) is managed through the same graph-based migration model used for SQL DDL. M1 proved the architecture with a single ascending index; M2 fills in the full vocabulary.
Key snippet (new capability)
Change map
The story
Define the data model. Extend
MongoStorageIndexwith all M2 options (wildcardProjection, collation, weights, text language settings). AddMongoStorageValidatorandMongoStorageCollectionOptionsto the contract. Back them with Arktype schemas for runtime validation.Build the schema IR. Create
MongoSchemaValidatorandMongoSchemaCollectionOptionsclasses following the established frozen AST pattern. Addcanonicalize()for key-order-independent structural serialization. ExtendindexesEquivalent()to compare all M2 index options.Extend the DDL command vocabulary. Add
CreateCollectionCommand,DropCollectionCommand, andCollModCommandto the DDL command AST, each with visitor dispatch. Add new index options toCreateIndexCommand. TheMongoDdlCommandVisitorinterface forces every consumer (executor, formatter, serializer) to handle the new commands at compile time.Teach the planner to diff everything. Extend
MongoMigrationPlannerto diff validators (viavalidatorsEqual+classifyValidatorUpdatefor widening/destructive classification +collModoperations), collection options (immutable option conflict detection + mutablechangeStreamPreAndPostImagesviacollMod), and collection lifecycle (createCollectionfor new collections with options,dropCollectionfor removed collections). ReplaceJSON.stringifywithcanonicalize()in the index lookup key for key-order independence.Extend the runner and executor. Add
createCollection,dropCollection, andcollModhandlers toMongoCommandExecutor. The runner's three-phase loop (prechecks → execute → postchecks) handles all new commands without structural changes. CAS check onupdateMarkerprevents concurrent migration corruption.Make it serializable. Extend
mongo-ops-serializerwith Arktype schemas and deserializer cases for all new DDL and inspection command kinds. Plans can be persisted toops.jsonand reloaded.Wire the CLI and target descriptor. Update the CLI operation dispatch to fall back to operation metadata display for non-SQL families. Update the Mongo target descriptor and control instance to expose the planner, runner, and
contractToSchemaas migration capabilities. Wire the Mongo driver's control exports.Add PSL authoring. Extend the Mongo PSL interpreter to handle
@@index,@@unique, and@uniqueattributes with Mongo-specific options (sparse, TTL, type, weights, default_language, language_override). ImplementderiveJsonSchema()to auto-derive a$jsonSchemavalidator from model field types, mapping contract codec IDs to BSON types.Prove it end-to-end. Integration tests against
mongodb-memory-serververify: compound indexes, text indexes with weights and language_override, TTL, hashed, 2dsphere, partial, collation, wildcard, validators (add/remove), capped collections, drop collection, and multi-step lifecycle flows (create → modify → strip). A separate PSL authoring test proves the full PSL → contract → plan → apply → verify cycle.Behavior changes & evidence
Adds full index vocabulary to the migration pipeline: The contract, IR, planner, serializer, and executor now support all MongoDB index key types (
1,-1,text,2dsphere,2d,hashed,$**) and all identity-significant options (unique,sparse,expireAfterSeconds,partialFilterExpression,wildcardProjection,collation,weights,default_language,language_override).buildIndexLookupKeyAdds canonical serialization for structural index matching:
canonicalize()produces key-order-independent strings for object-valued index options.buildIndexLookupKeyuses it forpartialFilterExpression,wildcardProjection,collation, andweights.JSON.stringifywhich is key-order dependent. Two indexes with the samepartialFilterExpressionbut different key ordering would be incorrectly treated as different indexes.Adds
$jsonSchemavalidators as a migration subject: The contract carriesMongoStorageValidator(jsonSchema + validationLevel + validationAction). The planner diffs validators and emitsCollModCommandoperations with nuanced classification: removal and relaxation (error→warn, strict→moderate) arewidening; tightening and schema body changes aredestructive.$jsonSchemavalidators are a core server-side configuration mechanism. Without migration support, validator changes require manual DDL.planValidatorDiff,classifyValidatorUpdateAdds collection lifecycle management: New collections with explicit options (capped, timeseries, clusteredIndex, collation) are created via
CreateCollectionCommand. Removed collections produceDropCollectionCommand. Immutable option changes on existing collections are detected asMigrationPlannerConflict. Mutable options (changeStreamPreAndPostImages) useCollModCommandwith directional classification (enabling = widening, disabling = destructive).CreateCollectionCommand,DropCollectionCommandplanCreateCollection,planDropCollection,hasImmutableOptionChange,planMutableOptionsDiffAdds PSL authoring for MongoDB indexes and validators: The PSL interpreter handles
@@index,@@unique, and@uniqueattributes with Mongo-specific options.deriveJsonSchema()auto-derives a$jsonSchemavalidator from model field definitions.collectIndexesRemoves CLI Mongo DDL formatter: The incomplete
extract-mongo-statements.ts(which only handledcreateIndex/dropIndex) has been deleted. The CLI falls back to operation metadata display for Mongo operations.undefinedfor non-SQLNo behavior change to existing code: The
query-astpackage export rename fromindex.tstoexecution.tsis a structural refactor (adding a/controlentrypoint). All existing consumers update their imports.Compatibility / migration / risk
index.ts→execution.ts) requires all consumers to update imports from@prisma-next/mongo-query-astto@prisma-next/mongo-query-ast/execution. All internal consumers are updated on this branch; no external consumers exist.Follow-ups / open questions
$jsonSchemaderivation: currently, a nullable value object field produces the same schema as a required one (nonullunion). Whether this matters depends on usage patterns.collationsupport: The interpreter doesn't parse per-indexcollationfrom PSL attributes. Documented as a known limitation in the contract-psl README.contract-builder.tstype safety:toStorageIndexandtoStorageCollectionOptionsuseas unknown asdouble-casts — should be replaced with typed construction.Non-goals / intentionally out of scope
destructivedefault is intentional.Summary by CodeRabbit
New Features
keysarrays.Tests