diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/10-cloud-integration.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/09-cloud-integration.temp.md similarity index 98% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/10-cloud-integration.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/09-cloud-integration.temp.md index 2b4c9daa7..4eecf8fe3 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/10-cloud-integration.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/09-cloud-integration.temp.md @@ -6,7 +6,7 @@ metaDescription: 'Learn how to build a data connector in Typescript for Hasura D Going over the process of creating and deploying a project to Hasura DDN is beyond the scope of this course and we don't want to go too off-track, but is covered in the Hasura Docs which you can -[check out here](https://hasura.io/docs/3.0/local-dev/). +[check out here](https://hasura.io/docs/3.0/getting-started/overview). We have created and included a Hasura DDN metadata configuration in the [repo for this course](https://github.com/hasura/ndc-typescript-learn-course/blob/main/hasura/) which you can use to diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/01-fetch-aggregates.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/01-fetch-aggregates.md new file mode 100644 index 000000000..9ac9f6784 --- /dev/null +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/01-fetch-aggregates.md @@ -0,0 +1,160 @@ +--- +title: "Implementing Aggregates" +metaTitle: 'Aggregates | Hasura DDN Typescript Data Connector Tutorial' +metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' +--- + +Let's implement aggregates in our SQLite connector. + +Like we've done before, we won't implement aggregates in their full generality, and instead we're going to implement two +types of aggregates, called `star_count` and `column_count`. Other aggregates like `SUM` and `MAX` that you know from +Postgres will come under the umbrella of _custom aggregate functions_, and we'll cover those separately in another +tutorial. + +Let's start by adding the `aggregates` capability to our capabilities response: + +```typescript +return { + version: "0.1.2", + capabilities: { + query: { aggregates: {} }, + mutation: {}, + relationships: {} + } +} +``` +Aggregate queries are indicated by the presence of the `aggregates` field in the query request body. Just like the +`fields` property that we handled previously, each aggregate is named with a key, and has a `type`, in this case +`star_count`. So we're going to handle aggregates very similarly to fields, by building up a SQL target list from these +aggregates. + +The NDC spec says that each aggregate should act over the same set of rows that we consider when returning `rows`. That +is, we should apply any predicates, sorting, pagination, and so on, and then apply the aggregate functions over the +resulting set of rows. + +So assuming we have a function called `fetch_aggregates` which builds the SQL in this way, we can fill in the +`aggregates` in the response: + +```typescript +const aggregates = request.query.aggregates && await fetch_aggregates(state, request); +``` + +Now let's start to fill in a `fetch_aggregates` function. + +We'll actually copy/paste the `fetch_rows` function and create a new function for handling aggregates. It'd be possible to extract that commonality into a shared function, but arguably not worth it, since so much is already extracted out into small helper functions anyway. + +```typescript +async function fetch_aggregates(state: State, request: QueryRequest): Promise<{ + [k: string]: unknown +}> { + +} +``` + +The first difference is the return type. Instead of `RowFieldValue`, we're going to return a value directly from the database, so let's change that to `unknown`. + +Next, we want to generate the target list using the requested aggregates, so let's change that. + +```typescript +const target_list = []; + +for (const aggregateName in request.query.aggregates) { + if (Object.prototype.hasOwnProperty.call(request.query.aggregates, aggregateName)) { + const aggregate = request.query.aggregates[aggregateName]; + switch (aggregate.type) { + case 'star_count': + // TODO + case 'column_count': + // TODO + case 'single_column': + // TODO + } + } +} +``` + +For now, we'll handle the first two cases here, and save the last for when we talk about custom aggregates. + +In the first case, we want to generate a target list element which uses the `COUNT` SQL aggregate function. + +```typescript +case 'star_count': + target_list.push(`COUNT(1) AS ${aggregateName}`); + break; +``` + +In the second case, we'll also use the `COUNT` function, but this time, we're counting non-null values in a single column: + +```typescript +case 'column_count': + target_list.push(`COUNT(${aggregate.column}) AS ${aggregateName}`); + break; +``` + +We also need to interpret the `distinct` property of the aggregate object, and insert the `DISTINCT` keyword if needed: + +```typescript +case 'column_count': + target_list.push(`COUNT(${aggregate.distinct ? 'DISTINCT ' : ''}${aggregate.column}) AS ${aggregateName}`); + break; +``` + +Now let's update our generated SQL to use the generated target list: + +```typescript +const sql = `SELECT ${target_list.length ? target_list.join(", ") : "1 AS __empty"} FROM ( + ( + SELECT * FROM ${request.collection} ${where_clause} ${order_by_clause} ${limit_clause} ${offset_clause} + )`; +``` + +Note that we form the set of rows to be aggregated first, so that the limit and offset clauses are applied correctly. + +And instead of returning all rows, we're going to assume that we only get a single row back, so we can match on that and return the single row of aggregates: + +```typescript +const result = await state.db.get(sql, ...parameters); + +delete result.__empty; + +if (result === undefined) { + throw new InternalServerError("Unable to fetch aggregates"); +} + +return result; +``` + +That's it, so let's test our connector one more time, and hopefully see some passing tests this time. + +```sh +ndc-test test --endpoint http://localhost:8080 --snapshots-dir snapshots + +... +├ Query ... +│ ├ albums ... +│ │ ├ Simple queries ... +│ │ │ ├ Select top N ... OK +│ │ │ ├ Predicates ... OK +│ │ │ ├ Sorting ... OK +│ │ ├ Relationship queries ... +│ │ ├ Aggregate queries ... +│ │ │ ├ star_count ... OK +│ │ │ ├ column_count ... OK +│ │ │ ├ single_column ... OK +... +``` + +Note that `ndc-test` is now testing aggregates automatically, since we turned on the `aggregates` capability. + +And let's check that we're generating the right SQL. Picking a random example from the logs, we can see that we are indeed generating well-formed SQL: + +```sql +SELECT + COUNT(id) AS id_count, + COUNT(DISTINCT id) AS id_distinct_count, + COUNT(name) AS name_count, + COUNT(DISTINCT name) AS name_distinct_count +FROM ( + SELECT * FROM artists LIMIT 10 +) +``` \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/1-video.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/1-video.md deleted file mode 100644 index 0813a092e..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/1-video.md +++ /dev/null @@ -1,10 +0,0 @@ ---- -title: "Video Walkthrough" -metaTitle: 'Aggregates | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -import YoutubeEmbed from "../../src/YoutubeEmbed.js"; - - - diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/2-fetch-aggregates.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/2-fetch-aggregates.md deleted file mode 100644 index fbbd52aa8..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/aggregates/2-fetch-aggregates.md +++ /dev/null @@ -1,219 +0,0 @@ ---- -title: "Implementing Aggregates" -metaTitle: 'Aggregates | Hasura DDN Typescript Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -Let's implement aggregates in our SQLite connector. - -Like we've done before, we won't implement aggregates in their full generality, and instead we're going to implement two -types of aggregates, called `star_count` and `column_count`. Other aggregates like `SUM` and `MAX` that you know from -Postgres will come under the umbrella of _custom aggregate functions_. - -If we take a look at our failing tests, we see that aggregate queries are indicated by the presence of the `aggregates` -field in the query request body. Just like the `fields` property that we handled previously, each aggregate is named -with a key, and has a `type`, in this case `star_count`. So we're going to handle aggregates very similarly to fields, -by building up a SQL target list from these aggregates. - -```JSON -{ - "collection": "albums", - "query": { - "aggregates": { - "count": { - "type": "star_count" - } - }, - "limit": 10 - }, - "arguments": {}, - "collection_relationships": {} -} -``` - -[The NDC spec](https://hasura.github.io/ndc-spec/specification/queries/aggregates.html) says that each aggregate should -act over the same set of rows that we consider when returning `rows`. That is, we should apply any predicates, sorting, -pagination, and so on, and then apply the aggregate functions over the resulting set of rows. - -So assuming we have a function called `fetch_aggregates` which builds the SQL in this way, we can fill in the -`aggregates` in the response. - -If the `query` function, add this line and amend the return type to include aggregates: - -```typescript -const aggregates = request.query.aggregates && await fetch_aggregates(state, request); - -return [{ rows,aggregates }]; -``` - -So the final query function becomes: -```typescript -async function query(configuration: Configuration, state: State, request: QueryRequest): Promise { - console.log(JSON.stringify(request, null, 2)); - - const rows = request.query.fields && await fetch_rows(state, request); - const aggregates = request.query.aggregates && await fetch_aggregates(state, request); - - return [{ rows, aggregates }]; -} -``` - -Now let's start to fill in a `fetch_aggregates` helper function. - -We'll actually copy/paste the `fetch_rows` function and create a new function for handling aggregates. It would be -possible to extract that commonality into a shared function, but arguably not worth it, since so much is already -extracted out into small helper functions anyway. - -The first difference is the return type. Instead of `RowFieldValue`, we're going to return a value directly from the -database, so let's change that to `unknown`. - -Next, we want to generate the target list using the requested aggregates, so let's change that. - -```typescript -async function fetch_aggregates(state: State, request: QueryRequest): Promise<{ [k: string]: unknown }> { - const target_list = []; - - for (const aggregateName in request.query.aggregates) { - if (Object.prototype.hasOwnProperty.call(request.query.aggregates, aggregateName)) { - const aggregate = request.query.aggregates[aggregateName]; - switch(aggregate.type) { - case 'star_count': - - case 'column_count': - - case 'single_column': - } - } - } - -} -``` - -For now, we'll handle the first two cases here. - -In the first case, we want to generate a target list element which uses the `COUNT` SQL aggregate function. - -```typescript -// ... -case 'star_count': - target_list.push(`COUNT(1) AS ${aggregateName}`); - break; -// ... -``` - -In the second case, we'll also use the `COUNT` function, but this time, we're counting non-null values in a single column: - -```typescript -// ... -case 'column_count': - target_list.push(`COUNT(${aggregate.column}) AS ${aggregateName}`); - break; -// ... -``` - -We also need to interpret the `distinct` property of the aggregate object, and insert the `DISTINCT` keyword if needed: - -```typescript -// ... -case 'column_count': - target_list.push(`COUNT(${aggregate.distinct ? 'DISTINCT ' : ''}${aggregate.column}) AS ${aggregateName}`); - break; -// ... -``` - -We'll create a new generated SQL function within `fetch_aggregates()` to use the generated target list: - -```typescript -// ... -const sql = `SELECT ${target_list.join(", ")} FROM ( - ( - SELECT * FROM ${request.collection} ${where_clause} ${order_by_clause} ${limit_clause} ${offset_clause} - )`; -// ... -``` - -Note that we form the set of rows to be aggregated first, so that the limit and offset clauses are applied correctly. - -And instead of returning all rows, we're going to assume that we only get a single row back, so we can match on that and -return the single row of aggregates: - -```typescript -const result = await state.db.get(sql, ...parameters); - -if (result === undefined) { - throw new InternalServerError("Unable to fetch aggregates"); -} - -return result; -``` - -Here's the full function: - -```typescript -async function fetch_aggregates(state: State, request: QueryRequest): Promise<{ - [k: string]: unknown -}> { - const target_list = []; - - for (const aggregateName in request.query.aggregates) { - if (Object.prototype.hasOwnProperty.call(request.query.aggregates, aggregateName)) { - const aggregate = request.query.aggregates[aggregateName]; - switch(aggregate.type) { - case 'star_count': - target_list.push(`COUNT(1) AS ${aggregateName}`); - break; - case 'column_count': - target_list.push(`COUNT(${aggregate.distinct ? 'DISTINCT ' : ''}${aggregate.column}) AS ${aggregateName}`); - break; - case 'single_column': - throw new NotSupported("custom aggregates not yet supported"); - } - } - } - - const parameters: any[] = []; - - const limit_clause = request.query.limit == null ? "" : `LIMIT ${request.query.limit}`; - - const offset_clause = request.query.offset == null ? "" : `OFFSET ${request.query.offset}`; - - const where_clause = request.query.where == null ? "" : `WHERE ${visit_expression(parameters, request.query.where)}`; - - const order_by_clause = request.query.order_by == null ? "" : `ORDER BY ${visit_order_by_elements(request.query.order_by.elements)}`; - - const sql = `SELECT ${target_list.join(", ")} FROM ( - SELECT * FROM ${request.collection} ${where_clause} ${order_by_clause} ${limit_clause} ${offset_clause} - )`; - - console.log(JSON.stringify({ sql, parameters }, null, 2)); - - const result = state.db.get(sql, ...parameters); - - if (result === undefined) { - throw new InternalServerError("Unable to fetch aggregates"); - } - - return result; -} -``` - -That's it, so let's test our connector one more time, and hopefully see some passing tests this time. - -Remember to delete the snapshots first, so that we can generate new ones: - -```bash -rm -rf snapshots -``` - -And re-run the tests with the snapshots directory: - -```shell -ndc-test test --endpoint http://0.0.0.0:8100 --snapshots-dir snapshots -``` - -OR -```shell -cargo run --bin ndc-test -- test --endpoint http://localhost:8100 --snapshots-dir snapshots -``` - -Nice! We've now implemented the `star_count` and `column_count` aggregates, and we've seen how to generate SQL for them. \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started.md index c4483119a..36cc803ea 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started.md @@ -4,16 +4,16 @@ metaTitle: 'Get Started | Hasura DDN Data Connector Tutorial' metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' --- -This video series focuses on building a native data connector for Hasura in Typescript, which enables the integration of -various data sources into your Hasura Supergraph. +This tutorial focuses on building a native data connector for Hasura in Typescript, which thus enables the +integration of various data sources into your Hasura Supergraph. This initial section goes through the basic setup and scaffolding of a connector using the Hasura TypeScript connector SDK and a local SQLite database. -It covers the creation of types for configurations and state, and the implementation of essential functions -following the SDK guidelines. - -We also introduce a test suite and shows the integration of the connector with Hasura, -demonstrating the process through a practical example. +It covers: +- The creation of types for configurations and state +- The implementation of essential functions following the SDK guidelines. +- A test suite to ensure the connector's functionality. +- The integration of the connector with Hasura. Let's get started... \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-clone.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-clone.md similarity index 88% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-clone.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-clone.md index 8ba8dd9ee..18f148c97 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-clone.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-clone.md @@ -4,7 +4,7 @@ metaTitle: 'Clone the Repo | Hasura DDN Data Connector Tutorial' metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' --- -You can use this course by watching the videos and reading, but you can also +You can use this course by following this guide but you can also [clone the finished repo](https://github.com/hasura/ndc-typescript-learn-course) to see the finished result in action straight away. Or, to follow along starting from a skeleton project, clone the repo and checkout the `follow-along` branch: @@ -28,7 +28,7 @@ npm install You can build and run the connector, when you need to, with: ```shell -npm run build && node dist/index.js serve --configuration configuration.json +npm run build && node dist/index.js serve --configuration . ``` However, you can run nodemon to watch for changes and rebuild automatically: diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-video.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-video.md deleted file mode 100644 index 143a90645..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/01-video.md +++ /dev/null @@ -1,10 +0,0 @@ ---- -title: "Video Walkthrough" -metaTitle: 'Get Started | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -import YoutubeEmbed from "../../src/YoutubeEmbed.js"; - - - diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-basics.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-basics.md similarity index 61% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-basics.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-basics.md index 9b829ce71..7203b28a3 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-basics.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/02-basics.md @@ -26,7 +26,7 @@ function which take a `connector` of type `Connector`. In your `src/index.ts` file, add the following: ```typescript -const connector: Connector = {}; +const connector: Connector = {}; start(connector); ``` @@ -34,12 +34,42 @@ start(connector); We will also need some imports over the course of the tutorial. Paste these at the top of your index.ts file: ```typescript -import sqlite3 from 'sqlite3'; -import { Database, open } from 'sqlite'; -import { BadRequest, CapabilitiesResponse, CollectionInfo, ComparisonValue, Connector, ExplainResponse, InternalServerError, MutationRequest, MutationResponse, NotSupported, ObjectField, ObjectType, OrderByElement, QueryRequest, QueryResponse, RowFieldValue, ScalarType, SchemaResponse, start } from "@hasura/ndc-sdk-typescript"; -import { JSONSchemaObject } from "@json-schema-tools/meta-schema"; -import { ComparisonTarget, Expression } from '@hasura/ndc-sdk-typescript/dist/generated/typescript/QueryRequest'; +import opentelemetry from "@opentelemetry/api"; +import sqlite3 from "sqlite3"; +import { readFile } from "fs/promises"; +import { resolve } from "path"; +import { Database, open } from "sqlite"; +import { + BadGateway, + BadRequest, + CapabilitiesResponse, + CollectionInfo, + ComparisonTarget, + ComparisonValue, + Connector, + ConnectorError, + ExplainResponse, + Expression, + ForeignKeyConstraint, + InternalServerError, + MutationRequest, + MutationResponse, + NotSupported, + ObjectField, + ObjectType, + OrderByElement, + Query, + QueryRequest, + QueryResponse, + Relationship, + RowFieldValue, + ScalarType, + SchemaResponse, + start, +} from "@hasura/ndc-sdk-typescript"; +import { withActiveSpan } from "@hasura/ndc-sdk-typescript/instrumentation"; +import { Counter, Registry } from "prom-client"; ``` You'll notice that your IDE will complain about the `connector` object not having the correct type, and -`RawConfiguration, Configuration, State` all being undefined. Let's fix that in the next section... \ No newline at end of file +`Configuration, State` all being undefined. Let's fix that in the next section... \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-configuration-and-state.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-configuration-and-state.md similarity index 55% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-configuration-and-state.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-configuration-and-state.md index 2e0298bec..d7a6e959b 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-configuration-and-state.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/03-configuration-and-state.md @@ -6,12 +6,12 @@ metaDescription: 'Learn how to build a data connector in Typescript for Hasura D We need to fill in implementations for each of the required functions, but we won't need all of these to work just yet. -First, you'll see that we define three types: `RawConfiguration`, `Configuration`, and `State`. +First, you'll see that we define two types: `Configuration`, and `State`. Let's define those now above the `connector` and `start` function: ```typescript -type RawConfiguration = { +type Configuration = { tables: TableConfiguration[]; }; @@ -22,30 +22,23 @@ type TableConfiguration = { type Column = {}; -type Configuration = RawConfiguration; - type State = { db: Database; }; ``` -`RawConfiguration` is the type of configuration that the user will see. By convention, this configuration should be -enough to reproducibly determine the connector's schema, so for our SQLite connector, we configure the connector -with an array of tables that we want to expose. Each of these table types in `TableConfiguration` is defined by its -name and a list of columns. - -`Column`s don't have any specific configuration yet, but we leave an empty object type here because we might want to -capture things like column types later on. - -The `Configuration` type is supposed to be a validated version of the raw configuration, but for our purposes, we'll -reuse the same type. +`Configuration` is the type of the connector's configuration, which will be read from a directory on disk. By +convention, this configuration should be enough to reproducibly determine the NDC schema, so for our SQLite connector, +we configure the connector with a list of tables that we want to expose. Each table is defined by its name and a list of +columns. Columns don't have any specific configuration yet, but we leave an empty object type here because we might want +to capture things like column types later on. [//]: # (TODO: What does it mean to validate the configuration? What does it mean to have a validated configuration?) ## State The `State` type is for things like connection pools, handles, or any non-serializable state that gets allocated on -startup, and which lives for the lifetime of the connector. For our connector, we need to keep a handle to our SQLite +startup, and which lives for the lifetime of the connector. For our connector, we need to keep a handle to our sqlite database. Cool, so now that we've got our types defined, we can fill in the function definitions which the connector requires diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-function-definitions.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-function-definitions.md new file mode 100644 index 000000000..e021e614d --- /dev/null +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/04-function-definitions.md @@ -0,0 +1,80 @@ +--- +title: "Function Definitions" +metaTitle: 'Function Definitions | Hasura DDN Data Connector Tutorial' +metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' +--- + +Now let's fill in some function definitions, these are the functions required to provide to the connector to satisfy +the Hasura connector specification, and we'll be implementing them as we go through the course. + +Copy and paste the following required functions into the `src/index.ts` file. Note that the amended `connector` is +also included at the bottom, overwriting the previous connector definition with this which takes these functions as +arguments. + +```typescript +async function parseConfiguration(configurationDir: string): Promise { + throw new Error("Function not implemented."); +} + +async function tryInitState(configuration: Configuration, registry: Registry): Promise { + throw new Error("Function not implemented."); +} + +function getCapabilities(configuration: Configuration): CapabilitiesResponse { + throw new Error("Function not implemented."); +} + +async function getSchema(configuration: Configuration): Promise { + throw new Error("Function not implemented."); +} + +async function query(configuration: Configuration, state: State, request: QueryRequest): Promise { + throw new Error("Function not implemented."); +} + +async function fetchMetrics(configuration: Configuration, state: State): Promise { + throw new Error("Function not implemented."); +} + +async function healthCheck(configuration: Configuration, state: State): Promise { + throw new Error("Function not implemented."); +} + +async function queryExplain(configuration: Configuration, state: State, request: QueryRequest): Promise { + throw new Error("Function not implemented."); +} + +async function mutationExplain(configuration: Configuration, state: State, request: MutationRequest): Promise { + throw new Error("Function not implemented."); +} + +async function mutation(configuration: Configuration, state: State, request: MutationRequest): Promise { + throw new Error("Function not implemented."); +} +``` + +Now we need to update the `connector` definition to include these functions. + +```typescript +const connector: Connector = { + parseConfiguration, + tryInitState, + getCapabilities, + getSchema, + query, + fetchMetrics, + healthCheck, + queryExplain, + mutationExplain, + mutation +}; +``` + +Ok, moving on swiftly, for this course we will only need to implement the first five functions: +- `parseConfiguration`: which reads the configuration from files on disk. +- `tryInitState`: which initializes our database connection. +- `getCapabilities`: which returns the NDC capabilities of our connector. +- `getSchema`: which returns an NDC schema containing our tables and columns. +- `query`: which actually responds to query requests. + +Let's do that now. \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-function-definitions.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-function-definitions.md deleted file mode 100644 index 4ab53897b..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-function-definitions.md +++ /dev/null @@ -1,88 +0,0 @@ ---- -title: "Function Definitions" -metaTitle: 'Function Definitions | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -Now let's fill in some function definitions, these are the functions required to provide to the connector to satisfy -the Hasura connector specification, and we'll be implementing them as we go through the course. - -Copy and paste the following required functions into the `src/index.ts` file. Note that the amended `connector` is -also included at the bottom, overwriting the previous connector definition with this which takes these functions as -arguments. - -```typescript -function get_raw_configuration_schema(): JSONSchemaObject { - throw new Error("Function not implemented."); -} - -function get_configuration_schema(): JSONSchemaObject { - throw new Error("Function not implemented."); -} - -function make_empty_configuration(): RawConfiguration { - throw new Error("Function not implemented."); -} - -async function update_configuration(configuration: RawConfiguration): Promise { - throw new Error("Function not implemented."); -} - -async function fetch_metrics(configuration: RawConfiguration, state: State): Promise { - throw new Error("Function not implemented."); -} - -async function health_check(configuration: RawConfiguration, state: State): Promise { - throw new Error("Function not implemented."); -} - -async function explain(configuration: RawConfiguration, state: State, request: QueryRequest): Promise { - throw new Error("Function not implemented."); -} - -async function mutation(configuration: RawConfiguration, state: State, request: MutationRequest): Promise { - throw new Error("Function not implemented."); -} - -// Implement these 5 functions below for this course - -async function validate_raw_configuration(configuration: RawConfiguration): Promise { - throw new Error("Function not implemented."); -} - -async function try_init_state(configuration: RawConfiguration, metrics: unknown): Promise { - throw new Error("Function not implemented."); -} - -function get_capabilities(configuration: RawConfiguration): CapabilitiesResponse { - throw new Error("Function not implemented."); -} - -async function get_schema(configuration: RawConfiguration): Promise { - throw new Error("Function not implemented."); -} - -async function query(configuration: RawConfiguration, state: State, request: QueryRequest): Promise { - throw new Error("Function not implemented."); -} - -const connector: Connector = { - get_raw_configuration_schema, - get_configuration_schema, - make_empty_configuration, - update_configuration, - validate_raw_configuration, - try_init_state, - fetch_metrics, - health_check, - get_capabilities, - get_schema, - explain, - mutation, - query -}; -``` - -Ok, moving on swiftly, for this course we will only need to implement the last 5 functions of -`validate_raw_configuration`, `try_init_state`, `get_capabilities`, `get_schema`, and `query`, in order to get a -basic working connector. Let's do that now. diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-implementation.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-implementation.md new file mode 100644 index 000000000..4c23e8e97 --- /dev/null +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/05-implementation.md @@ -0,0 +1,60 @@ +--- +title: "Implementation" +metaTitle: 'Implementation | Hasura DDN Data Connector Tutorial' +metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' +--- + +Right now, we only need to implement five required functions: +- `parseConfiguration`: which reads the configuration from files on disk. +- `tryInitState`: which initializes our database connection. +- `getCapabilities`: which returns the NDC capabilities of our connector. +- `getSchema`: which returns an NDC schema containing our tables and columns. +- `query`: which actually responds to query requests. + +We'll skip configuration validation entirely for now, and just read the raw configuration from a `configuration.json` +file in the configuration directory: + +```typescript +async function parseConfiguration(configurationDir: string): Promise { + const configuration_file = resolve(configurationDir, 'configuration.json'); + const configuration_data = await readFile(configuration_file); + const configuration = JSON.parse(configuration_data.toString()); + return { + filename: resolve(configurationDir, 'database.db'), + ...configuration + }; +} +``` + +To initialize our state, which in our case contains a connection to the database, we'll use the `open` function to +open a connection to it, and store the resulting connection object in our state by returning it: + +```typescript +async function tryInitState( + configuration: Configuration, + registry: Registry +): Promise { + const db = await open({ + filename: configuration.filename, + driver: sqlite3.Database + }); + + return { db }; +} +``` + +[//]: # (TODO: Link to the relevant part of the spec) +Our capabilities response will be very simple, because we won't support many capabilities yet. We just return the +version range of the specification that we are compatible with, and the basic `query` and `mutation` capabilities. + +```typescript +function getCapabilities(configuration: Configuration): CapabilitiesResponse { + return { + version: "0.1.2", + capabilities: { + query: {}, + mutation: {} + } + } +} +``` \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-implement-get-schema-and-query.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/06-implement-get-schema-and-query.md similarity index 51% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-implement-get-schema-and-query.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/06-implement-get-schema-and-query.md index f165fe387..ea781a379 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-implement-get-schema-and-query.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/06-implement-get-schema-and-query.md @@ -4,72 +4,50 @@ metaTitle: 'The Get Schema Function | Hasura DDN Data Connector Tutorial' metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' --- -`get_schema` is the first interesting function. It needs to return a spec-compatible schema containing our tables and -columns. +`getSchema` is the first interesting function. We're going to define scalar types, and an object type and a collection +for each table in the configuration. Let's first define the scalar types. In fact, we're only going to define one, +called `any`: In it, we're going to define scalar types, and an object type and a collection for each table in the configuration. For this course, we're going to ignore the `functions` and `procedures` fields, but we'll cover those in a later courses. It takes the RawConfiguration as an argument, and returns an abject in the format of the `SchemaResponse` -shape eg: - -```typescript -export interface SchemaResponse { - /** - * Collections which are available for queries and/or mutations - */ - collections: CollectionInfo[]; - /** - * Functions (i.e. collections which return a single column and row) - */ - functions: FunctionInfo[]; - /** - * A list of object types which can be used as the types of arguments, or return types of procedures. Names should not overlap with scalar type names. - */ - object_types: { - [k: string]: ObjectType; - }; - /** - * Procedures which are available for execution as part of mutations - */ - procedures: ProcedureInfo[]; - /** - * A list of scalar types which will be used as the types of collection columns - */ - scalar_types: { - [k: string]: ScalarType; - }; -} -``` +shape. Let's first define the scalar types. In fact, we're only going to define one, called `any` as a string literal: ```typescript -async function get_schema(configuration: RawConfiguration): Promise { +async function getSchema(configuration: RawConfiguration): Promise { let scalar_types: { [k: string]: ScalarType } = { 'any': { aggregate_functions: {}, - comparison_operators: {}, - update_operators: {}, + comparison_operators: { + 'eq': { + type: 'equal' + } + }, } }; } ``` [//]: # (TODO: This is confusing name because of the any type in typescript) -`any` is a generic scalar type that we'll use as the type of all of our columns. It doesn't have any comparison -operators or aggregates defined. Later, when we talk about those features, we'll need to split this type up into several -different scalar types. +`any` is a generic scalar type that we'll use as the type of all of our columns. It doesn't have any aggregates defined, +and only a single equality comparison operator, `eq`. Later, when we talk about those features, we'll need to split this +type up into several different scalar types. Now let's define the object types. ```typescript {5} -async function get_schema(configuration: RawConfiguration): Promise { +async function getSchema(configuration: RawConfiguration): Promise { let scalar_types: { [k: string]: ScalarType } = { 'any': { aggregate_functions: {}, - comparison_operators: {}, - update_operators: {}, + comparison_operators: { + 'eq': { + type: 'equal' + } + }, } }; @@ -96,42 +74,33 @@ async function get_schema(configuration: RawConfiguration): Promise { - let scalar_types: { [k: string]: ScalarType } = { - 'any': { - aggregate_functions: {}, - comparison_operators: {}, - update_operators: {}, - } +let collections: CollectionInfo[] = configuration.tables.map((table) => { + return { + arguments: {}, + name: table.tableName, + deletable: false, + foreign_keys: {}, + uniqueness_constraints: {}, + type: table.tableName, }; +}); +``` - let object_types: { [k: string]: ObjectType } = {}; - - for (const table of configuration.tables) { - let fields: { [k: string]: ObjectField } = {}; - - for (const columnName in table.columns) { - fields[columnName] = { - type: { - type: 'named', - name: 'any' - } - }; - } +Again, we define one collection per table in the configuration, and we use the object type with the same name that we +just defined. - object_types[table.tableName] = { - fields - }; - } +Now we can put the schema response together: +```typescript +async function getSchema(configuration: Configuration): Promise { let collections: CollectionInfo[] = configuration.tables.map((table) => { return { arguments: {}, @@ -142,21 +111,15 @@ async function get_schema(configuration: RawConfiguration): Promise { let scalar_types: { [k: string]: ScalarType } = { 'any': { aggregate_functions: {}, - comparison_operators: {}, - update_operators: {}, + comparison_operators: { + 'eq': { + type: 'equal' + } + }, } }; @@ -179,17 +142,6 @@ async function get_schema(configuration: RawConfiguration): Promise { - return { - arguments: {}, - name: table.tableName, - deletable: false, - foreign_keys: {}, - uniqueness_constraints: {}, - type: table.tableName, - }; - }); - return { functions: [], procedures: [], @@ -200,8 +152,7 @@ async function get_schema(configuration: RawConfiguration): Promise { - return configuration; -} -``` - -To initialize our state, which in our case contains a connection to the database, we'll use the `open` function to -open a connection to it, and store the resulting connection object in our state by returning it: - -```typescript -async function try_init_state(configuration: RawConfiguration, metrics: unknown): Promise { - const db = await open({ - filename: 'database.db', - driver: sqlite3.Database - }); - - return { db }; -} -``` - -[//]: # (TODO: Link to the relevant part of the spec) -Our capabilities response will be very simple, because we won't support many capabilities yet. We just return the -version range of the specification that we are compatible with, and the basic `query` capability. - -```typescript -function get_capabilities(configuration: RawConfiguration): CapabilitiesResponse { - return { - versions: "^0.1.0", - capabilities: { - query: {} - } - } -} -``` \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-testing.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-testing.md similarity index 68% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-testing.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-testing.md index c424da060..51e1adb5a 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-testing.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/07-testing.md @@ -4,14 +4,18 @@ metaTitle: 'Testing | Hasura DDN Data Connector Tutorial' metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' --- -So we have one more function to define, which is the query function, but before we do, let's talk about tests. The [NDC -specification repository](https://github.com/hasura/ndc-spec/) provides a +So we have one more function to define, which is the `query` function, but before we do, let's talk about tests. The +[NDC specification repository](https://github.com/hasura/ndc-spec/) provides a [test runner executable](https://github.com/hasura/ndc-spec/tree/main/ndc-test) called `ndc-test`, which can be used to implement a test suite for a connector. We can also use `ndc-test` to run some automatic tests and validate the work we've done so far. Let's compile and run our connector, and then use the test runner with the running connector. +```sh +npm run build && node dist/index.js serve --configuration . +``` + Back in your `ndc-typescript-learn-course` directory that we cloned during setup, you have a `configuration.json` file which you can use to run the connector against your sample database. @@ -28,27 +32,29 @@ Now, let's run the tests. (You will need to have the your machine.) ```shell -ndc-test test --endpoint http://localhost:8100 +ndc-test test --endpoint http://localhost:8080 ``` OR ```shell -cargo run --bin ndc-test -- test --endpoint http://localhost:8100 +cargo run --bin ndc-test -- test --endpoint http://localhost:8080 ```` Some tests fail, but we expected them to fail, but we can already see that our schema response is good. ```text -cargo run --bin ndc-test -- test --endpoint http://localhost:8100 - Finished dev [unoptimized + debuginfo] target(s) in 0.21s - Running `/Users/me/ndc-spec/target/debug/ndc-test test --endpoint 'http://localhost:8100'` +cargo run --bin ndc-test -- test --endpoint http://localhost:8080 + Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.23s + Running `/Users/sean/ProjectsHasura/ndc-spec/target/debug/ndc-test test --endpoint 'http://localhost:8080'` + ├ Capabilities ... │ ├ Fetching /capabilities ... OK │ ├ Validating capabilities ... OK ├ Schema ... │ ├ Fetching schema ... OK │ ├ Validating schema ... +│ │ ├ scalar_types ... OK │ │ ├ object_types ... OK │ │ ├ Collections ... │ │ │ ├ albums ... @@ -63,13 +69,24 @@ cargo run --bin ndc-test -- test --endpoint http://localhost:8100 │ ├ albums ... │ │ ├ Simple queries ... │ │ │ ├ Select top N ... FAIL -│ │ ├ Aggregate queries ... -│ │ │ ├ star_count ... FAIL │ ├ artists ... │ │ ├ Simple queries ... │ │ │ ├ Select top N ... FAIL -│ │ ├ Aggregate queries ... -│ │ │ ├ star_count ... FAIL +Failed with 2 test failures: + +[1] Select top N + in Query + in albums + in Simple queries + in Select top N +Details: error communicating with the connector: error in response: status code 500 Internal Server Error + +[2] Select top N + in Query + in artists + in Simple queries + in Select top N +Details: error communicating with the connector: error in response: status code 500 Internal Server Error ``` In the next section, we'll start to implement the query function, and see some of these tests pass. \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/09-query-function.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-query-function.md similarity index 78% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/09-query-function.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-query-function.md index 0f9ccd5b9..2edd4a0e7 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/09-query-function.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/get_started/08-query-function.md @@ -13,22 +13,28 @@ async function query(configuration: RawConfiguration, state: State, request: Que } ``` -Let's run the tests again. +Remember to rebuild and restart the connector: + +```shell +npm run build && node dist/index.js serve --configuration . +``` + +Let's run the tests again. In the `ndc-test` directory: ```shell rm -rf snapshots ``` ```shell -ndc-test test --endpoint http://0.0.0.0:8100 --snapshots-dir snapshots +ndc-test test --endpoint http://0.0.0.0:8080 --snapshots-dir snapshots ``` OR ```shell -cargo run --bin ndc-test -- test --endpoint http://localhost:8100 --snapshots-dir snapshots +cargo run --bin ndc-test -- test --endpoint http://localhost:8080 --snapshots-dir snapshots ``` -In the logs of the app, we can see the request +In the logs of the running app, we can see the request that was sent. It identifies the name of the collection, and a query object to run. The query has a list of fields to retrieve, and a limit of 10 rows. With this as a guide, we can start to implement our query function in the next section. @@ -36,20 +42,18 @@ section. ```text ... { - "collection": "albums", + "collection": "artists", "query": { "fields": { - "artist_id": { - "type": "column", - "column": "artist_id" - }, "id": { "type": "column", - "column": "id" + "column": "id", + "fields": null }, - "title": { + "name": { "type": "column", - "column": "title" + "column": "name", + "fields": null } }, "limit": 10 @@ -76,7 +80,9 @@ Later, we'll also implement aggregates here. Let's define the `fetch_rows` function the `query` function is delegating to: ```typescript -async function fetch_rows(state: State, request: QueryRequest): Promise<{ [k: string]: RowFieldValue }[]> { +async function fetch_rows(state: State, request: QueryRequest): Promise<{ + [k: string]: RowFieldValue +}[]> { const fields = []; for (const fieldName in request.query.fields) { @@ -89,7 +95,6 @@ async function fetch_rows(state: State, request: QueryRequest): Promise<{ [k: st case 'relationship': throw new Error("Relationships are not supported"); } - } } @@ -98,16 +103,18 @@ async function fetch_rows(state: State, request: QueryRequest): Promise<{ [k: st } const limit_clause = request.query.limit == null ? "" : `LIMIT ${request.query.limit}`; - const offset_clause = request.query.offset == null ? "" : `OFFSET ${request.query.offset}`; - const sql = `SELECT ${fields.join(", ")} FROM ${request.collection} ${limit_clause} ${offset_clause}`; + const sql = `SELECT ${fields.length ? fields.join(", ") : '1 AS __empty'} FROM ${request.collection} ${limit_clause} ${offset_clause}`; console.log(JSON.stringify({ sql }, null, 2)); + + const rows = await state.db.all(sql, {}); - return state.db.all(sql); + return rows.map((row) => { delete row.__empty; return row; }); } ``` + This function breaks down the request that we saw earlier and produces SQL with a basic shape. Here is what `fetch_rows` does: @@ -142,40 +149,19 @@ connectors - we get to push down the query execution to the data sources themsel Now let's see it work in the test runner. We'll rebuild and restart the connector, and run the tests again. ```text -cargo run --bin ndc-test -- test --endpoint http://localhost:8100 - Finished dev [unoptimized + debuginfo] target(s) in 0.29s - Running `/Users/me/ndc-spec/target/debug/ndc-test test --endpoint 'http://localhost:8100'` -├ Capabilities ... -│ ├ Fetching /capabilities ... OK -│ ├ Validating capabilities ... OK -├ Schema ... -│ ├ Fetching schema ... OK -│ ├ Validating schema ... -│ │ ├ object_types ... OK -│ │ ├ Collections ... -│ │ │ ├ albums ... -│ │ │ │ ├ Arguments ... OK -│ │ │ │ ├ Collection type ... OK -│ │ │ ├ artists ... -│ │ │ │ ├ Arguments ... OK -│ │ │ │ ├ Collection type ... OK -│ │ ├ Functions ... -│ │ │ ├ Procedures ... +... ├ Query ... │ ├ albums ... │ │ ├ Simple queries ... │ │ │ ├ Select top N ... OK │ │ │ ├ Predicates ... OK │ │ │ ├ Sorting ... FAIL -│ │ ├ Aggregate queries ... -│ │ │ ├ star_count ... FAIL │ ├ artists ... │ │ ├ Simple queries ... │ │ │ ├ Select top N ... OK │ │ │ ├ Predicates ... OK │ │ │ ├ Sorting ... FAIL -│ │ ├ Aggregate queries ... -│ │ │ ├ star_count ... FAIL +... ``` [//]: # (TODO - why are predicates passing here? They have not been implemented) diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/introduction.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/introduction.md index f805cef11..09d62a5b1 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/introduction.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/introduction.md @@ -6,7 +6,7 @@ metaDescription: 'Learn how to build a data connector for Hasura DDN' In this course we will go through the process of creating a Hasura DDN data connector in TypeScript step-by-step. -A data connector in Hasura DDN is an agent which allows you to connect Hasura to any arbitrary data source. +Data connectors allow us to target arbitrary data sources and bring their data into our Hasura graph. ## What will we be building? {#what-will-we-be-building} @@ -14,15 +14,11 @@ We will build a basic connector to an [SQLite](https://www.sqlite.org/index.html run locally on your machine. This will familiarize you with the process of creating a connector to the Hasura DDN specification. You can then -take the concepts you've learned and apply it to any data source you'd like to integrate with Hasura DDN. +take the concepts you've learned and apply them to any data source you'd like to integrate with Hasura DDN. ## How do I follow along? {#how-do-I-follow-along} -You can watch the walkthrough videos in each section and also follow along with the code in this tutorial by first -cloning the [repo](/get-started/2-clone/). - -The YouTube playlist for all the videos is -[here](https://www.youtube.com/playlist?list=PLTRTpHrUcSB_WmbGviXZUx0z-jVZXm4Yc). +You can follow along with the code in this tutorial by first cloning the [repo](/get-started/01-clone/). ## What will I learn? {#what-will-i-learn} @@ -49,9 +45,8 @@ About an hour. ## Additional Resources {#additional-resources} - The [NDC Specification](https://hasura.github.io/ndc-spec/specification/) details the specification for data connectors which work with Hasura DDN. -- [Reference Implementation with Tutorial](https://github.com/hasura/ndc-spec/blob/main/ndc-reference/README.md) - SDKs for data connectors built with Rust, and the one which we will be using in this course, TypeScript. - - [NDC Rust SDK](https://github.com/hasura/ndc-hub) + - [NDC Rust SDK](https://github.com/hasura/ndc-sdk-rs) - [NDC Typescript SDK](https://github.com/hasura/ndc-sdk-typescript) - Examples of existing native data connectors built by Hasura. - [Clickhouse](https://github.com/hasura/ndc-clickhouse) (Rust) diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/2-order_by.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/01-order_by.md similarity index 95% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/2-order_by.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/01-order_by.md index e3c2200a4..c2f40bee2 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/2-order_by.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/01-order_by.md @@ -11,8 +11,6 @@ sorting, and see more of our tests turn green. Implementing sorting is much simpler than implementing predicates, because there is no recursive structure to process. Instead, we have a simple list of orderings that we will turn into a SQL `ORDER BY` clause. -Let's get started. - ## Order By Just like with the `WHERE` clause last time, we will modify our SQL template to add a new `ORDER BY` clause, and @@ -21,7 +19,7 @@ delegate to a new function to generate the SQL for that new clause. ```typescript const order_by_clause = request.query.order_by == null ? "" : `ORDER BY ${visit_order_by_elements(request.query.order_by.elements)}`; -const sql = `SELECT ${fields.join(", ")} FROM ${request.collection} ${where_clause} ${order_by_clause} ${limit_clause} ${offset_clause}`; +const sql = `SELECT ${fields.length ? fields.join(", ") : '1 AS __empty'} FROM ${request.collection} ${where_clause} ${order_by_clause} ${limit_clause} ${offset_clause}`; ``` In this case, our new helper function is called `visit_order_by_elements`, and it breaks down the `order_by` property diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/1-video.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/1-video.md deleted file mode 100644 index 6c0eaf782..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/ordering/1-video.md +++ /dev/null @@ -1,9 +0,0 @@ ---- -title: "Video Walkthrough" -metaTitle: 'Ordering | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -import YoutubeEmbed from "../../src/YoutubeEmbed.js"; - - \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/2-where-clause.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/01-where-clause.md similarity index 56% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/2-where-clause.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/01-where-clause.md index fef4bc627..57e481494 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/2-where-clause.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/01-where-clause.md @@ -13,33 +13,48 @@ For example: a `where` clause. ## Where Clause Let's pick up from where we left off. We can modify our SQL template in our `fetch_rows` function to now include a -`WHERE clause`: +`WHERE` clause: ```typescript -const sql = `SELECT ${fields.join(", ")} FROM ${request.collection} ${where_clause} ${limit_clause} ${offset_clause}`; +const sql = `SELECT ${fields.length ? fields.join(", ") : '1 AS __empty'} FROM ${request.collection} ${where_clause} ${limit_clause} ${offset_clause}`; ``` To generate our `WHERE` clause, we will need to interpret the contents of the `where` property of the query request. To -see what this will look like, we can find some examples in the snapshots we generated last time: +see what this will look like, we can find some examples in the query snapshots we generated last time: ```JSON { - "Limit": 10, - "where": { - "type": "binary_comparison_operator", - "column": { - "type": "column", - "name": "artist_id", - "path": [] - }, - "operator": { - "type": "equal" - }, - "value": { - "type": "scalar", - "value": 5 - } + "collection": "albums", + "query": { + "fields": { + "id": { + "type": "column", + "column": "id", + "fields": null + }, + "title": { + "type": "column", + "column": "title", + "fields": null + } + }, + "limit": 10, + "predicate": { + "type": "binary_comparison_operator", + "column": { + "type": "column", + "name": "artist_id", + "path": [] + }, + "operator": "eq", + "value": { + "type": "scalar", + "value": 5 + } } + }, + "arguments": {}, + "collection_relationships": {} } ``` diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/02-expression-types.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/02-expression-types.md new file mode 100644 index 000000000..2f80e1e40 --- /dev/null +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/02-expression-types.md @@ -0,0 +1,22 @@ +--- +title: "Expression Types" +metaTitle: 'Expression Types | Hasura DDN Data Connector Tutorial' +metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' +--- + +In the SDK, these predicate expressions are given the TypeScript type `Expression`, and we can see that there are +several different types of expression. + +These are all expression types which can be used in the `where` clause of a query. Our `query` function will need to +handle them via the `fetch_rows` function. + +There are logical expressions like `and`, `or`, and `not`, which serve to combine other simpler expressions. + +There are unary (eg: `NULL`, `IS NOT NULL`, etc...) and binary (eg: `=` (equal), `!=` (not-equal), `>` (greater-than), +`<` (less-than), `>=` (greater-or-equal)) comparison operator expressions. + +And there are `exists` expressions, which are expressed using a sub-query against another collection. + +For now, we'll concentrate on logical expressions and comparison operator expressions. + +Let's begin to construct the where clause in the next section. \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/4-building-the-where-clause.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/03-building-the-where-clause.md similarity index 93% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/4-building-the-where-clause.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/03-building-the-where-clause.md index 5591c1943..e70b204ea 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/4-building-the-where-clause.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/03-building-the-where-clause.md @@ -118,10 +118,19 @@ case "unary_comparison_operator": // ... ``` -For `binary_comparison_operator` expressions, we can switch on `expr.operator.type`. We will only implement the `equal` -operator, because our schema doesn't advertise any custom binary operators. If we wanted to add another operator, like a +For `binary_comparison_operator` expressions, we can switch on `expr.operator`. We only need to implement the `eq` +operator, because our schema doesn't advertise any other binary operators. If we wanted to add another operator, like a "greater than" operator for numbers, we would do that here, and also advertise that operator in the NDC schema response. +```typescript +switch (expr.operator) { + case 'eq': + return `${visit_comparison_target(expr.column)} = ${visit_comparison_value(parameters, expr.value)}` + default: + throw new BadRequest("Unknown comparison operator"); +} +``` + Also, one new helper function `visit_comparison_value` is needed here, defined later, and we'll call it as per below: ```typescript @@ -178,8 +187,7 @@ will be added later when we support the `relationships` capability. In the `visit_comparison_value` function, we only handle the `scalar` case, in which we push the value onto our parameter list. Again, the other cases correspond to capabilities we haven't implemented yet. -The other two expression types are unsupported for now, so we'll throw an error here. We can also come back to these -later. +`exists` expressions are unsupported for now, so we'll throw an error here. We can come back to these later. ```typescript // ... diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/5-testing.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/04-testing.md similarity index 93% rename from tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/5-testing.md rename to tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/04-testing.md index 9ad615a8f..3fd0b0128 100644 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/5-testing.md +++ b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/04-testing.md @@ -17,7 +17,7 @@ rm -rf snapshots ``` ```shell -ndc-test test --endpoint http://0.0.0.0:8100 --snapshots-dir snapshots +ndc-test test --endpoint http://localhost:8080/ --snapshots-dir snapshots ``` OR @@ -25,8 +25,8 @@ OR cargo run --bin ndc-test -- test --endpoint http://localhost:8100 --snapshots-dir snapshots ``` -We can see that predicate tests are passing, but some other test cases are not. That's okay - we'll keep iterating over -the next few videos until we have all green tests here. +We can see that predicate tests are passing, but some other test cases are not. That's okay - we'll keep iterating +until we have all green tests here. [//]: # (TODO predicate tests were passing before) @@ -134,7 +134,7 @@ Response: ] ``` -Now let's deploy to Hasura and see how the GraphQL schema looks. Check out the video to follow along here. +Now let's deploy to Hasura and see how the GraphQL schema looks. [//]: # (TODO Need to have the deploy section done) diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/1-video.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/1-video.md deleted file mode 100644 index aaa9c57be..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/1-video.md +++ /dev/null @@ -1,9 +0,0 @@ ---- -title: "Video Walkthrough" -metaTitle: 'Predicates | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -import YoutubeEmbed from "../../src/YoutubeEmbed.js"; - - \ No newline at end of file diff --git a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/3-expression-types.md b/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/3-expression-types.md deleted file mode 100644 index 7619449bb..000000000 --- a/tutorials/backend/hasura-v3-ts-connector/tutorial-site/content/predicates/3-expression-types.md +++ /dev/null @@ -1,51 +0,0 @@ ---- -title: "Expression Types" -metaTitle: 'Expression Types | Hasura DDN Data Connector Tutorial' -metaDescription: 'Learn how to build a data connector in Typescript for Hasura DDN' ---- - -In the SDK, these predicate expressions are given the TypeScript type `Expression`, and we can see that there are -several different types of expression. These are all expression types which can be used in the `where` clause of a -query our `query` function will need to handle them via the `fetch_rows` function. - -```typescript -export type Expression = { - expressions: Expression[]; - type: "and"; -} | { - expressions: Expression[]; - type: "or"; -} | { - expression: Expression; - type: "not"; -} | { - column: ComparisonTarget; - operator: UnaryComparisonOperator; - type: "unary_comparison_operator"; -} | { - column: ComparisonTarget; - operator: BinaryComparisonOperator; - type: "binary_comparison_operator"; - value: ComparisonValue; -} | { - column: ComparisonTarget; - operator: BinaryArrayComparisonOperator; - type: "binary_array_comparison_operator"; - values: ComparisonValue[]; -} | { - in_collection: ExistsInCollection; - type: "exists"; - where: Expression; -}; -``` - -There are logical expressions like `and`, `or`, and `not`, which serve to combine other simpler expressions. - -There are unary (eg: `NULL`, `IS NOT NULL`, etc...) and binary (eg: `=` (equal), `!=` (not-equal), `>` (greater-than), -`<` (less-than), `>=` (greater-or-equal)) comparison operator expressions. - -And there are `exists` expressions, which are expressed using a sub-query against another collection. - -For now, we'll concentrate on logical expressions and comparison operator expressions. - -Let's begin to construct the where clause in the next section. \ No newline at end of file