diff --git a/README.md b/README.md index 4f00f89..1d4fcc4 100644 --- a/README.md +++ b/README.md @@ -8,7 +8,7 @@ Connector Installation Clone the connector from Github repository and refer this [link](./documentation/QUICKSTART.md) for quickstart. -##Prerequisites +## Prerequisites The following are required to run the ScyllaDB Sink Connector: * Kafka Broker: Confluent Platform 3.3.0 or above. * Connect: Confluent Platform 4.1.0 or above. @@ -81,4 +81,4 @@ Reporting Kafka Metrics ----------------------- Refer the following [confluent documentation](https://docs.confluent.io/current/kafka/metrics-reporter.html) -to access kafka related metrics. \ No newline at end of file +to access kafka related metrics. diff --git a/documentation/CONFIG.md b/documentation/CONFIG.md index 17c9390..b5f5747 100644 --- a/documentation/CONFIG.md +++ b/documentation/CONFIG.md @@ -1,4 +1,4 @@ -#ScyllaDB Sink Connector +# ScyllaDB Sink Connector Configuration Properties ------------------------ @@ -9,7 +9,7 @@ To use this connector, specify the name of the connector class in the ``connecto Connector-specific configuration properties are described below. -###Connection +### Connection ``scylladb.contact.points`` @@ -87,7 +87,7 @@ Connector-specific configuration properties are described below. * Default: false * Importance: high -###SSL +### SSL ``scylladb.ssl.truststore.path`` @@ -114,7 +114,7 @@ Connector-specific configuration properties are described below. * Valid Values: [JDK, OPENSSL, OPENSSL_REFCNT] * Importance: low -###Keyspace +### Keyspace **Note**: Both keyspace and table names consist of only alphanumeric characters, cannot be empty and are limited in size to 48 characters (that limit exists @@ -149,7 +149,7 @@ can be forced by using double-quotes ("myTable" is different from mytable). * Valid Values: [1,...] * Importance: high -###Table +### Table ``scylladb.table.manage.enabled`` @@ -177,7 +177,7 @@ can be forced by using double-quotes ("myTable" is different from mytable). * Importance: Low * Default: kafka_connect_offsets -###Topic to Table +### Topic to Table These configurations can be specified for multiple Kafka topics from which records are being processed. Also, these topic level configurations will be override the behavior of Connector level configurations such as @@ -210,7 +210,7 @@ Also, these topic level configurations will be override the behavior of Connecto should processed as delete request. -###Write +### Write ``scylladb.consistency.level`` @@ -278,28 +278,31 @@ Also, these topic level configurations will be override the behavior of Connecto * Valid Values: [0,...] * Default Value: 0 -###ScyllaDB +### ScyllaDB ``behavior.on.error`` - Error handling behavior setting. Must be configured to one of the following: +Error handling behavior setting. Must be configured to one of the following: - ``fail`` - The Connector throws ConnectException and stops processing records when an error occurs while processing or inserting records into ScyllaDB. +``fail`` - ``ignore`` - Continues to process next set of records when error occurs while processing or inserting records into ScyllaDB. +The Connector throws ConnectException and stops processing records when an error occurs while processing or inserting records into ScyllaDB. + +``ignore`` + +Continues to process next set of records when error occurs while processing or inserting records into ScyllaDB. - ``log`` - Logs the error via connect-reporter when an error occurs while processing or inserting records into ScyllaDB and continues to process next set of records, available in the kafka topics. +``log`` - * Type: string - * Default: FAIL - * Valid Values: [FAIL, LOG, IGNORE] - * Importance: medium +Logs the error via connect-reporter when an error occurs while processing or inserting records into ScyllaDB and continues to process next set of records, available in the kafka topics. +* Type: string +* Default: FAIL +* Valid Values: [FAIL, LOG, IGNORE] +* Importance: medium -###Confluent Platform Configurations. + +### Confluent Platform Configurations. ``tasks.max`` @@ -323,7 +326,3 @@ The name of the topics to consume data from and write to ScyllaDB. * Type: list * Default: localhost:9092 * Importance: high - ------------------------- - - diff --git a/documentation/QUICKSTART.md b/documentation/QUICKSTART.md index e2724b2..5ff04d3 100644 --- a/documentation/QUICKSTART.md +++ b/documentation/QUICKSTART.md @@ -14,8 +14,8 @@ Command to start ScyllaDB docker container: $ docker run --name some-scylla --hostname some-scylla -d scylladb/scylla ``` Running `docker ps` will show you the exposed ports, which should look something like the following: -``` +``` $ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 26cc6d47efe3 replace-with-image-name "/docker-entrypoint.…" 4 hours ago Up 23 seconds 0.0.0.0:32777->1883/tcp, 0.0.0.0:32776->9001/tcp anonymous_my_1 @@ -26,13 +26,13 @@ CONTAINER ID IMAGE COMMAND CREATED If you are new to Confluent then follow this [link](https://www.confluent.io/download) to download the Confluent Platform . -1 - Click on DOWNLOAD FREE under Self managed software. +1. Click on DOWNLOAD FREE under Self managed software. -2 - Click on Zip archive then fill the Email address then Accept the T&C and lastly click on Download Version 5.X.X. +2. Click on Zip archive then fill the Email address then Accept the T&C and lastly click on Download Version 5.X.X. -3 - Extract the downloaded file and paste it to the desired location. +3. Extract the downloaded file and paste it to the desired location. -4 - Now follow this [link](https://docs.confluent.io/current/quickstart/ce-quickstart.html#ce-quickstart) to complete the installation. +4. Now follow this [link](https://docs.confluent.io/current/quickstart/ce-quickstart.html#ce-quickstart) to complete the installation. ### Manual Installation Of The Connector @@ -87,11 +87,11 @@ Your output should resemble: "io.connect.scylladb.ScyllaDbSinkConnector" ``` -#####Connector Configuration +#### Connector Configuration Save these configs in a file *kafka-connect-scylladb.json* and run the following command: -``` +```json { "name" : "scylladb-sink-connector", "config" : { @@ -100,10 +100,11 @@ Save these configs in a file *kafka-connect-scylladb.json* and run the following "topics" : "topic1,topic2,topic3", "scylladb.contact.points" : "scylladb-hosts", "scylladb.keyspace" : "test" + } } ``` -Use this command to load the connector : +Use this command to load the connector: ``` curl -s -X POST -H 'Content-Type: application/json' --data @kafka-connect-scylladb.json http://localhost:8083/connectors @@ -120,21 +121,21 @@ Once the Connector is up and running, use the command ``kafka-avro-console-produ Example: ``` -kafka-avro-console-producer ---broker-list localhost:9092 ---topic topic1 ---property parse.key=true ---property key.schema='{"type":"record",name":"key_schema","fields":[{"name":"id","type":"int"}]}' ---property "key.separator=$" ---property value.schema='{"type":"record","name":"value_schema","fields":[{"name":"id","type":"int"}, -{"name":"firstName","type":"string"},{"name":"lastName","type":"string"}]}' +kafka-avro-console-producer \ + --broker-list localhost:9092 \ + --topic topic1 \ + --property parse.key=true \ + --property key.schema='{"type":"record","name":"key_schema","fields":[{"name":"id","type":"int"}]}' \ + --property "key.separator=$" \ + --property value.schema='{"type":"record","name":"value_schema","fields":[{"name":"id","type":"int"},{"name":"firstName","type":"string"},{"name":"lastName","type":"string"}]}' {"id":1}${"id":1,"firstName":"first","lastName":"last"} ``` Output upon running the select query in ScyllaDB: -select * from test.topic1; ``` +select * from test.topic1; + id | firstname | lastname ----+-----------+---------- @@ -143,9 +144,9 @@ select * from test.topic1; ``` -##Modes in ScyllaDB +## Modes in ScyllaDB -###Standard +### Standard Use this command to load the connector in : @@ -164,7 +165,7 @@ example. **Distributed Mode JSON** -``` +```json { "name" : "scylladb-sink-connector", "config" : { @@ -174,7 +175,7 @@ example. "scylladb.contact.points" : "scylladb-hosts", "scylladb.keyspace" : "test", "key.converter" : "org.apache.kafka.connect.json.JsonConverter", - "value.converter" : "org.apache.kafka.connect.json.JsonConverter" + "value.converter" : "org.apache.kafka.connect.json.JsonConverter", "key.converter.schemas.enable" : "true", "value.converter.schemas.enable" : "true", @@ -249,7 +250,7 @@ example. **Distributed Mode** -``` +```json { "name" : "scylladbSinkConnector", "config" : { @@ -280,7 +281,7 @@ scylladb.username=example scylladb.password=password ``` -###Logging +### Logging To check logs for the Confluent Platform use: