Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix some typos in JSON files and markdown formatting issues #15

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Connector Installation

Clone the connector from Github repository and refer this [link](./documentation/QUICKSTART.md) for quickstart.

##Prerequisites
## Prerequisites
The following are required to run the ScyllaDB Sink Connector:
* Kafka Broker: Confluent Platform 3.3.0 or above.
* Connect: Confluent Platform 4.1.0 or above.
Expand Down Expand Up @@ -81,4 +81,4 @@ Reporting Kafka Metrics
-----------------------

Refer the following [confluent documentation](https://docs.confluent.io/current/kafka/metrics-reporter.html)
to access kafka related metrics.
to access kafka related metrics.
47 changes: 23 additions & 24 deletions documentation/CONFIG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#ScyllaDB Sink Connector
# ScyllaDB Sink Connector

Configuration Properties
------------------------
Expand All @@ -9,7 +9,7 @@ To use this connector, specify the name of the connector class in the ``connecto

Connector-specific configuration properties are described below.

###Connection
### Connection

``scylladb.contact.points``

Expand Down Expand Up @@ -87,7 +87,7 @@ Connector-specific configuration properties are described below.
* Default: false
* Importance: high

###SSL
### SSL

``scylladb.ssl.truststore.path``

Expand All @@ -114,7 +114,7 @@ Connector-specific configuration properties are described below.
* Valid Values: [JDK, OPENSSL, OPENSSL_REFCNT]
* Importance: low

###Keyspace
### Keyspace

**Note**: Both keyspace and table names consist of only alphanumeric characters,
cannot be empty and are limited in size to 48 characters (that limit exists
Expand Down Expand Up @@ -149,7 +149,7 @@ can be forced by using double-quotes ("myTable" is different from mytable).
* Valid Values: [1,...]
* Importance: high

###Table
### Table

``scylladb.table.manage.enabled``

Expand Down Expand Up @@ -177,7 +177,7 @@ can be forced by using double-quotes ("myTable" is different from mytable).
* Importance: Low
* Default: kafka_connect_offsets

###Topic to Table
### Topic to Table

These configurations can be specified for multiple Kafka topics from which records are being processed.
Also, these topic level configurations will be override the behavior of Connector level configurations such as
Expand Down Expand Up @@ -210,7 +210,7 @@ Also, these topic level configurations will be override the behavior of Connecto
should processed as delete request.


###Write
### Write

``scylladb.consistency.level``

Expand Down Expand Up @@ -278,28 +278,31 @@ Also, these topic level configurations will be override the behavior of Connecto
* Valid Values: [0,...]
* Default Value: 0

###ScyllaDB
### ScyllaDB

``behavior.on.error``

Error handling behavior setting. Must be configured to one of the following:
Error handling behavior setting. Must be configured to one of the following:

``fail``
The Connector throws ConnectException and stops processing records when an error occurs while processing or inserting records into ScyllaDB.
``fail``

``ignore``
Continues to process next set of records when error occurs while processing or inserting records into ScyllaDB.
The Connector throws ConnectException and stops processing records when an error occurs while processing or inserting records into ScyllaDB.

``ignore``

Continues to process next set of records when error occurs while processing or inserting records into ScyllaDB.

``log``
Logs the error via connect-reporter when an error occurs while processing or inserting records into ScyllaDB and continues to process next set of records, available in the kafka topics.
``log``

* Type: string
* Default: FAIL
* Valid Values: [FAIL, LOG, IGNORE]
* Importance: medium
Logs the error via connect-reporter when an error occurs while processing or inserting records into ScyllaDB and continues to process next set of records, available in the kafka topics.

* Type: string
* Default: FAIL
* Valid Values: [FAIL, LOG, IGNORE]
* Importance: medium

###Confluent Platform Configurations.

### Confluent Platform Configurations.

``tasks.max``

Expand All @@ -323,7 +326,3 @@ The name of the topics to consume data from and write to ScyllaDB.
* Type: list
* Default: localhost:9092
* Importance: high

------------------------


47 changes: 24 additions & 23 deletions documentation/QUICKSTART.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@ Command to start ScyllaDB docker container:
$ docker run --name some-scylla --hostname some-scylla -d scylladb/scylla
```
Running `docker ps` will show you the exposed ports, which should look something like the following:
```

```
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
26cc6d47efe3 replace-with-image-name "/docker-entrypoint.…" 4 hours ago Up 23 seconds 0.0.0.0:32777->1883/tcp, 0.0.0.0:32776->9001/tcp anonymous_my_1
Expand All @@ -26,13 +26,13 @@ CONTAINER ID IMAGE COMMAND CREATED
If you are new to Confluent then follow this [link](https://www.confluent.io/download) to download the Confluent Platform .


1 - Click on DOWNLOAD FREE under Self managed software.
1. Click on DOWNLOAD FREE under Self managed software.

2 - Click on Zip archive then fill the Email address then Accept the T&C and lastly click on Download Version 5.X.X.
2. Click on Zip archive then fill the Email address then Accept the T&C and lastly click on Download Version 5.X.X.

3 - Extract the downloaded file and paste it to the desired location.
3. Extract the downloaded file and paste it to the desired location.

4 - Now follow this [link](https://docs.confluent.io/current/quickstart/ce-quickstart.html#ce-quickstart) to complete the installation.
4. Now follow this [link](https://docs.confluent.io/current/quickstart/ce-quickstart.html#ce-quickstart) to complete the installation.


### Manual Installation Of The Connector
Expand Down Expand Up @@ -87,11 +87,11 @@ Your output should resemble:
"io.connect.scylladb.ScyllaDbSinkConnector"
```

#####Connector Configuration
#### Connector Configuration

Save these configs in a file *kafka-connect-scylladb.json* and run the following command:

```
```json
{
"name" : "scylladb-sink-connector",
"config" : {
Expand All @@ -100,10 +100,11 @@ Save these configs in a file *kafka-connect-scylladb.json* and run the following
"topics" : "topic1,topic2,topic3",
"scylladb.contact.points" : "scylladb-hosts",
"scylladb.keyspace" : "test"
}
}
```

Use this command to load the connector :
Use this command to load the connector:

```
curl -s -X POST -H 'Content-Type: application/json' --data @kafka-connect-scylladb.json http://localhost:8083/connectors
Expand All @@ -120,21 +121,21 @@ Once the Connector is up and running, use the command ``kafka-avro-console-produ
Example:

```
kafka-avro-console-producer
--broker-list localhost:9092
--topic topic1
--property parse.key=true
--property key.schema='{"type":"record",name":"key_schema","fields":[{"name":"id","type":"int"}]}'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah! I was getting an issue because of missing double quote before "name" attribute as well

--property "key.separator=$"
--property value.schema='{"type":"record","name":"value_schema","fields":[{"name":"id","type":"int"},
{"name":"firstName","type":"string"},{"name":"lastName","type":"string"}]}'
kafka-avro-console-producer \
--broker-list localhost:9092 \
--topic topic1 \
--property parse.key=true \
--property key.schema='{"type":"record","name":"key_schema","fields":[{"name":"id","type":"int"}]}' \
--property "key.separator=$" \
Copy link

@izhukov1992 izhukov1992 Mar 31, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer:
--property key.separator="$"
I checked it on my side and it works too but look better for me

--property value.schema='{"type":"record","name":"value_schema","fields":[{"name":"id","type":"int"},{"name":"firstName","type":"string"},{"name":"lastName","type":"string"}]}'
{"id":1}${"id":1,"firstName":"first","lastName":"last"}
```

Output upon running the select query in ScyllaDB:
select * from test.topic1;

```
select * from test.topic1;

id | firstname | lastname

----+-----------+----------
Expand All @@ -143,9 +144,9 @@ select * from test.topic1;
```


##Modes in ScyllaDB
## Modes in ScyllaDB

###Standard
### Standard

Use this command to load the connector in :

Expand All @@ -164,7 +165,7 @@ example.

**Distributed Mode JSON**

```
```json
{
"name" : "scylladb-sink-connector",
"config" : {
Expand All @@ -174,7 +175,7 @@ example.
"scylladb.contact.points" : "scylladb-hosts",
"scylladb.keyspace" : "test",
"key.converter" : "org.apache.kafka.connect.json.JsonConverter",
"value.converter" : "org.apache.kafka.connect.json.JsonConverter"
"value.converter" : "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable" : "true",
"value.converter.schemas.enable" : "true",

Expand Down Expand Up @@ -249,7 +250,7 @@ example.

**Distributed Mode**

```
```json
{
"name" : "scylladbSinkConnector",
"config" : {
Expand Down Expand Up @@ -280,7 +281,7 @@ scylladb.username=example
scylladb.password=password
```

###Logging
### Logging

To check logs for the Confluent Platform use:

Expand Down