Skip to content

Commit 2fd30a0

Browse files
Cluster Topology Refresh + General Maintenance (#19)
1 parent 34c64f9 commit 2fd30a0

File tree

22 files changed

+162
-175
lines changed

22 files changed

+162
-175
lines changed

CHANGELOG.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,15 @@ All notable changes to this project will be documented in this file.
44
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
55
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
66

7+
## [1.2.0] - 2021-02-13
8+
### Added
9+
- Handle Redis cluster topology changes on the fly
10+
11+
### Changed
12+
- Upgraded various dependencies
13+
- Upgraded demo to use Confluent Platform 6.1.0
14+
- Integration testing improvements
15+
716
## [1.1.0] - 2020-12-11
817
### Added
918
- Parallelization for source connector based on channels/patterns

LICENSE.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
The MIT License (MIT)
22

3-
Copyright (c) 2020 Jared Petersen
3+
Copyright (c) 2021 Jared Petersen
44

55
Permission is hereby granted, free of charge, to any person obtaining a copy
66
of this software and associated documentation files (the "Software"), to deal

docs/demo/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ docker build -t jaredpetersen/redis:latest .
2727

2828
Next, we'll need to build a docker image for Kafka Connect Redis. Navigate to `demo/docker/kafka-connect-redis` and run the following commands:
2929
```bash
30-
curl -O https://repo1.maven.org/maven2/io/github/jaredpetersen/kafka-connect-redis/1.1.0/kafka-connect-redis-1.1.0.jar
30+
curl -O https://repo1.maven.org/maven2/io/github/jaredpetersen/kafka-connect-redis/1.2.0/kafka-connect-redis-1.2.0.jar
3131
docker build -t jaredpetersen/kafka-connect-redis:latest .
3232
```
3333

@@ -50,7 +50,7 @@ Be patient, this can take a few minutes.
5050

5151
Run the following command to configure Redis to run in cluster mode instead of standalone mode:
5252
```bash
53-
kubectl -n kcr-demo run -it --rm redis-client --image redis:6 -- redis-cli --pass IEPfIr0eLF7UsfwrIlzy80yUaBG258j9 --cluster create $(kubectl -n kcr-demo get pods -l app=redis-cluster -o jsonpath='{range.items[*]}{.status.podIP}:6379 ') --cluster-yes
53+
kubectl -n kcr-demo run -it --rm redis-client --image redis:6 -- redis-cli --pass IEPfIr0eLF7UsfwrIlzy80yUaBG258j9 --cluster create $(kubectl -n kcr-demo get pods -l app=redis-cluster -o jsonpath='{range.items[*]}{.status.podIP}:6379 {end}') --cluster-yes
5454
```
5555

5656
## Usage

docs/demo/SINK.md

Lines changed: 30 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,31 @@
11
# Demo: Kafka Connect Sink
22
## Install Connector
3-
Send a request to the Kafka Connect REST API to configure it to use Kafka Connect Redis:
3+
Send a request to the Kafka Connect REST API to configure it to use Kafka Connect Redis.
44

5-
### Avro
6-
**IMPORTANT:** The Avro demo utilizes multiple topics in order to work around [a bug in the Avro console producer](https://github.com/confluentinc/schema-registry/issues/898). A fix has been merged but Confluent has not published a new Docker image for it yet (6.1.0+). Kafka Connect Redis works with Avro on a single topic; this is just a problem with the console producer provided by Confluent.
5+
First, expose the Kafka Connect server:
6+
```bash
7+
kubectl -n kcr-demo port-forward service/kafka-connect :rest
8+
```
9+
10+
Kubectl will choose an available port for you that you will need to use for the cURLs (`$PORT`).
711

12+
### Avro
813
```bash
914
curl --request POST \
10-
--url "$(minikube -n kcr-demo service kafka-connect --url)/connectors" \
15+
--url "localhost:$PORT/connectors" \
1116
--header 'content-type: application/json' \
1217
--data '{
1318
"name": "demo-redis-sink-connector",
1419
"config": {
1520
"connector.class": "io.github.jaredpetersen.kafkaconnectredis.sink.RedisSinkConnector",
1621
"key.converter": "io.confluent.connect.avro.AvroConverter",
1722
"key.converter.schema.registry.url": "http://kafka-schema-registry:8081",
23+
"key.converter.key.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicRecordNameStrategy",
1824
"value.converter": "io.confluent.connect.avro.AvroConverter",
1925
"value.converter.schema.registry.url": "http://kafka-schema-registry:8081",
26+
"value.converter.value.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicRecordNameStrategy",
2027
"tasks.max": "3",
21-
"topics": "redis.commands.set,redis.commands.expire,redis.commands.expireat,redis.commands.pexpire,redis.commands.sadd,redis.commands.geoadd,redis.commands.arbitrary",
28+
"topics": "redis.commands",
2229
"redis.uri": "redis://IEPfIr0eLF7UsfwrIlzy80yUaBG258j9@redis-cluster",
2330
"redis.cluster.enabled": true
2431
}
@@ -28,7 +35,7 @@ curl --request POST \
2835
### Connect JSON
2936
```bash
3037
curl --request POST \
31-
--url "$(minikube -n kcr-demo service kafka-connect --url)/connectors" \
38+
--url "localhost:$PORT/connectors" \
3239
--header 'content-type: application/json' \
3340
--data '{
3441
"name": "demo-redis-sink-connector",
@@ -48,17 +55,18 @@ curl --request POST \
4855
### Avro
4956
Create an interactive ephemeral query pod:
5057
```bash
51-
kubectl -n kcr-demo run -it --rm kafka-write-records --image confluentinc/cp-schema-registry:6.0.0 --command /bin/bash
58+
kubectl -n kcr-demo run -it --rm kafka-write-records --image confluentinc/cp-schema-registry:6.1.0 --command /bin/bash
5259
```
5360

54-
Write records to the `redis.commands` topics:
61+
Write records to the `redis.commands` topic:
5562

5663
```bash
5764
kafka-avro-console-producer \
5865
--broker-list kafka-broker-0.kafka-broker:9092 \
5966
--property schema.registry.url='http://kafka-schema-registry:8081' \
67+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
6068
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisSetCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"value","type":"string"},{"name":"expiration","type":["null",{"name":"RedisSetCommandExpiration","type":"record","fields":[{"name":"type","type":{"name":"RedisSetCommandExpirationType","type":"enum","symbols":["EX","PX","KEEPTTL"]}},{"name":"time","type":["null","long"]}]}],"default":null},{"name":"condition","type":["null",{"name":"RedisSetCommandCondition","type":"enum","symbols":["NX","XX","KEEPTTL"]}],"default":null}]}' \
61-
--topic redis.commands.set
69+
--topic redis.commands
6270
>{"key":"{user.1}.username","value":"jetpackmelon22","expiration":null,"condition":null}
6371
>{"key":"{user.2}.username","value":"anchorgoat74","expiration":{"io.github.jaredpetersen.kafkaconnectredis.RedisSetCommandExpiration":{"type":"EX","time":{"long":2100}}},"condition":{"io.github.jaredpetersen.kafkaconnectredis.RedisSetCommandCondition":"NX"}}
6472
>{"key":"product.milk","value":"$2.29","expiration":null,"condition":null}
@@ -70,35 +78,39 @@ kafka-avro-console-producer \
7078
kafka-avro-console-producer \
7179
--broker-list kafka-broker-0.kafka-broker:9092 \
7280
--property schema.registry.url='http://kafka-schema-registry:8081' \
81+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
7382
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisExpireCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"seconds","type":"long"}]}' \
74-
--topic redis.commands.expire
83+
--topic redis.commands
7584
>{"key":"product.milk","seconds":1800}
7685
```
7786

7887
```bash
7988
kafka-avro-console-producer \
8089
--broker-list kafka-broker-0.kafka-broker:9092 \
8190
--property schema.registry.url='http://kafka-schema-registry:8081' \
91+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
8292
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisExpireatCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"timestamp","type":"long"}]}' \
83-
--topic redis.commands.expireat
93+
--topic redis.commands
8494
>{"key":"product.bread","timestamp":4130464553}
8595
```
8696

8797
```bash
8898
kafka-avro-console-producer \
8999
--broker-list kafka-broker-0.kafka-broker:9092 \
90100
--property schema.registry.url='http://kafka-schema-registry:8081' \
101+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
91102
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisPexpireCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"milliseconds","type":"long"}]}' \
92-
--topic redis.commands.pexpire
103+
--topic redis.commands
93104
>{"key":"product.waffles","milliseconds":1800000}
94105
```
95106

96107
```bash
97108
kafka-avro-console-producer \
98109
--broker-list kafka-broker-0.kafka-broker:9092 \
99110
--property schema.registry.url='http://kafka-schema-registry:8081' \
111+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
100112
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisSaddCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"values","type":{"type":"array","items":"string"}}]}' \
101-
--topic redis.commands.sadd
113+
--topic redis.commands
102114
>{"key":"{user.1}.interests","values":["reading"]}
103115
>{"key":"{user.2}.interests","values":["sailing","woodworking","programming"]}
104116
```
@@ -107,17 +119,19 @@ kafka-avro-console-producer \
107119
kafka-avro-console-producer \
108120
--broker-list kafka-broker-0.kafka-broker:9092 \
109121
--property schema.registry.url='http://kafka-schema-registry:8081' \
122+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
110123
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisGeoaddCommand","type":"record","fields":[{"name":"key","type":"string"},{"name":"values","type":{"type":"array","items":{"name":"RedisGeoaddCommandGeolocation","type":"record","fields":[{"name":"longitude","type":"double"},{"name":"latitude","type":"double"},{"name":"member","type":"string"}]}}}]}' \
111-
--topic redis.commands.geoadd
124+
--topic redis.commands
112125
>{"key":"Sicily","values":[{"longitude":13.361389,"latitude":13.361389,"member":"Palermo"},{"longitude":15.087269,"latitude":37.502669,"member":"Catania"}]}
113126
```
114127

115128
```bash
116129
kafka-avro-console-producer \
117130
--broker-list kafka-broker-0.kafka-broker:9092 \
118131
--property schema.registry.url='http://kafka-schema-registry:8081' \
132+
--property value.subject.name.strategy='io.confluent.kafka.serializers.subject.TopicRecordNameStrategy' \
119133
--property value.schema='{"namespace":"io.github.jaredpetersen.kafkaconnectredis","name":"RedisArbitraryCommand","type":"record","fields":[{"name":"command","type":"string"},{"name":"arguments","type":{"type":"array","items":"string"}}]}' \
120-
--topic redis.commands.arbitrary
134+
--topic redis.commands
121135
>{"command":"TS.CREATE","arguments":["temperature:3:11", "RETENTION", "60", "LABELS", "sensor_id", "2", "area_id", "32"]}
122136
>{"command":"TS.ADD","arguments":["temperature:3:11", "1548149181", "30"]}
123137
>{"command":"TS.ADD","arguments":["temperature:3:11", "1548149191", "42"]}
@@ -126,7 +140,7 @@ kafka-avro-console-producer \
126140
### Connect JSON
127141
Create an interactive ephemeral query pod:
128142
```bash
129-
kubectl -n kcr-demo run -it --rm kafka-write-records --image confluentinc/cp-kafka:6.0.0 --command /bin/bash
143+
kubectl -n kcr-demo run -it --rm kafka-write-records --image confluentinc/cp-kafka:6.1.0 --command /bin/bash
130144
```
131145

132146
Write records to the `redis.commands` topic:

docs/demo/SOURCE.md

Lines changed: 11 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,17 @@
22
## Install Connector
33
Send a request to the Kafka Connect REST API to configure it to use Kafka Connect Redis. We'll be listening to all keyspace and keyevent notifications in Redis by using the channel pattern `__key*__:*`.
44

5+
First, expose the Kafka Connect server:
6+
```bash
7+
kubectl -n kcr-demo port-forward service/kafka-connect :rest
8+
```
9+
10+
Kubectl will choose an available port for you that you will need to use for the cURLs (`$PORT`).
11+
512
### Avro
613
```bash
714
curl --request POST \
8-
--url "$(minikube -n kcr-demo service kafka-connect --url)/connectors" \
15+
--url "localhost:$PORT/connectors" \
916
--header 'content-type: application/json' \
1017
--data '{
1118
"name": "demo-redis-source-connector",
@@ -28,7 +35,7 @@ curl --request POST \
2835
### Connect JSON
2936
```bash
3037
curl --request POST \
31-
--url "$(minikube -n kcr-demo service kafka-connect --url)/connectors" \
38+
--url "localhost:$PORT/connectors" \
3239
--header 'content-type: application/json' \
3340
--data '{
3441
"name": "demo-redis-source-connector",
@@ -75,7 +82,7 @@ SMEMBERS {user.2}.interests
7582
### Avro
7683
Create an interactive ephemeral query pod:
7784
```bash
78-
kubectl -n kcr-demo run -it --rm kafka-tail-records --image confluentinc/cp-schema-registry:6.0.0 --command /bin/bash
85+
kubectl -n kcr-demo run -it --rm kafka-tail-records --image confluentinc/cp-schema-registry:6.1.0 --command /bin/bash
7986
```
8087

8188
Tail the topic, starting from the beginning:
@@ -92,7 +99,7 @@ kafka-avro-console-consumer \
9299
### Connect JSON
93100
Create an interactive ephemeral query pod:
94101
```bash
95-
kubectl -n kcr-demo run -it --rm kafka-tail-records --image confluentinc/cp-kafka:6.0.0 --command /bin/bash
102+
kubectl -n kcr-demo run -it --rm kafka-tail-records --image confluentinc/cp-kafka:6.1.0 --command /bin/bash
96103
```
97104

98105
Tail the topic, starting from the beginning:
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
1-
ARG VERSION=6.0.0
1+
ARG VERSION=6.1.0
22
FROM confluentinc/cp-kafka-connect-base:${VERSION}
33
COPY kafka-connect-redis-*.jar /usr/share/java/kafka-connect-redis/

docs/demo/kubernetes/kafka-broker/statefulset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ spec:
1515
spec:
1616
containers:
1717
- name: kafka-broker
18-
image: confluentinc/cp-kafka:6.0.0
18+
image: confluentinc/cp-kafka:6.1.0
1919
command:
2020
- /bin/bash
2121
args:

docs/demo/kubernetes/schema-registry/deployment.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ spec:
1414
spec:
1515
containers:
1616
- name: kafka-schema-registry
17-
image: confluentinc/cp-schema-registry:6.0.0
17+
image: confluentinc/cp-schema-registry:6.1.0
1818
ports:
1919
- containerPort: 8081
2020
name: rest

docs/demo/kubernetes/zookeeper/statefulset.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ spec:
1515
spec:
1616
containers:
1717
- name: zookeeper
18-
image: confluentinc/cp-zookeeper:6.0.0
18+
image: confluentinc/cp-zookeeper:6.1.0
1919
# Use downward api after https://github.com/kubernetes/kubernetes/pull/68719 is merged
2020
command:
2121
- /bin/bash

pom.xml

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66
<groupId>io.github.jaredpetersen</groupId>
77
<artifactId>kafka-connect-redis</artifactId>
8-
<version>1.1.0</version>
8+
<version>1.2.0</version>
99
<packaging>jar</packaging>
1010

1111
<name>kafka-connect-redis</name>
@@ -62,26 +62,26 @@
6262
<dependency>
6363
<groupId>org.apache.kafka</groupId>
6464
<artifactId>connect-api</artifactId>
65-
<version>2.6.0</version>
65+
<version>2.7.0</version>
6666
<scope>provided</scope>
6767
</dependency>
6868

6969
<dependency>
7070
<groupId>io.lettuce</groupId>
7171
<artifactId>lettuce-core</artifactId>
72-
<version>6.0.1.RELEASE</version>
72+
<version>6.0.2.RELEASE</version>
7373
</dependency>
7474

7575
<dependency>
7676
<groupId>io.projectreactor</groupId>
7777
<artifactId>reactor-core</artifactId>
78-
<version>3.3.10.RELEASE</version>
78+
<version>3.4.2</version>
7979
</dependency>
8080

8181
<dependency>
8282
<groupId>org.projectlombok</groupId>
8383
<artifactId>lombok</artifactId>
84-
<version>1.18.16</version>
84+
<version>1.18.18</version>
8585
<scope>provided</scope>
8686
</dependency>
8787

@@ -95,34 +95,34 @@
9595
<dependency>
9696
<groupId>org.junit.jupiter</groupId>
9797
<artifactId>junit-jupiter</artifactId>
98-
<version>5.7.0</version>
98+
<version>5.7.1</version>
9999
<scope>test</scope>
100100
</dependency>
101101

102102
<dependency>
103103
<groupId>org.mockito</groupId>
104104
<artifactId>mockito-core</artifactId>
105-
<version>3.5.15</version>
105+
<version>3.7.7</version>
106106
<scope>test</scope>
107107
</dependency>
108108

109109
<dependency>
110110
<groupId>org.testcontainers</groupId>
111111
<artifactId>testcontainers</artifactId>
112-
<version>1.15.0-rc2</version>
112+
<version>1.15.2</version>
113113
<scope>test</scope>
114114
</dependency>
115115
<dependency>
116116
<groupId>org.testcontainers</groupId>
117117
<artifactId>junit-jupiter</artifactId>
118-
<version>1.15.0-rc2</version>
118+
<version>1.15.2</version>
119119
<scope>test</scope>
120120
</dependency>
121121

122122
<dependency>
123123
<groupId>io.projectreactor</groupId>
124124
<artifactId>reactor-test</artifactId>
125-
<version>3.3.10.RELEASE</version>
125+
<version>3.4.2</version>
126126
<scope>test</scope>
127127
</dependency>
128128
</dependencies>

0 commit comments

Comments
 (0)