You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -207,18 +207,18 @@ Steps to update:
207
207
git checkout 063a9ae7a65cebdf1cc128da9815c05f91a2a996 # for version 1.8.2
208
208
```
209
209
210
+
If you get an error during that checkout command, double check that the submodule was initialized / cloned! You may need to run `git submodule update --init --recursive`
211
+
210
212
1. Update [`config.d.ts`](https://github.com/Blizzard/node-rdkafka/blob/master/config.d.ts) and [`errors.d.ts`](https://github.com/Blizzard/node-rdkafka/blob/master/errors.d.ts) TypeScript definitions by running:
211
213
```bash
212
214
node ci/librdkafka-defs-generator.js
213
215
```
214
216
Note: This is ran automatically during CI flows but it's good to run it during the version upgrade pull request.
215
217
216
-
1. Run `npm install` to build with the new version and fix any build errors that occur.
218
+
1. Run `npm install --lockfile-version 2` to build with the new version and fix any build errors that occur.
217
219
218
220
1. Run unit tests: `npm run test`
219
221
220
-
1. Run end to end tests: `npm run test:e2e`. This requires running kafka & zookeeper locally.
221
-
222
222
1. Update the version numbers referenced in the [`README.md`](https://github.com/Blizzard/node-rdkafka/blob/master/README.md) file to the new version.
Copy file name to clipboardExpand all lines: README.md
+40-44Lines changed: 40 additions & 44 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ I am looking for *your* help to make this project even better! If you're interes
17
17
18
18
The `node-rdkafka` library is a high-performance NodeJS client for [Apache Kafka](http://kafka.apache.org/) that wraps the native [librdkafka](https://github.com/edenhill/librdkafka) library. All the complexity of balancing writes across partitions and managing (possibly ever-changing) brokers should be encapsulated in the library.
19
19
20
-
__This library currently uses `librdkafka` version `1.9.2`.__
20
+
__This library currently uses `librdkafka` version `2.3.0`.__
21
21
22
22
## Reference Docs
23
23
@@ -60,11 +60,7 @@ Using Alpine Linux? Check out the [docs](https://github.com/Blizzard/node-rdkafk
60
60
61
61
### Windows
62
62
63
-
<<<<<<< HEAD
64
-
Windows build **is not** compiled from `librdkafka` source but it is rather linked against the appropriate version of [NuGet librdkafka.redist](https://www.nuget.org/packages/librdkafka.redist/) static binary that gets downloaded from `https://globalcdn.nuget.org/packages/librdkafka.redist.1.8.2.nupkg` during installation. This download link can be changed using the environment variable `NODE_RDKAFKA_NUGET_BASE_URL` that defaults to `https://globalcdn.nuget.org/packages/` when it's no set.
65
-
=======
66
-
Windows build **is not** compiled from `librdkafka` source but it is rather linked against the appropriate version of [NuGet librdkafka.redist](https://www.nuget.org/packages/librdkafka.redist/) static binary that gets downloaded from `https://globalcdn.nuget.org/packages/librdkafka.redist.1.9.2.nupkg` during installation. This download link can be changed using the environment variable `NODE_RDKAFKA_NUGET_BASE_URL` that defaults to `https://globalcdn.nuget.org/packages/` when it's no set.
67
-
>>>>>>> 52b40e99abc811b2c4be1d3e62dd021e4bb1f6d4
63
+
Windows build **is not** compiled from `librdkafka` source but it is rather linked against the appropriate version of [NuGet librdkafka.redist](https://www.nuget.org/packages/librdkafka.redist/) static binary that gets downloaded from `https://globalcdn.nuget.org/packages/librdkafka.redist.2.3.0.nupkg` during installation. This download link can be changed using the environment variable `NODE_RDKAFKA_NUGET_BASE_URL` that defaults to `https://globalcdn.nuget.org/packages/` when it's no set.
68
64
69
65
Requirements:
70
66
*[node-gyp for Windows](https://github.com/nodejs/node-gyp#on-windows)
@@ -96,12 +92,12 @@ npm install node-rdkafka
96
92
To use the module, you must `require` it.
97
93
98
94
```js
99
-
var Kafka =require('node-rdkafka');
95
+
constKafka=require('node-rdkafka');
100
96
```
101
97
102
98
## Configuration
103
99
104
-
You can pass many configuration options to `librdkafka`. A full list can be found in `librdkafka`'s [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.9.2/CONFIGURATION.md)
100
+
You can pass many configuration options to `librdkafka`. A full list can be found in `librdkafka`'s [Configuration.md](https://github.com/edenhill/librdkafka/blob/v2.3.0/CONFIGURATION.md)
105
101
106
102
Configuration keys that have the suffix `_cb` are designated as callbacks. Some
107
103
of these keys are informational and you can choose to opt-in (for example, `dr_cb`). Others are callbacks designed to
@@ -136,25 +132,25 @@ You can also get the version of `librdkafka`
136
132
constKafka=require('node-rdkafka');
137
133
console.log(Kafka.librdkafkaVersion);
138
134
139
-
// #=> 1.9.2
135
+
// #=> 2.3.0
140
136
```
141
137
142
138
## Sending Messages
143
139
144
140
A `Producer` sends messages to Kafka. The `Producer` constructor takes a configuration object, as shown in the following example:
A `Producer` requires only `metadata.broker.list` (the Kafka brokers) to be created. The values in this list are separated by commas. For other configuration options, see the [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.9.2/CONFIGURATION.md) file described previously.
148
+
A `Producer` requires only `metadata.broker.list` (the Kafka brokers) to be created. The values in this list are separated by commas. For other configuration options, see the [Configuration.md](https://github.com/edenhill/librdkafka/blob/v2.3.0/CONFIGURATION.md) file described previously.
153
149
154
150
The following example illustrates a list with several `librdkafka` options set.
155
151
156
152
```js
157
-
var producer =newKafka.Producer({
153
+
constproducer=newKafka.Producer({
158
154
'client.id':'kafka',
159
155
'metadata.broker.list':'localhost:9092',
160
156
'compression.codec':'gzip',
@@ -175,14 +171,14 @@ You can easily use the `Producer` as a writable stream immediately after creatio
175
171
```js
176
172
// Our producer with its Kafka brokers
177
173
// This call returns a new writable stream to our topic 'topic-name'
Additionally you can add serializers to modify the value of a produce for a key or value before it is sent over to Kafka.
335
331
336
332
```js
337
-
producer.setValueSerializer(function(value) {
333
+
producer.setValueSerializer((value)=> {
338
334
returnBuffer.from(JSON.stringify(value));
339
335
});
340
336
```
@@ -346,7 +342,7 @@ Otherwise the behavior of the class should be exactly the same.
346
342
To read messages from Kafka, you use a `KafkaConsumer`. You instantiate a `KafkaConsumer` object as follows:
347
343
348
344
```js
349
-
var consumer =newKafka.KafkaConsumer({
345
+
constconsumer=newKafka.KafkaConsumer({
350
346
'group.id':'kafka',
351
347
'metadata.broker.list':'localhost:9092',
352
348
}, {});
@@ -361,10 +357,10 @@ The `group.id` and `metadata.broker.list` properties are required for a consumer
361
357
Rebalancing is managed internally by `librdkafka` by default. If you would like to override this functionality, you may provide your own logic as a rebalance callback.
362
358
363
359
```js
364
-
var consumer =newKafka.KafkaConsumer({
360
+
constconsumer=newKafka.KafkaConsumer({
365
361
'group.id':'kafka',
366
362
'metadata.broker.list':'localhost:9092',
367
-
'rebalance_cb':function(err, assignment) {
363
+
'rebalance_cb': (err, assignment)=> {
368
364
369
365
if (err.code===Kafka.CODES.ERRORS.ERR__ASSIGN_PARTITIONS) {
370
366
// Note: this can throw when you are disconnected. Take care and wrap it in
@@ -389,10 +385,10 @@ var consumer = new Kafka.KafkaConsumer({
389
385
When you commit in `node-rdkafka`, the standard way is to queue the commit request up with the next `librdkafka` request to the broker. When doing this, there isn't a way to know the result of the commit. Luckily there is another callback you can listen to to get this information
@@ -459,15 +455,15 @@ The following example illustrates flowing mode:
459
455
consumer.connect();
460
456
461
457
consumer
462
-
.on('ready', function() {
458
+
.on('ready', () => {
463
459
consumer.subscribe(['librdtesting-01']);
464
460
465
461
// Consume from the librdtesting-01 topic. This is what determines
466
462
// the mode we are running in. By not specifying a callback (or specifying
467
463
// only a callback) we get messages as soon as they are available.
468
464
consumer.consume();
469
465
})
470
-
.on('data', function(data) {
466
+
.on('data', (data)=> {
471
467
// Output the actual message contents
472
468
console.log(data.value.toString());
473
469
});
@@ -478,17 +474,17 @@ The following example illustrates non-flowing mode:
478
474
consumer.connect();
479
475
480
476
consumer
481
-
.on('ready', function() {
477
+
.on('ready', () => {
482
478
// Subscribe to the librdtesting-01 topic
483
479
// This makes subsequent consumes read from that topic.
484
480
consumer.subscribe(['librdtesting-01']);
485
481
486
482
// Read one message every 1000 milliseconds
487
-
setInterval(function() {
483
+
setInterval(() => {
488
484
consumer.consume(1);
489
485
}, 1000);
490
486
})
491
-
.on('data', function(data) {
487
+
.on('data', (data)=> {
492
488
console.log('Message found! Contents below.');
493
489
console.log(data.value.toString());
494
490
});
@@ -528,15 +524,15 @@ The following table lists events for this API.
528
524
Some times you find yourself in the situation where you need to know the latest (and earliest) offset for one of your topics. Connected producers and consumers both allow you to query for these through `queryWaterMarkOffsets` like follows:
0 commit comments