Skip to content

Commit d9524b3

Browse files
committed
Merge pull request #28 from aviflax/patch-1
Replace README contents with a OBSOLETE notice
2 parents 7db4ec7 + 982d10e commit d9524b3

File tree

1 file changed

+2
-183
lines changed

1 file changed

+2
-183
lines changed

README.md

Lines changed: 2 additions & 183 deletions
Original file line numberDiff line numberDiff line change
@@ -1,184 +1,3 @@
1-
# kafka-rb
2-
kafka-rb allows you to produce and consume messages to / from the Kafka distributed messaging service.
3-
This is an improved version of the original Ruby client written by Alejandro Crosa,
4-
and is used in production at wooga.
1+
# kafka-rb is OBSOLETE AND NO LONGER MAINTAINED
52

6-
## Ubuntu Pre-install
7-
8-
apt-get install build-essential gcc g++ liblzo2-dev
9-
apt-get install ruby1.9.1-dev
10-
apt-get install libsnappy1 libsnappy-dev
11-
12-
## Requirements
13-
You need to have access to your Kafka instance and be able to connect through TCP.
14-
You can obtain a copy and instructions on how to setup kafka at http://incubator.apache.org/kafka/
15-
16-
To make Snappy compression available, add
17-
18-
gem "snappy"
19-
20-
to your Gemfile.
21-
22-
## Installation
23-
24-
sudo gem install kafka-rb
25-
26-
(should work fine with JRuby, Ruby 1.8 and 1.9)
27-
28-
## Usage
29-
30-
### Sending a simple message
31-
32-
require 'kafka'
33-
producer = Kafka::Producer.new
34-
message = Kafka::Message.new("some random message content")
35-
producer.push(message)
36-
37-
### Sending a sequence of messages
38-
39-
require 'kafka'
40-
producer = Kafka::Producer.new
41-
message1 = Kafka::Message.new("some random message content")
42-
message2 = Kafka::Message.new("some more content")
43-
producer.push([message1, message2])
44-
45-
### Batching a bunch of messages using the block syntax
46-
47-
require 'kafka'
48-
producer = Kafka::Producer.new
49-
producer.batch do |messages|
50-
puts "Batching a push of multiple messages.."
51-
messages << Kafka::Message.new("first message to push")
52-
messages << Kafka::Message.new("second message to push")
53-
end
54-
55-
* they will be sent all at once, after the block execution
56-
57-
### Consuming messages one by one
58-
59-
require 'kafka'
60-
consumer = Kafka::Consumer.new
61-
messages = consumer.consume
62-
63-
64-
### Consuming messages using a block loop
65-
66-
require 'kafka'
67-
consumer = Kafka::Consumer.new
68-
consumer.loop do |messages|
69-
puts "Received"
70-
puts messages
71-
end
72-
73-
## More real world example using partitions as well as reading avro
74-
75-
#!/usr/bin/env ruby
76-
require 'kafka'
77-
require 'snappy'
78-
require 'avro'
79-
require 'json'
80-
81-
# avro schema readers
82-
@schema = Avro::Schema.parse(File.open("events.avsc", "rb").read)
83-
@reader = Avro::IO::DatumReader.new(@schema)
84-
85-
# converts bits to well defined avro object
86-
def to_event (bits)
87-
blob = StringIO.new(bits)
88-
decoder = Avro::IO::BinaryDecoder.new(blob)
89-
obj = @reader.read(decoder)
90-
puts obj.to_json
91-
end
92-
93-
# define listener for each partition you have
94-
def subscribe (partition=0)
95-
trap(:INT) { exit }
96-
puts "polling partition #{partition}..."
97-
consumer = Kafka::Consumer.new({
98-
:host => "YOUR_KAFKA_SERVER_HERE",
99-
:port => 9092,
100-
:topic => "events",
101-
:max_size => 1024 * 1024 * 10,
102-
:polling => 1,
103-
:partition => partition })
104-
begin
105-
consumer.loop do |messages|
106-
puts "polling #{partition} partition..."
107-
messages.each do |message|
108-
to_event message.payload
109-
end
110-
end
111-
rescue
112-
puts "Whoops, critical error #{$!} on partition #{partition}!!!"
113-
end
114-
end
115-
116-
puts "Master listener started..."
117-
118-
# listen to 16 partitions on kafka
119-
# (be sure to configure to size of your kafka cluster)
120-
subscribers = []
121-
15.times do | partition_index |
122-
subscribers[p] = Thread.new{ subscribe(partition_index) }
123-
end
124-
125-
# endless loop pulling messages
126-
subscribers.each do | sub |
127-
sub.join
128-
end
129-
130-
### Using the cli
131-
132-
There is two cli programs to communicate with kafka from the command line
133-
interface mainly intended for debug. `kafka-publish` and `kafka-consumer`. You
134-
can configure the commands by command line arguments or by setting the
135-
environment variables: *KAFKA_HOST*, *KAFKA_PORT*, *KAFKA_TOPIC*,
136-
*KAFKA_COMPRESSION*.
137-
138-
139-
140-
#### kafka-publish
141-
142-
```
143-
$ kafka-publish --help
144-
Usage: kafka-publish [options]
145-
146-
-h, --host HOST Set the kafka hostname
147-
-p, --port PORT Set the kafka port
148-
-t, --topic TOPIC Set the kafka topic
149-
-c, --compression no|gzip|snappy Set the compression method
150-
-m, --message MESSAGE Message to send
151-
```
152-
153-
If _message_ is omitted, `kafka-publish` will read from *STDIN*, until EOF or
154-
SIG-INT.
155-
156-
NOTE: kafka-publish doesn't bach messages for the moment.
157-
158-
This could be quiet handy for piping directly to kafka:
159-
160-
```
161-
$ tail -f /var/log/syslog | kafka-publish -t syslog
162-
```
163-
164-
#### kafka-consumer
165-
166-
```
167-
$ kafka-consumer --help
168-
Usage: kafka-consumer [options]
169-
170-
-h, --host HOST Set the kafka hostname
171-
-p, --port PORT Set the kafka port
172-
-t, --topic TOPIC Set the kafka topic
173-
```
174-
175-
Kafka consumer will loop and wait for messages until it is interrupted.
176-
177-
This could be nice for example to have a sample of messages.
178-
179-
180-
## Questions?
181-
alejandrocrosa at(@) gmail.com
182-
http://twitter.com/alejandrocrosa
183-
184-
3+
kafka-rb supports only the pre-0.8 Kafka API and has been supplanted by [Poseidon](https://github.com/bpot/poseidon).

0 commit comments

Comments
 (0)