Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Improved README
  • Loading branch information
edenhill committed May 25, 2016
commit 1652550528b3fcd3ec2909a74039ba23ce50a320
82 changes: 66 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,40 +1,94 @@
Confluent's Apache Kafka client for Python
==========================================

Confluent's Kafka client for Python wraps the librdkafka C library, providing
full Kafka protocol support at great performance and reliability.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

support at great -> support with great


Prerequisites
===============
The Python bindings provides a high-level Producer and Consumer with support
for the balanced consumer groups of Apache Kafka 0.9.

librdkafka >=0.9.1 (or master>=2016-04-13)
py.test (pip install pytest)
See the [API documentation](http://docs.confluent.io/3.0.0/clients/confluent-kafka-python/index.html) for more info.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you want this link versioned, that's fine. You can also use http://docs.confluent.io/current/clients/confluent-kafka-python/index.html if you want something that doesn't need updating w/ each version



Build
Usage
=====

python setup.by build
**Producer:**

from confluent_kafka import Producer

p = Producer({'bootstrap.servers': 'mybroker,mybroker2'})
for data in some_data_source:
p.produce('mytopic', data.encode('utf-8'))
p.flush()


**High-level Consumer:**

from confluent_kafka import Consumer

c = Consumer({'bootstrap.servers': 'mybroker', 'group.id': 'mygroup',
'default.topic.config': {'auto.offset.reset': 'smallest'}})
c.subscribe(['mytopic'])
while running:
msg = c.poll()
if not msg.error():
print('Received message: %s' % msg.value().decode('utf-8'))
c.close()



See [examples](examples) for more examples.



Prerequisites
=============

* Python >= 2.7 or Python 3.x
* [librdkafka](https://github.com/edenhill/librdkafka) >= 0.9.1



Install
=======
Preferably in a virtualenv:

**Install from PyPi:**

pip install confluent-kafka


**Install from source / tarball:**

pip install .


Run unit-tests
==============
Build
=====

python setup.by build




Tests
=====


**Run unit-tests:**

py.test

**NOTE**: Requires py.test, install by `pip install pytest`

Run integration tests
=====================
**WARNING**: These tests require an active Kafka cluster and will make use of a topic named 'test'.

**Run integration tests:**

examples/integration_test.py <kafka-broker>

**WARNING**: These tests require an active Kafka cluster and will make use of a topic named 'test'.




Generate documentation
Expand All @@ -51,7 +105,3 @@ or:
Documentation will be generated in `docs/_build/`


Examples
========

See [examples](examples)