A lightweight command-line tool for producing, consuming, and inspecting Apache Kafka messages, similar to netcat for Kafka.
kcat is a lightweight, non-JVM command-line utility for interacting with Apache Kafka clusters. It allows users to produce messages to Kafka topics from stdin, consume messages from topics to stdout, and inspect cluster metadata, functioning like a netcat tool for Kafka. It solves the need for a fast, minimal alternative to the heavier Java-based Kafka command-line interfaces.
Developers, DevOps engineers, and data engineers who work with Apache Kafka and need a quick, scriptable way to test, debug, and monitor Kafka topics and clusters from the command line.
Developers choose kcat for its speed, small footprint (under 150KB statically linked), and ease of use in shell scripts and automated workflows, offering features like Avro deserialization, custom formatting, and mock cluster support without JVM overhead.
Generic command line non-JVM Apache Kafka producer and consumer
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Statically linked to under 150KB, kcat runs quickly without JVM overhead, making it ideal for resource-constrained environments or quick scripts.
Supports producer and consumer modes, metadata listing, Avro deserialization with Schema Registry, custom formatting, and an in-memory mock Kafka cluster for testing, as shown in the examples.
Designed as a netcat-like utility, it seamlessly integrates with shell pipelines and scripts, allowing for easy automation of Kafka operations from the command line.
Requires dependencies like librdkafka and optional libraries for Avro and JSON support, which can be cumbersome to set up, especially on Windows where specific Visual Studio tools are needed.
Being a command-line-only tool, it lacks graphical interfaces for visual debugging or monitoring, which might hinder users who prefer GUI-based Kafka management.
While versatile, it doesn't support advanced Kafka features like stream processing or schema management beyond deserialization, relying on external tools or services for those needs.