I have kafka messages defined in protobuf. I want to easily consume the messages.

Kafka has a default command line tool called kafka-console-consumer. It has --value-deserializer. If I have a Java deserializer which deserializes kafka messages in protobuf, it should work. However, firstly I don’t want to write extra code. Secondly, --value-deserializer does not work (https://stackoverflow.com/questions/45581082/kafka-console-consumer-sh-custom-deserializer).

Confluent has a Schema Registry which allows you to save message schema. However, as of now it supports Avro only.

Thanks to https://github.com/shanipribadi who recommanded me this nice tool called PQ (https://github.com/sevagh/pq). With it, I can easily consume kafka messages in protobuf.

Install the tools on your machine.

> brew install rustup
> rustup-init
> git clone https://github.com/sevagh/pq && cd pq
> cargo build --release

Compile your protobuf messages and save it under ~/.pq.

> protoc -o dog.fdset dog.proto
> cp dog.fdset ~/.pq/

The protobuf message type is com.example.dog.Dog.

package com.example.dog;

message Dog {
  required string breed = 1;
  required int32 age = 2;
  optional string temperament = 3;
}

Consume topic from beginning (--beginning) and stop after 1 message (--count 1).

> pq kafka my_topic --brokers 192.168.0.1:9092 --beginning --count 1 --msgtype com.example.dog.Dog
{
  "age": 10,
  "breed": "gsd",
  "temperament": "aggressive"
}

You might have a large number of proto files with dependencies. Run the following command

> cd src/main/proto
> find . -iname '*.proto' | xargs protoc -o example.fdset --include_imports --include_source_info