-
Node.js - Download & Install Node.js and the npm package manager.
-
Confluent Platform
-
Zookeeper
-
Kafka
-
Schema Registry
-
Rest Proxy ( optional )
-
npm install
npm start
Go here → http://localhost:8666
Edit config.json to configure the endpoints for the Application, Kafka, Schema Registry and the Rest Proxy
{
"APPLICATION_HOSTNAME": "localhost",
"APPLICATION_PORT": 8666,
"KAFKA_BROKER": "localhost:9092",
"SCHEMA_REGISTRY_URL": "http://localhost:8081",
"REST_PROXY_URL": "http://localhost:8082"
}
-
Toggle the TAB key to open/close the configuration pane
-
Choose Topics to tail
-
Move table around using drag and drop
-
Resize tables and columns
-
Toggle the SPACE key to pause/resume tailing topics
-
rinse repeat
Kafka → Node.js(kafka-avro) → Websockets → Browser
Topic Tailer communicates with the Rest Proxy to determine the list of available topics, if the Rest Proxy is not available it will fall back to the Schema Registry. Each time TAB is pressed, topic data is refreshed via a new call to the Rest Proxy/Schema Registry.
Avro and Json, everthing else gets converted to a string.
No, Topic Tailer only consumes the latest messages.
They get thrown away, when Topic Tailer is resumed it will restart consumption from the latest offset.
Yes, Topic Tailer creates at most 1 consumer thread per tailed topic. Messages received from a single consumer thread are broadcast to multiple browser sessions via websockets.