Skip to content

Commit

Permalink
Readme
Browse files Browse the repository at this point in the history
  • Loading branch information
tyt2y3 committed Mar 25, 2023
1 parent ed85232 commit 726d5d3
Show file tree
Hide file tree
Showing 4 changed files with 25 additions and 15 deletions.
20 changes: 12 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ Let's build real-time (multi-threaded, no GC), self-contained (aka easy to deplo
Add the following to your `Cargo.toml`

```toml
sea-streamer = { version = "0", features = ["kafka", "stdio", "socket", "runtime-tokio"] }
sea-streamer = { version = "0", features = ["kafka", "redis", "stdio", "socket", "runtime-tokio"] }
```

Here is a basic [stream consumer](https://github.com/SeaQL/sea-streamer/tree/main/examples/src/bin/consumer.rs):
Expand All @@ -57,9 +57,8 @@ async fn main() -> Result<()> {
let streamer = SeaStreamer::connect(stream.streamer(), Default::default()).await?;

let mut options = SeaConsumerOptions::new(ConsumerMode::RealTime);
options.set_kafka_consumer_options(|options| {
options.set_auto_offset_reset(AutoOffsetReset::Earliest);
});
options.set_auto_stream_reset(SeaStreamReset::Earliest);

let consumer: SeaConsumer = streamer
.create_consumer(stream.stream_keys(), options)
.await?;
Expand Down Expand Up @@ -131,15 +130,18 @@ async fn main() -> Result<()> {

Now, let's put them into action.

With Kafka:
With Redis / Kafka:

```shell
STREAMER_URI="redis://localhost:6379" # or
STREAMER_URI="kafka://localhost:9092"

# Produce some input
cargo run --bin producer -- --stream kafka://localhost:9092/hello1 &
cargo run --bin producer -- --stream $STREAMER_URI/hello1 &
# Start the processor, producing some output
cargo run --bin processor -- --input kafka://localhost:9092/hello1 --output kafka://localhost:9092/hello2 &
cargo run --bin processor -- --input $STREAMER_URI/hello1 --output $STREAMER_URI/hello2 &
# Replay the output
cargo run --bin consumer -- --stream kafka://localhost:9092/hello2
cargo run --bin consumer -- --stream $STREAMER_URI/hello2
# Remember to stop the processes
kill %1 %2
```
Expand Down Expand Up @@ -295,6 +297,8 @@ In Redis, shards and nodes is a M-N mapping - shards can be moved among nodes *a
It makes testing much more difficult.
Let us know if you'd like to help!

This crate is built on top of [`redis`](https://docs.rs/redis).

### `sea-streamer-stdio`: Standard I/O Backend

This is the `stdio` backend implementation for SeaStreamer. It is designed to be connected together with unix pipes,
Expand Down
2 changes: 2 additions & 0 deletions sea-streamer-redis/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,3 +50,5 @@ It's quite a difficult task, because clients have to take responsibility when wo
In Redis, shards and nodes is a M-N mapping - shards can be moved among nodes *at any time*.
It makes testing much more difficult.
Let us know if you'd like to help!

This crate is built on top of [`redis`](https://docs.rs/redis).
2 changes: 2 additions & 0 deletions sea-streamer-redis/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,8 @@
//! In Redis, shards and nodes is a M-N mapping - shards can be moved among nodes *at any time*.
//! It makes testing much more difficult.
//! Let us know if you'd like to help!
//!
//! This crate is built on top of [`redis`](https://docs.rs/redis).
#![cfg_attr(docsrs, feature(doc_cfg))]
#![deny(missing_debug_implementations)]
Expand Down
16 changes: 9 additions & 7 deletions src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,8 @@
//! let streamer = SeaStreamer::connect(stream.streamer(), Default::default()).await?;
//!
//! let mut options = SeaConsumerOptions::new(ConsumerMode::RealTime);
//! options.set_kafka_consumer_options(|options| {
//! options.set_auto_offset_reset(AutoOffsetReset::Earliest);
//! });
//! options.set_auto_stream_reset(SeaStreamReset::Earliest);
//!
//! let consumer: SeaConsumer = streamer
//! .create_consumer(stream.stream_keys(), options)
//! .await?;
Expand Down Expand Up @@ -131,15 +130,18 @@
//!
//! Now, let's put them into action.
//!
//! With Kafka:
//! With Redis / Kafka:
//!
//! ```shell
//! STREAMER_URI="redis://localhost:6379" # or
//! STREAMER_URI="kafka://localhost:9092"
//!
//! # Produce some input
//! cargo run --bin producer -- --stream kafka://localhost:9092/hello1 &
//! cargo run --bin producer -- --stream $STREAMER_URI/hello1 &
//! # Start the processor, producing some output
//! cargo run --bin processor -- --input kafka://localhost:9092/hello1 --output kafka://localhost:9092/hello2 &
//! cargo run --bin processor -- --input $STREAMER_URI/hello1 --output $STREAMER_URI/hello2 &
//! # Replay the output
//! cargo run --bin consumer -- --stream kafka://localhost:9092/hello2
//! cargo run --bin consumer -- --stream $STREAMER_URI/hello2
//! # Remember to stop the processes
//! kill %1 %2
//! ```
Expand Down

0 comments on commit 726d5d3

Please sign in to comment.