I am very intrigued by Redis streams. (Looks like the potential to build little systems powered by append-logs, like Kafka, but without all the overhead of Kafka.)
It looks straightforward to XADD
to a log/stream and to consume an entry from a log/stream. But what about if you want to join across two streams?
Kafka Streams, Flink, Spark, etc. provide means for doing this. Is there an equivalent in the Redis universe?
If not, I guess I'll just need to implement my own thing that consumes from two streams, does its own join logic from the messages, and publishes back out to a new stream. If others have experience doing this with Redis Streams, please do share your pointers or warnings.