I have a Kafka streams application which operates on the incoming state and need to store the state before writing to the next topic. The write should occur only after the state is updated in local store.
Something like this.
stream.map(this::getAndUpdateState) .map(this::processStateAndEvent) .to("topicname");
So that in getAndUpdateState()
I can do like
state = store.get(key); // or new if null state = updateState(state, event); // update changes to state store.put(key, state); // write back the state return state;
How do I implement that simple get() and put() operation on a kafka store? I already tried using KeyValueStore but it had issues as I had to add it a source and sink processor and all.
Alternatively, a way to get and put in kafka using a KTable or some other concept is also fine.
How to use Structured Streaming with Kafka Direct Stream? 1 Answer
spark & kafka connecion issue 1 Answer
Kafka Consumer Error 2 Answers
Wondering if there is a scheduled upgrade to Java 1.8? 1 Answer
Databricks Inc.
160 Spear Street, 13th Floor
San Francisco, CA 94105
info@databricks.com
1-866-330-0121