Apache Kafka

learn APACHE KAFKA

Kafka three key capabilities:

  1. To publish (write) and subscribe to (read) streams of events, including continuous import/export of your data from other systems.
  2. To store streams of events durably and reliably for as long as you want.
  3. To process streams of events as they occur or retrospectively.

concepts

KAFKA CONNECT

tool for scalably and reliably streaming data between Apache Kafka and other systems. It makes it simple to quickly define connectors that move large collections of data into and out of Kafka.

KAFKA STREAMS

client library for building real-time applications, where the input and/or output data is stored in Kafka clusters. Kafka Streams combines the simplicity of writing and deploying standard Java and Scala applications on the client side with the benefits of Kafka’s server-side cluster technology to make these applications highly scalable, elastic, fault-tolerant, distributed, and much more

https://github.com/apache/kafka/blob/3.6/streams/examples/src/main/java/org/apache/kafka/streams/examples/wordcount/WordCountDemo.java

resources