[go: up one dir, main page]

Skip to content

Basic Event Streaming - Fundamentals of Kafka Studies (BESt-FunKS)

License

Notifications You must be signed in to change notification settings

jpgsaraceni/best-funks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 

Repository files navigation

Apache Kafka

My study repo for Apache Kafka. Based on this tutorial.

Contents

Overview

Kafka is a distributed event streaming service. Events (also called "records" or "messages" - things that happened) are stored in "topics". Topics are persisted to disk (for a definite or indefinite period of time). Instead of storing a state of an object in a database, it stores events (and their states) in logs (called "topics" in Kafka).

Kafka is especially useful in microservices, for communication between services (producing and consuming events from one another). It also allows for real-time analytics of events, in a more straightforward manner.

Kafka Connect gets data from a database and sets it to a topic.

Kafka Streams is a Java API that does services like aggregation, grouping, enrichment (joins) on Kafka topics.

Key Terms

Event

Things that happened. Also refered to as messages or records, they are represented by a key, a value, a timestamp and optional metadata.

Topic

A log of events.

Producer

Client applications that write (publish) to a topic.

Consumer

Client applications that read (subscribe) from a topic.

Partition

Parts of a topic apread over buckets on Kafka brokers. Events with the same event key are always stored in the same partition. Consumers of a given topic-partition always read events in the order they are written.

Getting Started

Based on the step-by-step guide on the official website.

Install

  1. Download the suggested version from the link above.

  2. Extract:

    tar -xzf kafka_2.13-3.1.0.tgz
    cd kafka_2.13-3.1.0
  3. Install Java (requires 8+):

    sudo apt update
    sudo apt install default-jre
    java -version

Run

  1. Start Kafka environment:

    Start the ZooKeeper server:

    bin/zookeeper-server-start.sh config/zookeeper.properties

    And in another terminal instance, start the Kafka broker:

    bin/kafka-server-start.sh config/server.properties
  2. Create a topic (in another terminal instance):

    bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092

    Run kafka-topics.sh to display usage information.

  3. Write events into the topic:

    Run the console producer client:

    bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092

    Enter your events:

    >An event
    >Another event

    enter ctrl+c to exit

  4. Read the events:

    Run the console consumer client:

    bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092

Clear data

To remove created topics and events:

rm -rf /tmp/kafka-logs /tmp/zookeeper

Kafka Clients

There are a variety of clients available for using Kafka from inside an application. In the client directory of this repo I will build an implementation of Confluent's Go client, following their tutorial.

About

Basic Event Streaming - Fundamentals of Kafka Studies (BESt-FunKS)

Topics

Resources

License

Stars

Watchers

Forks