Introduction
Nowadays whenever we think of ingesting/storing/processing/analysing streaming data, there is a leading Event Streaming Platform i.e. Apache Kafka. Confluent complements Apache Kafka by providing additional tools, services, support etc. Here is a short introduction to Kafka. If you are keen to know it’s architecture then you can refer here.
This quickstart guide helps you to get up and running with Confluent Platform using Docker containers.
Learning Objectives
- Installing Docker on Windows
- Installing Docker on Mac
- Operating System Compatibility with Various Docker Versions
- Downloading Confluent Platform
- Starting Confluent Platform on Docker
- Apache Kafka Basic Commands
Installing Docker on Windows
Docker version 1.11 or later is required. Let’s install Docker on Windows:
You can Install the docker by visiting Docker official website and then go to Get Started > Docker Desktop > Download for Windows
Note: We are using Docker Hub stable version
Download from this link Docker Community Edition 2.0.0.0-win81 2018-12-07 Once download is completed, run the executable file.
Installing Docker on Mac
Docker Desktop for Mac is the Community version of Docker for Mac. You can download Docker Desktop for Mac from Docker Hub.
Note: Docker Desktop – macOS must be version 10.14 or newer: i.e. Mojave (10.14), Catalina (10.15), or Big Sur (11.0). Mac hardware must be a 2010 or a newer model with an Intel processor.
Operating System Compatibility with Various Docker Versions
After installation, there is a need to check if the Docker memory is allocated minimally at 6 GB. When using Docker Desktop for Mac, the default Docker memory allocation is 2 GB. You can change the default allocation to 6 GB in Docker. Navigate to Preferences > Resources > Advanced.
Confluent Platform and clients support these operating systems.
Downloading Confluent Platform on Docker
Download or copy the contents of the Confluent Platform all-in-one Docker Compose file, for example:
curl --silent --output docker-compose.yml https://raw.githubusercontent.com/confluentinc/cp-all-in-one/6.1.0-post/cp-all-in-one/docker-compose.yml
Starting Confluent Platform on Docker
Start Confluent Platform with the -d option to run in detached mode:
docker-compose up -d
The above command starts Confluent Platform with a separate container for each Confluent Platform component. Your output should resemble the following:
To verify that the services are up and running, run the following command:
docker-compose ps
Your output should resemble the following:
Login to broker account using below command:
docker exec -it broker bash
Kafka Basics commands
You will explore Confluent Kafka basics. You will utilize command-line tools provided by Confluent to start up Kafka and related components. Then you will create a Kafka topic and use the producer and consumer to utilize the messages written to the topic.
Execute the following command to create a Kafka topic:
kafka-topics --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic users
Showing list of Kafka topics:
kafka-topics --list --zookeeper zookeeper:2181
Sending some messages
Kafka comes with a command line client that will take input from a file or from standard input and send it out as messages to the Kafka cluster.
In the Producer terminal window, execute the following command to send some messages to the Kafka cluster:
kafka-console-producer --broker-list localhost:9092 --topic users
Enter the following messages at the prompt. Hit the enter key after each message:
>hello world
>another message
Keep the Terminal window open.
Reading messages from the Kafka cluster
Kafka also has a command-line consumer that will dump out messages to standard output.
Enter the following command to read the messages:
kafka-console-consumer --bootstrap-server localhost:9092 --topic users --from-beginning
Notice that after few seconds it shows the following messages:
>hello world
>another message
Now:
- Switch back to the Producer Terminal window and enter a few more messages.
- Switch back to the Consumer Terminal window and notice that the newly entered messages automatically show up!
- Switch to the Producer Terminal window and press Ctrl+Z to stop entering more messages. Keep the Terminal window open.
- Switch to the Consumer Terminal window and press Ctrl+Z to stop reading messages. Keep the Terminal window open.
Deleting Kafka Topic
In this part, you will delete the Kafka topic which you previously created in this lab.
In the Producer Terminal window, run the following command.
kafka-topics --delete --zookeeper zookeeper:2181 --topic users
This should show in the console output:
Topic users are marked for deletion.
Use the following command to verify that the ‘users’ topic doesn’t exist anymore:
kafka-topics --list --zookeeper zookeeper:2181
If the topic is not deleted go back and make sure that you closed the producer and consumer prompts. Then retry the delete and check again.
Note: You can see server.properties inside /etc/kafka/server.properties
server.properties is the configuration file used by Kafka server.
Conclusion
Congrats, you have practically installed Confluent Platform using Docker on your system. This enables you to leverage the functionalities of various components of Confluent Platform such as Apache Kafka, Kafka-Connect, KsqlDB, Control-Center, and much more. Once you have set up the Confluent Platform you may prepare for Confluent certification exam, perform PoC related to various usecases including ClickStream Analysis, Real-Time Data analysis, etc. In case you are keen in attending official Confluent Training then you can check the courses on our site – datacouch.io
Keep Learning, All the best!