About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Tutorial
Workshop: Get started with Apache Kafka
Learn how to use Kafka to build applications that react to events as they happen
Apache Kafka is a distributed streaming platform. It is made up of the following components: Broker, Producer, Consumer, Admin, Connect, and Streams.

Learning objectives
In this workshop, you'll learn how to build an end-to-end streaming pipeline using Apache Kafka, Kafka Connect, and Kafka Streams.

You'll learn how to:
- Configure the Kafka command line tools
- Create, list, and describe topics using the
kafka-topics.shtool - Consume records with the
kafka-console-consumer.shtool - Produce records with the
kafka-console-producer.shtool - Describe consumer groups with the
kafka-consumer-group.shtool - Configure and run the Kafka Connect runtime in distributed mode
- Configure and run the FileStreamsSourceConnector Kafka connector
- Run a Kafka Streams application
Prerequisites
- Apache Kafka CLI
- Java SDK, Version 8 or above
- gradle, Version 6 or above
Estimated time
Completing this workshop should take about 1 hour.
Steps
- Install & configure a Kafka cluster
- Sending & consuming messages
- Integrating data with Kafka Connect
- Processing data with Kafka Streams
Step 1: Install and configure a Kafka cluster
In part 1 of this workshop, you set up a Kafka cluster:
Step 2: Sending and consuming messages
In part 2 of this workshop, you'll use the Kafka CLI to create a topic, send some messages, and consume some messages. You'll also learn how to set up a consumer group.
Step 3: Integrating data with Kafka Connect
In part 3 of this workshop, you configure the Kafka Connect runtime for your environment:
Then, you configure, start, and test the connector.
Step 4: Processing data with Kafka Streams
In part 4 of this workshop, you learn how to use Kafka Streams to process streams of data in realtime using the built-in sample application.
Summary
In this workshop, you learned how to build an end-to-end streaming pipeline, with data flowing into Apache Kafka from an external system and then doing realtime processing on that data. By exploring the key concepts and components of Apache Kafka in this workshop, you can now build reliable, scalable, and performant environments.
Next steps
Perhaps you're ready to try one of these tutorials and code patterns to build your Kafka skills further:
- Tutorial: Develop Java programs to produce and consume messages to and from Apache Kafka using the Kafka Producer and Consumer APIs
- Tutorial: Developing a stream processor with Apache Kafka using Kafka Streams