Build real-time streaming data pipelines or streaming applications
An open-source distributed streaming platform for building real-time streaming data pipelines or real-time streaming applications. Publish and subscribe to streams of records, process streams of records as they occur, and store streams of records in a fault-tolerant way.
What is messaging and why is it useful when developing applications? How does it differ from APIs, and where do events fit in? How can messaging improve your microservices applications? Get the answers to these questions and more by reading the following articles in this Messaging Fundamentals learning path.
IBM Event Streams is an event-streaming platform that helps you build smart applications that can react to events as they happen. IBM Event Streams offers a fully managed Apache Kafka service, ensuring durability and high availability.
In this tutorial, we review an IBM Event Streams Java sample application. We look at how to write client code, so that you learn how to produce and consume messages from Apache Kafka.
In this tutorial, you'll actually build a small application. First, however, you will first run a small binary application. This application will produce some records to a topic. Then, your challenge is to write a consumer application that will consume records from the topic and will recover the secret message that is put inside one of the records.
In this learning path, learn the core concepts for Apache Kafka and IBM Event Streams, gain some hands-on experience with an IBM Event Streams instance via a sample application, try out a coding challenge of developing a solution to a problem, and learn how to troubleshoot and debug simple errors and connectivity issues.
Learn some of the common use cases for Apache Kafka and then learn the core concepts for Apache Kafka. Learn how producers and consumers work and how Kafka Streams and Kafka Connect can be used to create powerful data streaming pipelines.
In this tutorial, we’ll show you just how easy it is to deploy a Kafka instance using IBM Event Streams on IBM Cloud service and then connect and run one of the sample applications.
Explore the typical high-level event driven architecture (EDA) usage patterns for Apache Kafka. We will also look at how these usage patterns typically mature over time to handle issues such as topic governance, stream processing, and AI.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.