This is a cache of https://developer.ibm.com/tutorials/awb-enable-oms-kafka-messages-mq/. It is a snapshot of the page as it appeared on 2025-11-14T12:46:04.034+0000.
Enable OMS to read Kafka messages with IBM MQ as bridge - IBM Developer

Tutorial

Enable OMS to read Kafka messages with IBM MQ as bridge

Learn how to use Kafka Connect and IBM MQ to deliver Kafka messages to IBM Sterling OMS

By

Prashant Pillai

IBM Sterling Order Management System (OMS) supports Kafka as a producer, allowing it to send messages directly to Kafka topics. However, OMS does not support Kafka as a consumer, so it cannot read messages from Kafka topics directly.

This document outlines a proof-of-concept (POC) that uses a Kafka Connect Sink Connector to move messages from a Kafka topic to IBM MQ, which OMS can read natively.

The POC demonstrates how to set up this integration using OMS and IBM MQ from the OMS Developer Toolkit (DTK). The setup steps are similar for other OMS environments, such as on-premises deployments. A standalone Kafka instance is used for simplicity.

Note: This is not official IBM documentation and has not been performance-tested or officially supported. It is intended as a starting point for implementing this integration approach.

Setting up Kafka

You can either use an existing Kafka setup or create a standalone Kafka instance.

Option 1: Use an existing Kafka topic

If you already have a Kafka instance and topic, simply replace quickstart-events with your topic name wherever it's referenced.

Option 2: Set up a standalone Kafka Instance

Follow these steps to install and run Kafka locally:

  1. Download Kafka from the Apache Software foundation website.

  2. Extract the Package:

    tar -xzf kafka_2.13-3.8.0.tgz

  3. To start Zookeeper and Kafka Broker, first navigate to the Kafka folder:

    cd kafka_2.13-3.8.0

    In one terminal, start Zookeeper:

    bin/zookeeper-server-start.sh config/zookeeper.properties

    In another terminal, start the Kafka broker:

    bin/kafka-server-start.sh config/server.properties

  4. To create a Kafka Topic, open a new terminal and run:

    bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092

Getting the Kafka Connect MQ Sink Connector

You can either download a prebuilt JAR file or build it from source.

Option 1: Download the prebuilt JAR

Download the latest prebuilt JAR file from the official GitHub releases page:

Kafka Connect MQ Sink releases

Look for the file named:

kafka-connect-mq-sink-2.2.1-jar-with-dependencies.jar

Option 2: Build from source

Prerequisites

  • Git

  • Maven 3.0 or later

  • JDK 8 or later

Steps to build

  1. Clone the repository:

    git clone https://github.com/ibm-messaging/kafka-connect-mq-sink.git

  2. Navigate to the project directory:

    cd kafka-connect-mq-sink

  3. To build the project, run the following commands in order:

    mvn test
     mvn integration-test
     mvn clean package
  4. After a successful build, the JAR file will be available in the target folder as:

    kafka-connect-mq-sink-2.2.1-jar-with-dependencies.jar

Preparing IBM MQ to accept messages from Kafka Sink Connector

The IBM OMS Developer Toolkit (DTK) includes a container named om-mqserver, which runs an instance of IBM MQ. Follow these steps to configure it for use with the Kafka sink connector.

  1. Create a user in the MQ Container.

    a. Exec into the om-mqserver container.

    b. Create a new OS-level user named mqm1. This user will be used by the Kafka sink connector to authenticate with IBM MQ.

  2. Use an existing Queue. The queue DEV.QUEUE.1 under the queue manager OM_QMGR will be used to receive messages from Kafka.

  3. Grant access to the user.

    a. Open the IBM MQ web console: https://<ip:port>/ibmmq/console/login.html#/

    b. Grant the user mqm1 access to:

    • The queue manager: OM_QMGR

    • The queue: DEV.QUEUE.1

Set up a basic integration service in OMS to consume messages from MQ

Create a simple OMS integration service in the Developer Toolkit (DTK) to read messages from IBM MQ.

JMS Queue component settings

  • Destination Name: DEV.QUEUE.1

  • Initial Context Factory: File

  • Provider URL: <path to DTK folder>/devtoolkit_docker/jndi

  • Connection Factory: AGENT_QCF

  • Server Name: CreateItemFromQ

API component settings

  • Standard API: createItem

Run the Sink Connector and post a test message

Follow these steps to run the Kafka sink connector and post a message from Kafka to IBM MQ, which OMS will consume.

  1. Start the CreateItemFromQ integration server in the OMS Developer Toolkit.

  2. Create a copy of the sink properties file:

    cp <root directory of sink>/config/mq-sink.properties ~/MQConnect

    If you're using the prebuilt JAR, you can also download the config from: https://github.com/ibm-messaging/kafka-connect-mq-sink/tree/main/config

    Edit the file ~/MQConnect/mq-sink.properties and update the following:

    topics=quickstart-events
     mq.queue.manager=OM_QMGR
     mq.connection.name.list=<ip>(1414)
     mq.channel.name=SYSTEM.ADMIN.SVRCONN
     mq.queue=DEV.QUEUE.1
     mq.user.name=mqm1
     mq.password=<pwd>
  3. Run the following command from the Kafka directory:

    CLASSPATH=<path to sink jar>/kafka-connect-mq-sink-2.2.1-jar-with-dependencies.jar \ ./bin/connect-standalone.sh config/connect-standalone.properties ~/MQConnect/mq-sink.properties

  4. Post a message to Kafka by using the Kafka console producer:

    bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092

    Then enter the following message (modify the item ID and org code as needed):

    <ItemList><Item ItemID="Item005" OrganizationCode="TestOrg" UnitOfMeasure="EACH"></Item></ItemList>

    Once the message is posted to Kafka, it will be sent to IBM MQ and consumed by OMS. The item Item005 should now appear in the TestOrg catalog organization.

Use mTLS instead of user credentials for authentication

The earlier approach uses a username and password to authenticate Kafka with IBM MQ. This requires storing credentials in a properties file on the Kafka side.

A more secure alternative is to use mutual TLS (mTLS). It not only removes the need for credentials but also encrypts the data exchanged between Kafka and IBM MQ.

Key changes required

  • On the IBM MQ side:

    • Enable mTLS on the queue manager.

    • Add the Kafka client certificate to the MQ trust store.

  • On the Kafka side:

    • Add the IBM MQ certificate to the Kafka trust store.

    • Update the sink connector properties file to include the trust store location and cipher suite.

Follow the steps below to switch from username/password authentication to mTLS for Kafka-to-IBM MQ communication.

Configure IBM MQ for mTLS

Follow these steps on the IBM MQ container to enable mTLS authentication.

  1. To update the Queue Manager, run the following commands inside the container.

    runmqsc OM_QMGR

    Then run:

    ALTER CHANNEL(SYSTEM.ADMIN.SVRCONN) CHLTYPE(SVRCONN) SSLCIPH(ANY_TLS12)
     ALTER CHANNEL(SYSTEM.ADMIN.SVRCONN) CHLTYPE(SVRCONN) SSLCAUTH(REQUIRED)
  2. Create a trust store.

    runmqakm -keydb -create -db key.kdb -pw passw0rd -stash

  3. Create and extract a certificate.

    runmqakm -cert -create -db key.kdb -stashed -dn "cn=qm,o=ibm,c=uk" -label ibmwebspheremqom_qmgr
     runmqakm -cert -extract -label ibmwebspheremqom_qmgr -db key.kdb -stashed -file QM.cert

    Note: The label format must be: ibmwebspheremq<queue manager name in lowercase>. The file QM.cert is the public certificate to be copied to the Kafka trust store.

Configure Kafka for mTLS

  1. To Transfer the IBM MQ Certificate, copy the QM.cert file (exported from IBM MQ) to the machine where Kafka is running.

  2. Import IBM MQ Certificate to Kafka trust store

    keytool -keystore clientTrustStore.jks -storetype jks -importcert -file QM.cert -alias server-certificate

    This adds the IBM MQ certificate to Kafka’s trust store.

  3. Create Kafka’s certificate and private key.

    keytool -genkeypair -keyalg RSA -alias client-key -keystore clientKeystore.jks -storepass passw0rd -storetype jks

    This generates a Kafka certificate and key pair, stored in clientKeystore.jks.

  4. Export the Kafka crtificate.

    keytool -export -alias client-key -file clientCertificate.crt -keystore clientKeystore.jks

    Transfer the clientCertificate.crt file to the IBM MQ machine, for example, to the /tmp directory. This certificate will be imported into IBM MQ’s trust store.

Final steps on IBM MQ side

  1. To import the Kafka certificate into IBM MQ trust store, log into the IBM MQ container and run:

    runmqakm -cert -add -db key.kdb -stashed -label ibmwebspheremqapp -file /tmp/clientCertificate.crt

    This adds the Kafka certificate to IBM MQ’s trust store.

  2. Refresh TLS settings.

    runmqsc OM_QMGR
     REFRESH SECURITY(*) TYPE(SSL)
     EXIT

    This applies the updated SSL configuration.

Final configuration on Kafka side

  1. To update Kafka Sink Connector properties, edit ~/MQConnect/mq-sink.properties and add the following:

    mq.ssl.cipher.suite=*TLS12
    mq.ssl.keystore.location=<path>/clientKeystore.jks 
    mq.ssl.keystore.password=<keystore password> 
    mq.ssl.truststore.location=<path>/clientTrustStore.jks 
    mq.ssl.truststore.password=<truststore password>

    You can now comment out or remove the following, as they're no longer needed:

    mq.user.name=mqm1
    mq.password=<pwd>

    This enables mTLS-based authentication between Kafka and IBM MQ.

  2. Restart Kafka and post messages to the topic as before to verify that mTLS is working correctly.

Notes and recommendations

  • If IBM MQ is temporarily down, the sink connector will keep trying to reconnect. Messages will still be written to Kafka. Once MQ is back up, the connector resumes posting messages to the queue automatically.

  • If MQ is up but a message fails to be committed (for example, due to size limits), the connector stops processing further messages. After fixing the issue (for example, increasing max message size), restart the connector to resume message flow. Ensure message size limits match on both sides.

  • The connector supports "exactly-once" delivery with specific configurations on both the Kafka and MQ sides. This setup reduces performance.

  • The connector can be configured to include Kafka topic, partition, and offset as JMS message properties.

  • You can run multiple connector tasks using the appropriate configuration. However, in failure scenarios, this may lead to duplicate messages in MQ.

  • By default, the sink connector posts messages as XML strings to MQ. Other formats are not handled natively by OMS.

References