This is a cache of https://developer.ibm.com/tutorials/awb-enable-oms-kafka-messages-mq/. It is a snapshot of the page as it appeared on 2025-11-14T12:46:04.034+0000.
Enable OMS to read Kafka messages with IBM MQ as bridge - IBM Developer
IBM Sterling Order Management System (OMS) supports Kafka as a producer, allowing it to send messages directly to Kafka topics. However, OMS does not support Kafka as a consumer, so it cannot read messages from Kafka topics directly.
This document outlines a proof-of-concept (POC) that uses a Kafka Connect Sink Connector to move messages from a Kafka topic to IBM MQ, which OMS can read natively.
The POC demonstrates how to set up this integration using OMS and IBM MQ from the OMS Developer Toolkit (DTK). The setup steps are similar for other OMS environments, such as on-premises deployments. A standalone Kafka instance is used for simplicity.
Note: This is not official IBM documentation and has not been performance-tested or officially supported. It is intended as a starting point for implementing this integration approach.
Setting up Kafka
You can either use an existing Kafka setup or create a standalone Kafka instance.
Option 1: Use an existing Kafka topic
If you already have a Kafka instance and topic, simply replace quickstart-events with your topic name wherever it's referenced.
Option 2: Set up a standalone Kafka Instance
Follow these steps to install and run Kafka locally:
Preparing IBM MQ to accept messages from Kafka Sink Connector
The IBM OMS Developer Toolkit (DTK) includes a container named om-mqserver, which runs an instance of IBM MQ. Follow these steps to configure it for use with the Kafka sink connector.
Create a user in the MQ Container.
a. Exec into the om-mqserver container.
b. Create a new OS-level user named mqm1. This user will be used by the Kafka sink connector to authenticate with IBM MQ.
Use an existing Queue. The queue DEV.QUEUE.1 under the queue manager OM_QMGR will be used to receive messages from Kafka.
Grant access to the user.
a. Open the IBM MQ web console:
https://<ip:port>/ibmmq/console/login.html#/
b. Grant the user mqm1 access to:
The queue manager: OM_QMGR
The queue: DEV.QUEUE.1
Set up a basic integration service in OMS to consume messages from MQ
Create a simple OMS integration service in the Developer Toolkit (DTK) to read messages from IBM MQ.
JMS Queue component settings
Destination Name: DEV.QUEUE.1
Initial Context Factory: File
Provider URL: <path to DTK folder>/devtoolkit_docker/jndi
Connection Factory: AGENT_QCF
Server Name: CreateItemFromQ
API component settings
Standard API: createItem
Run the Sink Connector and post a test message
Follow these steps to run the Kafka sink connector and post a message from Kafka to IBM MQ, which OMS will consume.
Start the CreateItemFromQ integration server in the OMS Developer Toolkit.
Create a copy of the sink properties file:
cp <root directory of sink>/config/mq-sink.properties ~/MQConnect
Once the message is posted to Kafka, it will be sent to IBM MQ and consumed by OMS. The item Item005 should now appear in the TestOrg catalog organization.
Use mTLS instead of user credentials for authentication
The earlier approach uses a username and password to authenticate Kafka with IBM MQ. This requires storing credentials in a properties file on the Kafka side.
A more secure alternative is to use mutual TLS (mTLS). It not only removes the need for credentials but also encrypts the data exchanged between Kafka and IBM MQ.
Key changes required
On the IBM MQ side:
Enable mTLS on the queue manager.
Add the Kafka client certificate to the MQ trust store.
On the Kafka side:
Add the IBM MQ certificate to the Kafka trust store.
Update the sink connector properties file to include the trust store location and cipher suite.
Follow the steps below to switch from username/password authentication to mTLS for Kafka-to-IBM MQ communication.
Configure IBM MQ for mTLS
Follow these steps on the IBM MQ container to enable mTLS authentication.
To update the Queue Manager, run the following commands inside the container.
runmqsc OM_QMGR
Then run:
ALTER CHANNEL(SYSTEM.ADMIN.SVRCONN) CHLTYPE(SVRCONN) SSLCIPH(ANY_TLS12)
ALTER CHANNEL(SYSTEM.ADMIN.SVRCONN) CHLTYPE(SVRCONN) SSLCAUTH(REQUIRED)
Note: The label format must be: ibmwebspheremq<queue manager name in lowercase>. The file QM.cert is the public certificate to be copied to the Kafka trust store.
Configure Kafka for mTLS
To Transfer the IBM MQ Certificate, copy the QM.cert file (exported from IBM MQ) to the machine where Kafka is running.
Transfer the clientCertificate.crt file to the IBM MQ machine, for example, to the /tmp directory. This certificate will be imported into IBM MQ’s trust store.
Final steps on IBM MQ side
To import the Kafka certificate into IBM MQ trust store, log into the IBM MQ container and run:
You can now comment out or remove the following, as they're no longer needed:
mq.user.name=mqm1
mq.password=<pwd>
Copy codeCopied!
This enables mTLS-based authentication between Kafka and IBM MQ.
Restart Kafka and post messages to the topic as before to verify that mTLS is working correctly.
Notes and recommendations
If IBM MQ is temporarily down, the sink connector will keep trying to reconnect. Messages will still be written to Kafka. Once MQ is back up, the connector resumes posting messages to the queue automatically.
If MQ is up but a message fails to be committed (for example, due to size limits), the connector stops processing further messages. After fixing the issue (for example, increasing max message size), restart the connector to resume message flow. Ensure message size limits match on both sides.
The connector supports "exactly-once" delivery with specific configurations on both the Kafka and MQ sides. This setup reduces performance.
The connector can be configured to include Kafka topic, partition, and offset as JMS message properties.
You can run multiple connector tasks using the appropriate configuration. However, in failure scenarios, this may lead to duplicate messages in MQ.
By default, the sink connector posts messages as XML strings to MQ. Other formats are not handled natively by OMS.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.