This is a cache of https://developer.ibm.com/tutorials/local-ai-agent-workflow-mcp-watsonx-orchestrate/. It is a snapshot of the page as it appeared on 2025-11-14T12:59:59.295+0000.
Set up an end-to-end use case for local AI agents with MCP server and watsonx Orchestrate - IBM Developer
Integrating agents with tools, data, and workflows often require extra code and complex setup. Model Context Protocol (MCP) simplifies this process by standardizing how AI agents connect to services, making integration easier across existing systems.
Containers and Docker Compose (including networking)
Working with GitHub
Tools
Git installed
Python 3.11–3.13 (supported by ADK)
GitHub account
Developer computer (~16 GB RAM)
Container engine: Docker Desktop, Rancher Desktop, or Colima
Note: Ensure that your container engine is running before starting the setup. The first image pulls might take a few minutes.
Architecture of a local AI Agent workflow
This tutorial uses a local MCP server from the Galaxium Travels infrastructure and connects it to watsonx Orchestrate Developer Edition using Docker Compose.
The Galaxium Travels Example is an open source project that helps you build and test AI-driven applications by using REST APIs, an MCP server, and containerized environments. Licensed under the Apache 2.0 license, it is ideal for experimentation and learning.
This example represents a realistic enterprise setup where systems expose REST APIs that are used by front ends, backend services, and AI agents. It includes:
A UI that uses a booking REST API backend through OpenAPI.
A booking system MCP backend built with FastMCP for AI agent integration.
A Human Resources (HR) system that exposes APIs without a front end.
A modular design that separates traditional and AI components.
Note: Refer to the GitHub repository's README for the complete list of ports and service links.
The FastMCP backend operates independently of the REST API, allowing AI agents to interact with it directly, similar to real-world production environments.
The Example Infrastructure setup
You will use the local development Docker Compose stack that is included in watsonx Orchestrate Developer Edition. This setup includes:
The MCP server exposing Galaxium tools such as list_flights.
REST endpoints need extra wrappers and schemas for reasoning.
MCP defines tools, prompts, and schemas for LLMs, with built-in versioning and standardized transport.
Monitoring with Langfuse:
Use Langfuse to trace tool usage, latency, and errors. This helps during local testing and debugging.
Troubleshooting tips:
Containers not starting: Increase Docker CPU or RAM.
MCP Inspector not connecting: Check server URL, port, and network settings.
Tool import issues: Ensure the connection points to the service name, not localhost.
LiteChat not showing results: Verify MCP server and agent tool configuration.
Summary and next steps
With your environment set up, you can now test and refine agents and tools, monitor interactions in Langfuse, and expand your setup as needed. As next steps:
Add another MCP tool, such as book flight, and connect it to your agent.
Compare agent performance between REST-based and MCP-based tools.
Explore authentication and security options for MCP servers before using them in production.
Acknowledgments
This tutorial was produced as part of the IBM Open Innovation Community initiative: Agentic AI (AI for Developers and Ecosystem).
The authors deeply appreciate the support of Ahmed Azraq and Bindu Umesh for the guidance and expertise on reviewing and contributing to this tutorial.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.