This is a cache of https://developer.ibm.com/tutorials/local-ai-agent-workflow-mcp-watsonx-orchestrate/. It is a snapshot of the page as it appeared on 2025-11-14T12:59:59.295+0000.
Set up an end-to-end use case for local AI agents with MCP server and watsonx Orchestrate - IBM Developer

Tutorial

Set up an end-to-end use case for local AI agents with MCP server and watsonx Orchestrate

Learn how to connect a local MCP server with watsonx Orchestrate Developer Edition to build, test, and monitor AI agents using Docker Compose

By

Thomas Suedbroecker,

Maximilian Jesch

Integrating agents with tools, data, and workflows often require extra code and complex setup. Model Context Protocol (MCP) simplifies this process by standardizing how AI agents connect to services, making integration easier across existing systems.

In this tutorial, learn how to connect a local MCP server from the Galaxium Travels example to watsonx Orchestrate Developer Edition by using Docker Compose, and test agent-to-tool interaction with Langfuse observability.

Prerequisites

Technical knowledge

  • Python packaging and virtual environments
  • Containers and Docker Compose (including networking)
  • Working with GitHub

Tools

  • Git installed
  • Python 3.11–3.13 (supported by ADK)
  • GitHub account
  • Developer computer (~16 GB RAM)
  • Container engine: Docker Desktop, Rancher Desktop, or Colima

Note: Ensure that your container engine is running before starting the setup. The first image pulls might take a few minutes.

Architecture of a local AI Agent workflow

This tutorial uses a local MCP server from the Galaxium Travels infrastructure and connects it to watsonx Orchestrate Developer Edition using Docker Compose.

local AI Agent workflow Architecture

The Galaxium Travels Example is an open source project that helps you build and test AI-driven applications by using REST APIs, an MCP server, and containerized environments. Licensed under the Apache 2.0 license, it is ideal for experimentation and learning.

This example represents a realistic enterprise setup where systems expose REST APIs that are used by front ends, backend services, and AI agents. It includes:

  • A UI that uses a booking REST API backend through OpenAPI.
  • A booking system MCP backend built with FastMCP for AI agent integration.
  • A Human Resources (HR) system that exposes APIs without a front end.
  • A modular design that separates traditional and AI components.

Note: Refer to the GitHub repository's README for the complete list of ports and service links.

The FastMCP backend operates independently of the REST API, allowing AI agents to interact with it directly, similar to real-world production environments.

The Example Infrastructure setup

You will use the local development Docker Compose stack that is included in watsonx Orchestrate Developer Edition. This setup includes:

Local development with Docker Compose

local development with Docker Compose

The components in the Galaxium Travels Example can run in multiple ways:

  • As stand-alone Python applications.
  • As containerized services.
  • As Docker Compose stacks.
  • As deployments on IBM Cloud Code Engine.

In this tutorial, you will use Docker Compose for an easy and portable setup.

Video walkthrough and code repository

Watch and follow the full tutorial walkthrough of the following steps:

  1. Installing the Galaxium Travels infrastructure.
  2. Installing watsonx Orchestrate Agent Development Kit (ADK) and starting watsonx Orchestrate Developer Edition in Docker.
  3. Inspecting the MCP server locally with MCP inspector.
  4. Importing a tool (for example, list_flights) into watsonx Orchestrate.
  5. Creating the Galaxium Traveler Agent.


Video will open in a new tab or window

Source code repository

All example setup scripts and instructions are available in the GitHub repository.

By the end, you will have a working local setup similar to the GitHub repository and walkthrough video, ready for experimentation.

Key takeaways

  • MCP versus REST APIs:

    • REST endpoints need extra wrappers and schemas for reasoning.
    • MCP defines tools, prompts, and schemas for LLMs, with built-in versioning and standardized transport.
  • Monitoring with Langfuse:

    • Use Langfuse to trace tool usage, latency, and errors. This helps during local testing and debugging.
  • Troubleshooting tips:

    • Containers not starting: Increase Docker CPU or RAM.
    • MCP Inspector not connecting: Check server URL, port, and network settings.
    • Tool import issues: Ensure the connection points to the service name, not localhost.
    • LiteChat not showing results: Verify MCP server and agent tool configuration.

Summary and next steps

With your environment set up, you can now test and refine agents and tools, monitor interactions in Langfuse, and expand your setup as needed. As next steps:

  • Add another MCP tool, such as book flight, and connect it to your agent.
  • Compare agent performance between REST-based and MCP-based tools.
  • Explore authentication and security options for MCP servers before using them in production.

Acknowledgments

This tutorial was produced as part of the IBM Open Innovation Community initiative: Agentic AI (AI for Developers and Ecosystem).

The authors deeply appreciate the support of Ahmed Azraq and Bindu Umesh for the guidance and expertise on reviewing and contributing to this tutorial.