This is a cache of https://developer.ibm.com/articles/awb-simplifying-llm-integration-mcp-api-connect-graphql/. It is a snapshot of the page as it appeared on 2026-02-17T06:48:12.903+0000.
Simplifying LLM Integration with M<strong>c</strong>P and API <strong>c</strong>onne<strong>c</strong>t GraphQL
IBM Developer

Article

Simplifying LLM Integration with McP and API connect GraphQL

Explore a standardized way to connect data to any LLM

By carlos Eberhardt, Roy Derks

In the rapidly evolving landscape of AI development, large language models (LLMs) have become powerful tools for creating intelligent applications. However, a persistent challenge has been connecting these models to the diverse data sources they need to provide context-aware responses. Today, this integration becomes dramatically simpler by combining Model context Protocol (McP) and GraphQL.

The integration challenge

LLMs like claude, GPT-4, Granite, and others have impressive reasoning capabilities, but they're most valuable when they can access relevant data from your systems. Unfortunately, connecting LLMs to these data sources has typically required:

  • Building custom integration code for each data source
  • Managing complex authentication and security concerns
  • creating and maintaining separate endpoints for each integration
  • Rebuilding integrations when switching between LLM providers

This complexity has been a significant barrier to creating truly useful AI applications. What developers need is a standardized way to connect their data to any LLM.

McP: The USB-c for AI applications

The Model context Protocol (McP) has emerged as a solution to this problem. Think of McP like a USB-c port for AI applications — a standardized way for LLMs to connect with various data sources and tools.

McP follows a client-server architecture:

  • McP Hosts: Applications like claude Desktop or AI-enhanced IDEs that need to access data.
  • McP clients: Protocol clients that maintain connections with servers.
  • McP Servers: Lightweight programs exposing capabilities through the standardized protocol.
  • Data Sources: Your files, databases, and services that McP servers can access.

The protocol standardizes how applications provide context to LLMs, making it easier to build complex AI workflows while maintaining the flexibility to switch between LLM providers.

API connect for GraphQL for McP server deployment

While McP provides the standard, implementing it still requires work. That's where API connect for GraphQL comes in. We've developed a system that allows you to:

  1. Take multiple data sources and combine them into a single endpoint.
  2. Deploy that endpoint to a serverless hosting solution.
  3. Automatically expose it as an McP server.

The magic happens through API connect for GraphQL, which enables you to federate multiple data sources through GraphQL and then expose them via McP with minimal effort.

How it works

API connect for GraphQL handles the complexity of data integration through a few key components:

  1. Data Source connection: connect to virtually any data source, including:

    • Databases (PostgreSQL, MySQL, MSSQL, Oracle, MongoDB, Presto/Trino)
    • REST or SOAP endpoints
    • GraphQL APIs
  2. GraphQL Federation: We convert all backend sources to GraphQL, providing a unified query language that LLMs naturally understand.

  3. McP Server Generation: The federated GraphQL API is automatically exposed as an McP server with configurable tools.

  4. Tool Definition control: Fine-tune exactly what parts of your data are exposed through the @tool custom directive.

Here's how the tool definition works:

extend schema
  # A tool that only exposes Query.customers
  @tool(
    name: "customers-tool"
    description: "Lookup information about customers: {graphql_tool}"
    graphql: {expose: true, types: "Query", fields: "customers"}
  )

This directive allows you to create precisely scoped tools that expose only the data you want, with the descriptions LLMs need to use them effectively.

A simple example: Places and events integration

Let's walk through a practical example that demonstrates the power of this approach. We'll create an McP server that integrates place data with event information — allowing an LLM to answer questions like "What restaurants are near the concert venue?" or "Are there any events happening near this coffee shop this weekend?"

Step 1: connect your data sources

First, we'll connect to two different APIs:

  • A Places API (like Google Places)
  • An Events API (like Ticketmaster)

With API connect for GraphQL, this is as simple as creating connection configurations:

type Query {
  # Places API
  placesByLocation(lat: Float!, lng: Float!, radius: Int!, type: String): [Place!]!
    @rest(
      endpoint: "https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=$lat,$lng&radius=$radius&type=$type&key=YOUR_API_KEY"
    )

  # Events API
  eventsByDateAndLocation(startDate: String!, endDate: String, lat: Float!, lng: Float!, radius: Int!): [Event!]!
    @rest(
      endpoint: "https://app.ticketmaster.com/discovery/v2/events.json?latlong=$lat,$lng&radius=$radius&startDateTime=$startDate&endDateTime=$endDate&apikey=YOUR_API_KEY"
    )
}

In the code example above, the custom directive @rest is used to connect to a REST API. There are connectors for many different data sources, including other API standards and many different databases.

Step 2: Define your types

Now we'll define the GraphQL types for our data:

type Place {
  id: ID!
  name: String!
  location: Location!
  types: [String!]!
  rating: Float
  openingHours: [OpenHours!]
}

type Event {
  id: ID!
  name: String!
  venue: Venue!
  startTime: DateTime!
  endTime: DateTime
  category: String
}

type Location {
  latitude: Float!
  longitude: Float!
}

type Venue {
  id: ID!
  name: String!
  location: Location!
}

# Additional types omitted for brevity

You can also use the API connect for GraphQL cLI to connect to a data source, and that way the GraphQL types will be autogenerated for you. See the documentation for more information.

Step 3: create enhanced relationships

The real power comes when we connect these types:

extend type Place {
  nearbyEvents(radius: Int!, startDate: String!, endDate: String): [Event!]!
    @materializer(
      query: "eventsByDateAndLocation",
      arguments: [
        { name: "lat", field: "location.latitude" },
        { name: "lng", field: "location.longitude" },
        { name: "radius", argument: "radius" },
        { name: "startDate", argument: "startDate" },
        { name: "endDate", argument: "endDate" }
      ]
    )
}

extend type Event {
  nearbyPlaces(radius: Int!, type: String): [Place!]!
    @materializer(
      query: "placesByLocation",
      arguments: [
        { name: "lat", field: "venue.location.latitude" },
        { name: "lng", field: "venue.location.longitude" },
        { name: "radius", argument: "radius" },
        { name: "type", argument: "type" }
      ]
    )
}

These extensions create relationships between places and events based on location, allowing natural traversal between the types.

Step 4: Define your McP tool

Now we'll create an McP tool that exposes our integrated API:

extend schema
  @tool(
    name: "city-explorer"
    description: "Find places and events in cities, including relationships between them. Use this for questions about finding places near events or events near places."
    graphql: {expose: true, types: "Query", fields: "placesByLocation|eventsByDateAndLocation"}
  )

By default, a GraphQL tool that exposes the complete GraphQL schema and allows you to query that schema, is also added to the McP server. The above implementation, however, gives you more control over what the tool is able to return.

Step 5: Deploy your McP server

With API connect for GraphQL, deployment is as simple as running:

stepzen deploy

This deploys your federated GraphQL API as a serverless endpoint and automatically makes it available as an McP server.

Step 6: connect to an LLM

Now your LLM can use this tool to answer complex questions. For example, if a user asks:

"Are there any concerts near Italian restaurants in chicago this weekend?"

The LLM can formulate the appropriate queries:

{
  placesByLocation(lat: 41.8781, lng: -87.6298, radius: 5000, type: "restaurant") {
    id
    name
    types
    nearbyEvents(radius: 1000, startDate: "2023-08-19", endDate: "2023-08-20") {
      name
      startTime
      venue {
        name
      }
    }
  }
}

This query will be the input for tools in the McP server and return the data requested.

Advanced control with field visibility

Sometimes you need more granular control over what data is exposed in your tools. The @tool directive allows you to precisely define the visible parts of your schema:

extend schema
  @tool(
    name: "employee-directory"
    description: "Search for employees and their public information"
    graphql: [
      {expose: true, types: "Query", fields: "employee|department"},
      {expose: false, types: "Employee", fields: "salary|ssn|homeAddress"}
    ]
  )

This creates a tool that can access employee and department data but explicitly excludes sensitive fields like salary, SSN, and home address.

You can even create multiple tools from the same backend, each with different visibility scopes for different use cases or security requirements.

Summary and next steps

This approach to LLM integration offers several key benefits:

  1. Simplicity: connect multiple data sources without complex custom code.
  2. Standardization: Use the McP protocol for consistent integration.
  3. Security: control exactly what data is exposed.
  4. Flexibility: Switch between LLM providers as needed.

In future articles, we'll explore more complex examples, including automation use cases like PR review assistants and support ticket triage systems.

To get started with APIc for GraphQL for McP server deployment, read our @tool directive documentation or register for a free trial.

Also, explore the McP tools provided by IBM, which include reusable components and reference implementations that you can adapt for your own projects.

We can't wait to see how you'll use this technology to make LLMs truly useful for your specific needs.