Skip to content

LangGraph agents querying IBM AS/400 DB2 with `ibm_db`

Legacy IBM iSeries (AS/400) systems run the backbone of global logistics and banking, yet they remain notoriously difficult to access for modern AI agents. The native protocols are opaque, and modern REST APIs are often missing.

This guide provides a production-ready Model Context Protocol (MCP) server to bridge LangGraph agents with IBM DB2 on iSeries using the native ibm_db driver.

Instead of trying to teach a LangChain or LangGraph agent how to compile C-extensions for DB2, we containerize the connection logic. The Agent communicates via the standardized MCP protocol, while the container handles the heavy lifting of talking to the “Big Iron.”

  1. FastMCP Server: Exposes SQL execution capabilities as tools.
  2. ibm_db Driver: The official IBM driver (requires specific system dependencies).
  3. LangGraph Node: A Python client that consumes the MCP tools.

🛠️ Step 1: The Bridge Code (server.py)

Section titled “🛠️ Step 1: The Bridge Code (server.py)”

This server exposes two critical tools: query_as400 for data retrieval and get_schema for agent self-discovery.

import os
import ibm_db
import json
from fastmcp import FastMCP
# Initialize FastMCP
mcp = FastMCP("AS400_Gateway")
def get_db_connection():
"""
Establishes a connection to the IBM i (AS/400) DB2 database.
Requires environment variables: DB_HOST, DB_NAME, DB_USER, DB_PASS.
"""
db_host = os.getenv("DB_HOST")
db_name = os.getenv("DB_NAME")
db_user = os.getenv("DB_USER")
db_pass = os.getenv("DB_PASS")
db_port = os.getenv("DB_PORT", "50000")
# Standard connection string for TCP/IP connection to iSeries
conn_str = (
f"DATABASE={db_name};"
f"HOSTNAME={db_host};"
f"PORT={db_port};"
f"PROTOCOL=TCPIP;"
f"UID={db_user};"
f"PWD={db_pass};"
)
try:
conn = ibm_db.connect(conn_str, "", "")
return conn
except Exception as e:
error_msg = ibm_db.conn_errormsg()
raise RuntimeError(f"Failed to connect to AS/400: {e} - {error_msg}")
@mcp.tool()
def query_as400(sql_query: str) -> str:
"""
Executes a read-only SQL query against the IBM AS/400 DB2 database.
Use this to fetch live data like inventory, order status, or customer records.
Args:
sql_query: A valid SQL SELECT statement.
"""
conn = None
try:
conn = get_db_connection()
stmt = ibm_db.exec_immediate(conn, sql_query)
results = []
dictionary = ibm_db.fetch_assoc(stmt)
while dictionary != False:
results.append(dictionary)
dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(results, default=str)
except Exception as e:
return f"Error executing query: {str(e)}"
finally:
if conn:
ibm_db.close(conn)
@mcp.tool()
def get_table_schema(table_name: str) -> str:
"""
Retrieves the column definitions for a specific table in QSYS2.
Useful for the agent to understand data structure before querying.
"""
conn = None
try:
conn = get_db_connection()
# QSYS2.SYSCOLUMNS is the standard catalog for iSeries DB2
sql = f"""
SELECT COLUMN_NAME, DATA_TYPE, LENGTH, COLUMN_TEXT
FROM QSYS2.SYSCOLUMNS
WHERE TABLE_NAME = '{table_name.upper()}'
"""
stmt = ibm_db.exec_immediate(conn, sql)
columns = []
row = ibm_db.fetch_assoc(stmt)
while row != False:
columns.append(row)
row = ibm_db.fetch_assoc(stmt)
if not columns:
return f"No columns found for table {table_name}. Check capitalization."
return json.dumps(columns, default=str)
except Exception as e:
return f"Error fetching schema: {str(e)}"
finally:
if conn:
ibm_db.close(conn)
if __name__ == "__main__":
mcp.run()

IBM’s ibm_db driver has compiled C-dependencies. Using a Docker container is the safest way to ensure portability across agent environments.

Note: We expose port 8000 to allow this to run as a web service (e.g., on Railway or AWS ECS) compatible with MCP HTTP transport protocols.

# Use a slim Python base that supports wheel builds
FROM python:3.11-slim
# Install system dependencies required for ibm_db
# libxml2 is often needed for DB2 client libraries
RUN apt-get update && apt-get install -y \
gcc \
libxml2 \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
# Install Python dependencies
# fastmcp handles the MCP protocol
# ibm_db is the official driver
RUN pip install --no-cache-dir fastmcp ibm_db
# Copy server code
COPY server.py .
# Environment variables should be injected at runtime
# ENV DB_HOST=as400.example.com
# ENV DB_NAME=BIGDB
# ENV DB_USER=QSECOFR
# Expose the port for HTTP/SSE transport
EXPOSE 8000
# FastMCP CLI can serve the file over SSE on port 8000
CMD ["fastmcp", "run", "server.py", "--transport", "sse", "--port", "8000", "--host", "0.0.0.0"]

Once your Docker container is running (e.g., at http://localhost:8000/sse), you can connect a LangGraph agent to it.

from langgraph.prebuilt import create_react_agent
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_openai import ChatOpenAI
async def run_agent():
# 1. Connect to the MCP Server
async with MultiServerMCPClient() as client:
await client.connect_to_server(
"as400_server",
url="http://localhost:8000/sse"
)
# 2. Extract tools automatically
tools = client.get_tools()
# 3. Create LangGraph Agent
model = ChatOpenAI(model="gpt-4-turbo")
agent = create_react_agent(model, tools)
# 4. Invoke
response = await agent.ainvoke({
"messages": [
("user", "Check the QSYS2 schema for table 'ORDERS' and tell me the column names.")
]
})
print(response["messages"][-1].content)
  1. EBCDIC Encoding: The ibm_db driver handles EBCDIC to ASCII conversion automatically in most cases, but watch out for CCSID errors if your database is configured with an obscure locale.
  2. Journaling: UPDATE or INSERT operations (not covered in this read-only guide) often fail on AS/400 if the target table is not journaled.
  3. Connection Limits: Legacy AS/400 systems often have strict limits on concurrent JDBC/ODBC connections. Ensure your agent closes connections (the finally block in our code handles this).

  • Status: ✅ Verified
  • Environment: Python 3.11
  • Auditor: AgentRetrofit CI/CD

Transparency: This page may contain affiliate links.