Skip to content

CrewAI integration with IBM AS/400 DB2 using `ibm_db` (Python)

CrewAI integration with IBM AS/400 DB2 using ibm_db (Python)

Section titled “CrewAI integration with IBM AS/400 DB2 using ibm_db (Python)”

Slug: crewai-ibm-as400-db2-python

The Legacy Challenge: AS/400 “Green Screens” meet AI Agents

Section titled “The Legacy Challenge: AS/400 “Green Screens” meet AI Agents”

IBM i (formerly AS/400) systems are the silent engines of global logistics and banking. They store critical data in DB2 for i, often locked behind “green screen” terminal interfaces or rigid internal protocols.

Connecting a modern AI agent (like CrewAI) to these systems is difficult because:

  1. Proprietary Protocols: DB2 for i uses DRDA (Distributed Relational Database Architecture), not standard REST.
  2. Driver Complexity: The ibm_db driver requires specific C-level dependencies often missing in modern containerized environments.
  3. Authentication: Legacy systems often require strict session management that stateless HTTP APIs don’t handle well natively.

This guide provides a FastMCP server that acts as a translation layer. It allows your CrewAI agents to run SQL queries against an AS/400 DB2 instance via a standardized MCP interface.


We use the Model Context Protocol (MCP) to decouple the heavy IBM drivers from your lightweight Agent runtime.

  1. MCP Server (Dockerized): Runs Python + ibm_db. Handles the connection pooling and DRDA protocol.
  2. CrewAI Agent: Connects via Server-Sent Events (SSE). It “sees” tools like execute_sql and get_table_schema without needing to know what an AS/400 is.

This server uses fastmcp to expose DB2 functions. It includes a context manager to handle the connection lifecycle safely.

Note: We bind to 0.0.0.0 to ensure the Docker container accepts external connections.

import os
import ibm_db
import json
from fastmcp import FastMCP, Context
# Initialize the MCP Server
mcp = FastMCP("AS400-Gateway")
# Connection Settings from Environment Variables
DB_NAME = os.getenv("IBM_DB_NAME", "BLUDB")
DB_HOST = os.getenv("IBM_DB_HOST", "192.168.1.10")
DB_PORT = os.getenv("IBM_DB_PORT", "50000")
DB_USER = os.getenv("IBM_DB_USER", "db2admin")
DB_PASS = os.getenv("IBM_DB_PASS", "password")
def get_db_connection():
"""
Establishes a connection to the IBM i (AS/400) DB2 database.
Returns the connection handle.
"""
conn_str = (
f"DATABASE={DB_NAME};"
f"HOSTNAME={DB_HOST};"
f"PORT={DB_PORT};"
f"PROTOCOL=TCPIP;"
f"UID={DB_USER};"
f"PWD={DB_PASS};"
)
try:
# ibm_db.connect returns a connection object or throws an exception
conn = ibm_db.connect(conn_str, "", "")
return conn
except Exception as e:
error_msg = ibm_db.conn_errormsg()
raise ConnectionError(f"Failed to connect to AS/400: {error_msg}")
@mcp.tool()
def execute_sql(query: str, ctx: Context = None) -> str:
"""
Executes a read-only SQL query against the AS/400 DB2 database.
Useful for fetching customer records, inventory status, or ledger entries.
Args:
query: The SQL SELECT statement to execute.
"""
conn = None
try:
if ctx:
ctx.info(f"Connecting to {DB_HOST}...")
conn = get_db_connection()
stmt = ibm_db.exec_immediate(conn, query)
results = []
dictionary = ibm_db.fetch_assoc(stmt)
while dictionary:
results.append(dictionary)
dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(results, default=str)
except Exception as e:
return f"Database Error: {str(e)}"
finally:
if conn:
ibm_db.close(conn)
@mcp.tool()
def describe_table(table_name: str) -> str:
"""
Retrieves the schema definition for a specific table (Physical File).
Args:
table_name: The name of the table/file (e.g., 'LIBRARY.TABLE').
"""
# Simple query to syscat or equivalent depending on IBM i version
# This example assumes standard DB2 catalog views
schema, table = table_name.split('.') if '.' in table_name else ('QSYS2', table_name)
sql = f"""
SELECT COLUMN_NAME, DATA_TYPE, LENGTH
FROM QSYS2.SYSCOLUMNS
WHERE TABLE_NAME = '{table}' AND TABLE_SCHEMA = '{schema}'
"""
return execute_sql(sql)
if __name__ == "__main__":
# HOST must be 0.0.0.0 to work inside Docker
mcp.run(transport='sse', host='0.0.0.0', port=8000)

The ibm_db library relies on system libraries that are often missing in Alpine Linux. We use python:3.11-slim (Debian-based) to ensure compatibility.

# Use a Debian-based image for better ibm_db compatibility
FROM python:3.11-slim
# Install system dependencies
# libxml2 is sometimes required by underlying drivers
RUN apt-get update && apt-get install -y \
gcc \
libc-dev \
libxml2 \
&& rm -rf /var/lib/apt/lists/*
# Set working directory
WORKDIR /app
# Install Python dependencies
# fastmcp for the server, ibm_db for the database
RUN pip install --no-cache-dir fastmcp ibm_db
# Copy application code
COPY server.py .
# Expose the SSE port
EXPOSE 8000
# Run the MCP server
CMD ["python", "server.py"]

To connect your CrewAI agent to this Dockerized server, you configure the agent with the mcps parameter pointing to the SSE endpoint.

Note: If running the Agent locally and the Server in Docker, use http://localhost:8000/sse.

from crewai import Agent, Task, Crew
# No extra tool imports needed; the Agent discovers tools via MCP
# 1. Define the Agent with MCP capabilities
db2_specialist = Agent(
role='Legacy Data Analyst',
goal='Extract and analyze supply chain data from the AS/400 system',
backstory="""You are a specialist in legacy IBM systems.
You know how to query DB2 for i to find inventory counts and order status.""",
verbose=True,
# Connect to the local Docker container
mcps=["http://localhost:8000/sse"]
)
# 2. Define a Task
inventory_task = Task(
description="""
1. Check the schema of table 'LOGISTICS.INVENTORY'.
2. Query the table to find all items with 'stock_count' < 10.
3. Return a summary of low-stock items.
""",
expected_output="A list of low stock items with their IDs and current counts.",
agent=db2_specialist
)
# 3. Run the Crew
crew = Crew(
agents=[db2_specialist],
tasks=[inventory_task],
verbose=True
)
result = crew.kickoff()
print(result)

Important Considerations for Enterprise Deployment

Section titled “Important Considerations for Enterprise Deployment”
  1. Strict Filtering: The example above allows raw SQL (execute_sql). In a real production environment, you should replace this with specific function-based tools (e.g., get_order_by_id(id)) to prevent SQL injection or accidental table drops.
  2. License Management: IBM clidriver (embedded in ibm_db) may require a license file (db2consv_ee.lic) if connecting to DB2 on Z/OS or i from a non-IBM server. Ensure this file is copied into the Docker container if your connection fails with license errors.
  3. Encoding: AS/400 systems often default to EBCDIC. ibm_db handles the conversion to ASCII/UTF-8 automatically, but be aware of special characters in column names.

  • Status: ✅ Verified
  • Environment: Python 3.11
  • Auditor: AgentRetrofit CI/CD

Transparency: This page may contain affiliate links.