CrewAI integration with IBM AS/400 DB2 using `ibm_db` (Python)
CrewAI integration with IBM AS/400 DB2 using ibm_db (Python)
Section titled “CrewAI integration with IBM AS/400 DB2 using ibm_db (Python)”Slug: crewai-ibm-as400-db2-python
The Legacy Challenge: AS/400 “Green Screens” meet AI Agents
Section titled “The Legacy Challenge: AS/400 “Green Screens” meet AI Agents”IBM i (formerly AS/400) systems are the silent engines of global logistics and banking. They store critical data in DB2 for i, often locked behind “green screen” terminal interfaces or rigid internal protocols.
Connecting a modern AI agent (like CrewAI) to these systems is difficult because:
- Proprietary Protocols: DB2 for i uses DRDA (Distributed Relational Database Architecture), not standard REST.
- Driver Complexity: The
ibm_dbdriver requires specific C-level dependencies often missing in modern containerized environments. - Authentication: Legacy systems often require strict session management that stateless HTTP APIs don’t handle well natively.
This guide provides a FastMCP server that acts as a translation layer. It allows your CrewAI agents to run SQL queries against an AS/400 DB2 instance via a standardized MCP interface.
🏗️ Architecture
Section titled “🏗️ Architecture”We use the Model Context Protocol (MCP) to decouple the heavy IBM drivers from your lightweight Agent runtime.
- MCP Server (Dockerized): Runs Python +
ibm_db. Handles the connection pooling and DRDA protocol. - CrewAI Agent: Connects via Server-Sent Events (SSE). It “sees” tools like
execute_sqlandget_table_schemawithout needing to know what an AS/400 is.
🛠️ The Server (server.py)
Section titled “🛠️ The Server (server.py)”This server uses fastmcp to expose DB2 functions. It includes a context manager to handle the connection lifecycle safely.
Note: We bind to 0.0.0.0 to ensure the Docker container accepts external connections.
import osimport ibm_dbimport jsonfrom fastmcp import FastMCP, Context
# Initialize the MCP Servermcp = FastMCP("AS400-Gateway")
# Connection Settings from Environment VariablesDB_NAME = os.getenv("IBM_DB_NAME", "BLUDB")DB_HOST = os.getenv("IBM_DB_HOST", "192.168.1.10")DB_PORT = os.getenv("IBM_DB_PORT", "50000")DB_USER = os.getenv("IBM_DB_USER", "db2admin")DB_PASS = os.getenv("IBM_DB_PASS", "password")
def get_db_connection(): """ Establishes a connection to the IBM i (AS/400) DB2 database. Returns the connection handle. """ conn_str = ( f"DATABASE={DB_NAME};" f"HOSTNAME={DB_HOST};" f"PORT={DB_PORT};" f"PROTOCOL=TCPIP;" f"UID={DB_USER};" f"PWD={DB_PASS};" )
try: # ibm_db.connect returns a connection object or throws an exception conn = ibm_db.connect(conn_str, "", "") return conn except Exception as e: error_msg = ibm_db.conn_errormsg() raise ConnectionError(f"Failed to connect to AS/400: {error_msg}")
@mcp.tool()def execute_sql(query: str, ctx: Context = None) -> str: """ Executes a read-only SQL query against the AS/400 DB2 database. Useful for fetching customer records, inventory status, or ledger entries.
Args: query: The SQL SELECT statement to execute. """ conn = None try: if ctx: ctx.info(f"Connecting to {DB_HOST}...")
conn = get_db_connection() stmt = ibm_db.exec_immediate(conn, query)
results = [] dictionary = ibm_db.fetch_assoc(stmt) while dictionary: results.append(dictionary) dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(results, default=str)
except Exception as e: return f"Database Error: {str(e)}" finally: if conn: ibm_db.close(conn)
@mcp.tool()def describe_table(table_name: str) -> str: """ Retrieves the schema definition for a specific table (Physical File).
Args: table_name: The name of the table/file (e.g., 'LIBRARY.TABLE'). """ # Simple query to syscat or equivalent depending on IBM i version # This example assumes standard DB2 catalog views schema, table = table_name.split('.') if '.' in table_name else ('QSYS2', table_name)
sql = f""" SELECT COLUMN_NAME, DATA_TYPE, LENGTH FROM QSYS2.SYSCOLUMNS WHERE TABLE_NAME = '{table}' AND TABLE_SCHEMA = '{schema}' """ return execute_sql(sql)
if __name__ == "__main__": # HOST must be 0.0.0.0 to work inside Docker mcp.run(transport='sse', host='0.0.0.0', port=8000)🐳 Dockerfile
Section titled “🐳 Dockerfile”The ibm_db library relies on system libraries that are often missing in Alpine Linux. We use python:3.11-slim (Debian-based) to ensure compatibility.
# Use a Debian-based image for better ibm_db compatibilityFROM python:3.11-slim
# Install system dependencies# libxml2 is sometimes required by underlying driversRUN apt-get update && apt-get install -y \ gcc \ libc-dev \ libxml2 \ && rm -rf /var/lib/apt/lists/*
# Set working directoryWORKDIR /app
# Install Python dependencies# fastmcp for the server, ibm_db for the databaseRUN pip install --no-cache-dir fastmcp ibm_db
# Copy application codeCOPY server.py .
# Expose the SSE portEXPOSE 8000
# Run the MCP serverCMD ["python", "server.py"]🤖 Client Connectivity (CrewAI)
Section titled “🤖 Client Connectivity (CrewAI)”To connect your CrewAI agent to this Dockerized server, you configure the agent with the mcps parameter pointing to the SSE endpoint.
Note: If running the Agent locally and the Server in Docker, use http://localhost:8000/sse.
from crewai import Agent, Task, Crew# No extra tool imports needed; the Agent discovers tools via MCP
# 1. Define the Agent with MCP capabilitiesdb2_specialist = Agent( role='Legacy Data Analyst', goal='Extract and analyze supply chain data from the AS/400 system', backstory="""You are a specialist in legacy IBM systems. You know how to query DB2 for i to find inventory counts and order status.""", verbose=True, # Connect to the local Docker container mcps=["http://localhost:8000/sse"])
# 2. Define a Taskinventory_task = Task( description=""" 1. Check the schema of table 'LOGISTICS.INVENTORY'. 2. Query the table to find all items with 'stock_count' < 10. 3. Return a summary of low-stock items. """, expected_output="A list of low stock items with their IDs and current counts.", agent=db2_specialist)
# 3. Run the Crewcrew = Crew( agents=[db2_specialist], tasks=[inventory_task], verbose=True)
result = crew.kickoff()print(result)Important Considerations for Enterprise Deployment
Section titled “Important Considerations for Enterprise Deployment”- Strict Filtering: The example above allows raw SQL (
execute_sql). In a real production environment, you should replace this with specific function-based tools (e.g.,get_order_by_id(id)) to prevent SQL injection or accidental table drops. - License Management: IBM
clidriver(embedded inibm_db) may require a license file (db2consv_ee.lic) if connecting to DB2 on Z/OS or i from a non-IBM server. Ensure this file is copied into the Docker container if your connection fails with license errors. - Encoding: AS/400 systems often default to EBCDIC.
ibm_dbhandles the conversion to ASCII/UTF-8 automatically, but be aware of special characters in column names.
🛡️ Quality Assurance
Section titled “🛡️ Quality Assurance”- Status: ✅ Verified
- Environment: Python 3.11
- Auditor: AgentRetrofit CI/CD
Transparency: This page may contain affiliate links.