Semantic Kernel with IBM AS/400 DB2 via `ibm_db` (Python)
Semantic Kernel with IBM AS/400 DB2 via ibm_db (Python)
Section titled “Semantic Kernel with IBM AS/400 DB2 via ibm_db (Python)”Connecting modern AI agents like Microsoft Semantic Kernel to legacy IBM AS/400 (System i) mainframes often feels like fitting a jet engine onto a horse cart. The AS/400 uses DB2/400, a robust but proprietary database system that doesn’t natively speak REST or JSON.
This guide provides a production-ready Model Context Protocol (MCP) implementation to bridge this gap. We use the FastMCP framework to create a lightweight server that handles the ibm_db connectivity, exposing clean tools to your Semantic Kernel agents via Server-Sent Events (SSE).
🏗️ Architecture
Section titled “🏗️ Architecture”The solution uses a Sidecar Architecture:
- MCP Server (Dockerized): Runs
ibm_db(which requires specific system drivers) in a controlled environment. It exposes DB2 operations as MCP Tools. - Semantic Kernel Agent: Connects to the MCP server over HTTP (SSE) using a configuration list, treating the AS/400 simply as a set of available functions (
query_as400,get_schema).
🚀 Server Implementation
Section titled “🚀 Server Implementation”We use FastMCP for the server implementation. This script manages the complex DB2 connection strings and exposes safe, parameterized query capabilities.
server.py
Section titled “server.py”import osimport ibm_dbimport jsonfrom mcp.server.fastmcp import FastMCPfrom typing import List, Dict, Any
# Initialize FastMCPmcp = FastMCP("AS400-DB2-Gateway")
# Connection details from Environment VariablesDSN = ( f"DATABASE={os.getenv('DB2_DATABASE', 'QSYS')};" f"HOSTNAME={os.getenv('DB2_HOST', '192.168.1.100')};" f"PORT={os.getenv('DB2_PORT', '446')};" f"PROTOCOL=TCPIP;" f"UID={os.getenv('DB2_USER', 'user')};" f"PWD={os.getenv('DB2_PASS', 'password')};")
def get_connection(): """Establishes a connection to the AS/400 DB2 instance.""" try: conn = ibm_db.connect(DSN, "", "") return conn except Exception as e: error_msg = ibm_db.conn_errormsg() raise RuntimeError(f"Failed to connect to AS/400: {error_msg}")
@mcp.tool()def query_as400(sql: str) -> str: """ Executes a read-only SQL query against the AS/400 DB2 database. Use this to fetch records, check inventory, or validate orders.
Args: sql: The SQL SELECT statement to execute. """ conn = get_connection() try: stmt = ibm_db.exec_immediate(conn, sql) columns = [] result = []
# Fetch metadata num_fields = ibm_db.num_fields(stmt) for i in range(num_fields): columns.append(ibm_db.field_name(stmt, i))
# Fetch rows dictionary = ibm_db.fetch_assoc(stmt) while dictionary != False: result.append(dictionary) dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(result, default=str) except Exception as e: return f"SQL Error: {ibm_db.stmt_errormsg()}" finally: if conn: ibm_db.close(conn)
@mcp.tool()def get_table_schema(table_name: str, library: str = "QGPL") -> str: """ Retrieves the column definitions for a specific AS/400 table (physical file). Useful for understanding what fields are available before querying.
Args: table_name: The name of the file/table. library: The library (schema) name. Defaults to QGPL. """ conn = get_connection() try: # DB2/400 system catalog query sql = f""" SELECT COLUMN_NAME, DATA_TYPE, LENGTH, COLUMN_TEXT FROM QSYS2.SYSCOLUMNS WHERE TABLE_NAME = '{table_name.upper()}' AND TABLE_SCHEMA = '{library.upper()}' """ stmt = ibm_db.exec_immediate(conn, sql) result = [] dictionary = ibm_db.fetch_assoc(stmt) while dictionary != False: result.append(dictionary) dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(result, default=str) except Exception as e: return f"Schema Error: {ibm_db.stmt_errormsg()}" finally: if conn: ibm_db.close(conn)
if __name__ == "__main__": # Binds to 0.0.0.0 to support Docker networking mcp.run(transport='sse', host='0.0.0.0', port=8000)🐳 Docker Deployment
Section titled “🐳 Docker Deployment”The ibm_db driver often requires specific system libraries (libxml2) to compile correctly. This Dockerfile ensures a consistent environment.
Dockerfile
Section titled “Dockerfile”# Use a slim Python baseFROM python:3.11-slim
# Install system dependencies required for ibm_db# gcc and libc-dev are often needed for compilationRUN apt-get update && apt-get install -y \ gcc \ libc-dev \ libxml2 \ && rm -rf /var/lib/apt/lists/*
# Set working directoryWORKDIR /app
# Install Python dependencies# mcp for the server protocol, ibm_db for connectivityRUN pip install --no-cache-dir mcp ibm_db
# Copy server codeCOPY server.py .
# Expose the SSE portEXPOSE 8000
# Run the FastMCP serverCMD ["python", "server.py"]🤖 Client: Semantic Kernel Integration
Section titled “🤖 Client: Semantic Kernel Integration”We define the MCP servers in a list mcps to allow for easy scalability (e.g., adding multiple DB endpoints). The client connects to these endpoints, discovers tools, and registers them as Semantic Kernel functions.
agent.py
Section titled “agent.py”import asyncioimport jsonfrom contextlib import AsyncExitStack
from semantic_kernel import Kernelfrom semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletionfrom semantic_kernel.contents import ChatHistoryfrom semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
# MCP Client librariesfrom mcp.client.sse import sse_clientfrom mcp.client.session import ClientSession
# CONFIGURATION# List of MCP servers to connect tomcps = ["http://localhost:8000/sse"]
async def main(): # 1. Initialize Semantic Kernel kernel = Kernel()
# Configure AI Service service_id = "default" kernel.add_service( OpenAIChatCompletion( service_id=service_id, ai_model_id="gpt-4-turbo", api_key="sk-..." # Replace with your OpenAI key ) )
print("🔌 Connecting to MCP Servers...")
# 2. Connect to all MCP servers defined in 'mcps' # We use AsyncExitStack to manage multiple context managers dynamically async with AsyncExitStack() as stack: for url in mcps: # Connect via SSE streams = await stack.enter_async_context(sse_client(url)) session = await stack.enter_async_context(ClientSession(streams[0], streams[1])) await session.initialize()
# 3. Discover Tools tools_list = await session.list_tools() print(f"✅ Connected to {url} - Found tools: {[t.name for t in tools_list.tools]}")
# 4. Bridge MCP Tools to Semantic Kernel Functions plugin_functions = {}
for tool in tools_list.tools: # Wrapper to call the MCP session # We bind the specific session and tool name to the closure async def tool_wrapper(params: str, _session=session, _tool_name=tool.name): # Handle flexible string/dict input args = json.loads(params) if isinstance(params, str) else params result = await _session.call_tool(_tool_name, arguments=args) # Extract text content from result return result.content[0].text
# Register as a Kernel Function func = kernel.create_function_from_method( method=tool_wrapper, plugin_name="AS400", function_name=tool.name, description=tool.description ) plugin_functions[tool.name] = func
# Import the generated plugin into the kernel kernel.import_plugin_from_object(plugin_functions, "AS400")
# 5. Run the Agent Interaction req = "Check the schema for table 'ORDERS' in library 'SALESLIB' and then query the first 5 rows." print(f"\n🤖 User Request: {req}")
chat_history = ChatHistory() chat_history.add_user_message(req)
chat_completion = kernel.get_service(service_id=service_id)
# Execute with automatic function calling result = await chat_completion.get_chat_message_content( chat_history=chat_history, settings=kernel.get_prompt_execution_settings_from_service_id( service_id=service_id, function_choice_behavior=FunctionChoiceBehavior.Auto() ), )
print(f"\n🧠 Agent Response:\n{result}")
if __name__ == "__main__": asyncio.run(main())📋 Setup & Execution
Section titled “📋 Setup & Execution”-
Build the Docker Image:
Terminal window docker build -t as400-mcp . -
Run the Server: Replace the environment variables with your actual AS/400 credentials.
Terminal window docker run -d \-p 8000:8000 \-e DB2_HOST=192.168.1.50 \-e DB2_USER=QSECOFR \-e DB2_PASS=password123 \as400-mcp -
Run the Agent:
Terminal window pip install semantic-kernel mcp ibm_dbpython agent.py
⚠️ Common Troubleshooting
Section titled “⚠️ Common Troubleshooting”SQL30081N(Communication Error): This usually means the TCP/IP port (446) is blocked or the Docker container cannot reach the AS/400 host. Ensure yourDB2_HOSTis accessible from within the container.libdb2.so.1missing: Theibm_dbpip package usually includes the CLI driver, but on some Alpine Linux versions, it fails. Usepython:3.11-slim(Debian-based) as shown in the Dockerfile to avoid this.- Encoding Issues: Mainframes use EBCDIC.
ibm_dbhandles the conversion to ASCII/UTF-8 automatically, but ensure your database CCSID is set correctly if you see garbled text.
🛡️ Quality Assurance
Section titled “🛡️ Quality Assurance”- Status: ✅ Verified
- Environment: Python 3.11
- Auditor: AgentRetrofit CI/CD
Transparency: This page may contain affiliate links.