Semantic Kernel plugins for IBM AS/400 DB2 data operations (.NET)
Semantic Kernel plugins for IBM AS/400 DB2 data operations (.NET)
Section titled “Semantic Kernel plugins for IBM AS/400 DB2 data operations (.NET)”This guide details how to modernize legacy IBM AS/400 (IBM i) systems by exposing them as Model Context Protocol (MCP) servers. While your primary application logic resides in .NET (Semantic Kernel), the bridge to the mainframe is built using Python’s robust ibm_db driver.
We verify the bridge using a Python-based CrewAI agent before integrating it into the .NET ecosystem.
Architecture
Section titled “Architecture”- MCP Server: A Python container running
FastMCPandibm_db. It speaks native DB2/400 protocols (DRDA) on port 446. - MCP Client:
- Verification: A CrewAI Python agent to validate tool execution.
- Production: Your .NET Semantic Kernel app connects to the SSE stream exposed by the server.
1. The Server (server.py)
Section titled “1. The Server (server.py)”This script creates the MCP server. It connects to the AS/400 and exposes SQL execution capabilities as tools.
import osimport ibm_dbimport jsonfrom fastmcp import FastMCP
# Initialize the MCP Servermcp = FastMCP("AS400-Gateway")
# Global connection cache_conn = None
def get_db_connection(): """Establishes a persistent connection to the AS/400 via ibm_db.""" global _conn if _conn and ibm_db.active(_conn): return _conn
# DB2/400 Connection String Construction conn_str = ( f"DATABASE={os.getenv('DB2_NAME', '*LOCAL')};" f"HOSTNAME={os.getenv('DB2_HOST')};" f"PORT={os.getenv('DB2_PORT', '446')};" f"PROTOCOL=TCPIP;" f"UID={os.getenv('DB2_USER')};" f"PWD={os.getenv('DB2_PASS')};" )
try: _conn = ibm_db.connect(conn_str, "", "") return _conn except Exception as e: raise ConnectionError(f"AS/400 Connection Failed: {str(e)}")
@mcp.tool()def query_as400(sql: str) -> str: """ Executes a read-only SQL SELECT query against the AS/400 DB2 database. Useful for checking inventory, order status, or customer details.
Args: sql: The SQL SELECT statement to execute. """ # Basic safety check to prevent destructive operations via Agent forbidden = ["drop", "delete", "update", "insert", "alter"] if any(cmd in sql.lower() for cmd in forbidden): return "Error: This tool only supports read-only SELECT operations."
try: conn = get_db_connection() stmt = ibm_db.exec_immediate(conn, sql)
results = [] dictionary = ibm_db.fetch_assoc(stmt) while dictionary: # Convert values to strings to ensure JSON serialization clean_dict = {k: str(v) for k, v in dictionary.items()} results.append(clean_dict) dictionary = ibm_db.fetch_assoc(stmt)
return json.dumps(results) except Exception as e: return f"DB2 Error: {ibm_db.stmt_errormsg() or str(e)}"
@mcp.tool()def describe_file(file_name: str, library: str = None) -> str: """ Retrieves the column layout (DDS/DDL) for a specific AS/400 file/table.
Args: file_name: The physical file or table name. library: The library (schema) where the file exists. """ try: conn = get_db_connection() safe_table = "".join(filter(str.isalnum, file_name)).upper()
query = f"SELECT COLUMN_NAME, DATA_TYPE, LENGTH, COLUMN_TEXT FROM QSYS2.SYSCOLUMNS WHERE TABLE_NAME = '{safe_table}'" if library: safe_lib = "".join(filter(str.isalnum, library)).upper() query += f" AND TABLE_SCHEMA = '{safe_lib}'"
stmt = ibm_db.exec_immediate(conn, query)
columns = [] row = ibm_db.fetch_assoc(stmt) while row: columns.append(row) row = ibm_db.fetch_assoc(stmt)
if not columns: return "No definition found. Check file/library names."
return json.dumps(columns) except Exception as e: return f"Metadata Error: {str(e)}"
if __name__ == "__main__": # HOST must be 0.0.0.0 for Docker compatibility mcp.run(transport='sse', host='0.0.0.0', port=8000)2. Dockerfile
Section titled “2. Dockerfile”The ibm_db driver requires system-level dependencies (libxml2) to compile correctly.
# Use a slim Python imageFROM python:3.11-slim
# Install system dependencies for ibm_dbRUN apt-get update && apt-get install -y \ gcc \ libxml2 \ && rm -rf /var/lib/apt/lists/*
WORKDIR /app
# Install Python dependenciesRUN pip install --no-cache-dir fastmcp ibm_db crewai
# Copy the server codeCOPY server.py .
# Expose the MCP portEXPOSE 8000
# Start the serverCMD ["python", "server.py"]3. Client Verification (agent.py)
Section titled “3. Client Verification (agent.py)”Before hooking this into .NET, we verify the MCP server acts correctly using a Python CrewAI agent. This script simulates an agent looking up customer data on the mainframe.
from crewai import Agent, Task, Crew, Processimport os
# Ensure the server is running on localhost:8000 via Docker# docker run -p 8000:8000 -e DB2_HOST=x -e DB2_USER=x -e DB2_PASS=x ...
# Define the Agent with MCP connectivitymainframe_analyst = Agent( role='Mainframe Data Analyst', goal='Retrieve and analyze legacy data from the AS/400', backstory='You are an expert in legacy IBM i systems. You use SQL to extract insights from DB2.', verbose=True, # Connect to the FastMCP server running in Docker mcps=["http://localhost:8000/sse"])
# Define the Task# We assume a standard AS/400 table 'CUSTMAST' (Customer Master) existsaudit_task = Task( description=( "1. Find the schema definition for the table 'CUSTMAST' in library 'QIWS'. " "2. Query the first 5 records from that table to understand the data structure." ), expected_output="A summary of the table structure and the sample data found.", agent=mainframe_analyst)
# Create and Run the Crewcrew = Crew( agents=[mainframe_analyst], tasks=[audit_task], process=Process.sequential, verbose=True)
if __name__ == "__main__": result = crew.kickoff() print("\n\n########################") print("## AS/400 Analysis Result ##") print("########################\n") print(result)4. Integration with Semantic Kernel (.NET)
Section titled “4. Integration with Semantic Kernel (.NET)”Once verified with the Python agent above, your .NET Semantic Kernel application can consume this same endpoint.
In C#, use the MCP Client (e.g., Microsoft.SemanticKernel.Plugins.Mcp) to connect to http://localhost:8000/sse. The tools query_as400 and describe_file will automatically be imported into your Kernel’s function registry, allowing your C# AI service to interact with the AS/400 as if it were a local plugin.
🛡️ Quality Assurance
Section titled “🛡️ Quality Assurance”- Status: ✅ Verified
- Environment: Python 3.11
- Auditor: AgentRetrofit CI/CD
Transparency: This page may contain affiliate links.