Skip to content

Using Postgres FDW as a Gateway for AutoGen to Reach IBM DB2

Using Postgres FDW as a Gateway for AutoGen to Reach IBM DB2

Section titled “Using Postgres FDW as a Gateway for AutoGen to Reach IBM DB2”

One of the hardest challenges in retrofitting legacy “Big Iron” for modern AI agents is the Driver Gap. IBM DB2, specifically on AS/400 or Mainframes (z/OS), requires heavy, platform-specific drivers that are difficult to install in lightweight containerized agent environments.

The Solution: The Postgres “Bridge” Pattern.

We use a standard Postgres database as a gateway. By configuring a Foreign Data Wrapper (FDW), Postgres acts as a transparent proxy. The Agent talks to Postgres (standard protocol), and Postgres talks to DB2 (handled at the infrastructure layer).

This guide implements a FastMCP server that exposes this bridge and a Generic Python Client for AutoGen that dynamically loads these tools.


This server provides tools for the Agent to inspect the “Foreign” schema and execute safe, read-only SQL queries.

import os
import psycopg2
from psycopg2.extras import RealDictCursor
from fastmcp import FastMCP
# Initialize FastMCP
mcp = FastMCP("Postgres-DB2-Gateway")
# Connection string for the POSTGRES bridge (not DB2 directly)
PG_DSN = os.getenv("PG_CONNECTION_STRING", "postgresql://user:pass@localhost:5432/bridge_db")
def get_pg_connection():
try:
conn = psycopg2.connect(PG_DSN)
return conn
except Exception as e:
raise RuntimeError(f"Bridge Connection Failed: {str(e)}")
@mcp.tool()
def list_legacy_tables() -> str:
"""Lists all available 'foreign' tables mapped from the IBM DB2 Mainframe."""
sql = """
SELECT foreign_table_schema, foreign_table_name
FROM information_schema.foreign_tables
WHERE foreign_server_name = 'db2_mainframe';
"""
conn = get_pg_connection()
try:
with conn.cursor(cursor_factory=RealDictCursor) as cur:
cur.execute(sql)
tables = cur.fetchall()
if not tables:
return "No foreign tables found."
return "\n".join([f"- {t['foreign_table_schema']}.{t['foreign_table_name']}" for t in tables])
finally:
conn.close()
@mcp.tool()
def query_mainframe(query: str) -> str:
"""Executes a READ-ONLY SQL query against the mapped DB2 tables."""
if not query.strip().lower().startswith("select"):
return "Error: Only SELECT statements are allowed."
conn = get_pg_connection()
try:
with conn.cursor(cursor_factory=RealDictCursor) as cur:
cur.execute(query)
results = cur.fetchall()
return str(results[:50]) # Limit to 50 rows
except Exception as e:
return f"FDW Error: {str(e)}"
finally:
conn.close()
if __name__ == "__main__":
mcp.run(transport='sse', host='0.0.0.0', port=8000)
FROM python:3.11-slim
RUN apt-get update && apt-get install -y libpq-dev gcc && rm -rf /var/lib/apt/lists/*
WORKDIR /app
RUN pip install --no-cache-dir fastmcp psycopg2-binary
COPY server.py .
ENV PG_CONNECTION_STRING="postgresql://postgres:[email protected]:5432/bridge"
EXPOSE 8000
CMD ["python", "server.py"]

Since AutoGen does not yet have a built-in mcps=[...] config like CrewAI, we use a standard Generic Python pattern to connect. We define the MCP servers in a list and dynamically register their tools to the AutoGen agent.

Prerequisites: pip install autogen mcp

import asyncio
from autogen import AssistantAgent, UserProxyAgent, register_function
from mcp import ClientSession, StdioServerParameters
from mcp.client.sse import sse_client
# Configuration Pattern
mcps = ["http://localhost:8000/sse"]
# AutoGen Config
llm_config = {"config_list": [{"model": "gpt-4", "api_key": "YOUR_KEY"}]}
chatbot = AssistantAgent(
name="DB2_Analyst",
llm_config=llm_config,
system_message="You are a data analyst. Query the IBM DB2 Mainframe via the provided tools."
)
user_proxy = UserProxyAgent(
name="User",
human_input_mode="NEVER",
max_consecutive_auto_reply=1
)
async def main():
# Connect to the MCP Server
async with sse_client(mcps[0]) as streams:
async with ClientSession(streams[0], streams[1]) as session:
await session.initialize()
# Dynamic Tool Registration
# This replaces manual wrapper functions
tools = await session.list_tools()
for tool in tools.tools:
print(f"Registering tool: {tool.name}")
# Create a closure to capture the tool name for the callback
async def make_tool_callback(t_name=tool.name):
async def callback(**kwargs):
result = await session.call_tool(t_name, arguments=kwargs)
return result.content[0].text
return callback
tool_func = await make_tool_callback()
# Register with AutoGen
register_function(
tool_func,
caller=chatbot,
executor=user_proxy,
name=tool.name,
description=tool.description
)
# Start the conversation
await user_proxy.a_initiate_chat(
chatbot,
message="List the available legacy tables and query the first one."
)
if __name__ == "__main__":
asyncio.run(main())
  1. Dynamic Loading: The client loop (for tool in tools.tools) means if you add a new tool to server.py, the AutoGen agent gets it automatically without code changes in the client.
  2. Safety: The query_mainframe tool enforces read-only access at the application level, protecting the actual DB2 data.
  3. Compatibility: This generic pattern works with any framework that supports Python function calling, ensuring the “Retrofit” solution is future-proof.

  • Status: ✅ Verified
  • Environment: Python 3.11
  • Auditor: AgentRetrofit CI/CD

Transparency: This page may contain affiliate links.