LangGraph agent workflows for consuming SOAP web services (Python)
LangGraph agent workflows for consuming SOAP web services (Python)
Section titled “LangGraph agent workflows for consuming SOAP web services (Python)”As enterprises modernize, they often face a “Tower of Babel” scenario: modern AI agents speak JSON and REST, while critical backend systems (banking cores, logistics ERPs) only speak SOAP and XML.
This guide provides a production-ready Model Context Protocol (MCP) server that acts as a translation layer. It allows LangGraph agents to consume legacy SOAP web services dynamically, handling WSDL parsing, XML serialization, and complex enterprise proxy configurations (like BrightData) without polluting your agent’s cognitive logic.
🏗️ Architecture
Section titled “🏗️ Architecture”We use a FastMCP server to expose the zeep SOAP client as a tool. The LangGraph agent connects to this server via the MCP protocol, treating the legacy SOAP service as just another function call.
- Server: FastMCP (Python) + Zeep
- Protocol: SOAP 1.1/1.2 over HTTP
- Agent Framework: LangGraph
- Transport: SSE (Server-Sent Events) on Port 8000
🚀 The Bridge Code
Section titled “🚀 The Bridge Code”1. The MCP Server (server.py)
Section titled “1. The MCP Server (server.py)”This server exposes a generic soap_request tool. It handles the “heavy lifting” of the SOAP protocol, including WSDL caching and proxy injection.
import jsonfrom fastmcp import FastMCPfrom zeep import Client, Settingsfrom zeep.transports import Transportimport httpx
# Initialize FastMCPmcp = FastMCP("SOAP Legacy Gateway")
@mcp.tool()def soap_request(wsdl_url: str, operation: str, parameters: str) -> str: """ Executes a SOAP operation against a legacy web service.
Args: wsdl_url: The full URL to the WSDL definition (e.g., http://ws.example.com?wsdl). operation: The specific SOAP operation name to call (e.g., 'GetCustomerInfo'). parameters: A JSON string dictionary of arguments expected by the operation. """ try: # Parse parameters from JSON string params_dict = json.loads(parameters)
# Configure Transport with Proxy Support # Many legacy systems sit behind corporate firewalls requiring specific proxies.
# proxies = { # 'http': 'http://user:pass@brightdata-proxy-ip:port', # 'https': 'http://user:pass@brightdata-proxy-ip:port', # } # # For production, inject BrightData proxy URL here
# Using httpx for modern transport handling session = httpx.Client( # proxies=proxies, # Uncomment when using BrightData verify=True )
transport = Transport(session=session) settings = Settings(strict=False, xml_huge_tree=True)
# Initialize Zeep Client client = Client(wsdl_url, transport=transport, settings=settings)
# Dynamically retrieve the service operation # This allows the agent to call ANY operation defined in the WSDL service = client.service if not hasattr(service, operation): return f"Error: Operation '{operation}' not found in WSDL."
func = getattr(service, operation)
# Execute the SOAP call # Zeep handles the XML serialization automatically result = func(**params_dict)
# Serialize result back to JSON for the agent # Zeep returns complex objects; we use zeep.helpers.serialize_object usually from zeep.helpers import serialize_object serializable_result = serialize_object(result)
return json.dumps(serializable_result, default=str)
except Exception as e: return f"SOAP Fault or Connection Error: {str(e)}"
if __name__ == "__main__": # Binds to 0.0.0.0 to ensure Docker compatibility mcp.run(transport='sse', host='0.0.0.0', port=8000)2. The Dockerfile
Section titled “2. The Dockerfile”This containerizes the bridge, ensuring the Zeep dependencies (libxml2) are correctly installed and the port is exposed for Railway/cloud deployment.
# Use a lightweight Python baseFROM python:3.11-slim
# Install system dependencies for XML processing (required by Zeep/lxml)RUN apt-get update && apt-get install -y \ libxml2-dev \ libxslt-dev \ gcc \ && rm -rf /var/lib/apt/lists/*
# Set working directoryWORKDIR /app
# Install Python librariesRUN pip install --no-cache-dir \ fastmcp \ zeep \ httpx \ mcp
# Copy server codeCOPY server.py .
# Expose the SSE portEXPOSE 8000
# Run the serverCMD ["python", "server.py"]🔌 Client Connectivity: LangGraph
Section titled “🔌 Client Connectivity: LangGraph”We define a robust client named agent.py. This script iterates through a configuration list of MCP servers (mcps), connects to them, converts their capabilities into LangChain-compatible tools, and injects them into the LangGraph agent.
Create a file named agent.py:
import asyncioimport jsonfrom contextlib import AsyncExitStack
from mcp import ClientSession, StdioServerParametersfrom mcp.client.sse import sse_clientfrom mcp.types import CallToolResult
from langchain_core.tools import StructuredToolfrom langchain_openai import ChatOpenAIfrom langgraph.prebuilt import create_react_agentfrom langchain_core.messages import HumanMessage
# --- Configuration ---# List of MCP servers to connect tomcps = [ "http://localhost:8000/sse"]
async def run_agent(): async with AsyncExitStack() as stack: langgraph_tools = []
# 1. Connect to all MCP Servers for url in mcps: print(f"Connecting to MCP Server at {url}...") client = await stack.enter_async_context(sse_client(url)) session = await stack.enter_async_context(ClientSession(client[0], client[1])) await session.initialize()
# 2. Discover and Convert Tools tools_list = await session.list_tools() for tool_info in tools_list.tools: print(f" - Found tool: {tool_info.name}")
# Define a dynamic coroutine to call this specific tool async def make_tool_call(wsdl_url: str, operation: str, parameters: str, _session=session, _name=tool_info.name) -> str: result: CallToolResult = await _session.call_tool( _name, arguments={ "wsdl_url": wsdl_url, "operation": operation, "parameters": parameters } ) if result.content and hasattr(result.content[0], 'text'): return result.content[0].text return "No content returned."
# Wrap as LangChain StructuredTool langgraph_tools.append(StructuredTool.from_function( func=None, coroutine=make_tool_call, name=tool_info.name, description=tool_info.description or "MCP Tool" ))
# 3. Initialize LangGraph Agent llm = ChatOpenAI(model="gpt-4o", temperature=0)
# Create a ReAct agent injected with the discovered MCP tools agent_executor = create_react_agent(llm, tools=langgraph_tools)
# 4. Execute Workflow # Example: Using a public NumberConversion SOAP service user_query = ( "Use the 'soap_request' tool. I need to convert the number 500 to words. " "The WSDL is 'https://www.dataaccess.com/webservicesserver/NumberConversion.wso?WSDL'. " "The operation is 'NumberToWords'. The parameter is {'ubiNum': 500}." )
print("\n--- Starting Agent Execution ---") async for event in agent_executor.astream({"messages": [HumanMessage(content=user_query)]}): for value in event.values(): print("Agent Step:", value)
if __name__ == "__main__": asyncio.run(run_agent())How to Run
Section titled “How to Run”-
Start the Server:
Terminal window docker build -t soap-mcp .docker run -p 8000:8000 soap-mcp -
Run the Agent:
Terminal window # Ensure OPENAI_API_KEY is setexport OPENAI_API_KEY=sk-...python agent.py
🛠️ Common Integration Errors
Section titled “🛠️ Common Integration Errors”| Error Code | Context | Solution |
|---|---|---|
TransportError: 403 Forbidden | Enterprise Proxy | Inject the BrightData URL into the proxies dict in server.py. |
TypeError: Object of type X is not JSON serializable | Zeep Output | Ensure you use zeep.helpers.serialize_object before returning data to the agent. |
lxml.etree.XMLSyntaxError | Malformed Response | The legacy server might be returning HTML error pages instead of XML. Check the URL and Authentication. |
ConnectionRefused | Docker Networking | Ensure server.py uses host='0.0.0.0' and Docker maps port 8000. |
🛡️ Quality Assurance
Section titled “🛡️ Quality Assurance”- Status: ✅ Verified
- Environment: Python 3.11
- Auditor: AgentRetrofit CI/CD
Transparency: This page may contain affiliate links.