Enterprise Application Integration Using Zachman Matrix: Microservices and APIs

Enterprise Application Integration Using Zachman Matrix: Microservices and APIs
Legacy enterprises often have 50-200 disconnected applications. Order management in one system, inventory in another, accounting in a third. Data doesn't flow; reporting requires manual reconciliation; processes are manual and error-prone.
Modern enterprises solve this through integration architecture - microservices, APIs, and event-driven workflows. Zachman Framework ensures integration architecture is complete and coherent.
The Integration Problem
Legacy monolith (all in one system):
Customer → CRM (monolithic) ← → Accounting ← → Manufacturing
↓
Order processing all tangled together
Change one thing, break three other thingsModern microservices (each system focused):
Customer Service → Order Service → Inventory Service → Manufacturing Service
↓ ↓ ↓ ↓
Customer data Order data Inventory levels Production queue
Integration layers needed:
- APIs (synchronous communication)
- Message queues (asynchronous events)
- Data pipelines (sync data across systems)Challenge: How to design integration without creating new silos?
Zachman Integration Architecture
Row 1: Strategic Intent - What is Integration For?
Column 1 (What):
- Unified customer view across all systems
- Order-to-delivery process end-to-end
- Real-time inventory visibility
Column 2 (Why):
- Business objective: Reduce order-to-delivery time from 8 weeks to 2 weeks
- Requires: Real-time inventory visibility (not batch sync)
- Enables: Data-driven decisions (customer 360°)
Row 2: Current State - Inventory of Applications
Assessment:
50 Applications across enterprise:
By category:
- CRM: Salesforce (4 modules, heavily customized)
- ERP: SAP (procurement, inventory, accounting)
- Manufacturing: MES (Manufacturing Execution System, legacy)
- Warehouse: WMS (Warehouse Management, 10 years old)
- Finance: NetSuite (accounting, GL)
- HR: Workday (payroll, benefits)
- Reporting: Tableau (BI dashboards)
- Custom: 20 custom applications (Java, .NET, legacy)
Data flow today:
- Batch files (overnight): CSV exports/imports
- Manual data entry: Customer service enters order in 4 systems
- Point-to-point integrations: SAP talks to WMS (1 integration), Salesforce talks to SAP (1 integration) = 20+ point-to-point integrations (spaghetti)
Problems:
- Data inconsistency: Customer has different email in SAP vs. Salesforce
- Delays: Batch sync nightly (reports wrong until morning refresh)
- Brittleness: If WMS is down, nothing works (no graceful degradation)
- No visibility: Customer service can't see order status in manufacturing
- Manual workarounds: "Check SAP, then WMS, then call warehouse"Row 3: Target Architecture - Integration Design
Design principle: "How should systems communicate to enable business processes?"
Column 1 (What): Shared data model
Master data (single source of truth):
Customer:
- ID, email, phone (master in Salesforce)
- Billing address (shared across systems)
- Segment (calculated, shared via data lake)
Product:
- Master in ERP (SAP)
- Price (in ERP, pricing rules in pricing engine)
- Availability (real-time from inventory)
Order:
- Master in Order service (new)
- References customer (ID from Salesforce)
- References product (SKU from ERP)
- References inventory (real-time from inventory)Column 2 (How): Integration patterns
Pattern 1: Synchronous APIs (for real-time needs)
Customer Service app needs customer balance:
1. Calls REST API: GET /api/customer/123/balance
2. Finance service returns: $5,432.10
3. Instant result (no waiting)
Use case: Real-time order validation (can customer pay?)Pattern 2: Asynchronous Events (for eventual consistency)
Order created:
1. Order service publishes: "OrderCreated" event
2. Listeners subscribe:
- Inventory service: Reduces stock
- Accounting service: Records revenue
- Notification service: Sends confirmation email
3. All happen within seconds (eventual consistency, not real-time)
Use case: Decouple systems, allow them to fail independentlyPattern 3: Data synchronization (for analytics)
End-of-day: All data synced to data lake
- Customer: Salesforce → data lake
- Orders: Order service → data lake
- Inventory: Warehouse → data lake
- Financial: SAP → data lake
Use case: Historical reporting, machine learningColumn 3 (Where): Architecture diagram
API Gateway
(front door)
↓
┌───────────────────┼───────────────────┐
↓ ↓ ↓
Customer Service Order Service Inventory Service
(Salesforce) (new, core) (Warehouse)
↓ ↓ ↓
└───────────────────┼───────────────────┘
↓
Message Queue
(Event broker)
↓
┌───────────┬──────────┬───────────┐
↓ ↓ ↓ ↓
Accounting Notification Fulfillment Reporting
Service Service Service (Data lake)Column 4 (Who): Integration governance
API Review Board:
- Approve new APIs (ensure consistency)
- Deprecate old APIs (planned migration)
- Enforce API standards (documentation, versioning)
Data Governance:
- Customer master: Owned by CRM (Salesforce)
- Order master: Owned by Order service (new)
- Product master: Owned by ERP (SAP)
SLA Agreements:
- API response time: <00ms (p95)
- Message processing: < seconds
- Data consistency: < minuteColumn 5 (When): Integration lifecycle
Synchronous APIs (for real-time):
- Response time: 50-200ms
- Use case: Interactive features (can check now)
Asynchronous events (for eventual consistency):
- Processing time: 1-5 seconds
- Use case: Decoupled services, parallel processing
Batch synchronization (for analytics):
- Frequency: Nightly (end-of-day)
- Use case: Historical reporting, reconciliationColumn 6 (Why): Business outcomes
Reduce order-to-delivery:
- Before: 8 weeks (manual, batch processes, system hops)
- After: 2 weeks (real-time visibility, automated)
Reduce manual work:
- Before: 40% of orders require manual intervention
- After: 5% (mostly exceptions)
Improve accuracy:
- Before: Data inconsistency, manual errors
- After: Single source of truth, automated sync
Enable analytics:
- Before: No integrated reporting
- After: Complete 360° customer view, predictive analyticsRow 4: Technology Choices
Column 1 (What): Data integration technology
- ETL tools: Talend, Apache Airflow
- Real-time sync: Debezium (CDC - Change Data Capture)
- Data warehouse: Snowflake, BigQuery
Column 2 (How): API technology
- REST APIs (HTTP-based, standard)
- GraphQL (query language for APIs)
- gRPC (high-performance, binary protocol)
Column 3 (Where): Message queue technology
- Kafka (distributed event streaming)
- RabbitMQ (traditional message queue)
- AWS SNS/SQS (cloud-based)
Column 4 (Who): API gateway
- Kong (open-source)
- AWS API Gateway
- Azure API Management
Column 5 (When): Scheduling and routing
- Apache Airflow (schedule batch jobs)
- Kubernetes (route microservices)
- Service mesh (Istio, Linkerd)
Column 6 (Why): Integration patterns
- Circuit breaker (if downstream service down, fail gracefully)
- Retry logic (handle transient failures)
- Throttling (prevent overwhelming downstream systems)
Row 5: Implementation
Phase 1: Build Order Service (Months 1-3)
# New Order Service API
@app.post("/api/v1/orders")
def create_order(request: OrderRequest):
"""Create a new order, check inventory, process payment."""
# Step 1: Validate customer
customer = call_customer_api(request.customer_id)
if not customer:
raise CustomerNotFound()
# Step 2: Check inventory (real-time)
for item in request.items:
inventory = call_inventory_api(item.sku)
if inventory.quantity < item.quantity:
raise OutOfStock(item.sku)
# Step 3: Process payment (synchronous)
payment = call_payment_api(customer.id, request.total)
if not payment.successful:
raise PaymentFailed()
# Step 4: Create order
order = Order(
customer_id=customer.id,
items=request.items,
total=request.total,
status='CONFIRMED'
)
db.save(order)
# Step 5: Publish event (asynchronous)
publish_event('OrderCreated', {
'order_id': order.id,
'customer_id': customer.id,
'items': request.items
})
return orderPhase 2: Build event subscribers (Months 4-6)
# Accounting service: Listens for OrderCreated event
@event_handler('OrderCreated')
def on_order_created(event):
"""Record revenue when order is created."""
invoice = Invoice(
order_id=event['order_id'],
customer_id=event['customer_id'],
amount=sum(item['price'] * item['qty'] for item in event['items']),
status='PENDING'
)
db.save(invoice)
# Notification service: Listens for OrderCreated event
@event_handler('OrderCreated')
def on_order_created(event):
"""Send confirmation email to customer."""
customer = get_customer(event['customer_id'])
send_email(
to=customer.email,
subject="Your order is confirmed",
body=f"Order ID: {event['order_id']}"
)
# Fulfillment service: Listens for OrderCreated event
@event_handler('OrderCreated')
def on_order_created(event):
"""Create fulfillment task."""
fulfillment = Fulfillment(
order_id=event['order_id'],
status='NEW',
warehouse_id='wh-001'
)
db.save(fulfillment)Phase 3: Data synchronization (Months 7-9)
# Apache Airflow DAG: Sync customer data to data lake
dag_name: daily_customer_sync
schedule: 0 2 * * * # 2 AM daily
tasks:
1_extract_salesforce:
type: salesforce_api
query: "SELECT * FROM Account WHERE modified >= yesterday"
output: /data/salesforce/customers_raw.csv
2_extract_sap:
type: sap_api
query: "SELECT * FROM KNA1 WHERE AEDAT >= yesterday"
output: /data/sap/customers_raw.csv
3_transform:
type: python_script
script: transform_customers.py
inputs: [/data/salesforce/customers_raw.csv, /data/sap/customers_raw.csv]
logic: |
- Match on email (find duplicates)
- Merge customer info (Salesforce master for CRM, SAP for orders)
- Validate data quality
4_load:
type: snowflake
target: data_lake.customers
action: upsert (update if exists, insert if new)Row 6: Operational Metrics
Column 1 (What): Data consistency
Customer master data consistency: 98% (target: 99.5%)
- Salesforce: 50,000 records
- SAP: 49,500 records (500 only in Salesforce, not in SAP)
- Action: Identify orphaned customers, clean up
Order-to-revenue sync: 100%
- All orders in Order service have corresponding revenue in accounting
- Daily reconciliation auditColumn 2 (How): API performance
Order API (create order):
- Latency (p95): 180ms (target: <00ms) ✓
- Throughput: 5,000 orders/minute (target: 2,000) ✓
- Error rate: 0.1% (target: <.5%) ✓
Inventory API (check stock):
- Latency (p95): 45ms (target: <00ms) ✓
- Success rate: 99.8%Column 3 (Where): Message queue health
Kafka brokers: 3 (production, staging, backup)
Topics: 12 (OrderCreated, PaymentProcessed, InventoryUpdated, etc.)
Message throughput: 100k/minute (capacity: 500k/minute)
Consumer lag: < seconds (how far behind real-time) ✓
Dead-letter queue: 23 messages (investigate and reprocess)Column 4 (Who): Integration governance
API versioning:
- v1 (in use by 98% of consumers)
- v2 (new, in pilot)
- Deprecation: v1 sunset in 6 months
Data ownership clarity: 100%
- Customer data: CRM is master
- Order data: Order service is master
- Inventory data: Warehouse is masterColumn 5 (When): Synchronization SLAs
Real-time APIs: <00ms (p95)
Event processing: < seconds (from event published to processed)
Batch sync: Within 2 hours of end-of-dayColumn 6 (Why): Business impact
Manual intervention reduced: 40% → 5%
- Before: Customer service had to check 4 systems
- After: Single system of record
Order cycle time: 8 weeks → 2 weeks
- Real-time inventory visibility enables faster decision-making
- Automated fulfillment (vs. manual warehouse picking)
Revenue impact: +$50M (faster fulfillment, higher customer satisfaction)Common Integration Anti-Patterns
- Star architecture (everything talks to everything): Creates unmaintainable spaghetti
- No versioning: APIs change, consumers break
- Synchronous everywhere: Tightly coupled, one failure cascades
- No error handling: Message lost if consumer down
- No monitoring: Can't see integration failures until customers complain
Key Takeaways
-
Integration is architectural: Requires design (Row 3), not just tools (Row 4).
-
APIs are contract: Version them, document them, treat as public interface.
-
Asynchronous patterns reduce coupling: Events allow systems to be independent.
-
Master data management is critical: Single source of truth prevents chaos.
-
Monitoring integration is essential: If you can't see it, you can't manage it.
Next Steps
- Inventory current applications and data flows (understand current state)
- Define master data entities (customer, product, order, etc.)
- Design API contract (what should systems communicate?)
Modern enterprises are built on integration. Design it well from the start.
Meta Keywords: Enterprise application integration, microservices architecture, APIs, event-driven, Zachman Framework.
