Lending Journey
- Anand Nerurkar
- 50 minutes ago
- 2 min read
📌 End-to-End Parallel Lending Journey (Text Architecture Flow)
1. Login / Authentication
Persona: Customer (Parag Shah)
Action: Logs in to Lending App
Flow:
Frontend (Web/Mobile) → Azure API Management → Auth Microservice (AKS).
Auth MS validates Parag’s credentials with Azure AD B2C.
JWT token issued, user profile fetched from PostgreSQL BDR (Customer table).
Redis Cache Write:
Cache user profile: customer:{parag_id} → {name, email, mobile, KYC status, loan summary}
Kafka Topic:
Publish CustomerLoginEvent → consumed by Customer360 MS to update last login timestamp in PostgreSQL.
2. Loan Application Initiation
Persona: Parag Shah
Action: Starts loan application (selects product, amount)
Flow:
Loan Application MS writes to PostgreSQL BDR → loan_applications table (status=“Initiated”).
Publishes LoanInitiatedEvent to Kafka.
Consumers:
KYC MS → triggers KYC process.
Customer360 MS → updates loan summary in Cosmos DB.
Cosmos DB Insert:
{ "eventId": "evt-101", "customerId": "cust-001", "loanAppId": "loan-101", "eventType": "LoanInitiated", "timestamp": "2025-08-30T10:01:00Z", "payload": { "amount": 500000, "tenure": 36 } }
Redis Cache Update:
Update cache key customer:{parag_id}:loan_summary with {loanAppId, status=Initiated}.
Step 4 – Parallel Execution: KYC + Credit Score + Fraud Check
Action: After consent + document upload, orchestrator publishes events in parallel.
Kafka Topics Fired Simultaneously:
kyc-check-requested
credit-score-requested
fraud-check-requested
Consumers:
Topic | Consumer Service | Action | Cache Update API |
kyc-check-requested | KYCService | Calls external KYC API → updates PostgreSQL kyc_result table | SET kyc:paragshah '{"status":"completed","result":"pass"}' |
credit-score-requested | CreditScoreService | Calls credit bureau → inserts into credit_score_result table | SET credit:paragshah '{"score":780,"status":"completed"}' |
fraud-check-requested | FraudDetectionService | Runs ML Model → inserts into fraud_result table | SET fraud:paragshah '{"risk":"low","status":"completed"}' |
All three publish result events:
kyc-completed, credit-score-completed, fraud-check-completed
LoanEvaluationService listens to all three and waits until all results are in (fan-in pattern).
Step 5 – Loan Evaluation & Offer
Once all three results available, LoanEvaluationService calculates eligibility:
Updates loan_application table → status = OFFER_GENERATED
Publishes Kafka event: loan-offer-generated
Consumers:
CacheUpdaterConsumer updates cache:SET loan:123:status '{"status":"offer_generated","offer_amount":500000}'
CosmosDBWriterConsumer stores event for dashboard history.
Step 6 – Loan Agreement & e-Sign
Parag reviews offer → clicks "Accept"
LoanAgreementService generates PDF in Azure Blob → e-Sign
Updates PostgreSQL table loan_agreement → signed = true
Publishes Kafka event: loan-agreement-signed
Cache:SET loan:123:agreement '{"status":"signed"}'
Step 7 – Disbursement
DisbursementService triggered → Updates disbursement table
Publishes Kafka event: loan-disbursed
Cache:SET loan:123:status '{"status":"disbursed","amount":500000}'
CosmosDB: Final event inserted → used for dashboard analytics.
📊 Data Storage Across Layers
Component | Sample Data for Parag Shah |
PostgreSQL BDR | Tables: customer_profile, loan_application, kyc_result, credit_score_result, fraud_result, loan_agreement, disbursement |
CosmosDB | loan_events container with JSON docs for each event (loan-application-initiated, kyc-completed, credit-score-completed, etc.) |
Redis Cache | Keys: session:paragshah, loan:123:status, kyc:paragshah, credit:paragshah, fraud:paragshah → all stored dynamically |
Kafka | Topics: loan-application-initiated, document-uploaded, kyc-check-requested, kyc-completed, credit-score-requested, credit-score-completed, fraud-check-requested, fraud-check-completed, loan-offer-generated, loan-agreement-signed, loan-disbursed |
⚡ Azure Components
API Gateway → Azure API Management
Orchestration → Azure Logic Apps or Durable Functions
Compute → AKS for all microservices
Data → Azure Database for PostgreSQL (BDR enabled), CosmosDB (NoSQL), Azure Redis Cache, Azure Blob
Event Streaming → Azure Event Hubs (Kafka interface)
Monitoring → Azure Monitor, Application Insights, Grafana

Comments