Legacy Lending System (PL/SQL SP, Pro*C )Modernization
- Anand Nerurkar
- 6 days ago
- 3 min read
Legacy State (Before Modernization)
PL/SQL Packages:
Business logic for loan eligibility calculation was embedded inside Oracle PL/SQL stored procedures.
Example: Procedure CALC_LOAN_ELIGIBILITY fetching customer salary, liabilities, and calculating DTI (Debt-to-Income ratio).
Pro*C Components:
Pro*C programs interacting with Oracle DB for batch processing of loan applications.
Example: Nightly batch jobs checking pending applications, updating loan status.
Challenges:
Tight coupling of business logic with database.
Limited scalability & performance bottlenecks during peak load.
Hard to expose APIs for digital channels (mobile/web).
Target State (After Modernization with Java Microservices)
Modernization Approach:
One of the critical PL/SQL packages handled loan eligibility rules. We extracted that business logic, rewrote it in Java Spring Boot, exposed it as a LoanEvaluation REST API, and ensured it was stateless. The service used PostgreSQL for persistence and Kafka for event-driven communication. This eliminated dependency on Oracle DB and aligned with our cloud-native roadmap.
Analysis with CAST & VFunction
Used CAST to perform code inventory analysis and identify dependencies between PL/SQL procedures, Pro*C modules, and application code.
Classified stored procedures into categories:
Simple CRUD → move to ORM (Hibernate/JPA in Java).
Business rules → refactor into domain-driven microservices.
Reporting/ETL → migrate to data pipeline (Kafka + Spark/Azure Data Factory).
Refactoring PL/SQL Procedure Example
The loan eligibility PL/SQL procedure was refactored into a LoanEligibility Microservice in Spring Boot.
Instead of directly calling PL/SQL, the service now:
Exposes a REST API /eligibility.
Uses Spring Data JPA to fetch customer & loan details.
Calls an external CreditScore microservice (instead of DB join).
Applies business rules in Java code (rule engine like Drools or custom service).
Before (PL/SQL)
1. PROCEDURE check_eligibility(cust_id IN NUMBER, result OUT VARCHAR2) IS
credit_score NUMBER;
BEGIN
SELECT score INTO credit_score FROM credit_bureau WHERE customer_id = cust_id;
IF credit_score > 700 THEN
result := 'APPROVED';
ELSE
result := 'REJECTED';
END IF;
END;
2.CREATE OR REPLACE PROCEDURE calculate_emi(
loan_amt IN NUMBER,
rate IN NUMBER,
tenure IN NUMBER,
emi OUT NUMBER
) AS
BEGIN
emi := (loan_amt * rate/1200) / (1 - POWER(1 + rate/1200, -tenure));
END;
After (Java Microservice)
@RestController
@RequestMapping("/eligibility")
public class LoanEligibilityController {
@Autowired private CustomerRepository customerRepo;
@Autowired private CreditScoreClient creditScoreClient;
@GetMapping("/{custId}")
public String checkEligibility(@PathVariable Long custId) {
int creditScore = creditScoreClient.getScore(custId);
return creditScore > 700 ? "APPROVED" : "REJECTED";
}
}
2@RestController
@RequestMapping("/loan")
public class LoanController {
@GetMapping("/emi")
public double calculateEmi(@RequestParam double loanAmt,
@RequestParam double rate,
@RequestParam int tenure) {
return (loanAmt * (rate/1200)) / (1 - Math.pow(1 + (rate/1200), -tenure));
}
}
API Call Example
GET /loan/emi?loanAmt=100000&rate=8.5&tenure=60
Refactoring Pro*C Example
A Pro*C batch job that performed daily settlement was migrated to:
Kafka-based event streaming for real-time processing.
Spring Batch microservice for scheduled tasks.
Earlier: Pro*C batch jobs running nightly.
Now: Event-driven architecture using Kafka.
Example: When a loan application is submitted → loan.initiated event is published.
LoanEligibilityService consumes the event, performs eligibility check in real-time, and publishes result to loan.eligibility.checked.
Database Migration
Data from Oracle was migrated to PostgreSQL (with schema optimization).
Migrated customer and loan tables to PostgreSQL using Azure DMS.
Removed Oracle-specific constructs (e.g., sequences, PL/SQL packages).
Used Liquibase/Flyway for schema versioning.
Stored procedure logic was refactored into Java microservices (business logic in code, not DB).
Automation/Cloud Adoption/AI ML
Infra provisioning → Terraform + Azure DevOps pipeline.
Security → DevSecOps pipeline with SAST/DAST scans.
AI/ML Model: Predict loan default probability (enrichment to eligibility check).
GenAI (RAG Layer): Build knowledge base for loan officers.
Agentic AI: Auto-trigger manual review assignment if eligibility is borderline.
Outcome:
✅ Reduced dependency on Oracle DB → cut license costs.
✅ Improved scalability and agility (microservices deployed on AKS/EKS/GKE).
✅Reduced loan eligibility processing time from hours (batch) to seconds (real-time microservices).
✅ Faster change cycle (business rules now configurable in microservices).
✅ Enabled cloud-native data pipelines instead of batch Pro*C jobs.
✅Decoupled business logic from database → improved agility.
✅Enabled omnichannel API access (mobile, web, partner apps).
Comments