top of page

Pro*c Module to Java Microservices

  • Writer: Anand Nerurkar
    Anand Nerurkar
  • 5 hours ago
  • 5 min read

End-to-end example of migrating a Pro*C module to a Java microservice (target: PostgreSQL on Azure). I’ll use a realistic use case and show the mapping from Pro*C constructs to modern Java, plus the steps you’d actually run in a program.


Use case

Legacy Pro*C module: “Repayment Schedule Generator”

  • Reads a LOAN record from Oracle (ExaCC)

  • Calculates EMI and an amortization schedule

  • Writes rows into REPAYMENT_SCHEDULE

  • Runs in batch with simple error handling via SQLCA/WHENEVER

Target: Spring Boot microservice (repayment-svc) on Azure, writing to Azure Database for PostgreSQL.

1) Inventory & analysis (what you do first)

  1. Locate the Pro*C programs (usually .pc files) and list:

    • Entry points (main functions), include files, global variables

    • SQL statements (SELECT/INSERT/UPDATE/COMMIT/ROLLBACK)

    • Cursors, array/bulk DML, dynamic SQL usage

    • Error handling (WHENEVER SQLERROR, sqlca.sqlcode)

    • External dependencies (config files, flat files)

  2. Data model used:

    • Tables: LOAN, REPAYMENT_SCHEDULE, sequences, triggers

    • Oracle types (NUMBER, DATE, VARCHAR2, CLOB)

  3. Runtime/ops:

    • How it’s scheduled (cron), input parameters, logging, SLA

  4. Decide migration approach:

    • Re-implement logic in a domain microservice

    • Move “calculation” out of DB into service code

    • Replace Oracle specifics (sequences, date math, hints)

Tools like CAST help extract callgraphs and SQL hotspots; vFunction helps group code into domain services (e.g., “repayment”, “underwriting”, “customer”).

2) Typical Pro*C pattern (simplified legacy snippet)

/* repayment.pc */
#include <stdio.h>
#include <sqlca.h>

EXEC SQL INCLUDE sqlca;

EXEC SQL BEGIN DECLARE SECTION;
long    :loan_id;
double  :principal;
double  :annual_rate;
int     :tenor_months;
EXEC SQL END DECLARE SECTION;

void die() {
  fprintf(stderr, "SQL error: %ld\n", sqlca.sqlcode);
  EXEC SQL ROLLBACK RELEASE;
  exit(1);
}

int main(int argc, char** argv) {
  if (argc < 2) { fprintf(stderr, "usage: repayment <loan_id>\n"); return 1; }
  loan_id = atol(argv[1]);

  EXEC SQL CONNECT :user IDENTIFIED BY :pwd;  /* often via env */
  EXEC SQL WHENEVER SQLERROR DO die();

  /* fetch loan */
  EXEC SQL SELECT principal, annual_rate, tenor_months
           INTO :principal, :annual_rate, :tenor_months
           FROM LOAN
           WHERE loan_id = :loan_id;

  double r = annual_rate / 12.0 / 100.0;
  double emi = principal * r * pow(1+r, tenor_months) / (pow(1+r, tenor_months) - 1);

  /* delete existing schedule */
  EXEC SQL DELETE FROM REPAYMENT_SCHEDULE WHERE loan_id = :loan_id;

  double bal = principal;
  for (int m=1; m<=tenor_months; m++) {
     double interest = bal * r;
     double principal_comp = emi - interest;
     bal -= principal_comp;

     EXEC SQL INSERT INTO REPAYMENT_SCHEDULE
       (loan_id, installment_no, emi, interest_comp, principal_comp, balance)
       VALUES (:loan_id, :m, :emi, :interest, :principal_comp, :bal);
  }

  EXEC SQL COMMIT RELEASE;
  return 0;
}

What matters here (to map later):

  • Host variables → Java DTOs

  • WHENEVER SQLERROR/sqlca → Java exceptions

  • Cursor loop → Java loop over tenor

  • Array/batch inserts → JDBC batch

  • COMMIT → @Transactional

  • Oracle types/date handling → PG types & Java LocalDate/BigDecimal

3) Schema migration Oracle → PostgreSQL (high level)

Oracle:

CREATE TABLE LOAN (
  LOAN_ID        NUMBER PRIMARY KEY,
  PRINCIPAL      NUMBER(18,2),
  ANNUAL_RATE    NUMBER(5,2),
  TENOR_MONTHS   NUMBER(5),
  START_DATE     DATE
);

CREATE TABLE REPAYMENT_SCHEDULE (
  LOAN_ID          NUMBER,
  INSTALLMENT_NO   NUMBER,
  EMI              NUMBER(18,2),
  INTEREST_COMP    NUMBER(18,2),
  PRINCIPAL_COMP   NUMBER(18,2),
  BALANCE          NUMBER(18,2),
  DUE_DATE         DATE
);

PostgreSQL (via ora2pg/reviewed manually):

CREATE TABLE loan (
  loan_id        BIGINT PRIMARY KEY,
  principal      NUMERIC(18,2),
  annual_rate    NUMERIC(5,2),
  tenor_months   INTEGER,
  start_date     DATE
);

CREATE TABLE repayment_schedule (
  loan_id          BIGINT REFERENCES loan(loan_id),
  installment_no   INTEGER,
  emi              NUMERIC(18,2),
  interest_comp    NUMERIC(18,2),
  principal_comp   NUMERIC(18,2),
  balance          NUMERIC(18,2),
  due_date         DATE,
  PRIMARY KEY (loan_id, installment_no)
);

Notes:

  • NUMBER → NUMERIC, DATE → DATE/TIMESTAMP as needed

  • Add composite PK; move DB logic from triggers into service code

  • Use Azure DMS to migrate existing data

4) Target microservice (Spring Boot + PostgreSQL)

Tech choices

  • Spring Boot 3.x, Spring Web, Spring Data JDBC/JPA or jOOQ (explicit SQL is nice for finance)

  • Postgres driver, HikariCP

  • @Transactional for atomic schedule rebuild

  • For batch inserts: JDBC batch or COPY (for very large schedules)

Entities/records (example with JPA)

@Entity
@Table(name="loan")
public class Loan {
  @Id Long loanId;
  @Column(precision=18, scale=2) BigDecimal principal;
  @Column(precision=5, scale=2)  BigDecimal annualRate;
  Integer tenorMonths;
  LocalDate startDate;
  // getters/setters
}

@Entity
@Table(name="repayment_schedule")
@IdClass(RepaymentId.class)
public class Repayment {
  @Id Long loanId;
  @Id Integer installmentNo;
  @Column(precision=18, scale=2) BigDecimal emi;
  @Column(precision=18, scale=2) BigDecimal interestComp;
  @Column(precision=18, scale=2) BigDecimal principalComp;
  @Column(precision=18, scale=2) BigDecimal balance;
  LocalDate dueDate;
  // getters/setters
}

Service logic (maps the Pro*C loop)

@Service
public class RepaymentService {

  private final LoanRepository loanRepo;
  private final RepaymentRepository repaymentRepo;

  @Transactional
  public List<Repayment> regenerateSchedule(Long loanId) {
    Loan loan = loanRepo.findById(loanId)
         .orElseThrow(() -> new NotFoundException("Loan not found"));

    BigDecimal r = loan.getAnnualRate()
        .divide(BigDecimal.valueOf(12 * 100.0), 12, RoundingMode.HALF_UP); // monthly rate

    int n = loan.getTenorMonths();
    BigDecimal onePlusRPowerN =
        BigDecimal.ONE.add(r).pow(n, new MathContext(20, RoundingMode.HALF_UP));
    BigDecimal emi = loan.getPrincipal()
        .multiply(r).multiply(onePlusRPowerN)
        .divide(onePlusRPowerN.subtract(BigDecimal.ONE), 2, RoundingMode.HALF_UP);

    // delete existing schedule
    repaymentRepo.deleteByLoanId(loanId);

    BigDecimal bal = loan.getPrincipal();
    LocalDate due = loan.getStartDate().plusMonths(1);

    List<Repayment> out = new ArrayList<>(n);
    for (int m = 1; m <= n; m++) {
      BigDecimal interest = bal.multiply(r).setScale(2, RoundingMode.HALF_UP);
      BigDecimal principalComp = emi.subtract(interest).setScale(2, RoundingMode.HALF_UP);
      bal = bal.subtract(principalComp).setScale(2, RoundingMode.HALF_UP);

      Repayment row = new Repayment();
      row.setLoanId(loanId);
      row.setInstallmentNo(m);
      row.setEmi(emi);
      row.setInterestComp(interest);
      row.setPrincipalComp(principalComp);
      row.setBalance(bal.max(BigDecimal.ZERO)); // avoid negative last cent
      row.setDueDate(due);

      out.add(row);
      due = due.plusMonths(1);
    }

    // batch save
    repaymentRepo.saveAll(out);
    return out;
  }
}

API endpoints

@RestController
@RequestMapping("/loans")
public class RepaymentController {
  private final RepaymentService service;

  @PostMapping("/{id}/schedule:regenerate")
  public List<RepaymentDto> regenerate(@PathVariable Long id) {
    return service.regenerateSchedule(id).stream()
        .map(RepaymentDto::fromEntity)
        .toList();
  }

  @GetMapping("/{id}/schedule")
  public List<RepaymentDto> get(@PathVariable Long id) {
    return service.getSchedule(id).stream()
        .map(RepaymentDto::fromEntity)
        .toList();
  }
}

5) Mapping cheat-sheet (Pro*C → Java)

Pro*C / Oracle

Java / PostgreSQL

Host variables (EXEC SQL BEGIN DECLARE SECTION)

DTOs/Entities/Method params

WHENEVER SQLERROR / sqlca

Exceptions + @Transactional(rollbackFor=…)

Cursors + loops

JDBC/JPA queries + Java loops

Array DML (INSERT … VALUES (:arr))

JDBC batch / saveAll

COMMIT/ROLLBACK

Spring @Transactional

Sequences (NEXTVAL)

PG sequences / GENERATED BY DEFAULT AS IDENTITY

DATE arithmetic

Java LocalDate, LocalDateTime

PL/SQL packages

Service classes or functions

Hints / proprietary SQL

Standard SQL / PG functions

NLS formats

ISO date/time; explicit formatters

6) Data migration (Oracle ExaCC → Azure PostgreSQL)

  1. Schema: ora2pg (export DDL) → review → apply to PG.

  2. Data: Azure DMS online migration (or export/import for small sets).

  3. Dual-run strategy (optional):

    • Use CDC (GoldenGate/Debezium) from Oracle to PG while validating

    • Shadow write schedule in both systems during transition

  4. Cutover: freeze writes → final sync → switch microservice to PG.

7) Testing & parity

  • Golden dataset: same input loans in Oracle & PG, compare EMI and last-installment balance (allow tiny rounding deltas).

  • Query parity: ensure indexes & execution plans keep SLAs.

  • Backfill schedule: run for existing loans, verify counts and sums.

8) Non-functional & ops

  • Performance: use JDBC batch (size 500–1000), proper indexes, connection pooling.

  • Precision: always use BigDecimal and fixed scales for finance.

  • Observability: log correlation ID per loan, timings, metrics (schedule gen ms/loan).

  • Security: parameterized SQL only; Key Vault/Managed Identity for secrets; network via Private Link; Zero Trust.

  • Resilience: idempotent regeneration, retry policies, circuit breakers.

9) What changed (and why it’s better)

  • Business logic moved out of DB―clean domain service you can test and scale.

  • Vendor-neutral SQL; easier to move clouds.

  • API-first enables other channels (mobile/web/BPMS) to call the service.

  • Faster CI/CD and safer rollouts; observability built-in.

Bonus: If the Pro*C used dynamic SQL or bulk loaders

  • Dynamic SQL → PreparedStatements with named params (or jOOQ DSL).

  • SQL*Loader → PostgreSQL COPY or COPY FROM STDIN via PGJDBC for high-volume loads.

  • Pipes/external tables → Event-driven ingestion (Blob Storage → Function → Service).



 
 
 

Recent Posts

See All
why springbatch job??

Spring Batch Job Spring Batch is designed exactly for batch workloads  like Pro*C migrations. ✅ Advantages: Chunk-oriented processing...

 
 
 
Pro*c Job to Spring Batch Job

Example1: 📌 Background Pro*C job  → Written in C with embedded SQL, often used for batch ETL-like jobs in Oracle. Spring Batch job  →...

 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
  • Facebook
  • Twitter
  • LinkedIn

©2024 by AeeroTech. Proudly created with Wix.com

bottom of page