Credit score NLP with ollama
- Anand Nerurkar
- 6 days ago
- 5 min read
We will take a one walk through where cusotmer will use copilot to query info on structural data set like "what is my credit score??". then ollama model will respond in NLP way.
Objective
-----
👉 This is a complete, compilable, enterprise-style Spring Boot GenAI POC with:
Prompt stored as code in Git
Prompt registry
Fake microservice API
Prompt injection
Local Ollama call
Swagger UI testing
Model + prompt version returned in API
Assumption:
1.We have contextual API that will return credit score based on customer id. Now this is hardcoded for POC , in real enterprise , it will give a call for cibil api and get score.
Java 17, mvn installed locally,
ollama is intalled locally
how to install ollamalocally and start server
ChatGPT said:
Perfect — I’ll give you clear, OS-specific steps to install Ollama locally and run the LLM server, especially for Windows, since your error shows you’re on Windows.
✅ 1. INSTALL OLLAMA LOCALLY (WINDOWS)
🔹 Step 1: Download Installer
Go to:
Click:
Download for Windows
This downloads:
OllamaSetup.exe
🔹 Step 2: Install
Double-click OllamaSetup.exeFollow normal Next → Next → Install
After install, RESTART YOUR PC (important)
✅ 2. VERIFY OLLAMA IS INSTALLED
Open Command Prompt (CMD) — NOT PowerShell first:
ollama --version
✅ If installed correctly, you should see:
ollama version x.x.x
❌ If you still get:
'ollama' is not recognized
Then Ollama is NOT added to PATH → I’ll show how to fix that below.
✅ 3. START OLLAMA SERVER
Ollama automatically runs as a background service on Windows.
To verify server is running:
curl http://localhost:11434/api/tags
✅ If running, you’ll get JSON with available models❌ If not running, restart Windows OR run:
ollama serve
(keep this window open)
✅ 4. DOWNLOAD A MODEL (IMPORTANT)
Since you selected Qwen, pull a LOCAL model (cloud one won’t work offline):
✅ Recommended LOCAL MODELS
Use ONE of these:
ollama pull qwen2.5:7b
OR
ollama pull llama3:8b
OR (low RAM)
ollama pull phi3:mini
This downloads the model to your machine.
✅ 5. RUN MODEL ONCE (WARMUP)
ollama run qwen2.5:7b
Type:
Hello
If it responds, ✔️ your LLM is working.
Exit with:
/bye
✅ 6. CONFIGURE YOUR SPRING BOOT APP
Update your file:
📄 application.yml
ollama:
url: http://localhost:11434/api/generate
model: qwen2.5:7b
⚠️ Do NOT use:
qwen3-v1-235b-cloud
That is cloud-only and will fail locally.
✅ 7. TEST OLLAMA API DIRECTLY (IMPORTANT)
Before running your Spring app, test:
curl http://localhost:11434/api/generate ^
-d "{\"model\":\"qwen2.5:7b\",\"prompt\":\"What is my credit score?\",\"stream\":false}"
✅ You should receive JSON with:
"response": "..."
If this works → your Spring app WILL WORK.
genai-credit-poc/
│
├── pom.xml
│
└── src/
└── main/
├── java/
│ └── com/
│ └── digitallending/
│ │
│ ├── Application.java
│ │
│ ├── controller/
│ │ ├── ChatController.java
│ │ └── FakeScoreApiController.java
│ │
│ ├── model/
│ │ ├── ChatRequest.java
│ │ ├── ChatResponse.java
│ │ └── CustomerScore.java
│ │
│ ├── prompt/
│ │ ├── PromptBuilder.java
│ │ └── PromptRegistry.java
│ │
│ └── service/
│ ├── CreditScoreService.java
│ └── OllamaClient.java
│
└── resources/
├── application.yml
└── prompts/
└── credit-score-nlp-v1.0.prompt
Swagger-ui http://localhost:8080/swagger-ui/index.html

Architecture Flow
===
User (Swagger UI)
|
v
ChatController
|
v
ChatOrchestratorService
|
+--> CreditScoreService → FakeScoreApi
|
+--> PromptRegistry → PromptBuilder (Prompt as Code)
|
+--> OllamaClient → Local LLM Server
|
v
GenAI NLP Response
Folder Fiels
====
Project & Config
pom.xml
src/main/resources/application.yml
src/main/resources/prompts/credit-score-nlp-v1.0.prompt
Main App
Controllers
ChatController.java (Swagger entry point)
FakeScoreApiController.java (fake internal score API)
Models
Prompt as Code
Services
✅ Mapping of Files → Folders (So You Don’t Misplace Anything)
File | Exact Location |
pom.xml | /genai-credit-poc/pom.xml |
/src/main/java/com/digitallending/Application.java | |
/src/main/java/com/digitallending/controller/ChatController.java | |
/src/main/java/com/digitallending/controller/FakeScoreApiController.java | |
/src/main/java/com/digitallending/model/CustomerScore.java | |
/src/main/java/com/digitallending/model/ChatRequest.java | |
/src/main/java/com/digitallending/model/ChatResponse.java | |
/src/main/java/com/digitallending/prompt/PromptRegistry.java | |
/src/main/java/com/digitallending/prompt/PromptBuilder.java | |
/src/main/java/com/digitallending/service/CreditScoreService.java | |
/src/main/java/com/digitallending/service/OllamaClient.java | |
/src/main/java/com/digitallending/service/ChatOrchestratorService.java | |
application.yml | /src/main/resources/application.yml |
credit-score-nlp-v1.0.prompt | /src/main/resources/prompts/credit-score-nlp-v1.0.prompt |
✅ ROOT
📄 genai-credit-poc/pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0">
<modelVersion>4.0.0</modelVersion>
<groupId>com.digitallending</groupId>
<artifactId>genai-credit-poc</artifactId>
<version>1.0.0</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.2.2</version>
</parent>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<!-- Spring Web -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- Swagger UI -->
<dependency>
<groupId>org.springdoc</groupId>
<artifactId>springdoc-openapi-starter-webmvc-ui</artifactId>
<version>2.3.0</version>
</dependency>
<!-- JSON -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
</dependencies>
</project>
✅ src/main/resources
📄 src/main/resources/application.yml
server:
port: 8080
ollama:
url: http://localhost:11434/api/generate
model: qwen3-v1-235b-cloud
📄 src/main/resources/prompts/credit-score-nlp-v1.0.prompt
You are a professional digital banking assistant.
Convert the following structured financial data into a polite,
customer-friendly explanation.
DATA:
Customer Id: {{customerId}}
Credit Score: {{creditScore}}
Fraud Score: {{fraudScore}}
Rules:
- Do NOT change numeric values
- Use BFSI compliant language
- Do NOT provide financial advice
- Keep response under 3 sentences
✅ src/main/java/com/digitallending
package com.digitallending;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
✅ controller
📄 src/main/java/com/digitallending/controller/ChatController.java
package com.digitallending.controller;
import com.digitallending.model.ChatRequest;
import com.digitallending.model.ChatResponse;
import com.digitallending.service.ChatOrchestratorService;
import org.springframework.web.bind.annotation.*;
@RestController
@RequestMapping("/api/chat")
public class ChatController {
private final ChatOrchestratorService orchestrator;
public ChatController(ChatOrchestratorService orchestrator) {
this.orchestrator = orchestrator;
}
@PostMapping("/message")
public ChatResponse chat(@RequestHeader("X-Customer-Id") String customerId,
@RequestBody ChatRequest request) {
return orchestrator.process(customerId, request.getMessage());
}
}
📄 src/main/java/com/digitallending/controller/FakeScoreApiController.java
package com.digitallending.controller;
import com.digitallending.model.CustomerScore;
import org.springframework.web.bind.annotation.*;
import java.util.Map;
@RestController
@RequestMapping("/api/internal/scores")
public class FakeScoreApiController {
private static final Map<String, CustomerScore> DATA = Map.of(
"CUST1001", new CustomerScore("CUST1001", 720, 11),
"CUST1002", new CustomerScore("CUST1002", 650, 28),
"CUST1003", new CustomerScore("CUST1003", 540, 65)
);
@GetMapping("/{customerId}")
public CustomerScore getScore(@PathVariable String customerId) {
return DATA.get(customerId);
}
}
✅ model
📄 src/main/java/com/digitallending/model/CustomerScore.java
package com.digitallending.model;
public class CustomerScore {
private String customerId;
private int creditScore;
private int fraudScore;
public CustomerScore() {}
public CustomerScore(String customerId, int creditScore, int fraudScore) {
this.customerId = customerId;
this.creditScore = creditScore;
this.fraudScore = fraudScore;
}
public String getCustomerId() { return customerId; }
public int getCreditScore() { return creditScore; }
public int getFraudScore() { return fraudScore; }
}
📄 src/main/java/com/digitallending/model/ChatRequest.java
package com.digitallending.model;
public class ChatRequest {
private String message;
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
}
📄 src/main/java/com/digitallending/model/ChatResponse.java
package com.digitallending.model;
public class ChatResponse {
private String reply;
private String promptVersion;
private String model;
public ChatResponse(String reply, String promptVersion, String model) {
this.reply = reply;
this.promptVersion = promptVersion;
this.model = model;
}
public String getReply() { return reply; }
public String getPromptVersion() { return promptVersion; }
public String getModel() { return model; }
}
✅ prompt
📄 src/main/java/com/digitallending/prompt/PromptRegistry.java
package com.digitallending.prompt;
import jakarta.annotation.PostConstruct;
import org.springframework.core.io.ClassPathResource;
import org.springframework.stereotype.Component;
import java.nio.file.Files;
import java.util.HashMap;
import java.util.Map;
@Component
public class PromptRegistry {
private final Map<String, String> registry = new HashMap<>();
@PostConstruct
public void loadPrompts() {
try {
ClassPathResource resource =
new ClassPathResource("prompts/credit-score-nlp-v1.0.prompt");
String content = Files.readString(resource.getFile().toPath());
registry.put("credit-score-nlp-v1.0", content);
} catch (Exception e) {
throw new RuntimeException("Failed to load prompts", e);
}
}
public String getPrompt(String key) {
return registry.get(key);
}
}
📄 src/main/java/com/digitallending/prompt/PromptBuilder.java
package com.digitallending.prompt;
import com.digitallending.model.CustomerScore;
import org.springframework.stereotype.Component;
@Component
public class PromptBuilder {
private final PromptRegistry registry;
public PromptBuilder(PromptRegistry registry) {
this.registry = registry;
}
public String build(CustomerScore score) {
String template = registry.getPrompt("credit-score-nlp-v1.0");
return template.replace("{{customerId}}", score.getCustomerId())
.replace("{{creditScore}}", String.valueOf(score.getCreditScore()))
.replace("{{fraudScore}}", String.valueOf(score.getFraudScore()));
}
public String version() {
return "credit-score-nlp-v1.0";
}
}
✅ service
📄 src/main/java/com/digitallending/service/CreditScoreService.java
package com.digitallending.service;
import com.digitallending.model.CustomerScore;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestTemplate;
@Service
public class CreditScoreService {
private final RestTemplate restTemplate = new RestTemplate();
public CustomerScore fetch(String customerId) {
return restTemplate.getForObject(
"http://localhost:8080/api/internal/scores/" + customerId,
CustomerScore.class
);
}
}
📄 src/main/java/com/digitallending/service/OllamaClient.java
package com.digitallending.service;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import org.springframework.web.client.RestTemplate;
import java.util.Map;
@Component
public class OllamaClient {
@Value("${ollama.url}")
private String url;
@Value("${ollama.model}")
private String model;
private final RestTemplate restTemplate = new RestTemplate();
private final ObjectMapper mapper = new ObjectMapper();
public String call(String prompt) {
try {
Map<String, Object> req = Map.of(
"model", model,
"prompt", prompt,
"stream", false
);
String response = restTemplate.postForObject(url, req, String.class);
return mapper.readTree(response).get("response").asText();
} catch (Exception e) {
return "LLM Error: " + e.getMessage();
}
}
}
📄 src/main/java/com/digitallending/service/ChatOrchestratorService.java
package com.digitallending.service;
import com.digitallending.model.ChatResponse;
import com.digitallending.model.CustomerScore;
import com.digitallending.prompt.PromptBuilder;
import org.springframework.stereotype.Service;
@Service
public class ChatOrchestratorService {
private final CreditScoreService creditScoreService;
private final OllamaClient ollamaClient;
private final PromptBuilder promptBuilder;
public ChatOrchestratorService(CreditScoreService c,
OllamaClient o,
PromptBuilder p) {
this.creditScoreService = c;
this.ollamaClient = o;
this.promptBuilder = p;
}
public ChatResponse process(String customerId, String message) {
CustomerScore score = creditScoreService.fetch(customerId);
String prompt = promptBuilder.build(score);
String llmReply = ollamaClient.call(prompt);
return new ChatResponse(
llmReply,
promptBuilder.version(),
"qwen3-v1-235b-cloud"
);
}
}How to Run POC
===
✅ HOW TO RUN LOCALLY
1️⃣ Start Ollama (from desktop app)
Ensure API server is running at:
Test:
curl http://localhost:11434/api/tags
2️⃣ Run Spring Boot
cd genai-credit-poc
mvn spring-boot:run
3️⃣ Open Swagger
4️⃣ Test API
POST /api/chat/messageHeader
X-Customer-Id: CUST1001
Body
{
"message": "What is my credit score?"
}
✅ You now have a complete enterprise-style GenAI POC:
Microservice pattern
Prompt registry
Prompt as code
Ollama LLM integration
Swagger testing
BFSI-safe NLP generation
Request
===

Response
===

response
===

.png)

Comments