ROSTAN Technologies
Oracle Trusted Partner  ·  Since 2015
Oracle Trusted Partner
AWS AWS Cloud Partner
Microsoft Partner
1000+ Projects
200+ Clients
20+ Countries
11+ Years
Trusted by 200+ leading enterprises worldwide
oracal
nasscom
automation
newgen
Great Place to Work
Amazon Web Services
Ingram Micro
Microsoft
Zoho
EY
vivo
Titan
uflex
Path infotech
evosys
haldiram
mark-exhaust
Version1
Polycab
Eastcompeace
Deloitte
Quest
Weir
Deepak
USP
BPCL
EPCC
IOTL
AWS Startup Support Program — Free AWS Credits  ·  Apply Now
Loading…

Building AI-Powered Applications with Oracle APEX and OCI Generative AI

  • Article By : Rostan Team
  • Jan 24, 2024
  • Share This:

Oracle APEX 24.1 introduced native integration with OCI Generative AI Service, enabling developers to embed LLM capabilities directly inside APEX applications without leaving the Oracle ecosystem. This post explores the architecture, configuration steps, and real-world patterns for building production-ready AI apps.

Architecture Overview

The integration uses APEX Web Credentials to securely store OCI API keys, and the APEX_AI PL/SQL package to invoke models like Cohere Command R+ and Meta Llama 3 hosted on OCI. Responses are streamed or returned as JSON and can be surfaced directly in Interactive Reports, Forms, or custom regions.

Setting Up OCI Generative AI in APEX

First, create a Web Credential in APEX Workspace > Security > Web Credentials pointing to your OCI endpoint. Then use the following PL/SQL block to call the chat completion API:

DECLARE
  l_response  CLOB;
BEGIN
  l_response := APEX_AI.CHAT(
    p_message    => :P1_USER_PROMPT,
    p_service    => 'OCI_GENAI',
    p_model      => 'cohere.command-r-plus',
    p_max_tokens => 512
  );
  :P1_AI_RESPONSE := l_response;
END;

RAG Pattern with Oracle Vector Search

Oracle Database 23ai ships with native vector datatype and VECTOR_DISTANCE() function. You can load document embeddings generated by OCI Embedding models directly into an Oracle table and perform similarity search inside PL/SQL:

SELECT chunk_text
FROM   doc_chunks
ORDER  BY VECTOR_DISTANCE(embedding, :query_vector, COSINE)
FETCH  FIRST 5 ROWS ONLY;

Combining this with APEX AI Chat, you get a full Retrieval-Augmented Generation (RAG) pipeline entirely within the Oracle stack — no external Python microservice required.

Use Cases in Enterprise APEX Apps

  • Smart Search: Natural language search over Oracle tables using semantic embeddings
  • Document Summarisation: Upload a PDF, parse it with APEX_DATA_PARSER, summarise with OCI GenAI
  • AI Form Assistant: Auto-fill form fields from unstructured text or voice transcripts
  • Code Explainer: Paste PL/SQL — get a plain-English explanation back inline

Performance and Cost Considerations

OCI Generative AI is billed per token. For APEX apps with many users, implement a response cache table that stores the last 100 similar queries (using vector distance < 0.1 as a hit threshold) before calling the API. This alone can cut costs by 60–70% in typical enterprise workloads.

Oracle APEX + OCI GenAI is the fastest path for Oracle shops to ship AI features — no new cloud accounts, no Python infra, no vendor sprawl. Just PL/SQL and a web credential.


You May Also Know About
Back to Top
ROSTAN Support
Online · Typically replies instantly
WhatsApp Chat directly, fastest response Call Us +91-9810958952 Email Us info@rostantechnologies.com Send a Message Fill the contact form