top of page

Bridging the Gap: Connecting Unsupported AI Models to ODC

  • Writer: Natasha De Guzman
    Natasha De Guzman
  • 5 hours ago
  • 9 min read

Updated: 1 hour ago

OutSystems Developer Cloud's custom connection feature lets you integrate any AI provider beyond the built-in options.


Here's how we leveraged this extensibility to connect Vertex AI Gemini with enterprise-grade security in one weekend. 







TL;DR Summary


For OutSystems Developers: 

ODC’s "Custom Connection" feature is more powerful than it looks. If you can implement the contract, you can integrate any model. 

For Python Developers:

FastAPI + Pydantic = The dream team. Let Google’s SDK deal with the IAM authentication; you have better things to debug. 

For Enterprise Architects: 

You don't have to wait for platform vendors to support every provider. A mediation layer buys you flexibility and keeps the governance team happy.



The Tale of Two Geminis 

Here's the thing most people don't realize: Gemini, Google's family of large language models (similar to OpenAI's GPT or Anthropic's Claude), comes in two flavors. And they're not interchangeable when you're dealing with enterprise requirements. 


While the underlying model is the same, the authentication model, data boundaries, and compliance posture are fundamentally different.

Gemini in public vs Gemini in your cloud boundary. Here’s how they compare:

Category

Public Gemini (ODC Native Option)

Vertex AI Gemini (Enterprise Option)

Authentication

Simple API Key

Service account authentication (JSON keys / IAM)

Data Location

Google’s shared public cloud

Stays inside your GCP project

Security Model

Shared infrastructure

Isolated enterprise environment

Compliance Fit

Limited control over data residency

Full enterprise-grade governance & residency control

Best For

Prototypes, experiments, personal projects

Production enterprise workloads

Compliance Conversation

“Where exactly is our customer data going?”

“Data remains within our controlled cloud boundary.”


The Key Difference 

Public Gemini sends your data to Google's shared cloud; Vertex AI Gemini keeps it in your Google Cloud Project (GCP) - same AI, completely different security posture.


When Enterprise Requirements Meet Platform Extensibility 


This is exactly where ODC's custom integration feature shines. You can create a connector service that handles enterprise-specific requirements while ODC focuses on what it does best: building apps. 


Understanding ODC's Custom Connection Architecture 

So how do we build this connector? ODC's documentation describes something called a custom connection - a way to connect to LLMs beyond the native Azure OpenAI and Amazon Bedrock support. 

Here's what ODC requires for a custom connector service: 

✅ An intermediary web service (the "connector") 

✅ A synchronous REST endpoint using POST method 

✅ Implementation of OutSystems' API contract (defined in their [Swagger spec

✅ Supported authentication scheme (Bearer token, OAuth, etc.) 

✅ Endpoint accessible to ODC (public URL or private gateway) 

✅ Standard HTTP status codes for errors 


Think of it as ODC saying: "We can't support every AI provider directly, but if you build a translator that speaks our language, we'll talk to it." 

Perfect! We'll build that translator. 


The Master Plan (Or: How We Made This Work)


Here's what we built and the technologies we chose: 



  1. FastAPI: Built for Speed  

We needed to code this fast (pun intended), and FastAPI delivered: 

  • Automatic API documentation - Swagger UI out of the box, which is perfect because ODC provides a Swagger spec we need to implement. We can compare our generated docs against their spec to ensure compliance. 

  • Type validation with Pydantic - ODC's API contract requires specific property names and types. Pydantic catches validation errors before they become runtime issues.

  • Python - The language that Google's own Vertex AI SDK speaks fluently. Fighting this would be like swimming upstream while carrying a piano. 

Could we have used Node.js? Sure. But when Google's Vertex AI SDK is written in Python, and ODC's API contract is straightforward to implement in any language, Python was the path of least resistance.


  1. Google's Vertex AI SDK: The Obvious Choice 


When you're talking to Google's AI, you use Google's SDK. This isn't the time to be clever with raw HTTP requests. 

The SDK handles: 

  • Authentication - Service accounts, tokens, all that jazz 

  • Retry logic - Because networks are unreliable 

  • Streaming responses - For that ChatGPT-style typing effect 

  • All the weird edge cases - Google already solved them 

Most importantly, it provides the `vertexai=True` parameter that's the whole reason we're building this connector: 

from google import genai

client = genai.Client(
    vertexai=True,  # Enterprise Vertex AI configuration
    project="your-gcp-project",
    location="us-central1"
)

  1. LocalTunnel: The "But ODC Lives in the Cloud" Solution


Here's a fun problem: ODC is cloud-hosted. Your development server is on localhost. ODC can't exactly reach into your laptop to test your API.  

Enter LocalTunnel, the magical tool that gives your localhost a public URL:  


lt --port 8000
# Your tunnel is: https://random-words-1234.local.lt
  • Is it production-ready? No. 

  • Is it perfect for testing? Absolutely. 

  • Does the URL change every time you restart it? Unfortunately, yes. 


But for rapid development and testing with ODC, it's like having a temporary tunnel between your laptop and the cloud.


Putting it All Together 

With our tech stack chosen, we built a FastAPI application in Python that implements ODC's API contract. The core logic is straightforward: receive OpenAI-formatted requests from ODC, translate them to Vertex AI's format using Google’s SDK, call Gemini with enterprise authentication, then translate the response back to OpenAI format. Add some Pydantic models for validation, Bearer token authentication, and proper error handling, and we had a working connector service ready to test. 


Testing Time: Does This Thing Actually Work? 


Time for the moment of truth - let's see if this connector really works! 

Step 1: Set Up the Environment 

Create a `.env` file with your configuration: 

# Google Cloud Project Configuration
GCP_PROJECT_ID=apps-outsystems-dev
GCP_LOCATION=us-central1
GOOGLE_APPLICATION_CREDENTIALS=path/to/service-account.json

# Connector API Key (for ODC to authenticate)
MIDDLEWARE_API_KEY=your-secret-key-here


Step 2: Start the Connector Service


# Create a virtual environment (because we're professionals)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install fastapi uvicorn google-genai python-dotenv

# Start the server
uvicorn main:app --reload --port 8000

Server running? ✅ 

Visit `http://localhost:8000/docs` to see the auto-generated Swagger documentation. Compare it against ODC's Swagger spec to validate. 



Step 2.5: Test Locally (Optional but Recommended) 

Before exposing your connector to the internet, validate that everything works: 

# Run the test suite
python test_middleware.py

This test script validates: 

  • Health check: Is the server responding? 

  • Authentication: Does Bearer token validation work? 

  • Model validation: Are invalid models rejected? 

  • Chat completion: Can we talk to Gemini? 

  • Multi-turn conversation: Does context work across messages? 

  • Parameter validation: Are temperature boundaries enforced? 

  • Error handling: Do we return proper status codes? 

If all tests pass, you're ready to expose it to ODC. If not, fix the issues before proceeding.



Step 3: Expose to the Internet with LocalTunnel 


# Install LocalTunnel globally
npm install -g localtunnel

# Create a tunnel to your local server
lt --port 8000

# Output: your url is: https://brave-elephant-42.loca.lt

Public URL acquired? ✅ 

Pro tip: Keep this terminal window open. If you close it, the tunnel dies and ODC can't reach your connector anymore. 


Step 4: Configure ODC 


Now for the fun part, telling ODC about your custom connector service. 

Add a Custom connection AI Model in the ODC Portal: 

Remember to Add Custom Header

  • Header name: `Authorization` 

  • Header value: `Bearer your-secret-key-here`


ODC configured? ✅ 



Step 5: The Moment of Truth 

We tested with a simple chat in an ODC app: 

User: "Hello, how are you?"  Gemini (via our connector): "Hello! I'm doing well, thank you for asking. How can I help you today?" 


🎉 IT WORKS! 🎉 


But wait, there's more! Let's test the real power, document processing. 



Step 6: Testing with a Real Multi-Agent App 


Instead of building a test app from scratch, we reused the Loan Origination app from OutSystems' [Agentic AI training]. This is a complete multi-agent workflow app with four AI agents (Intake, Enrichment, Underwriter, Communication) that processes loan applications by analyzing uploaded documents (Pay Stub, Tax Form, ID Card, Bank Statement), extracting data, calculating financial metrics, and providing an interactive conversational interface with action calling capabilities.


All we did: Swapped out the AI model configuration in ODC Portal to use our custom Vertex AI Gemini connector instead of the default model. 


Gemini (via our connector): Successfully executed the full 4-agent workflow - validating documents, calculating financial metrics, checking policy adherence, and providing conversational responses. 


🎉 IT REALLY WORKS! 🎉 



The Pitfalls That Made Us Question Our Life Choices 


Of course, nothing works perfectly on the first try. Here are some errors that made us earn our stripes: 


  1. "The System Role Situation" 

The Error: 


400 INVALID_ARGUMENT: Request contains an invalid argument.

The Investigation: We sent a perfectly valid OpenAI-format request with a system message

{
  "messages": [
    {"role": "system", "content": "You are a helpful assistant"},
    {"role": "user", "content": "Hello!"}
  ]
}

The Problem:

Gemini DOES support system instructions, but not as a "system" role message in the chat history. System instructions need to be set via the `system_instruction` parameter when initializing the GenerativeModel, not as part of the messages array that gets sent to `generate_content()`. 


Since ODC sends system messages as part of the messages array (OpenAI style), and we're using `generate_content()` with a contents array, we can't include system role messages there. 


The Solution:

Convert all system messages to user messages as a workaround. Gemini treats them as context either way: 

def convert_to_gemini_format(messages):
    """Convert OpenAI format to Gemini format."""
    gemini_messages = []
    for msg in messages:
        # Map roles: assistant -> model, everything else -> user
        if msg.role == "assistant":
            role = "model"
        else:
            # Gemini says "no" to system role, convert to user
            role = "user"
        
        gemini_messages.append({
            "role": role,
            "parts": [{"text": msg.content}]
        })
    return gemini_messages

Lesson learned:

When building a connector service, you're not just translating data formats, you're also translating conceptual differences between AI providers. Gemini's system instructions work differently than OpenAI's system messages. 


  1. "The Document Processing Mystery"

The Error: 

TypeError: expected string or bytes-like object

The Investigation:

We tested with a simple text message, worked perfectly. Then we tried uploading a PDF document from ODC, boom, error. 

The Problem:

ODC sends images and PDFs as base64-encoded data URLs like: 

data:image/png;base64,iVBORw0KGgoAAAANS...
data:application/pdf;base64,JVBERi0xLjQK...

But we were trying to process them as plain text. Here's the reality: PDFs are not really just text, and images are definitely not text. 

The Solution:

Detect data URLs, extract the MIME type and base64 data, decode it, and send it to Gemini as a proper blob.


import base64
from google.genai import types

def process_content_part(part):
    """Process a content part (text or image/document)."""
    if part.get("type") == "image_url":
        url = part["image_url"]["url"]
        
        # Parse data URL: data:image/png;base64,iVBORw0KG...
        if url.startswith("data:"):
            # Extract MIME type and base64 data
            header, base64_data = url.split(",", 1)
            mime_type = header.split(":")[1].split(";")[0]
            
            # Decode and create blob
            file_data = base64.b64decode(base64_data)
            return types.Part.from_bytes(
                data=file_data,
                mime_type=mime_type
            )
    
    # Regular text content
    return types.Part.from_text(text=part.get("text", ""))

Now Gemini can "see" the images and read the PDFs! This is crucial for multi-modal AI applications where users upload documents for analysis.


Lesson learned:

ODC's API contract supports multi-modal content, so your connector service needs to handle it properly. Don't assume everything is text!  


The Victory Lap (And What We Learned) 


We did it!

By the end of the weekend, we didn’t just have a working endpoint.  We had a connector that fully implemented ODC’s custom AI contract, translated OpenAI-style payloads into Vertex AI calls, handled multi-modal content correctly, enforced authentication properly, and returned responses in exactly the format ODC expects. 


Enterprise authentication? Service accounts. 

Data residency? Inside our own GCP project.

Multi-modal support? PDFs and images processed without drama. 

Error handling? Predictable instead of mysterious. 

Most importantly, ODC didn’t need to know any of it. It just saw a compliant AI endpoint and moved on. Which, architecturally speaking, is the highest compliment.  



What's Next?


Currently, our connector service works beautifully... on our laptop... with a LocalTunnel URL that changes every time we restart it. 

That's great for testing and demos, but not exactly production-ready.

For production use, you'll want to:  

  • Deploy to a serverless platform like Google Cloud Run for stable HTTPS URLs  

  • Integrate with Secret Manager for secure credential management  

  • Set up Cloud Logging for monitoring and debugging  

  • Configure auto-scaling to handle traffic spikes  

  • Implement proper health checks and uptime monitoring  

The good news? It's mostly a deployment step, not a rewrite. The bad news? That's still another weekend, plus however long it takes your privacy team to approve it. 


Key Takeaway 

So here's the thing: ODC's custom integration feature is exactly what extensibility should look like. When you need enterprise-specific configurations, you don't have to compromise - you build a connector that handles your requirements.


One rainy weekend later, we had a working connector that lets ODC talk to Vertex AI Gemini like they were always meant to work together.


The security team got their enterprise authentication. The AI agents got their multi-modal capabilities. And we learned that the gap between "not supported" and "working in production" is often smaller than it looks.


Sometimes the best solution isn't waiting for the platform to catch up, it's building the bridge yourself. 


About the Author 


NDG is a Tech Consultant who believes that when platforms say "you can't do that," it really means "you can't do that yet."


She likes making systems work together, even when they weren't designed to. When not tinkering with APIs, AIs, and low-code platforms, she's probably writing about them or explaining to compliance officers why Vertex AI is better than public AI endpoints. 

P.S.  If you're from OutSystems and reading this: please add native Vertex AI support to ODC. We'll happily retire this connector service. Until then, we'll keep building bridges!



end of article


 
 
 

Comments


direction.png

Ready to get started?

Let us help transform your business on the industry leading modern application platform

img

Accelerated Focus is an OutSystems Premier Partner based in North America and Europe.

© Accelerated Focus 2026

CONTACT US

We don't just build apps, we build businesses

img
  • LinkedIn
  • Twitter
bottom of page