Home / Platform / Deploy Anywhere
001  /  Infrastructure Layer  /  Full Control

One platform. Every environment.

Public cloud. Private cloud. On-premise. Air-gapped.

Saptiva AI runs wherever your compliance framework requires. The same applications, the same orchestration, the same audit surface — across every deployment mode. When the framework changes, the platform follows. You don't replatform.

002 / Constant

Everything on the left is the same in every environment.

Deployment mode is a runtime concern, not a product concern. The applications, the orchestration engine, the policy language, the audit surface — they are identical across public cloud, hybrid, on-premise, and air-gapped deployments. Only the compute placement changes.

Saptiva Studio applications run identically
FrIdA orchestration runs identically
Policy language is the same file, the same repo
Audit record format is identical and portable
Forward Deployed Engineering model is the same
Upgrade path is the same
003 / Modes

Four deployment modes. One platform.

Every Saptiva AI deployment sits in one (or more) of these four modes. FrIdA routes workloads across them according to the policy you author. A single customer deployment routinely runs workloads in multiple modes simultaneously, against different data classes.

MODE 01

Public Cloud

Elastic compute for non-resident workloads.

SUITS

Non-sensitive workloads, general-purpose tasks, bursting capacity, development and staging environments. Not suitable for regulated data flows in most LatAm jurisdictions.

COMPUTE TARGETS
AWSus, eu, latam regions
AZUREany region
GCPany region
POLICY POSTURE

FrIdA forbids dispatch of workloads carrying regulated data unless the cloud provider and region explicitly satisfy the policy.

MODE 02

Private / Sovereign Cloud

In-country cloud for regulated workloads.

SUITS

Regulated data at volume, multi-tenant banking workloads, KYC at scale, customer operations with data residency requirements. The primary deployment mode for most of our banking and insurance customers.

COMPUTE TARGETS
SOVEREIGN REGIONSbr · mx · cl
PRIVATE VPCcustomer-operated
COLOCATIONin-country facilities
POLICY POSTURE

Residency enforced by architecture. Egress forbidden. Customer-managed encryption keys. Full audit trail retained in-country.

MODE 03

On-Premise

Customer data center, customer hardware.

SUITS

Institutions with an on-prem mandate, regulated workloads that cannot leave customer-controlled infrastructure, governments, and the highest-sensitivity financial and healthcare data flows.

COMPUTE TARGETS
HPEdistribution partner
DELLdistribution partner
NVIDIAgpu infrastructure
POLICY POSTURE

Network egress controlled at the customer's perimeter. Audit stream consumed locally. Model weights resident on customer hardware.

MODE 04

Air-Gapped

Disconnected environments, maximum isolation.

SUITS

Defense, intelligence, critical infrastructure, the most regulated government and financial workloads. Environments where the connectivity requirement itself is a risk.

COMPUTE TARGETS
ISOLATED RACKScustomer-operated
OFFLINE MODEL SYNCauthorized media
LOCAL INFERENCEno external deps
POLICY POSTURE

Zero external connectivity at runtime. Model and policy updates via authorized, signed offline transfer. Full audit surface available locally.

004 / When To Use What

Picking the mode for your workload.

A single deployment typically uses more than one mode — different data classes routed to different environments by the policy FrIdA evaluates. Common routings across LatAm regulated enterprises.

A bank's credit origination copilot operates on PII plus financial data under CNBV residency requirements.
TYPICAL MODE
Private / Sovereign Cloud — Mexico region.
An insurer's policy document processing runs against dense, high-volume document flows with CNSF oversight.
TYPICAL MODE
On-premise — customer data center.
A government AI initiative requires sovereign execution with no cross-border dependency.
TYPICAL MODE
On-premise or sovereign cloud, operated in-country.
A fintech's internal knowledge system operates on non-sensitive corporate documents with bursting compute needs.
TYPICAL MODE
Public cloud — approved provider and region.
A defense contractor runs classified workflows under zero external connectivity requirements.
TYPICAL MODE
Air-gapped — fully isolated infrastructure.
005 / Portability

The deployment mode can change. Your applications cannot tell.

Most enterprise AI platforms treat deployment mode as a rebuild event. Moving from cloud to on-prem means new code, new configuration, new audit surface, new certification. Saptiva AI was built from the start to treat mode as a runtime concern.

I

Regulations tighten. You move from sovereign cloud to on-prem.

Update the policy file. FrIdA starts routing the affected workloads on-premise on the next dispatch. The application code does not change. The audit record remains continuous.

II

You expand into a new country. Different residency rules apply.

Author a country-specific policy extension. The same Studio applications deploy against local compute. Two countries, one codebase, two policy regimes.

III

Your preferred cloud changes. Procurement switches vendors.

FrIdA's compute target changes by configuration. Zero application rewrite. Zero model retraining. Audit record remains portable.

IV

You need to add air-gapped capacity. For a specific workload.

Install the air-gapped runtime at the customer site. Route the specific workload by policy. The other workloads are unaffected. The platform remains one platform.

006 / Partners

The partners who ship with us.

Every deployment mode relies on infrastructure we do not sell. Our partners provide the hardware, the cloud capacity, and the in-country distribution. Saptiva AI orchestrates above them.

GPU · VALIDATION
NVIDIA
NVIDIA-validated infrastructure across on-premise, private cloud, and sovereign cloud deployments. KAL runs on NVIDIA infrastructure under Mexican jurisdiction.
ON-PREM · DISTRIBUTION
HPE
Distribution partner for Saptiva AI on-premise and hybrid deployments across regulated enterprises in the region.
ON-PREM · DISTRIBUTION
Dell
Distribution partner delivering integrated AI infrastructure into LatAm financial services and government customers.
PUBLIC CLOUD
AWS · Azure · GCP
All three major hyperscalers are available as compute targets. Saptiva AI is cloud-agnostic; our policy layer governs where, not which.
REGIONAL SI
Bajaware
Regional systems integrator distributing Saptiva AI inside existing enterprise relationships across Mexico and Central America.
REGIONAL SI
Intelimetrica
Enterprise distribution partner across the region's financial services and government sectors.
007 / Get In Touch

Bring the environment you have.

Hybrid rack you're still standing up. Sovereign region you're migrating to. Air-gapped cluster your compliance team insists on. Saptiva AI is already there. A Forward Deployed Engineer will respond within 48 hours.

Request a deployment review