Getting Started
Start with Argy: pick a deployment model, set governance, publish a first golden path.
Argy is the operating system for enterprise AI: a unified platform to standardize, automate, and govern AI and DevSecOps workflows.
It combines:
- Interfaces: Portal (web), Argy Chat (web), Argy Code (terminal)
- A governed AI core: LLM Gateway (multi-provider, quotas, filters, audit)
- A catalog of versioned modules (golden paths)
- Projects, deployments, pipelines, observability, and governance
Prerequisites
- A clear first use case to standardize (DevSecOps is typically the starting point).
- Your core toolchain context (Git, CI/CD, cloud and/or Kubernetes).
- Stakeholders for governance (Platform, Security/Compliance, Engineering leadership).
Setup steps (recommended)
1) Choose a deployment model
- SaaS (AKS West Europe, managed by Argy)
- Hybrid (SaaS control plane + execution agents in your infrastructure)
- On‑prem (Kubernetes deployment; LLM Gateway on-prem available as Enterprise add-on)
See: Deployment Guide
2) Set up identity and access
- Passwordless-first (Passkeys + Magic link)
- SSO via OIDC / SAML
- SCIM provisioning
- RBAC roles
See: Security Model
3) Define governance
- Approval policies for sensitive actions (deploy, publish, delete, etc.)
- LLM quotas and filters (PII, secrets, prompt injection, forbidden topics)
- Audit retention and export needs
4) Publish your first golden path
- Use Studio to design a workflow module
- Configure inputs/outputs schemas and bindings
- Simulate, publish, and version the module
5) Deploy through projects and environments
- Create a project
- Deploy a module to Dev/Staging/Prod
- Track runs, logs, outputs, and approvals
What you get first
- Self-service with control: teams move fast without bypassing governance.
- Traceability: audit logs, approvals, and execution evidence.
- Operational visibility: pipelines, logs, and DORA-oriented observability.
Next reads: Developer Guide and LLM Gateway.