You can deploy the Literal AI platform anywhere by provisioning the infrastructure, setting up the environment variables and running the Docker image.
Get the Image
docker login --username literalai
# When prompted for a password, use the token you have been provided with
docker pull literalai/platform:latest
Provision the Infrastructure
To run the Literal AI platform, you need to set up the following services:
- PostgreSQL database >= 15
- File storage (AWS S3, Google Cloud Storage, Azure Blob Storage)
- [Optional] Redis cache
- [Optional] SMTP server
Disclaimer
It is your responsibility to ensure that the infrastructure is secure and compliant with your company standards. Here are some recommendations:
- Run the database and redis cache in a private network so that only the container running the Literal AI platform can access them.
- Disallow public access to the file storage.
- Disable credential authentication and use OAuth providers for authentication.
Configure Environment Variables
Before running the container, you need to set up the environment variables.
1. Database Configuration
Required
You can either pass the full connection string or the individual components.
Variable | Description | Example |
---|
DATABASE_URL | URL for database communication (PostgreSQL supported). | postgresql://username:password@host:port/db_name |
OR
Variable | Description |
---|
DATABASE_HOST | Host of the database. |
DATABASE_USERNAME | Database username. |
DATABASE_PASSWORD | Password for the DB user. |
DATABASE_NAME | Database name. |
Optional
Variable | Description | Example |
---|
DATABASE_DIRECT_URL | Direct URL for database migrations (cannot use pg-bouncer). | postgresql://username:password@host:port/db_name |
DATABASE_SSL | Set to ‘true’ if using a secure network. | false |
2. Authentication Configuration
Provider Agnostic
Variable | Description | Example |
---|
NEXTAUTH_SECRET | Secret for encrypting JWT and hashing tokens. | your secret |
NEXTAUTH_URL | URL for your domain. | https://yourdomain.com |
ENABLE_CREDENTIALS_AUTH | Enable credentials (email password) authentication. | false |
Provider Specific
Remember you will have to allow the OAuth redirect URL in your provider’s settings.
Variable | Description | OAuth Redirect URL |
---|
Azure | AZURE_AD_CLIENT_ID AZURE_AD_CLIENT_SECRET AZURE_AD_TENANT_ID | /api/auth/callback/azure-ad |
Google | GOOGLE_CLIENT_ID GOOGLE_CLIENT_SECRET | /api/auth/callback/google |
Okta | OKTA_CLIENT_ID OKTA_CLIENT_SECRET OKTA_ISSUER | /api/auth/callback/okta |
GitHub | GITHUB_ID GITHUB_SECRET | /api/auth/callback/github |
3. File Storage Configuration
Provider Agnostic
Variable | Description | Example |
---|
BUCKET_NAME | Name of the bucket | my_bucket |
Provider Specific
Provider | Variables | Comment |
---|
AWS S3 | APP_AWS_ACCESS_KEY APP_AWS_SECRET_KEY APP_AWS_REGION | |
Google Cloud Storage | APP_GCS_PROJECT_ID APP_GCS_CLIENT_EMAIL APP_GCS_PRIVATE_KEY | Private key should be base64 encoded. |
Azure Blob Storage | APP_AZURE_STORAGE_ACCOUNT APP_AZURE_STORAGE_ACCESS_KEY | |
4. Cache Configuration [Optional]
Setting up a cache is optional, but recommended to decrease latency.
Either REDIS_URL
or REDISHOST
and REDISPORT
can be used.
Variable | Description | Example |
---|
REDIS_URL | Full Redis URL. | redis://127.0.0.1:6379/0 |
REDISHOST | Redis host (useful with Google Cloud Run). | 127.0.0.1 |
REDISPORT | Redis port. | 6379 |
The cache allows for asynchronous task handling such as:
- step ingestion
- AI evals monitoring
- experiment runs
Control over the queues and workers handling the asynchronous tasks requires an authentication
token defined via the ADMIN_AUTH_TOKEN
environment variable. Set it to a non-empty value to
be able to investigate your cache-based queueing system.
5. SMTP Configuration [Optional]
Setting up SMTP is optional. It enables Literal AI to send emails for password resets and project invitations.
Variable | Description | Example |
---|
EMAIL_SERVER_HOST | Mail server host. | "servicehost" |
EMAIL_SERVER_PORT | Mail server port. | "4000" |
EMAIL_SERVICE | Email service provider. | "gmail" |
EMAIL_USER | Email username. | "username" |
EMAIL_PASS | Email password. | "password" |
EMAIL_FROM | Sender email address. | "noreply@service.com" |
The server within the Docker image runs on port 3000
. Ensure this port is exposed.
Start the Container
You should now be able to start the container and access the Literal AI Platform.