You can deploy the Literal AI platform anywhere by provisioning the infrastructure, setting up the environment variables and running the Docker image.

Get the Image

docker login --username literalai
# When prompted for a password, use the token you have been provided with
docker pull literalai/platform:latest

Provision the Infrastructure

To run the Literal AI platform, you need to set up the following services:

  1. PostgreSQL database >= 15
  2. File storage (AWS S3, Google Cloud Storage, Azure Blob Storage)
  3. [Optional] Redis cache
  4. [Optional] SMTP server

Disclaimer

It is your responsibility to ensure that the infrastructure is secure and compliant with your company standards. Here are some recommendations:

  • Run the database and redis cache in a private network so that only the container running the Literal AI platform can access them.
  • Disallow public access to the file storage.
  • Disable credential authentication and use OAuth providers for authentication.

Configure Environment Variables

Before running the container, you need to set up the environment variables.

1. Database Configuration

The pgcrypto module is required for PostgreSQL. When using Azure, you will need to manually add the extension.

Required

You can either pass the full connection string or the individual components.

VariableDescriptionExample
DATABASE_URLURL for database communication (PostgreSQL supported).postgresql://username:password@host:port/db_name

OR

VariableDescription
DATABASE_HOSTHost of the database.
DATABASE_USERNAMEDatabase username.
DATABASE_PASSWORDPassword for the DB user.
DATABASE_NAMEDatabase name.

Optional

VariableDescriptionExample
DATABASE_DIRECT_URLDirect URL for database migrations (cannot use pg-bouncer).postgresql://username:password@host:port/db_name
DATABASE_SSLSet to ‘true’ if using a secure network.false

2. Authentication Configuration

Provider Agnostic

VariableDescriptionExample
NEXTAUTH_SECRETSecret for encrypting JWT and hashing tokens.your secret
NEXTAUTH_URLURL for your domain.https://yourdomain.com
ENABLE_CREDENTIALS_AUTHEnable credentials (email password) authentication.false

Provider Specific

Remember you will have to allow the OAuth redirect URL in your provider’s settings.
VariableDescriptionOAuth Redirect URL
AzureAZURE_AD_CLIENT_ID AZURE_AD_CLIENT_SECRET AZURE_AD_TENANT_ID/api/auth/callback/azure-ad
GoogleGOOGLE_CLIENT_ID GOOGLE_CLIENT_SECRET/api/auth/callback/google
OktaOKTA_CLIENT_ID OKTA_CLIENT_SECRET OKTA_ISSUER/api/auth/callback/okta
GitHubGITHUB_ID GITHUB_SECRET/api/auth/callback/github

3. File Storage Configuration

Provider Agnostic

VariableDescriptionExample
BUCKET_NAMEName of the bucketmy_bucket

Provider Specific

ProviderVariablesComment
AWS S3APP_AWS_ACCESS_KEY APP_AWS_SECRET_KEY APP_AWS_REGION
Google Cloud StorageAPP_GCS_PROJECT_ID APP_GCS_CLIENT_EMAIL APP_GCS_PRIVATE_KEYPrivate key should be base64 encoded.
Azure Blob StorageAPP_AZURE_STORAGE_ACCOUNT APP_AZURE_STORAGE_ACCESS_KEY

Configure CORS for your storage solution:

4. Cache Configuration [Optional]

Setting up a cache is optional, but recommended to decrease latency.

Either REDIS_URL or REDISHOST and REDISPORT can be used.

VariableDescriptionExample
REDIS_URLFull Redis URL.redis://127.0.0.1:6379/0
REDISHOSTRedis host (useful with Google Cloud Run).127.0.0.1
REDISPORTRedis port.6379

The cache allows for asynchronous task handling such as:

  • step ingestion
  • AI evals monitoring
  • experiment runs

Control over the queues and workers handling the asynchronous tasks requires an authentication token defined via the ADMIN_AUTH_TOKEN environment variable. Set it to a non-empty value to be able to investigate your cache-based queueing system.

5. SMTP Configuration [Optional]

Setting up SMTP is optional. It enables Literal AI to send emails for password resets and project invitations.
VariableDescriptionExample
EMAIL_SERVER_HOSTMail server host."servicehost"
EMAIL_SERVER_PORTMail server port."4000"
EMAIL_SERVICEEmail service provider."gmail"
EMAIL_USEREmail username."username"
EMAIL_PASSEmail password."password"
EMAIL_FROMSender email address."noreply@service.com"

Configure the Port and Start the Container

The server within the Docker image runs on port 3000. Ensure this port is exposed.

Start the Container

You should now be able to start the container and access the Literal AI Platform.