share-nodesPipelines

Pipelines: Automated Render Workflows

Beta: The Glossi API is currently in beta. Endpoints, request/response formats, and behavior may change as we iterate. If you run into issues or have feedback, reach out at [email protected]envelope.

Pipelines let you connect an external file storage system (like Google Drive or OneDrive) directly to Glossi. When new 3D model files appear in your source folder, Glossi automatically picks them up, processes them, renders images using your chosen template, and exports the finished renders to your output folder.

No code, no polling, no manual steps -- just drop files and get renders.

You can manage pipelines via the dashboard at /settings/pipelines or programmatically through the API.


How It Works

Your Source FolderGlossi PipelineYour Output Folder

Glossi automatically polls your source folder for new files. For each file it finds, it downloads the model, processes it, renders images using your chosen template, and exports the finished renders to your output folder.

Each file goes through these stages automatically:

Stage
What Happens

Queued

New file detected in source folder

Ingesting

File downloaded and uploaded to Glossi

Processing

3D model being converted and prepared

Rendering

Images/videos being rendered using your template

Exporting

Finished renders uploaded to your output folder

Completed

Done!

If any stage fails, the pipeline retries automatically (up to 3 times by default).


Quick Start

Prerequisites

  1. A Glossi template to use for rendering (browse templates)

  2. A Google Drive or OneDrive account (see connector setup below)

1. Set Up Your Connector

Currently supported: Google Drive and OneDrive. More connectors (Dropbox, S3) coming soon.

Choose your connector below:

chevron-rightOneDrive Setuphashtag

OneDrive uses a Sign in with Microsoft flow -- just click a button on the dashboard and authorize Glossi to access your files. Works with both personal Microsoft accounts and Microsoft 365 business accounts.

  1. Go to Settings → Pipelines (/settings/pipelines) and click Create Pipeline

  2. On the Connectors step, click Sign in with Microsoft

  3. Sign in with your Microsoft account and grant Glossi permission to access your OneDrive files

  4. After signing in, you'll be redirected back and your account will be connected automatically

  5. Click Browse Folders to navigate your OneDrive and select your input and output folders

chevron-rightGoogle Drive Setuphashtag

Google Drive uses a service account for authentication -- no user login required, fully automated.

Important: For output folders (where renders are exported), you must use a Shared Drive (requires Google Workspace). Service accounts don't have storage quota and cannot upload to regular folders. For input folders (where you drop 3D files), regular folders work fine.

Step 1: Create a Service Account

  1. Go to Google Cloud Consolearrow-up-right and create a project (or use an existing one)

  2. Go to APIs & ServicesLibrary, search for "Google Drive API", and click Enable

  3. Go to IAM & AdminService AccountsCreate Service Account

  4. Give it a name (e.g., "Glossi Pipeline") and click Create and ContinueDone

  5. Click on your new service account, go to Keys tab → Add KeyCreate new keyJSON

  6. Download the JSON file -- it contains your credentials

From the JSON file, you'll need:

  • client_email -- the service account email address

  • private_key -- the private key (include the full key with -----BEGIN PRIVATE KEY----- markers)

Step 2: Set up your Input Folder (regular folder)

  1. In Google Drive, create or choose a folder for input files

  2. Right-click the folder → Share

  3. Add the service account email (e.g., [email protected])

  4. Give it Editor access

Step 3: Set up your Output Folder (Shared Drive required)

Service accounts cannot upload to regular folders because they don't have storage quota. You must use a Shared Drive:

  1. In Google Drive, click Shared drives in the left sidebar

  2. Click New to create a Shared Drive (e.g., "Glossi Renders")

  3. Click the Shared Drive name → Manage members

  4. Add the service account email with Content Manager role

Note: Shared Drives are only available with Google Workspace (paid). If you have a free @gmail.com account, consider using S3 for output instead.

Get your Folder IDs:

From the Google Drive URL when viewing the folder:

For Shared Drives, the folder ID is in the URL the same way.

2. Create a Pipeline

The easiest way to create a pipeline is through the Glossi dashboard. If you prefer to automate pipeline creation programmatically, you can use the REST API instead.

Option A: Dashboard (recommended)

Go to Settings → Pipelines (/settings/pipelines) and click Create Pipeline. The guided form walks you through:

  1. Connectors -- choose your connector type, sign in with Microsoft (for OneDrive) or enter your service account credentials (for Google Drive), then select your input and output folders

  2. Basics -- name your pipeline and choose a template

  3. Advanced -- adjust polling interval, batch size, retry limits, and other options (the defaults work well for most use cases)

Once created, Glossi starts watching your input folder immediately.

Option B: REST API

If you'd rather create pipelines programmatically, you'll need a Glossi API key.

Using Google Drive:

Using OneDrive:

Note: OneDrive pipelines require the user to sign in via the dashboard first (to authorize Glossi). Use the dashboard at /settings/pipelines to create OneDrive pipelines.

3. Monitor Progress

Any .glb, .usdz, .fbx, .obj, .step, .stp, or .stl files that appear in your input folder will be automatically processed.

You can monitor progress from the dashboard at /settings/pipelines -- click on a pipeline to see job counts by stage, browse individual jobs, and filter by status.

Or check status via the API:

To view individual jobs (e.g. failed ones):


Pipeline Options

Option
Default
Description

pollIntervalMinutes

5

How often Glossi checks your folder for new files (5--60 minutes)

processExistingFiles

true

Whether to process files already in the folder when the pipeline is first created

reprocessOnModified

false

Whether to re-render a file if it's updated after the initial render

batchSize

10

Maximum files to process in parallel per cycle

maxRetries

3

How many times to retry a failed file before giving up

Render Settings

Setting
Default
Description

renderBookmarks

true

Render all camera bookmark images

renderShots

true

Render all video shots

renderVariants

true

Render all model variants

imageQuality

1

Image resolution: 0 = 720p, 1 = 1080p, 2 = 4K

videoQuality

1

Video quality: 0 = Fastest, 1 = Balanced, 2 = Best


Managing Pipelines

You can pause, resume, and delete pipelines from the dashboard at /settings/pipelines with one click. If you prefer the API:

Pause a Pipeline

Resume a Pipeline

This also resets the error counter if the pipeline was in an error state.

Delete a Pipeline

This removes the pipeline and all its job history.


How File Detection Works

Pipelines use efficient change tracking to detect new files:

  • First poll: Scans the folder and either queues all existing files (processExistingFiles: true) or just starts tracking from that point (processExistingFiles: false)

  • Subsequent polls: Only checks for changes since the last poll, so even a folder with 1,000 files is scanned instantly if nothing changed

  • Deduplication: Each file is only ever processed once. If the same file appears in multiple scans, it's ignored after the first time

  • Modified files: If reprocessOnModified is enabled, files that are updated after their initial render will be re-processed


Error Handling

Pipelines are designed to handle failures gracefully:

  • Automatic retries: Failed files are retried with increasing delays (1 min, 2 min, 4 min, etc.)

  • Max retries: After the configured number of retries (default 3), the file is marked as permanently failed

  • Circuit breaker: If 10 files fail in a row, the pipeline pauses automatically to prevent runaway errors. Resume it with a PATCH request once you've investigated

  • Job history: Every failure is recorded with the error message, the stage it failed at, and a timestamp -- useful for debugging

Viewing Failed Jobs

Each failed job includes:

  • lastError -- the most recent error message

  • retryCount -- how many retry attempts have been made

  • sourceFile.name -- which file failed


Output Folder Structure

Rendered files are organized in your output folder by model name:


Supported File Types

Extension
Format

.glb

glTF Binary

.gltf

glTF

.usdz

Universal Scene Description

.usd

Universal Scene Description

.fbx

Autodesk FBX

.obj

Wavefront OBJ

.step, .stp

STEP (CAD)

.stl

Stereolithography


Available Connectors

Connector
Status
Description

onedrive

Available

Microsoft OneDrive (personal and business accounts)

googledrive

Available

Google Drive (output requires Shared Drive / Google Workspace)

dropbox

Coming soon

Dropbox

s3

Coming soon

Amazon S3 / compatible

Check available connectors:

OneDrive Connector

Authenticated via Sign in with Microsoft on the dashboard. Works with personal Microsoft accounts (@outlook.com, @hotmail.com) and Microsoft 365 business accounts. No Azure app registration or admin setup required.

Field
Description

refreshToken

OAuth refresh token (set automatically after sign-in via the dashboard)

driveId

OneDrive drive ID (set automatically after sign-in)

folderId

Folder item ID (enter the ID of the folder to watch or export to)

Google Drive Connector

Uses a Google Cloud service account for authentication. No user login required.

Field
Description

clientEmail

Service account email (from JSON key file client_email)

privateKey

Service account private key (from JSON key file private_key, include BEGIN/END markers)

folderId

Google Drive folder ID (from the folder URL)

Important:

  • Input folders: Share the folder with the service account email (Editor access)

  • Output folders: Must use a Shared Drive (Google Workspace required). Add the service account as a Content Manager. Service accounts cannot upload to regular folders due to storage quota limitations.


Dashboard

Everything in this guide can also be done from the Glossi dashboard at /settings/pipelines -- no API key or curl commands needed. The dashboard and API share the same underlying service, so changes made in either place are reflected everywhere.


Security

Connector credentials (such as Google service account keys and OneDrive OAuth tokens) are protected at multiple layers:

  • Encrypted at rest: Sensitive fields (private keys, refresh tokens, client secrets) are encrypted with AES-256-GCM before being stored in the database. They are only decrypted internally when the pipeline needs to communicate with your storage service.

  • Redacted in responses: API and dashboard responses never expose raw secrets. Secret fields are replaced with •••••••• in all query responses.

  • OAuth tokens: OneDrive uses short-lived access tokens (1 hour) that are refreshed automatically. If Microsoft rotates the refresh token, Glossi stores the new one securely.

  • Transit: All API communication uses HTTPS/TLS.


Need Help?

  • Support: Contact us at [email protected]

  • API Reference: See the Getting Started Guide

  • Webhooks: Configure notifications for pipeline events

Last updated

Was this helpful?