Unlike other workflow engines that are purely ephemeral, dataflows builds a persistent "Context" of your company's data. This allows for historical analysis, time-travel debugging, and AI context generation.
The core schema is designed to be flexible yet strict where it matters.
service_clients)Manages the identity of external systems.
connections)Stores the actual credentials (tokens) linking a User or Account to a Service Client.
executions)The audit log of everything that happens in the system.
dataflows is designed to synchronize data from external systems into its own warehouse tables.
// Example: Syncing ClickUp Tasks to local DB
export async function saveToDatabase(model: TaskModel) {
'use step'
// Upsert Project
const project = await upsertAgencyProject({
clickup_id: model.project.id,
name: model.project.name,
})
// Upsert Task
await upsertAgencyTask({
clickup_id: model.id,
project_id: project.id,
name: model.name,
status: model.status
})
}
This allows you to query your data using standard SQL, even if the source API is slow or down.
The true power of the dataflows warehouse is its ability to turn raw API data into actionable business intelligence.
We don't just dump JSON into the database. We create structured SQL Views that join data across different services.
Because the data is in standard PostgreSQL, you can connect any visualization tool: