Architecture

Data Layer & Warehouse

The schema that powers the dataflows context.

The Data Warehouse

Unlike other workflow engines that are purely ephemeral, dataflows builds a persistent "Context" of your company's data. This allows for historical analysis, time-travel debugging, and AI context generation.

Core Schema

The core schema is designed to be flexible yet strict where it matters.

Service Clients (service_clients)

Manages the identity of external systems.

  • Auth Types: OAuth2, API Key, Basic Auth.
  • Security Schema: Maps to OpenAPI security definitions.
  • Scopes: Defines what permissions the client has.

Connections (connections)

Stores the actual credentials (tokens) linking a User or Account to a Service Client.

  • Token Management: Automatically handles Access and Refresh tokens.
  • User Info: Caches user profile data from the provider.
  • Encryption: All sensitive tokens are encrypted at rest.

Executions (executions)

The audit log of everything that happens in the system.

  • Payload: The full event data that triggered the workflow.
  • Result: The final output of the workflow.
  • Status: Running, Success, Error.
  • Timing: Precise start and completion timestamps.

Data Synchronization

dataflows is designed to synchronize data from external systems into its own warehouse tables.

// Example: Syncing ClickUp Tasks to the local warehouse
export const syncClickUpTask = defineWorkflow({
  id: 'sync-clickup-task',
  trigger: clickup.onTaskUpdated(),

  async run({ event, step }) {
    const task = event.payload

    // Each upsert is a durable step — retried on failure,
    // skipped on replay if it already succeeded.
    const project = await step.run('upsert-project', () =>
      db.projects.upsert({
        clickup_id: task.project.id,
        name: task.project.name
      })
    )

    await step.run('upsert-task', () =>
      db.tasks.upsert({
        clickup_id: task.id,
        project_id: project.id,
        name: task.name,
        status: task.status
      })
    )
  }
})

This allows you to query your data using standard SQL, even if the source API is slow or down.

Analytics & BI Readiness

The true power of the dataflows warehouse is its ability to turn raw API data into actionable business intelligence.

Reporting Views

We don't just dump JSON into the database. We create structured SQL Views that join data across different services.

  • Unified Customer View: Join Stripe payments with HubSpot contacts.
  • Project Profitability: Join ClickUp time entries with Xero invoices.

Visualization

Because the data is in standard PostgreSQL, you can connect any visualization tool:

  • Internal Dashboards: Build custom Nuxt UI dashboards directly in the dataflows app.
  • BI Tools: Connect PowerBI, Tableau, or Metabase directly to the warehouse.
  • AI Context: This structured data serves as the "Long Term Memory" for your AI agents, allowing them to answer questions like "Which projects were over budget last month?"

Ready for automation that just works?

Tell us about your process. We will tell you if and how dataflows can help.