Skip to main content
POST
/
api
/
v1
/
beam
/
backfills
Create backfill
curl --request POST \
  --url https://api.example.com/api/v1/beam/backfills
POST /api/v1/beam/backfills
Stages data from a Snowflake table for the given window, then streams it to one or more destinations. The job progresses through staging → transporting → completed (or failed / cancelled).
curl -X POST https://api.allium.so/api/v1/beam/backfills \
  -H "X-API-Key: ${ALLIUM_API_KEY}" \
  -H "Content-Type: application/json" \
  -d '{
    "job_name": "USDC transfers on Polygon — Jan 2024",
    "table": "polygon.assets.erc20_token_transfers",
    "backfill_window": {
      "partition_key": "block_timestamp",
      "start": "2024-01-01T00:00:00",
      "end": "2024-02-01T00:00:00"
    },
    "filter": {
      "column": "token_address",
      "values": ["0x3c499c542cef5e3811e1192ce70d8cc03d5c3359"]
    },
    "destinations": [
      { "type": "kafka", "name": "polygon-usdc-transfers-backfill" }
    ]
  }'
Response:
{
  "job_id": "f47ac10b-58cc-4372-a567-0e02b2c3d479",
  "workflow_id": "beam-backfill-org123-f47ac10b-58cc-4372-a567-0e02b2c3d479"
}
Use job_id to poll job status via List backfills or to cancel via Cancel backfill.
Request body:
{
  "job_name": str,
  "table": str,
  "backfill_window": { ... },
  "filter": { ... },
  "destinations": [ ... ],
  "static_egress_ip": bool
}

backfill_window

start is inclusive, end is exclusive, and start must be less than end.
Backfill by block height. Set partition_key to "block".
{
  "partition_key": "block",
  "start": 12345,
  "end": 12500
}
FieldTypeRequiredDescription
partition_key"block"YesMust be block
startintegerYesInclusive start block
endintegerYesExclusive end block

filter (optional)

{
  "column": "token_address",
  "values": ["0x3c499c542cef5e3811e1192ce70d8cc03d5c3359"]
}
Translates to WHERE column IN (values) on the Snowflake COPY INTO query.

destinations

Uses the same sink types as pipelines. See the Sinks reference.