Skip to main content
A Beam pipeline config describes the full data flow: where data comes from (source), how it’s processed (transforms), and where it goes (sinks).
source → transforms[] → sinks[]

BeamConfig

The top-level object returned by all config endpoints.
{
  "id": "abc123",
  "name": "USDC Transfer Monitor",
  "description": "Monitors USDC ERC20 transfers on Base",
  "tags": ["production", "base"],
  "static_egress_ip": false,
  "owner_org_team_user_id": "org_xyz",
  "created_at": "2025-01-15T10:30:00Z",
  "updated_at": "2025-01-15T10:30:00Z",
  "pipeline_config": {
    "source": { ... },
    "transforms": [ ... ],
    "sinks": [ ... ]
  }
}
FieldTypeRequiredDescription
idstringAuto-generatedPipeline ID. Omit when creating — the server generates it.
namestringYesPipeline name
descriptionstringYesPipeline description
tagsstring[]NoTags for organizing pipelines. Default: null
static_egress_ipbooleanNoEnable a static outbound IP for webhook sinks. Default: false
owner_org_team_user_idstringAuto-setOwner ID, set from the authenticated API key
created_atdatetimeAuto-setCreation timestamp
updated_atdatetimeAuto-setLast update timestamp
pipeline_configPipelineConfigYesThe pipeline’s source, transforms, and sinks

CreateBeamConfigRequest

Used as the request body for Create pipeline. Same as BeamConfig but without auto-generated fields.
{
  "name": "USDC Transfer Monitor",
  "description": "Monitors USDC ERC20 transfers on Base",
  "tags": ["production", "base"],
  "static_egress_ip": false,
  "pipeline_config": {
    "source": { ... },
    "transforms": [ ... ],
    "sinks": [ ... ]
  }
}
FieldTypeRequiredDescription
namestringYesPipeline name
descriptionstringYesPipeline description
tagsstring[]NoTags for organizing pipelines
static_egress_ipbooleanNoEnable a static outbound IP. Default: false
pipeline_configPipelineConfigYesThe pipeline’s source, transforms, and sinks

PipelineConfig

{
  "source": { ... },
  "transforms": [ ... ],
  "sinks": [ ... ]
}
FieldTypeRequiredDescription
sourceSourceYesWhere data comes from
transformsTransform[]YesProcessing steps applied in order
sinksSink[]YesWhere processed data is delivered

Source

Connects to Allium’s Datastreams. Select a blockchain and entity type.
{
  "type": "pubsub",
  "chain": "base",
  "entity": "erc20_token_transfer",
  "is_zerolag": false
}
FieldTypeRequiredDescription
typestringAuto-setSource type (set automatically based on chain)
chainstringYesBlockchain to source data from
entitystringYesEntity type to stream
is_zerolagbooleanNoStream from the tip of the blockchain before finality. Lower latency but may include reorged data. Default: false

Supported chains and entities

ChainEntitiesZerolag
Polygonlog, decoded_log, erc20_token_transfer, erc721_token_transfer, erc1155_token_transferYes
Baselog, decoded_log, erc20_token_transfer, erc721_token_transfer, erc1155_token_transferYes
Solananonvoting_transactionNo
Hyperliquidblock, trade, fill, order, misc_eventNo
More chains and entities are being added regularly. Check the Datastreams catalog for the latest availability, or use the GET /api/v1/beam/sources endpoint.

Transforms

Transforms process data in order. Two types are available. Each transform has an auto-generated uid — omit it when creating new transforms and the server will assign one.
Filters data by matching a field value against a set. Only records whose extracted value exists in your set pass through. Sets support 10M+ values.
{
  "type": "redis_set_filter",
  "uid": "tf-001",
  "filter_expr": "root = this.token_address"
}
FieldTypeRequiredDescription
type"redis_set_filter"YesMust be redis_set_filter
uidstringNoAuto-generated if omitted. Include to reference an existing transform.
filter_exprstringYesBloblang expression to extract the field value for filtering
Common filter_expr patterns:
ExpressionFilters by
root = this.addressContract address
root = this.topic0Event signature
root = this.from_addressTransaction sender
root = this.to_addressTransaction recipient
Filter values are managed separately via the filter values endpoints. Changes take effect immediately — no redeploy needed.
Use lowercase values when filtering by addresses, labels, or symbols.

Sinks

Sinks define where processed data is delivered. Each sink has an auto-generated uid — omit it when creating new sinks. Four types are available:
Delivers to an Allium-managed Kafka topic. After deployment, you receive connection credentials and consumer code snippets.
{
  "type": "kafka",
  "uid": "sk-001",
  "name": "usdc-transfers"
}
FieldTypeRequiredDescription
type"kafka"YesMust be kafka
uidstringNoAuto-generated if omitted
namestringYesTopic name suffix. Full topic: beam.{config_id}.{name}
Contact support@allium.so if you need a specific sink type.

ID fields and creation behavior

When creating resources (pipelines, transforms, sinks), omit the id / uid field and the server will auto-generate one. When updating, include the id / uid to reference the existing resource.
ResourceID fieldBehavior when omitted
PipelineidNew pipeline created with auto-generated ID
TransformuidNew transform created with auto-generated UID
SinkuidNew sink created with auto-generated UID