Skip to main content
Allium’s Datastream APIs enable you to transform raw blockchain streams into actionable signals by applying custom filters and routing the results to your preferred destination. Supported Destinations: Webhook endpoints, PubSub topics, Kafka topics

How It Works

Stream transformations follow a three-step process:
  1. Filter Data Sources - Define reusable lists of values (wallet addresses, contract addresses, token addresses)
  2. Filters - Apply logic to check if incoming messages match your criteria
  3. Workflows - Connect source streams, filters, and destinations together

Key Components

Example Use Case

Scenario: Monitor transactions for a specific set of DeFi protocol contracts
1

Create a Filter Data Source

Create a data source containing your list of contract addresses:
{
  "name": "defi_protocol_contracts",
  "type": "string_array",
  "values": ["0xabc...", "0xdef...", "0x123..."]
}
2

Create a Filter

Define filter logic to match transactions involving these contracts:
{
  "name": "defi_transactions_filter",
  "filter": {
    "field": "contract_address",
    "operator": "IN",
    "type": "data_source",
    "data_source_id": "your-data-source-id"
  }
}
3

Create a Workflow

Connect everything together - specify your source stream, apply the filter, and route to your destination:
{
  "name": "defi_monitoring_pipeline",
  "data_source_config": {
    "type": "PUBSUB",
    "topic": "ethereum.transactions"
  },
  "filter_id": "your-filter-id",
  "data_destination_config": {
    "type": "PUBSUB",
    "delivery_type": "PUSH",
    "webhook_url": "https://your-app.com/webhook"
  }
}

API Endpoints

Manage your stream transformations with the following endpoints:
ComponentAvailable Operations
Filter Data SourcesCreate, Read, Update, Delete, List, Get Values
FiltersCreate, Read, Update, Delete, List
WorkflowsCreate, Read, Update, Delete, List
For more details on stream transformations and use cases, visit the Datastreams documentation.
I