Allium Datashares
Data delivery straight to your data lake for privacy and control.
Allium handles and maintains bulk data delivery and the full life cycle management of data delivery that has hidden complexities that is commonly overlooked:
Bulk delivery - Allium can deliver 100s of TBs of data into any destination in days, so customers can avoid writing their own scripts to hit a static API
Seamless schema updates & migrations - Allium leverages an Iceberg table format together with clearly-defined SOPs for any schema changes to ensure backward compatibility
Benefits of Datashares
Seamlessly join their data with Allium's blockchain data in their own secure environment
Privacy -> Allium won't be able to see the queries our customers run with this method
Manage their own compute workloads
Run their own complex analysis on top of the whole 1 Petabyte dataset (e.g. constructing a graph database on top)
Allium supports a wide range of data connectors and data lake destinations to seamlessly integrate blockchain data with your own data lake.
We have our data in nearly every Data Query Engine, and multiple regions - US East, Central, West and also some Europe regions.
Supported Data Exports
Datashares
Query blockchain data directly via your data lake.
Data Dumps
Get data delivered to your data lake or pull data from Allium's data lake.
⛁ Amazon S3, Google Cloud Storage, e.t.c.
Supported Data Platforms and Regions
Snowflake
75+ chains* in GCP US Central 1
Can deliver worldwide with 3 hour freshness*
Solana is available.*
BigQuery
30 chains in GCP US Central 1
15 chains in EU West 2
Databricks
3 chains in US Central
3 chains in US East 1
Contact us if you would like your data destination and region to be supported.
Last updated
Was this helpful?