Architecture
How It Works
Create a New Pipeline
Log in to the GoldRush Platform and navigate to Manage Pipelines. Click Create New Pipeline.
Select your Destination Type
Select a destination type - ClickHouse, Postgres, Kafka, Object Storage (S3/GCS/R2), AWS SQS, or Webhook - and provide your connection credentials. You can also test if the Pipeline API can connect to your destination successfully.
Select your Data Object Type
Choose a chain and data type. Pick from event logs, transactions, token transfers, or protocol-specific data streams like Hyperliquid fills or Solana DEX trades.
Select your Data Range
Configure the block range for your pipeline. Your pipeline can run in both bounded (with start/end block heights) as well as unbounded mode for continuously delivering data.
Apply SQL transforms and ABI decoding
Optionally add SQL transforms to filter rows, select specific columns, or compute derived fields before data reaches your destination. Aditionally you can also provide ABI
definitions to decode raw event logs or call data into structured, typed columns.
Destination Types
ClickHouse
Stream data into ClickHouse for high-performance analytical queries over large volumes of blockchain data.
Postgres
Push decoded blockchain data into Postgres for application backends and transactional workloads.
Kafka
Publish raw data to Kafka topics for downstream consumers, stream processing, and event-driven architectures.
Object Storage
Write data to S3, GCS, or Cloudflare R2 as Parquet or JSON files for data lake and batch processing workflows.
AWS SQS
Deliver events to SQS queues for reliable, decoupled message processing with at-least-once delivery.
Webhook
Receive HTTP POST callbacks for each event, enabling lightweight integrations without managing infrastructure.
Key Features
ABI Decoding
Automatically decode raw EVM event logs into structured, typed columns using standard Solidity ABI definitions.
SQL Transforms
Apply SQL expressions to filter rows, select specific columns, or compute derived fields before data reaches your destination. Reduce storage costs and processing overhead at the source.
Protocol-Native Schemas
Purpose-built normalizers for Hyperliquid and Solana DEX data that produce clean, protocol-aware tables and not raw byte arrays.
Supported Chains
| Chain | Status |
|---|---|
| Base | Live |
| Hyperliquid | Live |
| Solana | Live |
| Polygon | Coming Soon |
| Arbitrum | Coming Soon |
| Ethereum Mainnet | Coming Soon |
For the full list of supported chains and available data entities, see the Supported Chains page.
Pipeline API vs Foundational and Streaming
Each GoldRush API serves a different access pattern. Choose based on how your application consumes data.| Foundational API | Streaming API | Pipeline API | |
|---|---|---|---|
| Pattern | Pull (REST) | Push (WebSocket) | Push (to your infra) |
| Best for | On-demand queries, wallet lookups, historical data | Real-time UI updates, trading bots, live feeds | Data warehousing, analytics, ETL, backend indexing |
| Data delivery | Request/response | WebSocket subscription | Continuous destination delivery |
| Destination | Your application | Your application | Your database, warehouse, queue, or webhook |
| Transforms | None (query parameters only) | None (filter on subscribe) | SQL transforms, ABI decoding, normalizers |