Skip to main content

Prerequisites

Using any of the GoldRush developer tools requires an API key.

Vibe Coders

$10/mo - Built for solo builders and AI-native workflows.

Teams

$250/mo - Production-grade with 50 RPS and priority support.
You will also need a running Postgres instance that is accessible from the internet. Have your connection host, port, database name, username, and password ready.

Create Your First Pipeline

This walkthrough creates a pipeline that streams decoded EVM log events from Base Mainnet into a Postgres database.
1
Log in to the GoldRush Platform and navigate to Manage Pipelines in the left sidebar. Click Create New Pipeline to open the pipeline builder.
2
Select the PostgreSQL Destination Type
3
Configure your PostgreSQL Connection with your connection url, username and password.
4
Test your connection to see if the Pipeline API can connect to your PostgreSQL.
5
Under EVM Chains, pick Base as your chain and EVM Log Events as your object type.
6
Choose a source topic to define what blockchain data your pipeline will ingest. Select Base Mainnet as the chain and EVM Block Logs as the data type. This maps to the topic base.mainnet.ref.block.logs, which emits every log event from every block on Base Mainnet in real time.
7
Since we want to continuously stream data, we will pick Unbounded and set the starting block height to be 30,000,000.
8
Pick an existing project (or create a new project) to group this pipeline under. For the ABI decoding, upload the ERC20 ABI.
9
Review your configuration and click “Create Pipeline” to deploy your pipeline.

Under the Hood

The platform UI generates a YAML configuration for each pipeline. The pipeline you just created produces a config equivalent to this:
# Pipeline Configuration
project: "research"
topic: "base.mainnet.ref.block.logs"

# Destination
destination:
  type: "postgres"
  url: "postgresql://hosted.postgreshost.com"
  user: "goldrush"
  password: "••••••••"

# Execution
execution:
  mode: "unbounded"
  start_from: "earliest"

# ABI Decoding
abi:
  files:
    - "erc20.json"
  unmatched: "skip"
The topic field determines the chain and data type. Destination credentials are injected as environment variables at deploy time so they are never stored in plaintext.

Next Steps

ABI Decoding

Automatically decode raw log data into human-readable event fields.

SQL Transforms

Filter and reshape data in-flight with SQL before it reaches your destination.

Destination Types

Explore all supported destination types beyond Postgres.

Guides

End-to-end tutorials for common pipeline patterns.