What You’ll Build
Real-time DEX pair discovery feed (Monad)
Early-life liquidity tracking (≥ $5,000 threshold)
Latency instrumentation (p50 / p95 / max)
Deterministic deduplication (pair + event keys)
Production-style JSON logs you can ship to a collector
A typical "sniper moment" during a token launch in DeFi is more than a single moment; instead, it’s a brief sequence of rapid on-chain events that can blur together. This involves only a few key on-chain facts, such as the following:
a pool/pair is created.
liquidity arrives (sometimes in multiple additions, sometimes briefly, sometimes as bait).
first swaps hit, price whipsaws, spreads are wide, and competition spikes.
price discovers violently in the first few dozen trades
Within minutes, you learn whether the pair is real tradable liquidity or just chain noise wearing a ticker.
That’s why catching fresh opportunities manually doesn’t usually fail because you “don’t know what to do.” It fails because you can’t watch everything at once, and you don’t have enough context fast enough to avoid clicking every noisy pair creation.
If you’re building a sniper bot as a product, or even as internal tooling, the first complicated problem usually isn’t execution. It’s observability: seeing newly tradable pairs quickly, confirming they’re actually tradable (not just “created”), and measuring how your system behaves under burst conditions—your latency, your miss rate, and whether the pipeline stays stable when activity spikes.
This series walks through building a Monad sniper bot end-to-end—starting with real-time observability, then moving into validation and filtering, and finally wiring execution and operational guardrails so the system can act safely under burst conditions.
Across four parts, we’ll build a Monad sniper bot end-to-end:
Part 1 builds the real-time radar: detect new pairs, track their first minutes, and baseline your reliability with measurements (lag, duplicates, reconnects, and “pairs seen vs pairs that become liquid”).
Part 2 turns that raw feed into a validation layer: liquidity reality checks, early swap confirmation, token risk constraints, and “should we even consider this?” gating logic.
Part 3 wires execution: deterministic entry rules, slippage and failure handling, and safe transaction submission patterns.
Part 4 operationalizes the bot: monitoring, alerting, backfills, runbooks, and the boring guardrails that keep production systems alive.
In this part (part 1), we’ll ship a small, real-time pair radar service for new tokens on Monad. It watches DEX pairs as they appear, logs them in a consistent schema, and produces the baseline metrics you’ll use to decide whether sniping is feasible for your setup.
Earlier, we published a guide on building an HFT-style trading bot on Sonic using real-time streams. That guide is about continuous market updates: pulling live price/pair changes and reacting to micro-changes quickly.
This Monad sniper series overlaps in tooling (stream subscriptions, reconnect handling, state hygiene, a deterministic “decision plane”), but the market structure differs.
What’s similar
You still need low-latency data and a clean state.
You still want a strict split between data plane (what happened) and decision plane (what you do about it).
You still need guardrails: rate limits, timeouts, “do nothing” defaults, and safe shutdowns.
What’s different
HFT is driven by continuous deltas (prices/spreads/flow proxies).
Sniping is driven by discrete lifecycle events (pair created → liquidity appears → first swaps).
Sniping needs heavier validation before you ever sign a swap: liquidity reality checks, token risk checks, and “is this even tradable?” logic.
So yes, you can reuse architecture patterns from the Sonic HFT bot. But don’t reuse the mental model of “always trade when signal fires.” Sniping is closer to incident response: detect → verify → act.

People use “sniper bot” to describe a lot of things, including shady behavior. That’s not what we’re building.
In this series, a sniper bot is:
An event-driven system that detects newly tradable pairs, verifies tradability and risk constraints, then optionally executes a predefined entry strategy.
In this part of the series, we do not execute trades. We build the radar and the measurement baseline first. That’s deliberate: if your radar is noisy or slow, execution just burns time and money faster.
Under the hood, most sniper bots reduce to three modules:
Detection: Identify new pairs/pools and liquidity events.
Validation: Check whether the launch has meaningful liquidity and real activity (not just a created pool with nothing behind it).
Execution: Construct and submit a buy/sell transaction quickly, with tight slippage and risk rules.
Core Modules of a Sniper Bot: Detection, Validation, Execution, and Instrumentation
One reason sniper guides get confusing is that people blur together three separate things:
Token creation (mint + metadata)
Market creation (pair/pool creation on a DEX)
Tradability (liquidity that persists + first real swaps)
A sniper bot cares about (2) and (3). Token creation alone is not enough. Plenty of tokens exist with no meaningful market, or with markets that are traps.
Here’s the lifecycle we’re instrumenting.
When a DEX factory emits a new pool address, it’s only a creation event, not proof of a live market. At that point, liquidity may be zero, the pair may never be traded, and similar pairs can be created repeatedly through spam, tests, or bait deployments.
The reliable signals are the pair address, the base/quote token addresses, the DEX identity, and the on-chain creation timestamp. GoldRush surfaces these events in real time through its newPairs stream subscription.
This is where most “sniper” logic should live, because this is where the difference between a tradable market and junk becomes measurable.
Liquidity “being added” is not one thing. You’ll see patterns like:
Drip liquidity: multiple small adds over ~30–120 seconds.
Flash liquidity: a big add, then partial remove, then add again.
Bait liquidity: enough to trigger bots, not enough to exit safely.
Skewed liquidity: base/quote reserves wildly imbalanced.
A real radar should treat liquidity as a state that evolves (“Is it trending real?”), not a boolean.
The initial swaps represent a particularly delicate moment; sharp price fluctuations are common, spreads are at their widest, competition is at its peak, and any delays or errors in your data plane can lead to poor fills at the execution layer.
This is why Part 1 emphasizes the importance of instrumentation; you want to measure lag and reliability from the outset, rather than waiting to uncover issues live when the stakes are highest.
When “launch chaos” hits on-chain, a lot of what you’re seeing is MEV in plain sight. MEV (maximal extractable value) is a broad bucket, but launches are among the few moments it’s retail-visible. Early trading becomes a contest over ordering and latency. In practice, the MEV-shaped behaviors you’ll usually see include:
Fast buyers race into the earliest blocks the moment trading becomes possible
Copy-trading wallets follow the same pair/liquidity signals within seconds
Sandwiching and liquidity games once volume spikes and routes become predictable
That’s also why data quality matters here. If your “new pair” signal arrives late or lacks sufficient context, you’re not responding to the launch. You’re responding to everyone who already saw it.

Monad is an EVM-compatible Layer 1 built for high-throughput, low-latency execution. The core idea behind developing Monad is to keep the Ethereum programming model familiar while redesigning the execution and networking paths so the chain can process much more activity without feeling “slow” at the edges. Monad’s own docs describe targets of 10,000 TPS, 400ms block times, and ~800ms time-to-finality (on the order of 2 blocks).

Sub-second blocks: React to launches faster
Local mempool: Different MEV dynamics
Parallel execution: Handle burst activity
The Monad chain possesses a few traits that matter for real-time bots:
Tighter time budgets: with sub-second blocks, a “small” delay can mean you’re multiple blocks behind.
Different propagation assumptions: Monad describes a local mempool model rather than a single global mempool like Ethereum, which provides a safer foundation for signals in finalized events and verified state.
Parallel-friendly execution: the system is designed to handle a lot of concurrent work, so your data pipeline needs to stay stable under bursts (no drops, no confusing ordering).
Monad is also a new mainnet, which had its mainnet launch on November 24, 2025. That early phase changes the opportunity surface for token launch Sniper bots:
Liquidity is still forming, so many “new pairs” are not yet tradable.
Noise is higher: spam pairs, test deployments, bait liquidity, copy-paste tokens.
Attention cycles are intense, which makes latency and validation matter more, not less.
For this series, GoldRush is the data rail. We rely on it for the two most time-sensitive streams a sniper system needs:
New DEX Pairs Stream (newPairs): real-time updates when new liquidity pairs are created on DEXes.
Wallet Activity Stream (walletTxs): real-time updates on transactions and interactions, with decoded types like swaps, transfers, approvals, deposits/withdrawals.
GoldRush supports MONAD_MAINNET as a chain option for these streams.
By the end of this tutorial, you’ll have a small TypeScript service that:
Listens for newly created DEX pairs (a “pair discovery feed”).
Tracks each pair for a short early-life window to see whether liquidity becomes real and whether swaps begin.
Logs clean, structured events (so you can analyze them later without re-parsing console noise).
Measures baseline readiness metrics you’ll need before execution:
event-to-log lag (approximate)
duplicates (how noisy discovery is)
reconnects (how stable your stream is)
“pairs seen vs pairs that become liquid” ratio
This is the “instrumentation first” approach: you don’t want to discover that your pipeline is late or unstable when the first real opportunities arise.
You can follow this on Windows/macOS/Linux.
You’ll need:
Node.js 20+
npm (bundled with Node)
A code editor (VS Code is fine)
A GoldRush API key
One important note: GoldRush Streams are exposed as GraphQL subscriptions (for example, a documented stream for new DEX pairs is commonly referred to as newPairs). Stream availability can vary by chain and account tier—so in Step 4, we’ll include a quick “sanity check” to verify the SDK includes the chain enum and subscription method you need before you write too much code.
1mkdir monad-sniper-part1-pair-radar
cd monad-sniper-part1-pair-radar
npm init -y2npm install @covalenthq/client-sdk ws dotenv zodnpm install --save-dev typescript tsx @types/node@covalenthq/client-sdk: GoldRush client (includes StreamingService helpers).ws: WebSocket implementation for Node (the SDK needs a WebSocket runtime).dotenv: loads .env config.zod: validates env vars so your app fails early and clearly.tsx: lets you run TypeScript directly without a build step while iterating.typescript / @types/node: standard TS setup.3npx tsc --init package.json and add scripts:{
"scripts": {
"dev": "tsx src/index.ts",
"typecheck": "tsc --noEmit"
}
}
npm run dev runs your service in one command.npm run typecheck validates types without compiling output.4COVALENT_API_KEY=YOUR_GOLDRUSH_KEY_HERE
# Chain selector (you will verify the exact enum/string in Step 5)
STREAM_CHAIN=MONAD_MAINNET
# “Early life” tracking window for each discovered pair (seconds)
EARLY_LIFE_SECONDS=180
# Treat a pair as “meaningfully liquid” once it crosses this USD value
MIN_LIQUIDITY_USD=5000
# Optional: don’t mark “active” unless we see at least this many swaps
MIN_SWAP_EVENTS=1
# Print a metrics summary every N pairs
PRINT_METRICS_EVERY=25COVALENT_API_KEY: lets the SDK authenticate.EARLY_LIFE_SECONDS: how long you keep watching each pair after discovery.MIN_LIQUIDITY_USD: your first “is this real?” threshold.MIN_SWAP_EVENTS: helps separate “liquidity added but dead market” from “trading started”.PRINT_METRICS_EVERY: keeps logs readable by emitting periodic summaries.5npm ls @covalenthq/client-sdknode_modules for the stream subscription methods:# macOS/Linux
grep -R "subscribeTo" node_modules/@covalenthq/client-sdk -n | head -n 40Select-String -Path "node_modules\@covalenthq\client-sdk\**\*" -Pattern "subscribeTo" | Select-Object -First 40subscribeToNewPairs).subscribeToUpdatePairs). “Update pairs” is a documented stream concept.StreamingChain.MONAD_MAINNET, or a string you pass through depending on SDK version).6mkdir -p src
mkdir -p src/streams
mkdir -p src/coresrc/core/* — config, metrics, state tracking (logic)src/streams/* — stream wiring (plumbing)src/index.ts — main entry point7.env file, which is essential for managing environment variables. Secondly, it validates the types of these variables, meaning that if there’s an error—such as setting EARLY_LIFE_SECONDS to a non-numeric value like "abc"—the application will fail immediately with a clear and readable error message.src/core/config.ts:// src/core/config.ts
import "dotenv/config";
import { z } from "zod";
const EnvSchema = z.object({
COVALENT_API_KEY: z.string().min(1, "COVALENT_API_KEY is required"),
STREAM_CHAIN: z.string().min(1, "STREAM_CHAIN is required"),
EARLY_LIFE_SECONDS: z.coerce.number().int().positive().default(180),
MIN_LIQUIDITY_USD: z.coerce.number().nonnegative().default(5000),
MIN_SWAP_EVENTS: z.coerce.number().int().nonnegative().default(1),
PRINT_METRICS_EVERY: z.coerce.number().int().positive().default(25),
});
const env = EnvSchema.parse(process.env);
export const CONFIG = {
apiKey: env.COVALENT_API_KEY,
streamChain: env.STREAM_CHAIN,
earlyLifeSeconds: env.EARLY_LIFE_SECONDS,
minLiquidityUsd: env.MIN_LIQUIDITY_USD,
minSwapEvents: env.MIN_SWAP_EVENTS,
printMetricsEvery: env.PRINT_METRICS_EVERY,
};
8src/core/logSchema.ts:// src/core/logSchema.ts
export type PairDiscoveredLog = {
kind: "pair_discovered";
observed_at: string; // local timestamp
chain: string;
pair_address: string;
dex_name?: string;
token0?: string;
token1?: string;
created_at_chain?: string; // chain timestamp if provided by stream payload
event_lag_ms?: number; // observed_at - created_at_chain
is_duplicate: boolean;
};
export type PairEarlyLifeLog = {
kind: "pair_early_life";
observed_at: string;
chain: string;
pair_address: string;
age_s: number;
liquidity_usd?: number; // if the stream includes a USD quote
swap_count_seen: number;
flags: {
crossed_liquidity_threshold: boolean;
became_active: boolean; // crossed threshold + swaps (basic “real market” signal)
};
};
export type MetricsSummaryLog = {
kind: "metrics_summary";
observed_at: string;
chain: string;
totals: {
pairs_seen: number;
duplicates: number;
reconnects: number;
pairs_tracked: number;
pairs_became_liquid: number;
pairs_became_active: number;
};
lag_ms: {
samples: number;
p50?: number;
p95?: number;
max?: number;
};
};
export type RadarLog = PairDiscoveredLog | PairEarlyLifeLog | MetricsSummaryLog;9src/core/logger.ts:// src/core/logger.ts
import { RadarLog } from "./logSchema";
export function logEvent(evt: RadarLog) {
// JSONL-friendly logs: 1 event per line, easy to pipe into files.
console.log(JSON.stringify(evt));
}
export function nowIso(): string {
return new Date().toISOString();
}
npm run dev > radar.jsonl10recordLag(ms) to store lag samples, andsummary() to return counters + quick lag stats (p50/p95/max)src/core/metrics.ts:// src/core/metrics.ts
function quantile(sorted: number[], q: number): number | undefined {
if (!sorted.length) return undefined;
const pos = (sorted.length - 1) * q;
const base = Math.floor(pos);
const rest = pos - base;
const baseVal = sorted[base];
const nextVal = sorted[base + 1] ?? baseVal;
return baseVal + rest * (nextVal - baseVal);
}
export class Metrics {
pairsSeen = 0;
duplicates = 0;
reconnects = 0;
pairsTracked = 0; // number we actually followed in early life
pairsBecameLiquid = 0; // crossed liquidity threshold at least once
pairsBecameActive = 0; // crossed threshold AND saw swaps (basic “real”)
// Keep a bounded buffer so memory doesn’t grow forever
private lagSamples: number[] = [];
private lagCap = 5000;
recordLag(ms: number) {
if (!Number.isFinite(ms) || ms < 0) return;
this.lagSamples.push(ms);
if (this.lagSamples.length > this.lagCap) {
this.lagSamples.shift();
}
}
summary() {
const sorted = [...this.lagSamples].sort((a, b) => a - b);
return {
totals: {
pairs_seen: this.pairsSeen,
duplicates: this.duplicates,
reconnects: this.reconnects,
pairs_tracked: this.pairsTracked,
pairs_became_liquid: this.pairsBecameLiquid,
pairs_became_active: this.pairsBecameActive,
},
lag_ms: {
samples: sorted.length,
p50: quantile(sorted, 0.5),
p95: quantile(sorted, 0.95),
max: sorted.at(-1),
},
};
}
}
11src/core/pairStore.ts:// src/core/pairStore.ts
import { CONFIG } from "./config";
export type PairState = {
pairAddress: string;
firstObservedAtMs: number;
createdAtChainMs?: number;
sawLiquidityCross: boolean;
swapCountSeen: number;
lastLiquidityUsd?: number;
trackingEndsAtMs: number;
};
export class PairStore {
private pairs = new Map<string, PairState>();
normalize(addr: string) {
return addr.trim().toLowerCase();
}
has(addr: string) {
return this.pairs.has(this.normalize(addr));
}
upsertDiscovery(input: {
pairAddress: string;
createdAtChainMs?: number;
}) {
const pair = this.normalize(input.pairAddress);
const existing = this.pairs.get(pair);
if (existing) return existing;
const now = Date.now();
const st: PairState = {
pairAddress: pair,
firstObservedAtMs: now,
createdAtChainMs: input.createdAtChainMs,
sawLiquidityCross: false,
swapCountSeen: 0,
lastLiquidityUsd: undefined,
trackingEndsAtMs: now + CONFIG.earlyLifeSeconds * 1000,
};
this.pairs.set(pair, st);
return st;
}
get(addr: string) {
return this.pairs.get(this.normalize(addr));
}
markLiquidity(st: PairState, liquidityUsd?: number) {
st.lastLiquidityUsd = liquidityUsd;
if (
typeof liquidityUsd === "number" &&
liquidityUsd >= CONFIG.minLiquidityUsd
) {
st.sawLiquidityCross = true;
}
}
markSwap(st: PairState, n: number = 1) {
st.swapCountSeen += n;
}
shouldStillTrack(st: PairState) {
return Date.now() <= st.trackingEndsAtMs;
}
}
has(pair)).trackingEndsAtMs.12src/streams/client.ts:// src/streams/client.ts
import WebSocket from "ws";
import { GoldRushClient } from "@covalenthq/client-sdk";
import { CONFIG } from "../core/config";
// GoldRush SDK expects a WebSocket implementation in Node.
(global as any).WebSocket = WebSocket;
export function makeClient(on: {
connecting?: () => void;
opened?: () => void;
closed?: () => void;
error?: (err: unknown) => void;
}) {
const client = new GoldRushClient(
CONFIG.apiKey,
{},
{
onConnecting: on.connecting,
onOpened: on.opened,
onClosed: on.closed,
onError: on.error,
}
);
return client;
}
13src/streams/newPairs.ts:Create src/streams/newPairs.ts:
// src/streams/newPairs.ts
import { makeClient } from "./client";
import { CONFIG } from "../core/config";
import { Metrics } from "../core/metrics";
import { PairStore } from "../core/pairStore";
import { logEvent, nowIso } from "../core/logger";
function parseChainTimeMs(v: any): number | undefined {
// Streams often include a chain timestamp field (name varies by stream/payload).
// We handle common variants defensively.
const candidates = [
v?.block_signed_at,
v?.created_at,
v?.timestamp,
v?.block_timestamp,
].filter(Boolean);
for (const c of candidates) {
const t = Date.parse(String(c));
if (Number.isFinite(t)) return t;
}
return undefined;
}
export async function startNewPairsStream(opts: {
metrics: Metrics;
store: PairStore;
onPairDiscovered: (pairAddress: string) => void;
}) {
const client = makeClient({
connecting: () => console.log("[stream] connecting…"),
opened: () => console.log("[stream] connected"),
closed: () => console.log("[stream] closed"),
error: (err) => console.error("[stream] error:", err),
});
// IMPORTANT:
// - Method name may differ slightly by SDK version.
// - If your SDK uses a different method, replace it here (Step 5 showed how to find it).
const unsubscribe = (client as any).StreamingService.subscribeToNewPairs(
{
chain_name: CONFIG.streamChain, // kept as string for flexibility across SDK versions
},
{
next: (payload: any) => {
// Payload shapes can vary. We only need a few fields.
const pairAddress =
payload?.pair_address ||
payload?.pairAddress ||
payload?.data?.pair_address ||
payload?.data?.pairAddress;
if (!pairAddress) return;
opts.metrics.pairsSeen++;
const normalized = pairAddress.trim().toLowerCase();
const isDup = opts.store.has(normalized);
if (isDup) opts.metrics.duplicates++;
const createdAtMs = parseChainTimeMs(payload);
const st = opts.store.upsertDiscovery({
pairAddress: normalized,
createdAtChainMs: createdAtMs,
});
// Approx event lag: how “late” we are relative to chain timestamp, if present.
let lagMs: number | undefined = undefined;
if (createdAtMs) {
lagMs = Date.now() - createdAtMs;
opts.metrics.recordLag(lagMs);
}
logEvent({
kind: "pair_discovered",
observed_at: nowIso(),
chain: CONFIG.streamChain,
pair_address: normalized,
dex_name: payload?.dex_name || payload?.dex || payload?.dex_name,
token0: payload?.token0_address || payload?.token0,
token1: payload?.token1_address || payload?.token1,
created_at_chain: createdAtMs ? new Date(createdAtMs).toISOString() : undefined,
event_lag_ms: lagMs,
is_duplicate: isDup,
});
// Only trigger early-life tracking the first time we see the pair.
if (!isDup && st) {
opts.onPairDiscovered(normalized);
}
},
error: (err: any) => {
opts.metrics.reconnects++; // treat stream errors as “stability hits”
console.error("[newPairs] subscription error:", err?.message || err);
},
complete: () => {
console.log("[newPairs] complete");
},
}
);
return {
stop: async () => {
if (unsubscribe) unsubscribe();
await (client as any).StreamingService.disconnect?.();
},
};
}
client as any):14updatePairs). This stream typically takes explicit pair addresses (which is why Part 1 is structured as discovery → then updates).src/streams/updatePairs.ts:// src/streams/updatePairs.ts
import { makeClient } from "./client";
import { CONFIG } from "../core/config";
import { Metrics } from "../core/metrics";
import { PairStore, PairState } from "../core/pairStore";
import { logEvent, nowIso } from "../core/logger";
function extractLiquidityUsd(payload: any): number | undefined {
// Different DEX streams expose liquidity differently.
// We try common fields; if none exist, this stays undefined.
const candidates = [
payload?.liquidity_usd,
payload?.liquidityUSD,
payload?.quote_usd,
payload?.tvl_usd,
payload?.reserve_quote_usd,
];
for (const c of candidates) {
const n = Number(c);
if (Number.isFinite(n)) return n;
}
return undefined;
}
function isSwapLike(payload: any): boolean {
// Some payloads include decoded_type or event category fields.
const t = String(payload?.decoded_type || payload?.event_type || "").toUpperCase();
if (!t) return false;
return t.includes("SWAP");
}
export async function trackPairEarlyLife(args: {
pairAddress: string;
metrics: Metrics;
store: PairStore;
}) {
const st = args.store.get(args.pairAddress);
if (!st) return;
args.metrics.pairsTracked++;
const client = makeClient({
error: (err) => console.error("[updatePairs] error:", err),
});
const unsubscribe = (client as any).StreamingService.subscribeToUpdatePairs(
{
chain_name: CONFIG.streamChain,
pair_addresses: [args.pairAddress],
},
{
next: (payload: any) => {
const state = args.store.get(args.pairAddress);
if (!state) return;
const liqUsd = extractLiquidityUsd(payload);
if (liqUsd !== undefined) {
const before = state.sawLiquidityCross;
args.store.markLiquidity(state, liqUsd);
if (!before && state.sawLiquidityCross) {
args.metrics.pairsBecameLiquid++;
}
}
if (isSwapLike(payload)) {
const beforeSwaps = state.swapCountSeen;
args.store.markSwap(state, 1);
// We only count “became active” once (threshold crossed + swaps seen)
if (
beforeSwaps < CONFIG.minSwapEvents &&
state.swapCountSeen >= CONFIG.minSwapEvents &&
state.sawLiquidityCross
) {
args.metrics.pairsBecameActive++;
}
}
const ageS = (Date.now() - state.firstObservedAtMs) / 1000;
logEvent({
kind: "pair_early_life",
observed_at: nowIso(),
chain: CONFIG.streamChain,
pair_address: args.pairAddress,
age_s: Math.max(0, Math.round(ageS)),
liquidity_usd: liqUsd,
swap_count_seen: state.swapCountSeen,
flags: {
crossed_liquidity_threshold: state.sawLiquidityCross,
became_active: state.sawLiquidityCross && state.swapCountSeen >= CONFIG.minSwapEvents,
},
});
// Stop tracking once window expires.
if (!args.store.shouldStillTrack(state)) {
unsubscribe?.();
}
},
error: (err: any) => {
args.metrics.reconnects++;
console.error("[updatePairs] subscription error:", err?.message || err);
},
complete: () => {
// nothing special
},
}
);
// Hard stop fallback in case we never get more events.
setTimeout(() => unsubscribe?.(), CONFIG.earlyLifeSeconds * 1000 + 2_000);
return {
stop: async () => {
unsubscribe?.();
await (client as any).StreamingService.disconnect?.();
},
};
}
activity confirmation.15src/index.ts:// src/index.ts
import { CONFIG } from "./core/config";
import { Metrics } from "./core/metrics";
import { PairStore } from "./core/pairStore";
import { logEvent, nowIso } from "./core/logger";
import { startNewPairsStream } from "./streams/newPairs";
import { trackPairEarlyLife } from "./streams/updatePairs";
async function main() {
console.log("Monad Sniper Bot (Part 1) — Pair Radar");
console.log(`Chain: ${CONFIG.streamChain}`);
console.log(
`Early-life window: ${CONFIG.earlyLifeSeconds}s | Min liquidity: $${CONFIG.minLiquidityUsd} | Min swaps: ${CONFIG.minSwapEvents}\n`
);
const metrics = new Metrics();
const store = new PairStore();
const newPairs = await startNewPairsStream({
metrics,
store,
onPairDiscovered: async (pairAddress) => {
// Fire-and-forget early-life tracking.
// In production you’d likely batch these to avoid too many concurrent subs.
trackPairEarlyLife({ pairAddress, metrics, store }).catch((e) =>
console.error("[trackPairEarlyLife] failed:", e)
);
// Periodic metrics summary
if (metrics.pairsSeen % CONFIG.printMetricsEvery === 0) {
const s = metrics.summary();
logEvent({
kind: "metrics_summary",
observed_at: nowIso(),
chain: CONFIG.streamChain,
totals: s.totals,
lag_ms: s.lag_ms,
});
}
},
});
process.on("SIGINT", async () => {
console.log("\n[main] shutting down…");
await newPairs.stop();
process.exit(0);
});
}
main().catch((err) => {
console.error("[main] fatal:", err);
process.exit(1);
});
16npm run devJSON lines as pairs and updates arrivenpm run dev > radar.jsonlradar.jsonl file is your “ground truth feed” for Parts 2–4
1) subscribeToNewPairs is not a function
Your SDK version uses a different method name.
Go back to Step 5 and search for the correct subscription function in node_modules, then replace it in src/streams/newPairs.ts.
2) No events at all
Your chain enum/string might be wrong (STREAM_CHAIN).
Or your account/plan may not have that stream enabled for Monad yet.
Confirm the exact chain identifier expected by your SDK version.
3) Too many update subscriptions
If discovery volume spikes, you may open too many concurrent “early-life” trackers.
The subsequent refinement is batching (track only the top N, or queue tracking jobs).
For Part 1, we keep it simple to keep the logic transparent.
At this point, you’ve built the part most sniper bots quietly depend on: a data-plane radar that’s measurable and auditable. You’re not trading yet—by design. You’re proving you can see the chain clearly, quickly, and consistently enough to justify execution later.
In Part 2, you’ll take these logs and start turning them into filters and validation rules that reduce noise—so your sniper bot learns to ignore pairs that are “created” but not genuinely tradable. In Parts 3 and 4, you’ll wire execution and then wrap it in operational guardrails to ensure it behaves safely under burst conditions.