Overview
bulkEvent() sends a batch of events in one HTTP request. Use it for backfills, importing historical data, or batching events you’ve already collected server-side.
For ongoing high-frequency tracking, use the EventBatcher which handles buffering automatically.
Signature
churn.bulkEvent(
events: BulkEventItem[],
options?: CallOptions
): Promise<{ ok: boolean; accepted: number }>
Parameters
Array of event objects. Maximum 500 items per call.
BulkEventItem
interface BulkEventItem {
userId: string
event: string
properties?: EventProperties // optional key-value metadata
timestamp?: string // ISO 8601 — defaults to server time if omitted
}
Examples
Backfill historical events
const events = historicalData.map((row) => ({
userId: row.user_id,
event: row.event_name,
properties: row.metadata,
timestamp: row.created_at,
}))
const { accepted } = await churn.bulkEvent(events)
console.log(`Accepted ${accepted} events`)
Batch events collected in memory
const buffer: BulkEventItem[] = []
function trackLater(userId: string, event: string, props?: Record<string, unknown>) {
buffer.push({ userId, event, properties: props })
}
// Flush every minute
setInterval(async () => {
if (buffer.length === 0) return
const batch = buffer.splice(0, 500)
await churn.bulkEvent(batch)
}, 60_000)
Chunking large datasets
const CHUNK_SIZE = 500
async function backfill(allEvents: BulkEventItem[]) {
for (let i = 0; i < allEvents.length; i += CHUNK_SIZE) {
const chunk = allEvents.slice(i, i + CHUNK_SIZE)
await churn.bulkEvent(chunk)
console.log(`Flushed events ${i}–${i + chunk.length}`)
}
}
Return value
{
ok: true,
accepted: 247 // number of events successfully ingested
}
Limits
| Limit | Value |
|---|
| Max events per call | 500 |
| Max payload size | 1 MB |
Errors
| Code | When |
|---|
VALIDATION_ERROR | Array is empty or contains invalid items |
PAYLOAD_TOO_LARGE | More than 500 events |
RATE_LIMITED | Too many bulk calls |
TIMEOUT | Request exceeded timeout |