Bulk & Batch Operations¶
Overview¶
When you need to create, update, or delete multiple records at once, Centrali offers two paths: bulk and batch. Both accept up to 1,000 records per call, but they differ in how they process records and which events they emit.
Choosing the right path depends on whether your downstream triggers need to process records individually or as a group.
When to Use Which¶
| Bulk | Batch | |
|---|---|---|
| Transaction model | All-or-nothing | Per-record (partial failure OK) |
| Events emitted | 1 aggregate event | N individual events |
| Trigger behavior | Trigger fires once with all record IDs | Trigger fires once per record |
| Failure handling | Entire operation rolls back | Returns successes and failures separately |
stopOnError | N/A | Yes — stop processing on first failure |
| Speed | Faster (single transaction) | Slower (per-record processing and events) |
Use bulk when:
- You need all records to succeed or fail together
- Downstream triggers process records as a group (e.g., rebuild a search index, send a batch notification)
- Speed matters more than per-record event granularity
Use batch when:
- You need per-record webhook or trigger execution
- Partial failure is acceptable (some records can succeed while others fail)
- Each record needs individual validation, notification, or side-effect processing
Also covered in Triggers docs
The Triggers guide has a quick comparison focused on event-driven trigger selection. This page covers the full operational details.
Event Publishing¶
This is the critical difference between the two paths.
Bulk Events¶
Bulk operations emit a single aggregate event containing all affected record IDs:
| Operation | Event | Failure Event |
|---|---|---|
| Create | records_bulk_created | records_bulk_created_failed |
| Update | records_bulk_updated | records_bulk_updated_failed |
| Delete | records_bulk_deleted | records_bulk_deleted_failed |
A trigger listening for records_bulk_created fires once and receives all record IDs in the payload.
Batch Events¶
Batch operations emit individual per-record events:
| Operation | Event (per record) | Failure Event (per record) |
|---|---|---|
| Create | record_created | record_created_failed |
| Update | record_updated | record_updated_failed |
| Delete | record_deleted | record_deleted_failed |
A trigger listening for record_created fires N times — once for each record in the batch.
Impact on Triggers¶
If you create 100 records:
- Bulk: Your
records_bulk_createdtrigger fires 1 time. Your function receives an array of 100 record IDs. - Batch: Your
record_createdtrigger fires 100 times. Each execution receives a single record's data.
If you have no triggers listening for bulk events, bulk operations still succeed — events are simply not consumed.
API Endpoints¶
All endpoints are under:
Bulk Create¶
curl -X POST "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/bulk" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '[
{ "customer": "Alice", "total": 100 },
{ "customer": "Bob", "total": 200 }
]'
Request body is a JSON array of record objects (not wrapped in { records: [...] }).
Bulk Update¶
Applies the same changes to all specified records:
curl -X PATCH "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/bulk" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"ids": ["uuid-1", "uuid-2", "uuid-3"],
"data": { "status": "shipped" }
}'
Bulk Delete¶
curl -X DELETE "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/bulk" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"ids": ["uuid-1", "uuid-2", "uuid-3"]
}'
Batch Create¶
curl -X POST "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/batch" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '[
{ "customer": "Alice", "total": 100 },
{ "customer": "Bob", "total": 200 }
]'
Batch Update¶
Each record gets individual changes:
curl -X PATCH "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/batch" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"updates": [
{ "id": "uuid-1", "changes": { "status": "shipped" } },
{ "id": "uuid-2", "changes": { "status": "cancelled" } }
],
"options": {
"stopOnError": false,
"returnRecords": true
}
}'
Batch Delete¶
curl -X DELETE "$CENTRALI_URL/data/workspace/$WORKSPACE/api/v1/records/slug/orders/batch" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"ids": ["uuid-1", "uuid-2", "uuid-3"],
"options": {
"hardDelete": true,
"stopOnError": false
}
}'
Request Body Summary¶
| Endpoint | Method | Body Shape |
|---|---|---|
.../bulk (create) | POST | [ { field: value }, ... ] |
.../bulk (update) | PATCH | { ids: [...], data: { ... } } |
.../bulk (delete) | DELETE | { ids: [...] } |
.../batch (create) | POST | [ { field: value }, ... ] |
.../batch (update) | PATCH | { updates: [{ id, changes }], options? } |
.../batch (delete) | DELETE | { ids: [...], options? } |
Key difference for updates: bulk applies the same data to all IDs, while batch applies individual changes per record.
Compute Function API¶
Inside compute functions, both paths are available via the api object.
Batch Methods¶
// Batch create — fires record_created per record
const result = await api.batchCreateRecords('orders', [
{ customer: 'Alice', total: 100 },
{ customer: 'Bob', total: 200 },
]);
// result: { created: 2, failed: 0 }
// Batch update — individual changes per record
const result = await api.batchUpdateRecords('orders', [
{ id: 'uuid-1', changes: { status: 'shipped' } },
{ id: 'uuid-2', changes: { status: 'cancelled' } },
], { stopOnError: true, returnRecords: true });
// result: { updated: 2, failed: 0, records: [...] }
// Batch delete
const result = await api.batchDeleteRecords('orders', ['uuid-1', 'uuid-2'], {
hardDelete: true,
stopOnError: false,
});
// result: { deleted: 2, failed: 0 }
Bulk Methods¶
// Bulk create — fires one records_bulk_created event
const result = await api.bulkCreateRecords('orders', [
{ customer: 'Alice', total: 100 },
{ customer: 'Bob', total: 200 },
]);
// result: { created: 2, failed: 0, recordIds: ['...', '...'] }
// Bulk update — same changes applied to all IDs
const result = await api.bulkUpdateRecords('orders',
['uuid-1', 'uuid-2', 'uuid-3'],
{ status: 'shipped' }
);
// result: { updated: 3, failed: 0, recordIds: [...] }
// Bulk delete — fires one records_bulk_deleted event
const result = await api.bulkDeleteRecords('orders', ['uuid-1', 'uuid-2'], {
hardDelete: true,
});
// result: { deleted: 2, failed: 0 }
SDK¶
The Centrali JavaScript/TypeScript SDK does not currently have dedicated bulk or batch record methods. Use the HTTP API directly or the compute function API shown above.
For bulk reads, the SDK provides bulkGet:
Limits¶
| Limit | Value |
|---|---|
| Max records per bulk operation | 1,000 |
| Max records per batch operation | 1,000 |
| Max request body size | 10 MB |
These limits apply to both the HTTP API and the compute function API.
Error Handling¶
Bulk Errors¶
Bulk operations are transactional. If any record fails validation, the entire operation is rolled back and nothing is persisted.
{
"error": "Validation failed for 2 record(s)",
"code": "VALIDATION_ERROR",
"details": {
"recordErrors": [
{ "index": 0, "errors": [{ "field": "email", "message": "Invalid email format" }] },
{ "index": 3, "errors": [{ "field": "total", "message": "Must be a number" }] }
]
}
}
Batch Errors¶
Batch operations support partial failure. The response tells you exactly what succeeded and what failed:
{
"created": 8,
"failed": 2,
"records": [ ... ],
"errors": [
{ "index": 2, "error": "Validation failed for field 'email'" },
{ "index": 7, "error": "Duplicate key violation" }
]
}
stopOnError Option¶
Batch update and batch delete accept a stopOnError option:
false(default): Process all records, collecting errors along the waytrue: Stop at the first failure; records already processed remain committed
Common Mistakes¶
Using bulk when you need per-record triggers¶
If you have a trigger on record_created that sends a welcome email per user, a bulk create will not fire that trigger. You need batch create, or a separate trigger on records_bulk_created.
Using batch for large imports when bulk would be faster¶
If you're importing 1,000 records and don't need per-record triggers, bulk is significantly faster because it runs in a single transaction and emits one event.
Not handling partial failures in batch results¶
Always check the failed count and errors array in batch responses. A 200 status does not mean every record succeeded.
const result = await api.batchCreateRecords('orders', records);
if (result.failed > 0) {
api.log({ message: `${result.failed} records failed`, errors: result.errors });
}
Confusing bulk update body shape with batch update¶
Bulk update applies the same changes to multiple IDs ({ ids, data }). Batch update applies different changes per record ({ updates: [{ id, changes }] }). Sending the wrong shape results in a 400 error.
Related Documentation¶
- Triggers — Event-driven trigger setup and the bulk vs batch event model
- Trigger Parameters — Payload shapes for bulk and per-record events
- Writing Functions — Compute function API reference
- Limits & Quotas — Full platform limits
- CSV Import — File-based import (uses bulk internally)