Guides
Bulk operations
Best practices for importing and updating data at scale
Batch sizing
The bulk endpoints accept arrays of records. Choose batch size based on data confidence:
| Scenario | Batch size | Why |
|---|---|---|
| Clean, validated data | 25 | Maximum per request |
| First import, some unknowns | 10 | Limits blast radius |
| Retrying failures | 1 | Isolate the bad record |
The bulk endpoints accept a maximum of 25 records per request. Batches are atomic — one bad record fails the entire batch, so a single duplicate email takes down all other records in the same batch.
Import order
Always import in this order so foreign keys resolve:
- Companies — no dependencies
- People — can reference
company_id - Deals — can reference
company_id - Associations — link people to deals
- Transcripts — link to meetings and people
Rate limiting
There's no documented rate limit, but we recommend:
- 25 records per batch (the API maximum)
- 1 second delay between batches
- Back off exponentially on 429 responses
Common errors
| Status | Meaning | Fix |
|---|---|---|
| 400 | Duplicate record (email/domain collision) | Pre-match and use PATCH |
| 404 | ID not found (bulk PATCH) | Verify record IDs exist |
| 422 | Validation error (id in POST, enum mismatch) | Fix payload |
| 500 | Server error on custom object single PATCH | Use bulk PATCH endpoint instead |