ClarifyClarify APIBeta
Guides

Bulk operations

Best practices for importing and updating data at scale

Batch sizing

The bulk endpoints accept arrays of records. Choose batch size based on data confidence:

ScenarioBatch sizeWhy
Clean, validated data25Maximum per request
First import, some unknowns10Limits blast radius
Retrying failures1Isolate the bad record

The bulk endpoints accept a maximum of 25 records per request. Batches are atomic — one bad record fails the entire batch, so a single duplicate email takes down all other records in the same batch.

Import order

Always import in this order so foreign keys resolve:

  1. Companies — no dependencies
  2. People — can reference company_id
  3. Deals — can reference company_id
  4. Associations — link people to deals
  5. Transcripts — link to meetings and people

Rate limiting

There's no documented rate limit, but we recommend:

  • 25 records per batch (the API maximum)
  • 1 second delay between batches
  • Back off exponentially on 429 responses

Common errors

StatusMeaningFix
400Duplicate record (email/domain collision)Pre-match and use PATCH
404ID not found (bulk PATCH)Verify record IDs exist
422Validation error (id in POST, enum mismatch)Fix payload
500Server error on custom object single PATCHUse bulk PATCH endpoint instead