I need to export a large number of users' deals from HubSpot through the API. Each user can have up to 100k deals that need to be exported. However, the process of using the API to export everything is quite slow due to rate limits. I have been exploring the recently introduced CSV export functionality as an alternative, but I'm facing a challenge with the output format.
The issue is that HubSpot's CSV exports have columns that are intended for human readability rather than machine processing. Is there a way to export the CSV files with machine-readable column names or to accurately match the CSV columns to the requested properties?
Additionally, I attempted to use the regular deals API to list and export the deals, but this approach is also hindered by the rate limits. Exporting such a large number of deals becomes a time-consuming process. I also tried exporting each pipeline separately, allowing users to begin work with data as it comes in. However, the HubSpot Search API, that I am using to filter out certain pipelines, presents further limitations in terms of rate limits. For instance, if a pipeline contains 70k deals, and I can only make 4 requests per second with a maximum of 100 deals per request, this solution is not viable.
Considering these challenges, do you have any suggestions or ideas on how to efficiently export up to 100k deals from HubSpot while overcoming rate limits and ensuring accurate column matching for machine processing purposes? Any insights or recommendations would be greatly appreciated!
The only thing that I can imagine you could still try is using GraphQL to make the API calls. It has a different rate limit than the regular API endpoints, so it might offer a solution. I haven't tested it with these amounts tho.
I tried retrieving 1100 records from HubSpot with a single call, and that works if you want to retrieve a single property, if you need multiple properties, you have to calculate how many points it will take to retrieve all 100k records.