I'm currently building a C# (.NET) application that needs to insert a large volume of records into a Cloudflare D1 database using its REST API (raw SQL endpoint).
I’m facing two main challenges:
When trying to insert a large number of records in one request, I receive a Statement too long error. The documentation mentions this is due to exceeding the maximum SQL statement length.
I need to process and insert around 200,000 rows per day, so performance and reliability are important.
My current approach is batching the data and sending it via POST
requests using a HttpClient
, but I'm not sure about the most optimal and scalable way to handle this.
What I need help with
- Best practices for bulk-inserting data into D1 via the REST API.
- How to split large insert operations into multiple requests efficiently.
- Recommendations for automating daily inserts (e.g., 200k+ rows).
- Whether it's advisable to parallelize API calls or use a queuing system.
- Any limitations regarding rate limits, request size, or concurrency when working with the D1 API.
If anyone has experience with high-volume inserts into Cloudflare D1 or similar serverless databases via HTTP, I'd really appreciate your insights.
Thanks in advance!