I periodically do site updates, where I change between 400-500 HTML pages per site.
While uploading the files via API, on occasion I will get non 200 code responses, with no error messages or text. The response seems completely blank. I then keep retrying the file upload until it eventually returns 200 and I continue to the next file. It can take several minutes of retrying the request before it returns 200, which is quite annoying.
This never happens on smaller site deployments (e.g. 50 HTML files).
I thought this might be related to rate limiting, but I can’t see the ‘X-RateLimit-Remaining’ header in the responses, so I can’t tell. I only see the ‘x-concurrent-limit’ and ‘x-concurrent-limit’ headers.
the amount of files is not as relevant here actually - we should be able to handle that - but i am wondering if there are any large files in here that might be impacting performance. can you do an audit and tell us how big the largest files are that you are trying to upload?
None of the files are very big. The biggest one is an image that is 409kb in size, and the biggest HTML files are 28kb in size. It seems to hang on the HTML files. So you think it’s not related to the total number of files in the deployment? Seems to happen only with my big deploys.
can you link me to a successful deploy for that site in our UI’s deploy logs pages, so I can take a look at what we got? Not that I don’t trust your description, but in case there is an unexpected 500MByte file in there or I see 20k files in a deploy then we’ll want to develop a shared understanding of what’s happening