laravel-backup: HEAD - 503 Slow Down

Error executing “HeadObject” on “https://xxx.sfo2.digitaloceanspaces.com/xxx/xxx-10-17-02-00-02.zip”; AWS HTTP error: Server error: HEAD https://xxx.sfo2.digitaloceanspaces.com/xxx/xxx2018-10-17-02-00-02.zip resulted in a 503 Slow Down response (server): 503 Slow Down -

related: #618

Any fixes so far? @mdavis1982 @freekmurze @leeuwd

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 1
  • Comments: 20 (3 by maintainers)

Most upvoted comments

@TobyMaxham A new cleanup strategy you added I understood. Interesting. Would you mind sharing that with us Toby?

Hey @jpmurray, I found your comment through a Google-Search since we had the same issue. We also used Backblaze B2 as a storage. Since the Spatie Backup default cleanup strategy \Spatie\Backup\Tasks\Cleanup\Strategies\DefaultStrategy::class checks if a backup exists, it eventualy call the API more than the bought rate limiting.

We have changed the cleanup strategy with our own class and finally everything went ok.

@pmochine Yeah, been at this for a few months now. Spaces is nice and not expensive but as general storage for images I had to cancel using them as I ran into rate limiting errors all the time despite using 4 buckets with them. As for backups, I do still use Digital Ocean Spaces but my Staging backup cleaning has been failing for 3 months now.

Been talking to S3 Adapter package maintainer at https://github.com/thephpleague/flysystem-aws-s3-v3/issues/205 and perhaps exponential backoffs in combination with some other tricks can make this work. But it seems like a hassle and simply tough for me to implement.

For image storage for projects we consider using volumes now. But even considering moving elsewhere. To our Dutch based provider TransIP to be precise. They have Big Storage you can attach to VPSs with relative ease and are not too pricey. I prefer not though as I liked Digital Ocean, their database management, floating ip addresses, API and so on. But this is a serious issue.

NB Using Amazon S3 would also be an option as they are more generous in their limiting, but they are more expensive and calculating cost is a major pain.