rclone: [GDrive + FUSE] 403 Forbidden Errors - API daily limit exceeded

As discussed in the forum (https://forum.rclone.org/t/google-drive-vs-acd-for-plex/471), users are getting 403 forbidden errors and unable to file access when using rclone FUSE mount. This is especially with using Plex to access the mount. Appears to be related to exceeding daily API access: https://developers.google.com/drive/v3/web/handle-errors . Users will get a temporary ban from access files via rclone FUSE mount or download files. Access to the Google Drive website still seems to work and upload still works without issue. It seems the only viable solution is to have a local cache as mentioned in #897.

What is your rclone version (eg output from rclone -V) v1.34-75-gcbfec0dβ Which OS you are using and how many bits (eg Windows 7, 64 bit) Linux Ubuntu Which cloud storage system are you using? (eg Google Drive) Google Drive The command you were trying to run (eg rclone copy /tmp remote:tmp) rclone copy --verbose --no-traverse gdrive:test/jellyfish-40-mbps-hd-h264.mkv ~/tmp A log from the command with the -v flag (eg output from rclone -v copy /tmp remote:tmp)

2016/12/26 05:43:12 Local file system at /home/xxxxxx/tmp: Modify window is 1ms
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for checks to finish
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for transfers to finish
2016/12/26 05:43:13 jellyfish-40-mbps-hd-h264.mkv: Failed to copy: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Attempt 1/3 failed with 1 errors and: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for checks to finish
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for transfers to finish
2016/12/26 05:43:13 jellyfish-40-mbps-hd-h264.mkv: Failed to copy: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Attempt 2/3 failed with 1 errors and: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for checks to finish
2016/12/26 05:43:13 Local file system at /home/xxxxxx/tmp: Waiting for transfers to finish
2016/12/26 05:43:13 jellyfish-40-mbps-hd-h264.mkv: Failed to copy: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Attempt 3/3 failed with 1 errors and: failed to open source object: bad response: 403: 403 Forbidden
2016/12/26 05:43:13 Failed to copy: failed to open source object: bad response: 403: 403 Forbidden 

About this issue

  • Original URL
  • State: closed
  • Created 8 years ago
  • Comments: 65 (2 by maintainers)

Most upvoted comments

@tcf909 What I have that is working great is

  1. plexdrive /mnt/plexdrivero
  2. rclone /mnt/plexdriverw
  3. unionfs with plexdrivero=RO:plexdriverw=RW /mnt/plexdrive

and reading gets done by plexdrive and writing get done my rclone

best of both worlds.

Would directory caching functionality in rclone fix this issue? It seems other tools that do directory structure caching don’t seem to run into the issues expressed above. I could have sworn I saw a feature request for something like this. Thoughts?

We are all hoping to see this fixed.

On 3 Feb 2017, at 12:32 PM, jackalblood notifications@github.com wrote:

I also really hope to see a fix for this I’m running both acd and gdrive and the peering with gdrive is far superior

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ncw/rclone/issues/972#issuecomment-277310450, or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuIoRleWO3Qktzn-RuUx_vs0pqFV4ks5rY2TEgaJpZM4LVmvK.

I also really hope to see a fix for this I’m running both acd and gdrive and the peering with gdrive is far superior

What is their reason for ban ?

Check their claim when they ban me twice within a week:

Now to your issue about the locked Amazon Drive:

Your account was locked on 2016/11/25 and again on 2016/11/30 because you downloaded hundreds of terabytes of data in a single 24 hour period.

Since it is not generally possible for a single user with a non-commercial internet connection to do this, nor would it be reasonable for a single user to do so when the account only contains a fraction of this much data, it indicates to us that you are using Amazon Drive in a manner which violates our terms of use.

I uploaded from 4 server close to full 1GPBs speed each when those bans happened, but still the claim of hundreds of terabytes its a bit absurd , i told them i wish I had access to such awesome internet as even 10GBPs line could not do it 😃

I explained them that Iam using acd on 2PC’s at home + my notebook and my work PC and that iam moving all my backups to amazon drive.

I also asked for official email from that with exact reason why my account was blocked so they appologized and unblocked me and somoene else called me within an hour and told me that my account was blocked with auto scripts that is blocking all high bandwith users.

They would not reveal ( he did not know) whats the limit as i explained I could set it if I knew what it is.

Since then my account was blocked at least 3 or 4 more times but after mid December all blocks stopped and I intentionally was pushing tons of data to acd.

                  rx      /      tx      /     total    /   estimated

enp7s0: Jan '17 124.31 TiB / 125.72 TiB / 250.03 TiB Feb '17 67.32 TiB / 67.78 TiB / 135.10 TiB / 340.33 TiB yesterday 8.31 TiB / 8.31 TiB / 16.62 TiB today 988.06 GiB / 988.24 GiB / 1.93 TiB / 16.74 TiB

Btw on my 5 or 6th ban i did call support twice as the person on other side just would not discuss it and told me they will look into it. So i just called again got another representative, told him that I need to get my data as my family is visiting and we have all home videos and pictures on … he did not even ask anything just told me your account is unblocked.

I really dont belive there is a perm ban with Amazon except if they ban for illegal content eg unencrypted data.

p.s. There is a guy with 1.5PB uploaded to ACD so far and was not banned once, but he is uploading just with 1GB speed.

Im doing some testing with the suggested settings above, I’ve got a pretty big archive of crap, Ill report back my findings, I am using my own API key of course