InvenTree: [BUG] Configuring S3 backend for backups failing

Deployment Method

  • Docker Production

Describe the problem*

Hi! First of all, thanks for this great piece of software.

I’m trying to configure Inventree to use AWS S3 for backups, with the latest version of Inventree, according to these instructions. I’m using Docker and have successfully deployed a server using the guide for production setups.

Steps to Reproduce

Following the Docker production setup instructions: In my .env, I’ve added these lines in an attempt to make Inventree use an S3 bucket for backups:

INVENTREE_BACKUP_STORAGE=storages.backends.s3boto3.S3Boto3Storage
INVENTREE_BACKUP_OPTIONS={'access_key': '<my access key>', 'secret_key': '<my secret key>', 'bucket_name': 'inventree-backup-backupbucket', 'default_acl': 'private'}

When running docker-compose run inventree-server invoke update, I get an error message. I think it’s about how the INVENTREE_BACKUP_OPTIONS are specified, they seem to not be correctly interpreted as a Python dictionary.

Relevant log output

Backing up InvenTree database...
TypeError: dbbackup.storage.Storage() argument after ** must be a mapping, not str
  File "/root/.local/lib/python3.9/site-packages/dbbackup/utils.py", line 120, in wrapper
    func(*args, **kwargs)
  File "/root/.local/lib/python3.9/site-packages/dbbackup/management/commands/dbbackup.py", line 80, in handle
    self.storage = get_storage()
  File "/root/.local/lib/python3.9/site-packages/dbbackup/storage.py", line 33, in get_storage
    return Storage(path, **options)

Traceback (most recent call last):
  File "/home/inventree/InvenTree/manage.py", line 23, in <module>
    execute_from_command_line(sys.argv)
  File "/root/.local/lib/python3.9/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line
    utility.execute()
  File "/root/.local/lib/python3.9/site-packages/django/core/management/__init__.py", line 413, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/root/.local/lib/python3.9/site-packages/django/core/management/base.py", line 354, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/root/.local/lib/python3.9/site-packages/django/core/management/base.py", line 398, in execute
    output = self.handle(*args, **options)
  File "/root/.local/lib/python3.9/site-packages/dbbackup/utils.py", line 120, in wrapper
    func(*args, **kwargs)
  File "/root/.local/lib/python3.9/site-packages/dbbackup/management/commands/dbbackup.py", line 80, in handle
    self.storage = get_storage()
  File "/root/.local/lib/python3.9/site-packages/dbbackup/storage.py", line 33, in get_storage
    return Storage(path, **options)
TypeError: dbbackup.storage.Storage() argument after ** must be a mapping, not str
Jan 16 11:08:00 cloud-init[2447]: util.py[WARNING]: Failed running /var/lib/cloud/instance/scripts/part-001 [1]
Jan 16 11:08:00 cloud-init[2447]: cc_scripts_user.py[WARNING]: Failed to run module scripts-user (scripts in /var/lib/cloud/instance/scripts)
Jan 16 11:08:00 cloud-init[2447]: util.py[WARNING]: Running module scripts-user (<module 'cloudinit.config.cc_scripts_user' from '/usr/lib/python2.7/site-packages/cloudinit/config/cc_scripts_user.pyc'>) failed

About this issue

  • Original URL
  • State: closed
  • Created a year ago
  • Comments: 17 (9 by maintainers)

Most upvoted comments

If you do not have anything that is encrypted in your database it is not a problem. There is the option for plugins to encrypt external (Slack, ServiceNow, …) API tokens when saving - not having the secret key would be a problem then as they could not be decrypted. Tokens are for access to the InvenTree API, they can be reissued easily.

I went the route of creating scripts for backing up and restoring the exported data.json plus the media folder. Since the server I’m running it on is running Amazon Linux on EC2, necessary tools for transfering to/from S3 are already installed by default. So I haven’t tried more with django-dbbackup.

Regarding the secret_key.txt, I have not backed it up, but I’ve tried to tear down my EC2 instance and restart it again, importing the exported database data.json and the media folder. It seems to work fine. Is it necessary to back up the secret key?

@felixeriksson SETUP_EXTRA_PIP is only working when using the bare metal installer. Where did you read that those can be used with docker?

My bad, I was searching for ways of making Inventree pip install additional libraries, and stumbled across that without realizing it was only for bare metal setups.

The plugins are only installed when running the ‘invoke update’ command or after the server is started up.

Does that mean that plugins.txt cannot be used to install django-storages[boto3] in a way so that backup with S3 backend would work?

The canonical way of adding dependencies to docker images would be to create a custom image that builds upon the original one. Would that mean modifying the inventree-worker image to also install django-storages[boto3]? Would you recommend this as an approach to be able to use AWS S3 as backend for the backup system?

I don’t have very much experience with Docker yet, so I would have to do some research on how complex that would be, and how it could be integrated in our setup.

To give some context, I’m automating the setup of an EC2 instance (using Cloudformation) that sets up Inventree using Docker, like the instructions specify. The setup of the instance works well, but I also want it to include a solid way of doing backups. My idea is also to make the instance fetch the backup from S3 when being set up, so the instance easily could be taken down and restarted from scratch, while keeping all the data. Maybe an alternative to using django-dbbackup could be to simply backup the database and media files, and exporting the database as JSON, as the Docker setup instructions indicate. But I haven’t found any instructions on how to restore such a backup - do you have any hints?