restic: out of memory during restic prune

Output of restic version

restic 0.13.1 compiled with go1.18 on windows/amd64

How did you run restic exactly?

RESTIC_CACHE_DIR=F:\Restic RESTIC_PASSWORD=C:\Restic\passwd.txt RESTIC_REPOSITORY=C:\Restic\restic.txt restic prune

What backend/server/service did you use to store the repository?

Wasabi S3 Storage (https://wasabi.com/)

Expected behavior

delete unused object from repository

Actual behavior

PS C:\> restic prune
repository 45f0f1a4 opened successfully, password is correct
loading indexes...
loading all snapshots...
finding data that is still in use for 57 snapshots
[0:46] 100.00%  57 / 57 snapshots...
searching used packs...
runtime: VirtualAlloc of 623902720 bytes failed with errno=1455
fatal error: out of memory

runtime stack:
runtime.throw({0x16394c0?, 0xc247e60000?})
        /usr/local/go/src/runtime/panic.go:992 +0x76
runtime.sysUsed(0xc225200000, 0x25300000)
        /usr/local/go/src/runtime/mem_windows.go:83 +0x1c9
runtime.(*mheap).allocSpan(0x1d75880, 0x12980, 0x0, 0x1)
        /usr/local/go/src/runtime/mheap.go:1279 +0x428
runtime.(*mheap).alloc.func1()
        /usr/local/go/src/runtime/mheap.go:912 +0x65
runtime.systemstack()
        /usr/local/go/src/runtime/asm_amd64.s:469 +0x4e

goroutine 1 [running]:
runtime.systemstack_switch()
        /usr/local/go/src/runtime/asm_amd64.s:436 fp=0xc18a84aa30 sp=0xc18a84aa28 pc=0xc32020
runtime.(*mheap).alloc(0x25300000?, 0x12980?, 0xc5?)
        /usr/local/go/src/runtime/mheap.go:906 +0x65 fp=0xc18a84aa78 sp=0xc18a84aa30 pc=0xbf7065
runtime.(*mcache).allocLarge(0x27750b4d6a8?, 0x25300000, 0x1)
        /usr/local/go/src/runtime/mcache.go:213 +0x85 fp=0xc18a84aac8 sp=0xc18a84aa78 pc=0xbe7305
runtime.mallocgc(0x25300000, 0x1584140, 0x1)
        /usr/local/go/src/runtime/malloc.go:1096 +0x5a5 fp=0xc18a84ab40 sp=0xc18a84aac8 pc=0xbdd6e5
runtime.newarray(0xc18a84aba0?, 0xbd7052?)
        /usr/local/go/src/runtime/malloc.go:1281 +0x52 fp=0xc18a84ab68 sp=0xc18a84ab40 pc=0xbddbf2
runtime.makeBucketArray(0xbd6dc5?, 0xa0?, 0xbd36fa?)
        /usr/local/go/src/runtime/map.go:363 +0x18e fp=0xc18a84aba8 sp=0xc18a84ab68 pc=0xbdea2e
runtime.hashGrow(0x1569b60?, 0xc03885f5f0)
        /usr/local/go/src/runtime/map.go:1049 +0x79 fp=0xc18a84abe8 sp=0xc18a84aba8 pc=0xbe0339
runtime.mapassign(0x15d0c80, 0xc03885f5f0, 0xc000472800?)
        /usr/local/go/src/runtime/map.go:658 +0xd4 fp=0xc18a84ac68 sp=0xc18a84abe8 pc=0xbdf2d4
github.com/restic/restic/internal/restic.BlobSet.Insert(...)
        /restic/internal/restic/blob_set.go:26
main.prune({_, {_, _}, _, {_, _}, _, _}, {{0x0, 0x0}, ...}, ...)
        /restic/cmd/restic/cmd_prune.go:222 +0x329 fp=0xc18a84b7e8 sp=0xc18a84ac68 pc=0x1410d09
main.runPruneWithRepo({_, {_, _}, _, {_, _}, _, _}, {{0x0, 0x0}, ...}, ...)
        /restic/cmd/restic/cmd_prune.go:162 +0x205 fp=0xc18a84b9a0 sp=0xc18a84b7e8 pc=0x1410925
main.runPrune({_, {_, _}, _, {_, _}, _, _}, {{0x0, 0x0}, ...})
        /restic/cmd/restic/cmd_prune.go:140 +0x213 fp=0xc18a84bb78 sp=0xc18a84b9a0 pc=0x1410573
main.glob..func18(0x1d4b920?, {0x162fe94?, 0x0?, 0x0?})
        /restic/cmd/restic/cmd_prune.go:35 +0x8a fp=0xc18a84bd08 sp=0xc18a84bb78 pc=0x140fc2a
github.com/spf13/cobra.(*Command).execute(0x1d4b920, {0x1db3930, 0x0, 0x0})
        /home/build/go/pkg/mod/github.com/spf13/cobra@v1.2.1/command.go:856 +0x67c fp=0xc18a84bde0 sp=0xc18a84bd08 pc=0xedb7bc
github.com/spf13/cobra.(*Command).ExecuteC(0x1d4d220)
        /home/build/go/pkg/mod/github.com/spf13/cobra@v1.2.1/command.go:974 +0x3b4 fp=0xc18a84be98 sp=0xc18a84bde0 pc=0xedbe34
github.com/spf13/cobra.(*Command).Execute(...)
        /home/build/go/pkg/mod/github.com/spf13/cobra@v1.2.1/command.go:902
main.main()
        /restic/cmd/restic/main.go:98 +0x32 fp=0xc18a84bf80 sp=0xc18a84be98 pc=0x1428c92
runtime.main()
        /usr/local/go/src/runtime/proc.go:250 +0x1fe fp=0xc18a84bfe0 sp=0xc18a84bf80 pc=0xc0925e
runtime.goexit()
        /usr/local/go/src/runtime/asm_amd64.s:1571 +0x1 fp=0xc18a84bfe8 sp=0xc18a84bfe0 pc=0xc34381

goroutine 6 [chan receive, 2 minutes]:
github.com/restic/restic/internal/restic.init.0.func1.1()
        /restic/internal/restic/lock.go:250 +0x77
created by github.com/restic/restic/internal/restic.init.0.func1
        /restic/internal/restic/lock.go:247 +0x25

goroutine 18 [syscall, 2 minutes]:
os/signal.signal_recv()
        /usr/local/go/src/runtime/sigqueue.go:151 +0x2f
os/signal.loop()
        /usr/local/go/src/os/signal/signal_unix.go:23 +0x19
created by os/signal.Notify.func1.1
        /usr/local/go/src/os/signal/signal.go:151 +0x2a

goroutine 8 [select]:
go.opencensus.io/stats/view.(*worker).start(0xc000162800)
        /home/build/go/pkg/mod/go.opencensus.io@v0.23.0/stats/view/worker.go:276 +0xad
created by go.opencensus.io/stats/view.init.0
        /home/build/go/pkg/mod/go.opencensus.io@v0.23.0/stats/view/worker.go:34 +0x8d

goroutine 9 [chan receive, 2 minutes]:
main.CleanupHandler(0x0?)
        /restic/cmd/restic/cleanup.go:59 +0x39
created by main.init.0
        /restic/cmd/restic/cleanup.go:21 +0x8b

goroutine 140 [select]:
github.com/restic/restic/internal/repository.(*Index).Each.func1.2(0xc06b0dff58)
        /restic/internal/repository/index.go:267 +0x196
github.com/restic/restic/internal/repository.(*indexMap).foreach(...)
        /restic/internal/repository/indexmap.go:59
github.com/restic/restic/internal/repository.(*Index).Each.func1()
        /restic/internal/repository/index.go:266 +0x1e2
created by github.com/restic/restic/internal/repository.(*Index).Each
        /restic/internal/repository/index.go:258 +0xe5

goroutine 25 [select, 2 minutes]:
main.refreshLocks(0x0?, 0xc0002ac300)
        /restic/cmd/restic/lock.go:77 +0xd1
created by main.lockRepository
        /restic/cmd/restic/lock.go:54 +0x1b1

goroutine 139 [select]:
github.com/restic/restic/internal/repository.(*MasterIndex).Each.func1()
        /restic/internal/repository/master_index.go:269 +0x279
created by github.com/restic/restic/internal/repository.(*MasterIndex).Each
        /restic/internal/repository/master_index.go:260 +0xe5

Steps to reproduce the behavior

just run restic prune after forgeting few snapshots

Do you have any idea what may have caused this?

maybe repository size 36TB

Do you have an idea how to solve the issue?

no

Did restic help you today? Did it make you happy in any way?

Restic is fantastic and I’m using it for few years on many projects.

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Reactions: 1
  • Comments: 19 (8 by maintainers)

Most upvoted comments

I didn’t see any documentation for Restic about what could be considered large, or potentially too large, for a repository size. Is this something you could elaborate on? I’d like to know if I’m going to have potential problems executing a prune or other commands.

restic is currently probably most suitable for single digit TB size repositories, double digit TB will probably work too, but a few performance limits may start to show up.

The larger a repository is, the more memory restic will require, a very rough estimate would be 200MB per 1 million files and also per 1 TB of data. There are few exceptions like multi-TB files or folder which directly (i.e., not in a subfolder) contain millions of files, which can lead to higher memory usage.

The larger a repository is, the more time prune/check will take. As these currently prevent concurrent access to the repository, this is also something you might want to take into account.