restic: Segfault/error when run under Windows Task Scheduler
EDIT: 2023-01-07: I changed the name of the issue as the crash also happened without --compression max in the meantime. However, it looks like it is triggered only, if restic is run using the Windows Task Scheduler.
SOLVED: 2023-01-22: I solved the issue by changing the memory priority. Please see below.
Last week, I implemented restic on another machine. I did the initial backup without --compression max. After a few manual backups (also without --compression max), I activated my automatic nightly backup script, which uses --compression max. The first automatic run worked without problems. On the next night, I got errors.
The backup set consists of about 6.5 million files, with a lot of them being very small. Which is maybe a bit uncommon. The machine is not some overclocked gaming rig. It is a Proliant DL325 with a 16 core Epyc 7302P which is running smoothly for 2 years already. So I think we can rule out hardware problems.
Output of restic version
restic 0.14.0 compiled with go1.19 on windows/amd64
How did you run restic exactly?
c:\tools\restic.exe backup --compression max --json -vv --use-fs-snapshot --limit-upload=90 --exclude-caches --iexclude=tmp --iexclude=temp --iexclude=backup-*-restic.txt.bz2 --iexclude=.cache --iexclude=AppData/Local --iexclude=AppData/LocalLow --iexclude=AppData/Roaming/**/Cache --iexclude=AppData/Roaming/Code/* --iexclude=!AppData/Roaming/Code/User --iexclude=AppData/Roaming/Microsoft/Teams --iexclude=AppData/Roaming/Microsoft/Search --iexclude=AppData/Roaming/AMD/*Installer* --iexclude=AppData/Roaming/Python c:\Users c:\own c:\tools e:\Daten
Environment variables:
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
RESTIC_REPOSITORY=s3:https://s3.eu-central-2.wasabisys.com/...
RESTIC_PASSWORD=...
RESTIC_PROGRESS_FPS=0.016666
What backend/server/service did you use to store the repository?
Wasabi S3
Expected behavior
Backup runs without errors
Actual behavior
I ran it 3 times with --compression max. It always croaked after about 650,000-700,000 files.
The messages where not exactly the same, but they always referenced the compression functions:
first output
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xc0000005 code=0x0 addr=0x2b pc=0x15b5cc2]
goroutine 223 [running]:
github.com/klauspost/compress/zstd.(*bestFastEncoder).Encode(0xc00d08a000, 0xc047c51dc0, {0xc0c6626000, 0x20000, 0x800000})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/enc_best.go:189 +0x282
github.com/klauspost/compress/zstd.(*Encoder).EncodeAll(0xc000118140, {0xc0c6626000, 0x802cb, 0x800000}, {0x0, 0x0, 0x0})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/encoder.go:603 +0x8ee
github.com/restic/restic/internal/repository.(*Repository).saveAndEncrypt(0xc000224480, {0x1faed90, 0xc0b738be80}, 0x1, {0xc0c6626000, 0x802cb, 0x800000}, {0xe1, 0xf3, 0x38, ...})
/restic/internal/repository/repository.go:403 +0x192
github.com/restic/restic/internal/repository.(*Repository).SaveBlob(0xc000224480, {0x1faed90, 0xc0b738be80}, 0x1, {0xc0c6626000, 0x802cb, 0x800000}, {0x0, 0x0, 0x0, ...}, ...)
/restic/internal/repository/repository.go:826 +0x285
github.com/restic/restic/internal/archiver.(*BlobSaver).saveBlob(0xc0eba663f0?, {0x1faed90?, 0xc0b738be80?}, 0x1?, {0xc0c6626000?, 0x802cb, 0x800000?})
/restic/internal/archiver/blob_saver.go:101 +0x85
github.com/restic/restic/internal/archiver.(*BlobSaver).worker(0x0?, {0x1faed90, 0xc0b738be80}, 0xc0004f2240)
/restic/internal/archiver/blob_saver.go:128 +0x131
github.com/restic/restic/internal/archiver.NewBlobSaver.func1()
/restic/internal/archiver/blob_saver.go:33 +0x29
golang.org/x/sync/errgroup.(*Group).Go.func1()
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:75 +0x64
created by golang.org/x/sync/errgroup.(*Group).Go
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:72 +0xa5
(that’s all, no more error text)
second output
unexpected fault address 0x87cbab
fatal error: fault
[signal 0xc0000005 code=0x0 addr=0x87cbab pc=0x15b89ee]
goroutine 211 [running]:
runtime.throw({0x1dedb99?, 0x0?})
/usr/local/go/src/runtime/panic.go:1047 +0x65 fp=0xc0004ef150 sp=0xc0004ef120 pc=0x11d8a85
runtime.sigpanic()
/usr/local/go/src/runtime/signal_windows.go:261 +0x125 fp=0xc0004ef198 sp=0xc0004ef150 pc=0x11ebf25
encoding/binary.littleEndian.Uint32(...)
/usr/local/go/src/encoding/binary/binary.go:81
github.com/klauspost/compress/zstd.load3232(...)
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/zstd.go:136
github.com/klauspost/compress/zstd.(*bestFastEncoder).Encode.func3(0xa1c0a000?, 0xc0?, 0xa1c0a000?, 0xc0?)
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/enc_best.go:209 +0x6e fp=0xc0004ef208 sp=0xc0004ef198 pc=0x15b89ee
github.com/klauspost/compress/zstd.(*bestFastEncoder).Encode(0xc02f32a000, 0xc077e6a8c0, {0xc0a1c0a000, 0x20000, 0x800000})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/enc_best.go:229 +0x828 fp=0xc0004ef8d8 sp=0xc0004ef208 pc=0x15b6268
github.com/klauspost/compress/zstd.(*Encoder).EncodeAll(0xc000876640, {0xc0a1c0a000, 0x823de, 0x800000}, {0x0, 0x0, 0x0})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/encoder.go:603 +0x8ee fp=0xc0004efa88 sp=0xc0004ef8d8 pc=0x15c756e
github.com/restic/restic/internal/repository.(*Repository).saveAndEncrypt(0xc000828780, {0x1faed90, 0xc0e9dfe780}, 0x1, {0xc0a1c0a000, 0x823de, 0x800000}, {0x6, 0x11, 0x93, ...})
/restic/internal/repository/repository.go:403 +0x192 fp=0xc0004efbb8 sp=0xc0004efa88 pc=0x15f5992
github.com/restic/restic/internal/repository.(*Repository).SaveBlob(0xc000828780, {0x1faed90, 0xc0e9dfe780}, 0x1, {0xc0a1c0a000, 0x823de, 0x800000}, {0x0, 0x0, 0x0, ...}, ...)
/restic/internal/repository/repository.go:826 +0x285 fp=0xc0004efd08 sp=0xc0004efbb8 pc=0x15f8c25
github.com/restic/restic/internal/archiver.(*BlobSaver).saveBlob(0xd55f8e1c0982710e?, {0x1faed90?, 0xc0e9dfe780?}, 0xb9?, {0xc0a1c0a000?, 0x823de, 0x7f31a9920b526486?})
/restic/internal/archiver/blob_saver.go:101 +0x85 fp=0xc0004efdd8 sp=0xc0004efd08 pc=0x15621a5
github.com/restic/restic/internal/archiver.(*BlobSaver).worker(0x82bc8503a82727d4?, {0x1faed90, 0xc0e9dfe780}, 0xc0000bc1e0)
/restic/internal/archiver/blob_saver.go:128 +0x131 fp=0xc0004eff48 sp=0xc0004efdd8 pc=0x1562431
github.com/restic/restic/internal/archiver.NewBlobSaver.func1()
/restic/internal/archiver/blob_saver.go:33 +0x29 fp=0xc0004eff78 sp=0xc0004eff48 pc=0x1561dc9
golang.org/x/sync/errgroup.(*Group).Go.func1()
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:75 +0x64 fp=0xc0004effe0 sp=0xc0004eff78 pc=0x1519984
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0004effe8 sp=0xc0004effe0 pc=0x1206e01
created by golang.org/x/sync/errgroup.(*Group).Go
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:72 +0xa5
goroutine 1 [semacquire, 17 minutes]:
runtime.gopark(0x25f754e2512?, 0xc000672fb0?, 0x0?, 0x60?, 0x11ba646?)
/usr/local/go/src/runtime/proc.go:363 +0xd6 fp=0xc000672f30 sp=0xc000672f10 pc=0x11db556
... and so on...
third output
unexpected fault address 0x844a7d
fatal error: fault
[signal 0xc0000005 code=0x0 addr=0x844a7d pc=0x15b89ee]
goroutine 200 [running]:
runtime.throw({0x1dedb99?, 0xab94a137dbb03a8?})
/usr/local/go/src/runtime/panic.go:1047 +0x65 fp=0xc05f6bf150 sp=0xc05f6bf120 pc=0x11d8a85
runtime.sigpanic()
/usr/local/go/src/runtime/signal_windows.go:261 +0x125 fp=0xc05f6bf198 sp=0xc05f6bf150 pc=0x11ebf25
encoding/binary.littleEndian.Uint32(...)
/usr/local/go/src/encoding/binary/binary.go:81
github.com/klauspost/compress/zstd.load3232(...)
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/zstd.go:136
github.com/klauspost/compress/zstd.(*bestFastEncoder).Encode.func3(0x61c9?, 0xc0?, 0x20000?, 0xffffffff?)
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/enc_best.go:209 +0x6e fp=0xc05f6bf208 sp=0xc05f6bf198 pc=0x15b89ee
github.com/klauspost/compress/zstd.(*bestFastEncoder).Encode(0xc03142c000, 0xc099a1cee0, {0xc0debfe000, 0x20000, 0x800000})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/enc_best.go:222 +0x310 fp=0xc05f6bf8d8 sp=0xc05f6bf208 pc=0x15b5d50
github.com/klauspost/compress/zstd.(*Encoder).EncodeAll(0xc0005ea140, {0xc0debfe000, 0x83ffb, 0x800000}, {0x0, 0x0, 0x0})
/home/build/go/pkg/mod/github.com/klauspost/compress@v1.15.9/zstd/encoder.go:603 +0x8ee fp=0xc05f6bfa88 sp=0xc05f6bf8d8 pc=0x15c756e
github.com/restic/restic/internal/repository.(*Repository).saveAndEncrypt(0xc0005a2300, {0x1faed90, 0xc08866e500}, 0x1, {0xc0debfe000, 0x83ffb, 0x800000}, {0x2, 0x2a, 0x1a, ...})
/restic/internal/repository/repository.go:403 +0x192 fp=0xc05f6bfbb8 sp=0xc05f6bfa88 pc=0x15f5992
github.com/restic/restic/internal/repository.(*Repository).SaveBlob(0xc0005a2300, {0x1faed90, 0xc08866e500}, 0x1, {0xc0debfe000, 0x83ffb, 0x800000}, {0x0, 0x0, 0x0, ...}, ...)
/restic/internal/repository/repository.go:826 +0x285 fp=0xc05f6bfd08 sp=0xc05f6bfbb8 pc=0x15f8c25
github.com/restic/restic/internal/archiver.(*BlobSaver).saveBlob(0x10?, {0x1faed90?, 0xc08866e500?}, 0x68?, {0xc0debfe000?, 0x83ffb, 0x87b10bb7ee3671ba?})
/restic/internal/archiver/blob_saver.go:101 +0x85 fp=0xc05f6bfdd8 sp=0xc05f6bfd08 pc=0x15621a5
github.com/restic/restic/internal/archiver.(*BlobSaver).worker(0xc06a6b2000?, {0x1faed90, 0xc08866e500}, 0xc0007ae0c0)
/restic/internal/archiver/blob_saver.go:128 +0x131 fp=0xc05f6bff48 sp=0xc05f6bfdd8 pc=0x1562431
github.com/restic/restic/internal/archiver.NewBlobSaver.func1()
/restic/internal/archiver/blob_saver.go:33 +0x29 fp=0xc05f6bff78 sp=0xc05f6bff48 pc=0x1561dc9
golang.org/x/sync/errgroup.(*Group).Go.func1()
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:75 +0x64 fp=0xc05f6bffe0 sp=0xc05f6bff78 pc=0x1519984
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc05f6bffe8 sp=0xc05f6bffe0 pc=0x1206e01
created by golang.org/x/sync/errgroup.(*Group).Go
/home/build/go/pkg/mod/golang.org/x/sync@v0.0.0-20220819030929-7fc1605a5dde/errgroup/errgroup.go:72 +0xa5
goroutine 1 [semacquire, 9 minutes]:
runtime.gopark(0x22a77e92c12?, 0xc000744fb0?, 0x0?, 0x66?, 0x11ba646?)
/usr/local/go/src/runtime/proc.go:363 +0xd6 fp=0xc000744f30 sp=0xc000744f10 pc=0x11db556
... and so on...
After I got the errors, I ran restic prune in order to remove stale packs which worked without problems.
Steps to reproduce the behavior
Use --compression max on a machine with lots of small files (?)
Do you have any idea what may have caused this?
It seems to be related to the --compression max option.
As this repository is broken now anyway, I will do another run with --compression max, just to see if it is reproducible (at least on this machine).
Do you have an idea how to solve the issue?
I removed --compression max. Now I had two runs without problems.
I wanted to keep an eye on it for a few weeks, and also turn on --compression max again after some time just for testing. …And then create the bug report, when I have more info/observations. However, today, restic check reported some errors, so I did not want to wait anymore.
# /opt/restic/restic check --read-data-subset 1%
using temporary cache in /tmp/restic-check-cache-1005701136
create exclusive lock for repositoryload indexes
check all packscheck snapshots, trees and blobs
[4:27] 100.00% 10 / 10 snapshots
read 1.0% of data packs
[11:08] 100.00% 190 / 190 packs
error for tree 32731bc8: decompressing blob 32731bc850ac982f38d812b653bb277b410fef0f42cfd4e7fec8aeacf3f38386 failed: output (131094) bigger than max block size (131072)
Fatal: repository contains errors
# /opt/restic/restic diff --json 8c433dfe 739c167a
error: decompressing blob 32731bc850ac982f38d812b653bb277b410fef0f42cfd4e7fec8aeacf3f38386 failed: output (131094) bigger than max block size (131072)
Besides finding the bug which caused this, I am also interested, if it is possible to recover this repository somehow.
My guess would be to run restic check --read-data and then run another backup with --force?
Did restic help you today? Did it make you happy in any way?
It still makes me sleep better every night 😃
About this issue
- Original URL
- State: open
- Created 2 years ago
- Comments: 28 (8 by maintainers)
Let’s keep this open for now. If no new error shows up in let’s say a month, then we can close it.