badger: Potential deadlock in value log
Saw this happening. This could be due to some other bug (move keys), or due to improper logfile locking in value log.
sync.runtime_Semacquire(0xc4672ce6d8)
/usr/lib/go-1.10/src/runtime/sema.go:56 +0x39
sync.(*RWMutex).Lock(0xc4672ce6d0)
/usr/lib/go-1.10/src/sync/rwmutex.go:98 +0x6e
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*logFile).doneWriting(0xc4672ce6c0, 0x5bb227e, 0x0, 0x0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:156 +0x7c
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*valueLog).write.func1(0x11b7c40, 0xc48529ff50)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:854 +0x289
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*valueLog).write(0xc4201b39c8, 0xc48eecb000, 0x753, 0x900, 0x0, 0x0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:898 +0x36e
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*DB).writeRequests(0xc4201b3880, 0xc48eecb000, 0x753, 0x900, 0xc47d1ce600, 0xc4822c3180)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/db.go:585 +0xfe
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*DB).doWrites.func1(0xc48eecb000, 0x753, 0x900)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/db.go:651 +0x55
created by github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*DB).doWrites
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/db.go:700 +0x327
---
goroutine 6030265 [semacquire, 17 minutes]:
sync.runtime_Semacquire(0xc4672ce6dc)
/usr/lib/go-1.10/src/runtime/sema.go:56 +0x39
sync.(*RWMutex).RLock(0xc4672ce6d0)
/usr/lib/go-1.10/src/sync/rwmutex.go:50 +0x49
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*valueLog).getFileRLocked(0xc4201b39c8, 0x12, 0x0, 0x0, 0x0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:919 +0xc6
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*valueLog).readValueBytes(0xc4201b39c8, 0x5b00000012, 0xc400adc6e1, 0xc4a35c9ac0, 0xdad86c, 0x81e, 0x2, 0x0, 0x9, 0x4)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:944 +0x40
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*valueLog).Read(0xc4201b39c8, 0x5b00000012, 0xadc6e1, 0xc4a35c9ac0, 0xc466316d50, 0x7f051aaa0109, 0xf, 0xc4716f8e00, 0xc47d13ea58, 0xad9250)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/value.go:933 +0x8b
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*Item).yieldItemValue(0xc4c5a1a840, 0x0, 0xc4915da000, 0xb, 0x7f06f5bc16c8, 0x0, 0xc47d13eb70)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/iterator.go:162 +0x16b
github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger.(*Item).Value(0xc4c5a1a840, 0x17, 0x20, 0xc478f37740, 0x17, 0x20)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/vendor/github.com/dgraph-io/badger/iterator.go:103 +0x63
github.com/dgraph-io/dgraph/posting.ReadPostingList(0xc478f37740, 0x17, 0x20, 0xc4bd598f50, 0x7f06f5bc16c8, 0x0, 0x8b3f12)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/posting/mvcc.go:411 +0x22d
github.com/dgraph-io/dgraph/worker.(*grpcWorker).PredicateAndSchemaData.func2(0xc478f37740, 0x17, 0x20, 0xc4bd598f50, 0x17, 0xc478f37740, 0x0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/worker/predicate.go:253 +0x28f
github.com/dgraph-io/dgraph/worker.(*streamLists).produceKVs.func1(0xc4406e1d70, 0x10, 0x10, 0x0, 0x0, 0x0, 0x0, 0x0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/worker/stream_lists.go:150 +0x306
github.com/dgraph-io/dgraph/worker.(*streamLists).produceKVs(0xc46aa1b320, 0x13103a0, 0xc46aa1af90, 0xc4386378c0, 0xc43b007c20, 0xc43b3a2060, 0x0, 0xc4865a7fb0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/worker/stream_lists.go:169 +0x233
github.com/dgraph-io/dgraph/worker.(*streamLists).orchestrate.func1(0xc4406e1ae0, 0xc46aa1b320, 0x13103a0, 0xc46aa1af90, 0xc4386378c0, 0xc43b007c20, 0xc43b3a2060, 0xc43b3a20c0)
/home/mrjn/go/src/github.com/dgraph-io/dgraph/worker/stream_lists.go:54 +0x94
created by github.com/dgraph-io/dgraph/worker.(*streamLists).orchestrate
/home/mrjn/go/src/github.com/dgraph-io/dgraph/worker/stream_lists.go:52 +0x19a
About this issue
- Original URL
- State: closed
- Created 6 years ago
- Comments: 15 (9 by maintainers)
Sorry @manishrjain I responded too early. The issue on my side is still there. At some point Badger tries to close a vlog file and stays waiting for the mutex lock. I reach no error or pannic…
As this operation is supposed to be running in a single goroutine, I don’t see where it could be. Obviously a
-racegave nothing too. Changing GOMAXPROCS to 1 sometimes solve the issue but it really depends… adding some delays or heavy logs in the code also resole the issue, sometimes. it looks like a race but I have no clue… and it’s working with 1.4.0