tower: Panicked at 'missing cancelation' in tower-ready-cache

I just had my code hit this .expect in tower-ready-cache. It’s using tower-balance::pool (and thus also tower-balance::p2c). Not sure what caused it, but figured maybe @olix0r can tease it out from the backtrace?

thread 'tokio-runtime-worker' panicked at 'missing cancelation', /home/ubuntu/.cargo/registry/src/github.com-1ecc6299db9ec823/tower-ready-cache-0.3.0/src/cache.rs:236:37
stack backtrace:
   0: backtrace::backtrace::libunwind::trace
             at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.40/src/backtrace/libunwind.rs:88
   1: backtrace::backtrace::trace_unsynchronized
             at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.40/src/backtrace/mod.rs:66
   2: std::sys_common::backtrace::_print_fmt
             at src/libstd/sys_common/backtrace.rs:77
   3: <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt
             at src/libstd/sys_common/backtrace.rs:59
   4: core::fmt::write
             at src/libcore/fmt/mod.rs:1052
   5: std::io::Write::write_fmt
             at src/libstd/io/mod.rs:1428
   6: std::sys_common::backtrace::_print
             at src/libstd/sys_common/backtrace.rs:62
   7: std::sys_common::backtrace::print
             at src/libstd/sys_common/backtrace.rs:49
   8: std::panicking::default_hook::{{closure}}
             at src/libstd/panicking.rs:204
   9: std::panicking::default_hook
             at src/libstd/panicking.rs:224
  10: std::panicking::rust_panic_with_hook
             at src/libstd/panicking.rs:470
  11: rust_begin_unwind
             at src/libstd/panicking.rs:378
  12: core::panicking::panic_fmt
             at src/libcore/panicking.rs:85
  13: core::option::expect_failed
             at src/libcore/option.rs:1203
  14: core::option::Option<T>::expect
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libcore/option.rs:347
  15: tower_ready_cache::cache::ReadyCache<K,S,Req>::poll_pending
             at ./.cargo/registry/src/github.com-1ecc6299db9ec823/tower-ready-cache-0.3.0/src/cache.rs:236
  16: tower_balance::p2c::service::Balance<D,Req>::promote_pending_to_ready
             at ./.cargo/registry/src/github.com-1ecc6299db9ec823/tower-balance-0.3.0/src/p2c/service.rs:151
  17: <tower_balance::p2c::service::Balance<D,Req> as tower_service::Service<Req>>::poll_ready
             at ./.cargo/registry/src/github.com-1ecc6299db9ec823/tower-balance-0.3.0/src/p2c/service.rs:238
  18: <tower_balance::pool::Pool<MS,Target,Req> as tower_service::Service<Req>>::poll_ready
             at ./.cargo/registry/src/github.com-1ecc6299db9ec823/tower-balance-0.3.0/src/pool/mod.rs:373
  19: <tower_buffer::worker::Worker<T,Request> as core::future::future::Future>::poll
             at ./.cargo/registry/src/github.com-1ecc6299db9ec823/tower-buffer-0.3.0/src/worker.rs:169
  20: tokio::task::core::Core<T>::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/core.rs:128
  21: tokio::task::harness::Harness<T,S>::poll::{{closure}}::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:119
  22: core::ops::function::FnOnce::call_once
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libcore/ops/function.rs:232
  23: <std::panic::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panic.rs:318
  24: std::panicking::try::do_call
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panicking.rs:303
  25: __rust_maybe_catch_panic
             at src/libpanic_unwind/lib.rs:86
  26: std::panicking::try
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panicking.rs:281
  27: std::panic::catch_unwind
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panic.rs:394
  28: tokio::task::harness::Harness<T,S>::poll::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:100
  29: tokio::loom::std::causal_cell::CausalCell<T>::with_mut
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/loom/std/causal_cell.rs:41
  30: tokio::task::harness::Harness<T,S>::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:99
  31: tokio::task::raw::RawTask::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/raw.rs:113
  32: tokio::task::Task<S>::run
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/mod.rs:381
  33: tokio::runtime::thread_pool::worker::GenerationGuard::run_task
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:459
  34: tokio::runtime::thread_pool::worker::GenerationGuard::process_available_work
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:317
  35: tokio::runtime::thread_pool::worker::GenerationGuard::run
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:282
  36: tokio::runtime::thread_pool::worker::Worker::run::{{closure}}::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:160
  37: std::thread::local::LocalKey<T>::try_with
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/thread/local.rs:262
  38: std::thread::local::LocalKey<T>::with
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/thread/local.rs:239
  39: tokio::runtime::thread_pool::worker::Worker::run::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:136
  40: tokio::runtime::thread_pool::current::set::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/current.rs:47
  41: std::thread::local::LocalKey<T>::try_with
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/thread/local.rs:262
  42: std::thread::local::LocalKey<T>::with
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/thread/local.rs:239
  43: tokio::runtime::thread_pool::current::set
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/current.rs:29
  44: tokio::runtime::thread_pool::worker::Worker::run
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/worker.rs:132
  45: tokio::runtime::thread_pool::Workers::spawn::{{closure}}::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/thread_pool/mod.rs:113
  46: <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/blocking/task.rs:30
  47: tokio::task::core::Core<T>::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/core.rs:128
  48: tokio::task::harness::Harness<T,S>::poll::{{closure}}::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:119
  49: core::ops::function::FnOnce::call_once
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libcore/ops/function.rs:232
  50: <std::panic::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panic.rs:318
  51: std::panicking::try::do_call
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panicking.rs:303
  52: __rust_maybe_catch_panic
             at src/libpanic_unwind/lib.rs:86
  53: std::panicking::try
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panicking.rs:281
  54: std::panic::catch_unwind
             at /rustc/fc23a81831d5b41510d3261c20c34dd8d32f0f31/src/libstd/panic.rs:394
  55: tokio::task::harness::Harness<T,S>::poll::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:100
  56: tokio::loom::std::causal_cell::CausalCell<T>::with_mut
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/loom/std/causal_cell.rs:41
  57: tokio::task::harness::Harness<T,S>::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/harness.rs:99
  58: tokio::task::raw::RawTask::poll
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/raw.rs:113
  59: tokio::task::Task<S>::run
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/task/mod.rs:381
  60: tokio::runtime::blocking::pool::run_task
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/blocking/pool.rs:311
  61: tokio::runtime::blocking::pool::Inner::run
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/blocking/pool.rs:230
  62: tokio::runtime::blocking::pool::Spawner::spawn_thread::{{closure}}::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/blocking/pool.rs:210
  63: tokio::runtime::context::enter
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/context.rs:72
  64: tokio::runtime::handle::Handle::enter
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/handle.rs:34
  65: tokio::runtime::blocking::pool::Spawner::spawn_thread::{{closure}}
             at ./.cargo/git/checkouts/tokio-377c595163f99a10/3fb217f/tokio/src/runtime/blocking/pool.rs:209

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 31

Commits related to this issue

Most upvoted comments

Yeah, I have looked at the FuturesUnordered impl myself before, and also do not believe it does any buffering. The receiver notification is the one I’m also currently suspicious of, though it would be crazy if a send on a oneshot would not be seen by a receiver polled by the same future. There’s currently some discussion about this on Discord. Alice suggested:

Initially key K contains a task A with associated cancel tx Atx in the map. A new task B with same key and cancel tx Btx is added.

  1. self.pending_cancel_txs.insert inserts Btx and takes out Atx
  2. self.pending.push(B) happens
  3. self.poll_pending happens and decides to poll task A
  4. Task A says it’s done, and Btx is removed along with it.
  5. c.send(()) is called on Atx
  6. self.poll_pending happens again, this time B is polled and says it’s done, but there’s no cancel token in the map.

The un-reordered version of my execution is that the poll_pending in 3. happened entirely before the push_pending But 1. and 2. got reordered backwards into the preceding poll_pending

Though that execution seems insane. It would mean that writing to a field through &mut self before yielding, and then reading that same field after yielding, could have the load not see the store.

This is, indeed, a very weird error. I have to admit ignorance to the newer std::future APIs. I would personally want to look at the translation from futures 0.1 to see if there’s anything that changed semantically. Or perhaps there’s a notification race that we haven’t seen before. I’ll refresh myself and see if I can come up with a better theory.