transformers: Can’t run Parallel inference

  • transformers version: 4.12.3
  • Platform: Darwin-20.6.0-x86_64-i386-64bit
  • Python version: 3.7.0
  • PyTorch version (GPU?): 1.10.0 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: parallel

Hi, I have warning message [W ParallelNative.cpp:214] Warning: Cannot set number of intraop threads after parallel work has started or after set_num_threads call when using native parallel backend (function set_num_threads) while using pipeline

pipe = pipeline("sentiment-analysis", model=model, padding=True, max_length=200, truncation=True)
results = pipe(texts)

It happened on both models:

  • “distilbert-base-uncased-finetuned-sst-2-english”
  • “cardiffnlp/twitter-roberta-base-sentiment”

Only one cpu is executed!

Any suggestions ? Thanks

@Narsil @LysandreJik

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Comments: 15 (8 by maintainers)

Most upvoted comments

Okay, I tried every single thing I could but I am unable to reproduce.

I am guessing the issues lies in Darwin at that point. I looked into pytorch issues which seem relevant (but cannot try to confirm at this time) https://github.com/pytorch/pytorch/issues/58585 https://github.com/pytorch/pytorch/issues/46409