Pytorch : W ParallelNative.cpp:206
Asked Answered
A

1

25

I'm trying to use a pre-trained template on my image set by following the tutorial right here : https://pytorch.org/tutorials/beginner/finetuning_torchvision_models_tutorial.html

Only I always get this "error" when I run my code and the console locks up :

[W ParallelNative.cpp:206] Warning: Cannot set number of intraop threads after parallel work has started or after set_num_threads call when using native parallel backend (function set_num_threads)

Thank you in advance for your help,

Aboriginal answered 10/11, 2020 at 15:59 Comment(2)
Please, update with the whole error message and in text format.Ascanius
Nearly 3 years later, this is still not solved. PyTorch data loader does not work on Mac M1 for num_workers >0. Setting multiprocessing_context in the data loader to spawn or fork does not help either.Corona
D
35

I have the same problem. Mac. Python 3.6 (also reproduces on 3.8). Pytorch 1.7.

It seems that with this error dataloaders don't (or can't) use parallel computing. You can remove the error (this will not fix the problem) in two ways.

  1. If you can access your dataloaders, set num_workers=0 when creating a dataloader
  2. Set environment variable export OMP_NUM_THREADS=1

Again, both solutions kill parallel computing and may slow down data loading (and therefore training). I look forward to efficient solutions or a patch in Pytorch 1.7

Designation answered 16/11, 2020 at 9:31 Comment(1)
me neither, looking for another workaroundDexterous

© 2022 - 2024 — McMap. All rights reserved.