This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Hybrid 2022.03.20.1: No module named 'vsdpir'
#44
I think that the part regarding the Vapoursynth + pip install and related modules should be provide the same output contained in the file "Hybrid_torch_addon.7z" that you sent to me (with the exception of folder vsgan_models). So I think that I can skip these steps (unless you think that your archive is not reliable to perform this test).

Now to create my CUDA setup I used the following files:

cuda_11.4.3_472.50_win10.exe
cudnn-11.4-windows-x64-v8.2.2.26.zip
TensorRT-8.0.3.4.Windows10.x86_64.cuda-11.3.cudnn8.2.zip

and I already wrote with this setup vs-dpir and onnxruntime is working perfectly.

Theoretically should be enough:

delete the env variable CUDA_PATH, CUDA_PATH_V11_4
rename "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.4" in "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.4_fake" (just to don't deinstall CUDA)
reboot the PC

now just to be sure, copy all dlls in "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.4_fake\bin" in "D:\Programs\Hybrid\64bit\Vapoursynth\Lib\site-packages\onnxruntime\capi"

I think that in the past I already tested a configuration like it, but I will perform this test again tomorrow.

In meanwhile I found the following code in the source directory of onnxruntime (file: onnxruntime_pybind_state.cc)

#ifdef USE_CUDA
    // If the environment variable 'CUDA_UNAVAILABLE' exists, then we do not load cuda. This is set by _ld_preload for the manylinux case
    // as in that case, trying to load the library itself will result in a crash due to the way that auditwheel strips dependencies.
    if (Env::Default().GetEnvironmentVar("ORT_CUDA_UNAVAILABLE").empty()) {
      if (auto* cuda_provider_info = TryGetProviderInfo_CUDA()) {
        const CUDAExecutionProviderInfo info = GetCudaExecutionProviderInfo(cuda_provider_info,
                                                                            provider_options_map);

        // This variable is never initialized because the APIs by which it should be initialized are deprecated, however they still
        // exist are are in-use. Neverthless, it is used to return CUDAAllocator, hence we must try to initialize it here if we can
        // since FromProviderOptions might contain external CUDA allocator.
        external_allocator_info = info.external_allocator_info;
        return cuda_provider_info->CreateExecutionProviderFactory(info)->CreateProvider();
      } else {
        if (!Env::Default().GetEnvironmentVar("CUDA_PATH").empty()) {
          ORT_THROW("CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.");
        }
      }
    }
    LOGS_DEFAULT(WARNING) << "Failed to create " << type << ". Please reference https://onnxruntime.ai/docs/reference/execution-providers/CUDA-ExecutionProvider.html#requirements to ensure all dependencies are met.";
#endif

It seems that if the environment variable CUDA_PATH is not found the CUDA library is not loaded. But first the module try to initialize CUDA in any case if is not defined the environment variable ORT_CUDA_UNAVAILABLE.
Reply


Messages In This Thread
RE: Hybrid 2022.03.20.1: No module named 'vsdpir' - by Dan64 - 29.03.2022, 23:01

Forum Jump:


Users browsing this thread: 1 Guest(s)