27.03.2024, 17:58
(27.03.2024, 16:40)zspeciman Wrote: The interesting thing is, if I Enable DDColor by itself, it works. If I enable DeOldify by itself with DDColor Combine=DDColor Only, and everything else unchecked it doesn't work. Aren't these
In the log that you provided there is this line
# adding colors using DeOldify
from vsdeoldify import ddeoldify
clip = ddeoldify(clip=clip, model=0, sat=[1.00,1.00], hue=[0.00,0.00], dd_method=0, dd_weight=0.50, dd_model=1)
the dd_method=0 activate only Deoldify, so probably you selected " DDColor Combine=DeOldify Only"
But the error is reported on __init__.py of DDColor.
The only line in the filter that can activate this function is the following
os.environ["CUDA_MODULE_LOADING"] = "LAZY"
os.environ["NUMEXPR_MAX_THREADS"] = "8"
from vsddcolor import ddcolor
In the file __init__.py of DDColor there are the following lines
from __future__ import annotations
import os
from threading import Lock
import kornia
import numpy as np
import torch
import torch.nn.functional as F
import vapoursynth as vs
from .ddcolor_arch import DDColor
__version__ = "1.0.0"
os.environ["CUDA_MODULE_LOADING"] = "LAZY"
model_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), "models")
the failure is reported in the torch import.
But if this is the case, also DDColor stand-alone should not work.
I'm sorry but I can't reproduce this problem.
Probably an upgrade of GPU to RTX 20/30 could help.
Dan
In the next release 3.0.0 I will delay the import of vsddcolor, so that it will be loaded only if really necessary.
Maybe in this way you will be able to use DeOldify.
Dan