RE: Deoldify Vapoursynth filter - Dan64 - 26.09.2024
For dblib see this post: #13
For mmvc see this post: #1
For vs-ddcolor install: https://github.com/dan64/vs-deoldify/releases/download/v4.0.0/vsddcolor-1.0.1-py3-none-any.whl
For spatial_correlation see post: #612
Dan
P.S.
But with this installation ColorMNet is working ?
RE: Deoldify Vapoursynth filter - Selur - 26.09.2024
Quote:For dblib see this post: #13
For mmvc see this post: #1
For vs-ddcolor install: https://github.com/dan64/vs-deoldify/releases/download/v4.0.0/vsddcolor-1.0.1-py3-none-any.whl
For spatial_correlation see post: #612
those are the files I used. (checked vs-ddcolor, I did use that version and made a mistake when writing the post above)
Quote:But with this installation ColorMNet is working ?
No, I get the exact same error when using ColorMNet.
using:
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)[/coe]
I get:
[code]
2024-09-26 18:11:54.387
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\kornia\feature\lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
@torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\kornia\feature\lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
@torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\deepex\models\vgg19_gray.py:130: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
model.load_state_dict(torch.load(vgg19_gray_path))
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\deepex\models\vgg19_gray.py:130: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
model.load_state_dict(torch.load(vgg19_gray_path))
2024-09-26 18:12:10.191
Failed to evaluate the script:
Python exception: DLL load failed while importing spatial_correlation_sampler_backend: Die angegebene Prozedur wurde nicht gefunden.
Traceback (most recent call last):
File "src\\cython\\vapoursynth.pyx", line 3387, in vapoursynth._vpy_evaluate
File "src\\cython\\vapoursynth.pyx", line 3388, in vapoursynth._vpy_evaluate
File "J:\tmp\tempPreviewVapoursynthFile18_11_49_389.vpy", line 45, in
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 297, in HAVC_main
clip_colored = HAVC_deepex(clip=clip, clip_ref=clip_ref, method=DeepExMethod, render_speed=DeepExPreset,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 574, in HAVC_deepex
clip_colored = vs_colormnet(clip, clip_ref, image_size=-1, enable_resize=enable_resize,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\vsslib\vsmodels.py", line 37, in vs_colormnet
return vs_colormnet_batch(clip, clip_ref, image_size, enable_resize, frame_propagate, max_memory_frames)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\__init__.py", line 199, in vs_colormnet_batch
colorizer = colormnet_colorizer(image_size=image_size, vid_length=vid_length, enable_resize=enable_resize,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\__init__.py", line 45, in colormnet_colorizer
return ColorMNetRender(image_size=image_size, vid_length=vid_length, enable_resize=enable_resize,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\colormnet_render.py", line 83, in __init__
self._colorize_init(image_size, vid_length, propagate)
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\colormnet_render.py", line 137, in _colorize_init
self.network = ColorMNet(self.config, self.config['model']).cuda().eval()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\model\network.py", line 37, in __init__
self.short_term_attn = LocalGatedPropagation(d_qk=64, # 256
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\model\attention.py", line 763, in __init__
from spatial_correlation_sampler import SpatialCorrelationSampler
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\spatial_correlation_sampler\__init__.py", line 1, in
from .spatial_correlation_sampler import SpatialCorrelationSampler, spatial_correlation_sample
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\spatial_correlation_sampler\spatial_correlation_sampler.py", line 6, in
import spatial_correlation_sampler_backend as correlation
ImportError: DLL load failed while importing spatial_correlation_sampler_backend: Die angegebene Prozedur wurde nicht gefunden.
I got a backup of an old environment, but not getting it working with Vapoursynth R70 is a problem.
Cu Selur
RE: Deoldify Vapoursynth filter - Dan64 - 26.09.2024
I don't think that R70 is the problem.
Can you try to update only Vapoursynth to R70, without reinstall the packages ?
I noted that there are 2 setuptools v70 e v75 not sure what is the one installed, but you should use the version released in my mmvc archive (v70).
Now I'm busy in fixing the GPU memory problem in ColorMNet, I hope to be able to fix it.
Once this problem will be solved I will look at this problem.
Thanks,
Dan
RE: Deoldify Vapoursynth filter - Selur - 26.09.2024
No hurry, the memory issue is more important.
Correct, using the working old setup, copying R70 into it and installing the Vapoursynth whl file and the ColorMNet is still working.
But no clue where the problem exactly is.
(using different python versions isn't the problem either, it's probably somehow related to the cuda versions)
Will do some more testing tomorrow.
Cu Selur
RE: Deoldify Vapoursynth filter - Selur - 27.09.2024
It's:
python -m pip install --pre -U torch torchvision torch_tensorrt --index-url https://download.pytorch.org/whl/nightly/cu124 --extra-index-url https://pypi.nvidia.com
that's breaking DeOldify.
Cu Selur
RE: Deoldify Vapoursynth filter - Dan64 - 27.09.2024
Hello Selur,
good news! I solved the problem regarding the GPU memory usage of ColorMNet when run inside Vapoursynth environment.
I wrote a frame server that run in a separate thread and different environment. In this way the GPU usage is very low (<2GB even with 20000 frames).
This is an innovative approach that allow to run inside Vapoursynth render programs that are too complex to be executed in the Vapoursynth environment.
I tried to reduce the changes, so that I added only a new parameter: server_port which represent the network port used to exchange data between Vapoursynth and the remote frame server.
I decided to keep, for the moment, the 3 encoding modes of ColorMNet, the only difference is that the mode:0 (named "batch") has been renamed in "remote" and represent the new encoding mode that uses the remote frame server. This mode is now the default encoding mode for ColorMNet.
I tested it, and it seems to be quite robust, in any case I decided to keep the mode:1 (async) e 2 (sync) in the case there are problems in using the "remote" mode.
Given that the new "remote" mode will allow to monitor the progress, there is no reason to implement a progress monitor for the "sync" mode, which probably will be removed when the "remote" mode will be fully tested with different environments and PCs and will be considered robust.
I think that this is the first filter for Vapoursynth developed in this way and I prefer to take time to test it.
Please try it and let me know what you think.
Thanks,
Dan
P.S.
Attached the new RC17 with the new "remote" rendering
RE: Deoldify Vapoursynth filter - Selur - 28.09.2024
Congratulation!
Happy, you found a solution.
Quote:the only difference is that the mode:0 (named "batch") has been renamed in "remote"
I adjusted Hybrid and uploaded a new deoldify test version for you.
Quote:I added only a new parameter: server_port which represent the network port used to exchange data between Vapoursynth and the remote frame server.
Okay. Hybrid will not expose this option in the GUI, since I don't think a user will/should change it. Maybe I will add it at a later point in time if there is a good reason for it.
Quote:This mode is now the default encoding mode for ColorMNet.
Setting a rather untested option as 'default', seem a bad idea.
Quote:Given that the new "remote" mode will allow to monitor the progress, there is no reason to implement a progress monitor for the "sync" mode, which probably will be removed when the "remote" mode will be fully tested with different environments and PCs and will be considered robust.
Okay.
Side note: Some firewalls or antivirus software will probably falsely identify the communication as malicious and stop it (worst case: without reporting), but that is 'okay' and kind of unavoidable.
---------------
Quote:I think that this is the first filter for Vapoursynth developed in this way and I prefer to take time to test it.
No problem, won't include it in a dev version until you say so.
--------------
Did a quick test (in my old R68 setup) and I can confirm that:
a. it does work
b. vram consumption for:
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)
is ~6.3-8GB independent of the source resolution.
Side note: ColorMNet too isn't good with nature content.
------------------
general questions: (too lazy to read the whole code)- Are 'Ref merge' and 'Ref Frame Dir' are used when method is 'HAVC' ?
- Is 'Ref Merge' somehow used when 'Ref Frame Dir' isn't set?
Cu Selur
RE: Deoldify Vapoursynth filter - Dan64 - 28.09.2024
(28.09.2024, 06:21)Selur Wrote: - Are 'Ref merge' and 'Ref Frame Dir' are used when method is 'HAVC' ?
- Is 'Ref Merge' somehow used when 'Ref Frame Dir' isn't set?
The answer to both questions in NO.
(27.09.2024, 20:50)Selur Wrote: python -m pip install --pre -U torch torchvision torch_tensorrt --index-url https://download.pytorch.org/whl/nightly/cu124 --extra-index-url https://pypi.nvidia.com
that's breaking DeOldify.
When I compiled Pytorch-Correlation-extension package, I used the following command to install conda in colormnet environment
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
see post: Questions regarding the code
but it is strange, I expect that the package should work with any 12.x installation
Dan
RE: Deoldify Vapoursynth filter - Selur - 28.09.2024
Yeah, according to semantic versioning, minor version changes add functionality in a backward compatible manner (so they could also include patch version changes and fix bugs, but always backward compatible).
No clue, why it isn't compatible.
Cu Selur
RE: Deoldify Vapoursynth filter - Selur - 28.09.2024
As a side note:
Also, just noticed that the dlib wheel file also does not work.
2024-09-28 21:41:26.603
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vscodeformer\__init__.py:98: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
module.load_state_dict(torch.load(model_path, map_location="cpu")["params_ema"])
F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vscodeformer\__init__.py:98: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
module.load_state_dict(torch.load(model_path, map_location="cpu")["params_ema"])
2024-09-28 21:41:26.826
Failed to evaluate the script:
Python exception: cannot access local variable 'dlib' where it is not associated with a value
Traceback (most recent call last):
File "src\\cython\\vapoursynth.pyx", line 3387, in vapoursynth._vpy_evaluate
File "src\\cython\\vapoursynth.pyx", line 3388, in vapoursynth._vpy_evaluate
File "J:\tmp\tempPreviewVapoursynthFile21_41_23_246.vpy", line 73, in
clip = CodeFormer(clip=clip, upscale=1, detector=1, weight=1.000, num_streams=3) # 720x576
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vscodeformer\__init__.py", line 103, in codeformer
FaceRestoreHelper(upscale, det_model=detection_model, use_parse=True, device=device) for _ in range(num_streams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vscodeformer\face_restoration_helper.py", line 113, in __init__
self.face_detector, self.shape_predictor_5 = self.init_dlib()
^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vscodeformer\face_restoration_helper.py", line 173, in init_dlib
face_detector = dlib.cnn_face_detection_model_v1(detection_path)
^^^^
UnboundLocalError: cannot access local variable 'dlib' where it is not associated with a value
got some time tomorrow and will do more testing of my environment
=> moved to dlib cuda thread, to not spam this here
|