This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
Hello Selur,

  I completed the development of vs-deoldify 4.5.0.
  I attached the last version RC20 (I hope).
  Respect to the previous version, now it is raised an exception in the case ColorMNet[remote] could not be initialized.
  I also improved the change scene detection implementation.
  I spent a lot of time trying the available scene detection implementations, including  PySceneDetect and CLIP 
  Unfortunately none of them was good enough to detected correctly all the scenes in the clips used for testing.
  If an implementation was good for a clip it was not good for another and vice-versa.

  At the end the Vapoursynth implementation of scene detection resulted to be the most general purposes and faster implementation tested.

  So that I decided to improve it, using the following approach:

  Run the scene detection with high sensitivity parameters:

  sc_threshold = 0.03
  sc_min_freq = 5

  In this way all the scenes change are detected for sure. The problem is that in this way are identified as scene change also frames that are very similar.

  To handle this problem I introduced a scene detection filter based on SSIM (Structural Similarity Index Metric) that analyse the frames selected in the previous step and discard the frames that are similar. The SSIM implementation is quite complex and is slow (I used the skimage implementation), but by applying it only on the selected frames, the impact on speed encoding is low (only 3% slower).
  To manage the SSIM filter I introduced the parameter sc_tht_ssim that represent the threshold to be used to discard similar images. Suggested values are between 0.45-0.65, being 0.60 the most effective. By default this parameter is set to 0, so that this second step filter is disabled.

Dan


Attached Files
.zip   vsdeoldify-4.5.0_RC20.zip (Size: 396,75 KB / Downloads: 22)
Reply
Have you tried the different scene change detection methods in Vapoursynth?
Quote: sc_threshold = 0.03
3% seems really low, I would have expected 10-15% to be more reasonable values.

I'll try: RC20

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
I attached the new version RC21, where I increaded the values of sc_tht_black and sc_tht_white (2 parameters not exported to GUI).

Don't worry about the high sensitivity of sc_threshold. All the similar frames will be discarded by SSIM.
The pre-filter with misc.SCDetect() is necessary only to reduce the number of frames to be analysed by SSIM.

Dan


Attached Files
.zip   vsdeoldify-4.5.0_RC21.zip (Size: 396,76 KB / Downloads: 22)
Reply
As expected when using the R70 (with updated CUDA) and ColorMnet, it fails:
2024-10-06 14:17:29.914
[VSE Server]: socket is ready to be read
[VSE Server]: connection open: true
[VSE Server]: connection readable: true
[VSE Server] - Message received: changeTo ### J:\tmp\tempPreviewVapoursynthFile14_17_29_906.vpy ###
2024-10-06 14:17:35.117
Failed to evaluate the script:
Python exception: Failed to initialize ColorMNet[remote] try ColorMNet[local]

Traceback (most recent call last):
File "src\\cython\\vapoursynth.pyx", line 3387, in vapoursynth._vpy_evaluate
File "src\\cython\\vapoursynth.pyx", line 3388, in vapoursynth._vpy_evaluate
File "J:\tmp\tempPreviewVapoursynthFile14_17_29_906.vpy", line 45, in
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 305, in HAVC_main
clip_colored = HAVC_deepex(clip=clip, clip_ref=clip_ref, method=DeepExMethod, render_speed=DeepExPreset,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 587, in HAVC_deepex
clip_colored = vs_colormnet(clip, clip_ref, image_size=-1, enable_resize=enable_resize,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\vsslib\vsmodels.py", line 45, in vs_colormnet
return vs_colormnet_remote(clip, clip_ref, image_size, enable_resize, frame_propagate, max_memory_frames)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\__init__.py", line 185, in vs_colormnet_remote
HAVC_LogMessage(MessageType.EXCEPTION, "Failed to initialize ColorMNet[remote] try ColorMNet[local]")
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\vsslib\vsutils.py", line 47, in HAVC_LogMessage
raise vs.Error(message_text)
vapoursynth.Error: Failed to initialize ColorMNet[remote] try ColorMNet[local]

Using R68 (with old CUDA):
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)
works.
clip = HAVC_main(clip=clip, EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, ScThtSSIM=0.10, DeepExMaxMemFrames=0)
Does seem to work too.
=> Updated the deoldify test download.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Quote: I attached the new version RC21, where I increaded the values of sc_tht_black and sc_tht_white (2 parameters not exported to GUI).
Okay, then the test version should still be fine. Smile

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
I attached a sample to test scene detection.

SCDetect with threshold = 0.05 will generate only one frame.
All the method tested provided wrong selections (see folders).
SCDetect with threshold = 0.03 detected too many frames (13)
SCDetect with threshold = 0.03 and sc_tht_ssim = 0.4 provided only 5 frames.

try also your scene change detection implementations to test what happen.

Dan


Attached Files
.zip   SCDetectSample.zip (Size: 4,42 MB / Downloads: 21)
Reply
They all have issues with these blend scene changes and either see no scene change, of if the threshold is lowered see way too many. Wink
(totally forgot that sc.Detect is the predecessor of misc.SCDetect)

Cu Selur

Ps.: when doing the scene change detection, to speed things up, you could always resize the clip to ...x320.
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Hello Selur,

  I released the new version RC22.

  In this version, the speed GUI "Preset" (fast, medium, slow) will apply also to ColorMNet.
  Also the GUI parameters "Ref merge", "Weight", "Threshold" are now applicable to ColorMNet.

  In order to allow SCDetect() to perform better on blend scenes I introduced a new boolean parameter "sc_normalize" in HAVC_ddeoldify() and "ScNormalize" in HAVC_main().
  If this flag is enabled, the B&W frames used by misc.SCDetect() will be "normalized", in this way the SCDetect() filter will be more sensible and will be able to detect more blend scenes.

  I hope that you can include it in the GUI.

Thanks,
Dan


Attached Files
.zip   vsdeoldify-4.5.0_RC22.zip (Size: 397,5 KB / Downloads: 25)
Reply
Updated the deoldify test version.

Quote: In order to allow SCDetect() to perform better on blend scenes I introduced a new boolean parameter "sc_normalize" in HAVC_ddeoldify() and "ScNormalize" in HAVC_main().
If this flag is enabled, the B&W frames used by misc.SCDetect() will be "normalized", in this way the SCDetect() filter will be more sensible and will be able to detect more blend scenes.
Nice. Smile

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Hi, saw you release 4.5.0 Smile

Installing vsdeoldify-4.5.0-py3-none-any with:
python -m pip install vsdeoldify-4.5.0-py3-none-any
and extracting 'spatial_correlation_sampler-0.5.0-py312-cp312-win_amd64.whl' into my 'Hybrid\64bit\Vapoursynth\Lib\site-packages'-folder and in Hybrid enabling 'DeOldify with Exemplar Models' I get:
Failed to evaluate the script:
Python exception: Failed to initialize ColorMNet[remote] try ColorMNet[local]

Traceback (most recent call last):
File "src\\cython\\vapoursynth.pyx", line 3387, in vapoursynth._vpy_evaluate
File "src\\cython\\vapoursynth.pyx", line 3388, in vapoursynth._vpy_evaluate
File "J:\tmp\tempPreviewVapoursynthFile12_38_31_085.vpy", line 48, in
clip = HAVC_main(clip=clip, ColorFix="none", EnableDeepEx=True, DeepExMethod=0, DeepExRefMerge=0, ScFrameDir=None, DeepExModel=0, DeepExEncMode=0, DeepExMaxMemFrames=0)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 317, in HAVC_main
clip_colored = HAVC_deepex(clip=clip, clip_ref=clip_ref, method=DeepExMethod, render_speed=DeepExPreset,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\__init__.py", line 615, in HAVC_deepex
clip_colored = vs_colormnet(clip, clip_ref, clip_sc, image_size=-1, enable_resize=enable_resize,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\vsslib\vsmodels.py", line 45, in vs_colormnet
return vs_colormnet_remote(clip, clip_ref, clip_sc, image_size, enable_resize, frame_propagate,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\__init__.py", line 136, in vs_colormnet_remote
HAVC_LogMessage(MessageType.EXCEPTION, "Failed to initialize ColorMNet[remote] try ColorMNet[local]")
File "F:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\vsslib\vsutils.py", line 47, in HAVC_LogMessage
raise vs.Error(message_text)
vapoursynth.Error: Failed to initialize ColorMNet[remote] try ColorMNet[local]
DINOv2FeatureV6_LocalAtten_s2_154000.pth is present in "Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\colormnet\weights"

Any idea what I'm missing?
(updated Hybrid deoldify test version download)

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply


Forum Jump:


Users browsing this thread: 10 Guest(s)