This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
I will add also an option to repeat the average more times, but I'm thinking to limit to 3 the max number of repetitions.

Dan
Reply
I uploaded a new dev version (same link) which fixes SplitYUV Preview, which I think can really help to see the UV problems.
Enable 'Filtering->Compare view' and set 'Filtering->Vapoursynth->Misc->Preview->Split Compare View' to 'splitYUV'.

Cu Selur

Ps.: updated again and added 'splitYUV&interleaved' as 't Compare View' option.
PPs.: attached a reencode where I used dpir_deblock (to address the blocking in the chroma planes), Spotless (trying to lessen the flickering in the chroma planes and resized to 1024x... (1000 seems to cause problems with NVEncC)


Attached Files
.zip   dpirdeplock_spotless_resize1024.zip (Size: 5,62 MB / Downloads: 32)
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
I will test it.
In meanwhile I discovered that "AverageFrames" is now available as standard plugin, so that can be called as:

clip = core.std.AverageFrames

In the new release I will switch to "std" version, no nedd to load the filter "MiscFilters.dll"

Dan
Reply
Nice, didn't know that AverageFrames was included in the core filters now. Smile

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Another good news.

I was finally able to properly calculate the average of frame using Vapoursynth, with this code

def clip_color_stabilizer(clip: vs.VideoNode = None, nframes: int = 5, smooth_type: int = 0) -> vs.VideoNode:
    max_frames = max(1, min(nframes, 31))
    def smooth_frame(n, f, clip_base: vs.VideoNode = None, max_frames: int = 3):
        f_out = f.copy()
        if n < max_frames:
            return f_out
        img_f = list()
        img_f.append(frame_to_image(f))
        for i in range(1, max_frames):       
            img_f.append(frame_to_image(clip_base.get_frame(n-i)))
        img_m = color_temporal_stabilizer(img_f, max_frames)
        return image_to_frame(img_m, f_out)   
    clip = clip.std.ModifyFrame(clips=[clip], selector=partial(smooth_frame, clip_base=clip, max_frames=max_frames))
    return clip

I calculated the images using 4 frames, this average should be equivalent to a "left" average using 7 frames.

Here the comparison: https://imgsli.com/MjQ5MTQ2

The average improved.

Dan
Reply
Another good news.

I found this project: deoldify-onnx

The interesting thing is that is possible to convert the onnx to using fp16 with this code

from onnxmltools.utils.float16_converter import convert_float_to_float16
from onnxmltools.utils import load_model, save_model

onnx_model = load_model('models/deoldify_256_fp32.onnx')
new_onnx_model = convert_float_to_float16(onnx_model)
save_model(new_onnx_model, 'models/deoldify_256_fp16.onnx')
 
Unfortunately the available model " deoldify.onnx" was built assuming a frame size of 256x256 (render_factor=16).
The images colored with this model are very de-saturated but it could be useful to stabilize "ddcolor".
I will test it...

Dan
Reply
Yeah, there is a deoldify .pth model for VSGAN too (https://openmodeldb.info/models/4x-Deoldify), but it isn't really useful either.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
To be able to use onnx  in Hybrid, I had to run

.\python -m pip install onnxruntime-gpu

I'm going to test it.

Dan
Reply
The vs-mlrt addon uses onnx models.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Ok, implemented and tested the onnx version.

It can be used with the following command

from vsdeoldify import ddeoldify_onnx
clip = ddeoldify_onnx(clip=clip, enableFP16=True)

The speed is good, but the quality is bad, so I'm not going to release it.

If you are curious, you can download the onnx version here:

vsdeoldify-2.0.0_onnx.zip

Dan.
Reply


Forum Jump:


Users browsing this thread: 3 Guest(s)