I will add also an option to repeat the average more times, but I'm thinking to limit to 3 the max number of repetitions.
Dan
I uploaded a new dev version (same link) which fixes SplitYUV Preview, which I think can really help to see the UV problems.
Enable 'Filtering->Compare view' and set 'Filtering->Vapoursynth->Misc->Preview->Split Compare View' to 'splitYUV'.
Cu Selur
Ps.: updated again and added 'splitYUV&interleaved' as 't Compare View' option.
PPs.: attached a reencode where I used dpir_deblock (to address the blocking in the chroma planes), Spotless (trying to lessen the flickering in the chroma planes and resized to 1024x... (1000 seems to cause problems with NVEncC)
I will test it.
In meanwhile I discovered that "AverageFrames" is now available as standard plugin, so that can be called as:
Code:
clip = core.std.AverageFrames
In the new release I will switch to "std" version, no nedd to load the filter "MiscFilters.dll"
Dan
Nice, didn't know that AverageFrames was included in the core filters now.
Cu Selur
Another good news.
I was finally able to properly calculate the average of frame using Vapoursynth, with this code
Code:
def clip_color_stabilizer(clip: vs.VideoNode = None, nframes: int = 5, smooth_type: int = 0) -> vs.VideoNode:
max_frames = max(1, min(nframes, 31))
def smooth_frame(n, f, clip_base: vs.VideoNode = None, max_frames: int = 3):
f_out = f.copy()
if n < max_frames:
return f_out
img_f = list()
img_f.append(frame_to_image(f))
for i in range(1, max_frames):
img_f.append(frame_to_image(clip_base.get_frame(n-i)))
img_m = color_temporal_stabilizer(img_f, max_frames)
return image_to_frame(img_m, f_out)
clip = clip.std.ModifyFrame(clips=[clip], selector=partial(smooth_frame, clip_base=clip, max_frames=max_frames))
return clip
I calculated the images using 4 frames, this average should be equivalent to a "left" average using 7 frames.
Here the comparison:
https://imgsli.com/MjQ5MTQ2
The average improved.
Dan
Another good news.
I found this project:
deoldify-onnx
The interesting thing is that is possible to convert the onnx to using fp16 with this code
Code:
from onnxmltools.utils.float16_converter import convert_float_to_float16
from onnxmltools.utils import load_model, save_model
onnx_model = load_model('models/deoldify_256_fp32.onnx')
new_onnx_model = convert_float_to_float16(onnx_model)
save_model(new_onnx_model, 'models/deoldify_256_fp16.onnx')
Unfortunately the available model " deoldify.onnx" was built assuming a frame size of 256x256 (render_factor=16).
The images colored with this model are very de-saturated but it could be useful to stabilize "ddcolor".
I will test it...
Dan
To be able to use onnx in Hybrid, I had to run
Code:
.\python -m pip install onnxruntime-gpu
I'm going to test it.
Dan
The vs-mlrt addon uses onnx models.
Cu Selur
Ok, implemented and tested the onnx version.
It can be used with the following command
Code:
from vsdeoldify import ddeoldify_onnx
clip = ddeoldify_onnx(clip=clip, enableFP16=True)
The speed is good, but the quality is bad, so I'm not going to release it.
If you are curious, you can download the onnx version here:
vsdeoldify-2.0.0_onnx.zip
Dan.