This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
(24.03.2024, 12:22)Selur Wrote: Ah, post wasn't finished. (Images were just shown without explaination as attachment)

I substitute the last line with

clip = clipMask.resize.Bicubic(format=vs.RGB24, matrix_in_s="709", range_s="limited", dither_type="error_diffusion")

And I confirm you that the mask is always "white".
I'm planing to write my own motion-mask filter...

Dan
Reply
Ah okay.
Something like:
deviation= 0.05
steps: int = 1
tht =  10
  
max_steps = max(min(steps, 5), 1)
  
clip_limited = vs_clip_chroma_stabilizer(clip, deviation=deviation)
for i in range(1, max_steps):
    clip_limited=vs_clip_chroma_stabilizer(clip_limited, deviation=deviation)

# calculate motion mask
org = clip
clipMask = clip
clipMask = clipMask.resize.Bicubic(format=vs.GRAY8, matrix_s="470bg", range_s="limited")
clipMask = vs.core.motionmask.MotionMask(clip=clipMask, th1=tht, th2=tht, tht=tht) # pixels with abs(diff) < tht will be black (static parts)
clipMask = vs.core.std.InvertMask(clip=clipMask) # invert so that static parts are white (weight=1)
# merge in YUV color space      
clipMask = clipMask.resize.Bicubic(format=vs.YUV444PS, matrix_s="470bg", range_s="limited")
clip_limited = clip_limited.resize.Bicubic(format=vs.YUV444PS, matrix_s="470bg", range_s="limited")
clip = clip.resize.Bicubic(format=vs.YUV444PS, matrix_s="470bg", range_s="limited")
clip = vs.core.std.MaskedMerge(clipa=clip, clipb=clip_limited, mask=clipMask) # MotionMask
# restore RBG24 color space
clip = clip.resize.Bicubic(format=vs.RGB24, matrix_in_s="709", range_s="limited", dither_type="error_diffusion")
clipMask = clipMask.resize.Bicubic(format=vs.RGB24, matrix_in_s="709", range_s="limited", dither_type="error_diffusion")
org = org.resize.Bicubic(format=vs.RGB24, matrix_in_s="709", range_s="limited", dither_type="error_diffusion")

clip = core.std.StackVertical([
core.std.StackHorizontal([org.text.Text("Original"), clipMask.text.Text("Mask")]),
core.std.StackHorizontal([clip.text.Text("Filtered"), clip.text.Text("Filtered")])
])
might help to better see whats happening. Smile

And I confirm you that the mask is always "white".
I'm planing to write my own motion-mask filter...
Whats wrong with the current one?
Reply
I confirm you, that this filter is flawd. As been written (or tested) to work in an narrow range of "tht=10"

Here the mask with tht=10:

[Image: attachment.php?aid=2318]

Here the inverted mask:

[Image: attachment.php?aid=2319]

If I increase "tht" the inverted mask becomes "white", this is not what I want. The expectation is that by increasing "tht" the moving parts will be considered "static", but the filter is unable to get this result.

Dan


Attached Files Thumbnail(s)
       
Reply
Hmm,... I agree higher values should detect less changes.


Cu Selur
Reply
I updated the Hybri deoldify test version, which should now work with v2.0.1
(hope I adjusted to all the changes Wink)

Cu Selur
Reply
The new dev-version is working! Smile 

I noted that the new temporal filter "color_limiter" is penalized in Vapoursynth encoding pipeline.

For example the encoding speed on my PC with SimpleMerge

clip = ddeoldify(clip=clip, model=0, sat=[1.00,1.00], hue=[0.00,0.00], chroma_resize=True, dd_method=2)
 
is 5.28fps

The encoding speed with all post process filters enabled, excluding only "color_limiter"
 
clip = ddeoldify(clip=clip, model=0, sat=[1.00,1.00], hue=[0.00,0.00], chroma_resize=True, dd_method=2, dark_darkness=[True,0.1,0.2,0.6,0.65], color_stabilizer=[True,False,True,5,"arithmetic",True])

is 5.20fps, only 2% slower.

But if I add the temporal filter "color_limiter"
 
clip = ddeoldify(clip=clip, model=0, sat=[1.00,1.00], hue=[0.00,0.00], chroma_resize=True, dd_method=2, dark_darkness=[True,0.1,0.2,0.6,0.65], color_stabilizer=[True,False,True,5,"arithmetic",True], color_limiter=[True,0.02])

The speed decrease to 2.58fps, 52% slower

But if I apply the filter after "ddeoldify"

clip = ddeoldify(clip=clip, model=0, sat=[1.00,1.00], hue=[0.00,0.00], chroma_resize=True, dd_method=2, dark_darkness=[True,0.1,0.2,0.6,0.65], color_stabilizer=[True,False,True,5,"arithmetic",True])

clip = dd_video_stabilizer(clip=clip, chroma_resize=[True,24], color_limiter=[True, 0.02])

The encoding speed is 5.04fps, only 5% slower.

It is not clear the reason of such difference in speed, but being a "temporal" filter, it is penalized if applied with other filters in the same pipeline.

I don't know if you are able to observe the same decrease in speed, but if so, It could be worth making available in Hybrid, as a post-post process filter also "dd_video_stabilizer".  Angel

Thanks,
Dan
Reply
Yes, I get the similar speed inpacts.

I agree that calling:
1. ddeoldify
and then
2. dd_video_stabilizer does make sense
does seem like better approach.

I'm really reluctant to add dd_video_stabilizer since, it's the whole thing gets more and more unusable.
This would add another 14+ ui elements,...

Would it make sense to remove color_stabilizer, color_limiter from ddeoldify
(this would remove the possiblity to use color_stabilizer for ddeoldiy and ddcolor separately) and then call
1. ddeoldify(..)
and then
2. dd_video_stabilizer(..) in case color_stabilizer, color_limiter or color_smoothing is enabled
?

Cu Selur
Reply
I understand your point of view.

The more conservative approach is to don't touch the GUI, but in the case is enabled "color_limiter" it can be run separately using dd_video_stabilizer(..) after ddeoldify(..).
For the moment is better enable this separate process only for "color_limiter". 

I'm still working on a Vapoursynth version "color_stabilizer" which is not introducing gray frames.
In the case I will be able to find a working solution, it would be better run this version of filter separately like "color_limiter" because is a temporal filter.

Thanks,
Dan
Reply
Uploaded a new dev version (same link), let me know what you think of this one.
(loads ddeolify and then dd_video_stabilizer)

Cu Selur
Reply
The GUI It's more rational, I like it.
Also the speed is good.

Thanks,
Dan

In the next days I will release a new version (only to you), where I will clean up the ddeoldify filter, by removing the post process filters and probably will rename "dd_video_stabilizer" in "ddeoldify_stabilizer".

Dan
Reply


Forum Jump:


Users browsing this thread: 4 Guest(s)