This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Filter limitation when frame changes strongly
#1
Question 
Hello!

Is it possible to apply the filter only if the frame changes a lot, for example, like "FillDuplicateFrame" does now?
For example, to limit temporal noise suppressors so that they don't blur the picture when the camera moves in anime?
Reply
#2
Sound like a motion mask would be better suited, especially for animes.
Enable 'Filtering->Vapoursynth->Misc->UI->Show 'Masted'-controls' to see the controls for masking, select the 'MotionMask' and see if that helps.
Other than that, there is no option to only apply a filter while a xy change is present.
Most denoisers with temporal components usually have settings to control the strength/threshold for temporal content.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
#3
Since your post is in the section which is specifically not about Hybrid.
Here also an example on how to do what you described:
import functools

# Define a function that will apply the filter only if the change in the frame meets a specified comparison condition.
def filterOnlyIf(clip, filtered, thres: float=0.5, compare_method: str="<", debug: bool=False):
    def filterOnly(n, f, clip):
        # Extract the PlaneStatsDiff property for the current frame.
        diff = f.props['PlaneStatsDiff']
       
        # Define comparison logic based on 'compare_method'
        if compare_method == "<":
            condition = diff < thres
        elif compare_method == "<=":
            condition = diff <= thres
        elif compare_method == "=":
            condition = diff == thres
        elif compare_method == ">":
            condition = diff > thres
        elif compare_method == ">=":
            condition = diff >= thres
        else:
            raise ValueError(f"Invalid comparison method: {compare_method}")

        # Apply filter or return the original clip based on the comparison result
        if condition:
            # If the condition is met, apply the filter (Levels in this case).
            ret = filtered
            if debug:
                ret = ret.text.Text(f"filtered, thresh: {diff}")
        else:
            # If the condition is not met, skip applying the filter and return the original clip.
            ret = clip
            if debug:
                ret = ret.text.Text(f"skip, thresh: {diff}")
       
        return ret

    # Apply the 'filterOnly' function frame-by-frame using FrameEval.
    clip = core.std.FrameEval(clip=clip, eval=functools.partial(filterOnly, clip=clip), prop_src=differences)
    return clip


# Example usage with '>' as comparison method and core.std.Levels as filter:

# Generate PlaneStats for the difference between frames
differences = core.std.PlaneStats(clipa=clip, clipb=clip[0] + clip)

# adjusting color space from YUV420P8 to RGB24 for vsLevels
clip = core.resize.Bicubic(clip=clip, format=vs.RGB24, matrix_in_s="470bg", range_s="limited")

# Apply the Levels filter to the clip (this will adjust color levels, range from 16-235).
# In this case, min_in=16, max_in=235 (input range), min_out=16, max_out=235 (output range), gamma=2.00 (adjust gamma).
filtered = core.std.Levels(clip=clip, min_in=16, max_in=235, min_out=16, max_out=235, gamma=2.00)

clip = filterOnlyIf(clip, filtered, thres=0.004, compare_method=">", debug=True)  # Use '>' comparison)
This example will boost the gamma whenever, the changes are below 0.004.

Cu Selur

Ps.: if a few users think this is something that could be useful, I could add this as a general option. Won't add this atm., since I don't really have a use case for this. Like I wrote before, I think a motion mask would be the better fit for your scenario.
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
#4
Heart 
WOW!!! This is awesome!

Thank you very much!

I just wanted to preserve the fine texture (for example wood texture) when the camera is moving, which is blurred when the camera is moving when using temporal noise reduction. I could not configure MotionMask to capture fine texture, it only captures larger details and also during a static camera with MotionMask configured to capture fine texture, the noise was removed worse. My idea is to use a weaker noise reduction when moving and a stronger one when static. Thank you very much! Heart
Reply
#5
Have you tried using an (inverted) edge mask and/or different denoisers (i.e. fft3dfilter and later apply cas)?
Other than that, you will probably have to try other masks, write your own code to combine masks or write your own masking code.
Maybe interesting:
Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)