This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

"Blur" filters in Hybrid
#11
Hello Selur,

   It worked!, also changing the bit depth as following

clip = kfg.conditional_resize(clip.fmtc.bitdepth(bits=16), width=1280, height=688, thr=0.00015)

   works.

Thanks,
Dan

Hello Selur,

  looking at the implementation, the function in effect is doing what it said, it is just a "conditionl resize" and inside the funtion the kernel used to perform the comparison are already given and are quite simple. I think that the idea is not so bad, but should be improved.
  I think that it would be better to provide in input the image already resized (the "oversharpened" obtained for example with "realesr") and the image downsampled (using a simpler resizer). But to be useful should be necessary to implement what is missing in TODO: implement blending zone in which both clips are merged to avoid abrupt and visible kernel changes.
   The merge could be implemented by using the VapourSynth's AverageFrames function, but as implemented currently is not useful and not necessary in Hybrid.

Thanks,
Dan
Reply
#12
Hello Selur,

   I modified the function conditional_resize to try to fix the implementation problems. To do that I wrote the python script kagefuncMOD.py
   In input now there are the files already resized. To simplify the test script, I used 2 external files already resampled, but the idea is that the resize is performed directly in the script by using 2 different methods.
   In the new function conditional_resize_ext the test is simplified (I don't know if it could be improved). I also changed the test condition in order to use the abs() function since the error could be on both the clips (too sharp or too smooth). In the case of "thr=0" the 2 clips are always merged with the weight specified in input (default = 0.5). In the case of weight=0 is returned the "down" clip, vice-versa in the case of weight=1.0 is returned always the "sharp" clip.
   I also attached an archive with the comparison.

   Let me know what do you think and if it is feasible to include this script in Hybrid.
   Feel free to change the script to improve it or to adapt to your needs.

Thanks,
Dan
Reply
#13
Helo Selur,

   I found a good solution to solve the problem of too much sharpness applied by realesr-general.
   It is just enough to apply after the resize the filter BlindDeHalo3 with strength between 50-125.
   The effect obtained in this way was exactly what I was looking for.

Dan
Reply
#14
Finally, I had some time to look at this.
Got a few questions, regarding your 'conditional_resize_ext':
a. Why keep the dependencies to mvsfunc and fvsfunc?
From what I see, get_y is the only function not defined in your script itself, so simply replacing it with 'core.std.ShufflePlanes(XXX, [0], vs.GRAY)' should remove the dependencies to mvsfunc and fvsfunc.
b. shouldn't it be named 'luma_diff_select'?

c. About what your script actually does:

Looking at:
clip = oversharpened
    
    # we only need luma for the comparison
    rescaled = get_y(down)
    oversharpened_up = get_y(oversharpened)
    src_luma = get_y(clip)
Do you realize that 'oversharpened_up' and 'src_luma' are the same?
-> I don't think the script does what you intended. Smile

Quote:It is just enough to apply after the resize the filter BlindDeHalo3 with strength between 50-125.
The effect obtained in this way was exactly what I was looking for.
Happy that worked out. Smile

Cu Selur
Reply
#15
Hello Selur,

  I'm quite satisfied, I obtained the amount of "blur" necessary to mitigate the side effects of realesr-general.
  I have attached an example where it was possible to reduce the "joker effect" introduced by realesr-general.

Thank you,
Dan
Reply
#16
Quote:I have attached an example where it was possible to reduce the "joker effect" introduced by realesr-general.
at least a bit Big Grin (btw. this will happen with others ai resizers too)
Reply
#17
Hello Selur,

  
Quote:btw. this will happen with others ai resizers too

   I noted that in the frame regarding the resize.
   there is the check box: "Stepped Resize" with the option to perform a second step resize.

  Given the problem that some AI resizers are introducing artifacts, it could be possible to reduce the impact by introducing a second step or an option where the image produced by the resizer is merged with another image produced with a simpler resizer. The merge could be implemented with a function like this one

PHP Code:
def resize_smooth(clipvs.VideoNodeoversharpenedvs.VideoNodekernel='bicubic'weight=0.5) -> vs.VideoNode:
    """
    Fix oversharpened upscales by merging the video with a blurry bicubic kernel downscale.
    """
    
down clip.fmtc.resample(oversharpened.widthoversharpened.heightkernel=kernel)
        
    return core.std.Merge(downoversharpenedweight)           


   where the kernel used for the resize could be selected in input and the weight allow to reduce the effects of AI resizers.

   The difference respect to using a DeHalo filter is that this filter is applied to the upscaled image. If the upscaler "invented" something wrong, the filter will be applied to the "invented" imaged, so the possibility to reduced the wrong effect is very limited. By merging the upscaled image with an upscaler not subject to "invent" could be possible to obtain a better result respect to a DeHalo filter.

Dan
Reply
#18
What "Stepped Resize" does it divides the difference between the target resolution and the starting resolution by the number of steps and then applies multiple consecutive resizes, which differ by the calculated amount. (additionally it can denoise&sharpen after each step)
Using it with resizers that have a fixed scale factor is possible, but I would not recommend it.

Sure, in theory, I could add an option to all resizers that would take two different resize methods and average them by some weight.
But it's quite a bit of coding involved in that, since it would have to make sure those two resize options use the same color space. (Your 'resize_smooth' has problems there, it only works for p16 content, otherwise it will fail in the merge step, since fmtc.resample always outputs 16bit.) Also, handling of interlaced content would need to be tested and adjusted.

If more users think this is useful, since they would use it, I could look into implementing this, but atm. I am not planning to add support for this.

--
As a general side note:
Your resize_smooth also better should take a gray scale mask (values 0.00-1.00) instead of a weight as parameter and use MaskedMerge, since that probably makes more sense than using a constant weight for the whole image.

Cu Selur
Reply
#19
Send you a link to a dev version with some basic weighted resizing.

Cu Selur
Reply
#20
Hello Selur,

    I downloaded the new dev-version and performed some basic tests.
    Here some results:

    1) Test1: https://imgsli.com/MTQ3NDY0
        Test1: original vs realesr

    2) Test1: https://imgsli.com/MTQ3NDY1
        Test1: BlindDeHalo3 vs Weighted Smooth (your version)

    3) Test2: https://imgsli.com/MTQ3NDY2
        Test2: original vs realesr

    4) Test2: https://imgsli.com/MTQ3NDY3
        Test2: BlindDeHalo3 vs Weighted Smooth (your version)

  Depending on the weight used there is some little improvement.
  I need to perform some more tests, but already in this version it seems a little better than BlindDeHalo3.

Thanks,
Dan
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)