05.01.2022, 01:34
(04.01.2022, 21:47)Selur Wrote: What they are doing is mixing EdgeCleaner, DeRinging und FFmpegs (un-)Sharpen.
I think mixing FFmpeg with Avisynth/Vapoursynth filters is in general bad and often stupid:
a. it easily can cause color issues
b. you easily loose control about the bit depth&co you work in (at least last I checked figuring out what color formats FFmpeg really is using is a real pain. Also controling how rgb<>yuv conversions are also rather complicated to control.
If you use Avisynth, instead of using FFmpegs (Un-)Sharpen, one could use one of the Avisynth filters for unsharpening http://avisynth.nl/index.php/External_filters through the custom addition.
If you use Vapoursynth there are also (un-)sharpen filters, see: https://vsdb.top/
Personally I'm no fan of unsharpen filters. (which is why in Hybrid only FFmpegs unsharpen filter is supported out of the box) Also unsharpen filters often have problems with noise content.
Cu Selur
RGB to YUV conversions in VapourSynth appears more straightforward than in AviSynth, at least from my experiences. All you have have to do is call the resize node in vapoursynth.core, then a resize algo method, and lastly define the color format plus matrix/transfer/primaries in the formal parameters of the method header. I don't understand how this process can be complicated to control, assuming that you aren't doing this process each time you call a filter, which shouldn't be happening unless each filter has different color format requirements.
Looking at AviSynth specifically, I think your argument for mixing AviSynth and FFMpeg filters is meaningless because the amount color conversions that happens with AviSynth alone is already really bad. Furthermore, the majority of FFMpeg filters use a large variety of color/pixel formats.
I have copied the settings shown in the video guide (I was the one who created the video) and this partial AviSynth+ script is generated by Hybrid. A 10bit video is being put in a 16bit container (instead of YUV420P10?) and is dithered down to 8bit:
# color sampling YV12@10, matrix: bt2020nc, scantyp: progressive, luminance scale: limited
LWLibavVideoSource("C:\Users\Goose\DOCUME~1\VAPOUR~1\DAYOFT~1.MKV",cache=false,format="YUV420P16", prefer_hw=0,repeat=true)
# current resolution: 1920x1080
# filtering
# deringing using MosquitoNR
MosquitoNR()
EdgeCleaner()
# Dithering from 16 to 8bit for encoder
ConvertBits(8)
PreFetch(8)
# setting output fps to 23.976fps
AssumeFPS(24000,1001)
# output: color sampling YV12@8, matrix: bt2020nc, scantyp: progressive, luminance scale: limited
return last
This doesn't appear to be a problem in VapourSynth, but there's no EdgeCleaner port. Which is why I stuck to AviSynth in the video guide.
Regarding FFMpeg (un)sharpen, figuring out what color formats FFMpeg uses should be no struggle. Just go to libavfilter in FFMpeg's source code and there's usually an enum in each filter's source containing the color/pixel formats used by FFMpeg. Looking at the unsharp filter, there appears to be wide support for color/pixel formats (see line 200: https://github.com/FFmpeg/FFmpeg/blob/ma..._unsharp.c). 10 Bit support for unsharp was also released over a year ago so there's essentially no loss of colors/conversion for the majority of videos.