Selur's Little Message Board

Full Version: Reinterpolation411-style script insertion
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
I have tripod shot dv avi footage with vertical banding (seemingly from chroma conversion issues with DV NTSC). I have all other settings for vapoursynth deinterlacing, denoise, and sharpening set.
Since there is no reinterpolation411 filter in vapoursynth or avisynth, A.I. has built the below script that mimics the C++ and DLL coding.

# ReInterpolate411-style chroma restoration for NTSC DV 4:1:1
def beforeDeinterlace(clip):
# Separate planes
y = core.std.ShufflePlanes(clip, 0, vs.GRAY)
u = core.std.ShufflePlanes(clip, 1, vs.GRAY)
v = core.std.ShufflePlanes(clip, 2, vs.GRAY)

# Reduce to even columns
u_even = core.std.SelectEvery(u, cycle=2, offsets=[0])
v_even = core.std.SelectEvery(v, cycle=2, offsets=[0])

# Resize to full width via horizontal interpolation
u_interp = core.resize.Bicubic(u_even, width=clip.width, height=clip.height, src_left=0.25)
v_interp = core.resize.Bicubic(v_even, width=clip.width, height=clip.height, src_left=0.25)

# Merge back to YUV
return core.std.ShufflePlanes([y, u_interp, v_interp], [0, 0, 0], vs.YUV)


I'm thinking I'd apply to all clips in my library. I'm archiving old Home videos. 
I inserted in the custom tab and inserted before deinterlace.
A.I. is saying the script hybrid generates has errors so I would appreciate help inserting the script properly (or confirming).
This may be my last msg before A.I. causes me to hang myself... (not literally, maybe)
I'm open to any best workflow filtering for optimum result. Shot from 2000-era Sony miniDV camcorder.
I attached a zoomed snip of the artifact I'm trying to resolve. It's a 29.97 bff ntsc conversion to an output from hybrid (QTGMC, denoise, sharpen). Irony for me is that I think these only seem to show in static footage (like tripod). Maybe indoor lighting plays a part. 

I'm open to any suggestions on either how to properly insert script into the workflow or the use of any other proper filtering. 

Once I have final success I'd be happy to post my workflow I've spent two month's time compiling. Maybe I can save someone some pain (or what not to do). I've also created subcode subtitles for datecode.

A bit off topic, I need to change some subcode. DVDate mishandles this. I cannot find any other app to perform a change. I'd appreciate any other apps, ideas, or experiences on that.[attachment=3154]
Code:
A.I. is saying the script hybrid generates has errors so I would appreciate help inserting the script properly (or confirming).
Care to elaborate? Without you showing the complete code Hybrid uses, I guess your code generator messed up.

Looking at your code:
(I assume this should do some strange 4:1:1 to 4:4:4 upsampling.)
Code:
def beforeDeinterlace(clip):
  # Separate planes; Personally I would use https://www.vapoursynth.com/doc/functions/video/splitplanes.html here
  y = core.std.ShufflePlanes(clip, 0, vs.GRAY)
  u = core.std.ShufflePlanes(clip, 1, vs.GRAY)
  v = core.std.ShufflePlanes(clip, 2, vs.GRAY)

  # Reduce to even columns; No clue why this is used (https://www.vapoursynth.com/doc/functions/video/selectevery.html). It does not make sense to me. I assume the intention was to do something like a SeparateFields ?
  u_even = core.std.SelectEvery(u, cycle=2, offsets=[0])
  v_even = core.std.SelectEvery(v, cycle=2, offsets=[0])

  # Resize to full width via horizontal interpolation; That seems fine. https://www.vapoursynth.com/doc/functions/video/resize.html
  u_interp = core.resize.Bicubic(u_even, width=clip.width, height=clip.height, src_left=0.25)
  v_interp = core.resize.Bicubic(v_even, width=clip.width, height=clip.height, src_left=0.25)

  # Merge back to YUV;  That seems fine
  return core.std.ShufflePlanes([y, u_interp, v_interp], [0, 0, 0], vs.YUV)
(Side note: Indentions are important in Python, so best use 'code'-tags when posting code.)
Assuming the syntax is correct, you need to let Hybrid know that after it the output is YUV444P8, by adding:
Code:
# colorformat YUV444P8
to the code, otherwise Hybrid wouldn't know that the color format changed to YUV444P8.
=> the whole code seems wrong to me

Cu Selur

Ps.: Machine learning models are not that good atm. with code generation of anything slightly complex. Especially when not provided with all the necessary information. If you can share a sample of the unprocessed source you feed Hybrid and properly explain what you aim to do, I probably could add a suggestion on what to do.
Hello and thank you for helping. My goal is "playable" format archiving. I was debating tossing source interlaced footage after muxing timecode subtitles but my better judgement prevailed.

I humbly don't know anything on coding. Yes, I've been over-relying on AI to points of frustration. I'm also new to this level of depth on conversion workflows.

My current workflow is preprocessing individual clips in Hybrid. DV AVI --> QTGMC --> Denoise --> Sharpen --> Output to CQ 14 H.265 .mkv 59.94 fps (audio passthrough pcm).
(I'm looking for a "best" average for all with an exception here and there on correcting over or under exposures.)

Post timeline editing in Shotcut (concatenate, transitions, & audio to aac with same CQ H2.65 output)

During my workflow I then ran across clips with vertical banding in source files and read a thread about reinterpolate411 and chroma cleanup. The common theme appears to be indoor lighting and maybe static shots inducing this effect.

Hybrid doesn't have a preloaded reinterpolate411 filter, and AI said "sure I can read the source filter files and provide a vapoursynth equivalent script. And this is where I'm stuck on execution, and it sounds like script formatting.


One change I'll be making to my workflow is to export FFV1 lossless in Hybrid and output shotcut H.265 CQ 14. 
I'm open to modifying existing presets based on your expertise.

I attached source footage, Reinterpolate411 files, Hybrid settings, snippet of input script and resulting vapourscript.
Will look at it after work.
Quote:A fast and simple filter to correct the improper 4:1:1 to 4:2:2 conversion that seems to occur with some NTSC DV/4:1:1 codecs. It assumes the odd chroma pixels are duplicates and discards them, replacing them with the average of the two horizontally adjacent even chroma pixels.
source: http://avisynth.nl/index.php/ReInterpolate411
Have you checked that this is the case for your content?

Quote:Hybrid doesn't have a preloaded reinterpolate411 filter,
That it is correct, it would use normal conversions for 4:1:1 to xy subsmapling.
Quote: During my workflow I then ran across clips with vertical banding in source files and read a thread about reinterpolate411 and chroma cleanup. The common theme appears to be indoor lighting and maybe static shots inducing this effect.
If you can share said clip, I can take a look at it.

About your function:
Instead of:
Code:
# Separate planes
y = core.std.ShufflePlanes(clip, 0, vs.GRAY)
u = core.std.ShufflePlanes(clip, 1, vs.GRAY)
v = core.std.ShufflePlanes(clip, 2, vs.GRAY)

# Reduce to even columns
u_even = core.std.SelectEvery(u, cycle=2, offsets=[0])
v_even = core.std.SelectEvery(v, cycle=2, offsets=[0])

# Resize to full width via horizontal interpolation
u_interp = core.resize.Bicubic(u_even, width=clip.width, height=clip.height, src_left=0.25)
v_interp = core.resize.Bicubic(v_even, width=clip.width, height=clip.height, src_left=0.25)

# Merge back to YUV
return core.std.ShufflePlanes([y, u_interp, v_interp], [0, 0, 0], vs.YUV)
better use:
Code:
# colorformat YUV444P8

# Separate planes
[y, u, v] = core.std.SplitPlanes(clip)
# Reduce to even columns
u_even = core.std.SelectEvery(u, cycle=2, offsets=[0])
v_even = core.std.SelectEvery(v, cycle=2, offsets=[0])

# Resize to full width via horizontal interpolation
u_interp = core.resize.Bicubic(u_even, width=clip.width, height=clip.height, src_left=0.25)
v_interp = core.resize.Bicubic(v_even, width=clip.width, height=clip.height, src_left=0.25)

# Merge back to YUV
return core.std.ShufflePlanes([y, u_interp, v_interp], [0, 0, 0], vs.YUV)
But this only makes sense if in your source "the odd chroma pixels are duplicates" otherwise your chroma will be wrong.
Here's an example, where I took a normal 4:1:1 DV file and applied that code:https://imgsli.com/MzkwMjY2

=> You probably should not use this or similar code.

Reading http://avisynth.nl/index.php/ReInterpolate411 again, you might notice that this is not meant to be used on 4:1:1 content, but it is meant to be used on YUY2/YUV4:2:2 content where the 4:1:1 => 4:2:2 conversion was done improperly.
Side note: one could use ReInterpolate411.dll by switching to Avisynth 32bit (requires an addon) in Hybrid and then load and apply that filter in a custom script.

Cu Selur
Just noticed that 2003.12.10_19.27.17_1.zip contains a sample Angel => applying ReInterpolate411 is wrong for your source.
For filtering try something like:
Code:
clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_BOTTOM) # bff
# adjusting color space from YUV411P8 to YUV444P16 for vsQTGMC
clip = core.resize.Bicubic(clip=clip, format=vs.YUV444P16, range_s="limited")
# Deinterlacing using QTGMC
clip = qtgmc.QTGMC(Input=clip, Preset="Fast", TFF=False, opencl=True) # new fps: 59.94
# Making sure content is preceived as frame based
clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # progressive
# denoising using KNLMeansCL
clip = denoise.KNLMeansCL(clip=clip, d=0) # maybe increase strength form the default 1.2
# contrast sharpening using CAS
clip = core.cas.CAS(clip=clip, sharpness=0.700) # maybe lower sharpness to 0.5
# applying dehalo using YAHR
clip = dehalo.YAHR(clip, depth=16)
Reading http://avisynth.nl/index.php/ReInterpolate411 again, you might notice that this is not meant to be used on 4:1:1 content, but it is meant to be used on YUY2/YUV4:2:2 content where the 4:1:1 => 4:2:2 conversion was done improperly.   Oops. Hopefully this thread and the Avisynth 32bit plugin info can serve someone in need. I had been unable to source a 64bit Reinterpolate411.

I assume I can apply your script via settings in the hybrid presets? I haven't imported from script into Hybrid yet and could see me making mistakes. 

Also, does KNLMeansCL need a standalone GPU to function? At the moment I only have my i9 13900K CPU graphics and I recall errors when I initially tried using KNLMeansCL.
Yes, I created the script through Hybrid.
I simply:
  • loaded the source
  • overwrote the scan type to bff
  • configured QTGMC
  • enabled and configured KNLMeans, which requires OpenCL drivers and should work with your onboard gpu
  • enabled and configured CAS
  • enabled and configured YAHR

Cu Selur
clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_BOTTOM) # bff
# adjusting color space from YUV411P8 to YUV444P16 for vsQTGMC
clip = core.resize.Bicubic(clip=clip, format=vs.YUV444P16, range_s="limited")
# Deinterlacing using QTGMC
clip = qtgmc.QTGMC(Input=clip, Preset="Fast", TFF=False, opencl=True) # new fps: 59.94
# Making sure content is preceived as frame based
clip = core.std.SetFrameProps(clip=clip, _FieldBased=vs.FIELD_PROGRESSIVE) # progressive
# denoising using KNLMeansCL
clip = denoise.KNLMeansCL(clip=clip, d=0) # maybe increase strength form the default 1.2
# contrast sharpening using CAS
clip = core.cas.CAS(clip=clip, sharpness=0.700) # maybe lower sharpness to 0.5
# applying dehalo using YAHR
clip = dehalo.YAHR(clip, depth=16)

I can identify some presets but not all. Could I assume the first 4 you list are auto-handled in Hybrid once I set QTGMC presets? I note you select "fast". Would slower be beneficial to quality?
I can identify the last three presets. For KNLMeansCL d=0 - is this distance?
Pages: 1 2