12.03.2024, 21:40
I'm sorry for the flu, I hope you get better soon.
In meanwhile I added another boolean parameter, called chroma_resize (default = True).
When this parameter is set to true, the encoding speed will increase by about 10% (see table below)
![[Image: attachment.php?aid=2284]](https://forum.selur.net/attachment.php?aid=2284)
The increase spreed will not decrease the final output quality that will be the same obtained by setting chroma_resize = False.
So it is safe to enable this parameter by default.
happy recovery!
Dan
I also added more explanations in ddeoldify(), now all the parameters are explained.
Dan
In meanwhile I added another boolean parameter, called chroma_resize (default = True).
When this parameter is set to true, the encoding speed will increase by about 10% (see table below)
The increase spreed will not decrease the final output quality that will be the same obtained by setting chroma_resize = False.
So it is safe to enable this parameter by default.
happy recovery!
Dan
I also added more explanations in ddeoldify(), now all the parameters are explained.
def ddeoldify(
clip: vs.VideoNode, model: int = 0, render_factor: int = 24, sat: list = [1.0,1.0], hue: list = [0.0,0.0],
dd_model: int = 1, dd_render_factor: int = 24, dd_tweak_luma_bind: list = [False, 0.0, 0.0], dd_bright: float = 0, dd_cont: float = 1, dd_gamma: float = 1.0,
dd_method: int = 2, dd_method_params: list = [0.5, 0.6, 0.15, 0.2], chroma_resize: bool = True, device_index: int = 0, n_threads: int = 8, dd_num_streams: int = 1,
torch_hub_dir: str = model_dir
) -> vs.VideoNode:
"""A Deep Learning based project for colorizing and restoring old images and video
:param clip: clip to process, only RGB24 format is supported.
:param model: deoldify model to use (default = 0):
0 = ColorizeVideo_gen
1 = ColorizeStable_gen
2 = ColorizeArtistic_gen
:param render_factor: render factor for the model, range: 10-44 (default = 24).
:param sat: list with the saturation parameters to apply to color models (default = [1,1])
[0] : saturation for deoldify
[1] : saturation for ddcolor
:param hue: list with the hue parameters to apply to color models (default = [0,0])
[0] : hue for deoldify
[1] : hue for ddcolor
:param dd_model: ddcolor model (default = 1):
0 = ddcolor_modelscope,
1 = ddcolor_artistic
:param dd_render_factor: ddcolor input size equivalent to render_factor, if = 0 will be auto selected
(default = 24) [range: 0, 10-64]
:param dd_tweak_luma_bind: parameters for luma constrained ddcolor preprocess
[0] : luma_constrained_tweak -> luma constrained ddcolor preprocess enabled (default = False), range: [True, False]
when enaabled the average luma of a video clip will be forced to don't be below the value
defined by the parameter "luma_min". The function allow to modify the gamma
of the clip if the average luma is below the parameter "gamma_luma_min"
[1] : luma_min -> luma (%) min value for tweak activation (default = 0, non activation), range [0-1]
[2] : gamma_luma_min -> luma (%) min value for gamma tweak activation (default = 0, non activation), range [0-1]
:param dd_tweak_bright ddcolor tweak's bright (default = 0)
:param dd_tweak_cont ddcolor tweak's constrast (default = 1)
:param dd_tweak_gamma ddcolor tweak's gamma (default = 1)
:param dd_method: method used to combine deoldify with ddcolor (default = 2):
0 : deoldify only (no merge)
1 : ddcolor only (no merge)
2 : Simple Merge:
the images are combined using a weighted merge, where the parameter clipb_weight
represent the weight assigned to the colors provided by ddcolor()
3 : Adaptive Luma Merge:
given that the ddcolor() perfomance is quite bad on dark scenes, the images are
combinaed by decreasing the weight assigned to ddcolor() when the luma is
below a given threshold given by: luma_threshold.
For example with: luma_threshold = 0.6 the weight assigned to ddcolor() will
start to decrease linearly when the luma < 60% till "min_weight"
4 : Constrained Chroma Merge:
given that the colors provided by deoldify() are more conservative and stable
than the colors obtained with ddcolor() images are combined by assigning
a limit to the amount of difference in chroma values between deoldify() and
ddcolor() this limit is defined by the parameter threshold. The limit is applied
to the image converted to "YUV". For example when threshold=0.1, the chroma
values "U","V" of ddcolor() image will be constrained to have an absolute
percentage difference respect to "U","V" provided by deoldify() not higher than 10%
:param dd_method_params: list with the parameters to apply to selected dd_method:
[0] : clipb_weight (%), used by: SimpleMerge, AdaptiveLumaMerge, ConstrainedChromaMerge, range [0-1]
[1] : luma_threshold (%), used by: AdaptiveLumaMerge, range [0-1]
[2] : min_weight (%), used by: AdaptiveLumaMerge, range [0-1]
[3] : chroma_threshold (%), used by: ConstrainedChromaMerge [0-1]
:param chroma_resize: if True will be enabled the chroma_resize: the cololorization will be applied to a clip with the same
size used for the models inference(), but the final resolution will be the one of the original clip.
:param device_index: device ordinal of the GPU, choices: GPU0...GPU7, CPU=99 (default = 0)
:param n_threads: number of threads used by numpy, range: 1-32 (default = 8)
:param dd_num_streams: number of CUDA streams to enqueue the kernels (default = 1)
:param torch_hub_dir: torch hub dir location, default is model directory,
if set to None will switch to torch cache dir.
"""
Dan