Posts: 121
	Threads: 6
	Joined: Aug 2022
	
	
 
	
	
		looks great, thats a clever idea to borrow the colors in such a way.
I tried to get DeepMaster going with your instructions, just to try it out for fun. Not having much success there.  
Selur, is there a chance to get a hybrid.exe version of it, as you've done for deepdeinterlace :-)
	
	
	
	
	
 
 
	
	
	
		
	Posts: 12.057
	Threads: 66
	Joined: May 2017
	
	
 
	
		
		
		25.06.2024, 04:22 
(This post was last modified: 25.06.2024, 04:24 by Selur.)
		
	 
	
		Quote: Not having much success there.  
Where is the problem?
Did you download the remasternet.pth.tar ? 
Cu Selur
	
 
	
	
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
	
	
 
 
	
	
	
		
	Posts: 121
	Threads: 6
	Joined: Aug 2022
	
	
 
	
	
		Yes, I have placed everything in the right folders including the remasternet.pth  
What is confusing to me, is what to do next.  I drop the sample video test_green_bw.mp4 in Hybrid Base tab, and what is the next step?  Is it to go to vsViewer and open the vsremaster_test_green.vpy script?  Or is to drop the script directly at the Hybrid  Base tab.  You've made everything so easy with few clicks that we've been spoiled, lol.   If its too complicated of steps to walk thru, involving greater understanding of commands, I'll understand if you skip.
	
	
	
	
	
 
 
	
	
	
		
	Posts: 12.057
	Threads: 66
	Joined: May 2017
	
	
 
	
	
		using a custom section and adding:
# requires colorformat RGB24
# requires luma pc
from vsremaster import remaster_colorize
clip = remaster_colorize(clip=clip, length = 2, render_vivid = False, ref_buffer_size = 10, ref_dir=r"g:\Temp")
 (path of 'ref_dir' needs to be adjusted)
should work.
Cu Selur
	
 
	
	
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
	
	
 
 
	
	
	
		
	Posts: 987
	Threads: 81
	Joined: Feb 2020
	
	
 
	
	
		Hello Selur,
  I noted that in the function  HAVC_deepx(),  to the boolean parameters "dark" and "render_vivid" are passed strings as shown in the code below
 
clip = HAVC_deepex(clip=clip, clip_ref=clipRef, render_speed="slow", render_vivid="False", ref_merge=0, dark="True", smooth=True)
   
  Unfortunately Python, which is not a "true" language, don't raise any warning...
Dan
	
 
	
	
	
	
 
 
	
	
	
		
	Posts: 12.057
	Threads: 66
	Joined: May 2017
	
	
 
	
	
		Uploaded a new delodify test, which should fix the problem.
Cu Selur
	
	
	
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
	
	
 
 
	
	
	
		
	Posts: 987
	Threads: 81
	Joined: Feb 2020
	
	
 
	
	
		Problem fixed.
Thanks,
Dan
	
	
	
	
	
 
 
	
	
	
		
	Posts: 987
	Threads: 81
	Joined: Feb 2020
	
	
 
	
	
		Added maintenance release: 
https://github.com/dan64/vs-deoldify/rel...tag/v4.0.1
A part code clean-up and bug fixing, I added the utility function: 
HAVC_extract_reference_frames 
It is just an utility function that can be used by more expert users. 
It is not strictly related to HAVC and does not need to be added in Hybrid.
Dan
	
 
	
	
	
	
 
 
	
	
	
		
	Posts: 12.057
	Threads: 66
	Joined: May 2017
	
	
 
	
	
		Thanks for the info.
	
	
	
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
	
	
 
 
	
	
	
		
	Posts: 987
	Threads: 81
	Joined: Feb 2020
	
	
 
	
	
		Good news on 
ColorMNet side
The authors are working to improve their methodology.
I think that the most important improvement is the decision to use a large-pretrained visual model guided feature estimation (PVGFE) module. 
Moreover, from the development point of view they decided to move from TensorFlow to PyThorch, this switch should simplify the porting in Hybrid.
More info here: 
https://github.com/yyang181/NTIRE23-VIDE.../issues/12
Dan