This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
looks great, thats a clever idea to borrow the colors in such a way.
I tried to get DeepMaster going with your instructions, just to try it out for fun. Not having much success there.  
Selur, is there a chance to get a hybrid.exe version of it, as you've done for deepdeinterlace :-)
Reply
Quote: Not having much success there.
Where is the problem?
Did you download the remasternet.pth.tar ?

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Yes, I have placed everything in the right folders including the remasternet.pth  
What is confusing to me, is what to do next.  I drop the sample video test_green_bw.mp4 in Hybrid Base tab, and what is the next step?  Is it to go to vsViewer and open the vsremaster_test_green.vpy script?  Or is to drop the script directly at the Hybrid  Base tab.  You've made everything so easy with few clicks that we've been spoiled, lol.   If its too complicated of steps to walk thru, involving greater understanding of commands, I'll understand if you skip.
Reply
using a custom section and adding:
# requires colorformat RGB24
# requires luma pc
from vsremaster import remaster_colorize
clip = remaster_colorize(clip=clip, length = 2, render_vivid = False, ref_buffer_size = 10, ref_dir=r"g:\Temp")
(path of 'ref_dir' needs to be adjusted)
should work.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Hello Selur,

  I noted that in the function  HAVC_deepx(),  to the boolean parameters "dark" and "render_vivid" are passed strings as shown in the code below

 
clip = HAVC_deepex(clip=clip, clip_ref=clipRef, render_speed="slow", render_vivid="False", ref_merge=0, dark="True", smooth=True)
 

  Unfortunately Python, which is not a "true" language, don't raise any warning...

Dan
Reply
Uploaded a new delodify test, which should fix the problem.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Problem fixed.

Thanks,
Dan
Reply
Added maintenance release: https://github.com/dan64/vs-deoldify/rel...tag/v4.0.1

A part code clean-up and bug fixing, I added the utility function: HAVC_extract_reference_frames
It is just an utility function that can be used by more expert users.

It is not strictly related to HAVC and does not need to be added in Hybrid.

Dan
Reply
Thanks for the info.
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Good news on ColorMNet side

The authors are working to improve their methodology.

I think that the most important improvement is the decision to use a large-pretrained visual model guided feature estimation (PVGFE) module. 
Moreover, from the development point of view they decided to move from TensorFlow to PyThorch, this switch should simplify the porting in Hybrid.

More info here: https://github.com/yyang181/NTIRE23-VIDE.../issues/12

Dan
Reply


Forum Jump:


Users browsing this thread: 7 Guest(s)