20.03.2024, 21:22 
		
	
	
		I will add also an option to repeat the average more times, but I'm thinking to limit to 3 the max number of repetitions.
Dan
	
	
	
	
Dan
| 
				
				 
					Deoldify Vapoursynth filter
				 
			 | 
		
| 
	 
		
		
		20.03.2024, 21:22 
		
	 
	
		I will add also an option to repeat the average more times, but I'm thinking to limit to 3 the max number of repetitions. 
	
	
	
	
Dan 
		I uploaded a new dev version (same link) which fixes SplitYUV Preview, which I think can really help to see the UV problems. 
	
Enable 'Filtering->Compare view' and set 'Filtering->Vapoursynth->Misc->Preview->Split Compare View' to 'splitYUV'. Cu Selur Ps.: updated again and added 'splitYUV&interleaved' as 't Compare View' option. PPs.: attached a reencode where I used dpir_deblock (to address the blocking in the chroma planes), Spotless (trying to lessen the flickering in the chroma planes and resized to 1024x... (1000 seems to cause problems with NVEncC) 
---- 
	
	
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page. 
		
		
		21.03.2024, 22:13 
		
	 
	
		I will test it. 
	
	
	
	
In meanwhile I discovered that "AverageFrames" is now available as standard plugin, so that can be called as: clip = core.std.AverageFramesIn the new release I will switch to "std" version, no nedd to load the filter "MiscFilters.dll" Dan 
		
		
		21.03.2024, 22:55 
		
	 
	
		Nice, didn't know that AverageFrames was included in the core filters now.  
	
	
![]() Cu Selur 
---- 
	
	
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page. 
		
		
		21.03.2024, 23:00 
		
	 
	
		Another good news. 
	
	
	
	
I was finally able to properly calculate the average of frame using Vapoursynth, with this code def clip_color_stabilizer(clip: vs.VideoNode = None, nframes: int = 5, smooth_type: int = 0) -> vs.VideoNode:I calculated the images using 4 frames, this average should be equivalent to a "left" average using 7 frames. Here the comparison: https://imgsli.com/MjQ5MTQ2 The average improved. Dan 
		
		
		22.03.2024, 19:38 
		
	 
	
		Another good news. 
	
	
	
	
I found this project: deoldify-onnx The interesting thing is that is possible to convert the onnx to using fp16 with this code from onnxmltools.utils.float16_converter import convert_float_to_float16Unfortunately the available model " deoldify.onnx" was built assuming a frame size of 256x256 (render_factor=16). The images colored with this model are very de-saturated but it could be useful to stabilize "ddcolor". I will test it... Dan 
		
		
		22.03.2024, 20:30 
		
	 
	
		Yeah, there is a deoldify .pth model for VSGAN too (https://openmodeldb.info/models/4x-Deoldify), but it isn't really useful either. 
	
	
Cu Selur 
---- 
	
	
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page. 
		
		
		22.03.2024, 20:37 
		
	 
	
		To be able to use onnx  in Hybrid, I had to run 
	
	
	
	
.\python -m pip install onnxruntime-gpuI'm going to test it. Dan 
		
		
		22.03.2024, 20:52 
		
	 
	
		The vs-mlrt addon uses onnx models. 
	
	
Cu Selur 
---- 
	
	
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page. 
		
		
		22.03.2024, 21:35 
		
	 
	
		Ok, implemented and tested the onnx version. 
	
	
	
	
It can be used with the following command from vsdeoldify import ddeoldify_onnxThe speed is good, but the quality is bad, so I'm not going to release it. If you are curious, you can download the onnx version here: vsdeoldify-2.0.0_onnx.zip Dan.  | 
| 
				
	 
					« Next Oldest | Next Newest »
				 
			 |