This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
It was released 6 hrs ago. Smile
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Wow! you are very fast, but I'm faster I already released a new version: https://github.com/dan64/vs-deoldify/rel...tag/v3.5.3

I added the option "MoreVivid" to VideoTune.

There is no urgency to release a new version, I added this option mainly to keep the symmetry in the range of options.

Meanwhile I discovered that in the NTIRE 2023 Video Colorization Challenge the benchmark to beat was DeOldify.

The winner of the challenge has not released any code yet, but in the case it will be made available, I could try to create a new Vapoursynth filter for it.

Dan
Reply
Next release will probably be in either end of May in June.
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
Uploaded a new test version which support "MoreVivid".

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
I tested the winner of NTIRE 2023 Video Colorization Challenge  (code here: Bistnet )

Unfortunately the results were not good as expected.

The approach followed by this model is to use some reference frame to map the objects color and then try to keep the temporal consistency.

Here a simple comparison

1) Start frame (for Bistnet is an external reference image)

[Image: attachment.php?aid=2462]

In this case was used as reference the true starting frame, while DDeoldify had to guess, the colors.

2) Frame 25

[Image: attachment.php?aid=2463]

in this case (due to temporal consistency) the boy's pants on the right start to turn red (like the girl's skirt)

3) Frame 80

[Image: attachment.php?aid=2464]

in this case the boy's pants on the left start to turn red (like the girl's skirt)

4) Frame 125

[Image: attachment.php?aid=2465]

in this case the girl's skirt starts to turn green (like the girl's pants)

In summary is not so good, DDelodifi(Stable) colors are less rich but are more stable.

Moreover is very slow: only 0.15 fps

Dan


Attached Files Thumbnail(s)
               
Reply
So probably more suited to fix some coloring lost using vhs transfers. (where you got just a few frames without colors)
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
The previous comparison was unfair because as reference image was used a ground truth image.
In the real situation the "ground truth" image is not available and cannot be provided manually inside an automatic video coloring process.

The most realistic solution is to provide at every scene change a reference image provided by another automatic image coloring model.

In this case the best candidate for providing the reference images is DDColor.

Given that the comparison with "DDeoldifiy(Stable)" was already provided, I will show the comparison with "DDColor(rf=32)".

1)  Start frame (for Bistnet was provided the DDColor image as reference)

[Image: attachment.php?aid=2466]

in this case the 2 images are the same by construction

2) Frame 25 

[Image: attachment.php?aid=2467]

in this case Bistnet is stable, while in DDColor the boys' shirts are starting to turn green

3) Frame 80 

[Image: attachment.php?aid=2468]

even in this case Bistnet is stable, while in DDColor the boy's shirt has become green and his trousers are almost blue

3) Frame 113

[Image: attachment.php?aid=2469]

in this case Bistnet is still stable, while in DDColor the girl's skirt also turned blue

This example shows that Bistnet could improve the stability of images provided by DDColor when the DDColor images are used (only) as reference, but the comparison provides also a clear example of the DDColor instability.

Comparing the Bistnet images obtained with this example with the "DDeoldifiy(Stable)" images provided previously, it is possible to see (a part the choice of colors) the both present a good color image stability.
But "DDeoldifiy(Stable)" is about 33x faster than Bistnet which is too slow and in practice cannot be used.

The author wrote that he will provide a new version called ColorMNet  that should be faster and less memory hungry, let's see...

Dan


Attached Files Thumbnail(s)
               
Reply
NTIRE 2024 Video Colorization Challenge shouldn't be war away too,... iirc. NTIRE was always in the summer.
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply
(are there plans for a direct implementation of deoldify into hybrid?)
Reply
More directly than through the torch-addon: No.
(see: https://selur.de/downloads/ for a link to the GoogleDrive where the addons are for download, read the README.md on how to install the packages)

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply


Forum Jump:


Users browsing this thread: 18 Guest(s)