This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Deoldify Vapoursynth filter
#1
Hello Selur,

   I just completed the development of a Vapoursynth filter for Deoldify.
   As you probably already know, Deoldify is a Deep Learning based project for colorizing and restoring old images and video.
  
   Currently it is possible to use Deoldify using Jupyter or Stable Diffusion.

   Now with this implementation it will be possible to use it directly in Vapoursynth.

   To use this filter it is necessary to install fastai version v1.0.60 with the command

python -m pip fastai=1.0.60

  Deoldify is delivered with a own version of fastai, so this installation is necessary to install "only" the dependencies.
  After the installation of fastai it is necessary to delete it from "Hybrid\64bit\Vapoursynth\Lib\site-packages" to avoid conflicts with the Deodify version (it seems that this problem arise only when Deoldify is inside a packeage).
 
  To install the filter it is necessary to unzip the attached file vsdeoldify-1.0.0.zip in "Hybrid\64bit\Vapoursynth\Lib\site-packages"

  I don't have included the models. They must downloaded from: https://github.com/jantic/DeOldify
  And installed in the folder:  "Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\models"  

  For your convenience I report here the links

   ColorizeArtistic_gen.pth
   ColorizeStable_gen.pth
   ColorizeVideo_gen.pth

   I attached also the file Example.zip with a simple example.

  The usage is the following:

from vsdeoldify import ddeoldify
clip=ddeoldify(clip, model, render_factor, device_index, post_process)  
where
- clip:  Clip to process. Only RGB24 format is supported.
- model: Model to use (default 0).
          0 = ColorizeVideo_gen
          1 = ColorizeStable_gen
          2 = ColorizeArtistic_gen
- render_factor: render factor for the model.
- device_index:  Device ordinal of the GPU, choices: GPU0...GPU7, CPU=99 (default 0)
- post_process:  post_process takes advantage of the fact that human eyes are much less sensitive to imperfections in chrominance compared to luminance lowering the memory usage (default True)

  This is a draft version. It would be nice if you can install on your side to check if is working on your installation. Smile

Thanks,
Dan


Attached Files
.zip   Example.zip (Size: 222,88 KB / Downloads: 19)
.zip   vsdeoldify-1.0.0.zip (Size: 239,67 KB / Downloads: 15)
Reply
#2
Nice.
Sadly, I'm busy today and tomorrow, but I'll test it and report back on Thursday after work.
How does it compare to ddcolor?

iirc render_factor slowed down deolidfy quite a bit, but helped with stabilizing the colors. (iirc. one needed to use 20+ to get usable results)
do you have any experience with this (it's quite a while since I last used deoldify)

Cu Selur
Reply
#3
(27.02.2024, 18:37)Selur Wrote: How does it compare to ddcolor?

  It depends on the model used, the Video model in "deoldify" has been calibrated to be flicker free in videos. Unfortunately in "ddcolor" all the models have some flickering.   
  The colors in "ddcolor" are more saturated while "deoldify" is more conservative and the color is desatured in order to avoid the flickering.
  The 2 models have different behavior and maybe the best result can be reached using them in combination (see Merge()).

(27.02.2024, 18:37)Selur Wrote: iirc render_factor slowed down deolidfy quite a bit, but helped with stabilizing the colors. (iirc. one needed to use 20+ to get usable results)
do you have any experience with this (it's quite a while since I last used deoldify)

Yes the render_factor controls the number of iteration for the convergence of the colors to be assigned to the image.
Higher values decrease the speed.
I forget to specify that a reasonable range of values for render_factor is between 10 and 44, being 21 a good default.
In term of speed render_factor affect the speed as input_size in "ddcolor".    
With render_factor=30, "deoldify" has about the same speed of "ddcolor" with  input_size =1024.
Reply
#4
Unexpectedly I had a few minutes time and I did the following:
cd f:\Hybrid\64bit\Vapoursynth\
python -m pip install fastai==1.0.6
which showed:
Collecting fastai==1.0.60
  Downloading fastai-1.0.60-py3-none-any.whl (237 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 237.3/237.3 kB 7.3 MB/s eta 0:00:00
Collecting bottleneck (from fastai==1.0.60)
  Downloading Bottleneck-1.3.8-cp311-cp311-win_amd64.whl.metadata (8.1 kB)
Collecting fastprogress>=0.2.1 (from fastai==1.0.60)
  Downloading fastprogress-1.0.3-py3-none-any.whl.metadata (5.6 kB)
Collecting beautifulsoup4 (from fastai==1.0.60)
  Downloading beautifulsoup4-4.12.3-py3-none-any.whl.metadata (3.8 kB)
Requirement already satisfied: matplotlib in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (3.7.1)Collecting numexpr (from fastai==1.0.60)
  Downloading numexpr-2.9.0-cp311-cp311-win_amd64.whl.metadata (8.1 kB)
Requirement already satisfied: numpy>=1.15 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (1.26.2)
Collecting nvidia-ml-py3 (from fastai==1.0.60)
  Downloading nvidia-ml-py3-7.352.0.tar.gz (19 kB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: pandas in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (2.0.1)
Requirement already satisfied: packaging in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (23.2)
Requirement already satisfied: Pillow in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (9.5.0)
Requirement already satisfied: pyyaml in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (6.0)
Requirement already satisfied: requests in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (2.31.0)
Collecting scipy (from fastai==1.0.60)
  Downloading scipy-1.12.0-cp311-cp311-win_amd64.whl.metadata (60 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.4/60.4 kB ? eta 0:00:00
Requirement already satisfied: torch>=1.0.0 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (2.1.1+cu121)
Collecting spacy>=2.0.18 (from fastai==1.0.60)
  Downloading spacy-3.7.4-cp311-cp311-win_amd64.whl.metadata (27 kB)
Requirement already satisfied: torchvision in f:\hybrid\64bit\vapoursynth\lib\site-packages (from fastai==1.0.60) (0.16.1)
Collecting spacy-legacy<3.1.0,>=3.0.11 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading spacy_legacy-3.0.12-py2.py3-none-any.whl.metadata (2.8 kB)
Collecting spacy-loggers<2.0.0,>=1.0.0 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading spacy_loggers-1.0.5-py3-none-any.whl.metadata (23 kB)
Collecting murmurhash<1.1.0,>=0.28.0 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading murmurhash-1.0.10-cp311-cp311-win_amd64.whl.metadata (2.0 kB)
Collecting cymem<2.1.0,>=2.0.2 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading cymem-2.0.8-cp311-cp311-win_amd64.whl.metadata (8.6 kB)
Collecting preshed<3.1.0,>=3.0.2 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading preshed-3.0.9-cp311-cp311-win_amd64.whl.metadata (2.2 kB)
Collecting thinc<8.3.0,>=8.2.2 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading thinc-8.2.3-cp311-cp311-win_amd64.whl.metadata (15 kB)
Collecting wasabi<1.2.0,>=0.9.1 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading wasabi-1.1.2-py3-none-any.whl.metadata (28 kB)
Collecting srsly<3.0.0,>=2.4.3 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading srsly-2.4.8-cp311-cp311-win_amd64.whl.metadata (20 kB)
Collecting catalogue<2.1.0,>=2.0.6 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading catalogue-2.0.10-py3-none-any.whl.metadata (14 kB)
Collecting weasel<0.4.0,>=0.1.0 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading weasel-0.3.4-py3-none-any.whl.metadata (4.7 kB)
Collecting typer<0.10.0,>=0.3.0 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading typer-0.9.0-py3-none-any.whl.metadata (14 kB)
Collecting smart-open<7.0.0,>=5.2.1 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading smart_open-6.4.0-py3-none-any.whl.metadata (21 kB)
Requirement already satisfied: tqdm<5.0.0,>=4.38.0 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from spacy>=2.0.18->fastai==1.0.60) (4.66.1)
Collecting pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading pydantic-2.6.2-py3-none-any.whl.metadata (83 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.0/84.0 kB ? eta 0:00:00
Requirement already satisfied: jinja2 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from spacy>=2.0.18->fastai==1.0.60) (3.1.2)
Requirement already satisfied: setuptools in f:\hybrid\64bit\vapoursynth\lib\site-packages (from spacy>=2.0.18->fastai==1.0.60) (67.7.2)
Collecting langcodes<4.0.0,>=3.2.0 (from spacy>=2.0.18->fastai==1.0.60)
  Downloading langcodes-3.3.0-py3-none-any.whl.metadata (29 kB)
Requirement already satisfied: charset-normalizer<4,>=2 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from requests->fastai==1.0.60) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from requests->fastai==1.0.60) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from requests->fastai==1.0.60) (2.0.2)
Requirement already satisfied: certifi>=2017.4.17 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from requests->fastai==1.0.60) (2023.5.7)
Requirement already satisfied: filelock in f:\hybrid\64bit\vapoursynth\lib\site-packages (from torch>=1.0.0->fastai==1.0.60) (3.12.0)
Requirement already satisfied: typing-extensions in f:\hybrid\64bit\vapoursynth\lib\site-packages (from torch>=1.0.0->fastai==1.0.60) (4.5.0)
Requirement already satisfied: sympy in f:\hybrid\64bit\vapoursynth\lib\site-packages (from torch>=1.0.0->fastai==1.0.60) (1.12)
Requirement already satisfied: networkx in f:\hybrid\64bit\vapoursynth\lib\site-packages (from torch>=1.0.0->fastai==1.0.60) (3.1)
Requirement already satisfied: fsspec in f:\hybrid\64bit\vapoursynth\lib\site-packages (from torch>=1.0.0->fastai==1.0.60) (2023.5.0)
Collecting soupsieve>1.2 (from beautifulsoup4->fastai==1.0.60)
  Downloading soupsieve-2.5-py3-none-any.whl.metadata (4.7 kB)
Requirement already satisfied: contourpy>=1.0.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (1.0.7)
Requirement already satisfied: cycler>=0.10 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (4.39.4)
Requirement already satisfied: kiwisolver>=1.0.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (1.4.4)
Requirement already satisfied: pyparsing>=2.3.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from matplotlib->fastai==1.0.60) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from pandas->fastai==1.0.60) (2023.3)
Requirement already satisfied: tzdata>=2022.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from pandas->fastai==1.0.60) (2023.3)
Collecting annotated-types>=0.4.0 (from pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4->spacy>=2.0.18->fastai==1.0.60)
  Downloading annotated_types-0.6.0-py3-none-any.whl.metadata (12 kB)
Collecting pydantic-core==2.16.3 (from pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4->spacy>=2.0.18->fastai==1.0.60)
  Downloading pydantic_core-2.16.3-cp311-none-win_amd64.whl.metadata (6.6 kB)
Collecting typing-extensions (from torch>=1.0.0->fastai==1.0.60)
  Downloading typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB)
Requirement already satisfied: six>=1.5 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from python-dateutil>=2.7->matplotlib->fastai==1.0.60) (1.16.0)
Collecting blis<0.8.0,>=0.7.8 (from thinc<8.3.0,>=8.2.2->spacy>=2.0.18->fastai==1.0.60)
  Downloading blis-0.7.11-cp311-cp311-win_amd64.whl.metadata (7.6 kB)
Collecting confection<1.0.0,>=0.0.1 (from thinc<8.3.0,>=8.2.2->spacy>=2.0.18->fastai==1.0.60)
  Downloading confection-0.1.4-py3-none-any.whl.metadata (19 kB)
Requirement already satisfied: colorama in f:\hybrid\64bit\vapoursynth\lib\site-packages (from tqdm<5.0.0,>=4.38.0->spacy>=2.0.18->fastai==1.0.60) (0.4.6)
Requirement already satisfied: click<9.0.0,>=7.1.1 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from typer<0.10.0,>=0.3.0->spacy>=2.0.18->fastai==1.0.60) (8.1.3)
Collecting cloudpathlib<0.17.0,>=0.7.0 (from weasel<0.4.0,>=0.1.0->spacy>=2.0.18->fastai==1.0.60)
  Downloading cloudpathlib-0.16.0-py3-none-any.whl.metadata (14 kB)
Requirement already satisfied: MarkupSafe>=2.0 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from jinja2->spacy>=2.0.18->fastai==1.0.60) (2.1.2)
Requirement already satisfied: mpmath>=0.19 in f:\hybrid\64bit\vapoursynth\lib\site-packages (from sympy->torch>=1.0.0->fastai==1.0.60) (1.3.0)
Downloading fastprogress-1.0.3-py3-none-any.whl (12 kB)
Downloading spacy-3.7.4-cp311-cp311-win_amd64.whl (12.1 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 12.1/12.1 MB 12.8 MB/s eta 0:00:00
Downloading beautifulsoup4-4.12.3-py3-none-any.whl (147 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 147.9/147.9 kB 9.2 MB/s eta 0:00:00
Downloading Bottleneck-1.3.8-cp311-cp311-win_amd64.whl (110 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 110.1/110.1 kB 6.2 MB/s eta 0:00:00
Downloading numexpr-2.9.0-cp311-cp311-win_amd64.whl (96 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 96.6/96.6 kB ? eta 0:00:00
Downloading scipy-1.12.0-cp311-cp311-win_amd64.whl (46.2 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.2/46.2 MB 12.6 MB/s eta 0:00:00
Downloading catalogue-2.0.10-py3-none-any.whl (17 kB) MB 13.4 MB/s eta 0:00:01
Downloading cymem-2.0.8-cp311-cp311-win_amd64.whl (39 kB)
Downloading langcodes-3.3.0-py3-none-any.whl (181 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.6/181.6 kB 10.7 MB/s eta 0:00:00
Downloading murmurhash-1.0.10-cp311-cp311-win_amd64.whl (25 kB)
Downloading preshed-3.0.9-cp311-cp311-win_amd64.whl (122 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 122.3/122.3 kB 7.5 MB/s eta 0:00:00
Downloading pydantic-2.6.2-py3-none-any.whl (394 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 394.9/394.9 kB 12.4 MB/s eta 0:00:00
Downloading pydantic_core-2.16.3-cp311-none-win_amd64.whl (1.9 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 13.3 MB/s eta 0:00:00
Downloading smart_open-6.4.0-py3-none-any.whl (57 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.0/57.0 kB ? eta 0:00:00
Downloading soupsieve-2.5-py3-none-any.whl (36 kB)
Downloading spacy_legacy-3.0.12-py2.py3-none-any.whl (29 kB)
Downloading spacy_loggers-1.0.5-py3-none-any.whl (22 kB)
Downloading srsly-2.4.8-cp311-cp311-win_amd64.whl (479 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 479.7/479.7 kB 10.0 MB/s eta 0:00:00
Downloading thinc-8.2.3-cp311-cp311-win_amd64.whl (1.5 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 13.4 MB/s eta 0:00:00
Downloading typer-0.9.0-py3-none-any.whl (45 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.9/45.9 kB 2.2 MB/s eta 0:00:00
Downloading typing_extensions-4.10.0-py3-none-any.whl (33 kB)
Downloading wasabi-1.1.2-py3-none-any.whl (27 kB)
Downloading weasel-0.3.4-py3-none-any.whl (50 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.1/50.1 kB ? eta 0:00:00
Downloading blis-0.7.11-cp311-cp311-win_amd64.whl (6.6 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.6/6.6 MB 13.2 MB/s eta 0:00:00
Downloading cloudpathlib-0.16.0-py3-none-any.whl (45 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.0/45.0 kB ? eta 0:00:00
Downloading confection-0.1.4-py3-none-any.whl (35 kB)
Building wheels for collected packages: nvidia-ml-py3
  Building wheel for nvidia-ml-py3 (setup.py) ... done
  Created wheel for nvidia-ml-py3: filename=nvidia_ml_py3-7.352.0-py3-none-any.whl size=19182 sha256=9c0934880d851ead2bb487650b7f4bbcac8ef648cee672b45055e31f280d8488
  Stored in directory: c:\users\selur\appdata\local\pip\cache\wheels\47\50\9e\29dc79037d74c3c1bb4a8661fb608e8674b7e4260d6a3f8f51
Successfully built nvidia-ml-py3
Installing collected packages: nvidia-ml-py3, cymem, wasabi, typing-extensions, spacy-loggers, spacy-legacy, soupsieve, smart-open, scipy, numexpr, murmurhash, langcodes, fastprogress, cloudpathlib, catalogue, bottleneck, blis, annotated-types, typer, srsly, pydantic-core, preshed, beautifulsoup4, pydantic, confection, weasel, thinc, spacy, fastai
  Attempting uninstall: typing-extensions
    Found existing installation: typing_extensions 4.5.0
    Uninstalling typing_extensions-4.5.0:
      Successfully uninstalled typing_extensions-4.5.0
  WARNING: The script weasel.exe is installed in 'F:\Hybrid\64bit\Vapoursynth\Scripts' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
  WARNING: The script spacy.exe is installed in 'F:\Hybrid\64bit\Vapoursynth\Scripts' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed annotated-types-0.6.0 beautifulsoup4-4.12.3 blis-0.7.11 bottleneck-1.3.8 catalogue-2.0.10 cloudpathlib-0.16.0 confection-0.1.4 cymem-2.0.8 fastai-1.0.60 fastprogress-1.0.3 langcodes-3.3.0 murmurhash-1.0.10 numexpr-2.9.0 nvidia-ml-py3-7.352.0 preshed-3.0.9 pydantic-2.6.2 pydantic-core-2.16.3 scipy-1.12.0 smart-open-6.4.0 soupsieve-2.5 spacy-3.7.4 spacy-legacy-3.0.12 spacy-loggers-1.0.5 srsly-2.4.8 thinc-8.2.3 typer-0.9.0 typing-extensions-4.10.0 wasabi-1.1.2 weasel-0.3.4
so far so good.
Then I called:
python -m pip uninstall fastai==1.0.6
which should onyl remove fastai, but keep the dependencies.
Found existing installation: fastai 1.0.60
Uninstalling fastai-1.0.60:
  Would remove:
    f:\hybrid\64bit\vapoursynth\lib\site-packages\fastai-1.0.60.dist-info\*
    f:\hybrid\64bit\vapoursynth\lib\site-packages\fastai\*
Proceed (Y/n)? Y
  Successfully uninstalled fastai-1.0.60
That worked as expected.
I then extracted the contents of vsdeoldify-1.0.0.zip to
f:\hybrid\64bit\vapoursynth\lib\site-packages
and the models to:
f:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\models
(Okay, I see why you didn't include those models.)

To test it, I did the following.
  • start Hybrid
  • load a source
  • enabled 'Filtering->Vapoursynth->Color->Basic->Tweak' and set 'Saturation' to 0.
  • I then set 'Filtering->Vapoursynth->Custim->Insert before' to 'Grayworld', enabled the edit section and for quick test entered:
    clip = core.resize.Bicubic(clip=clip, format=vs.RGB24, matrix_in_s="470bg", range_s="limited")
    from vsdeoldify import ddeoldify
    clip = ddeoldify(clip, model=0, render_factor=21)
    clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P8, matrix_s="470bg", range_s="limited")
    when pressing the preview, this triggered, my firewall to popup,...
    Allowing the download gave:
    Downloading: "https://download.pytorch.org/models/resnet101-63fe2227.pth" to C:\Users\Selur/.cache\torch\hub\checkpoints\resnet101-63fe2227.pth
    Downloading: "https://download.pytorch.org/models/resnet101-63fe2227.pth" to C:\Users\Selur/.cache\torch\hub\checkpoints\resnet101-63fe2227.pth
    0%|          | 0.00/171M [00:00
    1%|          | 1.20M/171M [00:00
    1%|▏         | 2.48M/171M [00:00
    2%|▏         | 3.73M/171M [00:00
    3%|▎         | 5.00M/171M [00:00
    4%|▎         | 6.25M/171M [00:00
    4%|▍         | 7.52M/171M [00:00
    5%|▌         | 8.78M/171M [00:00
    6%|▌         | 10.0M/171M [00:00
    7%|▋         | 11.3M/171M [00:00
    7%|▋         | 12.6M/171M [00:01
    8%|▊         | 13.8M/171M [00:01
    9%|▉         | 15.1M/171M [00:01
    10%|▉         | 16.3M/171M [00:01
    10%|█         | 17.6M/171M [00:01
    11%|█         | 18.9M/171M [00:01
    12%|█▏        | 20.1M/171M [00:01
    13%|█▎        | 21.4M/171M [00:01
    13%|█▎        | 22.7M/171M [00:01
    14%|█▍        | 23.9M/171M [00:01
    15%|█▍        | 25.2M/171M [00:02
    16%|█▌        | 26.5M/171M [00:02
    16%|█▋        | 27.7M/171M [00:02
    17%|█▋        | 29.0M/171M [00:02
    18%|█▊        | 30.3M/171M [00:02
    18%|█▊        | 31.5M/171M [00:02
    19%|█▉        | 32.8M/171M [00:02
    20%|█▉        | 34.1M/171M [00:02
    21%|██        | 35.4M/171M [00:02
    21%|██▏       | 36.6M/171M [00:02
    22%|██▏       | 37.9M/171M [00:03
    23%|██▎       | 39.1M/171M [00:03
    24%|██▎       | 40.4M/171M [00:03
    24%|██▍       | 41.7M/171M [00:03
    25%|██▌       | 42.9M/171M [00:03
    26%|██▌       | 44.2M/171M [00:03
    27%|██▋       | 45.5M/171M [00:03
    27%|██▋       | 46.7M/171M [00:03
    28%|██▊       | 48.0M/171M [00:03
    29%|██▉       | 49.3M/171M [00:03
    30%|██▉       | 50.5M/171M [00:04
    30%|███       | 51.8M/171M [00:04
    31%|███       | 53.1M/171M [00:04
    32%|███▏      | 54.3M/171M [00:04
    33%|███▎      | 55.6M/171M [00:04
    33%|███▎      | 56.9M/171M [00:04
    34%|███▍      | 58.1M/171M [00:04
    35%|███▍      | 59.4M/171M [00:04
    36%|███▌      | 60.7M/171M [00:04
    36%|███▋      | 61.9M/171M [00:04
    37%|███▋      | 63.2M/171M [00:05
    38%|███▊      | 64.4M/171M [00:05
    39%|███▊      | 65.7M/171M [00:05
    39%|███▉      | 67.0M/171M [00:05
    40%|████      | 68.2M/171M [00:05
    41%|████      | 69.5M/171M [00:05
    42%|████▏     | 70.8M/171M [00:05
    42%|████▏     | 72.0M/171M [00:05
    43%|████▎     | 73.3M/171M [00:05
    44%|████▎     | 74.6M/171M [00:05
    44%|████▍     | 75.8M/171M [00:06
    45%|████▌     | 77.1M/171M [00:06
    46%|████▌     | 78.4M/171M [00:06
    47%|████▋     | 79.6M/171M [00:06
    47%|████▋     | 80.9M/171M [00:06
    48%|████▊     | 82.2M/171M [00:06
    49%|████▉     | 83.4M/171M [00:06
    50%|████▉     | 84.7M/171M [00:06
    50%|█████     | 86.0M/171M [00:06
    51%|█████     | 87.2M/171M [00:06
    52%|█████▏    | 88.5M/171M [00:07
    53%|█████▎    | 89.8M/171M [00:07
    53%|█████▎    | 91.0M/171M [00:07
    54%|█████▍    | 92.3M/171M [00:07
    55%|█████▍    | 93.6M/171M [00:07
    56%|█████▌    | 94.8M/171M [00:07
    56%|█████▋    | 96.1M/171M [00:07
    57%|█████▋    | 97.4M/171M [00:07
    58%|█████▊    | 98.6M/171M [00:07
    59%|█████▊    | 99.9M/171M [00:07
    59%|█████▉    | 101M/171M [00:08
    60%|██████    | 102M/171M [00:08
    61%|██████    | 104M/171M [00:08
    62%|██████▏   | 105M/171M [00:08
    62%|██████▏   | 106M/171M [00:08
    63%|██████▎   | 107M/171M [00:08
    64%|██████▍   | 109M/171M [00:08
    65%|██████▍   | 110M/171M [00:08
    65%|██████▌   | 111M/171M [00:08
    66%|██████▌   | 113M/171M [00:08
    67%|██████▋   | 114M/171M [00:09
    67%|██████▋   | 115M/171M [00:09
    68%|██████▊   | 116M/171M [00:09
    69%|██████▉   | 118M/171M [00:09
    70%|██████▉   | 119M/171M [00:09
    70%|███████   | 120M/171M [00:09
    71%|███████   | 121M/171M [00:09
    72%|███████▏  | 123M/171M [00:09
    73%|███████▎  | 124M/171M [00:09
    73%|███████▎  | 125M/171M [00:09
    74%|███████▍  | 126M/171M [00:10
    75%|███████▍  | 128M/171M [00:10
    76%|███████▌  | 129M/171M [00:10
    76%|███████▋  | 130M/171M [00:10
    77%|███████▋  | 132M/171M [00:10
    78%|███████▊  | 133M/171M [00:10
    79%|███████▊  | 134M/171M [00:10
    79%|███████▉  | 135M/171M [00:10
    80%|████████  | 137M/171M [00:11
    81%|████████  | 138M/171M [00:11
    82%|████████▏ | 139M/171M [00:11
    82%|████████▏ | 140M/171M [00:11
    83%|████████▎ | 142M/171M [00:11
    84%|████████▍ | 143M/171M [00:11
    85%|████████▍ | 144M/171M [00:11
    85%|████████▌ | 145M/171M [00:11
    86%|████████▌ | 147M/171M [00:11
    87%|████████▋ | 148M/171M [00:11
    88%|████████▊ | 149M/171M [00:12
    88%|████████▊ | 150M/171M [00:12
    89%|████████▉ | 152M/171M [00:12
    90%|████████▉ | 153M/171M [00:12
    90%|█████████ | 154M/171M [00:12
    91%|█████████ | 156M/171M [00:12
    92%|█████████▏| 157M/171M [00:12
    93%|█████████▎| 158M/171M [00:12
    93%|█████████▎| 159M/171M [00:12
    94%|█████████▍| 161M/171M [00:12
    95%|█████████▍| 162M/171M [00:13
    96%|█████████▌| 163M/171M [00:13
    96%|█████████▋| 164M/171M [00:13
    97%|█████████▋| 166M/171M [00:13
    98%|█████████▊| 167M/171M [00:13
    99%|█████████▊| 168M/171M [00:13
    99%|█████████▉| 169M/171M [00:13
    100%|██████████| 171M/171M [00:13
    [Image: grafik.png]
    => seems to work (~5.4GB VRAM used)
    Increasing the render_factor to 40 roughly doubled the VRAM usage.
    Using a 4k source didn't really increase the VRAM usage, a lot. (maybe 20-25%)
    [Image: grafik.png]
    (yes, deoldify isn't good with nature. Wink)
    I then threw some 1080p content at it
    [Image: grafik.png]
    (again, not suited for natur content)
    [Image: grafik.png]
    on the last I also used ddColor for comparison:
    [Image: grafik.png]

    So conclusion: It's slow, but working. Smile

    Also found a bug in my code regarding handling of custom sections.

    Cu Selur
Reply
#5
Switchting to model 2 also triggers a download:
Downloading: "https://download.pytorch.org/models/resnet34-b627a593.pth" to C:\Users\Selur/.cache\torch\hub\checkpoints\resnet34-b627a593.pth
for amusement: https://imgsli.com/MjQyOTky

side note, next release will again support properly handling the custom hinting:
# requires colorformat RGB24
from vsdeoldify import ddeoldify
clip = ddeoldify(clip, model=2, render_factor=21)
the '# requires colorformat RGB24' lets Hybrid know that RGB24 is required.

Main issue atm. I see is the additional downloads, into a folder which probably never gets cleaned up.

Cu Selur
Reply
#6
I'm happy to know that is working as expected. Smile

The resnet network is necessary for torch and is automatically downloaded (only the first time).
It is the backbone of the generator- objects are detected more consistently and correctly with this.
I cannot do nothing on that issue. 

As you can note the colors provided by ddcolor are more saturated of the colors applied by deoldify.

For amusement I compared the effect of render_factor and input_size on deoldify and ddcolor

https://imgsli.com/MjQzMDM1

The main differences are in the color of the hair and hands.

As I wrote none of the filter is perfect, maybe it is possible to obtain a better result by combining them.

Are you planning to add this filter in Hybrid ?

Thanks,
Dan
Reply
#7
Quote:The main differences are in the color of the hair and hands.
It's all the colors. See the color of the tree (on the right) and the floors.

Quote:Are you planning to add this filter in Hybrid ?
Not sure atm., will have to test whether:
a. one can download the "resnet101-63fe2227.pth" models beforehand and maybe place them in the models folder (<- that doesn't work).
I really don't like the idea of Hybrid triggering 'random' downloads.
b. I manage to put alle the dependencies&co together so that there is a single download I could add as a spearate addon-package. (alternatively writing down the steps to install this manually would also be possible, but I prefer the package, since it better allows to reproduce problems)

Quote:As I wrote none of the filter is perfect, maybe it is possible to obtain a better result by combining them.
Not sure whether that will work, since iirc. both of the filters assume gray colored content and don't try to just fix some colors.
I doubt, something like a simple merging or some masked merges will work.

Also the video model does seem to add some blue halos.

Cu Selur

Ps.: main issue for me are the additional downloads. Not sure whether changing 'TORCH_HOME' environment variable a. will work b. will not cause issues with other filters.
Creating 'f:\Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\models\resnet\hub\checkpoints' with the files in it and using:
import os
os.environ['TORCH_HOME'] = 'F:/Hybrid/64bit/Vapoursynth/Lib/site-packages/vsdeoldify/models/resnet'
does seem to work, but no clue whether it causes any issues with other filters (mlrt or torch filters).
Reply
#8
The download of resnet101 is triggered by "fastai vision" in the files: presnet.py, xresnet.py, xresnet2.py
The function used is defined in torch.utils.model_zoo.load_url()
The logic is embedded in fastai version of Deoldify. More precisely in the function  
def xresnet101(pretrained=False, **kwargs):
    """Constructs a XResNet-101 model.

    Args:
        pretrained (bool): If True, returns a model pre-trained on ImageNet
    """
    model = XResNet(Bottleneck, [3, 4, 23, 3], **kwargs)
    if pretrained: model.load_state_dict(model_zoo.load_url(model_urls['xresnet101']))
    return model

While the use of the ".cache" is embedd directly in torch, I don't think that is possible to change this logic.

You can find more info here: https://pytorch.org/docs/stable/hub.html

In meanwhile I discovered that is not a problem to have "fastai==1.0.60" installed with vsdeoldify.
It was a problem for me during the development, but once all the "imports" are correctly assigned, the problem disappears.

I fixed some small issue in the package, I attached a new version.

Thanks,
Dan


Attached Files
.zip   vsdeoldify-1.0.2.zip (Size: 239,57 KB / Downloads: 9)
Reply
#9
I changed the file __init__.py adding the following code

torch.hub.set_dir(model_dir)

now the "resnet" files are saved in the directory: "Hybrid\64bit\Vapoursynth\Lib\site-packages\vsdeoldify\models\checkpoints"

I attached the new version of the filter.

Dan


Attached Files
.zip   vsdeoldify-1.0.3.zip (Size: 239,74 KB / Downloads: 13)
Reply
#10
So if they are downloaded beforehand, there should be no need to download them, right?
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)