Selur's Little Message Board
Open Models support ... - Printable Version

+- Selur's Little Message Board (https://forum.selur.net)
+-- Forum: Hybrid - Support (https://forum.selur.net/forum-1.html)
+--- Forum: Problems & Questions (https://forum.selur.net/forum-3.html)
+--- Thread: Open Models support ... (/thread-3528.html)



Open Models support ... - ToiletDuck - 31.12.2023

Hellu,

Iam starting to explore the usage of different models using VSGAN & VSLMRT..
and i have an few faq's in mind..

a. Where do i put the DL pytorch & / or ONNX files for hybrid (definitely depended by wich filter i use) ?  If i have to guess, just adding torch files in the vsgan_models folder.. but.. it never is that easy for me..
b. What parameters do i have to use..  
c. When no parameters is set, does the default settings (when checked) actually work optimized already for vsgan ?  Or is it best to use custom parameters for that filter?

I know, torch files can be converted to onnx so my question about that, are the models interchangeable between filters by doing that ?

EDIt:  hmm;... kind of hoped to find an good model to work with other resizers to speed up but maintain the same output quality as ESRgan resizer..
And i have found one to work with vsgan, but the turtle-slow-crawling speeds is killing my nvidia gpu !?  Like 0.00x fps ehh (°^°)

If anybody could provide me with an link to esrgan models for vs-mlrt , your most welcome to do so...  preferable 2x.. not 4x .. actually 4x looks much wurse than 2x.. for some reason..
to be specific, looking for ESRgan_2xPlus.. looks great for the source iam working with..


Cheerios
TD


RE: Open Models support ... - Selur - 01.01.2024

The supported model types vsgan and vsmlrt support are listed on their github pages.
Through the model parameters: no clue, depends on the architecture. (atm. Hybrid does only support ESRGAN models in VSGAN)

About interchangeability: that is the idea onnx and pth are should be different representation of the same data.

Cu Selur


RE: Open Models support ... - ToiletDuck - 01.01.2024

(01.01.2024, 02:29)Selur Wrote: The supported model types vsgan and vsmlrt support are listed on their github pages.
Through the model parameters: no clue, depends on the architecture. (atm. Hybrid does only support ESRGAN models in VSGAN)

About interchangeability: that is the idea onnx and pth are should be different representation of the same data.

Cu Selur

I have looked over there already , for an VSGAN equivalent of RealESRGAN_x2Plus for VS_MLRT (*.Onxx).
But haven't found one yet...  

Iam doing some reading about using the right params for the models.. so.. the jury is still out  Idea

(01.01.2024, 02:29)Selur Wrote: About interchangeability: that is the idea onnx and pth are should be different representation of the same data.
Cu Selur

Maybe so.. but that's the problem!  A pytorch model can't be loaded in vsmlrt , just like an onxx can't be loaded for vsgan.
These have to be converted / exported to the right format is it not ?

Is there an easy/quick way/tool (gui) to convert a pytorch model into an *.Onxx ? and Vice Versa ? Iam not a coder..hence..


Thanks,


RE: Open Models support ... - Selur - 01.01.2024

One can use chaiNNer to convert model files.
[Image: grafik.png]

Cu Selur


RE: Open Models support ... - ToiletDuck - 01.01.2024

(01.01.2024, 15:54)Selur Wrote: One can use chaiNNer to convert model files.
[Image: grafik.png]

Cu Selur

I knew you would suggest that app..  I have tried that one already before i asked here..

Doesn't seem to be so straightforward and easy as in the picture you post though  Dodgy
Iam using the portable version ..

cheers,


RE: Open Models support ... - Selur - 01.01.2024

Quote:Doesn't seem to be so straightforward and easy as in the picture you post though
Works fine here.
What I do is:
  • Check under 'Manage Dependencies' (upper right corner), that everything you need is installed.
  • Add a 'Pytoch->Load Model'-element and configure it to load your .pth file.
  • Add a 'Pytoch->Convert to Onnx'-element.
  • Add a 'Onnx->Save Model'-element.
connect/configure the elements, then press the 'play'-button at the top.
Seems to be rather straight forward.


Cu Selur


RE: Open Models support ... - ToiletDuck - 01.01.2024

(01.01.2024, 16:14)Selur Wrote:
Quote:Doesn't seem to be so straightforward and easy as in the picture you post though
Works fine here.
What I do is:
  • Check under 'Manage Dependencies' (upper right corner), that everything you need is installed.
  • Add a 'Pytoch->Load Model'-element and configure it to load your .pth file.
  • Add a 'Pytoch->Convert to Onnx'-element.
  • Add a 'Onnx->Save Model'-element.
connect/configure the elements, then press the 'play'-button at the top.
Seems to be rather straight forward.


Cu Selur

By that i mean, NEXT to the chainner app, i had to search for and dl the additional files to see the pytorch node in chainner   Wink
Everything wos expanded , and although there wos an onxx node available among many others, only the one i need wos missing  Dodgy .. ofCurse.. 

so yeah, i needed to DL the dependencies additionally.. Whooping 2GB+ >_> just to load A model ..!
I gather you didn't have to do that presumable? → portable VS redistributable difference mayhap..

cheers,
td


RE: Open Models support ... - Selur - 01.01.2024

¯\_(ツ)_/¯


RE: Open Models support ... - ToiletDuck - 01.01.2024

(01.01.2024, 16:23)Selur Wrote: ¯\_(ツ)_/¯

(°^°)