20.09.2024, 21:12
I released the new RC9, with the following updated description
To get the GPU memory available I use torch.cuda.mem_get_info() because what is relevant is the amount of GPU RAM seen by torch.
ColorMNet has huge memory problems when is used inside Vapoursynth. The only encode mode not having this limit is encode mode = 2 where max_memory_frames can be set even above 10000.
The values are heuristic estimates based on my GPU that has 12GB of RAM (reported correctly by torch). In the case the frames are too high the encoding will fail with the error: GPU out of memory.
There is nothing that I can do for that and in this case the user must lower the number of max_memory_frames defined in input to the filter.
Dan
Quote::param max_memory_frames: Parameter used by ColorMNet model, specify the max number of encoded frames to keep in memory.
Its value depend on encode mode and must be defined manually following the suggested values.
encode_mode=0, represent the batch size, suggested value are:
min=1, max=6/8 : for 8GB GPU
min=1, max=12/14 : for 12GB GPU
min=1, max=24/26 : for 24GB GPU
If = 0 will be filled with the max value (depending on total GPU RAM available)
encode_mode=1, suggested values are:
min=1, max=4/5 : for 8GB GPU
min=1, max=8/9 : for 12GB GPU
min=1, max=15/16 : for 24GB GPU
If = 0 will be filled with the max value (depending on total GPU RAM available)
encode_mode=2, there is no limit to this value (it could be all the frames in the clip).
Suggested values are:
min=250, max=10000
If = 0 will be filled with the value of 10000
To get the GPU memory available I use torch.cuda.mem_get_info() because what is relevant is the amount of GPU RAM seen by torch.
ColorMNet has huge memory problems when is used inside Vapoursynth. The only encode mode not having this limit is encode mode = 2 where max_memory_frames can be set even above 10000.
The values are heuristic estimates based on my GPU that has 12GB of RAM (reported correctly by torch). In the case the frames are too high the encoding will fail with the error: GPU out of memory.
There is nothing that I can do for that and in this case the user must lower the number of max_memory_frames defined in input to the filter.
Dan