Selur's Little Message Board
[HELP] why the capacity is too big? - Printable Version

+- Selur's Little Message Board (https://forum.selur.net)
+-- Forum: Hybrid - Support (https://forum.selur.net/forum-1.html)
+--- Forum: Problems & Questions (https://forum.selur.net/forum-3.html)
+--- Thread: [HELP] why the capacity is too big? (/thread-2145.html)

Pages: 1 2


why the capacity is too big? - kwan0220 - 22.10.2021

I set the settings below, set them to 120fps settings, and turned the encoding, and the work progress shown during the encoding was well below 800MB, but the results continued to increase to 3GB to 4GB.

There seems to be something wrong, but I'm not sure what the problem is.

Just in case, I'll take a screenshot of the settings I touched and upload them.

I'd really appreciate your help.
I'm a huge beginner in video encoding, so I'm sorry to ask you questions often.




[Image: unknown.png][Image: unknown.png][Image: unknown.png]


RE: why the capacity is too big? - Selur - 22.10.2021

Hybrid takes the output from NVEncC, which usually looks like:
[23.9%] 32441/128226 frames: 259.05 fps, 7213 kb/s, remain 0:06:38, GPU 10%, VE 104%, VD 22%, est out size 4860.5MB
takes the bit rate per second and multiplies it with the length the output should have and outputs that file size.

Wild guess would be that either NVEncC changed the bit rate unit so something other than kb/s, reports a wrong kb/s or the expected length(framecount) is not correct.

Would need a debug output of the processing to see whats happening.

Cu Selur


RE: why the capacity is too big? - kwan0220 - 22.10.2021

(22.10.2021, 11:00)Selur Wrote: Hybrid takes the output from NVEncC, which usually looks like:
[23.9%] 32441/128226 frames: 259.05 fps, 7213 kb/s, remain 0:06:38, GPU 10%, VE 104%, VD 22%, est out size 4860.5MB
takes the bit rate per second and multiplies it with the length the output should have and outputs that file size.

Wild guess would be that either NVEncC changed the bit rate unit so something other than kb/s, reports a wrong kb/s or the expected length(framecount) is not correct.

Would need a debug output of the processing to see whats happening.
Should I upload the debug file?


RE: why the capacity is too big? - Selur - 22.10.2021

Yes, otherwise I can't say whats the cause of this. Wink
Did a small test encode here an there it seemed to be okay.

Cu Selur


RE: why the capacity is too big? - kwan0220 - 22.10.2021

I put my debug file.txt in link.


https://workupload.com/file/jdQLXAJvZ3T


RE: why the capacity is too big? - Selur - 22.10.2021

Got it,
NVEnc reports:
64729 frames: 154.11 fps, 3717 kb/s, GPU 0%, VE 103%

(bitrate between 3700 and 4000)
and
encoded 173048 frames
encode time 0:18:24, CPU: 8.5, GPU: 0.2, VE: 95.6, GPUClock: 1910MHz, VEClock: 1709MHz
frame type IDR 722
frame type I 722, total size 160.51 MB
frame type P 172326, total size 3323.53 MB
output was 3484 MB

Input was frame rate: 23.976fps, frame count: 34576
Hybrid assumed the output to have frame rate: 120fps, frame count: 173053.
so that looks all fine.

During the job process Hybrid allso assumed:
current frame count: 173053, rate: 120
and a job length of 1442.11s
Taking the size and the length to calulate the average bitrate it get:
3484*1024*1024/1000 / 1442.11s = ~ 2533 kb/s which is

Which way below what NVEnc reported.

-> I'll later create a new dev verison which will also output to the debug output what Hybrid showed to the user.
Once you get it please create another debug output.

I also noticed that the bitrate indicated by NVEncC for the first minute was below the average bitrate of the output so you have to wait a bit if for it to be usable.

Cu Selur


RE: why the capacity is too big? - Selur - 22.10.2021

Send you a new link to a dev version which outputs the calculation output to the debug output

Cu Selur


RE: why the capacity is too big? - kwan0220 - 22.10.2021

(22.10.2021, 13:11)Selur Wrote: Send you a new link to a dev version which outputs the calculation output to the debug output

Cu Selur
I changed some setting.

https://workupload.com/file/xvUwgC2ubNc

[Image: unknown.png][Image: unknown.png][Image: unknown.png]


RE: why the capacity is too big? - Selur - 22.10.2021

Okay, here's what happens:
Calulcating file size with bitrate: 5577kbs, speed: 172.55fps
Updating progress (assumed length: 1442.11s): FPS: 172.55, Bitrate: 5577, Percent: 99.9341, RestTime: 00:00:00, EstimatedSize: 958.76
Length is correct, bitrate is read correct.
So the problem must be with the size calculation.
Hybrid assumes the bitrate is in kilo bit per second.
QString::number(bitrate * 1000.0 / 8.0 / 1024.0 / 1024.0 * m_currentLength, 'f', 2);
* 1000 to get to bit per second
/ 8 to get to byte per second
/1024 to get to kilo (=1024 since we want to go for file size not data rate) byte per second
/ 1024 to get to mega (=1024*1024) byte per second
* 1442.11s to get to mega byte.
this seems correct to me.

taking the file size 4618.59 MB
/ 1442.11s to get to mega byte per second
* 1024 to get to kilo byte per second
* 1024 to get to byte per second
* 8 to get to bit per second
/1000 8 to get to kilo (= 1000) bit per second
gives = 50 478 kBit/s

That's nearly a factor 10 away from what was reported.
So either I overlook something here or this isn't Hybrids fault by a bug in NVEncC.

Seeing that I thought may be NVEncC switched to reporting in 'kilo byte per second', but then it should be wrong on my system too.

So I just did another test here:
Updating progress (assumed length: 269.242s): FPS: 707.84, Bitrate: 7284, Percent: 99.0808, RestTime: 00:00:00, EstimatedSize: 233.79
is what Hybrid predicted and
encoded 32237 frames, 706.92 fps, 7269.07 kbps, 232.79 MB
encode time 0:00:45, CPU: 4.4, GPU: 28.0, VE: 64.8, GPUClock: 1837MHz, VEClock: 1664MHz
frame type IDR    27
frame type I      27,  total size    1.48 MB
frame type P   32210,  total size  231.31 MB
ist the output on my system.
-> The calculation of Hybrid is correct and matches the prediction of NVEnc.

I used the same settings as you:
NVEnc --y4m -i - --fps 120.000 --codec h265 --profile main10 --level auto --tier high --sar 1:1 --lookahead 16 --output-depth 10 --vbrhq 7200 --max-bitrate 240000 --gop-len 0 --ref 3 --bframes 0 --no-b-adapt --mv-precision Q-pel --preset default --colorrange limited --colormatrix bt470bg --cuda-schedule sync --keyfile GENERATED_KEY_FILE --output "E:\test.265"
I also double checked that the dev version comes with the same nvencc version I used.

No real clue why this is happening on your system.

My guess atm. is that something on your system is interfering with NVEncC bitrate reporting.
-> try whether this changes is you disable any antivirus&co software on your system.

Cu Selur


RE: why the capacity is too big? - kwan0220 - 22.10.2021

Quote:My guess atm. is that something on your system is interfering with NVEncC bitrate reporting.
-> try whether this changes is you disable any antivirus&co software on your system.
I turned off all vaccines except Windows' default defenders and tried encoding again. But the results came out about twice the expected size.