This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

[HELP] How encode 8bit to 10bit correctly?
#9
a. No, bug. Everything is as it should be.
b. x264 cli preview always shows --demuxer raw, since it doesn't know what will feed it (ffmpeg, mencoder, avs2yuv,...), during Job creation the decoder which feeds the encoder is choosen as needed and the demuxer ist adjusted accordingly.
c. Couldn't find anything case which resulted in netter results than bicubic.

In general: If you don't know an option keep it at the value Hybrid chooses.

Cu Selur
----
Dev versions are in the 'experimental'-folder of my GoogleDrive, which is linked on the download page.
Reply


Messages In This Thread
RE: How encode 8bit to 10bit correctly? - by Selur - 25.01.2019, 15:40

Forum Jump:


Users browsing this thread: 1 Guest(s)