02.01.2021, 04:50
(01.01.2021, 20:29)Selur Wrote:Quote:had the color noticably darker and green-ish.might be a tv (16-235) vs. pc (0-255) scale issue.
Quote:So, what's the real problem here? Color matrix?I doubt it's a color matrix issue unless the player doesn't support the matrix or ignores it and always uses the same matrix (or the same depending on the resolution).
-> if you want to change the color matrix:
a. use Color Matrix filter (https://www.ffmpeg.org/ffmpeg-all.html#toc-colormatrix to convert from one to another color matrix)
b. make sure to properly set the vui settings (color matrix, color range, color primaries)
-color_range XXX -colorspace XXX -color_primaries XXX -color_trc XXX
-> read
a. https://trac.ffmpeg.org/wiki/colorspace is also has some examples.
b. https://www.ffmpeg.org/ffmpeg-all.htm for details on the options
Usually ITU-R recommends for:
1. HDR and UHD content BT.2020
2. SDR and FullHD content BT.709
but there's also UHD content with BT.709 and SDR content with BT.601 or other color matrixes, so there is no strict you need to use color matrix xy for resolution A or B. (Side note: when doing HDR to SDR conversions tome mapping is recommended.)
Problem is not all players support all matrices and some always assume BT.2020 requires HDR which it doesn't. SDR content can have a WCG (wide color gamut, see: https://en.wikipedia.org/wiki/Rec._2020) and you can have WCG without HDR (high dynamic range).
iirc. (not totally sure) to take full advantage of BT.2020 10bit or higher is required (it is for HDR), but it should be possible with 8bit.
-> I would use BT.709 for SDR content independent of the resolution.
And recommend to use BT.2020 only for HDR content (independent of the resolution).
Cu Selur
Ahh, so it wasn't really color matrix. I did suspect it was the color range since despite the MediaInfo reporting both my input and output is at TV/Limited range, I am aware that these are just flags and it don't always reflects the exact properties of the content (the input could've been at PC/Full but this is a movie but then again this is at 10-bit depth…ahh that's where I missed *realizes by the time of writing)
I end up trying to mess with the matrix thinking it would fix everything just because I thought it was the that should change according to bit depth, damnit
Okay so you suggest that I can convert the color matrix using the colormatrix filter. What about scale? I kinda prefer that more. It should not make any difference, right? I'm just utilizing them to convert between matrices
Also, assuming I'll use scale, I'm thinking of doing -vf scale=in_range=pc:out_range=tv hoping it would fix this color shifting issue. Unless I'm wrong, I think converting to a different 8-bit color range would prevent players from misinterpreting things. Or am I being an idiot again for not putting out_range=pc instead? Or is this whole idea pointless once again? Assume I will properly set the flags
Lastly, thank you for providing references and elucidation for matrices BT709 and BT2020. Will definitely check them out (for my own sake anyways)
So I can use any matrices where most of the time I can just pick BT709. But when I want to experience a full color range video with HDR, I should choose BT2020 at 10-bit color depth (at least?), I see
Thank you for replying, you're the beast (yes, beast)