This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

How to enable GPU for MCTemporaldenoise filter?
#21
(26.10.2019, 13:02)Selur Wrote:
Quote:My input file is an 8 bit x264 YUV, not RGB. I'm encoding it to 10 bit x265, is that what you meant by high bit depth?
The important question is what color space and bit depth the content is in the filter chain when fed to the filter. Smile

Cu Selur

Oh how do I determine that, what the color space and bit depth are?

I just tried the KNLmeansCL using the last build you sent me - while it doesn't crash anymore, I don't think any denoising is being performed, when I set the channels to UV.

Also - similarity radius can't be increased beyond 4 in Hybrid, while the developer of KNLmeansCl had stated that its maximum value had been increased to 8 some time back.
Reply


Messages In This Thread
RE: How to enable GPU for MCTemporaldenoise filter? - by Toothache - 26.10.2019, 13:34

Forum Jump:


Users browsing this thread: 2 Guest(s)