25.04.2026, 20:12
Hello everyone,
I would like to open a discussion about an important challenge for exemplar-based colorization methods:
Keyframe Color Consistency Challenge
We need to find a reliable way to achieve temporal consistency between colorized keyframes so they can be used for the full colorization of black and white videos.
At the moment, one of the main limitations is that even when keyframes are individually colorized well, colors often vary from one keyframe to another. This creates flickering, unstable tones, skin tone changes, clothing color drift, and inconsistencies in objects or environments when processing the full sequence.
Why This Matters
For real-world video restoration, getting beautiful frame-by-frame results is not enough. We need:
I experimented with several workflows using Automatic1111 + ControlNet + IP-Adapter.
Example Setup in Automatic1111
Main txt2img Prompt
Positive Prompt:
Negative Prompt:
ControlNet Unit 0 (Preserve B&W Structure)
But Problems Remain
Even with this method:
From my tests, Qwen-Image-Edit currently delivers some of the best single-keyframe colorization results in terms of realism, detail, and natural palette generation.
However, there is still no practical way to maintain the same temporal consistency across all other keyframes in the video.
So even if one keyframe looks excellent, the next keyframe may still shift in:
Could it be possible to use Qwen-Image-Edit for colorize all keyframes?
For example:
Is there a way to use Qwen-Image-Edit to colorize all keyframes and create one unified temporal color source for all reference images (keyframes)?
This could be a major breakthrough for AI video colorization.
Your turn — ideas, experiments, thoughts?
I would like to open a discussion about an important challenge for exemplar-based colorization methods:
Keyframe Color Consistency Challenge
We need to find a reliable way to achieve temporal consistency between colorized keyframes so they can be used for the full colorization of black and white videos.
At the moment, one of the main limitations is that even when keyframes are individually colorized well, colors often vary from one keyframe to another. This creates flickering, unstable tones, skin tone changes, clothing color drift, and inconsistencies in objects or environments when processing the full sequence.
Why This Matters
For real-world video restoration, getting beautiful frame-by-frame results is not enough. We need:
- Stable skin tones from scene to scene
- Consistent clothing and object colors
- Lighting adaptation without color drift
- Smooth transitions between distant keyframes
- Reliable propagation across long sequences
I experimented with several workflows using Automatic1111 + ControlNet + IP-Adapter.
Example Setup in Automatic1111
Main txt2img Prompt
Positive Prompt:
masterpiece, best quality, colorized photograph, vibrant, highly detailedblack and white, grayscale, monochrome, bad quality, deformed- Enable: Yes
- Pixel Perfect: Yes
- Input Image: Original black and white frame
- Control Type: Recolor
- Preprocessor:
orrecolor_luminance
recolor_intensity
- Model:
ioclab_sd15_recolor
- Weight:
(or1.0
if too strong)0.8
- Enable: Yes
- Pixel Perfect: Yes
- Input Image: Previously colorized reference keyframe
- Control Type: IP-Adapter
- Preprocessor:
ip-adapter_clip_sd15
- Model:
ip-adapter_sd15
- Weight:
0.8 – 1.0
But Problems Remain
Even with this method:
- Colors still drift over time
- Different scenes generate different palettes
- The same person may change skin tone
- Sky, grass, and buildings vary between shots
- Propagation becomes unstable in long videos
From my tests, Qwen-Image-Edit currently delivers some of the best single-keyframe colorization results in terms of realism, detail, and natural palette generation.
However, there is still no practical way to maintain the same temporal consistency across all other keyframes in the video.
So even if one keyframe looks excellent, the next keyframe may still shift in:
- Skin tone
- Clothing colors
- Environment palette
- Object consistency
- Scene mood
Could it be possible to use Qwen-Image-Edit for colorize all keyframes?
For example:
- Use one master keyframe style and propagate it
- Multi-keyframe palette memory
- Identity/object color locking
- Reference-aware batch processing
- Latent temporal consistency between generated keyframes
Is there a way to use Qwen-Image-Edit to colorize all keyframes and create one unified temporal color source for all reference images (keyframes)?
This could be a major breakthrough for AI video colorization.
Your turn — ideas, experiments, thoughts?

