§ ¶YCbCr levels and the Video Mixing Renderer 9
However, in VMR7/9 mode when not using the overlay and using a YUV output mode, both ATI and NVIDIA seem to output a luma range of 16-235, which is a pain considering the monitor works in rgb where the luma range is 0-255, requiring either a software conversion to 0-255, or to specifically adjust the hardware color controls to compensate.
The VMR7/9 being referred to are the DirectShow Video Mixing Renderers, which are filters that render final video on the screen in a player like Windows Media Player. YCbCr data, which is frequently 8 bits per channel, nominally has a range of 16-235 for the luminance (Y) channel, where 16 represents black and 235 represents white. Any values outside of this range are reserved; the usual way of handling them is to clamp to the valid range. This is sometimes a point of confusion because JPEG JFIF specifies a slightly different encoding where YCbCr values are encoded with the full 0-255 range; mixing these up causes contrast shifts in the video. Some Motion JPEG codecs that don't properly compensate for this difference when exchanging the output of the JPEG engine with client applications can trigger this. It sounds like a similar mixup is occurring here, in that 16-235 data is being displayed as if it were 0-255, resulting in a 14% loss of contrast.
What's odd about Blight's statement, though, is that VMR7 and VMR9 are quite different beasts — VMR7 is DirectDraw 7 based, whereas VMR9 is Direct3D 9 based. It's somewhat unusual to see the same bug in both. I was curious, so I decided to look into it.
What Video Mixing Renderer 9 does
My first attempt to gain insight into VMR9's workings was to attach NVIDIA NVPerfHUD 3 to GraphEdit, with the aid of one of my proprietary adapter tools — but it crashed. Then I tried PIX for Windows to try to get a D3D call stream dump, but that never triggered. Trying to force it to use refrast failed too. Argh. So I just resorted to good old WinDbg and VTune.
I'd expected some neat pixel shaders to be used, but what VMR9 does in the common single-stream case is rather pedestrian: it calls CreateTexture() to allocate a texture and then StretchRect() to blast it onto the backbuffer. I'm not sure why it uses a texture, because an offscreen plain surface would actually be more flexible in that more formats and fewer restrictions are usually imposed. If either UYVY or YUY2 is available as a texture format, it uses that, otherwise it falls back to X8R8G8B8 (RGB 32-bit) instead. It tries R8G8B8 (24-bit RGB) instead, too, but good luck finding a card that supports it. It also calls CreateTexture() without checking for format availability first, which is bad since the error that comes back will fire a breakpoint when the debug D3D runtime is enabled.
I was eventually able to track down the problem on an ATI X800XT with a hacked-up version of VirtualDub. On that card, if you create a UYVY or YUY2 surface, and then StretchRect() it, the luma values are interpreted as 0-255 and a contrast shift results. If you simply draw the texture to the screen with a quad, though, you get the expected 16-235 behavior. Odd.
On an NVIDIA GeForce Go6800, I was also able to see the difference in comparing VMR9 to the old Video Renderer with all acceleration disabled, but couldn't reproduce it programmatically. NVIDIA GPUs don't support UYVY or YUY2 textures, so the only option to use those formats is to use an off-screen plain surface — and yet, I couldn't see a problem StretchRect()ing one, nor could I catch VMR9 using one either. And yet, VTune doesn't show software conversion going on in VMR9 itself, and if I stick in an Infinite Pin Tee Filter to force a conversion elsewhere, the problem goes away. So I'm not really sure where the problem occurs here.
Judging by the behavior I saw, and the total lack of any description in the DirectX SDK as to how this conversion should work, I'm inclined to say that this isn't VMR9's fault, but either a common driver problem or an unfortunate mismatch between Direct3D and DIB/DirectDraw conventions for YCbCr. Fortunately, VirtualDub always draws quads when Direct3D display mode is enabled, so it's not affected by these oddities. Whew.
I dug up the old Reference Rasterizer source code from the DirectX 7 SDK, which I think is the last version that was publicly released, and examined its texture sampler. It too decodes as if black were 16/255 and white were 235/255, although I noticed that it doesn't interpolate chroma (boo). Unfortunately, refrast isn't helpful in determining which StretchRect() behavior is correct, because it doesn't support YCbCr-to-RGB conversions along that path. So much for a reference.
I didn't mean that the problem manifests itself with VMR7 (which usually uses overlay), just that our player supports VMR7 as a renderer (along with overlay mixer and VMR9).
There are actually several issues with color conversion (YUV to RGB) in either DirectShow or the display drivers when using VMR9. The first problem is the luma-range, which is usually exhibited only if you output YV12. When outputting YUY2, the issue seems to go away (at least on my NVIDIA GF4).
I actually asked NVIDIA about this (a long time ago, this problem is as old as DirectX 9) and they said that it's a microsoft directive or something of that nature that the output luma range should be 16-235 (no conversion to RGB luma range), very confusing and inconsistant.
The other problem is that there is an inaccuracy in the conversion if it is carried out in the VMR9. Take two brightly lit scenes and take a screen capture with the VMR9 set to input YUY2 and RGB32. Then just flip between the screen captures, you'll notice that the YUY2 image has a slight green tint to it. I'm not sure if this problem is on all recent hardware, but it was very much visible on quite a few display cards, possibly an NVIDIA issue, but I'm not 100% certain.
You can easily see this problem by playing say an XVID AVI file with ffdshow used as the decoder and setting FFDShow to output only YUY2 and then only RGB32 when taking the screenshots (this isn't specific to ffdshow, I've seen it on other decoders).
Blight (link) - 06 03 06 - 07:15
I noticed several things that appear to be related to this; at least they're related to VMR9, the following does not happen with MPC set to Overlay Mixer.
If I set ffdshow to output YUY2 only the video will be too bright. This carries over to screenshots made with MPC.
If I set it to output RGB32 the video looks good, both during playback and on screenshots.
If I set it to output YV12 however, the video is played back too darkly, but screenshots look exactly like with RGB32 output.
Also, this does not seem to happen with all videos? On some of my XviDs it does happen, on some it doesn't. [?!]
I don't know much about video rendering, but I noticed that neither of you mentioned this. Maybe it helps.
NetSoerfer - 06 03 06 - 18:31
Argh. Wouldn't be the first time Direct3D had a dumb design decision. Let's see, half-pixel offset on NDC that makes projection matrices dependent on viewport size, mipmap sizes are rounded down, inconsistent vertical orientation of textures vs. render targets, can't read subrect with GetRenderTargetData(), GetFrontBufferData() is deliberately slow with an awful alpha-set loop....
I couldn't see any conversion code in VMR9, although I was running it on two SM3.0 systems. Unless someone got sloppy and rounded off a bunch of bits, I can't see why a green tint should result from such a conversion; GeForce 4s have at least 8 bits of fractional precision and all that a conversion would take is either a handful of dp3/mads.
Check your overlay color settings in the video driver's advanced configuration settings. Sometimes the overlay is set to a non-ideal setting on install.
Phaeron - 07 03 06 - 00:12
Strange, I thaught it was the other way around though, get this...
I disable directdraw complately, no overlays unless it's ogl I believe, which are disabled as well.
Overlays are disabled from the driver too, dd based or somehting I can't remember..
Meaning only one ouput window allowed, that's the best I can do there.
Anyways the dd disable makes sure it's gone anyways.
To the point.
Outputting to the default ms renderer, whatever it is...
Good color and all, best picture.
Buggy on 2kpro for some reason but fine on 2k3, I get wierd blocking effects in 2kpro which would seem decoder related.
On vmr9 output...
I get around the same pic, a tiny tiny bit blurry.
It's altered in some way, probably because it's a texture?(even though I use -15 mip bais).
The color however is boosted.
This is because of the color boost stuff I got going with the video card, an nvidia.
Not overlay specific that color stuff, I don't know how those work or if they even do.
This is using xvid as a playback decoder, because it handles most newer formats.
I force xvid to output a rgb 32bit.
This totally fixes up my uyvy or whatever it was called captures.
Makes them look like they should for some reason.
I mean, that's all I have to do, capture using a yuv format, 16 or 12 bit I believe, whatever it was.
Then convert during playback and it fixes it.
Perhaps it's doing some sort of interpolatin with the colors I dn.
It may be effecting all my captures...
But that does'nt make sence.
I render out from a tee filter using vmr9.
Since I found vmr9 to be the fastest output window on a stock win 2k3.
The tee filter splits the thing, also the orignal or what should be original unless the tee filter screwed it up, is outputted directly to a file, no changes, no output.
After the raw cap I convert to xvid.
I don't touch the color format.
Straight from the file to xvid.
Then I convert in real time, the color format, during playback.
Saves a bit of space to doit this way....
Strange though, since I'm only using vmr9 during playback, "sometimes"(if I'm not stretching to 720x480, I'll use the default renderer).
That and to watch the cap real time, and when I actually capture to my drives I do the same, so I can watch the progress :, using a tee filter thingy.
Interesting topic though for darn sure.
NEOAethyr - 11 03 06 - 17:35
As I've said before, a -15 mip bias is absolutely ridiculous. In this case, though, it makes no difference whatsoever because (a) there are no mipmaps on the texture, and (b) StretchRect() works from a surface, not a texture.
What can cause a blur is forcing full-scene antialiasing (FSAA) on in the driver. Supersampling causes multiple texture samples to be taken per pixel and blended together, which blurs the result. Even though multisampling writes the exact same color value into every subpixel, it too can blur the video because a blur post-pass is done over the result. If you have FSAA forced on, try turning it back to Application Controlled and see if that helps.
Phaeron - 12 03 06 - 16:48
It turns out a certain AVC decoder named CoreAVC is capable of tweaking the downstream renderer such that it always passes full PC range, even though the input pin has a YUV connection formattype. See here for more details:
Isochroma - 04 12 06 - 22:52
It is definitelly driver issue. For ati cards solution is to use drivers in version up to 6.5. Newer have color conversion different for sd and other for hd. In 6.5 output is always in 16-235 scale, so display can be calibrated for it.
I don’t know why it is changed, in release notes there is no information about it.
Piotr Wozniak - 27 02 07 - 13:57
the explanation is this :
ATI and NVIDIA followed the Microsoft recommandation for YCbCr->RGB conversion and that is why drivers from both manufacturers have this problem.
it is not a bug in the sense that ATI and NVIDIA followed Microsoft specification, but the microsoft recommandation is not correct.
Microsft specifies a conversion from YCrCb to Studio video RGB, Studio video RGB has an excursion of 16-235 like YCrCb, they should have specified Computer RGB instead which have and excursion of 0-255.
the graphics cards handle studio video RGB like computer RGB data, therefore a black studi video RGB (16,16,16) is handled as a computer RGB date and displayed like a dark grey.
The problem is at Microsoft's, neither ar ATI's or NVIDIA's
touco - 10 02 08 - 09:19