Current version

v1.10.4 (stable)

Navigation

Main page
Archived news
Downloads
Documentation
   Capture
   Compiling
   Processing
   Crashes
Features
Filters
Plugin SDK
Knowledge base
Contact info
 
Other projects
   Altirra

Archives

Blog Archive

Why does my AVI file show strange bit rates in Windows Vista?

Recently, a poster on my forum asked why Windows Vista reports the wrong bit rate when displaying Properties for AVI files written by VirtualDub, compared to Gspot, a common media file analysis utility.

After some experimentation, the answer is that it's a combination of some silliness in Windows Explorer and an omission in VirtualDub.

First, let's take a sample file with the following info (note that I'm particular about which items for an important reason):

First, let's start with Windows XP. Windows XP, as it turns out, computes data rates as follows:

If you look closely, you'll notice there's a big units problem here -- I've computed audio bit rate using SI units (1k = 1000), but video byte rate using binary units (1k = 1024), and labeled both as kbps. That's incorrect and confusing, but that's what Windows XP Explorer does. In this case, the actual video rate is closer to 400 kbps (1 kbps = 1000 bits/second). The other thing that's really bogus here is that the video rate is being computed according to the size of the entire file, which includes headers, audio, and whatever junk may be at the end. I did a test of concatenating an AVI file to itself back-to-back, and the reported video rate doubled. Oops.

So, basically, ignore whatever Windows XP Explorer says.

That brings us to Windows Vista. For some AVI files, it does display reasonable bitrates, but for files written by VirtualDub, it still displays strange values. I'll spare you all of the experiments I did and just give you the actual algorithm used by Windows Vista Explorer:

This is as awful as it looks. Bit rates and byte rates are mixed up again, if you look at how FileByteRate is used. If you have a file with an audio bit rate of 1536 kbps and a computed file byte rate of 491,000 bytes/sec, it will report a video data rate of 491 kbps and total bit rate of 1536 kbps. However, if the audio stream is changed so that the audio bit rate is 489 kbps, the video bit rate will change to 2 kbps and the total bit rate to 491 kbps. Also, if the audio bit rate is 1536 kbps but avih.dwMaxBytesPerSec is set to 1 byte/sec, Vista will then report video data rate = 491 kbps, total data rate = 491 kbps, audio bit rate = 1536 kbps. To add to the confusion, the total bitrate field is listed under Video, even though sometimes it reflects both audio and video and sometimes it doesn't. Trust me, I spent a lot of time double-checking these results with a hex editor because I couldn't believe it myself.

There's a new field involved here, which is dwMaxBytesPerSec in the main AVI header. This is supposed to be the maximum number of bytes per second required to play the file. For legacy reasons which I can't remember or justify, VirtualDub writes zero into this field, and that's what causes the anomalous behavior. That leaves the question of what DirectShow puts in this field, and the answer is that it computes the field as follows:

DirectShow will sometimes write out some junk at the end of the file due to sector alignment, which is not counted in the above, but it still includes headers and indices in the bandwidth calculation. Since it truncates the duration, it will often be less accurate than Explorer for short video files. (I didn't test it with a video file shorter than one second.) I can't say that DirectShow's value makes much more sense, because presumably this value was meant to allow the player to quickly detect situations where a CD-ROM drive couldn't play a file, and the average over the entire file is hardly a good statistic to use since it fails with variable bitrate. In the absence of an actual buffering model there isn't much else that could be specified, though. MPEG-1 avoids this by specifying a buffering model called the Video Buffering Verifier (VBV), which uses a precise simulation of when bits arrive and leave a fixed buffer and ensures that a compliant decoder can always predict whether a correctly labeled file can play. AVI doesn't specify such a model and thus any attempt to put a more faithful value here would likely be futile.

I'll probably just end up matching DirectShow's formula, for lack of a better way to compute this field and for consistency with Explorer's, um, interesting interpretation. In the meantime, if anyone's got access to the Windows bug database, feel free to reference this page....

Comments

This blog was originally open for comments when this entry was first posted, but was later closed and then removed due to spam and after a migration away from the original blog software. Unfortunately, it would have been a lot of work to reformat the comments to republish them. The author thanks everyone who posted comments and added to the discussion.