| Posted: January 13 2004 at 4:51pm | IP Logged
|
|
|
Windows Media Encoder profiles allow the content creator to choose between emphasizing video quality over framerate (video smoothness). Unfortunately, it's not possible to have your cake and eat it, too, so increasing the quality in one area always means a sacrifice in the other. I think that this is what is happening in your scenario where the encoder must make a choice between keeping the image quality high or keeping the framerate high. This keeps the CPU utilization to get maxed out and also prevents the encoder from falling behind.
To illustrate this, start Windows Media encoder and choose "Create a custom encoding session". Then choose your sources etc., then click the profiles tab & click the manage profiles button. Find the profile that you're using and copy it, then edit the copied profile. Give the copied profile a new name, etc, then click next until you get to the "common stream settings" screen. You'll notice an advanced button there - clicking allows you to adjust the amount of buffering performed on the encoder side. This is the number of seconds that are stored in memory before encoding begins. A larger buffer will give better results but the trade-off is more memory usage. Also, the player will buffer the same # of seconds before playback in order to give the viewer a smoother viewer experience - meaning, in live events, the player will be behind the encoder by this number in addition to any buffering on the server.
On the next screen, "Individual Video Stream Settings", you will see in addition to the framerate setting, two additional settings. First, Image Quality; this can be set from 0 (smoother) to 100 (clearer images). Even with the CPU only at 20% to 40% setting this to 100 can result in dropped frames when the encoder must make a decision on whether to sacrifice image quality or the framerate and does so on based on the setting to emphasize the image's quality over the framerate. Without this decision making process the encoder would try to do everything perfectly and the CPU usage would dramatically increase, eventually getting so behind there would be a radical increase in dropped frames. Try setting this to 0 and encoding a file with the same specs as before and you will note that, in almost every instance, no frames are dropped but that there is a significant decrease in the image quality.
The second you will see is Key Frame Interval. Increasing this number results in slightly less image quality and is generally considered appropriate when the source video is somewhat static, as in a "talking head" news footage or cartoons, where the background doesn't change much. Decreasing this number is desirable when there is a lot of motion in the source video and the need to maintain high quality images in the video is important. If you experiment with this and you should find that if you were to leave all the settings from your below profile the same and change this to 1 that the # of dropped frames would increase slightly, while increasing it to, for example, 7 from 4 (which I think is the default for most profiles) will decrease the # of dropped frames.
The settings I think are effecting your results are Image Quality and Key Framer Interval. I recommend you experiment with these to see how they impact your results.
Let me know if you have questions,
-Steve
|