Final Words

Quite frankly, neither method, in our opinion, is necessarily better or worse than the other. They both do the overscan compensation job. Granted, NVIDIA's overscan shift doesn't allow you to see everything at once and is dependent on cursor position. But ATI's method basically manipulates the timings for a specific resolution that is optimized for a specific HDTV resolution. On the other hand, NVIDIA is much easier to set up and interact with on a settings level.

However, we should note that neither solution provides a decent way to fix the overscan problem as it relates to gaming. Either way you go, you won't be able to play games locally or online because you either can't see the whole screen or the image output is too small, since games are actually run in predetermined resolutions (i.e. running Jedi Academy on 1024 x 768 for a 720p output gets overscan).

We should note that overscan in games is related to ATI HDTV output. With NVIDIA, there is no overscan, but you get really poor resolution handling, since 1024 x 728 in Jedi Academy under a 720p output produces an image that takes up less than half of the screen than when you are in desktop mode (enlarging the overall image size doesn't change the issue). This doesn't help to put NVIDIA at the head of pack. If you are outputting to HDTV for other purposes, such as PVR or web browsing, this isn't so much of an issue, since you are reliant on the overscan of the desktop resolution, rather than the program. Basically, when it comes to gaming, we are going to recommend using native computer resolution output rather than trying to toy around for hours with HDTV output settings that you can somewhat tolerate.

If you are shopping around for a video card that can do HDTV output, ATI provides more purchasing options and is obviously the cheapest way to go should you already own a supported card, but their HDTV output support doesn't provide the fix that everyone is going to be happy with. If you can wait a bit, NVIDIA is supposed to support HDTV output via DVI in ForceWare 65.xx, which won't cost a dime (minus internet associated costs) to download and try out if you already own a GeForce FX or 6800 card. Meanwhile, ATI has no comment on future driver support for HDTV output via DVI.

What NVIDIA does need is a component output dongle or some sort of a separate attachment to provide a more comprehensive HDTV output support for its various cards on the market. Until that happens, eVGA's GeForce FX 5700 Personal Cinema is going to have to do, which isn't saying much considering that it is NVIDIA's fastest multimedia card that does HDTV output via component cables.

NVIDIA – Overscan Shift
Comments Locked

13 Comments

View All Comments

  • AndrewKu - Wednesday, August 25, 2004 - link

    #2 - True. Hopefully, their will be more convergence in the spec.
  • aw - Wednesday, August 25, 2004 - link

    Keep in mind that for HDTV gaming, consoles are actually ahead of computers. A lot of games on the Xbox and a few on PS2 come in 480p/720p/1080i flavors, so they look great on the HDTV without any screwing around. Hopefully, computer game makers will start offering standard HDTV resolutions soon in all games...I have no idea how practical that is but if consoles can do it I assume they can too...
  • Questar - Wednesday, August 25, 2004 - link

    "It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle."

    Where do you come up with this crap?

    Overscanning is to eliminate the black bars around the picture. This was done long, long before there was s-video, component inputs, etc.

    The type of input has nothing to do with overscan.

Log in

Don't have an account? Sign up now