Assuming one gets a good framerate at 1600 X 1200 with no anti-aliasing or anistropic filtering...
or the same framerate at 1152 X 864 with AA4X and AF8X, Which should look better?
Is there some rule of thumb for a trade-off between greater resolution vs lower res with AA and AF?
Specifically, I'm wondering about Half-Life2 running on a 20.1" flat panel that has a native resolution of 1600 X 1200. Doom3 is similar, but I didn't notice an AF adjustment, just AA.
Honesty compels me to admit I can't tell the difference so the matter is academic - it would be nice to have the things set to optimum for when the kids (whose eyes are better) visit, though. :oops:
With LCDs, always run at the native resolution if you can. Image quality degrades radically when it tries to interpolate non-native resolutions.
Anisotropic Filtering (AF) has a small performance hit, and improves texture quality for objects rendered at a distance or at a funny angle to the point of view. Modern graphics cards can typically go up to at least 4xAF with little or no performance hit, and the visual difference is nice.
Doom 3 automatically enables 8xAF at High or Ultra quality, otherwise it is set to 0xAF. You can force AF in the console if you want to. "seta_ImageAnistropy" followed by the number is the command IIRC.
Anti-Aliasing (AA) is smoothing of jagged edges. Performance hit is much bigger than AF, but modern graphics cards can handle it at 2x at high resolutions and 4x at lower resolutions. Overall, more resolution is better, but if your eye is really annoyed by jagged edges, drop the resolution and increase the AA level.