I would strongly advise you to wait for the G80/R600. The main reason to wait for DX10 right now is NOT the fact that it is DX10-compliant, but simply that you will be getting a much better performance/price product. And of course, since you probably won't be upgrading again for a decent amount of time, atleast you'll have yourself a fully DX10-compliant video card.
Currently on the release schedule is the nVidia G80 - two models announced:
8800GTX and 8800GTS. Word has it the 8800GTX performs around the same as the 7900GTX SLi, and the 8800GTS around 7900GT SLi. Release is expected to be around mid-November. Hopefully that won't be a freakin' paper launch, but who knows.
Features include higher clocks, Shader Model 4, 768MB GDDR3 (~640MB for GTS),
128 unified shaders.. and one of the best new features: OpenEXR HDR + AA FP16 support. That was one main reason which held me off the 7950GX2 (or any of the G7x cards).. no OpenEXR HDR + AA support. It will be nice to test out 16xAA too.
ATi has their R600 in the works.. scheduled for Christmas but may be delayed (AMD acquisition, engineering issues.. whatever). The flagship model is expected to perform similarly to a X1950XTX CF setup.
If you REALLY can't wait, then the best bang/buck cards out right now are:
- 7950GT 256MB (easily OCs to GTX speeds.. only slightly more expensive than 7900GT right now.
- 7900GT 256MB (get them as low as $330; a volt mod will give you the ability to OC it close to GTX speeds).
- X1900XT (~$400 right now.. simply unbeatable when it comes to bang/buck)
- Gainward 7900GS 512MB (these cards boast a superior cooler, 1.4ns Samsung memory and they OC to GTX speeds. In other words, you can get yourself an SLi system which performs between a 7950GX2 and 7900GTX rig. The only reason why they won't perform aswell as a GTX when OCed to 650/1600 is they'd gimped by 4 pipes.)
- X1900PRO - ATi fixed up some important CF issues with these newer R580+ chips.. and AT's benchmarks speak for themselves. Very strong contender in the mid-market.
WhiteDeth said:
The 7900GT is more than enough to run Oblivion at high settings (speaking from experience here). Hell, the 6800 Ultra would be good enough to run it on mid-high settings.
My laptop has a 7900GS, and it runs Oblivion at the highest settings with no problems. I wont be upgrading it or buying a new laptop anytime soon either, there's no need to. DX10 GPU's will only be important when an operating system that supports DX10 and games that support it come out. At the moment, not only is Vista a good distance away, but games releasing for it would only take longer.
Im taking dibs at a timeframe of a year or so.
Oblivion is one of the most graphically demanding games out right now. You must be:
a) joking.
b) done the volt mod and OCed the GT to GTX speeds.
c) have a different definition of 'reasonable performance' to myself.
Go take a look at Anandtech's Oblivion Performance guide. The conclusion was this is possibly the first game ever to
require SLi/CF to run at high specs with good performances. You won't have a problem in dungeons and towns, but once you hit the open land with battles aswell as the Oblivion gates, your frames will dip down to the shits.
Perhaps we're on the wrong spectrum here when it comes to 'running it at high settings'. I would personally define that as 12*10, most settings maxed with HDR. Pretty much all single card solutions out right now besides the 7950GX2 run at these settings pretty poorly at certain areas of the game. As for the 6800Ultra comments, I can tell you now my X800XT-PE doesn't do so good at those settings at say, the Oblivion gates. And I'd advise you to take a look at AT's benchmarks for Gate. The 6800Ultra isn't there but the similarly performing 7600GT is. If you're telling me that 14.7fps is 'no problem'.. then I guess we'll just conclude our definition of reasonable performances are quite different.