MedVision ad

ATi prepares for slaughter as nVidia announces 512MB 7800GTX (1 Viewer)

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
Recap: nVidia released their new top-end card, the Geforce 7800GTX 256MB a few months ago.
ATi, being quite late with their responder.. is finally about to release the X1800XT in a week or so. It also comes with 512MB of RAM, which will appeal to buyers looking for longevity.

Now nVidia plans to announce a 512MB version of their own card, where the release date for it should be similar to that of ATi's. Assuming half a brain is present, we know that's gonna fuck up ATi's sales margin.

OH, to add insult to injury.. it's supposed to be cheaper too.

But things aren't looking too bad, as ATi controls the Direct3D line of games and is starting to make hurtful advances into nVidia's OpenGL realm (recall countless games lately with the 'way it's meant to be played' tag getting owned by ATi). For this reason, in the long term I would say nVidia might be in for some deep shit.

Source
 

Luke_D

Member
Joined
May 2, 2005
Messages
106
Location
West Penno, Sydney
Gender
Male
HSC
2005
Haha! Cool info man :)

I personally like ATi. I've got a Radeon 9800 Pro right now. I remember when those were the bomb haha.
 

insert-username

Wandering the Lacuna
Joined
Jun 6, 2005
Messages
1,226
Location
NSW
Gender
Male
HSC
2006
Having the best card is nothing if consumers can't get their hands on it. I think nVidia will continue to eat into ATi's market share simply because it has a price advantage over its competitor (having released its current gen products earlier) and it has cards available on release day in volume.


I_F
 

Templar

P vs NP
Joined
Aug 11, 2004
Messages
1,979
Gender
Male
HSC
2004
And to add to the insult, ATi probably won't have sufficient stock for the currently planned release date. In the end, it doesn't matter how technologically advanced your product is, if you can't deliver on time, you've lost their trust and as a result their wallets.
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
insert-username said:
Having the best card is nothing if consumers can't get their hands on it. I think nVidia will continue to eat into ATi's market share simply because it has a price advantage over its competitor (having released its current gen products earlier) and it has cards available on release day in volume.
Yup, the price advantage is gonna be a killer for ATi's sales.


playboy2njoy said:
Ah haha, its been stated NUMEROUS times that 512mb has no huge amount of difference compared to 256mb. This means nothing.
It's been stated numerous times that it doesn't for CURRENT games. But getting a 512MB ensures LONGEVITY. I mean c'mon, for the average user who does decide to invest in a $800 video card I'm sure they don't wanna upgrade again for a while.

Templar said:
And to add to the insult, ATi probably won't have sufficient stock for the currently planned release date. In the end, it doesn't matter how technologically advanced your product is, if you can't deliver on time, you've lost their trust and as a result their wallets.
Well I hope ATi has enough stock for release. I mean it would seem pretty stupid if they didn't learn that lesson from the X800s last year. I have a feeling they wouldn't release it this time without having sufficient stock.. but again you never know.
 

insert-username

Wandering the Lacuna
Joined
Jun 6, 2005
Messages
1,226
Location
NSW
Gender
Male
HSC
2006
I have a feeling they wouldn't release it this time without having sufficient stock.. but again you never know.

I assume that's why it's taken so long for the x1800 XT to appear. They want to make sure they have enough stock. Unfortunately, that lends them to fall way behind in the price cycle, making the extra stock somewhat useless. It's a catch-22 situation, really.


I_F
 

Enoch

ur a closet enoch-sexual!
Joined
Oct 15, 2004
Messages
452
Location
sydney
Gender
Male
HSC
2005
well if ur gonna compare those 2 cards the 1800xt will kill the 7800gtx ....cos 256mb and 512mb 7800gtx wont be that much of a difference
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
insert-username said:
I have a feeling they wouldn't release it this time without having sufficient stock.. but again you never know.

I assume that's why it's taken so long for the x1800 XT to appear. They want to make sure they have enough stock. Unfortunately, that lends them to fall way behind in the price cycle, making the extra stock somewhat useless. It's a catch-22 situation, really.


I_F
That's what I think too, although there were other reasons specified for it:

"The information on the R520 suggests that there are some extreme yield problems with the current design. Not only are there few working dice per wafer, but a large number of those dice only have 16 pixel units working, and others are lucky to get 24 working. The information I received suggested that the R520 was in fact designed with 32 pixel units (each with multiple ALU’s), but due to the issues that the chip is facing, very few of them so far are fully functioning. There are of course fully functioning parts that have been shown behind closed doors, and apparently Abit showed off a working card at ACon5 that scored some impressive 3DMarks." - Penstarsys

Another reason was that it was said from the graphics cards makers that it was delayed due to ATI is trying to offload more X850 GPUs before releasing R520. ATI has chalked up a considerable large inventory of X850 GPUs therefore they are hoping that the recently launch of Crossfire editions will help to clear up the inventory.

enoch said:
well if ur gonna compare those 2 cards the 1800xt will kill the 7800gtx ....cos 256mb and 512mb 7800gtx wont be that much of a difference.
I wouldn't say 'kill'. At the moment the X1800XT is neck to neck with the 7800GTX for HL2 at 0xAA/AF. With 4xAA/8xAF, you get about a 20% increase over the 7800GTX. At Doom 3, the X1800XT gets flogged.. and erm, even has trouble keeping up with the 7800GT, priced about 40% less. Considering the 512MB GTX will cost less than the X1800, I don't see how the X1800 will 'kill' the GTX. I doubt neither card will kill the other.. although ATi's range seems to have a better prospect at the moment (on-par performance without having to look at raising pipelines yet, better optimisation in the caches and memory controllers making it a dominator with AA/AF enabled etc.).. and already it seems the ATi's SM 3.0 implementation is better than nVidia's second gen attempt.
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
News just in: Looks like there's more to the 512MB version than meets the eye. Apparently nVidia is planning to raise the clock on the card aswell, from 430/1200MHz (original 7800GTX 256MB) to 550/1800MHz. Pretty much same as calling it a '7800GTX Ultra', if you will.. although nVidia isn't planning to implement a new name other than '7800GTX 512MB'.

So yes, very looking forward to any benchmarks of the 512MB version..

If I got a 256MB.. I'd be very pissed off right now.. lololol
 

SashatheMan

StudyforEver
Joined
Apr 25, 2004
Messages
5,656
Location
Queensland
Gender
Male
HSC
N/A
the only way this new release will effect me, is the price drop of the other high end cards, which might make me buy a nvidia 6800GT or something which is not top of the line anymore
 

insert-username

Wandering the Lacuna
Joined
Jun 6, 2005
Messages
1,226
Location
NSW
Gender
Male
HSC
2006
(on-par performance without having to look at raising pipelines yet, better optimisation in the caches and memory controllers making it a dominator with AA/AF enabled etc

ATi's x1800 XT is clocked 50% faster than the 7800 GTX, which is why they stay on par with less pixel pipelines. If ATi raises their pipeline counts and nVidia raises their clockspeed, they should stay on a par. But a 550Mhz 7800 GTX with 24 pipelines... that would be a monster of a card ...


I_F
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
insert-username said:
(on-par performance without having to look at raising pipelines yet, better optimisation in the caches and memory controllers making it a dominator with AA/AF enabled etc

ATi's x1800 XT is clocked 50% faster than the 7800 GTX, which is why they stay on par with less pixel pipelines. If ATi raises their pipeline counts and nVidia raises their clockspeed, they should stay on a par.


I_F
Exactly.. what I meant was ATi hasn't had to raise pipes yet (but simply raise the clock) to achieve the same performance as a card with more pipes. I always tend to view this as a better option, since I view raising the pipes as the big move.. so if you can hold back from having to resort to raising pipes yet by squeezing every little performance out of the number of pipes you have now, then that's a good way to do things. I mean, assuming ATi has totally exhausted all efforts to improve performance with their current 16 pipe architecture by squeezing every last MHz out of it etc.. then increased the pipes to 24 like the GTX would result in a 625MHz 24pipeline monster! Of course.. not as simple as that concerning heating issues too, but that's the basic drift. :)

As for nVidia's new card, I'm quite scared. For ATi though, to be honest I think I'm more of a ATi fanboy than nVidia (or perhaps just siding with the underdog??).. and with their stock troubles of the last gen, criticisms against the X800s (Doom 3 performance, lack of SM 3.0).. etc. and now we have this, I can't help but feel sorry for them.
 

insert-username

Wandering the Lacuna
Joined
Jun 6, 2005
Messages
1,226
Location
NSW
Gender
Male
HSC
2006
I feel that the problem with higher clockspeed is that it raises your power requirements and heat production faster than more pixel pipelines do. The GTX is a single slot design with 24 pipes, whereas the XT is double slot with only 16 pipes. I actually take the opposite view to you - I think lower clockspeed/more pipes is a better option, because I regard the heat and cooling factor as very important (your point is very valid though, it takes a lot of R&D to add more pipes, much more than to raise clockspeed). NVIDIA tried the high clockspeed/less pipes option with their now-legendary FX 5800 series, and they got utterly blasted for the "lawnmower" cooler and lacklustre performance whilst ATi's solution, with more pipes and a lower clockspeed sailed merrily by. It'll definitely be interesting to see just what ATi's R580 will be able to do, and whether it'll be able to take back the crown for them.


I_F
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
Yup, heat is an issue alright. Infact it's pretty much certain that the new GTX will be dual slot too.. but Inquirer suggested that nVidia must of used a new cooling system altogether.. I mean, even a decent water-cooling system could only get the current 256MB GTX to around 500-520MHz (where 520MHz isn't even stable), so for nVidia to OC that bitch to 550MHz on stock air cooling.. well I think they did quite a bit more mod than we would have thought.

But anyway, I can't stop thinking about how cut you would be right now if you already got a 256MB GTX... lolol I mean, 30% higher clock + 512MB vRAM + CHEAPER TOO.. = LOL pwnz0r.
 

Templar

P vs NP
Joined
Aug 11, 2004
Messages
1,979
Gender
Male
HSC
2004
Looks like just when ATi thought it might gain the upper hand, nVidia unleashes the cavalry. If it all goes ahead, it looks like a very bloody battle...with all the gore from one contributor.
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
Yeh, technology is amazing. First computer I got (in 93') was a 66MHz 486 with 4MB of RAM and 250MB hard disk.. and that was around $2500 :rolleyes:

First proper video card I got was a Geforce 2 Titanium, I think 32MB vRAM. Then I jumped straight onto an X800XT-PE 256MB.. man, didn't that do wonders for my games. Buahaha.
 

seremify007

Junior Member
Joined
Apr 29, 2004
Messages
10,059
Location
Sydney, Australia
Gender
Male
HSC
2005
Uni Grad
2009
My first was an old 5mhz one... then came the 10mhz which had a Turbo button to boost it to 20mhz... then came 66mhz Cyrix... then Pentium 133... then Pentium2 400.. then Pentium3 900... then Pentium4 2.6.. then Pentium4 2.8.. and now Centrino 1.7? The irony is that even with all these numerical speed increases- my computer takes longer than ever between login screen/powering on, and actually being able to USE the computer.
 

lcf

man. nature. technology.
Joined
Oct 24, 2005
Messages
656
Gender
Male
HSC
2005
Apple's new PowerMac G5 workstation line has just been rolled out with workstation graphics...

NVIDIA Quadro FX 4500
26 bit, 512MB ram, 33.6GBs throughput - two dual link DVI ports + stereo3D for CRT montiors.

Pretty cool - check it out here
http://www.apple.com/powermac/graphics.html
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top