MedVision ad

Conroe breaks 5GHz mark.. (1 Viewer)

Serius

Beyond Godlike
Joined
Nov 10, 2004
Messages
3,123
Location
Wollongong
Gender
Male
HSC
2005
I get scared when it comes to overclocking... i worry i might totally fuck something up, so all ive done before are small experimental clocks that seemed somewhat stable. I guess its something you need to be shown... so for now just buy solid parts.

Opterons although stable need a bit of tweaking to get about 2.7 stable on air, all those benchmark shit that people do arent really stable and they arent using a permanent cooling solution

e.g OMGZ 3.5ghz! [with dice] i dunno how you would get it to 4ghz and be able to use it to play games or do anything useful other than brag for more than a few hours
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
Opteron 165's are freaking beautiful. I think they're around $500 now, and for it's overclockability it's a bloody bargain.

As for extreme OCs, if it's stable and sustainable without too much effort, it certainly can be beneficial for gaming if you've got an extreme system. Often high-end SLi/Crossfire setups scrape the CPU onto the ground when it comes to bottlenecking. I can only imagine the relevance of better CPUs increasing as dual-card setups become more efficient & powerful and with the major graphics companies even experimenting in higher-than-dual card setups (although nVidia's first attempt at quad-SLi was a bomb). But then again, we're shifting more work from the CPU towards the GPU; plus something which will definitely happen with DirectX10. Infact, ATi has also proposed solutions on their graphics platforms to handle physics too, which shouldn't become much of a problem anyway due to the mainstream rollout of dual-core processors (and eventually quad-core for consumer desktops come 08'). Kinda makes me think whether all this hype about Ageia's projects are worth it.

Slight change in topic, anyone hear about Forbes' assertion that AMD is considering a business merger with ATi? Fuck fuck FUCK!
 

insert-username

Wandering the Lacuna
Joined
Jun 6, 2005
Messages
1,226
Location
NSW
Gender
Male
HSC
2006
Yeah, that PS3 thing is a bit overdone. PCs can still do plenty that consoles can't and probably won't be able to do for a while.

With shifting work from CPU to GPU, I read the other day that Mac OS X 10.4 (Tiger) actually creates its desktop and screen as a complete OpenGL texture, and there's an option to shift all screen generation calculations entirely to the video card and its memory (though it makes all sorts of crazy artifacts appear). Rumour's out that the next version of Mac OS X will activate the option by default, with the CPU sending drawing instructions and the GPU doing all the hard work. Vista's taking a similar line. It's good to see the OS makers actually taking advantage of the fastest piece of hardware they have available to them.

As for AMD/ATi, I saw that rumour. I doubt it'll eventuate though - I think AMD's go its work cut out for it competeing with Intel, and branching out now may be a little dangerous.


I_F
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
Sony's full of hype, but comments like this are just plain preposterous.

And I agree, now would be a bloody terrible time for AMD to progress further on any plans to absorb ATi. With little time to dwell on it's dissappointing (atleast in performance terms) AM2 dominance before Conroe crashes in (even though AMD explicitly wants consumers not to compare AM2 to Conroe, the mere mention of the new Intel chip still raises an uneasy aura of concern from AMD insiders) and with K8L not scheduled for release till early 08'.. I don't know what AMD's gonna do.

Infact I bloody well hope that there's no acquisition of ATi.. ever. God knows what an AMD/ATi merger will do for AMD/nVidia and Intel/ATi affiliations. I just hope those microprocessor corporations stay away from the graphics companies.
 

Templar

P vs NP
Joined
Aug 11, 2004
Messages
1,979
Gender
Male
HSC
2004
Perhaps AMD wants a share of the graphics market...Intel is the biggest supplier of graphics due to its integrated solution. But that would be akin to General Motors and perhaps the US military...trying to do too much with too little. AMD should focus on getting K8L on the table before expanding.

Console will always be hyped up as better. If you don't need a computer, sure, a console sold at a loss is a great piece of gaming equipment, but who doesn't need a computer in this age?
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A


Benched on the new ATi RD600 chipset using a 1.86GHz Conroe OCed to 2.4GHz with an X1900.



Same deal but with a Intel 975X.

Besides the nice benchies, notice that the results are odd - cuz as an ATi document you would think that they would want to show their boards beating out Intels...? LOL.
 

Collin

Active Member
Joined
Jun 17, 2003
Messages
5,084
Gender
Undisclosed
HSC
N/A
We've rarely even seen many benchies yet. Soon, some rich bastard out there is gonna get a 3GHz EE Conroe, OC it to 4GHz+ and benchmark it with a 7900/X1900 SLi/Crossfire setup. Maybe even quad SLi (although generally most enthusiasts aren't stupid enough to invest in that yet).

And then.. woops, time to consider a new 3DMark again gays.. :eek:
 

Minai

Alumni
Joined
Jul 7, 2002
Messages
7,458
Location
Sydney
Gender
Male
HSC
2002
Uni Grad
2006
Tom's hardware have a new benchmark piece with the 2.6Ghz conroe v FX-62

Conroe beats the FX-62 in most benchmarks, and even beats an overclocked 3.0Ghz FX-62 at times
 

Templar

P vs NP
Joined
Aug 11, 2004
Messages
1,979
Gender
Male
HSC
2004
The K8 architecture has almost exhausted all options for performance increases at 90nm. Without moving to 65nm, its clockspeed cannot be increased much further, plateauing at 3GHz. This is compounded by a Prescott equaling 125W TDP. It is all good to use 125W when it is the clear leader, but when the competition brings comparable if not better performance at half the waltage really questions the effectiveness of K8 against the Core architecture.

I'm not exactly sure what causes the difference in the gaming benchmarks, how one processor seems to be better at low resolution while losing at higher ones.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top