PC gaming hardware thread.
Takes down the torture rack.
Reply
ah. INRxP
DavPaz wrote:
get verified man! It takes seconds if you have online baking access!


<Insert poor joke about cookies>
JohnCoffey wrote:
I should have stayed in fucking bed :'(


Go to the pub and have a beer.
Kern wrote:
JohnCoffey wrote:
I should have stayed in fucking bed :'(


Go to the pub and have a beer.


I would love to go and get absolutely fucking hammered right now. I can't afford it though but next week I am going to drink myself into a fucking stupor :DD
I have a spare not great graphics card I am looking to sell, an HD3450.
I can't pay for it. So Im going to have to pray that my cousin shows up sooner rather than later. Ho hum.
Ok, well if anyone else wants it, I am thinking like £25. It's basically brand new, I only used it the once.
Fuck it, I think I am ordering this, they are not far from here, so I could pick up in person.

Good idea/bad idea?
LewieP wrote:
Fuck it, I think I am ordering this, they are not far from here, so I could pick up in person.

Good idea/bad idea?


That's the card I have - runs everything I chuck at it at 1920x1200, maxed out. Although admittedly I've never tried Crysis on it.

Currently playing Modern Warfare 2 and Bad Company 2 beta and they're lovely.

Keeps nice and quiet under load as well.
I think that might be out of the question now sadly, I have realised I will be needing a new power supply too :(
LewieP wrote:
I think that might be out of the question now sadly, I have realised I will be needing a new power supply too :(


http://cgi.ebay.co.uk/700-watt-Alienwar ... 27aca92de4

Ignore their stupid pic of an old AT PSU. It's the same as mine but with 50w less. Be warned, it's probably the same size though !
Atrocity Exhibition wrote:
LewieP wrote:
Fuck it, I think I am ordering this, they are not far from here, so I could pick up in person.

Good idea/bad idea?


That's the card I have - runs everything I chuck at it at 1920x1200, maxed out. Although admittedly I've never tried Crysis on it.

Currently playing Modern Warfare 2 and Bad Company 2 beta and they're lovely.

Keeps nice and quiet under load as well.


:this:

However. If you want cooler, calmer, quieter and far less power usage then get a 5770. Now admittedly on it's lonesome the 5770 isn't as fast but it does offer full DX11 support.
JohnCoffey wrote:
Atrocity Exhibition wrote:
LewieP wrote:
Fuck it, I think I am ordering this, they are not far from here, so I could pick up in person.

Good idea/bad idea?


That's the card I have - runs everything I chuck at it at 1920x1200, maxed out. Although admittedly I've never tried Crysis on it.

Currently playing Modern Warfare 2 and Bad Company 2 beta and they're lovely.

Keeps nice and quiet under load as well.


:this:

However. If you want cooler, calmer, quieter and far less power usage then get a 5770. Now admittedly on it's lonesome the 5770 isn't as fast but it does offer full DX11 support.


The 4890 is the quietest 'proper' graphics card I've ever had, I've had a play around in CCC with manual fan speeds, and by my estimation, the fan never spins above 35% when left to its own devices, even during an intensive gaming session, or in other words, I can't hear it above my case and CPU fans - at stock speeds the 4890 is about 15% faster than a 5770 at stock speeds.

I can max the 4890 out in CCC at 1GHz core and 1.2GHz RAM and it's entirely happy, but the fan does get audible when gaming - at that level it's kicking out polys over 25% faster than a 5770, and whilst the 5770 overclocks nicely as well, it can only narrow the difference back down to around 15%. (With pretty nasty diminishing returns at high resolutions due to a lack of memory bandwidth.)

DX11 is neither here nor there in my book, we've scarcely got any games with noticeable DX10 enhancements, let alone DX11 (Custom PC took a close look at DIRT 2 recently and basically said, 'Ummm, yeah, it says it's running in DX11 mode, but it looks the same and runs about 30% slower than if you leave it in DX9 mode') - considering pretty much all games are derived from console code these days (the 360 essentially being a five year old DX9 PC), don't get too excited about DX11 enhancements in games any time soon.

(The games I've played with DX10 modes, I've ended up going back to DX9 mode for speed and stability, even a brand new game (still in beta) such as Bad Company 2, the official advice when it comes to performance and stability is, 'Edit the ini file to force DX9 mode, even on DX10 capable hardware.')

All that said, the 5770 is a bit cheaper than the 4890, and it's certainly a pretty capable card, so either would be a reasonable choice IMO.

Here's one of many similar quotations you can find on various tech sites, the 1GB 5770 is slower than a 1GB 4870, let alone a 4890, and it's also lacking memory bandwidth, which will hurt in the most demanding games and/or at high resolutions: (This comment refers to a 4870, which is a fair bit slower than a 4890),

"The value of the 5770 in particular is clearly not going to be in its performance. Compared to AMD’s 4870, it loses well more than it wins, and if we throw out Far Cry 2, it’s around 10% slower overall. It also spends most of its time losing to NVIDIA’s GTX 260, which unfortunately the 4870 didn’t have so much trouble with. AMD clearly has put themselves in to a hole with memory bandwidth, and the 5770 doesn’t have enough of it to reach the performance it needs to be at.......

So here’s the bottom line for the 5770: Unless you absolutely need to take advantage of the lower power requirements of the 40nm process (e.g. you pay a ton for power) or you strongly believe that DirectX 11 will have a developer adoption rate faster than anything we’ve seen before for DirectX, the 1GB 4870 or GTX 260 is still the way to go"
Funny how reviews differ really. I read a review of the 5770 overclocked and it beat the 260. Thing is, with the reviews as they are they don't mean very much really. What counts is what it does when you actually put the card in your actual P.C. Like Zio I thought the 280GTX was incredibly underwhelming. It's not suprising to me that reviews said it was a tip top card, because when you look at their test rigs they're running the top of the pile parts for everything.

I mean, in my P.C a 5770 is pretty much as good as a 280 GTX. Infact, for some reason some things seem to run better. I am strongly thinking now that there could have quite probably been something wrong with the 280 from day one but I can't swear to it.

I agree to what they're saying about memory bandwidth and so on but that isn't going to be a problem once I strap two together. I'll pretty much go from a very capable little card to something other worldly.

Also, it's a little unfair really to compare the 5770 with the 260GTX. The 260GTX costs at least £30 more and right now ATI don't really have something in the same price bracket. The 5770 should be compared to the 250 GTS (because it costs pretty much penny for penny the same) and the 5770 wipes the floor with the 250 GTS. Maybe they'll come up with a 5800 or something for around £150-£170 that will be on the same price marker as the 260? not sure.

I also had nothing but trouble with Dirt 2 on the 280. It constantly fucking stuttered and so on. I also had the same issues with NFS : Shift and blamed that on my core 2 duo (infact that's what caused me to replace it). Now whether the 280 was already starting to die or whether it really was as crap as I remember it to be? I don't know. What I do know is that the 5770 is nippy as fuck in Dirt 2 (even though it benches at 20fps less it never stutters and feels far faster) and is a very good card over all.

As for fans spinning up? the 5770 NEVER spins up the fan. No matter how hard you game, or for how long (and I'm talking 13 hour Fallout 3 sessions) I have never heard it go over what could be considered as stock speed, even when I had used the CCC (which is fucking AWESOME BTW) to clock it faster than the XXX edition costing £30 more. It's also never risen above 52c even after 13 hours of solid battering. Fuck sake, the 280 *idled* 4c more than that and quickly flew to 80s every time.

I'm fucking glad to be shot of it in honesty.

Last thing to say again. The 5770 is a nice little card. However, it doesn't truly shine until you strap two of them together. And when I say shine? I mean, shine. It turns into a two headed beast and shits on pretty much everything within £100 above it. 5850? slaughtered. 5870? It takes that's head too. 285 superclocked? fuck off and sit behind me. 295? Well it actually gives that a bloody nose too.

Check this review out. You'll see why I'm beside myself with excitement at getting this stuff in and running. It is pretty long winded but the figures speak for themselves.

http://www.guru3d.com/article/radeon-hd ... iew-test/1
The problem with Crossfire (and SLI) is that you're relying on drivers and/or game support on a case-by-case basis.

When it shines, it's great, when it works poorly or not at all, you're screwed - you're really at the mercy of the coders.

Crossfire (and SLI) also eliminates the 'less power and heat' argument, and probably the 'less noise' one as well.

My take on this has always been to buy a strong single GPU card rather than bolt two mid-range GPUs together with a flaky software solution.

For the money you'd spend on two 5770s, why not just get a single 5850?
Because the 5770s are faster than a single 5850 by quite a margin.

The support issue isn't really an issue any more. I mean if you look at the benchmarks I posted it pretty much sums up my entire game collection hehe.

As for heat? well I don't see how two 52c graphics cards sitting side by side can produce a lot more heat than one really. Noise? again, maybe 1db over one. And power? Even two 5770s at idle use far less than a 4890.

I'm actually looking forward to seeing what you decide to go with next as an entire PC package as it goes :)

Also. I think the flakey driver argument is pretty much at an end. I mean yeah, the 5970 has it's teething problems but don't all new cards and hardware? That was pretty much going to be a given (and I assume you read the same review in Custom PC as I have :DD )

But when you look at it both Nvidia's current top end card (the 295) and ATI's (the 5970) are both SLI/Crossfire. They both have dual GPUs, both have two sets of everything and both need full SLI/Xfire support to run. I can't see either company not supporting their flagships which will have a knock on effect for everything to do with SLI/Xfire, including support from game manus.

I know you're slow to adopt new technology dude (just something I have noted after the decade I have known you) but sooner or later just like XP and some other stuff you're going to be doing a Uturn :D I do see the sense in your stance (same with the SSD argument) but the usual actual fact is that these things really are all that. Speaking of SSD I know you have your reservations but seriously, I can honestly put my hand on my heart and swear to god I could never EVER go back now. It would be like having broadband and going back to dial up. Seriously, it really does make that much of a difference.
JohnCoffey wrote:
As for heat? well I don't see how two 52c graphics cards sitting side by side can produce a lot more heat than one really. Noise? again, maybe 1db over one. And power? Even two 5770s at idle use far less than a 4890.

Your knowledge of physics is flaky at best, though.
What would you say to someone who said that they "don't see how two graphics cards sitting side by side can produce much faster frame rates than one" because it's a similar argument. If it does more work (e.g. better frame rates) then of course it's going to chuck out more heat unless there's a different architecture.
JohnCoffey wrote:
As for heat? well I don't see how two 52c graphics cards sitting side by side can produce a lot more heat than one really.


8)
So if two things are exactly the same temperature sitting side by side they both get warmer?
JohnCoffey wrote:
So if two things are exactly the same temperature sitting side by side they both get warmer?
A radiator gives off heat & warms a room. If you installed a second radiator in the same room would you not expect the room to warm faster than before?
If you have a room with one radiator in it, would it be the same temperature as an equivalent room with two radiators?

Edit: Hi5
:metul:
LewieP wrote:
If you have a room with one radiator in it, would it be the same temperature as an equivalent room with two radiators?
Depends whether or not they have thermostatic valves fitted :P
LewieP wrote:
If you have a room with one radiator in it, would it be the same temperature as an equivalent room with two radiators?

Edit: Hi5


No.. But I wouldn't expect the temperature to double. If the two heat sources are in pretty much exactly the same place and are pretty much exactly the same how much warmer would it become? A little bit? well yes, but I wouldn't expect it to be that much warmer than running one.

Either way I will know for sure within the next week or two (hopefully) and if the heat does bother me in the summer months I can always remove one. It would take about two seconds to do. Infact I'm absolutely certain that I mentioned doing just that recently.
Wullie wrote:
JohnCoffey wrote:
So if two things are exactly the same temperature sitting side by side they both get warmer?
A radiator gives off heat & warms a room. If you installed a second radiator in the same room would you not expect the room to warm faster than before?


Have you measured the size difference between a radiator and a graphics card recently? Last time I checked my Radeon was 8 inches x 4 x 1 inch/es.

Radiators on the other hand are a good few feet accross.
Physics doesn't care.
Craster wrote:
Physics doesn't care.


Cool. Then I'll do an Adam and reject your reality and substitute my own.
JohnCoffey wrote:
Either way I will know for sure within the next week or two
We know for sure right now.

Adding a graphics card consumes another 50W (say) of power. That power is emitted as heat. If you took all the graphics cards out and used a low-power integrated GPU, the CPU would still be 50+ degC. Would you therefore not expect the computer to run cooler and use less power?
Doctor Glyndwr wrote:
JohnCoffey wrote:
Either way I will know for sure within the next week or two
We know for sure right now.

Adding a graphics card consumes another 50W (say) of power. That power is emitted as heat. If you took all the graphics cards out and used a low-power integrated GPU, the CPU would still be 50+ degC. Would you therefore not expect the computer to run cooler and use less power?


Well if memory serves me correctly I don't recall saying it wouldn't change. I said it wouldn't make that much difference and I said that the noise wouldn't be that much worse.

For example. If you have one 12" subwoofer with a certain sensitivity and a certain output in db (say, 102bd 1w 1m) and you plop another one straight next to it you can expect go gain about 4db max. If you add another one after that you can expect to gain about 1db.

So the noise won't be much worse.

For the GPU heat I am using the philosophy of multi cored CPUS. If the two cores are 50c a piece and are right next to one another then a few cm away on the top of the heatsink the temperature would not be much more than if you had one core a few cm away running at 50c.
JohnCoffey wrote:
Doctor Glyndwr wrote:
JohnCoffey wrote:
Either way I will know for sure within the next week or two
We know for sure right now.

Adding a graphics card consumes another 50W (say) of power. That power is emitted as heat. If you took all the graphics cards out and used a low-power integrated GPU, the CPU would still be 50+ degC. Would you therefore not expect the computer to run cooler and use less power?


Well if memory serves me correctly I don't recall saying it wouldn't change. I said it wouldn't make that much difference and I said that the noise wouldn't be that much worse.

For example. If you have one 12" subwoofer with a certain sensitivity and a certain output in db (say, 102bd 1w 1m) and you plop another one straight next to it you can expect go gain about 4db max. If you add another one after that you can expect to gain about 1db.

So the noise won't be much worse.

For the GPU heat I am using the philosophy of multi cored CPUS. If the two cores are 50c a piece and are right next to one another then a few cm away on the top of the heatsink the temperature would not be much more than if you had one core a few cm away running at 50c.


Decibels are not a linear system, they're logarythmic. Completely different from temparature.
If you double the power, you essntially double the heat output. How is that so hard to understand?
John, you simply don't understand the physics here, as clearly evidenced by you not being able to distinguish between temperature and heat energy. Those concepts are related, but they are not synonyms. Stop arguing.

When you add that graphics card, your PSU will supply up to 85W more power. That electrical energy will become heat energy within the case (because of electrical physics I was taught at the age of 14) and that extra heat energy will cause the temperature inside the case to rise just as surely as the sun is going to rise tomorrow.
Mr Dave wrote:
Decibels are not a linear system, they're logarythmic. Completely different from temparature.
If you double the power, you essntially double the heat output. How is that so hard to understand?


Not very. So can you tell me then how much warmer say, a bedroom will become with two small fans blowing air accross two 1x1" gpus will become?

As I say, personally I feel it won't make that much of a difference in the room. If it does then I have that covered.

I do feel that it is definitely a case of actually doing it and seeing the results for real. I do recall saying that a water cooling system complete with radiator on the back of the case would take heat out of the case making it cooler. I also remember science coming to the rescue to tell me I was wrong and I also remember my case temps dropping by 4c ambient.

Doc. I do understand the physics here. I have accepted it would be warmer. I have stated that it wouldn't be that much warmer in my room due to the extra card.
JohnCoffey wrote:
As for heat? well I don't see how two 52c graphics cards sitting side by side can produce a lot more heat than one really. Noise? again, maybe 1db over one. And power? Even two 5770s at idle use far less than a 4890.


OK? I'm not arguing and I don't need loads of scientific technically stated facts thrown at me. My statement is perfectly reasonable.

If someone can show me actual information on how much hotter it would be using two ATI radeon 5770s side by side both running at 52c max load then fine.

Sorry, let me make this clear. I am accepting that yes, it will be more heat than one GPU. I also accept it will possibly make more noise. However, what I am trying to say is that I can't see the two cards making that much of a difference to the temperature of the room the PC sits in. The noise most certainly will absolutely not be an issue because the 92mm fan I sit next to (it's in the side of the case) drowns out any other noise from the PC as it makes the most noise and is closest to my listening position.

So from another angle, adding another card running at 52c does not mean that I will have 104c coming out the back of my computer yes? That's all I have been trying to point out.

Overall wattage and power usage are absolutely going to be higher on two cards. However, two of these buggers uses 2/3 of the power the 280 used alone.
Go use that PSU power consumption tool you've previously linked to. If your computer currently uses 500W, and you add an 85W graphics card, it's going to generate 17% more heat. Temperature has nothing to do with it.
Doctor Glyndwr wrote:
Go use that PSU power consumption tool you've previously linked to. If your computer currently uses 500W, and you add an 85W graphics card, it's going to generate 17% more heat. Temperature has nothing to do with it.


Right !

Now when I removed the 280 and put in the 5770 I noticed a massive ambient temperature drop in the room. Infact, I went from sitting here with the heating off 24/7 sweating to feeling quite chilly. the 280's 'body' was twice the size almost of the 5770 and it ran far hotter. The entire back end of the 280 was metal and got so hot you could not touch it until the card had been powered down for at least 20 minutes. The 5770 however? well the body of the card never gets any more than warm.

As I say, I accept the fact that a second card will add more heat. Of course it will ! It has two physical sources of heat. I am not arguing with any one here, nor wanted to turn this into a debate. I just truly believe that adding a second GPU along side the first one won't change the room temp that much. Case temps/CPU temps etc? I don't give a crap about those as that is fully taken care of (fan upgrades, water coolers and so on) It's the temps in my room.

If I am completely wrong and I feel like I am sitting in a sweat box? fine, I'll take one out. :)
The heat that the GTX 280 emits is the only reason I'm thinking of flogging mine on eBay and getting a Radeon. Well, that or buying a case with better cooling capability. It gets so warm inside the case at the moment that it's limiting what I can do with the computer. For instance, I should be able to overclock the CPU far, far more than I am at present but doing so makes the temps dangerously high and I'm sure it's heat from the GPU causing this (the CPU sits lower down and thus a fair bit closer to the top PCI-E socket on P55 motherboards). Also it's preventing me from being able to use my old 8800GT as a PhysX accelerator because the heat the pair of them generate together is unbelievable. The GTX 280 runs as hot at idle as the 8800GT did when being pushed to the limit and, as John says, becomes too hot to touch after you've been gaming with it for a couple of hours. I don't care if it's meant to be a hot card, I'm bloody certain running electronics at 70-odd degrees C for prolonged periods can't be good for it.
Well those were the things I thought were wrong with my card. So I guess there wasn't anything wrong with it after all and it really was just utterly ridiculous.

I wish I could have seen mine max at 70 ! 84 on a good day for me. I think that mine died that day the screen went to garbled red when playing Dirt 2. You're right, though, how the fuck can you expect something to last if it runs so stupidly hot? Eventually the solder screws up. This is why people reflow them in the bloody oven !

I hope the same can't be said for the 8800GTX. I'll never hear the fucking end of it if I fit my cousin's one and it packs in.
JohnCoffey wrote:
Wullie wrote:
JohnCoffey wrote:
So if two things are exactly the same temperature sitting side by side they both get warmer?
A radiator gives off heat & warms a room. If you installed a second radiator in the same room would you not expect the room to warm faster than before?
Have you measured the size difference between a radiator and a graphics card recently? Last time I checked my Radeon was 8 inches x 4 x 1 inch/es.

Radiators on the other hand are a good few feet accross.
& the last time I checked a metaphor was still a way that you could use something completely different to explain the original item in a more obvious way (in this case they both give off heat).

Stop & think before you start tapping those keys. Usually when someone makes an incorrect statement (or we think they have) we usually try to correct them & share a bit of knowledge. It's not out of badness, we'd be dicks if we let you (or anyone else) walk around spouting rubbish. If you don't get what we're on about then don't be afraid to ask for clarification or a better explanation (PM if you don't want to do it in public). It's a lot easier than going off on a tangent about the differing dimensions of a graphics card & a metaphorical radiator.

Also, please to be bearing in mind that when you're talking about component temperatures most people will tend to think inside the box :DD when you were meaning the room the box sits in.
Wullie wrote:
& the last time I checked a metaphor was still a way that you could use something completely different to explain the original item in a more obvious way (in this case they both give off heat).


A fart gives off heat dude. It doesn't mean it causes global warming.

Wullie wrote:
Stop & think before you start tapping those keys. Usually when someone makes an incorrect statement (or we think they have) we usually try to correct them & share a bit of knowledge. It's not out of badness, we'd be dicks if we let you (or anyone else) walk around spouting rubbish. If you don't get what we're on about then don't be afraid to ask for clarification or a better explanation (PM if you don't want to do it in public). It's a lot easier than going off on a tangent about the differing dimensions of a graphics card & a metaphorical radiator.


But I didn't make an incorrect statement. And all of a sudden I have a fucking lynch party after me. I know full well what I'm on about thanks, it was the three or four people that seemed to misunderstand what I said and then like, totally run away with it that seemed to not understand the context of what I said. Infact, I had to repeat it about three times and even put it in bold before it seemed to dawn on people that I hadn't said it will make no difference rather than what I did say which was I don't think it will make that much of a difference. Immediately people think I am saying that adding another GPU will make no difference at all.

Wullie wrote:
Also, please to be bearing in mind that when you're talking about component temperatures most people will tend to think inside the box :DD when you were meaning the room the box sits in.


Well I guess the reason so many people didn't get what I had said was because they haven't experienced a graphics card that runs so ridiculously hot that it turns your PC into a storage heater and your room into a sauna. But fair play.
JohnCoffey wrote:
A fart gives off heat dude. It doesn't mean it causes global warming.


PEDANT WARNING.

Yes they do. Animal methane is a pretty big contributor to the greenhouse effect.
DavPaz wrote:
JohnCoffey wrote:
A fart gives off heat dude. It doesn't mean it causes global warming.


PEDANT WARNING.

Yes they do. Animal methane is a pretty big contributor to the greenhouse effect.

It's the biggest "man-made" contributor, I think.
Bugger driving five miles less a day (I'm still not sure how that's meant to work), eating less meat would be far better for the environment.
Grim... wrote:
DavPaz wrote:
JohnCoffey wrote:
A fart gives off heat dude. It doesn't mean it causes global warming.


PEDANT WARNING.

Yes they do. Animal methane is a pretty big contributor to the greenhouse effect.

Needing to grow animals biggest "man-made" contributor, I think.
Bugger driving five miles less a day (I'm still not sure how that's meant to work), eating less meat would be far better for the environment.
Grim... wrote:
Bugger driving five miles less a day (I'm still not sure how that's meant to work), eating less meat would be far better for the environment.


I'll drive 5 miles less a day, you eat less meat, and we'll compare notes in 20 years.
If Grim... is eating less meat, I'm going to eat more. Earth is fine.
Malabar Front wrote:
If Grim... is eating less meat, I'm going to eat more. Earth is fine.

You eat more meat, and I'll buy the 4x4 I'm looking at. I'll get the wife to turn the central heating on less, and we'll all balance out.
Craster wrote:
Grim... wrote:
Bugger driving five miles less a day (I'm still not sure how that's meant to work), eating less meat would be far better for the environment.


I'll drive 5 miles less a day, you eat less meat, and we'll compare notes in 20 years.

We'll clearly all have drowned by then.
Grim... wrote:
We'll clearly all have drowned by then.


Oh yeah, 20 years of Zio's seed.
Oh the humanity. This again?

JC: GET A SCIENCE :DD
Malabar Front wrote:
Grim... wrote:
We'll clearly all have drowned by then.


Oh yeah, 20 years of Zio's seed.


:o My poor, poor finances!
DavPaz wrote:
JohnCoffey wrote:
A fart gives off heat dude. It doesn't mean it causes global warming.


PEDANT WARNING.

Yes they do. Animal methane is a pretty big contributor to the greenhouse effect.


I was aware of that yes. NSFW.

http://www.youtube.com/watch?v=lStqYicP2Mk
Edited: Changed it to a link - NSFW vids should be links, just like pictures, thanks.
I'm starting to think about a PC upgrade now that I've got a bit of disposable cash. I don't game on the PC, but would like something faster than what I have got now, but only if it's 1) a decent improvement; and 2) not a lot of cash.

I currently have a socket 939 AMD X2 3800, a radeon X1800GTO, 4gig RAM (I think it's PC3200, DDR, 4x1gig sticks), asus a8n-e motherboard.

For about 200 quid, maybe a bit more, what could I get that would be a decent step up?

TBH I'd be quite tempted to keep the graphics card as it works OK for what I want (although I might be tempted to get a 30-40 quid passive cooled cheapy to cut down on some noise). I gather I am unlikely to be able to use my RAM in newer mobos though.

Like I say, not really interested in bleeding edge gaming; not bothered about mad1337hax overclocking either.

Oh, PSU.... I have a 5/600W thermaltake something or other, I forget the model.
Page 4 of 119 [ 5933 posts ]