![]() |
2 Attachment(s)
So, yeah. I totally understand why fancy graphics cards are desirable.
https://www.miataturbo.net/attachmen...ine=1342589292 It's really helpful when you can actually see who you are shooting at, particularly in classes like Pyro and Heavy, where you tend to engage the enemy at extremely close range in which both of you are moving around rapidly to try and avoid one another while simultaneously trying to keep your opponent centered in the crosshairs. At 10 FPS, it's impossible. At 60+ FPS, you can finally start hitting the top of the leaderboard. |
Originally Posted by Joe Perez
(Post 904575)
So, yeah. I totally understand why fancy graphics cards are desirable.
https://www.miataturbo.net/attachmen...ine=1342589292 It's really helpful when you can actually see who you are shooting at, particularly in classes like Pyro and Heavy, where you tend to engage the enemy at extremely close range in which both of you are moving around rapidly to try and avoid one another while simultaneously trying to keep your opponent centered in the crosshairs. At 10 FPS, it's impossible. At 60+ FPS, you can finally start hitting the top of the leaderboard. |
Yay. Joe gets "it".
The last pair of cards I tried were HD6950s. There was no improvement. While the framerate was higher than using a single card, if it was anywhere below 60 there would be microstuttering, thus destroying the point of having higher framerates. The other parts in my system were all fully capable of supporting them. Just those fracking drivers. My GTX680 will slaughter your children with a butterknife. |
Originally Posted by Bryce
(Post 904578)
My GTX680 will slaughter your children with a butterknife.
And yeah- it's kind of interesting to be at least within visual range of the cutting edge again. Haven't felt this way since the original 3dfx Voodoo back in... whenever the hell it was. (Mid 90s, I think.) |
Originally Posted by Joe Perez
(Post 904594)
I drink your milkshake.
And yeah- it's kind of interesting to be at least within visual range of the cutting edge again. Haven't felt this way since the original 3dfx Voodoo back in... whenever the hell it was. (Mid 90s, I think.) |
Awww yeah. I remember playing Unreal Tournament on my big brother's Voodoo 3.
Originally Posted by Joe Perez
(Post 904594)
I drink your milkshake.
|
First video card I ever bought was a 3dfx voodoo 3 2000 - ahh, the good ol' days of spending my best buy paycheck at best buy.
|
this thread makes me so sad.
|
ima simple person so i think i'm going to take bryce's advice and do single, at least at first.
this looks cheap: CPU: i3-2125... $160 mobo: ASRock Z77 Extreme4 ... $124 joe thank u sir for the headsup on x8 vs x16 . sounds like 2 x16 slots is a sandy bridge or one generation previous to that kind of deal. but looking at the newegg page it is not obvious to me at all that when they say 2 x16 slots that they are actually both electrically x16. Newegg.com - ASRock Z77 Extreme4 LGA 1155 Intel Z77 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard brain, your angel told me to tell you with that kind of attitude you can forget about fan service when she delivers your card :D |
Those who speak of what they know
Find, too late, that prudent silence is wise Joseph Perez, hold your tongue He will burn you with the heat of his eyes |
Originally Posted by jasonb
(Post 904797)
ima simple person so i think i'm going to take bryce's advice and do single, at least at first.
this looks cheap: CPU: i3-2125... $160 mobo: ASRock Z77 Extreme4 ... $124 joe thank u sir for the headsup on x8 vs x16 . sounds like 2 x16 slots is a sandy bridge or one generation previous to that kind of deal. but looking at the newegg page it is not obvious to me at all that when they say 2 x16 slots that they are actually both electrically x16. Newegg.com - ASRock Z77 Extreme4 LGA 1155 Intel Z77 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard Looking at the link you posted, here's what they say about the PCIe x16 ports: PCI Express 3.0 x16: 2 (x16/0 or x8/x8) Physically, this board has two x16 ports, however it cannot run them both at full bandwidth. If only one card is installed (in the first or upper slot) then that card will run at x16 and the second port will be disabled (x16/0). If two cards are installed, then each will run at x8 (x8/x8). |
thread hijack!
Thoughts on a card.... Newegg.com - MSI R7850 Twin Frozr 2GD5/OC Radeon HD 7850 2GB 256-bit GDDR5 PCI Express 3.0 x16 HDCP Ready CrossFireX Support Video Card Going into an Asrock Z68/i5 ivy bridge/16gb ram/ssd'd build. Trying to stay around 200, this is 209 after rebate. I like the lower power usage and quiet reviews, along with being a decent performer. |
How is this threadjacking? This is a video card thread.
|
In that case... how does 100 for an EVGA GTX550 TI sound? Does DX11 and seems to be good bang for buck.
NEW NVIDIA GEFORCE GTX550 TI VIDEO CARD |
|
get the 560, the core amount alone is worth it compared to 550. My wife runs dual 550s in her pc, and I am running a single superclocked 560. Her SLI is just a tiny bit faster than my one card. I used to have dual 560, but I fried one and returned it for warranty... ended up using the warranted credit for a new FX processor.
Ill still sell my GTX280OC for 90 bucks shipped... runs faster than its predecessor... the 285, and runs just under the 290 http://www.geforce.com/Active/en_US/...red/lineup.png |
Good grief. My 8800 GTS 512 is barely on that chart at all.
|
GTX 470 with Zalman VF3000 cooler for 150?
NVIDIA GTX 470 WITH ZALMAN VF3000 COOLER http://images.craigslist.org/5L15Kc5...31f0bb10c9.jpg |
Originally Posted by Splitime
(Post 905806)
GTX 470 with Zalman VF3000 cooler for 150?
|
Guy is accepting of my 200 offer for the GTX560 TI 2gb card... so I think that makes sense.
Then I need a few games to play... been away a few years now. |
So assuming I get this card this weekend... what games should I be trying.
I used to mostly play good ole Counter Strike and Day of Defeat. |
|
Srsly!
And that was on the short list of new ones ;) I even remembered my old Steam account login! Friend mee1!!!!4!!!!!1111 (splitime... i know... shocker.) |
Or do I just get a 1gb model and save 30 bucks.
Brand new GTX 560 Ti 1GB - 3 Available Or buy both and go dual... which from this thread... probably not wise due to drivers never being quite right? |
Originally Posted by Joe Perez
(Post 905875)
Also we have a free game thread... so take a look at that and join up for APB: Reloaded and BF P4F |
Originally Posted by Joe Perez
(Post 904815)
I think I've figured out the informal jargon that the industry has developed to describe this.
Looking at the link you posted, here's what they say about the PCIe x16 ports: PCI Express 3.0 x16: 2 (x16/0 or x8/x8) Physically, this board has two x16 ports, however it cannot run them both at full bandwidth. If only one card is installed (in the first or upper slot) then that card will run at x16 and the second port will be disabled (x16/0). If two cards are installed, then each will run at x8 (x8/x8).youdathunk i coulda figured that out since i worked at a commodity hardware vendor before. thank u sir. |
Originally Posted by Pen2_the_penguin
(Post 905510)
Now, if only I could buy a card to boost my free time by ANY amount... |
HD4830 chiming in. oh yeah, i just overclocked that baby from 575MHz to 675MHz. now what?
|
Originally Posted by Braineack
(Post 906414)
HD4830 chiming in. oh yeah, i just overclocked that baby from 575MHz to 675MHz. now what?
:party: |
hahah. I like the spy, but it gets boring. card worked well overclocked to 700MHZ, too bad I don't have a baller card like the rest of you or I wouldn't be in second place for kills, just under Joe.
|
I'll try to join with my on cpu 2500hd Intel graphics....
|
itll probably be fine to be honest. I used to play CS on my laptop if I was traveling and still get #1 on the server.
|
58 minutes until its able to run... bah.
I wish one of these people would get back to me about video cards... so I can get up to speed and PWN you all ;) |
Originally Posted by Braineack
(Post 906441)
hahah. I like the spy, but it gets boring.
That's why I really like Pyro and Heavy- they're very simple classes to play. Just point and click. Pyro, I guess, requires a bit more coordination, as I find myself using the flare gun more and more, but it's still not all that complex. I'm also a big fan of Engie, especially on the CTF maps, and to a lesser degree, when playing Red on the PL maps. It's a very strategic class, but not a taxing one from a controls point of view. If done properly, your sentry should be doing all the work while you use the pistol only for spy-checking.
Originally Posted by Braineack
(Post 906441)
too bad I don't have a baller card like the rest of you or I wouldn't be in second place for kills, just under Joe.
Originally Posted by Splitime
(Post 906455)
58 minutes until its able to run... bah.
I wish one of these people would get back to me about video cards... so I can get up to speed and PWN you all ;) |
Soo have an old Dell dimension 8400 and I would like to upgrade the video card. I want an hd card for it. No gaming, just movies. Any suggestions?
__________________ Best Car Insurance | Auto Protection Today | FREE Trade-In Quote |
Originally Posted by Joe Perez
(Post 906459)
Pyro, I guess, requires a bit more coordination, as I find myself using the flare gun more and more, but it's still not all that complex.
did you see how many flare gun kills I had? at least 10 of those 50 kills. That's my main selected weapon, then i hit q to switch to the fire breathing dragon of death. I got a rake kill in there, that was fun. |
Originally Posted by Joe Perez
(Post 906459)
In all seriousness, is there a reason you're focusing only on used cards advertised on Craigslist, which is probably the flakiest bunch of people on earth? The prices I was seeing in those links didn't seem significantly lower than what I'd expect to pay for a mid-range card in a store. Went to Frys and bought a EVGA GTX560TI DS, 219 after rebate. Emailed the one person I was waiting on responses from... he sold his used one for 220 and hadn't bothered to tell me. So I came home with a G330 headset and video card. Updating/configuring now... in soon :) |
Originally Posted by Braineack
(Post 906481)
did you see how many flare gun kills I had?
Originally Posted by Splitime
(Post 906652)
Joe is all wise and knowing.
Went to Frys and bought a EVGA GTX560TI DS, 219 after rebate. Emailed the one person I was waiting on responses from... he sold his used one for 220 and hadn't bothered to tell me. So I came home with a G330 headset and video card. Updating/configuring now... in soon :) Good to see you in the game tonight. And from what I understand the 560TI is a pretty stellar unit. |
Originally Posted by Joe Perez
(Post 906688)
Well, you were flaming. :giggle:
They have a Fry's in Chicago?! Good to see you in the game tonight. And from what I understand the 560TI is a pretty stellar unit. So far so good. I'll rock it at 100% stock for awhile, thinking that I won't be running enough game to stress it and probably not have to touch it. That said... holy crap I'm rusty at FPS games... |
Originally Posted by Splitime
(Post 906706)
That said... holy crap I'm rusty at FPS games...
|
Update:
This GTX 280 is proving to be less than reliable. I honestly don't know if it's the card itself or just the drivers package. From time to time while playing TF2, it just locks up and goes to a black screen. Usually the game continues playing in the background (I can hear the audio) but the only way to recover is to Windows-key back to the desktop (which restores the display) and then force the application to close. Usually, this is accompanied by a little error message popping up which says "a problem was detected with the driver, which was recovered successfuly." No, it wasn't recovered successfully- you forced me to crash out of a game right when I was about to make a capture. I've installed a couple of different driver updates from nVidia, which have not helped. I also re-formatted the machine, re-installed the OS, etc. Same deal. Thinking very seriously about going out and buying one of those ATI 6580 cards. I'd love to think that a GTX 560 would prove more reliable, but I'm not really willing to gamble on something that's going to be using the same driver package that I have right now. At least with the ATI I'd be eliminating (or at least changing) every possible culprit. http://www.newegg.com/Product/Produc...scrollFullInfo Dammit, I hate spending money... |
Heat issues?
Also, running Solidworks, my rig would sometimes lock up for no apparent reason, and there was nothing I could do but wait. It was only for about 10-12 seconds, but it was very frustrating. Some sort of conflict with the drivers. I think the culprit was a printer I had hooked up (HP CP1518ni) which was using my computer as some sort of print server, but I never did get it to completely stop. Once my new computer came in, I haven't had any problems. |
I'm running an Nvidia GTX 460 and have been for a while. The girlfriend is running an ATI/AMD 6850 (iirc). Our hardware is a year behind, but here is what I've seen. Hers has been rock solid but in Steam/Source games (TF2/HL2/L4D) she seems to have issues rendering shadow shading stuff unless you are willing to sit down and tweak the graphics settings. This is pretty evident in Skyrim, but she is also running 10,000,000 add-on packs (as apparently this is normal for people who play Bethesda games) so who knows what is causing the issue. Mine renders everything just fine, but I used to get the occasional lockup as you describe.
Sidenote: The noise needs to be a non issue and Newegg reviewers are a bunch of bitches. My GTX460 runs something like 150 watts at full tilt. That's A LOT of heat and you need A LOT of air to dissipate it from such a tiny heatsink. Those bastards need to go grab a 100w incandescent light bulb that's been running for more than 5 minutes, then they will understand. Your issue most likely heat, not drivers. Heat is what causes my lockup issues with my 460. The built in fan speed/temperature response curve from Nvidia is crap. It lets the card get too hot because it runs the fan too slow, so they don't get noise complaints. I started using a piece of software called EVGA Precision EVGA | Software | EVGA Precision to manually set the fan speed/temperature response curve and its nearly eliminated the issue. I still get the occasional lockup, but its after I flog the system for hours at a time and the ambient temp in my computer room climbs up into the 80s. The software is shipped with EVGA cards, but it works fine with everything I've ever tried it on that had an Nvidia chipset on it. Before you spend more money, give it a shot. It's a free download and you have nothing to loose. Bonus points: The software will also let you overclock/underclock the various settings on the card, but that's another level of insanity that even I'm not interested in. |
I know you might want to stay away from nVidia products right now, but I recently rebuilt my rig with an EVGA GTX 670 FTW (Newegg.com - EVGA 02G-P4-2678-KR GeForce GTX 670 FTW 2GB 256-bit GDDR5 PCI Express 3.0 x16 HDCP Ready SLI Support Video Card) and it has been incredible.
How long ago did you get the card? Can you RMA it? |
Originally Posted by rleete
(Post 947438)
Heat issues?
The fan on the video card appears to be working fine, and when I had the machine apart last month to install a new SSD, I used the air compressor to thoroughly blow the dust out of everything, which included the video card. I wasn't able to figure out how to separate the plastic cover on it, though I did blow air at 120 PSI through it (from the rear outlet, and and into the fan intake in all directions.) The case itself is exceedingly well ventilated. Two outwards-blowing rear fans (one case, one PSU) and the whole front is pretty much wide open. The CPU itself runs cool, though I will admit that I haven't quite figured out how to monitor the temperature of the GPU in real-time while in a game. (I'm sure there's some overlay thingy I can activate- I will try that this evening.) |
The above Precision software will display temp, and I believe it has a logging feature as well.
Just Sayin' |
Originally Posted by EO2K
(Post 947444)
Your issue most likely heat, not drivers. (...) I started using a piece of software called EVGA Precision EVGA | Software | EVGA Precision to manually set the fan speed/temperature response curve and its nearly eliminated the issue.
Originally Posted by palmtree
(Post 947445)
I know you might want to stay away from nVidia products right now, but I recently rebuilt my rig with an EVGA GTX 670 FTW
Originally Posted by palmtree
(Post 947445)
How long ago did you get the card? Can you RMA it?
|
Originally Posted by EO2K
(Post 947444)
Your issue most likely heat, not drivers.
(...) I started using a piece of software called EVGA Precision EVGA | Software | EVGA Precision to manually set the fan speed/temperature response curve I set a new linear temp curve to crank the fan up to 100% by 65°C, and played for an hour without any glitches. Judging solely by the acoustic signature, I'm quite certain that the fan was not operating at a high speed previously- this sucker is LOUD when it's cranked up. Thanks for the tip on that. I guess this is what I get for owning an old GPU built on 65nm process with 1.4 billion transistors. |
2 Attachment(s)
Hooray! I'm useful! https://www.miataturbo.net/attachmen...ine=1352330797 I'm having a wonderful time! |
Hey, how come he gets the credit when I suggested it first? I'm wounded.
|
Psssht! You had me by 5 minutes only because I was typing out an explanation and trying to find a link to my solution :party:
|
Originally Posted by rleete
(Post 947581)
Hey, how come he gets the credit when I suggested it first? I'm wounded.
I am really quite unable to reconcile how the manufacturer of this card could configure the fan control so poorly. I grant you, it is annoyingly loud when it's cranked up, but annoyingly loud and working is a hell of a lot better than quiet but non-functional. If I wanted the machine to be quiet above all else, I'd turn it off, throw it in a hole in the back yard, and pour concrete over it. The fact that I've gone to all the trouble of actually plugging it in and turning on kind of suggests that I want to USE it, not listen to how quiet it is when it's just sitting there showing a black screen. |
People confuse me.
Out of curiosity, I started looking around at water-cooling solutions, knowing that this has been fashionable in recent years. Turns out that there are water blocks available for the GTX 260/280. They cost more than a brand new Radeon 7770, which outperforms the GTX 280 by about double and is apparently capable of doing so without overheating OR sounding like an F14 being launched off an aircraft carrier. Seriously? $150 for a poorly-machined block of copper? Who the hell is actually buying these things? |
Nobody is now, but when the card was $600 or whatever it was new, then maybe it made sense?
|
no credit to me as well? I feel sad.
|
Originally Posted by Joe Perez
(Post 947596)
I am really quite unable to reconcile how the manufacturer of this card could configure the fan control so poorly. I grant you, it is annoyingly loud when it's cranked up, but annoyingly loud and working is a hell of a lot better than quiet but non-functional.
For example: EVGA 02G-P4-2660-KR GeForce GTX 660 http://images17.newegg.com/is/image/...825-Z01?$S300$http://images17.newegg.com/is/image/...825-Z03?$S300$ This is pretty much a "reference" cooler design. It has a shroud, single fan, and looks exactly like the one on the NVidia website. It appeals to the "budget" buyer. Its usually a couple of copper slugs over the GPU and MOSFETS stuck to a cast/finned aluminum plate. It works, kinda. Its the bare minimum to get the job done. Most of the "gamer" targeted cards will have an "other than NVidia OEM" heat solution, because gamers know they are going to abuse these things. Same card, same chipset, same ram but includes an "aftermarket"~ish cooling solution. GIGABYTE GV-N660OC-2GD GeForce GTX 660 http://images17.newegg.com/is/image/...-443-TS?$S300$http://images17.newegg.com/is/image/...443-Z03?$S300$ The heat pipes, dual fans and stacked sheetmetal finned heat sink will massively increased surface area. The larger fans will move more air while being quieter than the blower in the OEM design. This card is going to run cooler for longer. Downside is, this will usually be more expensive. (bad example, these are within $10 of each other on Newegg, but you get the idea) Why buy a Trackspeed Engineering radiator when I can get one from Radiatorbarn.com for 1/5th the price?! They both radiator, right? |
Originally Posted by EO2K
(Post 947620)
Honestly, its reviews sites like HardOCP and NewEgg. "Casuals" want quiet cards. (...) The manufacturer knows this, and so they detune the fan to suppress noise complaints, and sell more cards to the masses due to favorable reviews.
That's my problem here. I grok the concept that 133T G4M3RZ will tend to gravitate towards fancy cooling solutions because they plan on overvolting / overclocking. Hell, 20 years ago (before water-cooling existed in its present form) people were literally submerging their entire motherboard in a fish tank filled with mineral oil just so they could crank an extra 20 or 30 MHz out of their Pentium MMX. But I'm not doing that. This Zotac 280 that I have is an exact copy of the reference design, running at reference voltage and reference clock. It is bone stock. And I am running TF2 on it, which is a game that had already been out for a full year back in 2008 when this card was the new hotness. It should not be overheating. I mean, imagine that you went to the dealership and bought a new car. Something fairly high-end but still mass-market, like a 911 or a Merc SLK55. You haven't turbocharged it or modified it any way. And every single time you drive it on the highway at legal speeds for more than an hour, the radiator boils over and the engine shuts down. You'd be pretty pissed, amirite? That's my problem here. This video card cost $650 when it was new- hardly something made to appeal to the "budget buyer." And I don't overclock shit or screw around with it in any way, because I want it to just sit there and work reliably. At the very least, give me a friggin' check-box somewhere in the stock driver / utility package that says "By clicking this, I acknowledge that I am going to cause the card to emit a loud noise while I am really pushing it, in return for which it will be reliable and not crash." For crying out loud, the OEM (nVidia) driver package for this sucker is 224 megabytes. You'd think that they'd have space in there somewhere for a box to enable the "work reliably" option. |
Originally Posted by Joe Perez
(Post 947630)
Yeah, I understand that. But if you're going to de-tune the fan, wouldn't it also make sense to de-tune the GPU so that it doesn't constantly overheat and shut down during normal operation? I'd think that reviews which say "This card is total piece of shit that locks up and crashes after half an hour" would be considered more unfavorable than reviews which say "This card works awesomely and is totally reliable, though it's a bit loud when I'm really pushing it."
That's my problem here. I grok the concept that 133T G4M3RZ will tend to gravitate towards fancy cooling solutions because they plan on overvolting / overclocking. Hell, 20 years ago (before water-cooling existed in its present form) people were literally submerging their entire motherboard in a fish tank filled with mineral oil just so they could crank an extra 20 or 30 MHz out of their Pentium MMX. But I'm not doing that. This Zotac 280 that I have is an exact copy of the reference design, running at reference voltage and reference clock. It is bone stock. And I am running TF2 on it, which is a game that had already been out for a full year back in 2008 when this card was the new hotness. It should not be overheating. I mean, imagine that you went to the dealership and bought a new car. Something fairly high-end but still mass-market, like a 911 or a Merc SLK55. You haven't turbocharged it or modified it any way. And every single time you drive it on the highway at legal speeds for more than an hour, the radiator boils over and the engine shuts down. You'd be pretty pissed, amirite? That's my problem here. This video card cost $650 when it was new- hardly something made to appeal to the "budget buyer." And I don't overclock shit or screw around with it in any way, because I want it to just sit there and work reliably. At the very least, give me a friggin' check-box somewhere in the stock driver / utility package that says "By clicking this, I acknowledge that I am going to cause the card to emit a loud noise while I am really pushing it, in return for which it will be reliable and not crash." For crying out loud, the OEM (nVidia) driver package for this sucker is 224 megabytes. You'd think that they'd have space in there somewhere for a box to enable the "work reliably" option. The fan is naturally loud, there is no getting around to the turbine design. I'd recommend a solder reflow using an oven and it should be good as new. No lie. |
Originally Posted by EO2K
(Post 947620)
Honestly, its reviews sites like HardOCP and NewEgg. "Casuals" want quiet cards. If they read reviews about how loud something is, that's going to scare them off. "Casuals" also don't push their systems like a lot of us do. The manufacturer knows this, and so they detune the fan to suppress noise complaints, and sell more cards to the masses due to favorable reviews. Knowledgeable folks understand that noise comes with performance thus don't usually care, or don't buy cards with OEM coolers.
For example: EVGA 02G-P4-2660-KR GeForce GTX 660 http://images17.newegg.com/is/image/...825-Z01?$S300$http://images17.newegg.com/is/image/...825-Z03?$S300$ This is pretty much a "reference" cooler design. It has a shroud, single fan, and looks exactly like the one on the NVidia website. It appeals to the "budget" buyer. Its usually a couple of copper slugs over the GPU and MOSFETS stuck to a cast/finned aluminum plate. It works, kinda. Its the bare minimum to get the job done. Most of the "gamer" targeted cards will have an "other than NVidia OEM" heat solution, because gamers know they are going to abuse these things. Same card, same chipset, same ram but includes an "aftermarket"~ish cooling solution. GIGABYTE GV-N660OC-2GD GeForce GTX 660 http://images17.newegg.com/is/image/...-443-TS?$S300$http://images17.newegg.com/is/image/...443-Z03?$S300$ The heat pipes, dual fans and stacked sheetmetal finned heat sink will massively increased surface area. The larger fans will move more air while being quieter than the blower in the OEM design. This card is going to run cooler for longer. Downside is, this will usually be more expensive. (bad example, these are within $10 of each other on Newegg, but you get the idea) Why buy a Trackspeed Engineering radiator when I can get one from Radiatorbarn.com for 1/5th the price?! They both radiator, right? My single GTX 280 OC from BFG had a max temp of 59c in BF3 (directx 10 only), at full fan speed. Leaving it to regulate itself got to a recorded extreme of 82c; it did eventually crash, and reducing the temperature it could handle, until it finally artifacted at 53c. I just solder reflowed the card, and replace the caked on old thermal paste with new good paste. I tested it with a self regulated 61c, and 57c max fan. No crashes. My single GTX 560ti AMP! had the dual cooling fans, it was louder, and its max temp in BF3 was 58c @ 100% fan speed My single GTX 670 was overclocked 110+ Mhz, @ 120% power usage (new term for 600 series) max temp was 52c @ 80% fan speed. They are now two if them in SLI with the same clock, with one still being the temp above, and the other being only 55c @ 80% fan speed. Every card above had heat pipes, nvidia DOES install them in the stock design. im an nvidia fan boy (pun intended) My case has a huge 120mm fan jet directing it to the pci cards; ventilation is key |
Originally Posted by Pen2_the_penguin
(Post 947686)
(point)
Originally Posted by Pen2_the_penguin
(Post 947689)
(counterpoint)
However, as an engineer, I can attest that designing an electronic product with the intention of running it at >80° is not good practice. At the very least, you are compromising on thermal margin for all of the semiconductors, and you are most likely also exposing electrolytic capacitors to ambient temperatures which will accelerate their decomposition. The fact that you have observed the cards to degrade with time supports this assertion. I still go back to one simple point: the card is physically capable of cooling itself in a reasonable fashion given the supplied heat-management solution. Thus, even in a context in which acoustic properties are not to be discounted, logic demands that the user at least be presented with an option in the configuration software to elect to operate the card in a way that ensures its stability and longevity. |
All times are GMT -4. The time now is 08:02 AM. |
© 2024 MH Sub I, LLC dba Internet Brands