Miata Turbo Forum - Boost cars, acquire cats.

Miata Turbo Forum - Boost cars, acquire cats. (https://www.miataturbo.net/)
-   Gaming (https://www.miataturbo.net/gaming-91/)
-   -   Yet another video card thread (https://www.miataturbo.net/gaming-91/yet-another-video-card-thread-66850/)

Joe Perez 06-28-2012 04:52 PM

Yet another video card thread
 
Just tell me which one I should buy, dammit.


Ok, so maybe a back-story is required.


I know nothing about these fancy graphics cards. The last video card I paid more then $50 for had a VLB connector on it, to give you some idea of how long it's been. I think that was in my '486.

I'm not a hardcore gamer. I don't want to spend a ton of money. I just want something to make TF2 run a bit more smoothly- using PyroVision for the first time last right really exposed what weaksauce my current display adapter is.


I've read (here and elsewhere) that the 6850 is the price / performance champion, and at $130-$150 they're at the upper end of what I'd prefer to spend. Is this still true?


Of course, silly me thought that "ATi 6850" referred to a specific graphics card. Who knew that chipset vendors were now outsourcing production to all sorts of little companies I'd never heard of, and that there are about a dozen different video cards which can all be described as being a 6850.

I start reading customer reviews. All that I've gathered from that is that every single video card in production today is an unreliable piece of crap which overheats, locks up, has poor driver support, sounds like a Saturn V rocket taking off, and something to do with a thing called "Skyrim" that I'd never heard of before. So the reviews aren't being especially helpful.

I've also learned that all ATi cards suck and that NVidia is better, except that all NVidia cards suck and ATi is better.

(This is why I hate people.)

And the manufacturer's naming conventions aren't helping either. I've found that all cards seem to be numbered, and that higher numbers indicate lower performance, except for when they don't. I think they're deliberately doing this to confuse me.

All I want is something that's reasonably quiet when not "in use", isn't a piece of crap, doesn't cost a fortune, and sucks less than what I have now. I'm not paying $200 for a graphics card, so don't even go there.


Just tell me what to buy, and I'll buy it. Maybe this one, which is in stock at the Fry's near me?

I also saw some folks here talking about the GTX 460, which I can get even more cheaply at Geeks, which is also a few miles from here: http://www.geeks.com/details.asp?inv...024-CO&cat=VCD

mgeoffriau 06-28-2012 05:22 PM

Best Graphics Cards For The Money: June 2012 : Best Graphics Cards For The Money, June Updates

Seems like a Radeon HD 6790 for around $115 would fit your needs.

mgeoffriau 06-28-2012 05:27 PM

Alternatively, I just found a Matrox Millennium video card in my PC parts box that you can have for the cost of shipping.

rleete 06-28-2012 05:30 PM

Be glad you don't also have the requirement that it runs a CAD program like SolidWorks or Inventor. That gives a whole new meaning to confusion.

Joe Perez 06-28-2012 06:16 PM


Originally Posted by rleete (Post 896657)
Be glad you don't also have the requirement that it runs a CAD program like SolidWorks or Inventor. That gives a whole new meaning to confusion.

Here at the office, we have one guy who uses Solidworks. He has it very easy. Just call the corporate IT department and say "I use Solidworks" and a week later, a new seventy-core Xeon machine with eleventy exabytes of RAM shows up that has nine Cray XT5s wired in parallel as a graphics chipset.

Or something like that.

So we're still talking about chipsets rather than cards, which is kind of not helping me. I'm assuming that it actually matters which company builds the card itself, insofar as part quality, layout, driver support, etc.

blaen99 06-28-2012 06:20 PM

Two questions, Joe.

Do you want the best price/performance ratio, or just the best performance at a reasonable price?

Secondly, do you just want to throw it on, or do you want to be able to play with it at all (I.e. being able to software mod a card into a higher variant of itself.)?

blaen99 06-28-2012 06:26 PM

Ah ---- it. Go with something like Newegg.com - Galaxy 56NGH6DH4TTX MDT GeForce GTX 560 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Mult-Display Video Card

It's in your price range, and a hefty step above the 460.

Alternatively, Newegg.com - GIGABYTE Super Overclock Series GV-N56GSO-1GI GeForce GTX 560 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card

In your price range after rebate, and blows the first one out of the water imnsho.

I've gotta GTFO now, unfortunately. Later folks!

Joe Perez 06-28-2012 06:50 PM


Originally Posted by blaen99 (Post 896675)
Do you want the best price/performance ratio, or just the best performance at a reasonable price?

Best *quality* for a reasonable price.

Quality includes:
  • Reasonable performance as compared to my current $30 GeForce 210.
  • Not a buggy piece of crap.
  • Doesn't consume 1.21 GW at idle.
  • Fan is completely inaudible at idle.





Secondly, do you just want to throw it on
Yes.


, or do you want to be able to play with it at all (I.e. being able to software mod a card into a higher variant of itself.)?
Couldn't care less. The last computer I overclocked or modified in any serious way had an 8 Mhz CPU and 512k of RAM.



So I'm gathering you like the 560, then.

What about any of these?

Newegg.com - ASUS ENGTX560 DCII OC/2DI/1GD5 GeForce GTX 560 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card

Newegg.com - EVGA 01G-P3-1460-KR GeForce GTX 560 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card

Newegg.com - PNY VCGGTX560XPB GeForce GTX 560 (Fermi) 1GB 256-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card

I ask because they're all three in stock at my local Fry's in the $140-$160 price range.

rleete 06-28-2012 06:56 PM


Originally Posted by Joe Perez (Post 896672)
Here at the office, we have one guy who uses Solidworks. He has it very easy. Just call the corporate IT department and say "I use Solidworks" and a week later, a new seventy-core Xeon machine with eleventy exabytes of RAM shows up that has nine Cray XT5s wired in parallel as a graphics chipset.

I wish. I'm running a 5 year old Dell with an aging ATI card. My home system is more up to date.



Originally Posted by Joe Perez (Post 896672)
So we're still talking about chipsets rather than cards, which is kind of not helping me. I'm assuming that it actually matters which company builds the card itself, insofar as part quality, layout, driver support, etc.

I'd personally go with the one mgeoffriau said (it is first on the list at the bottom of the page he linked) for one simple reason: cost. If you put a fancy card in, then next you'll need to upgrade memory. Then a new processor, which means you might as well get a MB. And you'd better upgrade that old P/S by now, too. It's a never ending cycle. Nickels and dimes, a hundred bucks a pop.

Upgrade to the cheaper card, and by the time that's falling flat, you'll need a whole new rig anyway.

Joe Perez 06-28-2012 08:31 PM


Originally Posted by rleete (Post 896691)
I wish. I'm running a 5 year old Dell with an aging ATI card. My home system is more up to date.

Don't get me wrong- the rest of us suffer with mediocre hardware. It's only the mechanical engineer who gets the fancy machine. A couple of my computers at work are actually Pentium 4's with a 2G ram limit. The only Core-series machine I have at the office is one I bought at Fry's and filed an expense report for.




I'd personally go with the one mgeoffriau said (it is first on the list at the bottom of the page he linked) for one simple reason: cost. If you put a fancy card in, then next you'll need to upgrade memory. Then a new processor, which means you might as well get a MB. And you'd better upgrade that old P/S by now, too. It's a never ending cycle. Nickels and dimes, a hundred bucks a pop.

Upgrade to the cheaper card, and by the time that's falling flat, you'll need a whole new rig anyway.[/QUOTE]Which one, the AMD 6670?

I'm starting to get the impression that it really doesn't matter at all so long as it looks fancy and has a big fan on it...

Ironically, the rest of my home PC is already pretty up to date. i5 processor, 8 gigs of RAM, beefy power supply...

rleete 06-28-2012 09:00 PM


Originally Posted by Joe Perez (Post 896723)
Which one, the AMD 6670?

Ironically, the rest of my home PC is already pretty up to date. i5 processor, 8 gigs of RAM, beefy power supply...

That's the one I was suggesting, but if you've already upgraded the rest of the rig, it's probably better to go more middle of the road.

That one from Fry's you listed in the OP should be a good one, although I am always leery of stuff like the "eyefinity" thing. Proprietary stuff tends to go the way of the dodo, and leave you hanging.

I really like the PNY one you linked to earlier. Always had good luck with their memory chips, and bought lots of stuff from the egg over the years.

Joe Perez 06-28-2012 09:59 PM

Ok, so tomorrow I am going to go to Fry's and buy one of these two cards:

Either the ASUS EAH6850 for $156
FRYS.com | ASUS

Or the PNY GTX 560 for $155
FRYS.com | PNY

Tell me how to spend my money

Frankly, I don't care about Eyefinity or any of that crap. This is my home PC. I have one 32" monitor which supports both HDMI and DVI, 3d shutter glasses give me a headache, and I will never take advantage of crossfire or anything like it. The last time I used a piggyback video processor was in the late 90s, and that was a video capture card.

midpack 06-28-2012 10:30 PM

560 > 6850

Awesome GPU chart

blaen99 06-28-2012 11:13 PM

GTX 560. Unless you need Eyefinity, which you've said you don't.

I recommended the 560 for a reason Joe. Although you can safely ignore the Tom's chart, it's a bit....off, but the 560 is definitely the best bang for your buck.

fooger03 06-29-2012 08:51 AM


Originally Posted by Joe Perez (Post 896762)
Either the ASUS EAH6850 for $156
FRYS.com-|-ASUS

Or the PNY GTX 560 for $155
FRYS.com-|-PNY

The EAH6850 is exactly the video card that I have, it is pure genious for the price.

If you're not a fan of mail-in rebates, then just buy the 6850.

If, however, you don't mind mail-in rebates, your decision becomes more difficult. This benchmark site I use:

PassMark Software - Video Card Benchmarks - High End Video Cards

Lists the 6850 and the GTX 560 as one right after the other - nearly identical in performance. I believe the EAH6850 core is overclocked by ASUS though, which is going to give you a small (maybe 5-8%) bump in performance. Not sure about the PNY - though I've always viewed PNY as a budget brand, I know they have performance lines too.

Lastly, ASUS is by FAR my favorite computer hardware manufacturer - with a few exceptions, I personally would pay a 5-10% premium to have an ASUS product over most other brands, and for the single problematic ASUS item I've ever purchased, their customer service was exceptional.

So: If I were you, I'd buy the EAH6850 w/no rebate, and be done with it. It doesn't have problems with SKYRIMS either... :giggle:

mgeoffriau 06-29-2012 09:55 AM

Agreed on ASUS. Used an ASUS vidcard and ASUS wifi card on my HTPC build.

blaen99 06-29-2012 10:36 AM


Originally Posted by fooger03 (Post 896889)
The EAH6850 is exactly the video card that I have, it is pure genious for the price.

If you're not a fan of mail-in rebates, then just buy the 6850.

If, however, you don't mind mail-in rebates, your decision becomes more difficult. This benchmark site I use:

PassMark Software - Video Card Benchmarks - High End Video Cards

Lists the 6850 and the GTX 560 as one right after the other - nearly identical in performance. I believe the EAH6850 core is overclocked by ASUS though, which is going to give you a small (maybe 5-8%) bump in performance. Not sure about the PNY - though I've always viewed PNY as a budget brand, I know they have performance lines too.

Lastly, ASUS is by FAR my favorite computer hardware manufacturer - with a few exceptions, I personally would pay a 5-10% premium to have an ASUS product over most other brands, and for the single problematic ASUS item I've ever purchased, their customer service was exceptional.

So: If I were you, I'd buy the EAH6850 w/no rebate, and be done with it. It doesn't have problems with SKYRIMS either... :giggle:

Although I love Asus myself (XFX or Asus video cards only here), please don't link synthetic tests as support for an argument about video card performance.

That's like trying to say "But McDonald's fries are AWESOME! So you should buy burgers every day from McDonald's!"

Synthetic tests have long held only passing similarity to real world results.

midpack 06-29-2012 02:17 PM


Originally Posted by blaen99 (Post 896783)
Although you can safely ignore the Tom's chart, it's a bit....off

How so?

blaen99 06-29-2012 11:25 PM


Originally Posted by midpack (Post 897056)
How so?

Any chart that tries to pack a lot of cards into a single, readable list is going to be off no matter how you do it man. Specifically, certain cards do better at certain things than other cards, and vice versa across the list.

It's a similar problem to the synthetic benchmarking. Just because a card beats another in a purely synthetic benchmark doesn't mean it's going to win in, say, BF8.

mgeoffriau 06-30-2012 11:28 AM

That's why they wrote a whole article discussing the best cards at each price point.

rleete 06-30-2012 12:14 PM

Joe, What did you buy?

Please say you went to Fry's and got the PNY.

Joe Perez 06-30-2012 01:28 PM


Originally Posted by rleete (Post 897386)
Joe, What did you buy?

Nothing.

At the last minute, a forum member emailed a link to this thread to a friend who happened to be sitting on a pile of used high-end video cards, who subsequently sent me a GTX 280 for free.

Free is good.

Joe Perez 07-01-2012 02:03 AM

Arrrr. I really want this video card to arrive!

I scored a Strange Rainblower tonight in TF2 which carries with it a very effective taunt kill, except I can't use it because equipping the Rainblower forces you into Pyrovision, which forces HDR and Bloom on, which turns my system into a lagfest.

rleete 07-01-2012 01:21 PM

Shoulda bought the PNY at Fry's. It'd be humming right along by now.

Free is good, but time is money.

Joe Perez 07-01-2012 07:00 PM

Y U no liek GTX 280?

Amusing sidebar: I was doing some reading on this chipset, and apparently it hold the record for the largest physical GPU die ever constructed. A somewhat dubious honor (makes me wonder what the power consumption will be) but it was apparently the highest of the high-end cards in its day (circa 2008), retailing at over $600.

I can't even begin to comprehend spending that kind of money on a video card, and yet a quick trip 'round the Egg shows that the current-gen "gamer class" high end cards still sell for $600-$700.

Seriously, who the hell is spending $700 on a video card just to play videogames?


Still, at least I can see the merit in spending money on a video card, in that you do in fact get some performance return from it. Contrast this to the $70 Bigfoot Gaming Network Card. Seriously? It's an ethernet card. It's one link in a *VERY* long chain between you and the rest of the world. The ethernet adapter that came built into your motherboard is capable of a hundred times more throughput than your fancy cable modem, to say nothing of the dozen or two routers between your cable headend and whatever game server you're connected to.

rleete 07-01-2012 07:53 PM

Not that I don't like, but that I couldn't stand to wait. I'm very impatient sometimes.

Oh, and edit out that double post. I expect it to be gone by the time I finish typing this.

blaen99 07-01-2012 09:01 PM

Hey, if someone offered me a GTX280, I'd jump on that quick.

IIRC, they have a full CUDA implementation, which is exactly why I bought my 470 originally. I should upgrade here in the near future, but nothing has offered a conclusive enough upgrade to entice me yet.

rleete 07-01-2012 09:11 PM


Originally Posted by Joe Perez (Post 897851)
It's an ethernet card. It's one link in a *VERY* long chain between you and the rest of the world.

And yet Monster cable sells gold plated power cords. Never mind your house wiring is all copper, and may have crappy connections. It's all about the bragging rights as to who can spend (waste) the most money.


Very good on editing out the double post, but you were a litle slow this time.

Joe Perez 07-02-2012 12:03 AM


Originally Posted by rleete (Post 897917)
And yet Monster cable sells gold plated power cords. Never mind your house wiring is all copper, and may have crappy connections. It's all about the bragging rights as to who can spend (waste) the most money.

Yeah, most of the high-end "gamer class" stuff I find quite over the top, but a "gamer class" ethernet card is about as retarded as an "audiophile" power cord. And by retarded, I am talking short-bus here. Everyone knew of some kid in high school that made you ask "why is he/she here? They are literally not going to gain anything from this experience, and are merely consuming educational resources which could be used to prepare others here for a life of productive, lotto-playing, Escalade-coveting, minimum-wage employment."



Very good on editing out the double post, but you were a litle slow this time.
Have you been having this issue with your browser frequently? I see no doppelpost.

blaen99 07-02-2012 12:29 PM

I am absolutely not saying "Go buy a monster network card" here, Joe, but unlike high-end Audiophile crap, there is actually quantifiable improvements with the network card.

AnandTech - Bigfoot

The last review they had on a wired ethernet card they had showed FPS gains up to 10%, so...Yeah. If you are dropping $600-$1k on video cards, $70 on a network card that can also net you up to a 10% FPS gain is nothing. Wrong target market and all.

/Sadly, not even I am in the target market, but hey.

Joe Perez 07-02-2012 02:40 PM

1 Attachment(s)

Originally Posted by blaen99 (Post 898166)
The last review they had on a wired ethernet card they had showed FPS gains up to 10%,

This just doesn't make sense.

Seriously- I can think of no way in which swapping ethernet cards would cause a 10% increase in rendered framerate of a videogame, unless there was something so horribly wrong with the old one (or its drivers) that it was misbehaving badly and consuming tons of CPU / memory / bus resources.



At any rate:

https://www.miataturbo.net/attachmen...ine=1341254419



Irony:


When it comes to home-theater stuff, I usually bitch and moan about how all the cool new devices (eg, set-top media players) only have HDMI outputs, but don't support my TV's component input needs.

My monitor at home only has HDMI inputs, whereas this card has two DVI ports and, amazingly, one analog component output.

Off to Fry's to pick up a DVI -> HDMI adapter.

blaen99 07-02-2012 05:58 PM


Originally Posted by Joe Perez (Post 898237)
This just doesn't make sense.

Seriously- I can think of no way in which swapping ethernet cards would cause a 10% increase in rendered framerate of a videogame, unless there was something so horribly wrong with the old one (or its drivers) that it was misbehaving badly and consuming tons of CPU / memory / bus resources.

I can think of lots of ways, Joe.

The simplest way to explain it conceptually is how much faster a hardware raid card is vs. a software raid card. Remember, everything is software LAN-wise now, so a pure hardware chip that offloads everything with heavily optimized drivers....

http://hardware.gotfrag.com/portal/story/34683/ goes into some detail on it, I'll later dig up the anandtech article on it that also goes into detail.

Joe Perez 07-03-2012 12:35 AM

7 Attachment(s)
So the wait wasn't so bad, rleete.


First things first. This card is MASSIVE. From front to back it's longer than the motherboard. The last time I owned an expansion card which fit that description, it had an 8-bit ISA connector on it.

It weighs a ton. Literally. As a test, I constructed a huge see-saw out of a couple of steel I-beams which I happened to have lying around left over from when I built the Empire State building using only a leatherman tool and three matchbooks from a jazz club in west Harlem, and with my Miata on one end and this card on the other, I only had to move the card 9 inches out from the fulcrum in order to counter-balance the car. In addition to two expansion slots, it also consumed two hard drive bays, forcing the relocation of two of my hard drives. (I wound up moving all three of them, just to tidy things up.) It covered two of the six SATA ports, and will have to be removed if I ever need to un-latch the RAM slots to remove a stick of memory:

https://www.miataturbo.net/attachmen...ine=1341290116


At any rate, everything finally managed to fit, though ironically my 1.8" SSD is now sitting in a 5.25" drive bay. (Yes, I actually still have a couple of 3.5" to 5.25" adapter kits lying around- complete with beige faceplate with embedded LED. Remember when hard drives had a little two-pin header on them for an external LED?) And I was honestly a bit surprised that I was able to find the correct power cables to attach it to my power supply. When I bought this unit two years ago, I just tossed 'em into a random box figuring that I'd never need 'em.


Fired it up, and then:

https://www.miataturbo.net/attachmen...ine=1341290116


Aaah, my old nemesis. It's been quite some time since we've been together.


Ok, into Safe Mode we go.

Works fine, so reboot into normal mode.


http://speedmaxpc.com/wp-content/upl...n-of-death.bmp

Hmmm.


Back into Safe Mode, works fine.

Hmm.

Too lazy to grab the laptop, so I boot into Safe Mode with Networking so I can do some gooling. When suddenly:

Attachment 239910


Well, now this is interesting.

You may recall that just this morning I was extolling the virtues of the Atheros AR5BDT92 wi-fi card in another thread. Turns out that it and my new GTX280 hate each other. So sadly, I've had to remove it and go back to the ole' USB Netgear WN111. It seems to mildly dislike me, but is content with just the occasional disconnect.

So there we are. On to TF2.

Hmm. Graphics options. So many choices... Fuсk it, let's just set everything to maximum and increase the resolution all the way to my monitor's native 1920x1200. And no sense beating around the bush, let's just jump straight into Pyroland with my shiny new Rainblower equipped.

http://up-ship.com/blog/wp-content/u..._beautiful.jpg

I'm nearly speechless. Visually, everything looks nearly the same (albeit at higher resolution) but the smoothness is just mind-blowing. I didn't really realize what I was missing, but it is so much easier to play now! Previously, close-in engagements were almost impossible, because the video became so choppy that it was impossible to circle around an opponent at close range while still maintaining accurate aim. But now it's just utterly fluid and seamless- I scored my first flamethrower kill of a Heavy tonight, simply because I could actually see him as I circled around him!

I also picked up FOUR charred/burned item drops and got two group invites in less than an hour of play.

Coincidence?

I could get used to this.

mgeoffriau 07-03-2012 12:43 AM

Send me that Atheros card and I'll send you the ASUS WiFi card in return...

Joe Perez 07-03-2012 12:52 AM


Originally Posted by mgeoffriau (Post 898555)
Send me that Atheros card and I'll send you the ASUS WiFi card in return...

That would be the one with "unwanted glitches like the repeated stuttering in video" I assume?

Bryce 07-03-2012 12:57 AM

Playing at 60+FPS is nice, isn't it? I hate having to spend $500 for the latest video card every few years to be able to accomplish the same thing with the newest games, but it has to be done.

Make sure you turn off VSYNC. It adds a small amount of input lag that is detrimental to your performance in online games. Get used to the screen tearing. You WILL slay more foes with it off.

Pen2_the_penguin 07-03-2012 12:59 AM

the problem is that the GTX280 doesnt support DX11, so some shaders in newer games are glitchy as FUUUU.

I loved my GTX280OC, it was tits... still runs BF3 in DX10 on max settings with a 60+ fps average on my work computer.

jasonb 07-03-2012 01:19 AM

so am i reading this right that gigabit ethernet chipsets @ household bitrates can generate enough interrupts to be measurable in terms of FPS? i'm a little bit surprised.

i'm not good with windows, but in linux there are various knobs you can turn for interrupt routing.

in following case core 0 is handling all of the ethernet interrupts. if this same core is handling your video card interrupts, then i can believe you would see effect of one on the other.

in linux you can tweek these things. IE: SMP affinity and proper interrupt handling in Linux - Alex on Linux
or if you are feeling lazy on some distros you can run irqbalance daemon...

Code:

cat /proc/interrupts
            CPU0      CPU1      CPU2      CPU3      CPU4      CPU5      CPU6      CPU7     
  0:      53238          0          0          0          0          0          0          0  IO-APIC-edge      timer
  1:          4          0          0          0          0          0          0          0  IO-APIC-edge      i8042
  4:    312844          0          0          0          0          0          0          0  IO-APIC-edge      serial
  7:          1          0          0          0          0          0          0          0  IO-APIC-edge   
  8:      23520          0          0          0          0          0          0          0  IO-APIC-edge      rtc0
  9:          0          0          0          0          0          0          0          0  IO-APIC-fasteoi  acpi
  12:          6          0          0          0          0          0          0          0  IO-APIC-edge      i8042
  14:          0          0          0          0          0          0          0          0  IO-APIC-edge      pata_amd
  15:          0          0          0          0          0          0          0          0  IO-APIC-edge      pata_amd
  19:  51027885          0          0          0          0          0          0          0  IO-APIC-fasteoi  aacraid
  21:        45          0          0          0          0          0          0          0  IO-APIC-fasteoi  ohci_hcd:usb2
  22:        45          0          0          0          0          0          0          0  IO-APIC-fasteoi  ehci_hcd:usb1
  23:          0          0          0          0          0          0          0          0  IO-APIC-fasteoi  sata_nv
  47:          0          0          0          0          0          0          0          0  IO-APIC-fasteoi  sata_nv
  78: 2742263583          0          0          0          0          0          0          0  PCI-MSI-edge      eth2

edit: this is also worth a read: http://www.alexonlinux.com/msi-x-the...interrupt-load

mgeoffriau 07-03-2012 01:45 AM


Originally Posted by Joe Perez (Post 898559)
That would be the one with "unwanted glitches like the repeated stuttering in video" I assume?

Oooh, fancy!

(Yes.)

Saml01 07-03-2012 09:19 AM

Joe. Whats your steam handle?

Braineack 07-03-2012 09:35 AM


Originally Posted by Joe Perez (Post 897405)
Nothing.

At the last minute, a forum member emailed a link to this thread to a friend who happened to be sitting on a pile of used high-end video cards, who subsequently sent me a GTX 280 for free.

Free is good.


WHOA WHOA WHOA WHOA.


Father once spoke of an angel, I used to dream he'd appear. Here in the this forum he calls me softly, somwhere inside, hiding. Somehow I know he's always with me. He, the unseen genious.

Grant me your glory! Angel of GPUs, hide no longer! Secret, strange, Angel.

mgeoffriau 07-03-2012 09:41 AM

Woah, I just looked up the GTX 280. That's a great card to pick up for free. I thought I had done well when I snagged an 8800GTS for $40.

GeForce 8800 GTS (G92) vs GeForce GTX 280 – Performance Comparison Benchmarks @ Hardware Compare

Joe Perez 07-03-2012 12:22 PM

2 Attachment(s)

Originally Posted by Bryce (Post 898563)
Playing at 60+FPS is nice, isn't it? I hate having to spend $500 for the latest video card every few years to be able to accomplish the same thing with the newest games, but it has to be done.

Nah. This experience has pretty much confirmed my long-held belief that it's perfectly acceptable to wait a couple of years and spend $100-$150 (or better yet, nothing at all) on technology that's 3-4 years old and formerly bleeding-edge.

Ironically, I feel even more like this guy than ever now:

https://www.miataturbo.net/attachmen...ine=1341332561


I'm sure I probably won't be able to run Call of Duty XVI (or whatever they're up to now) at 1920x1200 in the highest quality on this card, but should I need to, I can just down some of the quality settings. I honestly don't care whether or not I can see the entire world ray-traced to a depth of 16 intersections reflecting off of every single drop of blood sprayed on a wall. That does not significantly enhance gameplay.




Originally Posted by Bryce (Post 898563)
Make sure you turn off VSYNC. It adds a small amount of input lag that is detrimental to your performance in online games. Get used to the screen tearing. You WILL slay more foes with it off.

I had seen that option in the menu, but never really understood what it was. Did some research, and you're quite right. A very cool concept, but potentially detrimental. At any rate, it's been turned off this whole time, and I've not noticed any of these tearing artifacts.


I still cannot get over how utterly fluid the display is now. When I first fired it up, I was actually sort of repulsed by it; kind of like how all the high-end LED/LED TVs which support upconversion to 600 Hz refresh rates try to do all this fancy motion interpolation which winds up making movies look like ESPN-HD football games. In that context it's horribly off-putting, but the benefits in this context are obvious and massive.





Originally Posted by Pen2_the_penguin (Post 898564)
the problem is that the GTX280 doesnt support DX11, so some shaders in newer games are glitchy as FUUUU.

Fortunately, TF2 doesn't support DX11 either.

I guess we'll wait and see. Maybe in 3 or 4 years I'll have to upgrade to a 3 or 4 year old card that supports DX11. :D






Originally Posted by jasonb (Post 898576)
so am i reading this right that gigabit ethernet chipsets @ household bitrates can generate enough interrupts to be measurable in terms of FPS? i'm a little bit surprised.

Yeah, I understand the underlying concept of parking the TCP/IP stack in software on the main CPU vs. offloading it to a dedicated co-processor. I just question how it's even possible to code one so badly that it consumes enough system resources to cause a 10% reduction in the video framerate. In relative terms, the amount of computational power required to service an ethernet port is trivial, particularly given the fact that the actual network datarates involved in online game play are surprisingly small.

I won't claim to be an expert here, but I did a bit of light searching and found a couple of facts.

According to the Valve Developer's Guide, Source Engine games typically communicate with the server at a rate of only 20-30 packets per second. (source.)

Further this source suggests that for the default network config in TF2, the maximum effective datarate is 12.2 KBps, which again is exceedingly trivial.

I'm not saying that the overhead here is zero. But the load should be so tiny that its effect on rendered framerate should be almost immeasurable- certainly not 10% under normal circumstances.





Originally Posted by Saml01 (Post 898646)
Joe. Whats your steam handle?

Thraddax. My avatar is the Giant Chicken.




Originally Posted by Braineack (Post 898653)
Father once spoke of an angel, I used to dream he'd appear. Here in the this forum he calls me softly, somwhere inside, hiding. Somehow I know he's always with me. He, the unseen genious.

Grant me your glory! Angel of GPUs, hide no longer! Secret, strange, Angel.

I really tried to come up with a good response to this. Searched all of the lyrics, and just couldn't make any fit. I'm not musically gifted.

But I appreciate it.





Originally Posted by mgeoffriau (Post 898654)
Woah, I just looked up the GTX 280. That's a great card to pick up for free. I thought I had done well when I snagged an 8800GTS for $40.

It's certainly not top-of-the-line. Both the 6850 and the GTX 560 seem to slightly out-perform it in most benchmarks, but I'm certainly not going to complain. Compared to the card I pulled out, this thing is like handing FaeFlora the keys to a fast, reliable car.


It does seem to consume some power. This is to be expected, given the fact that it has 240 processor cores scattered across a die the size of Nevada, with more transistors than there are people in China. (That part really freaks me out. 1.4 billion is an extremely large number of anything, much less things that fit into a box under my desk.)


Stupidly, I failed to benchmark my system's power draw under actual gaming load with the previous card (the maximum TDP for the GT210 is a paltry 31 watts.) I did, however, benchmark the system's total idle load at 45-50 watts as measured on the AC side by the UPS. With the new card, idle load has jumped to 95-110 watts, which is a staggering increase. I'm really quite surprised by this, as I would have assumed that the vast majority of the card would simply power itself off when not in use, but this does not seem to be the case.

It is reasonably quiet. It is audible at idle, but only just barely. (It should be noted that the rest of the system is almost totally silent- nothing but 120mm fans, and one of the two mechanical hard drives is normally spun down except for the nightly backup operation.) During gameplay the fan speeds up a bit, though it's not noticeable with headphones on. Certainly nothing at all like the "screaming jet engine" that some users bitched about in the reviews.

Braineack 07-03-2012 01:52 PM


Originally Posted by Joe Perez (Post 898759)
I really tried to come up with a good response to this. Searched all of the lyrics, and just couldn't make any fit. I'm not musically gifted.

But I appreciate it.


Angel, I hear you. Speak, I listen. Stay by my side. Guide me. Angel, my soul was weak; forgive me. Enter at last, Master.

Angel of GPUs, guide and guardian, grant to me your glory. Angel of GPUs, hide no longer. Come to me, strange Angel.


seriously.........

blaen99 07-03-2012 02:49 PM


Originally Posted by Joe Perez (Post 898759)
Yeah, I understand the underlying concept of parking the TCP/IP stack in software on the main CPU vs. offloading it to a dedicated co-processor. I just question how it's even possible to code one so badly that it consumes enough system resources to cause a 10% reduction in the video framerate. In relative terms, the amount of computational power required to service an ethernet port is trivial, particularly given the fact that the actual network datarates involved in online game play are surprisingly small.

Speaking as someone who has both had to write a TCP/IP stack in software and has had to fix a completely broken one....

It is very easy to code one so badly.

AnandTech - BigFoot Networks Killer NIC: Killer Marketing or Killer Product? As you can see, even Anandtech extensively tested this. It baffled them equally as much as me or you how they achieved their results, but they did achieve repeatable, benchmarkable results, and these cards had the snot reviewed out of them across dozens if not hundreds of reviews.

They still aren't worth the $70-$100 they are asking for the cards to me, mind you, but the basic idea is factual and testable in nature.

jasonb 07-17-2012 10:52 AM

this stuff is contagious. so brain's angel appeared before me (she told me to tell brain she's sorry) and offered me 2 gtx 480's. i don't actually have any working x86 desktop hardware at home right now, everything broke lol. is it worth buying to support 2 video card or is one fine?

if 2 video card, do both slots need to be x16 or can make due with 2 x8's?

Joe Perez 07-17-2012 11:00 AM

Your angel is giving away GTX 480s? Sheesh- my angel needs to get with the program. :rolleyes:

Given how much more highly the GTX 480 scores than the GTX 280 in the benchmarks, I can't imagine needing two of them. But that's just me, of course.

I don't know a lot about these fancy cards, but I do know one thing. A card with an x16 edge connector will not physically fit into an x8 slot. I suppose that if FaeFlora owned the cards in question he could figure out a way to make them fit into x8 slots, however this might have a negative impact on their performance.

Bryce 07-17-2012 11:12 AM

Do not bother with getting two video cards to play nicely. Just get the fastest single card you can afford(receive). I've dual gpu setups from both card manufacturers and have never ever been happy with them. Between waiting for driver updates for the latest games, microstuttering, and flickering in all source gamers, I wont be touching a dual setup until they have made significant advances in compatibility, which will be nevar.

fooger03 07-17-2012 12:06 PM

You can get an ATI motherboard with dual x16 support, but Intel architecture doesn't support dual x16 slots. You will have to put one card in an x16 slot, and the second card in a x8 slot. The x8 slot will either be "open" at the back, allowing the remaining connectors for the x16 slot to freeball, or else the x8 slot will physically be x16 in size, while having internal connectors to only support x8 mode.

I run a single video card now, while I used to run dual cards. I never noticed the dual-card "micro-freezing" that some people are bothered by, but it sounds like they've made pretty significant improvements in the last 18-24 months on the micro-freezing

Joe Perez 07-17-2012 12:19 PM


Originally Posted by fooger03 (Post 904246)
You can get an ATI motherboard with dual x16 support, but Intel architecture doesn't support dual x16 slots.

You sure about that?

I'm looking at some IA boards on Newegg that have up to eight physical x16 slots, which seem to be configurable in a variety of ways to allocate channel-capacity as needed. A few can only drive one slot at x16 and the other at x8 (or drop both to x8 when two cards are installed), however others claim to be able to support two x16 cards at full bandwidth.

Examples:

Newegg.com - ASRock X79 Extreme4 LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 ATX Intel Motherboard

Newegg.com - ASUS P9X79 DELUXE LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 ATX Intel Motherboard with UEFI BIOS

I'd still agree that dual cards are probably more trouble than they're worth.

fooger03 07-17-2012 12:56 PM

Damn you, Gordon E. Moore!!!!

Splitime 07-17-2012 01:05 PM

any reason i shouldn't be considering two of those ASUS EAH6850 setup using CrossFireX?

Spend 300 on a single card?

edit: dammit thread beat me to my question.

Joe Perez 07-17-2012 02:13 PM

4 Attachment(s)
Also, some clarification on the issue of PCIe 8x vs 16x, etc., as I was a little unclear on this, and have done some learning.

There are four different physical standards for PCIe:

https://www.miataturbo.net/attachmen...ine=1342548809

Mechanically, the slots are downwards-compatible, meaning that you can plug a 1x card into a x4 slot, for instance. But you cannot plug a x16 card into a x8 slot (or similar) as it just won't physically fit without using a hacksaw.

What I didn't realize was that the PCIe standard also allows dynamic allocation of bus resources, such that a x16 card plugged into a x16 slot might actually be set up to run at x8 speed (or more properly, x8 width.) Certain older IA MoBo chipsets didn't support running two x16 cards at full-width, so commonly boards of this architecture with two x16 slots would turn them both down to x8 when two cards were installing, splitting the bandwidth in half.

http://media.bestofmicro.com/G/D/287...ck_diagram.png


So that would be a case where you could claim to have an x16 card installed in an x8 slot- technically it's an x16 slot running at half-capacity, not a physical x8 slot. Not as a horrible as it sounds, given that the quantity of data passing across the PCIe bus to the graphics cards pales in comparison to the amount of data being generated on the card itself, between the on-board memory and the GPU.


The newer stuff, however, has more bandwidth available to service graphics cards, and can thus support 2 cards operating at x16, four cards at x8, etc:

https://www.miataturbo.net/attachmen...ine=1342548809

Saml01 07-17-2012 02:31 PM

*yawn* tell me something I don't know.

;-)

You take your heart medication?

fooger03 07-17-2012 03:50 PM

1 Attachment(s)
Example of a pci-e x4 slot designed to accomodate an x8 or x16 capable card:

https://www.miataturbo.net/attachmen...ine=1342555045

review found here:
Gigabyte EX58-UD3R : X58 On A Budget: Seven Sub-$200 Core i7 Boards

Saml01 07-17-2012 04:43 PM


The real-time clock battery unfortunately prevents insertion of anything longer than an x4 card in that open-ended slot.
From the link you posted. Kinda funny.

Sadly you cant just stick any card you want. The card has to be compatible with x1 or x4(most likely X1 which is the native speed for open ended slots I read for compatability reasons) if its an x8 or x16.

Splitime 07-17-2012 05:07 PM

1 Attachment(s)
Hurm, looks like mine is ready to go for that.

2x of the PCI Express 3.0 x16 slots. Which with my Ivy Bridge cpu... it supports.
https://www.miataturbo.net/attachmen...ine=1342559311

Hurm.

Braineack 07-17-2012 05:12 PM


Originally Posted by jasonb (Post 904206)
this stuff is contagious. so brain's angel appeared before me (she told me to tell brain she's sorry) and offered me 2 gtx 480's. i don't actually have any working x86 desktop hardware at home right now, everything broke lol. is it worth buying to support 2 video card or is one fine?

if 2 video card, do both slots need to be x16 or can make due with 2 x8's?


WTF. MY ANGEL HAS YET TO APPEAR BEFORE EVEN ME!!!!!!!!!!!! ---- MY LIFE.

:party:

Joe Perez 07-17-2012 06:04 PM

Funny story:

In the 80s, there were dozens of different standards for external peripherals such as keyboards, mice, printers, plotters, scanners, floppy drives, CD-ROMs, hard drives, tape drives, modems, audio analyzers, ground-penetrating radar arrays, etc.

Now, in the 10s, we have USB- a single universal standard by which all peripherals may connect to all computers.



In the 80s, there was ISA, a single universal standard by which all internal expansion cards could plug into to all x86-class machines.

Now, in the 10s, I'm not even sure how many expansion busses (and operating modes thereof) we have. There's AGP (four different versions), PCI (3.3 and 5v), Mini-PCI (Type I, II and III) PCI-X, PCI-e (1x, 4x, 8x, 16x, plus half-height versions to fit into small-form-factor PCs), Mini PCI-e, ExpressCard (two different sizes) and we still occasionally bump into MCA every now and then (both 16 and 32 bit).

Pen2_the_penguin 07-18-2012 01:25 AM

ill be an angel for one of you gheys... I have a used BFG GTX 280 OC ill send for 100 bucks shipped. Doesnt support DX11, but ran BF3 on max anything.


All times are GMT -4. The time now is 01:39 PM.


© 2024 MH Sub I, LLC dba Internet Brands