half-dude said:
That's odd in that case, since Nvidia's website and documentation that comes with the graphics cards still say you NEED to have the same model cards for SLI to work. 
Ah, sorry, it has to be the same processor architecture. So, GT, GTX, etc. But, generation is not a factor. AND this is only if you want them to share processing load. If you want some card to do dedicated processing separately, this doesn't matter, it just needs to be GF8 or above.
half-dude said:
I'm confused by what you mean saying "improving graphics." Do you mean improving visual quality like texture quality, higher AA, higher resolution, ect, ect or improving FPS with the graphics settings you currently have?
Yes. If you want them to co-process the rendering of the graphics scene is what I mean by "improving graphics". PhysX and Cuda are part of the GPGPU trend, or general process graphics processing unit. This means, they're not part of the render pipeline, they can go off and do their own thing while things are being rendered. It offloads from the GPU and CPU.
half-dude said:
I believe you guys in what you say, but I'm confused, how can SLI be so ineffective? By how they explain it, it sounds like the graphics cards share the workload rendering 50% of the screen each.. wouldn't having each one working approximately half as hard as they'd usually work leave close to half of the remaining processing power of the card available for more workload? I mean if it works like that it sounds like SLI should give you double the speed of one card like the ads claim. 
Well, you need to understand, theoretical possibility and real-world reality are two different things. What they're claiming are theoretical maximums. However, most math behind these kinds of things are idealistic ... the real-world is imperfect, numbers aren't round. So, it's unlikely, even in the best conditions that the cards will reach maximum speed in SLI ... and on top of that ... it depends on the game. If the developers do not optimize for SLI in any way you might not even see any improvements! There is also some overhead because they're two physically separate chips on two physically separate boards. Electrons do take some time to travel. And interconnects slow them down greatly. (Electricity is only conceptually instantaneous ... but that latency does add up to noticeable levels in reality)
half-dude said:
Lastly Chrono, you said "the 400 series is already on par with SLIng 2 200 series cards." That means you're saying if I SLI'd two 260 series cards it'd be 'on par' with a mid-range 400 series card right? That SOUNDS like what I want doesn't it? I mean if a Geforce 470 costs around $400, and a second Geforce 260 for around $130 (that's how much they were where I bought the one I have now.. they had a lot of them laying around) and save money gettinthe same performance boost.. right?...
Seriously, ... after all these years, there is no H in my name.
No. That is not what I was saying. You see, when you SLI the largest improvement you'll get in reality, is about 80% And that's best case scenario, you generally get an improvement in the range of 10%-65%, which is, on average about 1.5X better.
Now, the GTX285 on its own, will only be slightly outperformed by 2 SLI 260s. and the 470, for example, can mop the floor with the 280. By mop, I mean get almost double framerate in Crysis Warhead at 1680x1050. And I'm assuming the 260 has 216 cores, not the stock 192:64:28
Now, I hear the 470, actually, isn't the best bang for your buck. The 460 is comparable in performance (generally about 10FPS less in games at the same settings with the same hardware) AND it's a sub-$250 card. It generally goes for around $180-$260 ... the GTX260 is $200. So, the 460 is actually even cheaper than trying to SLI your current rig ... ALSO You can still SLI both the cards together. Even if one is just dedicated to something else.
All reviews I've heard of the 460 are favorable. It can perform well for any game out at 1680x1050. Unless you're gaming at super high resolutions it's a really good choice if you're sticking with nVidia.
Blame it on Microsoft, God does.