Orpheus said:
FPS isn't everything but it may be enough.
This old 4850 I got now does great however it runs out of ram running Clearsky. If I set everything to maximum, the barbed wire and camo netting disappear. If I lower the grass and some other non-essentials the barbed wire and cammo netting return.
Obviously 512 megs isn't enough. Perhaps 1024 is.. For now.
My thoughts were 2048 might compensate for the price.
That 560 looks sweet.
Anywho's
Well. It depends on the game and engine and how it draws these things, just to note.
But, in general, things dealing with textures (not just visible ones) have to be loaded into video memory. So that's really the main advantage of huge amounts of dedicated video memory. You know, world objects and junk like that is still stored in system memory and they're fed to the GPU when they're needed in the next rendering batch.
The main advantage of larger video memory, again, is larger textures. So, you could have high resolution visual textures that are skinned on objects. Or you could have higher resolution shadow maps (which are stored in a texture). Or higher resolution lightmaps (which are stored in a texture). Stuff like that.
The main difference between those 4850 and 5770 cards is, even though they're the same memory size ... the 4850's memory has a larger bus (double the size), but has one less access per cycle. Also, the clock speeds are higher. (It's important to note that the first number is the generation, and the following numbers denote class. 8, 7, etc. the number after that is variation on the chipset and are only comparable within the same class. So, a 48XX card may likely perform better than a 57XX card, just based on chipset classes. Then again, the new generation could have all around optimizations that make them all perform better ... it really depends on the cards)
So, when running a game ... the GPU is the primary determination of how well a game will run based on graphics settings. HOWEVER, CPU speed, RAM size, and HDD access time DO play a part. If the CPU doesn't have enough speed or threads ... you could see noticeable stuttering in a game, simply because the CPU is operating as fast as it can, but can't service render batches as fast as the GPU is operating. So, even though it's stuttering, it doesn't mean the GPU is the bottle neck.
Something I ran into today was a limitation in RAM space. With the Crysis 2 demo. There was no operation lag. But the movements were stuttering like crazy. To note, animations and looking around, firing the weapon, etc, operated at normal speed. When I minimized the game, I looked at its resources, and the physical RAM was maxed out (~1.8GB) and the swap space was around 2.5GB ... what this means is, the streaming of the level was slowing the game down, so when I moved, it'd have to stream in new data ... but it just can't operate fast enough for it to be playable.
So, I need more RAM, so it doesn't go to disk for things it's going to need every other frame (slow)
So, yes, in these capacities, CPU, RAM, and HDD speed and size do have an impact on your performance ... however ... it's important to note that ... unless you did like what I did, which was, take a 5 year old system and simply put in a new GPU ... you'll likely not come into issues regarding bottlenecks outside of the GPU. It's still possible, of course, but it's pretty unlikely.
All generations of cards have different base chipset architectures. That's what a new generation is. The second number sort of specifies chipset class. So, the higher the number the more chipset features it has enabled. And the third and fourth numbers are further granularity of that.
It's very possible that one card will perform worse, just because it's missing a very specific register. (GTX 460 SE cards, for example, are missing hardware that the GTX 460 cards have)
Anyway, what I listed was two systems that were either identical ... or very similar (If they were in the same test, the only thing they would change is the GPU) and the game was at the EXACT same settings. It's the reason why I linked it! It's one of the only ways you can show comparison of performance ... only change the one thing you're measuring. So in that respect, YES, fps IS a pretty good measure of their performance in relation to one another, because the card being used was the only variable in the benchmarks.
To note, you should probably put fps requirements above eye candy. I can understand going down to even 25 fps. That's still playable without noticeable lag (consider that most cell animation is actually 15 frames a second, we're not as sensitive to this as people believe). But, sacrificing more than that ... I wouldn't recommend. That's where you get into a situation that the game feels "sluggish" (It's still not noticeable lag at that point even)
So, while I don't think your games HAVE to run at 60fps to just be "playable" like a lot of people will claim (most games don't run at 60fps anyway), playing under 25-30 should be unacceptable for you. (might even be a contributing factor to your motion sickness, since motion sickness is from surprising movements and that's exactly what choppy frame-rates are)
                                            
                        Blame it on Microsoft, God does.