Graphics Card revelations

Graphics Card revelations

Re: Graphics Card revelations Posted by Juim on Thu Dec 3rd 2009 at 5:09am
Juim
726 posts
Posted 2009-12-03 5:09am
Juim
member
726 posts 386 snarkmarks Registered: Feb 14th 2003 Occupation: Motion Picture Grip Location: Los Angeles
So, to preface this, I've been using r_netgraph in HL2 for a many years now, and in general I've concluded that, while the Source engine is a bit dated, it is still capable of producing very attractive environments when used properly. I was used to playing with AA and AF settings at minimum. This provided me with an average frame rate of 299 max (which I think was the default for the cfg).

Thus, I trained myself to look for those types of frame rates when critiqueing maps. Even in my own maps, (most notably dm_ff_stinger_rc1) I used these same parameters when considering the map suitable for release. While running through the map alone, I achieved this type of performance exactly. However, when in game with a full player load, I noticed my frames dropping to well under 100 when the graphics load became chaotic.

So I started Bing'ing reviews on my RADEON cards. In almost every case, the reviewer stated that the card really did'nt start to show it's true colors until AA and AF were turned up. So I decided to go into the Graphics settings in HL2 and max everything just for the heck of it. Max res, Max AA, Max AF, and Max everything else. I even set wait for vertical sync to yes.

First of all, things started to look really nice. (Duh!).
But then I saw my FPS, and was dismayed to see them hovering arounf 59.
So I tried another map which I knew played well. Same thing. And again, same thing. So then I loaded up a map I knew to be a nasty performing bugger in my previous settings, and still.... 59/58 fps!. Then I went after every POS map I could remember, (You know, the kind of map where you are standing in an empty box and still get under 20 fps?.... (at least at my old settings). And so now, I can't seem to get any less than 58 FPS no matter how shoddy the map, or no matter how well made, or no matter the player load. And everything looks really tight. Awesome maps look amazing, and shoddy maps look better!.

I've also tested other games at these setting and have achived mixed results.
Crysis: not so good. This game is just insane. Is there actually any rig out there that can max out the settings and achieve desirable frame rates? I set almost everything at high, but don't get good performance at Highest settings.
Crysys Warhead: This game has been tooled to scale more evenly with the system it's played on, but can still be a slide show if all settings are maxed out.
Far Cry 2: YES!. This is quite possibly one of the best games ever designed. My cards love this engine. Sometimes I just pick a previously finished level I've played and jump in, just to enjoy the scenery.
Fallout 3: Borderline. I get respectable frame rates on this game when settings are maxed, but they could be better.

So what's the point here?., If you own a recent RADEON card, ( I would say anything after the X series and up)go ahead and use all you got. The card won't show you what it's worth if you don't use it.
Re: Graphics Card revelations Posted by Junkyard God on Thu Dec 3rd 2009 at 3:36pm
Junkyard God
654 posts
Posted 2009-12-03 3:36pm
654 posts 81 snarkmarks Registered: Oct 27th 2004 Occupation: Stoner/mucisian/level design Location: The Nether Regions
even an 4850 in crossfire performs silly well visually to be honest, but i always turn everything up, i have an msi one and with the cooler that was on it i can use the utility to overclock it to the maximum allowed in there and it'll be fine with it in games etc.

i have 1x 4850 512mb :P

I get about 30-40 fps in all games i've played, new wolfenstein, half life 2, crysis ,farcry 2, stalker, stuff like that.

the settings in the utility seem to overide the games settings and perform much ,much better in most cases.
I would let it test itself everytime u overclock though to make sure it doesnt fry :)
Re: Graphics Card revelations Posted by G4MER on Thu Dec 3rd 2009 at 10:00pm
G4MER
2460 posts
Posted 2009-12-03 10:00pm
G4MER
floaty snark rage
member
2460 posts 360 snarkmarks Registered: Sep 6th 2003 Location: USA
I have a radeon card, I hated it, so I replaced it with 2 BFG nvidia 260 cards. I cant get a bad frame rate if I tried.

I have to send stuff to my laptop to test it, because I just dont know an honest low spec result otherwise.
Re: Graphics Card revelations Posted by haymaker on Fri Dec 4th 2009 at 4:03am
haymaker
439 posts
Posted 2009-12-04 4:03am
haymaker
member
439 posts 921 snarkmarks Registered: Apr 1st 2007 Location: CAN
I have a 4850, it screams through all the source games I own, can't put a dent in it ( except with exceptionally harsh maps )

Be aware Jim the vsync option limits you to a factor of your refresh rate! That's why you're seeing the apparent divebomb. In reality you could argue that there is no need to go higher than that, and you'd be correct, but I find it's better off for multiplayer.

Another strange Source thing Ive found is that if you crank the fps_max higher than your perceived highest framerate, lets say 290 and you override it to 600, it seems to allow the GPU to throttle up a bit and you may find a couple extra frames. Again it's probably useless, but not long ago when I was getting at best 100's on my 6600GT, this seemed to help.

I tell you it was a shock to play through ep2 again with all on high and check out the eye candy, I too had limited ideas of what the engine could do.

on a side note I don't use an LCD but was idly wondering if anybody likes playing 16x9?
Re: Graphics Card revelations Posted by Crollo on Sat Dec 5th 2009 at 1:50am
Crollo
148 posts
Posted 2009-12-05 1:50am
Crollo
member
148 posts 15 snarkmarks Registered: May 8th 2008 Location: Canada
Juim said:
Far Cry 2: YES!. This is quite possibly one of the best games ever designed.
:/
Re: Graphics Card revelations Posted by Le Chief on Sat Dec 5th 2009 at 3:55am
Le Chief
2605 posts
Posted 2009-12-05 3:55am
Le Chief
member
2605 posts 937 snarkmarks Registered: Jul 28th 2006 Location: Sydney, Australia
I have an nvidia 9800 gt and I can run Episode 2 and Bioshock at the highest possible settings at 1900x1080 resolution with a good frame rate. I've yet to try Crysis but I don't really take framerate readings too seriously, there are sooo many factors that determine what framerate you will get even what season it is (summer or winter) can influence your frame rate. I prefer to use my mojo for determining what performs well and comparing games' performance etc.

Juim with your framerate being stuck at 58/59 I'd say what's happening there is that your frame rate is being capped when you turn on aa/af (probably to prevent errors who knows) and you're graphics card is that good that it can process those maps at full capacity. I'm sure if the map was meaty enough it would dip below 58fps. :geek:
haymaker said:
on a side note I don't use an LCD but was idly wondering if anybody likes playing 16x9?
Hells yeah, widescreen is nice because it has a width:height ratio that is more similar to human sight than 4:3. Anybody who prefers playing games on 4:3 is either used to it and not familiar with widescreen or should get an eye patch. :hee:
Aaron's Stuff
Re: Graphics Card revelations Posted by Juim on Sat Dec 5th 2009 at 2:17pm
Juim
726 posts
Posted 2009-12-05 2:17pm
Juim
member
726 posts 386 snarkmarks Registered: Feb 14th 2003 Occupation: Motion Picture Grip Location: Los Angeles
Well Aaron, actually the wait for V-sync was keeping the frames down, as Haymaker suspected.If I turn it off it goes back to max 299. I also have a laptop but realistically it's not a good indicator of map quality IMO. It's a relatively new model(about 2 yrs old)with a core2duo running Vista, but just does'nt have the graphics capabilities. If I have a good internet connection, I can play HL2DM, but only at reduced screen sizes and with most of the eye candy turned off. I've considered upgrading the graphics in it though. Sucker runs hot as it is. I mostly just use it for work.
Re: Graphics Card revelations Posted by Le Chief on Mon Dec 7th 2009 at 7:15am
Le Chief
2605 posts
Posted 2009-12-07 7:15am
Le Chief
member
2605 posts 937 snarkmarks Registered: Jul 28th 2006 Location: Sydney, Australia
Juim said:
Well Aaron, actually the wait for V-sync was keeping the frames down
Ah yes ofcourse I did read that you turned it on but forgot about it :p
Aaron's Stuff
Re: Graphics Card revelations Posted by Crollo on Tue Dec 8th 2009 at 4:56am
Crollo
148 posts
Posted 2009-12-08 4:56am
Crollo
member
148 posts 15 snarkmarks Registered: May 8th 2008 Location: Canada
aaron_da_killa said:
I've yet to try Crysis
You'll be disappointed with your framerate when you do.

Unless you have 4 Nvidia 295's in sli, don't expect to run that high of a resolution, unless you don't have anti aliasing on, or just have low settings.
Re: Graphics Card revelations Posted by Le Chief on Tue Dec 8th 2009 at 5:10am
Le Chief
2605 posts
Posted 2009-12-08 5:10am
Le Chief
member
2605 posts 937 snarkmarks Registered: Jul 28th 2006 Location: Sydney, Australia
I have it right now but haven't installed it yet. I'm feeling confident, I haven't encountered any game that's given my video card any issues and I play my games on the highest graphical settings. 8-)

... that said I haven't tried a lot of games. :uncertain:

Surely Crysis can't be that bad... :scared:
Aaron's Stuff
Re: Graphics Card revelations Posted by larchy on Tue Dec 8th 2009 at 12:08pm
larchy
496 posts
Posted 2009-12-08 12:08pm
larchy
fluffy teim
super admin
496 posts 87 snarkmarks Registered: Jan 14th 2008 Occupation: kitten fluffer Location: UK
If you run 4xMSAA edge detect w/adaptive and 16x Euclidean Aniso then Crysis will cause every GPU config to grind to a halt except Radeon 5850s and 5870s and their Crossfire configs.

With regard to the OP, MSAA was broken in R600 and they needed extra passes as a kludge-fix for the broken hardware. It will tank performance on this architecture.

R700 series are fantastic with the 12x edge detect mode (4xMSAA) provided your SKU isn't a bandwidth limited version. The top end GDDR5 4870/4890s are awesome. nVidia had to rapidly uprate it's GTX260s to the 216 versions as they got totally reamed by RV770XT

Obviously with Cypress now out the hutch and Fermi delayed until the end of Q1 2010 (maybe longer) nVidia have abandoned anything above the low-mid range as they have nothing to compete with.

You also want to be running with vSync on 99% of the time to prevent tearing - which is really horrific when the modern cards get up to 3/4x the screen refresh and you have 3/4 visible tears at once.

Also, DX11 DiRT2 is awesome :D
Re: Graphics Card revelations Posted by Crollo on Tue Dec 8th 2009 at 9:42pm
Crollo
148 posts
Posted 2009-12-08 9:42pm
Crollo
member
148 posts 15 snarkmarks Registered: May 8th 2008 Location: Canada
larchy said:
Also, DX11 DiRT2 is awesome :D
Steam has said over 4 different times "PREPURCHASE DIRT2 NOW!!!" Even though the game was released god knows how long before those messages were appearing.

EDIT: To the original topic, No go, just opened crysis, went to highest resolution (1280x1024) Very high settings at x8 AA, it was a slideshow.
Re: Graphics Card revelations Posted by Juim on Thu Jan 14th 2010 at 4:20am
Juim
726 posts
Posted 2010-01-14 4:20am
Juim
member
726 posts 386 snarkmarks Registered: Feb 14th 2003 Occupation: Motion Picture Grip Location: Los Angeles
Just an update to an older thread. LATELY I have begun to notice a definite rising of the GPU temps and a lowering of performance as time went on with my dual Radeon HD4870 1gig cards. The problem started while I was playing through CRYSIS WARHEAD, and as I got to the later levels, I noticed two things. First, I was starting to see wierd spots and square dots when confronted with certain specular issues. Most noticeably snow and certain metals and shiny vegetation. (You know, reflective type surfaces), and secondly, crashes became rather common place. I was getting really frustrated with the game. It was as though it had suddenly become necessary to nurse maid the game at every turn. I had to constantly re-boot from crashes, and tone down settings just to get it to play! Also The temps were starting to eek upwards into(for me) very uncomfortable levels.
So, through some varied and diverse internet searches I discovered a few things. One issue I discovered was that the ATI CCC caused alot of problems. "TURN IT OFF" was one suggestion, and it actually worked!. I downloaded GPU-Z to monitor my graphics specs and after turning off ATI's own software,the game started acting right and performing properly at the "gamer" settings.

But I followed some more threads and wound up on an overclockers forum, and there was a guy asking how to take his card apart to remove dust. Within that thread was a comment about factory cooling solutions, and how their thermal paste was just awfull. Well, the sentiment was agreed upon by several others, so with that in mind I decided to go about taking one of my two cards apart and seeing if a better thermal paste would make a difference. Try and understand that, firstly, I've built all my own computers, but never have I disassembled one of my graphics cards.
and secondly, If I did it wrong, due to my current financial situation, I was simply out of a card!
So I did it. I took apart my Diamond HD4870 1 gig card. I have two. When I bought the first one it was shrouded whithin this massive plastic and metal cover (The second one had a much simpler cooling solution). After alot of screws and a certain amount of tension I had the card apart. I cleaned it out from front to back (and quite frankly, there was hardly any paste on the GPU at all!) and re-applied some sexy Arctic Silver 5 thermal paste. The re-assembly went relatively smoothly and I plugged the card back in and hit the power button......

Nothing popped or fizzeled and I could smell no smoke (Thank God). After determining that I had not fried anything, I cranked up GPU-Z and took a look. The GPU temp had dropped from 54C to 45C (almost 20 degrees farenheit!). This could'nt be bad, right? I then verified the temps with ATI's CCC. They concurred. So Then, I did the same thing with my second card, which, as I mentioned earlier, had a seriously simpler cooling solution. No cowling and a simple 4 screw cooler mount. Once again the temps took close to a 20 degree farenheit dive!.

So now, both cards are hovering at 41 to 44 degrees Celsius at idle, and very rarely break 60 degrees celsius under load. Which as I understand it is awesome, because several of the original reviews have the cards at around 70 celcius at idle.(Note: I do use the CCC to set the card fans at 6o% which is quite frankly very loud, but I play with headphones so it's not so bad)
See the review Toms Hardware=http://www.tomshardware.com/reviews/radeon-hd-4870,1964-17.html
Thats all for now, thanks for reading.
Re: Graphics Card revelations Posted by Orpheus on Wed Jan 27th 2010 at 3:53am
Orpheus
13860 posts
Posted 2010-01-27 3:53am
Orpheus
member
13860 posts 2024 snarkmarks Registered: Aug 26th 2001 Occupation: Long Haul Trucking Location: Long Oklahoma - USA
I have a 1 gig geforce 9800, and it runs things well enough. I'd really like to see what a truly big machine can deliver.

They say ignorance is bliss, but I'd still like to see the differences with mine own eyes.

The difference between the 9800 and my old card is so dramatic, its hard to imagine better still.

The best things in life, aren't things.