Jump to content

GRAW demo performance analysis by Elite ######


Recommended Posts

  • Replies 60
  • Created
  • Last Reply

Top Posters In This Topic

So your conclusion on AA is that it will never be possible with redoing the lighting system in the engine...sigh, because the jagginess DOES kill the game alot.

For those of us with now called midrange cards (BFG 6800 GT is what I use ATM), were kinda gonna have to get used to jaggies? Wife is gonna kill me, hmm how to tell her I REALLY need new vid card ROFL! :rofl:

Link to comment
Share on other sites

So your conclusion on AA is that it will never be possible with redoing the lighting system in the engine...sigh, because the jagginess DOES kill the game alot.

For those of us with now called midrange cards (BFG 6800 GT is what I use ATM), were kinda gonna have to get used to jaggies? Wife is gonna kill me, hmm how to tell her I REALLY need new vid card ROFL! :rofl:

no, it is not that. The method they chose mean no card that is currently out can do AA+HDR ever.

Link to comment
Share on other sites

OK...maybe someone can help me here.

I have this system:

Radeon 9800 Pro

3.4 AMD processor

1 gig of DDR ram

Sound Blaster Audigy ZS

In order to get good frame-rates and no mouse lag I have to set my resolution all the way down and then turn everything to low and turn off shadows.

Someone else here (a few people actually) have similiar systems and they could play the game on a higher res and medium settings.

In comparison (rough comparison) I can play Battlefield 2 on all high settings on a 64 player server and run perfectly smooth.

Anyway..what am I doing wrong? I've updated all my drivers...and I still get the ugly graphics. :/

Link to comment
Share on other sites

well as you can see by Hanner's test basically even the top cards on the market struggle for 40 fps so its not just you and your system Destructo its pretty much everybody. GRIN will need to spend some time optimizing the engine I think as this is not a "gamer friendly" title at this point as its beating the crap outta the high end stuff just as much as the low.

I still ordered copies for me and Missus today and will perservere thru it and hope they do some hot-fixing between now and the proposed add on not to mention see what kinda of optimizations the later GFX drivers can do too help. I mean its not like ATI and NV didnt know because they are both listed in the Credits as Tech Partners along with AGIEA

Link to comment
Share on other sites

Hrmmm......now I have to convice the good lady wife that chaging my whole rig platform to PCI-E is a great idea after shelling out $300 plus for my last video card...ahh....yes.....this'll be fun.

Better get down to the jewellery shop after the PC supplier! :wall:

Link to comment
Share on other sites

Anyway..what am I doing wrong? I've updated all my drivers...and I still get the ugly graphics. :/

You should be able to play at 800x600 with everything on low and shadows off. I'm on a slightly slower system than yours and that's all that's playable for me, though mouse-lag is still a problem. That's basically what you'll have to live with on your current video card ... nothing else you can do.

Link to comment
Share on other sites

You and me both stormcrow. I had intended to upgrade my PC at the end of June anway and pass my current rig along to the Missus, tho watching my X800XT struggle with 30 or so FPS doesnt make it look all that promising for her to enjoy this game(tho she plays BF2 rather well with my old 9800XT 128m with 2xAA and Medium). Im thinking maybe getting the 7800 GS Bliss 512Mb for this and hope it will run at least moderately well.

I intend on going with a Crossfire capable Mobo(Abit AT8 32X) and pick up 1 X1900XT and a 3800 X2, then pick up the Mastercard down the road

Link to comment
Share on other sites

Anyway..what am I doing wrong? I've updated all my drivers...and I still get the ugly graphics. :/

You should be able to play at 800x600 with everything on low and shadows off. I'm on a slightly slower system than yours and that's all that's playable for me, though mouse-lag is still a problem. That's basically what you'll have to live with on your current video card ... nothing else you can do.

See, that's what i'm talking about...if they want to make a profit on this game they are gonna have to make it playable for the casual gamer who doesn't have high end video cards, etc...

As of right now..most people just can't play the game...and it's not worth playing when it's so ugly graphically at lower settings...people with higher settings will easily be able to see you..but you won't be able to see them due to the smudgy graphics and more jaggys at that resolution.

That's why BF2 sold so many copies...it could look GREAT on a high end system..but could still look good on a low end system and that's on a 64 player server with large maps. I don't see how they expect to make a profit on a game the majority of gamers can't play decently.

Link to comment
Share on other sites

Great Striker thanks for the Input, The Missus isnt willing to fork out for the PCI-E switch over so my machine with a better GFX card will have to suffice

P43.2E @ 3.6(225FSB)Zalman 7000B-Cu

Abit IC-7

2x1024 Corsair XMS 3-3-3-8(1:1)

X800XT/Arctic Silencer 4

Samsung Spinpoint 120G SATA

SB Audigy 2 ZS

Tagan 420W

LG 1930BQ TFT 19" 12ms

Logitech G5

Saitek X-52

Logitech Driving Force Pro

3DMark05 6174

3DMark03 12917

3Dmark06 2162

Link to comment
Share on other sites

I too am "stuck" on AGP, have you looked at BFG's 7800 GS OC?

http://www.bfgtech.com/7800GS_256.html

I have a 6800 GT and am considering th 7800 GS as an upgrade, not sure if it's enough of one to merit upgrading tho LOL

Actually its THIS IM looking at and its a smoker, there has be on released even more recently based on the 7900GT core which is a full 24 Pipe card but costs pretty much the same as a X1900XTX.But basically THAT 7800GS is only available in Europe as Gainward pulled out of the NA Market.

Link to comment
Share on other sites

Don't know about you guys, but I'm not impressed with those results. When I spend $50 on a game and $500-600 (CDN) on a top end GPU, I want top end framerates, not 30~35. :angry:

Maybe the best bet is to wait for the next gen of graphics cards. <_<

Link to comment
Share on other sites

Don't know about you guys, but I'm not impressed with those results. When I spend $50 on a game and $500-600 (CDN) on a top end GPU, I want top end framerates, not 30~35. :angry:

Maybe the best bet is to wait for the next gen of graphics cards. <_<

Cant really argue with that in principal but what really bothers me is that GRIN would have even chose such a resource heavy unsupported form of lighting rather than just use regular HDR or even Bloom. Forcing the unsupported standard on us is my real issue with it. Not to bring up the Source engine but it at least gave you the choice of bloom or HDR which implemented more recently rather than stick you with something that absolutely NOBODY can really make use of efficiently or make look the way they want.

Link to comment
Share on other sites

Here's something else odd to think about. I have two X1900 cards in Crossfire configuration with an FX57 at 3ghz and 2 gigs ram. I get LESS FPS when running both cards in Crossfire compared to only 1 all by itself. Unbelievable. My system is stable in Crossfire mode verified by endless hours of BF2 and many loops of 3DMark '05. Nothing like spending another $500 to get less FPS. Something has got to be going on with the R580 in these X1900 cards. This review showed this clearly with the X1800 outperforming the X1900 at lower resolutions.

It's nice to see that 8xAF doesn't hurt performance really at all. I've tried running everything on high settings with 8xAF at 1920x1200 and I get about 25FPS. Running the same settings with only 1 X1900 card gives about 30FPS.

ATI or GRIN need to get on the ball with these issues because the 7900 GTX's in SLI configuration using alternate frame rendering seem to almost double their FPS when compared to only 1 card.

Luckily this game doesn't require 60+FPS to be playable. The very slow movements of the players will likely allow most of us to handle 30-40FPS as playable.

If this ATI issue isn't resolved soon, I'll be switching over to the 7900 GTXs since the X1900 have quite a nice resale value on eBay.

Edited by Danger30Q
Link to comment
Share on other sites

because the 7900 GTX's in SLI configuration using alternate frame rendering seem to almost double their FPS when compared to only 1 card.

A friend of mine have same frames using 1X 7800gtx 256 or 2 of them in SLI....maybe depends by that 'alternate frame rendering' option? Could someone explain better? I dont have SLI so don't know, but I wish to help him :)

SLI is working OK with GRAW demo right now?

Link to comment
Share on other sites

because the 7900 GTX's in SLI configuration using alternate frame rendering seem to almost double their FPS when compared to only 1 card.

A friend of mine have same frames using 1X 7800gtx 256 or 2 of them in SLI....maybe depends by that 'alternate frame rendering' option? Could someone explain better? I dont have SLI so don't know, but I wish to help him :)

SLI is working OK with GRAW demo right now?

Normally games are rendered in SFR mode (default), but SLI supports two other modes as well, AFR and AFR2. With a program like Nvtweak, you can create a new profile for the game and select/force another rendering mode (AFR in GRAWs case). It´s quite simple and some games perform better with the other rendering modes, but since even the latest driver doesn´t include a GRAW profile (the next one will), you have to create one for yourself.

Check out this older post of mine for more information on this topic:

http://www.ghostrecon.net/forums/index.php...16entry361616

Link to comment
Share on other sites

I don't understand why the X1900 XTX was out performed by the X1800XT at lower settings.

Someone explain?

GRAW demo detected my card X1800XT 512MB and set everything to high+8X AF, strange the resolution was set at 800x600@60, maybe 800x600 is standard/default detection? anyway playing graw at default detection is excellent, very responsive and smooth, but not the best looking game at that resolution, lol. So i upped the resolution to 1280x1024@60 (my native lcd resolution anyway) and i suffer massive framerate issues, sheesh kebab it fell to like 30 fps, infact it never realy climbed above 36fps in most areas of the demo level. So i dropped down to 1024x768@60 i gained maybe 4-5 fps, ha ha. So i'm interested to find out what single graphics card is going to run GRAW PC maxed out, i have a feeling it will have to be dual gfx set up for this game even at 1280x1024. GRAW demo is excellent and i'm sure the full game will be too. I suppose the only alternative would be to set most things at medium to low, maybe even turn some features off, which is a shame because the game looks amazing, oh well never mind onwards and upwards i suppose.

my current rig

win xp pro sp2

nf4 premium

4000+ sd

2gbyte 3200 dual channel

x1800xt 512mb

10k rpm sata hdd

x-fi extreme

530w psu

Link to comment
Share on other sites

Warcloud, your framerate is truly weird. I've used fraps to work out my averages while playing. Done 4 sessions of about 5 or 6 minutes each (with no going to the tactical map because that seriously increases the average)) and I'm averaging 39-40 fps. This is at 1280x960@70, everything on full except textures. With your gfx card I'm really suprised at what you're getting!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...