Jump to content

GRAW demo performance analysis by Elite ######


Recommended Posts

Warcloud, if I can play it maxxed out with a few high-res textures at 1280x960 with an overlocked 6800, you will easielly get twice as many fps as I have once ATI gets their head out of their ass and release a proper driver for GRAW. No doubt!

BTW, what driver version are you currently using? Make sure AA/AF are set to "application-controlled" in the driver settings.

Link to comment
Share on other sites

  • Replies 60
  • Created
  • Last Reply

Top Posters In This Topic

Where is GRIN when you need it? I think we would like to get some answers form them. It bothers me that they never said anything about crappy FPS

Some of the *ahem* more aggressive members chewed thier ear off with endless requests and random nonsense. I doubt you'll hear from them any time soon. It would be nice tho' to get some feedback.The most I can squeeze out of my 7800GS that has now been overclocked 4 weeks out of the box is, 40 fps and medium settings on half decent res.

Really for the dough I forked out its not great......still at least I can play the beast, which is more than can be said for some!

Link to comment
Share on other sites

I think its all come down to there foolish choice to use Deferred Lighting over traditional HDR or Bloom.

Quoted from an Article over at Beyond3D(thanks capteenix for digging it up)

After skimming over that article in your link capteenix, I really have no clue why they chose this, it is not for gaming(tho I have been saying since day one, WTH were they thinking using a basically unsupported form of lighting).

This lighting may be useful for say CGI movie making but it certainly has no advantages for gaming.

Qutoted from the Article:

The main disadvantages are:

* Large frame-buffer size

* Potentially high fill-rate

* Multiple light equations difficult

* High hardware specifications

* Transparency is very hard

None of those things sound very game friendly to me and are most likely the reason its bringing even Flagship GFX Cards to their knees, and of course the fact that it is completely incompatible with AA and appears it always will be unless Anti-Aliasing becomes completely overhauled.

Edit: Sorry I guess I should link to the original article

Deferred Lighting Article

Edited by LT.INSTG8R
Link to comment
Share on other sites

Considering that article is pretty old (July 2003) and even the best high-end cards back then are surpassed by todays low-end cards, I find the Conclusion paragraph pretty interesting.

Deferred Lighting is now possible in hardware on the latest video cards, it has an own unique advantages and disadvantages and while in many cases the disadvantage out way the advantages on current hardware in the right situations it can outperform and look better than conventional per-pixel lighting. In situations with complex procedural shaders with many lights and shadows, the saving can be massive this is even truer in densely occluded environments. The same techniques used to accelerate objects (batching etc) can be used to accelerate lights, and if a light isn’t visible its cost is very low.

In the long term the occlusion and geometric properties of deferred lighting are its most interesting factors, no other technique allows you to render so many lights affecting a single surface without crippling performance, with deferred lighting it scales much better.

****

What does that tell me? GRAWs engine has quite some headroom for optimizations because it is fairly "young" and while DL has some disadvantages, it has major advantages as well. Lights and (soft) shadows are less expensive than in tradional implementations, but I personally do agree that Grin should have included a tradional-static light system which would support AA. Either there wasn´t enough development time for something like that, or the engine doesn´t support 2 different lighting techniques in the same build.

Link to comment
Share on other sites

What does that tell me?  GRAWs engine has quite some headroom for optimizations because it is fairly "young" and while DL has some disadvantages, it has major advantages as well.  Lights and (soft) shadows are less expensive than in tradional implementations, but I personally do agree that Grin should have included a tradional-static light system which would support AA.  Either there wasn´t enough development time for something like that, or the engine doesn´t support 2 different lighting techniques in the same build.

I agree with that as well to an extent but I still stand by that this was a poor choice of technique to use on a game like this and it not a good choice of Genre to make us guinea pigs with. I think it would be fine for a paced out MMORPG or some sorta RTS game where speed isnt so much a factor. Do I think it has room for optimization? I sure hope so. I do hope they consider adding another lighting path to the engine as its a tough pill to swallow at this point unless you can run 2000x1000+ resolutions to get rid of the aliasing.

Link to comment
Share on other sites

Lets not forget non-tradional approaches to AA as well. Who says that all fail with the DL engine? You can do AA in shaders for instance, or as a post-processing effect. It might be very expensive right now, but next-gen video cards probably would laugh at them.

I really would love to have a mature discussion with Grins engine guys, and what their opinions are on DL and why they have choosen this tech (the first time DL is used in a game AFAIK). I have no doubt that they have good reasons and I´d love to hear them. I´m sure I´m not the only one. :)

Link to comment
Share on other sites

Warcloud, if I can play it maxxed out with a few high-res textures at 1280x960 with an overlocked 6800, you will easielly get twice as many fps as I have once ATI gets their head out of their ass and release a proper driver for GRAW.  No doubt!

BTW, what driver version are you currently using?  Make sure AA/AF are set to "application-controlled" in the driver settings.

ATI CCC 6.4, it's application managed no overclock. Anyway i have been doing some more testing with the demo and i found setting the resolution @ 800X600 60HZ aspect ratio 5:3? all settings as high as they will go, dynamic lights , shadows full on etc, plus 8X/16X? AF: to be the best performance for framerate and eye candy.

Win xp pro sp2

nf4 premium

4000+ sd

2gbyte 3200 dual channel

x1800xt 512mb

10k rpm sata hdd

x-fi extreme

530w psu

17" lcd dvi-d

800x600 60hz GRAW demo maxed out :grin1:

800x60060hzhigh8xaf6cl.jpg

1280x1024lownope2wd.th.jpg1280x1024high8xaf4cz.th.jpg

1024x76860hzlow4hy.th.jpg1024x768high8xaf0vo.th.jpg

Link to comment
Share on other sites

warcloud, thats an average 20fps hit from the lowest settings at 1024 to the highest settings at 1280? The performance hit is a larger than I thought but then again the difference in image quality is huge. I´m confident that ATI can reduce that hit to 15fps with their next drivers.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


×
×
  • Create New...