Jump to content

Is GRAW just too advanced to play on today's hardware?


  

187 members have voted

You do not have permission to vote in this poll, or see the poll results. Please sign in or register to vote in this poll.

Recommended Posts

That's a reflection upon those that still refuse to see that the game has problems. It's interrelated. If this was Lockdown or Ravenshield there'd be complaints about low frames.

It's all about perspective and some peeps here are blinded by name brands.

Link to comment
Share on other sites

  • Replies 58
  • Created
  • Last Reply

Top Posters In This Topic

i voted yes but i think that if it is too advanced then im worried because GRAW really doesnt look that impressive for the resources it needs to run.

and im also concerned that people are seeing 20-30fps as OK.

Totally agree! I mean we have been told its our hardware that the limiting factor more than once now by GRIN. I wonder if maybe they could tell us WHAT hardware or WHEN there will be hardware that IS capable of running this game efficiently(more than 40fps)

BTW whats the deal with them forcing the game to High Priority on certain machines? whats that supposed to accomplish other than cause peoples keyboards and mice to lag and anything else they may wanna run like, oh I dont know Comms seeing as there isnt any great way to communicate. :wall:

Link to comment
Share on other sites

Does anyone know what GRiN uses as system hardware?

What configuration? What setup did they use to test?

Here's a question for those more technical than myself...

How can any developer properly stress test the engine if future hardware doesn't even exist?

Link to comment
Share on other sites

Does anyone know what GRiN uses as system hardware?

What configuration? What setup did they use to test?

Here's a question for those more technical than myself...

How can any developer properly stress test the engine if future hardware doesn't even exist?

Thats what im saying...they couldnt build the game for something that isnt even made yet...DX10 for example...not that ive seen it said outright but its implied that this game will work better with DX10 but how do they know that without utilizing DX10 for tests. Maybe its Vista that they are talking about...that could be cause peeps are beta testing it as we speak...but that system keeps getting delayed too so who knows when that will come out. If this is the case...to where they actualy built this game around Vista and DX10 then thats just crazy...it should not have been released until the said system is out. It would would be nice to hear from GRIN on this topic cause it needs to be answered so we as consumers know what to expect.

As of that infamous question...what kind of system did they test it on? Ive asked that too plus i took it one step farther and asked did they see the jaggies that we do? <_<

Edited by }PW{ Postal
Link to comment
Share on other sites

As of that infamous question...what kind of system did they test it on? Ive asked that too plus i took it one step farther and asked did they see the jaggies that we do? <_<

Mmmmmmmm...jaggies... :drool:

Not to diverge much from the topic (it's kinda related), does anyone know how soon a vcard that supports deferred lighting and AA will be coming out?

OUCH! I even violated my own rules I posted in the first post. :nono: FLAME ON!

Link to comment
Share on other sites

I cant answer that but i can say that it was said to be possible with drivers but it would slow the performance down to basicly a stand still...on the average 512mb card With that said...and i hate to think this but Nvidia might be able to pull it off with the quad SLI (2 gig of onboard vid mem :drool: ) set up if they can get the drivers to do it.

Link to comment
Share on other sites

if you play FPS, moreso than any other genre, you should accept that lower end systems aren't going to be performing very well at all with newer, high-end games. Its a realisation that you either have to be satisfied with or basically move on to a different genre. I have a 6600GT and can play the game at about 30-40fps with dips into the late teens when things get heated. During MP Domination I usually don't drop below 20. Is it pretty? No. Is it playable? Yes! and that to me is what matters.

As for 30fps not being an acceptable frame rate for FPS and in particular MP modes, I don't buy it. The fundamental gameplay style for GRAW is different to BF2, the current online FPS squad based shooter with which to compare, as it stands. Infantry encounter ranges are significantly further out for GRAW than for BF2 meaning that things like reaction time and fps carry less meaning. Even in closer quarter maps the fps "issue" isn't one i'd worry about too much.

If your system is "high-end" currently and still only pulling out 30fps and you're not happy, simply down your settings and see if they rise. Just because you've got t3h l337 hardware that pwned BF2 doesn't mean you should immediately expect it to rock GRAW.

Link to comment
Share on other sites

I have a 6600GT and can play the game at about 30-40fps with dips into the late teens when things get heated. During MP Domination I usually don't drop below 20. Is it pretty? No. Is it playable? Yes!

I, too, have a 6600GT. It's very playable, no mouselag, and while it's not as pretty as it could be, it's still pretty, more than playable, and while I have only done bonified FRAPS tests in SP, where I get 50-60, I run FRAPS during MP just to show framerates and it seems to float between 40-60, never lower than 30.

People need to uninstall their Bonzi Buddies. ;)

Link to comment
Share on other sites

Thats whats funny about this engine, it dont matter if you have a high end rig or not, if you turn the settings up or down, theres only a small margin between the either you play mark and the cant play mark.

Example... You have a 6600gt and your getting 30-40 FPS average, I have an X1900XT and mine run 50-60 average thats with 50 being the norm really but wait theres more...behind door number 2 i have a $1000. FX-60 processor overclocked to 2.8 ghz and behind door number 3 we have 2 gigs of Corsair XMS ram at timings of 2.3.3.6.

Lets put it like this..build my system for around 2 to 2.5 grand then build yours for what? (keep in mind i dont know enough about your system but heres a guess) 1 to 1.5 grand and then install GRAW on both and see if you see a difference in the FPS. You really wouldnt, atleast not the 1000 to 1500 dollar difference you might expect.

Maybe my system should be the minimum specs and stuff thats not even made should be the recommended which by the way is gonna cost quite a bit more than the 2500 that i spent building this. (i didnt build this for GRAW, i built it cause i wanted to)

I dont expect this system to rock GRAW, it just seems by there standard of minimum and recommended specs...it really should rock and it dont.

Edit..

I just wanted to add that on FEAR if i crank everything up to utter max settings i get 60 FPS across the board...if i lower everything to the lowest settings.....im getting 120 to 150 for my average...yes i said average FPS Plus the difference visualy is outstanding between the lowest and highest setting. On GRAW if i turn everything down to low its not all that different visualy.

Edited by }PW{ Postal
Link to comment
Share on other sites

i wish it was too advanced! i've hit 75 FPS with my 12 pixel pipeline x850pro, not overclocked. it's called Post Effects Quality = "off". it's a lot easier to see people that way too... i actually expected the game to crawl on my rig, but after playing it, i don't think i'm gonna get a crossfire setup anymore. a liquid cooled 1900xtx should give me a 100 frames easy.

Link to comment
Share on other sites

Dont waste your money, you wont hit 100 frames on this game with that card...ok, well you might hit it if you stand in the darkest corner of the map and dont move... o wait...no you wont..i tried that and saw that infamous 75 frames that you saw. If i go out and play it drops right back down to 50 to 60 frames average. I have the X1900XT...trust me, if your getting 75 frames...dont waste your money.

Link to comment
Share on other sites

Dont waste your money, you wont hit 100 frames on this game with that card...ok, well you might hit it if you stand in the darkest corner of the map and dont move... o wait...no you wont..i tried that and saw that infamous 75 frames that you saw. If i go out and play it drops right back down to 50 to 60 frames average. I have the X1900XT...trust me, if your getting 75 frames...dont waste your money.

well, xt is not xtx, and again, have you set the PostFx value to off in the xml? cuz you can't do that from the options screen...

Link to comment
Share on other sites

Absolutly not. GRAW is simply built on a sub-par engine. Look at Half-Life 2 Lost Coast, it has HDR, it looks better then graw, and it still runs great. I can't really blame GRIN for that though, there isn't many developpers that have the experience and resources to build efficient and versatile engines. EPIC, id Software and Valve do that and they are extremely good at it. The thing is, if GRIN would have chosen a third party engine like the Unreal engine, Source engine or even the Cry engine, instead of building their own, they would have saved a whole lot of time and the game would probably look, play and perform better. Why ? Because Epic, id and Valve are veterans at building game engines. Look at the Doom 3 engine. It is scalable, you can play doom 3 on a very low end PC and it still is playable. On the other hand you can play doom 3 on a very high end PC and it will look damn nice. And last but not least, their engine is mod-friendly, because they built the engine with mods in mind. Same goes for the Source Engine.

I think it all comes down to "programming skills". How can you explain that GRAW looks almost on par with let's say, Far Cry, yet, it runs 10 times worse than FC. The way they coded it is not the most efficient way to do it. You can't really blame them for that as they didn't have time to build a whole engine. And I doubt Ubisoft wanted to pay 500k - 1M USD for a Unreal/Source/Cry engine license.

GRIN did not build their engine just for GRAW. GRAW is built on the 6th version of Diesel, just as the Unreal and Quake engines have been updated. If I remember correctly, the new Doom 3 engine is brand new and still has some kinks to be worked out and also has steep system requirments (I do not play games using those engines so am not completely clear as to what they need). Even the RSE engine is about 6 versions old.

From what I have seen recently, there have not been any "new" game engines created recently. As far as Ubi licensing an engine, they have licensed the Unreal engine for the SC series as with Raven Shield (though that upset many people while pleasing others and caused some to say "I told you so" after using it for a TC game). Ubi will be using the Unreal engine for R6: Vegas too (not sure if that is a good idea yet again).

Link to comment
Share on other sites

Dont waste your money, you wont hit 100 frames on this game with that card...ok, well you might hit it if you stand in the darkest corner of the map and dont move... o wait...no you wont..i tried that and saw that infamous 75 frames that you saw. If i go out and play it drops right back down to 50 to 60 frames average. I have the X1900XT...trust me, if your getting 75 frames...dont waste your money.

well, xt is not xtx, and again, have you set the PostFx value to off in the xml? cuz you can't do that from the options screen...

Yes i did that with the post effects...XML file...open with notepad...edit and save.

Theres not that much of a difference between the XT and XTX. Anyways, Im just trying to save ya some dough but if you insist on buying it expecting 100 frames from this game...all i can say is post some screenies when you get it with the FPS up in the corner and then we will talk about this some more.

Link to comment
Share on other sites

There is one option missing in the poll.

#5 Is the code unoptimized and that's why it runs the same no matter what system it's on?

The code for this engine is new, basically untested, and largly unoptimized. I implore you coders out there to open up the files, take a look and tell me I'm wrong.

Here is a small part of what's being said about the code on other forums:

More I mess with this game in WPE, Olly and IDA Pro, more I see that it is more PrehistoricGen than NextGen, broken and not optimized.

If they want to add some real anticheat scans/counter mesures in it, it will become unplayable for a lot of gamers/fans without SERIOUS and DEEP optimizations. The funniest thing that could happen is if they use the evenbalance spyware ( as optimized and ressources hungry as GR3 ).

The game isn't even finished, in my own humble opinion they sold a beta because of the newbisoft vampires.

I messed A LOT with FarCry and the CryEngine, it's older than GR3 but the technology used in it is way more optimized/advanced than in GR3.

Crysis will be a real NextGen game, GR3 is just a big eye candy garbage. It's sad because there are some good gameplay details/ideas in it...

Link to comment
Share on other sites

Thanks for posting that Skratte I saw that also and kind of had a chuckle about it. They just put a pretty candy coating on it kinda like Lockdown using the original RS/GR engine only LD ran better and in some respects looked better...

Edited by LT.INSTG8R
Link to comment
Share on other sites

Dont waste your money, you wont hit 100 frames on this game with that card...ok, well you might hit it if you stand in the darkest corner of the map and dont move... o wait...no you wont..i tried that and saw that infamous 75 frames that you saw. If i go out and play it drops right back down to 50 to 60 frames average. I have the X1900XT...trust me, if your getting 75 frames...dont waste your money.

Postal, my system is similar to yours, and I'm getting similar frames to you with a 6600GT, though probably at a lower res -- 800x600. I would expect much higher with a high-end card.

Moreover, if the review at Elitebastards is accurate, the 1800XT 512MB gets higher framerates than the 1900XTX. It might be some issue between the Diesel engine and the acrchitecture of the 1900XT line, which is radically different from the 1800 line. Unless Diesel caps framerates, which I can't imagine why it would, I would expect to get close to 100 with the $300 1800XT 512 at my current settings and res.

Also, people need to remember that ATI and NVidia both do driver revisions with specific games in mind. We could easily see driver optimzations and subsequent framerate increases in the future, even without the patches from GRIN.

Link to comment
Share on other sites

Yeah , im running with a res of 1280x1024. Yeah, im keeping my eye out for new drivers on a daily basis for just that reason. I know that my card is new...sorta and they are still working to optimize it with the drivers and each game is different to so that just adds more time too.

Yeah, i expected alot better frames per sec than what i have only because of the system im running. Im not saying that 50 to 60 isnt good..heck, i probably wouldnt even notice a difference in gameplay if it jumped up to 100. Its not really a complaint, its more an observation of how this engine seems unbiast of hardware.

Link to comment
Share on other sites

Edit..

Ok, i found it in my catalyst control. Its not an option to disable but to increase the refresh rate, i increased it to 120 hz and it did increase it. Now i average 60 FPS and max out at 70 to 75. So maybe i was wrong on my previous post..sorry Would be nice to see some posts from other high end users about what FPS they are getting average. I always do it during gameplay for awhile to get my average...i mean at one point i saw 99 FPS but i dont count that because it only happened once. I could say that i max out at like 700 FPS but thats on the black loading screen... :rofl:

Can you please tell me exactly where you found the control settings for the above. I have the ATI CATALYST® Version 06.4.

Was the setting you changed under the Display Options then 3D Refresh Rate Override?

silent_op

Edited by silent_op
Link to comment
Share on other sites

Edit..

Ok, i found it in my catalyst control. Its not an option to disable but to increase the refresh rate, i increased it to 120 hz and it did increase it. Now i average 60 FPS and max out at 70 to 75. So maybe i was wrong on my previous post..sorry Would be nice to see some posts from other high end users about what FPS they are getting average. I always do it during gameplay for awhile to get my average...i mean at one point i saw 99 FPS but i dont count that because it only happened once. I could say that i max out at like 700 FPS but thats on the black loading screen... :rofl:

Can you please tell me exactly where you found the control settings for the above. I have the ATI CATALYST® Version 06.4.

Was the setting you changed under the Display Options then 3D Refresh Rate Override?

silent_op

Yes, thats where its located. you can change it to whatever you want, I put mine to 120 and saw an improvement in my FPS...not much but i see more frames per sec. I couldnt find an option to turn off vsync so i used that instead.

In the Catalysts it would be under 3D settings under All Settings or under your Primary/3D settings in the tray. It is already set to Off unless App specifies by Default. You can change it Always Off if you so choose

Yup, just found it...TY very much..ive been looking for that for a long time...lol DOH Mine was not set to always off though, i moved it over and then it said always off.

Edited by }PW{ Postal
Link to comment
Share on other sites

Hi all,

I upgraded my machine after attempting to play GRAW on an inspiron 9100 with a Radeon 9800 mobility - I was getting ~ 25 FPS @ 800x600. Yeah, it sucked to have to upgrade, but it has been almost 2 years since my last upgrade. Running GRAW on a AMD Athlon 64 3700+ & a 7900 GT has really changed my opinion of this game. Getting 40-60 fps @ 1280 x720 makes this game a LOT better. I liked the game at the lower rez, but it really kicks ___ on a higher end system.

Not too advanced for today's hardware, but definitely too advanced for a machine a year or 2 old.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...