Jump to content

Spoudazo

Members
  • Posts

    69
  • Joined

  • Last visited

Posts posted by Spoudazo

  1. Papa yeah it's a resource hog, but I gotta tell you the new Unreal Engine is worse. The Splinter Cell demo brings my rig to it's knees at the same settings as GRAW. We're talking about 10-13 fps for SCDA and 35-40 for GRAW (1280x720 med settings/standard setting for SCDA). If GRAW2 uses the Disel engine, by the time it's released it will be a middle of the road game resource wise. Games like SCDA and R6:Vegas are going to need more powerful HW than GRAW that is for sure. After that demo, and seeing the upcoming games, I think the arguments that GRAW requires too high reqs for hardware will fade away.

    The Splinter Cell demo uses the Unreal Engine 2.5, not the UE3. ;)

  2. I know they are out there somewhere. I tried that extractor program that you use, and it messes with some game files, but never got that working right.

    So anyone got some working yet? I keep getting killed, the Messicans are too much for me in this game! :P

  3. 512? That card usually comes as a 256. Good enought o play yes but as good as a 6800GS or such probably not. Check the core speed, and what kind of memory it has on it (DDR, DDR2, DDR3) as they play a big part in its speed.

    I have a 7900gtx, and w/ 16xAF, 2gb of RAM, modified XML file (to load more stuff on the RAM), a 3800x2 (with one core waiting for this game to actually use it :whistle: ) I get on average 40fps at 1680x1050.

  4. Since desmond said this,

    Also, seems everybody want a scapegoat for the system requirements, but deferred lighting is not it. Sure you'll get a performance boost by turning off post effects - as long as the GPU is doing all the work - but as soon as you get into a firefight the CPU has some serious work to do, and then the framerate will drop regardless. It's not as simple as some of you self-proclaimed experts make it out to be.

    I'm assuming there is going to be a dual-core CPU patch, as this game is hard on the CPU.

    Any info?

  5. Well now I have bought a +3800 AMD single core so the question is settled for me now.

    Why would you do that? Games are going to be programmed for dual core, quad-core, etc. in the very near future, and your single core is going to be behind quite a bit when you could have spent a few more dollars (like $10 maybe?) and gotten a dual core.

  6. He's asking about SP cheats.  And he's not the only person who likes to know them.  I use them ALOT in GR1.  Without knowing the dev cheats how does one debug a new mission without being shot?  How does one test a complex bit of scripting without dying?  SP cheats.  Ask any scripter what they think of SP cheats, they'll tell you they are pretty important.  So lose the high and mighty "SP cheats are for nOObs", or "learn to play right" stuff.  They serve a purpose well beyond goofing off and blowing some things up on a map.

    :wub:

    Awww thanks :thumbsup: , lol

  7. I'm running the game at 1280x960 right now (at 85hz) and I after I set all of the "texture managed" options to TRUE (something that has been discussed here, or another forum board) it helped a good bit, as some of you may know. Now, I have 2gb of RAM, so I read that this isn't a good option if you have 1gb or less.

    My average FPS is probably around 45 now, w/ no dynamic lights and no shadows, 8xAF, and high textures on everything. Things are lookin' pretty good right now, much better than the 20-30fps average from before! :grin1:

  8. Aparently you might need two vid cards to see the physics in their full potential (from what I've heard).  I'm going to hold off until issues are resolved and the system is solid.  This is sorta the first game out utilizing ppu's, so, I'll give it a little time, but I'll end up getting one sooner or later anyhow.

    This was verified by a couple of websites, that a SLI technology is needed. Unfortunately, the PPU in the market now is ONLY PCI. Having two PCI-E highend card bottlenecking on a single PPU defeats the purpose.

    Per suggestion from a good friend, wait till the PCI-E version to comeout or just spend your itch on somewhere else. Maybe going 2-4GB on memory perhaps.

    Remember, the first wheel invented had rough edges, that cause too many bumps.

    Wait until they smoothen out this new technology. :thumbsup:

    lol @ wheel/bumps :)

    I'll probably just spend the money on a SLI setup then :D

  9. I think it'd be cool to have a PPU. Now, if only one game supported it, then of course, no, I wouldn't buy one. But Unreal Tournament 2007 (and I think all Unreal Engine 3 games) will support it, so that's one of the main reasons I'm buying one whenever they are released. :)

    How about you? :D

  10. Well,

    Doing some more tests, and I set everyting to the lowest the in-game settings would allow me. At first it was around 30fps or so, then after that, it stayed around 50fps. The textures, effects, lighting/shadows, were all turned down or off, and still it wouldn't average 60fps.

    I'm thinking the 3d engine isn't as well optimized for such huge landscapes, at least yet. The reason the frames almost doubled after a minute or so of playing is perhaps because the rest of the level is loading when you first start the game, or the textures, etc. are all getting settled in. This was with 2gb of RAM, so RAM was not lacking, nor was the burst speed, etc. of the HDD.

    You agree? :)

  11. If your video card can't play hardly any games at 1600x1200, then don't blame the developers of this nice game, rather blame yourself for not (1) upgrading your PC

    I have a 7800GT, 2GB of RAM and an Athlon 64 3500+. I still don't get good frames at that resolution. Don't go preaching complete rubbish like this, you just don't have a clue.

    You should be able to. Try turnning off shadows, and tweaking the xml file. A 7800gt is't far behind a 7800gtx. What do you consider playable?

    I always prefer 60+ fps, but I don't want to spend more money right now to get there ;)

    Edit: One day, oh yes one day, I'll shall make a post without typos :wall:

×
×
  • Create New...