Jump to content

GReetINgs from Mexico!


Recommended Posts

  • Replies 101
  • Created
  • Last Reply

Top Posters In This Topic

i didnt know gamers were obsessed with FSAA... I've just never felt the need to use it ever, especially at high resolutions :S

While Im in the action I dont notice any jaggies

Edited by whoa182
Link to comment
Share on other sites

Although I really love HDR lighting and think it works great in immitating an enviorment such as Mexico City, I think you should have given us more options. I am running GR:AW on all high settings at 1024 768 and the jaggies really get to me. I see all this beauty but it looks like it all has been cut out by a two year old.

A Bloom lighting and AA combination choice would have been great for those who wish to get rid of the jaggies. I know people with extremely high end cards are not getting these but Bloom lighting+ high AA looks very nice.

Link to comment
Share on other sites

Although I really love HDR lighting and think it works great in immitating an enviorment such as Mexico City, I think you should have given us more options. I am running GR:AW on all high settings at 1024 768 and the jaggies really get to me. I see all this beauty but it looks like it all has been cut out by a two year old.

A Bloom lighting and AA combination choice would have been great for those who wish to get rid of the jaggies. I know people with extremely high end cards are not getting these but Bloom lighting+ high AA looks very nice.

Agreed I would much rather have the option of Bloom+AA then HDR and no AA at all. I play at 1280x1024 and tho things dont look to bad up close anything more than half a block away looks sharp enough to chop wood on. Low FPS I can live with 30-35 FPS is playable but not when it looks this ugly(now granted I will be upgrading in July tom something more powerful but Im no slouch really)

P43.2E @ 3.6(225FSB)Zalman 7000B-Cu

Abit IC-7

2x1024 Corsair XMS 3-3-3-8(1:1)

X800XT/Arctic Silencer 4

Samsung Spinpoint 120G SATA

SB Audigy 2 ZS

Tagan 420W

LG 1930BQ TFT 19" 12ms

Logitech G5

Saitek X-52

Logitech Driving Force Pro

3DMark05 6174

3DMark03 12917

3Dmark06 2162

Edited by LT.INSTG8R
Link to comment
Share on other sites

i didnt know gamers were obsessed with FSAA...  I've just never felt the need to use it ever, especially at high resolutions :S

While Im in the action I dont notice any jaggies

Neither did I. Most competitive gamers with half a brain leave it shut off when they play because it sucks up framerates like nothing else. Sure you can crank it up when playing ancient games like [GR], but any new game that pushes hardware limits generally requires that you leave it off to maintain a decent framerate anyways.

Link to comment
Share on other sites

Neither did I.  Most competitive gamers with half a brain leave it shut off when they play because it sucks up framerates like nothing else.  Sure you can crank it up when playing ancient games like [GR], but any new game that pushes hardware limits generally requires that you leave it off to maintain a decent framerate anyways.

Ridiculous. Have you checked the benchmarks from modern video cards in the past 2 years? 2X AA is basically "free" with most cards these days, even midrange cards with modern titles - it certainly is on my 7600GT. I can often get better framerates with 1024*768 4XAA than I can with 1280*1024 no AA, and again that's with recent titles. It makes a huge difference in visual quality - even 2X AA.

Here's the problem: The screenshots recently released before the demo had high-quality AA. Gee, why was that?

Did you think that's one of the reasons people are a little upset now? It's simple: DON'T RELEASE DOCTORED SCREENSHOTS. People were wondering why the PC version doesn't look as good as the 360 version (which even at this point is debatable, even without AA I don't think GR PC looks bad, it has some advantages over the 360 version), then suddenly these "new" screenshots with lovely AA appear that look significantly better. "Phew!" cry the owners of $500 cards.

Well, too bad for you - that was marketing. I expect by now we'll see some doctored screenshots for the older consoles, there were always 1280*1024 released shots of say, the new Jak and Daxter for the PS2 that everyone knew wouldn't look like the final. But this is quite new for PC gamers.

It's simple: Don't lie. Don't put out bogus screenshots, and you'll at least be able to not lead people down a garden path. I understand that as developers, you may not have had any say in this and Ubisoft requested downsampled screenshots, but it's damn annoying nonetheless. When I see the description of screenshot, I expect to see an actual screenshot. Really, is that too much to ask?

As for "PC hardware can't handle it": Funny, sounds like what the developers of Oblivion said. A scant what - 2 weeks later? The "Chuck Patch" appears for ATI cards. To be more accurate: "Our implementation of HDR does not support AA. Yes, we release ATI X1000 cards can do FP16 blending with AA and Valve figured out how to support it with all PS 2.0 cards, but we went a different route that shall remain a mystery".

There ya go!

Link to comment
Share on other sites

Holy crap, how many times does it need to be said that reducing the size of a screenshot WILL reduce if not remove the appearance of jagged edges!? I can't believe "hardcore" PC gamer would claim ignorance on the issue. Especially when Bo said that AA was not being used in screen shots he made comments about.

Edited by Thoramir
Link to comment
Share on other sites

Neither did I.  Most competitive gamers with half a brain leave it shut off when they play because it sucks up framerates like nothing else.  Sure you can crank it up when playing ancient games like [GR], but any new game that pushes hardware limits generally requires that you leave it off to maintain a decent framerate anyways.

Ridiculous. Have you checked the benchmarks from modern video cards in the past 2 years? 2X AA is basically "free" with most cards these days, even midrange cards with modern titles - it certainly is on my 7600GT. I can often get better framerates with 1024*768 4XAA than I can with 1280*1024 no AA, and again that's with recent titles. It makes a huge difference in visual quality - even 2X AA.

Here's the problem: The screenshots recently released before the demo had high-quality AA. Gee, why was that?

Did you think that's one of the reasons people are a little upset now? It's simple: DON'T RELEASE DOCTORED SCREENSHOTS. People were wondering why the PC version doesn't look as good as the 360 version (which even at this point is debatable, even without AA I don't think GR PC looks bad, it has some advantages over the 360 version), then suddenly these "new" screenshots with lovely AA appear that look significantly better. "Phew!" cry the owners of $500 cards.

Well, too bad for you - that was marketing. I expect by now we'll see some doctored screenshots for the older consoles, there were always 1280*1024 released shots of say, the new Jak and Daxter for the PS2 that everyone knew wouldn't look like the final. But this is quite new for PC gamers.

It's simple: Don't lie. Don't put out bogus screenshots, and you'll at least be able to not lead people down a garden path. I understand that as developers, you may not have had any say in this and Ubisoft requested downsampled screenshots, but it's damn annoying nonetheless. When I see the description of screenshot, I expect to see an actual screenshot. Really, is that too much to ask?

As for "PC hardware can't handle it": Funny, sounds like what the developers of Oblivion said. A scant what - 2 weeks later? The "Chuck Patch" appears for ATI cards. To be more accurate: "Our implementation of HDR does not support AA. Yes, we release ATI X1000 cards can do FP16 blending with AA and Valve figured out how to support it with all PS 2.0 cards, but we went a different route that shall remain a mystery".

There ya go!

Awesome post. Thank you!

Link to comment
Share on other sites

:Db::Korven+Apr 27 2006, 11:31 PM-->

QUOTE(::Db::Korven @ Apr 27 2006, 11:31 PM)
:o=

GRAW won't survive without AA. Not GRIN either.

AA is for morons really.

AA is a stupid man's feature. You get a powerfull card and use 50% of it's resources to blur edges, when the same effect can be done with upping the resolution a much smaller fraction of the gpu's processing power.

Sorry, I got a 1900xtx, I like to paint and look at detail, So while i can put AA on, I chose to leave that option for the morons who think it makes a big visual difference. Its a graphics placebo people, a reason to justify SLI.

Link to comment
Share on other sites

I think this game is to "gray" there's like no color at all. And the buildings all look the same, i am starting to think im playing "Mario Bros" not GR

159876.jpg

^^^^ LOL This is how the textures of the buildings in GRAW look like.

Edited by USMC Maggot
Link to comment
Share on other sites

As for "PC hardware can't handle it": Funny, sounds like what the developers of Oblivion said.  A scant what - 2 weeks later?  The "Chuck Patch" appears for ATI cards.  To be more accurate: "Our implementation of HDR does not support AA.  Yes, we release ATI X1000 cards can do FP16 blending with AA and Valve figured out how to support it with all PS 2.0 cards, but we went a different route that shall remain a mystery".

Its funny how you mention gamers with half a brain, when you don't even realize that Valve does not use FP16 for HDR. They are not using true floating point precision HDR. They are using a fake SM2 "bloom" effect.

FP16 HDR + AA cannot be done without major workaround. PERIOD. It doesn't work on Oblivion, it doesn't work on Lockdown, it doesn't work on Splinter Cell, it doesn't work on FarCry, it doesn't work on Timeshift, it doesn't work on Serious Sam 2, and it doesn't work with BoS.

It can be done with x1900xtx cards, but requires a major workaround, and is simply not worth it for 1% of the population. Frankly, I would rather have them put that dev time towards something else.

I would like to see the option to disable HDR though, but they may have done it for MP balancing issues.

Edited by jAkUp
Link to comment
Share on other sites

Im not that upset...I think it looks really good as is....just sad that you can buy the second best NVIDIA card on the market today...and its still not good enough.

HACK

HACK, it wasn't plastered all over the forums, but I've mentioned it several times. I've been telling people for quite some time that this game would most likely be as hard on our systems as F.E.A.R., which these 512MB won't handle on their own at high res with everything turned on. Every time I do it, there's someone who will come in right behind me and say that a $150 X1600 will work just fine, or whatever. Sometimes, I've quoted Bo's recommendation of the 512MB cards; sometimes I've just recommended them myself.

Man o' man I just spent some money...

here is my new rig

CPU: AMD X2 4400 *new*

MB:  Asus A8N32 Deluxe *new*

RAM: 2 GB  *new*

Video :  UNKNOWN at this point...but will either be the ASUS or LEADTEK 7900GT(everyone is sold out of both)

Also, as far as video, if you have the $$$ to shell for that other hardware, go the extra mile and drop the money a 7900GTX. The 512MB will make a difference for GR:AW. :thumbsup:

Sorry I didn't quote Bo directly when I responded to you. It sounds like you may have been one of the ones who would have listened.

As far as the second best not being good enough, the game devs are in a no-win situation. If the hardware outworks the games, we complain that the game devs aren't exploiting the potentital of these expensive cards we bought. If the games outwork the hardware, we complain that we have to buy new cards. And if any game got it just right, it would only be just right for 6 months, then hardware addicts would be complaining that the game wasn't good enough. HL2, with a DX7, DX8, and DX9 path, is the only game I've ever seen to really get this right.

--Logos

I remember reading that...but thought it was your opinion...and not required specs. I did a lot of research before I bought my card....and everything I read said that 7900 256 would be more than enough for anything out there (of course I guess GRAW wasnt out there)

As I said...Im not to bothered by it...I think the game looks good////.but you are right..I would have probably bit the bullet and forked out the extra for 512....looks like I might be moving to SLI sooner than I thought.

Although my point still stands...UBI should have had the specs posted for months.

HACK

What about a PhysX card then?

You got a nice Grafic card already, I hate to see you (and me as I am about to by a completely new rig) having to settle for less than high resolution with that grafic card.

To put it plain: can a physX card be a substitute for a 512 memory on the grafic card?

A long shot: can a X-fi card help out a bit, by doing some of the CPU work and by that giving the CPU more capacity helping the grafic card out? :huh:

Link to comment
Share on other sites

To put it plain: can a physX card be a substitute for a 512 memory on the grafic card?

NO!!

The physics card is for processing physics calculations of the engine.Without one the software physics is handled by the CPU and system memory.

A 512mb graphics card will give you more Graphics memory for handling the High Graphics settings and improved detailing of the textures.

Link to comment
Share on other sites

To put it plain: can a physX card be a substitute for a 512 memory on the grafic card?

NO!!

The physics card is for processing physics calculations of the engine.Without one the software physics is handled by the CPU and system memory.

A 512mb graphics card will give you more Graphics memory for handling the High Graphics settings and improved detailing of the textures.

Yes that is as I perceive it as well. What I mean is if it could give collateral effects as it frees the graphic card from strain a bit.

Perhaps substituting a 512mb memory is far fetched, but pushing up a medium end 256-card to a high 256-card could perhaps be possible? On top of delivering the PhysX-effects of course.

I mean peeps get better graphic by updating their X-fi cards; a card which have very little with graphic to do (I better state that great knowledge so I don't get it down my throat).

Edited by JASGripen
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.

×
×
  • Create New...