Jump to content
Ghost Recon.net Forums

Recommended Posts

Deferred rendering is NOT the heatwave (which seems to be just a shader). It´s the whole lightning system.

If the heatwave was the problem, they would have added an option to disable it and enable AA a long long time ago, long before the game went gold. :yes:

So then it sounds like some sort of supersampling, which renders at higher res then "converts" to your screen res. But that should be a real framerate killer shouldn't it be?

What is supersampling like DR or the heatwave effect? The heatwave effect is NOT a framerate killer IMO, any decent card should be able to handle it with little performance penalty.

DR works like this (AFAIK). This rendering technique allows to spent expensive pixel shader operations just once per pixel. A render scene is moved to fat offscreen buffer (XYZ,norm,materialID,mapping coordinate,...) without any shaders using, and then render screen/aligned and apply correct shader on pixels (depending on per pixel information in fat buffer). Tim Sweeney said that AA doesnt work well with deffered rendering and this is logical if you think about it.

^^ Thats what I read over at Nvnews. My simplified version: The way I understand is that (almost) everything is rendered in a SINGLE cycle so that AA CAN´T applied (on current hardware) because there are no edges that can be AA anymore. You could add a blur filter, that sounds easy, but to add AA, you´d have to rip the first part of the rendering into pieces and re-write the way the rendering works.

However, this seems to be not practical and a real performance killer. ATI admitted that they can´t do anything about it with their current hardware, unlike with Oblivion, which works a totally different way and could solved by ATI.

It will be interesting to see how the Unreal Engine 3 will deal with deferred rendering. Will it be the standard and only rendering path? Not likely, but then again I highly doubt that AA will work with current hardware in UE3 when its set to use deferred rendering either. The performance hit for AA and DF on current hardware must be ridiculious. Thats how I understood Tims comment over at nvnews.net.

Edited by agentkay
Link to post
Share on other sites
  • Replies 50
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

I highly doubt that AA will work with current hardware in UE3 when its set to use deferred rendering either. The performance hit for AA and DF on current hardware must be ridiculious. Thats how I understood Tims comment over at nvnews.net.

Thanks for the simple explanation. I read that too but sometimes my eyes gloss over when it gets overly technical.

Link to post
Share on other sites

Geez 4gigs of ram. and about a decade ago a 32mb card was top of the line. then you were part of the "in" crowd if you owned a 64MB graphiccard. :rofl: I'm just getting old.

@ agentkay. So would GRIN have to strip down GRAW and do it all over to get rid of Deffered lighting to get AA?

this AA & deffered lighting has me feeling...dizzy :wacko:

Edited by Papa6
Link to post
Share on other sites

I dont bother , because i never used AA in any game. I still drinking beer , and after 1 litter , AA is comming on my screen , there's no scaling , everything is blur . :)

Yeah i tell you that beer does great things in hardware mode ! :grin1:

So forgive the software AA and purchase a pack of 26 Kronembourg ( french beer ) hardware AA !

:rofl:

Have fun

NTF-Phoenyx

French GRAW comunity site

Link to post
Share on other sites

@ agentkay. So would GRIN have to strip down GRAW and do it all over to get rid of Deffered lighting to get AA?

They might be able to add a new (old) rendering path to their engine and use it to support AA. Sounds more practical than ditching DL, or re-writing DL to hell and have still crap AA performance. I bet the ealier versions of their Diesel engine didn´t use Deferred lighting, but is the code still there, or is it dumped and can´t be brought back? No idea. If it still there, it would require a hell of work to make the game look half-decent. You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001.

I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us.

My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.

Link to post
Share on other sites

I still think its a smart move to make it an option. I mean you talk of UE3 well thats next year and will be DX10 compliant and will be released when the next gen of cards will be around to handle the changes. This is where I think GRIN missed the mark. The "Tech" is a cool idea but its FAR too early to force it on people when it has been said over and over there is NO hardware to support it.

So like any other game that adds new tech make it scaleable so when there IS the hardware to keep up with the tech the switch is there to turn it on

Link to post
Share on other sites

UE3 is this year with Gears of War but that one is console only (for the time being). UT2k3 might make it this holiday season or Q1 2007, both ways it will be still DX9 because Vista and DX10 is looking like March 2007 right now, or even later considering how crap Vista Beta2 runs. Yes UE3 will support DX10 (confirmed) but only once the hardware AND software (Vista) is released and available, not any time sooner. The min. specs for UE3 are good DX9 cards (minus Nvidia 5-series) anyway, so the chances are good that we will see UE3 games that will be DX9 only and won´t get any DX10 goodies unless its reasonable and the publisher of the games will fund such a decision.

I personally think Grins decision to use DL is mainly the 15 months they were given to deliver the game. DL is not "new", remember that article in 2003, but it just has not previously been used in a commerical game. One of the big benefits of DL is that you can make expensive next-gen effects and shaders, soft shadows, and tons of lights with a relative small performance penality. Who knows how the game would have looked liked if Grin had the same funding (and dev menpower) like the 360 version had? We still wouldn´t have DL+AA but most likely an addional and decent-looking rendering path that would support AA. Anyway, of course this is just my personal speculation and opinion. :)

Link to post
Share on other sites

Well as for it not being a performance penalty it most certainly is as, it requires alot of frame buffer which current 512 cards "can" handle but those arent "mainstream" cards just yet(they're getting there) and the reason no one up until now has used it(it was originally thought up back in 1988) But then you bring up the point that again UE3 engine is scaleable which really is my main point about all this, this game has very little scaleability, Im hoping this upcoming AA Feature is an attempt at making this game more scaleable and at least less of a hit on current cards which includes the the highest end.

Link to post
Share on other sites

The lighting used wasn't right for this type of game IMO. In tactical shooters AA makes alot of difference, especially for snipers.

but talking about realism its realistic lighting, photo-realistic

Apparently you never seen Valve's The Lost Coast now that's realistic lighting and with AA filtering It runs circles around Graw's HDR . When ever you a chance set it to 4x AA filtering and 16X AF filtering and you will see that Valve's Source engine has superior lighting system and it runs smooth too.

Edited by UberSoldier
Link to post
Share on other sites

Well for Lost Coast that is still only SM 2.0 HDR tho Im sure the upcoming HL2 Episode 1 will most likely use SM 3.0 HDR but will still have the the SM 2.0 path for older cards(I'll let you know June 1st when I can play it) but again its not the HDR thats killing the AA its the Deferred Shading

Edited by LT.INSTG8R
Link to post
Share on other sites

Can someone from GRIN comment on this? How are you guys going to do this? :huh:

Good Point Reb, beats alot of speculation on our parts

I wonder what would have happened if ubi told grin to use another game engine like Crytek's

CryEngine. With some modifications this would have been ideal.

CryENGINE is a game engine used for the first-person shooter computer game Far Cry. It was originally developed by Crytek as a technology demo for nVidia and, when the company saw its potential, it was turned into a game.

When video cards with support for 2.0 pixel and vertex shaders were released, Crytek released version 1.2 of the engine which used some of the capabilities for better graphics.

Later the company developed version 1.3, which had support for even longer 2.0b and 3.0 version shader instructions. In cooperation with ATI it created a machinima demo project called "The Project", loosely based on the world and levels of Far Cry.

The demo featured some of the most realistic real-time graphics to date, compared with previous ATI demo Doublecross. This technology will be used by Crytek for their next game titled Crysis.

It also supports HDR.

Link to post
Share on other sites

Its the same technique that Epic is using for Gears of War, which <ding> <dong> <ding> won´t have any AA on the X360 either! (yes Tim Sweeney confirmed that here > http://www.nvnews.net/vbulletin/showpost.p...675&postcount=1 ) The only way to get rid of the jaggies before DF-AA capable hardware will be released is to run the game at a higher res, push the screen further away, or put some butter on the screen.

I dont see that quote in transcript, all future games on x360 are using 4aa and full hdr. 360gpu has extra edram logic just for that and its shaders are dx10 compliant.

edit: check his armour

gow29ce.jpg

gow16rg.jpg

Edited by Lysander
Link to post
Share on other sites

Lysander, at the bottom of the transcript it says:

One more question that was answered via email.

Jacob- Will UE3.0 support predicated tiling to make use of 4xAA on Xbox 360?

Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.

Don´t believe the Microsoft BS with DX10 and AA on the 360. AA works only if data can fit into the 10MB buffer, if it can´t the engine has to support tiling to be able to support AA, and if the engine can´t do that, well, then you are out of luck. And thats just one reason and as you can see on Tims comment, there are other reasons that come into play as well when a game or engine lacks AA.

And the 360 can´t be DX10 because by the time that the hardware was finalized DX10 wasn´t finalized. The 360 is DX9+, how much different the "+" is compared to DX9c on PC; well I have no idea, but it isn´t too big.

Edited by agentkay
Link to post
Share on other sites

You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001.

No it wouldn't. You can still achieve all those things by other means. You can get all those nifty shaders and eye-candy and special effects without any form of deffered lighting. The dynamic shadows from the trees would probably be the only thing you gave up.

I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us.

My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.

UE3 approaches things much differently and the developer can use deffered techniques, static light maps (current standard technique), vertex lighting (still very useful) as well as any combination of those they see fit. There are not paths per se, but the use of different light types and mesh settings that do different things.

-John

Link to post
Share on other sites

You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001.

No it wouldn't. You can still achieve all those things by other means. You can get all those nifty shaders and eye-candy and special effects without any form of deffered lighting. The dynamic shadows from the trees would probably be the only thing you gave up.

But it would take a lot of work to make it look good, wouldn´t it? I mean more work than what you would spend on an average patch. I´m talking here at the current case as it stands with GRAW PC and its engine.

I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us.

My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.

UE3 approaches things much differently and the developer can use deffered techniques, static light maps (current standard technique), vertex lighting (still very useful) as well as any combination of those they see fit. There are not paths per se, but the use of different light types and mesh settings that do different things.

-John

Edited by agentkay
Link to post
Share on other sites

Here is an exerpt from correspondance with the lead engine programmer of a very well know engine.... names and specific references removed.

There are several problems with deferred lighting/shading techniques

that caused us to rule them out for XXXXXX:

1. They require buffering up all of the material parameters into render

targets (using DirectX9's MRT support), and processing the results in

frame buffer passes per lightsource, for 2X or more pixels more than

would be covered by rerendering objects for light/object interaction

pairs (because of background objects in the regions affected by a

light). This uses an extreme amount of memory bandwidth, and in

Andrew's PC-based experiments was slower and less flexible than the

pass-per-lightsource approach. On consoles, the management of the

memory footprint of this buffer could be a limiting factor.

2. The require using a uniform material model for all objects in the

scene (or expensive techniques to dynamically branch and choose a

material model for each pixel covered by a lightsource, requiring a

combinatorial explosion in shader complexity). Though we happen to only

have one top-level material class at this point, we're expecting lots of

games to define new ones, for new lighting models (anisotropic lighting,

cel shading, etc) and to intermix different material models freely.

The big argument for deferred shading is that it only requires one

rendering pass over each triangle in a scene, as opposed to one pass for

each lightsource. But this is valuable only if triangle rates are a

limiting factor for a particular game, and in our content so far,

triangle throughput isn't a limiting factor, and probably won't be in

the XXXX timeframe since memory (video memory on PC's or total memory on

consoles) constrains triangle counts short of the hardware's theoretical

throughput.

There are a few specific defered lighting techniques used still, but it does not use a full on defered lighting soultion.

Just some info to help those confused by the what, where, why and how of this stuff.

-John

Link to post
Share on other sites

Lysander, at the bottom of the transcript it says:

One more question that was answered via email.

Jacob- Will UE3.0 support predicated tiling to make use of 4xAA on Xbox 360?

Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.

Don´t believe the Microsoft BS with DX10 and AA on the 360. AA works only if data can fit into the 10MB buffer, if it can´t the engine has to support tiling to be able to support AA, and if the engine can´t do that, well, then you are out of luck. And thats just one reason and as you can see on Tims comment, there are other reasons that come into play as well when a game or engine lacks AA.

And the 360 can´t be DX10 because by the time that the hardware was finalized DX10 wasn´t finalized. The 360 is DX9+, how much different the "+" is compared to DX9c on PC; well I have no idea, but it isn´t too big.

Engine is also used by Huxley and Too human and both will use AA. I think Tim and Mark are full of trash. Listen, hundreds of people played gears of war MP at e3. No one said anything about jaggies.

Something is odd though, as msaa on 360 is not performed on shaders anyway but on separate chip, why it would interact with shader techinques. But natively (without going on edram chip) res is 720p without aa, thats logical.

Regarding dx10, there is ATI which imply that:

Guru3D: Looking at XBOX360, most of our readers obviously know that the graphics solution is provided by ATI, an architecture close to the R580 has been used. Can we expect the level of support for niche quality features was we see them on the PC Consumer card platform as well. In other words, what level of support for eye candy like AA and HDR will we see for XBOX 360 titles now and in the near future?

Neal: As a point of clarification, the graphics system that we provided for the Xbox 360 is well beyond the R580. With that in mind, I think you can definitely expect support for amazing graphics features in Xbox 360 titles well into the future. That’s one of the best parts of having a console for several years. Each Christmas, we typically see a new “benchmark†for the graphic quality of console titles. As developers take better advantage of the graphics core inside the 360, the gaming experiences will be better.

Guru3D: Looking at eye-candy in the likes of HDR, what's the next big thing for ATI to introduce technology wise? What will be the next focus? Graphics cards are getting closer and closer towards a real cinematic experience rendered in real-time. We have seen dramatic improvements when shaders where introduced and now followed by HDR. A nice example is the Crytek 2 engine which obviously shows an incredible graphics experience. What's next, what do we need besides raw computational power ?

Neal: I think that raw computational power is essential – but unless that is paired with a very high standard of visual quality, then that power is squandered. I think HDR + AA is a great example of that. The amount of work put into ATI’s Avivo display technologies are proof that ATI is concerned about the quality of these graphics features.

As I look at the not-too-distant future with DX10 around the corner, I see features there that excite me – more complex shaders, geometry shaders - water, fog, smoke, or dust that interacts with environment geometry. Now you’ve really got me started ; ) With the power behind ATI graphics cards, you can use instancing to animate thousands of individual characters with a variety of textures, animations, and geometry. Boy, this list is starting to get long…

From a technology perspective, one of the most amazing features of the Xbox 360 core that I’m looking at now – and looking forward to - is the Unified Shader Architecture (USA). In this architecture, there aren’t dedicated vertex and pixel shader engines, but a unified shader engine capable of handling both types of instructions. The possibilities there are only now being demonstrated.

link

Edited by Lysander
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...