Jump to content

agentkay

Members
  • Posts

    386
  • Joined

  • Last visited

Posts posted by agentkay

  1. This is the essential part of your entire post. The majority of the PC gamers did not by a new computer in 2006 or 2007, although I would have loved if they did as it would make things easier for developers. Like I said in the first post I did on this subject, only a few percent of the PC gamers have such new hardware and they would of course be able to play it. But that is also why the the AVERAGE home PC can't run a game made for the 360.

    Nvidia has sold more than 3 million G8x silicons (Nvidia 8 series chips) to board manufacturers in the last 6 months. Sure it doesn´t say that 3 million end-users have DX10 hardware right now but it shows at what pace the industry is going. ;) The PC hardware industry is far larger and generating more profit than the console hardware industry which actually generates loss on most consoles (except maybe the Wii) but thats another whole topic.

    Could you take the 360 version of GRAW2 and make it run on a P4 3,0GHz, with 1GB RAM and a GeForce 7600GS for example, which is pretty much a midrange gamer PC in use today? I think not. As said before, you can't compare console hardware with the latest PC hardware, then of course the PC will always win unless the console is brand new, you have to compare the console with the current most common PC user hardware.

    I think yes, IF you would optimize it to death and make it run at 800x600 or something like that but it surely wouldn´t be easy and might take a very long time. Just so you know, there is a reason why min. requirements are are made and they usually target the performance of hardware at 20fps at 800x600 and the lowest settings. So yes it would be possible (IMO) but no it wouldn´t look the same but the more time was available, the better it could be optimized.

    A reason why some games don't run as well on 360 that it should be able to is because the developers don't care as much due to the fact that it can handle less optimized graphics and code then you see in PC games. As long as they go above 30-40FPS it's acceptable, so why put expensive time into optimizing it when it's not needed? This is what we saw in the last generation of consoles very clear. Take the PS2 for example, They could have made games that looked like the games released for it last year from the beginning as the hardware was there, but they didn't as the market didn't push the developers to optimize the games to be able to use the hardware to the maximum. You'll see that same on the 360 as it ages, game will look better and better and still run at about the same level as the developers are forced to optimize and think more and more to get the most out of the same hardware. PC always require this as it's constantly having users with older hardware that must be able to use the software.

    You are saying that 360 games are not optimized? lol No offense but that is very VERY far from reality. Console games in general are strongly optimized because they don´t have the raw power to overcome poor coding/missing optimizations, UNLESS you make a crappy looking game of course. ;) Console game development is all about dodging limitations (bandwidth, memory, raw power, fillrate, disc size, missing HDD etc.). Why do you think they cap most games at 30fps, because the console can´t produce higher average framerate OR it would start to stutter or lock up (run into another hardware bottleneck). And the 30fps is just one limit what about usually limited level size, heavy LODing, missing filtering etc. Hardware limits and/or limited development time. ;)

    The reason the PS2 games started to look better as time passed by is because the PS2 was NOT easy to code for, both from the hardware side BUT ESPECIALLY from the devtools side. Many people in the industry hated to code for it and many had to learn to code for it but as time past by they got better and managed to squeeze everything out of the hardware. The 360 devtools are far easier to code for and already very effective. I don´t expect 360 games to advance visually like the PS2 games did. You can see the limit of the hardware already and it still in its first trimester. lol (joke ;) )

    Anyway, I´m done with this topic. My opinion is pretty clear now I guess. ;) Enjoy your PC and console games, I know I enjoy mine (especially ones on my PC) ;)

  2. The 360 hardware is decent for a console but modern PC hardware runs cycles around it and leave it in the dust. When a game is properly ported from the 360 to the PC, it will run faster, with higher AA/AF and resolution on a modern PC compared to the 360. I have a C2D at 3.2Ghz and a 8800GTX and I tested Test Drive Unlimited both on the PC and 360 and on the PC I could play it at 1280x1024 with 16AF, 16AA fully maxed out and it never dropped below 50fps, and my CPU cores were running at around 70%. The 360 runs at 1280x720, with 2AA (maybe 4AA but it looked blurry) and no AF, and it was capped at 30fps and the HDR looked worse as well.

    Another good port is Oblivion, which runs like a dream on a 8800GTX and looks truely next-gen with Qarls Texture Pack no.3.

    Of course its not fair because my PC has far more raw power in almost all areas but it shows how fast PC hardware caught up and overtook the 360 and PC´s have a much larger overhead to overcome as well (O.S. API, poor ports etc.).

    BTW, you can´t compare the CPU in PCs with the "in-order" triple core CPU in the 360. It is not a good general purpose CPU and highly lacks in areas where PC CPUs shine but it does a good job when code is written for its special architecture.

    The 360 hardware looks impressive on paper (and its marketing made it larger than it is as well) but when you look at the games, which usually have poor AI, simplied physics, limited resolution (1280x720), no filtering, limited AA, small levels, heavy LODing, low resolution textures (which are usally hidden with motion blur, or Depth of Field effects), it actually shows how poor the system is compared to modern PC hardware and the 360 is suppose to last another 4 years. Not very impressive to me. ;)

    Sure I have a slight PC bias and I´m known to be a graphics and picture quality fan but I do own a 360 and I tried to be as objective as I could be but I did want to show that the 360 is not "all that great" how some people make it out to be and surely not how its marketing claims it either but in the end you get what you pay for. ;)

    Final words: Proper ports from the 360 to PC will run and look better on modern PC that was bought in 2006 or later.

  3. I get the same bug with patch 1.16 (didn´t happen before). If I shot the two tanks myself, it obviously didn´t crash, but when I tried to move my tanks to the exit, the crash was triggered. It seems to me that once my tanks got to a certain point/distance near the wrecked enemy tanks, it crashed. AMD 2700, nforce 2, Nvidia 6800 (84.43 drivers), Soundblaster ZS2

  4. The blur shader was the only realistic option in the short time they had. Not only it works on almost all video cards, and the performance hit is rather small. If Grin had decided to add Supersampling AA, it would have worked only on Nvidia cards since SSAA is a hardware feature, but at the same time it costs A LOT of performance (and vram IIRC). I´m talking about a performance hit from 40fps to 10fps or even worse, depending how many edges are present. It surely would have looked nice since SSAA is the best kind of AA when it comes to Image Quality.

  5. You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001.

    No it wouldn't. You can still achieve all those things by other means. You can get all those nifty shaders and eye-candy and special effects without any form of deffered lighting. The dynamic shadows from the trees would probably be the only thing you gave up.

    But it would take a lot of work to make it look good, wouldn´t it? I mean more work than what you would spend on an average patch. I´m talking here at the current case as it stands with GRAW PC and its engine.

    I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us.

    My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.

    UE3 approaches things much differently and the developer can use deffered techniques, static light maps (current standard technique), vertex lighting (still very useful) as well as any combination of those they see fit. There are not paths per se, but the use of different light types and mesh settings that do different things.

    -John

  6. Lysander, at the bottom of the transcript it says:

    One more question that was answered via email.

    Jacob- Will UE3.0 support predicated tiling to make use of 4xAA on Xbox 360?

    Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.

    Don´t believe the Microsoft BS with DX10 and AA on the 360. AA works only if data can fit into the 10MB buffer, if it can´t the engine has to support tiling to be able to support AA, and if the engine can´t do that, well, then you are out of luck. And thats just one reason and as you can see on Tims comment, there are other reasons that come into play as well when a game or engine lacks AA.

    And the 360 can´t be DX10 because by the time that the hardware was finalized DX10 wasn´t finalized. The 360 is DX9+, how much different the "+" is compared to DX9c on PC; well I have no idea, but it isn´t too big.

  7. UE3 is this year with Gears of War but that one is console only (for the time being). UT2k3 might make it this holiday season or Q1 2007, both ways it will be still DX9 because Vista and DX10 is looking like March 2007 right now, or even later considering how crap Vista Beta2 runs. Yes UE3 will support DX10 (confirmed) but only once the hardware AND software (Vista) is released and available, not any time sooner. The min. specs for UE3 are good DX9 cards (minus Nvidia 5-series) anyway, so the chances are good that we will see UE3 games that will be DX9 only and won´t get any DX10 goodies unless its reasonable and the publisher of the games will fund such a decision.

    I personally think Grins decision to use DL is mainly the 15 months they were given to deliver the game. DL is not "new", remember that article in 2003, but it just has not previously been used in a commerical game. One of the big benefits of DL is that you can make expensive next-gen effects and shaders, soft shadows, and tons of lights with a relative small performance penality. Who knows how the game would have looked liked if Grin had the same funding (and dev menpower) like the 360 version had? We still wouldn´t have DL+AA but most likely an addional and decent-looking rendering path that would support AA. Anyway, of course this is just my personal speculation and opinion. :)

  8. Actually I can confirm two other users who have NO PPU slowdowns. They used GRAW 1.06 with the latest (beta) Physix drivers AND the both have dual-core AMDs. If you consider the latest GRAW patch, and its PPU optimizations and improvments, it should be safe to say that the performance issue is history as long as you a have dual-core CPU (just to be on the safe road).

  9. @ agentkay. So would GRIN have to strip down GRAW and do it all over to get rid of Deffered lighting to get AA?

    They might be able to add a new (old) rendering path to their engine and use it to support AA. Sounds more practical than ditching DL, or re-writing DL to hell and have still crap AA performance. I bet the ealier versions of their Diesel engine didn´t use Deferred lighting, but is the code still there, or is it dumped and can´t be brought back? No idea. If it still there, it would require a hell of work to make the game look half-decent. You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001.

    I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us.

    My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.

  10. Deferred rendering is NOT the heatwave (which seems to be just a shader). It´s the whole lightning system.

    If the heatwave was the problem, they would have added an option to disable it and enable AA a long long time ago, long before the game went gold. :yes:

    So then it sounds like some sort of supersampling, which renders at higher res then "converts" to your screen res. But that should be a real framerate killer shouldn't it be?

    What is supersampling like DR or the heatwave effect? The heatwave effect is NOT a framerate killer IMO, any decent card should be able to handle it with little performance penalty.

    DR works like this (AFAIK). This rendering technique allows to spent expensive pixel shader operations just once per pixel. A render scene is moved to fat offscreen buffer (XYZ,norm,materialID,mapping coordinate,...) without any shaders using, and then render screen/aligned and apply correct shader on pixels (depending on per pixel information in fat buffer). Tim Sweeney said that AA doesnt work well with deffered rendering and this is logical if you think about it.

    ^^ Thats what I read over at Nvnews. My simplified version: The way I understand is that (almost) everything is rendered in a SINGLE cycle so that AA CAN´T applied (on current hardware) because there are no edges that can be AA anymore. You could add a blur filter, that sounds easy, but to add AA, you´d have to rip the first part of the rendering into pieces and re-write the way the rendering works.

    However, this seems to be not practical and a real performance killer. ATI admitted that they can´t do anything about it with their current hardware, unlike with Oblivion, which works a totally different way and could solved by ATI.

    It will be interesting to see how the Unreal Engine 3 will deal with deferred rendering. Will it be the standard and only rendering path? Not likely, but then again I highly doubt that AA will work with current hardware in UE3 when its set to use deferred rendering either. The performance hit for AA and DF on current hardware must be ridiculious. Thats how I understood Tims comment over at nvnews.net.

  11. Deferred rendering is NOT the heatwave (which seems to be just a shader). It´s the whole lightning system. :) Every pixel in the game is affected by deferred rendering. The game doesn´t have a lightning system that other games have, so you can´t compare it with Oblivion for instance. The next game that will use DR will be Gears of War, and it won´t have any AA either. I think DR will be used in quite a few Unreal Engine 3 powered games.

    If the heatwave was the problem, they would have added an option to disable it and enable AA a long long time ago, long before the game went gold. :yes:

  12. I love the game as well! Gets a 9 of 10 from me. Excellent GFX (lighting, animations, models, etc), excellent sound, very nice level design, but sometimes a little too linear (gameplay decision I guess), good AI (often very good, sometimes frustating, in the average "good"), very nice physics, lots of love for small details (bullets visible in mags etc.).

    I would have given it a 10/10 score if the game had 2 or 3 points from my "wishlist" post. http://www.ghostrecon.net/forums/index.php...95entry369495

  13. My list:

    1. More GFX settings/customizations.

    2. "Ultra High" Shadow setting that renders the shadows approx. 10m further away and pushes the "Shadow Fade-In" further away. Maybe a similar setting for ground objects/car-window reflections since they fade-in pretty early as well.

    3. At least two levels have very light/barely visible shadows.

    4. 1 more rifle, preferable SR25, or another AR10/15 variant

    5. 1 or 2 addional weapon mods (ACOG for instance)

    6. EcoTech sight mountable on the other rifles, not just the MR-C

    7. Animated Bipod (when going prone) for the M99-Sniper rifle

    8. More SP modes

    9. Extreme Difficulty mode (preferable with more rebels, instead rebels with superpowers)

    10. Modtool that allows creation of new SP missions.

    Thats it. I don´t play MP, so I don´t care about it. :P

  14. lol thats odd, but.. did anyone know if u get the gnade launcher on any rifle once u rapell out of the blackhawk u can blow it up with a nade.. rofl

    No I didn´t manage that, but I killed accidently one of my Ghosts today with the GL. I wanted to shoot a truck approx. 40m away with a nade and one of my Ghosts stood maybe 15m infront of me at the left side of my FOV. I shot, nade accidently hit a lamp post, didn´t explode by this impact because its too close, but bounced off the post and right on my Ghost. Boom. :rofl: Totally amazing because the angle of the shot and bouncer were spot on and hilarious because it was totally unexpected. :D

  15. Lets not forget non-tradional approaches to AA as well. Who says that all fail with the DL engine? You can do AA in shaders for instance, or as a post-processing effect. It might be very expensive right now, but next-gen video cards probably would laugh at them.

    I really would love to have a mature discussion with Grins engine guys, and what their opinions are on DL and why they have choosen this tech (the first time DL is used in a game AFAIK). I have no doubt that they have good reasons and I´d love to hear them. I´m sure I´m not the only one. :)

  16. Considering that article is pretty old (July 2003) and even the best high-end cards back then are surpassed by todays low-end cards, I find the Conclusion paragraph pretty interesting.

    Deferred Lighting is now possible in hardware on the latest video cards, it has an own unique advantages and disadvantages and while in many cases the disadvantage out way the advantages on current hardware in the right situations it can outperform and look better than conventional per-pixel lighting. In situations with complex procedural shaders with many lights and shadows, the saving can be massive this is even truer in densely occluded environments. The same techniques used to accelerate objects (batching etc) can be used to accelerate lights, and if a light isn’t visible its cost is very low.

    In the long term the occlusion and geometric properties of deferred lighting are its most interesting factors, no other technique allows you to render so many lights affecting a single surface without crippling performance, with deferred lighting it scales much better.

    ****

    What does that tell me? GRAWs engine has quite some headroom for optimizations because it is fairly "young" and while DL has some disadvantages, it has major advantages as well. Lights and (soft) shadows are less expensive than in tradional implementations, but I personally do agree that Grin should have included a tradional-static light system which would support AA. Either there wasn´t enough development time for something like that, or the engine doesn´t support 2 different lighting techniques in the same build.

×
×
  • Create New...