Jump to content

agentkay

Members
  • Posts

    386
  • Joined

  • Last visited

Contact Methods

  • Website URL
    http://
  • ICQ
    0

agentkay's Achievements

Scout - 1st Class

Scout - 1st Class (7/13)

0

Reputation

  1. Nvidia has sold more than 3 million G8x silicons (Nvidia 8 series chips) to board manufacturers in the last 6 months. Sure it doesn´t say that 3 million end-users have DX10 hardware right now but it shows at what pace the industry is going. The PC hardware industry is far larger and generating more profit than the console hardware industry which actually generates loss on most consoles (except maybe the Wii) but thats another whole topic. I think yes, IF you would optimize it to death and make it run at 800x600 or something like that but it surely wouldn´t be easy and might take a very long time. Just so you know, there is a reason why min. requirements are are made and they usually target the performance of hardware at 20fps at 800x600 and the lowest settings. So yes it would be possible (IMO) but no it wouldn´t look the same but the more time was available, the better it could be optimized. You are saying that 360 games are not optimized? lol No offense but that is very VERY far from reality. Console games in general are strongly optimized because they don´t have the raw power to overcome poor coding/missing optimizations, UNLESS you make a crappy looking game of course. Console game development is all about dodging limitations (bandwidth, memory, raw power, fillrate, disc size, missing HDD etc.). Why do you think they cap most games at 30fps, because the console can´t produce higher average framerate OR it would start to stutter or lock up (run into another hardware bottleneck). And the 30fps is just one limit what about usually limited level size, heavy LODing, missing filtering etc. Hardware limits and/or limited development time. The reason the PS2 games started to look better as time passed by is because the PS2 was NOT easy to code for, both from the hardware side BUT ESPECIALLY from the devtools side. Many people in the industry hated to code for it and many had to learn to code for it but as time past by they got better and managed to squeeze everything out of the hardware. The 360 devtools are far easier to code for and already very effective. I don´t expect 360 games to advance visually like the PS2 games did. You can see the limit of the hardware already and it still in its first trimester. lol (joke ) Anyway, I´m done with this topic. My opinion is pretty clear now I guess. Enjoy your PC and console games, I know I enjoy mine (especially ones on my PC)
  2. The 360 hardware is decent for a console but modern PC hardware runs cycles around it and leave it in the dust. When a game is properly ported from the 360 to the PC, it will run faster, with higher AA/AF and resolution on a modern PC compared to the 360. I have a C2D at 3.2Ghz and a 8800GTX and I tested Test Drive Unlimited both on the PC and 360 and on the PC I could play it at 1280x1024 with 16AF, 16AA fully maxed out and it never dropped below 50fps, and my CPU cores were running at around 70%. The 360 runs at 1280x720, with 2AA (maybe 4AA but it looked blurry) and no AF, and it was capped at 30fps and the HDR looked worse as well. Another good port is Oblivion, which runs like a dream on a 8800GTX and looks truely next-gen with Qarls Texture Pack no.3. Of course its not fair because my PC has far more raw power in almost all areas but it shows how fast PC hardware caught up and overtook the 360 and PC´s have a much larger overhead to overcome as well (O.S. API, poor ports etc.). BTW, you can´t compare the CPU in PCs with the "in-order" triple core CPU in the 360. It is not a good general purpose CPU and highly lacks in areas where PC CPUs shine but it does a good job when code is written for its special architecture. The 360 hardware looks impressive on paper (and its marketing made it larger than it is as well) but when you look at the games, which usually have poor AI, simplied physics, limited resolution (1280x720), no filtering, limited AA, small levels, heavy LODing, low resolution textures (which are usally hidden with motion blur, or Depth of Field effects), it actually shows how poor the system is compared to modern PC hardware and the 360 is suppose to last another 4 years. Not very impressive to me. Sure I have a slight PC bias and I´m known to be a graphics and picture quality fan but I do own a 360 and I tried to be as objective as I could be but I did want to show that the 360 is not "all that great" how some people make it out to be and surely not how its marketing claims it either but in the end you get what you pay for. Final words: Proper ports from the 360 to PC will run and look better on modern PC that was bought in 2006 or later.
  3. I get the same bug with patch 1.16 (didn´t happen before). If I shot the two tanks myself, it obviously didn´t crash, but when I tried to move my tanks to the exit, the crash was triggered. It seems to me that once my tanks got to a certain point/distance near the wrecked enemy tanks, it crashed. AMD 2700, nforce 2, Nvidia 6800 (84.43 drivers), Soundblaster ZS2
  4. The blur shader was the only realistic option in the short time they had. Not only it works on almost all video cards, and the performance hit is rather small. If Grin had decided to add Supersampling AA, it would have worked only on Nvidia cards since SSAA is a hardware feature, but at the same time it costs A LOT of performance (and vram IIRC). I´m talking about a performance hit from 40fps to 10fps or even worse, depending how many edges are present. It surely would have looked nice since SSAA is the best kind of AA when it comes to Image Quality.
  5. Actually there are beta drivers for the XFi soundcard available at this website: http://www.soundblaster.com/language.asp?s...pport/downloads
  6. No it wouldn't. You can still achieve all those things by other means. You can get all those nifty shaders and eye-candy and special effects without any form of deffered lighting. The dynamic shadows from the trees would probably be the only thing you gave up. But it would take a lot of work to make it look good, wouldn´t it? I mean more work than what you would spend on an average patch. I´m talking here at the current case as it stands with GRAW PC and its engine. UE3 approaches things much differently and the developer can use deffered techniques, static light maps (current standard technique), vertex lighting (still very useful) as well as any combination of those they see fit. There are not paths per se, but the use of different light types and mesh settings that do different things. -John
  7. Lysander, at the bottom of the transcript it says: Don´t believe the Microsoft BS with DX10 and AA on the 360. AA works only if data can fit into the 10MB buffer, if it can´t the engine has to support tiling to be able to support AA, and if the engine can´t do that, well, then you are out of luck. And thats just one reason and as you can see on Tims comment, there are other reasons that come into play as well when a game or engine lacks AA. And the 360 can´t be DX10 because by the time that the hardware was finalized DX10 wasn´t finalized. The 360 is DX9+, how much different the "+" is compared to DX9c on PC; well I have no idea, but it isn´t too big.
  8. UE3 is this year with Gears of War but that one is console only (for the time being). UT2k3 might make it this holiday season or Q1 2007, both ways it will be still DX9 because Vista and DX10 is looking like March 2007 right now, or even later considering how crap Vista Beta2 runs. Yes UE3 will support DX10 (confirmed) but only once the hardware AND software (Vista) is released and available, not any time sooner. The min. specs for UE3 are good DX9 cards (minus Nvidia 5-series) anyway, so the chances are good that we will see UE3 games that will be DX9 only and won´t get any DX10 goodies unless its reasonable and the publisher of the games will fund such a decision. I personally think Grins decision to use DL is mainly the 15 months they were given to deliver the game. DL is not "new", remember that article in 2003, but it just has not previously been used in a commerical game. One of the big benefits of DL is that you can make expensive next-gen effects and shaders, soft shadows, and tons of lights with a relative small performance penality. Who knows how the game would have looked liked if Grin had the same funding (and dev menpower) like the 360 version had? We still wouldn´t have DL+AA but most likely an addional and decent-looking rendering path that would support AA. Anyway, of course this is just my personal speculation and opinion.
  9. Actually I can confirm two other users who have NO PPU slowdowns. They used GRAW 1.06 with the latest (beta) Physix drivers AND the both have dual-core AMDs. If you consider the latest GRAW patch, and its PPU optimizations and improvments, it should be safe to say that the performance issue is history as long as you a have dual-core CPU (just to be on the safe road).
  10. They might be able to add a new (old) rendering path to their engine and use it to support AA. Sounds more practical than ditching DL, or re-writing DL to hell and have still crap AA performance. I bet the ealier versions of their Diesel engine didn´t use Deferred lighting, but is the code still there, or is it dumped and can´t be brought back? No idea. If it still there, it would require a hell of work to make the game look half-decent. You see, you might not have jaggies anymore, but you would loose a lot of shaders and eye-candy along. The game might look more like GR1 (uniform lightning, lack of special effects) Would this make people happy? Honestly, I don´t think so. Grin would be flamed to hell and back for releasing a game that looks like games did back 2001. I can´t wait for the first (PC) game with the UnrealEngine3 and see Epic´s approach on this. Will deferred lightning be the one and only rending path in UE3? I wish Tim Sweeney would answer this one for us. My prediction: There will be a "low-end" non-deferred lightning rending path in UE3, it will be based on the UE2 path and it will work with AA. To be able to enjoy all effects and the full beauty of UE3, you will have to switch to the deferred lighting path and live with the fact that you can´t have AA on current hardware.
  11. So then it sounds like some sort of supersampling, which renders at higher res then "converts" to your screen res. But that should be a real framerate killer shouldn't it be? What is supersampling like DR or the heatwave effect? The heatwave effect is NOT a framerate killer IMO, any decent card should be able to handle it with little performance penalty. DR works like this (AFAIK). This rendering technique allows to spent expensive pixel shader operations just once per pixel. A render scene is moved to fat offscreen buffer (XYZ,norm,materialID,mapping coordinate,...) without any shaders using, and then render screen/aligned and apply correct shader on pixels (depending on per pixel information in fat buffer). Tim Sweeney said that AA doesnt work well with deffered rendering and this is logical if you think about it. ^^ Thats what I read over at Nvnews. My simplified version: The way I understand is that (almost) everything is rendered in a SINGLE cycle so that AA CAN´T applied (on current hardware) because there are no edges that can be AA anymore. You could add a blur filter, that sounds easy, but to add AA, you´d have to rip the first part of the rendering into pieces and re-write the way the rendering works. However, this seems to be not practical and a real performance killer. ATI admitted that they can´t do anything about it with their current hardware, unlike with Oblivion, which works a totally different way and could solved by ATI. It will be interesting to see how the Unreal Engine 3 will deal with deferred rendering. Will it be the standard and only rendering path? Not likely, but then again I highly doubt that AA will work with current hardware in UE3 when its set to use deferred rendering either. The performance hit for AA and DF on current hardware must be ridiculious. Thats how I understood Tims comment over at nvnews.net.
  12. Deferred rendering is NOT the heatwave (which seems to be just a shader). It´s the whole lightning system. Every pixel in the game is affected by deferred rendering. The game doesn´t have a lightning system that other games have, so you can´t compare it with Oblivion for instance. The next game that will use DR will be Gears of War, and it won´t have any AA either. I think DR will be used in quite a few Unreal Engine 3 powered games. If the heatwave was the problem, they would have added an option to disable it and enable AA a long long time ago, long before the game went gold.
  13. Yes that might be possible. Software AA algorithm might also work but it would be slow as hell.
  14. Its the same technique that Epic is using for Gears of War, which <ding> <dong> <ding> won´t have any AA on the X360 either! (yes Tim Sweeney confirmed that here > http://www.nvnews.net/vbulletin/showpost.p...675&postcount=1 ) The only way to get rid of the jaggies before DF-AA capable hardware will be released is to run the game at a higher res, push the screen further away, or put some butter on the screen.
×
×
  • Create New...