Jump to content
Ghost Recon.net Forums

nasa15

Members
  • Content count

    25
  • Joined

  • Last visited

Community Reputation

0 Neutral

About nasa15

  • Rank
    Recruit - 3rd Class

Profile Information

  • Location
    Marietta, Ga

Contact Methods

  • Website URL
    http://
  • ICQ
    0
  1. Also, to add to the performance hit comment, the PhysX card isn't magically killing performance by itself. The problem is that the PhysX hardware is waiting on the physical event to happen through the already in place Havok software, then it just sort of adds the effects on top of that. So you have a software delay that has to run on the main system processor, then PhysX kicks in and adds more effects. If GRAW used nothing but the Ageia PhysX software, in combination with the hardware, there would be virtually no performance hit at all.
  2. nasa15

    Are you buying a PPU?

    GRAW supports it, but there seems to be performance issues with it right now.
  3. Ahem. GRAW doesn't use bump maps, it uses normal maps. A bump map is a grayscale image of the color texture most of the time, or it can look like a height map, which is still devoid of color. Normal maps are tri-colored textures (red, green, and blue with each color representing a direction in 3D space) which describes the orientation of each pixel on the surface. Bump maps are grayscale and contain less lighting detail than do normal maps of the same resolution.
  4. nasa15

    Post_effect_quality

    Deferred lighting (google it) is the reason why AA does not work with the game, on any hardware. There's no way to do "some other way" of lighting without rewriting the renderer for the game completely.
  5. nasa15

    Not hearing your own gun?

    Dude the game sounds kick so much ass! I got the volume cranked on my 200watt THX certified setup.... my ears are in heaven. It's the first a time a game has made me twitch when a ricochet has hit the wall behind me.
  6. Doom 3 has physics. Fairly good physics, considering it was coded in-house. Knowledge is awesome.
  7. Whoever was responsible for coding the renderer for the game at GRIN has probably already considered this, but I figured I'd post it anyway. As I understand it, deferred shading (or deferred lighting) computes lighting effects as a post processing effect... it all happens in the frame buffer. The scene's geometry is rendered once, the normals for those objects stored in a floating point buffer, and the light sources are added onto that buffer and a whole slew of other wonderful technical things happen and you end up with complex lighting and shading, with not as much as a performance hit as you would get with "normal" lighting. GSC, the team behind STALKER, knew that AA would not be possible when running the game in DX9 mode, so they incoporated an edge-detect filter pass to their scene and that somehow gave them the ability to blur the detected edges to create a smooth edge result not unlike what could be had with normal AA. I'm curious what the developer's thoughts are on this!
  8. 5 - 10 years still, and thats a very optimistic prediction. ← Are you kidding? 5 years would be an "optimistic" estimation, at the very most! Look at the past 5 years, and see where PC graphics have come from. Then times that advancement by like three or four, and that's where we'll be in another 5 years. Real time global illumination, light bouncing (both of which can be faked pretty well already) and many, many other features that are currently "offline" only, meaning that they need to be rendered out to still frames or video to be observed. edit; to add to my point: Return to Castle Wolfenstein was voted as having the best graphics of the year, in 2001, based on a Gamespot survey. (Whether or not Gamespot is qualified as the only source for this info does not matter, they're popular enough) And this is what it looked like: http://img.gamespot.com/gamespot/images/20...w_screen002.jpg Flash forward 5 years to the present, to STALKER: Shadow of Chernobyl, and this is what you have: http://stalker.myexp.de/ger/community/scre...alker_dx9_1.jpg Games in 5 years should be closely approaching the appearance of reality. Of course, this also means that they will take longer and longer to make, because of the complexity of the art assets that will be needed to mimick reality. Either that, or developers will get smart and turn to photo-based texturing and rendering more, but that's a whole other story.
  9. http://www.ghostrecon.net/forums/index.php...ndpost&p=361384 A link to a post I made with 3 screenshots at max detail at 1680x1050 on a 7900GT Superclock from eVGA.
  10. Very interesting. Perhaps the demo only contains Low and Medium texture assets, while the retail version will obviously have the additional textures used in High mode. I'm thinking the demo would have been larger if it contained all three sets of texture assets.
  11. Heh, did you only take two guys with you, or did you lose one along the way? ← Ahem, well, one of them sort of... got in the way of my fun with the grenade launcher at the beginning of the mission.
  12. 3 shots I took with the "everything on high and 1680x1050" settings:
  13. I edited the xml render file so that every medium setting reads "high", as well as upped the resolution to 1680x1050 for my 7900GT, and it looks noticeably better than the medium settings. The soldiers look great. No crashing or anything.
  14. nasa15

    Demo Alert!

    To get the game to run at 1680x1050, make the first part of the render.xml file look like this: <render_config> <d3d_device adapter = "NVIDIA GeForce 7900 GT" driver = "nv4_disp.dll" resolution = "1680 1050" windowed = "false" refresh_rate = "60" /> <render_settings> <variable name="aspect_ratio" value="1.6"/> Only edit the lines that are in bold.
  15. As I understand it, the retail package will come with a demo DVD along with the game Cellfactor.
×