Jump to content
Ghost Recon.net Forums

Recommended Posts

So,

like I'm confused about a small issue with my system - and hope someone can shed some light on this.

Basically I have been spending most of my gaming time of late playing the Joint Operaions demo - and have begun to upgrade my system in preperation for the full release in a couple of weeks. I had noticed that for some reason, my display peformance was pretty sketchy - frame rates were very low and the game got very VERY jerky at busy moments making it both frustrating and difficualt to play competitively.

I never thought too much about it, blaming the fact that my GPU is pretty basic (5200 FX) and tried to soldier on with medium graphics settings and shadows off, etc.

It wasn't until my bro (Potshot) visited and remarked on how lousy the performance was - so much so that he simply refused ot play it - claiming it was 'impossible' when it was that jerky. That got me confused. You see, his system at the time, was virtually identical to mine. Same mobo, same CPU, and only the graphics card was different - he had a 440MX/64Mb.

The following week I was out home - and saw his JO in action. Silky smooth!?!

His settings were exactly the same (except Water had been set to Low by force - no other option avail) yet the performance was hugely improved over my rig. What makes it doubly confusing is that by this time I had upgraded my mobo to an Asus a7N8x-E deluxe, upgraded to 1Gb 333 RAM ,and new CPU (AMD 3000+) - yet my machine still struggled with JO.

In other tests - including GR itself - it was great....better than my bros and getting good fps at high levels of detail.

Then I realised that JO was probably the only game I ws playing that was written with DX9.0 in mind....and began to wonder. I had heard that the 5200 FX was a bit of a doorstop when it came to DX9 and wasn't all it was cracked up to be - even for a low-end budget job.

So I nicked an old 440 MX off a mate who wasnt using his machine...and bunged it in my rig tonight....and guess what?

JO runs like a belter - super smooth in comparison to what Ive been struggling with for the past 2 months!!! :wall::wall::wall:

What's goin on here? I did notice that the MX does NOT support pixel shading (in the diagnostic screen of JO as it loads for first time after a display change) whereas the FX does...is it just a case of the 5200 being unable to do some of the stuff that it TRIES to do and just falling over...while the MX keeps things simple and does it fine?

WHy is the 440MX much better than the 5200 FX when it comes to me and JO??

What's the mystery??

Either way - I have a 9800XT on the way soon, so not too fussed - but would like to know.

My head hurts.

:wacko::stupid:

Link to post
Share on other sites

The FX5200 is a horrible card. It's meant as a replacement for the MX440.

I suspect the reason your performance is so much poorer is because it doesn't support the advanced features the 5200 does. The MX just goes on it's merry way, ignoring all the DX9 calls, while the FX attempts to render them. So, when the cards are about equal on general performance, but one is displaying a lot more 'detail', there's your difference in actual performance.

Link to post
Share on other sites

Seems very strange i thought that FX's were better than MX's but they wer all basically crap. It probably is that the G-Cards dnt support something that you need maybe not just DX but maybe other things.

Anyway you should have NO trouble with your 9800XT cos that is an awesome card and imho a very good choice if you can afford its high price tag, which is the main reason i dont have one and Im gunna have to settle for a 9800 PRO for my new rig. :( Still good tho.

If you upgrading everything else why not just get a R9800 also?

Probably mainly cos the cost a BOMB!

Edited by [TCS]BlackMamba
Link to post
Share on other sites
If you upgrading everything else why not just get a R9800 also?

WHy is the 440MX much better than the 5200 FX when it comes to me and JO??

What's the mystery??

Either way - I have a 9800XT on the way soon, so not too fussed - but would like to know.

My head hurts.

;)

Dannik - cheers - I suspected that was the deal. Will keep the MX in until my 9800 materialises.

:thumbsup::thumbsup:

Link to post
Share on other sites

Cheers Mamba - I'm hoping it will be an XT - though that depends on how generous (or devious) my local PC shop pal will be....might need to negotiate a Pro instead...but hey.

:rolleyes:

Link to post
Share on other sites
If you upgrading everything else why not just get a R9800 also?

WHy is the 440MX much better than the 5200 FX when it comes to me and JO??

What's the mystery??

Either way - I have a 9800XT on the way soon, so not too fussed - but would like to know.

My head hurts.

;)

Dannik - cheers - I suspected that was the deal. Will keep the MX in until my 9800 materialises.

:thumbsup::thumbsup:

I need to start reading things more. :blink:

Link to post
Share on other sites
How much are you paying for your 9800XT? I find that an X800 Pro is only on average $30 more, yet the X800 is by far more advanced.

Hopefully zip.

I havent paid for computer component for some time.

A friendly computer shop owner (and wannabe musician) who quite often 'wins' competitions in the music magazine that I edit....call it swap shop.... :shifty:

Link to post
Share on other sites
WHy is the 440MX much better than the 5200 FX when it comes to me and JO??

What's the mystery??

Here's the deal Sync:

The GeForce 4 MX 440 is basically an upgraded GF 2 MX, both "MX" cards are DirectX 7 only. These cards run DX7 games really well, but cannot run DX9 features. So if you are playing a DX9 game, as Dannik said, it will ignore execting all the DX9 feature calls.

Note: The GF 3 and GF 4 "Titanium series" (4200, 4600) are DX8 cards, Nvidia has been accused of misleading people by naming an upgraded GF2 MX the GF4 MX) :wacko:

The GeForce FX 5200 is a DirectX 9 card, which means it theoretically can do fancy stuff like Vertex/Pixel Shaders 2.0, but it does so at the absolute lowest spec, which results in pretty miserable performance.

The GeForce FX 5200 also can run DX7 games, but it is well known that it actually performs worse on DX7 games than a GF2/4 MX. Why? Because the 5200 has a different architecture than GF2/4 MX, which lets it run DX9, but at the expense of good DX7 performance. The GF2/4 MX was engineered just for DX7, so there is no "overhead". :ph34r:

Several games now have graphics options where you can turn off DX9 features to hopefully help speed up frame rate (like Halo PC), but that still won't let the 5200 catch up to the GF4 MX.

Most dev studios making DX8-9 games these days make sure their game supports DX7, because GeForce 2/4 MX cards are currently have the largest installed base.

There may come a time, say 2 years from now when the next-gen Windows (Longhorn) makes DX9 necessary even to run your desktop (!), when GF FX 5200 cards make up the largest installed base (even though they peform poorly, they have sold like hot cakes for Nvidia!). So it is possible that future games may REFUSE to run on a GF 2/4 MX. :blink:

Does that answer your question?

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...