Jump to content

geforce fx


snakebite1967

Recommended Posts

Official GeforceFX Specs - 1/24/03 7:11 pm - By: Ben - Source: NVIDIA

Brian Burke has just passed along the official specs for the GeforceFX 5800 Ultra and GeforceFX 5800

GeforceFX 5800 Ultra

- 500MHz core

- 1GHz Memory clock

-$399 preorders

GeforcFX 5800

-400MHz Core

-800MHz memory clock

-lower pricepoint

You can infer from the release of the information that we're close to seeing real reviews.

so they are going to release two cards to start with, and that price being US currency i wonder what it will be canadian and how it will affect the other cards , also an interesting thing is happening to video card websites all the new 8 x agp cards with diferent overclock rates based on the 4200 gpu are springin up, and theres rumours that the 4600 and maybe the 4400 gf4 cards will be fazed out by year end, nvidia is concentrating on their main sellers IE the 4200 series cards the 4400 has not been selling well and the 4600 only moderatly better, hence the reasoning behind the new revamped 4200 series with everything from revamped copper heatsinks to 4600 boards running tweaked 4200 chips.

i think nvidia does not want to drop the prices on its flag ship cards so they will stop making them and instead offer 4200 cards in various states of tune and cooling that might even rival the 4400 or 4600 in power at an affordable price

Edited by snakebite1967
Link to comment
Share on other sites

Does it have the 128bit or 256bit memory controller though? If it is 128, I'll put my money on Ati.

Exactly what I was thinking...and I was expecting a little more in price though, perhaps around 450 to 500. But I'm still curious on Ati's secret project. Who knows, maybe the next card I get will be an Ati. :blink:

Link to comment
Share on other sites

i have a very interesting article on ATIs new ram and speeds that are due out very soon, its gonna be a hot summer thats for sure i also think that the FX range is not all that it will be im sure that nvidia has held somthing back just to see what the new ATI card is going to offer, the prices for sure are lower than expected and with their plans to do away with the 4400 and 4600 ( tho not totaly confirmed at this point ) it would make sense that they will replace that line up with somthing, maybe the first FX cards will be a mid level FX, who knows only time will tell.

Another funny thing is the new FX cooling system appears to be a complete copy of the new albatron turbo 4200 cooling system almost identical and that 4200 clocks way over 4600 speeds and runs stable, im not sure what nvidia is upto here but they are sure pushing the 8x agp on cards already available, which to me suggests they are trying to keep customers from jumping ship, the first 4800 series (4200) with bright new copper heat sinks have gone up on all major card sites tho they were available last month, also MSI and a few other sites are offering retailers the chance to win 5 FX cards if they preorder the FX line, they certainly seem to be gearing up for somthing tho whether its an attempt to hold a market or a complete product overhaul im not sure.

Another weird thing is that all bench marks ive seen have been with non production cards and beta drivers with strict agreements enforced by Nivida to only test in a certain way , to me this sounds fishy this late in the game unless nvidia is keeping somthing secret.

sorry for the long post guys , i just love video cards

your opinions plz

Link to comment
Share on other sites

ABIT GeForce4 TI 4200 OTES

Abit has always been known for pushing the envelope. A favorite among overclocking enthusiasts, Abit has established a reputation for bringing new features and functionality to the table to allow the consumer to squeeze every last drop of performance out of their products. In the same tradition, Abit has developed the OTES, or Outside Thermal Exhaust System. Seemingly nothing more than a fan and an enormous block of copper, the OTES heatsink assembly is actually based upon the impressive behavior of heatpipes.

Retail Packaging

In theory, a heatpipe is a hollow section of tubing, which contains a liquid of some sort. As the tubing is heated, there is enough activation energy to cause the liquid inside to begin to boil. As it does this, it changes phase and begins to vaporize. As a result, the heat is transferred along the tubing until it reaches the end. Here, the vapor changes state again to a liquid form and returns to the base of the tubing for the process to repeat. In order to dissipate the heat at the end of the heatpipe, a copper radiator is used. This radiator is further cooled by a 7000rpm fan, which blows directly over the fins via a specialized air duct which rests on top of the heatsink assembly.

OTES Heatsink Assembly

http://www.nvnews.net/reviews/gf4_ti_4200_...out/index.shtml

and this is an interesting article that if true might mean the video card crown

ATI R350 Info - 1/24/03 3:24 pm - By: volt - Source: Theinquirer

The Inquirer has gathered some info on the upcoming R350 from ATI. They believe they have solid information about what to expect.

ATI's magic goal is to achieve 400MHz/800 MHz for the card and memory while we know that it can easily clock it to 375 MHz, which can be considered as the lowest frequency of R350.

The final silicon should be back from TSMC any day now, as product is almost ready and when it does we will try to confirm the information about the final clock. 400 to 425 MHz is a dream target.

We heard that ATI's R350 will beat Geforce FX in eight out of 10 benchmarks, and we learned that one of benchmarks they will lose will be Quake 3. This could be true since the R350 will have 6Gb/Sec memory bandwidth more than the 9700 Pro.

The card will be equipped with 128 MB memory while it will support 256MB configurations and 128MB variations will be priced around the €400 mark

http://www.digitimes.com/NewsShow/Article3...pages=04&seq=20

Edited by snakebite1967
Link to comment
Share on other sites

Maximum PC got their hands on a beta FX card and tested it out. They shared the benchmarks with PC Gamer (they are sister magazines and I read both) and this is the deal:

Benchmarks GeForce FX Radeon 9700 Pro

Quake III* 209fps 147fps

UT 2003

Asbsetos Flyby* 140fps 119fps

3dMark2002SE

Game 4 Nature* 41fps 45fps

*All tests run at 1600x1200, 32-bit color, 2x AA enabled

Remember that this was done on a beta card and not a retail card. The card will be a DX9+ part. It will be using DDR2 memory and should be clocked at 500mHz, meaning that it will, in effect, be running at 1,000mHz -- a hugh leap beyond the RADEON 9700's 310mHz, which operates at 620mHz.

GeforceFX will need that extra memory speed to help make up for the fact that it uses a 128-bit memory interface, half that of ATI's 256-bit interface.**

As you can see, the new card looks promising even if it has a smaller memory pipeline due to its faster ram.

**Text in italics from PC Gamer magazine.

Link to comment
Share on other sites

Sadly it won't... The gpu is allready taxied or it wouldn't need such a huge heatsink. Ati will eventually respond with a DDRII answer, so the memory issue is null. The lack of a 256bit controller like the 9700, is the whole reason it is using DDRII, to try and keep up. By the time the 9700 is tapped out with DDRII mem, I don't think it will stand a chance.

Link to comment
Share on other sites

oh it does need a heat sink and fan,a doozey it takes up a whole pci slot under the card and exhausts thru a pci slot its just that these pics are all pre release and the individual mfg havent released their own heat sink fan combos yet

heres a basic look at the type of cooling involved

http://www.theinquirer.net/?article=7423

Edited by snakebite1967
Link to comment
Share on other sites

Good article White theres still a lot of speculation on this card , but the way sites are gearing up well find out soon enough wether its the best thing or runner up

Babel Fish Translation, In English: Help

Test: First detailed bench mark of the GeForce FX The GeForce FX got high Vorschusslorbeeren in the apron. Undreamt-of performance and innovative technology should distinguish it. But the reality is ernuechternd. OF BERNHARD HALUSCHAK Already in November 2002 NVIDIA announced the GeForce FX chip. Only now, at the end of January 2003, is NVIDIA in a the position to deliver Testsamples to the editorships. Our first test candidate is a map with the GeForce-FX-5800-Ultra-Chip. Announcement:

[ 45 kByte ] Imposant: The map with the GeForce FX 5800 Ultra captivates by the extravagante cooling technology. The exterior of the diagram map impresses by the Design and the thought out cooling technology as well as the weight. The closed cooling module sucks in air over an opening in the Doppelbracket and whirls it by exhausts over the radiator box. Over a second air duct the warm is outside carried after air. The comparison with a turbine is obvious and permitted - at least regarding noise. If the map switches from the 2D into the 3D-Betrieb, the exhaust number of revolutions and thus the noise level increase sensitively. [ 68 kByte ] back opinion: Also the memory chips on the back require for a suitable cooling. In Punkto weight did not save NVIDIA. The map weighs altogether 600 gram. The competitor of ATI is content with 220 gram. As connection types are available with test-SAM-polarize a data processing ii-interface, a s-video and an standard VGA exit. In the following one you experience, which achievement brings new generation to NVIDIAs in practice. Detailed information to the features and the chip architecture of the NVIDIA GeForce FX finds you in our contribution GeForceFX: The new generation.

The test candidate Our test candidate corresponds to a NVIDIA Referenzkarte with the GeForce FX 5800 Ultra. The board is equipped with 128 MByte GDR ii-memory. Both the core and the memory work with a clock frequency of 500 MHz each. As driver NVIDIA put the version 42.63 to us at the disposal. Announcement:

This article is from a german site and needs translating. Use this to translate the site. :)

Also remember folks we don't know if this is legit or not, so take everything with a grain of salt.

volt: We just had a discussion with someone who knows more about this than we do. He said that the benchmarks won't come in earlier than 12 hours, so at 8:00am gmt (central european time 9:00am) we will most likely see the reviews (Monday).

<@volt> Laxlunch: so the benchmarks will come tomorrow correct?

<@Laxlunch> volt they are already correct, they wont change them. but as it was too early to make those bechmarks public, they claim now that they "are currently working on the review"

<@Laxlunch> but the review is complete, probably they will still add the image quality review

<@Laxlunch> now cet is 8.44 am ... so just calculate :)

So there you go. This is coming from the front line, and it all points to Monday for reviews :)

it seems that the official benchmarks and reviews will be released tomorrow and full blown tests using a finished driver will be available sorry for the incomplete translation took me awhile to find a tool that would do this well

Edited by snakebite1967
Link to comment
Share on other sites

well a test has been released from europe a bit early but it seems legit .

the DUSTBUSTER as they are calling the FX now cannot beat the 9700 pro in high end tests but anyway read for youselfs i wont give Rocky a canipsion by posting it all here

http://www.extremetech.com/article2/0,3973...3,846356,00.asp

this post suddenly sent down this morning and now its a little diferent than it read last night not sure whats going on but still a good read.

now admitedly this test doesnt apply to us that play games at lesser settings but its a good idea on the strengths and weaknesses, but i dont think the story ends here i have seen a picture of a MSI hexus FX that says its 256 ddr2, and if thats not a hoax then this battle isnt over, at the moment the superior ATI memory is winning the memory intensive tests but only time will tell

NV30.jpg

Edited by snakebite1967
Link to comment
Share on other sites

Geez sorry White dunno how that happened , hmm ust goes to show dont drink beer and post at the same time

this article tests the FX and a few other cards in actual gaming situations and is more relevant to us.

NVIDIA takes the crown! No question about it - the GeForceFX 5800 Ultra is faster than the competition from ATI's Radeon 9700 PRO in the majority of the benchmarks. However, its lead is only slight, especially compared to the distance that ATI put between its Radeon 9700 PRO and the Ti 4600. Still, when compared to its predecessor, the GeForce4 Ti, the FX represents a giant step forward.

The GeForceFX 5800 Ultra is irrefutably the fastest card in most of the tests - but at what price? The power of the FX relies on high clock speeds, which in turn require high voltages and produce an enormous amount of heat. The consequence is that extensive (and expensive) cooling is necessary. Add to that the DDR-II memory, the price of which is quite high, due to the small production numbers. Even the 12-layer board layout is complex and expensive.

It will be difficult for NVIDIA to push its GeForceFX 5800 Ultra. Radeon 9700 PRO cards are only slightly slower, and, because they've been out on the market for months now, they're much less expensive. Also, because they deliver 3D performance with much slower clock speeds, they do not require extensive cooling - and that's nice for your pocketbook as well as your ears.

Still, despite expectations to the contrary, the official price for the FX 5800 is $399 plus tax and that seems pretty aggressive and attractive. This makes it identical to the launch price of the GeForce4 Ti4600 and the ATI Radeon 9700 Pro. The "normal" version of the 5800 will be somewhat less expensive. It's surprising that the GeForceFX GPU, clocked at 500 MHz, only gains a small lead over the R300 GPU (VPU), which is modestly clocked at 325 MHz in comparison.

It remains to be seen how long NVIDIA, with its FX 5800, can maintain a lead over ATI. ATI has already started to hint at a faster-clocked R350 to come in the next weeks (according to rumor, it will have a 400-425 MHz core and 800 MHz memory).

Nevertheless, enthusiasts will, without a doubt, love the GeForceFX 5800 Ultra. It is a monster card! And it has a look that is similarly spectacular to the 3dfx Voodoo5 6000 at the time of its launch.

and heres a link to the full article with some sound files so you guys can all hear the dustbuster fire up

http://www.tomshardware.com/graphic/200301...0127/index.html

Edited by snakebite1967
Link to comment
Share on other sites

That box looks great at least...I plan to buy this card when it comes out, I think it will secure the top spot once again in the gpu battle between nVidia and ATI...Though I'm sure ATI will come back with the Monster R350 or whatever it is. I've always been one to buy nVidia products and so this will be much of the same...The link works too, I read the scoop on the card...It looks good and I know they're will be some fixes and touch ups to this card before it is released...If only it had that 256 interface...

Edited by Killa_N_Manila
Link to comment
Share on other sites

The NVIDIA GeForce FX graphics card has been eagerly awaited by an industry with a thirst for new technology. The hype surrounding the card has reached monumental proportions with expectations set very high indeed.

When the Radeon 9700 was launched back in August last year its performance represented a significant improvement compared to the GeForce4 Ti4600 with both improved image quality and blisteringly fast performance. NVIDIA’s task of out performing the Radeon should not be underestimated.

When you consider that the GeForce FX card we tested is one of the first GPU’s to use the 0.13micron and that the card is utilising a great deal of new technology it is clear that NVIDIA have a very solid platform to build on and develop here. In comparison both Intel and AMD did not produce their best 0.13micron processors until they had run with the process for a while and yields have improved.

Some people may mistakenly interpret this very early and UK exclusive first 'hands on' look at the NVIDIA GeForceFX as a definitive guide as to buy or not to buy, and may well be thinking that the GeForce FX is not living up to the hype and over inflated expectation.

However that would be totally unjust on NVIDIA and HEXUS are responsible enough to you guys to only give you our final verdict when we have been provided with sufficient time to really reach one.

There can be no question that the GeForce FX out performs the Radeon. We saw the demos of the Ogre, Dawn and the truck - these are incredible. We will have these on the media archive on the end of this review. There is more to the FX than just gaming, and benchmarks. We believe that this card will be the future of games - with the CineFX engine, High-Precision Graphics, Intellisample Technology all needing to have support from the developers to take full advantage of this card.

With a radical card comes some radical packaging. The cooling system is not the quietest and indeed in discussions with an ex-AKASA thermal guy, we concluded that there were areas for improvement, that could yield more efficient thermal control in conjunction with lower noise.

The fact that it uses the AGP slot and one PCI slot really is not a major issue to most enthusiasts. For a start how many people actually use all of their PCI slots? Especially as due to the PCI IRQ routing on many mainbords the PCI slot beside the AGP slot which the GeForce FX encroaches upon will be sharing an IRQ with the AGP slot which is probably not what you would want. The design of the card is wonderful, no picture does it justice. When held in your hand the card feels very solid and robust.

NVIDIA's Achilles Heel may be the price of the GeForce FX. If the prices are so much more than above say, the Connect3D Radeon 9700 PRO, we feel that the minor performance gain offered by the GeForce FX may not tempt customers.... but as we have said we will reserve final judgement based upon more testing time and in consideration of the kind of hardware and software bundles that GeForceFX manufacturers come to market with.

Pros

- Wonderful looks.

- Sharp Image quality.

- Better performance than the Radeon 9700 Pro.

- NVIDIA Driver support will make sure you get performance.

Cons

- Pricing/Availability

- Noisy

- Produces a lot of heat.

We believe that this NVIDIA card is ahead of its time and for sure we have an interesting few months coming up - these are going to be the best you can get at the moment - will R350 be on time? Will we see more from NVIDIA in the form of NV31 etc. Will we even see partners watercooling their cards? We're hearing that we will... We have heard an awful lot of information about future cards, and of course - you will be able to read all of it on HEXUS.

this might be a good point both ATI and nvidia are moving into a whole new realm of video cards ,the FX may not be the ultimate king at this moment but it is more advanced than anything out there, both ATI and nvidia will have super cards available mid year both will be 256 ddr 2 and ATI will have finished the new ram, i know not many of us will be able to afford these cards but the good news is as long as these two giants keep upping the anti on each other the mainstream cards we can afford will be better and cheaper

Link to comment
Share on other sites

initial reviews of the GeForce FX are out and the general consensus among our visitors is that it's not what they expected. Generally speaking, we had been hoping for a 30% performance increase over ATI's Radeon 9700 Pro across the board and less of an impact on performance when both antialiasing and balanced anisotropic texture filtering are enabled at the same time.

Before getting into the main part of this commentary, let's indulge ourselves by looking at two articles on previous NVIDIA graphics chipsets:

"Before we dive into the benchmark comparison we thought we would shine some light on exactly why, for the first time since NVIDIA started tending to the online community, that benchmarks haven't been released for so long. (This can be attributed to) drivers, an expensive launch (that) cannot be delayed, and honestly, there are no benchmarks."

and in another review:

"The drivers are the downfall to the (new graphics chipset's) performance. In some cases the performance difference between the (new graphics chipset) and the (previous graphics chipset) was absolutely nothing and in other cases the (new graphics chipset) was actually slower than the (previous graphics chipset)."

These comments may sound familiar if you've been following NVIDIA during the past couple of years and have also been keeping tabs on the GeForce FX. But, there's an interesting point to the observations. The first quote is from March 2001 and the second is from October 1999, describing the GeForce 3 (NV20) and original GeForce 256 (NV10), respectively. Similar issues and solutions were true of the TNT and TNT2 as well.

It's deja vu all over again.

HISTORY DOES REPEAT ITSELF

Has a six-month product cycle somehow made us forget the issues that befall NVx0 chipsets at launch? Well, it's happened before, and we should expect the same from the NV30. The drivers will improve, and performance will increase across the board as a result. The NV30 can be thought of as a concept card. It contains new features just as the GeForce 256 and the GeForce 3 did. However, NVIDIA's rollout plan is to expose new features and ensure stability and then concentrate on increasing performance.

GeForce FX Front View

If the NV30 happens to be the fastest graphics chipset on the market, it may or may not be a good thing (more on that later). But the NV35 is where NVIDIA's high-end performance hopes lie. Remember the improvements the GeForce 2 (NV15) had over the GeForce and the improvements the GeForce 4 had over the GeForce 3? Six months isn't a long time. NV30 was primarily delayed due to, once again, the move to a new manufacturing process. The same was true of the GeForce 256 and GeForce 3. And remember the GeForce2 Ultra? It was basically a placeholder for the GeForce3, which was pushed back from 2000 until early 2001.

Does that mean NV35 will be released any earlier? Probably not.

While the performance of the NV30 may not be eclipsing the Radeon 9700 Pro by a large margin at this time, the verdict is still out on how these parts will perform when DirectX 9 features are used. Yes, the NV30 comes with a overwhelming cooling solution and it requires a large amount of power to operate. But, as everyone has seemed to forget, the GeForce 3 and GeForce 256 had their share of problems.

SO WHAT CHANGED?

For the first time since 3dfx's Voodoo2, NVIDIA is experiencing increased competition. Towards the end of its existence, 3dfx shot itself in the foot with delays of the Voodoo5. S3's Savage2000 remains, in my mind, the most disappointing graphics chipset ever to be released. While it contained hardware assisted transform and lighting, the problem was that it never worked in the drivers.

In regards to ATI, the first Radeon couldn't match the GeForce 2 in 3D graphics performance and the Radeon 8500 launch was problematic. With a combination of poor drivers and a naming fiasco, it was quickly relegated to second place after the release of the GeForce 4. But late last summer, something changed. ATI's R300 graphics chipset was mind-blowing not only due to its 3D performance, but also because it was an attractive alternative to the fastest NVIDIA had to offer. The R300 surprised everyone - even those left jaded by the Radeon 8500 launch. Still, six months later, the Radeon 9700 Pro remains an alternative for those upgrading, even (or especially, depending on how you look at it) compared to the NV30.

What does this mean for consumers? Well, think back to 1999 for a minute. AMD, long considered to be a creator of under-performing central processors, introduces the Athlon. Suddenly, there's an alternative to Intel's Pentium 3 that performs at a higher speed and is cheaper. Everything changed. I doubt we'd have 3GHz computers with meta-SMP on a chip and 64-bit processors come out in force if there was a single choice for a high-performance consumer processor.

GeForce FX Side View

The same will be true of the battles between ATI and NVIDIA, which heated up after the Radeon 8500 had been out a few months. The driver bugs that marred its launch were being fixed, and performance was improving. The OEM "8500LE" clock speed debacle was resolved and the 8500 became a viable competitor. Still, since it was so long after launch, few people paid attention.

And then, we had the Radeon 9700. It's hard to deny that it's a great card with good drivers. It was fast, relatively quiet, and basically what everybody expected NV30 to be at the time.

Sounds like the Athlon, doesn't it?

FINAL THOUGHTS

The NV30 doesn't mean that NVIDIA is going the way of 3dfx. Far from it. It proves that they are still doing what made them an incredibly successful company. It proves, however, that ATI has stepped up to the challenge of making a graphics chipset that can compete with NVIDIA. It proves that ATI isn't going to sit around and design hardware that is crippled by mediocre drivers. With the Radeon 9700, we've seen ATI's driver development team step up to the plate, although the majority of responses at Rage Underground's 01/15/2003 user survey rate ATI's drivers as unsatisfactory.

NVIDIA Supplied GeForce FX Benchmarks

Is it good for ATI? Absolutely. Is it good for NVIDIA? Actually, yes. Competition in the graphics chipset market will spur the development of faster products with more features and improved image quality. High levels of antialiasing and anisotropic filtering with a negligible performance cost may actually be on the horizon. Tile-based rendering may become common on high-end cards. And, maybe most importantly, prices will drop. The upcoming price wars will spur the adoption of new graphics cards faster, which means games will actually use new features on the card you just purchased.

So, in the end, who wins from all of this? Everybody. Just like with Intel and AMD in the processor market, prices will fall while performance will increase.

And if it seems like history has a way of repeating itself, maybe that's because it does.

DRIVER ANALYSIS

To illustrate the improved performance attributed to continuous driver optimizations, a series of Quake 3 benchmarks using two different drivers sets were run using a GeForce4 Ti 4600 running at default clock speeds. The latest official Detonator 41.09 drivers, which were released on December 3, 2002, and the Detonator 28.32 drivers, which were released two months following the debut of the GeForce4 Ti 4600 on April 26, 2002, were used.

The demo4 map in version 1.30 of Quake 3 was benchmarked using maximum quality graphics settings, which consists of the standard high quality setting with maximum texture detail and high geometry. Sound and vsync were disabled. Resolutions tested include 1024x768, 1280x1024, and 1600x1200 using the following settings:

No Antialiasing / No Anisotropic Texture Fitering

2X Antialiasing / No Anisotropic Texture Filtering

No Antialiasing / 4X Anisotropic Texture Filtering

2X Antialiasing / 4X Anisotropic Texture Filtering

System Specs

AMD Athlon XP 2700+ @ 2.17GHz - Thouroughbred Revision B *

ASUS A7NX8 Preproduction nForce2 Motherboard *

NVIDIA nForce2 chipset with DualDDR Memory design *

Corsair PC3200 (CMX256A-3200C2) DDR SDRAM (2) 256MB DIMMs

Maxtor DiamondMax Plus 80GB 7200RPM ATA-133 Hard Disk Drive

Maxtor DiamondMax Plus 40GB 7200RPM ATA-100 Hard Disk Drive

Sony Multiscan E500 CRT Monitor - 21-Inch

NVIDIA GeForce4 Ti 4600 - 128MB

NVIDIA Detonator XP Driver Version 40.91 and 28.32

32-Bit Color / Sound Disabled / Vsync Disabled / 60Hz Refresh Rate

Windows XP Professional with Service Pack 1 / DirectX 8.1

Quake 3 Arena Version 1.30

Notes

The AMD Athlon XP 2700+ is based on preproduction silicon

The ASUS A7NX8 is a preproduction motherboard

The nForce2 chipset is based on preproduction silicon

BIOS Settings

Agressive Settings

Expert System Performance

166MHz Front Side Bus

Memory Timing 4-2-2

CAS Latency 2.0

APIC Mode Disabled

The following bar charts list the average frames per second from each driver per resolution. Antialiasing is labeled as AA and anisotropic filtering is labeled as AF.

Quake 3 Performance - 1024x768

At 1024x768, antialiasing performance between the two driver versions is similar. However, 4X anisotropic filtering performance of the 41.09 drivers exceeds that of the 28.32 drivers by an impressive 42.5% (208 vs. 146fps).

Quake 3 Performance - 1280x1024

The improvements are even greater at 1280x1024 as 4X anisotropic filtering performance increased in Quake 3 by 52.7% during the course of seven months. 2X antialiasing performance increased by 7.4%, from 136 to 146fps, during this time.

Quake 3 Performance - 1600x1200

The largest relative increase in antialiasing performance (27%) and anisotropic filtering performance (59.4%) occurred at 1600x1200. With both settings simultaneously enabled, performance improved from 50 to 76fps, or 52%. Improvements of this magnitude are particularly useful as driver updates allowed gamers to use better imaqe quality and also experience improved performance. Under this type of scenario, you can have your cake and eat it too!

Quake 3

An image quality comparison between the two drivers can be made by clicking here for a 1600x1200 screenshot taken with the 41.09 drivers (2.64MB) and clicking here for a screenshot taken with the 28.32 drivers (2.58MB). Note that there was a screenshot "glitch" that appears on the far right edge of the image using the 28.32 drivers. Other than that, image quality appears to be similar.

Perhaps the most telling sign that performance is expected to improve on the GeForce FX are specific comments made by John Carmack's in his latest .plan update. In the update, Carmack covers the performance and features of the NV30 (GeForce FX) and ATI's R300 (Radeon 9700 Pro) under the latest game he's developing - Doom 3.

Key phrases related to graphics drivers include the following:

"At the moment, the NV30 is slightly faster on most scenes in Doom than the R300."

"Nvidia assures me that there is a lot of room for improving the fragment program performance with improved driver compiler technology."

"I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system, and because I feel Nvidia still has somewhat better driver quality (ATI continues to improve, though)."

This analysis is an encouraging sign that performance improvements for the GeForce FX will certainly be realized via driver optimizations. However, it's unknown how long it will take and by how much performance will improve. For example, 80% of the improvement could occur during the first month following the release of the GeForce FX or incremental improvements could be spread over a period of many months. However, the data also reveals that optimum performance of a new graphics chipset is typically realized a few months following its release. And with prices having dropped during this time as well gives consumers an opportunity to "get the best bang for their buck."

Are we blowing smoke here, or will updated drivers make a difference in the performance of the GeForce FX?

Click here to vote if you're a forum member.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...