Jump to content

No Future For The Gpu?


CR6

Recommended Posts

I need to read more of it, but I don't know. We still see a slowdown in games when onboard audio is used, even with dual core systems over when discreet audio is used. I expect gamers will want their quad core systems to have games written that take all 4 cores into play and will not want graphics to be done via the CPUs and decrease game preformance.

Link to comment
Share on other sites

I thought I've read about something like this before but I dont remember where. CPUs taking over the role of GPU? I dont know if I like the sound of that just because it sounds like you'd be buying CPU (A) that may not be as good as CPU (B), kinda like everyone jumps for the new video card every five minutes as it is, except now it will be the CPU.

I didnt read the whole article because some of the things he talks about I dont understand.

I can see it now?

"Your RAM is nice but your CPU is one day old and is worthless as a result!" (Basically what I read whenever someone has game-related tech problems, just replace CPU with Video Card)

Edited by Foxtrot360
Link to comment
Share on other sites

Well, how many of you here still have the original R6 installed? It's very interesting to try to run it in "Software mode" on these new powerful dual/quad core CPUs. It still doesn't look as nice, but runs at a decent framerate.

I actually have it installed on my Athlon XP 3200 system at home. I haven't tried to run it on my Vista laptop yet. Maybe once I get home I will try it (only thing I have with a dual core proc right now).

Link to comment
Share on other sites

The interesting thing about advances in multi-core CPUs is the programming for them. It's easy to write some sequential code for a single CPU but now that the number of cores is increasing, it will be very hard for developers to design games that take advantage of this. From a programming standpoint it's tricky because you have to worry about issues with accessing the same memory location by multiple cores and thus potentially overwriting it. Not to mention if two cores depends on one another for some information they will just get stuck. I don't see GPUs going away any time soon but I suppose anything is possible in the future.

Edit: Just read the article and found this on page 4:

"And then there's usability—the programmer's point of view. How hard is it to write code for this architecture? Code that we can write in 10 lines and run on a single-threaded processor today—how many lines do we need to write to have that scale up to many cores and wide vectors in the future? Is it still 10 lines? That's the ideal. If a simple application today can be written simply and scale up to Teraflop performance, that would be great. But it might be a lot worse. You might have to write 20 lines of code, or 50 lines of code to scale up top to multiple threads in the future. "

Edited by firefly2442
Link to comment
Share on other sites

I need to read more of it, but I don't know. We still see a slowdown in games when onboard audio is used, even with dual core systems over when discreet audio is used. I expect gamers will want their quad core systems to have games written that take all 4 cores into play and will not want graphics to be done via the CPUs and decrease game preformance.

Well perhaps with mutlicore CPU's, this may be a truth.

Link to comment
Share on other sites

So slightly going OT, but has anyone here "invested" in the latest gen GPUs? (GeForce 260, Radeon HD 4xxx series)

I wonder if most of us are just happy with our 8800's - no reason to upgrade this year.

Went from a 8800GTS 640MB to a ATI 4870 (and from a AMD Athlon X2 6400+ to a Intel E8400) and I must say that it was an utterly useless upgrade for the most part. Only reason I did it was to give my younger brother my CPU and GPU so he head a good rig to use. With the 8800 though I could run everything maxed out except Crysis. Most games I could also run at 4xAA. With the 4870 I can run everything maxed out WITH 4xAA, except Crysis, which runs a lot smoother. Overall not recommended to upgrade to something like that unless it's at high resolutions (mine is at 1680x1050).

Link to comment
Share on other sites

Hey Nutlink, thanks for the info. I actually don't know anyone personally who has anything faster than dual 8800's, so it's good to know.

The ATI 4870's are actually not priced that high, so it's tempting to upgrade, but I have no games that I'm playing now that make me feel I need an upgrade :P

Link to comment
Share on other sites

So slightly going OT, but has anyone here "invested" in the latest gen GPUs? (GeForce 260, Radeon HD 4xxx series)

I wonder if most of us are just happy with our 8800's - no reason to upgrade this year.

Went from a 8800GTS 640MB to a ATI 4870 (and from a AMD Athlon X2 6400+ to a Intel E8400) and I must say that it was an utterly useless upgrade for the most part. Only reason I did it was to give my younger brother my CPU and GPU so he head a good rig to use. With the 8800 though I could run everything maxed out except Crysis. Most games I could also run at 4xAA. With the 4870 I can run everything maxed out WITH 4xAA, except Crysis, which runs a lot smoother. Overall not recommended to upgrade to something like that unless it's at high resolutions (mine is at 1680x1050).

ID agree with that 8800GT I have is good runs everything fine ... except crysis (tehehe)

Although my mobo is crossfire ready I might consider some ATI cards soon ... along with a pheanom

oh b.t.w. the 6400+ and the e8400 is a very comparable in terms of performance. I checked them both out before I bought my 6400. I'm still an AMD nutter and think they will put a hurting on nvidia and intel again shortly. Just lending them a hand in their numbers. They need some investors, their stocks dropped so fast a couple years ago.

Edited by pz3
Link to comment
Share on other sites

So slightly going OT, but has anyone here "invested" in the latest gen GPUs? (GeForce 260, Radeon HD 4xxx series)

I wonder if most of us are just happy with our 8800's - no reason to upgrade this year.

Well, depends alittle on where you are upgrading from I'd say.

Now, I have invested in a new GTX260 combined with a Q9550 quadcore and to me that is a worthvile investment...as it's a pretty big leap from the old 6800 Ultra and Intel P4 640 3.2Ghz cpu that I already have.

I tend to do my upgrades in leaps and bounds rather than the mini steps that most computer nutters do...so now I again have a system that will last me a few years without any major upgrades, in a few years it will likely be severely outdated just like the old system is right now though.

Link to comment
Share on other sites

I is happy with 8800GTX...yes it stuttered with final scene with Crysis...but then again who didn't.

Gamers with SLI 8800GTX liquid cooled Quaddies showed videos of the stuttering. It had to do with the game per them.

Wonder if same problem with C:Warhead :unsure:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...