Jump to content
Ghost Recon.net Forums
Sign in to follow this  
Logos

Professional Gamers

Recommended Posts

I'm upgrading my vidcard for GR:AW. I've already made that decision. And when I was reading an article a few days ago on high-end cards, something was said that didn't make sense to me.

Somewhere in the article it said something like "while these cards will allow people to run any current game at maximum resolutions with all the video options on high, professional gamers will still run at lowers resolutions and options to achieve the highest framerates possible."

??? :wacko: ???

I don't get it. If the human eye can't tell the difference beyond 24 frames per second, why would professional gamers care, as long as they were running at least 35/40 frames per second to allow slack for a sudden drop?

Fluid motion is fluid motion, right? Or is there some other advantage to super-high framerates that I am unaware?

--Logos

Share this post


Link to post
Share on other sites

im not sure about this one but,

25FPS in a game is slow and I can definitly know the diffrence.

about 45fps is when I stop noticing the frame beeing slow ....

but 60-80fps its smooth as butter. When every frame really matters to you as a pro gamer it is a big diffrence.

Share this post


Link to post
Share on other sites

From my view they don't really care about the visual effects or "eye candy". With lower res. & medium details your frames won't drop much durring those crucial firefights.

So having smoother gameplay with minimum frame drop is their goal for a lott of people that play online for fun or competitive.

Personally I run games on a 19" inch screen & 1024* res & medium quality for better performance. But I do turn up the quality if I do some singleplayer action.

Share this post


Link to post
Share on other sites

Pro Gamers don't set detail on low only for framerate, it also helps them to spot enemy hiding in ( unexisting in this details ) bushes/shadows. :P

Share this post


Link to post
Share on other sites
Pro Gamers don't set detail on low only for framerate, it also helps them to spot enemy hiding in ( unexisting in this details ) bushes/shadows.  :P

Yeah, that I know. My clan ran low environment textures in TvT for just that reason. However, the article seemed to suggest something else. I was just wondering.

Anyway, I found the following excerpt from fatal1ty's guide to setting up your system like a pro gamer. I don't think I want mine set up for Quake3 gaming to run GR:AW, but it is talking a little bit about what we're talking about.

_________________________

Video settings are probably one of the most important settings in the game for a Professional Gamer. Basically with Professional Gamers, they work on their own settings to make it fit their needs to play at their peak performance. My settings are pretty basic, and I try to make the map as bright as possible. I'm not saying that my settings are the best for everything, but I do guarantee you that most of the people that play in the CPL tournaments have similar video settings as I do. I don't know many people that play in tournaments with high-resolution settings for Quake 3. I've heard of a few players running at 800x600 resolution, and my best guess as to why would be simply because some find it easier to rail at that resolution. I've actually noticed that my rail is pretty good at 800x600, but I rather stick to my game and just keep playing how I have been.

__________________________

That does seem to suggest that fatal1ty runs at 640x480. :wacko: I trust he knows far better than I do, however.

--Logos

Share this post


Link to post
Share on other sites

I don't know your card specification. However, if it's still decent card perhaps it's better to wait till the game comes out. You maybe running the game with no problems, you may not like the game, or there might be a new card by then causing prices of other cards to drop.

Edited by dreamr

Share this post


Link to post
Share on other sites
I don't know your card specification. However, if it's still decent card I would wait till the game comes out.

No, no. It's a 6600GT. I know I could play GR:AW with it with the settings low enough. That's not the issue.

When I built the system a year ago, the 6600GT choice was a price concession for a system that overall was going to run me quite a bit. I always planned on upgrading that sooner than the rest of the system. At this time, $300 for a 7800GT is not that painful of a hit to the wallet, and the GR:AW release is just my excuse to do it now. This wasn't really about my videocard upgrade, but about the pro-gamer framerate question I had as a result of the research before that vidcard purchase.

Anyone want a Leadtek 6600GT PCI-Express?

--Logos

Share this post


Link to post
Share on other sites

Suppose what they mean is - during online gaming - the comparison:

basis 25fps that drops to 17 versus

40 fps that drops to 32

where the latter has an advantage and the former e.g. stuttering screen when aiming.

Share this post


Link to post
Share on other sites
Suppose what they mean is - during online gaming - the comparison:

basis 25fps that drops to 17  versus

40 fps that drops to 32

where the latter has an advantage and the former e.g. stuttering screen when aiming.

Obvious benefit there. I get that.

But if they're running at 640x480 or 800x600 with low settings on a 7800GTX-512 or a 1900XT, which was the implication of the article, that's not framerates of 40; that has to be well over 100 f/s for newer games, if not over 200.

If there is someone out there with a 7800GT/GTX or better, would you check FEAR or BF2 at 640x480 with low settings and give us your average framerate over a few minutes? Pleeeeeeeeeease.

--Logos

Share this post


Link to post
Share on other sites
I don't get it.  If the human eye can't tell the difference beyond 24 frames per second, why would professional gamers care, as long as they were running at least 35/40 frames per second to allow slack for a sudden drop?

I'm guessing that the statement about 24 fps goes back to the belief that films (Hollywood, cartoon) can be viewed smoothly at that rate, but this does not apply to video games, especially at a competitive level.

3dfx used to have a tech demo showing a boucing ball at 30 fps vs 60 fps, and I'm sure anyone in this forum could tell the difference in the fluidity of motion. However, everyone is different. I can walk around a computer lab and tell you which CRTs are running at less than 70Hz, while other people might not notice (but wonder why they get a headache/eye fatigue sitting in front of a 60Hz screen :( )

The other thing I don't think people have mentioned here is the response time with what's on your screen and clicking your mouse. With some high-end mouses (mice?) taking 1500 images per second of the surface below, a person playing at 30 fps will be at a disadvantage vs someone playing at twice as much fps. Hope that makes sense?

Anyhoo, I've played a lot of games at 20-30 fps during single player just cuz I want all the eye candy on, but if I'm competing for $1000's then I wouldn't care less about eye candy and would want every (legal) advantage over the competitors.

One more thing ... often in competition, players don't get to bring their own rig (they may get to bring their own keyboard/mouse) so they have no control over what graphics card is in the box. The safest bet is just to turn down all settings.

Share this post


Link to post
Share on other sites
3dfx used to have a tech demo showing a boucing ball at 30 fps vs 60 fps, and I'm sure anyone in this forum could tell the difference in the fluidity of motion.

I'm going to look for this demo or a demo like it. If anyone else can find it, I would love to see it.

The other thing I don't think people have mentioned here is the response time with what's on your screen and clicking your mouse. With some high-end mouses (mice?) taking 1500 images per second of the surface below, a person playing at 30 fps will be at a disadvantage vs someone playing at twice as much fps. Hope that makes sense?

I'm not sure how significant the difference would be, but I can see it. This makes sense.

--Logos

Edited by Logos

Share this post


Link to post
Share on other sites

Allrighty, what I'm going to do is this: when I get my 7800GT, along with the other tests I was going to run, I'm going to take FEAR or COD2, and try to find some settings that will produce around 35 with the 6600GT and 70 with the 7800GT. If I can find ONE set of settings that would produce those framerates when switching between those two cards, I'll FRAPS them and compare to see the difference. If it flies, I'll make them available to people who want to see them.

--Logos

Share this post


Link to post
Share on other sites

This is a cool thread and makes sense on may levels, but is isnt this more on the lines of Computer Discussion or Off Topic?

Good thread though, I certainly can tell low frame rates and low Hz settings on monitors. I had a 60hz monitor running games at 30FPS a while back, now I have a 19" running at 1280x1024 85hz at around 50-60FPS (on most but not all games) the difference just blasts you in the face instantly.

Best I can desribe is everything looks more liquid smooth on the eyes, as apposed to jagged & sharp ... thats the only visual - to - words way I can put it :rofl:

:rocky::rocky:

Share this post


Link to post
Share on other sites
I'm going to look for this demo or a demo like it.  If anyone else can find it, I would love to see it.

Yeah, I was googling around for it but no luck. However, I did find some other folks that do remember the demo so I guess I'm not going too senile yet (unlike in a previous recent post of mine ;) )

http://www.sharkyforums.com/showpost.php?p...29&postcount=45

Back when 3dfx was speed-king (remember 16-bit color?), they put out something similar: it was a graphic of a 3d rendered ball bouncing at 30 and 60 fps. It looked worlds better than this one (yes, you can still tell the difference).

http://www.radiumsoftware.com/excerpts/60h..._holy_grail.txt

There was a 3DFX demo that split the screen and updated one side at 30Hz and the other at 60Hz.  It was designed to show just how significant a

higher frame rate can be.  IIRC it had a simple animation of a ball bouncing.

This is a cool thread and makes sense on may levels, but is isnt this more on the lines of Computer Discussion or Off Topic?

Yeah, you're right that this type of discussion fits better over at Computer Discussion ... moving now ...

Share this post


Link to post
Share on other sites
This is a cool thread and makes sense on may levels, but is isnt this more on the lines of Computer Discussion or Off Topic?

Damn, so my single use of the word GR:AW in the opening post didn't slip this past you.

I'm upgrading my vidcard for GR:AW. 

Justified! Justified! :rofl:

My bad, guys. I suppose it should be moved.

--Logos

Share this post


Link to post
Share on other sites

Why lower resolution?

Easier spotting of enemies in for example Counter Strike

In GR you had to play at 1024x768 minimum in order to spot enemies at long distance in bushy levels. Imo at lower resolutions it was very hard to see the difference in moving pixels near trees/bushes/foggy area etc.

Or to get more fps, the more fluid the game the easier it is to aim.

In some games fps also determines how easy you can "hit" your opponent, with bad fps it seems like you have rubber bullets.

Amount if dpi doesn't necessarily improve your aiming, some of the best aimers in the world still use old Microsoft mice with 400dpi.

If there was a option to turn grass into a texture with 1 single color green i would use it. Graphics aren't that important for me, for me gameplay is more important. And not just me, thats the reason for bright skins and all the commands to tweak games' graphics as well as possible.

24/25/30 fps is not playable imo

60 fps is still low

a fps that's locked at your monitor's refreshrate is fine (mostly 85+ fps)

In quake3 engine based games most "pro" players use 125 fps, maybe for the fluid motion but it also allows them to jump further and higher. In games like quake 3 and call of duty you can do jumps that aren't possible with 60fps, only with special fps values like 333, 125, 76, (1000/1, 1000/2, 1000/3, 1000/4 etc).

In CoD you can do jumps on top of houses or ledges that aren't possible with a standard config. Info: http://pedsdesign.com/codjumper/tips.php

In quake 3 this sort of became a skill, moving around the map. It allowed good players to make a difference in a game with superior movement/jumping (gaining more and more speed etc). There is a gamemode in quake3 called defrag where you have to get from point A to B in the fastest possible time or huge "stunt maps" just to trick around. There's more then just jumping :)

http://www.own-age.com/vids/video.aspx?id=1052

http://www.own-age.com/vids/video.aspx?id=1950

http://www.esreality.com/?a=post&id=889728

Clear look, easy to spot enemy, lowest details...

Share this post


Link to post
Share on other sites
Yeah, I was googling around for it but no luck. However, I did find some other folks that do remember the demo so I guess I'm not going too senile yet (unlike in a previous recent post of mine ;) )

I actually may have found it, but it won't run because(go figure) Glide is not installed. I'll see what I can do.

Thanks, though, CR6, because I read the discussions at those links, and I wanted to post some extremely interesting points from those that are relative to this discussion. all three came from :

Radium Software

Elite Quake players have always played with ultra-low resolution and quality settings

to achieve the maximum frame rate. The reason is that not only the game world

gets displayed 300 times per second (which your eyes might not see), but also

the game samples your input 300 times per second, which, being a grandmaster

with an insane mouse sensitivity, you'd definitely notice.

Which is one sound answer to my original question.

There are many other interactivity issues, aside from what the eye can detect.

If you're running at 30fps, then the amount of latency between when the

user does something, and when he sees it happen in the world, is going

to be between 33ms and 82ms, with the average being in the middle.

82ms is a lot, and the variance between 33 and 82 is a lot.

And another, thanks again, CR6.

(I also wrote a gdmag article once where I traced through the i/o path

of a client/server game and showed how in a lot of cases more of the

lag is due to frame rates than network latency).

I would like to see this article. Does he mean my low framerates can create the illusion of network lag? That, I know.

Or, does he actually mean what the sentence technically says, which is that low framerates from one of the users can cause the server to lag? That doesn't make sense to me.

That said, a frame of television is typically gathered from film (or a video camera which emulates film). [When shooting a scene...] each frame of the film is exposed to the scene for X  amount of time [probably 1/24th of a second], during which time any motion is blurred across the film. When played back at the proper rate [24 fps], all motion looks nice an continuous.

A game on the other hand, is a instantaneous snap shot of a scene that

contains no blurring. To provide a smooth illusion of motion, you need to

increase the number of samples in time (the frame rate).

Which explains why 24fps is fine for film and television, but about 60 fps is what is needed for buttery smooth motion in a video game. That figure of 60fps, btw, is what seems to be a broad consensus over the different places around the net where I read about this, not any personal figure I've arrived at. Some people said 40-50, some said 70, but by far the most common thought was 60fps. Again, that's just for the buttery smooth motion and doesn't take into account the other advantages of high fps mentioned above.

Hope you found it interesting reading.

--Logos

Share this post


Link to post
Share on other sites
I'm upgrading my vidcard for GR:AW.  I've already made that decision. And when I was reading an article a few days ago on high-end cards, something was said that didn't make sense to me.

Somewhere in the article it said something like "while these cards will allow people to run any current game at maximum resolutions with all the video options on high, professional gamers will still run at lowers resolutions and options to achieve the highest framerates possible."

??? :wacko: ???

I don't get it.  If the human eye can't tell the difference beyond 24 frames per second, why would professional gamers care, as long as they were running at least 35/40 frames per second to allow slack for a sudden drop? 

Fluid motion is fluid motion, right?  Or is there some other advantage to super-high framerates that I am unaware?

--Logos

I'm no professional gamer, but what i find interesting is as you say..."can the human eye tell the difference beyond 24 frames per second"? well i don't know about that, i'll have to look into it. But what i can tell you from my point of view is, it's what i feel with the mouse pointer or if i'm in a first person shooter game then it's the reticule. @ 60 frames per second i struggle to keep the pointer or reticule steady/smooth @ 70 frames per second it's better. @ 80 frames per second and above (according to FRAPS or in-game frame rate counter) i have much better control of movements in-game, it feels much better. In Battlefield 2 for example i have 1280 x 1024 @60HZ the first three custom video settings High the rest medium, 100% draw distance and i average 80+ frames per second and often hitting 90 frames per second and that feels superb. So for me frames per second is something i feel (in control of the mouse) rather than what i see.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×