This component is, in my opinion, the single most important piece of hardware in a gaming system. Just remember: don't sacrifice other components, particularly the CPU, for a better GPU. You must find the best "balance" between the two. Otherwise all those megabyte cars of information will get into a traffic jam inside the CPU, so do them a favor and give them a big enough highway ;)
Important GPU Key Factors:
2. SLI and Crossfire
4. Frames Rate
6. Expansion Card Standard
A graphics card is almost like a mini computer: it has a Graphics Processing Unit (GPU), its own system memory, bus speeds, etc. Don't worry though, its really not that complicated when determining what card is best for you because both ATI and Nvidia (the two graphics card kings at the moment, Intel will join the competition with their Larrabee GPU next year) give all of their graphics cards a labeling system to help you determine what is a high end card, a performance card, and a low end card. For example, the current series of ATI Cards is the HD 4000, the top of the line being the 4890, the mid range being a 4670, and the low end the 4350. There are also variations of the high, mid, and low range cards, such as the high end 4870, 4850, and 4830, each offering less performance than the former. Basically, higher the number the better. Nvidia cards have a similar number system, higher the better, but also a prefix. The GTX 2xx series is better than the GTS 2xx series basically, and that’s all you really need to know about that.
SLI and Crossfire
Now, many motherboards offer support for multiple graphics cards. Next time I'll talk about motherboards and which ones offer such support. An SLI setup is Nvidia's multiple GPU type, and Crossfire is ATI's multiple GPU type. There is support for 2, 3, and 4 GPUs using these set ups. However, just because you have 2 GPUs in SLI or Crossfire DOES NOT mean you get twice the performance. Sorry, but the average performance increase in higher resolutions is usually only about 20%. So you'll have to ask yourself if the extra $250 for a second ATI 4890 is really worth the 10 extra frames per second?
If you are building a system now, you are one lucky SOB. When I build my first system, a high end card was about $400 to $500. Today they are around $200, and some are on sale for close to $150. SLI and Crossfire solutions will easily and quickly add up to around $400 to $500, but are only beneficial with a larger monitor.
For most people this single GPU solution will perform perfectly well, and it is nonsense to upgrade to an SLI or Crossfire unless you are using a 24" LCD monitor or higher. Most gamers today prefer either 22" or 24" LCD monitors. 22" LCDs typically run 1680x1050 resolutions. This is the amount of pixels (little square dots on the screen that produce a color) horizontally by vertically. The more pixels you have, the higher resolution it is and thus more detailed it is. 24" LCD Monitors typically run 1920x1200 pixels, which also runs full 1080p HD if you have an Xbox 360 or PS3 hooked up to one (1080p is a standard to describe resolutions that have ~1080 pixels vertically, and usually applies to HDTVs. However, some TVs can be less or more than 1080. The P stands for progressive scan, but is irrelevant to this blog so I'll let you use the Google machine to learn more if you want to).
Single GPUs begin to fall behind in performance when trying to process all the information necessary to fill those higher resolution pictures on the 24" and up monitors. Some people even have a 30" LCD monitor, which retails over $1000, and has a resolution of 2560x1600. This is where SLI and Crossfire shines, because they can promote the necessary performance boost to allow such high resolutions. Keep in mind that this is still only a 20% performance increase on average. At lower resolutions, the increase is so negligible its not even worth mentioning.
While this information is repeated on almost every system building website and forum, people still make the horrid mistake of buying two mid range or low end graphics cards for a SLI or Crossfire setup they think will give them better performance than a high end card. Not true, use that money for a single GPU solution, and only invest if you have a high resolution 24" or larger monitor.
Frame Rate (Frames per Second)
Listen closely. If I build a $500 system and get 30 frames per second (FPS) in a game, and you build a $3000 system and get 300fps in a game, you just wasted $2500.
A frame is a single rendered image, basically a still picture. Movies, TV, and computers play hundreds of these pictures back to back so fast that the human brain thinks they are moving. Yup, you're brain is being tricked. Now a theater film runs at 24 frames per second, which is considered just slow enough that you're brain cant tell the difference. Video game "lag" is associated with frame rates so low, under 24fps, that your brain starts to notice the difference between each individual frame.
The general rule of thumb is a normal person cannot tell the difference at 30 frames per second. Any higher fps and it will all look the same.
The trained eye, such as a TV salesman like me who watches TVs all day at Best Buy and knows what to look for can tell the difference between 30 and 60 frames per second. Basically, its just smoother transition. (They don't increase movie frame rate for a variety of reasons not relevant to this blog, once again Google is your friend).
NO human on earth can tell the difference above 60 frames per second. Thus, if you spend more money so you can have the uber highest frame rate in a game you really just wasted a lot of money, because a cheaper build that is at least 30 fps wont produce a noticeable difference to most people.
Note: one can argue that building a super system now can "future proof" the PC, meaning it will last longer and play more future games for at least 3 to 4 years, where a cheap PC that can barely play the current games will have to be upgraded within 1 year. This is both true AND false and will be discussed a bit later in this series. I am trying to help you get the most bang for your buck, unless your Bill Gates and money literally means absolutely nothing to you, only binary code.
Since a variety of GPUs, even ones of the same series, offer varying degrees of memory it is worth mentioning here. Basically, the more memory the better. While sometimes the improved performance may not be worth the money, such as a small 2% increase from a 512mb card to a card that offers 1gb for $40 more (this is just an example and not based on any empirical evidence). In all cases its important to look into the benchmarks when doing your research, but in current GPUs more memory does offer better performance, particularly in video game mods. Mods, short for modifications, are community/fan created games based off of a games source code. Counter Strike originally began as a free mod created by a kid in his basement before valve bought the Mod rights, hired the guy, and released it as an official title. Anyways, mods typically tend to soak up more GPU memory, so the more the merrier.
Expansion Card Standard
At the time of this article, all video cards use PCI Express 2.0 Card Standards. What does this mean? Its essentially the same thing for the CPU Socket standard discussed in the last article. It is the type of connection between the video card and the motherboard. You're motherboard must have the same type of Expansion Card slot as your video card. PCI Express (PCI-E) 2.0 comes in a 16x standard for one card, meaning the bandwidth. Some very cheap mobos have a 8x standard, thus half the bandwidth, and as mentioned before more bandwidth equals less bottlenecking. Bottlenecking bad.
Now most mobos have multiple PCI-e 2.0 slots (this will be discussed again when we go over motherboards, but I feel it deserves a mention here as it pertains to video cards). If you're building a SLI or Crossfire system, it is essential to have 2 or more PCI-e slots, otherwise you just cant do it. The higher end mobos will be able to support 16x on both channels, meaning both PCI-e slots. However some mid range mobos will only support 8x on both PCI-e slots, thus half the bandwidth. While this isn't necessarily crucial and wont have much of a noticeable effect, it is something to keep in mind. And before anyboy asks, no, the 8x slot will not cut performance in half from the 16x, its just slowing down the transfer of data. While it makes sense, its just too complicated to explain and all you need to know is it does affect performance, but not nearly that much. In most games (except the latest such as Crysis perhaps) the difference wouldn't be noticeable.