Main Page | Recent changes | View source | Page history

Printable version | Disclaimers | Privacy policy

Not logged in
Log in | Help
 

Graphics Card Buyers Guide

From OCAU Wiki

This is a draft page used for proposing changes to the the article on before becomes final.

Some rules:

  • If you need to discuss this draft, please do so at talk or contact the person whom create/contributed.
  • You can make changes providing your additional is correct and make sense
  • Correction to any spelling, grammar and punctuation errors are much appreciated welcome.

The Graphics Card Buyers Guide by SlamaA

I have made this guide for private use only, all is my own work other than the history section which has been taken from Wikipedia. I hope you find some use from this guide.

Please note this guide is not yet completed



Contents

What does the Graphics card (GPU) do?

A graphics card is a device inside your computer that uses pixels to create the images displayed on your monitor


Pixels are tiny dots on your monitor that make up the images you see. At the standard resolution setting, a monitor will display over 1 million pixels. The computer then has to decide what to do with every one in order to create an image. For this to happen, the computer needs a translator. The translator then takes binary data from the CPU and display an image. This translation takes place on the Graphics card


A software application, along side the CPU, sends information on the images to the graphics card. The Graphic card then decides on how to use the pixels in order to create the image. The graphic card then sends the information to the monitor via cables.


In order to make 3-d images, the graphics card first creates wire frames out of straight lines. Then, it converts the image into points on the grid, and fills the remaining pixels. It then has to interpret, lighting, texture and colour. For fast paced games, the graphics card has to repeat this process about 60 times per second. Without the graphics card to translate the binary data into an image, the workload would be to much for the computer to handle.


The History of the Graphics Card

1970s Modern GPUs are descended from the monolithic graphic chips of the late 1970s and 1980s. These chips had limited BitBLT support in the form of sprites (if they had BitBLT support at all), and usually had no shape-drawing support. Some GPUs could run several operations in a display list, and could use DMA to reduce the load on the host processor; an early example was the ANTIC co-processor used in the Atari 800 and Atari 5200. In the late 1980s and early 1990s, high-speed, general-purpose microprocessors became popular for implementing high-end GPUs. Several high-end graphics boards for PCs and computer workstations used TI's TMS340 series (a 32-bit CPU optimized for graphics applications, with a frame buffer controller on-chip) to implement fast drawing functions; these were especially popular for CAD applications. Also, many laser printers from Apple shipped with a PostScript raster image processor (a special case of a GPU) running on a Motorola 68000-series CPU, or a faster RISC CPU like the AMD 29000 or Intel i960. A few very specialised applications used digital signal processors for 3D support, such as Atari Games' Hard Drivin' and Race Drivin' games.

As chip process technology improved, it eventually became possible to move drawing and BitBLT functions onto the same board (and, eventually, into the same chip) as a regular frame buffer controller such as VGA. These cut-down "2D accelerators" were not as flexible as microprocessor-based GPUs, but were much easier to make and sell.


1980s The Commodore Amiga was the first mass-market computer to include a blitter in its video hardware, and IBM's 8514 graphics system was one of the first PC video cards to implement 2D primitives in hardware.

The Amiga was unique, for the time, in that it featured what would now be recognised as a full video accelerator, offloading practically all video generation functions to hardware. This video hardware included a fast blitter, a hardware sprite engine, a display port scrolling engine and hardware resources to draw lines, fills and other primitives. Prior (and quite some time after on most systems) the CPU had to draw the display.


1990s By the early 1990s, the rise of Microsoft Windows sparked a surge of interest in high-speed, high-resolution 2D bitmapped graphics (which had previously been the domain of Unix workstations and the Apple Macintosh). For the PC market, the dominance of Windows meant PC graphics vendors could now focus development effort on a single programming interface, Graphics Device Interface (GDI).

In 1991, S3 Graphics introduced the first single-chip 2D accelerator, the S3 86C911 (which its designers named after the Porsche 911 as an indication of the speed increase it promised). The 86C911 spawned a host of imitators: by 1995, all major PC graphics chip makers had added 2D acceleration support to their chips. By this time, fixed-function Windows accelerators had surpassed expensive general-purpose graphics coprocessors in Windows performance, and these coprocessors faded away from the PC market.

Throughout the 1990s, 2D GUI acceleration continued to evolve. As manufacturing capabilities improved, so did the level of integration of graphics chips. Video acceleration became popular as standards such as VCD and DVD arrived, and the Internet grew in popularity and speed. Additional APIs arrived for a variety of tasks, such as Microsoft's WinG graphics library for Windows 3.x, and their later DirectDraw interface for hardware acceleration of 2D games within Windows 95 and later.

In the early and mid-1990s, CPU-assisted real-time 3D graphics were becoming increasingly common in computer and console games, which lead to an increasing public demand for hardware-accelerated 3D graphics. Early examples of mass-marketed 3D graphics hardware can be found in fifth generation video game consoles such as PlayStation and Nintendo 64. In the PC world, notable failed first-tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with separate add-on boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the 3dfx Voodoo. However, as manufacturing technology again progressed, video, 2D GUI acceleration, and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were the first to do this well enough to be worthy of note.

As DirectX advanced steadily from a rudimentary (and perhaps tedious) API for game programming to become one of the leading 3D graphics programming interface, 3D accelerators evolved seemingly exponentially as years passed. Direct3D 5.0 was the first version of the burgeoning API to really dominate the gaming market and stomp out many of the proprietary interfaces. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L). 3D accelerators moved beyond of being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The nVidia GeForce 256 (also known as NV10) was the first card on the market with this capability. Hardware transform and lighting set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

2000 to present With the advent of the DirectX 8.0 API and similar functionality in OpenGL, GPUs added programmable shading to their capabilities. Each pixel could now be processed by a short program that could include additional image textures as inputs, and each geometric vertex could likewise be processed by a short program before it was projected onto the screen. nVidia also held the crown for being the first to market with a chip capable of programmable shading, the GeForce 3 (also known as NV20). By October 2002, with the introduction of the ATI Radeon 9700 (also known as R300), the world's first Direct3D 9.0 accelerator, pixel and vertex shaders could implement looping and lengthy floating point math, and in general were quickly becoming as flexible as CPUs, and orders of magnitude faster for image-array operations.

Today, parallel GPUs have begun making computational inroads against the CPU, and a subfield of research, dubbed GPGPU for General Purpose Computing on GPU has found its way into fields as diverse as oil exploration, scientific image processing, and even stock options pricing determination. There is increased pressure on GPU manufacturers from "GPGPU users" to improve hardware design, usually focusing on adding more flexibility to the programming model.[citation needed]

The newest version of DirectX, DirectX10 will be released with Microsoft Windows Vista.

On the 8th of November 2006 NVIDIA released the 8800 series of graphics cards which are top of the line and have the latest and greatest technology at the present time.

Some of these technologies include:

Supports Direct3D 10 with Shader Model 4.0, as well as OpenGL 2.x. Direct3D 10 is only available through Windows Vista, and software written for Direct3D 10 is required to take advantage of its new capabilities. Unified shaders, which can be changed dynamically to work on physics, geometry, vertex, or pixel shading. Historically, graphics cards had fixed numbers of non-unified shaders or pipelines. The graphics card's rendering power could become bottlenecked waiting on one high demand shader type. The new HDCP compliant Pure Video HD which uses the graphics card to help play high-definition video, such as the videos on HD DVDs or Blu-Ray discs. The Quantum Effects Technology allows the graphics card to do physics calculations. This feature is designed with video games in mind. Supports Scalable Link Interface (SLI) Uses the PCI express bus. Supports up to x16 anti-aliasing as well as 128-bit high dynamic range lighting.

What Graphics Card is for YOU?

When purchasing a graphics card the first question you must ask yourself is: What will I be using it for? Depending on what you want to use the card for and what card it is it will cost you different amounts of money.


There are five types of uses for graphics cards:

  • Hardcore Gaming
  • Casual Gaming
  • Video Editing
  • email & Office
  • Home Theatre PC


Please note: Hardware Specs in this next section are up to date on the 5th November 2006


Now we will look into each of the 5 Catagories.


HardCore Gaming

If you want to use the graphics card for hardcore gaming you are going to need a high end more powerful card which will cost you more, this also applies with people who will be using the card for video editing. These cards will cost a lot more than the average card. The price bracket will range from around $200 - $1200 depending on what you want and how much you are willing to spend.

Depending on how much you want to spend for a graphics card that will support a hardcore gamer there are different types of high end video cards.

The price ranges for these types of card are:

$200-$300 - Some possible solution for this price range are:

  • ATI Radeon X1600 Series

http://www.ati.com/products/radeonx1600/index.html


$300-$400 - Some possible solution for this price range are:

  • NVIDIA Geforce 7800 Series (GT or GTX)

http://www.nvidia.com/page/geforce_7800.html

  • ATI Radeon X1800 Series

http://www.ati.com/products/radeonx1800/index.html


$500-$700 - Some possible solution for this price range are:

  • NVIDIA Geforce 7900 Series

http://www.nvidia.com/page/geforce_7900.html

  • ATI Radeon X1900 Series

http://www.ati.com/products/radeonx1900/index.html


$900-$1200 - Some possible solution for this price range are:

  • NVIDIA Geforce 7950 Series

http://www.nvidia.com/page/geforce_7950.html

  • ATI Radeon X1950 Series

http://www.ati.com/products/radeonx1950/index.html


Also see SLI and CrossFire system


Casual Gamer

A casual gamer is someone who only plays games casual and not on a regular basis, for this task the graphics card does not need to be top of the line but still needs to be half decent and up to date. Cards used for casual gaming have a price range from $110 - $200 and is not a very big field.


Some cards suitable for Casual Gamers for around $110-$200 are:


  • NVIDIA Geforce 6600 Series (Standard & GT)

http://www.nvidia.com/page/geforce_6600.html

  • NVIDIA Geforce 7300 Series (GS & GT)

http://www.nvidia.com/page/geforce_7300.html

  • ATI X700 Series

http://www.ati.com/products/radeonx700/index.html

  • ATI X1300 Series

http://www.ati.com/products/RadeonX1300/index.html


Video Editing

Video Editing PC's must contain high end graphics card capable of rendering high end graphics and pixels, see Hardcore Gaming Cards for Video Editing as Cards are the Same.


eMail & Office

Graphics Cards that run a standard computer are anything, from any range of graphics cards, people who don't run any extreme graphical activities on their computer can run lower end older out of date cards as it does not matter.


Some Cards that are made for standard use are as follows but any card can be used.


  • NVIDIA Geforce 7100 series (GS only)

http://www.nvidia.com/page/geforce_7100.html, for example the Leadtek PX7100GS PCI Express 128mb as reviewed by OCAU memebr Draco II



  • ATI Radeon X1300 Series

http://www.ati.com/products/RadeonX1300/index.html

Home Theatre PC

The Home Theatre PC is a Computer that is used as a surround sound amp. DVD player and recorder and Digital TV, they need Graphics Card with a little more power than a standard card but less than a casual gamer.

The two Graphics Card Producing company's, NVIDIA and ATI have special series of cards made specifically for home theatre PC's.

They are as follows:

NVIDIA FX Series

http://www.nvidia.com/page/mediacenter.html


ATI All in Wonder X1900 Series

http://www.ati.com/products/radeonx1900/aiwx1900/index.html


[Main Page]
OCAU News
OCAU Forums
PC Database

Main Page
Recent changes
Random page
All pages
Help

View source
Discuss this page
Page history
What links here
Related changes

Special pages