News Headlines
- Wed, May 29
- Stellar Blade launches as top-selling game in US during April 2024
- Mon, May 27
- Dragon Quest III returns in HD 2D as Square Enix confirms platforms
- Thu, May 16
- Take-Two eyes Fall 2025 launch for Grand Theft Auto VI
- Activision forms Elsewhere Entertainment for new narrative-based AAA franchise
- Tue, May 14
- PS5 sold 20.8M consoles during FY23, bringing lifetime sales to 59.2M
Related Articles
XFX GeForce 8600 GT XXX Review - PAGE 2
J. Micah Grunert, Michael Nguyen- Saturday, April 21st, 2007 Like Share
The new NVIDIA G80 core has been thus far been a bag full of surprises. In the beginning, we saw the release of the powerhouse G80 based cards, some of notable reference being the BFG GeFORCE 8800 GTS and the flagship XFX GeFORCE 8800 GTX XXX card. The G80 returned with the more affordable but still powerful XFX GeFORCE 8800 GTX XXX 320MB version. Now the long awaited cards priced below the $250 range have been released, headlined by the XFX GeFORCE 8600 GTS XXX. All do deliver their expected measure of performance, with the 8800 versions shining above the rest. Though the 8800 320MB cards are essentially identical, that cut in memory (from 640MB down to 320MB) did have an adverse affect upon frame rates in the higher resolutions, especially when hitting 2048x1536 and 2560x1600 pixels. Read the XFX GeFORCE 8800 GTX XXX 320MB article and you'll see what I mean.
As for the new G84 core, we have seen thus far that the reduction in Stream Processors is graphically akin to cutting the number of cores in a processor when trying to crunch multiple threads apps. Performance will inveritabllt begin to suffer.
So to illustrate the differences between the new NVIDIA GPU's, the following chart helps to determine where the various cores lay in terms of overall GPU specs.
Model | Release Date | Codename | Fab process (nm) | Core clock max (MHz) | Fillrate max (billion texel/s) | Shaders | Memory | Power Consumption (Watts) | Transistor Count (Millions) | Shader Processing Power (Gigaflops) | |||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Stream Processors | Clock (MHz) | Bandwidth max (GB/s) | Bus type | Bus width (bit) | Memory | Clock (MHz) | |||||||||
GeForce 8300 GT [10] | May 2007 | G86 | 80 | 500 | ? | 16 | ? | ? | GDDR2 | 128 | 128/256 | 1200 | ? | 210 | ? |
GeForce 8400 GS [10] | May 2007 | G86 | 80 | ? | ? | 16 | ? | ? | GDDR2 /GDDR3 |
? | ? | ? | ? | 210 | ? |
GeForce 8500 GT[11] [10] | 17th April 2007 | G86 | 80 | 450 | 3.60 | 16 | 900 | 12.80 | GDDR2 | 128 | 256/512 | 400 | 40 | 210 | 43.20 |
GeForce 8600 GT [10] | 17th April 2007 | G84 | 80 | 540 | 8.64 | 32 | 1190 | 22.40 | GDDR3 | 128 | 256 | 700 | 43 | 289 | 113.28 |
GeForce 8600 GTS [10] | 17th April 2007 | G84 | 80 | 675 | 10.80 | 32 | 1450 | 32.00 | GDDR3 | 128 | 256 | 1000 | 71 | 289 | 139.20 |
GeForce 8800 GTS [12] [13] [14] | 8th November 2006 | G80 | 90 | 500 | 24.00 | 96 | 1200 | 64.00 | GDDR3 | 320 | 320/640 | 800 | 147 | 681 (~750) | 345.60 |
GeForce 8800 GTX [12] [13] [14] | 8th November 2006 | G80 | 90 | 575 | 36.80 | 128 | 1350 | 86.40 | GDDR3 | 384 | 768 | 900 | 177 | 681 (~750) | 518.40 |
GeForce 8800 Ultra [15] | 1st May 2007 | G80 | 90 | ? | ? | 128 | ? | 105.60 | GDDR4 | 384 | 768 | 1100 | ? | 681 (~750) | ? |
The 8800's 96 Stream Processors are cut to a third for the 8600 GT, leaving us with 32. The 256MB of GDDR3 memory is standard for this price point. Some of the core speeds and memory speeds do make a difference, but every manufacturer clocks their cards a little differently. Our review card is XFX's XXX version of the 8600 GT which is overclocked to 600MHz core / 1.6GHz memory speeds.
There is more serious downside to the new G84 architecture, being the 128bit memory. Two big highlights though; GDDR3 versus GDDR4. As we can see, the GeFORCE 8500 GT (and lower) use GDDR2 memory, and in one case GDDR3. A shame really, simply for the high latency factor of GDDR2. But with an 8500 budget card, low memory latency and fast read/write times isn't really a factor due to the narrower memory bus. A card like that will probablly have most of (if not all) its time to surfing the web and drifting around the desktop. The GeFORCE 8300 to 8500 gamers need not apply.
But I am very excited to see that the GeFORCE 8800 Ultra edition (launching in May 2007) will be moving to GDDR4 memory. This new memory will most likely be graphics card exclusive, helping to facilitate some truely outrageous RAM bandwidth speeds. But we can save the tastier details until the GeFORCE 8800 Ultra edition hit the streets.
At the risk of repeating ourselves from our 8600 GTS review, the most welcome of additions to the 8600 series is the enhanced HD video features. The PureVideo HD (VP2 + BSP + AES128 Engine) is a method whereby all of the video processing of HD video (HD-DVD, Blu-Ray, H.264) is handled by the GPU, thus off-loading the system processor. In the past, HD-DVD and Blu-ray media required CPU processing to help decode the media stream. The changes for the added PureVideo HD functionality were made within the G84 and G86 silicon itself. A bitstream processor and second generation video processor comprise most of the new hardware which will now handle HD decoding.
With the GeForce 7900 and even the G80 based 8800 series, NVIDIA 's decoding system would only finish off the last two steps of the process while the CPU off-loaded the first two steps. Now the new core architecture will compensate for all four steps to minimalize CPU usage and prevent framerate drops during video playback. So in theory this means that a less power CPU is required for HD playback if you have a 8600 GTS, 8600 GT or 8500 GT working in conjunction with it.
I was wondering how well the 8600 series performed versus my 7600 GT. I was contemplating selling it in search of something newer but now, nah. I had also thought about getting another one for SLI but since they aren't DirectX 10 compatible I don't want to get another at this point in time, best to wait. Although, the brass hats at id Software recently expressed that they weren't too impressed by DirectX 10 or Vista.