Navigation
RSS Feed


HCW Tech Blog

For the latest info on computer hardware, tech, news, video games, software tips, and Linux, check out our new improved front page: HCW Tech Blog

Reviewed by: Bryan Pizzuti, Carl Nelson [02.24.03]
Card Manufacturer: Powercolor
GPU Manufacturer: SiS
MSRP: $125

Discuss this article in the forum!
Registration NOT Required!

 

The Xabre600, while just released, is NOT a DirectX9 compliant part.  It doesn't support floating-point pixel pipelines, like the ATI RADEON 9500 and 9700 chips.  It is, however, DirectX8 compliant, unlike the GeForce4 MX series.  It contains hardware pixel shaders (which the GF4MX chips lack) and quasi-hardware vertex shaders, which requires a bit of explanation.

Vertexelizer engine

This is a VERY fancy way of saying vertex shaders, but SiS did their vertex shaders in a slightly fancier way that some. There are several methods of implementing a vertex shader engine.  One is to simply allow DirectX8 to do it itself, through its own software engine.  This has the advantage of giving any card vertex shader support, but does not allow the manufacturer of a video card to optimize it at all.  This method also eats up a significant amount of CPU cycles.  The second is to implement it completely in hardware, using one or more units on the GPU die.  This takes the major vertex-shading processing load off of the computer's CPU, allowing much faster video performance.  The performance of a GPU-based vertex shader unit can also be tweaked through video drivers, but is also generally not upgradeable. The third way is a combination method, which SiS uses.  First of all, the main vertex shader engine resides in the drivers, so it's initially software-based.  But the software vertex engine is designed to use the Xabre's hardware T&L unit, rather than exclusively using the computer's CPU, like other software implementations. This gives an advantage over pure software solutions, and allows a smaller GPU die, reducing costs.

Another advantage a software Vertex engine offers that few mention is upgradeability.  If the engine is etched into the GPU, it's going to stay the way it is forever. But if it's in the drivers, it can be upgraded, optimized, tweaked, and advanced to provide more and better performance down the road.  This is why SiS's partial hardware, partial software implementation of vertex shaders strikes me as a pretty good idea, especially from a cost reduction and GPU die reduction standpoint. Though the card will never be completely DirectX9 compliant, the Vertexelizer engine could be upgraded to support v2.0 Vertex Shaders, which are part of the DirectX9 specification.  But I wonder how it would be possible to implement multiple units using this method, such as the 4 contained in the ATI Radeon 9500 Pro.  It’s also bound to reduce vertex shading performance by SOME amount since it has to operate through the AGP bus, rather than completely on the chip.

Between a quasi-hardware vertex engine, and actual hardware pixel shaders, the Xabre600 has a serious advantage over NVIDIA's GeForce MX series, which uses DirectX8's vertex shader program, and has no pixel-shader capability whatsoever.  Feature-wise, this card is definitely a step above the GeForce4MX series, but the Xabre600 also has a few other unique features with fancy names, as well.

XmartVision

So begins the "X" features list. XmartVision is a fancy way of saying gamma correction.  It makes things bright when the picture is too dark, and makes it darker when it's too bright. But the "Xmart" part of this is its done automatically when needed, rather than requiring manual input by the user. This certainly beats fiddling with brightness controls while you're in the middle of a deathmatch.

Thankfully, you can disable this feature if you want to enjoy games the way the developers had intended them to be seen.  I admit that this feature is good for deathmatch games though, when the ability to see your enemy is more important than eye candy in the game.

XmartAGP

This feature appears to just be a glorified way of saying that the card automatically detects the speed of your AGP bus.  SiS's information states the following:

"With the dizzying number of chipsets and motherboards available today, remembering your system's optimal AGP capability is probably not at the top of your "to-do" list. XmartAGP™ technology automatically queries the host computer, dynamically adjusting itself to maximize performance. Whether your system operates at a poky 1X AGP, or the latest 8X performance standard, Xmart-AGP™ brings out the best your computer has to offer."

This would seem to indicate some additional optimization occurs in addition to self-setting the AGP speed.  Unfortunately, I have been unable to obtain any additional details as to what sort of optimizations might be occurring.

XmartDrive

Unlike XmartAGP, which is still a bit mysterious, this feature is pretty cool.  Literally.  The drivers and card detect when you are not using the card in 3D mode to push polygons, and promptly throttle back the speed of the GPU, memory, and GPU fan.  What this does is save wear and tear on the card and fan, firstly.  It also will stabilize your system when not running games, since the card requires a LOT less power in 2D mode.  It will also generate less heat, allowing your system temperature to drop a bit as well.

Comparison

Here's how the Xabre600 specifications stack up against some of its closest competitors:

Card

GF4MX440

64 MB

Xabre600

64 MB

GF4TI4200

64 MB

GPU Speed

270 MHz

300 MHz

250 MHz

RAM Technology

64 MB DDR (128 bit, 4 channel crossbar)

64 MB DDR (128 bit, dual-channel)

64 MB DDR (128 bit, 4 channel crossbar)

RAM Speed

400 MHz

600 MHz

500 MHz

Bandwidth

6.4 GB/sec

9.6 GB/sec

8 GB/sec

Pixel Pipes

2

4

4

Texture Units

2 per pipe

2 per pipe

2 per pipe

Textures per pass (not per clock)

4

8

8

AGP

4X

8X

4X

Pixel Shaders

No

Yes

Yes

Vertex Shaders

Software using DirectX

Software using T&L engine

Hardware

(2 units)

Multi-Monitor

Yes

Yes

Yes

AntiAliasing

4XS

4X

4XS

Z-Culling

Yes

Yes

Yes

DirectX

7

8

8

MSRP (US$)

$100

$125

$130

Next Page: (3)