TODO alt text

Gigabyte GeForce GTS 250 1GB review

Is it time for the ageing GTS 250 chipset to be put out to pasture?

Jump to Section:

Our Verdict

For

  • Affordable
  • Proven DX10 architecture
  • Low power consumption
  • Cool and quiet
  • 256-bit memory bus

Against

  • Ageing technology
  • Soundly beaten by the Radeon HD 5770
  • Too expensive for what it is
  • Lacks DX11 support
  • Badge engineering at its worst

If longevity is any measure of success, Gigabyte's budget-busting GV-250 OC-1GI board is the daddy. After all, the GeForce GTS 250 GPU found at its heart card can trace its roots right back to the GeForce 8800 GT circa late 2007.

OK, it's gone through a couple of rebranding exercises and a die shrink or two. But deep down, it remains essentially the same 128-shader, 256-bit bus, DX10 specimen it has been since birth.

That matters because, when it comes to graphics cards, the march of time is rarely kind. Two and a half years is an eternity in this game. Thus, the GV-250 OC-1GI gets seven shades beaten out of it by all comers bar Asus's Radeon HD 5670 board. The latter, if we're honest, is more all-round multi-media machine than out-and-out gaming board.

Of course, you could argue it's more important how this card performs in isolation than how it compares with more expensive fare. Could you actually live with the performance on offer? Unfortunately, it's hard to tell thanks to instability in our Call of Duty benchmark.

It's the sort of game you would expect to show the GeForce GTS 250 chipset in its best light. But it won't run for more than a few seconds without crashing. We suspect a rare Nvidia driver glitch is the culprit rather than anything Gigabyte can be blamed for.

As for the other benchmarks, it's all a bit of a struggle. Unsurprisingly, playing Crysis at anything resembling high resolution or detail settings is completely off the menu. You can forget about all that parallax-mapped goodness. If you want smooth frame rates, you'll have to stomach flat textures and dull visuals.

More disappointing is the hard work the GV-250 OC-1GI makes of Just Cause 2. If you're rolling with a 1080p Full-HD monitor, running at native resolution isn't a realistic option with this card. Disappointing. What's more, if all that is true for today's games, things are only going to get worse with future titles.

We liked

If we had to pick our single favourite feature delivered by the Gigabyte GV-250 OC-1GI, it would have to be the pukka 256-bit memory bus. When it comes to pumping pixels there's no substitute for bus width. In fact, we don't think a performance graphics card should have anything less than a 256-bit bus.

We're also pleased to find a full compliment of HDMI, DVI and VGA ports. Whether it's a standard monitor, HDTV or even a projector, the connectivity bases are covered.

Likewise, if your PC's overall power budget is limited, this card is a good option. It requires only a single six-pin external power connection.

We disliked

A beefy memory bus is all very well. But it's all for nought if the graphics architecture it feeds is old and creaky. The GeForce GTS 250 is simply too outdated to handle the latest and most demanding games. That's true both in terms of poor frame rates at high detail settings and the absent support for DX11. Desirable features such as tessellation and the Compute Shader are missing.

But the biggest killer is actually the price. In absolute terms, £110 is not expensive for a performance graphics board. But Nvidia has been cranking these things out in some form or other for years. You could pick up a very similar card for not much more money 18 months ago. At £75 or less, it would make for an infinitely more attractive option. As it is, it feels like a lot of money for old technology.

Verdict

The GPU that underpins this board was once one of the very best. It still offers many decent features including that 256-bit bus. But there's only so much you can do with a die shrink and a rebranding.

Frankly, the game is up for the GeForce GTS 250 chipset. Its dated architecture has finally caught up with it.

Follow TechRadar Reviews on Twitter: http://twitter.com/techradarreview