Tuesday 19 February 2013

Nvidia Releases "Getforce GTX Titan" - Hardware Analysis


Numerous were the rumours about a Kepler GK110 GPU joining the GeForce Family. It first started with GeForce GTX 685 - later forgotten since it seemed NVIDIA was done with the 600 Series. After that, speculation turned towards a possible GeForce GTX Titan which was allegedly GeForce GTX 780 Ti, to go with NVIDIA's naming convention. Now the speculation is done and this is it folks - I present to you GeForce GTX Titan!

AMD claims to be happy with their Radeon HD 7000 Series GPU and showed a roadmap claiming that the HD 7000 Series GPUs are to be stable throughout 2013 and that they would be focusing on improving driver support. However, that doesn't mean there won't be new Radeon HD 8000 GPUs in 2013.

Now, NVIDIA is not holding back the big guns and has released what, as of February 2013, we call: the fastest single GPU available.

GeForce GTX Titan is based on the Kepler GK110 Core used on the Tesla K20 Workstation, but with only 14 SMX activated and so it offers 2688 Shader Processing Units, 48 ROPs and 224 TMUs on a 384-bit interface of fast GDDR5. The central clock runs at 837MHz and goes up to 876MHz in Turbo Mode, while the memory clock operates at 1502MHz. GeForce GTX Titan draws no more than 250 Watts. Early GeForce GTX Titan benchmarks indicate its performance is up to 85% of GeForce GTX 690!

Looking at the specifications on paper and what seems to be expected from GeForce GTX Titan, I would say this GPU has many major pros but an enormous major con, as well.

If it really does offer 85% of GeForce GTX 690's performance, while consuming only 250 Watt (exactly what Radeon HD 7970 GHz Edition consumes), it's clear to say it's an extremely energy efficient GPU - well we didn't expect less from a Kepler based GPU. Adding to that, it's expected to heat much less than GeForce GTX 690 and to rely less on driver support.

With Titan comes a new technology called Kepler Boost 2.0.



Instead of just the regular Boost in which the GPU's central unit and Shaders speeds automatically raise themselves to a certain frequency (known as Engine Clock), GeForce GTX Titan features an upgraded version in which the GPU dynamically adjusts voltage based on its temperature and allows for an even higher frequency than Engine Clock known as Max Clock. This means that the lower the temperature, the higher the voltage and the higher the Max Clock.

Everything looks good but we have noticed that GeForce GTX Titan is priced around 900 American Dollars. This makes GeForce GTX Titan extremely uncompetitive. It's obvious a GPU of this magnitude, with over 7000 million transistors, has an extremely high manufacturing cost. However, such a high price will make it obsolete in the market. It's much more worthy getting a solution of Radeon HD 7970 Crossfire, which ends up costing less than 800 USD and performs better.

GeForce GTX Titan's price might not be final yet. In our opinion, if it remains that high, GeForce GTX Titan might end up only being bought by extremely rich gaming enthusiasts or benchmarking fanatics with a pretty penny. Unfortunately, that's less than 1% of the GPU market.

Needless to say, if playing Crysis 3, on the Ultra Preset, with 8xMSAA, on a 2560x1600 monitor, is your dream, GeForce GTX Titan is your "girl". The 288 GB/sec of memory bandwidth plus the 2688 Shader Processing Units, ought to do the job well. Adding to that, Crysis 3 is optimized for NVIDIA GPUs.

Now, the big question is: is this trouble for AMD?

Considering all this true, here's the relative performance of the top high end solutions:



Yeah, GeForce GTX 690 still takes the lead...

Here is their price:



As usual, AMD's top high end GPU costs less than NVIDIA's top high end GPU.

Let's not forget about a performance to watt ratio:



As mentioned, GeForce GTX Titan offers the best power consumption/Performance. Almost all NVIDIA GPUs are more energy efficient than AMD's.

and last, but certainly not least, a price to performance ratio:



Radeon HD 7970 has a clear win on this one while GeForce GTX Titan is smashed - even by NVIDIA's own GeForce GTX 690.