Nvidia geforce gtx 260 operating temperature. What temperature of the video card is considered normal. Game tests: World in Conflict

In this article we will talk about what was once one of the top gaming video cards nVidia GeForce GTX 260. The model was created in 2009 and is considered a stripped-down version of the GTX 280 - the difference will be discussed in more detail in the article. Considering the purpose of the video card, we can say that it is intended for gamers. Another pleasant find - the video card runs games from 2015-2017 at excellent fps, so it can rightfully be considered a real “dinosaur” in this field of production. But first things first.

Main characteristics

  • Technology: 65 Nm
  • Core clock: 576 MHz
  • Memory type: GDDR3
  • Memory frequency: 999 MHz
  • Own video adapter capacity: 896 MB
  • Memory bus: 448 bits
  • Rasterization units: 28
  • Texture blocks: 64
  • Number of execution processors: 192
  • Video memory bandwidth: 112 GB/sec

Manufacturers

Reference from nVidia

The first and most important video card manufacturer is NVIDIA.

I wonder what this model was a defect of the NVIDIA GeForce GTX 280, mainly due to the reduced number of video card control modules, and therefore the GTX 260 had, so to speak, a non-standard amount of memory - 896 MB, which contradicts the standards for the amount of video memory of video cards, according to which it should be equal deuce to any degree. With the GTX 280, everything is coordinated in this direction due to a full set of control units and its memory capacity is 1024 Megabytes, that is, two to the tenth power. In addition, the stripped-down version, that is, the GTX 260, has less:

  • Video memory frequency (1998 MHz versus 2214 MHz for the GTX 280),
  • Memory bus width (448 bits versus 512)
  • GPU frequency (575 MHz vs. 602)

The video card also has a smaller number of unified shader processors (192 versus 240), texture units (64 versus 80), higher power consumption (500 watts versus 550), etc.

Due to all this, the GeForce GTX 280 naturally wins in terms of performance, although it loses in power consumption, heat dissipation (heating by 100 degrees Celsius for the GTX 280 versus 80 degrees Celsius for the GTX 260), and significantly in cost (the price of the GTX 280 was $500 at that time how the GTX 260 costs $200 less!).

Modification of XFX GeForce GTX 260 from XFX

To be honest, nothing was changed in the technical specifications, but the video card was framed in a box with a beautiful design, many new accessories relative to the NVIDIA model, as well as a full version of the Assasin’s Creed game. The video card also comes with installation and operating instructions, and on the box itself minimum requirements for running a video card, such as a 500 W power supply with a current of 36 Amps at a voltage of 12 volts and so on. The back of the box describes the various benefits of the graphics card.

Modification of ZOTAC GeForce GTX 260 AMP2! Edition by ZOTAC

When the company introduced the first modification of the GTX 260 (AMP1!), it was difficult to name its differences from the XFX modification, except for another box with a dragon design and the game supplied with the package (GRID), however, ZOTAC was rehabilitated and in the new version introduced an improved video card with 216 -y executive processors, which undoubtedly improves the speed of exchange of the video card with RAM and, accordingly, greater performance. In addition, there is one more difference from competitors’ products, namely, the presence of 2 cables for connecting additional power (competitors have only one), otherwise everything is the same as in similar GTX 260.

Comparison with analogues

The eternal battle of giants: nVidia GeForce GTX 260 vs ATi Radeon HD 4870

When these video cards were the most powerful of gaming cards, they cost about the same - $300. But many people know AMD’s disregard for its products, and this could not be avoided here, since AMD was indifferent to supporting its brainchild and did not release new drivers for the Radeon HD 4870, this caused a lot of negative feedback, and there were also problems with excessive power consumption of this video card relative to what was specified at release, and therefore its cost fell by this moment a video card costs about 6,500 rubles, while the GTX 260 has practically not fallen in price since 2009 and at the moment the cost of a video adapter is 17,000 rubles.

Differences between GeForce GTX 260 and Radeon HD 4870 video cards

nVidia GeForce GTX 260 ATi Radeon HD 4870
Technological process 65 Nm 55 Nm
Number of transistors 956 million 1400 million
GPU frequency 750 MHz 602 MHz
Video memory frequency 3600 MHz 2214 MHz
Video memory size 512/1024 MB 896 MB
Supported memory type GDDR5 GDDR3
Memory bus width 256 Bit 512 Bit
Number of execution processors 800 240
Number of texture blocks 40 80
Number of rasterization blocks 16 32
Bandwidth 115 GB/sec 141 GB/sec

Despite the differences in characteristics, the two video cards are almost similar in performance, as shown by their comparison in benchmarks and some games. But at the same time, we should not forget about the problem of energy consumption by the Radeon card and the lack of support for it, therefore, despite the differences in cost and approximately equal performance, I would still give preference to the nVidia GeForce GTX 260 when purchasing.

Tests

Testing in benchmarks, games during overclockingnVidia GeForce GTX260 and without it.

In testing, the video card behaves quite confidently and shows good results both in benchmarks and in games. If we talk about the so-called optimization of the video card by replacing the cooling system and overclocking the GTX 260, then it’s definitely not worth doing this. Productivity will not increase much, which will be clearly shown below, and spending on this will be in vain. At the same time, the risk of overheating still increases, since the temperature of the video card is already 80 degrees Celsius, and during overclocking it will even increase to critical, as a result of which the GTX 260 will simply burn out. This is what came out in testing using various benchmarks with and without overclocking the nVidia GeForce GTX 260.

Configuration of the system unit under test:

  • MP: ASUSTek P5K Deluxe/WiFi-AP (Intel P35), LGA 775, BIOS v0812
  • CPU: Intel Core 2 Extreme QX9650, 3.0 GHz, 1.25 V, L2 2 x 6 MB, FSB: 333 MHz x 4, (Yorkfield, C0)
  • CPU cooling system: Thermalright SI-128 SE (Scythe Ultra Kaze, ~1250 rpm);
  • RAM: 2 x 1024 MB DDR2 Corsair Dominator TWIN2X2048-9136C5D (Spec: 1142 MHz / 5-5-5-18 / 2.1 V) + 2 x 1024 MB DDR2 CSX DIABLO CSXO-XAC-1200-2GB-KIT (Spec: 1200 MHz / 5-5-5-16 / 2.4 V);
  • Drives: SATA-II 500 GB, Samsung HD501LJ, 7200 rpm, 16 MB, NCQ;
  • Drive: SATA-II DVD RAM & DVD±R/RW & CD±RW Samsung SH-S183L;
  • Power supply: Enermax Galaxy DXX (EGA1000EWL) 1000 Watt (standard fans: 135 mm for intake, 80 mm for exhaust).
  • Monitor: 24″ BenQ (Wide LCD, 1920 x 1200 / 60 Hz)

In order to eliminate the possibility that the performance of the GTX 260 depends on the speed of the rest of the PC hardware, it was decided to overclock the processor to a clock frequency of 4 GHz and increase its voltage to 1.6 volts. The OS chosen for testing was Win Vista Ultimate Edition x64 preSP1. In the video card itself, the GPU frequency was increased from 602 to 738 MHz and the video memory frequency from 2214 to 2484 MHz. The game settings were set to high, testing was carried out at a resolution of 1920 x 1200.

Results:

GeForce GTX 260 GeForce GTX 260 (overclocked)
3DMark 2006 10000 11681
3DMark Vantage 4138 4989
S.T.A.L.K.E.R. – Shadow of Chernobyl 67 fps 80 fps
Call of Duty 4: Modern Warfare 98 fp 110 fps
Crysis 35 fps 42 fps

Based on the results above, we can conclude: when overclocked, the GeForce GTX 260 does not provide a high performance increase in benchmarks, although in games it shows an excellent performance increase, approaching the GTX 280 in terms of performance, but whether this is worth the risk of overheating the video card is up to you to decide.

P.S. In some modern games from 2013-2017, the nVidia GeForce GTX 260 2009 shows excellent performance! For example, in GTA 5 45 fps on medium settings, 33 fps in Skyrim on medium, 100 fps in CS: GO on medium, 40 fps in Dota 2 on maximum.

The graphics card is the busiest component inside your computer when it comes to gaming. It processes millions of instructions that perform various operations while gaming, and this causes it to get hot. Like the CPU, overheating can occur GPU on the graphics card, which can lead to a variety of problems, including graphics card failure. In a graphics card, the GPU is the main component where overheating can occur. The graphics card memory may also become warm, but it does not go beyond the danger level. Overheating can reduce the lifespan of the GPU and can also cause immediate damage to the graphics card.

What video card temperature is considered normal?

The answer to this question depends both on the manufacturer and on specific model graphics cards, but in general anything above 80 degrees Celsius is a sign of concern. If the GPU graphics card temperature exceeds 80°C, you should take appropriate measures to lower it, preferably in the range of 70°C - 75°C or lower.

If you are having problems with the performance of your computer, especially when editing video, processing video, or when playing video it starts to stutter, slow down, or freeze, then the first thing you will need to do is check your video card and compare it with the readings in the table below.

Acceptable temperature NVIDIA video cards

Video cardsIdle TemperaturePermissible TemperatureMaximum temperature
GeForce GTX 1080 Ti42 55-80 91
GeForce GTX 108042 60-84 94
GeForce GTX 107041 83 94
GeForce GTX 106038 55-75 94
GeForce GTX 1050 Ti35 55-80 97
GeForce GTX 105035 55-80 97
GeForce GT 103035 65-82 97
GeForce GTX TITAN X42 83 91
GeForce GTX TITAN (Z,Black)41 81 95
GeForce GTX 980 Ti42 85 92
GeForce GTX 98042 81 98
GeForce GTX 97044 73 98
GeForce GTX 96037 50-78 98
GeForce GTX 95030-35 75 95
GeForce GTX 780 Ti42 83 95
GeForce GTX 78043 83 95
GeForce GTX 77036 60-77 98
GeForce GTX 76036 82 97
GeForce GTX 750 Ti33 55-70 95
GeForce GTX 75033 76 95
GeForce GTX 69034 77 98
GeForce GTX 68037 80 98
GeForce GTX 67036 55-80 97
GeForce GTX 660 Ti34 78 97
GeForce GTX 66032 63 97
GeForce GTX 650 Ti Boost38 69 97
GeForce GTX 65035 66 98
GeForce GTX 645- - 97
GeForce GT 64034 75 102
GeForce GT 63035 75 98
GeForce GT 620- - 98
GeForce GTX 59037 81 97
GeForce GTX 58042 81 97
GeForce GTX 57044 81 97
GeForce GTX 560 Ti33 76 99
GeForce GTX 56034 76 99
GeForce GTX 550 Ti36 67 100
GeForce GT 52037 75 102
GeForce GTX 48044 96 105
GeForce GTX 47030-40 92 105
GeForce GTX 465- 90 105
GeForce GTX 46030 65-80 104
GeForce GTS 450- 65-80 100
NVIDIA TITAN Xp- 80 94
NVIDIA TITAN X- 80 94

Measures to reduce GPU temperature

Here are all the possible measures you can take to lower your GPU graphics card temperature.

Disable GPU overclocking

If you have overclocked your graphics card, then you should return the GPU to the original settings to prevent it from increasing the GPU temperature. If you plan to overclock again, then you must ensure that the card remains within a safe temperature range in the future. Below you can read how to prevent your card from overheating.

Clean fan and radiator

Dust can become lodged on the heatsink and fan, thereby reducing their performance and efficiency. Open the PC case and then remove the graphics card. After that, use a small brush and a vacuum cleaner to carefully remove any dust from the video card. Install the graphics card again, and then monitor the temperature using GPU monitoring tools.

Changing Thermal Paste

It's possible that the thermal paste between the GPU and the heatsink has dried out and cracked, causing it to lose its effectiveness. You will have to remove the fan and heatsink, and remove any remaining old thermal paste, and carefully apply new thermal paste. Read in more detail on how to properly replace thermal paste.

Faulty fan

If the video card fan is not working properly or perhaps it is spinning very slowly, then this may be due to an increase in GPU temperature. Here the only thing you can do is replace the faulty video card fan with a new one or try one.

Install a more efficient cooling system

You can also install a good, higher-performance third-party Aftermarket GPU cooler on your video card. And if you think that the stock cooler/heatsink fan (HSF) is not doing a good enough job, then you can install a water cooling system for the card to bring down the GPU temperature.

Note: Aftermarket coolers only work with reference video cards or standard size video cards printed circuit board.

Increase air flow inside the PC case

Incorrect or poor airflow inside the computer case can also cause the graphics card's temperature to rise. To improve airflow inside your PC case, you can install additional exhaust fans.

Some time ago, in our test laboratory, we tested several video adapters manufactured by GIGABYTE. But these were video cards based on reference graphics chips from NVIDIA. This time we were provided with video cards in an overclocked version, that is, having a factory overclocking of the memory frequencies and the graphics core itself, which are called Super Over Clock, namely GIGABYTE GeForce GTX 275 (GV-N275SO-18I) and GIGABYTE GeForce GTX260 (GV- N26SO-896I). In this article we will present the results of their testing, and for clarity we will add to them the results of testing the previous model of the GIGABYTE GeForce GTX 275 (GV-N275UD-896I) video card (the version without overclocking) and, of course, the reference GIGABYTE GeForce GTX 295 video card (GV-N295-18I).

GIGABYTE GeForce GTX260 Super Over Clock (GV-N26SO-896I)

The GIGABYTE GeForce GTX260 Super Over Clock video card is built on the nVIDIA GeForce GTX 260 graphics processor (GT200 chip), but has a number of differences from the reference video card. First of all, these are, of course, changed specifications models. The appearance of the video card, as well as the design of the cooling system, has not undergone virtually any changes and is no different in appearance from the cooling system installed on the reference model Geforce GTX260. It is worth noting that, as GIGABYTE claims, the fan speed automatically changes depending on the current temperature of the GPU, but the very principle of this dependence, compared to the reference model, has been changed. The cooling system is based on a massive aluminum radiator placed in a plastic casing, which covers the entire printed circuit board of the video card.

When the GIGABYTE GeForce GTX260 Overclock Edition video card is idle, the GPU temperature (according to the GPU-Z 0.3.4 utility) does not exceed 49 °C. At the same time, the fan rotation speed is 1381 rpm and does not change up to 85 ° C. In the reference model, the fan rotation speed begins to change from 72 °C. And since the graphics card is typically not running at 100% capacity, the new RPM-temperature relationship gives the user a quieter graphics card.

In order to determine the maximum fan rotation speed, as well as measure the maximum power consumption of the model, we used the FurMark 1.7.0 utility, designed for stress loading video cards, and a hardware wattmeter. During testing, it turned out that the maximum fan speed on the GIGABYTE GeForce GTX260 Overclock Edition video card is 2910 rpm. At the same time, the temperature of the graphics processor does not cross the line of 86 ° C, which can be considered a good indicator, based on the fact that the video card has increased frequency characteristics graphics memory and processor. Thus, the GPU core frequency is 680 MHz, the shader unit frequency is 1500 MHz, and the video memory frequency is 1250 MHz. Let us recall that for the reference video card the graphics core frequency is 576 MHz, the shader unit is 1242 MHz, and the memory frequency is 999 MHz.

As you can see, all the main components of the video card are slightly overclocked, and therefore its performance should be at least slightly higher than the performance of the reference video card.

As for the remaining characteristics of the GIGABYTE GeForce GTX260 Overclock Edition video card, they do not differ from the characteristics of the reference model, with the exception of one fact. The video card is equipped with 896 MB of GDDR3 memory, and the memory bus width is 448 bits, while the memory bandwidth is 130 GB/s.

The NVIDIA GeForce GTX 260 graphics processor (codenamed GT200), manufactured using a 55 nm process technology (die area is 487 mm2), has 216 shader unified processors and 28 raster operation units (ROPs). IN previous versions video cards based on this chip used only 192 unified shader processors.

It remains to add that the GIGABYTE GeForce GTX260 Overclock Edition video card has a DVI-I output, an HDMI output and a VGA (D-Sub) output, while the reference video card was equipped only with a DVI and VGA connector. However, the model we tested, like all models of graphics chips, does not allow connecting three monitors at the same time, and three outputs are provided to the user so that he has a choice. This model, like all the latest video cards based on powerful graphics chips, occupies two slots in system unit and is compatible with API Direct X10 (SM 4.0). We could, of course, also remember about support for all sorts of proprietary technologies such as CUDA, PhysX, etc., but let's separate purely marketing technologies from what users really need.

GIGABYTE GeForce GTX275 Super Over Clock (GV-N275SO-18I)

The GIGABYTE GeForce GTX275 Super Over Clock video card, like the other model in our testing from GIGABYTE, is built on the NVIDIA GeForce GT200 graphics processor. Moreover, in this case we are talking about an overclocked version of the NVIDIA GeForce GTX275 graphics chip, which is inferior in performance only to video cards based on NVIDIA GeForce GTX285 and GeForce GTX295 graphics processors (this applies, of course, to video cards based on chips from this company). This video card has also changed all frequency characteristics - the operating frequencies of the memory, graphics chip and the operating frequencies of unified processors. If for the reference model (GIGABYTE GeForce GTX275 (GV-N275UD-896I)) the memory frequency is 1200 MHz, then for the GIGABYTE GeForce GTX275 Super Over Clock video card it is 1260 MHz. The graphics core in this new video card operates at a frequency of 715 MHz, while the reference frequency value for cards based on this chip is 633 MHz. The operating frequency of the shader unit, that is, the unified processors, is 1550 MHz, while the reference model is 1404 MHz. In addition, the GIGABYTE GeForce GTX275 Super Over Clock graphics card uses 896 MB of GDDR3 memory with a 448-bit memory bus width.

Rice. 1. Comparison of video card characteristics

The cooling system used on the GIGABYTE GeForce GTX275 Super Over Clock video card is a reference one, however, as in the case of the above-described video card on the GeForce GTX260 chip, it has been changed.

When the GPU is not loaded, its temperature is 43 ° C, and the fan speed is 1500 RPM. At maximum load of the GPU, its temperature increases to 91 ° C, and the fan speed is 2150 RPM. Comparative characteristics of the GIGABYTE GeForce GTX275 Super OverClock model and the reference video card are shown in the table.

Just like the other video card we tested, GIGABYTE GeForce GTX275 Super Over Clock occupies two slots in the system unit, has a PCIExpress 2.0 interface and three outputs for connecting a monitor - DVI-I, VGA and HDMI. It remains to add that this model during testing was characterized by quite frequent failures during operation. Of course, this is explained precisely by the overestimated clock frequencies, but it is likely that this behavior was caused by some kind of manufacturing defect.

Testing methodology

The methodology for testing video cards is described in detail in the article “New methodology for testing processors, computers and video cards,” published in the September issue of the magazine, and therefore we will not repeat ourselves and will only briefly recall its main points. It is worth noting that in this testing we used the new Windows 7 Ultimate OS as the operating system. Reference results were also obtained on a new operating system with the latest version at the time of writing NVIDIA drivers ForceWare 190.62.

To test video cards, we use the ComputerPress Game Benchmark Script v.4.0 test script, which allows you to fully automate the entire testing process and select games for testing, the screen resolution at which they are launched, game settings for maximum display quality or maximum performance, and also set the number runs for each game.

The ComputerPress Game Benchmark Script v.4.0 includes the following games and benchmarks:

  • Quake 4 (Patch 1.42);
  • S.T.A.L.K.E.R.: Shadow of Chernobyl (Patch 1.005);
  • S.T.A.L.K.E.R.: Clear Sky (Patch 1.007);
  • Half-Life: Episode 2;
  • Crysis v.1.2.1;
  • Left 4 Dead;
  • Call of Juares Demo Benchmark v. 1.1.1.0;
  • 3DMark06 v. 1.1.0;
  • 3DMark Vantage v. 1.0.1.

During testing, all games (with the exception of 3DMark Vantage v. 1.0.1) were launched at four different screen resolutions: 1280x800 (or 1280x720), 1440x900, 1680x1050 and 1920x1200. Benchmark 3DMark Vantage v. 1.0.1 was run in each of the four presets (Entry, Performance, High and Extreme).

All games were launched in two setting modes: maximum performance and maximum quality. The maximum performance setting mode is achieved by disabling effects such as anisotropic texture filtering and screen anti-aliasing, as well as setting image detail to low, etc. That is, this mode is aimed at getting the highest possible result (maximum FPS value).

The maximum quality setting mode is achieved through the use of high detail, various effects, anisotropic texture filtering and screen anti-aliasing. IN this mode settings, the result depends to a greater extent on the performance of the video card and to a lesser extent on the performance of the processor.

Based on the results of all runs, the arithmetic mean result was calculated for each test. In addition, based on the testing results, an integral performance indicator was determined for each video card. To do this, initially for each game in each setting mode, the weighted average result for all resolutions was calculated using the formula.

After this, the geometric mean was calculated between the results calculated using the formula described above for the maximum quality mode and the maximum productivity mode. The result found in this way was an integral assessment of performance in a separate game.

To obtain an integral performance assessment in the 3DMark Vantage test, the geometric mean between the results for all presets was calculated using the formula.

Next, the integral performance scores in each individual game were normalized to similar results for the reference video card, and the geometric mean was calculated for all normalized integral results. For convenience of presenting the results, the obtained value was multiplied by 1000. This was the integral assessment of the video card’s performance. For the reference video card, the integrated performance result is 1000 points.

The GeForce GTX295 video card was used as a reference video card. The video card testing stand had the following configuration:

  • processor - Intel Core i7 Extreme 965 (clock frequency 3.2 GHz);
  • motherboard - ASUS RAMPAGE II EXTREME;
  • chipset motherboard- Intel X58 Express;
  • Intel Chipset Device Software - 9.1.0.1007;
  • memory - DDR3-1066 (Qimonda IMSH1GU03A1F1C-10F PC3-8500);
  • memory capacity - 3 GB (three modules of 1024 MB each);
  • memory operating mode - DDR3-1333, three-channel mode;
  • memory timings - 7-7-7-20;
  • HDD - Western Digital WD2500JS;
  • operating system: Microsoft Windows 7 Ultimate;
  • video driver - ForceWare 190.62.

The absolute test results are presented in Fig. 2-20, and the integral performance indicators of the tested video cards are shown in Fig. 21.

Rice. 2. Test results
in the game Quake IV (Patch 1.42)

Rice. 3. Test results
in the game Quake IV (Patch 1.42)

Rice. 4. Test results
in the game Half-Life: Episode 2
when set to maximum quality

Rice. 5. Test results
in the game Half-Life: Episode 2
when set to minimum quality

Rice. 6. Test results in the test
when set to maximum quality

Rice. 7. Test results in the test
Call of Juares DX10 Benchmark v.1.1.1.0
when set to minimum quality

Rice. 8. Test results in the game
when set to maximum quality

Rice. 9. Game testing results
S.T.A.L.K.E.R.: Shadow of Chernobyl (Patch 1.005)
when set to minimum quality

IntroductionThe past year was extremely difficult for Nvidia - the company had to suffer defeat after defeat, retreating in all sectors of discrete 3D graphics under an unexpectedly powerful onslaught from the suddenly activated graphics division of Advanced Micro Devices. At the same time, everything that happened cannot be attributed to evil accidents - Nvidia itself is largely to blame for the chain of failures, which initially chose the wrong strategy for the development of its graphics processors and invested all its efforts in creating the G200. The unacceptably long delay in transferring even the core of the previous generation, G92, to a more refined technical process, not to mention the flagship of the line, which desperately needed it, also played a negative role.

Although the G200 had good makings, with a 65-nm process technology it was not able to fully exploit its capabilities, since, being extremely complex (1.4 billion transistors), it could not operate at high frequencies. Suffice it to remember that even the flagship model of the new family, the Nvidia GeForce GTX 280, had a domain frequency of shader processors limited to 1.3 GHz, and the Nvidia GeForce GTX 260 was forced to settle for a modest 1242 MHz. For comparison, the operating frequency of the computing units was almost the same in the original version of the Nvidia GeForce 8800 GTS, which used the 90 nm G80 core! With all this, in a number of cases the new product did not provide a decisive advantage over the much simpler ATI RV770.

Based on the situation described, it becomes clear that the new, thinner technical process was necessary for the Nvidia G200 like air if the company was going to continue using it as an impact weapon. Firstly, the transfer of the G200 to a 55-nm process technology made it possible to significantly increase its frequency potential while maintaining power consumption and heat dissipation levels within acceptable limits, and secondly, it opened the way to creating an answer to the ATI Radeon HD 4870 X2, a dual-chip Nvidia card. The latter would hardly be possible with the 65nm version of the G200 - a card based on two such GPUs would certainly be too hot and uneconomical.

Obviously, Nvidia itself understood this well, having suffered greatly from the powerful pressure from the graphics division of Advanced Micro Devices. The alternative was to continue to lose the already shaky positions in the discrete graphics market, but it is easy to lose influence in this sector, but then winning back every percentage is given with great difficulty, as they say, with sweat and blood. Therefore, the necessary work was carried out, and their result was the creation of a 55-nm version of the G200, also known under the code names G200b, GT200b, GT206 and some others. For convenience, we will call it G200b. The G200b is nothing new in architectural terms, remaining the same G200, containing 240 unified shader processors, 80 texture operations units and 32 RBE, but manufactured using 55 nm production standards. Therefore, we should expect from it either a less hot temper and greater efficiency, or, with comparable levels of power consumption and heat dissipation, greater performance than the 65 nm version of the G200.

Of course, the GeForce GTX 260 still uses chips with a truncated configuration: 216 ALUs, 72 texture processors and 28 RBE units. But even in the 55 nm version, the complexity of the G200 remains the same, which means the cost is high, so it is logical to assume that some of the cores installed on the GeForce GTX 260 Core 216 either did not pass frequency control or have defective blocks, which does not allow use These chips are more profitable for the company - in the production of GeForce GTX 285 and 295.

The first company to present graphics cards based on the new version of the G200 to the public was EVGA, one of Nvidia's main partners. This may mean the presence of certain privileges, a kind of reward for loyalty to the “green” camp - EVGA supplies the market exclusively with solutions based on Nvidia chips, but the appearance of similar products from other manufacturers is now only a matter of time. However, thanks to EVGA, we have a unique opportunity to be one of the first to explore the capabilities of the G200b. EVGA GeForce GTX 260 Core 216 Superclocked will help us with this.

Packaging and accessories

At first glance, the packaging of the product described is not anything unusual, representing a standard medium-sized box.


An orange stripe on a black background is practically all the design elements, however, against the backdrop of robots, dragons and full-breasted girls in revealing armor that have managed to set the teeth on edge, this minimalist approach looks solid and does not irritate the eye. However, we again see a common mistake - the inscription in the upper left corner says “896MB DDR3”, while in fact the card, of course, is equipped with GDDR3 memory, which has little in common with DDR3 and is architecturally similar to DDR2. The Superclocked inscription indicates factory overclocking, and the sticker under it indicates availability in the kit. full version popular shooter Far Cry 2.

On the back of the box there is a transparent window through which a section of the video adapter's printed circuit board with a sticker with a serial number is visible. Below the window there is a second sticker and inscription warning the buyer that serial numbers on both stickers must match. This is important for the lifetime warranty provided by EVGA, as well as for participation in the EVGA Step-Up program, which allows you to replace your purchased card with a more powerful one within 90 days after purchase, of course, paying the difference in cost between them. Frankly speaking, there is little point in this, since, as a rule, few people think about replacing a recently purchased graphics adapter for a new one within three months - especially since in this case we are talking about a card that is already productive and expensive.

The quality of packaging is traditionally high; Instead of a cardboard box, EVGA uses a transparent plastic container in which the video adapter is securely fixed. This protects it from possible damage during transportation and storage. In addition to the video adapter, the box contained the following set of accessories:



DVI-I→D-Sub adapter
DVI-I→HDMI adapter
2 adapters 2xPATA → PCIe
S/PDIF connecting cable
Quick Installation Guide
User guide
2 EVGA branded stickers
CD with drivers and utilities
DVD with the full version of Far Cry 2

Even taking into account the manufacturer's recommended price, which is quite high and amounts to over $250, the equipment of the product described deserves a flattering assessment, especially considering the presence in its composition of the full version of the popular 3D shooter Far Cry 2. The lack of a software player capable of working with modern formats is somewhat disappointing high-resolution video, but given that the EVGA GeForce GTX 260 Core 216 Superclocked is clearly positioned as a gaming solution, this is not a fatal flaw, especially since the vast majority of other manufacturers do the same.

On a disk with drivers, in addition to themselves and electronic version user manuals, you can find a couple useful utilities, one of which is the Fraps 2.9.6 test tool known to all players, but a few words need to be said about the other separately. EVGA Precision is a convenient utility for overclocking a graphics adapter, which is most likely based on the RivaTuner kernel. In addition to controlling the clock speeds of the graphics core and memory, it allows you to manually adjust the rotation speed of the cooling fan and monitor the temperatures of up to four GPUs in the system.



In addition, EVGA Precision allows you to create up to 10 profiles, assigning a different profile to each of them. hotkey, and also knows how to display data, including frame rate, not only on the main monitor, but also on the screen of a popular gaming Logitech keyboards G15. This is not to say that EVGA Precision is irreplaceable, but, unlike some similar software products from other manufacturers, it has a simple and functional design, which makes it suitable for everyday overclocker use.

As for Far Cry 2, this game, being the heir to the legendary Far Cry, which once established new standard quality graphics for first-person shooters, needs no introduction, since it is familiar to all fans of this genre and is used by us as one of the tests. It's really good gift potential buyer, and we can only applaud EVGA for including it in the package - few graphics card suppliers supply their products with such “fresh” games.

Overall, the packaging and features of the EVGA GeForce GTX 260 Core 216 Superclocked certainly deserves an “excellent” rating. The box itself looks solid and attractive, and in addition to the video adapter itself, it contains everything necessary for its full operation. At the same time, the kit does not include outdated cables and adapters, but as a free add-on there is a truly modern and popular game.

PCB design

At first glance, the card looks no different from the 65 nm version of the Nvidia GeForce GTX 260/GTX 260 Core 216. However, EVGA tried to add originality to its product, and not limited to the sticker on the cooling system casing - the side of this casing is painted red color.






The length of the EVGA GeForce GTX 260 Core 216 Superclocked PCB is 27 centimeters, so this product will not be suitable for owners of most compact cases. Considering that its direct competitor is the ATI Radeon HD 4870, which uses a 23-centimeter-long board, this is a rather significant drawback.

Now about the differences, which, in fact, are much greater than it might seem at first glance. It would be hard to expect that a 55nm version of Nvidia GeForce GTX 260 Core 216 would be developed new design printed circuit board, because such an undertaking is not cheap and, in light of the fact that the card is essentially a truncated version of the Nvidia GeForce GTX 280, quite pointless. However, as soon as you dismantle the cooling system, it becomes obvious that Nvidia has finally decided to take this step - there are very, very many differences.



Firstly, it is immediately noticeable that all the memory chips are now located on the front side of the printed circuit board, and there are 14 of them, that is, this design initially provides for a 448-bit memory bus without the possibility of expansion to 512-bit. It's difficult to say what prompted Nvidia to develop a new board; Most likely, this is a desire to reduce production costs at any cost, although the gain is unlikely to be significant - even in its updated form, the board looks quite complex and expensive to manufacture. It’s interesting that the GeForce 8800 line developed along the same path at one time: then, in most cases, the increase clock frequencies successfully compensated for the decrease in memory bus width during the transition from G80 to G92 chips.


Old Geforce GTX 260 PCBNew Geforce GTX 260 PCB


The power subsystem has also undergone significant processing - if in the old version a five-phase stabilizer was used, obtained by truncation of the original seven-phase stabilizer of the Nvidia GeForce GTX 280, then in the 55-nm version of the Nvidia GeForce GTX 260 the total number of phases of the GPU power stabilizer is four, and they use different power transistors. Bad news for owners of liquid cooling systems or custom air coolers: systems designed for the 65nm Nvidia GeForce GTX 200 series will not be suitable for 55nm cards, according to at least, monolithic models designed to cool not only the GPU and memory chips, but also the elements of the power system, precisely because of the different arrangement of the latter.


The heart of the stabilizer is the four-phase PWM controller NCP5388, located on the reverse side of the printed circuit board. Next to it is an unknown chip of microscopic size, marked as “BR=AL U07”, and, apparently, responsible for controlling a separate two-phase memory power supply stabilizer. Connectors for connection external power supply, as before, two. Both connectors are six-pin PCIe 1.0 with a load capacity of up to 75 W. Other differences include the presence of a metal frame around the GPU, but the left side of the board, where the interface connectors and the NVIO chip are located, has remained virtually unchanged.


As memory chips, this product uses GDDR3 Samsung K4J52324QH-HJ1A chips with a capacity of 512 Mbit (16Mx32), designed for a frequency of 1000 (2000) MHz and a supply voltage of 1.9 V.


14 such chips form a bank of local video memory with a capacity of 896 MB with a 448-bit access bus. According to official Nvidia specifications, GeForce GTX 260 memory, regardless of the type used GPU versions, should run at 1000 (2000) MHz, providing 112 GB/s bandwidth, but EVGA overclocked the memory to 1053 (2106) MHz, which increases bandwidth to 117.9 GB/s.

This is even slightly higher than that of the ATI Radeon HD 4870, which is 115.2 GB/s, but, taking into account the use of chips originally designed for 1000 (2000) MHz, it leaves practically no room for further overclocking, which, however, will be tested in the corresponding chapter of the review.

It was not possible to visually assess how much the transition to the 55-nm process technology made it possible to reduce the area of ​​the G200b crystal - the new version of the chip is equipped with a metal heat spreader cover in the same way as the old one. We did not risk removing this cover, since the card will be useful to us in the future. However, the numbers speak for themselves - the new version of the G200 has an area of ​​470 square millimeters versus 576 sq. mm for the old one, produced using 65 nm technology. It’s not very impressive compared to the compactness of the ATI RV770, which has an area of ​​only 260 sq.mm, but don’t forget that the latter is almost one and a half times simpler when it comes to the number of transistors. The chip is marked as G200-103-B2, which allows you to immediately distinguish it from old version, marked G200-100-A2.


The official core frequencies of the Nvidia GeForce GTX 260 Core 216 are 576 MHz for the main domain and 1242 MHz for the compute part, but today's article is about a factory overclocked product, and these parameters are 625 and 1350 MHz, respectively. Not a very significant increase, which clearly does not allow us to judge whether the frequency potential of the G200b has increased in comparison with the G200 - in our practice, we have encountered 65-nm versions of the Nvidia GeForce GTX 260 Core 216, overclocked by the manufacturer to higher frequencies. This is why EVGA GeForce GTX 260 Core 216 Superclocked will be subject to further overclocking attempts. As for the graphics core configuration, it is standard for the Nvidia GeForce GTX 260 Core 216: 216 universal shader processors, 72 texture processors and 28 RBE units. Theoretically, if further overclocking is successful, the card can be expected to perform at the level of Nvidia GeForce GTX 280 or even higher.

The card is equipped with a standard set of connectors, including two dual-channel DVI-I ports supporting resolutions up to 2560x1600 inclusive, a universal analog video output port, two SLI connectors that allow you to combine unified system up to three cards, as well as a two-pin S/PDIF connector, which serves to broadcast an external S/PDIF audio stream to HDMI, for which a corresponding cable is included in the package.

Cooling system design

In many respects, the design of the cooling system of the 55-nm version of the Nvidia GeForce GTX 260 Core 216 remains, but there are also significant changes dictated by both the new design of the printed circuit board and the use of a more economical version of the graphics core.


The most important difference, perhaps, is the significantly reduced radiator area - it has lost the section located directly in front of the mounting strip, due to which it has become noticeably shorter. One of the heat pipes, which previously ensured the transfer of heat flow from the power elements of the power supply system, has disappeared, and the base in contact with the surface of the GPU has become significantly smaller.






Apparently, all these measures are also aimed at reducing costs, since we see no other objective reasons to cut the configuration of a well-proven design. Of course, efficiency new system cooling should drop significantly, but, in the end, its drop can be compensated by the lower heat dissipation level of the G200b. Whether this is actually the case will be verified in the next chapter of the review.

Otherwise, as already mentioned, the general concept remains the same - a copper heat exchanger that takes heat from the GPU and transfers it to the radiator using heat pipes. The usual dark gray thick thermal paste is used as a thermal interface. A radial fan is responsible for blowing through the radiator; heated air is expelled outside the system body through a series of slots in the video adapter mounting strip. Other elements that require cooling, such as memory chips, the NVIO crystal and power regulator transistors, contact the aluminum base of the system through its protrusions, equipped with fiber pads impregnated with white thermal paste.

The design as a whole looks quite successful, although it is somewhat simplified in comparison with the original version, originally developed for the Nvidia GeForce GTX 200 family. In this regard, one should expect slightly less high efficiency of its operation, but taking into account the use of the 55 nm version of the G200 , the difference may be insignificant or not appear at all. If the thermal conditions do not deteriorate, simplification of the design can be considered justified.

Power consumption, thermal conditions, overclocking and noise

Since the 55-nm version of the Nvidia GeForce GTX 260 Core 216 came into our testing laboratory for the first time, we did not fail to measure its power consumption and find out whether it has become more economical compared to the old, 65-nm version, and how big the gain is. For this purpose, a test bench with the following configuration was used:

Processor Intel Pentium 4 560 (3.6 GHz, LGA775)
DFI LANParty UT ICFX3200-T2R/G motherboard (ATI CrossFire Xpress 3200)
Memory PC2-5300 (2x512 MB, 667 MHz)
Hard drive Western Digital Raptor WD360ADFD (36 GB)
Chieftec ATX-410-212 power supply (power 410 W)
Microsoft Windows Vista Ultimate SP1 32-bit
Futuremark PCMark05 Build 1.2.0
Futuremark 3DMark06 Build 1.1.0

According to the accepted methodology, to create a load in 3D mode, the first SM3.0/HDR test of the 3DMark06 package is used, launched in a cycle at a resolution of 1600x1200 with forced FSAA 4x and AF 16x. Emulation of the “peak 2D” mode is carried out using the “2D Transparent Windows” test, which is part of PCMark05. The last test is quite relevant in light of the fact that window Windows interface Vista Aero takes advantage of the GPU.






The transition to a 55-nm process technology had the most beneficial effect on the efficiency of the G200 in general and the GeForce GTX 260 in particular. Although there was no significant gain in idle mode, under load in 3D mode it amounted to 34 W, so the crown of championship rightfully goes from the ATI Radeon HD 4870 1GB to cards based on the G200b. This is a pretty serious blow to ATI, which, we believe, should seriously think about the power consumption of its products, especially those built on a dual-processor design.

As for the layout of the power lines, in the new version of the GeForce GTX 260 Core 216 the load on the external connectors is uneven, unlike the old one, which uses the design of the printed circuit board and power system circuitry developed for the GeForce GTX 280. The connector closest to the end of the board is noticeably loaded, 10-13 W, stronger; however, taking into account the indicators, which are very far from the limit value (75 W), this is not a cause for concern.

As you know, the 55nm version of the Nvidia GeForce GTX 260 Core 216 operates at the same frequencies as the 65nm version - 576 MHz for the main GPU domain and 1240 MHz for the shader processor domain. Described in this review EVGA GeForce GTX 260 Core 216 Superclocked was pre-overclocked by the manufacturer to 625/1350 MHz, however, we decided to go further and find out how much the frequency potential of the G200 has increased after transferring it to the 55 nm process technology. Without using any additional means, such as replacing the cooling system or modifying the power system, the result was 715/1541 MHz for the graphics core and 1150 (2300) MHz for the memory. Not a bad result for a chip consisting of 1.4 billion transistors, especially considering that a similar copy of the Nvidia GeForce GTX 260 Core 216, using the 65 nm version of the G200, managed to overclock only to 650/1400 MHz. In percentage terms, the gain was about 10%, but this already allows us to hope for performance at the level of the Nvidia GeForce GTX 280, which has 240 ALUs, 80 TMUs and 32 RBE versus 216, 72 and 28 similar units for the Nvidia GeForce GTX 260 Core 216.

A study of the temperature regime using RivaTuner showed the following picture:



At the same clock speeds, the maximum temperatures of the 65-nm and 55-nm versions of the G200 are also the same, which is apparently due to the somewhat simplified design of the cooling system of the latter. However, with further overclocking of the EVGA GeForce GTX 260 Core 216 Superclocked, the maximum GPU temperature did not exceed that of the Nvidia GeForce GTX 280. Note that in 2D mode, all members of the Nvidia GeForce GTX 200 family automatically reduce GPU clock speeds to 300/600 MHz, and memory frequency - up to 100 (200) MHz, which helps them maintain low temperatures and noise levels.



Despite some design changes in the EVGA GeForce GTX 260 Core 216 Superclocked cooling system, noise level measurements did not reveal significant differences from the reference Nvidia GeForce GTX 280 cooling system. This is not least due to the relatively high noise level generated by our test platform as a whole - even with completely passive cooling of the graphics card, it is 43 dBA, which in itself is quite a lot. However, we must admit that the CO design used by Nvidia still remains one of the most successful in the discrete graphics industry, combining high cooling efficiency with comfortable noise characteristics. There is no need to talk about complete noiselessness, but the acoustic composition of the noise is favorable, since the main contribution to it is made by the rustling of the air flow, which practically does not irritate the ear.

Test platform configuration and testing methodology

A comparative study of the performance of EVGA GeForce GTX 260 Core 216 Superclocked was conducted on a test platform with the following configuration:

Processor Intel Core i7-965 Extreme Edition(3.2 GHz, 6.4 GT/s QPI)
System Asus board P6T Deluxe (Intel X58)
Memory Corsair XMS3-12800C9 (3x2 GB, 1333 MHz, 9-9-9-24, 2T)
Hard drive Maxtor MaXLine III 7B250S0 (250 GB, SATA-150, 16 MB buffer)
Power supply Enermax Galaxy DXX EGX1000EWL (power 1 kW)
Monitor Dell 3007WFP (30”, maximum resolution 2560x1600@60 Hz)
Microsoft Windows Vista Ultimate SP1 64-bit
ATI Catalyst 8.12 for ATI Radeon HD
Nvidia GeForce 180.48 WHQL for Nvidia GeForce

Graphics card drivers have been tuned to provide the highest possible quality of texture filtering while minimizing the impact of default software optimizations. Transparent texture antialiasing was enabled, and multisampling mode was used for both architectures, since ATI solutions do not support supersampling for this function. As a result, the list of ATI Catalyst and Nvidia GeForce driver settings took the following form:

ATI Catalyst:

Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality

Nvidia GeForce:

Texture filtering - Quality: High quality
Texture filtering - Trilinear optimization: Off
Texture filtering - Anisotropic sample optimization: Off
Vertical sync: Force off
Antialiasing - Gamma correction: On
Antialiasing - Transparency: Multisampling
Other settings: default

The composition of the test package has been significantly redesigned to better suit current realities. As a result of the revision, the following set of games and applications was included in it:

3D first-person shooters:

Call of Duty: World at War
Crysis Warhead
Enemy Territory: Quake Wars
Far Cry 2
S.T.A.L.K.E.R.: Clear Sky


3D shooters with third person view:

Dead Space
Devil May Cry 4
Grand Theft Auto IV


RPG:

Fallout 3
Mass Effect


Simulators:

Race Driver: GRID
X³: Terran Conflict


Strategies:

Red Alert 3
Spore
World in Conflict


Synthetic tests:

Futuremark 3DMark06
Futuremark 3DMark Vantage

Each of those included in the test set software The game was configured to provide the highest possible level of detail, using only the tools available in the game itself to any uninitiated user. This means a fundamental rejection of manual modification configuration files, since the player is not required to be able to do this. For some games, exceptions were made, dictated by certain considerations of necessity; Each of these exceptions is mentioned separately in the appropriate section of the review.

In addition to the EVGA GeForce GTX 260 Core 216 Superclocked, the testing participants included the following graphics cards:

Nvidia GeForce GTX 280 (G200, 602/1296/2214 MHz, 240 SP, 80 TMU, 32 RBE, 512-bit memory bus, 1024 MB GDDR3)
Nvidia GeForce GTX 260 Core 216 (G200, 576/1242/2000 MHz, 216 SP, 72 TMU, 28 RBE, 448-bit memory bus, 896 MB GDDR3)
ATI Radeon HD 4850 X2 (2xRV770, 650/650/2000 MHz, 1600 SP, 80 TMU, 32 RBE, 2x256-bit memory bus, 2x1024 MB GDDR3)
ATI Radeon HD 4870 (RV770, 750/750/3600 MHz, 800 SP, 40 TMU, 16 RBE, 256-bit memory bus, 1024 MB GDDR5)

Testing was carried out in resolutions of 1280x1024, 1680x1050, 1920x1200 and 2560x1600. The latter was used only for the main characters of today's review, as well as for the ATI Radeon HD 4850 X2. Whenever possible, standard 16x anisotropic filtering was supplemented with 4x MSAA anti-aliasing. Activation of anti-aliasing was carried out either by the game itself, or, in their absence, it was forced using the appropriate settings of the ATI Catalyst and Nvidia GeForce drivers. As already mentioned, no additional configuration tools were used.

To obtain performance data, we used the tools built into the game, with the obligatory recording of original test videos if possible. In all other cases, the Fraps 2.9.6 utility was used in manual mode with a three-fold test pass and subsequent averaging of the final result. Whenever possible, data was recorded not only on average, but also on minimum productivity.

Game tests: Call of Duty: World at War


The factory overclocking undertaken by EVGA is clearly not enough to compete on equal terms with the Nvidia GeForce GTX 280, but additional overclocking allows it to achieve this even at a resolution of 2560x1600, despite the narrower memory bus. Moreover, in the latter case, the minimum performance turns out to be high enough for comfortable gaming, and, in general, the overclocked EVGA GeForce GTX 260 Core 216 Supeclocked is practically not inferior to the ATI Radeon HD 4850 X2 with significantly lower power consumption and noise.

Game tests: Crysis Warhead


The same can be seen in Crysis Warhead - additional overclocking brings the EVGA card to the level of the Nvidia GeForce GTX 280. However, in this case, the exception is the resolution of 2560x1600, in which the smaller width of the memory access bus begins to affect, as well as the smaller amount of memory itself - 896 versus 1024 MB.

Game tests: Enemy Territory: Quake Wars

ET: Quake Wars contains an average performance cap that is locked at 30 frames per second, as multiplayer syncs all events at 30 Hz. To obtain more complete data on the performance of graphics cards in Quake Wars, this limiter was disabled via game console. Since testing uses the game's internal capabilities, there is no minimum performance information available.


The difference between the various representatives of the Nvidia GeForce GTX 200 family in this game is minimal, however, here too, overclocking the EVGA card allows it to reach the same level as the Nvidia GeForce GTX 280, and at a resolution of 2560x1600 even slightly ahead of it. In all cases, the performance margin is very high and far exceeds the requirements of the game.

Game tests: Far Cry 2


IN original version EVGA GeForce GTX 260 Core 216 Superclocked is slightly inferior to Nvidia GeForce GTX 280, but overclocking the GPU to 715/1541 MHz and the memory to 1150 (2300) MHz allows you to practically bridge the gap, with only a slight lag in minimum performance remaining.

Game tests: S.T.A.L.K.E.R.: Clear Sky

To ensure an acceptable level of performance in this game, it was decided to abandon the use of FSAA, as well as such resource-intensive options as “Sun rays”, “Wet surfaces” and “Volumetric smoke”. During testing, the “Enhanced full dynamic lighting” (DX10) mode was used; for ATI cards, the DirectX 10.1 mode was additionally used


An EVGA product overclocked to the limit behaves similarly in Clear Sky - increased clock speeds allow it to be on par with the Nvidia GeForce GTX 280 in average performance, but a smaller number of functional units and a narrower memory bus cannot be compensated by the gain in frequencies, which negatively affects minimum performance.

Game tests: Dead Space


The fact that the lag in the number of ALUs, TMUs and RBEs is not so easy to compensate for with clock frequencies is also confirmed in Dead Space: if, according to average indicators, the maximum overclocked EVGA card is ahead of the Nvidia GeForce GTX 280 by 2-5%, depending on resolution, then the minimum performance remains the same. Moreover, at a resolution of 2560x1600, overclocking alone is no longer enough, and the Nvidia GeForce GTX 280 again comes out on top. Taking into account the relatively modest requirements presented by the game, the overall level of performance is very high and the difference described above is impossible to notice with the naked eye.

Game tests: Devil May Cry 4


The overclocked EVGA card behaves in approximately the same way in Devil May Cry 4, demonstrating superiority over the Nvidia GeForce GTX 280 in average performance and parity in minimum performance. The exception is the resolution of 2560x1600, where it manages to significantly exceed the latter indicator flagship model families.

Game tests: Grand Theft Auto IV

Since with today’s standard video memory of 512 MB, the game does not allow the use of the “Texture Quality” option value above “Medium”, and the maximum safe value of the “View Distance” option is 32, all tests were carried out with exactly these settings. For other options, the maximum possible values ​​were set. Since testing uses the game's native capabilities, there is no minimum performance information available.


Due to the features described above, significant discrepancies in the results between testing participants were identified only in resolutions of 1920x1200 and 2560x1600. In the first case, overclocking the EVGA card allowed us to achieve average performance at the level of the Nvidia GeForce GTX 280, and in the second it completely brought it to first place, leaving behind not only the flagship Nvidia model, but also the ATI Radeon HD 4850 X2. Also, judging by the results of the ATI Radeon HD 4870 1GB, it can be assumed that the game favors the large number of texture processors in Nvidia solutions.

Starting with the next review, the testing methodology in GTA IV will be adjusted to the new minimum video memory value of 896 MB.

Game tests: Fallout 3


ATI solutions are unrivaled, however, cards based on Nvidia G200 feel quite confident in all resolutions, including 2560x1600. It’s interesting that even with factory overclocking, the EVGA GeForce GTX 260 Core 216 Superclocked is in no way inferior to the Nvidia GeForce GTX 280, and with additional overclocking it even noticeably surpasses it, by 8-10%.

Game Tests: Mass Effect


The situation, already repeatedly seen in other tests, is repeated - the maximum overclocking of the EVGA card allows it to easily compete on equal terms with the Nvidia GeForce GTX 280, both in average and in minimum performance. Alas, the ATI Radeon HD 4870 1GB is not a competitor to solutions based on the G200/G200b in this game; Only the much hotter and very noisy ATI Radeon HD 4850 X2 managed to surpass them.

Game tests: Race Driver: GRID


It is impossible not to note the victory of the maximally overclocked EVGA card in the 2560x1600 resolution, where it lost only to the dual-processor ATI monster. In all other cases, the picture is usual - factory overclocking is not enough to compete with the Nvidia GeForce GTX 280 and in order to achieve similar performance, you have to resort to further increasing the clock speeds of the GPU and memory, and the minimum performance practically does not increase.

Game tests: X³: Terran Conflict


The game clearly prefers ATI solutions; just look at the minimum performance at resolutions above 1280x1024. Even additional overclocking of the EVGA card helps little - at a resolution of 1680x1050, the minimum performance can only be raised to 22 frames per second, which is clearly not enough to ensure smooth gameplay. It is interesting to note that at resolutions of 1920x1200 and 2560x1600, the relatively modest factory overclocking undertaken by EVGA allows the GeForce GTX 260 Core 216 Superclocked to outperform the Nvidia GeForce GTX 280.

Game tests: Red Alert 3

The game contains a permanent average performance limiter, fixed at 30 frames per second.


So far the performance situation Nvidia cards in Red Alert 3 is far from ideal - when using FSAA 4x, they all show unacceptably low performance. It remains to place hopes on new version GeForce drivers, it may fix this problem. However, the behavior of the overclocked EVGA card is not fundamentally different from what could be observed in previous tests.

Game tests: Spore

The game contains a permanent average performance limiter, fixed at 30 frames per second. FSAA is not supported.


No information could be obtained because all participating graphics cards reached the average performance limit. Therefore, they are all equally suitable for use in Spore. Since in most cases this test does not provide any meaningful information, we are discontinuing its use - it will be replaced by one of the popular and more informative games.

Game tests: World in Conflict


Further overclocking of the EVGA GeForce GTX 260 Core 216 Superclocked gave impressive results - at a resolution of 1680x1050 the card was able to outperform even the ATI Radeon HD 4850 X2, and at a resolution of 1920x1200 it turned out to be the only video adapter other than the aforementioned dual-processor ATI solution that provides a comfortable level of performance. The maximum gain with additional overclocking compared to the Nvidia GeForce GTX 280 was almost 14%, which can fairly be considered an excellent result, given the lag in the number of functional units, as well as performance and video memory.

Synthetic tests: Futuremark 3DMark06









The EVGA card did not show outstanding results in any of the 3DMark06 tests, either with the factory overclock or with the additional one we attempted. Apparently, it has hit a ceiling dictated by the hardware configuration of the GeForce GTX 260 Core 216.

Synthetic tests: Futuremark 3DMark Vantage

To minimize the impact central processor, when testing in 3DMark Vantage, the “Extreme” profile is used, using a resolution of 1920x1200, FSAA 4x and anisotropic filtering. To complete the performance picture, individual test results from this point on are taken across the entire resolution range.






Testing in 3DMark Vantage showed exactly the opposite result: in the mode with additional overclocking, EVGA GeForce GTX 260 Core 216 Supeclocked was not only able to get ahead of the Nvidia GeForce GTX 280, but took first place among all test participants!


The greatest gain from overclocking is observed in the resolution of 1280x1024; As the resolution increases, the gap between the EVGA GeForce GTX 260 Core 216 Superclocked and the Nvidia GeForce GTX 280 narrows - the latter still has a good head start in the performance of the memory subsystem and it is no longer possible to get away with superiority in GPU frequency.


In the second test the picture is slightly different - in all resolutions the EVGA card is slightly inferior to the Nvidia GeForce GTX 280 normal mode, and goes on par with it in the mode with additional acceleration. Apparently, the lack of texture processors is successfully compensated by their higher operating frequency. Note that both Nvidia solutions in this test they are noticeably inferior to the ATI Radeon HD 4850 X2.

Conclusion

Let's summarize. Testing reveals no differences in performance between 55nm and 65nm Nvidia versions GeForce GTX 260 Core 216, which have reference GPU and memory frequencies, however, the factory overclocking of the EVGA card provided it with a noticeable advantage, averaging from 4.3% to 6.4%, depending on the resolution. Additional overclocking allowed us to add another 8-9%.

As a result, the main character of today's review, the EVGA GeForce GTX 260 Core 216 Superclocked, showed its best side, demonstrating that it can be a worthy rival to the ATI Radeon HD 4870 1GB. In many tests, the EVGA product was faster, but due to its losses in games such as Fallout 3, Race Driver: GRID, X³: Terran Conflict and Red Alert 3, the average advantage was only about 5%, and then only at resolutions no higher than 1680x1050. Thus, the choice in this case is determined solely by the player’s personal preferences.

As for the attempt to catch up with the Nvidia GeForce GTX 280, it was not successful in the factory version, however, thanks to the use of the new 55 nm version of the G200, the card demonstrated good overclocking potential, and with the help of additional overclocking we managed to catch up and even overtake the GTX 280 almost in all tests.












In standard resolutions, the average advantage of the EVGA GeForce GTX 260 Core 216, overclocked to frequencies 715/1541/2300 MHz, ranged from 3.1 to 3.8%, and the greatest effect was observed in Devil May Cry 4 (1280x1024, advantage 10.7 %), World in Conflict (1680x1050, 10.8% advantage) and X³: Terran Conflict (1920x1200 and 2560x1600, 16.2% and 19.3% advantage, respectively). More than good, especially considering the significantly lower cost compared to the Nvidia GeForce GTX 280. Thus, the advisability of purchasing the latter with the advent of 55-nm versions of the Nvidia GeForce GTX 260 Core 216 is in question, especially since the appearance of in the mass sale of GeForce GTX 285 video cards, which also use the 55 nm version of the G200 and, accordingly, have a higher frequency potential than the GTX 280.

The product itself, described in this review, EVGA GeForce GTX 260 Core 216 Superclocked, deserves an extremely positive rating. In addition to attractive performance levels and good overclocking potential, it boasts a decent package that includes the full version of the popular shooter Far Cry 2 and a convenient overclocking tool, as well as a lifetime warranty and participation in the EVGA Step-Up program. The only thing that can scare off a potential buyer is relatively high price, traditionally characteristic of EVGA products, because currently the official price of the ATI Radeon HD 4870, equipped with 512 MB of GDDR5 memory, has already fallen below $200, and for the version with 1024 MB of memory is set at $239. Real prices in Moscow retail, of course, are higher than the official prices of manufacturers, but they are quite correlated with them - so when EVGA GeForce GTX 260 Core 216 Superclocked cards appear on sale here, we should expect that they will also be more expensive than Radeon HD 4870 1 GB.

EVGA GeForce GTX 260 Core 216 Superclocked: advantages and disadvantages

Advantages:

Using 55nm version of G200
High level of performance in modern games
Outperforms ATI Radeon HD 4870 1GB in many cases
With additional overclocking, it is ahead of Nvidia GeForce GTX 280
Performance independent of software multi-GPU support
Wide range of FSAA modes
Minimal impact of FSAA on performance
PhysX GPU acceleration support
Hardware support for HD video decoding
Support S/PDIF audio output via HDMI
Relatively low energy consumption and heat dissipation
Relatively low noise level
Good overclocking potential
Includes the full version of Far Cry 2

Flaws:

Performance bias towards texture processors and RBE
Lack of support for DirectX 10.1 and Shader Model 4.1
Incomplete hardware support for VC-1 decoding
Lack of integrated sound core
The package does not include a software HD video player
High price

Other materials on this topic


ATI Radeon HD 4850 X2 vs Nvidia GeForce GTX 280: decisive battle
Evolution of ATI Catalyst drivers using the example of Radeon HD 4870
Brothers in arms: two Palit GeForce 9800GTX+ video cards

It would seem that a gaming video adapter with a DDR3 bus, which has long been replaced by the DDR5 memory type, which is faster and more modern, can surprise the user. Nothing, unless this video adapter is Nvidia 260, the characteristics of which can compete with many modern devices, which has an improved technical process and many innovations.

Very expensive production

Judging by the fact that the rejection rate during the production of GT200 chips exceeded 50%, the price of video adapters in this series was clearly overpriced on the market. But any buyer knew that he had in stock not just another consumer goods from a famous manufacturer, but practically one of best products white assembly. And we're talking about gaming GeForce video card GTX 260. The excellent build quality is just one of a whole list of qualities that this video adapter has.

The reduction in the cost of the device on the market is due to the type of DDR3 bus used. Manufacturers considered that this memory would not play a big role in transferring significant amounts of information, because the adapter has many other interesting technologies that can improve its performance. But the savings are obvious; the chip was cheaper to produce than a similar one with DDR5 memory.

Gentleman's set

It cannot be said that for the top line, which includes the GeForce GTX 260, the characteristics are ideal and far superior to competitors. It's quite the opposite. Take, for example, the technical process - 65 nm with a core area of ​​576 square meters. mm. Any gamer will say that this is the last century. Another interesting thing is the video memory bandwidth, which is the most important characteristic for gaming video adapter. It is 112 gigabytes per second. This is a very large figure, even for modern gaming video cards.

High performance of the video adapter is ensured by a 448-bit memory bus. Apparently, due to the expensive tire, the price of the device is clearly too high, because in Lately Nvidia decided to install 384-bit buses in top models, increasing memory bandwidth by overclocking the cores, but it is unknown how long their potential will last.

Something's wrong here

A very strange characteristic for the Nvidia GTX 260 is the amount of memory. After all, it is customary on the market to set the size as a multiple of 512 megabytes, but here it is 896 MB. It's all about the bit capacity: the non-standard bus was achieved by disabling several control units on the video adapter board, this affected the reduction in the total amount of memory.

And if you delve deeper into the investigation, you can find out that the GTX 260 chip is a reject from the older model from the same GTX 280 line. It just has a standard 512-bit bus and 1024 MB of memory on board. To reach the performance of its older brother, it will not be possible to improve the characteristics of the GeForce GTX 260 programmatically, unless you solder in the missing control units. At most, you can increase the core frequency by raising the voltage on it. To do this, you will need the NVFlash program, in which instead of the standard voltage of 1.12 Volts you need to change it to 1.18 Volts and upload the changes to the BIOS of the video adapter.

Some goodies

The 260 video output characteristics are pleasing. All devices on the market have identical connectors. You can connect any device, it doesn’t matter whether it’s digital or analogue - from the ancient D-Sub port to the modern HDMI standard. There is even a TV output, which is implemented in the form of S-video, but the included adapters allow you to connect devices that have a component input.

In all Nvidia GTX 260 video cards, the cooling system characteristics are impeccable. Cooling can not only be adjusted automatically, but also high quality assembly does not cause discomfort during operation. Which is quite rare in the video adapter market.

Best deal

XFX is known among gamers as one of the best in the production of video adapters. No consumer goods, only powerful gaming devices under this brand. Therefore, the appearance on the market of the GeForce GTX 260 Black Edition under this brand attracted the attention of many XFX fans to the product.

How much does one box cost? In size it can compete with the sizes motherboard. Many adapters, accessories, branded disks with drivers and detailed instructions Few people will be surprised, but the game Far Cry 2 was supposed to be a reward for a user who uses only licensed software.

Gothic appearance A gaming video card is a little scary at first, but then arouses respect. Not every user has such a masterpiece. Naturally, the cooling system is at the top level, in this regard there are no questions for the XFX GTX 260. The power consumption characteristics are a little confusing - 236 W, like the older GTX 280 model. Apparently, it was the factory overclocking of the memory to 300 MHz that had such an impact on electricity consumption .

Couldn't do it without Zotac

But the Zotac brand, well-known among gamers, which until recently took on chips from the top line, lost its position a little when it started producing devices in the low-cost segment. Zotac put the GeForce GTX 260 product, the technical characteristics of which apparently appealed to the manufacturer, onto the production line.

Branded box, excellent packaging in a shockproof bag, plenty of cables and adapters, CDs with drivers and instructions - a standard set for any top-end device. If you look closely at the adapter, you get the feeling that it is the same XFX, only with a new sticker with a dragon from Zotac. Even in the GTX 260, the cooling system characteristics are identical.

Apparently, the factory decided not to change the characteristics of the chipset, leaving everything as is. This is evidenced by performance indicators, memory and core frequencies, as well as the maximum. However, in the price category, the video card occupies one of the first places, what the catch is is unknown.

Destroying the foundations

Manufacturers Gainward, Palit and Gigabyte are more consumer goods, at least that is the opinion. Apparently, they were interested in the performance characteristics of the Nvidia GTX 260, and they released very productive devices that quickly found their buyers.

Judging by numerous consumer reviews in the media, there are no complaints at all about these three manufacturers. Whether the equipment is complete, what the assembly is, what the performance is - they are impeccable. Naturally, buyers were also pleased with the price, because these manufacturers have set very attractive prices for gaming video cards.

However, there are questions about the video adapter from MSI. The device is presented without any proprietary modifications - nothing stands out GTX video card 260. The characteristics of the cooling system have always been in the first place for MSI; with the “Cyclone” radiator, it is always easy to find on the shelf.

Strange player in the market

Constructors and designers of the GALAXY company, whose masterpieces can often be found on the market in the form of video cards with big amount coolers in the cooling system, they left the standard radiator in the GTX 260. The low temperature characteristics of the core and memory chips are the advantage for which buyers go to the store and purchase a GALAXY product.

And yet, the manufacturer did something with the video adapter, since, having the factory characteristics of the chip, it performs excellently in tests, leaving behind most manufacturers of the chip of the same name. Interestingly, the core and memory temperatures are normal.

Finally

As a result of the review, we can conclude that the GTX 260 video adapter, whose characteristics are very impressive in terms of performance on the gaming device market, is quite competitive in the top class. Something completely different is confusing - all manufacturers, without exception, have installed almost the same cooling system. The user will not see any proprietary technologies from MSI or GALAXY. This oddity suggests that the video card is completely unsuitable for overclocking - there is no potential. Because of this, there is no point in installing improved cooling. Increasing the core voltage does not give any gain in games, it only increases power consumption and heat generation.