Google Search

Monday, December 31, 2012

NVIDIA GTX 780 COMING IN MARCH 2013?

The GTX 700 series will come by March 2013 Source say at the earliest and will offer around 25-30% better performance and power efficiency as the 28nm process is refined. The legendary GK110 part that is speculated about so much is expected to remain in the non-consumer realm only – in the Tesla K20 and Quadro K6000 cards
.
AMD’s Sea Islands aka HD 8000 will probably debut slightly earlier than Nvidia’s GTX 700 series but they will be having the same process yield problems as Nvidia.

Saturday, December 1, 2012

GeForce 310.64 Beta Drivers: An Essential For Far Cry 3

nVidia released the new set of beta drivers for their GeForce cards, which contain substantial improvements to help prepare for Far Cry 3.

 The new beta drivers contain all the same improvements that 310.54 and 310.61 beta drivers had, the new set optimizes performance in preparation for Far Cry 3′s release in North America.
Release notes indicate a performance boost of up to 38% in Far Cry 3 using a GeForce GTX 680 and up to 6% using a GeForce GTX 660.

Along with this are the optimizations added previously for Call of Duty: Black Ops II and Assassin’s Creed II, and some updates to the nVidia 3D Vision profiles to a handful of games, notably 007 Legends, The Amazing Spider-Man, and Medal of Honor: Warfighter. In their release, nVidia has also stated that they are actively working on fixing issues that have been reported in games such as Battlefield 3 and Assassin’s Creed III.

Far Cry 3 was released today in Europe but doesn’t hit shelves in North America until Tuesday, but nVidia is keeping up-to-date and constantly testing to ensure that the release goes over smoothly for PC gamers, and that GeForce cards can efficiently handle such a performance-driven game.by omegaphoenix

Tuesday, November 13, 2012

NVIDIA Releases New 310.54 Driver, Up to 16% Performance Improvement

NVIDIA is stepping up to the plate by offering their GeForce customers something similar. While the primary focus of their new 310.54 drivers are to offer optimal release-day performance and DX11 optimizations for Call of Duty: Black Ops 2 and Assassin’s Creed III, there are some additional benefits included as well. According to the driver’s release notes, many games will see up to 16% better framerates even though the lion’s share of improvements are reserved for the aforementioned big-name titles.

Call of Duty: Black Ops 2 and Assassin’s Creed III are likely going to be two of this holiday season’s hottest titles and NVIDIA worked with both game’s developers to include TXAA support. This should lead to high quality anti-aliasing without a significant framerate reduction. Additional optimizations for these titles are included within the 310.54 driver as well, allowing for an approximate 26% framerate increase over the WHQL version currently posted on NVIDIA’s download page. If you have an NVIDIA card and will be buying Call of Duty: Black Ops 2 or Assassin’s Creed III, be sure to install this driver.
Source: GeForce.com
Alongside the aforementioned items, NVIDIA has added an automatic LOD bias for SGSSAA  which can be applied through NVIDIA’s Inspector tool. The passage below is from a GeForce.com post which details the changes to this ultra-high quality AA routine:
Several years ago Fullscene Sparse Grid Supersampling Anti-Aliasing was added to the GeForce drivers as an advanced anti-aliasing option for those with high-end systems. When enabled and correctly configured, SGSSAA significantly increases the quality of Multisample Anti-Aliasing, helping remove aliasing that even 8xMSAA struggles with.
Configuring SGSSAA to reach this level of detail can at times be tricky due to the need to counteract texture blurring that occurs when using the technique. With a Negative LOD Bias, applied via NVIDIA Inspector, texture quality can be restored, but unfortunately the correct value can only be ascertained through trial and error.
Following repeated calls from users for a solution, we silently introduced an automatic LOD Bias feature in last month’s 310.33 beta driver, which we’re pleased to officially announce today. Now, users need only enable Sparse Grid Supersampling in the profile of a compatible game in NVIDIA Inspector, and the new feature will do the rest.
NVIDIA continues their work on improving multi card scaling and the 310.54 stack adds SLI profiles for Hitman: Absolution, Hawken, Natural Selection 2 and Primal Carnage.by skymtl

Monday, November 5, 2012

Danger Den to Close its Doors, Cuts Prices by 75%

After being in the business of supplying PC enthusiasts for over a decade, water cooling product manufacturer Danger Den is in the process of closing its doors and winding down operations.
Long thought of as a pioneer in the water cooling business, Danger Den has announced that its time as a leading manufacturer of everything from CPU blocks to reservoirs to PC cases is coming to an abrupt end. It looks like the viability of supplying and developing new products for an ever-shrinking market was no longer financially viable.

In a short post on their site, Dan, Jeremy, Dennis and Rokk announced their decision:
“After 12 years our hobby has come to an end. It’s time to pursue other interests and Danger Den will be closing its doors. Thank you for all your support over the years, we’ve enjoyed being part of your modding community.”

To many, this will be a particularly sad day as Danger Den was always thought of as one of as a major player within the water cooling niche. They have been around since the dawn of liquid cooling and their products were well received by enthusiasts all over the world.

In order to get rid of in-stock merchandise, their online store is offering 75% off every single item until Monday. At this time, there isn’t much left but this could be an opportunity for some people to stock up on parts that won’t be available anymore.by skymtl

Saturday, November 3, 2012

I Pad Mini Hands On


We just spent a good amount of time with the iPad mini and the easiest way to describe the device is that it's lighter than you'd expect. The build quality and finish both feel good as you'd expect, but the device is just considerably lighter than the iPad which results in superior in hand feel.
The display doesn't feel cramped either thanks to the reasonably large diagonal size. It's clear that the iPad mini is a nod to those who want something even more portable than the standard iPad.
In terms of performance, there's a pretty noticeable difference between the A5 in the iPad mini and the A6X in the 4th gen iPad as you'd expect. I do wish that Apple had brought the A6 to the mini, however something has to give in pursuit of the lower price point.
The LTE version of the iPad mini has an RF window at the top of the unit similar to the standard iPad, although it does blend in a bit better on the black model. by lai shimpi

Friday, November 2, 2012

Best Graphics Cards For The Money: October

Since our last monthly update, Nvidia launched three new graphics cards: the GeForce GTX 650, 650 Ti, and 660.

The GeForce GTX 650 is essentially a GeForce GT 640 with a higher-clocked 1058 MHz core and 1250 MHz GDDR5 (instead of 891 MHz DDR3 memory). That increased memory bandwidth immediately uncorks this card's performance, putting it head-to-head against the Radeon HD 7750. A $120 price point sounds about right until you hop online and see AMD's Radeon HD 7750 selling for $105. The notably faster Radeon HD 7770 goes for $125. After a series of price drops from AMD, the GeForce GTX 650 needs to get closer to $105 before it's really competitive.

Next, the GeForce GTX 650 Ti is built around the same GK106 GPU found on Nvidia's GeForce GTX 660, but with a single GPC cluster disabled. The result is a processor with 768 shaders, 64 texture units, and two ROP partitions capable of 16 raster operations per clock. The card's core operates at 925 MHz, and its GDDR5 memory runs at 1350 MHz. All told, the GeForce GTX 650 Ti outperforms AMD's Radeon HD 6850 and Nvidia's GeForce GTX 460, nearly reaching the same performance levels as the GeForce GTX 560 and Radeon HD 6870. Unfortunately, a rather narrow 128-bit memory interface hampers frame rates at higher resolutions with MSAA enabled. Nevertheless, GeForce GTX 650 Ti is the highest-performing $150 card on the market
.
You might think that this fact alone would earn the GeForce GTX 650 Ti an easy recommendation. But AMD's counter-strike cannot be ignored. A discounted 1 GB version of the Radeon HD 7850 for $170. The 650 Ti is quite a bit slower, and saving $20 doesn't make up the difference. Nvidia's other issue is that street prices on the GeForce GTX 650 Ti are notably higher than $155. We'd want to see wider availability in the $140 to $150 range for this new card to earn more than an honorable mention.
Finally, the GeForce GTX 660 employs an uncut version of the same GK106 processor, giving it 960 cores, 80 texture units, and three ROP partitions. It sports a 980 MHz core clock rate and 1502 MHz GDDR5 memory on an aggregate 192-bit interface. Selling for $230, it's very attractive next to a $250 Radeon HD 7870. But because it takes a performance hit when MSAA is applied, it shares our recommendation around $240 with AMD's Radeon card.by don woligroski

Monday, October 15, 2012

Nvidia announces GeForce GTX 650 Ti and Assassin's Creed 3 bundles

PC gamers looking to improve their graphics quality without emptying their wallets can check out graphics cards based on the Nvidia GeForce GTX 650 Ti available Tuesday at a starting price of $150. The GTX 650 Ti uses the same Kepler architecture found in other GTX cards such as the $380 GTX 670, but at a far lower price point. You could get an even lower price by snapping up the GTX 650, but for about an extra $30 Nvidia says the GTX 650 Ti is up to 40 percent faster than the regular 650. Select versions of the GTX 650 Ti also come with a free copy of Assassin's Creed 3.
Assassin's Creed 3Nvidia
Assassin's Creed 3
The reference design for Nvidia's new GTX 650 Ti features a 925 MHz core clock, 768 CUDA cores (double the amount in the GTX 650), and 1 GB of dedicated memory. The card uses one 6-pin PCI Express connector, and offers one HDMI and two dual-link DVI ports. Nvidia says you can expect the GTX 650 Ti to offer performance of about 42 frames per second on Battlefield 3, and around 40 frames per second on Borderlands 2 and similar games, on a 1920 x 1080 display. The GTX 650 Ti can support up to four separate monitors simultaneously depending on the version you purchase.
Gtx 650 Ti ReferenceNvidia
The reference design is only six inches long and takes up one slot.
Card manufacturers may ship versions of the GTX 650Ti in different configurations. One example is Zotac's ZT-61103-10M, which will include 2GB of frame buffer memory and a DisplayPort connector. Clock speeds of different cards may also vary from one model to the next.
Zotac ZT-61103-10MNvidia
Zotac ZT-61103-10M
The reference design is one slot wide and less than six inches long, so may fit in compact cases. Power consumption is rated at 110 watts, so most PC power supplies should be able to handle the load.
Not every card will be as small as Nvidia's reference design. Gigabyte's GV-N65TOC-2GI will offer dual cooling fans to help cool down an overclocked GPU. Overclocked cards may also draw more power, but power consumption is still low enough to enable them to run in most PC systems with 350W or greater power supplies.
Gigabyte GV-N650TOC-2GINvidia
Gigabyte GV-N65TOC-2G
Nvidia is aiming the GTX 650 Ti at users who may be stepping up from older 9600 GT budget cards. At well under $200, the GTX 650 Ti is aimed at price conscious gamers who only upgrade every couple of years, and aren't willing to spend more than $200.
The Assassin's Creed 3 bundle offering won't be offered with every version of the GTX 650 Ti, so you need to check that the card you're buying includes a download coupon for the game. A quick check on Newegg showed versions of the GTX 650 Ti from Asus, EVGA, Gigabyte, MSI, and Zotac all included the Assassin's Creed 3 bundle. Prices on Newegg started at $155.

Saturday, September 29, 2012

Best Gaming CPU: $200 And Up Price Range

Core i5-3570K
Codename:  
Process: 22 nm
CPU Cores/Threads: 4
Clock Speed (Max. Turbo): 3.4 GHz (3.8 GHz)
Socket: LGA 1155
L2 Cache: 4 x 256 KB
L3 Cache: 6 MB
Thermal Envelope:77 W


The Core i5-3570K is  300 MHz faster than the Core i5-3450 at stock speeds,The K-series' unlocked ratio multiplier is a must-have for overclockers looking to unleash significant performance improvements. It is for this reason alone that a gamer should shell out the extra $30 over Intel's slower model. After all, the pricier chip's HD Graphics 4000 is hardly relevant when you plan to use a discrete card anyway.
If you don't plan to overclock, then we think that there's little reason to look past the Core i5-3450.
Read our review of the Ivy Bridge-based CPUs
CPUs priced over $230 offer rapidly diminishing returns when it comes to game performance. As such, we have a hard time recommending anything more expensive than the Core i5-3570K, especially since this multiplier-unlocked processor can be overclocked to great effect if more performance is desired.,it meets or beats the $1000 Core i7-990X Extreme Edition when it comes to gaming.

But now that LGA 2011 is here, there's certainly an argument to be made for it as the ultimate gaming platform. LGA 2011-based CPUs have more available cache and as many as two more execution cores than the flagship LGA 1155 models. Additionally, more bandwidth is delivered through a quad-channel memory controller. And with 40 lanes of third-gen PCIe connectivity available from Sandy Bridge-E-based processors, the platform natively supports two x16 and one x8 slot, or one x16 and three x8 slots, alleviating potential bottlenecks in three- and four-way CrossFire or SLI configurations.

Although they sound impressive, those advantages don't necessarily translate into significant performance gains in modern titles. Our tests demonstrate fairly little difference between a $225 LGA 1155 Core i5-2500K and a $1000 LGA 2011 Core i7-3960X, even when three-way graphics card configurations are involved. It turns out that memory bandwidth and PCIe throughput don't hold back the performance of existing Sandy Bridge-based machines.
Where we do see the potential for Sandy Bridge-E to drive additional performance is in processor-bound games like World of Warcraft or the multiplayer component of Battlefield 3. If you're running a three- or four-way array of graphics cards already, there's a good chance that you already own more than enough rendering muscle. An overclocked Core i7-3960X or -3930K could help the rest of your platform catch up to an insanely powerful arrangement of GPUs.

 while we generally recommend against purchasing any gaming CPU that retails for more than $220 from a value point of view (sink that money into graphics and the motherboard instead), there are those of you who have no trouble throwing down serious money on the best of the best, and who require the fastest possible performance available. If this describes your goals, the following CPU is for you:

Best Gaming CPU for $570:
Core i7-3930K

Core i7-3930K
Codename: Sandy Bridge-E
Process: 32 nm
CPU Cores/Threads: 6/12
Clock Speed (Max. Turbo): 3.2 GHz (3.8 GHz)
Socket: LGA 2011
L2 Cache: 6x 256 KB
L3 Cache: 12 MB
Thermal Envelope:130 W
Take the $1000 Core i7-3960X, remove 3 MB of L3 cache, and drop the base clock rate by 100 MHz. What do end up with? Four hundred dollars and change left over, and an Intel Core i7-3930K.
The 100 MHz difference in clock rate is hardly relevant, given unlocked multiplier ratios benefiting both CPUs. And you'd be hard-pressed to quantify the advantage of 15 MB of shared L3 cache over 12 MB. Moreover, a greater-than-$400 savings lets you buy a nice motherboard and cooler, while still getting the same four-channel memory subsystem and 40-lane PCI Express 3.0-capable controller.

Monday, September 24, 2012

NVIDIA GeForce GTX 660 2GB

Just a month ago, NVIDIA released its GeForce GTX 660 Ti, which was received to rather critical acclaim. Specs-wise, the card isn't too far off from the GTX 670 which costs $100 more. But at $300, the card wasn't exactly "affordable" by all standards. NVIDIA knew it had a duty to finally deliver a mainstream Kepler part as close to $200 as possible, and that's resulted in the $229 non-Ti GTX 660.
The GTX 660 is equipped with 960 cores, vs. 1344 with the Ti. That comparison alone can give us an idea of what to expect here. Well, it would be easy if NVIDIA, in its usual way, didn't give the core clock a nice boost on the non-Ti edition. Lesser cores, but +65MHz to the clock. An interesting move, and not one that anyone will complain about.
NVIDIA GeForce GTX 660
Memory density and general architecture layout remain similar between the two cards, although while the 660 Ti is based on the GK104 chip, this non-Ti version uses GK106. Whereas typical GPCs, or Graphics Processing Cluster, have two SMX units per, GK106 splits one right down the middle, as the following diagram shows:
NVIDIA GeForce GTX 660 - GK106
This is an odd design, but won't result in some sort of bottleneck as all SMX modules interface with their respective raster engine rather than each other.
Though this non-Ti edition of the GTX 660 drops the core count quite significantly, its increase in core frequency negates some of the additional power savings we would have seen. For that reason, the non-Ti is rated at 140W, vs. 150W of the Ti.

CoresCore MHzMemoryMem MHzMem BusTDP
GeForce GTX 69030729152x 2048MB6008256-bit300W
GeForce GTX 680153610062048MB6008256-bit195W
GeForce GTX 67013449152048MB6008256-bit170W
GeForce GTX 660 Ti13449152048MB6008192-bit150W
GeForce GTX 6609609802048MB6000192-bit140W
GeForce GTX 65038410581024MB5000128-bit64W
GeForce GT 6403849002048MB5000128-bit65W
GeForce GT 630968101024MB3200128-bit65W
GeForce GT 620967001024MB180064-bit49W
GeForce GT 610488101024MB180064-bit29W
 
Alongside the GTX 660, NVIDIA has also launched the GTX 650, although availability at this point is nil. Despite being briefed on both of these cards at the same time, I haven't found a single review online of the GTX 650, so it's to be assumed that NVIDIA isn't rushing that product out too fast, and if I had to guess, it's a card that exists only to finish off the sequential numbering system. Laugh all you want, but imagine the table above without the GTX 650. That'd look rather odd, wouldn't it?
That aside, for NVIDIA to call it the GTX 650 is a bit of an insult to the GTX name. In no possible way does this GPU deserve it - GTX has traditionally represented cards that could more than handle games being run with lots of detail and at current resolutions. "GTX" is certainly suitable for the 660, but how did NVIDIA deem the 650 worthy when it slots just barely in front of the $100 GT 640? Maybe next we'll see Porsche release a 4 cylinder 911 Turbo S.
Rant side, the vendor to provide us with a GTX 660 sample is GIGABYTE. Unfortunately, it's an "OC Version", which means we are unable to deliver baseline GTX 660 results (I am not keen on forcing turbo adjustments). Making matters a bit worse, the OC isn't that minor. Memory remains the same, but the core gets +73MHz tacked on. An OC like this is great for consumers, but tough on reviewers who'd like to compare GPUs fairly.

With all of the launches NVIDIA's done in the past couple of months for Kepler, it was difficult to explore GTX 660 with anything more than minimal enthusiasm. However, from what we've seen throughout all of our testing, the GTX 660 is actually quite an impressive card, and possibly one of the most important to NVIDIA's entire 600 series line-up.
The reason for that boils down to the affordable price-point, and the performance it delivers. A $229 card that can handle a graphically gorgeous game like Battlefield 3 at Ultra detail at 1080p? Do I really need to explain why that's awesome?
Unfortunately, we didn't have a GPU that was directly comparable to this one, so there's really no apples to apples comparison. AMD's Radeon HD 7850 comes closest, at about $200. In that match-up, we saw the GTX 660 consistently perform better than the HD 7850, with the lowest gains being seen in the heavily AMD-favored DiRT: Showdown. Performance increases of 10-20% were not uncommon. In the also AMD-favored SHOGUN 2, the GTX 660 averaged 50% faster.

Saturday, September 22, 2012

EVGA GTX 680 SUPERCLOCKED SIGNATURE 2

 we are going to talk about a graphics card which is called EVGA GeForce GTX 80 SC Signature 2:
Interestingly, the official EVGA website says that this product can only be ordered singly. Only one device for one buyer. The company seems to regard this graphics card as something special if they take such a careful approach to its distribution. So, let’s see what secrets are hidden under the name of EVGA GeForce GTX 680 SC Signature 2.

Technical Specifications

The detailed technical specifications of the new EVHA GeForce GTX 680 SC Signature 2 card are summed up in the table below side by side with those of the reference Nvidia GeForce GTX 680 (the differences are marked with bold font):
 
 

Performance

3DMark 2011

Metro 2033: The Last Refuge

Aliens vs. Predator (2010)

Total War: Shogun 2

Crysis 2

Battlefield 3

Sunday, August 12, 2012

Do Graphics Cards Need 4 GB of Memory?

Putting more memory on a graphics card is often not a real improvement but a marketing trick targeted at inexperienced users. Most often it is employed with entry-level solutions, such as GeForce GT630, which do not need more than 1 gigabyte of onboard memory for any applications they can cope with. However, this doesn’t prevent their manufacturers from installing additional memory, up to a fantastic 4 gigabytes, to make them look more attractive in the buyer’s eyes.

It’s different with top-end solutions such as Nvidia’s GeForce GTX 680 and 670 and AMD’s Radeon HD 7950 and 7970. The AMD cards have 3 gigabytes of memory by default, which seems to be quite enough, but Nvidia is already criticized for equipping its Kepler-based GeForce series products with only 2 gigabytes of GDDR5 memory. Meanwhile, the GK104 Kepler GPU can actually work with either 2 or 4 GB of memory, according to its specifications, and some manufacturers have used this opportunity. One of them is EVGA whose GeForce GTX 670 4GB Superclocked+ w-Backplate is going to be reviewed in this article.

The EVGA GeForce GTX 670 Superclocked with 4 GB of memory is three times as fast as the regular GeForce GTX 670 and twice as fast as the overclocked GeForce GTX 680. That would be perfect if its frame rate were not as low as 14-15 fps, bottoming out to 5-6 fps. Of course, there’s no talking about smooth gameplay at such a low speed. Still, we can note that the double amount of memory does provide some benefits here.






Conclusion

Increasing the amount of memory on board of GeForce GTX 670 and GTX 680 cards translates to obvious performance benefits only in specific unique cases, such as triple-monitor set-ups with 3240x1920 resolution and enabled antialiasing. Metro 2033: The Last Refuge and Sniper Elite V2 are the only games that need more than the standard 2 GB of graphics memory, but the contemporary High-End graphics cards are anyway too slow in these games even with 4 GB of video memory. In the rest of our games we could hardly see any difference between GeForce GTX 670s with 2 and 4 GB of memory in 3240x1920 and no difference at all in 2560x1440. So, purchasing a 4GB card wouldn't be worth the investment unless you've got a triple-monitor configuration. But if you do have one, 4GB graphics cards really make sense for 2-, 3- and 4-way SLI configurations and playing contemporary games at high resolutions.by sergey lepilov

 

Monday, August 6, 2012

Nvidia Geforce GTX 660 ti Finally Here?

.
Zoom

A Swedish retailer posted a pre-order opportunity of the Asus GeForce GTX 660 Ti DirectCU II graphics card  a price of just under $400 for this card.

Based on Kepler architecture, the 660 Ti will run on 1,344 cores and integrate 2 GB of GDDR5 memory using a 192-bit memory bus (with 144.2 GB/s bandwidth). The card is nearly identical in its specs to the GTX 670 (which has a 256-bit memory interface with 192.2 GB/s bandwidth), which sells in the $400 neighborhood as well, but we expect the new card to use less power and distance itself in price. The price mentioned on the Swedish site may be a pit premature and optimistic. A $350 target for volume cards seems more realistic to us, but we'll find out soon.by wolfgang gruener

Thursday, August 2, 2012

Nvidia GeForce GTX 690 SLI Benchmarks

Performance

The two semi-synthetic benchmarks and tech demo ran at their default settings in two graphics quality modes.whereas Unigine Heaven doesn't permit to set a resolution higher than 1920x1080 pixels. Anyway, we can still compare our graphics subsystems in these benchmarks.

3DMark Vantage


3DMark 2011

We’ve got the same picture in the newer version of 3DMark, except that the two GTX 690s in 4-way SLI mode enjoy a larger advantage over the single GTX 690.

Unigine Heaven Demo

Unigine Heaven is only capable of getting the best from our graphics subsystems at the highest settings with 8x antialiasing. The standings are the same as in the two versions of 3DMark, though.
Now let’s see what we have in real games.

S.T.A.L.K.E.R.: Call of Pripyat

There’s nothing extraordinary about these results except that the two Radeon HD 7970 GHz Edition cards are ahead of the single GTX 690.

Metro 2033: The Last Refuge

Except for the bottom speed, the two GeForce GTX 690s are just splendid here. No other graphics card or multi-GPU subsystem has ever delivered such a high frame rate in Metro 2033: The Last Refuge.

Just Cause 2

The results of this test are more predictable and easier to explain:
We can see that the single-GPU flagship products from AMD and Nvidia cannot make the game playable with comfort at 3240x1920 pixels, although Just Cause 2 was released over 2 years ago. The CrossFireX and SLI configurations are most appropriate here

Aliens vs. Predator (2010)

The overall picture is like in the previous test:
One top-end graphics card wouldn’t be enough if you’ve got as many as three HD monitors. Take note of the high efficiency of the CrossFireX tandem built out of two Radeon HD 7970 GHz Edition cards. At the highest settings it is almost as fast as the two GTX 690s.

Lost Planet 2

The overclocked GeForce GTX 680 is ahead of the Radeon HD 7970 GHz Edition at 2560x1440 but the latter overtakes it when we switch to the multi-monitor configuration, even though the Lost Planet 2 engine is optimized for Nvidia's architecture. There's nothing we can add about the SLI and CrossFireX tandems except that the GTX 690s seem to be limited by the platform's performance at 2560x1440 pixels.

Sid Meier’s Civilization V

If you thought Civilization V a light application, you may want to reconsider after seeing the results of the single top-end graphics cards at the resolution of 3240x1920 pixels: 23 fps with the GTX 690 and 38 fps with the HD 7970 GHz Edition. So, that's where the SLI and CrossFireX tandems are going to come in handy. And we should also note that AMD routs its opponent here.

Total War: Shogun 2

The same goes for this test:
Nvidia is more or less competitive at the classic resolution of 2560x1600 pixels, but AMD’s solutions are unrivalled with our multi-monitor configuration at 3240x1920. Total War: Shogun 2 obviously needs more memory than 2 gigabytes installed on board the GTX 680 and 690 (per each GPU). Coupled with the 256-bit bus, the memory bandwidth slows the fast Kepler GPU down. Alas, this is not the only game in this test session where Nvidia's solutions behave like that.

Crysis 2

Like Metro 2033: The Last Refuge, this game may occasionally slow down very much, but the standings are overall the same as in most other tests. You need at least a single GeForce GTX 690 for the triple-monitor configuration or, better yet, two Radeon HD 7970 GHz Edition cards which have a much higher bottom frame rate.

Hard Reset Demo


Conclusion

Combined into a 4-way SLI configuration, two Nvidia GeForce GTX 690 cards are indeed overkill for ordinary users who play their games on a single monitor, even a high-resolution one. Well, we couldn't have any doubts about that really. The more surprising outcome of this test session is that such a tandem can indeed be helpful for a triple-monitor setup with a resolution of 3240x1920 pixels because a single GTX 690 wouldn’t cope. Such a high resolution translates into a very high load on the graphics subsystem, especially if you also enable full-screen antialiasing.
But the biggest surprise is that Nvidia isn’t quite ready for that resolution with its Kepler-based graphics cards.by sergey lepilov

Tuesday, July 31, 2012

Nvidia Geforce GTX 660 Ti Specifications and Launch Date Rumored

Nvidia set to release its new Kepler-based GTX 660 Ti, replacing the GTX 560 Ti. originally reported that the GTX 660 Ti would not hit the market until Q3, but now all signs point to an August 16th release. Nvidia could be using the upcoming GamesCom event as the stage to release its new card.
SpecificationGeforce GTX 660 TiGeforce GTX 670
ArchitectureKeplerKepler
Technology28 nm28 nm
GPUGK104 (?)GK104
CUDA cores1344 1344
Base frequency915 MHz915 MHz
Boost Frequency980 MHz980 MHz
Memory Bus192-bit256-bit
Amount of memoryGB 2 GDDR5GB 2 GDDR5
Memory Frequency6008 MHz6008 MHz
TDP150 W170 W

The GTX 660 Ti looks to be basically the same card as the GTX 670, except with 192-bit memory bus and a lower TDP of 150 W. It is based on the Kepler architecture with 1344 CUDA cores (192*7), and equipped with four video outputs (two DVI, HDMI and DisplayPort), and is powered via two 6-pin PCI Express connectors. We got an early glimpse of what a stock Evga GeForce GTX 660 Ti card looks like from a leaked image coming from Expreview. by doug crowthers source sweclockers
Image Leaked by: Expreview

 
 
 
 
 

Monday, July 30, 2012

EVGA GTX 680 CLASSIFIED HANDS-ON

We have recently received several different GeForce cards covering a range of performance levels, prices, and cooler configurations. Over the next couple of weeks we’ll be taking a look at such GTX 670 and GTX 680 cards from Asus, EVGA, MSI, and Zotac. NVIDIA is going through a period of tight control over their partners’ designs, but this hasn’t stopped their partners from putting their own unique touches on their cards.
Nowhere is this embodied more than with our first card, EVGA’s GeForce GTX 680 Classified. In EVGA’s product hierarchy the Classified is their top of the line product, where they typically go all-out to make customized products to scratch the itch of overclockers and premium buyers alike. The GTX 680 Classified in turn is EVGA’s take on a premium GTX 680,What has EVGA seen fit to do with their fully-custom GTX 680, and does it live up to the hype and the price tag that comes with the Classified name? .

EVGA GeForce GTX 680 Condensed Product Lineup
EVGA GTX 680 ClassifiedEVGA GTX 680 FTW+EVGA GTX 680 SCEVGA GTX 680
Stream Processors 1536 1536 1536 1536
Texture Units 128 128 128 128
ROPs 32 32 32 32
Core Clock 1111MHz 1084MHz 1058MHz 1006MHz
Boost Clock 1176MHz 1150MHz 1124MHz 1058MHz
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 6.208GHz GDDR5 6.008GHz GDDR5
Memory Bus Width 256-bit 256-bit 256-bit 256-bit
Frame Buffer 4GB 4GB 2GB 2GB
Price $659 $629 $519 $499

NVIDIA usually turns out solid reference card designs. For their high-end single-GPU cards NVIDIA typically uses balanced designs that are reasonably quiet, reasonably cool, and have some degree of overclocking potential. On the other hand NVIDIA also tends to go conservative in some ways, with NVIDIA favoring blowers so that their reference cards work in most cases, and rarely overbuilding their cards in order to keep the manufacturing cost of the card down.
This is where custom cards come in. NVIDIA’s reference which leads to their partners creating custom products not only to differentiate themselves from each other, but to target specific niches that the reference design doesn’t do a good job of covering. Even just replacing the cooler while maintaining the reference board – what we call a semi-custom card – can have a big impact on noise, temperatures, and can improve overclocking. But at the end of the day there’s only so much you can do with NVIDIA’s reference boards, particularly when it comes to form factors and overclocking. This leads us to fully-custom cards.

Crysis, Metro, DiRT 3, Shogun 2, & Batman
Since the GTX 680 Classified doesn’t bring anything new to the table architecturally, we’ll keep our commentary on its stock performance brief. At stock it’s much like any other overclocked GTX 680 (factory or otherwise), with the only real room for differentiation being the greater amount of RAM and the higher power target. In practice the greater amount of RAM doesn’t make much of a difference in our single-GPU tests, as that much RAM is far more beneficial for the ultra-high resolutions of multi-monitor gaming, at which point you’re going to need a second card to provide the necessary horsepower.
The higher default power target on the other hand is quite interesting. The GTX 680 Classified will hit its top boost bin almost all of the time thanks to the generous power target, something the reference GTX 680 can have trouble with even at stock. So although reference cards can be overclocked to this level, it doesn’t necessarily mean they’ll match the GTX 680 Classified’s boost clocks in that state.
Starting off as always in Crysis, there’s actually not much to see. Since the reference GTX 680 is already memory bandwidth limited here and since the GTX 680 Classified doesn’t have a memory overclock, the factory core overclock does very little for its performance here.







Batman on the other hand doesn’t do the GTX 680 Classified any favors, which is a bit odd. 3-5% just isn’t what you expect here, since there’s no real evidence that the game is CPU or memory bandwidth bottlenecked.

FINAL THOUGHTS

 out of the box the GTX 680 is a very impressive card. EVGA’s various touches such as 4GB of RAM, a larger cooler, a factory overclock, and of course additional VRM circuitry that leads to a higher stock power target, all serve to make the GTX 680 Classified a clearly better card than the reference GTX 680. Furthermore thanks in large part to EVGA’s binning there’s even more overclocking headroom to play with, leading us to reach a 1211MHz core clock without ever increasing its voltage. It’s a very good – if very expensive – GTX 680.by ryan smith