-->

Monday, January 7, 2019

author photo

Technology - Google News


The NVIDIA GeForce RTX 2060 6GB Founders Edition Review: Not Quite Mainstream - AnandTech

Posted: 07 Jan 2019 06:00 AM PST

In the closing months of 2018, NVIDIA finally released the long-awaited successor to the Pascal-based GeForce GTX 10 series: the GeForce RTX 20 series of video cards. Built on their new Turing architecture, these GPUs were the biggest update to NVIDIA's GPU architecture in at least half a decade, leaving almost no part of NVIDIA's architecture untouched.

So far we’ve looked at the GeForce RTX 2080 Ti, RTX 2080, and RTX 2070 – and along with the highlights of Turing, we’ve seen that the GeForce RTX 20 series is designed on a hardware and software level to enable realtime raytracing and other new specialized features for games. While the RTX 2070 is traditionally the value-oriented enthusiast offering, NVIDIA's higher price tags this time around meant that even this part was $500 and not especially value-oriented. Instead, it would seem that the role of the enthusiast value offering is going to fall to the next member in line of the GeForce RTX 20 family. And that part is coming next week.

Launching next Tuesday, January 15th is the 4th member of the GeForce RTX family: the GeForce RTX 2060 (6GB). Based on a cut-down version of the same TU106 GPU that's in the RTX 2070, this new part shaves off some of RTX 2070's performance, but also a good deal of its price tag in the process. And for this launch, like the other RTX cards last year, NVIDIA is taking part by releasing their own GeForce RTX 2060 Founders Edition card, which we are taking a look at today.

NVIDIA GeForce Specification Comparison
  RTX 2060 Founders Edition GTX 1060 6GB (GDDR5) GTX 1070
(GDDR5)
RTX 2070
CUDA Cores 1920 1280 1920 2304
ROPs 48 48 64 64
Core Clock 1365MHz 1506MHz 1506MHz 1410MHz
Boost Clock 1680MHz 1709MHz 1683MHz 1620MHz
FE: 1710MHz
Memory Clock 14Gbps GDDR6 8Gbps GDDR5 8Gbps GDDR5 14Gbps GDDR6
Memory Bus Width 192-bit 192-bit 256-bit 256-bit
VRAM 6GB 6GB 8GB 8GB
Single Precision Perf. 6.5 TFLOPS 4.4 TFLOPs 6.5 TFLOPS 7.5 TFLOPs
FE: 7.9 TFLOPS
"RTX-OPS" 37T N/A N/A 45T
SLI Support No No Yes No
TDP 160W 120W 150W 175W
FE: 185W
GPU TU106 GP106 GP104 TU106
Transistor Count 10.8B 4.4B 7.2B 10.8B
Architecture Turing Pascal Pascal Turing
Manufacturing Process TSMC 12nm "FFN" TSMC 16nm TSMC 16nm TSMC 12nm "FFN"
Launch Date 1/15/2019 7/19/2016 6/10/2016 10/17/2018
Launch Price $349 MSRP: $249
FE: $299
MSRP: $379
FE: $449
MSRP: $499
FE: $599

Like its older siblings, the GeForce RTX 2060 (6GB) comes in at a higher price-point relative to previous generations, and at $349 the cost is quite unlike the GeForce GTX 1060 6GB’s $299 Founders Edition and $249 MSRP split, let alone the GeForce GTX 960’s $199. At the same time, it still features Turing RT cores and tensor cores, bringing a new entry point for those interested in utilizing GeForce RTX platform features such as realtime raytracing.

Diving into the specs and numbers, the GeForce RTX 2060 sports 1920 CUDA cores, meaning we’re looking at a 30 SM configuration, versus RTX 2070’s 36 SMs. As the core architecture of Turing is designed to scale with the number of SMs, this means that all of the core compute features are being scaled down similarly, so the 17% drop in SMs means a 17% drop in the RT Core count, a 17% drop in the tensor core count, a 17% drop in the texture unit count, a 17% drop in L0/L1 caches, etc.

Unsurprisingly, clockspeeds are going to be very close to NVIDIA’s other TU106 card, RTX 2070. The base clockspeed is down a bit to 1365MHz, but the boost clock is up a bit to 1680MHz. So on the whole, RTX 2060 is poised to deliver around 87% of the RTX 2070’s compute/RT/texture performance, which is an uncharacteristically small gap between a xx70 card and an xx60 card. In other words, the RTX 2060 is in a good position to punch above its weight in compute/shading performance.

However TU106 has taken a bigger trim on the backend, and in workloads that aren’t pure compute, the drop will be a bit harder. The card is shipping with just 6GB of GDDR6 VRAM, as opposed to 8GB on its bigger brother. The result of this is that NVIDIA is not populating 2 of TU106’s 8 memory controllers, resulting in a 192-bit memory bus and meaning that with the use of 14Gbps GDDR6, RTX 2060 only offers 75% of the memory bandwidth of the RTX 2070. Or to put this in numbers, the RTX 2060 will offer 336GB/sec of bandwidth to the RTX 2070’s 448GB/sec.

And since the memory controllers, ROPs, and L2 cache are all tied together very closely in NVIDIA’s architecture, this means that ROP throughput and the amount of L2 cache are also being shaved by 25%. So for graphics workloads the practical performance drop is going to be greater than the 13% mark for compute throughput, but also generally less than the 25% mark for ROP/memory throughput.

Speaking of video memory, NVIDIA has called this the RTX 2060 but early indications are that there will be different configurations of RTX 2060s with less VRAM and possibly fewer CUDA cores and other hardware resources. Hence, it seems forward-looking to refer to the product mentioned in this article as the RTX 2060 (6GB); as you might recall, the GTX 1060 6GB was launched as the ‘GTX 1060’ and so appeared as such in our launch review, up until a month later with the release of the ‘GTX 1060 3GB’, a branding that does not indicate its lower-performing GPU configuration unrelated to frame buffer size. Combined with ongoing GTX 1060 naming shenanigans, as well as with GTX 1050 variants (and AMD’s own Polaris naming shenanigans also of note), it seems prudent to make this clarification now in the interest of future accuracy and consumer awareness.

NVIDIA GTX 1060 Variants
Specification Comparison
  GTX 1060 6GB GTX  1060 6GB
(9 Gbps)
GTX 1060 6GB (GDDR5X) GTX 1060 5GB (Regional) GTX 1060 3GB
CUDA Cores 1280 1280 1280 1280 1152
Texture Units 80 80 80 80 72
ROPs 48 48 48 40 48
Core Clock 1506MHz 1506MHz 1506MHz 1506MHz 1506MHz
Boost Clock 1708MHz 1708MHz 1708MHz 1708MHz 1708MHz
Memory Clock 8Gbps GDDR5 9Gbps GDDR5 8Gbps GDDR5X 8Gbps GDDR5 8Gbps GDDR5
Memory Bus Width 192-bit 192-bit 192-bit 160-bit 192-bit
VRAM 6GB 6GB 6GB 5GB 3GB
TDP 120W 120W 120W 120W 120W
GPU GP106 GP106 GP104* GP106 GP106
Launch Date 7/19/2016 Q2 2017 Q3 2018 Q3 2018 8/18/2016

Moving on, NVIDIA is rating the RTX 2060 for a TDP of 160W. This is down from the RTX 2070, but only slightly, as those cards are rated for 175W. Cut-down GPUs have limited options for reducing their power consumption, so it’s not unusual to see a card like this rated to draw almost as much power as its full-fledged counterpart.

All-in-all, the GeForce RTX 2060 (6GB) is quite the interesting card, as the value-enthusiast segment tends to be more attuned to price and power consumption than the performance-enthusiast segment. Additionally, as a value-enthusiast card and potential upgrade option it will also need to perform well on a wide range of older and newer games – in other words, traditional rasterization performance rather than hybrid rendering performance.

Meanwhile, looking at evaluating the RTX 2060 itself, measuring generalizable hybrid rendering performance remains unclear. Linked to the Windows 10 October 2018 Update (1809), DXR has been rolled-out fairly recently. 3DMark’s DXR benchmark, Port Royal, is due on January 8th, while for realtime raytracing Battlefield V is the sole title with it for the moment, with optimization efforts are ongoing as seen in their recent driver efforts. Meanwhile, it seems that some of Turing's other advanced shader features (Variable Rate Shading) are only currently available in Wolfenstein II.

Of course, RTX support for a number of titles have been announced and many are due this year, but there is no centralized resource to keep track of availability. It’s true that developers are ultimately responsible for this information and their game, but on the flipside, this has required very close cooperation between NVIDIA and developers for quite some time. In the end, RTX is a technology platform spearheaded by NVIDIA and inextricably linked to their hardware, so it’s to the detriment of potential RTX 20 series owners in researching and collating what current games can make use of which specialized hardware features they purchased.

Planned NVIDIA Turing Feature Support for Games
Game Real Time Raytracing Deep Learning Supersampling (DLSS) Turing Advanced Shading
Anthem   Yes  
Ark: Survival Evolved   Yes  
Assetto Corsa Competizione Yes    
Atomic Heart Yes Yes  
Battlefield V Yes
(available)
Yes  
Control Yes    
Dauntless   Yes  
Darksiders III   Yes  
Deliver Us The Moon: Fortuna   Yes  
Enlisted Yes    
Fear The Wolves   Yes  
Final Fantasy XV   Yes
(available in standalone benchmark)
 
Fractured Lands   Yes  
Hellblade: Senua's Sacrifice   Yes  
Hitman 2   Yes  
In Death     Yes
Islands of Nyne   Yes  
Justice Yes Yes  
JX3 Yes Yes  
KINETIK   Yes  
MechWarrior 5: Mercenaries Yes Yes  
Metro Exodus Yes    
Outpost Zero   Yes  
Overkill's The Walking Dead   Yes  
PlayerUnknown Battlegrounds   Yes  
ProjectDH Yes    
Remnant: From the Ashes   Yes  
SCUM   Yes  
Serious Sam 4: Planet Badass   Yes  
Shadow of the Tomb Raider Yes    
Stormdivers   Yes  
The Forge Arena   Yes  
We Happy Few   Yes  
Wolfenstein II     Yes, Variable Shading
(available)

So the RTX 2060 (6GB) is in a better situation than the RTX 2070. With comparative GTX 10 series products either very low on stock (GTX 1080, GTX 1070) or at higher prices (GTX 1070 Ti), there’s less potential for sales cannibalization. And as Ryan mentioned in the AnandTech 2018 retrospective on GPUs, with leftover Pascal inventory due to the cryptocurrency bubble, there’s much less pressure to sell Turing GPUs at lower prices. So the RTX 2060 leaves the existing GTX 1060 6GB (1280 cores) and 3GB (1152 cores) with breathing room. That being said, $350 is far from the usual ‘mainstream’ price-point, and even more expensive than the popular $329 enthusiast-class GTX 970.

Across the aisle, the recent Radeon RX 590 in the mix, though its direct competition is the GTX 1060 6GB. Otherwise, the Radeon RX Vega 56 is likely the closer matchup. In the mix are concurrent events like AMD-partner Sapphire’s just-announced RX Vega price cuts, which will see the RX Vega 64 Nitro Plus moved to $379 and the RX Vega 56 Pulse to $329, and both with an attached 3-game bundle. While not explicitly connected to the RTX 2060 (6GB) launch, the implications from the AMD side of the fence are clear.

Unfortunately we've not had the card in for testing as long as we would've liked, but regardless the RTX platform performance testing is in the same situation as during the RTX 2070 launch. Because the technology is still in the early days, we can’t accurately determine the performance suitability of RTX 2060 (6GB) as an entry point for the RTX platform. So the same caveats apply to gamers considering making the plunge.

Q1 2019 GPU Pricing Comparison
AMD Price NVIDIA
Radeon RX Vega 56 $499 GeForce RTX 2070
  $449 GeForce GTX 1070 Ti
  $349 GeForce RTX 2060 (6GB)
  $335 GeForce GTX 1070
Radeon RX 590 $279  
  $249 GeForce GTX 1060 6GB
(1280 cores)
Radeon RX 580 (8GB) $200/$209 GeForce GTX 1060 3GB
(1152 cores)

As for the card itself, we've already seen the scheme with the RTX 2080 Ti, RTX 2080, and RTX 2070 Founders Editions, the main highlight being the new open air cooler design. This time around, the RTX 2060 Founders Edition has stock reference clocks and presumably stock TDP.

Like the RTX 2070 Founders Edition, the RTX 2060 Founders Edition has a single 8-pin power connector at the front of the card, and lacks the NVLink SLI connectors as only the RTX 2080 and above support SLI. Internally, the board appears very similar to the RTX 2070 Founders Edition. Like the other RTX 20 cards, the RTX 2060 has followed with increasing TDP, standing at 160W compared to the 120W of the GTX 1060 6GB. I/O-wise is the same story, with the DVI port customary for mid-range and mainstream cards, which are often paired with budget DVI monitors, particularly as a drop-in upgrade for an aging video card.

This is also in addition to the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.

As mentioned in the other RTX 20 series launch articles, the reference design change poses a potential issue to OEMs, as unlike blowers, open air designs cannot guarantee self-cooling independent of chassis airflow. As a higher-volume and nominally mainstream part, the RTX 2060 Founders Edition would be the more traditional part found in OEM systems.

Keeping in mind that the GeForce RTX 2080 Ti, 2080, and 2070 Founders Editions featured non-reference clockspeeds and TDP, we've opted to use the reference results in order to keep comparisons apples-to-apples across vendors and generations. For the the RTX 2060 6GB Founders Edition, its specifications are reference and so emulating a reference card was unnecessary here.

CPU: Intel Core i7-7820X @ 4.3GHz
Motherboard: Gigabyte X299 AORUS Gaming 7 (F9g)
Power Supply: EVGA 1000 G3
Hard Disk: OCZ Toshiba RD400 (1TB)
Memory: G.Skill TridentZ DDR4-3200 4 x 8GB (16-18-18-38)
Case: NZXT Phantom 630 Windowed Edition
Monitor: LG 27UD68P-B
Video Cards: AMD Radeon RX Vega 64 (Air Cooled)
AMD Radeon RX Vega 56
AMD Radeon RX 590
AMD Radeon RX 580
AMD Radeon R9 390
NVIDIA GeForce RTX 2080
NVIDIA GeForce RTX 2070
NVIDIA GeForce RTX 2060 (6GB) Founders Edition
NVIDIA GeForce GTX 1080 Founders Edition
NVIDIA GeForce GTX 1070 Ti Founders Edition
NVIDIA GeForce GTX 1070 Founders Edition
NVIDIA GeForce GTX 1060 6GB Founders Edition
Video Drivers: NVIDIA Release 417.54 (Press)
AMD Radeon Software Adrenalin 2019 Edition 18.12.3
OS: Windows 10 Pro (1803)
Spectre/Meltdown Mitigations Yes (both)

Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing, although at this time its realtime raytracing isn't ready to be used as a generalizable benchmark.

We use the Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).

Battlefield 1 - 3840x2160 - Ultra QualityBattlefield 1 - 2560x1440 - Ultra QualityBattlefield 1 - 1920x1080 - Ultra Quality

Battlefield 1 has made the rounds for some time, and after the optimizations over the years both manufacturers generally enjoy solid performance across the board. The RTX 2060 (6GB) is no exception and fares well, splitting the difference between the GTX 1070 Ti and GTX 1080. This also means it opens a lead on the RX Vega 56.

Battlefield 1 - 99th Percentile - 3840x2160 - Ultra QualityBattlefield 1 - 99th Percentile - 2560x1440 - Ultra QualityBattlefield 1 - 99th Percentile - 1920x1080 - Ultra Quality

The latest title in Ubisoft's Far Cry series lands us right into the unwelcoming arms of an armed militant cult in Montana, one of the many middles-of-nowhere in the United States. With a charismatic and enigmatic adversary, gorgeous landscapes of the northwestern American flavor, and lots of violence, it is classic Far Cry fare. Graphically intensive in an open-world environment, the game mixes in action and exploration.

Far Cry 5 does support Vega-centric features with Rapid Packed Math and Shader Intrinsics. Far Cry 5 also supports HDR (HDR10, scRGB, and FreeSync 2). This testing was done without HD Textures enabled, a new option that was recently patched in.

Far Cry 5 - 3840x2160 - Ultra QualityFar Cry 5 - 2560x1440 - Ultra QualityFar Cry 5 - 1920x1080 - Ultra Quality

Once again, the RTX 2060 (6GB) slots in neatly between the GTX 1070 Ti and GTX 1080, useful given that the GTX 1070 Ti only had a slim lead on the RX Vega 56.

As for the high resolution texture pack, Far Cry 5 released a free 29GB patch adding toggleable HD Textures, and it's something we'll want to look into as we investigate VRAM limitations. Generally, high-resolution texture packs are a simple way of increasing visual fidelity without significantly hurting framerates, provided the card has enough framebuffer.

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

Ashes of the Singularity: Escalation - 3840x2160 - Extreme QualityAshes of the Singularity: Escalation - 2560x1440 - Extreme QualityAshes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Somewhat surprisingly, the RTX 2060 (6GB) performs poorly in Ashes, closer to the GTX 1070 than the GTX 1070 Ti. Although it is still ahead of the RX Vega 56, it's not an ideal situation, where the lead over the GTX 1060 6GB is cut to around 40%.

Ashes: Escalation - 99th Percentile - 3840x2160 - Extreme QualityAshes: Escalation - 99th Percentile - 2560x1440 - Extreme QualityAshes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

 

id Software is popularly known for a few games involving shooting stuff until it dies, just with different 'stuff' for each one: Nazis, demons, or other players while scorning the laws of physics. Wolfenstein II is the latest of the first, the sequel of a modern reboot series developed by MachineGames and built on id Tech 6. While the tone is significantly less pulpy nowadays, the game is still a frenetic FPS at heart, succeeding DOOM as a modern Vulkan flagship title and arriving as a pure Vullkan implementation rather than the originally OpenGL DOOM.

Featuring a Nazi-occupied America of 1961, Wolfenstein II is lushly designed yet not oppressively intensive on the hardware, something that goes well with its pace of action that emerge suddenly from a level design flush with alternate historical details.

The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled. NVIDIA Adaptive Shading was not enabled.

In summary, Wolfenstein II tends to scales well, enables high framerates with minimal CPU bottleneck, enjoys running on modern GPU architectures, and consumes VRAM like nothing else. For the Turing-based RTX 2060 (6GB), this results in outpacing the GTX 1080 as well as RX Vega 56 at 1080p/1440p. The 4K results can be deceiving; looking closer at 99th percentile framerates shows a much steeper dropoff, more likely than not to be related to the limitations of the 6GB framebuffer. We've already seen the GTX 980 and 970 struggle at even 1080p, chained by 4GB video memory.

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes for better accuracy in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 3840x2160 - Ultra QualityFinal Fantasy XV - 2560x1440 - Ultra QualityFinal Fantasy XV - 1920x1080 - Ultra Quality

At 1080p and 1440p, the RTX 2060 (6GB) returns to its place between the GTX 1080 and GTX 1070 Ti. Final Fantasy is less favorable to the Vega cards so the RTX 2060 (6GB) is already faster than the RX Vega 64. With the relative drop in 4K performance, there are more hints of 6GB being potentially insufficient.

Final Fantasy XV - 99th Percentile - 3840x2160 - Ultra QualityFinal Fantasy XV - 99th Percentile - 2560x1440 - Ultra QualityFinal Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

 

Now a truly venerable title, GTA V is a veteran of past game suites that is still graphically demanding as they come. As an older DX11 title, it provides a glimpse into the graphically intensive games of yesteryear that don't incorporate the latest features. Originally released for consoles in 2013, the PC port came with a slew of graphical enhancements and options. Just as importantly, GTA V includes a rather intensive and informative built-in benchmark, somewhat uncommon in open-world games.

The settings are identical to its previous appearances, which are custom as GTA V does not have presets. To recap, a "Very High" quality is used, where all primary graphics settings turned up to their highest setting, except grass, which is at its own very high setting. Meanwhile 4x MSAA is enabled for direct views and reflections. This setting also involves turning on some of the advanced rendering features - the game's long shadows, high resolution shadows, and high definition flight streaming - but not increasing the view distance any further.

Grand Theft Auto V - 3840x2160 - Very High Quality

Grand Theft Auto V - 2560x1440 - Very High QualityGrand Theft Auto V - 1920x1080 - Very High Quality

Grand Theft Auto V - 99th Percentile - 3840x2160 - Very High Quality

Grand Theft Auto V - 99th Percentile - 2560x1440 - Very High QualityGrand Theft Auto V - 99th Percentile - 1920x1080 - Very High Quality

 

Next up is Middle-earth: Shadow of War, the sequel to Shadow of Mordor. Developed by Monolith, whose last hit was arguably F.E.A.R., Shadow of Mordor returned them to the spotlight with an innovative NPC rival generation and interaction system called the Nemesis System, along with a storyline based on J.R.R. Tolkien's legendarium, and making it work on a highly modified engine that originally powered F.E.A.R. in 2005.

Using the new LithTech Firebird engine, Shadow of War improves on the detail and complexity, and with free add-on high resolution texture packs, offers itself as a good example of getting the most graphics out of an engine that may not be bleeding edge. Shadow of War also supports HDR (HDR10).

Shadow of War - 3840x2160 - Ultra QualityShadow of War - 2560x1440 - Ultra QualityShadow of War - 1920x1080 - Ultra Quality

Succeeding F1 2016 is F1 2018, Codemaster's latest iteration in their official Formula One racing games. It features a slimmed down version of Codemasters' traditional built-in benchmarking tools and scripts, something that is surprisingly absent in DiRT 4.

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters' EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

F1 2018 - 3840x2160 - Ultra QualityF1 2018 - 2560x1440 - Ultra QualityF1 2018 - 1920x1080 - Ultra Quality

Last in our 2018 game suite is Total War: Warhammer II, built on the same engine of Total War: Warhammer. While there is a more recent Total War title, Total War Saga: Thrones of Britannia, that game was built on the 32-bit version of the engine. The first TW: Warhammer was a DX11 game was to some extent developed with DX12 in mind, with preview builds showcasing DX12 performance. In Warhammer II, the matter, however, appears to have been dropped, with DX12 mode still marked as beta, but also featuring performance regression for both vendors.

It's unfortunate because Creative Assembly themselves have acknowledged the CPU-bound nature of their games, and with re-use of game engines as spin-offs, DX12 optimization would have continued to provide benefits, especially if the future of graphics in RTS-type games will lean towards low-level APIs.

There are now three benchmarks with varying graphics and processor loads; we've opted for the Battle benchmark, which appears to be the most graphics-bound.

Total War: Warhammer II - 3840x2160 - Ultra QualityTotal War: Warhammer II - 2560x1440 - Ultra QualityTotal War: Warhammer II - 1920x1080- Ultra Quality

Shifting gears, we'll look at the compute and synthetic aspects of the RTX 2060 (6GB). As a cut-down configuration of the TU106 GPU found in the RTX 2070, we should expect a similar progression in results.

Starting off with GEMM tests, the RTX 2070's tensor cores are pulled into action with half-precision matrix multiplication, though using binaries originally compiled for Volta. Because Turing is backwards compatible and in the same compute capability family as Volta (sm_75 compared to Volta's sm_70), the benchmark continues to work out-of-the-box, though without any particular Turing optimizations.

Compute: General Matrix Multiply Half Precision (HGEMM)

Compute: General Matrix Multiply Single Precision (SGEMM)

For Turing-based GeForce, FP32 accumulation on tensors is capped at half-speed, thus resulting in the observed halved performance. Aside from product segmentation, that higher-precision mode is primarily for deep learning training purposes, something that GeForce cards wouldn't be doing in games or consumer tasks.

Moving on, we have CompuBench 2.0, the latest iteration of Kishonti's GPU compute benchmark suite offers a wide array of different practical compute workloads, and we’ve decided to focus on level set segmentation, optical flow modeling, and N-Body physics simulations.

Compute: CompuBench 2.0 - Level Set Segmentation 256Compute: CompuBench 2.0 - N-Body Simulation 1024KCompute: CompuBench 2.0 - Optical Flow

Moving on, we'll also look at single precision floating point performance with FAHBench, the official Folding @ Home benchmark. Folding @ Home is the popular Stanford-backed research and distributed computing initiative that has work distributed to millions of volunteer computers over the internet, each of which is responsible for a tiny slice of a protein folding simulation. FAHBench can test both single precision and double precision floating point performance, with single precision being the most useful metric for most consumer cards due to their low double precision performance.

Compute: Folding @ Home Single Precision

Next is Geekbench 4's GPU compute suite. A multi-faceted test suite, Geekbench 4 runs seven different GPU sub-tests, ranging from face detection to FFTs, and then averages out their scores via their geometric mean. As a result Geekbench 4 isn't testing any one workload, but rather is an average of many different basic workloads.

Compute: Geekbench 4 - GPU Compute - Total Score

We'll also take a quick look at tessellation performance.

Synthetic: TessMark, Image Set 4, 64x Tessellation

Finally, for looking at texel and pixel fillrate, we have the Beyond3D Test Suite. This test offers a slew of additional tests – many of which use behind the scenes or in our earlier architectural analysis – but for now we’ll stick to simple pixel and texel fillrates.

Synthetic: Beyond3D Suite - Pixel FillrateSynthetic: Beyond3D Suite - Integer Texture Fillrate (INT8)

 

As always, we'll take a look at power, temperature, and noise of the RTX 2060 Founders Edition, though most of the highlights and trends we've seen twice before with the RTX 2080 Ti, RTX 2080, and RTX 2070 Founders Edition launches. For the most part, the dual axial fan open air design provide straightforward benefits in lower noise and cooling, which counterbalences the atypically large GPUs and new fixed-function hardware.

As this is a new GPU, we will quickly review the GeForce RTX 2060's stock voltages and clockspeeds as well.

NVIDIA GeForce Video Card Voltages
Model Boost Idle
GeForce RTX 2060 (6GB) Founders Edition 1.050V 0.725V
GeForce RTX 2070 Founders Edition 1.050v 0.718v
GeForce GTX 1060 6GB Founders Edition 1.062v 0.625v

The voltages are broadly comparable to the preceding 16nm GTX 1070. In comparison to pre-FinFET generations, these voltages are exceptionally lower because of the FinFET process used, something we went over in detail in our GTX 1080 and 1070 Founders Edition review. As we said then, the 16nm FinFET process requires said low voltages as opposed to previous planar nodes, so this can be limiting in scenarios where a lot of power and voltage are needed, i.e. high clockspeeds and overclocking. Of course, Turing (along with Volta, Xavier, and NVSwitch) are built on 12nm "FFN" rather than 16nm, but there is little detail on the exact process tweaks.

Power Consumption

The TDP increase to 160W brings the RTX 2060 (6GB) in between the 180W GTX 1080/1070 Ti and 150W GTX 1070. In turn, load consumption is more-or-less on that level, and nothing dissimilar to what we've seen. This also means that efficiency is around the same relative to performance, as opposed to the RTX 2070, 2080, and 2080 Ti.

Idle Power ConsumptionLoad Power Consumption - Battlefield 1Load Power Consumption - FurMark

 

 

Temperature & Noise

With an open air cooler design with dual axial fans, the results are in line with what we've seen with the other RTX Founders Editions.

Idle GPU TemperatureLoad GPU Temperature - Battlefield 1Load GPU Temperature - FurMark

Idle Noise LevelsLoad Noise Levels - Battlefield 1Load Noise Levels - FurMark

As we bring this to a close, we are again revisiting the central themes of the GeForce RTX 20 series across the launches: forward-looking featuresets that are not widely available, premium pricing based on those hardware features, and competition with existing Pascal products due to comparable conventional gaming performance. This time, however, the last two have played out a little differently. Pascal-based GTX 1080, 1070 Ti, and 1070s are not so readily available and/or are at higher prices. And although the price premium pushes the GeForce RTX 2060 (6GB) out of the traditional mainstream home of the x60 part, it puts it firmly in contention against the Radeon RX Vega cards and to a lesser extent the recently-launched Radeon RX 590.

As a whole, in developing Turing and the GeForce RTX 20 series, NVIDIA has invested heavily in hybrid rendering, offering less price-to-performance for conventional gaming than usual for new GPU architectures. This has been compounded by excess Pascal inventory, a result of the cryptocurrency mining demand of the past year or two. The RTX 2060 (6GB) is no exception, and while it is a better price-to-performance offering relative to its older siblings, it’s simply no longer a ‘mainstream’ video card at $350, instead occupying the ‘value-enthusiast’ space.

For conventional gaming, if the RTX 2080 is akin to the GTX 1080 Ti and the RTX 2070 like the GTX 1080, then the RTX 2060 (6GB) truly performs like the GTX 1070 Ti. By the numbers, the RTX 2060 (6GB) is 2-3% faster than the GTX 1070 Ti at 1440p and 1080p, though comparison becomes a wash at 4K. In turn, reference-to-reference the RTX 2060 (6GB) is around 11% faster than the RX Vega 56 at 1440p/1080p, narrowing to 8% at 4K. There are hints that the 6GB framebuffer might be limiting, especially with unexpectedly low 99th percentile framerates at Wolfenstein II in 4K, though nothing to the extent that older 4GB GTX 900 series cards have experienced.

Potential VRAM bottlenecks is something that needs further investigation, but more to the point, this is a $350 card featuring only 6GB VRAM. Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379, but it also means that NVIDIA has essentially regressed in VRAM capacity at this price point. In terms of the larger RTX lineup, 6GB is a bit more reasonable progression compared to the 8GB of the RTX 2070 and RTX 2080, but it is something to revisit if there are indeed lower-memory cut-down variants of the RTX 2060 on the way, or if games continue the historical path of always needing more framebuffer space. The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.

Generationally, the RTX 2060 (6GB) does bring more to the table, offering roughly 86% of the performance of the RTX 2070 for 70% of the price. Or against its direct predecessor, the GTX 1060 6GB, it’s faster by around 59%. In context, the GTX 1060 6GB was 80-85% faster than the GTX 960 (2GB) at launch, where presently that gap is more along the lines of 2X or more, with increased framebuffer the primary driver. But at $200, the GTX 960 was a true mainstream card, as was the GTX 1060 6GB at its $249 MSRP, despite the $299 Founders Edition pricing.

What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal, but the price cuts we've heard will be taking effect today will keep the cards closer together. Reference-to-reference, the RTX 2060 (6GB) is already bringing around 95% of RX Vega 64 performance, so card pricing will make all the difference. The same goes for the RX 590, whose position in the ‘performance gap’ between the RX Vega 56 and RX 580 is now shared. And alongside price changes, there are still the value-adds of game bundles and FreeSync compatibility.

At least, that would have been the straightforward case for AMD if not for yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.

Like the RTX 2070, the RTX 2060 (6GB) is less suited as an option for most high-end GTX 10 series owners, and with 6GB VRAM as it’s a little less tempting than it could be as a move up from the GTX 1060 6GB or GTX 980 Ti. The card offers known performance along the lines of the GTX 1070 Ti and at very similar power consumption, but brings better value than existing higher-end RTX 20 series models. And this time, there’s less of a spoiler effect from older Pascal models.

Compared to previous generations, it’s not breaking the price-to-performance curve, as it is still an RTX card and pulling double-duty as the new entry-point for RTX platform support. That being said, there is no mincing words about the continuing price creep of the past two GeForce series. The price-to-performance characteristics of the RTX 2070, 2080, and 2080 Ti is what renders the RTX 2060 (6GB) a better value in comparison, and not necessarily because it is great value in absolute terms. But as an upgrade from older mainstream cards, the RTX 2060 (6GB) price point is a lot more reasonable than the RTX 2070’s $500+, where there more of the price premium is from forward-looking hardware-accelerated features like realtime raytracing.

So the RTX 2060 (6GB) would be the most suitable for gamers that aren’t gung-ho early adopters or longtime enthusiasts. The caveat is on the 6GB framebuffer, keeping in mind that the 4GB GTX 980 and 970 now punch below their weight in certain games, given the trends of HDR, HD texture packs, high-refresh rates, and more. Beyond that, the RTX 2060 (6GB) and RTX 2070 comes with a choice of Anthem or Battlefield V, as part of the new bundle. For a prospective buyer, this might not justify $500 but might outweigh $350, especially as realtime raytracing can be immediately tried out with Battlefield V. In the same way, upcoming support for adaptive sync could do the same for those looking to upgrade to a monitor with variable refresh rate.

Let's block ads! (Why?)

Google Assistant will soon be on a billion devices, and feature phones are next - The Verge

Posted: 07 Jan 2019 06:00 AM PST

As CES kicks off, Google has a massive presence: monorails, a booth that's three times larger than last year, and likely a giant pile of news to announce. But ahead of all the actual product news, the company wants to beat its chest a little by announcing some numbers. By the end of the month, it expects that Google Assistant will be on 1 billion devices — up from 500 million this past May.

That's 900 million more than the number Amazon just gave us for Alexa. But just like Amazon, Google's number comes with caveats. In an interview with The Verge, Manuel Bronstein, the company's vice president of Google Assistant, copped to it. "The largest footprint right now is on phones. On Android devices, we have a very very large footprint," he says. He characterizes the ratio of phones as "the vast majority" of that billion number, but he won't specify it more than that. Though he does argue that smart speakers and other connected home devices comprise a notable and growing portion.

In addition to the billion milestone, Google is also pointing out that Assistant now works in 30 languages and is available in 80 countries. Global active users have grown four times year over year, too. That last stat is not super useful, of course, because we don't know what the active users number was last year.

So while these are new numbers, they're not necessarily informative numbers. Just as with Amazon, the main thing you can take away is that Google has hit a big enough scale to be able to claim it's got an honest-to-god platform on its hands. And so the next step is expanding that platform to work on as many new devices as possible.

For Google, the next billion devices will come in emerging markets, specifically on feature phones. Here's what Bronstein has to say about them:

There are large, large numbers of feature phones in the market today — hundreds of millions. ... But if you think about writing, reading, and typing on those feature phones, it's not that simple. And the voice-first interaction is becoming increasingly important in those markets, and we're beginning to see traction there. We're going to start talking more about that at Mobile World Congress, but definitely, there's a massive opportunity for voice interaction and assistive technology in those markets as well.

We won't get the full details of what Google has planned for feature phones until February. But in the meantime, given Google's massive presence here at CES, there's sure to be a ton of other announcements to pay attention to in the coming days.

Let's block ads! (Why?)

CES Kicks Off Tuesday & Samsung Surprises with an iTunes Announcement, HP Laptops with OLED Displays and Apple … - Patently Apple

Posted: 06 Jan 2019 12:55 PM PST

 

Samsung surprised the market today with the announcement that they'll be offering iTunes Movies and TV Shows while offering AirPlay 2 on 2019 Samsung Smart TV models that launch this spring. Clearly Apple is gearing up its new content service and getting their service listed on Samsung's bottom bar services bar as presented above is an excellent way to help kick off Apple's content service rumored to be debuting in 2019.

 

Samsung notes that adding Apple's iTunes is an industry first with an iTunes Movies and TV Shows app that will debut only on Samsung Smart TVs in more than 100 countries. AirPlay 2 support will be available on Samsung Smart TVs in 190 countries worldwide. Support on older 2018 Samsung Smart TVs will be made available via firmware update.

 

With the new iTunes Movies and TV Shows app on Samsung Smart TVs, Samsung customers can access their existing iTunes library and browse the iTunes Store to buy or rent from a selection of hundreds of thousands of movies and TV episodes — including the largest selection of 4K HDR movies. iTunes Movies and TV Shows will work seamlessly with Samsung’s Smart TV Services, such as Universal Guide, the New Bixby and Search, to create a consistent experience across Samsung’s platform.

 

With AirPlay 2 support, Samsung customers will be able to effortlessly play videos, photos, music, podcasts and more from Apple devices directly to Samsung Smart TVs, including QLED 4K and 8K TVs, The Frame and Serif lifestyle TVs, as well as other Samsung UHD and HD models.

 

Won-Jin Lee, Executive Vice President, Service Business of Visual Display at Samsung Electronics: "We pride ourselves on working with top industry leaders to deliver the widest range of content services to our Smart TV platform. Bringing more content, value and open platform functionality to Samsung TV owners and Apple customers through iTunes and AirPlay is ideal for everyone."

 

Eddy Cue, senior vice president of Internet Software and Services at Apple: "We look forward to bringing the iTunes and AirPlay 2 experience to even more customers around the world through Samsung Smart TVs, so iPhone, iPad and Mac users have yet another way to enjoy all their favorite content on the biggest screen in their home."

 

Other interesting announcements beginning to leak out prior to the CES 2019 event kick starting on Tuesday include:

 

HP Breaks new Ground with a 15" AMOLED Display: It was only a matter of time before notebooks got a little attention over smartphones. While Apple made the switch to OLED beginning last year with iPhone X, HP is the one that will finally be bringing quality AMOLED displays to notebooks starting with their Spectre X360.

 

With support for HDR, the Spectre x360 15 delivers 33 percent more colors than sRGB and has a stellar 100,000:1 contrast ratio. All that will come in a new Spectre x360 body unveiled recently by HP with a dual-chamfered design for easier opening, a privacy kill switch on the webcam and very narrow bezels. More on this.

 

HP's Omen 15 is the first gaming laptop with a 240Hz display: While Apple wowed us with the iPad Pro with 120Hz for ultra-smooth scrolling, HP is upping the ante with their Omen 15" gaming laptop with 240Hz display.  More on this.

 

Lastly, HP's leaks also included that they're adding Intel Core i9 CPUs, Nvidia RTX graphics to their elite Omen Obelisk desktop.

 

Oddly Apple

 

When Huawei drove trucks with ads plastered on it promoting their new coming smartphone in front of UK Apple stores I found it to be like a teenage prank. Huawei also heavily promoted their smartphones with massive ads around Apple's construction site in Milan Italy.

 

Oddly, Apple is getting on board this kind of marketing tactic with a new in-your-face banner ad that reads: "What happens on your iPhone, stays on your iPhone." It's a play on the old saying about Vegas: "What happens in Vegas, stay in Vegas."    

 

Apple never had a presence at the annual CES event, so it's an oddity for Apple to stick their iPhone message in the face of their competitors like Google.  

 

2 X APPLE PRIVACY

 

Apple's CEO Tim Cook took aim at Google and Facebook over 'weaponized data' back in late October. The banner may be a follow-up to Apple's ongoing message on privacy that lashes out at social media companies.

 

10.0 Apple News Bar

Let's block ads! (Why?)

This post have 0 komentar


EmoticonEmoticon

Next article Next Post
Previous article Previous Post