Anda di halaman 1dari 24

Maingear FORCE

0. Intel i7 6900K,
ASUS Republic of Gamers Rampage V
Dual NVIDIA GeForce GTX 1080s SLI
Samsung US 850 EVO SSDs
Corsair AX1500i PSU
HyperX Predator memory

1.ASUS Rampage V Extreme X99 motherboard


Intel i7 5960X processor
EPIC SUPERSTOCK Custom handcrafted Hardline liquid cooling
64GB Corsair Dominator Platinum DDR4-2800 Memory
1500 Watt Corsair AX1500i PSU
NVIDIA GeForce GTX Titan X in 4 Way SLI
6 X 1TB Samsung USA 850 PRO SSD in RAID 0
2.GIGABYTE z97x-Gaming G1
Intel i7 4790k OC'd to 4.7 Ghz
Ek 420 + Ek 480 Heat Array
32 GB Corsair Dominator Platinum DDR3
2x NVIDIA GeForce GTX 980s
Corsair AX1200i PSU
1T Samsung USA 840 EVO SSD
1TB Samsung 850 EVO SSD
2 TB Seagate Barracuda
3.ASUS Maximus VII Formula Motherboard
Dual NVIDIA GeForce GTX 980 in SLI
Intel i7 4790k Processor
32GB Corsair Dominator memory
1X Samsung 850 EV0 250GB SSD
1X Samsung 850 EV0 1TB SSD
4.3X NVIDIA GeForce GTX 980 Video Cards
Intel i7 5960x Processor
32GB Corsair Gaming Vengeance DDR4- 2666
2X Samsung USA 850 PRO 256GB SSD in RAID 0
1500 Watt Corsair Professional Digital Series AX1500i 80+ Titanium Certified Modular Power Supply ROHS
5.Motherboard: ASUS North America Rampage V Extreme
CPU: Intel Core i7 5960X
Memory: 32GB Corsair Dominator Platinum DDR4-2667
Video Cards: NVIDIA GeForce Titan X in Four Way SLI
SSD: Samsung USA 850 PRO 1TB
6. Motherboard: ASUS Z170-Deluxe
CPU: Intel Core i7 6700K
Memory: 32GB HyperX Predator DDR4-2667
Video Cards: NVIDIA GeForce Titan X in Two Way SLI
SSD: Intel NVMe 750 PCIe SSD 1.2TB
PSU: Corsair AX1200i 1200 watt
7. Motherboard: ASUS Z10PE0-D8 WS
CPU: 2 x Intel E5-2698 V3 w/ total of 32 Cores/64 Threads
Memory: 128GB Crucial DDR4-2133
GPU: 3 x NVIDIA GeForce GeForce Titan X
GPGPU: NVIDIA Tesla K40
SSD: 2 x 1TB Samsung USA 850 PRO
HDD: 3 x 6TB Seagate Barracuda
PSU: Corsair AX1500i 1500 Watt
Maingear SHIFT
1.ASUS Rampage V Extreme Motherboard
NVIDIA GeForce GTX 980
Intel i7 5960X Processor
16GB Corsair Dominator memory
1X Samsung USA 850 EV0 250GB SSD
1X Samsung 850 EV0 1TB SSD

2.ASUS North America X99 DELUXE Motherboard

Dual NVIDIA GeForce GTX 980 in SLI


Intel i7 5960X Processor
32GB Corsair Dominator memory
1X Samsung USA 850 EV0 250GB SSD
1X Samsung USA 850 EV0 1TBGB SSD

3.ASUS RAMPAGE v Extreme Motherboard


Intel i7 5960X Processor Overclocked to 4.5 GHz
32GB Corsair Dominator Platinum DDR4 2800
3x NVIDIA GeForce GTX TITAN X
Corsair AX1200i PSU
EPIC 180 Liquid Cooling System
2X 256GB Samsung USA 850 EVO SSD in RAID 0
3TB Seagate Barracuda 7200 RPM HDD
4.Intel Core i7 5960X 8-Core overclocked to 4.5GHz
ASUS North America Rampage V Extreme Motherboard
KLEVV Cras 32GB DDR4-3000 Memory
Dual NVIDIA GeForce GTX Titan X GPU
Corsair AX1200i 1200 watt PSU
Samsung USA 850 Evo 500GB SSD
5.Motherboard: X99 ASUS North America Deluxe
CPU: Intel Core i7 5960X
Memory: 32GB Corsair Dominator
Video Cards: Dual NVIDIA GeForce 980 l SLI
SSD: Dual Samsung USA Evo 256GB
6. Custom One-of-a-kind Liquid Cooling with Hard Acrylic Tubing
Motherboard: ASUS North America X99 Rampage V Extreme
CPU: Intel Core i7 5960X 8 Core/16 Threads
Memory: 16GB Corsair Dominator Platinum DDR4-2800
Video Cards: 3 Way SLI NVIDIA GeForce GTX 980 4GDDDR5
SSD: 256GB Samsung USA 850 Pro
HDD: 3TB Seagate Barracuda
PSU: Corsair AX1200i 1200 watt
Maingear THE RUSH
1.ASUS North America Maximus 7 Gene Motherboard
NVIDIA GeForce GTX 980 Video Card
Intel i7 4790K Processor
16GB Corsair Dominator memory
Dual Samsung USA 850 PRO 256GB SSD in RAID 0
Corsair AX860 power supply
2. Dual NVIDIA GeForce GeForce GTX 1080s SLI,
Intel Core i7 6950X,
MSI USA X99 XPower Gaming Titanium
HyperX 32GB Predator DDR4-3000
Maingear X-CUBE XL
1.Motherboard: X99 ASUS North America Deluxe
CPU: Intel Core i7 5960X
Memory: 64GB Corsair Vengeance
Video Cards: Dual AMD RADEON R9 FURY X
SSD: Samsung USA Pro 512GB

2.Intel Core i7 5960X 8-Core overclocked to 4.5GHz


ASUS North America Rampage V Extreme Motherboard
KLEVV Cras 32GB DDR4-3000 Memory
Dual NVIDIA GeForce GTX Titan X GPU
Corsair AX1200i 1200 watt PSU
Samsung USA 850 Evo 500GB SSD
Arctic Panther
Intel Core i7 5960X CPU
Gigabyte GTX 980 G1 Gaming GPUs (x2)
ASUS X99 Deluxe Motherboard
G.Skill Ripjaws 4 32GB (4x8GB) DDR4 2400 memory (black)
Samsung 850 EVO 250GB SSD (OS) & 2x Samsung 850 PRO 500GB (RAID 0)
WD 4TB Black HDDs (x2, RAID 1)
Fractal Design Define R5 Case (Black)

Fractal Design Newton 1000W PSU

Ocaholic - Project Alpha Omega.


CASE - Corsair Italia Carbide SPEC-ALPHA
CPU - Intel Core i7 5930K
MB - ASUS ROG STRIX X99 Gaming
GPU - 2 x ASUS ROG GeForce GTX 1080 Strix Gaming OC
SLI - NVIDIA High Bandwidth SLI Bridge 3-Slot
RAM - 64GB Corsair Italia Dominator Platinum DDR4 2800Mhz
SSD - 2 x Corsair Italia Force LE 480GG
PSU - Corsair Italia HX1000i
Custom Cables & LEDs - CableMod
Custom Watercooling
CPU - Bitspower Summit EF
RES - Bitspower Water Tank Z-Multi 250 (Clear acrylic)
PUMP - Bitspower DDC Plus with Bitspower DDC Upgrade Kit
RAD - Bitspower Leviathan Xtreme 360
FITTINGS - Bitspower Matte Black - various
TUBING - Bitspower 12mm Acrylic
FANS - Corsair Italia Air Series SP120 Quiet Edition x 4
COOLANT - Mayhems Aurora II Silver

Thermaltake Technology Inc Core P5,


custom painted NVIDIA green featuring the
NVIDIA GeForce GTX 1080's in SLI,
mounted to the GIGABYTE Gaming 7 Z170X motherboard,
utilizing an Intel 6700K and
16GB of G.Skill Global G.Skill Trident Z ram.
This also features the Thermaltake LCS components, using Mayhems Solutions Ltd Silver Aurora 2 and white X1 mix and Riing fans, as well
as some V1 Tech fan grilles and backplates

CORSAIR 780T-Y4NG ROG 1547/1547


SPEC:
i7 6950X
ASUS RAMPAGE V EDITION 10
ASUS GTX TITAN X SLI.
CORSAIR DOMINATOR PLATINUM 4X8/3200 ROG EDITION
SAMSUNG 950PRO M.2 512GB.
WD 3TB RED
CORSAIR AX1200I
CORSAIR 780T

Anthony's Video Card Test System Specifications

Motherboard: ASUS Rampage V Extreme - Buy from Amazon / Read our review
CPU: Intel Core i7 5960X - Buy from Amazon / Read our review
GPU: AMD Radeon R9 Fury X Crossfire x 4

Cooler: Corsair H110 - Buy from Amazon / Read our review


Memory: Kingston 16GB (4x4GB) HyperX Predator DDR4 3000MHz - Buy from Amazon
Storage #1: SanDisk Extreme II 240GB - Buy from Amazon / Read our review
Storage #2: Intel 730 Series 480GB - Buy from Amazon / Read our review
Case: Lian Li PC-T80 Open-Air - Buy from Amazon
Power Supply: Corsair AX1500i - Buy from Amazon / Read our review
OS: Microsoft Windows 7 Ultimate 64-bit - Buy from Amazon
Drivers: NVIDIA GeForce 355.65 and AMD Catalyst 15.7.1

Anthony's Video Card Test System Specifications

Motherboard: ASUS Rampage V Extreme - Buy from Amazon / Read our review
CPU: Intel Core i7 5960X - Buy from Amazon / Read our review
Cooler: Corsair H110 - Buy from Amazon / Read our review
Memory: Kingston 16GB (4x4GB) HyperX Predator DDR4 3000MHz - Buy from Amazon
Storage #1: SanDisk Extreme II 240GB - Buy from Amazon / Read our review
Storage #2: Intel 730 Series 480GB - Buy from Amazon / Read our review
Case: Lian Li PC-T80 Open-Air - Buy from Amazon
Power Supply: Corsair AX1500i - Buy from Amazon / Read our review
OS: Microsoft Windows 10 Home 64-bit - Buy from Amazon
Drivers: NVIDIA GeForce 361.91 and AMD Crimson 16.1.1

Rig of the Day - R40 ROG Edition


CPU: Intel i5-6600K Overclocked
Mobo: ASUS Maximus Gene VIII
Ram: 16GB Corsair Vengeance LPX
GPU: 2x ASUS GTX 980TI STRIX in SLI
Case: HEX GEAR R40 Black with custom red frost accent pack and panels
Power Supply: Corsair HX1000i with custom laser engraving
SSD: Corsair Neutron XT 240 gb
CPU cooling: BeQuiet Dark Rock TF

CORSAIR 780T
SPEC:
i7 6850K
ASUS RAMPAGE V EDITION 10
ASUS GTX1080 FE. SLI.
CORSAIR DOMINATOR PLATINUM 8X8/3000
SAMSUNG 950PRO M.2 512GB.
SAMSUNG 850PRO 2TB.
CORSAIR AX1200I
CORSAIR 780T
HRC SLI BRIDGE SLI BRIDGE
WATERCOOLING:
EKWB SUPREMACY EVO NICKEL PLEXI
EKWB FC GTX 1080 NICKEL ACATEL
EKWB GTX 1080 BACKPLATE BACK
EKWB RADIATOR SE360
BITSPOWER MULTI LINK ADAPTOR 12MM.
EKWB DCC TOP MOD
LAING DDC 355 PUMP
EKWB PASTEL WHITE
HRC FLEX PIPE 10/12
GALAXY PC CABLE
THERMALTAKE RING RGB 120

GIGABYTE GA-Z170X-Gaming 7
Intel i7 6700 k
2 x GIGABYTE GeForce GTX 980 Ti Windforce 3
G. Skill Ripjaws V Series 16 GB (2 x 8 GB) 2800 Ddr4
Hyperx Predator 240 g m. 2 Pcie
SAMSUNG 850 EVO 1TB SSD
SeaSonic Snow Silent-1050 1050W
Caselabs S8
Rig of the Day - In Win 909 Skylake
Intel 6700k
MSI USA Xpower Gaming Titanium
Corsair Dominator Platinum
VisionTek Products, LLC. Fury X (2)
Intel 750 Series 1.2TB (Liquid cooled!)
HGST 4x4TB RAID 5
EVGA 1200 P2
Sanctum Sleeving
In Win USA 909 Silver
EK Water Blocks Custom Liquid Cooling
EK-Supremacy
EK-FC Fury X - Nickel + Nickel Backplates
EK-FC Terminal Triple Parallel Plexi
X3 150 Reservoir w/ white top & bottom
XTOP D5 Plexi + Nickel Cover Kit
XE360 Radiator + 2x SE 240 Radiators
EK-Vardar F4-120 ER White x11
EK-Ekoolant Pastel Blue
EK-ACF White 10/16 Fittings

THE UBER RIG


Monitor: HP Z27Q (5K) | Dell P2715Q (4K)
GPUs: 4x GTX TITAN X (PASCAL) @ 2050Mhz / 11664Mhz
MoBo: Asus Rampage V Extreme (X99)
CPU: i7 6950X @ 4.30GHz
RAM: Corsair Dominator Platinum DDR4 3200Mhz (64GB)
PSU: Corsair AX1500i
OS: Samsung 850 Pro 256GB
Games/Programs: Samsung 840 EVO (RAID-0) / Samsung 850 EVO
THE ASUS RoG SWIFT (G-SYNC) RIG
Monitor: ASUS RoG Swift PG278Q (1440P)
GPUs: 2x EVGA GTX-980 Ti Classified SLI @ 1430Mhz / 8100Mhz
MoBo: ASRock Extreme11 (X79)
CPU: i7 3970X @ 4.50GHz
RAM: Corsair Dominator Platinum DDR3 2133Mhz (32GB)
PSU: Antec HCP-1300
OS: Samsung 840 Pro 250GB
Games/Programs: WD Caviar Black (2TB)

$15,000 4-Way Titan X(P) Rig - http://bit.ly/2bX68Zh


Intel Core i7-6950X 3.0GHz 10-Core Processor - http://amzn.to/2chv5ea
Corsair H110i GTX 104.7 CFM Liquid CPU Cooler - http://amzn.to/2bWQ6N4
Asus X99-E WS SSI CEB LGA2011-3 Motherboard - http://amzn.to/2bO2Q7b
Corsair Vengeance LPX 128GB (8 x 16GB) DDR4-3000 Memory - http://amzn.to/2bNpgUv
OCZ RD400 1TB M.2-2280 Solid State Drive - http://amzn.to/2chvXzt
(x4) Samsung 850 PRO 2TB 2.5" SSD - http://amzn.to/2bWNRtc
(x4) Seagate IronWolf 10TB 3.5" 7200RPM Hard Drive - http://amzn.to/2bNoGWW
(x4) NVIDIA Titan X (Pascal) 12GB Video Card (4-Way SLI) - http://www.geforce.com/hardware/10ser...
CaseLabs MAGNUM TH10A Case - http://www.caselabs-store.com/magnum-...
EVGA SuperNOVA T2 1600W 80+ Titanium Modular ATX PSU - http://amzn.to/2bNpRW5
Pioneer BDR-209DBK Blu-Ray/DVD/CD Writer - http://amzn.to/2chvVbb
Windows 10 (from Kinguin) - http://bit.ly/2bUDWSc

Guru3D

Multi-GPU Mode Explained

Both Nvidia's SLI and AMD's Crossfire allow you to combine/add a second, third or sometimes even a fourth similar generation graphics card
(or add in more GPUs) to the one you already have in your PC. This way you effectively try to double, triple or even quadruple your raw
rendering gaming performance (in theory). Starting with Pascal GPUs (GeForce GTX 1070 and 1080), Nvidia is however limiting and
supporting 2-way SLI mode primarily and really, is phasing out 3 and 4-way SLI. We don''t mind really, the reality is that the more GPUs that
are present, the worse the scaling becomes and the more driver issues you will run into. Honestly, two GPUs in most scenarios is ideal in
terms of multi-GPU gaming, always remember that.
You could for example place two or more AMD graphics cards into a Crossfire compatible motherboard, or two GeForce graphics cards in SLI
mode on a compatible motherboard. In today's article we'll use a 2 -way SLI GeForce GTX 1080 graphics card configuration.

A Crossfire compatible motherboard is pretty much ANY motherboard with multiple PCIe x16 slots that is not an nForce motherboard.

An SLI certified motherboard is an nForce motherboard with more than two PCIe x16 slots or a certified P55, P67, Z68, X58, Z77, Z87,
X79, Z97 and X99, Z170 motherboard. Please check with the motherboard manufacturer whether or not it is SLI compatible. Keep that
in mind, but most of the latest generations AMD and Intel based motherboards are compatible. A small note, if you are on an AMD
processor then on AMD's side the 900 series chipset supports SLI as well.

Once we seat the similar graphics cards on the carefully selected motherboard, we just bridge them together with a supplied Crossfire
connector or, in Nvidia's case, an SLI connector. Then install/update the drivers, after which most games can take advantage of the extra
horsepower we just added into the system.

Once you have your hardware setup it's time to install the latest drivers. In the Nvidia control panel, make sure that Maximize 3D Performance
is activated. For SLI + Multi monitor setup you need to click 'Span Displays with Surround'. Multi GPU rendering -- the idea is not new at all.
There are multiple ways to manage two cards rendering one frame; think of Super Tiling, it's a popular form of rendering. Alternate Frame
Rendering, each card will render a frame (even/uneven) or Split Frame Rendering, simply one GPU renders the upper or the lower part of the
frame. So you see, there are many methods where two or more GPUs can be utilized to bring you a substantial gain in performance.
The Computer Components Used
To be able to understand what we are doing today, we need to briefly take you through some of the key components used for our PC. Today
we have a home built DIY (Do It Yourself) X99 based Core i7 system that consists out of the following gear:
Benchmark Setup:

Core i7 5960X with all cores clocked at @ 4.4GHz

Motherboard -- X99 MSI Godlike

Memory -- 16GB (4 x 4096MB)

512 GB SSD for storage

1200 Watt Power Supply

These are some pretty nifty parts and bear in mind, when you opt for multi-GPU gaming, always have your gear right. You'll need that quality
power supply, you'll need that proper SLI supporting motherboard, a processor and then you'll need a chassis with some very decent airflow to
keep the graphics cards nicely chilled down.

If you decide to go for high-end Multi-GPU gaming, our recommendation currently is an Core i7 processor based on a Z87/Z97/X79/X99/Z170
motherboard as it has plenty of PCIe gen 2.0 and 3.0 lanes and thus cross-link bandwidth really is optimal. For installation, make sure you do
not forget to use a proper SLI bridge -- we'll talk a bit more about that on the next page though.

An Update To SLI
With Pascal there is a change invoked for SLI. One critical ingredient to NVIDIAs SLI technology is the SLI Bridge, which is a digital interface
that transfers display data between GeForce graphics cards in a system. Two of these interfaces have historically been used to enable
communications between three or more GPUs (i.e., 3-Way and 4-Way SLI configurations). The second SLI interface is required for these
scenarios because all other GPUs need to transfer their rendered frames to the display connected to the master GPU, and up to this point
each
interface
has
been
independent.

Beginning with NVIDIA Pascal GPUs, the two interfaces are now linked together to improve bandwidth between GPUs. This new dual-link SLI
mode allows both SLI interfaces to be used in tandem to feed one Hi-res display or multiple displays for NVIDIA Surround. Dual-link SLI mode
is
supported
with
a
new
SLI
Bridge
called
SLI
HB.

The bridge facilitates high-speed data transfer between GPUs, connecting both SLI interfaces, and is the best way to achieve full SLI clock
speeds with GeForce GTX 1080 GPUs running in SLI. The GeForce GTX 1080 is also compatible with legacy SLI bridges; however, the GPU
will
be
limited
to
the
maximum
speed
of
the
bridge
being
used.

Using this new SLI HB Bridge, GeForce GTX 1080s new SLI interface runs at 650 MHz, compared to 400 MHz in previous GeForce GPUs
using legacy SLI bridges. Where possible though, older SLI Bridges will also get a speed boost when used with Pascal. Specifically, custom
bridges that include LED lighting will now operate at up to 650 MHz when used with GTX 1080, taking advantage of Pascals higher speed IO.

Compared to prior DirectX APIs, Microsoft has made a number of changes that impact multi-GPU functionality in DirectX 12. At the highest
level, there are two basic options for developers to use multi-GPU on NVIDIA hardware in DX12: Multi Display Adapter (MDA) Mode, and
Linked Display Adapter (LDA) mode. LDA Mode has two forms: Implicit LDA Mode which NVIDIA uses for SLI, and Explicit LDA Mode where
game developers handle much of the responsibility needed for multi-GPU operation to work successfully. MDA and LDA Explicit Mode were
developed to give game developers more control. The following table summarizes the three modes supported on NVIDIA GPUs:

In LDA Mode, each GPUs memory can be linked together to appear as one large pool of memory to the developer (although there are certain
corner case exceptions regarding peer-to-peer memory); however, there is a performance penalty if the data needed resides in the other
GPUs memory, since the memory is accessed through inter-GPU peer-to-peer communication (like PCIe). In MDA Mode, each GPUs
memory is allocated independently of the other GPU: each GPU cannot directly access the others memory. LDA is intended for multi-GPU
systems that have GPUs that are similar to each other, while MDA Mode has fewer restrictionsdiscrete GPUs can be paired with integrated
GPUs, or with discrete GPUs from another manufacturerbut MDA Mode requires the developer to more carefully manage all of the
operations that are needed for the GPUs to communicate with each other. By default, GeForce GTX 1070/1080 SLI supports up to two GPUs.
3-Way and 4-Way SLI modes are no longer recommended. As games have evolved, it is becoming increasingly difficult for these SLI modes to
provide beneficial performance scaling for end users. For instance, many games become bottlenecked by the CPU when running 3-Way and
4-Way SLI, and games are increasingly using techniques that make it very difficult to extract frame-to-frame parallelism. Of course, systems
will still be built targeting other Multi-GPU software models including:

MDA or LDA Explicit targeted

2 Way SLI + dedicated PhysX GPU

3 & 4-Way SLI?


NVIDIA indeed does no longer recommend 3 or 4 way systems for SLI and places its focus on 2-way SLI only. However, for those of you that
do want more than two GPUs, there is a way (albeit complex). However... meet the enthusiast key. While NVIDIA no longer recommends 3 or
4 way systems for SLI, they know that true enthusiasts will not be swayed and in fact some games will continue to deliver great scaling beyond

two GPUs. For this class of user they have developed an Enthusiast Key that can be downloaded off of NVIDIAs website and loaded into an
individuals GPU. This process involves:
1. Run an app locally to generate a signature for your GPU
2. Request an Enthusiast Key from an upcoming NVIDIA Enthusiast Key website
3. Download your key
4. Install your key to unlock the 3 and 4-way function
Full details on the process are available on the NVIDIA Enthusiast Key website, which will be available at the time GeForce GTX 1080/1070
GPUs are available in users hands.

GeForce GTX 1080 2-way SLI (preliminary review)


Two weeks ago we had a peek at the GeForce GTX 1080, this round however we bring in the big guns, as in plural. We will arm our big daddy
PC rig with not one, but two GeForce GTX 1080 cards and zoom in at SLI performance for the GeForce GTX 1080. In this review we'll run the
standard benchmarks, but we will also have an even closer look at Ultra HD gaming performance as well as a micro stuttering analysis with
the help of FCAT.
Before we start please read this a HUGE note; this will be article is revision one. The early 368.25 driver does not seem to support SLI for the
majority of games, hence we'll look at a handful of games in this preliminary revision of the article first, once a proper final SLI supporting
driver is out, we'll add the rest okay ? So this is us jumping the gun to push some results out, not blaming Nvidia.
We will be looking at performance from a single monitor point of view, so ideally with so much horsepower a Wide Quad HD resolution
(2560x1440) is the monitor resolution where you should start (but preferably Ultra HD of course). You will notice great performance increases
with 2-way SLI as the cards scale nicely with so much horsepower, but in certain scenarios will have a bit more scaling issuesas well sure.
While Full HD (1920x1080/1200) and WHQL (2560x1440) have become the industry standard within the display industry, enthusiasts will
never settle for just the 'standard' and are always looking for the next big innovation in technology. Ultra HD gaming is exactly that, the next
evolution in immersion that gamers have been waiting for. Commonly addressed as Ultra HD, UHD or 4K, this new resolution refers to the
ultra-high resolutions with approximately 4000 horizontal pixels. Ultra HD resolution also has four times the number of pixels of a typical
1920x1080 resolution. It will be interesting to find out how the GeForce GTX 1080 cards will handle such extreme resolutions. Considering its
nice 8 GB framebuffer. With UHD (Ultra High Definition Gaming) becoming rapidly popular we'll test multiple multi-GPU setups on such a
monitor. Next to that we'll perform FCAT tests to see where how the card behaves anno 2016 in terms of micro-stuttering and frame pacing.
Join us in this review where we'll once again look at everything. As if you figured just one card would be interesting.

Hardware Installation
Installation of any of the Nvidia GeForce cards is really easy. Once the card is seated into the PC make sure you hook up the monitor and of
course any external power connectors like 6 and/or 8-pin PEG power connectors. Preferably get yourself a power supply that has these PCIe
PEG connectors native (converting them from a Molex Peripheral connector anno 2014 we feel is a no-go).

Download new NVIDIA GeForce drivers here

Once done, we boot into Windows, install the latest drivers and after a reboot all should be working.

Load up the Nvidia control panel and check if SLI mode was successfully enabled, maximize 3D performance is the setting you are after, once
applied properly you will see that green indicator line showing 2-way SLI.

Power Consumption
Let's have a look at how much power draw we measure with this graphics card installed. The methodology: We have a device constantly
monitoring the power draw from the PC. We stress the GPU to the max, and the processor as little as possible. The before and after wattage
will tell us roughly how much power a graphics card is consuming under load. Our test system is based on an eight-core Intel Core i7-5960X
Extreme Edition setup on the X99 chipset platform. This setup is overclocked to 4.40 GHz on all cores. Next to that we have energy saving
functions disabled for this motherboard and processor (to ensure consistent benchmark results). We'll be calculating the GPU power
consumption here, not the total PC power consumption.
Measured Power Consumption
Mind you, the system wattage is measured at the wall socket side and there are other variables like PSU power efficiency. So this is an
estimated value, albeit a very good one. Below, a chart of relative power consumption. Again, the Wattage shown is the card with the GPU(s)
stressed 100%, showing only the peak GPU power draw, not the power consumption of the entire PC and not the average gaming power
consumption.

Power consumption

TDP in KWh

KWh price

2 hrs day

4 hrs day

Graphics card measured TDP

0,364

0,23

0,17

0,33

Cost 5 days per week / 4 hrs day

1,67

Cost per Month

7,26

Cost per Year 5 days week / 4 hrs day /

87,07

Cost per Year 5 days week / 4 hrs day / $

$ 114,93

Power supply recommendation:

GeForce GTX 1080 - On your average system the card we recommend a 550 Watts power supply unit.

GeForce GTX 1080 SLI - On your average system the cards we recommend a 750 Watts power supply unit.

If you are going to overclock your GPU or processor, then we do recommend you purchase something with some more stamina. At half the
PSU load (50% usage) your PSU is the most energy efficient hence we recommend that bit extra. There are many good PSUs out there,
please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What could happen if
your PSU can't cope with the load is:

Bad 3D performance

Crashing games

Spontaneous reset or imminent shutdown of the PC

Freezing during gameplay

PSU overload can cause it to break down

Let's move to the next page where we'll look into GPU heat levels and noise levels coming from this graphics card.

Graphics Card Temperatures


So here we'll have a look at GPU temperatures. First up, IDLE (desktop) temperatures as reported through software on the thermal sensors of
the GPU. IDLE temperatures first, overall anything below 50 Degrees C is considered okay, anything below 40 Degrees C is nice. We threw in
some cards at random that we have recently tested in the above chart. But what happens when we are gaming? We fire off an intense gamelike application at the graphics card and measure the highest temperature of the GPU.

With the cards fully stressed we kept monitoring temperatures and noted down the GPU temperature as reported by the thermal sensor.

The card's temperature under heavy game stress stabilized at 83~84 Degrees C. We note down the hottest GPU reading, not the
average. The two cards are setup in a worst case scenario, close next to each other.

These tests have been performed with a 20~21 Degrees C room temperature, this is a peak temperature based on a FireStrike loop. For SLI
installation we recommend to leave as much space in-between the cards as possible. We deliberately place them close together in the review
though to emulate a more worst case scenario.

Graphics Card Noise Levels


When graphics cards produce a lot of heat, usually that heat needs to be transported away from the hot core as fast as possible. Often you'll
see massive active fan solutions that can indeed get rid of the heat, yet all the fans these days make the PC, a noisy son of a gun. Do
remember that the test we do is extremely subjective. We bought a certified dBA meter and will start measuring how many dBA originate from
the PC. Why is this subjective you ask? Well, there is always noise in the background, from the streets, from the HDD, PSU fan, etc, so this is
by a mile or two, an imprecise measurement. You could only achieve objective measurement in a sound test chamber.

The human hearing system has different sensitivities at different frequencies. This means that the perception of noise is not at all equal at
every frequency. Noise with significant measured levels (in dB) at high or low frequencies will not be as annoying as it would be when its
energy is concentrated in the middle frequencies. In other words, the measured noise levels in dB will not reflect the actual human perception
of the loudness of the noise. That's why we measure the dBA level. A specific circuit is added to the sound level meter to correct its reading in
regard to this concept. This reading is the noise level in dBA. The letter A is added to indicate the correction that was made in the
measurement. Frequencies below 1 kHz and above 6 kHz are attenuated, whereas frequencies between 1 kHz and 6 kHz are amplified by the
A weighting.

Examples of Sounds Levels


Jet takeoff (200 feet)
Construction Site
Shout (5 feet)
Heavy truck (50 feet)
Urban street
Automobile interior
Normal conversation (3 feet)
Office, classroom
Living room
Bedroom at night
Broadcast studio
Rustling leaves

120 dBA
110 dBA
100 dBA
90 dBA
80 dBA
70 dBA
60 dBA
50 dBA
40 dBA
30 dBA
20 dBA
10 dBA

Intolerable
Very noisy
Noisy
Moderate
Quiet
Barely audible

There are a lot of differences in measurements among websites. Some even place the dBA meter 10 cm away from the card. Considering
that's not where your ear is located, we do it our way, at 75 cm distance.

For each dBA test we close the PC/chassis and move the dBA gun 75 cm away from the PC. Roughly the same proximity you'll have from a
PC in a real-world situation. Above, the IDLE (desktop mode) results where the GPU hardly has to do anything. The system idle results are
really good.

Once the card is in a fully stressed status (in-game) it touches only 42~43 dBA. The cards under stress are audible.

Test Environment & Equipment


Here is where we begin the benchmark portion of this article, but first let me show you our test system plus the software we used.
Mainboard
MSI X99A GODLIKE Gaming - Review
Processor
Core i7 5960X (Haswell-E) @ 4.4 GHz on all eight cores - Review
Graphics Cards

GeForce GTX 1080 - 8 GB GDDR5X graphics memory


Memory
16 GB (4x 4096 MB) 2,133 MHz DDR4
Power Supply Unit
1,200 Watts Platinum Certified Corsair AX1200i - Review
Monitor
Dell 3007WFP - QHD up to 2560x1600
ASUS PQ321 native 4K UHD Monitor at 3840 x 2160 - Review
OS related software
Windows 10 64-bit
DirectX 9/10/11/12 End User Runtime (Download)
AMD Radeon Software Crimson Driver 16.5.3.x (Download)
NVIDIA GeForce Driver 368.25 WHQL (Download)
Software benchmark suite

(DX11) Far Cry Primal

(DX11) Battlefield Hardline

(DX11) 3DMark 11

(DX11) 3DMark 2013 FireStrike

(DX11) Thief

(DX11) Alien: Isolation

(DX11) LOTR Middle Earth: Shadow of Mordor

A Word About "FPS"


What are we looking for in gaming, performance wise? First off, obviously Guru3D tends to think that all games should be played at the best
image quality (IQ) possible. There's a dilemma though, IQ often interferes with the performance of a graphics card. We measure this in FPS,
the number of frames a graphics card can render per second, the higher it is the more fluently your game will display itself.
A game's frames per second (FPS) is a measured average of a series of tests. That test is often a time demo, a recorded part of the game
which is a 1:1 representation of the actual game and its gameplay experience. After forcing the same image quality settings; this time-demo is
then used for all graphics cards so that the actual measuring is as objective as can be.

Frames per second

Gameplay

<30 FPS

Very limited gameplay

30-40 FPS

Average yet very playable

40-60 FPS

Good gameplay

>60 FPS

Best possible gameplay

So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.

With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts.
Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering
quality versus resolution, hey you want both of them to be as high as possible.

When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely
smoothly at every point in the game, turn on every possible in-game IQ setting.

Over 100 FPS? You either have a MONSTER graphics card or a very old game.

Monitor Setup
Before playing games, setting up your monitor's contrast & brightness levels is a very important thing to do. I realized recently that a lot of you
guys have set up your monitor improperly. How do we know this? Because we receive a couple of emails every now and then telling us that a
reader can't distinguish between the benchmark charts (colors) in our reviews. We realized, if that happens, your monitor is not properly set
up.

What Are You Looking For?

Top bar - This simple test pattern is evenly spaced from 0 to 255 brightness levels, with no profile embedded. If your monitor is
correctly set up, you should be able to distinguish each step, and each step should be visually distinct from its neighbours by the same
amount. Also, the dark-end step differences should be about the same as the light-end step differences. Finally, the first step should be
completely black.

The three lower blocks - The far left box is a black box with in the middle a little box a tint lighter than black. The middle box is a lined
square with a central grey square. The far right white box has a smaller "grey" box that should barely be visible.

You should be able to distinguish all small differences, only then you monitor is setup properly contrast and saturation wise.

Final Words & Conclusion


So you missed all recent triple A game titles as well as DX12 titles eh ? Yep, us as well. SLI support does not seem to be fully enabled just yet
with the current 368.25 WHQL driver. So the results on that we'll add in a later update. We also did not include any Crossfire results for
comparison as I want to re-do the complete test on all games on these cards as well with the newest driver. So please look at this article as
merely some preliminary results with SLI.
For the titles in this preliminary article that did work, scaling is pretty okay; but you need to be at least at 2560x1440 and preferably at Ultra HD
as only there scaling makes sense. It's that old devil, CPU limitation again. Something that would have been nice to test with DX12 titles, if
only they had worked. Ah well, .. we'll check it out later once a proper driver is available.
Our recommendation, with a single monitor setup up-to say 1920x1080 to 2560x1440 you'd be more than okay with just one card, two if you
want that extra boom-boom-pow (but the verdict is still out on that). Now, if you have that nice Ultra HD monitor with a 3840x2160 resolution,
that's where a 2nd card could make a lot of sense. But A) I am seriously inclined to recommend 1070'ies price performance wise and B) I am
still inclined to steer away from Multi-GPU SLI setups. I think you need to spend too much money for what you receive in return scaling wise.
Processor power then. We use a X99 / Core i7 5960X Extreme processor clocked at 4400 MHz. With multi-GPU gaming these faster clocked
6-core puppies do show an increase in performance. You do need to wonder though if the 10~15% performance increase in lower resolutions
really justifies that money, but obviously if you can afford two cards in SLI, you probably will go for the best and fastest infrastructure as well.
That would be X99 with a nice 6 or 8 or more recent 10-core processor.

Noise & Heat


Depending on your configuration the temperature target protection of the reference GeForce GTX 1080 founders edition cards hover in the 80
to 85 Degrees C at maximum. As explained, in a poorly ventilated chassis this can have an adverse effect, since the temperature is the top
priority the cards could clock down (throttle) a bit once they want to pass that 83 degrees C target, a feature to protect your 1080'ies. Mind you
that all board partners offer 3rd party coolers available with all the AIB brands, there will be a lot better offers in cooling performance for the
most of them! If you stick to founders edition cards, noise wise really, it's OK. Up-to two cards you can definitely hear airflow under full GPU
stress though. And again, with board partner cards like the ones from Gigabyte, MSI, Palit, Galax and ASUS you will be surprised how silent
such a SLI setup can be.

In closing
As stated many times, this is a preliminary article. I wanted to push some SLI results out, however the driver does not seem to be ready just
yet. Hence I also forfeited on going deeper with Crossfire/SLi combo's etc. For not this article is merely a reference point with some initial
results on game titles that are supported. I do know that with 1080 SLI you'll need to have a spicy PC with a fast clock processor if you'd stick
to the somewhat lower solutions as we ran into CPU limitation pretty much everywhere. Really 1080 SLI only will make sens STARTING at
2560x1440, but preferably ultra HD as that's where the benefits start to show.
We'll update this article once a better driver is out offering proper SLI support and then will look and update with Hitman (2016), Tomb Raider
(2016), GTA5, Anno 2205, AOS. Doom, Witcher III, The Division and Total War WARHAMMER. But for now, as limited as the review is ..
you'll at least have an idea where 2-way SLi scaling is headed.
-H
Other Related reviews

MSI GeForce GTX 1080 GAMING X 8G review

GeForce GTX 1080 Founders reference review

GeForce GTX 1080 FCAT Frametime Analysis

GeForce GTX 1080 Overclock guide

GeForce GTX 1070 Founders reference review

GeForce GTX 1070 FCAT Frametime Analysis

Recommended Downloads

Unigine Heaven Stress test

MSI AfterBurner

3DMark 11

3DMark FireStrike (2013)

Download Latest Nvidia GeForce Drivers

Sign up to receive a notice when we publish a new article

Or go back to Guru3D's front page

Mainboard
MSI X99A GODLIKE Gaming - Review
Processor
Core i7 5960X (Haswell-E) @ 4.4 GHz on all eight cores - Review
Graphics Cards
Nvidia

Titan

(Pascal

2016)

12

GB

GDDR5X

graphics

memory

Memory
16 GB (4x 4096 MB) 2,133 MHz DDR4
Power Supply Unit
1,200 Watts Platinum Certified Corsair AX1200i - Review
Monitor
Dell
3007WFP
ASUS PQ321 native 4K UHD Monitor at 3840 x 2160 - Review

QHD

up

to

2560x1600

OS related software
Windows
DirectX
9/10/11/12
AMD
Radeon
Software
NVIDIA GeForce Driver 369.05 (Download)
Software benchmark suite

10
End
Crimson

User
Driver

Runtime
16.7.2

64-bit
(Download)
16.7.3 (Download)

(OpenGL 4.5) Doom

(Vulkan) Doom

(DX12) Hitman (2016)

(DX12) Rise of the Tomb Raider (2016)

(DX12) Ashes of Singularity

(DX12) Total War: Warhammer

(DX11) The Division

(DX11) Far Cry Primal

(DX11) Anno 2205 (2016)

(DX11) Battlefield Hardline

(DX11) Grand Theft Auto V

(DX11) The Witcher III

(DX11) 3DMark 11

(DX11) 3DMark 2013 FireStrike

(DX11) Alien: Isolation

(DX11) LOTR Middle Earth: Shadow of Mordor

A Word About "FPS"


What are we looking for in gaming, performance wise? First off, obviously Guru3D tends to think that all games should be played at the best
image quality (IQ) possible. There's a dilemma though, IQ often interferes with the performance of a graphics card. We measure this in FPS,
the number of frames a graphics card can render per second, the higher it is the more fluently your game will display itself.
A game's frames per second (FPS) is a measured average of a series of tests. That test is often a time demo, a recorded part of the game
which is a 1:1 representation of the actual game and its gameplay experience. After forcing the same image quality settings; this time-demo is
then used for all graphics cards so that the actual measuring is as objective as can be.
Frames per second

Gameplay

<30 FPS

Very limited gameplay

30-40 FPS

Average yet very playable

40-60 FPS

Good gameplay

>60 FPS

Best possible gameplay

So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.

With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts.
Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering
quality versus resolution, hey you want both of them to be as high as possible.

When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely
smoothly at every point in the game, turn on every possible in-game IQ setting.

Over 100 FPS? You either have a MONSTER graphics card or a very old game.

Monitor Setup
Before playing games, setting up your monitor's contrast & brightness levels is a very important thing to do. I realized recently that a lot of you
guys have set up your monitor improperly. How do we know this? Because we receive a couple of emails every now and then telling us that a
reader can't distinguish between the benchmark charts (colors) in our reviews. We realized, if that happens, your monitor is not properly set
up.

What Are You Looking For?

Top bar - This simple test pattern is evenly spaced from 0 to 255 brightness levels, with no profile embedded. If your monitor is
correctly set up, you should be able to distinguish each step, and each step should be visually distinct from its neighbours by the same
amount. Also, the dark-end step differences should be about the same as the light-end step differences. Finally, the first step should be
completely black.

The three lower blocks - The far left box is a black box with in the middle a little box a tint lighter than black. The middle box is a lined
square with a central grey square. The far right white box has a smaller "grey" box that should barely be visible.

You should be able to distinguish all small differences, only then you monitor is setup properly contrast and saturation wise.

Anda mungkin juga menyukai