0. Intel i7 6900K,
ASUS Republic of Gamers Rampage V
Dual NVIDIA GeForce GTX 1080s SLI
Samsung US 850 EVO SSDs
Corsair AX1500i PSU
HyperX Predator memory
Motherboard: ASUS Rampage V Extreme - Buy from Amazon / Read our review
CPU: Intel Core i7 5960X - Buy from Amazon / Read our review
GPU: AMD Radeon R9 Fury X Crossfire x 4
Motherboard: ASUS Rampage V Extreme - Buy from Amazon / Read our review
CPU: Intel Core i7 5960X - Buy from Amazon / Read our review
Cooler: Corsair H110 - Buy from Amazon / Read our review
Memory: Kingston 16GB (4x4GB) HyperX Predator DDR4 3000MHz - Buy from Amazon
Storage #1: SanDisk Extreme II 240GB - Buy from Amazon / Read our review
Storage #2: Intel 730 Series 480GB - Buy from Amazon / Read our review
Case: Lian Li PC-T80 Open-Air - Buy from Amazon
Power Supply: Corsair AX1500i - Buy from Amazon / Read our review
OS: Microsoft Windows 10 Home 64-bit - Buy from Amazon
Drivers: NVIDIA GeForce 361.91 and AMD Crimson 16.1.1
CORSAIR 780T
SPEC:
i7 6850K
ASUS RAMPAGE V EDITION 10
ASUS GTX1080 FE. SLI.
CORSAIR DOMINATOR PLATINUM 8X8/3000
SAMSUNG 950PRO M.2 512GB.
SAMSUNG 850PRO 2TB.
CORSAIR AX1200I
CORSAIR 780T
HRC SLI BRIDGE SLI BRIDGE
WATERCOOLING:
EKWB SUPREMACY EVO NICKEL PLEXI
EKWB FC GTX 1080 NICKEL ACATEL
EKWB GTX 1080 BACKPLATE BACK
EKWB RADIATOR SE360
BITSPOWER MULTI LINK ADAPTOR 12MM.
EKWB DCC TOP MOD
LAING DDC 355 PUMP
EKWB PASTEL WHITE
HRC FLEX PIPE 10/12
GALAXY PC CABLE
THERMALTAKE RING RGB 120
GIGABYTE GA-Z170X-Gaming 7
Intel i7 6700 k
2 x GIGABYTE GeForce GTX 980 Ti Windforce 3
G. Skill Ripjaws V Series 16 GB (2 x 8 GB) 2800 Ddr4
Hyperx Predator 240 g m. 2 Pcie
SAMSUNG 850 EVO 1TB SSD
SeaSonic Snow Silent-1050 1050W
Caselabs S8
Rig of the Day - In Win 909 Skylake
Intel 6700k
MSI USA Xpower Gaming Titanium
Corsair Dominator Platinum
VisionTek Products, LLC. Fury X (2)
Intel 750 Series 1.2TB (Liquid cooled!)
HGST 4x4TB RAID 5
EVGA 1200 P2
Sanctum Sleeving
In Win USA 909 Silver
EK Water Blocks Custom Liquid Cooling
EK-Supremacy
EK-FC Fury X - Nickel + Nickel Backplates
EK-FC Terminal Triple Parallel Plexi
X3 150 Reservoir w/ white top & bottom
XTOP D5 Plexi + Nickel Cover Kit
XE360 Radiator + 2x SE 240 Radiators
EK-Vardar F4-120 ER White x11
EK-Ekoolant Pastel Blue
EK-ACF White 10/16 Fittings
Guru3D
Both Nvidia's SLI and AMD's Crossfire allow you to combine/add a second, third or sometimes even a fourth similar generation graphics card
(or add in more GPUs) to the one you already have in your PC. This way you effectively try to double, triple or even quadruple your raw
rendering gaming performance (in theory). Starting with Pascal GPUs (GeForce GTX 1070 and 1080), Nvidia is however limiting and
supporting 2-way SLI mode primarily and really, is phasing out 3 and 4-way SLI. We don''t mind really, the reality is that the more GPUs that
are present, the worse the scaling becomes and the more driver issues you will run into. Honestly, two GPUs in most scenarios is ideal in
terms of multi-GPU gaming, always remember that.
You could for example place two or more AMD graphics cards into a Crossfire compatible motherboard, or two GeForce graphics cards in SLI
mode on a compatible motherboard. In today's article we'll use a 2 -way SLI GeForce GTX 1080 graphics card configuration.
A Crossfire compatible motherboard is pretty much ANY motherboard with multiple PCIe x16 slots that is not an nForce motherboard.
An SLI certified motherboard is an nForce motherboard with more than two PCIe x16 slots or a certified P55, P67, Z68, X58, Z77, Z87,
X79, Z97 and X99, Z170 motherboard. Please check with the motherboard manufacturer whether or not it is SLI compatible. Keep that
in mind, but most of the latest generations AMD and Intel based motherboards are compatible. A small note, if you are on an AMD
processor then on AMD's side the 900 series chipset supports SLI as well.
Once we seat the similar graphics cards on the carefully selected motherboard, we just bridge them together with a supplied Crossfire
connector or, in Nvidia's case, an SLI connector. Then install/update the drivers, after which most games can take advantage of the extra
horsepower we just added into the system.
Once you have your hardware setup it's time to install the latest drivers. In the Nvidia control panel, make sure that Maximize 3D Performance
is activated. For SLI + Multi monitor setup you need to click 'Span Displays with Surround'. Multi GPU rendering -- the idea is not new at all.
There are multiple ways to manage two cards rendering one frame; think of Super Tiling, it's a popular form of rendering. Alternate Frame
Rendering, each card will render a frame (even/uneven) or Split Frame Rendering, simply one GPU renders the upper or the lower part of the
frame. So you see, there are many methods where two or more GPUs can be utilized to bring you a substantial gain in performance.
The Computer Components Used
To be able to understand what we are doing today, we need to briefly take you through some of the key components used for our PC. Today
we have a home built DIY (Do It Yourself) X99 based Core i7 system that consists out of the following gear:
Benchmark Setup:
These are some pretty nifty parts and bear in mind, when you opt for multi-GPU gaming, always have your gear right. You'll need that quality
power supply, you'll need that proper SLI supporting motherboard, a processor and then you'll need a chassis with some very decent airflow to
keep the graphics cards nicely chilled down.
If you decide to go for high-end Multi-GPU gaming, our recommendation currently is an Core i7 processor based on a Z87/Z97/X79/X99/Z170
motherboard as it has plenty of PCIe gen 2.0 and 3.0 lanes and thus cross-link bandwidth really is optimal. For installation, make sure you do
not forget to use a proper SLI bridge -- we'll talk a bit more about that on the next page though.
An Update To SLI
With Pascal there is a change invoked for SLI. One critical ingredient to NVIDIAs SLI technology is the SLI Bridge, which is a digital interface
that transfers display data between GeForce graphics cards in a system. Two of these interfaces have historically been used to enable
communications between three or more GPUs (i.e., 3-Way and 4-Way SLI configurations). The second SLI interface is required for these
scenarios because all other GPUs need to transfer their rendered frames to the display connected to the master GPU, and up to this point
each
interface
has
been
independent.
Beginning with NVIDIA Pascal GPUs, the two interfaces are now linked together to improve bandwidth between GPUs. This new dual-link SLI
mode allows both SLI interfaces to be used in tandem to feed one Hi-res display or multiple displays for NVIDIA Surround. Dual-link SLI mode
is
supported
with
a
new
SLI
Bridge
called
SLI
HB.
The bridge facilitates high-speed data transfer between GPUs, connecting both SLI interfaces, and is the best way to achieve full SLI clock
speeds with GeForce GTX 1080 GPUs running in SLI. The GeForce GTX 1080 is also compatible with legacy SLI bridges; however, the GPU
will
be
limited
to
the
maximum
speed
of
the
bridge
being
used.
Using this new SLI HB Bridge, GeForce GTX 1080s new SLI interface runs at 650 MHz, compared to 400 MHz in previous GeForce GPUs
using legacy SLI bridges. Where possible though, older SLI Bridges will also get a speed boost when used with Pascal. Specifically, custom
bridges that include LED lighting will now operate at up to 650 MHz when used with GTX 1080, taking advantage of Pascals higher speed IO.
Compared to prior DirectX APIs, Microsoft has made a number of changes that impact multi-GPU functionality in DirectX 12. At the highest
level, there are two basic options for developers to use multi-GPU on NVIDIA hardware in DX12: Multi Display Adapter (MDA) Mode, and
Linked Display Adapter (LDA) mode. LDA Mode has two forms: Implicit LDA Mode which NVIDIA uses for SLI, and Explicit LDA Mode where
game developers handle much of the responsibility needed for multi-GPU operation to work successfully. MDA and LDA Explicit Mode were
developed to give game developers more control. The following table summarizes the three modes supported on NVIDIA GPUs:
In LDA Mode, each GPUs memory can be linked together to appear as one large pool of memory to the developer (although there are certain
corner case exceptions regarding peer-to-peer memory); however, there is a performance penalty if the data needed resides in the other
GPUs memory, since the memory is accessed through inter-GPU peer-to-peer communication (like PCIe). In MDA Mode, each GPUs
memory is allocated independently of the other GPU: each GPU cannot directly access the others memory. LDA is intended for multi-GPU
systems that have GPUs that are similar to each other, while MDA Mode has fewer restrictionsdiscrete GPUs can be paired with integrated
GPUs, or with discrete GPUs from another manufacturerbut MDA Mode requires the developer to more carefully manage all of the
operations that are needed for the GPUs to communicate with each other. By default, GeForce GTX 1070/1080 SLI supports up to two GPUs.
3-Way and 4-Way SLI modes are no longer recommended. As games have evolved, it is becoming increasingly difficult for these SLI modes to
provide beneficial performance scaling for end users. For instance, many games become bottlenecked by the CPU when running 3-Way and
4-Way SLI, and games are increasingly using techniques that make it very difficult to extract frame-to-frame parallelism. Of course, systems
will still be built targeting other Multi-GPU software models including:
two GPUs. For this class of user they have developed an Enthusiast Key that can be downloaded off of NVIDIAs website and loaded into an
individuals GPU. This process involves:
1. Run an app locally to generate a signature for your GPU
2. Request an Enthusiast Key from an upcoming NVIDIA Enthusiast Key website
3. Download your key
4. Install your key to unlock the 3 and 4-way function
Full details on the process are available on the NVIDIA Enthusiast Key website, which will be available at the time GeForce GTX 1080/1070
GPUs are available in users hands.
Hardware Installation
Installation of any of the Nvidia GeForce cards is really easy. Once the card is seated into the PC make sure you hook up the monitor and of
course any external power connectors like 6 and/or 8-pin PEG power connectors. Preferably get yourself a power supply that has these PCIe
PEG connectors native (converting them from a Molex Peripheral connector anno 2014 we feel is a no-go).
Once done, we boot into Windows, install the latest drivers and after a reboot all should be working.
Load up the Nvidia control panel and check if SLI mode was successfully enabled, maximize 3D performance is the setting you are after, once
applied properly you will see that green indicator line showing 2-way SLI.
Power Consumption
Let's have a look at how much power draw we measure with this graphics card installed. The methodology: We have a device constantly
monitoring the power draw from the PC. We stress the GPU to the max, and the processor as little as possible. The before and after wattage
will tell us roughly how much power a graphics card is consuming under load. Our test system is based on an eight-core Intel Core i7-5960X
Extreme Edition setup on the X99 chipset platform. This setup is overclocked to 4.40 GHz on all cores. Next to that we have energy saving
functions disabled for this motherboard and processor (to ensure consistent benchmark results). We'll be calculating the GPU power
consumption here, not the total PC power consumption.
Measured Power Consumption
Mind you, the system wattage is measured at the wall socket side and there are other variables like PSU power efficiency. So this is an
estimated value, albeit a very good one. Below, a chart of relative power consumption. Again, the Wattage shown is the card with the GPU(s)
stressed 100%, showing only the peak GPU power draw, not the power consumption of the entire PC and not the average gaming power
consumption.
Power consumption
TDP in KWh
KWh price
2 hrs day
4 hrs day
0,364
0,23
0,17
0,33
1,67
7,26
87,07
$ 114,93
GeForce GTX 1080 - On your average system the card we recommend a 550 Watts power supply unit.
GeForce GTX 1080 SLI - On your average system the cards we recommend a 750 Watts power supply unit.
If you are going to overclock your GPU or processor, then we do recommend you purchase something with some more stamina. At half the
PSU load (50% usage) your PSU is the most energy efficient hence we recommend that bit extra. There are many good PSUs out there,
please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What could happen if
your PSU can't cope with the load is:
Bad 3D performance
Crashing games
Let's move to the next page where we'll look into GPU heat levels and noise levels coming from this graphics card.
With the cards fully stressed we kept monitoring temperatures and noted down the GPU temperature as reported by the thermal sensor.
The card's temperature under heavy game stress stabilized at 83~84 Degrees C. We note down the hottest GPU reading, not the
average. The two cards are setup in a worst case scenario, close next to each other.
These tests have been performed with a 20~21 Degrees C room temperature, this is a peak temperature based on a FireStrike loop. For SLI
installation we recommend to leave as much space in-between the cards as possible. We deliberately place them close together in the review
though to emulate a more worst case scenario.
The human hearing system has different sensitivities at different frequencies. This means that the perception of noise is not at all equal at
every frequency. Noise with significant measured levels (in dB) at high or low frequencies will not be as annoying as it would be when its
energy is concentrated in the middle frequencies. In other words, the measured noise levels in dB will not reflect the actual human perception
of the loudness of the noise. That's why we measure the dBA level. A specific circuit is added to the sound level meter to correct its reading in
regard to this concept. This reading is the noise level in dBA. The letter A is added to indicate the correction that was made in the
measurement. Frequencies below 1 kHz and above 6 kHz are attenuated, whereas frequencies between 1 kHz and 6 kHz are amplified by the
A weighting.
120 dBA
110 dBA
100 dBA
90 dBA
80 dBA
70 dBA
60 dBA
50 dBA
40 dBA
30 dBA
20 dBA
10 dBA
Intolerable
Very noisy
Noisy
Moderate
Quiet
Barely audible
There are a lot of differences in measurements among websites. Some even place the dBA meter 10 cm away from the card. Considering
that's not where your ear is located, we do it our way, at 75 cm distance.
For each dBA test we close the PC/chassis and move the dBA gun 75 cm away from the PC. Roughly the same proximity you'll have from a
PC in a real-world situation. Above, the IDLE (desktop mode) results where the GPU hardly has to do anything. The system idle results are
really good.
Once the card is in a fully stressed status (in-game) it touches only 42~43 dBA. The cards under stress are audible.
(DX11) 3DMark 11
(DX11) Thief
Gameplay
<30 FPS
30-40 FPS
40-60 FPS
Good gameplay
>60 FPS
So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts.
Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering
quality versus resolution, hey you want both of them to be as high as possible.
When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely
smoothly at every point in the game, turn on every possible in-game IQ setting.
Over 100 FPS? You either have a MONSTER graphics card or a very old game.
Monitor Setup
Before playing games, setting up your monitor's contrast & brightness levels is a very important thing to do. I realized recently that a lot of you
guys have set up your monitor improperly. How do we know this? Because we receive a couple of emails every now and then telling us that a
reader can't distinguish between the benchmark charts (colors) in our reviews. We realized, if that happens, your monitor is not properly set
up.
Top bar - This simple test pattern is evenly spaced from 0 to 255 brightness levels, with no profile embedded. If your monitor is
correctly set up, you should be able to distinguish each step, and each step should be visually distinct from its neighbours by the same
amount. Also, the dark-end step differences should be about the same as the light-end step differences. Finally, the first step should be
completely black.
The three lower blocks - The far left box is a black box with in the middle a little box a tint lighter than black. The middle box is a lined
square with a central grey square. The far right white box has a smaller "grey" box that should barely be visible.
You should be able to distinguish all small differences, only then you monitor is setup properly contrast and saturation wise.
In closing
As stated many times, this is a preliminary article. I wanted to push some SLI results out, however the driver does not seem to be ready just
yet. Hence I also forfeited on going deeper with Crossfire/SLi combo's etc. For not this article is merely a reference point with some initial
results on game titles that are supported. I do know that with 1080 SLI you'll need to have a spicy PC with a fast clock processor if you'd stick
to the somewhat lower solutions as we ran into CPU limitation pretty much everywhere. Really 1080 SLI only will make sens STARTING at
2560x1440, but preferably ultra HD as that's where the benefits start to show.
We'll update this article once a better driver is out offering proper SLI support and then will look and update with Hitman (2016), Tomb Raider
(2016), GTA5, Anno 2205, AOS. Doom, Witcher III, The Division and Total War WARHAMMER. But for now, as limited as the review is ..
you'll at least have an idea where 2-way SLi scaling is headed.
-H
Other Related reviews
Recommended Downloads
MSI AfterBurner
3DMark 11
Mainboard
MSI X99A GODLIKE Gaming - Review
Processor
Core i7 5960X (Haswell-E) @ 4.4 GHz on all eight cores - Review
Graphics Cards
Nvidia
Titan
(Pascal
2016)
12
GB
GDDR5X
graphics
memory
Memory
16 GB (4x 4096 MB) 2,133 MHz DDR4
Power Supply Unit
1,200 Watts Platinum Certified Corsair AX1200i - Review
Monitor
Dell
3007WFP
ASUS PQ321 native 4K UHD Monitor at 3840 x 2160 - Review
QHD
up
to
2560x1600
OS related software
Windows
DirectX
9/10/11/12
AMD
Radeon
Software
NVIDIA GeForce Driver 369.05 (Download)
Software benchmark suite
10
End
Crimson
User
Driver
Runtime
16.7.2
64-bit
(Download)
16.7.3 (Download)
(Vulkan) Doom
(DX11) 3DMark 11
Gameplay
<30 FPS
30-40 FPS
40-60 FPS
Good gameplay
>60 FPS
So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts.
Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering
quality versus resolution, hey you want both of them to be as high as possible.
When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely
smoothly at every point in the game, turn on every possible in-game IQ setting.
Over 100 FPS? You either have a MONSTER graphics card or a very old game.
Monitor Setup
Before playing games, setting up your monitor's contrast & brightness levels is a very important thing to do. I realized recently that a lot of you
guys have set up your monitor improperly. How do we know this? Because we receive a couple of emails every now and then telling us that a
reader can't distinguish between the benchmark charts (colors) in our reviews. We realized, if that happens, your monitor is not properly set
up.
Top bar - This simple test pattern is evenly spaced from 0 to 255 brightness levels, with no profile embedded. If your monitor is
correctly set up, you should be able to distinguish each step, and each step should be visually distinct from its neighbours by the same
amount. Also, the dark-end step differences should be about the same as the light-end step differences. Finally, the first step should be
completely black.
The three lower blocks - The far left box is a black box with in the middle a little box a tint lighter than black. The middle box is a lined
square with a central grey square. The far right white box has a smaller "grey" box that should barely be visible.
You should be able to distinguish all small differences, only then you monitor is setup properly contrast and saturation wise.