Gaming Performance on GTX 770s

F1 2013

First up is F1 2013 by Codemasters. I am a big Formula 1 fan in my spare time, and nothing makes me happier than carving up the field in a Caterham, waving to the Red Bulls as I drive by (because I play on easy and take shortcuts). F1 2013 uses the EGO Engine, and like other Codemasters games ends up being very playable on old hardware quite easily. In order to beef up the benchmark a bit, we devised the following scenario for the benchmark mode: one lap of Spa-Francorchamps in the heavy wet, the benchmark follows Jenson Button in the McLaren who starts on the grid in 22nd place, with the field made up of 11 Williams cars, 5 Marussia and 5 Caterham in that order. This puts emphasis on the CPU to handle the AI in the wet, and allows for a good amount of overtaking during the automated benchmark. We test at 1920x1080 on Ultra graphical settings.

F1 2013 SLI, Average FPS


Bioshock Infinite

Bioshock Infinite was Zero Punctuation’s Game of the Year for 2013, uses the Unreal Engine 3, and is designed to scale with both cores and graphical prowess. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Bioshock Infinite SLI, Average FPS


Tomb Raider

The next benchmark in our test is Tomb Raider. Tomb Raider is an AMD optimized game, lauded for its use of TressFX creating dynamic hair to increase the immersion in game. Tomb Raider uses a modified version of the Crystal Engine, and enjoys raw horsepower. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Tomb Raider SLI, Average FPS


Sleeping Dogs

Sleeping Dogs is a benchmarking wet dream – a highly complex benchmark that can bring the toughest setup and high resolutions down into single figures. Having an extreme SSAO setting can do that, but at the right settings Sleeping Dogs is highly playable and enjoyable. We run the basic benchmark program laid out in the Adrenaline benchmark tool, and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Sleeping Dogs SLI, Average FPS


CPU Performance The ASRock X99 Extreme11 Conclusion
Comments Locked

58 Comments

View All Comments

  • wyewye - Friday, March 13, 2015 - link

    "this system is just a confused jumble of parts slapped together"
    This is the best conclusion for this mobo.

    I think they hope marketing/sales guys will be able to bamboozle dummies to sell this as a 18 port raid server mobo. Anyone who spends 600$ on a high-end mobo without reading a review, deserves whats coming to them.
  • swaaye - Wednesday, March 11, 2015 - link

    That chipset fan is cheesy. They most definitely could have come up with a better cooling solution.
  • Hairs_ - Wednesday, March 11, 2015 - link

    As pointed out above, this board doesn't answer a single question any user is asking, and it doesn't fulfill any logical useage case.

    I'm struggling to see why it was reviewed other than the possible reason "reviewer only wants to review weird expensive stuff". Getting in a board whose supposed only reason to exist is the number of storage ports, then not yet the storage, and say " in sure it's the same as the one I reviewed a few years ago " is... Troubling. What was the point of getting it in for review at all??
  • ClockHound - Wednesday, March 11, 2015 - link

    What's the point?

    It's the new Anandtech, where the point is clicks! Catchy headlines with dubious content, it's how Purch is improving a once great review site. Thanks, Purch!
  • ap90033 - Friday, March 13, 2015 - link

    I think you may be on to something! Sad to see..
  • Stylex - Wednesday, March 11, 2015 - link

    I don't understand how motherboards still have usb2 ports. Did it seriously take this long for it to transition from usb1.1 to usb2?
  • DanNeely - Wednesday, March 11, 2015 - link

    At the USB1.1-2.0 transition time, Intel chipsets had at most 6 ports; and since the new standard didn't need any more IO pins so they could cut over all at once. Not needing any more IO pins is important because it's been the the limiting factor for chipset cost for a number of years; with the die size being determined by the number of output pins added.

    The bottleneck for USB3 has been the chipsets. Pre-IVB they had no USB3. IVB added support for 4 ports, haswell bumped it to 6, the 9x series chipsets that were supported to launch with broadwell were essentially unchanged from the previous model. As a result, mobo makers who wanted to add more USB3 have had to spend extra money on 3rd party chips to do so. Initially it was on USB3 controllers which generally ate a PCIe lane for every pair of ports added. More recent designs are using 4 port USB hub chips; which give better bang for the buck but still drive prices up.

    When skylake launches later this year, the situation should improve; its higher end versions will offer up to 10 ports. That might be enough ports to make all USB3 configurations possible in the mid range without either using a very small total number of ports or driving the board price up with extra controller chips. High end boards will probably still have some ports attached to a controller though, because Intel's expanding it's use of flexible IO ports and native USB3 will be competing with PCIe storage for IO pins. In both cases though, I suspect a number of boards will also expose the 4 remaining 2.0 ports; probably 1 internal header and 2 external ports. That won't be just a case of 'gotta use them all'; older OSes (eg win7) without native support for USB3 are easier to install if you've got a few 2.0 ports available; and there will be residual demand for 2.0 headers from people with older cases or internal card readers.

    The situation is similar with AMD chipsets. But due to their being only able to compete in the value segment of the market, they're behind Intel; topping out at 4 native USB3 ports.
  • Stylex - Friday, March 13, 2015 - link

    Ah, the pin count makes a lot of sense, thanks for that insight!

    But still, how much more could it possibly drive up the price to use a third party controller, $5-10? I'd pay that for all usb3, especially on a board like this one.
  • DanNeely - Sunday, March 15, 2015 - link

    Estimates I've seen over the last few years put a 2 port USB3-PCIe controller as adding $10 to the retail price of a board; a 4 port USB3 hub chip added $5. The caveats are that hubs only add ports not total bandwidth; which is fine if you're only interested in being able to plug a USB3 device into any port but use sufficiently few of them that sticking multiple high speed devices on the same hub isn't a problem. Controllers don't have that problem; but do need PCIe lanes. Those tend to be in short supply on intels 8x/9x chipsets. Using a 4-8 lane PLX on the chipset relieves that pressure somewhat but is another $10 or $20 to the board price. The situation there will be better for skylake due to the 100 series chipets having 28 high speed IO lanes instead of only 18; but that's partially counter balanced by m2/SataExpress connections needing several lanes each.

    The lack of native 3.1 support means that the next generation of mobos will probably go the controller route; not hubs to bump up the port count. With Intel rarely doing any major updates on the Tock versions of the chipset, it will probably be at least 2017 before external USB3 controllers mostly go way.
  • darkfalz - Thursday, March 12, 2015 - link

    You have a keyboard, mouse or gamepad that requires 100 MB/sec bandwidth, do you?

Log in

Don't have an account? Sign up now