Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
POST A COMMENT

122 Comments

View All Comments

  • silverblue - Saturday, November 22, 2014 - link

    Understood - definite incompetence and on a grand scale, too, considering somebody with multiple cards has put x times the money into the vendor than somebody who would purchase just the one. I would find it hard to believe that they were unaware from their own internal testing. There's the possibility that whoever presides over this was given their marching orders and AMD set about fixing the damage, but I guess we'll never know.

    I apologise for the pedantry as well.
    Reply
  • D. Lister - Saturday, November 22, 2014 - link

    No problem at all, it takes a big man to take an opposing argument with such candor - well done. Reply
  • FlushedBubblyJock - Wednesday, November 26, 2014 - link

    It's AMD's responsibility to work with game devs to make certain their cards work properly.
    Of course, AMD is notoriou for not doing that for many, many years, then of course, it's nVidia's fault.
    AMD might do well: " We take full responsibility."
    That would mean of course having Catalyst Makers doing more than emailing and whining, like showing up at game dev studios and taking an active hand and having game day drivers ready.
    Of course if they did that, what would their unbelievably as incompetent misplaced blame fans have to do ?
    I mean seriously, it's as bad as the worst politicians we've ever seen pointing fingers in every direction but their own.
    Reply
  • Lerianis - Friday, November 28, 2014 - link

    Agreed.... should be year and a half at least for a game of this scale with the manpower allotted to Ubisoft Montreal. Reply
  • JarredWalton - Thursday, November 20, 2014 - link

    1440p High is probably playable on a single GTX 980 -- I just ran GTX 970 on that and got results of 30.4/23.6 Avg/Min, which is about 40% faster (44% to be precise) on average FPS and 65% faster on minimum FPS. If 980 sees the same scaling, it will be around 35/26 FPS at 1440p High. There's not a huge difference in performance or quality between the High and Medium presets, which means you really would need to drop to Low (or close to it) for 4K gaming.

    Why did I test these settings? Because you have to choose something, and we generally go with "Ultra" at 1440p -- just to see how the GPUs fare. I've tested 4K at Ultra in the past, but that was completely unplayable across the board so I dropped to High for this game. If I had dropped 1440p to High, I'm sure I'd get people wanting to see Ultra numbers -- you can't please everyone.

    Anyway, as someone that has a 4K display, I can tell you I'd rather play at 1440p or even 1080p with better graphics quality (High, Very High, or Ultra) than run at native 4K with the Low/Medium settings. YMMV.
    Reply
  • AnnonymousCoward - Saturday, November 22, 2014 - link

    IMHO, as a 30" owner I'm more interested in 2560-benchmarks at a quality setting that gives 60fps on non-SLI cards. Reply
  • Akrovah - Thursday, November 20, 2014 - link

    I disagree. I find 30 perfectly playable. That's the effective frame rate of television. Movies are 24, and nobody has issues with them not being "smooth enough." Heck, people almsot got out pitch forks when someone dared film a movie at 48 fps.

    I mean yes, for gamign 60 fps is preferable and looks and feels better, but to call anythign under that "awful" is going a little far. Especially whent he game in question is not a twitch shooter. Action/adventure games like Assassin's Creed are perfectly enjoyable at 30 fps.
    Reply
  • HanzNFranzen - Thursday, November 20, 2014 - link

    Well you know, this is the internet... comments must be exaggerated for effect. Either something is greatest of all time or it's awful, never any middle ground. Anyways, I have a GTX980 and a 5820k @ 4.0Ghz and I would say that my experience with "playability" in this game doesn't really mirror the benchmarks at 2560x1440/ultra. Perhaps there are more taxing areas on the game that I haven't seen yet but I'm not seeing frames dropping into the teens. I feel the controls hurt the playability of the game more than anything as they just seem clucky. Reply
  • theMillen - Friday, November 21, 2014 - link

    Exactly my remarks, 3770k @ 4.8, and evga 980 acx oc'd to 1550... and at 1440/ultra it is completely playable, im about 4 hours in and am completely satisfied with results. would i love to stay above 60fps at all times? yes. am i satisfied? yup! Reply
  • foxtrot1_1 - Thursday, November 20, 2014 - link

    There is a big difference between passively watching a 24fps film and interacting with a 24fps video game. I'm far from a pedant on these things and I find anything under 45-50 fps distractingly unplayable. Reply

Log in

Don't have an account? Sign up now