Officially launched on October 28, Lords of the Fallen had a bit of a rocky start so it took longer for me to finish running benchmarks on the game, but I'll get into that momentarily. At its core, Lords of the Fallen is a melee third-person action-RPG similar in many ways to the Dark Souls games. There are periodic boss fights to shake things up, as you play the convicted criminal Harkyn (his face has tattoos to publicly declare his sins) trying to stop the invasion of the demonic Rhogar. The game has received decent reviews, with a current Metacritic score of 72%.

Like many recent releases, Lords of the Fallen is a multi-platform title that launched simultaneously for the PC, PS4, and Xbox One. The updated consoles now sport more memory than the old PS3 and Xbox 360, which gives developers opportunities to do a lot more in terms of textures and graphics quality, and at launch I think that ended up creating some problems on the PC. The short summary is that GPUs with 2GB of VRAM or less tended to have quite poor performance. CrossFire also had issues: it wasn't just functioning poorly but it actually caused the game to crash to the desktop.

The first update to the game was released about a week later, and it fixed a few bugs and instability issues, but more importantly it offered much better performance on GPUs with limited VRAM. CrossFire is also "working" now – and I put that in quotes because CrossFire is actually causing degraded performance in some cases and is basically not scaling well enough to be worth the hassle so far. Need I mention that this is an NVIDIA "The Way It's Meant To Be Played" title? Not that the developers have intentionally crippled AMD performance, but I don't think AMD has spent as much time optimizing their drivers for the game. It runs well enough with the right hardware and settings, but this is definitely a game that favors NVIDIA.

We tested using the game's built-in Ultra and High settings, but there's not a significant difference in performance or quality between the two modes so I'm in the process of running another set of performance figures at Medium quality. (I'm also testing Assassin's Creed: Unity performance, which will be the next Benchmarked article, so it will be a bit before I can post the full 1080p Medium results.) Before we get to the performance, here's a quick look at image quality using the four presets:

The major difference between Ultra and High seems to be a minor change in the handling of shadows; I'm not sure you could definitively call Ultra "better", and performance is so close that it's mostly a moot point. Medium appears to disable Ambient Occlusion on the shadows, resulting in a much more noticeable change to the graphics, while Low also disables the Post Processing effect. I'm not sure that's actually a bad thing, though – the effect warps the image a bit, particularly on the right and left thirds of the screen, and tends to make everything look a little blurry/weird.

Lords of the Fallen also features support for some NVIDIA technologies, including PhysX APEX particle support. Many of the effects use PhysX on the CPU and thus work with all GPUs (as well as running on the PS4 and Xbox One), but there's an additional effect called Turbulence that's only available on NVIDIA GPUs. I didn't try to do thorough testing of performance with and without Turbulence enabled since it's an NVIDIA exclusive, but informally it looks like the performance hit is relatively small – around 5-10% – so if you're running an NVIDIA GPU it's probably worth enabling.

One final thing to note before we get to the benchmarks is that Lords of the Fallen is very demanding when it comes to GPUs. Moderate hardware (e.g. Radeon R7 and similar, or NVIDA GTX 750 Ti and lower) are going to struggle to break 30FPS at 1080p Ultra or High settings, so 1080p Medium or even 1600x900 Medium might be required. I'll add the Medium results in the next day or two once I finish retesting. And once more as a quick overview, here's the hardware used for our Benchmarked articles:

Gaming Benchmarks Test Systems
CPU Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)
Overclocked to 4.1GHz
Motherboard Gigabyte G1.Sniper M5 Z87
Memory 2x8GB Corsair Vengeance Pro DDR3-1866 CL9
GPUs Desktop GPUs:
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
Reference GeForce GTX 980

GeForce GTX 980M (MSI GT72 Dominator Pro)
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro)
Storage Corsair Neutron GTX 480GB
Power Supply Rosewill Capstone 1000M
Case Corsair Obsidian 350D
Operating System Windows 7 64-bit

Lords of the Fallen Average FPS

As far as target FPS for a decent experience, Lords of the Fallen isn't quite as twitch-heavy as some games, so I'd recommend shooting for anything above 40FPS average. If you have a G-SYNC display with an NVIDIA GPU, that will also allow you to still experience "smooth" gameplay without tearing. For our testing, however, we disable VSYNC as usual.

Lords of the Fallen 4K Ultra

Lords of the Fallen QHD Ultra

Lords of the Fallen 1080p Ultra

Lords of the Fallen 1080p High

Starting with average frame rates, 4K is basically a stretch at best for even the fastest single GPU configurations. The GTX 980 can technically break 30FPS (barely), but it's not as smooth as I'd like so dropping down a notch is recommended. SLI reportedly works well, though I don't have the hardware to test this (yet), so two high-end NVIDIA GPUs might be enough to get into the playable frame rate territory at 4K. At present CrossFire R9 290X still falls well short, but that's also due to the fact that CrossFire scaling is very low right now.

There's a sizeable jump in performance going from 4K to QHD as expected, with most of the GPUs basically doubling their performance – not too surprising as 4K has 2.25X as many pixels to render as QHD. I mentioned earlier how the patch changed performance in some cases, particularly for GPUs with 2GB or less VRAM. The big beneficiary for higher performance GPUs ends up being the GTX 770, which saw a jump in QHD performance of over 70% with the patch (and a still significant increase of 30% at 1080p Ultra/High).

On the AMD side of the equation, the R9 GPUs don't do all that well compared to NVIDIA. We're used to seeing the 780/970 trade blows with the 290X in most games, but here the 290X is closer to a 770, with the 780/970 offering a solid 15-20% increase in performance. Meanwhile the 280X is mostly playable at QHD but certainly not ideal, and the 280 has to drop to 1080p before it can achieve "acceptable" performance. Overall, the R9 290X along with all of the GTX desktop GPUs I tested can handle QHD Ultra and provide a good experience.

Moving to the 1080p results and looking at the laptops, the GTX 980M is clearly a force to be reckoned with, essentially matching the R9 290X and the GTX 770 and easily handling 1080p Ultra. The next step down to the GTX 880M is a pretty big one – the 980M is about 35% faster than the 880M – but the 880M is still able to handle 1080p Ultra. The 870M meanwhile is in that "questionable" range, and dropping to High settings is only good for about 3-5% more performance on most of our GPUs, so a bit more tweaking is going to be required. Last but not least, the 860M falls short of even the 30FPS mark, and it will need some tuning or Medium quality before it's really acceptable at 1080p.

Our sole "low-end" GPU is the R7 250X, and as you can see it's really not doing well at 1080p High, falling below 20FPS. It also benefited quite a bit from the patch, improving by around 35% at 1080p High, but going from 13.7 FPS to 18.5 FPS still means it's unplayable. I also tested an Intel HD 4600 just for fun (though it's not shown in the charts since it only managed 6.5 FPS); even at 1366x768 and Low quality, it's still far short of being playable with frame rates of around 17 FPS.

Lords of the Fallen Minimum FPS

As with Civilization: Beyond Earth, for the "minimum" FPS I'm actually using an average of the bottom 1% of frame rates. What that means is that this is a realistic look at minimum frame rates, as our benchmark run typically consists of a couple thousand frames of data so we're looking at an average of 20+ frames. Thus, a single frame that took a long time to render won't have as great of an impact as consistently low frame rates. The goal here is to give you a better idea of what performance will be like in the most graphically intense situations.

Lords of the Fallen 4K Ultra Minimums

Lords of the Fallen QHD Ultra Minimums

Lords of the Fallen 1080p Ultra Minimums

Lords of the Fallen 1080p High Minimums

When we look at the minimum FPS, you can now see why I recommended at least 40FPS average frame rates for Lords of the Fallen to be "playable". That translates into minimum frame rates of roughly 30FPS, so even in higher complexity scenes the game will still stay reasonably smooth. On the other hand, if you're averaging closer to 30FPS, minimum FPS is going to drop into the low 20s, and that can be quite choppy.

The standings of the various GPUs don't really change much in our minimum FPS results. In most cases the minimum is around 70-75% of the average FPS, with GPUs that have less RAM generally faring slightly worse than those with more RAM. NVIDIA seems to do a bit better than AMD at 1080p than at QHD, but there aren't any clear issues on any of the GPUs.

Closing Thoughts

I never played any of the Dark Souls games for whatever reason (lack of time, mostly), so for me Lords of the Fallen is actually pretty fun. Of course, having benchmarked the same sequence I don't know how many times (well over 100) does become rather tedious. With so many other games coming out right now, I don't think I'd place Lords of the Fallen at the top of any recommendations list, but it has enough to warrant picking it up if it goes on sale. In the meantime, I'd suggest Middle-Earth: Shadow of Mordor or Assassin's Creed: Unity as better games, at least in my opinion.

Now that we've had a few of these Benchmarked articles, let me also ask for reader feedback. The good thing about these Benchmarked articles is that once I'm done with the initial benchmarking, I won't necessarily be retesting this same game on different systems for another year or two. It's also useful to increase the number of games we benchmark, as it helps to keep the GPU manufacturers honest – they can't just optimize drivers for the ten or so games that most sites use for benchmarking as an example. But what do you think – do you like these articles? Short of the desire to test even more configurations (it's always something that would be nice to have but very time consuming to deliver), what else would you like to see? Are there any recently released games that you'd like to see us test? Let us know!



View All Comments

  • JarredWalton - Thursday, November 13, 2014 - link

    Unfortunately I don't have any AMD mobo/APU to do the testing on right now, but it wouldn't surprise me to see CPU bottlenecks come into play on lower performance CPUs/APUs. The CPU bottleneck appears to be around 110-115 FPS on my 4.1GHz i7-4770K (at Low/Medium quality), so at higher quality and with dual GPUs it would probably be lower than that. I think most people would be fine with a single fast GPU and an APU, but breaking 60 FPS might require something more potent. Reply
  • yannigr2 - Thursday, November 13, 2014 - link

    On one hand AMD is letting people go, so maybe less programmers in the drivers team. On the other hand many games are starting to be made with AMD hardware in mind thanks to the consoles. This two things go to opposite directions. If AMD in a year or two, looks better in the driver area, then it wasn't the number of engineers as big a factor as we both think, but the way games where made. Games where made on Intel+Nvidia hardware from what I remember and just tested for compatibility on AMD hardware. So Nvidia had to address less problems than AMD.
    Of course in the above though Ubisoft is the exception considering their tight relationship with Nvidia. They will always program games with Nvidia hardware in mind.
    But what I want to say is that. In a year or two, if AMD with less stuff in the programmers team is starting to look better, then those conspiracy theories I am talking about now, could have a very large dose of reality.
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    AMD has been letting people go for years.
    AMD already had the lesser of the people, because any one with a brain wanted to work for the company with BILLIONS in the bank, nVidia.

    All the conspiracy theories in the world pale in comparison to the common sense dollar analysis, call it the terrible social consequences of capitalism for amd fans...

    Not too mention, AMD is to be pitied, it's end user fan base is constantly demanding more for less and is the worst of the penny pinching freak jobs, the exact kind of customer and consumer every decent businessman wants to avoid 24/7/365... as they drain the viability of the company.

    Scrimping little penniless tightwad complainers... always demanding a better deal - the only upside being the insane loyalty and "for the poor underdog" mindset that can be used and wielded as a powerful yet dishonest PR ax.
    Well, in the end, that spells destruction, and in the mean time, it spells under performance, problems, lagging, and inferior quality in every area.

    Maybe the FED can print up a multi billion dollar bailout "for free".
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    That isn't even true, take Dragon Age and it's titles for instance.
    AMD is the one pulling stunts to strip the competition of performance when one of their 4 "game devs" sends off email answers to game companies.

    Nvidia has literally 100 traversing the world, AMD had 4 last time I saw it publicly revealed.
    So there's only so much AMD can do with 4 underpaid disgruntled and overworked stringers facing the firing ax every time the quarterly loss sheet hits the news.

    Maybe the EU should make a new progressive law whereby nVidia must begin optimizing AMD GPU's as "each according to it's need" would please amd fans internationale.
  • Gigaplex - Wednesday, November 12, 2014 - link

    "but I don't think AMD has spent as much time optimizing their drivers for the game"

    I'm tired of seeing a need for special application-specific "optimisations" (read: corner-cutting) in the low level drivers just to make the game work. If the games used the APIs correctly, and the drivers implemented the APIs correctly, driver optimisations should benefit all games.
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    Oh great, then mantle should be a smashing success. Reply
  • TrackSmart - Wednesday, November 12, 2014 - link

    Maybe it's just me, but it's hard to keep up with all of the new GPU naming schemes. For those of us who are a little behind the times, it might be helpful to reference some of the older "equivalents" in articles like this. It would let us more quickly figure out where our existing hardware is likely to fall on these charts.

    Doing a bit of searching, I can see that the R7 250X is essentially equivalent to a Radeon HD 7770 (and hence not far off of the older 6770 and 5770). That's useful knowledge! It says that yesterday's "mid-range" videocard is going to have a tough time handling the current generation of games, even at compromised settings.

    Thanks for keeping us informed.
  • JarredWalton - Thursday, November 13, 2014 - link

    True. The R9 280 is the same as the HD 7950, and the R9 280X is basically the same as HD 7970 (GHz edition or overclocked a bit). R9 285 is a different core, but performance should be about the same as the R9 280 as well. On the NVIDIA side, there haven't been as many straight rebadges (at least for the GPUs I have), though in terms of performance I'd estimate the 860M as being relatively close to the old GTX 580 in performance. But I could be off there, so don't quote me. ;-) Reply
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    rebranding is no longer a topic since AMD dove into the lake of fire and lies head first in that area...

    I find that extremely unfortunate as all it would take is the old name of the card next to the new name of the card in the charts 7850 7870 9750 ... everyone would realize it's been like years since AMD made anything new - except for 290x hawaii which I'd assume is just a bloated out pitcairn or tahiti.

    I have to admit it's amazing their cards are even on the charts - I think AMD has pulled like 30% performance out of the driver pile across the board over time - the image quality has suffered but that is never a topic anymore either because we have to save the drowning underdog, it's a worldwide group effort.
  • BlueScreenJunky - Thursday, November 13, 2014 - link

    Nice article, waiting for AC:U as it might be my next purchase if it runs OK on PC.
    Could you maybe also run a few tests with either different CPUs or downclocking / disabling cores on your i7 in 1080p to see if some games may be CPU limited ?

Log in

Don't have an account? Sign up now