Still quite a bit of noise on the Internet between those who still protest about downgrades and those who think the game looks beautiful and the graphic isn’t the whole deal, anyway. Not noticing that the two positions aren’t antithetic, but compatible.
And because I believe they are compatible, something needs still to be said about this downgrade, in particular to show how very, very far the game ended up from its expected fall point:
1- CDP always denied a downgrade, or even the possibility of one.
2- Just a couple of weeks before release they claimed the game would run on Ultra PC settings + Hariworks enabled at locked 60 FPS.
In the end not only the two individual claims were completely false, and DELIBERATELY false. Intended to be false in order to misdirect customers. But not only that, it’s the SUM of the two that should tell you how far they are from their claim.
I have lots of stuff in the archive to put on BOTH blogs but I keep delaying that. For now here’s some bitching about The Witcher 3, even considering the engine is not too bad, after all.
– There was a downgrade (between real game and old trailers) and it was significant. All areas were hit: lighting is different, vegetation completely changed, clip plane murdered, object geometry reduced, pop-in detail, and NPC density. While the real game looks pretty, it still looks like a pretty, modern computer game, as opposed to the visual marvel the trailers showed. To anyone with minimal technical competency those trailers were very obvious “frauds”, you didn’t have to wait the game and be surprised that that sort of quality is out of reach.
– While until a week ago the official word was: “fixed 60 fps ultra preset on a 980”. The actual truth is quite far from that. With everything maxed a 980 averages around 40-50 fps, depending on how demanding/rigorous is the benchmark. In any case, this is a game that had multiple delays, before the latest ones it was expected for early 2014, so this was a game PLANNED to run on hardware that precedes Nvidia 9xx class, this BEFORE the downgrade. So take the best class videocard at that time, a 780 TI. It means they wanted us to believe that a 780 Ti could run this game well on the kind of graphic before the downgrade. Yeah.
770 class videocard = 30 fps at high, maybe with some non-intensive ultra
970 class videocard = 60 fps at high/ultra mix (Hairworks off)
980 class videocard = 60 fps ultra (Hairworks off)
PS4 usually at 30 fps, with mostly high settings (foliage is high, shadows medium, worse LOD but it looks like shadows on the long range are better than on PC regardless of setting)
Watch Dogs and AC:Unity have seen massive outrage from the public, and that should tell you how much perception counts. The Witcher 3 is still rather buggy, seems to crash a lot on PC (especially at weird points, like looking through the inventory), has seen a more significant downgrade compared to those games. Yet, it is a better game overall, and especially a flavor not so trite and shallow as Ubisoft games. Still, everyone is more than willingly to close an eye to the many technical problems (and gameplay, and very bad controls).
So, on one hand it’s understandable that The Witcher is much better received despite its technical problem. It is largely the game everyone expected, whereas Watch Dogs and AC:U were both a technical failure on top of VERY stale (and for the most part shallow, uninspired) design. On the other hand CDP is too much fan favorite despite not really having earned that position.
The engine does one good thing, and it is requiring low CPU usage overall. Yet it’s incredibly taxing on the GPU, to unjustifiable levels. Why? What people see on screen is mostly about those red light tones, and vegetation. Vegetation runs on middleware, an engine CDP bought. It’s literally about buying an engine and then buying individual trees from a library they offer you.
SpeedTree, the engine that powers trees, grass and wind effects, has been really, really bad at performance since its early days of integration with that bad other piece of middleware that is Netimmerse/Gamebryo. SpeedTree is what you saw in Oblivion and DAoC. Same engine. Sure it can look pretty, but that’s all your processing power being spent entirely on a poorly optimized vegetation simulator.
So no real surprise at The Witcher running so badly. It’s a bit like Hairworks, but mandatory for the experience.
(another technical detail: if you notice the benchmarks largely favor AMD cards. This is totally unexpected since this is a Nvidia-optimized game and AMD didn’t even release a custom driver for it, as Nvidia did. So what’s the reason for AMD winning so easily without even putting an effort on it? SpeedTree again. It’s likely that wind physics and other physics simulations run on “Compute” and traditionally Nvidia cards are usually bottlenecked way more than AMD on that specific feature.)
Beside that, there’s significant pop-in about objects and actors, delivered through that ugly dithering effect that seems to be a default of DX11.
Just think if, instead of those gorgeous trailers that made CDP win so many awards even if the game was years from release, just think if they showed this instead (specifically from 1:25 to 1:48):
Entire buildings warping and morphing in and out of existence, textures missing. And that’s the patched game. I’m quite sure that if they showed that footage they wouldn’t have won so many prizes.
And that’s the problem. Looking back, they win. This practice of being deliberately dishonest and going for what’s convenient for that moment has paid off. They get glowing reviews all over the place, with minor notice for bugs and technical problems.
Anyway, you might still want to enjoy the game for what it offers. If you have a 770 class videocard (or better), and you still want the best performance without losing video quality where it matters, here are my tips:
– Turn OFF that pointless in game sharpness filter and anti-aliasing (replaced by SMAA, look down)
– Turn OFF Hairworks (it’s not worth the watts even if you have the hardware for it)
– SSAO looks better than HBAO. Use SSAO for better performance.
– Foliage range HUGELY affects performance, especially going from high to ultra. So keep on high.
– Foliage density has a small impact, but “medium” actually looks better than “ultra”. Use medium.
– Shadow quality medium looks decent enough. Use medium. (LOD levels are unaffected regardless of setting)
– Ultra/High texture quality, no difference at all. Ultra only caches more textures. (so 2Gb cards are perfectly fine on high)
– Set other settings like water detail, detail level, terrain, all to “high”. Most of these have currently no effect at all, so it’s all placebo.
THEN (following tweaks come at very little performance costs).
– Open the archive and dump the content in the witcher /bin/x64/ where the game .exe also is.
– Rename the ReShade64.dll file to dxgi.dll
– Run the game once to make sure it runs (as soon the game start you should see some text saying it’s running)
– Go into SweetFX directory and edit SweetFX_settings.txt look at all the effects and make sure only USE_SMAA and USE_LUMASHARPEN are set to 1
– Scroll down to the LumaSharpen part and make sure the numbers are like this:
(I’ll have to check later for actual numbers, for now keep at default or tweak as you like)
If you want to tweak more you can always alt-tab, edit and save the .txt and tab back into the game. The settings are always loaded dynamically as soon you save the .txt
(Dark Souls edition means I simply used the profile I was using for that game. The final tweaks are more conservative and only apply proper anti-aliasing with some sharpening to improve the look of grass, if you squint)
I just noticed on twitter a report that WoW’s subs dropped substantially again.
This follows a predictable high peak after the expansion release in November, but the news here is that in 5 months not only they completely lost all the subscriptions they gained, but they are also now at an all-time low since release, and of course there isn’t going to be an expansion to change things again anytime soon.
I don’t follow WoW, but I had a passing interest in the expansion and followed just enough to realize it was doing very bad design mistakes on almost all levels possible. In particular they went for a busywork style of play that required you to complete a few daily tasks. That’s fine, as a way to try to keep retention, but this also means they reduced the game to JUST a formula. That gets old very quickly.
So: the short-term good, long-term bad kind of design. WoW has gone downhill in game design since after Wrath of the Lich King. Cataclysm was a good idea, but all the good game design that the game had aplenty has been systematically siphoned out ever since.
This to say: WoW isn’t leaking subscriptions because it’s old, or even because of (mostly shitty, if you exclude FFXIV) competition, but because the WoW of today is even WORSE than the original WoW. Not only this game was unable to progress as every MMORPG is required to if it wants to continue to exist, but it went backwards.
The simple fact that the expansion brought back so many players, and none of them stayed, is the sign of how bad the game has become. People WANT to play and love WoW, but what’s left of it is an empty shell.
For the record, another MMORPG also working hard to go backwards is now Guild Wars 2. Good luck with that expansion, too.