Books/mythology/stuff discussions moved to: loopingworld.com

This means that the site here won’t (usually) be updated and I’ll eventually copy all of book-related posts over there. The rest of the stuff will stay here for as long the site stays up (not planning of pulling it down for the foreseeable future).

UPDATE: I’ll sporadically still post here, but it will be for writing about roguelike development, tracking my own (lack of) progress, or other quirky gaming things.

Posted in: Uncategorized | Tagged:

Nvidia and the bleaker future of GPUs

I should probably spend time doing more worthwhile things rather than writing this. But it seems that no one does otherwise.

As usual when I deal with this stuff, I will be imprecise and simplify A LOT. But in general what I say is going to be practically correct. It means that the big picture is the one I’m describing, without getting lost in the technical details.

The situation is this: in the last couple of generations of GPU, namely the 7xx and the latest 9xx, Nvidia has won the market. They won with hardware that, at the same price level, can output better performance AND consistently better energy efficiency. So it’s a total win-win scenario, where Nvidia wins over AMD in every case you can measure.

The problem is that it turns out this was achieved by removing certain scheduling hardware from the chips, a process that started with the 7xx class and continued with the 9xx. So, putting in the most simplistic way possible, that there’s less “stuff” on the chip, and because of that the chip requires less power to run. Nvidia found out that they were able to improve the performance by moving that specific logic away from the hardware and dealing with it in “software” instead, meaning the drivers. Stripping down and simplifying the hardware allowed Nvidia to create these energy efficient GPUs, also drastically reducing production costs. That’s how they won.

But this summer the first DirectX 12 benchmarks came out, and they showed not only that ATI performed a lot better compared to Nvidia, but that in a few cases NVidia hardware performed WORSE in DX12 than in DX11. Turns out that DX12 implementations rely much more directly on the hardware scheduling that, guess what, is not physically present in the recent Nvidia hardware.

What this reveals is important for both DX11 and DX12 future games, and the likely scenario is that the current 970s and 980s videocards will age VERY quickly and very poorly. The current excellent performance of these GPUs depends critically on Nvidia writing specific game schedulers in the drivers. It means that critical optimization is done directly by Nvidia engineers at the compiler and driver level. Game programmers have NO ACCESS to this level of source code, so they cannot do anything beside calling Nvidia and hope they care enough to allocate their engineer hours to fix certain issues. Right now the 970s and 980s are showing excellent performance because they have full support directly from those engineers, writing these custom schedulers for every big game coming out. These GPUs are crucially dependent on driver optimization because the driver is doing a job that usually is done at the hardware level, in ATI’s case, but Nvidia stripped down the hardware from the new chips, and so does that job in the drivers. And that’s also why new games are coming out that show very poor performance of the 7xx chips compared to the 9xx ones. Because Nvidia engineers focus more and more on the newer cards and less and less effort goes on optimizing and writing drivers for older hardware. Widening the gap over time.

What happens when Nvidia will release new hardware next year, with proper support in hardware for the DX12 features? That everything changes. Nvidia engineers will be focused on optimization for the newer cards, because Nvidia’s job is to sell you new hardware. And because the current GPUs performance is so dependent on active drivers optimization, more than it ever was because the schedulers are written in software, it means that once Nvidia engineers stop putting all their work on that optimization the performance of the current cards will plummet.

The scenario is that while the 970s and 980s are, by far, the best cards right now in the market, in the next months and years we’ll see the scenario completely rewritten. Current cards are going to perform very badly and upgrades will be mandatory if you want to keep up with newer games. There’s going to be a significant step up in hardware requirements, way steeper than what we’ve seen in the least few years.

Yet it’s also not possible to determine if Nvidia has already lost the market battle. Right now ATI hardware is much better future-proof compared to Nvidia, so ATI is better strategically positioned. But the next year marks a shift in technology, a new beginning, and it’s probable that Nvidia will put back in hardware the schedulers, with proper DX12 support instead of emulation. But it’s a new beginning only for Nvidia and who is ready to buy brand new hardware. For everyone else who sticks with Nvidia’s current generation it will only mean that this hardware will quickly be rendered obsolete.

Posted in: Uncategorized |

A note on MMORPGs business models

First, remember that MOST people can only see what happened after it happened. Whereas other people learn enough to have a vision of what is going to happen. In a similar way, there are games created for an existing market and audience, and games that deliberately create a market that wasn’t there before, and that suddenly becomes canon and that everything else has to conform to from that point onward. A vision can open new paths, and these new paths become the foundation on which everything else is built.

That said, there’s this widespread myth that free to play has to “replace” subscription models, that it is some unavoidable destination. This discussion is conditioned by the idea of a new model replacing an obsolete one, instead of discussing the game’s own merits. So a game can fail or succeed because of its business model.

The truth of free to play versus subscription models is fairly simple and slightly different from the debates I usually see. The point is that a subscription model is more directly competitive, and so risky. But it is not a case of “new” versus “old”, or a model that is now obsolete. The rise of free to play is motivated by the fact that the market is so competitive no one would survive in a subscription model. Free to play is a way to virtually enlarge the pie. Understood?

The reason is also simple. Players out there can and will buy different games, they can shift their focus from one to the other. Whereas a subscription model leads to a situation where a player will decide on what game to play. It’s very unlikely that a player will maintain multiple subscriptions. So the result is that subscription-based games are much more directly competitive between each other, and only the “King of the Hill” will survive and do well under these conditions. Every other title will fall short and struggle, which is the very simple reason why World of Warcraft dominated all these years. Or the reason why Elder Scrolls Online has to move to a subscription-free model not because subscriptions are a “bad business model”, but merely because the title isn’t valid enough to face the competition of a subscription model. It’s like a two-tiered market where a couple of games can compete at the top and do well with a subscription model, while lesser competitors have to find a way to co-exist with less belligerent business models.

Again, subscription model are still “ideally” the more appropriate business model for a long-term MMORPG that wants to grow as a virtual world, but for the practical needs of a market, and a market where you want to survive, the free to play model offers a way to squeeze more space out of that highly competitive, merciless market.

Rocksteady engineers trying to do the impossible

Gaming news these days are a joke.


Actual quote:

Warner said the above list is the priority, but it’s still working on the following:

– Skipping the boot up splash screens

Man, their most talented engineers are all hard at work to make splash screens skippable, but you have to accept when a task is simply above your skills.

Please desist, WB and Rocksteady, we will forgive you if you can’t fix what’s honestly way too complex to accomplish realistically.

I feel we should start a campaign to send them advices and maybe help them with this cyclopean task. Something like: “try moving the splash screen video files out of their directory, because it just might work.”

Dispelling the myth of DirectX 12

There are lots of articles out there detailing the merits of the new DirectX, but I think they all evoke expectations for the end-user that will never materialize.

The biggest aspect is that DX12 is “more efficient”, and so free performance. Being compatible with older hardware means that the same engine on the same hardware will run better, especially lowering load on the CPU side. All this leading up to the myth that DX12 will extend the life cycle of current hardware.

My opinion is that the opposite will happen: DX12 are a way to push again to buy new videocards and new CPUs. As it always happened.

A couple of days ago Eurogamer published an article about the first somewhat relevant DX12 benchmark:

The most important aspect is how on a fast CPU and a Nvidia card, DX12 is SLOWER than ancient DX11 technology. This is already the proof, just one case that means nothing beside showing that it can actually happen: DX12 isn’t a sure improvement. It could as well push things backward instead of forward. It’s not unambiguously “better”.

Here’s what I wrote about what might have happened (the beginning is an answer to someone claiming Nvidia DX12 drivers aren’t optimized yet):

Part 1: if we are at the bottom level, the activity of the driver isn’t very different from what DX11 does. If we are talking at a very basic level on DX12, it means dealing with basic instructions that DX11 already perfected. So there isn’t something intrinsic in DX12 that makes for a “tricky to develop” driver. The DX12 driver, compared to the DX11 one, is a driver that does less, at an even more basic level. So I’d assume for an engineer it’s much easier to write that driver (and less to work with when it’s time to squeeze out more performance). The first reason why DX11 might be FASTER is because Nvidia engineers know how to make something faster *in the driver*, whereas these guys who made the DX12 code didn’t know as many tricks. Hence, DX11 is faster because it ends up having better custom-code written by Nvidia.

Part 2: better multi-thread in DX12 still brings overhead. That’s why Nvidia backwards performance ONLY HAPPENS on 4+ cores and higher CPU frequency. If the DX11 render can keep up (meaning that it doesn’t completely fill one core) then the DX11 is FASTER than DX12. Because single-threading code is faster and because it leaves even more space on the remaining cores for the rest of the game logic. If instead you hit your CPU cap on the single thread THEN DX12 should be ideally faster, because you can spread better the load on other cores.

The reason why Final Fantasy 14 benchmark runs faster on DX9 than DX11 is somewhat similar. You can have fast single-thread code, or slower multi-thread core. At the end if you add up the load of multi-thread code it ends up cumulatively higher (so slower) than the single-thread code. The same happens with 64bits vs 32bits. 64 is marginally slower, but it allows you to tap into more resources.

Those are aspects that might explain why DX11 ends up being actually faster that DX12. But the myth is that the ideal better performance of an engine will become better performance for the end-user too. I think that’s false, and that’s because it’s produced by a false perception of how game development works.

I’ll try to explain again why DX12 expectations may be overblown, as it always happens, when you focus on the technical aspects and not on the practical ones.

Optimizing a game is a never-ending process that takes development time. Development time = money.

For a game company the first priority is to do things QUICKLY, because doing things fast turns into money you save. That’s why Batman game tanked: they didn’t want to allocate it enough time. They wanted it done FAST because PC isn’t worth long develop times.

Time spent on optimization and actual game performance for the end user belong to the same axis. That means that in a lot of cases the hypothetical speed of DX12 WILL NOT be translated into faster FPS for the end users, but into shorter optimization phases for the developer.

So, DX12 = same performance of DX11 with shorter development time (eventually), but at a lesser cost for the developer.

That’s how it works. The speed of an engine isn’t solely due to technology, but also to time spent on it. In practice, TIME is more an important variable for the developer than performance for the end-user.

That means, again, that in practice DX12 will end producing just about the same performance you see now in DX11. Every improvement in tech, in the HISTORY OF PC has always been eaten very quickly by rising requirements. Always and without exception. The moment you give developers some gains, they fill them up on their side by cutting down the time.

That’s not even the whole picture. As everyone knows video drivers are increasingly complex and optimized only for the newest cards. See Witcher 3 performing badly on 7xx cards. That means that even if DX12 theoretically bring benefits to ALL cards, as time passes the engineers writing drivers will only have time (and motivation to do so) to optimize them well on newer hardware. To not even consider developers who write engines, that will never waste weeks and months writing specific optimization for older hardware.

That means that all gains that DX12 might bring will be used to push new hardware, and not to make your current hardware live longer. It will mean less engineering effort to develop new cards while showing bigger performance gaps. Smoke & mirrors.

This is how things work in practice, since the world isn’t simply run by theoretical technology. What you expect from DX12 just WON’T HAPPEN. DX12 performance improvements are oversold, as it ALWAYS happened and will continue to happen with new technology.

<3 JRPG pixels

I’m fiddling a bit with Playstation 1 emulators.

Most of everyone uses ePSXe, because you can scale up the resolution and prettifying the graphic. Though the result is never as good as people think it is. The images might be dark, so increase brightness to better compare them and maybe open them in separate pages of the browser, on a black background.

This is just scenery, taken from Vagrant Story, a game that looks immensely pretty when properly pixelated.

The first image is ePSXe with scaled up resolution and textures. Notice that the polygons are much better defined. But there’s a complete lack of dithering, so the surfaces are very smooth and plain, giving a washed out, bland look.

The one below is again ePSXe, but with “software” mode plugin. Both pixels and dithering are back, but it doesn’t look so great. Yet I prefer this to the smoothness of the first image.

Now we change emulator. This is the one I always used, a Japanese, fairly obscure, emulator named “Xebra” with no configurable plugins and no possibility to “scale up” resolution and textures. This is just emulating with the maximum possible accuracy the original harware. So the following image is the vanilla Xebra. Also notice that Xebra produces slightly more vibrant colors compared to ePSXe. Even if there are big pixels the image blends well and offers a natural effect. The dithering makes surfaces richer, with more depth.

Now I noticed that if you disable OpenGL the image becomes sharper, and it looks way better. Only that it gives some graphic problem and it’s a lot slower in that mode. But then I realized that SweetFX shaders could work on top of the emulator, and so I could sharpen the image to get on OpenGL the same results. The following image is the result of me playing with these shaders. I actually like a lot the result, I toyed with various intensity values, and the image is sharp. The negative side is that, if you look at the bottom of the image, the effect enhances all these tiny squares that are so sharp they actually cover the actual detail by standing out too much.

The last image is again Xebra, with default setting and just a sharpen shader pushed to its maximum value. I think this produces the best effect. The beautiful dithering is there and the tiny squares at the bottom blend much better with the background.

Now another (open in new window for full size). The first is ePSXe in software. It is pretty bad. Then there’s ePSXe at higher resolution. The textures are smoother, but, if you notice, the better resolution also ends up exposing the problems of the source. The face looks a bit unnatural with its pointy nose and chin. The lines are too sharp. That’s a common side effect when you scale up games that weren’t made for that sort of detail. And due to lack of dithering the shoulder only shows diagonal bands of colors that look quite bad. The next quadrant is Xebra with my sharpen shader, to compare with the last quadrant that is vanilla Xebra. In this case the last one produces the best results, the image is softer and more natural. Suggesting that maybe a compromise between the last two is ideal.

But again, the important point I’m trying to prove is that the smooth, high-resolution option most people use is far from being the best. Pixels are beautiful.

Posted in: Uncategorized |

This modern counter-bias

So, it looks like The fantasy side of tabletop Warhammer joins those things that got a “reboot”. It shouldn’t surprise anyone that it turned into shit.

It doesn’t take very long, looking at the internet, to see that the response for this reboot has been almost universally negative. The Warhammer fantasy universe has been reset, so all established lore has been canceled, and it was also an opportunity to rewrite the rules and, guess what, make them more “casual”.

The main differences are the focus on a smaller amount of units and more importance given to heroes with special abilities. So a smaller scale to manage where single units make the difference. Beside that, everyone complains that the removal of army points makes the battles simply impossible to balance. And it sounds like a gaping hole of an oversight, however you want to look at it.

It should be evident that they now want a toy, and not a wargame.

But I’m pointing this out to underline two basic trends. One is about these “reboots” that systematically alienate the current players yet gain absolutely no one new. The point here is that it doesn’t take any careful analysis to realize these plans are always terrible ones.

The second trend is that I was reading this article that was doing a good job explaining the situation:

The problem is that it falls in this trend of the “Social Justice Warrior” angle being forced upon everything, which has the only effect of undermining perfectly reasonable complaints. As I said the article makes very good points, so it really wasn’t necessary to also put the load on that silly angle. I’m linking it because it reads like a parody of those same issues.

One of the new things the new rules seem to do is trying to break the fictional layer of the game to engage directly THE PLAYER as a game mechanic. In some kind of parody game it could even be a good, goofy idea, but on the actual Warhammer? It’s beyond stupid.

But I find even more funny that on one side the game rules themselves break the fictional layer, while on the other side the guy writing that article pushes the political agenda onto a fictional game/product. So I guess two wrongs make a right. And so the result is that perfectly reasonable complaints about a very goofy ruleset turn into very goofy complaints, in a kind of circular way.

And so the accusation:
“encouraging players to straight up mock people who suffer from mental illness”

About this rule:
“if, during your hero phase, you pretend to ride an imaginary horse, you can re-roll failed hit rolls”

Uh-oh. So very offensive. Worthy of a crusade. GRAB THE WARHAMMERS!

On a more serious note, this way of thinking is dangerous. It’s a weapon of an argument and it is now pervasive in our culture, in plenty of more subtle ways. Blaming people for imaginary intentions. If you ride an imaginary horse while playing a game your INTENTION for doing so is “mock people who suffer from mental illness”. And of course you cannot even defend yourself from the accusation, because the accusation pretends to reveal an HIDDEN purpose, and so that won’t be admitted. Like a dialectic bullet of entitlement. Beware, because this way of thinking is spreading.

A few things about E3

When I was a kid I used to be super-excited about new games being announced, but because of how I’m wired it wasn’t due to me being a kid, but to the different scenario and then things before us.

I was excited when Quake was announced, with the promise of a true 3D engine and a medieval setting with dragons. And I was excited when Final Fantasy 9 was announced, as a return to a more medieval “fantasy” setting, that I liked so much better than the futuristic mishmash more typical of Square, still without losing the sense of wonder and with character design going in the opposite direction of the VIII chapter.

And, in general, new games would push the boundaries of what could be done. Look at the timeline. Just one or two years between each, we went from Wolfenstein 3D, to Doom, to Quake. Since the early 2000s the progress curve has been flattened, and new games simply fail to hype, not because I’m older, but because the value simply isn’t there.

Fallout 4 is a good example. Same engine, but new hardware allowing them to add more foliage and geometry. Still no shame exhibiting horrendous animations. Still terrible combat. Dialogue system turned to utter shit. Removal of all skills. But hey, one really good idea: turn all the clutter into actual useful resources.

And I almost agree with Sarkeesian (who at this E3 has been working hard to troll the internet with the most stupid claims ever), if the modular crafting system wasn’t all about armor and weapon, and more about survival and other tools, the system and potential gameplay would have been nothing short of amazing. But, alas, “streamlining” is the buzzword of shitty design. And that’s what we get nowadays.

Then, Elder Scrolls Online? No one cares about that, and still has no vision of itself and what it is supposed to be. Yet Another Card Games (announced with an handful of concept art screens)? No thanks. Another program launcher to launch programs? No thanks. Dishonored 2? Just CGI trailer, no gameplay being shown, nothing at all being said about the actual game beside it has an option for a female protagonist.

And there’s Doom. Could be worse. This tweet already shows a lot. The idea of an “editor” that simply allows you to paste together pre-made rooms is fucking terrible if they don’t actually give players a real editor when you can edit actual geometry, and from what I read on twitter this is very unlikely. So it’s basically all a fraud. Pasting together rooms is fun and exiting for about 10 minutes. That’s not what made Doom map-making popular.

At least they did get the fact that Doom core is about gameplay and you can’t turn it into a scripted, linear game. So focus on gameplay, action and atmosphere. Problem is that what they’ve shown is problematic. I also thought that even if they do that, then they’d basically have Bulletstorm.

That’s also a problem. That demo looked “nice”, but a kind of nice that could easily be done with Unreal 3 engine (and it would probably run immensely better than whatever tech has ID now). The level design was really poor, and that’s THE feature of a Doom game. And only an handful of monsters on screen. That means they are probably on the right track, but the game currently is very shallow.

EA presentation: utter shit. Nothing to comment.

Ubisoft: look at The Division. This year they replaced their fake/fraud gameplay trailers with one that is actually in-engine and running. It’s obvious because they announced it playable at the show, and because you can observe some flickering textures (glitches are usually a good sign of a non-doctored footage). The problem is that it looks immensely downgraded from the previous two years demos. And now that it look barely average it also lost most of its coolness.

Then Ghost Recon. This is interesting because of the interplay with The Division. On one hand we have a game that comes out of “bullshit trailer” to show real footage, on the other we have a game that just enters now the bullshit trailer phase. The effect is that now people suddenly stop caring about The Division, while Ghost Recon can steal the show.

Until Ghost Recon also exits its bullshit phase, I guess.

Then Sony. They obviously win by the biggest margin imaginable, but. The Last Guardian is not there to awe us with a scripted jumping scene. Is that Uncharted? If depth of interaction is replaced by scripted scenes then we simply get a shitty game.

Final Fantasy VII. Beside universal hype there’s the fact they’ve shown nothing at all. The CG wasn’t even THAT good if it didn’t have that brand. So? That’s a game that can be a total failure or total success depending how you do it. And no one can guess how they are going to do it. A good idea would be to keep it classic, turn based, but with polished and deep systems. Streamlining here will simply kill it. Making it “realistic” or real-time would kill it too. So the way it would work great would be like Pillars of Eternity to Baldur’s Gate: something that at least aims to enhance the core, maybe radically, but without transforming it. So, great idea, but with very good chances of fucking it up.

Shenmue? Beside the ridiculous contradiction of an “indie” kickstarter begging for money on the biggest stage (it’s like EA launched a Kickstarter for the next Fifa game), this is still a game series that was built on the hugest budgets imaginable. How can this work out for a respectable Shenmue 3? No idea, no explanations given. Just BELIEVE!

Horizon? That looked fucking cool. And because I’ve seen this happen, and because this is the Killzone team, that’s also load of bullshit. So wait next year (or two) to see what the real game looks like.

See the pattern? This E3 is about two things:

1- It is about a lot of hype for vaporware games. Games only shown as CGI, that barely exist as ideas. That’s how hype works: if you show something completely undefined, people will complete it with their wishes and imagination. That is going to be nothing like the final product. An endless cycle of hype and disappointment.

2- Most of all other games, and those that are real, are all games that look behind instead of ahead. From Doom, passing through The Last Guardian, to Shenmue, we only have classics of the past. But are they connected to something? Do they know their roots or are they just aping their own identity?

No new game seems to actually show you something that you REALLY can hype. That you see and think it is INDISPENSABLE right now. Battlefront looks good, but it’s Battlefield with Star Wars skins.

The industry seems only being able of looking behind and digging out old bones. Having an history is extremely important, but what I observe is BOTH a critical failure in understanding what’s good in what’s left behind, and another failure on the vision of things ahead.

And only hardware and graphics can brute-force things onward.

The Witcher 3 and technical exploitation

Still quite a bit of noise on the Internet between those who still protest about downgrades and those who think the game looks beautiful and the graphic isn’t the whole deal, anyway. Not noticing that the two positions aren’t antithetic, but compatible.

And because I believe they are compatible, something needs still to be said about this downgrade, in particular to show how very, very far the game ended up from its expected fall point:

1- CDP always denied a downgrade, or even the possibility of one.
2- Just a couple of weeks before release they claimed the game would run on Ultra PC settings + Hariworks enabled at locked 60 FPS.

In the end not only the two individual claims were completely false, and DELIBERATELY false. Intended to be false in order to misdirect customers. But not only that, it’s the SUM of the two that should tell you how far they are from their claim.

Look at this Eurogamer article:

If you scroll you can see:
Core i7 4790K, Ultra Settings, No HairWorks 1080p Low/Avg FPS
GeForce GTX 980 4GB 49.0 / 65.8

Things to say about The Witcher 3

I have lots of stuff in the archive to put on BOTH blogs but I keep delaying that. For now here’s some bitching about The Witcher 3, even considering the engine is not too bad, after all.

– There was a downgrade (between real game and old trailers) and it was significant. All areas were hit: lighting is different, vegetation completely changed, clip plane murdered, object geometry reduced, pop-in detail, and NPC density. While the real game looks pretty, it still looks like a pretty, modern computer game, as opposed to the visual marvel the trailers showed. To anyone with minimal technical competency those trailers were very obvious “frauds”, you didn’t have to wait the game and be surprised that that sort of quality is out of reach.

– While until a week ago the official word was: “fixed 60 fps ultra preset on a 980”. The actual truth is quite far from that. With everything maxed a 980 averages around 40-50 fps, depending on how demanding/rigorous is the benchmark. In any case, this is a game that had multiple delays, before the latest ones it was expected for early 2014, so this was a game PLANNED to run on hardware that precedes Nvidia 9xx class, this BEFORE the downgrade. So take the best class videocard at that time, a 780 TI. It means they wanted us to believe that a 780 Ti could run this game well on the kind of graphic before the downgrade. Yeah.

770 class videocard = 30 fps at high, maybe with some non-intensive ultra
970 class videocard = 60 fps at high/ultra mix (Hairworks off)
980 class videocard = 60 fps ultra (Hairworks off)
PS4 usually at 30 fps, with mostly high settings (foliage is high, shadows medium, worse LOD but it looks like shadows on the long range are better than on PC regardless of setting)

Watch Dogs and AC:Unity have seen massive outrage from the public, and that should tell you how much perception counts. The Witcher 3 is still rather buggy, seems to crash a lot on PC (especially at weird points, like looking through the inventory), has seen a more significant downgrade compared to those games. Yet, it is a better game overall, and especially a flavor not so trite and shallow as Ubisoft games. Still, everyone is more than willingly to close an eye to the many technical problems (and gameplay, and very bad controls).

So, on one hand it’s understandable that The Witcher is much better received despite its technical problem. It is largely the game everyone expected, whereas Watch Dogs and AC:U were both a technical failure on top of VERY stale (and for the most part shallow, uninspired) design. On the other hand CDP is too much fan favorite despite not really having earned that position.

The engine does one good thing, and it is requiring low CPU usage overall. Yet it’s incredibly taxing on the GPU, to unjustifiable levels. Why? What people see on screen is mostly about those red light tones, and vegetation. Vegetation runs on middleware, an engine CDP bought. It’s literally about buying an engine and then buying individual trees from a library they offer you.

SpeedTree, the engine that powers trees, grass and wind effects, has been really, really bad at performance since its early days of integration with that bad other piece of middleware that is Netimmerse/Gamebryo. SpeedTree is what you saw in Oblivion and DAoC. Same engine. Sure it can look pretty, but that’s all your processing power being spent entirely on a poorly optimized vegetation simulator.

So no real surprise at The Witcher running so badly. It’s a bit like Hairworks, but mandatory for the experience.

(another technical detail: if you notice the benchmarks largely favor AMD cards. This is totally unexpected since this is a Nvidia-optimized game and AMD didn’t even release a custom driver for it, as Nvidia did. So what’s the reason for AMD winning so easily without even putting an effort on it? SpeedTree again. It’s likely that wind physics and other physics simulations run on “Compute” and traditionally Nvidia cards are usually bottlenecked way more than AMD on that specific feature.)

Beside that, there’s significant pop-in about objects and actors, delivered through that ugly dithering effect that seems to be a default of DX11.

Just think if, instead of those gorgeous trailers that made CDP win so many awards even if the game was years from release, just think if they showed this instead (specifically from 1:25 to 1:48):

Entire buildings warping and morphing in and out of existence, textures missing. And that’s the patched game. I’m quite sure that if they showed that footage they wouldn’t have won so many prizes.

And that’s the problem. Looking back, they win. This practice of being deliberately dishonest and going for what’s convenient for that moment has paid off. They get glowing reviews all over the place, with minor notice for bugs and technical problems.

Anyway, you might still want to enjoy the game for what it offers. If you have a 770 class videocard (or better), and you still want the best performance without losing video quality where it matters, here are my tips:

– Turn OFF that pointless in game sharpness filter and anti-aliasing (replaced by SMAA, look down)
– Turn OFF Hairworks (it’s not worth the watts even if you have the hardware for it)
– SSAO looks better than HBAO. Use SSAO for better performance.
– Foliage range HUGELY affects performance, especially going from high to ultra. So keep on high.
– Foliage density has a small impact, but “medium” actually looks better than “ultra”. Use medium.
– Shadow quality medium looks decent enough. Use medium. (LOD levels are unaffected regardless of setting)
– Ultra/High texture quality, no difference at all. Ultra only caches more textures. (so 2Gb cards are perfectly fine on high)
– Set other settings like water detail, detail level, terrain, all to “high”. Most of these have currently no effect at all, so it’s all placebo.

THEN (following tweaks come at very little performance costs).

Download latest SweetFX:

SweetFX 2.0 (Preview 8 | ReShade 0.18.4)

– Open the archive and dump the content in the witcher /bin/x64/ where the game .exe also is.
– Rename the ReShade64.dll file to dxgi.dll
– Run the game once to make sure it runs (as soon the game start you should see some text saying it’s running)
– Exit.
– Go into SweetFX directory and edit SweetFX_settings.txt look at all the effects and make sure only USE_SMAA and USE_LUMASHARPEN are set to 1
– Scroll down to the LumaSharpen part and make sure the numbers are like this:

(I’ll have to check later for actual numbers, for now keep at default or tweak as you like)

If you want to tweak more you can always alt-tab, edit and save the .txt and tab back into the game. The settings are always loaded dynamically as soon you save the .txt

Some screenshots.

Vanilla: http://www.cesspit.net/misc/w3a.jpg
Dark Souls edition: http://www.cesspit.net/misc/w3b.jpg
Vanilla again: http://www.cesspit.net/misc/w3c.jpg
Final tweaks: http://www.cesspit.net/misc/w3d.jpg

(Dark Souls edition means I simply used the profile I was using for that game. The final tweaks are more conservative and only apply proper anti-aliasing with some sharpening to improve the look of grass, if you squint)

Posted in: Uncategorized |