I think it's a safe bet The Witcher 4 will run just as well as its tech demo after talking with UE5 devs and analyzing Unreal Fest tech data

The Witcher 4 Unreal Engine 5 Tech Demo 4K | State of Unreal | Unreal Fest Orlando - YouTube The Witcher 4 Unreal Engine 5 Tech Demo 4K | State of Unreal | Unreal Fest Orlando - YouTube
Watch On

Unless you happened to be visiting the far side of the Moon these past few weeks, you can't have missed the Witcher 4 tech demo, which opened Epic's State of Unreal keynote at Unreal Fest 2025 in Orlando. The developers—CD Projekt Red and Epic Games—were both at pains to point out that the graphics and tech showcased in the demo were both works in progress, and you shouldn't use them to determine what The Witcher 4 will eventually be like.

That didn't stop the masses on social media from expressing serious doubt that the game will look anywhere near as good and, more importantly, certainly not run at 60 fps on a standard PlayStation. The sorry state of Cyberpunk 2077 at launch has certainly not faded from anyone's minds just yet.

But after attending Unreal Fest, sitting through various tech presentations, talking to developers and Epic themselves, I'm quietly confident that what we saw in the keynote will actually be pretty close to what we'll get in The Witcher 4 when it appears at some point in the future.

Of course, I don't expect anyone to just take my word for it, so let me walk you through some of the evidence I used to make this judgment. Before I start, though, let me be clear on one thing: The Witcher 4 will almost certainly have bugs, performance issues, and whatnot at launch—no AAA game ever releases in a perfect state—but the graphics you saw, all running at 60 fps? Yes, that's definitely possible, even likely.

There was a specific presentation at Unreal Fest, titled The Road to 60 fps, in which Epic's lead rendering programmer, Kevin Ortegren, and CDPR's expert core tech engineer, Jarosław Rudzki, went through every aspect of the demo's graphics and performance, breaking down what they wanted to achieve, how they went about it, and how they managed to reach the desired performance target.

Key technologies were Unreal Engine's virtual geometry system, Nanite, along with its hardware-based ray-traced lighting, Lumen. We're all familiar with both of these, as just about every UE5 game that's appeared in the past two years has used one or both of them. We're perhaps also familiar with the superb graphics they can generate, along with the steep system requirements and choppy performance.

But it was actually a different presentation, one on Unreal Engine's animation framework in The Witcher 4 tech demo, that caught my eye, as I sat amongst the host of developers, animators, and 3D artists attending Unreal Fest. There was one slide that showed the various processing times for the entire run of the tech demo, including frame time, GPU render time, and thread times for the graphics and animation.

(Image credit: Epic Games/CD Projekt Red)

The legend in the top right-hand corner isn't easy to read, but the top blue line is the frame time, and you can see that it's almost constantly at 16.666 milliseconds for the entire demo. Convert that into a frame rate (divide 1000 by the time) and you get 60 fps. Just beneath it is the GPU render time, and the fact that it's practically the same 16 milliseconds shows that the demo is entirely GPU-bound.

That's a good sign because it means there is plenty of scope to add additional processes for gameplay that don't involve using the GPU. This point is backed up by the three thread times—the pale blue one is the central 'game' thread—because even at their slowest point (around 13 milliseconds) during the market scene, they still don't come close to making the demo U-bound.

Further evidence for this can be seen in a slide from the Road to 60 fps presentation, which shows the processing times for all the different threads for one frame during the market stage. First of all, note how many individual instructions are spread across the various threads (1x Game, 1x Render, 1x RHI, 2x Foreground, and 10x Background). So what, you might think. Well, take a closer look at it all.

A lot of things concerning graphics, including ray tracing instructions, get done very early on before moving on to the bulk of the work, which is stuff relating to the handling of all those NPCs and their animation. Then there's a relatively big gap of little U processing while the GPU executes the rendering before the frame is finally done.

It was mentioned a few times in the presentations, and by other people I spoke to, that there was plenty of room to not only improve thread times and management, but also utilise the U's 'downtime' even more effectively.

The second image, showing the U 'core' utilisation, demonstrates this more clearly, and the third one shows how much better things get when you drop a third of the NPCs. In other words, there's either room already or there's scope to switch things around to give sufficient breathing space for all the usual game code not present in the tech demo.

And it's not just about the U, as the graphics pipeline is already pretty well optimised.

You might think the use of hardware-based Lumen is going to be a nail in The Witcher 4's performance coffin, but again, that's not the case. For the Unreal Engine 5.6 update, Epic improved how well HW Lumen runs, and the GPU profiling times for the market scene show just how little time the ray-traced lighting system takes up in the whole frame.

It also helps that while the final output is 4K, the tech demo runs two upscaling stages, with the primary frame rendering done at between 800p and 1080p, dynamically, and then the post-processing stage is handled at 1440p.

Typically, console games render at a low resolution but use a checkerboard technique to get a high resolution output, but of late, temporal upscaling has become the favoured solution, no doubt because that's what gaming PCs use.

I don't think it's possible to do a dual-upscaling sequence with FSR, DLSS, or XeSS, so this might be one aspect of the PC version of The Witcher 4 that doesn't go as well as it does for the console version. Then again, many gaming PCs have a better graphics processor than that in the PS5 and Xbox Series X, so it might not be an issue at all.

Epic did it that the tech demo wasn't perfect, even though it hit a consistent 60 fps, and the company talked about how hitches (where the processing time exceeds 17 milliseconds for one or two frames) are challenging to deal with.

For example, the demo's engine relies on previous frame data to work out occlusion culling (the removal of objects from the rendering sequence based on how visible they are), but if there's a sudden change in environment, camera position, etc, then all that previous data is useless.

What you then get is something called overdraw, where the engine pushes everything out for rendering, because it can't cull properly. Epic's solution was to use ray tracing information to generate a low-quality frame that effectively primes the occlusion culler with enough detail to chop the overdraw right down.

Now, I know what some of you are thinking. "Err Nick, this is a tech demo. It's designed to be as flawless and fancy as possible, but that still doesn't mean the final game will be like this." I'll it that such a line of thought is perfectly valid, and The Witcher 4 could well end up like v1.0 Cyberpunk 2077.

Your next upgrade

Nvidia RTX 5090 Founders Edition graphics card on different backgrounds

(Image credit: Future)

Best SSD for gaming: Get into the game ahead of the rest.

However, I do believe that there is enough evidence to suggest that, while still possible, it's less likely that Ciri's new adventure is a hot rendered mess on first release. Some of that is because CD Projekt Red is no longer having to work on creating an entirely fresh engine and set of frameworks for the game—that's mostly Epic's job, although the two companies are clearly working closely together to help make The Witcher 4 become the poster child for Unreal Engine.

The rest of the evidence, beyond that which I've covered above, is word-of-mouth and from people who didn't know I was press or were talking off-the-record. I know that's hardly compelling, but it did add weight to my impression that this was more than just a mere tech demo, but a genuine tease of what we can expect with Witcher 4.

In fact, I'd go as far as to say that CDPR were only heavily stressing the demo aspect of it all because they know how gamers are still a bit raw about 2077's launch state, and they wanted to temper said expectations.

It could all go pear-shaped over the coming years, of course, but with Epic and Unreal Engine looking stronger than ever, I honestly don't think that will happen. The game itself might ultimately not be Witcher 3 level in of story, gameplay, and vibe, but number four will almost certainly look and run considerably better than its predecessor.

TOPICS
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick ed Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely its to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days? 

You must confirm your public display name before commenting

Please and then again, you will then be prompted to enter your display name.