Hogwarts Legacy, Forspoken and Co.: But too little video memory for games – the future has caught up with us! A comment

Hogwarts Legacy, Forspoken and Co.: But too little video memory for games - the future has caught up with us!  A comment

February 8, 2023 update: Times are changing. What is modern today may already be outdated tomorrow. Loyal PCGH readers know that we always make our hardware recommendations with an eye on the future, so you don’t back the wrong horse. This is sometimes seen as exaggerated and is not always worthwhile for those who change graphics cards a lot – but the longer you use a graphics card, the more it pays off to invest in a model that we recommend. You bought a Geforce RTX 3060 12GB, Geforce RTX 3080 12GB, Radeon RX 6700 XT or Radeon RX 6800 XT instead of a Geforce RTX 3060 Ti and Geforce RTX 3080 10GB? Then you will be celebrating now at the latest.

In May 2022 we put our somewhat hasty conclusions on record (see also the original article below), now the future predicted at the time has caught up with us. The forerunners of the new generation of games – Hogwarts Legacy and Forspoken – show clear limits to every memory stingy model. Yes, we mean you, Nvidia(s product manager)! Basically, the recent VRAM problems in Hogwarts Legacy, Forspoken, Dead Space Remake are not new or surprising – but the intensity is higher. Graphics cards with 8 GiByte were counted out long ago, but now the knockout is near. The generation change was delayed due to various factors (Covid, mining, lack of components), but 2023 will bring the change with many next-gen games. There will no doubt be some titles this year that are perfectly optimized and consequently run decently on older graphics cards, but the course is set. While we wait for the final drivers for Hogwarts Legacy to present you with comprehensive benchmarks, we present you our oven-fresh VRAM Advice 2023 to the heart.


Two good reasons for a lot of graphics memory




Hogwarts Legacy: PCGH recommends 32 GiB RAM plus 16 GiByte graphics memory for maximum enjoyment.  If one or both are missing, there is more or less pronounced jerking.  <span class="sourceLinkwrapper dIn">[Quelle: PC Games Hardware]</span>” src=”https://www.pcgameshardware.de/screenshots/380×214/2023/02/Hogwarts-Legacy-PC_4K-UHD-Raytracing-DLSS-Quality_maxed-und-downsampled_10-pcgh.jpg” width=”380″ height=”214″/><br />
</span></p>
<p><span class=
Hogwarts Legacy: PCGH recommends 32 GiB RAM plus 16 GiByte graphics memory for maximum enjoyment. If one or both are missing, there is more or less pronounced jerking. [Quelle: PC Games Hardware]






Forspoken: The game has - despite all prophecies of doom - beautiful moments.  We recommend at least 12 GiByte VRAM plus 16 GiByte RAM.  <span class="sourceLinkwrapper dIn">[Quelle: PC Games Hardware]</span>” src=”https://www.pcgameshardware.de/screenshots/380×214/2023/02/Forspoken-1-pcgh.jpg” width=”380″ height=”214″/><br />
</span></p>
<p><span class=
Forspoken: The game has – despite all prophecies of doom – beautiful moments. We recommend at least 12 GiByte VRAM plus 16 GiByte RAM. [Quelle: PC Games Hardware]






Long story short, not every one of our predictions comes true, but time is working against hardware that’s only trimmed for the present. It’s good to have a buffer for emergencies and the future. Or do you have another opinion? Do you belong to the “Play it safe”, “Upgrade more often” or even “Why always ultra details” team? Discuss with us!


Original article of May 22, 2022 (Insufficient video memory for games: I was probably a bit premature – a comment by Raffael Vötter): “Once I thought I was wrong, but I was wrong.” I’m not that type. It’s good when you can compare statements made with new information and then admit that things turned out differently. In my everyday work, the combination of a large number of samples and experience saves me from overly bold, leaky statements. The experience, which enables a perspective classification of the facts, needs to be well dosed as a multiplier. When I looked at the “graphics card vs. games” topic, I had anticipated a different development curve from the one that actually existed.

At the end of 2020, the signs were pointing to a new beginning: Brand new GPUs from AMD and Nvidia as well as a fresh generation of consoles promised a wonderful next-gen future. What we got instead is rather unspectacular to this day. Sars-CoV-2 is still raging, a lack of components, crypto-humbug and now a completely unnecessary, brain-burned war are slowing down the spread of new hardware. Without it, optimization is carried out for old components that have been installed millions of times. That also has something good: graphics cards, which I saw as “sewed on edge” at the end of 2020, are still usable properly (8 GiB) to very well (10 GiB) today.

When writing this text, I am of course looking at Nvidia’s Geforce RTX 3080, which was undoubtedly fast when it was released, but had less memory than the previous top model, the RTX 2080 Ti. No, let Nvidia’s official justification that the RTX 3080 be the successor to the RTX 2080 I still don’t use that as an excuse for a lack of progress. The following years played into the hands of the RTX 3080 10GB and Nvidia is really good at delaying symptoms of deficiency with the help of aggressive memory management. But postponed is not canceled. From my last tests I can roughly deduce that 10 “Geforce-GiByte” behaves about the same as 12 “Radeon-GiByte”. If the memory is actually filled due to a high load that cannot be streamed, driver tricks do not help either. RTX 3080 10GB, RTX 3070 (Ti) and the like simply have too little capacity for their performance. There are good reasons why most new graphics cards have 12 GiBytes. Nvidia knows that too. The RTX 2060 12GB, RTX 3060, RTX 3080 12GB and RTX 3080 Ti are clear proof and are all “safe” on the memory side. The fact that AMD understood the benefits of real memory should have been clear since the Fiji disaster at the latest. The Radeon makers sometimes forget that (including the RX 5600 XT, RX 6500 XT), but these are not high-end cards, but official savings.

The statement that the next generation of games will blow away many a “edge seam model” is still valid. Really now, you could say, because things are finally starting to roll. The first games are cutting off old-gen pigtails (no more Xbox One and PS4 support), the Unreal Engine 5 is ramping up and Direct Storage is a major innovation in data streaming. If the past two years have taught us anything, it’s not to be hasty. Direct storage in particular is not implemented overnight – we are talking here from many years to widespread use. So was I premature? Yes. Was this development foreseeable? No. Is a graphics card with more memory better, more durable, more future-proof and more valuable when resold than one with less memory? That’s a rhetorical question.

Reference-www.pcgameshardware.de