Nonsense, or only for a very small definition of "many".
No, Daniel Owen did a pretty extensive examination of this. If you run at ultra, many (the traditional definition of many) of the latest graphically intensive games will be impacted to various degrees with 8GB VRAM.
For example, Horizon Forbidden West gets a whopping 44% advantage in 1% lows at 1080p very high, indicating dramatically smoother gameplay. This is easy to miss because the
average is only 9% higher. If you go to 1080p high settings, the discrepancy disappears. Also Resident Evil 4 and path-traced Cyberpunk were very memory-hungry. Lots of other games showed a 10-15% advantage in 1% lows, indicating they would like more VRAM but can deal with it for now, it's more prevalent at 1440p than 1080p, and DLSS framegen eats a ton of VRAM also.
He tested Alan Wake 2 path-traced too. On the 4060ti 8/16GB cards he was using, he went to 1080p DLSS quality and found a 35% 1% low advantage for the 16GB card. Much smoother.
That isn't necessarily a problem, except $400+ GPUs are still being sold with 8GB VRAM and people would reasonably expect them to run 1080p balls to the wall. But they won't. I certainly wouldn't recommend buying an 8GB GPU in 2024 unless it was a great deal and you know what you're getting. Today it's just hitting ultra, but eventually high settings will be impacted too and who wants to be bothered?
We had an "8GB VRAM is enough for anybody for now" thread here on Ars and I agreed with you at the time, a year or two back. I figured with so many 8GB cards on the market, developers would be forced to support them. And to be fair they still are, just not at ultra or with path-tracing.
Regarding a theoretical $800 RTX5080 with RTX4090-level performance, that would be attractive. $1000 much less so.
View: https://youtu.be/nrpzzMcaE5k?t=729