(REPOST) My 4GB of VRAM is being rapidly left behind by the gaming landscape. What happened to the days when an Xbox 360 with 512 MB of VRAM was enough to do a whole crap load of a lot. Granted console games are more optimized for consoles, but in theory, you could play the same game with about 1 GB of VRAM. But nowadays you need at least 8GB to run about all of the AAA games, unless you want 40- fps on low settings. What happened to optimizing the graphics of modern games? Also, if you look back to the Xbox 360-era games vs modern games, there are no huge leaps and bounds in fidelity like in the 90s-2000s, only 4k support. But for those of us who don't need or want 4k, it looks the same. How come they can't optimize for lower-end PCs anymore? Why do you need to process every spark, drop of water, and leaf individually? I know there are limitations for 4 GB of VRAM, but holy crap the Xbox 360 had less than 1 GB!!! I know I'm salty because I can't afford anything better, but that still doesn't justify them making hugely unoptimized games for PC. Heck, I can't even get a decent framerate with the T-series and a trailer. Before I could do that just fine. In the push for "graphical fidelity," they leave behind anyone with a skinny wallet. It just gets old not being able to play games that look slightly better than their predecessors, but needing twice the VRAM for a half-decent experience. Spoiler: Xbox 360 VS PC Even the PS4 with approx 2GB of VRAM, still runs modern games at 75+ FPS. Granted with lower settings. Spoiler: PS4 Performance