After building my PC, it didn’t take long for the problems to start. Strange green screen crashes greeted me with regularity. The card sounded like a jet engine taking off every time I played anything slightly demanding and it ran so hot that it made the area around it boil. My computer was so loud that I had to move it as far away from me as I could. Granted, it was an aftermarket card, but even the reference 5700 XTs had plenty of reports of excessive heat and loudness. I’d lose picture occasionally for no reason. After some time, the drivers improved things by a bit. But I’d still routinely have games crash on me.
Not just that, games had a tendency to shut the entire PC off along with them. I even had crashes from watching Netflix. To put it mildly, I rued the decision. Sure, I saved $130, but I would have saved myself much grief had I just bought that 2070. I likely would have jumped on that soon, but, as we all know, the pandemic happened. All of a sudden, the $500 card I was too cheap to buy massively jumped in price, and finding one became difficult. I used that 5700 XT for two whole years and it gave me giant problems the whole way through.
Hate to walk away
To be clear, I can think of three games in the last six months alone that I had to cancel review plans for because of that card. They’d crash repeatedly, taking my whole PC with them. Not only did this make reviewing certain things borderline impossible, but I was worried that one of those times would end up with me begrudgingly getting up for a reboot, only to find that the thing wouldn’t even boot up. Not that what I got sometimes was far off. After one of the “your whole PC shut down” crashes, I’d occasionally have the AMD drivers freak out again, leading to another crash and restart.
This happened several times. I learned something very interesting about game development this way, though. The thing is, since Nvidia cards have the majority of the market share, AMD cards are simply not used in development by indie devs as much. And those indie devs sometimes don’t (or can’t) do as much optimization as they need to. Thus, playing smaller games, especially pre-release, meant that I was going to keep seeing crashes for as long as I used that card. Naturally, this didn’t extend to AAA games. Those devs had the cash to test and optimize so that as many people as possible could play without issue.
Now, I’m the Reviews Editor here at PC Invasion, and I review a lot of indie games. To say that this combination made me nervous is an understatement. Every time I went to install a new indie game for review, I braced myself for crashes. The last game I had to apologize over and assign to someone else led to the devs asking me for logs and info. According to a developer who looked into one of their games crashing my PC, it’s an issue that Unity games are more likely to run into, potentially in regard to AMD hardware. With this line of Navi cards, this isn’t exactly unheard of, as a great many people have also had routine issues with games crashing.
The light at the end
I was positive this was going to proceed for the foreseeable future. I mean, how was I going to get a new card when they were so hard to find? This has been an especially difficult year for me, so I was far from hopeful. But the stars aligned and, shockingly, things turned out fantastically. I feel like I used up multiple years worth of accrued luck, but I was able to start using an Nvidia card again.
In the weeks since I started using it, I have had two crashes. Both of them were Cyberpunk 2077 being a dumpster fire made out of bugs. Neither of them crashed my PC. Both were followed up by CD Projekt Red asking me to send a bug report. It’ll probably still need two years to fix the problem.
The positivity doesn’t stop there. I no longer have to hide from my GPU either, as this one is nice and quiet. I’m not even scared when I boot up indie games anymore. On top of that, I’ve finally gotten to see what ray tracing is all about. I can’t tell you how much of a relief it is to have a stable computer again, especially in this line of work. Of course, the old card threw one last nasty surprise my way when it got stuck in its PCIE slot. I actually had to rip out the plastic tab on the slot to even get it free, due to shoddy motherboard design. Not the card’s fault, but it seemed appropriate for its final farewell.
“You get what you pay for” is an old adage that I tend to roll my eyes at. But, hey, it’s true. I know AMD is continually improving its GPUs and the current batch is supposed to be its best yet, but I don’t think that’s a pool I’ll be wading into again any time soon. I have heard some excellent things about AMD’s newest line, but time will tell how nicely the drivers end up playing with games. So, yes, I think I’ve learned my lesson this time. It might take a few years for GPU prices to go back to normal, but I’m going to be sticking with Nvidia on that front. I really can’t take years of crashes again. For this job and, yes, for my massively dwindled sanity.