Well, I've got kind of a game-related personal story for you guys.
I think I'm probably the biggest gamer on this board, but I've always been more of a console gamer than a PC gamer. I use my PC more for work than play, and I've always been fine with buying a pre-built machine that could only play the lower-end games. My first three computers were HP laptops, bought when I was still moving around a lot. They lasted an average of three years each, which was fine. When my third laptop started showing warning signs, I finally bought my first tower. That was about a year ago. It came with an awesome 4th-gen i7 4770 processor and 16gb of memory, but with an anemic geforce gt 625 graphics card and a 350-watt power supply. Still, for the types of games I typically play on PC --
Diablo 3,
Marvel Heroes, and
Crusader Kings 2 -- I could still run everything I really wanted to play at my native resolution of 1080p, I just had to crank all the other settings down to minimum. Not being a graphics snob, that really didn't bother me much. But it did limit the types of games I could buy for PC. I never even considered getting something like
Witcher 2, even though I was pretty sure I'd love it. I just knew my rig couldn't handle it.
Well, recently a game was released on Steam called
Divinity: Original Sin. It's an isometric RPG (kinda like
Diablo), but with turn-based combat and a D&D style of gameplay. It looked right up my alley, so I went ahead and bought it. It didn't seem like a game that would demand a lot of graphical power. But when I fired it up, I discovered that even with all the other settings on minimum, I couldn't get the game to run a smooth 30fps at 1080, I actually had to bump the resolution down to 720. It frustrated the hell out of me, because I just didn't expect to have this problem with this game. I was forced to either play it non-native and have it look blurry as hell, or play windowed and have everything be very small. Grr.
It was the straw that broke the camel's back. I realized that I had never in my life had a half-decent graphics card. I wasn't ready to spend crazy money, but I decided that I wanted to upgrade to something that wouldn't sweat running graphically lower-end games at max or near max settings. After consulting with my brother, who knows much more about this stuff than I do, I ended up getting a geforce gtx 750 ti. Apparently it's one of the first cards to use the new Maxwell architecture, and while it isn't the beefiest card out there, it was really cost-effective ($130 plus tax, pretty good considering the next step up is twice as expensive) and also happens to use very little power and run very cool and quiet. Even so, my brother told me I'd probably need to upgrade my 350-watt power supply, despite the online specs saying that the card could run on a 300-watt supply. So I ordered a 600-watt supply too ($60).
And here's where the story gets a little hairy. After my order arrived from Newegg this afternoon, I went about trying to install the thing. Now, I'm not a hardware guy at all. The most I've ever done to a computer is add a second hard drive. And while switching graphics cards is pretty easy, switching power sources is a bit more complicated, since it has to connect to a number of different components. Anyway, I fiddled around with it for a while, and managed to unplug most of the old power supply's connections from the rest of my rig. But then I encountered a problem. I couldn't get the damned 24-pin connector off the motherboard. There was a little plastic thingy that you were supposed to depress so that a latch would come off, but for me it just wasn't working. After 15-20 minutes of screwing around with this thing, I decided that I'd just try to install the graphics card alone. After all, the online specs said a 300-watt supply would be fine, and mine was 350. Not ideal, but whatever. So I plug everything back in that I had unplugged from the power supply, installed the card without any trouble, put the cover back on, went to fire it up and...
It wouldn't turn on. The power button just didn't work.
You can imagine what went through my head at that point. I had backed up everything important the day before (I have a 500gb external drive for this purpose), but I don't have a lot of money if I had to replace my whole machine, and then there's just the hassle of trying to get it fixed, getting a new system up and running... Suffice it to say, I was pretty manic. Definitely not in a happy place.
My brother lives in a whole other timezone from me, so he couldn't help. I didn't know anyone in my apartment complex who knew hardware really well. So I decided to take the whole mess to Best Buy and see if they could sort it out, despite the fact that I'd probably end up paying them $200+ for their time, and who knew if they could even fix it at all. For all I knew, I had fried my motherboard.
But here's where serendipity came in. I was walking out to the parking lot to head to Best Buy when I saw my friend Cathy (older woman, 60s) trying to start her girlfriend's beat-up old beast of a van. She wasn't having much luck (I knew she and her partner had had the thing towed away before when it wouldn't start... they're poor and can't afford a new car right now). Anyway, I knew that her partner had some tech experience, mostly software and networking, but still. I stopped and ask her how comfortable her partner was with hardware. She said decent, but that they knew another couple in our complex who were better with hardware than she was. Delighted at the prospect of possibly not having to give Best Buy a decent-sized chunk of change, I paid these folks a visit. Luckily they weren't doing anything that couldn't be interrupted.
The guy came back to my apartment to try to troubleshoot this thing. The first thing we did was put the old card back in, but the machine still wouldn't turn on at all. That meant it probably
was the old power supply that had failed. Either I'd physically damaged it, or its tiny little brain had been blown out by the new graphics card (contrary to what the online specifications had said, I discovered that the box for the card noted a need for a
400-watt supply rather than a 300... mine was 350). In any case, we next went to install the new power supply. My new best friend was able to unplug that evil 24-pin connector by actually reaching underneath the latch on the bottom and pulling it up manually, rather than depressing the stupid little button that was supposed to accomplish the same thing. If I had know to do that, I may have avoided the scare I got from all this (then again, I may have busted the new supply, too).
Long story short, we got the new 600-watt power supply installed, and when we tested it, finally the damned thing started. I was overjoyed. But even better, we then unplugged it and put the new card back in with the new power supply, and again it started up fine. Mission accomplished. I started out wanting a better graphics card, but at the end of this day, I was even more happy that my machine worked at all. The fact that the new components actually work now seems like more of a bonus next to the nightmare of bricking my whole system.
But at the end of the day, it did all work out, and I got to test out my new rig. And the difference truly is night and day:
https://gpuboss.com/gpus/GeForce-GTX-750-Ti-vs-GeForce-GT-625
Another way to show the difference is that SystemRequirementsLab.com rated my old card in the 60th percentile of nvidia graphics cards, and my new one in the 92nd percentile.
Games that I used to run on minimum spec at 1080, 30fps I can now at maximum spec at 60fps without a problem. I can even max
Divinity: Original Sin with no issue, which was the whole reason I got the card in the first place. And I have to say, it's pretty awesome for the first time in my life to not need to worry that I won't be able to run any particular game. It puts all sorts of games in reach that just weren't possibilities for me before now. Probably worth the $200 and the scare I went through to get it all working.
So yeah, that was my day today. Hopefully someone enjoyed reading about it.