Well, Nevermind.
A long time ago, I was told that texreduce would improve performance, but I never actually bothered to compare. It does feel like there are fewer hitches and stutters under more extreme conditions (see: This), But it seems that it doesn't actually provide any measureable performance boost <_<. The Placeebo effect may have something to do with it
>_>
If you're curious, this is my setup:
Yes, that's right, I'm using a downgraded version of a graphics card from 10 years ago.
A long time ago, I was told that texreduce would improve performance, but I never actually bothered to compare. It does feel like there are fewer hitches and stutters under more extreme conditions (see: This), But it seems that it doesn't actually provide any measureable performance boost <_<. The Placeebo effect may have something to do with it
>_>
If you're curious, this is my setup:
[SELECT ALL] Code:
A 4 year old eMachines ET1831-05 "Everyday Computing" desktop (not a gaming machine by any means, even 4 years ago):
Single core Intel Celeron 450 @ 2.20 GHz
3.0 GB DDR2 RAM
The motherboard kinda sucks in general for reasons that I don't fully understand
Tiny power supply, only 250W
The desktop didn't even come with a case fan, I had to buy one separately
Added graphics card is NVidia GeForce 6800 (256 MB), which is the downgraded commodity version of the GeForce 6800 Ultra, which came out in 2004. I actually do have a spare graphics card (NVidia GeForce 8600 GTS overclocked by BFG Tech), which is still a little old, but definitely a step up. The problem is the power supply can't handle it. Switching out the graphics card would bring the total power usage slightly above 250W, according to some power-supply-requirement calculator thing, and I've heard that the power supplies that eMachines used are less than reliable, I don't want to risk it
I do have a dual-monitor setup, but sometimes I'll switch to single-monitor mode for the slight performance boost. For example, when I watched Netflix before I learned how to manually select a lower buffer bitrate. For whatever weird reason, on this computer, Netflix (with Silverlight, it doesn't happen with the HTML5 player) has this strange bug. Whenever the video stream is reloaded (captions / language settings changed, "Allow HD" unchecked, seeking to anywhere in the video) will cause Silverlight to hang, and use up 100% CPU, forcing me to kill the process. Of course, this doesn't save the status of the "Allow HD" checkbox or the position that it tried to seek to
A few months ago, I got a laptop that's better for gaming. It was really top-of-the-line... about 2-3 years ago, lol. (Dell Studio 17, refurbished / upgraded with an earlier generation core i7 processor (from i3 or i5, I'm not sure which) )
It has a slight overheating problem, so I normally use my desktop when I can. My desk is a mess, there's no room for my laptop, so it's more comfortable to just use my desktop
Yes, that's right, I'm using a downgraded version of a graphics card from 10 years ago.