I have quite a bit of hardware laying around that I have literally hoarded or that has been donated to us that is anywhere between 5-25 years old. I love trying to get older hardware to do things that they probably shouldn’t be able to do.
To celebrate 4000 subscribers I water-cooled a Pentium 2. To celebrate 5000 subscribers I wanted to do something a little bit strange. I had a random idea Â in the shower. Its where I come up with all my ideas. I wanted see if I could upgrade an almost 10 year old all in one PC to run modern games in 4K. Lets see what happens.
The donor system is a HP Elite 8000 or 8200 or 8400. To be honest I seriously cannot remember which one it is but its old. The original configuration for this system came with either 2GB or 4GB DDR3 RAM and a first gen Intel i5 650.
A few years ago I had a spare first gen Intel i7 860 that I had swapped in and thats whats currently in it now. The i7 860 has a base clock of around 2.8ghz and turbos up to around 3.46ghz. It was and still is a pretty decent CPU. This system has been used once before on the channel when I did the DIY eGPU.
The idea was to remove the stock cooler and put something a little bit more substantial on it and put a full sized GPU in the system even though the GPU itself is almost as long as the whole system and the system itself is a low profile system. As well as figuring out how to power the GPU since the built in power supply has no PCIe power and the motherboard itself definitely cannot power an MSI Gaming X Trio RTX 2080. It didn’t have to be pretty. It just had to work.
Performance wise I was pretty impressed with how it performed in the game based benchmarks we performed and with playing actual games, but to be honest, it wasnt really about the numbers at all. That’s not the point of this system. The idea for me was to just see if this ugly janky system would work at all. There wasn’t any reason it shouldn’t have but I still needed to actually see.
But to appease the people who are really obsessed with numbers lets take a look at how it performed in our usual suite of benchmarks with the addition of a 20 minute stress test to get an average full load CPU temperature.
Lets start off with our 20 minute CPU temperature stress test. It reached an average of 46 degrees Celsius.
Onto Rise of the tomb raider. We use this for almost all our GPU benchmarking. We use the high preset with 4x SMAA in all our testing. For these tests we only tested in DX11.
In 1080p we saw an overall score of around 87fps where as when we did our benchmarking on our 8700k GPU test bench we saw an average score of around 117fps. Thats around a 34 percent difference. Youll find though at lower resolutions the CPU bottleneck will be a lot more significant. Not only that the HP Elite is using PCIe 2.0 whereas our GPU test bench is using PCIe 3.0. The PCIe speed differences between 2 and 3 wont cause as much difference as you might think. Well not 34 percent anyways.
In 1440p we saw an overall score of around 70fps which is around 11 percent slower than on our 8700K benchmark. As you can see the performance gaps are getting smaller as the resolution gets higher.
Lastly onto. 4K. In 4K we saw an overall score of 39fps which is only 2 frames less or around 5 perfect different to when we ran it on our 8700K test bench.
Onto unigine super position. On the super position tests we performed 3 tests in total. We used the 4K optimized preset, 1080p extreme and a custom 1440p test with Depth of field and motion blur turned off.
In 1440p we saw an average framerate of around 92fps which is around 53 percent slower than our 8700K GPU test bench. This is more to do with how superposition to uses the cpu. But I wasnt expecting anything special here.
In 4K we saw an average framerate of around 65fps which is only 2 less frames or 3 percent slower than the 8700k test bench.
Lastly is the other of all mothers of a benchmark. The superposition 1080p extreme benchmark. No GPUs are safe here. Very oddly we saw an average framerate of 51fps which is 1 frame more that on our 8700K benchmark. I ran the tests again on both systems just to make sure and there was 1 frame in it everytime.
Last but not least is the Final fantasy 15 benchmarks. This test is a little bit different. This test gives you a score based on the total amount of rendered frames. Im guessing you know the resolutions already.
In 1080p we saw a score of 6295 which is around 78 percent less rendered frames than our 8700K test bench.
In 1440p we saw a score 6034 which is around 37 percent less rendered frames than our 8700k test bench.
Finally in 4k we saw a score of 4617 which is around 4 percent less rendered frames than our 8700k test bench.
Its clear that the old i7 with the RTX 2080 under performs compared to the 8700K test bench but thats to be expected. Its a 9 year old CPU. The numbers dont matter because its a practical use case for an RTX card. Like I mentioned earlier it was just to see if it would work at all. Which it does.
At the end of the day if you REALLY wanted to use new GPU in a really old system you could but should you. Probably not. The funniest thing about this whole adventure was I had a gigabyte socket 1156 board that I could have easily have used and I could have built it in a new case with loads of RGB but it just wouldnt have been as cool as making the ugliest computer I could think of.
I just wanted to thank you guys again for helping us reach 5000 subscribers. You guys are the reason we do all of this. I cannot believe that we hit 5000 so quickly after 4000. It feels like just yesterday that I water-cooled that Pentium 2 system.