Does your motherboard support CrossFire or SLI?
Interesting. To be fair, testing a CPU's performance with benchmarking (especially FPS) rarely gives you an accurate result due to a better CPU mostly reducing frame drops and "freeze" while playing, and the game running perfectly fine the rest of the time (or at least, from what I can tell with my experience owning a i5 750).
It's too bad that Skyrim doesn't offer a benchmark, I'd be curious to know about the performance leap between the i5 750 and a better CPU such as i7 4770K. Skyrim with maxed out settings tend to cause constant frame drops and freezing with the i5 750, especially when moving quickly across the map (for example, by horse). Not unplayable, but definitely annoying until you reduce textures/AA.
Comment has been collapsed.
Dunno, that's what I read on the net. Whatever setting I changed, it helped. ¯\_(ツ)_/¯
Comment has been collapsed.
It could be that what you actually need, to get rid of Skyrim stutters, is not a new PC, but lower latency RAM in better configuration:
http://www.tomshardware.com/reviews/memory-bandwidth-latency-gaming,3409-8.html
Comment has been collapsed.
Speaking as a pretty nooby person when it comes to computers, how would this make sense when my CPU is the one hitting 100% usage and not my RAM? Does the RAM affect your CPU that much?
Comment has been collapsed.
Skyrim's scripting engine is a total mess in most any sense of the word. It is part of the reason I always say Bethesda Games Studio's coding department consists of three drunken chimpanzees and an unpaid intern. And the sad part is that it is most likely true.
RAM latency measures how many cycles does it take for the RAM stick to respond to a request. Skyrim's memory controls are horrible and they result in a lot of excessive and wasted cycles, meaning the higher the RAM latency, the more and more cycles are wasted on doing absolutely nothing.
Add that the engine's memory controller is absolutely ass when it comes to it knowing which parts of the memory it even tries to address, and you have something prone to memory leaks, leading to even more lag and instability. And now you know why people call them Bugthesda.
Some fanmade hacks try to address this issue with varying success though.
Comment has been collapsed.
Skyrim's problem is not in the graphics engine though, but in the gods-awful scripting engine Bethesda used since Morrowind in every game. (Although, to be fair, it goes to quite a few games; scripts not letting the CPU calculate driver overhead can be more common causes for frame drops than demanding graphical calculations.)
Comment has been collapsed.
At this stage the only thing I can do is run a test on the i5 with CrossFire enabled or disabled, and test the i7 machine as-is. I'm not going to open the cases of these machines to shuffle GPUs, but perhaps running benchmarks on these machines as they are now can still reveal interesting results. Any specific benchmarks you have in mind?
Comment has been collapsed.
I ran some similar tests when I upgraded my GPU a few months back,
http://www.3dmark.com/compare/fs/10304075/fs/10296156/fs/6697325#
As you can see, with column 3, the CPU is the i5 2500, and compared with the i5 6600K in column 2, there is very little difference, both using the R9 380 Dual X - however as soon as the RX480 is installed (Column 1), the results are a clear improvement across the board.
With most games now being rendered increasingly by the GPU, CPU strength (for gaming at least) is being made ever more redundant. Older machines can still perform well against the latest out there, it's all just a matter of GPU.
As most motherboard manufacturers update BIOS/UEFI constantly these days, to allow for more compatibility, CPU's seem to becoming much less relevant, (Again I'm applying this only to gaming - keep yer knickers on) and as such one would expect the cost of CPU's to start dropping. If newer chips can't be utilised to capacity, with GPU's taking the workload, what is the point of upgrading them?
Obviously programs that run multiple threads and simultaneous programs will keep buying them, and I know a few guys (like me) who play Everquest on a free (Legal) server who multibox - multi thread CPUS are big on our servers because you can run more instances per machine with less lag/stuttering and load times. I run 18 - 24 at times, and one guy I know runs 36, but even he uses a very old server machine with a crapload of RAM - he's been talking about getting an ever older machine that runs dual CPUS to run even more toons - because the older CPU's still run the games, it's more the GPU that becomes an issue.
EDIT
One thing I forgot, and while it has no bearing on the benchmarks shown here, I still find it interesting...
When the RX480 arrived, and I installed it, I moved my R9 380 to the second PCIEx16 slot - reason being is that the new DX12 is supposed to support multiple GPU's even if they are not the same architecture.
I ran the Timespy Benchmark - That's the DX12 benchmark from 3DMark (You might be able to find it if you browse my results) and then checked out Catalyst (AMD Driver Centre).
Catalyst told me that my cards were "LINKED". Now I thought this was strange because I know they are NOT XFire compatible being totally different architechure.
I was impressed. Then I saw that there was a new driver version.
I updated it - And the cards were no longer showing as "LINKED".
Now This is what I found interesting... AMD (Whom I have only recently started buying - the past 3 years or so - I was an NVidia Fanboy) made a huge deal out the fact that Multiple cards would be able to be used with Vulcan/Mantle in DX12 - Mainly because of NVidias' nasty little patch that removed PhysX functionality when using Non-NVidia cards in conjunction with their cards, and YET - in a single driver upgrade, they disabled THEIR OWN CARD?!?
Was it an oversight? Was it even meant to link in the first place? Was my computer possessed? Did I have the wrong medication that day?
While I admit it could have been any of these things, or a combination, I know what I saw and have as yet been unable to replicate it :(
Comment has been collapsed.
I haven't downgraded the drivers no - the new card runs all my games (Fallout 4/ Skyrim LE/ Styx etc) at 1080p at 60fps and even 40-50fps in 4K so i'm happy with the performance of it running singly.
Once newer stuff starts coming out and I slow down I might try it, or as you say - contact AMD to find out what it's about.
My main point here was that it was linked - not cross-fired, and I have no idea if that was intentional, a driver bug, or even exactly what it did - because thought the multi GPU component is supposed to be in DX12/Mantle/Vulcan - No-one has actually used it yet (AFAIK).
I might contact AMD anyway and see what they have to say. I'll update you here if I hear anything.
Comment has been collapsed.
OK so I contacted AMD and here's the rundown:
The Link does indeed refer to the new API - but it was only currently supposed to be used with the ON BOARD graphics or APU's with a single GPU. It will eventually include all (APU's + Multiple Vendor ID GPU's), but somehow I got my hands on a test driver. That driver was removed because of an undisclosed issue, and never meant to go public (Beats me how I got it) and they recommended I update my drivers and try again.
I'm waiting for my wife to take the kids to the park with the dragon-in... err mother-in-law this afternoon, and I'm going to slip the old card back in the second PCIE slot to see the results.
Comment has been collapsed.
OK - Just finished a 2 hour desk to head bang-a-thon.
The card still comes up as disabled.
I tried updating etc, but to no avail.
I rolled back the drivers and no good.
I installed some from my old back-up drive - and we had a winner. It worked. Linked again. But running 3dMark showed that it still only used the 1 GPU (The primary RX480).
As there was a 4% decrease in performance, no point in keeping it in.
Comment has been collapsed.
I have a single GPU and won't be running more than one but I'm definitely getting a new one as soon as I can afford it. This old 1GB HD6850 doesn't play some of the newer (and even several older) games well unless I turn all of the settings down. Looking to getting a RX480 so I'm hopefully future proof for a while.
Comment has been collapsed.
BTW, in this case upgrading to CrossFire didn't cost me anything, because I already owned the second HD 7850 and probably wouldn't even bother trying to sell it. I basically doubled the framerate in my HTPC without spending a cent. You can't beat that value :-)
Comment has been collapsed.
That's always a possibility, but it's easy to disable CrossFire per game. With that said, even the one game I tested where minimum frame rate dropped (BioShock Infinite), I couldn't see it and what I did see was generally much smoother framerate. I believe that CrossFIre is beneficial much more often than it isn't (though time will tell).
Faced with the same option (double framerate, no money spent), would you handle the HTPC GPU upgrade differently?
Comment has been collapsed.
And don't compare the price of a brand new gpu with the price of a second hand gpu... All brand new or all second hand.
Why? The performance will be the same.
Comment has been collapsed.
If you're planning to upgrade your PC to a new desktop CPU (and as a result, new board), why get a eGPU? I can see why you'd be interested in one if you planned on using an existing laptop for gaming.
Also, are there even tests of Bristol Ridge AM4? What makes you think it's going to be better than similarly priced Intel CPUs?
Comment has been collapsed.
eGPU is for the laptop, Bristol Ridge for the HTPC. Haven't been using my desktop for a while and probably won't start using it any time soon.
The little I saw of Bristol Ridge AM4 wasn't that promising, and I'm not sure I'll go that way, but certainly if I want to use integrated graphics, AMD is going to be a better choice than Intel. The HTPC only supports half height cards and has a PSU that gets noisy when any significant amount of power is used (where 'significant' is 100W). A 65W APU should be a decent choice for that machine and be significantly faster than the current setup.
Comment has been collapsed.
One solution for the HTPC is a different case. I'm using a SilverStone Grandia GD07 which has enough room for 2 full sized graphic cards and several hard drives.
Comment has been collapsed.
Great post thank you.
This is very relevant to me as I have an RX 470 4GB too.
Comment has been collapsed.
Yeah I've heard it said that nowadays the CPU is way less important than the GPU. I'm currently using an i7-2600k and honestly I've never noticed a bottleneck from this. I'm saving up to get a 480 8 gig to replace my 260x and I don't see a need to upgrade the CPU just yet.
Comment has been collapsed.
That's actually something I hadn't really considered... good point. I'd have to do some maths to figure it out.
Comment has been collapsed.
Dual GPUs offer such diminishing returns and support for them is atrocious and actually fading. They were gaining but last few releases we see both AMD and Nvidia pulling back support and consistency. It's an awful cycle. Developers need to add support, it's rare that it's out of the box, and often times both AMD and Nvidia need to do their part with the drivers.
Dual gpus may be worth it if you're going to buy a used card super cheap to add another year to your PC but I'd definitely say they're not worth considering from the start of a new build unless you seriously have too much money or every game you're going to play does support it well and you're not concerned about future support.
Comment has been collapsed.
Almost all had poor support at launch with patches helping but off the top of my head. Dishonored 2, Doom, Just Cause 3, Rise of Tomb Raider, Titanfall 2, Battlefield 1 (only supported DX11), Froza Horizon. and it goes on.
A lot of them had it fixed either by dev or drivers, others needed custom profiles, which are spotty at best. Even then you often only gain a 50% increase in performance for a 100% increase in cost if you buy them together.
Dual gpus can be fantastic but they can also be terrible and are not EVER worth 100% blanket recommendations.
Comment has been collapsed.
So basically, if you want to be an early adopter of new AAA games, get the best single GPU you can afford.
If you want the best bang for the buck and don't tend to buy new games when they haven't received any discounts (like most people on this site), consider getting another GPU of the same type you already have. Just make sure your board supports multiple GPUs.
In my example, I doubled the framerate of my HTPC without paying a cent. Even if I had to buy the second card (instead of moving it from another machine I just upgraded), it still would have been a fraction of the cost compared to getting similar performance by upgrading to a new single-GPU solution.
Comment has been collapsed.
If you're upgrading later, yeah I already conceded that point, but only if you do your research. You also chose games that happened to, rarely, scale very favorably and are absolutely outside the norm.
You're trying to make a blanket statement in support of dual+ GPUs and none of the facts support that. It's super caveat emptor.
I understand you're excited with your setup and that's great and all, but all other benchmarks of real game performance simply don't match your bold statements and most will take these statements as if they should buy two cards right away which the economics simply don't ever support unless you're looking for extremely specific performance gains in very few circumstances.
It is always better the get the best single card you can get rather than 2 cards. It's certainly, often times, better to upgrade later by getting a second card at a good discount. But that was also broken with the GTX 10 series line. There's just no room for super enthusiasm about dual+ gpu solutions when the trend is fading and the future market is explicitly shifting away from that.
EDIT:
Just re-read your OP and it seems my headache is affecting me, lol. Disregard my nitpicking, you're explicitly, consistently, saying upgrades later. My apologies.
Comment has been collapsed.
1,959 Comments - Last post 7 minutes ago by MeguminShiro
54 Comments - Last post 14 minutes ago by harakirixx
36 Comments - Last post 25 minutes ago by Wasari
27 Comments - Last post 29 minutes ago by thenewman97
21 Comments - Last post 1 hour ago by orono
9 Comments - Last post 1 hour ago by HaxterZ
189 Comments - Last post 1 hour ago by Chris76de
16,892 Comments - Last post 15 seconds ago by adam1224
93 Comments - Last post 2 minutes ago by herbesdeprovence
27 Comments - Last post 3 minutes ago by Chris76de
406 Comments - Last post 6 minutes ago by SerenaM
499 Comments - Last post 10 minutes ago by wizcreations
28,493 Comments - Last post 10 minutes ago by Yamaraus
154 Comments - Last post 11 minutes ago by coleypollockfilet
I recently purchased a new GPU - a 4GB Sapphire Nitro RX 470. It replaced a Radeon HD 7850 in my desktop PC. I then moved the old card to my HTPC, where it joined another HD 7850 in CrossFire mode. The desktop PC has an i7 4770K CPU and 16 GB RAM, while the HTPC has an i5 750 CPU and 8 GB RAM.
Since in both cases I had a single HD 7850, this upgrade gave me the opportunity to test the benefit from using two cards in CrossFire mode vs the benefit from a much faster CPU (which is easily 5 times faster in some applications I'm using). Long story short, the benefit from CrossFire is amazing. In many cases it provides double the framerate compared to a single card, way beyond my expectations. The CPU on the other hand seems to bring no or negligible improvements in most games, not even remotely close to what I'm seeing in other applications. It's definitely not an upgrade I would recommend to anyone whose main interest is gaming and who doesn't frequently use CPU-intensive applications, at least not if they have a CPU similar or better than this 2009 i5 750.
Lastly, the RX 470 is awesome. It's both faster and more power efficient than the CrossFire rig. However, although it's (relatively) inexpensive, it's still about 3~4 times more expensive than adding a second hand HD 7850. If you're considering a GPU upgrade and your motherboard supports CrossFire (or the similar NVIDIA SLI), the option of adding another card is well worth looking into, especially if your budget is very limited.
Here are the results:
Note that some of these tests have slightly modified settings, based on recommendations suggested in this thread. I can provide more details if people are interested.
TLDR: When it comes to gaming rig upgrades, a second GPU in CrossFire mode can be awesome, while a much faster CPU can bring negligible improvements.
O.GAComment has been collapsed.