How does the e8400 compare to the q6600 (both at 3.0ghz), both in terms of gaming and everything else. What if the q6600 was above 3.0ghz? Thanks.e8400 @ 3.0ghz VS q6600 @ 3.0ghz
the quad core is better, for the fact that 4 is more then 2. As of right now, games are still being devolped with 2 core in mind. Of course your OC the q6600, and you can OC the e8400 to alot more. Gaming as of now they are about equal, but the quadcore is better for mutli tasking.e8400 @ 3.0ghz VS q6600 @ 3.0ghz
[This message was deleted at the request of the original poster]
the quad is better for muilti taskin and also gaming even thou the dual cores can range a better fps on lower res but the quads can gain ground more if ur monitor is hugh
First let me tell you that core 2 duo's can be used for gaming whereas core 2 quad processors are meant for workstations.
games are becoming more advanced so if you want to get that core 2 duo go for it but I would prefer the quad because the next generation games migh need 4 cores
Games don't take advantage of quad cores, so a high end Core 2 Duo is the way to go. You'll see more performance on a C2D than on a C2Q.
I might have went for the Core 2 Duo, but I multi-task a lot, so I went Quad.
The E8400 is a 45nm Penryn chip, whereas the Q6600 is derived from the older 65nm Conroe family.
Penryn CPUs are faster clock-for-clock due to architectural improvements, and also have the SSE4 instruction set, which some apps use to great effect. They also generate much less heat because of being built on a smaller process, which means much more overclocking headroom. (The Q6600 is disadvantaged enough by having twice the cores to keep cool; now it has to compete with a chip built on a smaller process that generates even less heat per core!)
If you want to game, you're probably better off with the E8400 and overclocking it to 4.0 GHz or more, unless you're willing to pay up for a Q9650 without having to compromise on the rest of your system. That quad is also a Penryn chip, and with a 9x multiplier, should easily hit 4.0 GHz or more if you put some good cooling on it. Unfortunately, it costs 500+ US$, and all the cheaper quads are poor overclockers either because their multipliers are low or because they're toasty 65nm chips.
If you want a multitasking machine, though, then it would be much easier to do so with many CPU cores free to take up various workloads. Let's say that you have an app that makes full use of a dual-core but not a quad, and you want to use it, but you also want to run something else at the same time that also needs one or two powerful cores. The dual would have a performance penalty, but the quad would be unaffected and both apps will run at full speed.
Finally, consider if you want to upgrade to something like Nehalem in the future. If you do, a dual would be cheaper, which means more money to spend on whatever you're planning to upgrade to. If you don't, you can pay up for a quad now, which will be more future-proof than a dual. Sure, some would say that by the time most apps make use of quads, current quads will be obsolete-however, duals from the same era will be even more obsolete, and not everyone is going to upgrade every time a new CPU architecture shows up!
Before you buy either CPUs you must ask yourself, ''Will I use photoshop or autoCad on a professional level ?'' or ''Will I just game?''Core 2 quads are good for everything and also good in gaming but core 2 Duo are better in that field, also the e8400 @4ghz is faster, more stable, cooler than a q6600 @3.0ghz
[QUOTE=''Games_pro''] Before you buy either CPUs you must ask yourself, ''Will I use photoshop or autoCad on a professional level ?'' or ''Will I just game?''Core 2 quads are good for everything and also good in gaming but core 2 Duo are better in that field, also the e8400 @4ghz is faster, more stable, cooler than a q6600 @3.0ghz [/QUOTE]I agree....I also believe that by the time developers are making games (if this is what the OP is primarily going to use his CPU for) that utilize 4 cores or more, the Q6600 will be an out-dated chip..not that it won't still run well, but there is newer technology that is supplanting it now....*+
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment