When using this resource, please read FragKrag's opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly.
On January 22 2013 05:13 MisterFred wrote: @The Swamp. As SkyR said, look at the post I wrote for Coil1 (last post pg. 1357) Though being near a microcenter (which one?), you could consider overclocking as you can do it relatively cheaply (you can often get z77 boards cheap as part of an in-store bundle deal).
Other than processor or mobo as part of a bundle/combo deal, microcenter rarely offers prices better than newegg/us.ncix.com, however.
Your budget is a little higher, of course, but there's not really anything more to spend on unless you want to put some effort into quiet computing, overclock, or are planning on getting a better monitor. You could upgrade to a 7950 instead of 7870, but that'll have only a marginal difference on graphics performance for maybe Planetside 2. A 7870 is already horrendous overkill for SC2 & CS:GO, as SkyR mentioned.
Hi and thanks for the help. I live in Chicago. I bought my last PC at MicroCenter and they price matched everything. I'm not sure if they still do this, but if they do I'd rather get everything at the same time. If I were to overclock, would I need an extra fan? Also, to what Nvidia GPU would the 7950 and 7870 be comparable? I know nothing about ATI cards. Seriously though thanks so much!
GTX670 and 660ti would be around what you're looking for (7950/7870 range). A lot of people say that 7870 is the best bang for the buck though.
And yes, you will want to get an aftermarket fan (instead of the one that comes with your cpu package) if you want to OC. Shouldn't cost you more than 30-35 unless you want to get a fancy one. You will also need a bit more expensive motherboard.
Oh, and one more thing. Check with MC first to see what exactly their bundle deal is this time around. It used to be $50 off any Z77 motherboard if you buy a i7-3770k, i5-3570k, or i3-3225, but at least in my area, they changed it to $40 off any motherboard with either 3570k or 3225. The sales reps didn't allow other processors to be bundled.
Hi, sorry for taking so long to reply. So all I know is to be able to OC an Intel it has to have a K next to the name, correct? Also, My budget has become a little more flexible so if there something that I could spend some extra money on that would be worth it, I can do that.
Overclocking requires a K suffix processor along with a Z chipset board (Z77, Z75, etc).
OK thanks! Will overclocking an I5 give me more than enough performance on games? Sorry for being overly cautious. I got really bad advice from a friend on my first build, and I just want to be absolutely sure with everything.
An i7 isn't really any faster than an i5 for gaming. If you want something faster than a 3570k for gaming you're stuck waiting on Haswell.
Or you could get the AMD 3850 ^^
Is this a troll? I can't think of a single cpu bound game that benefits from the 8350 (you meant this right?) over a 3570. Zero games that tax a 3570 use more than 4 threads. Here is an approximation of how you can expect every cpu bound game to perform on different cpus: http://www.anandtech.com/bench/CPU/129
No, this isn't 'intel biased', as x86 instructions aren't gimped for different configurations. You can look at any other cpu bound game and you'll find similar results, this graph is just the most comprehensive reliable bench I've seen.
If you like running benchmarks 24/7 the 3570 and 3770 will be the better choice. But if you're actually going to use it for gaming, you should consider the 8350. You will get a CPU thats 50 dollars cheaper than the 3570k, the mobo will be cheaper and you get way more sata 6bg ports on AMD chipsets.
I think the testing behind this review was found to be inaccurate.
Would you mind telling us why you think that it's inaccurate?
On January 14 2013 07:50 Myrmidon wrote: The problem is the complete lack of consistency in results. Does this really need to be spelled out? If you can't interpret results like these (edit: I mean, figure out that they might be inaccurate, at least if you have some background knowledge about how the Intel processors relate to one another, which I wouldn't expect of someone who doesn't follow these things), then you need to seriously develop some critical thinking skills.
Then only tested gaming+streaming on one game, where results are hugely in question.
Huge red flags: 3770 > 8350 at 1080p, yet 3770 < 8350 at 1440p. Uh, changing resolutions changes the workload on the GPU, not the CPU. 3770 > 3570 by huge margin without streaming, despite only 100 MHz / 2MB L3 / HT difference (and HT being irrelevant here). We all know 100 MHz won't make that kind of change, for processors running 3+ GHz, and if 2MB extra L3 and/or HT were somehow so important—they're not—then one would expect the 3820 to do a lot better despite being SB-E instead of IVB.
Some other issues too.
PS: who plays fps at those kinds of frame rates?
I didn't bother looking more after seeing the above. But what were the streaming settings? You can always choose a faster encoding preset for slightly worse quality.
On January 14 2013 15:51 Myrmidon wrote: Any situation where HT and slightly more L3 cache are going to make a difference, i7-3820 should be mostly close with i7-3770k. No such trend exists in that data.
When some things are off by dozens of percentage points, is the data really worth looking through? I mean, if you can figure out what's horribly awry, then that's good, but there's no basis for discussion here. It's not like even figuring out what's wrong would give us the "correct" results. May as well bring up the topic you want to discuss and ignore this debacle completely.
edit: more or less, even ignoring AMD vs. Intel and all other review sites' results, there are all sorts of inconsistencies and obvious double-digit percentage point errors that all results should be considered way beyond the "cast into doubt" stage.
On January 28 2013 07:54 Myrmidon wrote: With stock cooler, distance between PSU fan and CPU fan is not that close and reversing may not make much difference, but the larger and better the cooler, the closer the fans would be and more interference / turbulence / problems you'd have as a result.
Haha yeah I noticed after letting the pc run for a while under load. I'll get the Shuriken 2 Big Rev. B. Seems like a good fit. Thank you for all your help
On January 22 2013 05:13 MisterFred wrote: @The Swamp. As SkyR said, look at the post I wrote for Coil1 (last post pg. 1357) Though being near a microcenter (which one?), you could consider overclocking as you can do it relatively cheaply (you can often get z77 boards cheap as part of an in-store bundle deal).
Other than processor or mobo as part of a bundle/combo deal, microcenter rarely offers prices better than newegg/us.ncix.com, however.
Your budget is a little higher, of course, but there's not really anything more to spend on unless you want to put some effort into quiet computing, overclock, or are planning on getting a better monitor. You could upgrade to a 7950 instead of 7870, but that'll have only a marginal difference on graphics performance for maybe Planetside 2. A 7870 is already horrendous overkill for SC2 & CS:GO, as SkyR mentioned.
Hi and thanks for the help. I live in Chicago. I bought my last PC at MicroCenter and they price matched everything. I'm not sure if they still do this, but if they do I'd rather get everything at the same time. If I were to overclock, would I need an extra fan? Also, to what Nvidia GPU would the 7950 and 7870 be comparable? I know nothing about ATI cards. Seriously though thanks so much!
GTX670 and 660ti would be around what you're looking for (7950/7870 range). A lot of people say that 7870 is the best bang for the buck though.
And yes, you will want to get an aftermarket fan (instead of the one that comes with your cpu package) if you want to OC. Shouldn't cost you more than 30-35 unless you want to get a fancy one. You will also need a bit more expensive motherboard.
Oh, and one more thing. Check with MC first to see what exactly their bundle deal is this time around. It used to be $50 off any Z77 motherboard if you buy a i7-3770k, i5-3570k, or i3-3225, but at least in my area, they changed it to $40 off any motherboard with either 3570k or 3225. The sales reps didn't allow other processors to be bundled.
Hi, sorry for taking so long to reply. So all I know is to be able to OC an Intel it has to have a K next to the name, correct? Also, My budget has become a little more flexible so if there something that I could spend some extra money on that would be worth it, I can do that.
Overclocking requires a K suffix processor along with a Z chipset board (Z77, Z75, etc).
OK thanks! Will overclocking an I5 give me more than enough performance on games? Sorry for being overly cautious. I got really bad advice from a friend on my first build, and I just want to be absolutely sure with everything.
An i7 isn't really any faster than an i5 for gaming. If you want something faster than a 3570k for gaming you're stuck waiting on Haswell.
Or you could get the AMD 3850 ^^
Is this a troll? I can't think of a single cpu bound game that benefits from the 8350 (you meant this right?) over a 3570. Zero games that tax a 3570 use more than 4 threads. Here is an approximation of how you can expect every cpu bound game to perform on different cpus: http://www.anandtech.com/bench/CPU/129
No, this isn't 'intel biased', as x86 instructions aren't gimped for different configurations. You can look at any other cpu bound game and you'll find similar results, this graph is just the most comprehensive reliable bench I've seen.
If you like running benchmarks 24/7 the 3570 and 3770 will be the better choice. But if you're actually going to use it for gaming, you should consider the 8350. You will get a CPU thats 50 dollars cheaper than the 3570k, the mobo will be cheaper and you get way more sata 6bg ports on AMD chipsets. http://www.youtube.com/watch?v=eu8Sekdb-IE
I think the testing behind this review was found to be inaccurate.
Would you mind telling us why you think that it's inaccurate?
On January 14 2013 07:50 Myrmidon wrote: The problem is the complete lack of consistency in results. Does this really need to be spelled out? If you can't interpret results like these (edit: I mean, figure out that they might be inaccurate, at least if you have some background knowledge about how the Intel processors relate to one another, which I wouldn't expect of someone who doesn't follow these things), then you need to seriously develop some critical thinking skills.
Then only tested gaming+streaming on one game, where results are hugely in question.
Huge red flags: 3770 > 8350 at 1080p, yet 3770 < 8350 at 1440p. Uh, changing resolutions changes the workload on the GPU, not the CPU. 3770 > 3570 by huge margin without streaming, despite only 100 MHz / 2MB L3 / HT difference (and HT being irrelevant here). We all know 100 MHz won't make that kind of change, for processors running 3+ GHz, and if 2MB extra L3 and/or HT were somehow so important—they're not—then one would expect the 3820 to do a lot better despite being SB-E instead of IVB.
Some other issues too.
PS: who plays fps at those kinds of frame rates?
I didn't bother looking more after seeing the above. But what were the streaming settings? You can always choose a faster encoding preset for slightly worse quality.
On January 14 2013 15:51 Myrmidon wrote: Any situation where HT and slightly more L3 cache are going to make a difference, i7-3820 should be mostly close with i7-3770k. No such trend exists in that data.
When some things are off by dozens of percentage points, is the data really worth looking through? I mean, if you can figure out what's horribly awry, then that's good, but there's no basis for discussion here. It's not like even figuring out what's wrong would give us the "correct" results. May as well bring up the topic you want to discuss and ignore this debacle completely.
edit: more or less, even ignoring AMD vs. Intel and all other review sites' results, there are all sorts of inconsistencies and obvious double-digit percentage point errors that all results should be considered way beyond the "cast into doubt" stage.
Intel is strong in this one
But seriously, how can the information be so wrong when he tested each game multiple times and got the same results?
Overclocking requires a K suffix processor along with a Z chipset board (Z77, Z75, etc).
OK thanks! Will overclocking an I5 give me more than enough performance on games? Sorry for being overly cautious. I got really bad advice from a friend on my first build, and I just want to be absolutely sure with everything.
An i7 isn't really any faster than an i5 for gaming. If you want something faster than a 3570k for gaming you're stuck waiting on Haswell.
Or you could get the AMD 3850 ^^
Is this a troll? I can't think of a single cpu bound game that benefits from the 8350 (you meant this right?) over a 3570. Zero games that tax a 3570 use more than 4 threads. Here is an approximation of how you can expect every cpu bound game to perform on different cpus: http://www.anandtech.com/bench/CPU/129
No, this isn't 'intel biased', as x86 instructions aren't gimped for different configurations. You can look at any other cpu bound game and you'll find similar results, this graph is just the most comprehensive reliable bench I've seen.
If you like running benchmarks 24/7 the 3570 and 3770 will be the better choice. But if you're actually going to use it for gaming, you should consider the 8350. You will get a CPU thats 50 dollars cheaper than the 3570k, the mobo will be cheaper and you get way more sata 6bg ports on AMD chipsets. http://www.youtube.com/watch?v=eu8Sekdb-IE
I think the testing behind this review was found to be inaccurate.
Would you mind telling us why you think that it's inaccurate?
On January 14 2013 07:50 Myrmidon wrote: The problem is the complete lack of consistency in results. Does this really need to be spelled out? If you can't interpret results like these (edit: I mean, figure out that they might be inaccurate, at least if you have some background knowledge about how the Intel processors relate to one another, which I wouldn't expect of someone who doesn't follow these things), then you need to seriously develop some critical thinking skills.
Then only tested gaming+streaming on one game, where results are hugely in question.
Huge red flags: 3770 > 8350 at 1080p, yet 3770 < 8350 at 1440p. Uh, changing resolutions changes the workload on the GPU, not the CPU. 3770 > 3570 by huge margin without streaming, despite only 100 MHz / 2MB L3 / HT difference (and HT being irrelevant here). We all know 100 MHz won't make that kind of change, for processors running 3+ GHz, and if 2MB extra L3 and/or HT were somehow so important—they're not—then one would expect the 3820 to do a lot better despite being SB-E instead of IVB.
Some other issues too.
PS: who plays fps at those kinds of frame rates?
I didn't bother looking more after seeing the above. But what were the streaming settings? You can always choose a faster encoding preset for slightly worse quality.
On January 14 2013 15:51 Myrmidon wrote: Any situation where HT and slightly more L3 cache are going to make a difference, i7-3820 should be mostly close with i7-3770k. No such trend exists in that data.
When some things are off by dozens of percentage points, is the data really worth looking through? I mean, if you can figure out what's horribly awry, then that's good, but there's no basis for discussion here. It's not like even figuring out what's wrong would give us the "correct" results. May as well bring up the topic you want to discuss and ignore this debacle completely.
edit: more or less, even ignoring AMD vs. Intel and all other review sites' results, there are all sorts of inconsistencies and obvious double-digit percentage point errors that all results should be considered way beyond the "cast into doubt" stage.
Intel is strong in this one
But seriously, how can the information be so wrong when he tested each game multiple times and got the same results?
Testing methodology, I believe.
I can run a benchmark of Metro or something on 1080P with a GT520 on an 8350, 3570k, and 3770k, and repeat it like 10 times, but that doesn't mean it's a good result. And when you get weird discrepancies between a 3570k and a 3770k when they're essentially identical for gaming (hyperthreading does nothing), you know something else has to be off.
Also: a comment from that website; "Depends on the power supply that you get. If you get a good one, 5 years from now, you wont have to upgrade the power supply in a new rebuild. Average custom gaming computer builds have been demanding 700watt psu's for the past 10 years, I highly doubt in the next 5 years, the demand will be much higher, so if you buy an 800-1000watt you should be good for any future upgrades. Just look at the long run."
Overclocking requires a K suffix processor along with a Z chipset board (Z77, Z75, etc).
OK thanks! Will overclocking an I5 give me more than enough performance on games? Sorry for being overly cautious. I got really bad advice from a friend on my first build, and I just want to be absolutely sure with everything.
An i7 isn't really any faster than an i5 for gaming. If you want something faster than a 3570k for gaming you're stuck waiting on Haswell.
Or you could get the AMD 3850 ^^
Is this a troll? I can't think of a single cpu bound game that benefits from the 8350 (you meant this right?) over a 3570. Zero games that tax a 3570 use more than 4 threads. Here is an approximation of how you can expect every cpu bound game to perform on different cpus: http://www.anandtech.com/bench/CPU/129
No, this isn't 'intel biased', as x86 instructions aren't gimped for different configurations. You can look at any other cpu bound game and you'll find similar results, this graph is just the most comprehensive reliable bench I've seen.
If you like running benchmarks 24/7 the 3570 and 3770 will be the better choice. But if you're actually going to use it for gaming, you should consider the 8350. You will get a CPU thats 50 dollars cheaper than the 3570k, the mobo will be cheaper and you get way more sata 6bg ports on AMD chipsets. http://www.youtube.com/watch?v=eu8Sekdb-IE
I think the testing behind this review was found to be inaccurate.
Would you mind telling us why you think that it's inaccurate?
On January 14 2013 07:50 Myrmidon wrote: The problem is the complete lack of consistency in results. Does this really need to be spelled out? If you can't interpret results like these (edit: I mean, figure out that they might be inaccurate, at least if you have some background knowledge about how the Intel processors relate to one another, which I wouldn't expect of someone who doesn't follow these things), then you need to seriously develop some critical thinking skills.
Then only tested gaming+streaming on one game, where results are hugely in question.
Huge red flags: 3770 > 8350 at 1080p, yet 3770 < 8350 at 1440p. Uh, changing resolutions changes the workload on the GPU, not the CPU. 3770 > 3570 by huge margin without streaming, despite only 100 MHz / 2MB L3 / HT difference (and HT being irrelevant here). We all know 100 MHz won't make that kind of change, for processors running 3+ GHz, and if 2MB extra L3 and/or HT were somehow so important—they're not—then one would expect the 3820 to do a lot better despite being SB-E instead of IVB.
Some other issues too.
PS: who plays fps at those kinds of frame rates?
I didn't bother looking more after seeing the above. But what were the streaming settings? You can always choose a faster encoding preset for slightly worse quality.
On January 14 2013 15:51 Myrmidon wrote: Any situation where HT and slightly more L3 cache are going to make a difference, i7-3820 should be mostly close with i7-3770k. No such trend exists in that data.
When some things are off by dozens of percentage points, is the data really worth looking through? I mean, if you can figure out what's horribly awry, then that's good, but there's no basis for discussion here. It's not like even figuring out what's wrong would give us the "correct" results. May as well bring up the topic you want to discuss and ignore this debacle completely.
edit: more or less, even ignoring AMD vs. Intel and all other review sites' results, there are all sorts of inconsistencies and obvious double-digit percentage point errors that all results should be considered way beyond the "cast into doubt" stage.
Intel is strong in this one
But seriously, how can the information be so wrong when he tested each game multiple times and got the same results?
How can every other reviewer be wrong when they tested each game multiple times and got the same results as each other but different results from "this one guy"? You want to believe him, so you are assuming he's better at what he's doing than everyone else posting reviews, without any actual reason to believe he's the only one with superior testing powers.
Mainstream reviewers they can be wrong sometimes. But seriously, there's no reason to think only one man on the internetz knows how to benchmark and everyone else is a paid shill or something.
If he was actually a top-notch reviewer he'd be looking into reasons why he got weird results. Bit-tech got really weird results with their Shogun 2:Total War test for their ivy bridge review. They flagged those results for their readers, swore they ran the tests multiple times, and let it go. It happened to turn out that game shows huge benefits from minor changes in IB's instruction set. But sometimes weird results just means you did something wrong.
At least the guy tested more than 3 weird, non-mainstream games for his second video, making him a little less of a hypocrite. Still the best studio set for an online review I've seen, though. Serious props on that.
OK thanks! Will overclocking an I5 give me more than enough performance on games? Sorry for being overly cautious. I got really bad advice from a friend on my first build, and I just want to be absolutely sure with everything.
An i7 isn't really any faster than an i5 for gaming. If you want something faster than a 3570k for gaming you're stuck waiting on Haswell.
Or you could get the AMD 3850 ^^
Is this a troll? I can't think of a single cpu bound game that benefits from the 8350 (you meant this right?) over a 3570. Zero games that tax a 3570 use more than 4 threads. Here is an approximation of how you can expect every cpu bound game to perform on different cpus: http://www.anandtech.com/bench/CPU/129
No, this isn't 'intel biased', as x86 instructions aren't gimped for different configurations. You can look at any other cpu bound game and you'll find similar results, this graph is just the most comprehensive reliable bench I've seen.
If you like running benchmarks 24/7 the 3570 and 3770 will be the better choice. But if you're actually going to use it for gaming, you should consider the 8350. You will get a CPU thats 50 dollars cheaper than the 3570k, the mobo will be cheaper and you get way more sata 6bg ports on AMD chipsets. http://www.youtube.com/watch?v=eu8Sekdb-IE
I think the testing behind this review was found to be inaccurate.
Would you mind telling us why you think that it's inaccurate?
On January 14 2013 07:50 Myrmidon wrote: The problem is the complete lack of consistency in results. Does this really need to be spelled out? If you can't interpret results like these (edit: I mean, figure out that they might be inaccurate, at least if you have some background knowledge about how the Intel processors relate to one another, which I wouldn't expect of someone who doesn't follow these things), then you need to seriously develop some critical thinking skills.
Then only tested gaming+streaming on one game, where results are hugely in question.
Huge red flags: 3770 > 8350 at 1080p, yet 3770 < 8350 at 1440p. Uh, changing resolutions changes the workload on the GPU, not the CPU. 3770 > 3570 by huge margin without streaming, despite only 100 MHz / 2MB L3 / HT difference (and HT being irrelevant here). We all know 100 MHz won't make that kind of change, for processors running 3+ GHz, and if 2MB extra L3 and/or HT were somehow so important—they're not—then one would expect the 3820 to do a lot better despite being SB-E instead of IVB.
Some other issues too.
PS: who plays fps at those kinds of frame rates?
I didn't bother looking more after seeing the above. But what were the streaming settings? You can always choose a faster encoding preset for slightly worse quality.
On January 14 2013 15:51 Myrmidon wrote: Any situation where HT and slightly more L3 cache are going to make a difference, i7-3820 should be mostly close with i7-3770k. No such trend exists in that data.
When some things are off by dozens of percentage points, is the data really worth looking through? I mean, if you can figure out what's horribly awry, then that's good, but there's no basis for discussion here. It's not like even figuring out what's wrong would give us the "correct" results. May as well bring up the topic you want to discuss and ignore this debacle completely.
edit: more or less, even ignoring AMD vs. Intel and all other review sites' results, there are all sorts of inconsistencies and obvious double-digit percentage point errors that all results should be considered way beyond the "cast into doubt" stage.
Intel is strong in this one
But seriously, how can the information be so wrong when he tested each game multiple times and got the same results?
Testing methodology, I believe.
I can run a benchmark of Metro or something on 1080P with a GT520 on an 8350, 3570k, and 3770k, and repeat it like 10 times, but that doesn't mean it's a good result. And when you get weird discrepancies between a 3570k and a 3770k when they're essentially identical for gaming (hyperthreading does nothing), you know something else has to be off.
Also: a comment from that website; "Depends on the power supply that you get. If you get a good one, 5 years from now, you wont have to upgrade the power supply in a new rebuild. Average custom gaming computer builds have been demanding 700watt psu's for the past 10 years, I highly doubt in the next 5 years, the demand will be much higher, so if you buy an 800-1000watt you should be good for any future upgrades. Just look at the long run."
oh ma gawd.
I'm 100% surprised that there isn't Dirt 3/Showdown or F1 (w/e its called, based on same engine as Dirt 3/Showdown) as one of the games tested. Nothing like having a graphically bound game skewing results! (albeit Metro 2033/Crysis are GPU bound games so their results should be around the same).
On January 28 2013 05:47 Rachnar wrote: i havent ever hit 4 gb even when streaming.... dunno how people who just play can hit 8 lol
While I definitely agree with the general stance on price/performance everyone takes here, I have to say that I can easily hit 4gb of ram usage. Had I not gone with 8gb myself when I assembled my computer I'd been forced to go out and get it.
A dozen+ browser tabs, HD streams, all the miscallenous software(skype/TS3/IRC etc.) while also running some demanding games at high settings. It definitely breaks 4gb ram. I believe I've hit above 5 at some point.
Not to defend overspending on stupidly priced "gamerram" or whatever, but 8gb isn't too much if there's any room in the bugdet.
On January 28 2013 02:57 ImANinjaBich wrote: is this motherboard compatible with this cpu?
ASRock H77M LGA 1155 Intel H77 i5 3470
thanks
Stop asking a million useless questions. Yes it is.
EDIT for clarity: On the product page the H77M says it supports LGA 1155 Processors. Guess what?!? On the product page of the i5 3470 it says its a LGA 1155 processor. wowow 10 seconds of reading makes a question not needed.
EDIT2 for more clarity:
Guess what!! O.o you're a dickhead!
EDIT for clarity: you're a dickhead!! O.o
EDIT2 for more clarity: you're still a dickhead! O.o
Is the Xigmatek Elysium natively compatible with a 360 rad in push/pull (both inside the case)? I found some pictures that seem to indicate it, but I can't really tell if any modding has been done.
When do you plan on building it? Between now and March 1st.
Do you plan on overclocking? Maybe, but most likely no.
Do you need an Operating System? No
Do you plan to add a second GPU for SLI or Crossfire? No
Where are you buying your parts from? Newegg/Best Buy/Micro center. I say Best buy cuz i work there, get discount ftw.
I have a 550w PSU thats good. I have a 460 GTX Cyclone thats currently overclocked 19%. I dont think this is bottlenecking me. i have a shitty mobo and phenom II x4 955 black I have 4gb ddr2 i HAVE a case with 3 fans, large case, anything will fit. Im NOT using CDrom drives.
Basically, im looking for a CPU (likely i5-3570k) a Motherboard thats DECENT (I will replace mobo again when i upgrade again. This mobo should get the job done, doesnt need any special features. Basically, where cost meets performance. A balance -- i am not looking to spend 400$ on a mobo), 8gb ram ddr3, and a HDD that ill pick.
Id like to spend 180$ on i5 (microcenter price, i can pick this up) Id like to spend about 200-250 on ram AND mobo. This will leave me with ~300$ later to upgrade GPU. (also whats the best GPU for the price atm, around that price?)
On January 28 2013 10:27 Alryk wrote: Testing methodology, I believe.
I can run a benchmark of Metro or something on 1080P with a GT520 on an 8350, 3570k, and 3770k, and repeat it like 10 times, but that doesn't mean it's a good result. And when you get weird discrepancies between a 3570k and a 3770k when they're essentially identical for gaming (hyperthreading does nothing), you know something else has to be off.
Also: a comment from that website; "Depends on the power supply that you get. If you get a good one, 5 years from now, you wont have to upgrade the power supply in a new rebuild. Average custom gaming computer builds have been demanding 700watt psu's for the past 10 years, I highly doubt in the next 5 years, the demand will be much higher, so if you buy an 800-1000watt you should be good for any future upgrades. Just look at the long run."
oh ma gawd.
Every teck forum has these idiots who talk out of their asses, and have no clue what they are talking about.
On January 28 2013 10:33 MisterFred wrote: How can every other reviewer be wrong when they tested each game multiple times and got the same results as each other but different results from "this one guy"? You want to believe him, so you are assuming he's better at what he's doing than everyone else posting reviews, without any actual reason to believe he's the only one with superior testing powers.
Mainstream reviewers they can be wrong sometimes. But seriously, there's no reason to think only one man on the internetz knows how to benchmark and everyone else is a paid shill or something.
If he was actually a top-notch reviewer he'd be looking into reasons why he got weird results. Bit-tech got really weird results with their Shogun 2:Total War test for their ivy bridge review. They flagged those results for their readers, swore they ran the tests multiple times, and let it go. It happened to turn out that game shows huge benefits from minor changes in IB's instruction set. But sometimes weird results just means you did something wrong.
At least the guy tested more than 3 weird, non-mainstream games for his second video, making him a little less of a hypocrite. Still the best studio set for an online review I've seen, though. Serious props on that.
Well, every other reviewer tested this cpu when it first came out, and logan said something about using some updates that was not out for the masses yet.
Year ago, I think you are unawere off the update Iam talking about. Atm you need to contact Microsoft to get it. It has something to do about fixing a issue where the cpu would randomly dump the cashe.
On January 28 2013 19:43 Pusekatten wrote: Year ago, I think you are unawere off the update Iam talking about. Atm you need to contact Microsoft to get it. It has something to do about fixing a issue where the cpu would randomly dump the cashe.
Well then who cares? Come back later when someone's actually testing a real-world product available to the average consumer. If an ordinary windows update could produce those results, we'd all be saying "buy AMD then download this update" with an appropriate link to microsoft.com. After someone other than "one dude," no matter who said dude is, reproduced the same results, of course.
Edit: Not to mention that if true that would be a great reason to avoid/make fun of AMD: a company showing itself to be so incompetent it doesn't ship software with its product that can practically double its performance.
and then one commenter right below speculating that they should have "dump[ed] all that cache" and instead used more of the die area on traditional CPU core logic (decoder, execution units, etc.). edit: which is pretty hard to say for sure if you didn't help design the architecture and know all its inner workings, but that would seem pretty plausible. That said, for future and current mobile derivatives and also HSA in mind as their intended future, it makes sense for the CPU cores themselves to be leaner so there is more room for GPU processing elements.
Anyone else have any input on a DECENT mobo that wont gimp me, but doesnt kill the wallet? Not looking to overclock or anything. Prefer from microcenter. for a i5-3570k
On January 29 2013 00:25 Malpractice.248 wrote: Anyone else have any input on a DECENT mobo that wont gimp me, but doesnt kill the wallet? Not looking to overclock or anything. Prefer from microcenter. for a i5-3570k
What features would you consider gimped by not having? Some users want a whole lot of USB ports out back and dislike USB hubs for whatever reason; others want coax S/PDIF, Bluetooth, SLI / Crossfire support, whatever. There's not much to say otherwise. Even the cheapest stuff will work for many users. If you don't need overclocking support, you can mostly pick any series 7 chipset motherboard (B75, H77, Z77, etc.) or even an older one.
I guess the i5-3570k is there because others aren't as discounted? You don't need the overclocking support and probably wouldn't use the HD 4000 either.
That has most of the things people are looking for and can be used to OC if you change your mind. You can save money by getting something with less features.