|
When using this resource, please read FragKrag's opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly. |
On June 12 2013 22:32 Alryk wrote:Do you mean overclocked? If you mean stock settings, then everything can be tested just by using it. If the computer boots up and you don't find any weird issues, you're probably good to go  you don't really need to use stability tests for stock settings.
Thanks to everyone that responded. I am going to overclock so I think I'll pick up Prime 95 and CPU-Z to get an overall monitoring for stability. I don't think I need Furmark since I don't really want to manually overclock my GPU. What about HD and SSD? is it even necessary to monitor those? Is there a good all-inclusive monitoring program? Thanks people.
|
On June 13 2013 00:19 Spec wrote:Show nested quote +On June 12 2013 22:32 Alryk wrote:Do you mean overclocked? If you mean stock settings, then everything can be tested just by using it. If the computer boots up and you don't find any weird issues, you're probably good to go  you don't really need to use stability tests for stock settings. Thanks to everyone that responded. I am going to overclock so I think I'll pick up Prime 95 and CPU-Z to get an overall monitoring for stability. I don't think I need Furmark since I don't really want to manually overclock my GPU. What about HD and SSD? is it even necessary to monitor those? Is there a good all-inclusive monitoring program? Thanks people.
Crystaldiskinfo will tell you the smart data which lets you know if the hdd is deteriorating and I think it will tell you if the SSD is running out of writes, not 100% on that though.
Also benching your SSD may be worth it to check you have your drivers and settings straight, crystaldiskmark and theres some other one, HDDtacho or something?
|
On June 12 2013 23:48 dunne_bo wrote: Hello gurus of TL.net, I need some advice on a build for my brother's new desktop...I dont know much about building Desktops (I usually just buy the best my budget can afford) So my brothers works as a graphic designer....mostly photoshop and some other vector drawing programs such as illustrator. I think he requieres a good GPU since he works with very high resolution images and stuff like that. He also wants to play some demanding games like bioshock infinite. So I was doing some research but I have no clue where to begin with. So I'm just gonna drop some preferences he has: - Nvidia GPU is prefered - Maybe a haswell processor...not sure about this one, he usually sticks with a desktop for a long time so he might wanna get the latest tech available - ASUS motherboard (he likes the brand lol)
Well I hope this is enough info for a build suggestion. About the budget...I think he wants to spend 1.2K and no more than that.
Thanks in advance guys!! Tell him if he wants bang for buck, he'll have to reconsider his prejudices. AMD generally has better bang for buck than nVidia when we consider single GPU configurations.
ASUS are a good brand. Nothing wrong with them. Their stuff is a bit expensive, but hey, you generally get what you pay for - but to the exclusion of other brands? I wouldn't agree at all.
Anyway, so that we may properly address your question, we need some more information. What is your budget? Do you need an OS? Do you have a monitor already? If not, what kind of monitor would you like? If you do have one, what size/res is it?
|
On June 11 2013 11:42 Cyro wrote: And i own a 4770k and a silver arrow. By the way: Why did you choose the Silver Arrow over the NH-D14?
|
Hello fellow TL members. I'd like to hear your advice on the matter that I was unable to find the answer for:
Will a better integrated graphic improve the performance of the dedicated graphic card?
Specifically, I own a GTX 650 graphic card and have the possibility to pair it with Intel Core i3-3220 (Intel HD2500 graphics) or with Intel Core i3-3225 (Intel HD 4000). The processing cores of those are identical (or so I believe), the only difference is the graphic part. So will using the i3-3225 improve the performance of the GTX650 or is it just 'dead weight' and the graphic part of the processors is not used at all?
Thank you in advance
|
hm, i am probably really retarded at oc or my chip doesnt like me much
got a ga z77x d3h with an 3570k and it only gets stable at 4.1ghz without whea error etc on 1.28vcore, doing 4.2ghz on 1.3 and i get dozens of whea errors while running prime
temps with prime runs to 69 but i run my hr02 macho on silent via bios (515rpm) so temps could be better if i would crank the macho up
|
On June 13 2013 03:14 zdubius wrote: Hello fellow TL members. I'd like to hear your advice on the matter that I was unable to find the answer for:
Will a better integrated graphic improve the performance of the dedicated graphic card?
Specifically, I own a GTX 650 graphic card and have the possibility to pair it with Intel Core i3-3220 (Intel HD2500 graphics) or with Intel Core i3-3225 (Intel HD 4000). The processing cores of those are identical (or so I believe), the only difference is the graphic part. So will using the i3-3225 improve the performance of the GTX650 or is it just 'dead weight' and the graphic part of the processors is not used at all?
Thank you in advance You shouldn't pay extra for the HD4000.
It won't add anything to the GTX650. If you have both graphics card and integrated graphics enabled, the two are still separate. It's used if you connect a monitor to the motherboard. The programs displayed on that monitor will use the integrated graphics, so that's not terribly useful as you'll want programs to use the GTX650, I assume.
There's also that "Quick Sync" for encoding video that might be neat to use (though I don't know which programs can do that), but I think that's not any different between HD2500 and HD4000.
|
On June 13 2013 03:51 {ToT}ColmA wrote: hm, i am probably really retarded at oc or my chip doesnt like me much
got a ga z77x d3h with an 3570k and it only gets stable at 4.1ghz without whea error etc on 1.28vcore, doing 4.2ghz on 1.3 and i get dozens of whea errors while running prime
temps with prime runs to 69 but i run my hr02 macho on silent via bios (515rpm) so temps could be better if i would crank the macho up Check if you have "Load Line Calibration" (LLC) enabled. Your actual Vcore could be a lot lower than those 1.28 under load.
That's what I posted the last time you mentioned something about overclocking, and there's a screenshot where you'll find it: + Show Spoiler +On June 08 2013 18:47 Ropid wrote:Show nested quote +On June 08 2013 17:32 {ToT}ColmA wrote: okay i just was insane after waking up and did oc again, just really newb like putting it on auto volt with 4ghz, not really long prime test 2ish hours and max temp was 66, guess i could go higher clock with that high voltage but who cares, should be fast enough anyway Try this: go into the UEFI (aka BIOS) and go into the "3d power control" screen, set "voltage response" to Fast and "vcore loadline calibration" to High. It's this screen: http://www.ninjalane.com/images/ga-z77x-ud3h/bios_3dpower.jpgGo to the voltage screen and set VCore to "Normal" instead of Auto. The following line (named DVID) will light up, which is an offset you can add if you need more voltage. For your 4 GHz, offset on zero will very likely still be enough. You can do your experimenting without rebooting in Windows with the EasyTune6 software. After you are done, you can just take the offset voltage you used and set it in the UEFI. Screenshots of that: + Show Spoiler +
|
On June 13 2013 03:51 {ToT}ColmA wrote: hm, i am probably really retarded at oc or my chip doesnt like me much
got a ga z77x d3h with an 3570k and it only gets stable at 4.1ghz without whea error etc on 1.28vcore, doing 4.2ghz on 1.3 and i get dozens of whea errors while running prime
temps with prime runs to 69 but i run my hr02 macho on silent via bios (515rpm) so temps could be better if i would crank the macho up Jesus, are you sure? Are you setting the voltage manually or dynamically? The dynamic vcore can only be set + or - off the stock voltage (probably 1.1v).
Find a stable manual voltage first, and then work on the dynamic vcore (which is a bit trickier - I have that board and CPU).
|
what i did was just do manual vc for cpu and multi to 41 / 42, thats all i did via bios, didnt install easytune6
totally missed your post the last time@ropid
gonna install easytune6 to check it out
|
If I think about it, I'm pretty sure it's that LLC setting. Your temperatures are very low for 1.3 V. It should be towards 90 C or something (EDIT: that 90 C guess is probably off by a lot and exaggerated). LLC being off could mean perhaps 0.1 V lower voltage while prime95 is doing its stuff.
That easytune6 suggestion was just so you can be a bit lazy and increase voltage without having to reboot if you see WHEA warnings. You actually have very little settings in easytune6 compared to the UEFI. You can't use fixed voltage, and you can't change things like the 3d power settings.
|
On June 13 2013 03:59 Ropid wrote:Show nested quote +On June 13 2013 03:14 zdubius wrote: Hello fellow TL members. I'd like to hear your advice on the matter that I was unable to find the answer for:
Will a better integrated graphic improve the performance of the dedicated graphic card?
Specifically, I own a GTX 650 graphic card and have the possibility to pair it with Intel Core i3-3220 (Intel HD2500 graphics) or with Intel Core i3-3225 (Intel HD 4000). The processing cores of those are identical (or so I believe), the only difference is the graphic part. So will using the i3-3225 improve the performance of the GTX650 or is it just 'dead weight' and the graphic part of the processors is not used at all?
Thank you in advance You shouldn't pay extra for the HD4000. It won't add anything to the GTX650. If you have both graphics card and integrated graphics enabled, the two are still separate. It's used if you connect a monitor to the motherboard. The programs displayed on that monitor will use the integrated graphics, so that's not terribly useful as you'll want programs to use the GTX650, I assume. There's also that "Quick Sync" for encoding video that might be neat to use (though I don't know which programs can do that), but I think that's not any different between HD2500 and HD4000.
Thank you very much, Ropid.
|
did the changes u suggested, cpu is getting a lot warmer now lol
past test4 prime already 78degree ah well, cpu-z shows vcore of 1.356 now lol
|
Right! I bet the warnings are now gone. So you can go down a lot with the voltage now. Or you could go higher with the multiplier, until the errors start showing up again. Personally, I don't think 78 C is scary, even though prime95 isn't the stress test that can produce the most heat.
|
well i dont ve any whea errors so far, temps at 79c, fourth core always the hottest. what do u suggest i aim for temp wise? cuz thats what i am most afraid of. plus i ve no clue about lcc or what its called and all these options in the bios :D
|
That 1.356 V is perhaps a bit much. Also, keep in mind that your temperatures might look pretty different if the graphics card is running for a while at full speed. It will heat up the case and if that's 20 C, those 20 C will get added to the CPU temperature. I wouldn't like that even if it's only theoretical and there's no actual program that does both 100 % graphics and 100 % CPU at the same time.
EDIT: I looked through screenshots and stuff, and personally, it seems I aimed for prime95 staying below 80 C at all times. For my cooler, which is like half the size of yours, that's 1.255 V.
|
Don't bother going over 1.3v unless you have a high end air cooler. Get 4.5/4.6 stable and call it a day Anything up to 85 is perfectly safe.
|
yea well problem is/was that my cpu was stable with my settings without errors @4.1ghz with vcore of 1.265 and 4.2ghz wasnt stable with 1.3vcore so i doubt it could go much lower if i wanted to if i go higher clock
|
|
United Kingdom20318 Posts
On June 13 2013 02:26 blueslobster wrote:By the way: Why did you choose the Silver Arrow over the NH-D14?
It was on offer (more than 25% cheaper) and they are functionally pretty much the same
|
|
|
|