|
On December 10 2012 16:25 QoGxDyNaSTy wrote: i run in windowed fullscreen and have dual monitors. would running ultra and having 2 monitors effect my VRAM?
Yes. The higher the graphics settings, specifically your texture settings, and the higher your resolution (720 vs 1080+), will dramatically affect VRAM, as will having 2 monitors.
1 monitor on ~720p on medium or high, you would almost never break 500vram in 1v1 and unit preloader. On ultra, you won't ever break 750vram on 1v1/unit preloader (maybe once, 500 is too little, but 768vram is more than comfortable).
On 1080+ Ultra, I imagine that 1GB would be enough, but you'd be pushing it, and with 2 monitors I'm not so sure if 1GB would be enough (it might be, but you'd definitely push past 800mb VRAM). I mean you could be comfortable, but I think you'd surpass 1GB on 2 monitors.
Here's what you should do - download something called HWINFO. Just run it. See if your GPU VRAM usage ever goes your limit. Your definitely pushing it on 1080+, Ultra, 2 monitors.
|
On December 11 2012 05:20 Capped wrote:Forced to play on a laptop without its cooling fan (Shipped everything from UK to Sri lanka - not due for a few days) i dont know why i didnt ship its fan, FU  Basically, ive got it raised about 2/3 an inch off a glass table, big 16" pedastal fan directed at it (hot part, the fan vent etc) and its still running hot, i need to clean it as its running about 10 degrees higher on average idle / under load. Any ideas how to help reduce heat further? Also, while playing guild wars 2 Speedfan is registering temps of about 90-92oC but the moment i minimise (and im talking 2 seconds max) it drops straight back down to 80-84oC. This is the GFX. The CPU is sitting between 75-80 constant. Im aware i can run my GPU at about 80 without doing damage, 90 is too high though but i think its a false positive from speedfan, considering the massive jump downwards as soon as i minimise. Also, what is a safe operating temp for a CPU? is 75-80 ok-ish, for now? I am resting it every 30 mins to 1 hour, letting it completely cool to below 40 or outright turning it off for 10 mins too...
Laptop CPUs often push 80-90 degrees. 75 is nothing to worry about. It's going to run hotter because of the inherent design/fact that there's less cooling headway available.
|
Nope. Sc2 with 4x AA will not come close to using 1GB of VRAM. Maybe BF3 with ultra textures and 16xAF/4xAA, but not sc2 by any stretch. Ever.
BF3 and other first person shooters don't use much VRAM because your on-screen vision is limited by hallways, buildings, et cetera - what's in front of you. In FPS, your view of the map is limited so VRAM usage is actually quite low.
SC2, on the other hand, really doesn't take much VRAM because it's quite an old game and such, but compared to a similarly intensive FPS (say a 2008 FPS or something), it will use a lot more VRAM because of so many units, and how big the map is. While VRAM usage will spike based on what's in front of you, textures at the opponent's base or dead/no longer present units will still take up memory cache.
Throw in AA, and your going to significantly increase your VRAM usage, a good 20%+.
I tested VRAM usage in SC2 on my 1360x768 resolution, and I've found on Ultra, with no AA, that 512mb was way too little VRAM and that I was generally maxing around 600-700 in 1v1 and unit preloader. I'm pretty sure on 1080+, with AA, your going to go way past 800. 1GB might be enough, but I seriously think AA, 1080+, 2 monitors, would be pushing it. I really don't think it'd be enough on such a set-up.
You are definitely wrong in saying "sc2 won't come close to 1gb" though. Definitely wrong.
You mean you'll outperform it at encoding? Maybe, but I doubt it, the i5 has 4 much faster cores than an OCed x6, they'd probably be close at 4.2, not much either way.
You seem to say a lot of things without knowing for sure...
Phenom II outperforms comparatively priced Intel (note: A phenom ii is comparable to a pentium in price, not an i5, although an x6 is closer to an i3 or lower end i5 in price) when it comes to H264 encoding. Intel is a better chip for sure at the same price - they run cooler, they do 95% of applications 5-10% better (not noticeably better, just consistently better). But when it comes to H264 encoding performance, both first pass and second pass, Phenom beats Intel because of the 4 physical cores instead of 4 modular cores.
http://www.anandtech.com/bench/Product/203?vs=363
As you can see, A Phenom x6 beats an i5-2400 in 2nd pass, and is slightly lower. Overclock it by 25%, which is a very conservative 24/7 overclock on a phenom x6, and it'll outperform a stock i5, like the i5-2400 that can't be overclocked.
The Phenom x6 is a horrible deal, both new and used - it's priced similarly to an i5, which if overclocked will crush an overclocked Phenom x6, and beat it in every application by a significant amount. I'm not defending the chip at all.
But when it comes to encoding performance, Phenom beats Intel at the same price point, especially when overclocked, although the i5 is just superior in every way once overclocked (and its' not very fair to compare overclocked Phenom x6 to stock i5, to be fair).
All I'm getting at here, is that the poster can overclock his Phenom x6 for a significant increase in performance, and that his chip is far from weak when it comes to streaming.
http://www.anandtech.com/bench/Product/203?vs=701
Non-OC Phenom x6 vs IB-i5 (lets assume ib-i5 is an oc sb-i5. Factor in a 25% overclock, and the Phenom's h264 performance is going to surpass the i5).
I mean it's all very arguably and close, the i5 is way better than the Phenom x6, and when it comes to streaming, an OC i5 beats a Phenom x6. But an OC Phenom x6 will beat a stock i5 when it comes to streaming. It'll lose in everything else, but when it comes to streaming, it does very well.
The same story repeats with the Phenom x4 vs Pentium 850, both of which are priced at ~$75 (the intel is actually slightly more expensive). Except in such case, the Phenom x4 actually performs almost twice as good as the Pentium 850, especially when overclocked. Again, Pentium 850 beats Phenom x4 in everything, slightly, but the Phenom x4 is the better contender when it comes to streaming + overclock.
You should always buy intel, and I've made points before about Phenom for budget overclock+streaming instead of intel at sub-$100 level, and that the i5 is superior when you can afford it, but my point here is simply that the phenom x6 will stream quite well.
A stock one, but if he matches your overclock he will have a significant (>50%) lead in cpu-bound game performance while matching multi-threaded performance, showing as much higher minimum and average fps, FPS not dipping nearly as low with screen capture methods and being able to hit the same encoding settings as the x6, but holding >60fps (or close to it) for a higher percentage of the game or a battle while the phenom being behind in performance cuts it a lot lower than the i5.
Of course. I'm just saying his Phenom x6 is no weakling when it comes to streaming. He should easily be able to stream 720p@60fps on a phenom x6, much less an x4.
Really gotta overclock the Phenom ii to make the chip worthwhile, especially the x6. Otherwise Intel is better at everything and only slightly worse at streaming for the same price point, except the i5 vs x6.
Dont get me wrong, i5 >>>>>>>> x6. But that guy should have no problem with streaming on his x6, it should be as smooth as butter, especially with an overclock.
Forced to play on a laptop without its cooling fan (Shipped everything from UK to Sri lanka - not due for a few days) i dont know why i didnt ship its fan, FU
Basically, ive got it raised about 2/3 an inch off a glass table, big 16" pedastal fan directed at it (hot part, the fan vent etc) and its still running hot, i need to clean it as its running about 10 degrees higher on average idle / under load.
Any ideas how to help reduce heat further?
Also, while playing guild wars 2 Speedfan is registering temps of about 90-92oC but the moment i minimise (and im talking 2 seconds max) it drops straight back down to 80-84oC. This is the GFX.
The CPU is sitting between 75-80 constant.
Im aware i can run my GPU at about 80 without doing damage, 90 is too high though but i think its a false positive from speedfan, considering the massive jump downwards as soon as i minimise.
Also, what is a safe operating temp for a CPU? is 75-80 ok-ish, for now?
I am resting it every 30 mins to 1 hour, letting it completely cool to below 40 or outright turning it off for 10 mins too...
What CPU? What model laptop?
Depends on your CPU. If your running an AM3 chip, that could be seriously damaging your CPU. Serious damage. If it's an intel, it might be okay. Really depends. I'd say 75-80 for idle means your destroying your laptop very quickly, considering load temps will reach 90+. Turn off the laptop....
Aside from getting a ton of dry ice or having the laptop in the freezer with cables out for monitors and HIDs, you need to get a fan. How crucial it is that you dont touch your laptop, turn it off, if your destroying it quickly or slowly, is a matter of what laptop you have and if these are idle or load temps or what. You really arent telling us enough...
You should be getting massive jump downwards of temps as you minimize your game, on GPU temps... A component will cool quite rapidly as soon as it goes from load to idle, it actually takes a ton of energy to keep something hot, and even more energy to raise the temperature of a component. Basic physics. It does not sound like a false positive.
Also, 90*C will be perfect okay for a GPU (its on the high end, but if thats temps on a GPU stresstest and NOT a video game, mind you, it's okay, if its video game temps thats actually a bit worrying), but that doesn't mean it's okay for everything ON the graphics card. PCBs are commonly rated for only 90*C, so while your chip might be okay, your PCB might deteroriate and have permanent damage. it's okay once or twice for a minute or few or something, but it will definitely cause damage in the long term, if not the short term.
Then there's also the GPU VRMs, which will frequently blow out and may be rated for only 100*C, which means that while they will only blow out at 100*C, they will have serious performance degradation a good 10-20*C before then, causing hardware problems and writing bad code and artifacting and instability as power is not delivered cleanly to your GPU components due to VRM perfomance degradation.
Until we know more about what your running and you give us some more information, and you figure out what your actually doing, I'd strongly recommend you turn off your laptop, or at the very least stop fucking playing starcraft 2 on it. Your not going to feel too smart when your laptop breaks or it dies within 3 years because you had to play a couple games.
There's a reason laptops come with cooling components at stock.... if it was a desktop, you'd have more wiggle room. Laptops and mobile devices specifically are fighting constantly with heat. If heat wasn't a problem, laptops would be infinitely small and infinitely powerful. Having a dead fan on a desktop can be okay, having a dead fan on a laptop is just death.
Laptops tend to be rated for slightly higher temps, but we dont know what your running. You might be just killing your system, we dont know what your running. On certain systems your temps might be tolerable, on another it's bad. Get it? A lack of CPU fan isn't just hurting your CPU, it's meant to cool the entire system with residual air flow, so everything is getting hotter.
|
First of all, Chill - the fuck - out.
By "cooling fan" i meant one of those cooling pads with a fan, the actual fan works fine -_-
I dont know if your tired or what but spouting endless nonsense about not knowing if temps are idle, under load via video games or prime95 and all because "I wanted to play a couple games of starcraft (fucking) 2" when i clearly stated "I get these temps from speedfan while playing guild wars 2" is quite baffling to me, along with the aggressive tone of your post, i didnt post my specs however.
The laptop is custom from PCSpecialist.
The CPU is an intel i5 35somethingsomething. The GPU is a GT520M
And like i said, its running hotter then it has in the past but alas, my compressed air and tools for cleaning these things are with the rest of my things in a box 70km away
Thanks for taking the time to reply.
|
Laptop question for ya'll - best option for a laptop under $500? I only want to play 1v1s, low settings @ 1366 x 768 and be sure that late game battles stay smooth, above 30fps. Candidates below.
+ Show Spoiler +
I would have guessed the i5 and HD 4000 combination to be best, but I'm a little confused as to why the A6 is performing as well, even slightly better, than the A8 processor. The previous generation A8 is even weirder, as a A8-3520M 1.6GHz Radeon HD 6620G combo seems just as good as current generation AMD offerings. Thoughts?
|
On December 11 2012 09:15 pshosh wrote:Laptop question for ya'll - best option for a laptop under $500? I only want to play 1v1s, low settings @ 1366 x 768 and be sure that late game battles stay smooth, above 30fps. Candidates below. + Show Spoiler +I would have guessed the i5 and HD 4000 combination to be best, but I'm a little confused as to why the A6 is performing as well, even slightly better, than the A8 processor. The previous generation A8 is even weirder, as a A8-3520M 1.6GHz Radeon HD 6620G combo seems just as good as current generation AMD offerings. Thoughts? First of all, notebookcheck results are kind of spotty and inconsistent, so read between the lines.
But I don't see the A6-4400M / 7520G doing better. Where? You linked the integrated graphics pages. A6-4400M is a dual core (single module) at relatively high clock speeds, whereas the A8-4500M is a quad core (two modules) at lower clock speeds. For CPU performance, if a task doesn't make use of the extra cores, having the higher clock speed is an advantage.
Last generation AMD Llano can be better than Trinity because Trinity uses updated Bulldozer (Piledriver) CPU cores, while Llano uses the old K10 cores from desktop Phenom II / Athlon II, which are generally better per clock because current iterations of Bulldozer are mostly a miserable failure. Depends on the workload and clock speeds as to which would be better.
For low, take the Ivy Bridge i5.
|
5930 Posts
Trinity should be better, check actual laptop reviews to verify. NotebookCheck's benchmarks a lot of the time don't seem to be very meaningful. What is low, medium, high? What did you do to test it?
The i5/HD4000 could lose because Trinity has a well better iGPU component than it. That's about it.
|
1v1 low @ 1366 x 768 should be no problem for HD 4000 running ~1100 MHz. It's more important for the CPU portion not to suck when we're talking late game minimum fps. For other games, Trinity is often a lot better.
|
But I don't see the A6-4400M / 7520G doing better. Where? You linked the integrated graphics pages. A6-4400M is a dual core (single module) at relatively high clock speeds, whereas the A8-4500M is a quad core (two modules) at lower clock speeds. For CPU performance, if a task doesn't make use of the extra cores, having the higher clock speed is an advantage.
I thought the SC2 benches were halfway down the page under that link. I see the 6620G and 7520G getting ~30fps under medium detail, while the 7640G gets about 23fps under similar conditions.
+ Show Spoiler +
Thanks for the comments, especially on clock speed. It was something I expected, but I'm very unfamiliar with AMD's processor lines.
|
Hey guys, when watching a replay it stutters every 10 ingame seconds, on 10 and 20 and 30 and so on. They are for maybe half a second, I am running on a laptop, so shouldn't be the SSD problem I've read about, normally in game it runs at about 40-60fps, but on replays I have this perpetual stutter. Is this a bug? Is it fixable? Thanks
EDIT: I've tried partitioning my hard drive and running the temp files on a seperate partition which hasn't helped. I always tried that but with a usb and it made it about 20 times worse.
|
On December 11 2012 06:34 Belial88 wrote:Show nested quote +On December 10 2012 16:25 QoGxDyNaSTy wrote: i run in windowed fullscreen and have dual monitors. would running ultra and having 2 monitors effect my VRAM? Yes. The higher the graphics settings, specifically your texture settings, and the higher your resolution (720 vs 1080+), will dramatically affect VRAM, as will having 2 monitors. On 1080+ Ultra, I imagine that 1GB would be enough, but you'd be pushing it, and with 2 monitors I'm not so sure if 1GB would be enough (it might be, but you'd definitely push past 800mb VRAM). I mean you could be comfortable, but I think you'd surpass 1GB on 2 monitors. Here's what you should do - download something called HWINFO. Just run it. See if your GPU VRAM usage ever goes your limit. Your definitely pushing it on 1080+, Ultra, 2 monitors. Secondary monitor is a pittance:
1600x1200 requires 15 MB for double buffered output (1600x1200x4 (32bit RGBA) x 2 for front and back buffers ) or 7.5 MB for one screenful.
1920x1080 only requres 16200 KB (~16 MB). (1920x1080x4x2 (for two screenfulls).
Belial88 wrote: BF3 and other first person shooters don't use much VRAM because your on-screen vision is limited by hallways, buildings, et cetera - what's in front of you. In FPS, your view of the map is limited so VRAM usage is actually quite low.
My 560ti with 1GB of vram uses all of it during BF3 with ultra textures, ultra mesh distance, 16xAF at 1080p (no AA) whilst maintaining about 70-80 fps outside. I have checked before with msi afterburner, it chews vram on ultra textures.
Sc2 does not as I recall, I'll play some games windowed and check.
EDIT: ya it uses like 700-750 mb on ultra for me.
No it wont (by a non-trivial amount anyway), they'll be neck and neck still if you're using the same cooler, as you'll get like .3-.4 ghz less on the x6 at least than the 2500k. Correct me if I'm wrong, but I don't see x6 systems on a 212+ above 4.0-4.2 ghz at all, and I see 2500ks at 4.6ish maximum usually. Hell my 2500k @4.4 with a 212+ doesn't break 65C ever with p95/intel burn test/whatever, and I have ambients above 30c inside (no A/C in the australian summer, yay). At stock they're pretty close, the x6 has a small lead over sb, small loss to ib, given the small advantage in overclocking (sb), they're going to be close enough not to matter.
Belial88 wrote: I mean it's all very arguably and close, the i5 is way better than the Phenom x6, and when it comes to streaming, an OC i5 beats a Phenom x6. But an OC Phenom x6 will beat a stock i5 when it comes to streaming. It'll lose in everything else, but when it comes to streaming, it does very well.
Yeah, but comparing a x6 oced to a non-oced i5 isn't very realistic, as someone that wants to overclock a phenom x6 is going to want to overclock an i5. But it is way cheaper, and pretty much on par with the i5 stock for stock, or oc for oc.
Belial88 wrote: You should always buy intel, and I've made points before about Phenom for budget overclock+streaming instead of intel at sub-$100 level, and that the i5 is superior when you can afford it, but my point here is simply that the phenom x6 will stream quite well.
Yeah, it will stream well, but you said "better" before, when they will be pretty much the same (as they're both more than fast enough to encode), or possibly the higher frame rate ability from the i5 will be nicer, in a starcraft 2 stream (doesn't hold for other games, but this is a sc2 website).
|
I'm sorry in advance since this isn't tech related, but I recently bought a 7870 and they gave me a voucher for a free game. The dilemma is far cry 3 or hit an absolution: which one? I like both, but I want to know your opinions.
|
Hello everyone heres my questions
So my buddy currently has a Pentium duel-core cpu E6500 CPU, Gefore 7950 GX2 GPU, and a saga+400r PSU (I believe this is the PSU), now what he wants to do is plug in my old Radeon 4870 instead of his 7950 GX2, now what we noticed was that his PSU doesnt have two 6 pin connectors to go into the Radeon 4870 which it requires, it only has one. So is it ok to just get a 2x4 pin molex to 6pin converter and just plug that bad boy in? And also the Radeon 4870 appears to have a higher power consumption than the 7950 GX2 so would this power supply even be powerful enough for such a card?
TLDR: Can I use a 2x4 molex to 6pin adapter to add the missing 6 pin connector required for the Radeon 4870 and can I expect to run into power issues with this PSU only being a 400W?
If any more info is required for more accurate answers please let me know.
Thanks in advance everyone!
Edit: Thanks for the reply Myrmidon! Ill let him know he should upgrade his PSU to avoid any hiccups. (turns out his bro had a spare one for him )
|
That's what molex adapters are for. It's just +12V and ground wires. The electricity doesn't care which cables you used (though if the wires are very thin—which they don't seem to be—then you could have issues on heavy load with higher currents causing nontrivial drops across the wires and too much waste heat.
FSP Saga+ is really kind of outdated and cheap (Saga II not that much better), but it's not a ticking time bomb liability. I think a similar design is used for some of the really old low-end Cooler Master Elite Power type of stuff, at least the ones that FSP makes for them.
Note that it only claims 276W on +12V, 350W in all. If it actually has OCP on the claimed 10A and 13A +12V rails, if the limits are set tight, and if molex and PCIe +12V power are on the same rail (this last part is likely), then you could easily overload that rail by running a HD 4870, in which case it should just shut off. Probably it doesn't actually have OCP, or at least the limit set around 13A.
276W on +12V is really not that much for running a HD 4870, but it should be okay.
All in all, the power supply is not good, and I wouldn't recommend that setup, but it should work most likely. A good 400W power supply would handle that easily. I'm just a little bit hesitant on an older unit of mediocre build, that has a frankly inflated rating.
|
My 560ti with 1GB of vram uses all of it during BF3 with ultra textures, ultra mesh distance, 16xAF at 1080p (no AA) whilst maintaining about 70-80 fps outside. I have checked before with msi afterburner, it chews vram on ultra textures.
Sc2 does not as I recall, I'll play some games windowed and check.
EDIT: ya it uses like 700-750 mb on ultra for me.
I'd expect so with 16xAF and 1080. I mentioned that relatively that BF3, and other fps, use relatively low VRM, not that they use a low amount of VRM. Plenty of people play BF3 on 1.5gb+ VRAM cards for a reason.
No it wont (by a non-trivial amount anyway), they'll be neck and neck still if you're using the same cooler, as you'll get like .3-.4 ghz less on the x6 at least than the 2500k. Correct me if I'm wrong, but I don't see x6 systems on a 212+ above 4.0-4.2 ghz at all, and I see 2500ks at 4.6ish maximum usually. Hell my 2500k @4.4 with a 212+ doesn't break 65C ever with p95/intel burn test/whatever, and I have ambients above 30c inside (no A/C in the australian summer, yay). At stock they're pretty close, the x6 has a small lead over sb, small loss to ib, given the small advantage in overclocking (sb), they're going to be close enough not to matter.
Well, it will pass it by a trivial amount. I was not implying the x6 would blow the i5 out of the water when it comes to streaming, like the x4 does to the i3. x6's can hit 4ghz with a 212, sure.
X6 is a terrible deal and an i5 is a million times better, my point was just to the guy who already made the mistake of getting an x6, that it's still a strong performing chip and will stream like a boss. It's just not cost effective for it's performance, at all.
Yeah, but comparing a x6 oced to a non-oced i5 isn't very realistic, as someone that wants to overclock a phenom x6 is going to want to overclock an i5. But it is way cheaper, and pretty much on par with the i5 stock for stock, or oc for oc.
I know, I mentioned that. I was simply saying an x6 is a good chip and will stream HD easily. It's not a good chip for the price, and it's stupid to buy an x6, but if you already have one, you shouldn't try to upgrade if your just trying to stream and/or play games. In gaming and particularly for streaming, the chip will perform very well.
Oc vs OC, the x6 gets destroyed by the i5. I don't see where you see it's way cheaper, the phenom x6 seems to be equal in price or more expensive than i5 from what I see.
|
On December 12 2012 16:00 Belial88 wrote:I don't see where you see it's way cheaper, the phenom x6 seems to be equal in price or more expensive than i5 from what I see. My bad, I didn't realise that 1090/1100t were the only BE x6 processors, the other x6's are significantly cheaper than those two. ^.^
|
I have a question.. I'm going to the US for 6 months to study. I have an unlocked android phone at home that I want to take with me and use in the US (California). But so far I have only found or heard about plans which INCLUDE a phone, and no option to simply get a plan with SIM card and phone number WITHOUT a phone.
Where do I look?
|
Nevermind, solved my own problem.
|
Is it possible that Windows 8 reduces the FPS in Starcraft 2?
|
Question a friend is asking me: What's better for a gaming laptop? i7 3632QM processor @ 2.2 GHz & 7670m or A10 processor @ 2.3GHz & 7730m
afaik the i7 is definitely better, but not sure about the 7670m vs 7730m. I don't particularly care for helping this friend finding a better deal or anything (I don't even know the prices of these), I just want to help answer this question.
|
|
|
|