• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 21:24
CEST 03:24
KST 10:24
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun12[ASL21] Ro8 Preview Pt1: Inheritors16[ASL21] Ro16 Preview Pt2: All Star10Team Liquid Map Contest #22 - The Finalists22[ASL21] Ro16 Preview Pt1: Fresh Flow9
Community News
Code S Season 1 (2026) - RO12 Results02026 GSL Season 1 Qualifiers25Maestros of the Game 2 announced92026 GSL Tour plans announced15Weekly Cups (April 6-12): herO doubles, "Villains" prevail1
StarCraft 2
General
Code S Season 1 (2026) - RO12 Results Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun Team Liquid Map Contest #22 - The Finalists Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool MaNa leaves Team Liquid
Tourneys
RSL Revival: Season 5 - Qualifiers and Main Event GSL Code S Season 1 (2026) SC2 INu's Battles#15 <BO.9 2Matches> WardiTV Spring Cup SEL Masters #6 - Solar vs Classic (SC: Evo)
Strategy
Custom Maps
[D]RTS in all its shapes and glory <3 [A] Nemrods 1/4 players [M] (2) Frigid Storage
External Content
The PondCast: SC2 News & Results Mutation # 523 Firewall Mutation # 522 Flip My Base Mutation # 521 Memorable Boss
Brood War
General
[BSL22] RO16 Group A - Sunday 21:00 CEST [BSL22] RO16 Group B - Saturday 21:00 CEST Pros React To: Leta vs Tulbo (ASL S21, Ro.8) RepMastered™: replay sharing and analyzer site BW General Discussion
Tourneys
Escore Tournament StarCraft Season 2 [BSL22] RO16 Group Stage - 02 - 10 May [Megathread] Daily Proleagues [ASL21] Ro8 Day 2
Strategy
Fighting Spirit mining rates Simple Questions, Simple Answers What's the deal with APM & what's its true value Any training maps people recommend?
Other Games
General Games
Daigo vs Menard Best of 10 Stormgate/Frost Giant Megathread Nintendo Switch Thread Dawn of War IV Diablo IV
Dota 2
The Story of Wings Gaming
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Vanilla Mini Mafia Mafia Game Mode Feedback/Ideas TL Mafia Community Thread Five o'clock TL Mafia
Community
General
European Politico-economics QA Mega-thread US Politics Mega-thread Russo-Ukrainian War Thread 3D technology/software discussion Canadian Politics Mega-thread
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Manga] One Piece Anime Discussion Thread [Req][Books] Good Fantasy/SciFi books Movie Discussion!
Sports
2024 - 2026 Football Thread McBoner: A hockey love story Formula 1 Discussion
World Cup 2022
Tech Support
streaming software Strange computer issues (software) [G] How to Block Livestream Ads
TL Community
The Automated Ban List
Blogs
Sexual Health Of Gamers
TrAiDoS
lurker extra damage testi…
StaticNine
Broowar part 2
qwaykee
Funny Nicknames
LUCKY_NOOB
Iranian anarchists: organize…
XenOsky
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1949 users

Simple Questions Simple Answers - Page 244

Forum Index > Tech Support
Post a Reply
Prev 1 242 243 244 245 246 653 Next
Belial88
Profile Blog Joined November 2010
United States5217 Posts
December 10 2012 21:34 GMT
#4861
On December 10 2012 16:25 QoGxDyNaSTy wrote:
i run in windowed fullscreen and have dual monitors. would running ultra and having 2 monitors effect my VRAM?


Yes. The higher the graphics settings, specifically your texture settings, and the higher your resolution (720 vs 1080+), will dramatically affect VRAM, as will having 2 monitors.

1 monitor on ~720p on medium or high, you would almost never break 500vram in 1v1 and unit preloader. On ultra, you won't ever break 750vram on 1v1/unit preloader (maybe once, 500 is too little, but 768vram is more than comfortable).

On 1080+ Ultra, I imagine that 1GB would be enough, but you'd be pushing it, and with 2 monitors I'm not so sure if 1GB would be enough (it might be, but you'd definitely push past 800mb VRAM). I mean you could be comfortable, but I think you'd surpass 1GB on 2 monitors.

Here's what you should do - download something called HWINFO. Just run it. See if your GPU VRAM usage ever goes your limit. Your definitely pushing it on 1080+, Ultra, 2 monitors.
How to build a $500 i7-3770K Ultimate Computer:http://www.teamliquid.net/blogs/viewblog.php?topic_id=392709 ******** 100% Safe Razorless Delid Method! http://www.overclock.net/t/1376206/how-to-delid-your-ivy-bridge-cpu-with-out-a-razor-blade/0_100
Alryk
Profile Blog Joined November 2011
United States2718 Posts
December 10 2012 22:04 GMT
#4862
On December 11 2012 05:20 Capped wrote:
Forced to play on a laptop without its cooling fan (Shipped everything from UK to Sri lanka - not due for a few days) i dont know why i didnt ship its fan, FU

Basically, ive got it raised about 2/3 an inch off a glass table, big 16" pedastal fan directed at it (hot part, the fan vent etc) and its still running hot, i need to clean it as its running about 10 degrees higher on average idle / under load.

Any ideas how to help reduce heat further?

Also, while playing guild wars 2 Speedfan is registering temps of about 90-92oC but the moment i minimise (and im talking 2 seconds max) it drops straight back down to 80-84oC. This is the GFX.

The CPU is sitting between 75-80 constant.

Im aware i can run my GPU at about 80 without doing damage, 90 is too high though but i think its a false positive from speedfan, considering the massive jump downwards as soon as i minimise.

Also, what is a safe operating temp for a CPU? is 75-80 ok-ish, for now?

I am resting it every 30 mins to 1 hour, letting it completely cool to below 40 or outright turning it off for 10 mins too...


Laptop CPUs often push 80-90 degrees. 75 is nothing to worry about. It's going to run hotter because of the inherent design/fact that there's less cooling headway available.
Team Liquid, IM, ViOlet!
Belial88
Profile Blog Joined November 2010
United States5217 Posts
Last Edited: 2012-12-10 22:06:42
December 10 2012 22:05 GMT
#4863
Nope. Sc2 with 4x AA will not come close to using 1GB of VRAM. Maybe BF3 with ultra textures and 16xAF/4xAA, but not sc2 by any stretch. Ever.


BF3 and other first person shooters don't use much VRAM because your on-screen vision is limited by hallways, buildings, et cetera - what's in front of you. In FPS, your view of the map is limited so VRAM usage is actually quite low.

SC2, on the other hand, really doesn't take much VRAM because it's quite an old game and such, but compared to a similarly intensive FPS (say a 2008 FPS or something), it will use a lot more VRAM because of so many units, and how big the map is. While VRAM usage will spike based on what's in front of you, textures at the opponent's base or dead/no longer present units will still take up memory cache.

Throw in AA, and your going to significantly increase your VRAM usage, a good 20%+.

I tested VRAM usage in SC2 on my 1360x768 resolution, and I've found on Ultra, with no AA, that 512mb was way too little VRAM and that I was generally maxing around 600-700 in 1v1 and unit preloader. I'm pretty sure on 1080+, with AA, your going to go way past 800. 1GB might be enough, but I seriously think AA, 1080+, 2 monitors, would be pushing it. I really don't think it'd be enough on such a set-up.

You are definitely wrong in saying "sc2 won't come close to 1gb" though. Definitely wrong.

You mean you'll outperform it at encoding? Maybe, but I doubt it, the i5 has 4 much faster cores than an OCed x6, they'd probably be close at 4.2, not much either way.


You seem to say a lot of things without knowing for sure...

Phenom II outperforms comparatively priced Intel (note: A phenom ii is comparable to a pentium in price, not an i5, although an x6 is closer to an i3 or lower end i5 in price) when it comes to H264 encoding. Intel is a better chip for sure at the same price - they run cooler, they do 95% of applications 5-10% better (not noticeably better, just consistently better). But when it comes to H264 encoding performance, both first pass and second pass, Phenom beats Intel because of the 4 physical cores instead of 4 modular cores.

http://www.anandtech.com/bench/Product/203?vs=363

As you can see, A Phenom x6 beats an i5-2400 in 2nd pass, and is slightly lower. Overclock it by 25%, which is a very conservative 24/7 overclock on a phenom x6, and it'll outperform a stock i5, like the i5-2400 that can't be overclocked.

The Phenom x6 is a horrible deal, both new and used - it's priced similarly to an i5, which if overclocked will crush an overclocked Phenom x6, and beat it in every application by a significant amount. I'm not defending the chip at all.

But when it comes to encoding performance, Phenom beats Intel at the same price point, especially when overclocked, although the i5 is just superior in every way once overclocked (and its' not very fair to compare overclocked Phenom x6 to stock i5, to be fair).

All I'm getting at here, is that the poster can overclock his Phenom x6 for a significant increase in performance, and that his chip is far from weak when it comes to streaming.

http://www.anandtech.com/bench/Product/203?vs=701

Non-OC Phenom x6 vs IB-i5 (lets assume ib-i5 is an oc sb-i5. Factor in a 25% overclock, and the Phenom's h264 performance is going to surpass the i5).

I mean it's all very arguably and close, the i5 is way better than the Phenom x6, and when it comes to streaming, an OC i5 beats a Phenom x6. But an OC Phenom x6 will beat a stock i5 when it comes to streaming. It'll lose in everything else, but when it comes to streaming, it does very well.

The same story repeats with the Phenom x4 vs Pentium 850, both of which are priced at ~$75 (the intel is actually slightly more expensive). Except in such case, the Phenom x4 actually performs almost twice as good as the Pentium 850, especially when overclocked. Again, Pentium 850 beats Phenom x4 in everything, slightly, but the Phenom x4 is the better contender when it comes to streaming + overclock.

You should always buy intel, and I've made points before about Phenom for budget overclock+streaming instead of intel at sub-$100 level, and that the i5 is superior when you can afford it, but my point here is simply that the phenom x6 will stream quite well.

A stock one, but if he matches your overclock he will have a significant (>50%) lead in cpu-bound game performance while matching multi-threaded performance, showing as much higher minimum and average fps, FPS not dipping nearly as low with screen capture methods and being able to hit the same encoding settings as the x6, but holding >60fps (or close to it) for a higher percentage of the game or a battle while the phenom being behind in performance cuts it a lot lower than the i5.


Of course. I'm just saying his Phenom x6 is no weakling when it comes to streaming. He should easily be able to stream 720p@60fps on a phenom x6, much less an x4.

Really gotta overclock the Phenom ii to make the chip worthwhile, especially the x6. Otherwise Intel is better at everything and only slightly worse at streaming for the same price point, except the i5 vs x6.

Dont get me wrong, i5 >>>>>>>> x6. But that guy should have no problem with streaming on his x6, it should be as smooth as butter, especially with an overclock.

Forced to play on a laptop without its cooling fan (Shipped everything from UK to Sri lanka - not due for a few days) i dont know why i didnt ship its fan, FU

Basically, ive got it raised about 2/3 an inch off a glass table, big 16" pedastal fan directed at it (hot part, the fan vent etc) and its still running hot, i need to clean it as its running about 10 degrees higher on average idle / under load.

Any ideas how to help reduce heat further?

Also, while playing guild wars 2 Speedfan is registering temps of about 90-92oC but the moment i minimise (and im talking 2 seconds max) it drops straight back down to 80-84oC. This is the GFX.

The CPU is sitting between 75-80 constant.

Im aware i can run my GPU at about 80 without doing damage, 90 is too high though but i think its a false positive from speedfan, considering the massive jump downwards as soon as i minimise.

Also, what is a safe operating temp for a CPU? is 75-80 ok-ish, for now?

I am resting it every 30 mins to 1 hour, letting it completely cool to below 40 or outright turning it off for 10 mins too...


What CPU? What model laptop?

Depends on your CPU. If your running an AM3 chip, that could be seriously damaging your CPU. Serious damage. If it's an intel, it might be okay. Really depends. I'd say 75-80 for idle means your destroying your laptop very quickly, considering load temps will reach 90+. Turn off the laptop....

Aside from getting a ton of dry ice or having the laptop in the freezer with cables out for monitors and HIDs, you need to get a fan. How crucial it is that you dont touch your laptop, turn it off, if your destroying it quickly or slowly, is a matter of what laptop you have and if these are idle or load temps or what. You really arent telling us enough...

You should be getting massive jump downwards of temps as you minimize your game, on GPU temps... A component will cool quite rapidly as soon as it goes from load to idle, it actually takes a ton of energy to keep something hot, and even more energy to raise the temperature of a component. Basic physics. It does not sound like a false positive.

Also, 90*C will be perfect okay for a GPU (its on the high end, but if thats temps on a GPU stresstest and NOT a video game, mind you, it's okay, if its video game temps thats actually a bit worrying), but that doesn't mean it's okay for everything ON the graphics card. PCBs are commonly rated for only 90*C, so while your chip might be okay, your PCB might deteroriate and have permanent damage. it's okay once or twice for a minute or few or something, but it will definitely cause damage in the long term, if not the short term.

Then there's also the GPU VRMs, which will frequently blow out and may be rated for only 100*C, which means that while they will only blow out at 100*C, they will have serious performance degradation a good 10-20*C before then, causing hardware problems and writing bad code and artifacting and instability as power is not delivered cleanly to your GPU components due to VRM perfomance degradation.

Until we know more about what your running and you give us some more information, and you figure out what your actually doing, I'd strongly recommend you turn off your laptop, or at the very least stop fucking playing starcraft 2 on it. Your not going to feel too smart when your laptop breaks or it dies within 3 years because you had to play a couple games.

There's a reason laptops come with cooling components at stock.... if it was a desktop, you'd have more wiggle room. Laptops and mobile devices specifically are fighting constantly with heat. If heat wasn't a problem, laptops would be infinitely small and infinitely powerful. Having a dead fan on a desktop can be okay, having a dead fan on a laptop is just death.

Laptops tend to be rated for slightly higher temps, but we dont know what your running. You might be just killing your system, we dont know what your running. On certain systems your temps might be tolerable, on another it's bad. Get it? A lack of CPU fan isn't just hurting your CPU, it's meant to cool the entire system with residual air flow, so everything is getting hotter.
How to build a $500 i7-3770K Ultimate Computer:http://www.teamliquid.net/blogs/viewblog.php?topic_id=392709 ******** 100% Safe Razorless Delid Method! http://www.overclock.net/t/1376206/how-to-delid-your-ivy-bridge-cpu-with-out-a-razor-blade/0_100
Capped
Profile Blog Joined June 2011
United Kingdom7236 Posts
Last Edited: 2012-12-10 23:18:30
December 10 2012 23:10 GMT
#4864
First of all, Chill - the fuck - out.

By "cooling fan" i meant one of those cooling pads with a fan, the actual fan works fine -_-

I dont know if your tired or what but spouting endless nonsense about not knowing if temps are idle, under load via video games or prime95 and all because "I wanted to play a couple games of starcraft (fucking) 2" when i clearly stated "I get these temps from speedfan while playing guild wars 2" is quite baffling to me, along with the aggressive tone of your post, i didnt post my specs however.

The laptop is custom from PCSpecialist.

The CPU is an intel i5 35somethingsomething.
The GPU is a GT520M

And like i said, its running hotter then it has in the past but alas, my compressed air and tools for cleaning these things are with the rest of my things in a box 70km away

Thanks for taking the time to reply.
Useless wet fish.
pshosh
Profile Joined December 2012
United States3 Posts
December 11 2012 00:15 GMT
#4865
Laptop question for ya'll - best option for a laptop under $500? I only want to play 1v1s, low settings @ 1366 x 768 and be sure that late game battles stay smooth, above 30fps. Candidates below.

+ Show Spoiler +

A8-4500M / 7640G - $380
http://www.notebookcheck.net/AMD-Radeon-HD-7640G.69836.0.html

A6-4400M / 7520G - $380 (this combination curiously gets better results on Notebook Check, any idea why?)
http://www.notebookcheck.net/AMD-Radeon-HD-7520G.71728.0.html

i5-3210M / HD 4000 - $430
http://www.notebookcheck.net/Intel-HD-Graphics-4000.69168.0.html


I would have guessed the i5 and HD 4000 combination to be best, but I'm a little confused as to why the A6 is performing as well, even slightly better, than the A8 processor. The previous generation A8 is even weirder, as a A8-3520M 1.6GHz
Radeon HD 6620G combo seems just as good as current generation AMD offerings. Thoughts?
Myrmidon
Profile Blog Joined December 2004
United States9452 Posts
December 11 2012 00:59 GMT
#4866
On December 11 2012 09:15 pshosh wrote:
Laptop question for ya'll - best option for a laptop under $500? I only want to play 1v1s, low settings @ 1366 x 768 and be sure that late game battles stay smooth, above 30fps. Candidates below.

+ Show Spoiler +

A8-4500M / 7640G - $380
http://www.notebookcheck.net/AMD-Radeon-HD-7640G.69836.0.html

A6-4400M / 7520G - $380 (this combination curiously gets better results on Notebook Check, any idea why?)
http://www.notebookcheck.net/AMD-Radeon-HD-7520G.71728.0.html

i5-3210M / HD 4000 - $430
http://www.notebookcheck.net/Intel-HD-Graphics-4000.69168.0.html


I would have guessed the i5 and HD 4000 combination to be best, but I'm a little confused as to why the A6 is performing as well, even slightly better, than the A8 processor. The previous generation A8 is even weirder, as a A8-3520M 1.6GHz
Radeon HD 6620G combo seems just as good as current generation AMD offerings. Thoughts?

First of all, notebookcheck results are kind of spotty and inconsistent, so read between the lines.

But I don't see the A6-4400M / 7520G doing better. Where? You linked the integrated graphics pages. A6-4400M is a dual core (single module) at relatively high clock speeds, whereas the A8-4500M is a quad core (two modules) at lower clock speeds. For CPU performance, if a task doesn't make use of the extra cores, having the higher clock speed is an advantage.

Last generation AMD Llano can be better than Trinity because Trinity uses updated Bulldozer (Piledriver) CPU cores, while Llano uses the old K10 cores from desktop Phenom II / Athlon II, which are generally better per clock because current iterations of Bulldozer are mostly a miserable failure. Depends on the workload and clock speeds as to which would be better.

For low, take the Ivy Bridge i5.
Womwomwom
Profile Blog Joined September 2009
5930 Posts
Last Edited: 2012-12-11 01:04:05
December 11 2012 01:00 GMT
#4867
Trinity should be better, check actual laptop reviews to verify. NotebookCheck's benchmarks a lot of the time don't seem to be very meaningful. What is low, medium, high? What did you do to test it?

The i5/HD4000 could lose because Trinity has a well better iGPU component than it. That's about it.
Myrmidon
Profile Blog Joined December 2004
United States9452 Posts
December 11 2012 01:11 GMT
#4868
1v1 low @ 1366 x 768 should be no problem for HD 4000 running ~1100 MHz. It's more important for the CPU portion not to suck when we're talking late game minimum fps. For other games, Trinity is often a lot better.
pshosh
Profile Joined December 2012
United States3 Posts
December 11 2012 01:14 GMT
#4869

But I don't see the A6-4400M / 7520G doing better. Where? You linked the integrated graphics pages. A6-4400M is a dual core (single module) at relatively high clock speeds, whereas the A8-4500M is a quad core (two modules) at lower clock speeds. For CPU performance, if a task doesn't make use of the extra cores, having the higher clock speed is an advantage.


I thought the SC2 benches were halfway down the page under that link. I see the 6620G and 7520G getting ~30fps under medium detail, while the 7640G gets about 23fps under similar conditions.

+ Show Spoiler +

http://www.notebookcheck.net/AMD-Radeon-HD-6620G.54675.0.html

Or, for instance, a comparison of HD 4000 graphics and 6620G http://www.anandtech.com/show/5772/mobile-ivy-bridge-and-asus-n56vm-preview/6

I suppose the best way would be direct comparisons of the models in question, but that kind of detail is hard to come by :/



Thanks for the comments, especially on clock speed. It was something I expected, but I'm very unfamiliar with AMD's processor lines.
Asthenic
Profile Joined March 2012
United Kingdom45 Posts
Last Edited: 2012-12-11 15:17:04
December 11 2012 01:26 GMT
#4870
Hey guys, when watching a replay it stutters every 10 ingame seconds, on 10 and 20 and 30 and so on. They are for maybe half a second, I am running on a laptop, so shouldn't be the SSD problem I've read about, normally in game it runs at about 40-60fps, but on replays I have this perpetual stutter. Is this a bug? Is it fixable? Thanks

EDIT: I've tried partitioning my hard drive and running the temp files on a seperate partition which hasn't helped. I always tried that but with a usb and it made it about 20 times worse.
For my latest casts and other content : http://www.youtube.com/user/AsthenicSC2. or https://twitter.com/AsthenicSC2
Rollin
Profile Joined March 2011
Australia1552 Posts
Last Edited: 2012-12-11 03:34:26
December 11 2012 02:46 GMT
#4871
On December 11 2012 06:34 Belial88 wrote:
Show nested quote +
On December 10 2012 16:25 QoGxDyNaSTy wrote:
i run in windowed fullscreen and have dual monitors. would running ultra and having 2 monitors effect my VRAM?


Yes. The higher the graphics settings, specifically your texture settings, and the higher your resolution (720 vs 1080+), will dramatically affect VRAM, as will having 2 monitors.

On 1080+ Ultra, I imagine that 1GB would be enough, but you'd be pushing it, and with 2 monitors I'm not so sure if 1GB would be enough (it might be, but you'd definitely push past 800mb VRAM). I mean you could be comfortable, but I think you'd surpass 1GB on 2 monitors.

Here's what you should do - download something called HWINFO. Just run it. See if your GPU VRAM usage ever goes your limit. Your definitely pushing it on 1080+, Ultra, 2 monitors.

Secondary monitor is a pittance:
1600x1200 requires 15 MB for double buffered output (1600x1200x4 (32bit RGBA) x 2 for front and back buffers ) or 7.5 MB for one screenful.

1920x1080 only requres 16200 KB (~16 MB). (1920x1080x4x2 (for two screenfulls).


Belial88 wrote:
BF3 and other first person shooters don't use much VRAM because your on-screen vision is limited by hallways, buildings, et cetera - what's in front of you. In FPS, your view of the map is limited so VRAM usage is actually quite low.

My 560ti with 1GB of vram uses all of it during BF3 with ultra textures, ultra mesh distance, 16xAF at 1080p (no AA) whilst maintaining about 70-80 fps outside. I have checked before with msi afterburner, it chews vram on ultra textures.

Sc2 does not as I recall, I'll play some games windowed and check.

EDIT: ya it uses like 700-750 mb on ultra for me.
Belial88 wrote:
http://www.anandtech.com/bench/Product/203?vs=701

Non-OC Phenom x6 vs IB-i5 (lets assume ib-i5 is an oc sb-i5. Factor in a 25% overclock, and the Phenom's h264 performance is going to surpass the i5).

No it wont (by a non-trivial amount anyway), they'll be neck and neck still if you're using the same cooler, as you'll get like .3-.4 ghz less on the x6 at least than the 2500k. Correct me if I'm wrong, but I don't see x6 systems on a 212+ above 4.0-4.2 ghz at all, and I see 2500ks at 4.6ish maximum usually. Hell my 2500k @4.4 with a 212+ doesn't break 65C ever with p95/intel burn test/whatever, and I have ambients above 30c inside (no A/C in the australian summer, yay). At stock they're pretty close, the x6 has a small lead over sb, small loss to ib, given the small advantage in overclocking (sb), they're going to be close enough not to matter.

Belial88 wrote:
I mean it's all very arguably and close, the i5 is way better than the Phenom x6, and when it comes to streaming, an OC i5 beats a Phenom x6. But an OC Phenom x6 will beat a stock i5 when it comes to streaming. It'll lose in everything else, but when it comes to streaming, it does very well.

Yeah, but comparing a x6 oced to a non-oced i5 isn't very realistic, as someone that wants to overclock a phenom x6 is going to want to overclock an i5. But it is way cheaper, and pretty much on par with the i5 stock for stock, or oc for oc.

Belial88 wrote:
You should always buy intel, and I've made points before about Phenom for budget overclock+streaming instead of intel at sub-$100 level, and that the i5 is superior when you can afford it, but my point here is simply that the phenom x6 will stream quite well.

Yeah, it will stream well, but you said "better" before, when they will be pretty much the same (as they're both more than fast enough to encode), or possibly the higher frame rate ability from the i5 will be nicer, in a starcraft 2 stream (doesn't hold for other games, but this is a sc2 website).
Throw off those chains of reason, and your prison disappears. | Check your posting frequency timeline: http://www.teamliquid.net/mytlnet/post_activity_img.php
matiK23
Profile Joined May 2011
United States963 Posts
December 11 2012 21:28 GMT
#4872
I'm sorry in advance since this isn't tech related, but I recently bought a 7870 and they gave me a voucher for a free game. The dilemma is far cry 3 or hit an absolution: which one? I like both, but I want to know your opinions.
Without a paddle up shit creek.
PigAntlers
Profile Joined February 2011
Canada32 Posts
Last Edited: 2012-12-12 06:40:01
December 12 2012 04:56 GMT
#4873
Hello everyone heres my questions

So my buddy currently has a Pentium duel-core cpu E6500 CPU, Gefore 7950 GX2 GPU, and a saga+400r PSU (I believe this is the PSU), now what he wants to do is plug in my old Radeon 4870 instead of his 7950 GX2, now what we noticed was that his PSU doesnt have two 6 pin connectors to go into the Radeon 4870 which it requires, it only has one. So is it ok to just get a 2x4 pin molex to 6pin converter and just plug that bad boy in? And also the Radeon 4870 appears to have a higher power consumption than the 7950 GX2 so would this power supply even be powerful enough for such a card?

TLDR: Can I use a 2x4 molex to 6pin adapter to add the missing 6 pin connector required for the Radeon 4870 and can I expect to run into power issues with this PSU only being a 400W?

If any more info is required for more accurate answers please let me know.

Thanks in advance everyone!

Edit: Thanks for the reply Myrmidon! Ill let him know he should upgrade his PSU to avoid any hiccups. (turns out his bro had a spare one for him )
Myrmidon
Profile Blog Joined December 2004
United States9452 Posts
December 12 2012 05:13 GMT
#4874
That's what molex adapters are for. It's just +12V and ground wires. The electricity doesn't care which cables you used (though if the wires are very thin—which they don't seem to be—then you could have issues on heavy load with higher currents causing nontrivial drops across the wires and too much waste heat.

FSP Saga+ is really kind of outdated and cheap (Saga II not that much better), but it's not a ticking time bomb liability. I think a similar design is used for some of the really old low-end Cooler Master Elite Power type of stuff, at least the ones that FSP makes for them.

Note that it only claims 276W on +12V, 350W in all. If it actually has OCP on the claimed 10A and 13A +12V rails, if the limits are set tight, and if molex and PCIe +12V power are on the same rail (this last part is likely), then you could easily overload that rail by running a HD 4870, in which case it should just shut off. Probably it doesn't actually have OCP, or at least the limit set around 13A.

276W on +12V is really not that much for running a HD 4870, but it should be okay.

All in all, the power supply is not good, and I wouldn't recommend that setup, but it should work most likely. A good 400W power supply would handle that easily. I'm just a little bit hesitant on an older unit of mediocre build, that has a frankly inflated rating.
Belial88
Profile Blog Joined November 2010
United States5217 Posts
December 12 2012 07:00 GMT
#4875
My 560ti with 1GB of vram uses all of it during BF3 with ultra textures, ultra mesh distance, 16xAF at 1080p (no AA) whilst maintaining about 70-80 fps outside. I have checked before with msi afterburner, it chews vram on ultra textures.

Sc2 does not as I recall, I'll play some games windowed and check.

EDIT: ya it uses like 700-750 mb on ultra for me.


I'd expect so with 16xAF and 1080. I mentioned that relatively that BF3, and other fps, use relatively low VRM, not that they use a low amount of VRM. Plenty of people play BF3 on 1.5gb+ VRAM cards for a reason.

No it wont (by a non-trivial amount anyway), they'll be neck and neck still if you're using the same cooler, as you'll get like .3-.4 ghz less on the x6 at least than the 2500k. Correct me if I'm wrong, but I don't see x6 systems on a 212+ above 4.0-4.2 ghz at all, and I see 2500ks at 4.6ish maximum usually. Hell my 2500k @4.4 with a 212+ doesn't break 65C ever with p95/intel burn test/whatever, and I have ambients above 30c inside (no A/C in the australian summer, yay). At stock they're pretty close, the x6 has a small lead over sb, small loss to ib, given the small advantage in overclocking (sb), they're going to be close enough not to matter.


Well, it will pass it by a trivial amount. I was not implying the x6 would blow the i5 out of the water when it comes to streaming, like the x4 does to the i3. x6's can hit 4ghz with a 212, sure.

X6 is a terrible deal and an i5 is a million times better, my point was just to the guy who already made the mistake of getting an x6, that it's still a strong performing chip and will stream like a boss. It's just not cost effective for it's performance, at all.

Yeah, but comparing a x6 oced to a non-oced i5 isn't very realistic, as someone that wants to overclock a phenom x6 is going to want to overclock an i5. But it is way cheaper, and pretty much on par with the i5 stock for stock, or oc for oc.


I know, I mentioned that. I was simply saying an x6 is a good chip and will stream HD easily. It's not a good chip for the price, and it's stupid to buy an x6, but if you already have one, you shouldn't try to upgrade if your just trying to stream and/or play games. In gaming and particularly for streaming, the chip will perform very well.

Oc vs OC, the x6 gets destroyed by the i5. I don't see where you see it's way cheaper, the phenom x6 seems to be equal in price or more expensive than i5 from what I see.


How to build a $500 i7-3770K Ultimate Computer:http://www.teamliquid.net/blogs/viewblog.php?topic_id=392709 ******** 100% Safe Razorless Delid Method! http://www.overclock.net/t/1376206/how-to-delid-your-ivy-bridge-cpu-with-out-a-razor-blade/0_100
Rollin
Profile Joined March 2011
Australia1552 Posts
December 12 2012 07:23 GMT
#4876
On December 12 2012 16:00 Belial88 wrote:I don't see where you see it's way cheaper, the phenom x6 seems to be equal in price or more expensive than i5 from what I see.

My bad, I didn't realise that 1090/1100t were the only BE x6 processors, the other x6's are significantly cheaper than those two. ^.^
Throw off those chains of reason, and your prison disappears. | Check your posting frequency timeline: http://www.teamliquid.net/mytlnet/post_activity_img.php
Cutlery
Profile Joined December 2010
Norway565 Posts
December 13 2012 12:55 GMT
#4877
I have a question.. I'm going to the US for 6 months to study. I have an unlocked android phone at home that I want to take with me and use in the US (California). But so far I have only found or heard about plans which INCLUDE a phone, and no option to simply get a plan with SIM card and phone number WITHOUT a phone.

Where do I look?
Maluk
Profile Joined August 2011
France987 Posts
Last Edited: 2012-12-14 00:15:22
December 14 2012 00:04 GMT
#4878
Nevermind, solved my own problem.
LukasG
Profile Joined July 2011
Germany95 Posts
December 14 2012 17:27 GMT
#4879
Is it possible that Windows 8 reduces the FPS in Starcraft 2?
Grobyc
Profile Blog Joined June 2008
Canada18410 Posts
December 14 2012 20:39 GMT
#4880
Question a friend is asking me:
What's better for a gaming laptop?
i7 3632QM processor @ 2.2 GHz & 7670m
or
A10 processor @ 2.3GHz & 7730m

afaik the i7 is definitely better, but not sure about the 7670m vs 7730m. I don't particularly care for helping this friend finding a better deal or anything (I don't even know the prices of these), I just want to help answer this question.
If you watch Godzilla backwards it's about a benevolent lizard who helps rebuild a city and then moonwalks into the ocean.
Prev 1 242 243 244 245 246 653 Next
Please log in or register to reply.
Live Events Refresh
Replay Cast
00:00
2026 GSL S1: Ro12 Group A
CranKy Ducklings92
Liquipedia
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
PiGStarcraft325
SpeCial 234
RuFF_SC2 57
ROOTCatZ 49
ProTech49
StarCraft: Brood War
NaDa 7
Dota 2
monkeys_forever855
League of Legends
Doublelift3686
Counter-Strike
fl0m3963
taco 435
Super Smash Bros
C9.Mang0357
Other Games
gofns9246
tarik_tv7868
summit1g7033
JimRising 332
WinterStarcraft241
ViBE50
amsayoshi24
Organizations
Other Games
gamesdonequick877
BasetradeTV160
Dota 2
PGL Dota 2 - Main Stream60
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
[ Show 11 non-featured ]
StarCraft 2
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
Other Games
• Scarra1137
Upcoming Events
Replay Cast
7h 36m
RSL Revival
8h 36m
Classic vs GgMaChine
Rogue vs Maru
WardiTV Invitational
9h 36m
Percival vs Shameless
ByuN vs YoungYakov
IPSL
14h 36m
Ret vs Art_Of_Turtle
Radley vs TBD
BSL
17h 36m
Replay Cast
22h 36m
RSL Revival
1d 8h
herO vs TriGGeR
NightMare vs Solar
uThermal 2v2 Circuit
1d 12h
BSL
1d 17h
IPSL
1d 17h
eOnzErG vs TBD
G5 vs Nesh
[ Show More ]
Patches Events
1d 22h
Replay Cast
2 days
Wardi Open
2 days
Afreeca Starleague
2 days
Jaedong vs Light
Monday Night Weeklies
2 days
Replay Cast
2 days
Sparkling Tuna Cup
3 days
Afreeca Starleague
3 days
Snow vs Flash
WardiTV Invitational
3 days
GSL
4 days
Classic vs Cure
Maru vs Rogue
GSL
5 days
SHIN vs Zoun
ByuN vs herO
Replay Cast
5 days
Escore
6 days
The PondCast
6 days
WardiTV Invitational
6 days
Replay Cast
6 days
Liquipedia Results

Completed

Escore Tournament S2: W5
WardiTV TLMC #16
Nations Cup 2026

Ongoing

BSL Season 22
ASL Season 21
CSL 2026 SPRING (S20)
IPSL Spring 2026
KCM Race Survival 2026 Season 2
KK 2v2 League Season 1
Acropolis #4
SCTL 2026 Spring
RSL Revival: Season 5
2026 GSL S1
BLAST Rivals Spring 2026
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026

Upcoming

BSL 22 Non-Korean Championship
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
Maestros of the Game 2
2026 GSL S2
Stake Ranked Episode 3
XSE Pro League 2026
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
IEM Atlanta 2026
Asian Champions League 2026
PGL Astana 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.