• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EST 23:19
CET 05:19
KST 13:19
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
ByuL: The Forgotten Master of ZvT0Behind the Blue - Team Liquid History Book16Clem wins HomeStory Cup 289HomeStory Cup 28 - Info & Preview13Rongyi Cup S3 - Preview & Info8
Community News
Weekly Cups (Feb 9-15): herO doubles up0ACS replaced by "ASL Season Open" - Starts 21/0224LiuLi Cup: 2025 Grand Finals (Feb 10-16)43Weekly Cups (Feb 2-8): Classic, Solar, MaxPax win2Nexon's StarCraft game could be FPS, led by UMS maker13
StarCraft 2
General
ByuL: The Forgotten Master of ZvT Weekly Cups (Feb 9-15): herO doubles up Nexon's StarCraft game could be FPS, led by UMS maker How do you think the 5.0.15 balance patch (Oct 2025) for StarCraft II has affected the game? StarCraft 1 & 2 Added to Xbox Game Pass
Tourneys
LiuLi Cup: 2025 Grand Finals (Feb 10-16) WardiTV Team League Season 10 PIG STY FESTIVAL 7.0! (19 Feb - 1 Mar) $5,000 WardiTV Winter Championship 2026 StarCraft Evolution League (SC Evo Biweekly)
Strategy
Custom Maps
Map Editor closed ? [A] Starcraft Sound Mod
External Content
Mutation # 513 Attrition Warfare The PondCast: SC2 News & Results Mutation # 512 Overclocked Mutation # 511 Temple of Rebirth
Brood War
General
TvZ is the most complete match up Gypsy to Korea Which units you wish saw more use in the game? Ladder maps - how we can make blizz update them? BW General Discussion
Tourneys
[Megathread] Daily Proleagues Escore Tournament StarCraft Season 1 Small VOD Thread 2.0 KCM Race Survival 2026 Season 1
Strategy
Fighting Spirit mining rates Zealot bombing is no longer popular? Simple Questions, Simple Answers Current Meta
Other Games
General Games
Nintendo Switch Thread ZeroSpace Megathread Diablo 2 thread Path of Exile Battle Aces/David Kim RTS Megathread
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Mafia Game Mode Feedback/Ideas Vanilla Mini Mafia
Community
General
US Politics Mega-thread Russo-Ukrainian War Thread Ask and answer stupid questions here! Things Aren’t Peaceful in Palestine European Politico-economics QA Mega-thread
Fan Clubs
The IdrA Fan Club The herO Fan Club!
Media & Entertainment
[Req][Books] Good Fantasy/SciFi books [Manga] One Piece Anime Discussion Thread
Sports
2024 - 2026 Football Thread
World Cup 2022
Tech Support
TL Community
The Automated Ban List
Blogs
The Search For Meaning in Vi…
TrAiDoS
My 2025 Magic: The Gathering…
DARKING
Life Update and thoughts.
FuDDx
How do archons sleep?
8882
StarCraft improvement
iopq
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1837 users

Computer Build, Upgrade & Buying Resource Thread - Page 96

Forum Index > Tech Support
Post a Reply
Prev 1 94 95 96 97 98 745 Next
When using this resource, please read the opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly.
IMKR
Profile Joined August 2012
United States378 Posts
November 09 2013 11:48 GMT
#1901
What exactly is AMD's mantle?
can someone simplify it in a easy explanation?

when i look up what mantle is, there is more terms that pop up leading me to have to dig more, which the more i dig, the more digging i end up having to do, (its a never ending cycle)
Incognoto
Profile Blog Joined May 2010
France10239 Posts
Last Edited: 2013-11-09 11:56:47
November 09 2013 11:55 GMT
#1902
On November 09 2013 20:48 IMKR wrote:
What exactly is AMD's mantle?
can someone simplify it in a easy explanation?

when i look up what mantle is, there is more terms that pop up leading me to have to dig more, which the more i dig, the more digging i end up having to do, (its a never ending cycle)


I won't claim to have a precise definition of mantle however I understand the rough concept. Basically when you play games today, your game engine sends information to your GPU so that the GPU displays what you see on your screen. To allow your GPU and the game engine to communicate, they need a link. That link is DirectX.

So basically it goes like this:

Game engine -> DirectX -> GPU -> Monitor

The way mantle works is that it eliminates the DirectX step. This increases performance somewhat significantly since we lose a step (less shit to compute). So what it would be with Mantle is this:

Game engine /w mantle -> GPU -> Monitor


There is a drawback however; it's that the game engine itself has to be coded with mantle in mind, not DirectX like most games are. This means that developers, if they opt to go with Mantle, will be forced to code a game for both Mantle and DirectX (as many people own Nvidia cards). This is a huge investment of time and money for a gaming company. It already costs quite a bit to release games that work on both PCs and consoles and Mantle would basically add to that. Nonetheless, the benefits of bypassing DirectX and working with the GPU directly are definitely real.

NB: this is my understanding of mantle so maybe I left out some key points to consider. Also noteworthy is that Nvidia won't have Mantle, meaning if Mantle works out and game developers start coding games to work with Mantle, we have a pretty big win for AMD.
maru lover forever
caedmon-
Profile Joined April 2009
Australia64 Posts
November 09 2013 12:01 GMT
#1903
On November 09 2013 20:24 Incognoto wrote:
Look into a 280X, might be cheaper and it performs a bit better than a 7970 (it's a rebranded 7970). Good brands to look into are Gigabyte, ASUS and MSI, perhaps also Sapphire. Those brands have great heatsinks and you can overclock them.

You might be able to find a cheaper motherboard if you're not going to overclock. No reason to get an H87 board when B85 or H81 chipsets are cheaper and will work fine for a gaming rig.

Or you can dish out some more cash and get a 4670k and a Z87 motherboard, with OC capability.


Thanks for the reply!

The cheapest 280X is $350, $40 more than the 7970. Would the performance increase be worth the difference? It'd be the same brand and same warranty.

Would something like this for the motherboard be sufficient? http://www.pccasegear.com/index.php?main_page=product_info&products_id=24853&cPath=1491
There's an even cheaper one but it only has 2 USB ports.
Incognoto
Profile Blog Joined May 2010
France10239 Posts
Last Edited: 2013-11-09 12:44:32
November 09 2013 12:40 GMT
#1904
For $40 more you do get a faster card. Not taking OC into account, the Powercolor 7970 you're looking at is at 955/1375 compared to the 280x's 1000/1500. There's your $40 difference. I would probably pay for the extra $40. It's up to you though. Here's food for thought:

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/12

^What you would be comparing is the XFX 280X (in red) to AMD's vanilla 7970 (in black). If you feel the edge the 280X has compared to the 7970 is worth $40, you have your answer. XFX and Powercolor aren't the same brand but they're the same "tier", so to speak. They both use reference PCBs with their own, non-reference coolers on that. Bigger brands have custom PCBs and coolers to go with that that allow for extra performance (through better overclocking capabilities), at a price.


The cheaper motherboard you mentioned shouldn't have only 2 USB ports, that sounds weird. I checked: http://www.asus.com/Motherboards/H81ME/#overview

It seems to have 4 on back panel. There's only 2 USB 3.0 ports, however.

Edit: I could be wrong though. That's weird as fuck.

Edit 2: OK it says

ASUS H81M-E Motherboard, LGA1150, Intel H81 Chipset, 2x 1600MHz DDR3, PCI-E x16, 2x SATA3, 2x USB 3.0, Gigabit LAN, DVI, D-Sub, Micro ATX form factor. Backed by a 3 year ASUS warranty.


Which I believe means it does indeed have only 2 3.0 ports and another number for 2.0 ports. Just 2 USB ports for one motherboard doesn't make any sense at all. lol ^^
maru lover forever
caedmon-
Profile Joined April 2009
Australia64 Posts
November 09 2013 13:14 GMT
#1905
+ Show Spoiler +
On November 09 2013 21:40 Incognoto wrote:
For $40 more you do get a faster card. Not taking OC into account, the Powercolor 7970 you're looking at is at 955/1375 compared to the 280x's 1000/1500. There's your $40 difference. I would probably pay for the extra $40. It's up to you though. Here's food for thought:

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/12

^What you would be comparing is the XFX 280X (in red) to AMD's vanilla 7970 (in black). If you feel the edge the 280X has compared to the 7970 is worth $40, you have your answer. XFX and Powercolor aren't the same brand but they're the same "tier", so to speak. They both use reference PCBs with their own, non-reference coolers on that. Bigger brands have custom PCBs and coolers to go with that that allow for extra performance (through better overclocking capabilities), at a price.


The cheaper motherboard you mentioned shouldn't have only 2 USB ports, that sounds weird. I checked: http://www.asus.com/Motherboards/H81ME/#overview

It seems to have 4 on back panel. There's only 2 USB 3.0 ports, however.

Edit: I could be wrong though. That's weird as fuck.

Edit 2: OK it says

ASUS H81M-E Motherboard, LGA1150, Intel H81 Chipset, 2x 1600MHz DDR3, PCI-E x16, 2x SATA3, 2x USB 3.0, Gigabit LAN, DVI, D-Sub, Micro ATX form factor. Backed by a 3 year ASUS warranty.


Which I believe means it does indeed have only 2 3.0 ports and another number for 2.0 ports. Just 2 USB ports for one motherboard doesn't make any sense at all. lol ^^


Ah so it's not an insignificant difference. I'd be saving $40 with the change in motherboard anyway, so I may as well go for it. Will two USB 3.0 ports probably be sufficient? Google has told me it doesn't make a difference for keyboards/mice, and I don't see myself regularly using external hard drives.

Cyro
Profile Blog Joined June 2011
United Kingdom20324 Posts
November 09 2013 16:59 GMT
#1906
Not worth the $40 imo

a 280x is a 7970 is a whatever unless you're bothering about the different coolers, voltage unlocked versions, OC potential etc
"oh my god my overclock... I got a single WHEA error on the 23rd hour, 9 minutes" -Belial88
SnowSC2
Profile Blog Joined September 2010
United States678 Posts
November 09 2013 18:29 GMT
#1907
Quick question.

My friend just recently put a 770 into his system. It has 8 gigs of ram. I;m not sure what CPU he has but its an amd at 3.6ghz.

His psu is some 600w thing from a brand I've never heard of. The gpu requires both and 8 pin and 6 pin pcie connectors. His cpu only had 2 6 pins open, and he used a 6 pin to 8 pin adaptor on one to connect the gpu. He says he can run bf4 on med settings fine, but when he puts it up to ultra it starts lagging like hell.

I believe he has some molexs free, could the 6 to 8 pin adaptor be insufficient and cause the card to underperform? and if so then a 2 molex to 8 pin adaptor should work just fine right?
IdiotSavant
Profile Joined April 2011
United States88 Posts
November 09 2013 19:39 GMT
#1908
On November 09 2013 14:58 Cyro wrote:
Nvidia has shadowplay (it's a bit semi-functional right now, may not be fully released for a while as beta was due to start in june and was delayed for 4 months, yet still has little function) and AMD has mantle

It depends what you want, g-sync looks really awesome especially if they throw out some high refresh rate monitors with it, but may cost a decent premium and will definitely limit monitor choice, it's also impossible to truly judge it unless you have two monitors physically in front of you. Shadowplay also seems really good for recording games (there's nothing to replace g-sync and shadowplay.. they are kinda unique in function - while Mantle's major benefit is higher gpu performance - which you could always throw money at if you really cared, that's my view at least)

I got a 770, but it's just soo hard to reccomend @26% higher price, 7970 has 3gb vram too



well I have no plans on streaming things (only watching). also no plans on using 2 monitors...but if I do Radeon should be fine with that?

anyone have any recommendations on which manufacture to get? I have no experience with gigabyte but thats my current choice only b/c thats what was suggested for a good mother board. I always hear asus is good but runs hot and loud..solved by having more fans..but I also will work in a windoless room with I would say less than perfect airflow
iTzSnypah
Profile Blog Joined February 2011
United States1738 Posts
November 09 2013 20:22 GMT
#1909
On November 10 2013 03:29 SnowSC2 wrote:
Quick question.

My friend just recently put a 770 into his system. It has 8 gigs of ram. I;m not sure what CPU he has but its an amd at 3.6ghz.

His psu is some 600w thing from a brand I've never heard of. The gpu requires both and 8 pin and 6 pin pcie connectors. His cpu only had 2 6 pins open, and he used a 6 pin to 8 pin adaptor on one to connect the gpu. He says he can run bf4 on med settings fine, but when he puts it up to ultra it starts lagging like hell.

I believe he has some molexs free, could the 6 to 8 pin adaptor be insufficient and cause the card to underperform? and if so then a 2 molex to 8 pin adaptor should work just fine right?

BF4 uses a ton of VRAM. More than the 2GB the 770 has when you crank up textures and AA.
Team Liquid needs more Terrans.
skyR
Profile Joined July 2009
Canada13817 Posts
November 09 2013 22:01 GMT
#1910
G-Sync has nothing to do with multiple displays. Cyro is saying it's impossible to judge how good it is without comparing a monitor with G-Sync and a monitor without side by side. Both the GTX 770 and R9 280x supports three monitors via DVI, DVI and HDMI.

I'm not sure whether we are talking about R9 280x or GTX 770 anymore. Most manufacturers are fine, I guess it depends on what you are looking for.

The big three (ASUS, MSI, and Gigabyte) all offer a three year warranty as opposed to two years offered by AMD's partners (Sapphire, HIS). On Nvidia's side, EVGA offers the same three year warranty as the big three but their post-sale support is generally considered to be the best in the GPU industry (option to purchase extended warranty, step-up program, advanced RMA, forum community, etc).

The noise and temperature difference between all of them should be similar under a gaming load and won't matter much at all probably.

For the ~$300 cards, the ASUS R9 280x probably has the best PCB and heatsink design. MSI has the shortest PCB of all the R9 280x's if that matters to you but the rest of it is still reference design afaik. All the cards near ~$300 are voltage locked except for HIS IceQ x2 apparently. Sapphire Toxic is clocked the highest but also is ~$40 more expensive.
Ropid
Profile Joined March 2009
Germany3557 Posts
November 09 2013 22:29 GMT
#1911
I think EVGA's warranty does not expire if you fiddle with their card and install an aftermarket air cooler or waterblock. They are perhaps the only ones that allow that?
"My goal is to replace my soul with coffee and become immortal."
Incognoto
Profile Blog Joined May 2010
France10239 Posts
November 09 2013 22:31 GMT
#1912
is it worth spending more for voltage unlock? OC is definitely a boost in performance, but how much would one pay for such a boost on a card at the 280x's level?
maru lover forever
Ropid
Profile Joined March 2009
Germany3557 Posts
Last Edited: 2013-11-09 22:36:03
November 09 2013 22:34 GMT
#1913
You first have to find out if you can actually do that. Does the card's cooler and your PC case still keep everything cool enough if you increase voltage? That did not work for me. I had to mod the card and replace its cooler. If you start to do that, you can then also mod the graphics card BIOS (though I don't know if there's a hack possible for every card).

EDIT: I don't have a 280X, was just talking in general. I have a GTX 560 Ti.
"My goal is to replace my soul with coffee and become immortal."
Incognoto
Profile Blog Joined May 2010
France10239 Posts
November 09 2013 22:45 GMT
#1914
Well, yeah we hypothetically assume that our case can keep everything cool enough and that the card's cooler is also capable. The voltage is unlocked. Such a card is going to be more expensive than a reference PCB with an OK-but-not-great cooler.

I'm just wondering out loud how much extra is it worth paying to get such a card.
maru lover forever
IdiotSavant
Profile Joined April 2011
United States88 Posts
November 09 2013 23:22 GMT
#1915
On November 10 2013 07:01 skyR wrote:
G-Sync has nothing to do with multiple displays. Cyro is saying it's impossible to judge how good it is without comparing a monitor with G-Sync and a monitor without side by side. Both the GTX 770 and R9 280x supports three monitors via DVI, DVI and HDMI.

I'm not sure whether we are talking about R9 280x or GTX 770 anymore. Most manufacturers are fine, I guess it depends on what you are looking for.

The big three (ASUS, MSI, and Gigabyte) all offer a three year warranty as opposed to two years offered by AMD's partners (Sapphire, HIS). On Nvidia's side, EVGA offers the same three year warranty as the big three but their post-sale support is generally considered to be the best in the GPU industry (option to purchase extended warranty, step-up program, advanced RMA, forum community, etc).

The noise and temperature difference between all of them should be similar under a gaming load and won't matter much at all probably.

For the ~$300 cards, the ASUS R9 280x probably has the best PCB and heatsink design. MSI has the shortest PCB of all the R9 280x's if that matters to you but the rest of it is still reference design afaik. All the cards near ~$300 are voltage locked except for HIS IceQ x2 apparently. Sapphire Toxic is clocked the highest but also is ~$40 more expensive.


Well I was talking about R9 280x since you guys tell me it will be cheaper and better for me. The 770 is similar and offers things that I will probably never use. (Unless I am mistaken in what you guys are saying).

Short or long PCB doesn't mean anything to me..I'll have to google it and read up on it.

There is a low chance of me OC anything for now...so having the chance to do it is only a bonus but def not a requirement.
IMKR
Profile Joined August 2012
United States378 Posts
November 09 2013 23:24 GMT
#1916
+ Show Spoiler +
On November 09 2013 20:55 Incognoto wrote:
Show nested quote +
On November 09 2013 20:48 IMKR wrote:
What exactly is AMD's mantle?
can someone simplify it in a easy explanation?

when i look up what mantle is, there is more terms that pop up leading me to have to dig more, which the more i dig, the more digging i end up having to do, (its a never ending cycle)


I won't claim to have a precise definition of mantle however I understand the rough concept. Basically when you play games today, your game engine sends information to your GPU so that the GPU displays what you see on your screen. To allow your GPU and the game engine to communicate, they need a link. That link is DirectX.

So basically it goes like this:

Game engine -> DirectX -> GPU -> Monitor

The way mantle works is that it eliminates the DirectX step. This increases performance somewhat significantly since we lose a step (less shit to compute). So what it would be with Mantle is this:

Game engine /w mantle -> GPU -> Monitor


There is a drawback however; it's that the game engine itself has to be coded with mantle in mind, not DirectX like most games are. This means that developers, if they opt to go with Mantle, will be forced to code a game for both Mantle and DirectX (as many people own Nvidia cards). This is a huge investment of time and money for a gaming company. It already costs quite a bit to release games that work on both PCs and consoles and Mantle would basically add to that. Nonetheless, the benefits of bypassing DirectX and working with the GPU directly are definitely real.

NB: this is my understanding of mantle so maybe I left out some key points to consider. Also noteworthy is that Nvidia won't have Mantle, meaning if Mantle works out and game developers start coding games to work with Mantle, we have a pretty big win for AMD.



So if mantle does catch on, does that mean that games won't run with Nvidia cards since it's skipping the direct x part? Also, what if mantle doesn't work out? Can cards that run mantle still run DirectX? Is it it more like 1 or the other thing?
skyR
Profile Joined July 2009
Canada13817 Posts
November 09 2013 23:28 GMT
#1917
On November 10 2013 07:45 Incognoto wrote:
Well, yeah we hypothetically assume that our case can keep everything cool enough and that the card's cooler is also capable. The voltage is unlocked. Such a card is going to be more expensive than a reference PCB with an OK-but-not-great cooler.

I'm just wondering out loud how much extra is it worth paying to get such a card.


All the R9 280x are using custom PCBs afaik. ASUS, Gigabyte, and Sapphire Toxic uses 8+2 power phase design while HIS and XFX uses 6+2. Only MSI uses the reference 5+2 design but its PCB is still custom. ASUS is already running at a higher voltage apparently so it being voltage locked doesn't necessarily matter? There's a lot of mixed reports of whether a card is voltage locked or unlocked lol but I think the general consensus is that the HIS, XFX, and ASUS Matrix is unlocked.
Ropid
Profile Joined March 2009
Germany3557 Posts
November 09 2013 23:41 GMT
#1918
On November 10 2013 08:24 IMKR wrote:
+ Show Spoiler +
On November 09 2013 20:55 Incognoto wrote:
Show nested quote +
On November 09 2013 20:48 IMKR wrote:
What exactly is AMD's mantle?
can someone simplify it in a easy explanation?

when i look up what mantle is, there is more terms that pop up leading me to have to dig more, which the more i dig, the more digging i end up having to do, (its a never ending cycle)


I won't claim to have a precise definition of mantle however I understand the rough concept. Basically when you play games today, your game engine sends information to your GPU so that the GPU displays what you see on your screen. To allow your GPU and the game engine to communicate, they need a link. That link is DirectX.

So basically it goes like this:

Game engine -> DirectX -> GPU -> Monitor

The way mantle works is that it eliminates the DirectX step. This increases performance somewhat significantly since we lose a step (less shit to compute). So what it would be with Mantle is this:

Game engine /w mantle -> GPU -> Monitor


There is a drawback however; it's that the game engine itself has to be coded with mantle in mind, not DirectX like most games are. This means that developers, if they opt to go with Mantle, will be forced to code a game for both Mantle and DirectX (as many people own Nvidia cards). This is a huge investment of time and money for a gaming company. It already costs quite a bit to release games that work on both PCs and consoles and Mantle would basically add to that. Nonetheless, the benefits of bypassing DirectX and working with the GPU directly are definitely real.

NB: this is my understanding of mantle so maybe I left out some key points to consider. Also noteworthy is that Nvidia won't have Mantle, meaning if Mantle works out and game developers start coding games to work with Mantle, we have a pretty big win for AMD.



So if mantle does catch on, does that mean that games won't run with Nvidia cards since it's skipping the direct x part? Also, what if mantle doesn't work out? Can cards that run mantle still run DirectX? Is it it more like 1 or the other thing?


There's OpenGL and Direct3D at the moment. AMD and NVIDIA provide support for both in their drivers.

Direct3D is only on Windows (and something related and stripped on the consoles?). OpenGL is available everywhere, Windows, Apple's OSX, Linux, a stripped version on smartphones, etc.

The consoles also always had something special perhaps similar to Mantle. That's what was meant when people said the games on a console can get more out of the hardware than on a PC.

So games programmers already work and support a lot of different stuff to be able to have their game engine run on all the different systems they might want to sell their games on.

Regarding Mantle and NVIDIA, on OpenGL, the GPU manufacturer's are allowed to provide special extensions that are unique to their GPUs. NVIDIA already did something with that a few years ago to get a similar performance increase as what AMD's trying to do with Mantle. That extension did not do that much for performance compared to normal OpenGL and Direct3D. It also was not used at all by games because it was exclusive to NVIDIA. The same might happen to Mantle, it might not do that much to be worth the work, and it might not get used because it's exclusive to AMD.
"My goal is to replace my soul with coffee and become immortal."
iTzSnypah
Profile Blog Joined February 2011
United States1738 Posts
November 10 2013 00:53 GMT
#1919
On November 10 2013 08:28 skyR wrote:
Show nested quote +
On November 10 2013 07:45 Incognoto wrote:
Well, yeah we hypothetically assume that our case can keep everything cool enough and that the card's cooler is also capable. The voltage is unlocked. Such a card is going to be more expensive than a reference PCB with an OK-but-not-great cooler.

I'm just wondering out loud how much extra is it worth paying to get such a card.


All the R9 280x are using custom PCBs afaik. ASUS, Gigabyte, and Sapphire Toxic uses 8+2 power phase design while HIS and XFX uses 6+2. Only MSI uses the reference 5+2 design but its PCB is still custom. ASUS is already running at a higher voltage apparently so it being voltage locked doesn't necessarily matter? There's a lot of mixed reports of whether a card is voltage locked or unlocked lol but I think the general consensus is that the HIS, XFX, and ASUS Matrix is unlocked.

In theory the Gigabyte card can clock the highest. Gigabyte uses the 60A IR3553B Power phases on all it's GPU's (well I don't know about entry level cards). They are slightly more efficient than what the other boards use. So even though lets say a GPU like the 780 Ti with it's 265w board limit (106% limit), even though all the cards can pull 265w, the Gigabyte card has slightly higher useable power. In practice though it doesn't really help clockwise, just slightly lower power consumption.

Then there is the Classified cards from EVGA. They are completely unlocked once you hit a switch on the PCB.
Team Liquid needs more Terrans.
CrankOut
Profile Joined November 2013
187 Posts
November 10 2013 01:35 GMT
#1920
On November 10 2013 05:22 iTzSnypah wrote:
Show nested quote +
On November 10 2013 03:29 SnowSC2 wrote:
Quick question.

My friend just recently put a 770 into his system. It has 8 gigs of ram. I;m not sure what CPU he has but its an amd at 3.6ghz.

His psu is some 600w thing from a brand I've never heard of. The gpu requires both and 8 pin and 6 pin pcie connectors. His cpu only had 2 6 pins open, and he used a 6 pin to 8 pin adaptor on one to connect the gpu. He says he can run bf4 on med settings fine, but when he puts it up to ultra it starts lagging like hell.

I believe he has some molexs free, could the 6 to 8 pin adaptor be insufficient and cause the card to underperform? and if so then a 2 molex to 8 pin adaptor should work just fine right?

BF4 uses a ton of VRAM. More than the 2GB the 770 has when you crank up textures and AA.


I play with a HD 6870 on Medium. If he can only play BF4 on Medium with a 770 his CPU is bottlenecking.
Prev 1 94 95 96 97 98 745 Next
Please log in or register to reply.
Live Events Refresh
OSC
00:00
OSC Elite Rising Star #17.5
Liquipedia
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
RuFF_SC2 267
StarCraft: Brood War
yabsab 227
sorry 142
Noble 84
Icarus 7
Dota 2
monkeys_forever739
NeuroSwarm174
febbydoto33
LuMiX1
League of Legends
JimRising 773
Reynor70
Counter-Strike
m0e_tv486
Other Games
summit1g11875
C9.Mang0534
WinterStarcraft469
Maynarde144
Trikslyr69
ZombieGrub47
Organizations
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 15 non-featured ]
StarCraft 2
• HeavenSC 29
• Response 3
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• Azhi_Dahaki24
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Scarra1853
• Lourlo693
Upcoming Events
WardiTV Winter Champion…
7h 42m
PiGosaur Cup
20h 42m
Replay Cast
1d 4h
WardiTV Winter Champion…
1d 7h
Replay Cast
1d 19h
PiG Sty Festival
2 days
Maru vs Bunny
Classic vs SHIN
The PondCast
2 days
KCM Race Survival
2 days
WardiTV Winter Champion…
2 days
OSC
2 days
[ Show More ]
Replay Cast
2 days
PiG Sty Festival
3 days
Clem vs Percival
Zoun vs Solar
Epic.LAN
3 days
Replay Cast
3 days
PiG Sty Festival
4 days
herO vs NightMare
Reynor vs Cure
CranKy Ducklings
4 days
Epic.LAN
4 days
Replay Cast
4 days
PiG Sty Festival
5 days
Serral vs YoungYakov
ByuN vs ShoWTimE
Sparkling Tuna Cup
5 days
Replay Cast
5 days
Replay Cast
6 days
Wardi Open
6 days
Replay Cast
6 days
Liquipedia Results

Completed

C-League Week 31
LiuLi Cup: 2025 Grand Finals
Underdog Cup #3

Ongoing

KCM Race Survival 2026 Season 1
Nations Cup 2026
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual
eXTREMESLAND 2025
SL Budapest Major 2025

Upcoming

Escore Tournament S1: King of Kings
[S:21] ASL SEASON OPEN 1st Round
[S:21] ASL SEASON OPEN 1st Round Qualifier
Spring Cup 2026: China & Korea Invitational
[S:21] ASL SEASON OPEN 2nd Round
[S:21] ASL SEASON OPEN 2nd Round Qualifier
Acropolis #4
HSC XXIX
uThermal 2v2 2026 Main Event
Bellum Gens Elite Stara Zagora 2026
RSL Revival: Season 4
WardiTV Winter 2026
BLAST Rivals Spring 2026
CCT Season 3 Global Finals
FISSURE Playground #3
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League Season 23
ESL Pro League Season 23
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.