Anyways bulldozer is an epic failure and is slower than even thuban and phenom in many cases, and sucks for gaming and uses twice the power under load than an intel rig.
It is mind boggling that AMD worked on this processor for over 6 years and it sucks worst than its predecessor architecture. WHY AMD WHY!!!?? i am stunned and dumbfounded.
sleepy right now, but will be reading this as soon as i wake up, i love this tech stuff so much lol, and i hope its not as bad as you make it sound lol. Thanks for bringing this to my attention, funny tho im still using a core 2 duo from yester year, and still working great. anyways thanks again sir!
I skimmed through a couple of tests and it looks pretty bad.
I have owned Intels and AMDs, because I always simply buy what's the most bang for the amount of bucks I'm willing to spend (and never understood Intel/AMD/nVIDIA/ATI or any corporate fandom). Currently that's a Phenom II 955, which I bought before Sandy Bridge's release. AMD falling further behind is really bad for the consumer, even if you don't intend on ever buying an AMD cpu.
Not looking bright for the future and when you compare Intel's with AMD's budget it's not likely to get better any time soon. That's not even including Intel's business practices.
There were already a bunch of leaked benchmarks before the nda lift.. and it wasn't like all of them would end up conspiring against amd. Just that the oct 12+ benchmarks only confirmed them... anything beyond 2005 seemed like it all went down hill for amd
It would seem they hope to sell the processors on the "8 core bit" alone. Screw performance people actual need. Maybe the lower end processors will be more viable.
I was looking forward to this, but when I found out that my 790 board wouldn't be compatible my interest dropped off a while ago.
Still, it is a little silly that the 8150 loses to the 1100T on most occasions. I've read some of the new architecture's goal and a relevant point is that this architecture should be more "scalable" than the last architecture. My (last) hope is that they can somehow pull a houdini and scale this thing to 16+ cores without much trouble and shake up the server industry.
I may get an Intel next upgrade, but I'm still an AMD fanboy on the graphics side until they prove me wrong, but they seem to have their act together on gpus.
havent read the articles yet but if what you say is true then I'm disappointed, I had high hopes for bulldozer
edit: finished reading, while BD is quite disappointing, the OP is a little bit extreme. I think it's worth mentioning that in some heavily threaded applications, BD > Sandy Bridge. Unfortunately its still not enough to make up for its shortcomings.
I generally like AMD products simply because you tend to get more bang for your buck, but meh... I was hoping they wouldn't fail. The fusion APU thing was pretty cool, so I was kinda expecting AMD to be on a roll, but this sucks.
Boggles the mind that they would release an obviously inferior product. Surely they would be aware of its less than stellar performance and simply scrap it rather than putting it into production ?
Sure many companies put out a "budget" line of products to appeal to the lower price point of the market, but these products are usually the last years "top of the line" model that has simply been stripped back of the fiddly bits to keep cost low.
You cant release a "brand new" product that was intended to be top of the line as a budget product and expect to recoup the development costs.
On October 12 2011 18:52 Playguuu wrote: Well I used to be a AMD fan before the P4 came out. I wonder how they thought the specs were ok to release the chip.
They didn't but it took over five years to design so they had to run with what they had.
Notice it was delayed from Q4 2009 to Q2 2011, and then another four months in 2011 and in that time there were two respins (B1 -> B2 -> B2G) in an attempt to fix it.
Close to 500watts when OC'd to ~4.7GHz and still doesn't come close to the 2500k in single threaded performance. We're so far away from programs that would use this CPU to its full potential and by that time ivy bridge and beyond will be out for Intel. Would've been easier to just die shrink a thuban and slap on 2 more cores.
I think it was known since May that AMD had some big problems with single threaded performance and overall those procs seems to be a failure for the deskstop market since video games are coded like shit and can't really use more than two cores for the most part. SB is just a way better architecture here. They are probably still interesting for AMD fanboys who don't play games and only use their comps for video encoding and stuff like that but it is a niche already covered by the Thuban. Bulldozer looks like AMD PIV, more cores, higher freqs but no real increase in performance except for some multithreaded apps.
Don't really know what to say, maybe it have more success on the server market but again Intel is already so strong here and their procs just have better TDP. It seems that we will have to wait for the next gen ( will be an APU ).
edit: also they are fucking dumb to call that proc FX.
On October 12 2011 18:46 nalgene wrote: There were already a bunch of leaked benchmarks before the nda lift.. and it wasn't like all of them would end up conspiring against amd. Just that the oct 12+ benchmarks only confirmed them... anything beyond 2005 seemed like it all went down hill for amd
Tons of fakes too ;D
Ph II and Athlons II ( especially x3 ) pre SB were kinda cool if you wanted value.
Is it possible that there are compatibility issues that need to be patched with firmware updates? I feel pretty bad for them since they've supported IPL 3
On October 12 2011 19:23 PepperoniPiZZa wrote: Is it possible that there are compatibility that need to be patched with firmware updates? I feel pretty bad for them since they've supported IPL 3
Well it seems to me that the transition to 32nm hasn't been as smooth as they hoped for. Anyway there is still a speck of light in there for us budget computer builders. I think if they can work on the power consumption while getting 10%-15% increase every year it's going to be OK.
On October 12 2011 19:23 PepperoniPiZZa wrote: Is it possible that there are compatibility that need to be patched with firmware updates? I feel pretty bad for them since they've supported IPL 3
There's some fixing that can be done through a bios update with cache thrashing and Windows 7 isn't optimized for BD's module. Preliminary testing on fixes for the cache thrashing don't bring a lot more performance and tests with windows 8 dev preview with better ability to align threads brought less than 1% increase in performance.
Other than that it seems most likely to be a problem with GLoFO and their yields.
Meh. They'll get over it, probably sell quite a lot of these over a few years to non-suspecting customers (NEVER underestimate how many completely clueless people buy ready-built PC's without knowing what's in them). Just a question of marketing them right - and hoping Intel doesn't pull some shit again to get AMD locked out of that market. Although I doubt Intel feels threatened by AMD enough to go to those lengths.
Other than that it seems most likely to be a problem with GLoFO and their yields.
I don't think so. Clock speeds are quite high already (4.2GHz Turbo).
The problem is that AMD has between 1/10 and 1/8 of the R&D funding and employees of Intel. No one should expect them to be competitive any more.
On paper the BD module design isn't bad. Yes, single thread performance is going to be worse, that was a conscious design decision, but it shouldn't be as horrible as it is right now. That, along with the massive power usage overclocking brings, could very well be explained by GLoFo's poor yields. We already know they were having issues with Llano. All we can do is wait and see if better manufacturing techniques bring better yields with FX8170 in Q1, but I'm not holding my breath.
Other than that it seems most likely to be a problem with GLoFO and their yields.
I don't think so. Clock speeds are quite high already (4.2GHz Turbo).
The problem is that AMD has between 1/10 and 1/8 of the R&D funding and employees of Intel. No one should expect them to be competitive any more.
On paper the BD module design isn't bad. Yes, single thread performance is going to be worse, that was a conscious design decision, but it shouldn't be as horrible as it is right now. That, along with the massive power usage overclocking brings, could very well be explained by GLoFo's poor yields. We already know they were having issues with Llano. All we can do is wait and see if better manufacturing techniques bring better yields with FX8170 in Q1, but I'm not holding my breath.
To be fair though, who are they really trying to target by having chips with more cores and less single thread performance?
Gamers and standard users both really need the single thread performance.
The multi-thread performance is good for like servers and video rendering, etc, but why would you cater your entire lineup towards that?
To be fair though, who are they really trying to target by having chips with more cores and less single thread performance?
Gamers and standard users both really need the single thread performance.
The multi-thread performance is good for like servers and video rendering, etc, but why would you cater your entire lineup towards that?
Or are they just aiming towards the future?
To put it bluntly, these processors were not initially meant to compete with SB. They delayed the release for ~2 years. It's been just about 4 years since AMD announced BD altogether, I think they believed more programmers would be making and utilizing multithreaded apps and games.
The server market is a lot bigger than you think it is, but that's a whole different story and we've yet to see BD server performance. The marketing division shot themselves in the foot with this one though. They pulled the ultimate InControl with all the hype.
AMD aimed for a future that is coming way too slowly.
Other than that it seems most likely to be a problem with GLoFO and their yields.
I don't think so. Clock speeds are quite high already (4.2GHz Turbo).
The problem is that AMD has between 1/10 and 1/8 of the R&D funding and employees of Intel. No one should expect them to be competitive any more.
On paper the BD module design isn't bad. Yes, single thread performance is going to be worse, that was a conscious design decision, but it shouldn't be as horrible as it is right now. That, along with the massive power usage overclocking brings, could very well be explained by GLoFo's poor yields. We already know they were having issues with Llano. All we can do is wait and see if better manufacturing techniques bring better yields with FX8170 in Q1, but I'm not holding my breath.
To be fair though, who are they really trying to target by having chips with more cores and less single thread performance?
Gamers and standard users both really need the single thread performance.
The multi-thread performance is good for like servers and video rendering, etc, but why would you cater your entire lineup towards that?
Or are they just aiming towards the future?
I'm guessing they just wanted to be better than intel at something for pr reasons rather than have something that doesn't stand out in any way even though it might give a better overall performance. That's just my uneducated feeling though. If the bulldozer can breath enough fresh air into the sales maybe it buys them enough time to come out with something better.
As I type this, somewhere, someone from Intel is smashing the "CANCEL" button on the next round of price cuts for Sandy Bridge and slamming the brakes on Ivy Bridge development.
Everyone should cry, except for the people working at Intel.
On October 12 2011 20:35 KeksX wrote: Just get the AMD x6/x4 and be fine. I see no problem with the bulldozers being bad. I'll get my AMD x6 1100t anyway.
It's a huge concern when you only have two options on the market. If amd can't keep up that will a) give you an inferior product if you choose amd and b) enable intel to overcharge, impose restriction etc to their lineup because you don't have any other option than to pay what they are offering. Intel wins, customers and amd lose.
The site is completely not credible (been following them for a while, they have a scattergun approach to rumours and print everything in the hope that one thing is right), and Bapco were clearly too close to Intel and skewing the benchmark (very important because it's used worldwide for government procurement).
If it was just AMD's problem why did Nvidia and VIA (3rd biggest x86 CPU company) leave too?
Doesnt really bother me. It was going to be terrible anyways. The future of amd is with fusion and away from the highend. But sandybridge is really really kicking ass in everything it goes into.
Anyways its not like intel hasnt already started to go back to their old ways, the number of sockets that have come out over the past few years and are coming out soon is ridiculous. Great value if you want a new computer now, not so great if you just wanted to upgrade a cpu ):
To be fair though, who are they really trying to target by having chips with more cores and less single thread performance?
Gamers and standard users both really need the single thread performance.
The multi-thread performance is good for like servers and video rendering, etc, but why would you cater your entire lineup towards that?
Or are they just aiming towards the future?
To put it bluntly, these processors were not initially meant to compete with SB. They delayed the release for ~2 years. It's been just about 4 years since AMD announced BD altogether, I think they believed more programmers would be making and utilizing multithreaded apps and games.
The server market is a lot bigger than you think it is, but that's a whole different story and we've yet to see BD server performance. The marketing division shot themselves in the foot with this one though. They pulled the ultimate InControl with all the hype.
AMD aimed for a future that is coming way too slowly.
It seems like it might be a killer Server chip, which has actually been AMD's bread & butter since the Opterons came out. Which could explain a lot of the issues. AMD might be mostly focused on the Server Market and the normal consumer market is taking a backseat to that.
Though we really need AMD to survive in the consumer market. A monopoly on the consumer market wouldn't be a good thing.
I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
I think AMD is going to go bankrupt which is really, really sad. I hope some wealthy investors buy the majority stake of the shares and pump up money so it can compete and Intel doesn't become a monopoly, but its looking grim for AMD and I wanted them to be competitive so Intel reduces prices!
Damn, this is disappointing. After my whole life using AMD, I will probably make he switch to Intel should I ever upgrade. I was really hyped for this release but AMD really bombed it.
Oh well, I was always jealous of Intel owners anyways. ;D
I built my rig with P2 x6 1055T a little more than a year ago and was looking forward to this release from AMD. Needless to say, I am disappointed. It's not that I was looking forward to upgrading my CPU in the immediate future, but it's still sad to witness such a disappointment.
I think AMD is going to go bankrupt which is really, really sad. I hope some wealthy investors buy the majority stake of the shares and pump up money so it can compete and Intel doesn't become a monopoly, but its looking grim for AMD and I wanted them to be competitive so Intel reduces prices!
I think you're forgetting that AMD have a GPU division, which would defiinitely keep them afloat (This generation of Radeons are pretty good - don't know too much myself about PC hardware nowadays though I must admit)
They are actually pretty good in highly threaded applications, but so were the phenom 2s and price for price they are still the best bargain there. Problem is the vast majority of games aren't threaded well and people expect their computer to do more than play bf3. They basically made a chip that is less of a bargain for the same things phenom 2 did
People need to understand AMD has a market cap of about 3.3 billion compared to Intel's 120 billion. For them to make something that even comes close to beating what Intel can produce is pretty remarkable. What the two companies can spend on R&D is incomparable. That's personally why I like AMD. Sadly there's no way I can buy these chips if they can barely beat phenom 2s
It's an honest shame that these chips are so disappointing because we are rapidly losing out of real competition in the cpu market which is never a good thing for the consumer.
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
Most of the comments here are really unquallified. The Bulldozer design is at the start of its lifecycle and it starts where the phenom II line stopps. Bulldozer is for sure not inferior to the phenom II line, and it is compareable to the i7 line in many benchmarks.It also outclasses the i5 in many benchmarks. Its flaw is the single core performance but that performance is nowhere near as horrendously bad as stated by some posters here. Also stating it is "fail" or whatever for gaming or for SC II is utter nonsense when it can run SC II on +90 FPS with a bad graphics card.
I dont think the CPU was made to outclass Intel but to create a successor for the phenom II line wich reached its technical limits and to create a platform that can compete with intel in the future, maybe not at all performance level but for sure with better pricing. I think that AMD is going for the more bang for bucks instead of some percent more calculations for double the price.
I would place the card between i5 and i7 but giving the price advantage to AMD. And Zambesi does not have an integrated graphics like some people stated here.
I see uneducated people going for AMD because of the "omg it has 8 cores" so it must be better then what intel is offering. Overall consumers lose because of this chip being so underwhelming.
On October 12 2011 22:31 Holy_AT wrote: Most of the comments here are really unquallified. The Bulldozer design is at the start of its lifecycle and it starts where the phenom II line stopps. Bulldozer is for sure not inferior to the phenom II line, and it is compareable to the i7 line in many benchmarks.It also outclasses the i5 in many benchmarks. Its flaw is the single core performance but that performance is nowhere near as horrendously bad as stated by some posters here. Also stating it is "fail" or whatever for gaming or for SC II is utter nonsense when it can run SC II on +90 FPS with a bad graphics card.
I dont think the CPU was made to outclass Intel but to create a successor for the phenom II line wich reached its technical limits and to create a platform that can compete with intel in the future, maybe not at all performance level but for sure with better pricing. I think that AMD is going for the more bang for bucks instead of some percent more calculations for double the price.
I would place the card between i5 and i7 but giving the price advantage to AMD. And Zambesi does not have an integrated graphics like some people stated here.
Yeah I'm also hoping that this is just the first generation of processors we're seeing on this plattform, and that these have some flaws (like energy consumption for example), and that new releases will somewhat solve this.
Overall AMD has never really been able to compete with the high end performance of Intel, but they offer a cheaper alternative, which is atleast why I've stuck with them.
Hard to understate how much of an underachievement this is. @2B transistors and 3cm2 to have such low performance is disgusting. I am even surprised they decided to launch it at all.
The fabrication process is to blame for sure, but how bad can that be? They said 50% faster than their 6core and this isn't close at all. They should have hit over 4.5 GHz to get there. We can definitely see llano is also suffering on clockspeed and power consumption compared to previous generations, and this should have been a warning for AMD.
Also let me add that I've said this before, but AMD has been selling us Opterons since 2003 and it is a big mistake. They have completely neglected the mobile segment, which was the engine of the growth in the market, for the very low volume server market, and this has lead to the contraction of their CPU business, and their R&D most likely had to suffer as well.
This problem is only getting worse with time so I'm afraid the company will go under pretty soon.
On October 12 2011 22:18 floor exercise wrote: People need to understand AMD has a market cap of about 3.3 billion compared to Intel's 120 billion. For them to make something that even comes close to beating what Intel can produce is pretty remarkable. What the two companies can spend on R&D is incomparable. That's personally why I like AMD. Sadly there's no way I can buy these chips if they can barely beat phenom 2s
How is it remarkable? AMD beat Intel pretty handily with their Athlon series all the way up to San Diego... Wasn't the gap even larger back then?
Also, I think people are overrestimating the end user PC enthusiast saying AMD will go under because of this and all that shit. There is a lot more money to be made selling to server clusters and things like that and it seems like the architecture favors that.
The best we can hope for is a Presler kind of revelation towards the end of the BD-cycle, but even that's gonna be somewhat optimistic. When I say Presler...I'm saying how the price/perf was somewhat compelling after the Prescott launch, but that was after 2 years too.
AMD not only is NOT price competitve, but also worse in IPC that the Thubans, and worse in power consumption to add insult to injury. I'm just a little miffed at the people who played their part in suggesting this was a good chip. chew at XS, the reddit thread with Icrontic giving updates...MovieMan even changed his avatar. << Ok last one was good for a chuckle...but damn.
I know Kyle at H suggested that it's a decent chip for BF3, but IMO, the power consumption and price, when you can already get a i5 2500K for $149/$179 at MC, the BD just isn't compelling.
On October 12 2011 22:53 mav451 wrote: AMD not only is NOT price competitve, but also worse in IPC that the Thubans, and worse in power consumption to add insult to injury. I'm just a little miffed at the people who played their part in suggesting this was a good chip. chew at XS, the reddit thread with Icrontic giving updates...MovieMan even changed his avatar. << Ok last one was good for a chuckle...but damn.
Misery loves company I guess.
For once I get to feel smarmy about being a generation behind in my processor.
I'm surprised they even released this. It makes the company look so bad. After this, they essentially look 100% incapable of competing with Intel.
Its interesting especially because its not like Intel is doing anything anti-competitive. Intel just totally and completely outclasses AMD. Does this create a sorta monopoly situation? I dunno, seems so strange. I wonder why AMD can't even come close to keeping up.
On October 12 2011 22:18 floor exercise wrote: People need to understand AMD has a market cap of about 3.3 billion compared to Intel's 120 billion. For them to make something that even comes close to beating what Intel can produce is pretty remarkable. What the two companies can spend on R&D is incomparable. That's personally why I like AMD. Sadly there's no way I can buy these chips if they can barely beat phenom 2s
How is it remarkable? AMD beat Intel pretty handily with their Athlon series all the way up to San Diego... Wasn't the gap even larger back then?
Also, I think people are overrestimating the end user PC enthusiast saying AMD will go under because of this and all that shit. There is a lot more money to be made selling to server clusters and things like that and it seems like the architecture favors that.
One small flash in a pan doesn't mean that much. All the other times were basically "good effort AMD but you're not good enough for Intel". It'll still be "good effort AMD" till the end of time because Intel aren't going to lose this lead.
AMD won't go under because Intel will make sure of this. They are popular in HPCs because they are dirt cheap, not because they perform exceptionally well: Magny Cours was basically handing out free silicon to stop market share from slipping further. That hasn't worked too well since in overall servers, Intel is the undisputed king...that is reflected by AMD's server market share slipping down to single figures. You also have companies like Oracle selling servers by the core, which obviously is advantageous to Intel (Hyperthreading =/= Core) and bad for AMD.
AMD is too far behind Intel at this point of time. No doubt they can design a good product (all companies have talented people, it just takes management to make them shine) but without fabrication resources Intel has they'll never be remotely competitive in any sector.
On October 12 2011 23:19 Mohdoo wrote: I'm surprised they even released this. It makes the company look so bad. After this, they essentially look 100% incapable of competing with Intel.
Its interesting especially because its not like Intel is doing anything anti-competitive. Intel just totally and completely outclasses AMD. Does this create a sorta monopoly situation? I dunno, seems so strange. I wonder why AMD can't even come close to keeping up.
Intel pours money into their fabs. Their fabs are like 18 months ahead of Global Foundries, which is what AMD generally uses to manufacturer their processors. It doesn't help this was a new processor architecture on an entirely new process so you get mad yield issues.
In a perfect world, Bulldozer would probably be clocked extremely aggressively and perform quite well at stock settings. It was delayed for so long because they were incapable of reaching, I suppose, their target frequencies.
@Womwomwom they can't really be dirt cheap when you look at the die size and transistor count. Bulldozer is extremely inefficient on those. If they switched their lineup to BD they would make too little money per processor sold and will surely be driven into the ground.
Read my post again. It doesn't matter if their processors are fucking huge, AMD was pretty much handing out free silicon with Magny Cours (that 2x Phenom II processors glued together processor) to prevent their market share in servers from slipping further. And that is generally why HPCs like using AMD processors: because their tasks are well threaded enough to use a billion cores and AMD is willing to give you that many for cheap.
And haven't they pretty much switched to Bulldozer? Phenom II EOL is like the end of this year or something.
On October 12 2011 18:38 Cocoabean wrote: So judging by the benchmarks, AMD basically did the impossible and reversed Moore's Law.
/clap
Not true. Moore's law simply states that you can purchase a processor twice as powerful for the same price every 2 years. This fact still holds true. You just can't buy that processor from AMD.
I'm no brand fanboy, though my last 3 pc's have been AMD due to better price-performance at the time, the first 2 were intel. This sucks as many have said it will mean AMD is not competitive at this point. Sandybridge is amazing and definitely hard to compete with. It really is a David versus Goliath kind of scenario, and it seems like David's sling arm is getting rather tired.
I don't call it a fail, but its definitely not ideal.
Sure they can (especially with them only paying for the good processors), whether it earns them enough money to improve their prospects in a different matter. Its not like they can do anything about Global Foundries except hope they catch up to Intel's fabs and keep giving AMD sweet deals.
To be on topic, I'm curious about the performance after Microsoft deals with the scheduling issues Bulldozer has with Windows. Having performance worse than a Phenom II is rather...odd and the fact performance improves quite a bit in Windows 8 seems to suggest the operating system seems to hold performance back a bit. Strange AMD didn't get Microsoft to patch it before release.
On October 12 2011 22:31 Holy_AT wrote: Most of the comments here are really unquallified. The Bulldozer design is at the start of its lifecycle and it starts where the phenom II line stopps. Bulldozer is for sure not inferior to the phenom II line, and it is compareable to the i7 line in many benchmarks.It also outclasses the i5 in many benchmarks. Its flaw is the single core performance but that performance is nowhere near as horrendously bad as stated by some posters here. Also stating it is "fail" or whatever for gaming or for SC II is utter nonsense when it can run SC II on +90 FPS with a bad graphics card.
I dont think the CPU was made to outclass Intel but to create a successor for the phenom II line wich reached its technical limits and to create a platform that can compete with intel in the future, maybe not at all performance level but for sure with better pricing. I think that AMD is going for the more bang for bucks instead of some percent more calculations for double the price.
I would place the card between i5 and i7 but giving the price advantage to AMD. And Zambesi does not have an integrated graphics like some people stated here.
this is just delusional. bulldozer is more expensive than phenom ii x6 and delivers slightly better performance for the price of a 2500k. if they drop the prices by like $50+ it might be somewhat appealing but considering a decent overclock will put you somewhere in the range of 300w+ it makes little sense to go that route.
as a successor it fails. as competition it fails. saying it outclasses a 2500k is ludicrous when in single threaded performance it is blown out of the water and only manages to barely best it in all but the most heavily threaded benches.
hopefully amd can turn piledriver into bulldozers phenom ii but i'm guessing they are going to need more than a 10% increase to deal with ivy bridge.
On October 12 2011 23:57 Womwomwom wrote: Sure they can (especially with them only paying for the good processors), whether it earns them enough money to improve their prospects in a different matter. Its not like they can do anything about Global Foundries except hope they catch up to Intel's fabs and keep giving AMD sweet deals.
To be on topic, I'm curious about the performance after Microsoft deals with the scheduling issues Bulldozer has with Windows. Having performance worse than a Phenom II is rather...odd and the fact performance improves quite a bit in Windows 8 seems to suggest the operating system seems to hold performance back a bit. Strange AMD didn't get Microsoft to patch it before release.
I agree that having a suboptimal scheduler really suck - in fact a lot of these numbers probably deserve another review after microsoft patches it up (if they do bother).
I am really worried about the single thread performance. I see almost no reason for AMD users to upgrade, and I will still recommend sandy bridge to people over the new AMD BD.
Programming multi-threaded application provides more challenges for programmers; so for AMD to follow this path of multicore parallel processing is non prevalent to the consumer needs at this time.
I think Bullzdozer is a worthy successor to AMD's Phenom II, it's built upon support multi GPU support, hence the PCI-E X16 (2x). As a systems builder, you need to look at all aspect of the chipset and support rather than relying on the CPU alone.
Also the pricing of these CPUs are extremely attractive, also the overclockability of these chips will help further decide the purchasing decision by consumers.
On October 12 2011 23:57 Womwomwom wrote: Sure they can (especially with them only paying for the good processors), whether it earns them enough money to improve their prospects in a different matter. Its not like they can do anything about Global Foundries except hope they catch up to Intel's fabs and keep giving AMD sweet deals.
To be on topic, I'm curious about the performance after Microsoft deals with the scheduling issues Bulldozer has with Windows. Having performance worse than a Phenom II is rather...odd and the fact performance improves quite a bit in Windows 8 seems to suggest the operating system seems to hold performance back a bit. Strange AMD didn't get Microsoft to patch it before release.
I read about the schedule issue, but 3% seems to be the quoted number here. The flipside (or optimistic) is why didn't AMD get that resolved before release if it would significantly help performance? So logically it probably does not really help much.
When you consider the consistent bad performance in the leaks, and then even lab501 releasing similar numbers about 1.5 days earlier, it really put to bed any real expectations for "surprising" performance boosts. Some of the AMD fans clung to BIOS updates or AGESA problems changing things (lol), but they really are grasping at straws at this point.
Also the pricing of these CPUs are extremely attractive, also the overclockability of these chips will help further decide the purchasing decision by consumers.
Which bulldozer chip is priced competitively to Intel? Honestly looking at the benchmarks, every Intel chip at the same price point will perform better with and without overclocking.
It feels like this is a time when Intel finally has AMD beat in every possible way.
On October 13 2011 01:19 Grobyc wrote: Still glad I got my Intel SandyBridge then
My thoughts exactly.
I expect a CPU in the same price range that comes out 6 months later than the one I'm using right now to be at least 10% better in most benchmarks, not only some specific cases.
This is just a total failure. I can't believe AMD actually released the chip at its current state. Either 8 cores is just not getting the support it needs or this is just a terrible concept to begin with, or both.
While its basically AMD's Pentium 4, its not a terrible chip.
It just has way to many weird quirks since it favors parallelism over single threaded and the current revision kinda sucks at OC at reasonable levels. There's issues with the scheduler and cache thrashing and other things but it's a good starting block for a new line of processors.
The processor only really fails at the enthusiast sectors while not being particularly mind blowing compared to Phenom II and its leaking so much power that it could heat your home in winter. That's the only real problem with it since they hyped it so much for 4 years :p.
The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
On October 13 2011 01:34 B00ts wrote: The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
No, it does a functional job in multi-threaded situations. Or is "Slightly better than half the physical cores in an older CPU sometimes" equivalent to "does very well" now?
Shame. I wonder what would've happened without Intels market manipulation in earlier years. AMD could've / should've grabbed a much larger market-share during the Pentium 4 vs Athlon XP/64 period, where AMD was on top both in raw performance as well as bang-for-buck.
Now with the OEMs being bullied away from AMD by Intel, AMD never got the market share and revenues to expand their R&D division. I fear for competition on the CPU market if AMD can't come back with a strong processor soon.
At least they're making a profit now, something that wasn't true a few years back. But still...
On October 13 2011 01:34 B00ts wrote: The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
No, it does a functional job in multi-threaded situations. Or is "Slightly better than half the physical cores in an older CPU sometimes" equivalent to "does very well" now?
Yes, look at the prices of an i7 vs and the FX as well. I would rather by a CPU that does better (albeit marginally small) for less money for my server space.
That, and Win7 is not optomized when it comes to thread placement either, and we've only seen limited Win8 tests, however thread scheduling is addressed (or so I hear).
Also, I said 'does very well'. If an i7 does very well, and teh FX is just as good (above or below a few % in different tests), that means it also does very well.[DISCLAIMER]: I am talking about multi-threaded tests ONLY.[/DISCLAIMER]
Don't nit-pick my choice of words and grammar... You know darn well what I was saying. And that when people say "does very well", they are not saying "does totally better than competition"... Otherwise they would just say that, if that is what they meant.
On October 13 2011 01:34 B00ts wrote: The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
No, it does a functional job in multi-threaded situations. Or is "Slightly better than half the physical cores in an older CPU sometimes" equivalent to "does very well" now?
Yes, look at the prices of an i7 vs and the FX as well. I would rather by a CPU that does better (albeit marginally small) for less money for my server space.
That, and Win7 is not optomized when it comes to thread placement either, and we've only seen limited Win8 tests, however thread scheduling is addressed (or so I hear).
Also, I said 'does very well'. If an i7 does very well, and teh FX is just as good (above or below a few % in different tests), that means it also does very well.
Don't nit-pick my choice of words and grammar... You know darn well what I was saying. And that when people say "does very well", they are not saying "does totally better than competition"... Otherwise they would just say that, if that is what they meant.
Who's nitpicking? I'm trying to understand how a newer CPU with a higher clock and twice the physical cores occasionally barely exceeding the competition is considered doing very well. It's counter intuitive.
You can try to make it sound good all you want, but if a CPU needs twice the cores and a higher clock to match performance on apps that use the cores even reasonably well, it just plain sucks.
Play fanboy all you want, but don't expect us to buy into this line of... uhm... thinking.
On October 13 2011 01:34 B00ts wrote: The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
No, it does a functional job in multi-threaded situations. Or is "Slightly better than half the physical cores in an older CPU sometimes" equivalent to "does very well" now?
Yes, look at the prices of an i7 vs and the FX as well. I would rather by a CPU that does better (albeit marginally small) for less money for my server space.
That, and Win7 is not optomized when it comes to thread placement either, and we've only seen limited Win8 tests, however thread scheduling is addressed (or so I hear).
Also, I said 'does very well'. If an i7 does very well, and teh FX is just as good (above or below a few % in different tests), that means it also does very well.
Don't nit-pick my choice of words and grammar... You know darn well what I was saying. And that when people say "does very well", they are not saying "does totally better than competition"... Otherwise they would just say that, if that is what they meant.
Who's nitpicking? I'm trying to understand how a newer CPU with a higher clock and twice the physical cores occasionally barely exceeding the competition is considered doing very well. It's counter intuitive.
You can try to make it sound good all you want, but if a CPU needs twice the cores and a higher clock to match performance on apps that use the cores even reasonably well, it just plain sucks.
Play fanboy all you want, but don't expect us to buy into this line of... uhm... thinking.
Who is a fanboy? And why all the attacks?
I was merely pointing out the fact that people are getting a little too upset over the crappy results in some tests... as the chip wasn't really made for it... and also isn't a pure Consumer-designed CPU.
Also... It doesn't really have twice the physical cores. The 8 core has 4 'modules'. Each module has a floating point and 2 integer cores... So its sort of in between 4 and 8 cores, as a regular core has 1 of each. Im not making excuses for them or anything... just clarifying.
And your definition of "uses cores reasonably well" is of course, subjective.
They wanted to increase scalability so to do that they had to increase the pipeline size for some reason. When the pipeline is increased, it makes it slower per cycle. When its slower per cycle, then they had to ramp the speed up. When they had to ramp the speed up, Global Foundries couldn't get their fab process to work well enough. When Global Foundries couldn't make Bulldozer that well, they delayed it. When they delayed it, they hyped it even more to buy time. When they tried to buy time by hyping it even more, they eventually had to release it like 2 years behind schedule. etc etc.
Intel is so far ahead in the technological aspect related to CPU's.
WTF 600WATT Processor in LOAD With clock???My whole system draws 550W and thats my whole PSU,it never draws more than 400-450W,even thou i have 4ghz overclock on my 32nm i3.
On October 13 2011 01:58 Crying wrote: Intel is so far ahead in the technological aspect related to CPU's.
WTF 600WATT Processor in LOAD With clock???My whole system draws 550W and thats my whole PSU,it never draws more than 400-450W,even thou i have 4ghz overclock on my 32nm i3.
This is a complete joke.
Its 600Watt system draw. BD has a similar draw to the old i7's.
On October 13 2011 01:34 B00ts wrote: The Bulldozer core is and was designed for the Server space. All they did was alter some features and throw an FX brand on it and labelled it their Consumer CPU 'top of the line'.
The CPU does very well in multi threaded situations, and should serve its purpose in tackling the server space.
The Bulldozer core was never specifically designed to go after SandyBridge in gaming or single threaded apps.
No, it does a functional job in multi-threaded situations. Or is "Slightly better than half the physical cores in an older CPU sometimes" equivalent to "does very well" now?
Yes, look at the prices of an i7 vs and the FX as well. I would rather by a CPU that does better (albeit marginally small) for less money for my server space.
That, and Win7 is not optomized when it comes to thread placement either, and we've only seen limited Win8 tests, however thread scheduling is addressed (or so I hear).
Also, I said 'does very well'. If an i7 does very well, and teh FX is just as good (above or below a few % in different tests), that means it also does very well.
Don't nit-pick my choice of words and grammar... You know darn well what I was saying. And that when people say "does very well", they are not saying "does totally better than competition"... Otherwise they would just say that, if that is what they meant.
Who's nitpicking? I'm trying to understand how a newer CPU with a higher clock and twice the physical cores occasionally barely exceeding the competition is considered doing very well. It's counter intuitive.
You can try to make it sound good all you want, but if a CPU needs twice the cores and a higher clock to match performance on apps that use the cores even reasonably well, it just plain sucks.
Play fanboy all you want, but don't expect us to buy into this line of... uhm... thinking.
I don't think you can simply compare the number of cores and clockspeed and come up with a performance expectation. The variables that matter are price and possibly power consumption. There are various roads that lead to good performance. That more clockspeed doesn't necessarily mean better performance we already knew from the Pentium 4. Similarly, more cores doesn't automatically mean better performance (even on multi-threaded programs) as the individual cores can be rather weak, as we see now with Bulldozer. Compare performance for the set of applications/tasks that you're interested in and match that with the price. Pulling in other statistics such as clockspeed, core count, amount of L3 cache, whatever, is fairly pointless.
Also note that despite AMDs marketing buzz, Bulldozer isn't a full 8-core CPU. It consists of 4 so-called "modules" that each contain almost 2 cores. Almost meaning that some of the parts, most notably the floating point unit and some of the cache, are shared between the 2 cores in one module. This implementation lies between Intels HyperThreading and having 8 actually distinct cores both in terms of performance and amount of transistors required.
Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
Well, you'll get those with the server parts since only very few desktop things can use BD's power.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
Sure, except they've tried to market it for all the things it's being shredded for not doing well.
Market an 8 core CPU, I treat bench results like an 8 core CPU. Tough shit for AMD on that, one of their marketing points just makes the performance look worse.
And I know full well that clock and cores aren't the primary components to performance. The problem is when you release a CPU that does a miserable job at a lot of the things you tried to market it for, even with edges in things like that.
And anybody acting like BD is worth a damn outside incredibly limited areas, just like Ph2, is deluding themselves. Obviously nobody will say it's 100% useless for everything, but they've fallen so far short of so many of the marks they set for themselves and hyped, that it's a huge disappointment.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
I'm learning about processors, memory, memory management, caches yada yada in one my classes right now, so I'm genuinely interested in the stuff.
So, literally just googled "AMD Bulldozer target market" and it's not 'clear' whatsoever. Sounds like one needs to do some serious digging to find that kind of relevant information. I mean, if it was so blatantly obvious, don't you think the guys testing the thing would throw it into a webserver and seeing how it performs under load? Did that little nugget of information just completely pass over everyone's head? Or is that just an excuse?
After more website-skimming, I'm seeing a lot of quotes along the lines of
What's interesting to note is that the Bulldozer architecture is being launched for both the server and desktop markets.
If that's true, then the desktop people are disappointed, and rightfully so, I believe.
So, how does the thing perform as a server? Does it blow the competition out of the water?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
If that's true, then the desktop people are disappointed, and rightfully so, I believe.
So, how does the thing perform as a server? Does it blow the competition out of the water?
I haven't seen / can't find anything yet. Anyone else?
I was merely pointing out the fact that people are getting a little too upset over the crappy results in some tests... as the chip wasn't really made for it... and also isn't a pure Consumer-designed CPU.
So AMD spent the past 5 years and all their R+D designing a chip that is tailored only for a super-specific niche computer market?
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
To justify this chip is like saying Honda released a new Accord at the same price as a new Camry, but has inferior performance in every single driving scenario possible except if you drive exactly at 87mph on gravel road in downtown San Francisco on October 27th of 2013.
Again, AMD managed to do the impossible and actually released *significantly* inferior CPU performance to those that have already been out on the market for 8 months-2 years. The fact that this lineup of CPU's does decently better at a completely niche market is irrelevant.
I was merely pointing out the fact that people are getting a little too upset over the crappy results in some tests... as the chip wasn't really made for it... and also isn't a pure Consumer-designed CPU.
So AMD spent the past 5 years and all their R+D designing a chip that is tailored only for a super-specific niche computer market?
Servers are super niche? lol
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
To justify this chip is like saying Honda released a new Accord at the same price as a new Camry, but has inferior performance in every single driving scenario possible except if you drive exactly at 87mph on gravel road in downtown San Francisco on October 27th of 2013.
Again, AMD managed to do the impossible and actually released *significantly* inferior CPU performance to those that have already been out on the market for 8 months-2 years. The fact that this lineup of CPU's does decently better at a completely niche market is irrelevant.
I had no idea servers were niche... hmm
Actually, when it comes to CPU sales... Do you know what the market is actually like right now? I'll give you a hint... Gaming PC's are niche.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
Big disappointment. Oh well, I'm no adamant fanboy but I've usually leaned towards Intel. Pretty hyped for Ivy Bridge release in March 2012, might build a new computer around that if money permits
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
I'm utterly against BD for consumer use, but I won't argue their validity and viability for the server market. They just really picked some bad directions for their consumer marketing that caused them to underwhelm. If they'd marketed in a completely different route and targeted the so-called "budget enthusiast" bracket at release, they'd have done much better.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
First of all, that was not your initial point was it?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:42 JingleHell wrote:
On October 13 2011 02:37 Shikyo wrote:
On October 13 2011 02:05 B00ts wrote:
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
First of all, that was not your initial point was it?
Read my edit.
I'm not sure how you guys spending 10k more than you would by using intel CPUs is a good argument
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
Enterprise server farms don't give a rats ass about clock, it's about physical cores, thermals, and power use. The power saving features help the thermals in the server farm environment, making them a good choice there.
Overclocking is home use shit. Server farms work COMPLETELY differently, and the market share is generally distributed based on specific use rather than silly things like price or power consumption alone.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy, but isn't the total cost of ownership of a server farm dominated by both the salary of the guy you pay to maintain it, and the long-term power consumption costs? I can't think of any business that would just up and go "Yeah, fuck it, get the super heavy-duty chips, and to hell with the power costs!"
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:42 JingleHell wrote:
On October 13 2011 02:37 Shikyo wrote:
On October 13 2011 02:05 B00ts wrote:
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
First of all, that was not your initial point was it?
Read my edit.
I'm not sure how you guys spending 10k more than you would by using intel CPUs is a good argument
Dude it's no an argument, it's fact. They use AMD on 90% of the servers here. I'm just letting you know. I'm not trying to debate semantics with you.
10 grand is nothing to these guys (and most enterprise level data centres)
Believe it or not AMD CPU's are really good at multi threaded tasks, for a server the more cores the better, who gives a fuck about clock speed when you have a server with 24 cores in it?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:42 JingleHell wrote:
On October 13 2011 02:37 Shikyo wrote:
On October 13 2011 02:05 B00ts wrote:
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
Enterprise server farms don't give a rats ass about clock, it's about physical cores, thermals, and power use. The power saving features help the thermals in the server farm environment, making them a good choice there.
Overclocking is home use shit. Server farms work COMPLETELY differently, and the market share is generally distributed based on specific use rather than silly things like price or power consumption alone.
Well enterprise server farms might, sure. I'm not sure why "server" has to mean 500.000$ systems but let us all then witness the Bulldozer be better than 990X at that.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy, but isn't the total cost of ownership of a server farm dominated by both the salary of the guy you pay to maintain it, and the long-term power consumption costs? I can't think of any business that would just up and go "Yeah, fuck it, get the super heavy-duty chips, and to hell with the power costs!"
The power costs function differently in a server farm.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
But they have issues paying 320$ for a 2600k instead of 245$ for a bulldozer?
On October 13 2011 02:42 JingleHell wrote:
On October 13 2011 02:37 Shikyo wrote:
On October 13 2011 02:05 B00ts wrote:
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Actually BD has some good features for saving power in the true server market, it's the home use where the power consumption blows. It's just that home server with proper server qualifications is very niche, the enterprise server market doesn't vary too significantly most of the time, and BD just plain hasn't impressed yet for the majority of consumers.
Oh they're running it on stock?
Enterprise server farms don't give a rats ass about clock, it's about physical cores, thermals, and power use. The power saving features help the thermals in the server farm environment, making them a good choice there.
Overclocking is home use shit. Server farms work COMPLETELY differently, and the market share is generally distributed based on specific use rather than silly things like price or power consumption alone.
Well enterprise server farms might, sure. I'm not sure why "server" has to mean 500.000$ systems but let us all then witness the Bulldozer be better than 990X at that.
The home-based Clan ventrilo and 3 CSS Dedi's server is NOT the server market.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Most of them, yes, from what I've read.
Sad thing is, I'm at 4.7 on liquid, but can't go past that due to some fucked up voltage issues. Pisses me off.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Yes and you can get a 2600k to 4.6ghz as well on air, at 4.6ghz it uses less than half the energy of the bulldozer at 4.6 and outperforms it in everything but maybe 7zip, you get the extra 75$ back over time because of electricity bills.
Btw I don't really understand bulldozer if cores is all that matters for servers, why not just have 1.6ghz 16cores and stuff?
...god dammit. Now we're one step closer to an Intel monopoly. I used to love supporting AMD back in the early 2000's when their products were both higher-end AND better bang for buck than Intel's. Sadly I kinda already knew that Bulldozer was going to be an enormous failure, given how badly it was delayed. I'm just hoping at the very least when AMD goes under, ATI gets spun-off and stays alive.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy
Leave it at that please.
So, it's too difficult to directly answer, so you pick on that little tidbit? How mature. From the sound of it, you aren't a server guy yourself. Just bragging about what your business uses and claiming authority.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Yes and you can get a 2600k to 4.6ghz as well on air, at 4.6ghz it uses less than half the energy of the bulldozer at 4.6 and outperforms it in everything but maybe 7zip, you get the extra 75$ back over time because of electricity bills.
Btw I don't really understand bulldozer if cores is all that matters for servers, why not just have 1.6ghz 16cores and stuff?
Limitations to how much you can mash onto one die? They just use multiple socket boards instead.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
Yes and you can get a 2600k to 4.6ghz as well on air, at 4.6ghz it uses less than half the energy of the bulldozer at 4.6 and outperforms it in everything but maybe 7zip, you get the extra 75$ back over time because of electricity bills.
Btw I don't really understand bulldozer if cores is all that matters for servers, why not just have 1.6ghz 16cores and stuff?
Just out of curiosity what is the point of having many cores? to me it just seems like a fuckload of overhead and synchronization. Why dosn't the industry build really, really powerful dual cores instead of weaker but more cores?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy, but isn't the total cost of ownership of a server farm dominated by both the salary of the guy you pay to maintain it, and the long-term power consumption costs? I can't think of any business that would just up and go "Yeah, fuck it, get the super heavy-duty chips, and to hell with the power costs!"
If you want a detailed answer here you go.
The total cost of a server farm or data centre is completely IRRELEVANT. You're paying for good equipment for a couple reasons: Stability, failsafe mechanisms' and data retention.
If you end up making money from the server farm then, great. People don't invent all this money into a huge data centre to make a profit, it's a guaranteed loss (in it's own respect)
It's there to run the company, the infastructure and to make sure everything works as it should. It's a necessity.
A data centre tech or NOC manager makes about oh I don't know 70~90k a year, which is a few months of electricity. Believe it or not, the cooling and air conditioners in a data centre cost more then most hardware (to operate 24/7) so it's somewhat of a moot point.
the only way you can make a profit from a data centre is to rent out a collocation to another company, perhaps a reseller of your services, or a smaller fish who's renting some pipe from you. these usually cost about 15k, for a small 4x6 or 6x8 foot area.
On October 13 2011 03:02 KaiserJohan wrote: Just out of curiosity what is the point of having many cores? to me it just seems like a fuckload of overhead and synchronization. Why dosn't the industry build really, really powerful dual cores instead of weaker but more cores?
Well, probably because it's faster doing more things at once almost as fast than it is doing less things at once slightly faster. At least so far. Like folding and bitcoin mining, those only really work on GPU's with their shitload of slower cores. Different kinds of processing take different kinds of hardware.
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy, but isn't the total cost of ownership of a server farm dominated by both the salary of the guy you pay to maintain it, and the long-term power consumption costs? I can't think of any business that would just up and go "Yeah, fuck it, get the super heavy-duty chips, and to hell with the power costs!"
If you want a detailed answer here you go.
The total cost of a server farm or data centre is completely IRRELEVANT. You're paying for good equipment for a couple reasons: Stability, failsafe mechanisms' and data retention.
If you end up making money from the server farm then, great. People don't invent all this money into a huge data centre to make a profit, it's a guaranteed loss (in it's own respect)
It's there to run the company, the infastructure and to make sure everything works as it should. It's a necessity.
A data centre tech or NOC manager makes about oh I don't know 70~90k a year, which is a few months of electricity. Believe it or not, the cooling and air conditioners in a data centre cost more then most hardware (to operate 24/7) so it's somewhat of a moot point.
the only way you can make a profit from a data centre is to rent out a collocation to another company, perhaps a reseller of your services, or a smaller fish who's renting some pipe from you. these usually cost about 15k, for a small 4x6 or 6x8 foot area.
Irrelevant? Really?
So you're saying a 1% increase in core capacity, would justify 500% more power consumption? Those three metrics are the ONLY ones under consideration? Please. Maybe to a business that likes to hemorrhage money, but over here in the real world, the total cost of ownership is a huge deal. It's pretty much the sole reason my company's IPBX product wins deals against the equivalents from Cisco and Avaya. At least in this industry, people give much less of a shit about the simpler, intuitive interface and the distributed architecture, than they do the simple fact that power consumption is stupid-low and you only need one guy to maintain it.
BD release is turning AMD vs Intel into Mac vs Windows... it's fucking ridiculous. People are so busy hating stuff for their use that they can't admit the validity to a totally different market.
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
He is probably mistaking Bulldozer with Llano. Next gen Bulldozer will be an APU too.
Anyway even if it is a failure i don't thing that the situation is desperate for AMD. zacate is doing good ( they can't ship enough actually ) Llano isn't doing decent too and their GPU are selling well.
They were in a way worse shape during the Core era before the Phe/ Athlon II tbh. Still disappointing though.
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
He is probably mistaking Bulldozer with Llano. Next gen Bulldozer will be an APU too.
Anyway even if it is a failure i don't thing that the situation is desperate for AMD. zacate is doing good ( they can't ship enough actually ) Llano isn't doing decent too and their GPU are selling well.
They were in a way worse shape during the Core era before the Phe/ Athlon II tbh. Still disappointing though.
First gen phenom was the darkest days, even the low end enterprise server market had to move away from them for that. Phenom 2 had it's moments at least, despite being relatively obsolete at release. They still had some good moments. Llano absolutely has awesome uses in the market for inexpensive notebooks that are capable of playing games to some degree, and BD will have some enterprise server market share.
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
He is probably mistaking Bulldozer with Llano. Next gen Bulldozer will be an APU too.
Anyway even if it is a failure i don't thing that the situation is desperate for AMD. zacate is doing good ( they can't ship enough actually ) Llano isn't doing decent too and their GPU are selling well.
They were in a way worse shape during the Core era before the Phe/ Athlon II tbh. Still disappointing though.
Source on next generation Bulldozer being an APU?
Just because Trinity and Kaveri will be using Piledriver and Steamroller cores does not mean that the FX brand will be getting an IGP...
On October 13 2011 03:02 KaiserJohan wrote: Just out of curiosity what is the point of having many cores? to me it just seems like a fuckload of overhead and synchronization. Why dosn't the industry build really, really powerful dual cores instead of weaker but more cores?
Well, probably because it's faster doing more things at once almost as fast than it is doing less things at once slightly faster. At least so far. Like folding and bitcoin mining, those only really work on GPU's with their shitload of slower cores. Different kinds of processing take different kinds of hardware.
There is also the issue of the heat ceiling. Instead of 2 really extreme cores that would overheat, you spread that over more cores at a lower clock frequency, thus reducing the maximum temperature. Phase changing / liquid cooling isn't exactly something you can mass market, costs put aside.
The problem is that most programs aren't designed to take advantage of multi-core technology right, and we are caught in this transition.
Huh? We knew BD tech would be utilized for Trinity - this was presented in the June 2011 keynote. Remember the guy holding up the Trinitiy APU? It's for this reason people are linking the success of the two together. If BD sucks, kind of diminishes the excitement for Trinity doesn't it?
GPU improvement will be just fine (ATi held up their end of the deal), but if BD-side stalls the CPU performance, it's not going to be a balanced approach.
Lets not kid ourselves here. They released an absolute turd of a product. When the i7-2600k, which has been out for almost close to a year now, completely annihilates the FX-8150 in virtually every benchmark that is applicable to 99.9% of the consumer base out there, you know AMD has a massive failure on their hands.
This is fair... Except that the FX-8150 is not ~$314, but $245. Still I think the price/performance ratio leans towards Intel still.
Someone on [H] ran the numbers and its questionable if the 2500's or one of the FX-81xx's are better since OCing is kinda meh due to the power issues. At stock the FX are better and at OC they are about the same. The only issues are the cost of system and the fact that BD mobos have more PCI-e lanes etc etc. So its kinda a wash and really depends on what you want. I'd be more curious about the FX-4100.
Uhm, OCing might be the same end result on high end custom loops, according to AT a 2500k hits roughly the same clock on air as an 8150, with better clock for clock performance.
You can get a 2500k to 4.6ghz on air cooling? I had no idea... I'm gunna have some fun tomorrow
mine is at 4.4ghz stable (no need to push it any further, really) with an aftermarket heatsink.
On October 13 2011 03:30 mav451 wrote: Huh? We knew BD tech would be utilized for Trinity - this was presented in the June 2011 keynote. Remember the guy holding up the Trinitiy APU? It's for this reason people are linking the success of the two together. If BD sucks, kind of diminishes the excitement for Trinity doesn't it?
GPU improvement will be just fine (ATi held up their end of the deal), but if BD-side stalls the CPU performance, it's not going to be a balanced approach.
I know Trinity and Kaveri will be using Piledriver and Steamroller cores... But when someone says next-generation Bulldozer, I assume they're talking about Enhanced Bulldozer scheduled for Q1 of 2012 or Next-Generation Bulldozer on 22nm scheduled for 2013. Is there actually information on these (which is the FX brand) being APUs ... ?
On October 12 2011 21:25 Phayze wrote: Rofl AMD. Kinda feels like they arent even trying to compete in the consumer market any more.
Yea that's why zacate is destroying the atom line up on the low end mobile market lol.
On October 12 2011 22:24 android_245 wrote:
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
He is probably mistaking Bulldozer with Llano. Next gen Bulldozer will be an APU too.
Anyway even if it is a failure i don't thing that the situation is desperate for AMD. zacate is doing good ( they can't ship enough actually ) Llano isn't doing decent too and their GPU are selling well.
They were in a way worse shape during the Core era before the Phe/ Athlon II tbh. Still disappointing though.
Source on next generation Bulldozer being an APU?
Just because Trinity and Kaveri will be using Piledriver and Steamroller cores does not mean that the FX brand will be getting an IGP...
Next Gen Bulldozer will be used for APUs and pretty much their whole deskstop offer. My bad. Yea they will have some high end procs without IGP but it will only be interesting for servers, video editing, 3D rendering and stuff like. I would not be surprised if it is limited to the opterons tbh. They will have to make huge improvements in single/double threaded performance and if they want to be competitive in the mainstream / performance DESKTOP market ( i mean we are talking about gamers here ).
The IGP will be the main selling point for AMD. If they can work on the drivers it means that you will be able to get a cheap crossfire.
Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
edit: i mean it just doesn't make sense to not put IGP with the proc. It is the only way to be competitive. I mean why Llano is competitive vs SB on the low end mobile ? Because AMD has a better offer than SB IGP.
On October 12 2011 21:25 Phayze wrote: Rofl AMD. Kinda feels like they arent even trying to compete in the consumer market any more.
Yea that's why zacate is destroying the atom line up on the low end mobile market lol.
On October 12 2011 22:24 android_245 wrote:
On October 12 2011 21:34 TheBomb wrote: I knew it. The fact that they were delaying it so much and the fact that early leaked benchmarks showed bulldozer loosing to I5 2600K in the cpu area and only winning in the graphics area which is not even important as 90% of the people have dedicated graphic card anyways!
How does Bulldozer beat Sandy Bridge in the graphics area when it doesn't even have a IGP?
He is probably mistaking Bulldozer with Llano. Next gen Bulldozer will be an APU too.
Anyway even if it is a failure i don't thing that the situation is desperate for AMD. zacate is doing good ( they can't ship enough actually ) Llano isn't doing decent too and their GPU are selling well.
They were in a way worse shape during the Core era before the Phe/ Athlon II tbh. Still disappointing though.
Source on next generation Bulldozer being an APU?
Just because Trinity and Kaveri will be using Piledriver and Steamroller cores does not mean that the FX brand will be getting an IGP...
Um... Take my word for it. I promise?.. hehehe.
They aren't technically called APU's in the road map... But they will be DirectX 11 capable.
EDIT: To clarify more.. Llano's next gen, Trinity, will use Bulldozer cores, same with Next gen bulldozer.. both will have DX11 capable graphics.
On October 13 2011 03:37 Boblion wrote: Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
Yes but FX is not mainstream, Komodo won't feature an IGP(the successor to Zambezi).
On October 13 2011 03:37 Boblion wrote: Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
Yes but FX is not mainstream, Komodo won't feature an IGP(the successor to Zambezi).
Take the stats with a grain of salt, most benchmarks don't mean much to be honest. Benchmarks are generally created to to determine performance in a certain aspect, meaning that some processors will perform well on some benchmarks and bad on others. Just from a general overview of the benchmarks it seems to do ok, just not amazing.
OMG OMG THE CLOCK FREQUENCY ISN"T AS HIGH AS AN i7 ITS TERRIBLE!?!?!?!?!!!?
AMD has pretty much always designed processors that run at slower clocks but do more instructions in said clock period. Intel on the other hand does the exact opposite, they have a very fast clock and do tons of tiny instructions. In general they even out when your performing any task (sometimes AMD even does better), but the higher clock rate is great for marketing.
Rule of thumb, don't trust the stats, they are generally false, test the hardware yourself with the program you will be buying the processor for.
EDIT: Maybe I should clarify. Yes I saw that the clock rates for the Bulldozer were higher than that of the i7. I was just talking about general trends, not the Bulldozer specifically.
I am also not insinuating that AMD's chips do better performance-wise than Intel chips, I am just suggesting that AMD's chips are often underrated because they may not post as nice stats.
On October 13 2011 03:37 Boblion wrote: Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
Yes but FX is not mainstream, Komodo won't feature an IGP(the successor to Zambezi).
An unedited version of the pic and a more recent roadmap. € Or why do you think the "DX11 capable GPU" is a new point for the other releases but not for Komodo?
Take the stats with a grain of salt, most benchmarks don't mean much to be honest. Benchmarks are generally created to to determine performance in a certain aspect, meaning that some processors will perform well on some benchmarks and bad on others. Just from a general overview of the benchmarks it seems to do ok, just not amazing.
OMG OMG THE CLOCK FREQUENCY ISN"T AS HIGH AS AN i7 ITS TERRIBLE!?!?!?!?!!!?
AMD has pretty much always designed processors that run at slower clocks but do more instructions in said clock period. Intel on the other hand does the exact opposite, they have a very fast clock and do tons of tiny instructions. In general they even out when your performing any task (sometimes AMD even does better), but the higher clock rate is great for marketing.
Rule of thumb, don't trust the stats, they are generally false; test it yourself with the program you will be buying the processor for.
Whatever you're smoking, hook me up. Bulldozer has higher clocks. Obviously you didn't read any of the stuff you're trying to discredit.
And when the benchmarks from multiple sources all agree, it's hard to cry bias, especially when AMD buys ad space on half of the sites providing those reviews.
Bulldozer does NOT compete in the consumer market. But hey, delude yourself and waste money all you want.
On October 13 2011 03:47 Chimpalimp wrote: Take the stats with a grain of salt, most benchmarks don't mean much to be honest. Benchmarks are generally created to to determine performance in a certain aspect, meaning that some processors will perform well on some benchmarks and bad on others. Just from a general overview of the benchmarks it seems to do ok, just not amazing.
OMG OMG THE CLOCK FREQUENCY ISN"T AS HIGH AS AN i7 ITS TERRIBLE!?!?!?!?!!!?
AMD has pretty much always designed processors that run at slower clocks but do more instructions in said clock period. Intel on the other hand does the exact opposite, they have a very fast clock and do tons of tiny instructions. In general they even out when your performing any task (sometimes AMD even does better), but the higher clock rate is great for marketing.
Rule of thumb, don't trust the stats, they are generally false, test the hardware yourself with the program you will be buying the processor for.
So... Just so you know, I almost reported you thinking this was a troll... But I'll bite.
The Bulldozer core actually does LESS per clock tick (see Tom's hardware review tests). And these bulldozer chips actually ship with basically the same range as clock rates as Intel this time around.
Did you look at Bulldozer stats/benchmarks at all?
On October 13 2011 03:37 Boblion wrote: Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
Yes but FX is not mainstream, Komodo won't feature an IGP(the successor to Zambezi).
Take the stats with a grain of salt, most benchmarks don't mean much to be honest. Benchmarks are generally created to to determine performance in a certain aspect, meaning that some processors will perform well on some benchmarks and bad on others. Just from a general overview of the benchmarks it seems to do ok, just not amazing.
OMG OMG THE CLOCK FREQUENCY ISN"T AS HIGH AS AN i7 ITS TERRIBLE!?!?!?!?!!!?
AMD has pretty much always designed processors that run at slower clocks but do more instructions in said clock period. Intel on the other hand does the exact opposite, they have a very fast clock and do tons of tiny instructions. In general they even out when your performing any task (sometimes AMD even does better), but the higher clock rate is great for marketing.
Rule of thumb, don't trust the stats, they are generally false; test it yourself with the program you will be buying the processor for.
Way to be an ignorant delusional AMD fanboy. Intel has higher IPC for years now.
On October 13 2011 03:37 Boblion wrote: Just look at their current platform. The APU are replacing progressively the older CPU and they are starting by the bottom end. Mainstream is the next step.
Yes but FX is not mainstream, Komodo won't feature an IGP(the successor to Zambezi).
An unedited version of the pic and a more recent roadmap. € Or why do you think the "DX11 capable GPU" is a new point for the other releases but not for Komodo?
Not sure where you found that roadmap... I can't find the same one anywhere official.... Perhaps for a reason?
On October 13 2011 03:47 Chimpalimp wrote: Take the stats with a grain of salt, most benchmarks don't mean much to be honest. Benchmarks are generally created to to determine performance in a certain aspect, meaning that some processors will perform well on some benchmarks and bad on others. Just from a general overview of the benchmarks it seems to do ok, just not amazing.
OMG OMG THE CLOCK FREQUENCY ISN"T AS HIGH AS AN i7 ITS TERRIBLE!?!?!?!?!!!?
AMD has pretty much always designed processors that run at slower clocks but do more instructions in said clock period. Intel on the other hand does the exact opposite, they have a very fast clock and do tons of tiny instructions. In general they even out when your performing any task (sometimes AMD even does better), but the higher clock rate is great for marketing.
Rule of thumb, don't trust the stats, they are generally false, test the hardware yourself with the program you will be buying the processor for.
er
I don't think that's been true since it was Athlon 64 vs. P4.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
I just want to see one of their 4 core Bulldozers benchmarked. Just glancing at the titles, and reading the Tom's Hardware article, those aren't available to be benchmarked yet? Or what?
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
Intel does a good job at pretending they're not a monopoly. They delayed Ivybridge, stripped down X79 chipset, and hasn't invaded AMD's $100-$150 segment with quads yet.
On October 13 2011 04:07 FIStarcraft wrote: I just want to see one of their 4 core Bulldozers benchmarked. Just glancing at the titles, and reading the Tom's Hardware article, those aren't available to be benchmarked yet? Or what?
Only Techspot has benchmarked the FX4170 but that is not being released until 2012... the one being released is the FX4100 which is 600MHz slower. AMD probably asked reviewers not to benchmark FX4 since it's performance is so embarassing that a Phenom II can beat it...
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
Intel does a good job at pretending they're not a monopoly. They delayed Ivybridge, stripped down X79 chipset, and hasn't invaded AMD's $100-$150 segment with quads yet.
On October 13 2011 04:07 FIStarcraft wrote: I just want to see one of their 4 core Bulldozers benchmarked. Just glancing at the titles, and reading the Tom's Hardware article, those aren't available to be benchmarked yet? Or what?
Only Techspot has benchmarked the FX4170 but that is not being released until 2012... the one being released is the FX4100 which is 600MHz slower. AMD probably asked reviewers not to benchmark FX4 since it's performance is so embarassing that a Phenom II can beat it...
Kind of like not lifting the FX-8 NDA until they released the junk, so people wouldn't hear in advance how not competitive it was in the consumer market?
On October 13 2011 02:01 Bibdy wrote: Well, where's the test that shows off its merits? If it was intended for a specific niche, shouldn't that be the benchmark? I find it more likely that they hedged a lot of bets on a single research line that was not particularly fruitful, and they've decided to release something to try and get some of that investment back.
If you follow the industry at all... You would know that the new platform was designed for The Server market.
However, I'm fully aware that not everyone is as nerdy as I when it comes to this stuff... But any google search for Bulldozer will eventually get you search results from prior to today and you can plainly see this fact.
afaik servers stay on 24/7 and you ideally don't want to spend 50$ a day on the electricity bill for your computer, please correct me if I'm wrong.
Wrong.
Any large scale server/noc centre would have no issue at all paying 50 dollars per day for a solid reliable server.
EDIT: For clarification, the Data Centre/Noc I work for spends roughly $25,000 per month on our electricity bill, and we use the AMD platform in most of our servers.
I'm no server/performance guy, but isn't the total cost of ownership of a server farm dominated by both the salary of the guy you pay to maintain it, and the long-term power consumption costs? I can't think of any business that would just up and go "Yeah, fuck it, get the super heavy-duty chips, and to hell with the power costs!"
If you want a detailed answer here you go.
The total cost of a server farm or data centre is completely IRRELEVANT. You're paying for good equipment for a couple reasons: Stability, failsafe mechanisms' and data retention.
If you end up making money from the server farm then, great. People don't invent all this money into a huge data centre to make a profit, it's a guaranteed loss (in it's own respect)
It's there to run the company, the infastructure and to make sure everything works as it should. It's a necessity.
A data centre tech or NOC manager makes about oh I don't know 70~90k a year, which is a few months of electricity. Believe it or not, the cooling and air conditioners in a data centre cost more then most hardware (to operate 24/7) so it's somewhat of a moot point.
the only way you can make a profit from a data centre is to rent out a collocation to another company, perhaps a reseller of your services, or a smaller fish who's renting some pipe from you. these usually cost about 15k, for a small 4x6 or 6x8 foot area.
Irrelevant? Really?
So you're saying a 1% increase in core capacity, would justify 500% more power consumption? Those three metrics are the ONLY ones under consideration? Please. Maybe to a business that likes to hemorrhage money, but over here in the real world, the total cost of ownership is a huge deal. It's pretty much the sole reason my company's IPBX product wins deals against the equivalents from Cisco and Avaya. At least in this industry, people give much less of a shit about the simpler, intuitive interface and the distributed architecture, than they do the simple fact that power consumption is stupid-low and you only need one guy to maintain it.
I don't know where you're pulling 1% and 500% respectively. Keep in mind we're not talking about future gen BD architecture in servers. We're talking about current AMD platforms that are in place. which don't have all that much difference in power consumption (Right now) than that of Intel.
The fact of the matter is, they're in use and obviously I can't speak on behalf of the total price, or cost/revenue associated with running AMD chips, it obviously works and there are reasons for it. Which have been outlined by others in this thread. And another thing to consider, people running these server farms or data centres, have deals with the local/provincial/state run power companies, and get their power at almost cost. So like I said, it's totally irrelevant.
On October 12 2011 18:38 Cocoabean wrote: So judging by the benchmarks, AMD basically did the impossible and reversed Moore's Law.
/clap
Not true. Moore's law simply states that you can purchase a processor twice as powerful for the same price every 2 years. This fact still holds true. You just can't buy that processor from AMD.
No it doesn't. Moore's law just involves the transistor count (or density) on an integrated circuit. And "integrated circuit" isn't just the processor - it includes RAM, NAND, etc.
Anyway, on the topic of Bulldozer, looks disappointing, but I'm holding out hope that later iterations will make great enterprise server processors. That's the high margin market (the gaming market is a much lower margin niche).
AMD claims that the architecture was designed for scaling up clockspeeds, and rumors say that it's the shitty yield at Global Foundries that's preventing AMD from cranking it up. Though the "architecture designed for scaling up clockspeed" sounds ominously similar to Intel's NetBurst/P4 fiasco...
I'm really hoping that AMD's long-term strategy pays off - an Intel monopoly on high-end processors would suck for us all.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
That is.. disappointing, but I'm not surprised one bit. *le sigh*. Maybe AMD just shouldn't bother with high-end CPUs and only target budget-gamers, because there is no way in hell they can compete with Intel now. The monopolizing begins~
Looks like they really shot themselves in the foot going for a longer pipeline. As long as they continue to ignore their problems with memory management, no improvement in specific computational architecture is going to net a win over Intel.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
If all the casinos use rigged slot machines for net payouts 20% below what they should be, does that make any individual casino honest?
And just because they could be worse also doesn't make them honest. When Ivy Bridge comes out, we'll see if you still think they're honest.
So much for AMD's new advert they where playing at IPL. 500w on the cpu only while oced at load are we getting a Nuclear reactor to power the whole rig for the price tag ?.
the new Bulldozer they made i would compaire to the Hummer H1, big powerfull but uses too much gas for everyday use
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
I'm more annoyed by lack of overclocking on most of their chips now. The days of grabbing a $100 chips and overclocking the pants off it are over until AMD's Bulldozer beta test is done and they release the real deal.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
I'm more annoyed by lack of overclocking on most of their chips now. The days of grabbing a $100 chips and overclocking the pants off it are over until AMD's Bulldozer beta test is done and they release the real deal.
You can thank AMD for slacking off during the Athlon 64 days. Since we haven't had a competitive market for years, there is no reason why Intel would give you free performance.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
I'm more annoyed by lack of overclocking on most of their chips now. The days of grabbing a $100 chips and overclocking the pants off it are over until AMD's Bulldozer beta test is done and they release the real deal.
You can thank AMD for slacking off during the Athlon 64 days. Since we haven't had a competitive market for years, there is no reason why Intel would give you free performance.
You can also thank Intel for their illegal deals :3
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
I'm more annoyed by lack of overclocking on most of their chips now. The days of grabbing a $100 chips and overclocking the pants off it are over until AMD's Bulldozer beta test is done and they release the real deal.
You can thank AMD for slacking off during the Athlon 64 days. Since we haven't had a competitive market for years, there is no reason why Intel would give you free performance.
Not so much slacking off but more like Intel throwing enough money at Pentium 3 architecture to fund a Nuclear program for a rouge state.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
If all the casinos use rigged slot machines for net payouts 20% below what they should be, does that make any individual casino honest?
And just because they could be worse also doesn't make them honest. When Ivy Bridge comes out, we'll see if you still think they're honest.
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
I'm more annoyed by lack of overclocking on most of their chips now. The days of grabbing a $100 chips and overclocking the pants off it are over until AMD's Bulldozer beta test is done and they release the real deal.
AMD has been brute forcing it's way in the market for a long time now. Even with GPUs, they spend so much R&D resources on the FPU, ALU, and pipeline configuration while largely neglecting memory management. They're forced to put so much power through the execution to make up for such HUGE memory shortfalls that there is very little room to overclock. They're essentially selling pre-overclocked chips in order to remain competitive.
Overall Bulldozer is an absolute disappointment in the desktop market. It just doesn't have the per core per cycle performance needed to compete. It just does not work out. Being worse than a phenom II is just not acceptable. Would be better just to shrink the k 10.5 core down to 32nm and add 2 more cores, it would be a lot more powerful and probably less work in the design, 8 core 32nm phenom II would perform better in single threaded workloads and even threaded tasks.
On the bright side, window 8 will most likely increase performance by a few percent. Once Globalfoundries sort out the silicon the CPU will be able to clock a lot higher. Bulldozer is a great server processor, 16 core interlango might gain back some shares in that department which might put more funding on the next generation of bulldozer. The CPU is also extremely scalable in terms of cores and clock, its kinda like a p4 in that regards. If they can get a 10% increase in per cycle performance in enhanced bulldozer and get the clock up they might be able to compete with ivy bridge on the mid range market.
I still have some hope for AMD but as of right now, theres no reason to not buy a 2500k if you have the money as a gamer. Eventually software will all become more threaded but as of right now; 4 cores 4 threads is all anyone needs as a gamer. AMD just looked too far ahead this time and didn't deliver for the now. Hopefully the FX IIs if they come out would be like phenom IIs to the phenoms. Intel's monopoly will mean bad news for the consumer.
Overall Bulldozer is an absolute disappointment in the desktop market. It just doesn't have the per core per cycle performance needed to compete. It just does not work out. Being worse than a phenom II is just not acceptable. Would be better just to shrink the k 10.5 core down to 32nm and add 2 more cores, it would be a lot more powerful and probably less work in the design, 8 core 32nm phenom II would perform better in single threaded workloads and even threaded tasks.
On the bright side, window 8 will most likely increase performance by a few percent. Once Globalfoundries sort out the silicon the CPU will be able to clock a lot higher. Bulldozer is a great server processor, 16 core interlango might gain back some shares in that department which might put more funding on the next generation of bulldozer. The CPU is also extremely scalable in terms of cores and clock, its kinda like a p4 in that regards. If they can get a 10% increase in per cycle performance in enhanced bulldozer and get the clock up they might be able to compete with ivy bridge on the mid range market.
I still have some hope for AMD but as of right now, theres no reason to not buy a 2500k if you have the money as a gamer. Eventually software will all become more threaded but as of right now; 4 cores 4 threads is all anyone needs as a gamer. AMD just looked too far ahead this time and didn't deliver for the now. Hopefully the FX IIs if they come out would be like phenom IIs to the phenoms. Intel's monopoly will mean bad news for the consumer.
Very nice analysis. I think what you said about AMD looking too far ahead is absolutely right, however it might pay off in the future. Here's to hoping!
I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
yes same. I bought the phenom x6 4 months ago and now i wish i got intel. so sad t.t
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
And what kind of segment do you think the person you quoted fit into?
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
And what kind of segment do you think the person you quoted fit into?
Wow.
He said "every price point"... Not "my price point".... I'm talking about what was said, not who said it.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
Intel does have like 80% of the market to AMD's 10% or so.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
yes same. I bought the phenom x6 4 months ago and now i wish i got intel. so sad t.t
Why would you be disappointed now? This doesn't change anything about your purchase. If your purchase was good then it is fine now....
On October 13 2011 03:58 Djzapz wrote: Dear lord AMD, you almost managed to take a step backward...
Hopefully Intel will keep their prices honest -_-
BWAHAHAHAHAHHAHAHAHA.
I LOL'ed IRL. For one, "keep"? That's a good one. For two, yeah right.
Right now they charge more for an i5 2500 non-k than a 2300. Same piece of silicon.
Well it's probably binned, isn't it? Regardless, hardware manufacturers have been doing that even when competition was pretty even. Like between NVIDIA and ATI, they would literally shut down some pipelines on their cards to sell them cheaper. Sometimes you could even turn them back up. Recently there was a GTX465 that was just a gimped GTX470 that you could flash back to GTX470 firmware.
Anyway, $170 for a 2500k that'll last me for years is pretty alright. They could almost act like a monopoly at this point and they'd get slapped fees for it but that'd probably be covered by their profits.
By "keeping their prices honest", I meant that I hope they won't start acting like a monopoly.
So because everybody is ripping you off in the same ways it's ok they rip you off? And they're not always better binned, depending on demand.
And if you think they aren't acting like a monopoly, look at 1366 pricing.
And GTX 465 was the most insulting piece of hardware released in the last 2 years. Although reference 6950s were kind of a nice gesture.
I don't see how it's so outrageously wrong of me to call Intel "honest" when their pricing for LGA1155 has been relatively honest if we compare it to the industry... No reason to full caps "bwahaha..." at me unless you really want some attention.
As for 1366 pricing, it's for stupid people or enthusiasts, but I would agree that it's not particularly honest.
They could be so, so much worse.
If all the casinos use rigged slot machines for net payouts 20% below what they should be, does that make any individual casino honest?
And just because they could be worse also doesn't make them honest. When Ivy Bridge comes out, we'll see if you still think they're honest.
Relatively honest stands. GGPLAY.
Relatively honest may stand, but go back to your original statement that you took offense at me laughing at.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
I assume your definition of high end desktop market includes 100% of desktops that use only desktop hardware?
Because at the low end, where people need at most a dual, SB Pentium wins over anything AMD can offer at the same price. At the mid-range, SB i5 shreds... everything. And the high end you already conceded. If you mean relative, say relative.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
Intel does have like 80% of the market to AMD's 10% or so.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
G620 wrecks their listed "top price/performance", and the price point competition is G840 or 850 I believe? Especially since they're saying gaming CPU...
Sigh... I really wanted this to be good, at least close to sandy bridge, and I am very disappointed. I'm also not very surprised about this either, seeing how even some of Intel's lowest end processors (pentium and i3) were better than AMD's now last gen processors (phenom II). The improvement in performance over the last gen of processors that AMD made isn't very impressive either, according to anardtech bench ( http://www.anandtech.com/bench/Product/434?vs=203 ) the highest end bulldozer only gets 2 fps more than the highest end phenom II hex core in Starcraft 2; and is actually slightly, but within the margin of error, slower in some games (Dirt 3 and Crysis). I'm very unimpressed.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
The Athlon II x3 is still the king below 100$ and the cheap Phenoms can still be interesting especially if you plan to OC.
I'd love to see the criteria on that.
The criteria is that you can OC.
You know the same criteria than the Intel fanboys have always used before SB <3
Anyway the conclusion of Anandtech is basicly what i'm saying.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
Well do you know how hot they run and how low the thermal limit is in comparison? I mean that means you need an aftermarket cooler and that's like 30$ added and then you're looking at a 110$ CPU
Also the overclocking gets it even with G840, not ahead, and it'll use over double the power...
Oh but if you enable the last core it has some merit naturally
On October 13 2011 05:17 JingleHell wrote: Relatively honest may stand, but go back to your original statement that you took offense at me laughing at.
Well you still should have taken my post with a grain of salt. I don't see the problem with it. All corporations are "dishonest" to an extent so we shouldn't grasp straws like that.
On October 13 2011 05:02 Drowsy wrote: I went with Phenom x6 because it was on sale at the local fry's electronics a year ago. I'm kind of regretting it now though, intel is destroying AMD at nearly every price point.
???
Do people post not knowing what they are talking about???
Intel is clearly winning the High-end desktop market... But thats about it
AMD's Llano and Brazos family CPU/APU's are doing very well in low-mid markets. And while the bulldozer is bleh for desktops.. It should serve to help out the server side market share gap as well.
And keep in mind... The high-end CPU market is the smallest CPU market segment.
EDIT: Yes, Intel is clearly winning overall.. But in almost every segment AMD being destroyed?... C'mon....
low-mid is what? I don't know much about laptops, but for desktop sandybridge processors destroy AMD in the 50$, 65-80$, 125$, and 175$ departments as well unless you for some reason need a ton of cores without much single-thread performance.
The Athlon II x3 is still the king below 100$ and the cheap Phenoms can still be interesting especially if you plan to OC.
I'd love to see the criteria on that.
The criteria is that you can OC.
You know the same criteria than the Intel fanboys have always used before SB <3
Anyway the conclusion of Anandtech is basicly what i'm saying.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
Well do you know how hot they run and how low the thermal limit is in comparison? I mean that means you need an aftermarket cooler and that's like 30$ added and then you're looking at a 110$ CPU
Also the overclocking gets it even with G840, not ahead, and it'll use over double the power...
Oh but if you enable the last core it has some merit naturally
Oh no need to convince me, i know that the reasoning behind the OC is kinda dumb especially for low end CPU but for some people it makes sense. Remember the old debate between the Wolfdale and Clarkdale vs Athlon II x3x4 and Phenom II x2x3 when Intel CPU had better OC performance.
When I mentioned market segments in terms of Low-mid-high, I should have been more clear.
Low-mid desktop CPUs are a smaller part of the low-mid segment. I am encompassing everythnig from Brazos - Bulldozer... small form factor, to laptops, to desktops,... even smartphones (lol AMD smart phone cpu's hehe)
In terms of market share AMD is catching up low-mid overall with Brazos and Llano.... Including mobile brazos/llano and desktop brazos - llano....
I was not referencing performance... But actual market share and results GROWTH. Apologies for the foggy comments... I can see the confusion.
too bad it was originally supposed to come out in 2009 derp, 2009 was the first roadmap for bulldozer release, maybe it would have been cool back then eh
This is an old chart (accurate data up to 2010)... However 2011 so far, is even more skewed towards tablets/mini laptops than the 2011 projection below.
So in a few years desktops are projected to drop below 20%, and of those 20%... How much do you think are high end? 10%? 20%?... The high-end desktop CPU market will eventually very niche, and I am starting to think AMD is banking on it, and trying to just go after than fat segments.
Which sucks for us, letting Intel pretty much charge willy nilly for their high-end stuff.
On October 13 2011 05:58 B00ts wrote: This is an old chart (accurate data up to 2010)... However 2011 so far, is even more skewed towards tablets/mini laptops than the 2011 projection below.
So in a few years desktops are projected to drop below 20%, and of those 20%... How much do you think are high end? 10%? 20%?... The high-end desktop CPU market will eventually very niche, and I am starting to think AMD is banking on it, and trying to just go after than fat segments.
Which sucks for us, letting Intel pretty much charge willy nilly for their high-end stuff.
Aren't those prebuilts and does it take into account that you can't build your own laptops?
On October 13 2011 05:58 B00ts wrote: This is an old chart (accurate data up to 2010)... However 2011 so far, is even more skewed towards tablets/mini laptops than the 2011 projection below.
So in a few years desktops are projected to drop below 20%, and of those 20%... How much do you think are high end? 10%? 20%?... The high-end desktop CPU market will eventually very niche, and I am starting to think AMD is banking on it, and trying to just go after than fat segments.
Which sucks for us, letting Intel pretty much charge willy nilly for their high-end stuff.
Aren't those prebuilts and does it take into account that you can't build your own laptops?
Also I'm fairly certain there are countries in the world that are not the US.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
I have an overclocked i7-2600k. Explain that.
It's impossible to have a processor that is so called 'clock-locked'. No matter if you can't clock up, you can always clock down your processor. Don't get caught up in the turbo-boost technology on the snb chips, that has nothing to do with the ability to overclock.
Oh and the Athlon II X3 is not better than the snb processors in heavily threaded workloads. Especially not compared to any of the high-end i7's. It has a better performance/price ratio than a lot of the intels but it can barely compare in terms of overall performance.
It's not that it's a shit product or anything, but it's an $80 processor and there's only so much AMD can do with that compared with the $300+ i7's
I bought the phenom II x4 955 about 8 months ago for a steal from newegg.ca and I couldn't be more happy with it for my gaming computer.
Sad to see that the newest technology doesn't even stand up to the older stuff ??
BTW, a good quality SSD is the best thing to throw in if you're building a gaming rig. Just install SC2 or whatever game to the SSD and enjoy loading before everyone else (and playing in Extreme graphics, but on a budget)
Anyway the conclusion of Anandtech is basicly what i'm saying.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
I have an overclocked i7-2600k. Explain that.
It's impossible to have a processor that is so called 'clock-locked'. No matter if you can't clock up, you can always clock down your processor. Don't get caught up in the turbo-boost technology on the snb chips, that has nothing to do with the ability to overclock.
Oh and the Athlon II X3 is not better than the snb processors in heavily threaded workloads. Especially not compared to any of the high-end i7's. It has a better performance/price ratio than a lot of the intels but it can barely compare in terms of overall performance.
It's not that it's a shit product or anything, but it's an $80 processor and there's only so much AMD can do with that compared with the $300+ i7's
That's not a Pentium...
And all SBs are pretty much multiplier overclock only for 2 models, none of which are cheap. You can't adjust the FSB/HT/Whatever They Call it Clock in SB very much without it going screwy on you. That being said, all you have to do in SB is to buy something that ends in 'k' and adjust multiplier up along with voltage instead of monkeying with it for a whole afternoon (but some people like doing that).
On October 13 2011 06:39 Boblion wrote: We will all get ARM devices and cloud computing anyway !
You call them "Intel Fanboys" when the ironic thing is that they will always go for the better price performance brand and you're the actual fanboy for defending that amd processor for the low end gaming rig.
Anyway the conclusion of Anandtech is basicly what i'm saying.
The decision tilts in AMD's favor if you start comparing to the Athlon II X3. In heavily threaded workloads, the Athlon II X3's third core helps put it ahead of the entire SNB Pentium lineup. If you're building a machine to do offline 3D rendering, multithreaded compiling or video transcoding then AMD continues to deliver the best performance per dollar. It's in the lighter, less threaded workloads that the Pentium pulls ahead. If you're building more of a general use system (email, web browsing, typical office applications and even discrete GPU gaming), the Pentium will likely deliver better performance thanks to its ILP advantages. What AMD has offered these past couple of years is an affordable way to get great multithreaded performance for those applications that need it.
Unfortunately the entire Sandy Bridge Pentium lineup is clock locked. Without turbo modes there's no support for overclocking at all. While these new Pentiums would have normally been great for enthusiasts looking to overclock, Intel has ensured that anyone looking to get more performance for free at the low end will have to shop AMD. Unfortunately Intel's advantage in single/lightly threaded performance is big enough that a clock speed advantage alone is generally not enough to make up for it (see G620 vs. Athlon II X2 265 comparison). It's sad that it has come to this. I was hoping we'd see more K-series SKUs at the low end but it seems like those will only be for the enthusiasts at the high end.
The new Pentiums are better in game, i will give you that.
I have an overclocked i7-2600k. Explain that.
It's impossible to have a processor that is so called 'clock-locked'. No matter if you can't clock up, you can always clock down your processor. Don't get caught up in the turbo-boost technology on the snb chips, that has nothing to do with the ability to overclock.
Oh and the Athlon II X3 is not better than the snb processors in heavily threaded workloads. Especially not compared to any of the high-end i7's. It has a better performance/price ratio than a lot of the intels but it can barely compare in terms of overall performance.
It's not that it's a shit product or anything, but it's an $80 processor and there's only so much AMD can do with that compared with the $300+ i7's
Wow thanks for the input dude. Didn't know that the 2600K was better than an athlon II x3 Great advice !
On October 13 2011 06:39 Boblion wrote: We will all get ARM devices and cloud computing anyway !
You call them "Intel Fanboys" when the ironic thing is that they will always go for the better price performance brand and you're the actual fanboy for defending that amd processor for the low end gaming rig.
Well i guess that Anandtech and Tomhardware are fanboys too... edit: actually you just don't know how to read lol.
Just to be clear i called fanboys the guys who have said in the past that a wolfdale > Athlon II / Phe II because of the OC. I didn't call ppl in this thread fanboys.
You originally stated that the Athlon II X3 was the best processor sub $100 than link to a tomshardware article for "best gaming CPU" ... Anandtech only implied it is a good value for multi-threaded work... not gaming.
The article you quoted from Anandtech even stated that even though the Athlons are overclockable, the clock speed advantage is not enough to make up for the performance difference between the architectures...
And yes we all know Tomshardware is bias. This is why you get no credibility when you link to this shit site.
Second generation Phenoms and Athlons were good value back than because the performance gap between it and Intel wasn't as huge as it is now and they were sub $150 processors while the Wolfdales (E8) and Yorkfields (Q9) were all above $150.
On October 12 2011 18:37 Carnac wrote: I skimmed through a couple of tests and it looks pretty bad.
I have owned Intels and AMDs, because I always simply buy what's the most bang for the amount of bucks I'm willing to spend (and never understood Intel/AMD/nVIDIA/ATI or any corporate fandom). Currently that's a Phenom II 955, which I bought before Sandy Bridge's release. AMD falling further behind is really bad for the consumer, even if you don't intend on ever buying an AMD cpu.
Not looking bright for the future and when you compare Intel's with AMD's budget it's not likely to get better any time soon. That's not even including Intel's business practices.
Exactly my thoughts. The *acs have similar beliefs.
I've had both, a dual core intel, a Q6600, and now a Phenom II 955, which I got before sandy bridge was released. The processor is fine for me, didn't cost that much in comparison to i5/i7, and I thought "oh, I'll be supporting bulldozer blahblahblah" Well, this sucks :/ With bulldozer doing so poorly we're getting closer to a true monopoly in processors.
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
Meh, advertising via cherrypicking may be rather intentionally deceptive, but then, that's why we look for independent reviews. I'm less concerned with AMD doing it than if "review sites" and "experts" start massaging numbers, ignoring findings, and generally using scummy methods to make BD look better than it is. Advertising hype, by it's nature, from any company tends to piss me off.
is it just me or will the bulldozer will better for server machines, since technically its going to be better for highly multi-threaded applications, rather than games and the like. Either way, its quite dissapointing, specially to those people who waited for the Bulldozer to come out instead of buying the phenom 2 x6, they could have bought and been using the Ph2 x6 for like a year now.
On October 13 2011 08:55 rebuffering wrote: is it just me or will the bulldozer will better for server machines, since technically its going to be better for highly multi-threaded applications, rather than games and the like. Either way, its quite dissapointing, specially to those people who waited for the Bulldozer to come out instead of buying the phenom 2 x6, they could have bought and been using the Ph2 x6 for like a year now.
The server processors on BD architecture should be excellent for servers.
But then, there's a reason people don't complain about Xeon underperforming in consumer applications. Namely: It doesn't get marketed for them.
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
haha yea that video is pretty bad, i kept seeing advertisements for it at IPL 3, i hadnt looked at any preliminary benchmarks yet or anything so i thought the fx might actually be worth looking into.
Hmm so the i7 2700k gets released in around 12 days, worth waiting for? the price difference looks to be extremely minimal (like 20-30 dollars maybe less) doesnt seem like there is too much of a difference though a slightly better clock speed and turbo boost is what i notice from the chart on wiki.
Edit: actually waiting another 2 weeks on top of that for the 3820 might be nice, considering that i dont know how much overclocking i would actually do (3820 isnt unlocked) having pcie 3.0 a bigger cache (10mb) and support for 1600 ddr3 might be nice. Also it will be using the new socket 2011 so upgrading from that eventually would be alot more feasable not to mention its slightly cheaper, i guess the question is am i going to overclock or not. Ive bought computers/ processors / motherboards with the intention of overclocking before and not gone through with it (or have) so i guess thats sort of up in the air for me ><
PCI-E 3.0 is next to useless in a non multi-GPU configuration. Quad channel memory is next to useless for the majority of all consumers. The extra cache over LGA1155 is going to be negligible.
Lol I'm not sure what makes you think a LGA 2011 i7 will be less expensive than a LGA1155 or Bulldozer .... Core i7 3820 is on the enthusiast LGA2011 socket that is replacing the existing enthusiast LGA1366. It's going to be priced the same as the core i7 2600k or slightly higher and LGA2011 motherboards will be much more expensive than LGA1155 motherboards considering all the crap you get with X79.
To get an enthusiast platform and not overclock is a gigantic waste of money.
On October 13 2011 09:31 skyR wrote: PCI-E 3.0 is next to useless in a non multi-GPU configuration. Quad channel memory is next to useless for the majority of all consumers. The extra cache over LGA1155 is going to be negligible.
Lol I'm not sure what makes you think a LGA 2011 i7 will be less expensive than a LGA1155 or Bulldozer .... Core i7 3820 is on the enthusiast LGA2011 socket that is replacing the existing enthusiast LGA1366. It's going to be priced the same as the core i7 2600k or slightly higher and LGA2011 motherboards will be much more expensive than LGA1155 motherboards considering all the crap you get with X79.
To get an enthusiast platform and not overclock is a gigantic waste of money.
Yeah, to expand on this, 1366 systems cost more than better 1155 systems, and that's now. Some of the prices have gone down considerably. You're saving over $100 on RAM alone. And very few consumers have a real use for it besides tinkering and benching for shits n gigs.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
On October 12 2011 18:37 Carnac wrote: I skimmed through a couple of tests and it looks pretty bad.
I have owned Intels and AMDs, because I always simply buy what's the most bang for the amount of bucks I'm willing to spend (and never understood Intel/AMD/nVIDIA/ATI or any corporate fandom). Currently that's a Phenom II 955, which I bought before Sandy Bridge's release. AMD falling further behind is really bad for the consumer, even if you don't intend on ever buying an AMD cpu.
Not looking bright for the future and when you compare Intel's with AMD's budget it's not likely to get better any time soon. That's not even including Intel's business practices.
From my experience and friends to be honest ATI usually fail and fuck up more often than nvidias. From what I've seen from AMD for the last X years they have been making pretty crappy stuff when compared to similar priced intel counterparts. That doesn't mean I am totally biased fanboy though. They've just yet to impress.
On October 13 2011 08:55 rebuffering wrote: is it just me or will the bulldozer will better for server machines, since technically its going to be better for highly multi-threaded applications, rather than games and the like. Either way, its quite dissapointing, specially to those people who waited for the Bulldozer to come out instead of buying the phenom 2 x6, they could have bought and been using the Ph2 x6 for like a year now.
The server processors on BD architecture should be excellent for servers.
But then, there's a reason people don't complain about Xeon underperforming in consumer applications. Namely: It doesn't get marketed for them.
Cray is using the Unholy Yellow-Brown RGB Union of Bulldozer and Tesla for its super computers IIRC.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
Well, anyone who watched IPL shouldn't be acting like AMD didn't hype it as a gaming processor.
If they could have kept their marketing in check, they'd be getting a lot less hate.
They cannot recover from such a terrinle release. I own an AMD cpu myself and this is definitely cobfirmation I have to switch to intel for my new rig.
On October 13 2011 11:42 AxelTVx wrote: They cannot recover from such a terrinle release. I own an AMD cpu myself and this is definitely cobfirmation I have to switch to intel for my new rig.
By any chance do you live with a prankster? If so, they swapped your B and N keycaps.
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
On October 12 2011 19:52 rajssten wrote: ehh so disappointed...what should I be upgrading to nowadays? I always had AMD cpu before, I'm big fan of this company and its policy...
I guess I will be stickin with SouthBridge for next 2-3 years then :|
If your thinking of upgrading soon then obviously the i7 2600k and the i5 2500k (depending on your budget) are the way to go, the pricing is really really reasonable too.
I was surprised how well the i7 2600k performed when compared to the 990x looking at the difference in prices.
Im still running a core 2 quad 6800 (old i know) and i will be upgrading to a 2600k in a few weeks, in terms of price and performance its the best your gonna get.
Also im probably gonna get a HD 6950 unless someone convinces me otherwise, looking at the benchmarks (and for the price) it seems to be the best option for my price range. Im running a gtx 260 (core 216) and its starting to feel outdated (even though most things still run max, no dx 11 is starting to suck)
At that pricepoint 6950 and GTX 560 Ti are very competitive, 560 Ti overclocks better but 6950 tends to be a bit better at stock frequencies, depends largery on the game and which company you favor.
You realized you linked to a year old article? But good luck with that considering how most are hardware level locked.
It still works.
Only if you get a card where they didn't physically deactivate the unused part of the GPU. And if there's not a BIOS selector switch (they don't make it with that anymore), it gives you much better odds of having a rather expensive paperweight.
Hope AMD can get it together. This 2600K is my first Intel cpu, the Phenoms were just not competitive enough. Bulldozer flopping only means bad things for the consumer, and for Intel too if AMD goes under and the Government starts riding them.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
Well, anyone who watched IPL shouldn't be acting like AMD didn't hype it as a gaming processor.
If they could have kept their marketing in check, they'd be getting a lot less hate.
What's IPL? I'm being serious here. Some SC2 tournament?
You know, even though BD is relatively good at highly-threaded integer workloads, like for server use, on second glance, the die size is kind of disappointing for the performance you get.
BD 4 module FX-8150 is 315mm2, Sandy Bridge 4 core is 216mm2 yet has pretty much the same performance. I don't think that the corresponding Opteron and Xeon sizes will be that much off relatively. (and keep in mind that many of the Xeons don't have integrated graphics, and the 216mm2 includes the integrated graphics)
So the 2 module FX-4100 is another 315mm2 die and is trying to compete in price and performance with Intel's dual core i3-2100 at 131mm2. Seems expensive for AMD.
edit: maybe BD has too much cache? They've got 2MB L2 cache per module and 8MB L3 cache in all.
On October 13 2011 22:50 Myrmidon wrote: You know, even though BD is relatively good at highly-threaded integer workloads, like for server use, on second glance, the die size is kind of disappointing for the performance you get.
BD 4 module FX-8150 is 315mm2, Sandy Bridge 4 core is 216mm2 yet has pretty much the same performance. I don't think that the corresponding Opteron and Xeon sizes will be that much off relatively. (and keep in mind that many of the Xeons don't have integrated graphics, and the 216mm2 includes the integrated graphics)
So the 2 module FX-4100 is another 315mm2 die and is trying to compete in price and performance with Intel's dual core i3-2100 at 131mm2. Seems expensive for AMD.
edit: maybe BD has too much cache? They've got 2MB L2 cache per module and 8MB L3 cache in all.
Pretty much the only thing that had me going was the 4CU/4C argument, but after seeing Hardware.fr's numbers for even that, it doesn't look good. Anyway, nobody should be considering a 1st gen BD if they're serious about value. Maybe things are different in the 2nd iteration, but AMD has got a lot of work to do
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
Well, anyone who watched IPL shouldn't be acting like AMD didn't hype it as a gaming processor.
If they could have kept their marketing in check, they'd be getting a lot less hate.
What's IPL? I'm being serious here. Some SC2 tournament?
On October 13 2011 23:16 ilovelings wrote: There is no reason not to buy a 2500k now. On the other hand, software has shitty multi-core/thread management.
No it doesn't. Pretty much all CPU-heavy software uses multiple threads today, even games and OS. Although most games benefits more from using physical cores rather than logical, but they still use more than 1 core.
On October 13 2011 23:34 Nizaris wrote: ^don't bother with the i7 unless money isn't an issue and/or you do video encoding (streaming). i5 does everything else just as good.
On October 13 2011 22:50 Myrmidon wrote: You know, even though BD is relatively good at highly-threaded integer workloads, like for server use, on second glance, the die size is kind of disappointing for the performance you get.
BD 4 module FX-8150 is 315mm2, Sandy Bridge 4 core is 216mm2 yet has pretty much the same performance. I don't think that the corresponding Opteron and Xeon sizes will be that much off relatively. (and keep in mind that many of the Xeons don't have integrated graphics, and the 216mm2 includes the integrated graphics)
So the 2 module FX-4100 is another 315mm2 die and is trying to compete in price and performance with Intel's dual core i3-2100 at 131mm2. Seems expensive for AMD.
edit: maybe BD has too much cache? They've got 2MB L2 cache per module and 8MB L3 cache in all.
So what you're saying is that if the cache size was reduced, the overall latency and performance ratio to mm would go up?
edit: Question: Does more cache mean more latency by default, or is that just better manufacturing and design by intel?
Well the comment about too much cache was just about die area. The overall design should have a much stronger impact on latency than how much cache there is.
I'm not sure how the cache is connected on Bulldozer, but Sandy Bridge uses a ring architecture to connect the cores, cache, IGP, and system agent.
A ring is a well-studied network topology used still in some networks today (but not like Ethernet, Wi-Fi, cell networks): http://en.wikipedia.org/wiki/Ring_network
On October 14 2011 00:34 Integra wrote: 6 years of work for this? wow. Certainly not something a gamer should acquire.
Yeah. I'd maybe consider the 8120 if I wanted a machine to run a lot of dedi's off of, but that's about it. Seems like with CPU affinities, that would work ok.
This feels like the second coming of Pentium 4 where the architecture might be good if it can scale to lots of cores but will instead fail horribly because it will run into a thermal wall.
The power issue seems to be a due to BD's ridiculously high turbo speeds, Global Foundaries' 32nm process and AMD clocking them so high to make up for the IPC decrease. Under normal idle conditions, its about 100Watts from wall which is +10W over a SB, idle overclock is the same or lower. Its only when all 8 cores are under 100% load that the power consumption goes insane because it hits a wall and it becomes exponentially harder to get more processing power.
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
Well, anyone who watched IPL shouldn't be acting like AMD didn't hype it as a gaming processor.
If they could have kept their marketing in check, they'd be getting a lot less hate.
What's IPL? I'm being serious here. Some SC2 tournament?
And you call yourself an archon... *shakes head*
Yes, it's some sc2 tournament.
Been here long before SC2...
But let's get it back on track:
So, after a couple days this news have been out and heated discussion going on here, would someone be so kind as to sum things up nicely? Was this AMD release a failure? Is BD a viable purchase and in what cases? Are there any innovations in it people should know about? Or is it just an obsolete piece of hardware that has been released way past when it should?
Edit: You know, simple explanation for people whose brain shuts off incoming audio/video after hearing/seeing phrases like "memory latency".
sad panda man, im quite loyal to the intel market but i always want someone to keep intel honest. Now that AMD is on the ropes so to speak, prices are going to be pretty rough on the intel side.
Guess they should change that commercial jungle from "bum-bu-bum-bum bah" to "cha-chu-cha-ccha-ching!"
On October 13 2011 09:33 Childplay wrote: man why did amd spend all the time and money developing a chip thats slower than the one they already have in the most important part... gaming....
Perhaps they didn't intend to release it for gaming? There are other things to do with computers you know... And servers are pretty important too.
You don't advertise for a market if you aren't willing to get benched in that market. Common sense. I don't see Quadro getting advertised for gaming.
Well I think they incorrectly projected the market of CPUs when they started designing it all that time ago (4 years I think). I think they thought parallelism would be the way to go because us Computer Scientists would write software to take advantage of that for everything. Well 4 years later, Computer Scientists still don't quite know how best to do that in every application and processing power needs have leveled out somewhat so there's really not that much reason to need that much parallelism in every day stuff...
I don't really care what the reason is much, I'm just tired of hearing the "intended for servers" excuse being offered for a CPU that received so much bullshit hype as a gaming CPU. Or was I the only one here who watched IPL?
Haha I think we all watched the IPL - otherwise these guys are on the wrong forum :p
Well, anyone who watched IPL shouldn't be acting like AMD didn't hype it as a gaming processor.
If they could have kept their marketing in check, they'd be getting a lot less hate.
What's IPL? I'm being serious here. Some SC2 tournament?
And you call yourself an archon... *shakes head*
Yes, it's some sc2 tournament.
Been here long before SC2...
But let's get it back on track:
So, after a couple days this news have been out and heated discussion going on here, would someone be so kind as to sum things up nicely? Was this AMD release a failure? Is BD a viable purchase and in what cases? Are there any innovations in it people should know about? Or is it just an obsolete piece of hardware that has been released way past when it should?
Edit: You know, simple explanation for people whose brain shuts off incoming audio/video after hearing/seeing phrases like "memory latency".
Bulldozer is not a viable purchase for gamers and the majority of consumers. For gaming and lightly threaded software, it is greatly outperformed by Intel's Sandybridge and even AMD's older second generation Phenom can rival it.
Bulldozer is only good at heavily threaded software. But imo, if you were serious about your work, you would invest money into Intel's upcoming core i7 3930k or its existing core i7 2600k.
I don't see AMD succeeding with this architecture in the consumer market even if new iterations are due out every year. Intel's Ivybridge is likely to wipe the floor with the second iteration of Bulldozer, both of which are due in 2012.
I don't get it when people say that something isn't viable when it is perfectly capable of doing so. That's like saying that an i3 isn't viable because its not an i5.
I am so bitterly disappointed by AMD's performance these past couple of years. I remember when I built my first computer and used a 939 3500+ from AMD, ah the good old days. Ever since then though, I've had no choice but to choose intel CPUs time and time again, and it looks like things are going to change.
On October 14 2011 02:32 Antisocialmunky wrote: I don't get it when people say that something isn't viable when it is perfectly capable of doing so. That's like saying that an i3 isn't viable because its not an i5.
I simply used the word because the question was phrased that way. But it's pretty common sense... Intel equivalents of Bulldozer which is the core i5 2400 and core i5 2500k far outperform it and has a upgrade path to Ivybridge. Even AMD's older second generation Phenoms rival it and they're less expensive. There is absolutely no reason to be purchasing Bulldozer for the majority if not all consumers.
But hey if you think buying a vastly inferior product is viable than you're welcome to.
There's some bugs with the sharing resources in those "modules" of the BD. It has higher IPC ( sometimes even +35% higher than the other one ) if you ran it as 4modules/4Cores ( disable one in each ) vs 2modules/4Cores. It also seems to work better with Windows 8 than Windows 7.
Full load i7 at 3.4ghz = 155watts Full load FX-8150 at 3.6ghz = 223watts
Obviously they suck for gaming, but from a work-related point of view they are not that bad :
Many pro applications are multi-core and multi-thread optimized and this CPU really shines when in front of these, beating a i5 2500k and sometimes the i7 2600k. They are priced the same, around 270€.
Since the Intel socket is more expensive than a AM3+ socket you can build yourself a cheaper and more powerfull work station than a Intel based one. Add the potential overcloacking gain and it's worth it. I can imagine some people who want to work at home or some entreprises being interested by this architecture.
Since the Intel socket is more expensive than a AM3+ socket you can build yourself a cheaper and more powerfull work station than a Intel based one. Add the potential overcloacking gain and it's worth it.
Unfortunately not true. The cheapest 9xx AM3+ motherboard I can find on newegg, for example, is $100. Not even sure if it can OC. You can easily get $100 OC'ing 1155 motherboards, not to mention $60 non-OC motherboards.
You also need to spend more on the PSU, especially if OC'ing. An OC'd FX8150 consumes almost 3 times more than an OC'd 2600k.
The cheapest 9xx AM3+ motherboard I can find on newegg, for example, is $100. Not even sure if it can OC. You can easily get $100 OC'ing 1155 motherboards, not to mention $60 non-OC motherboards.
Search more, there are others motherboard with different chipsets that are even more affordable ( in the 70€ range like the ASUS M5A87 ) and also does overcloacking. Asus did some of good quality ( for having one I can tell ). Is there a particular reason you chose the 9xx chipset ?
You also need to spend more on the PSU, especially if OC'ing. An OC'd FX8150 consumes almost 3 times more than an OC'd 2600k.
Does absolutly not means 3 times more expensive PSU. Also, at what clock speed are you OCing ? I'm not sure it consumes 3 times more until you reach a very big one... BTW it doesn't really make sense to compare the power requirements of a 8 core with a 4 core, even more with a completely different architecture. To be fair It should be compared to the i7 960x.
The cheapest 9xx AM3+ motherboard I can find on newegg, for example, is $100. Not even sure if it can OC. You can easily get $100 OC'ing 1155 motherboards, not to mention $60 non-OC motherboards.
Search more, there are others motherboard with different chipsets that are even more affordable ( in the 70€ range like the ASUS M5A87 ) and also does overcloacking. Asus did some of good quality ( for having one I can tell ). Is there a particular reason you chose the 9xx chipset ?
You also need to spend more on the PSU, especially if OC'ing. An OC'd FX8150 consumes almost 3 times more than an OC'd 2600k.
Does absolutly not means 3 times more expensive PSU. Also, at what clock speed are you OCing ? I'm not sure it consumes 3 times more until you reach a very big one... BTW it doesn't really make sense to compare the power requirements of a 8 core with a 4 core, even more with a completely different architecture. To be fair It should be compared to the i7 960x.
On October 14 2011 06:20 renkin wrote: BTW it doesn't really make sense to compare the power requirements of a 8 core with a 4 core, even more with a completely different architecture. To be fair It should be compared to the i7 960x.
Cores don't really matter, neither does clockspeed.
The 2600k is a competitor when it comes to price and performance(in multithreaded apps) and thus Bulldozer has to be compared to it.
Bulldozer doesn't even have real 8 cores :p
Would still love to see a detailed power draw comparison at different loads tho, not that "we used prime or linpack" shit. Has any site done this so far? HT4u has atleast measured cpu+pwm alone to get mainboards out of the equation, but that doesn't satisfy me.
AMD has released a statement about AMD FX and the sub-par performance currently seen.
This week we launched the highly anticipated AMD FX series of desktop processors. Based on initial technical reviews, there are some in our community who feel the product performance did not meet their expectations of the AMD FX and the “Bulldozer” architecture. Over the past two days we’ve been listening to you and wanted to help you make sense of the new processors. As you begin to play with the AMD FX CPU processor, I foresee a few things will register: In our design considerations, AMD focused on applications and environments that we believe our customers use – and which we expect them to use in the future. The architecture focuses on high-frequency and resource sharing to achieve optimal throughput and speed in next generation applications and high-resolution gaming.
An excerpt.
And since a decent number of you might have read bits of me thrashing them, if you want to see what I had to say after seeing this, here it is in the interest of fairness.
Yes. Basically AMD has given up trying to compete directly with Intel in the current market. In order to catch up, AMD is trying to "head em off at the pass"
Oh? I thought it was more like "grasping at straws".
Like I said in my blog though, if they'd price them against a different part of the performance curve, they'd sell enough to make up for the reduced price. If an 8150 rig was priced around an i5 2300 or 2400 rig, they'd be damned competitive for game streaming, and college students in video or coding type stuff where they might want to leave cores running on something and do some light gaming in the evening.
Uses too much power if one would use it for a workstation ( + sub par performance compared to the competitor ) and if they even made a laptop version of it... it'd drain too much power...
never heard of a cpu where " can my psu handle it " would ever be asked until now perhaps... 229 watts full load alone...
On October 14 2011 10:33 JingleHell wrote: Oh? I thought it was more like "grasping at straws".
Like I said in my blog though, if they'd price them against a different part of the performance curve, they'd sell enough to make up for the reduced price. If an 8150 rig was priced around an i5 2300 or 2400 rig, they'd be damned competitive for game streaming, and college students in video or coding type stuff where they might want to leave cores running on something and do some light gaming in the evening.
I was more referring to the parts where Bulldozer benches significantly better in Windows 8 developer preview.
According to AMD, Windows 7 throws threads around "willy nilly" where win8 has proper thread scheduling.
On October 14 2011 10:33 JingleHell wrote: Oh? I thought it was more like "grasping at straws".
Like I said in my blog though, if they'd price them against a different part of the performance curve, they'd sell enough to make up for the reduced price. If an 8150 rig was priced around an i5 2300 or 2400 rig, they'd be damned competitive for game streaming, and college students in video or coding type stuff where they might want to leave cores running on something and do some light gaming in the evening.
I was more referring to the parts where Bulldozer benches significantly better in Windows 8 developer preview.
According to AMD, Windows 7 throws threads around "willy nilly" where win8 has proper thread scheduling.
If you can believe that.
Well, Vista is the OS I'm used to thinking of as being rather spastic.
Just remember, if the thread scheduling gets them 1-2% better performance, they're not lying. Wait for proper benches, IMO.
Not saying it isn't possible, but they've demonstrated enough ability to deceive via omission with this release that nobody in their right mind should take anything they say at any more than minimal face value.
I believe that. What's confusing is that this question was asked ages ago: "how will Windows 7 recognize modules" and AMD responded "Microsoft will deal with it before launch". They didn't it seems because it does stupidly better in Windows 8 than it does in Windows 7.
Its not a fantastic processor since it doesn't quite handle the deficiencies of all AMD processors (that is weak single threaded performance) but its not that terrible. Still no point buying one over a i3 2100 or something because holy shit that power draw when turbo coring is unacceptable.
There were a few cases where bulldozer did like 15 percent better on win8, I can't recall which right now.
I think if piledriver comes out, and thats 15 percent better, then win8 comes out making the entire brand 15 better as well? I think we just might have some competition.
Not right now though. I think they are crazy pricing it the way they are right now. Once the chip stops selling out I expect a pretty big price drop.
I'm disappointed by Bulldozer as it is currently, but lets not throw the architecture out the window and assume that it's a complete failure. The fact that standard clock speeds are in the 3.7-4 Range is insane.
And it does completely obliterate the i7-2600k in x264 encoding that's proof of it's potential.. I'm hoping that the reason it's so bad in single thread performance is because of scheduling issues. Only time will tell.
I don't believe that the full potential of Bulldozer will be seen until Windows 8, unless Microsoft released a fix(which they have done in the past -- mind you it was Windows 95).. I'd say as it stands right now Bulldozer isn't worth the purchase, but in due time I believe that it will live up to its hype and shine like it was supposed to.
On October 14 2011 11:33 PunkyBrewster wrote: I'm disappointed by Bulldozer as it is currently, but lets not throw the architecture out the window and assume that it's a complete failure. The fact that standard clock speeds are in the 3.7-4 Range is insane.
And it does completely obliterate the i7-2600k in x264 encoding that's proof of it's potential.. I'm hoping that the reason it's so bad in single thread performance is because of scheduling issues. Only time will tell.
I don't believe that the full potential of Bulldozer will be seen until Windows 8, unless Microsoft released a fix(which they have done in the past -- mind you it was Windows 95).. I'd say as it stands right now Bulldozer isn't worth the purchase, but in due time I believe that it will live up to its hype and shine like it was supposed to.
Signed up to post this shit? What's AMD's PR department like, anyway? Did your Advertising staff get canned?
How is the poor single thread performance because of scheduling issues? It's not hard to figure out where to schedule 1 thread (hint: any core will do).
It's probably some games or other workloads where resource sharing could be beneficial, where BD module-aware scheduling will help out. Also, smarter scheduling for some tasks may involve grouping tasks on the same modules so the remaining modules can be parked, allowing for higher Turbo Core frequency boosts.
Where in my post did I say that it was worth the purchase? If you don't think that scheduling issues are the problem then why have tests shown a difference in speed of 30% in Cores 1&3 compared to 2&4? Get past your Intel-Fanboyism, this processor has some INSANE potential.. Or have you forgotten that Intel based their Dual Core Designs off the Athlon64? It seems very unlikely that a company would go BACKWARDS, especially a company that has been help driving the processor market for quite a while.
All these reviews use DDR3-1600 when the Bulldozer's NATIVE RAM is DDR3-1866(and yes I know the difference is minuscule most of the time), that would make quite a difference on quite a few of those benchmarks.
EDIT: I agree that it's probably a flaw in the architectural design of the processor, AMD hasn't been known for great single-threaded performance.. But it's a very likely scenario that Bulldozer's issue does lie with scheduling.
On October 14 2011 11:41 Myrmidon wrote: How is the poor single thread performance because of scheduling issues? It's not hard to figure out where to schedule 1 thread (hint: any core will do).
It's probably some games or other workloads where resource sharing could be beneficial, where BD module-aware scheduling will help out. Also, smarter scheduling for some tasks may involve grouping tasks on the same modules so the remaining modules can be parked, allowing for higher Turbo Core frequency boosts.
Hey, none of those fancy technical details, we're talking about computers, powered by the same devilry as Tarot cards, Astrology, and Ouija boards!
And TL is famous enough to pop near the top of the Google results, so they sent their advertisers here to make up for the bad hype.
I shouldn't expect logical arguments from people on a gaming website. I thought people posting in this forum would act a bit more rational. ... But talking about technical problems with people that are fans of a company because their favorite team is sponsored by them is what I should have expected. Why would you want Bulldozer to fail? Competition is what drives the market. It wasn't so long ago that AMD was king, or have all you fanboys forgotten about the P4?
Intel may own desktop processors, but AMD still has the majority of the server market. I'm not claiming that Bulldozer lived up to its hype, but i'm not going to make my judgment on the first few days of release. The design still has quite a bit of potential.
On October 14 2011 12:01 PunkyBrewster wrote: I shouldn't expect logical arguments from people on a gaming website. I thought people posting in this forum would act a bit more rational. ... But talking about technical problems with people that are fans of a company because their favorite team is sponsored by them is what I should have expected. Why would you want Bulldozer to fail? Competition is what drives the market. It wasn't so long ago that AMD was king, or have all you fanboys forgotten about the P4?
Intel may own desktop processors, but AMD still has the majority of the server market. I'm not claiming that Bulldozer lived up to its hype, but i'm not going to make my judgment on the first few days of release. The design still has quite a bit of potential.
This is the most moronic excuse for a "logical" argument I've heard on behalf of AMD yet. Just a few days ago I was repeatedly subjected to your horrendous hype over and over again watching IPL, just like everyone else here. AMD sponsors a ton of Starcraft related events, and if we went based on sponsorship, A: we'd all be broke with shitty PCs, and B, we'd all use AMD.
It looks like this bulldozer couldn't handle pushing the sand...
...off the bridge.
To add some content, I'm really really disappointed in this. I really want to like AMD, but with the problems I'm having with their video drivers and releasing a CPU that, in some test, is worse then its previous...sigh. No one wins.
To add some content, I'm really really disappointed in this. I really want to like AMD, but with the problems I'm having with their video drivers and releasing a CPU that, in some test, is worse then its previous...sigh. No one wins.
On October 14 2011 12:01 PunkyBrewster wrote: I shouldn't expect logical arguments from people on a gaming website. I thought people posting in this forum would act a bit more rational. ... But talking about technical problems with people that are fans of a company because their favorite team is sponsored by them is what I should have expected. Why would you want Bulldozer to fail? Competition is what drives the market. It wasn't so long ago that AMD was king, or have all you fanboys forgotten about the P4?
Intel may own desktop processors, but AMD still has the majority of the server market. I'm not claiming that Bulldozer lived up to its hype, but i'm not going to make my judgment on the first few days of release. The design still has quite a bit of potential.
... The only market AMD has the majority of is the bargain bin desktops. Everywhere else AMD either eats up too much power or doesn't offer the same robustness as Intel.
Don't get me wrong. I want AMD to do well in all markets, but not by ignoring their weaknesses compared to Intel. You said yourself that Intel based their duel-core designs off of the Athlon64 (a claim I don't feel like validating right now), which shows they were willing to address the shortfalls of their design philosophy. AMD continues to believe they don't have memory architecture (and philosophy) problems and continue to push their processors to their power boundaries. I don't have respect for that approach.
As bad as AMD is doing in the desktop business they are actually getting eaten up even worse in the server market. I have no idea where that guy got his information from.
I haven't dabbled in it personally, but Im pretty sure the intel Xeon Westmere Hexacores are imba imba.
On October 14 2011 12:31 JingleHell wrote: Oh, and I like how he says 8150 "completely obliterated 2600k in x264 encoding...
Why are you using the quick benchmark tool which can't account for system settings and versioning when you can instead quote Anand's own head to head...?
That last one would've still been the 2600k if only they bumped the multiplier by 2x to match it in frequency and it'd still use significantly less power
On October 14 2011 12:31 JingleHell wrote: Oh, and I like how he says 8150 "completely obliterated 2600k in x264 encoding...
Why are you using the quick benchmark tool which can't account for system settings and versioning when you can instead quote Anand's own head to head...?
Glad I got my 2500k. It is interesting if you are following on xtreme systerms, someone did a proof of concept that you can get 20-30% performance boost by disabling one of the cores in each cluster to turn it into a much faster quad core over just disabling 2 clusters. He hasn't posted any gaming benchmarks yet but it is definitely interesting as you can clock higher and can remove the penalty for shared cache...
Hey, with enough tinkering and a hypothetical load balancing/throttling/core toggling software patch, it can beat Thuban! xD
Apparently, this ex-AMD engineer claims that part of Bulldozer's failure relates change to the design process of chips by AMD. Traditionally, AMD and Intel handcrafted their chips, but at a certain point (not mentioned), AMD decided to switch to an automated process of designing their chips. He claims that automation increases the speed of the design process (and yet Bulldozer was delayed for half a year), while its inefficiencies leads to a 20% bigger die area and 20% less performance. This explains why a 2 billion transistor CPU can't beat an older X6 1100T which has much less transistors (~900 million) on most tests. And the relatively large size of the chip contributed to its high cost.
Apparently, this ex-AMD engineer claims that part of Bulldozer's failure relates change to the design process of chips by AMD. Traditionally, AMD and Intel handcrafted their chips, but at a certain point (not mentioned), AMD decided to switch to an automated process of designing their chips. He claims that automation increases the speed of the design process (and yet Bulldozer was delayed for half a year), while its inefficiencies leads to a 20% bigger die area and 20% less performance. This explains why a 2 billion transistor CPU can't beat an older X6 1100T which has much less transistors (~900 million) on most tests. And the relatively large size of the chip contributed to its high cost.
I have a feeling we'll see a lot of articles feeling out who the public will accept as a scapegoat before we hear about someone getting fired. First of many.
I still think it should be the marketing guys who overhyped it, and whoever set standard pricing.
Not saying this is impossible or anything, just looking at how the world tends to work.
Apparently, this ex-AMD engineer claims that part of Bulldozer's failure relates change to the design process of chips by AMD. Traditionally, AMD and Intel handcrafted their chips, but at a certain point (not mentioned), AMD decided to switch to an automated process of designing their chips. He claims that automation increases the speed of the design process (and yet Bulldozer was delayed for half a year), while its inefficiencies leads to a 20% bigger die area and 20% less performance. This explains why a 2 billion transistor CPU can't beat an older X6 1100T which has much less transistors (~900 million) on most tests. And the relatively large size of the chip contributed to its high cost.
Yeah this was discussed months ago on MR too (where he was made a post with that kind of information). I'm lowering engineering expectations for the next 18-24 months, and even then, it's probably still too soon to expect any serious deviations from the engineering design decisions AMD made on BD
Not really a fun time for builders, cuz there's nothing to debate on lol.
On October 15 2011 01:46 mav451 wrote: ...I'm lowering engineering expectations for the next 18-24 months, and even then, it's probably still too soon to expect any serious deviations from the engineering design decisions AMD made on BD
...
Well w e did just get a new CEO this quarter... Maybe something will... is... changing?
Apparently, this ex-AMD engineer claims that part of Bulldozer's failure relates change to the design process of chips by AMD. Traditionally, AMD and Intel handcrafted their chips, but at a certain point (not mentioned), AMD decided to switch to an automated process of designing their chips. He claims that automation increases the speed of the design process (and yet Bulldozer was delayed for half a year), while its inefficiencies leads to a 20% bigger die area and 20% less performance. This explains why a 2 billion transistor CPU can't beat an older X6 1100T which has much less transistors (~900 million) on most tests. And the relatively large size of the chip contributed to its high cost.
I realize I'm late quoting this and I'm the 3rd to do so, but I get the feeling that many don't quite get what's going on here. Actually I'm not so sure myself since I've never worked with VHDL / Verilog / whatever or any transistor design tools or digital logic design beyond the baby stuff.
A processor is logically just an insanely complicated digital logic circuit made up of transistors with various properties, in various arrangements, hooked up and laid out in a certain exact pattern. The key idea is that there are multiple ways to build something to do the exact same thing.
All the functional blocks like adders, registers, cache, etc. are made up of transistors. There should be tools for creating blocks as well as laying out parts of the design and figuring out how to connect them together.
For best electrical characteristics (maybe this may effect possible clock speeds and power consumption), fewer gates to pass through so less delays, using the fewest amount of transistors, and so on, it's best to optimize the layout and construction of elements by hand. No automatic tools are perfect.
It's very roughly like writing C code, compiling, and then doing hand optimization of the resulting assembly code to make it a bit faster. And mostly C is just used for computational efficiency to begin with.
Obviously you still have to do a lot of stuff manually and plan things out, even if you're using some automatic tools to help with some of the design. There's just the question of how much skimping is going on, and how much difference that's really making.
I think the comment may be as much a critique of the culture shift as much as of the technical merits of taking such shortcuts.
Apparently it works 17-35% better on Linux compared to the Phenom || X6 predecessor without overclocking and or requiring ridiculous load voltages that come with overclocking...
On October 15 2011 06:59 JingleHell wrote: Maybe that ex-AMD engineer is trying to make the process sound bad because he got fired from a job doing the basic circuit design?
Nah, a chip that uses DOUBLE the number of transistors of the i7 should be able to get somewhat close in performance. But instead it's far, far behind and just uses a shitload more power. Sounds exactly like the type of crap that automated tools generate.
I sincerely hope AMD's new CEO kicks this practice out the door, because if they don't recover quickly, AMD is gonna be on its way out, which is bad for consumers... Intel has already shown its willingness to squeeze competitors out of the market and charge exorbitant prices. Imagine what it will be like if they have a true monopoly?
So apparently a new bios option on the MSI boards FM1 boards that can allow you to push the clock speed near 170mhz which results in 4.0-5.0 ghz on the processor...
It is interesting to note that FM1 Llanos are basically 32nm Phenoms... There's even a Athlon II X4 631 which is a Llano with disabled graphics core.
On October 15 2011 09:24 Antisocialmunky wrote: So apparently a new bios option on the MSI boards FM1 boards that can allow you to push the clock speed near 170mhz which results in 4.0-5.0 ghz on the processor...
It is interesting to note that FM1 Llanos are basically 32nm Phenoms... There's even a Athlon II X4 631 which is a Llano with disabled graphics core.
Man, that really is dissapointing. I remember i built this PC with an Athlon ii x3 450 3.2ghz 1.4v 2gb nb, and overclocked/unlocked it to x4 3.415ghz 1.472v 2.53gb nb. I love my CPU. Everyone was hyping about the imminent Bulldozer back then (man they really delayed it), and I was thinking of getting it when it came out a year later.
What about the future bulldozers? Are they going to be good?
Does anyone here have a Bulldozer?
edit: on a sidenote, wtf is up with those APUs, dragon, etc. why would anyone use one of those?
Intel beats AMD in performance at every price point. Bulldozer is only slightly better in performance than a Phenom 2 and in some cases it is worse. They are supposed to be releasing an update to improve performance with windows 7 and server versions but by the time that is released Intel will have released Ivy Bridge. Basically, Bulldozer is a pile of shit and you are better off going Intel if you are looking for the most performance out of your money.
Piledriver will improve on Bulldozer and I guess it's going to be a good option for multi-threaded tasks if you can't afford a native hex or octo Ivybridge.
Llanos are good for low-end gaming notebooks, small form factor PCs, and so on.
I was considering buying the 4200(?) bulldozer on amazon, it was £89, was could possibly go wrong?
Yeah I'm glad I didn't, I'l stick with my x4 630 Propus.
Also can someone explain how AMD went from awesome with the athlonx2 64FX (which I owned for 6 years and was awesome) to basically a bargain bin processor developer. Don't get me wrong, my 630 Propus is amazing value for money and lets me play games like Skyrim on medium-high, BF3 on high and SWTOR or all high, but sometimes I feel it's like Intel has invented the wheel and AMD are still trying to figure out how to match it.
Bulldozer doesn't work with like 10 steam games(They won't start up at all) which is pretty shitty. Also they consume way too much power and such in single-core performance.
In my opinion the best AMD CPU to get atm is Phenom II x4 955.
Piledrivers could be good, also Llanos are good if you don't want a discrete graphics card and at least their power consumption is normal(Close to sandybridge, not even comparable to athlons or phenoms)
In general they are a bit warmer than the athlons due to the L3 cache, but they aren't particularly hot, although they overclock really well in regards to stability, so are generally temperature limited. So what you get is a bunch of idiots getting motherboards that shit out. MSi boards, for example, have really shitty VRMs, although in general they are a quality brand with good service (although my MSi gpu is a lemon, they have offered to replace it, but i cant go 2 weeks without a gpu, even if i have to underclock it by 800 mhz).
So i woulndt say they have a tendency to burn motherboards. cant really trust what people say about overclocking, since most people don't know what they are doing.