Anyways bulldozer is an epic failure and is slower than even thuban and phenom in many cases, and sucks for gaming and uses twice the power under load than an intel rig.
It is mind boggling that AMD worked on this processor for over 6 years and it sucks worst than its predecessor architecture. WHY AMD WHY!!!?? i am stunned and dumbfounded.
sleepy right now, but will be reading this as soon as i wake up, i love this tech stuff so much lol, and i hope its not as bad as you make it sound lol. Thanks for bringing this to my attention, funny tho im still using a core 2 duo from yester year, and still working great. anyways thanks again sir!
I skimmed through a couple of tests and it looks pretty bad.
I have owned Intels and AMDs, because I always simply buy what's the most bang for the amount of bucks I'm willing to spend (and never understood Intel/AMD/nVIDIA/ATI or any corporate fandom). Currently that's a Phenom II 955, which I bought before Sandy Bridge's release. AMD falling further behind is really bad for the consumer, even if you don't intend on ever buying an AMD cpu.
Not looking bright for the future and when you compare Intel's with AMD's budget it's not likely to get better any time soon. That's not even including Intel's business practices.
There were already a bunch of leaked benchmarks before the nda lift.. and it wasn't like all of them would end up conspiring against amd. Just that the oct 12+ benchmarks only confirmed them... anything beyond 2005 seemed like it all went down hill for amd
It would seem they hope to sell the processors on the "8 core bit" alone. Screw performance people actual need. Maybe the lower end processors will be more viable.
I was looking forward to this, but when I found out that my 790 board wouldn't be compatible my interest dropped off a while ago.
Still, it is a little silly that the 8150 loses to the 1100T on most occasions. I've read some of the new architecture's goal and a relevant point is that this architecture should be more "scalable" than the last architecture. My (last) hope is that they can somehow pull a houdini and scale this thing to 16+ cores without much trouble and shake up the server industry.
I may get an Intel next upgrade, but I'm still an AMD fanboy on the graphics side until they prove me wrong, but they seem to have their act together on gpus.
havent read the articles yet but if what you say is true then I'm disappointed, I had high hopes for bulldozer
edit: finished reading, while BD is quite disappointing, the OP is a little bit extreme. I think it's worth mentioning that in some heavily threaded applications, BD > Sandy Bridge. Unfortunately its still not enough to make up for its shortcomings.
I generally like AMD products simply because you tend to get more bang for your buck, but meh... I was hoping they wouldn't fail. The fusion APU thing was pretty cool, so I was kinda expecting AMD to be on a roll, but this sucks.
Boggles the mind that they would release an obviously inferior product. Surely they would be aware of its less than stellar performance and simply scrap it rather than putting it into production ?
Sure many companies put out a "budget" line of products to appeal to the lower price point of the market, but these products are usually the last years "top of the line" model that has simply been stripped back of the fiddly bits to keep cost low.
You cant release a "brand new" product that was intended to be top of the line as a budget product and expect to recoup the development costs.
On October 12 2011 18:52 Playguuu wrote: Well I used to be a AMD fan before the P4 came out. I wonder how they thought the specs were ok to release the chip.
They didn't but it took over five years to design so they had to run with what they had.
Notice it was delayed from Q4 2009 to Q2 2011, and then another four months in 2011 and in that time there were two respins (B1 -> B2 -> B2G) in an attempt to fix it.
Close to 500watts when OC'd to ~4.7GHz and still doesn't come close to the 2500k in single threaded performance. We're so far away from programs that would use this CPU to its full potential and by that time ivy bridge and beyond will be out for Intel. Would've been easier to just die shrink a thuban and slap on 2 more cores.