|
A Joust to the finish.
The CPU market these days is an interesting place, we have three price segments and three markets. Low, Mid, High (duals and quads) and Budget, Mainstream, Professional. The tactics that AMD and Intel have been using have proven to be quite interesting.
Not too long ago AMD introduced the Phenom II line of quad core processors which undercut Intel’s Core2Quad line in price and brought similar performance. However, Intel at that point still remained the performance king with its high clocking E8000 dual core line, the formidable Q9550 quad, and untouchable Core i7. The Phenom II line quickly sprouted off in multi-core directions, bringing the Phenom II X2 and X3 into the fray as well as the X4 955 and X4 965. It seemed like, for the first time in a long time, AMD was back on top, at least in the mid-range.
The Core2 line still technically provides better overall performance but fails in two important areas for its mid-range segment: price and gaming. The Phenom II architecture is a heavy hitter when it comes to gaming, giving nearly on par clock-for-clock performance against the Intel flagship; the i7. So when AMD branched its Phenom II line into dual and triple core forms they had effectively stolen the segment from Intel. This, in addition to backwards compatibility and promise of long term socket support, gave AMD a large edge and command of the mid-range. Good times for AMD.
The good times, however, would not last. A few months later Intel introduced socket 1156 which would power their 32nm Clarkdales in the future, as well as the newly introduced Core i5 line. The Core i5 line was a crippling blow to AMD’s domination. The cheapest i5-750 was essentially an i7 without Hyper-Threading and triple channel memory. These losses are nearly nonexistent and with Intel pricing the i5-750 at around $200 it chopped off AMDs mid range quad domination.
AMD struck back at the low-end with the Athlon II X2 and X4 line. The Athlon II X2 250 brought a 3GHz dual core to the low end segment which performed around 8% worse than the Phenom II X2 550 at more than 20% less cost. The Athlon II X4 620 marked the first time a brand new quad core would sell for under $100 while keeping up, and in some cases, besting Intels $160+ quads. With the Athlon II X2 250 dominating the low-end, the Phenom II X2 550 BE dominating the high-end duals, the Athlon II X4 620 dominating the low-end quads, and the Phenom II X3 720 BE being the best bang for buck processor, AMD seemed to still be in good shape. In addition to this, 2010 would see the introduction of AMD 45nm hexa cores and 32nm octo cores.
As good a year 2010 seems to be for AMD, it will be even better for Intel. Early next year, Intel will be introducing its answer to AMD’s remaining mid-range domination: the 32nm Clarkdale. Early reports of Clarkdale told only half the story. Intel would release a 32nm dual core to combat AMD’s Phenom II quads? Surely, a losing proposition. As the release of Clarkdale grew nearer, revelating rumors began to surface. Clarkdale will have Hyper-Threading. Clarkdale will have Turbo.
Currently, Clarkdale will be separated into two lines: the i5 which will be high end ($170-$280), the i3 which will be mid range ($120-$140), and the Pentium which will be low end ($90). The i5 and i3 lines will include HT and Turbo. To make things interesting (and quite frankly, practical) let's assume the Clarkdale architecture will match the Phenom II clock-for-clock and core-for-core in gaming performance and surpass it in everything else. With the slowest mid range Clarkdale clocking in at 2.93GHz, what this would mean is that AMD would lose the dual core market which is, in essence, a majority of the gaming market. That’s not even taking into account Turbo mode.
But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking.
This really only leaves AMD with consumers looking for incremental upgrades for their AMD motherboards and the low-mid range quad core market for people who actually need 4 physical cores; a very small segment (practically nonexistent) as most people who require 4 physical cores will move up market.
What's AMD's next move?
|
Osaka27150 Posts
As someone who does not follow the trends of hardware much, this was a really interesting read. I'm interested in any debate that develops in this thread.
|
I never knew there was such competition between AMD and Intel. If this thread develops into a real debate, I'm sure to learn lots.
Edit: So manifesto basically said it better.
|
I have to say that I agree with the i5 cutting deeply into AMD territory. I was actually going to build a Phenom II x4 machine until I saw the i5 :|
|
Welcome to TL! Had you included some links in there I'd say you were a spam bot or marketer lol.
While there are enough techies around here, you might be on the wrong forum for this kind of stuff (seeing how we also have people deleting their rundll files ).
I guess we'll just have to wait and see? :p Nothing touches the i7 series at the moment, but I'm sure AMD is brewing up their own new line of CPU's.
For people wanting to do more research.. check these benchmarks! In some areas the i7 series are more than twice (!) as fast compared to the best AMD competitor. http://www.tomshardware.com/charts/2009-desktop-cpu-charts/benchmarks,60.html
Personally I prefer AMD for desktops & Intel for notebooks. However, if I were to buy a new desktop right now and had a budget to buy a monster comp, I'd definitely go for an i7 tho.
|
|
Nice write-up... its really hard keeping track of new processors.... i feel that the product cycle is really speeding up recently
|
I'm actually thinking about buying some AMD stock, though my main worry is because they've been operating at a deficit the past year. It would be interesting to see what people's thoughts are on the company and its products.
|
I believe AMD has a 6 core proc coming out in 2010
|
It's like a weapons race.
|
thx for the writeup
|
This is AMD's desktop roadmap. Highly relevant.
|
rofl I bet they are all going to use AM3 socket
Except the 2011 cores
|
Somewhat on topic rant:
Anybody who buys AMD/Intel based on the company is retarded. There's a very simple way to buy hardware:
1. Decide on your price range 2. Go to sites like anandtech, techreport, etc and look at reviews of whatever hardware you want (ie CPU). Look for charts/graphics which give a bar graph of the products' performance in that range. There will probably be 10+ cpus in whatever your price range is, and their performance all nicely graphed out. 3. Buy the best performing part in your price range. Alternatively you might find that you can cut off 25% of what you pay for only 5% less performance with a certain part. Etc. The bottom line is you have the performance, you have the price, and you make the call.
This is so obvious but it frustrates me to no end when I see people who post here in hardware / build a computer thread with advice like "Buy intel, they're better" or "Don't get ATI, nvidia is a lot better." Just understand the performance, then look at prices, and go from there.
edit: This wasn't a criticism or response to the OP really but rather to unaware people who might conclude from the OP that you should buy a certain company rather than buying based on facts of the hardware you are paying for
|
yeah, but sometimes there is reasoning behind it. As much as I hate to say it, AMD/ATi drivers sucked whereas nVidia drivers are generally much better. Intel has been dominating the market with it's C2 and i7 lines whereas the Phenom II came out at a horrible time
|
<3 my corei 7, I run crysis full settings/screen and I have never seen CPU usage go above 33%.
|
To me it's amazing how transistor feature size keeps shrinking so fast these days. Since chip size is no longer increasing, the decreasing feature size is the only thing keeping Moore's Law going. Practically every silicon chip on the market, including processors (and except maybe flash memory), can contain more transistors than people really know how to use. Right now everybody's cheating with processors, saying, "Oh, let's just put 2 4 8? cores on there. Since we don't know how to make a better processor with all our available transistors, let's just put the same old crap in duplicate all right next to each other to use up the space and hope people will be happy...oh, and all the huge excess space even after all that, let's make into cache." I mean, sure, extra cache helps for some stuff. And extra cores help for some things.
But I feel while Intel and AMD keep pushing hardware innovation, software to actually fully utilize the hardware is lagging way behind. It's not every task that can be parallelized so easily. People don't really know how to write programs to run in parallel (yes, lots of stuff does it, but it's far from mature).
What I want to know is what happens when transistors of reasonable cost can no longer be made smaller. There are physical limits--you can't make layers less than an atom thick. This really isn't too far on the horizon, so it's something to consider soon.
|
On October 09 2009 13:23 Myrmidon wrote: To me it's amazing how transistor feature size keeps shrinking so fast these days. Since chip size is no longer increasing, the decreasing feature size is the only thing keeping Moore's Law going. Practically every silicon chip on the market, including processors (and except maybe flash memory), can contain more transistors than people really know how to use. Right now everybody's cheating with processors, saying, "Oh, let's just put 2 4 8? cores on there. Since we don't know how to make a better processor with all our available transistors, let's just put the same old crap in duplicate all right next to each other to use up the space and hope people will be happy...oh, and all the huge excess space even after all that, let's make into cache." I mean, sure, extra cache helps for some stuff. And extra cores help for some things.
But I feel while Intel and AMD keep pushing hardware innovation, software to actually fully utilize the hardware is lagging way behind. It's not every task that can be parallelized so easily. People don't really know how to write programs to run in parallel (yes, lots of stuff does it, but it's far from mature).
What I want to know is what happens when transistors of reasonable cost can no longer be made smaller. There are physical limits--you can't make layers less than an atom thick. This really isn't too far on the horizon, so it's something to consider soon. There's a couple proposed routes, the one that I can remember the best is quantum computing. http://en.wikipedia.org/wiki/Quantum_computing I also believe there's another proposed processor that uses lasers somehow.. I'm fuzzy on this one. Edit: This is probably it: http://en.wikipedia.org/wiki/Optical_computer
There are tech articles that come up on this subject every couple of months.
|
I used to keep up with this stuff, but I don't anymore. It was an interesting read.
The AMD/Intel rivalry is a very interesting rivalry. They're both incredibly well managed and well run companies, a perfect example of how competition is good for the consumer.
|
On October 09 2009 13:16 cz wrote: Somewhat on topic rant:
Anybody who buys AMD/Intel based on the company is retarded. There's a very simple way to buy hardware:
1. Decide on your price range 2. Go to sites like anandtech, techreport, etc and look at reviews of whatever hardware you want (ie CPU). Look for charts/graphics which give a bar graph of the products' performance in that range. There will probably be 10+ cpus in whatever your price range is, and their performance all nicely graphed out. 3. Buy the best performing part in your price range. Alternatively you might find that you can cut off 25% of what you pay for only 5% less performance with a certain part. Etc. The bottom line is you have the performance, you have the price, and you make the call.
This is so obvious but it frustrates me to no end when I see people who post here in hardware / build a computer thread with advice like "Buy intel, they're better" or "Don't get ATI, nvidia is a lot better." Just understand the performance, then look at prices, and go from there.
edit: This wasn't a criticism or response to the OP really but rather to unaware people who might conclude from the OP that you should buy a certain company rather than buying based on facts of the hardware you are paying for Not necessarily true, I would be willing to pay a bit extra to a company I trust and that I know the shit works. For example, I have had a terrible experience with amd in the past(Support issues/driver), so I am less likely to purchase their products even if one of their products is a bit cheaper/better.
|
T.O.P.
Hong Kong4685 Posts
Back in the pentium days, Intel cheated by offering rebates to manufacturers who only bought Intel processors or bought mostly Intel processors. Because of this, AMD lost market share despite having superior processors. AMD is basically screwed for life now. They own <20% of the market while Intel owns around 80%. They have a inferior design and a inferior manufacturing process. It costs less for Intel to produce a processor compared to AMD but AMD has to sell the processor for much less because it's performance is inferior. Every quarter AMD loses a lot of money and every quarter they need a bailout from some Saudi Arabian prince just to pay payroll, because of that it's shares gets diluted. AMD will just keep falling behind because it can't afford to keep up with Intel in research and development. AMD is so hopeless and it sucks cause I bought AMD stock before they merged with ATI. At least the graphics division is doing well now.
|
Interesting read, I never knew about AMD's great success with their processors... This entire time I was on a Centrino ... Anyways, welcome to TL and great first post
|
On October 09 2009 13:06 FragKrag wrote: I believe AMD has a 6 core proc coming out in 2010 The problem with Thuban (the AMD hexa core) is that it is built on 45nm technology while Intel's "equivalent" Gulftown hexa core is 32nm. This means that the Intel will be able to clock faster due to each core taking less voltage. Though it will probably be more accessible price-wise. Not many people will have use for a low clocking hexa core especially since hardly any programs take advantage of quads at the moment. AMD's 8 core Bulldozer may bring high end redemption but it isn't due till late 2010/2011.
|
On October 09 2009 13:23 Myrmidon wrote: What I want to know is what happens when transistors of reasonable cost can no longer be made smaller. There are physical limits--you can't make layers less than an atom thick. This really isn't too far on the horizon, so it's something to consider soon. I don't really know much about the subject, but I am reading a book right now that touches on it and I feel like quoting a part of it here because you/others reading this thread might find it slightly interesting.
...Moore's law, which states that computer power doubles every eighteen months, is possible because of our ability to etch smaller and smaller transistors onto silicon chips via beams of ultraviolet radiation. Although Moore's law has revolutionized the technological landscape, it cannot continue forever. The most advanced Pentium chip has a layer twenty atoms across. Within fifteen to twenty years, scientists may be calculating on layers perhaps five atoms across. At these incredibly small distances, we have to abandon Newtonian mechanics and adopt the quantum mechanics, where the Heisenberg uncertainty principle takes over. As a consequence, we no longer know precisely where the electron is. This means that short circuits will take place as electrons drift outside insulators and semiconductors instead of staying within them. Source: Parallel Worlds by Michio Kaku, pp. 172-173.
Not entirely revolutionary as a quick google search could have given you the same information, (also it was published in 2005 so who knows if the numbers still hold true) but it is the clearest explanation of the need for quantum computing that I have ever read.
|
AMD vs nVidia is pretty interesting right now too, with the 58*0s out, and Fermi still many months away. nVidia is making far less money on their gtx 2** line. Older process, larger chip size, more complicated layout due to larger bus (needed for the older GDDR3). People are questioning whether they are in fact going to be making any money on them at all, and if they will/should shut down gtx 2** and ride it out until Fermi gets here. Even when it does It's likely to be a monster in terms of power use, and I'm personally not interested in it, the same way gtx has been a no go for me. I've never had an ATI card, but this looks likely to change.
There's also some outrage over nVidia disabling hardware physx processing on their GPUs when there is an ATI GPU in the system (ie no using an ATI for graphics with an nVidia card for physx like you used to be able to do. THey even disabled the old Ageia PPUs if you have an ATI card, even though they were made before nVidia bought Ageia). Not the happiest time for nVidia right now, and gives AMD something to be happy about in the face of their CPU woes with the Intel competition
|
Pretty interesting. I am not familiar with AMD vs Intel, as far as I knew AMD was owning it up when Intel had Pentium D's, and then from Core 2 Duo onwards AMD was finished 
ATI vs Nvidia is pretty interesting and I am a lot more familiar with that. Right now I would say that they are roughly even, although ATI has managed to release DX11 GPUs before Nvidia.
|
I tought AMD is near dead since a while?
Well... Glad to be wrong.
|
Back to AMD vs Intel: wtf is Intel doing with Westmere (their 32nm refresh of Nehalem)? I'd love a quad core Westmere: a lower power consumption/faster Lynnfield, but there won't be any (at least at first). For socket 1156, the only 32nm chips are dual core with an integrated gfx part D: On the socket 1366 side, they are only offering 6 cores (and socket 1366 is expensive)
Argh. I have a dual core Penryn right now, and have been waiting to go to quad core until I could get a quad with a TDP that's not too much higher than my current dual. A quad core 32nm part might have done that, but all there are is dual core (no thanks, want to move on) or 6 core (don't want to move that far: it would guarantee a higher TDP).
If you want quad core i7, you have to stick with Lynnfield, at 45nm (which would mean a higher TDP for me). ffuuuuu Oh well, I can just wait it out until that 4core part comes out. Fermi might be out by then, and even if I don't want it, its appearance might cause the price of AMDs GFX cards to drop a bit.
Keeping track of crap like this makes me understand some of the appeal of consoles. I like pain I guess.
|
The problem with Hyperthreading is that most of the improvements it shows over non-HT are in SYNTHETIC benchmarks, sometimes Hyperthreading can actually slow the system down.
The last time I built a PC, I could have chosen between a 9XXX Phenom and a Q6XXX(Kentsfield)/9XXX(Penryn) Quad. Ultimately, what decided it for me was not the statistics on paper, but rather the overclockability of each platform. At the time, the 9XXXs were in poor shape, very few people managed to push it past 3GHz with acceptable stability, while batches of the Q6600 were in the 3,8-4,0GHz range.
I still have to say that ever since Core 2 was out AMD has never dominated the upper mid-range market for CPUs. Phenom and all of its derivatives have never really matched the performance of the Intel chips they were meant to compete against. Instead, AMD has manged to stay afloat by slashing prices and appealing to the lower-end.
For me personally, I'll play the type that waits out the current manufacturing process refresh. My Q6600 rig's PSU died recently, so I'm on my Windsor A64X2 rig-it's still running hella fast.
|
AMD vs INTEL
AMD got owned by their own strategy. Sure, one can say they are understaffed and all, but I would like to reveal a few mistakes that got them on the brink.
Firstly, AMD major breakthrough was in 2003 with the first Opteron they released. Opteron was targeted specifically at the server market, but the processor was later adapted to the desktop market.Since 2003, AMD has been selling us (the same) Opterons. Intel was on weed during that time and was pushing the crappy Pentium4 aggressively. No need to further discuss the suck of P4, it has been covered. So AMD got the "performance crown" as it is known, in terms of x86 processors.
Many do not know this, but at that time, the so called "enterprise server" market was dominated by RISC processors, but nowadays x86 is king and will take over completely when the first large mainframe using x86 arrives. IBM is the last hurdle to overcome. AMD Opteron started all of this, and I want to stress this: AMD started by being consumer oriented and moved into the server market. Lesson to learn? The consumer market, you and I, is where the money is, and if you dominate there, you get rich and powerful, and therefore you have more money to invest in research and development.
We all have laptops these days, don't we? Well, guess what, INTEL saw this one coming long ago, and they decided to work on the old Pentium3 architecture to make something more power efficient for us. AMD was late, and when they decided to get into our laptops, they put an Opteron in. In recent years, the mobile computer market has been the source of growth, and AMD simply could not compete on anything except price! If an analogy is allowed, a Hummer Truck with hybrid propulsion is still a Hummer Truck. Today, laptops outsell desktops, and little AMD still works with an architecture that was designed with other things in mind.
Remember the "native quad core" bullshit in late 2006? Well, the quad core market did not exist at that time. Really, it didn't! It took off in 2008, and by that time mobiles were outselling desktops already so duals were still more profitable by a margin. AMD left their vulnerable dual cores to be eaten alive by INTEL while they were fuelling a market that didn't exist.
And a final question, since moving to a smaller fabrication process helps improve power consumption, what is the last thing AMD transitioned to 45nm? Was it the mobile computer oriented Turion64? Son, I am disappoint.
So AMD has literally abandoned everything consumer oriented to gain the niche that are the servers. And now this is coming back to haunt them, since the fat INTEL develops at a much higher rate and has now also produced a server oriented processor, the Nehalem. Which by coincidence runs circles around the Opteron.
AMD has had plenty of time to make the right strategic decisions, but unfortunately the highly anticipated Bulldozer microarchitecture will arrive later than INTEL's next step, called Sandy Bridge, and moreover, it is another Opteron.
|
On October 09 2009 14:23 MamiyaOtaru wrote: AMD vs nVidia is pretty interesting right now too, with the 58*0s out, and Fermi still many months away. nVidia is making far less money on their gtx 2** line. Older process, larger chip size, more complicated layout due to larger bus (needed for the older GDDR3). People are questioning whether they are in fact going to be making any money on them at all, and if they will/should shut down gtx 2** and ride it out until Fermi gets here. Even when it does It's likely to be a monster in terms of power use, and I'm personally not interested in it, the same way gtx has been a no go for me. I've never had an ATI card, but this looks likely to change.
There's also some outrage over nVidia disabling hardware physx processing on their GPUs when there is an ATI GPU in the system (ie no using an ATI for graphics with an nVidia card for physx like you used to be able to do. THey even disabled the old Ageia PPUs if you have an ATI card, even though they were made before nVidia bought Ageia). Not the happiest time for nVidia right now, and gives AMD something to be happy about in the face of their CPU woes with the Intel competition
^ agreed.
People seem to forget that AMD's greatest strength lies in areas where Intel don't compete directly. NVidia is struggling hardcore since AMD introduced 48** series chips. With the new 58** chips, AMD is in fact doing very well.
When you really think about it, any one who upgrade their components will more likely get a graphic card before getting a CPU as a new CPU will impose additional requirements on MoBo etc.
Intel however have a monopoly over the laptop market, I have not seen a single AMD laptop outside of budget category and I work for a computer retailer and laptop is like 70% of pc hardware market. Intel at the moment have way too much hold on the pre-build vendor machines.
I still feel that we need to support AMD as a consumer since we need healthy and balanced competition in this industry.
|
On October 09 2009 18:25 50bani wrote: AMD vs INTEL
AMD got owned by their own strategy. Sure, one can say they are understaffed and all, but I would like to reveal a few mistakes that got them on the brink.
Firstly, AMD major breakthrough was in 2003 with the first Opteron they released. Opteron was targeted specifically at the server market, but the processor was later adapted to the desktop market.Since 2003, AMD has been selling us (the same) Opterons. Intel was on weed during that time and was pushing the crappy Pentium4 aggressively. No need to further discuss the suck of P4, it has been covered. So AMD got the "performance crown" as it is known, in terms of x86 processors.
Many do not know this, but at that time, the so called "enterprise server" market was dominated by RISC processors, but nowadays x86 is king and will take over completely when the first large mainframe using x86 arrives. IBM is the last hurdle to overcome. AMD Opteron started all of this, and I want to stress this: AMD started by being consumer oriented and moved into the server market. Lesson to learn? The consumer market, you and I, is where the money is, and if you dominate there, you get rich and powerful, and therefore you have more money to invest in research and development.
We all have laptops these days, don't we? Well, guess what, INTEL saw this one coming long ago, and they decided to work on the old Pentium3 architecture to make something more power efficient for us. AMD was late, and when they decided to get into our laptops, they put an Opteron in. In recent years, the mobile computer market has been the source of growth, and AMD simply could not compete on anything except price! If an analogy is allowed, a Hummer Truck with hybrid propulsion is still a Hummer Truck. Today, laptops outsell desktops, and little AMD still works with an architecture that was designed with other things in mind.
Remember the "native quad core" bullshit in late 2006? Well, the quad core market did not exist at that time. Really, it didn't! It took off in 2008, and by that time mobiles were outselling desktops already so duals were still more profitable by a margin. AMD left their vulnerable dual cores to be eaten alive by INTEL while they were fuelling a market that didn't exist.
And a final question, since moving to a smaller fabrication process helps improve power consumption, what is the last thing AMD transitioned to 45nm? Was it the mobile computer oriented Turion64? Son, I am disappoint.
So AMD has literally abandoned everything consumer oriented to gain the niche that are the servers. And now this is coming back to haunt them, since the fat INTEL develops at a much higher rate and has now also produced a server oriented processor, the Nehalem. Which by coincidence runs circles around the Opteron.
AMD has had plenty of time to make the right strategic decisions, but unfortunately the highly anticipated Bulldozer microarchitecture will arrive later than INTEL's next step, called Sandy Bridge, and moreover, it is another Opteron.
To be fair, AMD was cornered out of the consumer sector long before opteron, even during the Pentium suckage, they failed to make significant penetration. So It make sense for them to target a sector where lesser competition exists, it's very much like what they are doing with ATI atm with the obvious difference being that graphic cards actually sells -_-.
Their biggest mistake was letting Intel get away with locking out vendors. Now days, they are just playing opportunists (The exact same words from Intel representative I met), not necessaryly a bad thing.
|
no fucking idea what all this shenanigans is about.
but being an econ major, i don't think healthy competition is possible in all markets.
in simple things like farming, production of bulk goods, yes, because it's easy for a competitor to enter the market if already-existing suppliers suck.
but in things that involve development and research, no. It takes millions of dollars for a competitor to enter the market, because the further advanced the technology is, the more it takes to get started. That is why nothing will ever be a real competitor with Microsoft for OS. It's just a matter of time before AMD gets fucked by Intel.
|
amd cpus are more stable and faster, moreover they are cheaper. what more could one want?
|
On October 09 2009 19:27 Bliss wrote: amd cpus are more stable and faster, moreover they are cheaper. what more could one want? ...?
Haduken I still feel that we need to support AMD as a consumer since we need healthy and balanced competition in this industry. I reached that same conclusion during a conversation with a friend the other day, the exchanges after that consists of us telling each other to get AMD and finding excuses out of it. We want all the benefits of having AMD around, we don't quite want to buy AMD most of the time :p
|
On October 09 2009 18:33 haduken wrote:Show nested quote +On October 09 2009 18:25 50bani wrote: AMD vs INTEL
AMD got owned by their own strategy. Sure, one can say they are understaffed and all, but I would like to reveal a few mistakes that got them on the brink.
Firstly, AMD major breakthrough was in 2003 with the first Opteron they released. Opteron was targeted specifically at the server market, but the processor was later adapted to the desktop market.Since 2003, AMD has been selling us (the same) Opterons. Intel was on weed during that time and was pushing the crappy Pentium4 aggressively. No need to further discuss the suck of P4, it has been covered. So AMD got the "performance crown" as it is known, in terms of x86 processors.
Many do not know this, but at that time, the so called "enterprise server" market was dominated by RISC processors, but nowadays x86 is king and will take over completely when the first large mainframe using x86 arrives. IBM is the last hurdle to overcome. AMD Opteron started all of this, and I want to stress this: AMD started by being consumer oriented and moved into the server market. Lesson to learn? The consumer market, you and I, is where the money is, and if you dominate there, you get rich and powerful, and therefore you have more money to invest in research and development.
We all have laptops these days, don't we? Well, guess what, INTEL saw this one coming long ago, and they decided to work on the old Pentium3 architecture to make something more power efficient for us. AMD was late, and when they decided to get into our laptops, they put an Opteron in. In recent years, the mobile computer market has been the source of growth, and AMD simply could not compete on anything except price! If an analogy is allowed, a Hummer Truck with hybrid propulsion is still a Hummer Truck. Today, laptops outsell desktops, and little AMD still works with an architecture that was designed with other things in mind.
Remember the "native quad core" bullshit in late 2006? Well, the quad core market did not exist at that time. Really, it didn't! It took off in 2008, and by that time mobiles were outselling desktops already so duals were still more profitable by a margin. AMD left their vulnerable dual cores to be eaten alive by INTEL while they were fuelling a market that didn't exist.
And a final question, since moving to a smaller fabrication process helps improve power consumption, what is the last thing AMD transitioned to 45nm? Was it the mobile computer oriented Turion64? Son, I am disappoint.
So AMD has literally abandoned everything consumer oriented to gain the niche that are the servers. And now this is coming back to haunt them, since the fat INTEL develops at a much higher rate and has now also produced a server oriented processor, the Nehalem. Which by coincidence runs circles around the Opteron.
AMD has had plenty of time to make the right strategic decisions, but unfortunately the highly anticipated Bulldozer microarchitecture will arrive later than INTEL's next step, called Sandy Bridge, and moreover, it is another Opteron. To be fair, AMD was cornered out of the consumer sector long before opteron, even during the Pentium suckage, they failed to make significant penetration. So It make sense for them to target a sector where lesser competition exists, it's very much like what they are doing with ATI atm with the obvious difference being that graphic cards actually sells -_-. Their biggest mistake was letting Intel get away with locking out vendors. Now days, they are just playing opportunists (The exact same words from Intel representative I met), not necessaryly a bad thing.
Yeah.
I haven't been keeping up much with the current trends, but AMD's Athlon back in like '00/'01/'02 and maybe even '03 was kicking the crap out of the p2 then p3 and even p4 to some extent. They just couldn't get a good foothold in the market.
The complete failure of Microsoft in the OS department like windows millenium edition probably didn't help much either AMD's plan to penetrate the market.
My computer I bought back in '01 has an Athlon which totally owned anything that Intel had then. But holy crap I got stuck with Window's Millenium Edition. Fail. Computer leaks memory so bad it will just freeze up within 12-24 hours of bootup even if I don't open any programs.
|
On October 09 2009 13:19 FragKrag wrote: yeah, but sometimes there is reasoning behind it. As much as I hate to say it, AMD/ATi drivers sucked whereas nVidia drivers are generally much better. Intel has been dominating the market with it's C2 and i7 lines whereas the Phenom II came out at a horrible time
ATI drivers have been as good or better than nvidia for at least 2 years
|
On October 09 2009 13:48 LittleBallOfHate wrote:Show nested quote +On October 09 2009 13:16 cz wrote: Somewhat on topic rant:
Anybody who buys AMD/Intel based on the company is retarded. There's a very simple way to buy hardware:
1. Decide on your price range 2. Go to sites like anandtech, techreport, etc and look at reviews of whatever hardware you want (ie CPU). Look for charts/graphics which give a bar graph of the products' performance in that range. There will probably be 10+ cpus in whatever your price range is, and their performance all nicely graphed out. 3. Buy the best performing part in your price range. Alternatively you might find that you can cut off 25% of what you pay for only 5% less performance with a certain part. Etc. The bottom line is you have the performance, you have the price, and you make the call.
This is so obvious but it frustrates me to no end when I see people who post here in hardware / build a computer thread with advice like "Buy intel, they're better" or "Don't get ATI, nvidia is a lot better." Just understand the performance, then look at prices, and go from there.
edit: This wasn't a criticism or response to the OP really but rather to unaware people who might conclude from the OP that you should buy a certain company rather than buying based on facts of the hardware you are paying for Not necessarily true, I would be willing to pay a bit extra to a company I trust and that I know the shit works. For example, I have had a terrible experience with amd in the past(Support issues/driver), so I am less likely to purchase their products even if one of their products is a bit cheaper/better.
There's no difference between AMD/ATI and nvidia or intel in terms of support. First of all, for video cards (ATI/nvidia), they are almost always made by 3rd parties and so you get their support. CPUs almost never have problems but when they do I've never heard of AMD/Intel having better tech support. You either RMA or you don't.
|
The Athlon x4 620 has such a great power/price ratio it is hilarious. I got one :p
|
On October 09 2009 14:07 T.O.P. wrote: Back in the pentium days, Intel cheated by offering rebates to manufacturers who only bought Intel processors or bought mostly Intel processors. Because of this, AMD lost market share despite having superior processors. AMD is basically screwed for life now. They own <20% of the market while Intel owns around 80%. They have a inferior design and a inferior manufacturing process. It costs less for Intel to produce a processor compared to AMD but AMD has to sell the processor for much less because it's performance is inferior. Every quarter AMD loses a lot of money and every quarter they need a bailout from some Saudi Arabian prince just to pay payroll, because of that it's shares gets diluted. AMD will just keep falling behind because it can't afford to keep up with Intel in research and development. AMD is so hopeless and it sucks cause I bought AMD stock before they merged with ATI. At least the graphics division is doing well now. Remember the K7  AMD was know as a shitty company making "budget" procs then they produced the best proc avaible ( which was also cheap and stomped the P3 performance wise ).
If their new architectures are as good than the K7 was in his time they can be profitable.
|
AMD had a good run, but Intel has been back on top for a while now. I love my overclocked Q9550 :D Can't get an i7 yet because I blew 5K on a sound card hehe.
|
What sound card must cost $5K, DAMN YOU AUDIOPHILES.
|
Holy shit my entire desktop wasn't even 5K RMB
|
What sound card cost 5k? There isn't enough space physically to put $5k's worth of components on it afaik...
Actually, Haduken, if it costs 5k, I am willing to bet it isn't something that'd appeal to audiophiles.
The more I think the odder I find it, Audiophiles would probably want a Xonar Essence or simply have a complete build using outside components, the former is nowhere near 5k (in fact, it isn't a far cry from 4 digits), and the latter isn't a card. Blowing 5k on a complete system is one thing, on one sound card...?
|
Intel is far more superior to AMD, their is no compairison between the two. I would also like to take this time to thank my sponcer Intel for providing me with my processor ^_^
|
I agree, AMD is an inferior product when paired with INTEL. Which is obviously the greatest ever.
Ever.
greatest..
|
iNcontroL
USA29055 Posts
I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you.
|
Physician
United States4146 Posts
I bought a shitload of intel stocks after the crash.
|
On October 10 2009 13:38 Ecael wrote: What sound card cost 5k? There isn't enough space physically to put $5k's worth of components on it afaik...
Actually, Haduken, if it costs 5k, I am willing to bet it isn't something that'd appeal to audiophiles.
The more I think the odder I find it, Audiophiles would probably want a Xonar Essence or simply have a complete build using outside components, the former is nowhere near 5k (in fact, it isn't a far cry from 4 digits), and the latter isn't a card. Blowing 5k on a complete system is one thing, on one sound card...?
Who says it has to go inside the computer :p Its this DAC http://www.msbtech.com/products/gold4.php
|
Uhg... this isnt helping me decide on what processor I want to buy next T_T
It used to be so easy to compare them, now I'm totally lost.
Trying to wade through all the posts to make a good decision...
|
On October 09 2009 12:58 motbob wrote:I love AMD... they forced Intel to get their act together near the end of the Pentium 4's lifecycle. Now Intel is generally blowing them out of the water but that's a good thing for the consumer.  History - AMD was seeded by ibm to creative competition with intel in proving them chips.
AMD is atleast doing something now that is worth mentioning.
Frankly for the past several years they seemed to be more of a failure in providing good competition after intel's crappy Pentium cycle AMD seemed great.
But i mean when C2D hit the shelves it just blew the shit out of amd in performance they amd bleed red ink for several years and still does even with buying of ATI carrying some decent profits. C2D and C2Q up until what? a year ago was undisputed in performance esp with AMD shitty start with their phenom line. Thankfully Phenom II performed quite nicely against C2Q.
Still intel is king at lower end esp wolfdale chips to enthusiasts you can oc one of those suckers to death of a pretty cheap board at v1.45 on air without too much risk. Lucky now the basically butchered decent chips from their quad core line proves better performance at stock speeds. Making AMD a good market choice $50-$130 range for those who wont OC. Intel is still pretty far again with i5 and i7 in games the best of AMD can keep up but else where they just get totally slaughter.
But keeping up in gaming performance is basically what AMD's fans have been begging for. Also 955 and 965 is hardly competition for i5 i7 line up it bests cheapest chip in games but that's it else were it just get again demolished.
Frankly AMD is amazing how they can constantly bleed red ink for years now and still be afloat. I really wish AMD will pick it up in their design i know they don't have the budget Intel has but it's intel only competition in the x86 cpu biz AMD at least isn't hurting their fans anymore in gaming performance but most people do want more and the lure of intel line up is pretty dam good
Also comment i3 most likely will have disabled HT to keep the price/performance clean for itnel although form a design standpoint the i5 didn't have to have HT disabled nothing is stopping it from having HT it's mainly done so intel doesn't undermined itself like AMD is, their price/performance board is so cluttered now 50-180 dollars so many cpu's to fit your needs from AMD HT isn't important in games currently but i love it as an multi-tasker and someone that does encoding etc.
|
On October 10 2009 15:46 Saddened Izzy wrote: Also comment i3 most likely will have disabled HT to keep the price/performance clean for itnel although form a design standpoint the i5 didn't have to have HT disabled nothing is stopping it from having HT it's mainly done so intel doesn't undermined itself like AMD is, their price/performance board is so cluttered now 50-180 dollars so many cpu's to fit your needs from AMD HT isn't important in games currently but i love it as an multi-tasker and someone that does encoding etc. The rumor going around is that both the i5 and i3 will have HT. It's makes a lot of sense, it allows Intel to compete with AMD's physical core advantage for those who don't necessarily need more cores but would like the added multitasking ability. The bottom end Pentium will, however, have HT disabled. I don't see HT cutting significantly into i5-750 territory and I definitely don't see it saturating itself.
|
On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking.
This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while.
AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year.
|
AMD have no moves left in terms of marketing. Intel outspends them and out smart them on every move.
What did AMD do in terms of marketing? a poster on your local shop's wall or google ad is as far as they go while Intel logos are splashed everywhere and the deals and arrangement that Intel have with vendors like HP, Lenovo, Acer etc etc...
They will do well if you can convince major vendors to take their line up (very difficult to do) but why would vendors care when Intel more than likely match whatever offer AMD will give.
I think AMD's future lies in GPU/CPU integration, if they can make a significant play in this area then we will see some bounce back.
|
Intel of AMD: Price, AMD will be try to be cheaper and nearly as good though Intel is pricing very aggressively now and so both are viable. Intel chips all have Hyper Threading. If this is important to you Intel is very interesting.
If however you actually need the highest possible processing speed and not numbers of cores i.e. single thread compiling then just chose the highest 'speed' ignoring the numbers or cores etc. So for a programmer this is more ideal though a renderer its not as ideal as more cores as long as your software supports multiple cores. I.e. for me to compile I can do it in the same time on a £100 or £1000 PC atm as speeds have plateau'd around 2-3+Ghz
Nvidia v Ati: Ati cards are significantly cheaper for there performance so they are the best value for money. Nvidia cards are good too but there is no reason to take one or the other. In fact Ati will have the 1st DX 11 cards which run extremely well and Physx may very well die as a standard as DX Compute may simply take over entirely though a few devs have used it sparingly but to little effect. DX Compute may very well become the prefered method overnight for furture games nvidia will most likely want to have the DX 11 stamp on the next range of cards so may adopt DX Compute also effectively giving up Physx unless its stubburn but Microsoft won't care so DX Compute will eventually become standard regardless though it will take a long time think 2-3 years before we even see anything significant.
Luckily my m8 gave me 2x 8800 GTX's so I can SLI for Starcraft 2 if need be, but I will probably buy the top Nvidia card or the day depending on Starcraft 2's performance on both being the deciding factor. Unfortunately I have a 24" widescreen Dell Ultrasharp which runs are 1920x1200 so I need to have significant g-card to make using it comfortable which those 8800's do but dunno for how long.
The best decision you can make atm is to not make a decision until Starcraft 2 comes out or Beta comes out and some benchmarks are made.
|
51473 Posts
need to sponsor a proteam again imo
|
I've had enough problems with my AMD and ATI hardware in the past. Then I made a brilliant move and decided to stick to Intel + nVidia and all of the problems went away... Seriously, I am willing to overpay sometimes if I know it's going to save me a lot of compatibility issues, shitty support etc. Besides, I wonder why people even consider buying ATI cards now since most new games are using PhysX which is nVidia stuff (sure, you can launch games that use PhysX without it, but the difference is dramatic).
|
On October 10 2009 22:28 Manit0u wrote: most new games are using PhysX which is nVidia stuff (sure, you can launch games that use PhysX without it, but the difference is dramatic). Batman Asilyum and ? o,o
Dx11 man ...
|
On October 10 2009 22:28 Manit0u wrote: I've had enough problems with my AMD and ATI hardware in the past. Then I made a brilliant move and decided to stick to Intel + nVidia and all of the problems went away... Seriously, I am willing to overpay sometimes if I know it's going to save me a lot of compatibility issues, shitty support etc. Besides, I wonder why people even consider buying ATI cards now since most new games are using PhysX which is nVidia stuff (sure, you can launch games that use PhysX without it, but the difference is dramatic).
you are so blind. i dont even need to give arguments simply go google a bit
|
So is the next Intel series "Clarkdale" won't have qaud cores phsyically? That aside, will it outperform the current i7 series?
Sorry i'm not very good with hardwares.
|
if you're smart, you'd go for AMD, why? Because AMD is a lot cheaper.
|
I've always trusted AMD's products. The margin of difference in whether AMD or Intel is better really isn't all that much, but the price differences are sometimes staggering at how low AMD is.
|
On October 11 2009 00:10 furymonkey wrote: So is the next Intel series "Clarkdale" won't have qaud cores phsyically? That aside, will it outperform the current i7 series?
Sorry i'm not very good with hardwares.
It's going to be a dual core based on the same architecture as current i7 and i5, but with hyperthreading. It should be roughly as powerful as a 3-core from AMD at similar clocks in multithreaded apps, but should be significantly faster in single threaded apps due to turbo.
In other words it's a dual core that potentially encroaches on all the triple cores market with performance and wins outright with power consumption.
|
On October 10 2009 15:27 zgl wrote:Show nested quote +On October 10 2009 13:38 Ecael wrote: What sound card cost 5k? There isn't enough space physically to put $5k's worth of components on it afaik...
Actually, Haduken, if it costs 5k, I am willing to bet it isn't something that'd appeal to audiophiles.
The more I think the odder I find it, Audiophiles would probably want a Xonar Essence or simply have a complete build using outside components, the former is nowhere near 5k (in fact, it isn't a far cry from 4 digits), and the latter isn't a card. Blowing 5k on a complete system is one thing, on one sound card...? Who says it has to go inside the computer :p Its this DAC http://www.msbtech.com/products/gold4.php We don't call it a card at that point, we call it a DAC -.-
On October 10 2009 17:12 Aerox wrote:Show nested quote +On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking. This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while. AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year. If we want to go into the "can" category, i5s have seen overclocking up to 4ghz afaik.
|
On October 10 2009 23:12 Boblion wrote:Show nested quote +On October 10 2009 22:28 Manit0u wrote: most new games are using PhysX which is nVidia stuff (sure, you can launch games that use PhysX without it, but the difference is dramatic). Batman Asilyum and ? o,o Dx11 man ...
Dawn of War II Mirror's Edge Crysis
Just to name a few. And DX 11 doesn't mean anything to me as just like I believe majority of Windows users I'm still on XP so DX 9 is as far as I look.
|
On October 11 2009 01:46 Ecael wrote:Show nested quote +On October 10 2009 17:12 Aerox wrote:On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking. This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while. AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year. If we want to go into the "can" category, i5s have seen overclocking up to 4ghz afaik. Yup, but with almost double the price, I'd expect at least 50% to 100% increase in performance rather than only about 12% increase. This is why I mentioned that more consumers need to be informed of AMD's advantages and not just write them off simply with Intel's branding.
|
On October 11 2009 02:32 Aerox wrote:Show nested quote +On October 11 2009 01:46 Ecael wrote:On October 10 2009 17:12 Aerox wrote:On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking. This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while. AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year. If we want to go into the "can" category, i5s have seen overclocking up to 4ghz afaik. Yup, but with almost double the price, I'd expect at least 50% to 100% increase in performance rather than only about 12% increase. This is why I mentioned that more consumers need to be informed of AMD's advantages and not just write them off simply with Intel's branding. Seldom do we see performance increases of that scale, no? At that, if I recall correctly, not all 550 BE can be unlocked up to 4 cores without seeing functionality issues. In this case we have a good amount of those floating around though, so I suppose it isn't too huge of an issue.
However, of those people can be convinced to put up with the trouble of building, there are even fewer that'd want to tinker with such modifications. AMD can't exactly march out and tell mainstream makers to start modding their machine for them, at this point it isn't even that consumers need to be informed, it is that you are using the wrong standard to judge. People will choose a 975 over a 920 just because the latter can do 4ghz easier than the latter, even though the latter is just as capable of it. Why won't they pay a mere 100% more from $100 for a reliable quad core with high performance?
|
On October 10 2009 17:12 Aerox wrote:Show nested quote +On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking. This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while. AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year. You know that's a horrible argument.
The x2 550 is a borken ass quad core i forget which one ionno amd has too many cpus on the market it's one cluttered fuck
But those 2 other cores are disabled for a reason unlocking them leads to guaranteed instability in your system. And possibly worse things besides mis-calculations from your cpu.
Anyways a E6500 from intel (it's not your mamas 6500 it's a new wolfdale 1066 fsb)can oc well into 3.6-3.8ghz and crush the 550 it's also cheaper and if you oc you might as well just get the 6300 for even cheaper
AMD gained better cpus at stock in the sub 150 range but intels cpu have a much higher capacity to oc.
|
I thought that was one of the ones with more reliable quad core unlocks, or am I confusing it with something else?
|
On October 11 2009 02:32 Aerox wrote:Show nested quote +On October 11 2009 01:46 Ecael wrote:On October 10 2009 17:12 Aerox wrote:On October 09 2009 12:42 zeroimagination wrote: But what about those gamers who also multitask heavily? Intel answers that in the form of Hyper-Threading which turns Clarkdale into a virtual quad core. This practically destroys most of the incentive for the average gamer to go with Phenom II. A higher clocking dual core is going to perform better than a lower clocking quad core while consuming less power and producing less heat, and unless you are doing something that requires 100% of one core, Hyper-Threading will provide adequate real world multitasking. This is actually in response to the current AMD Phenom II X2 550 Black Edition which can be unlocked into 4 cores. The core speed is 3.2GHz and can be overclocked to 3.6GHz. It is CURRENTLY being sold for about USD100 comparing to Clarkdale's i5 at the same speed costing USD176 which will launch next year... which is still a long while. AMD can win if more consumers are intelligently informed IMO. Therefore, AMD just needs to do more marketing as well as capitalize their limited eehan timing window to capture the consumers' hearts right now until the end of this year. If we want to go into the "can" category, i5s have seen overclocking up to 4ghz afaik. Yup, but with almost double the price, I'd expect at least 50% to 100% increase in performance rather than only about 12% increase. This is why I mentioned that more consumers need to be informed of AMD's advantages and not just write them off simply with Intel's branding. The problem with this is that the i5/i7 architecture is achieving 12% performance at stock clock (though this isn't true in apps that don't use 4 threads due to turbo) while the AMD is running at a significantly faster clock speed. If we are talking about overclocking to 4GHz that performance gap will only widen. Take a look at the performance of the i5 870 or i7 975. Sure, they aren't in the same price range as a Phenom II 955, but you will achieve similar performance gains since they are in essence the same processor as the i5 750 / i7 920.
The reason why I would put the most mid-high quads in the professional segment is this: for the average person how much does a quad core increase productivity? The answer is subjective but is it enough to warrant spending 2x more for a quad when a dual core can fit 90% of your needs? Probably not.
If, however, you fall under the category of those who use quads regularly to, let's say, render videos the 12% increase (at stock clock remember!) will be significant; and if we take into account overclocking it just blows anything AMD has out of the water.
|
Even if you are just a casual user I think the significant amount of time you save (i7-920 vs nearest amd processor) is enough to justify the difference of $60-$90. Really, for something you would likely use for 4-7 years, its not that much.
ie. using winRaR because I assume everyone uses it. http://www.tomshardware.com/charts/2009-desktop-cpu-charts/WinRAR-3.9-x64-Beta1,1399.html
Changing charts to Photoshop or AVG (more common used programs) would give similar results.
|
On October 09 2009 13:31 xmShake wrote:Show nested quote +On October 09 2009 13:23 Myrmidon wrote: To me it's amazing how transistor feature size keeps shrinking so fast these days. Since chip size is no longer increasing, the decreasing feature size is the only thing keeping Moore's Law going. Practically every silicon chip on the market, including processors (and except maybe flash memory), can contain more transistors than people really know how to use. Right now everybody's cheating with processors, saying, "Oh, let's just put 2 4 8? cores on there. Since we don't know how to make a better processor with all our available transistors, let's just put the same old crap in duplicate all right next to each other to use up the space and hope people will be happy...oh, and all the huge excess space even after all that, let's make into cache." I mean, sure, extra cache helps for some stuff. And extra cores help for some things.
But I feel while Intel and AMD keep pushing hardware innovation, software to actually fully utilize the hardware is lagging way behind. It's not every task that can be parallelized so easily. People don't really know how to write programs to run in parallel (yes, lots of stuff does it, but it's far from mature).
What I want to know is what happens when transistors of reasonable cost can no longer be made smaller. There are physical limits--you can't make layers less than an atom thick. This really isn't too far on the horizon, so it's something to consider soon. There's a couple proposed routes, the one that I can remember the best is quantum computing. http://en.wikipedia.org/wiki/Quantum_computingI also believe there's another proposed processor that uses lasers somehow.. I'm fuzzy on this one. Edit: This is probably it: http://en.wikipedia.org/wiki/Optical_computerThere are tech articles that come up on this subject every couple of months. yea I even remember something about biological computing which would operate off of bacteria and such. pretty crazy stuff.
|
On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe?
|
|
yea that was a long ass time ago cm :p
|
On October 11 2009 08:48 FragKrag wrote: yea that was a long ass time ago cm :p
jan 2009
|
|
Anyway freq means nothing. The P4 had very high freq but got raped really hard by slower athlon T-birds / Xp / 64.
|
On October 11 2009 08:13 CharlieMurphy wrote:Show nested quote +On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe? They were fined for abusing their relationships with major PC manufacturers in order to force them not to sell AMD based systems (threatening cutting relations with them if they introduced AMD PCs/laptops), therefor not allowing AMD to capitalize on a (then) crushing performance advantage in their CPU offerings.
That aside: Intel have the superior products. Their performance/watt ratio is amazing, their overclocking and undervolting potentials are amazing, and they will further push their lead up to (maybe) the Bulldozer core, which will compete with Sandy/Ivy Bridge (Intel's next steps after westmere, and supposedly as big/bigger a jump as i7 was from core2). At the moment it makes no sense for Intel to introduce a 32nm quad-core - AMD cannot compete with current Intel offerings in that segment. They could probably lower the costs and put AMD out of business, but then they'd risk being treated as a monopoly - so they're just keeping profits high instead.
The dual-core clarkdales should outperform the venerable Q8200 in nearly every non-synthetic benchmark, putting the hurt on AMDs lower end quad core, all triple core and all dual core offerings. The integrated graphics is a boon for non-gamers - an entire fully functional system could be pushed into a mini-ITX form-factor, use under 120W at full load for all components, and run fast enough for most needs - under $500. Of course, SC2 would probably suck on such a machine.
|
On October 11 2009 07:25 kiykiy wrote:Even if you are just a casual user I think the significant amount of time you save (i7-920 vs nearest amd processor) is enough to justify the difference of $60-$90. Really, for something you would likely use for 4-7 years, its not that much. ie. using winRaR because I assume everyone uses it. http://www.tomshardware.com/charts/2009-desktop-cpu-charts/WinRAR-3.9-x64-Beta1,1399.htmlChanging charts to Photoshop or AVG (more common used programs) would give similar results. winrar is for pirates and those who don't know anybetter
7zip is the free open source alt ilke peazip that uses lmza compression which beats winrar's best compression just about in all cases excluding wav files and bmp type files. and 7z is compatible with winrar. I'm waiting for nanozip to move out of alpha builds. There are much better compression tools but most of them are only cmd line interface with no shell extensions so they aren't popular.

On October 11 2009 09:01 CharlieMurphy wrote:Show nested quote +On October 11 2009 08:48 FragKrag wrote: yea that was a long ass time ago cm :p jan 2009
That's one of the virtues of AMD's design is that it doesn't have a tendency of failure in sub zero temperatures there is no problems. But it doesn't make it a better consumer level oc as most OC is done on air or water.
On October 11 2009 09:13 Kazius wrote:Show nested quote +On October 11 2009 08:13 CharlieMurphy wrote:On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe? They were fined for abusing their relationships with major PC manufacturers in order to force them not to sell AMD based systems (threatening cutting relations with them if they introduced AMD PCs/laptops), therefor not allowing AMD to capitalize on a (then) crushing performance advantage in their CPU offerings. That aside: Intel have the superior products. Their performance/watt ratio is amazing, their overclocking and undervolting potentials are amazing, and they will further push their lead up to (maybe) the Bulldozer core, which will compete with Sandy/Ivy Bridge (Intel's next steps after westmere, and supposedly as big/bigger a jump as i7 was from core2). At the moment it makes no sense for Intel to introduce a 32nm quad-core - AMD cannot compete with current Intel offerings in that segment. They could probably lower the costs and put AMD out of business, but then they'd risk being treated as a monopoly - so they're just keeping profits high instead. The dual-core clarkdales should outperform the venerable Q8200 in nearly every non-synthetic benchmark, putting the hurt on AMDs lower end quad core, all triple core and all dual core offerings. The integrated graphics is a boon for non-gamers - an entire fully functional system could be pushed into a mini-ITX form-factor, use under 120W at full load for all components, and run fast enough for most needs - under $500. Of course, SC2 would probably suck on such a machine.
They were fined by the idiot EU the same people harassing Microsoft about unfair for them to put their own browser on their own OS which they developed from the ground up. The same people are forcing Microsoft to bundle win 7 for the EU with other browsers on the install. The same people bitching how it is unfair the way the ballot system is working. The same people who don't see browser market trends and IE dropping off in use. The same people who said it's not good enough to have the ability to remove IE competently from your windows install. The same people who first filed the lawsuit because you couldn't remove IE completely from the computer. Oh yeah and it took them nearly 10 years to do this lawsuit too, that's right it was filled 10 years ago see how good the EU is at judgments considering the EU had a N? edition of XP with no browsers on it at all that didn't sell well.
The people running the tribunals for the EU are tech idiots.
Also the fine is they are could be hurting the EU citizens buying from those manufactures. They had no definite proof that they are hurting anyone besides AMD. And they did not fine the manufactures for agreeing with Intel's unsung exclusivity contracts with them. And they collect the money not AMD frankly a large portion of tech forms claimed the EU was pulling money grabs against the tech industry as most of it is Asia and US based. let me rage some more!fjasdl;kga
|
On October 11 2009 09:57 Saddened Izzy wrote:Show nested quote +On October 11 2009 09:01 CharlieMurphy wrote:On October 11 2009 08:48 FragKrag wrote: yea that was a long ass time ago cm :p jan 2009 That's one of the virtues of AMD's design is that it doesn't have a tendency of failure in sub zero temperatures there is no problems. But it doesn't make it a better consumer level oc as most OC is done on air or water. Show nested quote +On October 11 2009 09:13 Kazius wrote:On October 11 2009 08:13 CharlieMurphy wrote:On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe? They were fined for abusing their relationships with major PC manufacturers in order to force them not to sell AMD based systems (threatening cutting relations with them if they introduced AMD PCs/laptops), therefor not allowing AMD to capitalize on a (then) crushing performance advantage in their CPU offerings. That aside: Intel have the superior products. Their performance/watt ratio is amazing, their overclocking and undervolting potentials are amazing, and they will further push their lead up to (maybe) the Bulldozer core, which will compete with Sandy/Ivy Bridge (Intel's next steps after westmere, and supposedly as big/bigger a jump as i7 was from core2). At the moment it makes no sense for Intel to introduce a 32nm quad-core - AMD cannot compete with current Intel offerings in that segment. They could probably lower the costs and put AMD out of business, but then they'd risk being treated as a monopoly - so they're just keeping profits high instead. The dual-core clarkdales should outperform the venerable Q8200 in nearly every non-synthetic benchmark, putting the hurt on AMDs lower end quad core, all triple core and all dual core offerings. The integrated graphics is a boon for non-gamers - an entire fully functional system could be pushed into a mini-ITX form-factor, use under 120W at full load for all components, and run fast enough for most needs - under $500. Of course, SC2 would probably suck on such a machine. They were fined by the idiot EU the same people harassing Microsoft about unfair for them to put their own browser on their own OS which they developed from the ground up. The same people are forcing Microsoft to bundle win 7 for the EU with other browsers on the install. The same people bitching how it is unfair the way the ballot system is working. The same people who don't see browser market trends and IE dropping off in use. The same people who said it's not good enough to have the ability to remove IE competently from your windows install. The same people who first filed the lawsuit because you couldn't remove IE completely from the computer. Oh yeah and it took them nearly 10 years to do this lawsuit too, that's right it was filled 10 years ago see how good the EU is at judgments considering the EU had a N? edition of XP with no browsers on it at all that didn't sell well. The people running the tribunals for the EU are tech idiots. Also the fine is they are could be hurting the EU citizens buying from those manufactures. They had no definite proof that they are hurting anyone besides AMD. And they did not fine the manufactures for agreeing with Intel's unsung exclusivity contracts with them. And they collect the money not AMD frankly a large portion of tech forms claimed the EU was pulling money grabs against the tech industry as most of it is Asia and US based. let me rage some more!fjasdl;kga I think that MS did do wrong - Netscape was back then a far superior product to explorer, and lost it's market share and became irrelevant, allowing MS's massive development funds to close the gap in a couple of years instead of what should have been closer to five (if they had retained their 70%+ market share instead of quickly dropping to under 40%, it is likely they'd have more cash for R&D, what would have created a stronger option than IE for a longer time). That would have also forced IE instead of being intentionally buggy and not working with standard pages well in order to screw over the competition who'd want to develop according to non-MS standards much sooner (prior to IE8, which ironically causes Microsoft's site to be buggy on it, as it was built around the broken standards of earlier versions). It took nearly 7 years for the competition to be serious again, and suddenly MS is furiously working at improving their rather crappy software... so tell me competition isn't a good thing.
Of course, netscape has become irrelevant (seriously, how many people remember that company?), but AMD could, and should, have reached a much higher market share, which would have given it a LOT more money to develop next-gen stuff, and if that were the case, we'd be seeing 32nm quad-cores by Intel by the end of this year. As it is today, AMD needs five years to work on a drastically new core, and we will be seeing Bulldozer in 2011 (manufacturing will optimistically start mid-2010), and up to that point they will not be competitive at the high end... and Intel are busy maximizing their profits instead of pushing the tech advantage faster. AMD has been forced to spin off it's foundries in the meantime, giving it short term relief instead of long term integration between development and manufacturing. And I'm not even talking about how MS abused it's relationship with manufacturers to gain a massive market share for MS Office, and then jacked up the price by 5-10 times once it was the standard (which is now making more money than OS sales for them, and has given us near stagnation in that field of computing).
If anything, MS should have been fined much more, much earlier, and Intel should have paid that money directly to AMD around 2005... and that would have been much better for the consumers by now. Better yet, all the manufacturers should have said no to Intel. Or best yet, Intel shouldn't have went against free-market principles. Strong competition gave us the Core2 lineup, the nVidia 200 series, the ATI 4xxx series, etc. At the moment, Intel have no incentive to give us all their tech can offer (hence only dual core w/IGP or hexa-core for 1366 sockets in 32nm). They can focus on profit margins instead. Great for them, sucks for us.
|
On October 11 2009 12:04 Kazius wrote: Of course, netscape has become irrelevant (seriously, how many people remember that company?), but AMD could, and should, have reached a much higher market share, which would have given it a LOT more money to develop next-gen stuff, and if that were the case, we'd be seeing 32nm quad-cores by Intel by the end of this year. As it is today, AMD needs five years to work on a drastically new core, and we will be seeing Bulldozer in 2011 (manufacturing will optimistically start mid-2010), and up to that point they will not be competitive at the high end... and Intel are busy maximizing their profits instead of pushing the tech advantage faster We will be seeing 32nm Intel hexacores Q1 or Q2 of next year and Intel already has plans to move to 22nm and 15nm after that. Intel has been pushing innovation for quite a while, it's AMD that can't catch up.
|
random fact... AMD stocks did pretty badly last month. one of the worst performing stocks in the S&P500 last month i think..
|
On October 11 2009 12:39 madnessman wrote: random fact... AMD stocks did pretty badly last month. one of the worst performing stocks in the S&P500 last month i think..
The stocks do badly because the company is bleeding money out of its ass. Decreasing gross revenue plus negative net income for the past 2 years. On a slightly unrelated side note, Electronic Arts is also losing money, even though they've posted record high net revenue in for 2008.
|
I had both AMD and Intel computers. I usually go with the CPU that produce the least heat for similar performance. Since I leave my computer always running, I don't need a mini-heater in the room nor a power sucking rig.
Had an earlier AMD, it was OK. Later had a Intel P4 HT which was horridly hot and noisy. Now I have a Intel Core Duo which is perfect.
|
On October 11 2009 12:35 zeroimagination wrote:Show nested quote +On October 11 2009 12:04 Kazius wrote: Of course, netscape has become irrelevant (seriously, how many people remember that company?), but AMD could, and should, have reached a much higher market share, which would have given it a LOT more money to develop next-gen stuff, and if that were the case, we'd be seeing 32nm quad-cores by Intel by the end of this year. As it is today, AMD needs five years to work on a drastically new core, and we will be seeing Bulldozer in 2011 (manufacturing will optimistically start mid-2010), and up to that point they will not be competitive at the high end... and Intel are busy maximizing their profits instead of pushing the tech advantage faster We will be seeing 32nm Intel hexacores Q1 or Q2 of next year and Intel already has plans to move to 22nm and 15nm after that. Intel has been pushing innovation for quite a while, it's AMD that can't catch up. All you say is true, but instead of a hexacore for the high-end platform, not quad-cores for the mainstream. Do you seriously think that the hexacore CPU will be for the $150-$250 price range? They delay everything nowadays due to lack of competition, from the atom platform (pinetrail was scheduled for Q3 2009, not Q1 2010), mainstream quads (32nm "hopefully" late 2010 instead of early 2010), to the high end (32nm should enable octo-core CPUs in the same size of 45nm quads). Intel has been at the forefront, but this is just like a race - you go faster when the competition breaths down your neck. As I said: the reason AMD doesn't have the resources to catch up is because when they did have the tech advantage, Intel used questionable business practices to not allow AMD to capitalize. That lack of competition allows Intel to slow down the pace. Just so you know - the next gen quad-core CPUs are ready (and would be in the market in half a year if Intel would will it). But they don't need 32nm/new architectures because Lynnfield/Clarkfield CPUs dominate AMD offerings as is.
|
On October 11 2009 15:50 [X]Ken_D wrote: I usually go with the CPU that produce the least heat for similar performance. ... Later had a Intel P4 HT wat
|
On October 11 2009 12:04 Kazius wrote:Show nested quote +On October 11 2009 09:57 Saddened Izzy wrote:On October 11 2009 09:01 CharlieMurphy wrote:On October 11 2009 08:48 FragKrag wrote: yea that was a long ass time ago cm :p jan 2009 That's one of the virtues of AMD's design is that it doesn't have a tendency of failure in sub zero temperatures there is no problems. But it doesn't make it a better consumer level oc as most OC is done on air or water. On October 11 2009 09:13 Kazius wrote:On October 11 2009 08:13 CharlieMurphy wrote:On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe? They were fined for abusing their relationships with major PC manufacturers in order to force them not to sell AMD based systems (threatening cutting relations with them if they introduced AMD PCs/laptops), therefor not allowing AMD to capitalize on a (then) crushing performance advantage in their CPU offerings. That aside: Intel have the superior products. Their performance/watt ratio is amazing, their overclocking and undervolting potentials are amazing, and they will further push their lead up to (maybe) the Bulldozer core, which will compete with Sandy/Ivy Bridge (Intel's next steps after westmere, and supposedly as big/bigger a jump as i7 was from core2). At the moment it makes no sense for Intel to introduce a 32nm quad-core - AMD cannot compete with current Intel offerings in that segment. They could probably lower the costs and put AMD out of business, but then they'd risk being treated as a monopoly - so they're just keeping profits high instead. The dual-core clarkdales should outperform the venerable Q8200 in nearly every non-synthetic benchmark, putting the hurt on AMDs lower end quad core, all triple core and all dual core offerings. The integrated graphics is a boon for non-gamers - an entire fully functional system could be pushed into a mini-ITX form-factor, use under 120W at full load for all components, and run fast enough for most needs - under $500. Of course, SC2 would probably suck on such a machine. They were fined by the idiot EU the same people harassing Microsoft about unfair for them to put their own browser on their own OS which they developed from the ground up. The same people are forcing Microsoft to bundle win 7 for the EU with other browsers on the install. The same people bitching how it is unfair the way the ballot system is working. The same people who don't see browser market trends and IE dropping off in use. The same people who said it's not good enough to have the ability to remove IE competently from your windows install. The same people who first filed the lawsuit because you couldn't remove IE completely from the computer. Oh yeah and it took them nearly 10 years to do this lawsuit too, that's right it was filled 10 years ago see how good the EU is at judgments considering the EU had a N? edition of XP with no browsers on it at all that didn't sell well. The people running the tribunals for the EU are tech idiots. Also the fine is they are could be hurting the EU citizens buying from those manufactures. They had no definite proof that they are hurting anyone besides AMD. And they did not fine the manufactures for agreeing with Intel's unsung exclusivity contracts with them. And they collect the money not AMD frankly a large portion of tech forms claimed the EU was pulling money grabs against the tech industry as most of it is Asia and US based. let me rage some more!fjasdl;kga I think that MS did do wrong - Netscape was back then a far superior product to explorer, and lost it's market share and became irrelevant, allowing MS's massive development funds to close the gap in a couple of years instead of what should have been closer to five (if they had retained their 70%+ market share instead of quickly dropping to under 40%, it is likely they'd have more cash for R&D, what would have created a stronger option than IE for a longer time). That would have also forced IE instead of being intentionally buggy and not working with standard pages well in order to screw over the competition who'd want to develop according to non-MS standards much sooner (prior to IE8, which ironically causes Microsoft's site to be buggy on it, as it was built around the broken standards of earlier versions). It took nearly 7 years for the competition to be serious again, and suddenly MS is furiously working at improving their rather crappy software... so tell me competition isn't a good thing. Of course, netscape has become irrelevant (seriously, how many people remember that company?), but AMD could, and should, have reached a much higher market share, which would have given it a LOT more money to develop next-gen stuff, and if that were the case, we'd be seeing 32nm quad-cores by Intel by the end of this year. As it is today, AMD needs five years to work on a drastically new core, and we will be seeing Bulldozer in 2011 (manufacturing will optimistically start mid-2010), and up to that point they will not be competitive at the high end... and Intel are busy maximizing their profits instead of pushing the tech advantage faster. AMD has been forced to spin off it's foundries in the meantime, giving it short term relief instead of long term integration between development and manufacturing. And I'm not even talking about how MS abused it's relationship with manufacturers to gain a massive market share for MS Office, and then jacked up the price by 5-10 times once it was the standard (which is now making more money than OS sales for them, and has given us near stagnation in that field of computing). If anything, MS should have been fined much more, much earlier, and Intel should have paid that money directly to AMD around 2005... and that would have been much better for the consumers by now. Better yet, all the manufacturers should have said no to Intel. Or best yet, Intel shouldn't have went against free-market principles. Strong competition gave us the Core2 lineup, the nVidia 200 series, the ATI 4xxx series, etc. At the moment, Intel have no incentive to give us all their tech can offer (hence only dual core w/IGP or hexa-core for 1366 sockets in 32nm). They can focus on profit margins instead. Great for them, sucks for us.
To add some more fun facts about MS vs EU discussion: EU is just trying to keep the market competetive and monopoly-free. Let's look at why MS got fined many times here: 1. Windows Media Player - no one can develop and SELL their media players because only an idiot would buy something you get for free right off the bat with your system. No? 2. Internet Explorer - most browsers are free but most people won't bother downloading something they already have. No? 3. X-box - for selling it below the manufacturing price (or something like that, they sold them horrendously cheap and I don't remember the details of this case). Hard to beat that.
And some other random stuff of how MS wanted to dominate but failed: When the government of Austria wanted to unify all government machines by installing the same system on all of them (and by teaching new personnel only 1 thing instead of 3) that would be stable, profitable (ie. cheap) and secure. The decision was between Linux and Windows. MS offered 90% discount but was rejected anyway (10% of something is still > 0).
|
I think AMD is a lot better position than they were... even if really all they have is the Phenom 2 at the top, for the moment it is pretty much the best bang for your buck. 3-4 months ago I was thinking they were pretty much down for the count, but since then I've at least considered getting a Phenom 2.
I think they're in pretty good shape in the graphics card market too... their cards tend to be a bit cheaper than the equivalent Nvidia cards are. I got my first ATI graphics card about two months ago, and I'm a big fan of it.
|
On October 12 2009 23:12 Manit0u wrote:Show nested quote +On October 11 2009 12:04 Kazius wrote:On October 11 2009 09:57 Saddened Izzy wrote:On October 11 2009 09:01 CharlieMurphy wrote:On October 11 2009 08:48 FragKrag wrote: yea that was a long ass time ago cm :p jan 2009 That's one of the virtues of AMD's design is that it doesn't have a tendency of failure in sub zero temperatures there is no problems. But it doesn't make it a better consumer level oc as most OC is done on air or water. On October 11 2009 09:13 Kazius wrote:On October 11 2009 08:13 CharlieMurphy wrote:On October 10 2009 14:38 {88}iNcontroL wrote: I'm going to have to agree with Bryce and Anna. Intel is quite possibly the Gandhi of modern times. Giving us such wondrous products while being so kind and supportive.
Thanks Intel.
I love you. wasn't intel just recently fined a shit load of money for trying to do some illegal monopoly shit in europe? They were fined for abusing their relationships with major PC manufacturers in order to force them not to sell AMD based systems (threatening cutting relations with them if they introduced AMD PCs/laptops), therefor not allowing AMD to capitalize on a (then) crushing performance advantage in their CPU offerings. That aside: Intel have the superior products. Their performance/watt ratio is amazing, their overclocking and undervolting potentials are amazing, and they will further push their lead up to (maybe) the Bulldozer core, which will compete with Sandy/Ivy Bridge (Intel's next steps after westmere, and supposedly as big/bigger a jump as i7 was from core2). At the moment it makes no sense for Intel to introduce a 32nm quad-core - AMD cannot compete with current Intel offerings in that segment. They could probably lower the costs and put AMD out of business, but then they'd risk being treated as a monopoly - so they're just keeping profits high instead. The dual-core clarkdales should outperform the venerable Q8200 in nearly every non-synthetic benchmark, putting the hurt on AMDs lower end quad core, all triple core and all dual core offerings. The integrated graphics is a boon for non-gamers - an entire fully functional system could be pushed into a mini-ITX form-factor, use under 120W at full load for all components, and run fast enough for most needs - under $500. Of course, SC2 would probably suck on such a machine. They were fined by the idiot EU the same people harassing Microsoft about unfair for them to put their own browser on their own OS which they developed from the ground up. The same people are forcing Microsoft to bundle win 7 for the EU with other browsers on the install. The same people bitching how it is unfair the way the ballot system is working. The same people who don't see browser market trends and IE dropping off in use. The same people who said it's not good enough to have the ability to remove IE competently from your windows install. The same people who first filed the lawsuit because you couldn't remove IE completely from the computer. Oh yeah and it took them nearly 10 years to do this lawsuit too, that's right it was filled 10 years ago see how good the EU is at judgments considering the EU had a N? edition of XP with no browsers on it at all that didn't sell well. The people running the tribunals for the EU are tech idiots. Also the fine is they are could be hurting the EU citizens buying from those manufactures. They had no definite proof that they are hurting anyone besides AMD. And they did not fine the manufactures for agreeing with Intel's unsung exclusivity contracts with them. And they collect the money not AMD frankly a large portion of tech forms claimed the EU was pulling money grabs against the tech industry as most of it is Asia and US based. let me rage some more!fjasdl;kga I think that MS did do wrong - Netscape was back then a far superior product to explorer, and lost it's market share and became irrelevant, allowing MS's massive development funds to close the gap in a couple of years instead of what should have been closer to five (if they had retained their 70%+ market share instead of quickly dropping to under 40%, it is likely they'd have more cash for R&D, what would have created a stronger option than IE for a longer time). That would have also forced IE instead of being intentionally buggy and not working with standard pages well in order to screw over the competition who'd want to develop according to non-MS standards much sooner (prior to IE8, which ironically causes Microsoft's site to be buggy on it, as it was built around the broken standards of earlier versions). It took nearly 7 years for the competition to be serious again, and suddenly MS is furiously working at improving their rather crappy software... so tell me competition isn't a good thing. Of course, netscape has become irrelevant (seriously, how many people remember that company?), but AMD could, and should, have reached a much higher market share, which would have given it a LOT more money to develop next-gen stuff, and if that were the case, we'd be seeing 32nm quad-cores by Intel by the end of this year. As it is today, AMD needs five years to work on a drastically new core, and we will be seeing Bulldozer in 2011 (manufacturing will optimistically start mid-2010), and up to that point they will not be competitive at the high end... and Intel are busy maximizing their profits instead of pushing the tech advantage faster. AMD has been forced to spin off it's foundries in the meantime, giving it short term relief instead of long term integration between development and manufacturing. And I'm not even talking about how MS abused it's relationship with manufacturers to gain a massive market share for MS Office, and then jacked up the price by 5-10 times once it was the standard (which is now making more money than OS sales for them, and has given us near stagnation in that field of computing). If anything, MS should have been fined much more, much earlier, and Intel should have paid that money directly to AMD around 2005... and that would have been much better for the consumers by now. Better yet, all the manufacturers should have said no to Intel. Or best yet, Intel shouldn't have went against free-market principles. Strong competition gave us the Core2 lineup, the nVidia 200 series, the ATI 4xxx series, etc. At the moment, Intel have no incentive to give us all their tech can offer (hence only dual core w/IGP or hexa-core for 1366 sockets in 32nm). They can focus on profit margins instead. Great for them, sucks for us. To add some more fun facts about MS vs EU discussion: EU is just trying to keep the market competetive and monopoly-free. Let's look at why MS got fined many times here: 1. Windows Media Player - no one can develop and SELL their media players because only an idiot would buy something you get for free right off the bat with your system. No? 2. Internet Explorer - most browsers are free but most people won't bother downloading something they already have. No? 3. X-box - for selling it below the manufacturing price (or something like that, they sold them horrendously cheap and I don't remember the details of this case). Hard to beat that. And some other random stuff of how MS wanted to dominate but failed: When the government of Austria wanted to unify all government machines by installing the same system on all of them (and by teaching new personnel only 1 thing instead of 3) that would be stable, profitable (ie. cheap) and secure. The decision was between Linux and Windows. MS offered 90% discount but was rejected anyway (10% of something is still > 0). Selling Hardware below it's production cost is a common practice 360 used to be the ps3 is still sold only the wii is sold above manufacture price, it's common practice sell the hardware make the money back on perhiperals and games some where around 8 i believe.
The EU decision is not a decision it's a punishment they already have deiced that IE is unfair 3 times! They just keep fucking up on tiring to "even the field"
But i mean there are a ton of shit that is considered a must for an OS that windows has which is not consistent with the EU decision. Infact if they studied any kind of market trends IE has constantly lost market territory when they first filed the lawsuit 11 years ago.
It's a free program that microsoft develops and keeps updated all on their own.
1 issue with the stupid ass EU decision is that Microsoft will be blamed for bug/security holes etc and updates for a product they do not control.
There are a ton of products that are inconsistent with the EU decision which again people make money off of but generally are free.
1. MS Paint 2. WordPad 3. Notepad (both ms products) 4. Windows media player 5. Windows zipper 6. Windows live messenger MSN 7. Defragger(which is acutlly not really devleoped by them) 8. Windows movie maker 9. Photo/imange viewer 10. ETC
The list goes on there are a ton of alternatives quite a few of these products cannot be removed from windows fully. They operate just like what IE has been dogged on about yet only IE is the only one to be flagged. This is just a bullshit protectionist failure because the orignal lawsuit is filled out by Opera a EU headquartered business. Which does shitty in the normal browser market but takes more then 60% of the mobile browser for which there are no alternatives for mobile browers on alot of phones.
Just saying the EU is inconsistent and has been filed for counter lawsuits because the person making the judgments have been liked to cooperation of the competitions for microsoft and apparently it's considered fair and unbais judgments.
i call shenanigans!
|
|
|
|