|
When using this resource, please read FragKrag's opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly. |
Yeah that may be true these days, as I noted above. This is particularly true if your volume isn't as high as Intel's, hence part of why AMD is being realistic and not going after bleeding edge process sizes anymore. Another part is not really owning any fabs anymore, obviously.
Next to be integrated on the CPU die is the southbridge. The trend has been: omg so much spare area -> add more cache to die -> add additional CPU cores to die -> add memory controller and I/O controllers to die -> add integrated graphics to die -> add more I/O to die, etc. All that's interspersed with omg add more cache to die, of course. Anyhow, the process sizes are shrinking faster than they can come up with useful uses of the spare area. Or at least the correspondence isn't proportional. The percentage of a CPU die that's actually traditional fetch/decode/execute logic has gone steadily down obviously, even though the designs generally get a little more complicated.
By the way, also note that with larger die sizes, your die is more likely to have a defect on it that requires binning it lower or throwing it away (since the defect rate per square mm shouldn't depend on the size of the die).
|
From my last post.
EDIT: Now that I think about it, are we sure that a smaller used silicon die area translates to a smaller overall wafer size? I can't imagine Intel cutting these things right to the edge if you know what I am saying.
On February 15 2012 09:22 Myrmidon wrote: By the way, also note that with larger die sizes, your die is more likely to have a defect on it that requires binning it lower or throwing it away (since the defect rate per square mm shouldn't depend on the size of the die).
That doesnt make sense to me, shouldnt etching a smaller process be more difficult than etching a larger one?
Or do you mean that etching the same nanometer process is complicated when you add more stuff to the silicon?
|
No, the wafer size is a certain standard. They just get more chips per wafer with smaller chips. 300 mm diameter has been the standard that a lot of people said was never going to be supplanted, but 450 mm is coming.
http://www.gsaglobal.org/email/2010/general/0222w.htm
On February 15 2012 09:24 Medrea wrote:Show nested quote +On February 15 2012 09:22 Myrmidon wrote: By the way, also note that with larger die sizes, your die is more likely to have a defect on it that requires binning it lower or throwing it away (since the defect rate per square mm shouldn't depend on the size of the die). That doesnt make sense to me, shouldnt etching a smaller process be more difficult than etching a larger one? Or do you mean that etching the same nanometer process is complicated when you add more stuff to the silicon?
The defect is from the silicon being impure IIRC. Hence the need for 99.9999% pure silicon, or whatever it is. I haven't looked at the process in a long time (and didn't just now), but if you want to read some Intel propaganda/whatever, they have some informative slides on the production process. http://newsroom.intel.com/docs/DOC-2476
![[image loading]](http://i.imgur.com/xnYKl.jpg) Ivy Bridge quad cores
|
When i refer to wafer I was talking about the size of the wafer residing inside the package. Not the piece they are cutting the chip from to begin with.
Also from the sounds if it, it sounds like the amount of chips you can cut go down in plateau's, so not every die shrink results in being able to cut more from less.
|
5930 Posts
In AMD's whole lifespan, they haven't given Intel a run of their money. Netburst was the most they ever achieved. What makes you think Intel processors are going to be ungodly expensive without good competition when AMD has been nothing but an alternative processor for, to put it simply, poor people?
"I want to buy AMD to stop Intel's price gouging" is such a stupid argument. Was stupid and is still stupid. There is zero reason for Intel to price their products out of the reach of their market: they don't like it because they can't sell enough processors to justify their fabrication processes, investors don't like it, and consumers don't like it.
Anyway, Intel has bigger problems with ARM-based devices starting to dissolve their existing markets. Mum and dad, as well as businesses, have started to ditch laptops for iPads because they don't do anything but consume media with them. They're cheap, the OS is completely locked (everyone but nerds want this), the screens are fantastic, the battery life is amazing, the weight/form factor/size is magnitudes smaller than a laptop, and most of the time the build quality is better than most $1,500 laptops.
|
AMD is still a point of reference, and another marketer on the scene. Even if AMD has never truly been price/performance competitive with Intel, the masses are going to buy into the hype anyway. Not having a competitor presence at all is still bad.
|
5930 Posts
AMD has never been competition. They might as well not exist unless you can't afford anything but the cheapest hardware. AMD's point of reference is: we can match Intel's processors two generations ago but with shittier I/O speeds, shittier reliability, much higher power draw, defective power management features, and being heatphoblic. Any laptop with good hardware is going to be Intel based; any DIY builder wanting performance will look at Intel; anyone wanting low power draw and good single threaded performance for business use is going to look at Intel; any non-HPC servers will be using Intel processors.
So what does AMD offer to most people?
|
On February 15 2012 09:30 Womwomwom wrote: Anyway, Intel has bigger problems with ARM-based devices starting to dissolve their existing markets. Mum and dad, as well as businesses, have started to ditch laptops for iPads because they don't do anything but consume media with them. They're cheap, the OS is completely locked (everyone but nerds want this), the screens are fantastic, the battery life is amazing, the weight/form factor/size is magnitudes smaller than a laptop, and most of the time the build quality is better than most $1,500 laptops.
I'm not so sure about businesses, because Windows 8 on ARM will not be able to use x86 programs. Sure there will be special releases of Office and a lot of popular software, but that's probably not enough for many businesses, which rely on all sorts of expensive legacy solutions and software.
Also I think Intel has a legitimate shot with the new Atom architecture in 2013 on 22 nm tri-gate, to take back at least some market share from ARM for mobile. The brute-force manufacturing advantage may actually be able to overcome the stupidity of RISC CISC (whoops I haven't heard anyone say "CISC" in years so it's hard to type lol), making this potentially more attractive as a smartphone/tablet SoC. Even the new 32 nm Atoms with more or less the old architecture are kind of competitive already.
|
5930 Posts
Oh yeah they definitely won't replace the legacy desktop + Windows XP in businesses. I'm generally talking about portable field devices for business users. Most business users need a light portable device to access and show information for clients: the iPad is the best device for this. Businesses also like locking down their hardware to restrict usage and this is what the iPad does better than any other device on the market.
In this regard, they've replaced a lot of situations where laptops would typically be used. Its also creating new markets in the medical field, for instance. Even in the surveying field, with ArcGIS and AutoCAD apps appearing on the iPad, it could start replacing some of the field devices and workstations which do nothing but do basic editing and checking of results.
|
I wonder which one introduces more defects into the equation.
Defects from pushing the process boundary.
Defects from larger die area from silicon impurities.
Im guessing there is a sweet spot that Intel is always trying to stay inside. Hence why the roadmap is what it is.
|
@Womwomwom I agree, particularly for use out in the field and demos, some control interfaces, and so on. There's a lot of laptops out there deployed that don't really need to be laptops and may be better suited to the task if they weren't laptops.
But like I said, I could conceive of Windows-based tablets using Atom becoming popular. Maybe the train's already left, maybe not. We'll see.
|
On February 15 2012 08:52 Medrea wrote:Im going to need clarification since Intel's chip area has never dropped below 75mm squared. Since intel is cutting out the same square area of silicon wafer for every single chip, how does a smaller die area reduce material usage? Show nested quote +On February 15 2012 08:50 Josh_rakoons wrote: Noob question alert, whats a die shrink? When we talk about die shrink we are talking about how close various elements on the chip are allowed to be. Smaller spaces allow current to travel through the processor faster, as well as a bunch of other benefits. Overall die size rarely goes down, only up.
Yes! I understood what you just said! >_> <_< >_>.....
|
On February 15 2012 09:43 Womwomwom wrote: AMD has never been competition. They might as well not exist unless you can't afford anything but the cheapest hardware. AMD's point of reference is: we can match Intel's processors two generations ago but with shittier I/O speeds, shittier reliability, much higher power draw, defective power management features, and being heatphoblic. Any laptop with good hardware is going to be Intel based; any DIY builder wanting performance will look at Intel; anyone wanting low power draw and good single threaded performance for business use is going to look at Intel; any non-HPC servers will be using Intel processors.
So what does AMD offer to most people?
Around Athlon vs P4 times, AMD was so much better than the Intel equivalent, they didn't just match them, AMD surpassed Intel by far in IPC, wattage and overclockability. the interesting thing was that Intel still won in sales due to bribing OEM to only use Intel products and general consumers not having enough knowledge about processors. "OH a 3 ghz Intel processor vs a 1,8ghz AMD, lets go for Intel!". Remember when AMD had to introduce the 3700+ 4000+ etc. as marketing terms due to this kind of misinformation? It's kind of sad that due to these kind of circumstances and all the dirty play by Intel, AMD could never really compete in terms of capital and investment. Like many of us are saying, the consumers will be the real losers by giving Intel free reign on the CPU market. Not only will the prices increase, but the consumers have to pay premium price for having processors with unlocked multiplier, laughable stuff like overclocking insurance, bad chipsets and so on. I agree that boycotting is useless and even silly, especially since AMD isn't even trying anymore, but something must be done.
|
5930 Posts
Yes I mentioned that. Who cares, that was the only single shining point in AMD's history and it was more due to the huge fuck up from Intel. And it wasn't even that dire because Intel still had the far superior fabs...and used it to create the Core 2 Duo.
|
7770 for $160.. lol AMD grasping at straws
|
These things use less power, and the GPU die itself is barely larger than Turks (HD 66xx). There's no way it couldn't be priced at current HD 6770 levels; it just doesn't make sense for them to do that since that's not how you make money. They've got to clear old stock, and it's not like they're really competing much with Nvidia in the sub-$150 category. Some people will want the new stuff anyway, because they didn't check any reviews, or the new features or power consumption or whatever is important to them.
The only thing maybe worth a look now is the fanless HD 7750 tbh, maybe a couple of the other models too.
|
What is your budget?
500-600$
What is your resolution?
1440x900 / 1280x1024
What are you using it for?
Streaming starcraft 2, gaming
What is your upgrade cycle?
Whenever something dies, i replace it. Overall, probably 3 years
When do you plan on building it?
Not sure to be honest...within next month
Do you plan on overclocking?
No
Do you need an Operating System?
Yes
Do you plan to add a second GPU for SLI or Crossfire?
No
Where are you buying your parts from?
Anywhere
|
One more attempt at laptop advice : (
Budget: $1200 Resolution: 1920x1080 17" screen (higher if a miracle can be found) Use: Gaming and emulators will be the most taxing use. My goal is to maximize gaming power within my budget, I'll find ways to use any power I can get in that dollar range. Overclocking: If necessary and if the cooling is sufficient to allow it. Upgrade Cycle: Difficult to predict at this point, so let's say 3 years.
Three months ago I got a Dell XPS 17 for $900 after tax with an i7 2670, Geforce 555m, 1TB hard drive, 8GB ram, and a 17.3" 1920x1080 screen. I'm back in the market because it was stolen, and would like to at least match those specs. I don't expect to find a deal that good in just a few weeks of looking when it took 4 months before. Right now I'm looking for advice on where to look for a good deal, as well as what brands I should avoid no matter the deal.
|
On February 15 2012 15:34 Myrmidon wrote: These things use less power, and the GPU die itself is barely larger than Turks (HD 66xx). There's no way it couldn't be priced at current HD 6770 levels; it just doesn't make sense for them to do that since that's not how you make money. They've got to clear old stock, and it's not like they're really competing much with Nvidia in the sub-$150 category. Some people will want the new stuff anyway, because they didn't check any reviews, or the new features or power consumption or whatever is important to them.
The only thing maybe worth a look now is the fanless HD 7750 tbh, maybe a couple of the other models too. Well sure, but I still believe it's rather silly that you can buy a waaaay better card for less from the same manufacturer lol
(However 7750 with some decent sales should be a really nice card at that price bracket)
|
7770 is just too much money. You would have to really value the power consumption.
|
|
|
|