|
Man I love this forum! It's awesome debating computer tech with people on here.
Maybe I'll post an article here everyday? Idk, if some of you don't like it, I can stop...
Intel's new Jasper Forest CPU will integrate the I/O hub into the CPU itself, saving space on the motherboard and using less energy in the process.
By integrating the I/O hub into the CPU, there's no need to have a north bridge on the motherboard anymore.
So, since we already got rid of the southbridge, all that was left was the north bridge, if I'm understanding this correctly.
It seems as though many computer hardware innovators are moving toward integrating more and more into the CPU and less on the motherboard.
So from this, I came up with a hypothesis:
Could Intel be moving toward integrating everything into the CPU, and getting rid of the motherboard all together?
I can't really think of a way to connect all of the other components to the CPU itself, but when that becomes possible in the near future, all we may have in our tiny little boxes we used to loathe carrying around may be just a tiny little box the size of your hand.
However, I don't think Intel will be able to throw out the motherboard all together; motherboard vendors will have something to say about that.
Also, if Intel is trying to integrate EVERYTHING into the CPU, they will have the problem of shrinking every component in the computer down to the size of a chip or a transistor and integrating them into the CPU.
Well, that's my theory! But what do you guys think?
|
Impossible! They will ruin the market for many vendors if they do try to integrate everything into the CPU. Then they will have that problem again of them trying to monopolize the market. My thoughts though are; If they do integrate everything into the CPU that is a lot of space you will be saving up. I just really don't know how this will be possible though, as in how this will come out in the end. I feel like it will end up being like the Mac Air.
|
Yes, exactly my though.
However, I'm guessing it will be possible in the future... I mean, we went from having just one GPU in a computer to having multiple in a matter of months. So, anything is possible.
However, I don't think Intel will have a problem with the whole "monopoly" thing.
If everything does get integrated into the CPU, there's still the problem of producing all of those transistors and small parts that will function as the GPU, hard drive, and other components of the computer.
Then maybe, just maybe, the vendors will still have a chance in the market.
|
I don't believe that is the direction Intel wants to move in the near future. A more likely purpose for eliminating the northbridge and letting the GPU talk directly to the CPU is to prepare an architecture more favorable for Larrabee or integrated CPU+GPU budget chips. I'm not aware of any upcoming Intel products that would give them an incentive to remove the southbridge.
|
Hmm true true...
Well, the Larrabee is called the General Purpose GPU and it can take on some of the roles that the CPU would traditionally do, so I guess it would be more efficient just to connect the GPU directly with the CPU.
Again, this is just a theory, a thought that just came to my mind when I thought about how Intel keeps integrating small parts of the motherboard into the CPU.l
Though it may not seem possible now, it might be in the future. Anything can happen in innovating the PC.
|
I agree with you that it will probably eventually happen. If you have a 32-core chip, stuffing all the motherboard functions onto a couple of those cores would make a lot of sense. I think it will be several years before anyone announces a fully integrated CPU, though.
|
Yea, definitely not any time soon. Though several years may pass very quickly... XP
I'm thinking that motherboard vendors will have a huge say in this, since they keep "losing parts" on their motherboards. Less material ?=? cheaper boards? Not necessarily, but it might be the case.
So it might also mean less profits.
|
Well I'm not sure about it, but I don't think intel thinks they can break the nVidia and AMD hold on the standalone graphics market yet.
|
Well, of course not!
And thinking about how to integrate those bricks that we call GPUs is nearly impossible right now!
Also, taking into the power factor into account, I don't think we're going to see 400W CPUs with built-in GPU engines any time soon Xp
Which does sound awesome though...
|
Nah, I rather be able to update my graphics card with out having to buy a whole new cpu in order to make my graphics better :p
|
Skochkeyy: ahaha that is true. What would we do during our day if we can't drool over awesome, huge graphics cards anymore!?
I came up with another problem with the whole integration idea though.
If, out of some miracles, we do eventually integrate almost everything on the computer into a single processor, wouldn't it be hard to replace or fix if one small part of the processor hinders you from using your computer?
For example, if the GPU does get integrated, and there's something wrong with it, you can no longer go buy another GPU and fix the problem on your own. You may even have to buy a whole new processor.
Just a thought... whadya guys think?
|
There's a lot of potential in utilizing GPU in general computing tasks. Their parallelization potential is one that I can think of and of course the recent developments such as openCL means that GPU will actually play an important role outside of gaming/graphics market. I don't really think that this would be that bigger deal (at least in the next 5 or 10 years or so) Vendors are not necessarily missing out; and there has being a general push in the direction of making personal computer into a integrated appliance instead of components that you build.
No matter what happens, traditional parts maker like ASUS and Gigabyte will still play a role.
Let's take a look at what Western Digital and Seagate has done. Instead of just selling hard drives by itself, they now sell external hard drives.
If a business can't adapt then that's their problem.
But Intel and NVidia won't cross swords soon, My opinion is that they will diverge to a point where they will become the next AMD vs Intel and then we will see some real competitions.
Which is of course good for the consumer.
Let's now start worrying about what ifs. The bottom lines of any business (Tech firms included) is making money and they must do it by capturing markets. So when and if it so happens that GPU and CPU becomes one, then there will be new buying models.
|
There is something called system-on-a-chip (SOC) you know...
Larrabee is really a dark horse in the GPU market. From what we see it is going to be huge and very hot, and unfortunately not very fast. Intel's plan would be for it to be the leader in pure performance when it comes out but it might be behind smaller and more efficient chips from AMD and nVidia. In short, it will probably not make a huge impact on the market, not even the strength of the Intel brand could bring it afloat.
But what about the future? Well, CPU and GPU might be merged and we might see some kind of Larrabee vs. IBM CELL battles. The current paradigm cannot be pushed much longer, but since a great majority of the software that is designed today does not comprise of embarassingly paralel problems chip manufacturers have to keep growing "per thread" performance. I would also point out that transitioning to new paradigms and technologies will not be easy because all customers require a "here and now" performance improvement to upgrade their systems. A 64 core Atom looks promising on paper, but would be worthless in the current software ecosystem.
|
|
|
|