|
For those of you who aren't exactly computer savvy, if you are using an operating system with a graphical interface (read: all of them), then you are currently using a graphics card.
A graphics card (also referred to as a VGA accelerator by some people, or just VGA card) is a piece of silicon with a single purpose: To display what you currently see on your screen. Their methods are widely varied, and in the DOS era, they were rarely standardized. Back in the days of the ISA system bus (the precursor to PCI, AGP, and PCI-E), graphics acclerators had pretty terrible performance and were really expensive. Not only that, but they were necessary. Anyone who wanted to play some DOOM or Turok had to have one, and so came the first big graphics companies.
More than 10 years ago, there was a graphics card company called 3dfx. They were amazing and several years ahead of their time, developing technologies such as SLI (scan leave interface, a technology that allowed the use of multiple graphics cards per computer, thus improving 3d performance) and power saving features that weren't seen from the competition for years, most of their entries into the market blowing the competition out of the water. Keep in mind that they first rose to power when 2mb of on-board video memory was almost superfluous.
However, due to rampant mis-management, bad spending, competition from the then brand new Geforce 2 and Geforce 3 cards from the rising star Nvidia, 3dfx failed. Three days before they were slated to release what was supposed to be their live-saver, the Voodoo 5 series, the were sold by investors, ironically enough, to Nvidia. We don't see much of 3dfx anymore, unless you like to use multiple video cards in your computer, in which case only the faint echoes of the original SLI technology echo through your monitor.
Fast forward to today. We have two major graphic card companies, as well as two major processor manufacturers. ATI, and Nvidia. ATI was recently bought by processor manufacturer AMD when it was in dire financial straits, giving it the budget it needed to finally start to succeed again. With only one misstep, the fiasco that was the 2900xt, ATI has been relatively successful lately, outselling Nvidia in key market points.
Nvidia on the other hand has formed an informal alliance with Intel (aka Chipzilla), and with their dishonest marketing strategies, (Such as "The Way It's Mean To Be Played", where they pay game development companies large sums to make their games perform better on Nvidia systems) has caused a lot of PR damage.
In 2008, as is standard, Nvidia and ATI released their new graphics cards at roughly the same time. ATI shocked the PC-enthusiasts of the world with their "HD 4870", which was more than two times more powerful than their previous entry, the 3870. The 4xxx series gpu's by ATI outsold the Nvidia gt200 cards in that round, due to their higher price performance. Nvidia has slowly been losing market share over the past year to the AMD/ATI combination.
And now we have the current generation of graphics cards.
A couple months ago ATI launched their 5 series graphics cards. The flagship 5870 once again managed to double the performance of the last series of graphics cards, and was the first Direct X11 (A standard 3d coding API proliferated by Microsoft) supporting card ever released. ATI managed to strike a hard blow against Nvidia, releasing their new, suprisingly cheap cards right at the end of the lifecycle of the gt200 cards. Because of this, ATI has gained approximately 8% of the graphics card market share in just a few months. Nvidia's new offering, the gt300, codename "Fermi" is now nowhere to be seen.
Here's the problem:
Fermi has been delayed for nearly 6 months now. Originally scheduled to come out only a month after the 5xxx series cards, it's now been pushed back to March. While there have been a couple teasing screenshots, the problems encountered by Nvidia are really telling.
One of the problems stems, not from a failing of Nvidia, but from their new production process. They've moved to the 40nm production node (meaning the average width of a logic gate on the GPU die will be no greater than 40nm), which is buggy and untested. TSMC, the company responsible for manufacturing the cards, has managed to achieve only a 2% success rating. That isn't a typo. That means that out of every 100 pieces of blank silicon TSMC is given, only 2 of them will yield successful graphics cards. Blank silicon and hafnium wafers aren't cheap, and Nvidia has supposedly squandered millions of dollars on this technology. Not only that, but Nvidia has entirely switched the architecture of their GPU, literally rebuilding it from the ground up.
The second problem is that Nvidia refuses to admit that it's failed. As a matter of fact, Nvidia was even caught faking gt300 reference cards. There have been no benchmarks released, and other than the constant delays, absolutely no sign that the Fermi cards will ever be released.
Now go back to the beginning.
3dfx found itself in exactly the same position. Mismanagement, underhanded corporate tactics, and massive delays led to the death of the company even after it had been so successful. Only one failure can lead to the death of an industry giant in a market as volatile as that of the graphics card industry. And so that leaves us thinking: If nVidia fails, who will step up to fill their place?
|
Nivida is nowhere near going out of business and they make WAayyAyaYAYyAy more money then 3dfx could have ever dreamed of.
|
Wow that's interesting. I haven't followed the hardware scene for ages and last time I checked (i.e. years ago) nVidia was raping ATI (and Intel was raping AMD). God I remember the good old voodoo and TNT cards. Was really leet stuff back then. :p
|
On January 09 2010 17:25 Matoo- wrote: Wow that's interesting. I haven't followed the hardware scene for ages and last time I checked (i.e. years ago) nVidia was raping ATI (and Intel was raping AMD). God I remember the good old voodoo and TNT cards. Was really leet stuff back then. :p
voodoo!!! holy crap im old.
|
MURICA15980 Posts
Even though ATI gained 8% last year, doesn't Nvidia still have the majority of the market share? I mean, sure the top of the line cards are cool and one side may outsell the other in that race, but the real money is made in the mid-level cards, aren't they? Don't Macs and Dell sell exclusively Nvidia? I'd say they're doing fine, but I honestly have no idea.
|
It's very telling that NVIDIA hasn't released a new product since the GT200 series in mid-2008. Their previous cycle was a new product every six months, now they are just constantly rehashing the GT200. If their next card fails I think it will have a very serious impact on their ability to remain competitive with ATI finally becoming a serious contender.
|
Nvidia has a lot of the market, but yes due to their fiasco with the 300 series they are coming up short.
Klogon: Apple now equips iMacs with 4850 (as a flagship gpu LOL), and Mac Pros with 4870s. Dell also uses AMD chips, like the HD 4650 in their desktops.
|
T.O.P.
Hong Kong4685 Posts
Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it.
|
I believe this is a good thing, as far as i know AMD/ATI was struggling very hard a few years ago. Nvidia won't fail with 1 big blunder, at least from what my limited knowledge can tell (:. It will give ATI/AMD some breathing room to build a buffer hopefully. We really need these companies to keep compeeting successfully, anything else can only be bad for the consumer.
|
T.O.P.
Hong Kong4685 Posts
On January 09 2010 17:44 R1CH wrote: It's very telling that NVIDIA hasn't released a new product since the GT200 series in mid-2008. Their previous cycle was a new product every six months, now they are just constantly rehashing the GT200. If their next card fails I think it will have a very serious impact on their ability to remain competitive with ATI finally becoming a serious contender. Yes, 8600 GT = 9600 GT = GT 230. At least in the mobile market. They've been renaming gpus for the last few years.
|
i don't think we can put the blame fully on nvidia. i believe both amd and nvidia get their chips supplied from the same manufacturer tsmc. yields on ati cards may be better but they are still unable to keep up with supply which brought up questions how well they would do during the holiday season and inflated prices
i don't think nvidia is in too bad shape as long as fermi kicks ass, which according to nvidia it does (lol obv).
i mean most people who can wait for fermi to come out will wait for it anyway.
|
Think of it this way: The GT200 die, considering monolithic and expensive, was like 230mm/sq. Fermi is going to be like 300+mm/sq, increasing the prices that much more. (Hafnium isn't cheap y'know)
|
On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it.
i think you are in opposite land
|
T.O.P.
Hong Kong4685 Posts
On January 09 2010 17:55 FragKrag wrote:Show nested quote +On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain
|
I don't really have an established opinion on the graphics card industry, but that was one hell of an awesome OP. I read every word (and even reread some stuff!!)
|
On January 09 2010 17:57 Day[9] wrote: I don't really have an established opinion on the graphics card industry, but that was one hell of an awesome OP. I read every word (and even reread some stuff!!)
Day[9] complimented my OP. My TL life is complete.
|
On January 09 2010 17:56 T.O.P. wrote:Show nested quote +On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain
oh
I don't know. It's just that nvidia has been rebranding their cards for the last 4 years and making money off of them.
Have you ever heard of the 200 series TOP?
|
On January 09 2010 17:56 T.O.P. wrote:Show nested quote +On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain i don't know for sure, but if ATI is like AMD in this case they are the one selling with a low profit margin, and nvidia is still earning money even though their graphic cards are performing worse per dollar, and losing market shares. it does make sense giving their respective positions too.
|
On January 09 2010 17:56 T.O.P. wrote:Show nested quote +On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain Have you heard of the G92 GPU? It's basically Nvidia's workhorse. It first debuted with the 8800gtx, and was later carried on to the 8800gtx+, the 9800gtx, gtx+, and the gts250. That GPU was such a baller that they used it like 5 times. It had 128 of those unified "stream processors" Nvidia is so proud of. To create the gtx280, nvidia simply worked on the g92 gpu, creating the 384sp behemoth that we have today, and also the 192 and later 216sp gtx260's. The G92 has always had amazing performance (well over that of ATI's best cards) but is also fairly expensive to produce. If you want top of the line performance, go nvidia. If you want to be able to afford some games to play on your new computer, go ati.
|
T.O.P.
Hong Kong4685 Posts
On January 09 2010 17:59 FragKrag wrote:Show nested quote +On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain oh I don't know. It's just that nvidia has been rebranding their cards for the last 4 years and making money off of them. Have you ever heard of the 200 series TOP?
Yes, the 200 series is the reason why Nvidia is falling behind.
|
On January 09 2010 18:03 T.O.P. wrote:Show nested quote +On January 09 2010 17:59 FragKrag wrote:On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain oh I don't know. It's just that nvidia has been rebranding their cards for the last 4 years and making money off of them. Have you ever heard of the 200 series TOP? Yes, the 200 series is the reason why Nvidia is falling behind. I don't think i get the focus of your post. That graph shows the market share of nVidia getting raped in 2008, which is when they released the gt200 series, which is when their market share started falling...
|
T.O.P.
Hong Kong4685 Posts
On January 09 2010 18:01 nttea wrote:Show nested quote +On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain i don't know for sure, but if ATI is like AMD in this case they are the one selling with a low profit margin, and nvidia is still earning money even though their graphic cards are performing worse per dollar, and losing market shares. it does make sense giving their respective positions too. No, ATI is the one selling with the high profit margin because the die size of their gpu is small. Nvidia is selling with low profit margin because their die size is big. Die size big = high failure rate.
|
T.O.P.
Hong Kong4685 Posts
On January 09 2010 18:05 ghermination wrote:Show nested quote +On January 09 2010 18:03 T.O.P. wrote:On January 09 2010 17:59 FragKrag wrote:On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain oh I don't know. It's just that nvidia has been rebranding their cards for the last 4 years and making money off of them. Have you ever heard of the 200 series TOP? Yes, the 200 series is the reason why Nvidia is falling behind. That graph shows the market share of nVidia getting raped in 2008, which is when they released the gt200 series, which is when their market share started falling... Exactly the point I wanted to make.
|
it also houses 3 dual gpu cards whereas ATi releases like 1 dual gpu card per series
Nvidia makes up for the high cost of the process by inflating the cost of their GPUs.
|
On January 09 2010 18:06 T.O.P. wrote:Show nested quote +On January 09 2010 18:01 nttea wrote:On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain i don't know for sure, but if ATI is like AMD in this case they are the one selling with a low profit margin, and nvidia is still earning money even though their graphic cards are performing worse per dollar, and losing market shares. it does make sense giving their respective positions too. No, ATI is the one selling with the high profit margin because the die size of their gpu is small. Nvidia is selling with low profit margin because their die size is big. Die size big = high failure rate. Not necessarily. For example, the gt200 which was and still is very large had around a 60% yield rate, which isn't the best but it's respectable. To put it in perspective, the rv770 and rv870 cores have both had around 70% yields. Cores that are damaged/faulty are often rebinned as lower performing cards, but the big problem for nvidia is that they can't get away with rebinning their gt300 cards because they're on an entirely new and untested process.
|
T.O.P.
Hong Kong4685 Posts
We don't even know how fermi's gonna be like. Talking about fermi is like talking about SC2. I was talking about how the 200 series is hurting nvidia.
|
T.O.P.
Hong Kong4685 Posts
|
On January 09 2010 18:11 T.O.P. wrote: We don't even know how fermi's gonna be like. Talking about fermi is like talking about SC2. I was talking about how the 200 series is hurting nvidia. I was responding to the part where you said "die size big = big failure rate." Also: I don't think we've seen the end of delays for fermi. I mean, even if they were to suddenly dump all of the cards that exist on the market, that would only be a couple hundred. At this rate how are they ever going to produce enough of them to make them below the cost of the Asus Mars gtx295? + Show Spoiler +that's a really geeky joke but the mars was made with two GTX285's instead of two gtx260's, making it like $1000+
|
The 4xxx cards had a wayyy better price to performance compared to the expensive 200 series.
|
|
Bill307
Canada9103 Posts
On January 09 2010 18:06 T.O.P. wrote: Die size big = high failure rate. No, it's the opposite.
|
Well Intel is still >>>> AMD
but atm ATi > nVidia.
|
On January 09 2010 18:17 Bill307 wrote:Thanks for the informative post, ghermination.  Last I'd checked it was Intel + nVidia >> AMD + ATI. Glad things are evening up now. I remember hearing something ~4 years ago about the limit of our current chip fabrication technology being about 45(?) nm, because anything smaller runs into serious problems, such as heat dissipation. I'm surprised nVidia would commit themselves to a smaller size without sufficiently testing it first. Then again, bad management knows no bounds. =P Actually, the physical limit our technology can reach is 12nm. Samsung has even demonstrated a transistor fabricated at the 22nm process. While i'm no physicist, at that size we run into a problem called quantum tunneling. Although i don't fully understand the process, you can look it up on wikipedia if you're actually interested.
|
Maybe if the i7 prices go down, everyone just gets a Phenom II x4 and OCs it. Im still stuck with a e6420 @ stock speeds Damn old mobo!!!
|
On January 09 2010 18:18 FragKrag wrote: Well Intel is still >>>> AMD
but atm ATi > nVidia.
The C2D and Q series are great, guess even the Quad is just too low for today's standards.
|
you could always get like an E8400 or something.
|
On January 09 2010 18:22 Disregard wrote:Show nested quote +On January 09 2010 18:18 FragKrag wrote: Well Intel is still >>>> AMD
but atm ATi > nVidia. The C2D and Q series are great, guess even the Quad is just too low for today's standards.
Not necessarily. The core architecture offers amazing performance, and if you look at the new 32nm i3 and i5's that were just released they have some pretty impressive clocks. I believe the best dual core i5 part is at 3.46ghz, and can easily be overclocked to 4.5+. New fabrication processes are amazing.
Also, they're fairly cheap. Once the 22nm hexa-core sandy bridge parts come out (~q1-q2 2011) we should see a huge price drop on the older i7 and i5 processors, considering intel will probably be abandoning lga 1155 at that time and will begin to phase out lga 1366 at that time (according to their roadmap and statements made at CES they plan to phase out lga 1366 by 2012)
|
On January 09 2010 18:24 ghermination wrote:Show nested quote +On January 09 2010 18:22 Disregard wrote:On January 09 2010 18:18 FragKrag wrote: Well Intel is still >>>> AMD
but atm ATi > nVidia. The C2D and Q series are great, guess even the Quad is just too low for today's standards. Not necessarily. The core architecture offers amazing performance, and if you look at the new 32nm i3 and i5's that were just released they have some pretty impressive clocks. I believe the best dual core i5 part is at 3.46ghz, and can easily be overclocked to 4.5+. New fabrication processes are amazing. http://www.teamliquid.net/forum/viewmessage.php?topic_id=109322
|
On January 09 2010 18:06 T.O.P. wrote:Show nested quote +On January 09 2010 18:01 nttea wrote:On January 09 2010 17:56 T.O.P. wrote:On January 09 2010 17:55 FragKrag wrote:On January 09 2010 17:47 T.O.P. wrote: Even though Nvidia is losing marketshare, they still own a big part of the market. However, ATI's great lineup from the bottom to the top forced Nvidia to lower prices significantly and it's hurting Nvidia. It more expensive for Nvidia to manufacture GPUs compared to ATI. ATI's gpu die size is significantly smaller than Nvidia. ATI gpus still run faster than Nvidia gpus because of faster clock speeds and 2 gpus on a board technology. ATI is doing great because they sell gpus with a good profit margin. Nvidia sells gpus and loses money doing it. i think you are in opposite land explain i don't know for sure, but if ATI is like AMD in this case they are the one selling with a low profit margin, and nvidia is still earning money even though their graphic cards are performing worse per dollar, and losing market shares. it does make sense giving their respective positions too. No, ATI is the one selling with the high profit margin because the die size of their gpu is small. Nvidia is selling with low profit margin because their die size is big. Die size big = high failure rate. ok thanks! i shouldn't speak out about stuff i know jack shit about but then again it doesn't seem to stop anyone else around here (:
|
To be completely honest i've got my eye on the g9650. As soon as i get some spare cash i'm going to get me a sempron 140, and then one of those, and i plan on finally busting out dry ice, something i haven't done in years. I'm sure those 32nm/single core processors overclock amazingly well on c02
Edit: Smoking a bowl for ATI. Go red team!.
|
This honestly makes me wonder why dont I encourage myself to learn how to OC. I always thought it was very technical, maybe one day.
|
On January 09 2010 18:30 Disregard wrote: This honestly makes me wonder why dont I encourage myself to learn how to OC. I always thought it was very technical, maybe one day.
"Technical" 1. go to BIOS 2. Raise multiplier. Alternatively, raise FSB. 3. If the computer fails to boot, increase voltages a little. If it still fails to boot, increase ram voltage. 4. Repeat 2 and 3 until you can't go any higher/are unable to boot. 5. ??? 6. PROFIT!
|
Bill307
Canada9103 Posts
On January 09 2010 18:20 ghermination wrote: Actually, the physical limit our technology can reach is 12nm. Samsung has even demonstrated a transistor fabricated at the 22nm process. While i'm no physicist, at that size we run into a problem called quantum tunneling. Although i don't fully understand the process, you can look it up on wikipedia if you're actually interested. Well, that's what I get for referencing 4 year old info on current technology. 
Quantum Tunnelling, for anyone who's wondering, is basically like... you know how sometimes Dragoons miss their target, even on level ground where they're supposed to have a 100% hit rate? Well similarly, at very small scales, particles can do things they're not supposed to have enough energy to do.
If you've studied chemistry, you've probably learned about the "activation energy" needed to initiate a chemical reaction, with the corresponding energy graph that looks like a hill. Well, Quantum Tunneling also involves a hill-shaped energy graph, where a particle needs to have x amount of energy to "climb over the hill" and do something. But the particle can do that same thing even if it doesn't have enough energy, as if it "tunnels" through the hill instead of climbing over it. Hence the name.
So what does this mean for sizes smaller than 12 nm? I don't know: someone else will have to look that up or explain it.
|
I think it has something to do with the heat output of the chips and being unable to cool them effectively.
|
United States22883 Posts
The video card market is extremely volatile, these companies just need to have enough reserves to outlast their duds (GT200.) Look at all the switches that have taken place since the R300/GF4 days. Nvidia's had two bad series in a row but they just need one card like the 8800GT to turn it around again.
|
On January 09 2010 18:36 Bill307 wrote:Show nested quote +On January 09 2010 18:20 ghermination wrote: Actually, the physical limit our technology can reach is 12nm. Samsung has even demonstrated a transistor fabricated at the 22nm process. While i'm no physicist, at that size we run into a problem called quantum tunneling. Although i don't fully understand the process, you can look it up on wikipedia if you're actually interested. Well, that's what I get for referencing 4 year old info on current technology.  Quantum Tunnelling, for anyone who's wondering, is basically like... you know how sometimes Dragoons miss their target, even on level ground where they're supposed to have a 100% hit rate? Well similarly, at very small scales, particles can do things they're not supposed to have enough energy to do. If you've studied chemistry, you've probably learned about the "activation energy" needed to initiate a chemical reaction, with the corresponding energy graph that looks like a hill. Well, Quantum Tunneling also involves a hill-shaped energy graph, where a particle needs to have x amount of energy to "climb over the hill" and do something. But the particle can do that same thing even if it doesn't have enough energy, as if it "tunnels" through the hill instead of climbing over it. Hence the name. So what does this mean for sizes smaller than 12 nm? I don't know: someone else will have to look that up or explain it.  For sizes smaller than 12nm we'll need to invent a new process to bring us into the era of "nanoelectronics" (that is, electronics on the scale of <10nm). Currently submersion lithography can only do so much, and i think if we perfect it we may be able to break the 9nm but it would be really difficult and probably won't be realized until we've all got personal reobot sex slaves and space ships.
|
Maybe when I replace my parts, soon.
E6420 2GB Muskin DDR2 800 4770 512MB MSI P6N (Forgot which version, but it only supports up to C2D I think). And the shitty PSU CoolerMaster ExtremePower 550 or 600W
God the damn thing sucks, the fan makes a very loud grinding noise or something and well I dont even need so much power in the first place. I regret getting it a couple of years ago, and yet I still kept it for this long.
|
On January 09 2010 18:40 Jibba wrote: The video card market is extremely volatile, these companies just need to have enough reserves to outlast their duds (GT200.) Look at all the switches that have taken place since the R300/GF4 days. Nvidia's had two bad series in a row but they just need one card like the 8800GT to turn it around again.
I honestly regretted not waiting for the 8800GT or the 8800GTS 512MB instead of getting the 320MB G80. Oh well, it died on me few months ago.
|
On January 09 2010 18:42 Disregard wrote: Maybe when I replace my parts, soon.
E6420 2GB Muskin DDR2 800 4770 512MB MSI P6N (Forgot which version, but it only supports up to C2D I think). And the shitty PSU CoolerMaster ExtremePower 550 or 600W
God the damn thing sucks, the fan makes a very loud grinding noise or something and well I dont even need so much power in the first place. I regret getting it a couple of years ago, and yet I still kept it for this long.
Get a PII x2 550 BE, a cheap AM3 board, 4gb of ddr3 1333 and you're good to go.
|
Bill307
Canada9103 Posts
On January 09 2010 18:38 FragKrag wrote: I think it has something to do with the heat output of the chips and being unable to cool them effectively. That would confirm what I'd heard about going beyond 45 nm.
|
On January 09 2010 18:48 Bill307 wrote:Show nested quote +On January 09 2010 18:38 FragKrag wrote: I think it has something to do with the heat output of the chips and being unable to cool them effectively. That would confirm what I'd heard about going beyond 45 nm. Fortunately this is wrong, with the use of heatpipe coolers and 120mm fans, even extremely overclocked chips run at most 55 degrees celsius. My i7 920 @ 4.1ghz on a CNPS10X EXTREME!!! (lol zalman) is currently running firefox and folding@home and running at a nice 48 degrees.
|
I recently bought a couple of 4870's for a crossfire build and they were super cheap and have been amazing thus far. On another cpu i have a 8800GT running and it has been for 2 years and it hasnt died yet.
and @ bill, curiously enough there's some complex analysis stuff i read in a book a few years back that was saying stuff about quantum tunnelling..pretty interesting how it can cut a route like that.
|
I for-one always believed that anything 40ish Celsius* on idle is way too hot. The temps on my PC will get hotter when the new parts are installed, especially in the summer with stock coolers.
|
On January 09 2010 18:53 Disregard wrote: I for-one always believed that anything 40ish Celsius* on idle is way too hot. The temps on my PC will get hotter when the new parts are installed, especially in the summer with stock coolers.
40 degrees is perfectly acceptable. What you need to worry about is load temps breaking 60. That will rape a processor in less than a year, whereas it's designed to operate constantly at temperatures higher than 40 degrees.
|
Doesn't the problem with smaller processes include the fact that the semiconductor will still conduct even if it's turned off or something? I remember reading something on how as the processes become smaller, the distance between transistors becomes smaller and then volt leakages occur.
not sure if what I said is right though
|
On January 09 2010 19:00 FragKrag wrote: Doesn't the problem with smaller processes include the fact that the semiconductor will still conduct even if it's turned off or something? I remember reading something on how as the processes become smaller, the distance between transistors becomes smaller and then volt leakages occur.
not sure if what I said is right though It doesn't seem to be a problem they've run into so far. Other than the aforementioned "Quantum tunneling" i'm fairly sure they've fixed problems like this.
|
Yeah, 5870 is great but NOBODY CAN GET THEIR HAND on one.
|
Lol one bad year and people claim that the company Nvidia worth the same as AMD (that includes all of ATI) will go under lol. Zeesh if that was true then AMD who has been in debt sense 2006 pretty much would have burned to the ground by now.
|
If ATI has 8% of the market, then where is the other 92% go to.
|
On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to.
8% is far too low for ATi imo
Most of the market should actually be in Intel's hands with their integrated gpus. (I think)
|
On January 09 2010 19:08 FragKrag wrote:Show nested quote +On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to. 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think)
All victims of pre-build laptops/netbooks and PCs.
edit: Well building your netbook and laptop.... eh
|
T.O.P.
Hong Kong4685 Posts
Klogan meant that ATI increased their market share by 8%.
|
And no, I don't know why I did this.
|
Well written and very interesting OP, thanks!
|
I think you're reading too much into it. Nvidia and ATI have been going at it for years with the advantage swaying to both sides.
Back in the beginning Nvidia was the best no question about it, if you were a serious gamer you owned an Nvidia card and that was it.
Then ATI's 9xxx cards completely >>> Nvidia's Geforce FX series.
Fast forward a bit and we have the Geforce 8 series blowing the HD2xxx series out of the water.
HD3xxx vs Geforce 8/9 was pretty even imo. I bought a HD3850 over 8800GT or 9600GT for the better price/performance ratrio at the time.
Now we have HD4xxx beating the Geforce 2xx series, this seems to be pretty normal imo, it's like how the metagame in Starcraft swings back and forth but ultimately the game is balanced
|
IMHO nVidia is far from failing. The graphics market has always seen the biggest competition races in terms of technology. With every new generation it's either the red team or the green team winning the race. I even don't want to compare the situation nVidia is in ATM to any situation Ati has seen before. It's just different. Ati used to have grahipcs and later chipsets. nVidia has a more diverse product portfolio. They got graphics, HPC with Tesla, mobile graphics, ARM based mobile platflorm Tecra, chipsets (ION). And if you ask me they are secretly working on some x86 too. Fermi has been in development for a good while now. But it's also much more than 'yet another graphics architecture'. By the increase in double precision performance alone an the use of ECC RAM you can see that nVidia is aiming to make a serious impact on the HPC scene. Fermi also seems to rock in OpenCL. As soon as Adobe products start to make broad use of OpenCL everyone even the non-gamers will be buying top-of-the-line GPUs for their PCs. I agree that nVidia needs to get it's act together. But they still have enough time.
|
There are many things happening beyond what is obvious in the graphics market. nVidia is afraid of the cpu-gpu fusion happening behind their back and letting them stare in the sun as AMD and Intel make them redundant. They are not 100% focused on 3d graphics and games any more.
Bill307:to simplify a little, a wafer contains almost the same number of faulty chips regardless of their die size. This is why gpus are die harvested a lot, there are not many fully functional dies, but many partially. Defects are a lone transistor that doesn't work or an interconnect that is cut somewhere. If they don't compromise the whole chip you can work around defects and make yourself a lot of money.
"Standard" CMOS will not scale beyond 22nm unless it is really necessary, such as there being a lack of breakthroughs in other fields. Silicon's days are numbered. Defect density will be higher, requiring smaller die sizes, as well as more die harvesting, and clock-speeds will be starting to decrease instead of increase.
|
nice op well as a manager u have to make decisions, like invest in a new way of manufacturing graphic cards. just imagine the 40nm technology yields graphic cards which are so much faster than anything ATI can produce with the 45nm technology. boom you gotta take risks. if james cameron would have said "oh fuck avatar gets too expensive, forget it" probably everyone would have agreed. now look at the success of the film.
|
Charlie Demerjian is a bit too much of a firebrand for some people to take him seriously. He is frequently accurate (for a rumor monger) and Fermi is definitely in trouble. But woe is you if you link to his articles in certain circles D:
|
Yes, the 200 series is the reason why Nvidia is falling behind.
I'm perhaps a bit late to say this, but that graph just shows that Nvidia's stock price fell in the recession. Like everyone else's stock price. Compare and contrast ATI/AMD, (and you'll find a similar price slump circa January 2009 for almost everybody, except the oil companies).
|
afaik, 3dfx died because they did not implement 32-bit colors or something. And they had problems with higher resolutions, while they had very fast CPU and good image display.
great op, i am also a few years late on the topic and plan to buy a desktop pc in a year or so, good to know that i will buy an ATi card again.
|
First of all ATI and Nvidia are different on the market, ATI produces their own cards with their own graphic chipset, while nvidia is simply a graphic chipset.
For my own experience, since nvidia release their riva tnt 2, ati was totally out, i had very bad experience with ati chipset since the radeon, opengl not supported while it's supposed to be, compatibility issues on unix plateform and the list goes on...
nvidia chipset always provided to me good performance for a reasonable price. AMD ASUS NVIDIA is my personal combo
|
I never really knew how their pronounced, ASUS. I always said it as A-sus, but its really pronounced in the Latin form.
|
2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time.
For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price.
|
Very informative OP. One of the only threads on TL where i have read every response
|
There is no more ATI, ATI was sold off to AMD and now AMD handles all of the graphical chip sets. ATI/AMD have stopped fabricating their own chips and have handed it off to third party companies. This is a cycle just 1-2 years ago when nVidia was running their GT series 8800GT, 8800GTS ATI/AMD was getting run into the ground and were extremely close to closing shop (I know this because my friend worked for ATI), but AMD bailed them out. Then once that merger was complete there was a lot of disorganization when it came to their graphical cards. ATI/AMD drivers have always been their achilles heel, they always had inferior drivers/software to nvidia and IMO they still do. Last year ATI/AMD was able to gain ground on the marketshare because their mid range cards were good but their high end cards the ones only few people could afford were getting beat down by nVidia's higher end cards.
This is how the computer industry works one company gets ahead then the other company drops, this is how it should be it should never be one company at the top then there is no competition. Competition is always good for the consumer.
|
Germany / USA16648 Posts
On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time.
For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price. I don't use Linux myself, but hasn't the Linux driver support become much better since AMD bought ATi? At least that's what I was told by the people I know who use Linux.
I can't judge how good it really is in Linux, but at the moment there is pretty much no good reason to buy nvidia cards. ATi is ahead everywhere, performance wise, price wise, heat wise, consumption wise. That being said I really hope nvidia's upcoming generation of GPUs performs well, because having two (or more preferably) strong competitors is much, much better from a consumer's perspective. I can't say that I'm really worried though.
Regarding AMD vs Intel: for gaming Phenom II x4 are up there and much cheaper, in most other fields Intel is ahead though (especially in mobile CPUs). Intel market share is huge though, especially in prebuild computers (partially due to their market practice inthe past tho...), here's to hope AMD picks it up one day :<
|
to be honest, i don't have much knowledge about graphic cards, but i recognized the brand name "3dfx" from long time ago, which made me wonder what had happened to it. this was a very informative post, and i can't believe i read through all lol
|
I once owned a "Diamond Monster 3D" (3DFX Voodoo Chipset). One of the first "3d" cards to come out.
Best card I ever had (relatively speaking)
|
On January 10 2010 00:10 Carnac wrote:
I don't use Linux myself, but hasn't the Linux driver support become much better since AMD bought ATi? At least that's what I was told by the people I know who use Linux.
nVidia is still a ways ahead of AMD in the Linux graphics driver department, but yes, they've improved a lot
|
this is really interesting didn't know much about any of this before the post =]
|
Isn't the Adobe Mercury Playback Engine currently only designed for nvidia cards?
Someone in the know please help me out.
|
Very informative OP!
I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since.
I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago.
|
|
Germany / USA16648 Posts
On January 10 2010 01:34 Highways wrote: Very informative OP!
I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since.
I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. sorry, but that's just dumb I understand being a fanboy of a team, sportsman, artist, ..., but of a piece of hardware? why pay more for less?
|
ATI's Linux driver is suppose to be open sourced now. So gradually we can expect good things to come out.
But Nvidia actually have a Linux driver division lol...
Who knows? I say that AMD/ATI merger did a lot of good for both companies. 2009 have being a very good year for them indeed. They were down but came out punching.
For NVidia, they will have to step up their game. I think they grew a bit complacent during the ATI suckage and didn't expect ATI to last as long as it did. Think about it, a failing company AMD buying anther failing company...
NVidia is still in great position. They will just float themselves until they release another product... who cares in the world of gpu cut throat LOL.
|
On January 10 2010 01:44 Carnac wrote:Show nested quote +On January 10 2010 01:34 Highways wrote: Very informative OP!
I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since.
I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. sorry, but that's just dumb I understand being a fanboy of a team, sportsman, artist, ..., but of a piece of hardware? why pay more for less?
Blasphemy!
|
wow, 2% success rating? how is that even possible?
did TSMC fire all their staff and got untested robots to make the cards, or something?
|
That was a pretty good summary Ghermination.
I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD.
On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while.
I think it sucks because monopoly are never good for the consumers and because eh i loved my GeForce 256 
|
On January 10 2010 02:19 Boblion wrote:That was a pretty good summary Ghermination. I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD. On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while. I think it sucks because monopoly are never good for the consumers and because eh i loved my GeForce 256 
What?
Doesn't Nvidia still control most of the market?
I think that if they keep failing and just rebranding their cards over and over their market share will continue to fall, but even in the worst case scenario it would take years for Nvidia to be considered as being out of the market.
|
On January 10 2010 02:26 CrimsonLotus wrote:Show nested quote +On January 10 2010 02:19 Boblion wrote:That was a pretty good summary Ghermination. I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD. On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while. I think it sucks because monopoly are never good for the consumers and because eh i loved my GeForce 256  What? Doesn't Nvidia still control most of the market? I think that if they keep failing and just rebranding their cards over and over their market share will continue to fall, but even in the worst case scenario it would take years for Nvidia to be considered as being out of the market. You are so naive. I don't know the numbers of their % on the market atm but i'm pretty sure it is already falling and all about the deals with OEM manufacturers and shitty integrated GPU. Problem is that it won't last for ever. Ati has CLEARLY the best products atm and i don't know why OEM manufacturers would buy Nvidia components if ATI has cheaper and better GPU.
Also you have to remember that Nvidia stopped to manufacture most of its 2xx cards and now only rely on shitty rebranded stuff + integrated GPU until Fermi release. Oh and the integrated market is highly dependant of Intel and AMD good will ( i hope you understand why ). Anyway it is not here that the benefit/card ratio is the most important. ATI probably makes more money selling one 5870 than 10+ shitty integrated cards.
On the mainstream GPU market a firm can die if they fail one generation of cards because it is just too competitive. The Voodoo 4 and 5 failed -> bye 3Dfx The parhelia 512 failed -> bye Matrox And actually i'm pretty sure that if AMD didn't bought ATI they might have died too.
|
On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time.
For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price. You are incorrect about the single core processors not being able to increase in speed. The Sempron 140, made from faulty athlon II 4400e die, is only a single core processor but it can easily outpace any single-core from the past. Hafnium still has a long way to go. I mean, have you ever seen an x-ray of a processor die? We still haven't even gained the ability to use 100% of the hafnium we print on, or the ability to even do it in a perfect square (the transistors in a die look a lot like a pancake, round with messed up edges. I'm sure we could fit a lot more of them in a piece of hafnium if we developed new ways of printing or improved already aging submersion lithography methods.
|
On January 09 2010 21:21 Djin)ftw( wrote: nice op well as a manager u have to make decisions, like invest in a new way of manufacturing graphic cards. just imagine the 40nm technology yields graphic cards which are so much faster than anything ATI can produce with the 45nm technology. boom you gotta take risks. if james cameron would have said "oh fuck avatar gets too expensive, forget it" probably everyone would have agreed. now look at the success of the film.
Also, ATI is currently producing 40nm dies fairly more successfully than Nvidia, because all they've had to cope with is a die shrink. I believe their numbers are around 30-40%, which is pretty bad and why there aren't very many 5 series cards in the market. Also TSMC isn't the only producer capable of manufacturing these 40nm gpu's, iirc Samsung has been contracted to do it many times in the past, and is also currently working on a 22nm process.
|
On January 09 2010 19:08 FragKrag wrote:Show nested quote +On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to. 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think) Intel holds like 50% thanks to them selling most their cpus bundled with their shitty for games graphics, nvidia then holds around 30% and ATI holds that last 20% this is very rough numbers intels is more like 49 nividia is closer to 31 and ati is closer to about 18 and the rest is other cards like matrox, last time i checked which was around the start of 2009, but when you look at these numbers you should note most of nvidia's and ATI's market share comes from OEM deals ie laptops and other integrated gpus, very few cards are sold as those monster flagship cards.
|
On January 10 2010 01:34 Highways wrote: Very informative OP!
I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since.
I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. That's just wrong as GTA4 ability to play is based on available memory to the card and a 280 has more. Also 1 game doesn't mean shit there are plenty of bais games the HL engine which was developed very closely with ATI runs on ATI cards better no surprise, the OP forgot the mention that the so called dirty practice of "the way it's meant to be play" is done by both companies, it's a long held practice that both companies lead companies millions in free work from engineers the only diff is that nvidia does this better then ATI because they are bigger. They also have a marketing team so they came up with TWIMTBP and crap like that, remember Intel and Nvidia have advertisements in stores and crap awhile AMD and ATI only puts adverts on websites that usually the people know AMD and ATI which is stupid
And about the OP part where Intel and Nvidia are buddies??? lol that's dam wrong they been fighting over lisens agreements for years now and nvidia was lucky enough to get it into court after the FTC decied to investigated intel's practices due to the parts were Asia and the EU already have(although i find both those cases very shakey and the EU's case like with mircosoft just plain stupid or atleast the reporting to it made it sound like the head of the investigation had nothing between his ears)
|
United States3824 Posts
How big is Nvidia into the shitmobile stuff for mobile computers and integrated graphics cards (like shared virtual memory) When I think about it that has to be where a lot of the money comes in, the big contracts with Dell and Hp and whatnot. (Maybe just Dell)
On January 09 2010 17:38 Klogon wrote: Even though ATI gained 8% last year, doesn't Nvidia still have the majority of the market share? I mean, sure the top of the line cards are cool and one side may outsell the other in that race, but the real money is made in the mid-level cards, aren't they? Don't Macs and Dell sell exclusively Nvidia? I'd say they're doing fine, but I honestly have no idea.
I'm rocking a Macbook from like Q2 2008 from before the switch to the solid bodies and then back to the white book and it has an Intel chipset for graphics.
Which leads to my next question, on motherboards with integrated graphics cards and shared memory for video stuff are the CPUs doing the computation or do they still have separate GPUs?
|
On January 10 2010 03:50 ghermination wrote:Show nested quote +On January 09 2010 21:21 Djin)ftw( wrote: nice op well as a manager u have to make decisions, like invest in a new way of manufacturing graphic cards. just imagine the 40nm technology yields graphic cards which are so much faster than anything ATI can produce with the 45nm technology. boom you gotta take risks. if james cameron would have said "oh fuck avatar gets too expensive, forget it" probably everyone would have agreed. now look at the success of the film. Also, ATI is currently producing 40nm dies fairly more successfully than Nvidia, because all they've had to cope with is a die shrink. I believe their numbers are around 30-40%, which is pretty bad and why there aren't very many 5 series cards in the market. Also TSMC isn't the only producer capable of manufacturing these 40nm gpu's, iirc Samsung has been contracted to do it many times in the past, and is also currently working on a 22nm process. The numbers are much higher then that that reporting was weeks months ago! AMD has been able to get it up to 70% but by this time they should be at 99% the yeilds are just bad. Nvidia has been able to get it up to 40% last time i heard they too should be like at 99% by now in otherwords 40nm has been a bitch to tame.
|
On January 10 2010 04:13 Virtue wrote:Show nested quote +On January 09 2010 19:08 FragKrag wrote:On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to. 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think) very few cards are sold as those monster flagship cards. But you gain more money with one "monster" than 10 shitty integrated card.
|
On January 09 2010 18:57 ghermination wrote:Show nested quote +On January 09 2010 18:53 Disregard wrote: I for-one always believed that anything 40ish Celsius* on idle is way too hot. The temps on my PC will get hotter when the new parts are installed, especially in the summer with stock coolers. 40 degrees is perfectly acceptable. What you need to worry about is load temps breaking 60. That will rape a processor in less than a year, whereas it's designed to operate constantly at temperatures higher than 40 degrees.
That's not true, and depends a lot on the processor. For example, a lot of Intel mobile chips are designed to run at 70-100 C and they do that fine (for a mobile chip, you've got a confined space and don't want to waste battery on fans).
The newer core chips from intel run super cool though. This is part of why they're so overclockable.
On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
Not true at all, core per core an i7 is probably 8x more computational power than a Pentium 4. This is because of more cache, multiple issue/pipelining/hyperthreading/etc.
On January 10 2010 02:19 Boblion wrote: That was a pretty good summary Ghermination.
I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD.
On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while.
This seems premature to me. Sure, the 2xx series hasn't been great for nVidia, but they've dealt with this before (5xxx FX series). nVidia still has high market share, and ATI/AMD aren't exactly on great financial ground either. Plus, there are still reasons people purchase nVidia cards (purevideo, cuda/opencl, physx, possibly better drivers, linux, etc).
|
On January 10 2010 04:40 Boblion wrote:Show nested quote +On January 10 2010 04:13 Virtue wrote:On January 09 2010 19:08 FragKrag wrote:On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to. 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think) very few cards are sold as those monster flagship cards. But they are the most profitable. I'm pretty sure it's the opposite, Nvidia was selling their cards at a near nothing like the GTX260 etc while ATI cut their profits thin to under cut Nvidia's original pricing.
I believe the flagship cards are the companies way of advertisement in the community and a form of development for the lower end cards and that's why the companies do it.
Both companies make the most money to the sub 75$ range most of the gpus solds though OEM deals etc, becuase those cards cost near nothing to make but they can sell them much better.
Nvidia's Tegra and ION has been the most profitable thing for Nvidia's this year which is why nvidia wants back into the chipset business the chips people think little of are the most profitable to make.
|
I don't know enough about graphics cards to be totally conclusive about it, but it seems like ATI is just owning Nvidia in most price brackets... The 4650 and 4670 have amazing price/performance ratio and are way better than the Geforce 9400/500, and at the high end the 5850-5970 seems like a way better deal than the GTX275-295. I'm not 100% sure about the midrange, but it seems like the cards are about equal but ATI's are always like 40-60$ cheaper for about the same performance, like the 5770 and the GTX260.
|
Sanya12364 Posts
nVidia and a total public relations disasters last year when their mobile GPU manufacturing process came out flawed. They pointed fingers at just about everyone including blaming their customers, the laptop assembly companies. They didn't make many friends during that episode so they're losing market share because of that. ATI's linux drivers still sucks. There is almost no documentation and it's hard to follow what is going on especially when a problem arises. In general ATI's non-windows software package is bad.
In the end, reliability and availability count more than benchmarks for most people. It explained why many companies stuck with Intel even when it was going through its P4 performance dip and ATI's server computing advantage with the Opteron.
nVidia's manufacturing flaw, its PR fiasco, and the lack of general availability of any new hardware on both sides just means that everyone is competing based on the last generation of GPUs and the ATI 4870's beat nVidia's GTX285 on price-performace.
|
On January 09 2010 17:44 R1CH wrote: It's very telling that NVIDIA hasn't released a new product since the GT200 series in mid-2008. Their previous cycle was a new product every six months, now they are just constantly rehashing the GT200. If their next card fails I think it will have a very serious impact on their ability to remain competitive with ATI finally becoming a serious contender.
And at that point people can start to talk about them doing badly over the next few years. The graphics market is the not the same as it used to be.
This is a fun and interesting story, but anyone that draws a conclusion that a company as huge and successful (in the long term, and somewhat in the short term) as nVidia is gonna pull a 3dfx is just fooling themselves.
The thing about being in a market of two competitors and being the top competitor is this: it's actually not a bad strategy to sit on your ass and have people send you their cash. You do little work (less investment in risky/newer technology) and still make approximately the same money in the short term. When your competition looks like it might be accomplishing something, you wake up from naptime and have a go at it again. This is simply a case of not enough competition coming from ATI.
|
On January 10 2010 05:31 TanGeng wrote: In the end, reliability and availability count more than benchmarks for most people. It explained why many companies stuck with Intel even when it was going through its P4 performance dip and ATI's server computing advantage with the Opteron.
They stuck with Intel because of its anti competitive practice I doubt Intel has been fined 1+ billion for nothing -.-
|
AMD was really closing to going under itself about 2 years ago. If it wasn't due to Hector Ruiz's deal with ATIC(Advanced Technology Investment Company) of Abu Dhabi and the Asset Light model, AMD would probably have gone bankrupt.
AMD collaborated with ATIC to create a company called Global Foundries, which now own the Dresden FAB that AMD used to run. AMD got an infusion of cash, while still maintining a 50%(aprox) stake in GF. AMD then turned over to be a pure Design company, just like Nvidia is and ATI used to be.
The reduced cost of not having to pay for the fabs anymore allowed AMD to get somewhat back on it heels, and buy itself more time until they can release a competetive product.
It's not secret that it's the CPU division of AMD that is pulling in the big cash. Even with the graphics division doing so well lately, it's not pulling in more than maybe a couple hundred millions $ per quarter.
AMD still has a huge debt, around 4-5 Billion Dollars and it really needs to release a killer product if it wants to stay in the game.
Alot hinges on the upcoming release of "Bulldozer" which will be released in First half of 2011
|
On January 10 2010 06:44 ruXxar wrote: AMD was really closing to going under itself about 2 years ago. If it wasn't due to Hector Ruiz's deal with ATIC(Advanced Technology Investment Company) of Abu Dhabi and the Asset Light model, AMD would probably have gone bankrupt.
AMD collaborated with ATIC to create a company called Global Foundries, which now own the Dresden FAB that AMD used to run. AMD got an infusion of cash, while still maintining a 50%(aprox) stake in GF. AMD then turned over to be a pure Design company, just like Nvidia is and ATI used to be.
The reduced cost of not having to pay for the fabs anymore allowed AMD to get somewhat back on it heels, and buy itself more time until they can release a competetive product.
It's not secret that it's the CPU division of AMD that is pulling in the big cash. Even with the graphics division doing so well lately, it's not pulling in more than maybe a couple hundred millions $ per quarter.
AMD still has a huge debt, around 4-5 Billion Dollars and it really needs to release a killer product if it wants to stay in the game.
Alot hinges on the upcoming release of "Bulldozer" which will be released in First half of 2011
Hexa-core 32nm parts ftw. I wonder how long it will be until we see 4ghz stock clocks.
|
Thanks for this OP. Really interesting read. I don't know enough to comment but will be interesting to see how things go. I used to be hardcore Nvidia user, but yeah, ATI has really stepped up in last few generations. My next gfx card will probably be an ATI 5xxx
|
Intel isn't going to be buying nVid soon. Not after nVid said one day many years ago that the CPU will be phased out by a unified GPU...
|
On January 10 2010 08:01 peidongyang wrote: Intel isn't going to be buying nVid soon. Not after nVid said one day many years ago that the CPU will be phased out by a unified GPU... The funny part (not for nVidia) is that it's now the other way around. Intel, AMD, and ATI have all put signifigant money into integrating the GPU into the CPU, rather than the other way around, and intel just released new processors which did just that, integrating the GPU into the CPU. I don't really know much about how they run and how they overclock, but i imagine we'll eventually have CPU's with integrated gpu's that are as powerful as they are now, making for like a 250 watt+ processor on like a 2500 pin socket, but reducing the total size of the computers by a lot. Unfortunately nVidia hasn't put any money at all into CPU/GPU integration so they'll probably be left in the dust in that aspect. I don't think we'll see the company outright fail, but if somehow they actually fail to release Fermi i don't think they'll ever completely recover.
|
man, this is so cut-throat, one product can change your luck completely. I wouldn't want to be working at these companies. Every day is like rollacoaster.
|
Oh, I would forget to add one important thing to the discussion: PhysX. ATI can't have it = it sucks (see something with and without PhysX engine, like Mirror's Edge, and the difference is horrendeous).
|
On January 10 2010 17:59 Manit0u wrote: Oh, I would forget to add one important thing to the discussion: PhysX. ATI can't have it = it sucks (see something with and without PhysX engine, like Mirror's Edge, and the difference is horrendeous).
Dude, PhysX is just a gimmick. There are other, more easily implemented, and either open source or cheaper to license physics systems. Nvidia is definitely trying to sell Physyx as some sort of gaming revolution when in reality it has very little to do with anything.
|
I read the OP and about half of the posts... Really nice OP could you make a blog next time instead so it's easier to keep track of your posts (if you intend to make more threads like these)
|
nVidia is a dirty company. They created the 9800 series to trick consumers, because the 9800 is identical to the 8800, with maybe a 10% overclock, so basically the 8800 OC'd editions. I think the early 9800 cards were even labelled 8800 in the bios.
The Way Its Meant To Be Played is also a dirty scam. They paid to have Assassin's Creed PC remove DX10.1 support after the game had already come out because nVidia cards didn't have it, but ATI cards did, and it made the game run better on ATI cards.
PhysX is also a bit of a scam. That stuff can be done using non-proprietary code, but nVidia pays TWIMTBP partners to exclusively use PhysX so they can claim their cards can do something that ATI cards can't.
Basically, nVidia is evil. Instead of just doing their best to be better, they pay game developers/publishers to give them an unfair advantage against ATI. Their price/performance ratio has been worse than ATI ever since the HD4800 series came out, and when the GT300 series finally comes out this spring, it'll be overpriced for the performance yet again, since the massive die size and production yield problems will make the chips way more expensive to produce than the HD5800 series. Not only will the GT300 be 6 months late, but insanely priced. nVidia is going to have a tough year.
|
On January 10 2010 18:10 Zzoram wrote:
The Way Its Meant To Be Played is also a dirty scam. They paid to have Assassin's Creed PC remove DX10.1 support after the game had already come out because nVidia cards didn't have it, but ATI cards did, and it made the game run better on ATI cards.
lol I had no idea they did that. That's fucked up wow.
|
On January 11 2010 06:02 Drowsy wrote:Show nested quote +On January 10 2010 18:10 Zzoram wrote:
The Way Its Meant To Be Played is also a dirty scam. They paid to have Assassin's Creed PC remove DX10.1 support after the game had already come out because nVidia cards didn't have it, but ATI cards did, and it made the game run better on ATI cards.
lol I had no idea they did that. That's fucked up wow. A lot of people aren't aware of that TWIMTBP bullshit. For example, the framerates people using nVidia cards experience in Borderlands are 10x smoother than they are on ATI cards. It's almost as if the game plays differently, and on ATI cards it always feels like sub 30 fps even though it's often far over that.
|
On January 10 2010 18:10 Zzoram wrote: and when the GT300 series finally comes out this spring, it'll be overpriced for the performance yet again, since the massive die size and production yield problems will make the chips way more expensive to produce than the HD5800 series. Not only will the GT300 be 6 months late, but insanely priced. nVidia is going to have a tough year.
Consider this: When the gtx280 first launched, it's prices were everywhere from $400 to $600. It's 200x200mm die, with about a 60% yield rate, was rather expensive to produce but was still able to give off enough performance that nVidia was doing okay.
The gt300 is supposedly 320x320mm+, and with a 2% yield rate we can only imagine what the prices will be. I'm expecting something around $600+ for the premium (gtx380 or whatever) card, and probably between 200 and 300 for the gtx360.
|
Nvidia is a lot more diversified than 3dfx
|
What is this turning into a nvidia hate thread? I know im quoting faud here but he has a valid point http://www.fudzilla.com/content/view/15834/1/ http://www.fudzilla.com/content/view/15794/34/
You guys forget both Nvidia and ATI practice giving away free money free work to game devs which improves the quality by alot. Nvidia has a strong PR and marketing team so they came up with TWIMTBP it's also why there are alot of Nvidia edition crap like cases.
People think Nvidia will be like 3dFX you mean bought out and melded into another company but wait that is ATI lol only diff is that ATI's brand name wasn't bad enough yet so AMD kept it.
Nvidia is the last great public Graphics card maker Matrox is a specialty and privately own 3dfx was bought out by nvidia ATI was bought out by AMD ionno what S3 does anymore creative jumped out of that business intel only does igp
so for nvidia to fall would be a great hurt to the graphics card business and competition AMD would jack up the price modeling for the cards in an instant.
|
On January 11 2010 07:55 Virtue wrote:What is this turning into a nvidia hate thread? I know im quoting faud here but he has a valid point http://www.fudzilla.com/content/view/15834/1/http://www.fudzilla.com/content/view/15794/34/You guys forget both Nvidia and ATI practice giving away free money free work to game devs which improves the quality by alot. Nvidia has a strong PR and marketing team so they came up with TWIMTBP it's also why there are alot of Nvidia edition crap like cases. People think Nvidia will be like 3dFX you mean bought out and melded into another company but wait that is ATI lol only diff is that ATI's brand name wasn't bad enough yet so AMD kept it. Nvidia is the last great public Graphics card maker Matrox is a specialty and privately own 3dfx was bought out by nvidia ATI was bought out by AMD ionno what S3 does anymore creative jumped out of that business intel only does igp so for nvidia to fall would be a great hurt to the graphics card business and competition AMD would jack up the price modeling for the cards in an instant. So you're saying that TWIMTBP isn't a rather unfair business practice? I'm sure being bribed to give performance advantages to Nvidia cards is great for the developers at those companies.
Also i think you're over-estimating the value of a Nvidia being the "last great public graphics card maker". Nvidia still hasn't realized that within a surprisingly short amount of time they will be outdated when you take into considering that CPU/GPU integration is moving ahead by leaps and bounds.
The fact is that Nvidia has done great damage to themselves, and even to the market, with their business practices, and if they fail and are bought out by intel or something, it will only be their fault.
|
|
|
|