|
On January 09 2010 19:08 FragKrag wrote:Show nested quote +On January 09 2010 19:06 Garnet wrote: If ATI has 8% of the market, then where is the other 92% go to. 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think)
All victims of pre-build laptops/netbooks and PCs.
edit: Well building your netbook and laptop.... eh
|
T.O.P.
Hong Kong4685 Posts
Klogan meant that ATI increased their market share by 8%.
|
And no, I don't know why I did this.
|
Well written and very interesting OP, thanks!
|
I think you're reading too much into it. Nvidia and ATI have been going at it for years with the advantage swaying to both sides.
Back in the beginning Nvidia was the best no question about it, if you were a serious gamer you owned an Nvidia card and that was it.
Then ATI's 9xxx cards completely >>> Nvidia's Geforce FX series.
Fast forward a bit and we have the Geforce 8 series blowing the HD2xxx series out of the water.
HD3xxx vs Geforce 8/9 was pretty even imo. I bought a HD3850 over 8800GT or 9600GT for the better price/performance ratrio at the time.
Now we have HD4xxx beating the Geforce 2xx series, this seems to be pretty normal imo, it's like how the metagame in Starcraft swings back and forth but ultimately the game is balanced
|
IMHO nVidia is far from failing. The graphics market has always seen the biggest competition races in terms of technology. With every new generation it's either the red team or the green team winning the race. I even don't want to compare the situation nVidia is in ATM to any situation Ati has seen before. It's just different. Ati used to have grahipcs and later chipsets. nVidia has a more diverse product portfolio. They got graphics, HPC with Tesla, mobile graphics, ARM based mobile platflorm Tecra, chipsets (ION). And if you ask me they are secretly working on some x86 too. Fermi has been in development for a good while now. But it's also much more than 'yet another graphics architecture'. By the increase in double precision performance alone an the use of ECC RAM you can see that nVidia is aiming to make a serious impact on the HPC scene. Fermi also seems to rock in OpenCL. As soon as Adobe products start to make broad use of OpenCL everyone even the non-gamers will be buying top-of-the-line GPUs for their PCs. I agree that nVidia needs to get it's act together. But they still have enough time.
|
There are many things happening beyond what is obvious in the graphics market. nVidia is afraid of the cpu-gpu fusion happening behind their back and letting them stare in the sun as AMD and Intel make them redundant. They are not 100% focused on 3d graphics and games any more.
Bill307:to simplify a little, a wafer contains almost the same number of faulty chips regardless of their die size. This is why gpus are die harvested a lot, there are not many fully functional dies, but many partially. Defects are a lone transistor that doesn't work or an interconnect that is cut somewhere. If they don't compromise the whole chip you can work around defects and make yourself a lot of money.
"Standard" CMOS will not scale beyond 22nm unless it is really necessary, such as there being a lack of breakthroughs in other fields. Silicon's days are numbered. Defect density will be higher, requiring smaller die sizes, as well as more die harvesting, and clock-speeds will be starting to decrease instead of increase.
|
nice op well as a manager u have to make decisions, like invest in a new way of manufacturing graphic cards. just imagine the 40nm technology yields graphic cards which are so much faster than anything ATI can produce with the 45nm technology. boom you gotta take risks. if james cameron would have said "oh fuck avatar gets too expensive, forget it" probably everyone would have agreed. now look at the success of the film.
|
Charlie Demerjian is a bit too much of a firebrand for some people to take him seriously. He is frequently accurate (for a rumor monger) and Fermi is definitely in trouble. But woe is you if you link to his articles in certain circles D:
|
Yes, the 200 series is the reason why Nvidia is falling behind.
I'm perhaps a bit late to say this, but that graph just shows that Nvidia's stock price fell in the recession. Like everyone else's stock price. Compare and contrast ATI/AMD, (and you'll find a similar price slump circa January 2009 for almost everybody, except the oil companies).
|
afaik, 3dfx died because they did not implement 32-bit colors or something. And they had problems with higher resolutions, while they had very fast CPU and good image display.
great op, i am also a few years late on the topic and plan to buy a desktop pc in a year or so, good to know that i will buy an ATi card again.
|
First of all ATI and Nvidia are different on the market, ATI produces their own cards with their own graphic chipset, while nvidia is simply a graphic chipset.
For my own experience, since nvidia release their riva tnt 2, ati was totally out, i had very bad experience with ati chipset since the radeon, opengl not supported while it's supposed to be, compatibility issues on unix plateform and the list goes on...
nvidia chipset always provided to me good performance for a reasonable price. AMD ASUS NVIDIA is my personal combo
|
I never really knew how their pronounced, ASUS. I always said it as A-sus, but its really pronounced in the Latin form.
|
2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time.
For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price.
|
Very informative OP. One of the only threads on TL where i have read every response
|
There is no more ATI, ATI was sold off to AMD and now AMD handles all of the graphical chip sets. ATI/AMD have stopped fabricating their own chips and have handed it off to third party companies. This is a cycle just 1-2 years ago when nVidia was running their GT series 8800GT, 8800GTS ATI/AMD was getting run into the ground and were extremely close to closing shop (I know this because my friend worked for ATI), but AMD bailed them out. Then once that merger was complete there was a lot of disorganization when it came to their graphical cards. ATI/AMD drivers have always been their achilles heel, they always had inferior drivers/software to nvidia and IMO they still do. Last year ATI/AMD was able to gain ground on the marketshare because their mid range cards were good but their high end cards the ones only few people could afford were getting beat down by nVidia's higher end cards.
This is how the computer industry works one company gets ahead then the other company drops, this is how it should be it should never be one company at the top then there is no competition. Competition is always good for the consumer.
|
Germany / USA16648 Posts
On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time).
nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time.
For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price. I don't use Linux myself, but hasn't the Linux driver support become much better since AMD bought ATi? At least that's what I was told by the people I know who use Linux.
I can't judge how good it really is in Linux, but at the moment there is pretty much no good reason to buy nvidia cards. ATi is ahead everywhere, performance wise, price wise, heat wise, consumption wise. That being said I really hope nvidia's upcoming generation of GPUs performs well, because having two (or more preferably) strong competitors is much, much better from a consumer's perspective. I can't say that I'm really worried though.
Regarding AMD vs Intel: for gaming Phenom II x4 are up there and much cheaper, in most other fields Intel is ahead though (especially in mobile CPUs). Intel market share is huge though, especially in prebuild computers (partially due to their market practice inthe past tho...), here's to hope AMD picks it up one day :<
|
to be honest, i don't have much knowledge about graphic cards, but i recognized the brand name "3dfx" from long time ago, which made me wonder what had happened to it. this was a very informative post, and i can't believe i read through all lol
|
I once owned a "Diamond Monster 3D" (3DFX Voodoo Chipset). One of the first "3d" cards to come out.
Best card I ever had (relatively speaking)
|
On January 10 2010 00:10 Carnac wrote:
I don't use Linux myself, but hasn't the Linux driver support become much better since AMD bought ATi? At least that's what I was told by the people I know who use Linux.
nVidia is still a ways ahead of AMD in the Linux graphics driver department, but yes, they've improved a lot
|
|
|
|