Nvidia the new 3dfx? - Page 5
Forum Index > General Forum |
Yogurt
United States4258 Posts
| ||
jeddus
United States832 Posts
Someone in the know please help me out. | ||
Highways
Australia6098 Posts
I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since. I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. | ||
eNoq
Netherlands502 Posts
| ||
Carnac
Germany / USA16648 Posts
On January 10 2010 01:34 Highways wrote: Very informative OP! I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since. I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. sorry, but that's just dumb I understand being a fanboy of a team, sportsman, artist, ..., but of a piece of hardware? why pay more for less? | ||
haduken
Australia8267 Posts
But Nvidia actually have a Linux driver division lol... Who knows? I say that AMD/ATI merger did a lot of good for both companies. 2009 have being a very good year for them indeed. They were down but came out punching. For NVidia, they will have to step up their game. I think they grew a bit complacent during the ATI suckage and didn't expect ATI to last as long as it did. Think about it, a failing company AMD buying anther failing company... NVidia is still in great position. They will just float themselves until they release another product... who cares in the world of gpu cut throat LOL. | ||
haduken
Australia8267 Posts
On January 10 2010 01:44 Carnac wrote: sorry, but that's just dumb I understand being a fanboy of a team, sportsman, artist, ..., but of a piece of hardware? why pay more for less? Blasphemy! | ||
Tangsta
Australia68 Posts
how is that even possible? did TSMC fire all their staff and got untested robots to make the cards, or something? | ||
Boblion
France8043 Posts
I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD. On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while. I think it sucks because monopoly are never good for the consumers and because eh i loved my GeForce 256 | ||
CrimsonLotus
Colombia1123 Posts
On January 10 2010 02:19 Boblion wrote: That was a pretty good summary Ghermination. I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD. On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while. I think it sucks because monopoly are never good for the consumers and because eh i loved my GeForce 256 What? Doesn't Nvidia still control most of the market? I think that if they keep failing and just rebranding their cards over and over their market share will continue to fall, but even in the worst case scenario it would take years for Nvidia to be considered as being out of the market. | ||
Boblion
France8043 Posts
On January 10 2010 02:26 CrimsonLotus wrote: What? Doesn't Nvidia still control most of the market? I think that if they keep failing and just rebranding their cards over and over their market share will continue to fall, but even in the worst case scenario it would take years for Nvidia to be considered as being out of the market. You are so naive. I don't know the numbers of their % on the market atm but i'm pretty sure it is already falling and all about the deals with OEM manufacturers and shitty integrated GPU. Problem is that it won't last for ever. Ati has CLEARLY the best products atm and i don't know why OEM manufacturers would buy Nvidia components if ATI has cheaper and better GPU. Also you have to remember that Nvidia stopped to manufacture most of its 2xx cards and now only rely on shitty rebranded stuff + integrated GPU until Fermi release. Oh and the integrated market is highly dependant of Intel and AMD good will ( i hope you understand why ). Anyway it is not here that the benefit/card ratio is the most important. ATI probably makes more money selling one 5870 than 10+ shitty integrated cards. On the mainstream GPU market a firm can die if they fail one generation of cards because it is just too competitive. The Voodoo 4 and 5 failed -> bye 3Dfx The parhelia 512 failed -> bye Matrox And actually i'm pretty sure that if AMD didn't bought ATI they might have died too. | ||
ghermination
United States2851 Posts
On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time). nVidia vs ATI = no big deal either. ATI is trying to grab the market with their blazing fast cards that support stuff that's not out yet (which still gives nVidia time to counter them). Also you forget that there are people who don't give a shit about gcard supporting DX11 or not (OSX, Linux, Unix users). But all this aside, there's also another side to each of this companies, and that's support. I would never ever pick ATI card over nVidia one just for the sole fact that with nVidia I don't have to worry with compatibility issues most of the time (yes, I'm a Linux user now) and I am getting new, improved drivers released more frequently thus making my card work better for a longer period of time. For me it'll always be Intel + nVidia combo. Even if I have to overpay for it. Reliability >>> price. You are incorrect about the single core processors not being able to increase in speed. The Sempron 140, made from faulty athlon II 4400e die, is only a single core processor but it can easily outpace any single-core from the past. Hafnium still has a long way to go. I mean, have you ever seen an x-ray of a processor die? We still haven't even gained the ability to use 100% of the hafnium we print on, or the ability to even do it in a perfect square (the transistors in a die look a lot like a pancake, round with messed up edges. I'm sure we could fit a lot more of them in a piece of hafnium if we developed new ways of printing or improved already aging submersion lithography methods. | ||
ghermination
United States2851 Posts
On January 09 2010 21:21 Djin)ftw( wrote: nice op well as a manager u have to make decisions, like invest in a new way of manufacturing graphic cards. just imagine the 40nm technology yields graphic cards which are so much faster than anything ATI can produce with the 45nm technology. boom you gotta take risks. if james cameron would have said "oh fuck avatar gets too expensive, forget it" probably everyone would have agreed. now look at the success of the film. Also, ATI is currently producing 40nm dies fairly more successfully than Nvidia, because all they've had to cope with is a die shrink. I believe their numbers are around 30-40%, which is pretty bad and why there aren't very many 5 series cards in the market. Also TSMC isn't the only producer capable of manufacturing these 40nm gpu's, iirc Samsung has been contracted to do it many times in the past, and is also currently working on a 22nm process. | ||
semantics
10040 Posts
On January 09 2010 19:08 FragKrag wrote: 8% is far too low for ATi imo Most of the market should actually be in Intel's hands with their integrated gpus. (I think) Intel holds like 50% thanks to them selling most their cpus bundled with their shitty for games graphics, nvidia then holds around 30% and ATI holds that last 20% this is very rough numbers intels is more like 49 nividia is closer to 31 and ati is closer to about 18 and the rest is other cards like matrox, last time i checked which was around the start of 2009, but when you look at these numbers you should note most of nvidia's and ATI's market share comes from OEM deals ie laptops and other integrated gpus, very few cards are sold as those monster flagship cards. | ||
semantics
10040 Posts
On January 10 2010 01:34 Highways wrote: Very informative OP! I havnt been following the graphics card war since the nvidia 200 series was released. Can't believe nvidia havnt released a new one since. I remember reading that the 4750's performs much better than the 280's in GTA4 back in 2008. But I'm a nvidia fanboy so I'll still stick with them. Just bought a 9800GT a few months ago. That's just wrong as GTA4 ability to play is based on available memory to the card and a 280 has more. Also 1 game doesn't mean shit there are plenty of bais games the HL engine which was developed very closely with ATI runs on ATI cards better no surprise, the OP forgot the mention that the so called dirty practice of "the way it's meant to be play" is done by both companies, it's a long held practice that both companies lead companies millions in free work from engineers the only diff is that nvidia does this better then ATI because they are bigger. They also have a marketing team so they came up with TWIMTBP and crap like that, remember Intel and Nvidia have advertisements in stores and crap awhile AMD and ATI only puts adverts on websites that usually the people know AMD and ATI which is stupid And about the OP part where Intel and Nvidia are buddies??? lol that's dam wrong they been fighting over lisens agreements for years now and nvidia was lucky enough to get it into court after the FTC decied to investigated intel's practices due to the parts were Asia and the EU already have(although i find both those cases very shakey and the EU's case like with mircosoft just plain stupid or atleast the reporting to it made it sound like the head of the investigation had nothing between his ears) | ||
cgrinker
United States3824 Posts
On January 09 2010 17:38 Klogon wrote: Even though ATI gained 8% last year, doesn't Nvidia still have the majority of the market share? I mean, sure the top of the line cards are cool and one side may outsell the other in that race, but the real money is made in the mid-level cards, aren't they? Don't Macs and Dell sell exclusively Nvidia? I'd say they're doing fine, but I honestly have no idea. I'm rocking a Macbook from like Q2 2008 from before the switch to the solid bodies and then back to the white book and it has an Intel chipset for graphics. Which leads to my next question, on motherboards with integrated graphics cards and shared memory for video stuff are the CPUs doing the computation or do they still have separate GPUs? | ||
semantics
10040 Posts
On January 10 2010 03:50 ghermination wrote: Also, ATI is currently producing 40nm dies fairly more successfully than Nvidia, because all they've had to cope with is a die shrink. I believe their numbers are around 30-40%, which is pretty bad and why there aren't very many 5 series cards in the market. Also TSMC isn't the only producer capable of manufacturing these 40nm gpu's, iirc Samsung has been contracted to do it many times in the past, and is also currently working on a 22nm process. The numbers are much higher then that that reporting was weeks months ago! AMD has been able to get it up to 70% but by this time they should be at 99% the yeilds are just bad. Nvidia has been able to get it up to 40% last time i heard they too should be like at 99% by now in otherwords 40nm has been a bitch to tame. | ||
Boblion
France8043 Posts
On January 10 2010 04:13 Virtue wrote: very few cards are sold as those monster flagship cards. But you gain more money with one "monster" than 10 shitty integrated card. | ||
yoden
United States64 Posts
On January 09 2010 18:57 ghermination wrote: 40 degrees is perfectly acceptable. What you need to worry about is load temps breaking 60. That will rape a processor in less than a year, whereas it's designed to operate constantly at temperatures higher than 40 degrees. That's not true, and depends a lot on the processor. For example, a lot of Intel mobile chips are designed to run at 70-100 C and they do that fine (for a mobile chip, you've got a confined space and don't want to waste battery on fans). The newer core chips from intel run super cool though. This is part of why they're so overclockable. On January 09 2010 22:40 Manit0u wrote: 2 things to remember: Intel vs AMD = no big deal. For a couple last years processors didn't really get any faster (after the P4) because the materials used in their production process just won't allow that. That's why we have seen a dawn of dual-, quad- and more-cores. Still it doesn't make your computer all that much faster, just because software is not designed in the multi-core architecture in mind (having different cores perform different tasks simultaneously is not optimal, the goal here is to design software in the way that every single task is processed by ALL cores at the same time). Not true at all, core per core an i7 is probably 8x more computational power than a Pentium 4. This is because of more cache, multiple issue/pipelining/hyperthreading/etc. On January 10 2010 02:19 Boblion wrote: That was a pretty good summary Ghermination. I also think that Nvidia is in a bad situation. However i don't think they will die like 3dfx but will evolve like Matrox and provide cards for professional users ( CUDA etc .... ) and maybe cheap integrated GPU if they don't get f***** by Intel and AMD. On the other hand ATI/AMD will basicly have a monopoly for mid-high level gaming cards. With the failure of the first generation Larrabee Intel shouldn't be a threat for a while. This seems premature to me. Sure, the 2xx series hasn't been great for nVidia, but they've dealt with this before (5xxx FX series). nVidia still has high market share, and ATI/AMD aren't exactly on great financial ground either. Plus, there are still reasons people purchase nVidia cards (purevideo, cuda/opencl, physx, possibly better drivers, linux, etc). | ||
semantics
10040 Posts
I'm pretty sure it's the opposite, Nvidia was selling their cards at a near nothing like the GTX260 etc while ATI cut their profits thin to under cut Nvidia's original pricing. I believe the flagship cards are the companies way of advertisement in the community and a form of development for the lower end cards and that's why the companies do it. Both companies make the most money to the sub 75$ range most of the gpus solds though OEM deals etc, becuase those cards cost near nothing to make but they can sell them much better. Nvidia's Tegra and ION has been the most profitable thing for Nvidia's this year which is why nvidia wants back into the chipset business the chips people think little of are the most profitable to make. | ||
| ||