ATI (AMD) versus Nvidia is a bit like arguing Mac versus PC. Bias can be inherent, but there’s a point to it beyond just brandname preference. I’ll temper this by saying they both suck at times: There’s a bit too much bleedin’ edge; That usually means someone has to bleed.
This is all IMHO, so feel free to throw your own opinion in:
ATI / AMD sucks at drivers.
Nvidia sucks with overheating.
I know you can give vise-versa examples, where Nvidia drivers failed to perform, or an ATI card overheated, but I believe those to be generally true statements.
Why ATI sucking at drivers is good / bad:
ATI follows the “release early, release often” rule more closely and have a wider variety of iteration in their chipsets as a result. The upside is that ATI gets more power to heat ratio, so their cards tend to be cooler and physically fail less. That’s a great thing.
When I say “ATI sucks at drivers”, that’s in relation to newly released games. They do their due diligence and update drivers, but I think they’re inherently behind because the little changes in each card make it more time-consuming to come up with unified solutions.
On that note, it’s strongly my opinion that drivers are a two-way street, with most of the traffic on the manufacturer’s side. When someone complains that X game sucks because they didn’t make the game more compatible with ATI, I place the blame primarily at ATI’s feet. It’s their responsibility to get their cards working with games that are about to be released. I don’t know of a developer on this planet that would close their doors to them, so that’s just conspiracy talk when I hear it.
Why Nvidia overheating is good / bad:
Nvidia pushes the high-end too, but they tend to keep their chipsets mostly the same and ramp up clock speeds for the latest-and-greatest. As a result, they’re ahead of the game on drivers and if it’s working, don’t break it, right?
As a result though: Nvidia’s cards tend to run ultra-hot and in this era of cheaply assembled PCs, that means they often overheat. Anyone who’s recognized an overheated card can tell you once it’s gone past a certain point, it’s going to act flaky forever. Many people don’t recognize it and think their card is just defective, but really it’s been broken. Call it defective-by-design if you like, but it is avoidable.
Why I prefer Nvidia?
I can handle the heat. That’s under my control. I can get a case with better airflow, better fans, etc.. Drivers on the other hand, are not under my control and I have little patience waiting for software to make my hardware work correctly.
This is really another debate over whether PCs are still a hobbyist thing (ding! That’s my opinion), or are PCs so widespread that they should just work? That’s a grand idea, and probably why the iPad will do well with very lame / crippled hardware.