Nvidia’s chip fabrication problem

The Inquirer recently published an excellent article about Nvidia’s chip fabrication problem that I highly recommend reading. It doesn’t spare any of the technical details in explaining why Nivida’s recent graphics chips are approaching a failure rate of 40% on some lines. The short summary of the problem is that Nvidia put off some complex but necessary re-engineering on their new chip lines until it was too late to do it properly, so to meet engineering tolerances they switched out one type of component with another, thus causing a whole set of problems including fast failure rates. Ouch.

What with ATI’s new line of 4850 and 4870 graphics chips that are surprisingly good, their new and improved GNU/Linux support, and a lack of high failure rates, I’m thinking my next graphics card purchase will be an ATI. This latest round of news is really bad for Nvidia, but definitely good for ATI. I wouldn’t even consider buying another Nvidia card until I hear that the problems are worked out.

3 Responses to “Nvidia’s chip fabrication problem”

  1. shadey Says:

    I have to ask, but how do ATI drivers fare on Linux these days? FGLRX and AIGLX was very buggy the last time I tried, but this was quite a while ago.

  2. Ed Says:

    It’s a tricky business this one… Both Intel and AMD have had their share of big problems in the past and have managed to turn things around. Let’s hope Nvidia can do that too as it’s much better to have some competition in this market than to have none.

  3. Gregory Maxwell Says:

    I grew tired of coding on the small laptop screen, so I built myself a new desktop system. I found a good deal on some very nice high end 26″ flat panels, so I needed a card that could drive them (I think all of my older cards were limited to 1600×1200 on their DVI outputs).

    I don’t want to use proprietary drivers. They’d be the only proprietary code on my entire system, they are a frequent source of kernel panics, they have a disturbing habit of dropping support for older cards, and from supporting other people have chosen to use them I know they like to screw with the nice and orderly way the distribution does things. So, I’d rather not use or support a company that uses proprietary drivers, and as a result Nvidia was not an option.

    I want working 3D, but I don’t really care much about performance: I’m never going to run a game more intensive than xscorch or maybe bzflag.

    The new free software ATI stuff didn’t look to usable yet. So I opted to get some older ATI X850 cards: They are supported, even with 3D, by the current free software ATI driver. They were also fairly inexpensive (dual DVI pci-express cards for ~$40).

    I’ve been a little disappointed with my decision, and I now better understand why a couple of friends of mine have opted to run Ubuntu with its built in proprietary drivers rather than Fedora on their ATI systems. It seems that there is a current transition happening between the classic style of configuring X with the xorg.conf and a new dynamic mode of configuration accomplished via xrandr. Basically the xorg ATI driver ignores many kinds of configuration in the xorg.conf (like multihead stuff), but the GUI configuration tools in Fedora are confused by something in the drivers xrandr behaviour and are unable to configure it (the same tools do xrandr configuration with the intel chipsets in my laptops just fine, and I was previously unaware of the change to dynamic configuration). I basically wasted hours getting it working and eventually figured that simply running the xrandr config tool manually with the right options would make it work. I threw “xrandr –output DVI-0 –right-of DVI-1” in my /etc/X11/xinit/xinitrc-common and was good to go. But I’m about as much of a guru as they come. If it took me a couple of hours to figure it out, I can understand why people switched to the proprietary drivers.

    My frustration was magnified by the increasing attempted-idiot-friendliness of modern desktop environments making it impossible to get the behaviour I want. Neither gnome, KDE 4.1, or XFCE4 allowed me to take a 3840×1200 jpg and display it spread across both screens. They would happily display the same image twice, once on each head with a multitude of useless stretching and scaling options. I think KDE and XFCE were nice enough to allow me to select different images for each scren, so I could have accomplished the behaviour I wanted by cutting the image in half. Only XFCE was nice enough to provide an obvious option so that the desktop environment leaves the root window unmanaged, allowing me to set my image with xsetroot. Message to desktop environment authors? Stop trying to be so f*king smart. You’re not. Your second guessing of my wishes is usually wrong. Just expose the damn options, hide them behind an advanced menu if you must. Linus complained about this three years ago and his complaint is as spot on now as it was then. Too bad KDE also seems to be going the feature-removed route in 4.1, though not as badly as gnome.

    (FWIW, I’m using XFCE4 now, the deal maker being the aforementioned ability to leave the root window alone allowing me to use video or xscreensavers as my background. I realize my usage isn’t typical: All I really want to do is run a bunch of statically positioned tabbed-terminals (ideally without useless window decoration), a couple of emacs windows, and a web browser. Were it not for the browser popping up windows I wouldn’t even need a window manager at all. So, not typical‚Ķ but actually a lot simpler than the typical usage. Too bad neither of the popular ‘desktop environments’ can meet my needs.)

    Performance in 2D is somewhat disappointing, which is really sad because I’m quite undemanding. When I switch virtual desktops to a desktop covered with a dozen windows it takes a good full second to paint the screen. Otherwise the 2D performance is fine (screensavers are especially spiffy). 3D performance also meets my needs, but it seems a bit crash prone when running full screen. As a result I am not using the compiz 3d desktop stuff. (Honestly, I wouldn’t be running it anyways, as it adds delays so you have time to see the pretty animated effects. I use it on my laptop when I’m in public just to get the ooohs and ahhhs from on-lookers).

    I kinda wish that I’d picked up a more modern radeonhd card: It likely would have take a bit longer to get working, and might not currently work better but at least I could expect it to improve. I don’t think anyone is putting much effort into the old ATI driver. At least it wasn’t expensive‚Ķ I can just get a radeonhd card once the driver matures (and by then the cards should also be less expensive).

    What I’d really like is Intel to produce a line of stand-alone video cards. They seem to be the only graphics chipset makers who really have this GNU/Linux thing figured out. They employ multiple developers who are quite active in the appropriate communities (one even wrote a patch directly in response to a bug report from me). Intel’s competence really shows: video on my laptops just works. It’s problem free, highly performing (though perhaps not for 3d games), and the drivers don’t do anything unneighbourly (like monopolize the PCI bus and screw up realtime applications). It seems that ATI will get there eventually, but it might take a while. Nvidia still seems like a lost cause.