Archive for the 'GNU/Linux' Category

Meet Vertumnus, my new GNU/Linux desktop (running on a Dell Inspiron 530)

Wednesday, June 11th, 2008

If this post seems a little glowing, don’t be alarmed; it’s because I’m still basking in the brilliant sheen of my new GNU/Linux desktop (which I am composing this blog post on as I type these very words — and these words, too). That’s right, I went through with my plans for setting up a GNU/Linux desktop, though I didn’t actually use the parts list I threw together two weeks ago. I ran across an amazing deal through Dell’s small business site (instant savings of nearly half off!) on an Inspiron 530 and I jumped on it. For $360 ($407 after shipping and state taxes), I got a nice little Dell mini-tower with an Intel Core 2 Duo E8200 processor, 2 GB of DDR2 PC2 6400 RAM, 500GB SATA hard drive with 16 MB cache, SATA DVD burner, keyboard, and optical scroll mouse. It ended up being about the same price as the parts list I put together, but the performance is marginally better, with the added possibility of upgrading to 4 GB of RAM. It also came with Windows Vista Home Premium, which I suppose would be a value add-in for some, but which just made me wince at how much cheaper I could have gotten this system without paying the Microsoft tax. Anyway, Vista’s in the trash now, where it belongs, and the price was good enough that I’m not worrying about it.

Installing the OS

I was going to install Kubuntu on my new system, but I opted for Ubuntu instead on a recommendation from Drinian, who says that Kubuntu isn’t quite as well put-together. The only reason I wanted Kubuntu was because I wanted to run KDE instead of Gnome, but it turns out that’s incredibly easy to accomplish in Ubuntu (just install the kubuntu-desktop meta-package in aptitude, then set your login session to KDE). So choosing Ubuntu over Kubuntu hasn’t left me disappointed in any way.

Unfortunately, installing Ubuntu GNU/Linux still wasn’t as easy as it should have been. I blame the problem on hardware incompatibilities, most likely with the SATA controller on the motherboard. The installation CD wouldn’t boot without passing the kernel parameter “all_generic_ide”, which is something I can handle but the average computer user is likely to be turned off by. Then, after the installation completed, my system wouldn’t boot from the hard drive for the same reason, so I had to boot back into the LiveCD environment, mount my boot partition, and then edit grub’s (a bootloader) menu.lst to pass that same kernel parameter. So, yeah, GNU/Linux isn’t exactly friendly for the masses, at least not on this hardware. Curiously enough, I had this exact same problem when dual-booting Fedora Core (another distribution of GNU/Linux) on my previous desktop. There’s definitely some room for improvement in this area by either the Linux kernel developers or the Ubuntu packagers. There’s no real reason this can’t be one of those things that “Just Works”.

Naming the system

But after the minor hitch with “all_generic_ide” , everything else worked just fine. It was the smoothest GNU/Linux installation I believe I’ve ever done. The GNU/Linux graphical installers have become quite advanced, completely putting anything Microsoft offers up to shame. Actually, the part of the installation process that took the longest time was picking a name for my new computer. I have a long history of naming computers after various mythologies, deities, or nerdy things (Ixion, Dark Anima, Fyre, Quezacoatl, Geminoid, Phoenix, etc.), so I wanted to continue the theme. I figured since this is the first time I’ve ever used a dedicated GNU/Linux system as my primary desktop (as opposed to Microsoft Windows), I wanted to emphasize the change this brings to my computing life. So I got into a lively discussion on IRC with someone who apparently knows a good deal about ancient Greek/Roman mythology, and his best suggestion was the Roman god Vertumnus, who is “the god of seasons, change and plant growth, as well as gardens and fruit trees”. I liked both the change aspect and the environmental aspect, so Vertumnus it was.

Read the rest of this entry »

Specs for a high power, cheap ($380) GNU/Linux desktop

Wednesday, May 28th, 2008

The other day, I was realizing that I don’t use GNU/Linux as often as I should. Sure, I run it exclusively on my servers, but I still use Windows on the desktop for the most part. That’s more out of habit than out of any need. Everything I currently do in Windows I can do in GNU/Linux, except for the games, which I’m playing more and more occasionally these days. I was dual-booting my current desktop with Windows XP and GNU/Linux for awhile, but it proved to be inconvenient. My computers’ uptimes, both servers and desktops, are typically measured in months (only going down for crashes and power losses). It takes awhile to reboot and restart all of the applications I typically have running, so I don’t do it by choice. Thus you can see the problem with dual-booting: it entails constant rebooting, which I had to do as often as I felt like playing a Windows game. And then once I was in Windows I wouldn’t want to go through the hassle of booting into GNU/Linux only to boot back into Windows the next time I wanted to play a game. It simply wasn’t working.

So I now see the problem with my initial attempts at using GNU/Linux on the desktop. I simply don’t have the patience to put up with all of those constant reboots and interruptions in my computing environment. I’m too lazy. I’m simply going to get another desktop to use exclusively for GNU/Linux, while making every effort to only use my current Windows desktop for playing games. And luckily, making a desktop computer is cheaper than it’s ever been. Here is a current parts list I put together just yesterday for a killer GNU/Linux desktop.

The specs

This complete GNU/Linux system costs only $355. Throw in shipping and we’ll call it $380. That’s a really cheap price considering how powerful this system is. Avoiding the Microsoft tax by choosing a Free operating system pays huge dividends when the overall system is cheap. Allow me to explain the choices I made in putting this system together with individual analyses of each other components:

The barebone system

First of all, I save a lot of money with this computer by building it into a barebone system. A price of $90 for a case, power supply, and motherboard is really hard to beat. You can easily spend over $90 for each of those individual components (and in fact, when I built my current desktop, I did). Getting a good barebone system is an excellent way to save a lot of money on a low-end desktop. If you’re not building a low-end desktop, I wouldn’t bother. The limitations can be significant. For instance, the motherboard that ships in the barebone I picked out supports a maximum of 2 GB of RAM; fine for a low-end system, but you really want 4 GB of RAM on a medium or high end system. And the power supply is only 250W; again, fine for a low end system, but don’t expect it to be able to power, say, a high-end discrete video card. And naturally the motherboard doesn’t support dual video cards, which would be an upgrade path you might want to keep open on a system you’re outlaying more money on. It also doesn’t support quad-core processors. So there are limitations, but for a low-level system, you won’t run into them.

Read the rest of this entry »

Some useful WordPress plugins

Thursday, January 31st, 2008

In the year that I’ve been running this blog, I’ve accumulated a decent number of WordPress plugins. Some I now consider essential; others are merely neat. The list below contains every WordPress plugin I currently have running on this blog, in alphabetical order (say you entirely wanted to duplicate the look of this blog, this list would be a good place to start). And if you see some missing from the list that you think I should be running, well, let me know in the comments.

  1. All in One SEO Pack. I know the name sounds kind of evil (SEO is a four-letter word in many parts of the Internet), but this plugin is really innocuous. The main functionality that I use is its ability to nicely and cleanly give posts meta descriptions, which show up under the page title in a search engine listing. It also lets you adjust all sorts of keywords, other meta tags, page titles, etc., but I haven’t messed around with that stuff yet.
  2. AskApache RewriteRules Viewer displays a list of URL rewriting rules, both those of WordPress and any applicable .htaccess files. It helped me out when I was debugging a permalink structure change. Though I haven’t used it since, I still keep it around just in case.
  3. Flexible Upload is a very useful plugin that extends the functionality of uploading and using images in WordPress. It creates thumbnails of a given size on the fly and offers increased control in how an image is placed within a post (without having to manually adjust the HTML). It will also do automatic watermarking for you. Just a caveat though, version 1.9 has some sort of incompatibility with certain themes or WordPress 2.3.x or something, so I’m running 1.8.
  4. Google XML Sitemaps automatically creates an XML sitemap of your blog and keeps it updated over time. XML sitemaps are still kind of a new thing, but Google is using them, and having one can potentially increase the visibility of some of your harder-to-find pages in search rankings. If nothing else, at least its output is nifty.
  5. In Series adds series functionality to WordPress. Think of a series as categories with ordering, so each post has a numbered table of contents linking to the other pages in the series as well as “Previous” and “Next” links. I’m using it for my telescope-making work log and my Diamondback opinion columns. One caveat: the only way to prevent the table of contents (which can get long) from displaying on the front page is by hacking up your theme. The next version of this plugin will make it a simple configuration option.
  6. Live Comment Preview implements live comment previewing in pure JavaScript, no AJAX or additional server calls required. A caveat, recent versions seem to have problems for logged in users (i.e. you) with WordPress 2.3.x on non-Internet Explorer browsers, so I’m using version 1.7.
  7. No Self Pings prevents your newly written posts from generating pingbacks on previous posts. Some people like them; I don’t.
  8. Random Posts Widget displays a number of links to random posts on your blog. I used to have a “Best posts” heading, but maintaining it was too much of a hassle, so I removed it and simply went for the random links. The ability to browse through a blog randomly rather than having to go chronologically or by category is great, and by giving a choice of posts for someone to click on, they can pick the one that most interests them. If you haven’t tried out the random links yet, do have a go. Most of the traffic on this blog is on recent posts, but the older posts don’t have any less quality.
  9. Raz-Captcha adds a CAPTCHA to user login and/or user registration. I just have it turned on for registration, because too many spammers were automatically registering accounts in the vain hope that being logged in would let their spammy comments through my spam filter (it doesn’t).
  10. Recent Comments Widget displays the most recent comments anywhere on the blog. This is one of my favorite plugins. It fosters discussion and also makes tracking down spam comments on old posts easy. At a simple glance of my page, anyone can see where the latest comments are, and then if they feel like it, they can respond to them. Without this plugin, this default WordPress functionality only displays a list of latest comments in the blog’s admin interface.
  11. Redirection is a redirection manager that can change a bunch of different aspects on how redirects are handled, but I only use it for one thing. I changed my permalinks structure to remove the /index.php/ part recently, yet WordPress was sending back HTTP codes of 302, or “Moved Temporarily”, along with all of the redirects to the new permalink URLs. This is bad, as it can split up search engine karma across multiple pages. So I used Redirection to change the HTTP code to 301, or “Moved Permanently”. This tells search engines to update everything to point to the new URL.
  12. Spam Karma 2 is one of those plugins that I don’t know how I’d live without. It’s caught tens of thousands of spam comments so far. I cannot even imagine trying to handle all of that manually. And it’s false positive rate is amazingly low. Put simply, if you are running a WordPress blog on the public Internet, you need an anti-spam solution, and Spam Karma 2 is much more configurable and feature-full than WordPress’s default, Akismet.
  13. Update Manager keeps track of your plugins and lets you know when a new version of one is available. Not much more to say about that. Just be wary about upgrading; as the caveats above show, newer is not always better.
  14. WordPress.com Stats tabulates post view statistics in a blog-aware fashion (as opposed to the other stats tracker I use, awstats, which just knows about web pages in general). The plugin itself basically just farms out all of the work to WordPress.com’s servers (for which you need a free API key). If you don’t want them knowing the intimate details of your blog readership, you don’t want this plugin.
  15. WordPress Automatic Upgrade provides a smooth way of upgrading WordPress whenever new versions come out. Instead of having to manually backup your database and upload the new WordPress files, this plugin handles everything. It’s very nice, but you don’t end up using it that often simply because WordPress updates don’t come out all that frequently.
  16. WP Super Cache is a caching plugin that stores rendered HTML versions of your blog pages. It’s very useful for keeping your site up and running if you were to be, say, Digged or Slashdotted. I currently have it installed but not running, however, because the way it caches means that dynamic widgets like Recent Comments end up not updating on individual pages until the expiration time of the cache is reached. But I still have it ready to turn on at a moment’s notice should I get hit with a flood of traffic.

So that’s all of the WordPress plugins that I’m using. I hope that I at least gave you some leads on useful ones. The WordPress software is pretty barebone, lacking a lot of near-required functionality that you only get through plugins. I just wish someone would release a plugin that auto-moderates all Trackbacks and Pingbacks. Yes, there are some older ones out there, but none are compatible with WordPress 2.3. I’ve had such a problem with splogs sending me pingbacks and trackbacks (which Spam Karma doesn’t catch because those links actually exist, they’re just one of thousands of fake posts) that I’ve had to turn off Pingbacks and Trackbacks altogether. I really wish I could re-enable them. If you find out about a plugin for this that works with WordPress 2.3, please let me know!

Gentoo Linux tutorial: Playing m4a song files in Amarok

Wednesday, November 21st, 2007

Several years ago, I bought many albums from the iTunes Music Store. I know, it was a stupid thing to do, and I regret it. But back then, I had a PowerBook (which has since broken), so everything “just worked”. Well, I have a new laptop now that I’m running GNU/Linux on, so it no longer “just works”. Luckily, I’ve found a solution that does work.

The first step is stripping the abysmal “FairPlay” DRM (that’s Digital Restrictions Management to those in the know) off of the encrypted m4p song files that I purchased. This was fairly easy using a program called QTFairUse — unfortunately, it only runs on Windows. You won’t find anything to do it under GNU/Linux because it needs to use iTunes to work.

So once I stripped the encryption off the .m4p files, I was left with these .m4a song files. They’re not encrypted, but they’re also not a very widely used format, and aren’t supported by most audio player software. The simplest solution would just be to transcode them to ogg or mp3, but that’s a bad idea. You shouldn’t ever convert from one lossy format to another. It’s like making a xerox copy of a xerox copy; the quality loss accumulates. Now if you have a non-iPod portable digital music player, you don’t really have many choices, because none of them support m4a. Personally, I don’t see anything ethically wrong with downloading nice quality mp3s of songs you’ve already purchased just to avoid the transcoding quality loss, but I’m sure the law disagrees with me there.

Anyway, I digress. This tutorial is about playing m4a files natively under Gentoo GNU/Linux, without having to transcode them and suffer a loss of quality. It works perfectly for me, since I don’t even use my portable digital music player anymore. I just play everything on my computers. I’ll be using my favorite Free Software audio player program in this tutorial, Amarok. Luckily, everything necessary for the playback of m4a files is already in the source code — it just isn’t compiled in by default. So we’ll have to modify the USE flags to get it to work. But don’t fret; this is a very common procedure in Gentoo GNU/Linux.

Read the rest of this entry »

Making progress on NaNoWriMo, and a GNU Emacs tutorial

Thursday, November 1st, 2007

I’m making good progress on my novel for National Novel Writing Month, about my participation in which I’ve previously blogged. Incidentally, that previous sentence is a good example of why it isn’t a good idea to always obey the “no prepositions at the end of a sentence” rule, a rule I’m planning to make lots of violations of.

And now for something completely different. I’m writing my novel entirely on the command line in GNU Emacs. Why? Because I can. Oh, and because it allows me to work on my novel from anywhere with Internet access, as all I need to do is just ssh in. Thus I don’t have to worry about only being able to write from one computer, or having to carry around my novel as a file with me. Unfortunately, GNU Emacs is optimized for editing computer code. The default way it treats long lines isn’t very friendly to prose; it wraps on any character instead of only on spaces, and movement across long lines is unpleasant. Luckily it can be customized. Here’s the code I added to the .emacs file in my home directory. Modify (or create) yours accordingly and you can be writing long-form prose on the command line in no time.

;; Create a new minor mode called story-mode that will be
;; automatically loaded any time a text file with the .story extension is opened.
(add-to-list 'auto-mode-alist '("\\.story\\'" . text-mode))
;; Set longlines-mode to load as a hook whenever the text-mode minor mode is activated.
(setq text-mode-hook '(longlines-mode))
;; Set it so that the line length at which long lines wrap is adjusted to the size of the window.
(custom-set-variables '(longlines-wrap-follows-window-size t))

If you have GNU Emacs version 22 or higher, that’s it, you’re set up! Longlines-mode is included. If you have an older version of Emacs, you’ll need to download longlines mode separately. This configuration sets the mode to only fire automatically for files named *.story. If you want to use longlines mode for other types of files, either adjust your configuration accordingly, or load it manually on a case-by-case basis.

Fixing clock skew problems in GNU/Linux

Monday, August 27th, 2007

I ran into a bit of trouble recently on my new Gentoo GNU/Linux laptop because I accidentally set the date a whole month in the future, and then proceeded to install lots of packages before realizing my mistake. I corrected the date, but then I was left with a slew of clock skew problems. The problems occur during boot init scripts, which complain that configuration files were created or modified in the future. This is little more than an annoyance, but the clock skew most definitely did mess up my attempted installation of VMware. Make uses files’ modified times to determine if they should be recompiled; obviously, any file seeming to date from the future won’t be recompiled. This will affect package managers like portage or apt-get.

The simplest solution would be to wait out the extra time of the skew so that it doesn’t appear that files were created in the future. This is okay when you’re dealing with broken timezone settings, which can’t make the files appear to be from more than a day in the future, but it’s not good for dealing with skews of a month. I simply can’t wait that long to install new software. So I came up with a better solution. This fix should work for any GNU/Linux system:

(as root)

$ cd /
$ touch currtime
$ find . -cnewer /currtime -exec touch {} \;

The first two steps involve going to the root of the filesystem and creating a dummy file called currtime. This file is used solely for its timestamp, which will effectively represent the present. Any file with a modified time after currtime appears to be modified in the future, which the third line then corrects. The program find has a bunch of parameters that can be used to specify how recent a file is (e.g. “modified within last X hours”), but they can’t seem to take a negative number as a parameter (which would indicate modification in the future) because the negative sign is interpreted as a parameter signal. Hence why I’m bothering with the dummy file currtime; it’s simply easiest to do the time calculations by comparing modified times between files.

One last thing to explain: the program touch has two separate functionalities. If called with a filename that doesn’t yet exist, it will create that file will null contents, which is what we use to create currtime. And if the file does exist, it will “touch” the file, meaning that the modified time is updated to the present but no changes are made to the actual contents of the file. So the full script works like this: make a dummy file, search the entire filesystem for anything that appears to have been modified more recently than the dummy file, and touch all these files to make them appear to have been modified in the present.

You will get some errors from non-touchable parts of your system, like devices, pseudo files in /proc, etc. Don’t worry about them. Their modified times can’t be set, so you won’t do any harm, and they don’t need any fixing. The last thing to do is to cleanup your root directory by removing the dummy file currtime. And that’s it — the clock skew is totally corrected.

Dell laptop hwclock incompatibilities in Gentoo GNU/Linux

Sunday, August 26th, 2007

I’m installing Gentoo GNU/Linux on my Dell Inspiron 9400 laptop right now, and as usual, things never are as easy as they would seem. I finally found out about and installed the 915resolution package, which allows me to use the laptop’s display in its native widescreen resolution. Then I ran into a problem with the hardware clock. I couldn’t set it. I would get an error message at shutdown saying it could not set the hardware clock to the system clock. And since the hardware clock was set in localtime rather than UTC (because the previous installation was Windows), I would keep getting all of these timestamp errors about files being modified in the future. I had to reset the system clock manually after each boot.

So I looked into what was going on and it turns out the hwclock command wasn’t functioning. This is what is used to read and set the hwclock. Here is the error message I was getting:

$ hwclock
select() to /dev/rtc to wait for clock tick timed out

A preliminary Google search didn’t turn up anything useful. So I relaxed the search terms and came upon a thread in the ArchLinux users forum. One of the users mentions incompatibility with Dell laptop motherboards, and suggests using the parameter --directisa to hwclock. Testing it out from the command line, I confirm that it works instantly, rather than freezing up for a few seconds and then timing out. So this allowed me to set the hardware clock manually by using the parameter.

But wait, I wasn’t finished yet. I was still going to get the same errors during shutdown that I got earlier, because the init.d script for hwclock wasn’t using that parameter. In addition to an annoying error message during shutdown, that means my hardware clock would slowly drift over time, and I would have to periodically reset it manually. That’s unacceptable. So I modified the /etc/init.d/clock init script as follows.

I changed the line

myopts="${myopts} ${CLOCK_OPTS}"

to

myopts="${myopts} ${CLOCK_OPTS} --directisa"

This line is located inside of the setupopts() function, which is called near the beginnings of both the start() and stop() functions. This is the simplest fix you can make to the clock init script so that hwclock is always called with the --directisa parameter, and thus, it always works.

And that’s it! The clock on my Dell Inspiron 9400 laptop is working perfectly now.

Write your own alarm clock

Thursday, August 23rd, 2007

I finally got fed up with my store bought alarm clock. It just wasn’t smart enough. It didn’t understand the difference between weekdays and week nights, and every time I forgot to turn it off before going to bed on a Friday yielded a most unpleasant experience when it started blaring on Saturday morning. Likewise, there were bad ramifications for forgetting to turn it back on on Sunday night. And the sound it made when it went off was terribly annoying and not at all a good way to start the day. So I realized I could do better, and did.

One of my servers is within earshot of where I sleep. It has an internal PC speaker. So I made a simple shell script to output beeps in an aesthetically pleasing (for an alarm clock, anyway) manner and set up a cronjob so that it would run every weekday at 9:00am. No more having to worry about setting my alarm!

The entry in crontab is easy enough to understand. Let’s have a look at it:

0 9 * * Mon-Fri cyde sh /home/cyde/scripts/alarm.sh

As you can see, this runs on the zeroth minute of the ninth hour of every weekday. I’m running Ubuntu 7.04 on this server, and the way to make crontab entries in Ubuntu is to put the line in a text file inside of /etc/cron.d/. I named the file /etc/cron.d/alarm, naturally. The version of crond I’m running supports a field (set to cyde) to specify which user account to run the command under; if your cron daemon doesn’t support this field, simply take it out. And of course, /home/cyde/scripts/alarm.sh is the location of the following script:

Read the rest of this entry »

AMD announces open source GNU/Linux drivers for its video cards

Sunday, May 13th, 2007

Color me excited. AMD, the microprocessor company that is Intel’s chief competition and recently bought ATI, one of two major players in the graphics gard market, has announced that it will release open source drivers for its line of video cards. This is excellent, excellent news. Let me try to explain what this means to the non-techie audience.

The main thrust behind the GNU/Linux movement is free, open source, libre software. This means you can see the source code, you can redistribute the source code, you can modify the source code, and you can redistribute those modifications. Needless to say, the ramifications of these freedoms are extensive, and are the major cause for GNU/Linux’s current success. By 1992 Richard Stallman and the GNU project had put together all of the major components of a totally free operating system except for the kernel. With the addition of the Linux kernel to GNU in 1992, forming GNU/Linux, the world saw its first completely free modern operating system.

Unfortunately, there’s been a bit of backslide as of late. You can run your completely free operating system, but you won’t get very good performance out of your video cards. This is because up until now, ATI and nVidia, the only real players in the high-performance graphics card market, have not released free versions of their graphics card drivers, nor have they released the specifications on how to create our own drivers. So reverse-engineered free drivers are out there, they are just bad, and don’t take good advantage of any of the added power in the last few generations of graphics cards. So if you want to play a recent 3D commercial videogame under GNU/Linux, you really do need to use the proprietary drivers.

But the proprietary drivers have their own disadvantages. They aren’t as high quality as the Windows or Mac OS X drivers, but without the source code, we cannot fix their flaws. And they force us to do certain things that we do not wish done: for instance, the nVidia proprietary driver forces video-out output to enable Macrovision DRM, which degrades video quality. Those of us accustomed to using free software are driven crazy by this kind of nonsense, because, with free software, you have the freedom and the ability to modify the source code exactly as you see fit, so the software only does what you want it to do, and it certainly doesn’t do what a corporation is trying to force you to do if you don’t want it.

Thus, I am overjoyed by AMD’s announcement of upcoming open source drivers for their graphics cards. This will be a huge boon to free software everywhere. 3D applications (especially games) will run with much better performance. The only thing we need to watch out for is AMD’s clever use of the phrase “open source” rather than free. Open source does not always mean free, as Richard Stallman has pointed out. Microsoft has released some of its code under its own “open source” licenses, which don’t actually allow the essential free software freedoms, like being able to redistribute ones modifications. If AMD releases their drivers in a truly free way, that will be excellent. If they release it “open source” but with non-free restrictions, it will be rubbish. I’m hoping they go the free route, and once they do, nVidia will really have no choice but to follow suit.

Mark Shuttleworth tackles Linux on commodity PCs

Wednesday, March 14th, 2007

Mark Shuttleworth, the financier behind Ubuntu (thanks Mark!) tackles the problem of Linux in a recent blog post. He points out that profit margins are very low on these products, and that co-marketing funds from Microsoft make up a significant proportion of the profits. Without these funds, the profit margins on machines are so small that a problem with any single order can negate the profits on many orders. All it takes is one guy complaining that he “can’t install his [Windows] programs” and returning the computer to cancel out the profits on ten other sales. Unfortunately, the number of people who would do this kind of thing is way too high, as the average computer buyer really doesn’t know anything about Linux, and many sales of Linux PCs might end up being accidental, i.e. the person doesn’t realize what they’re getting into.

Mark also points out that it’s very expensive for Dell to try to cater to the wide range of desires that Linux users typically have. They want very specific things (e.g. this version of Ubuntu) and very specific types of hardware. Dell would have to deal with a huge compatibility list between dozens of distros and hardware configurations. In other words, not really practical.

So what’s the solution? Mark hits on it, but doesn’t fully consider it. It isn’t ideal, but then again, I don’t think there is an ideal solution to it. The idea is simple: ship the computers without an OS and include an install CD for your distro of choice in the box. All Dell would have to do is make sure their hardware is compatible with Linux (and that the distro they’re distributing has the correct drivers for it). Realistically, this is probably what most people would end up doing anyway. I ordered a machine pre-installed with Linux from Wal-Mart once, and the very first thing I did was install my own preferred distro. Even if a computer shipped with the latest version of Ubuntu, I don’t think I’d be able to resist the urge to reinstall. Who knows what Dell did to it? I’d rather just go with a default Gentoo install and make sure I know everything that’s going on.

So, as sad as that sounds, I think that is the solution: to just ship PCs without OSes and give the customer the opportunity to install the distro of their choice. This will help cut down Dell’s support costs; if the OS doesn’t come pre-installed, they don’t have to support it. And they can put prominent disclaimers on these OS-less computers saying that they’ll only support hardware issues. This should help to keep profits in the black, versus the unfortunate situation with customer support that I detailed above. This will be a good solution for experienced Linux users, and hey, for those that just want to try out Linux, I suppose an install guide could be shipped with the CD as well.

It’s just too bad about Microsoft’s monopoly. They hold such a stranglehold over the commodity PC market, and have successfully thrown their weight around for years to prevent Linux offerings. And now that Linux PCs from Dell may finally see the light of day, they’re going to be horribly stunted, as what we really want to do with them, have Linux pre-installed, is too expensive to support, and faces too many risks from the heavily Windows-centric PC user culture at large.