Stephen Fry celebrates GNU’s 25th birthday

Wednesday, September 3rd, 2008

Now this is a slightly unexpected, yet nevertheless entirely awesome, bit of news. Stephen Fry, famous British comedian of Fry & Laurie fame (that’s Hugh Laurie, the actor who plays Dr. House on House), has released a celebratory message to GNU on its 25th anniversary. It contains a good bit of background on GNU and Linux, though nothing that should be new to you if you’ve been involved in the Free Software community for awhile.

Still, it’s a nice video, and it’s cool to see someone so, well, famous extolling the virtues of Free Software. Check it out! Unfortunately, it’ll work a lot better in the United Kingdom than here in the United States, since they actually know who he is. We just need to get an American equivalent to tape something equally praising of GNU/Linux. How about … Scarlett Johannson?

KDE 4.1 is out and good enough for everyday use

Thursday, August 14th, 2008

Version 4.1 of the K Desktop Environment (the first release of KDE 4.x suitable for everyday use) for GNU/Linux came out recently, and last week I decided to give it a try. It’s pretty good, but it’s still a bit unpolished. Installing it in Ubuntu was simple, except for an error with one of the packages that forced me to remove them all, then install them again in a specific order using the dist-upgrade command. The process will become smoother when it hits the main Ubuntu branch, but for now, just be forewarned that it still has a bit of a “beta” feel to it.

KDE 4.1 also did something weird with my xorg.conf, so I have to restore it to a working version upon each boot or I get dumped back into the basic vesa driver which can only display a resolution of 640×480. Luckily I hardly ever reboot, or this would be more of an annoyance. Again, I expect this to be something that’s fixed in the final release. I don’t think these problems are even KDE 4.1’s fault, but rather, a problem in the way the Ubuntu packager configured things.

So, after debugging the problems (and you wouldn’t even be interested in checking out bleeding edge software releases if you seriously minded contributing to the edge being bleeding in the first place), KDE 4.1 is up and running, and it’s really nice. Whereas KDE 3.5 seemed to draw inspiration for its appearance from Windows 95, KDE 4.1 draws its inspiration from Windows Vista, and even gives it a run for its money. KDE 4.1 is a pleasure to look at, right from the boot-up screen all the way to the everyday tasks performed in the desktop environment. Even the applications menu, with its smoothly animated sliding pages, is well done. Appearance matters a lot more than most free software folks will admit to, so it’s good to see a free software project that really gets it.

KDE 4’s best new feature has to be the desktop plasmoids. A plasmoid is a view of a folder that goes on the desktop, so it is always beneath all other opened applications and does not show up on the taskbar. The simplest use of a plasmoid is to show the contents of the desktop folder (creating a desktop folder plasmoid the size of the entire screen emulates the behavior in other desktop environments). Plasmoids are nice because they corral the icons that normally overwhelm a desktop into a nice sortable box. And then the real power of the plasmoid is revealed when you create other plasmoids — one plasmoid for your Documents folder, another for the download directory for Mozilla Firefox, another for the download directory for BitTorrent, etc. All of the files you need are always at your finger tips, in a neat orderly manner that don’t overwhelm your task bar. Organizing your files is as easy as dragging icons from one plasmoid to another. It’s such an incredible user interface improvement it makes you wonder why no one else has thought of it before. Oh wait, they sort of have — anyone remember Windows 3.1 and its persistent folder windows?

KDE 4.1 is also lacking some configuration options in comparison to KDE 3.5, but it’s apparently already a lot better than KDE 4.0 was, and most of the configuration options should be restored soon in future releases. All of the basic options are there, you just don’t have the kind of intricate configurability that long-time KDE users might expect.

I would love to write about all of the new desktop widgets, but alas, I couldn’t get any of them working, and this error is echoed by others out there trying out the Ubuntu build of KDE 4.1. This looks like another error by the packager. Oh well. KDE 4.1 on Ubuntu is still perfectly usable as is, it just doesn’t have all the bells-and-whistles. If the problems I’ve listed so far aren’t deal-breakers for you, go ahead and download KDE 4.1 and try it out. Otherwise, you might want to wait another few weeks or so until the official mainline release is out. Even if you’ve never used KDE before (which is quite common for Ubuntu users, seeing as how Gnome is the default window manager), you ought to give it a serious try. You might really like it.

64-bit GNU/Linux is totally ready for mainstream use

Monday, June 16th, 2008

When I was installing Ubuntu GNU/Linux with KDE on my latest desktop purchase, I faced a seemingly agonizing decision between 32-bit and 64-bit. There are all sorts of peripheral arguments over performance, but the main arguments on each side are that 32-bit can only support 4 GB of RAM (not technically true) and that 64-bit has limited application support and is more buggy.

Well, I’m happy to report that all of the supposed caveats of 64-bit GNU/Linux completely failed to materialize. After over a week of heavy usage of 64-bit Ubuntu, and installation of a few hundred applications, I haven’t run across a single problem stemming from my decision to use 64-bit. So I would say the choice of 64-bit is a no-brainer. 64-bit has reached maturity, and all of the supposed problems with it are problems of the past. 64-bit is the future of computing (just like 32-bit was the future of computing back when 16-bit was still common). It’s better to make the switch now than to find yourself a year or two down the line facing a 64-bit reinstallation of a 32-bit system. This choice is pretty much set in stone when you install an operating system; there is no upgrade path. So make the correct choice now.

I should point out that not all processors support 64-bit OSes. The older the processor, the less likely it is to offer 64-bit support. So do your due diligence before you accidentally end up downloading the wrong version of a GNU/Linux distribution ISO.

Meet Vertumnus, my new GNU/Linux desktop (running on a Dell Inspiron 530)

Wednesday, June 11th, 2008

If this post seems a little glowing, don’t be alarmed; it’s because I’m still basking in the brilliant sheen of my new GNU/Linux desktop (which I am composing this blog post on as I type these very words — and these words, too). That’s right, I went through with my plans for setting up a GNU/Linux desktop, though I didn’t actually use the parts list I threw together two weeks ago. I ran across an amazing deal through Dell’s small business site (instant savings of nearly half off!) on an Inspiron 530 and I jumped on it. For $360 ($407 after shipping and state taxes), I got a nice little Dell mini-tower with an Intel Core 2 Duo E8200 processor, 2 GB of DDR2 PC2 6400 RAM, 500GB SATA hard drive with 16 MB cache, SATA DVD burner, keyboard, and optical scroll mouse. It ended up being about the same price as the parts list I put together, but the performance is marginally better, with the added possibility of upgrading to 4 GB of RAM. It also came with Windows Vista Home Premium, which I suppose would be a value add-in for some, but which just made me wince at how much cheaper I could have gotten this system without paying the Microsoft tax. Anyway, Vista’s in the trash now, where it belongs, and the price was good enough that I’m not worrying about it.

Installing the OS

I was going to install Kubuntu on my new system, but I opted for Ubuntu instead on a recommendation from Drinian, who says that Kubuntu isn’t quite as well put-together. The only reason I wanted Kubuntu was because I wanted to run KDE instead of Gnome, but it turns out that’s incredibly easy to accomplish in Ubuntu (just install the kubuntu-desktop meta-package in aptitude, then set your login session to KDE). So choosing Ubuntu over Kubuntu hasn’t left me disappointed in any way.

Unfortunately, installing Ubuntu GNU/Linux still wasn’t as easy as it should have been. I blame the problem on hardware incompatibilities, most likely with the SATA controller on the motherboard. The installation CD wouldn’t boot without passing the kernel parameter “all_generic_ide”, which is something I can handle but the average computer user is likely to be turned off by. Then, after the installation completed, my system wouldn’t boot from the hard drive for the same reason, so I had to boot back into the LiveCD environment, mount my boot partition, and then edit grub’s (a bootloader) menu.lst to pass that same kernel parameter. So, yeah, GNU/Linux isn’t exactly friendly for the masses, at least not on this hardware. Curiously enough, I had this exact same problem when dual-booting Fedora Core (another distribution of GNU/Linux) on my previous desktop. There’s definitely some room for improvement in this area by either the Linux kernel developers or the Ubuntu packagers. There’s no real reason this can’t be one of those things that “Just Works”.

Naming the system

But after the minor hitch with “all_generic_ide” , everything else worked just fine. It was the smoothest GNU/Linux installation I believe I’ve ever done. The GNU/Linux graphical installers have become quite advanced, completely putting anything Microsoft offers up to shame. Actually, the part of the installation process that took the longest time was picking a name for my new computer. I have a long history of naming computers after various mythologies, deities, or nerdy things (Ixion, Dark Anima, Fyre, Quezacoatl, Geminoid, Phoenix, etc.), so I wanted to continue the theme. I figured since this is the first time I’ve ever used a dedicated GNU/Linux system as my primary desktop (as opposed to Microsoft Windows), I wanted to emphasize the change this brings to my computing life. So I got into a lively discussion on IRC with someone who apparently knows a good deal about ancient Greek/Roman mythology, and his best suggestion was the Roman god Vertumnus, who is “the god of seasons, change and plant growth, as well as gardens and fruit trees”. I liked both the change aspect and the environmental aspect, so Vertumnus it was.

Read the rest of this entry »

Specs for a high power, cheap ($380) GNU/Linux desktop

Wednesday, May 28th, 2008

The other day, I was realizing that I don’t use GNU/Linux as often as I should. Sure, I run it exclusively on my servers, but I still use Windows on the desktop for the most part. That’s more out of habit than out of any need. Everything I currently do in Windows I can do in GNU/Linux, except for the games, which I’m playing more and more occasionally these days. I was dual-booting my current desktop with Windows XP and GNU/Linux for awhile, but it proved to be inconvenient. My computers’ uptimes, both servers and desktops, are typically measured in months (only going down for crashes and power losses). It takes awhile to reboot and restart all of the applications I typically have running, so I don’t do it by choice. Thus you can see the problem with dual-booting: it entails constant rebooting, which I had to do as often as I felt like playing a Windows game. And then once I was in Windows I wouldn’t want to go through the hassle of booting into GNU/Linux only to boot back into Windows the next time I wanted to play a game. It simply wasn’t working.

So I now see the problem with my initial attempts at using GNU/Linux on the desktop. I simply don’t have the patience to put up with all of those constant reboots and interruptions in my computing environment. I’m too lazy. I’m simply going to get another desktop to use exclusively for GNU/Linux, while making every effort to only use my current Windows desktop for playing games. And luckily, making a desktop computer is cheaper than it’s ever been. Here is a current parts list I put together just yesterday for a killer GNU/Linux desktop.

The specs

This complete GNU/Linux system costs only $355. Throw in shipping and we’ll call it $380. That’s a really cheap price considering how powerful this system is. Avoiding the Microsoft tax by choosing a Free operating system pays huge dividends when the overall system is cheap. Allow me to explain the choices I made in putting this system together with individual analyses of each other components:

The barebone system

First of all, I save a lot of money with this computer by building it into a barebone system. A price of $90 for a case, power supply, and motherboard is really hard to beat. You can easily spend over $90 for each of those individual components (and in fact, when I built my current desktop, I did). Getting a good barebone system is an excellent way to save a lot of money on a low-end desktop. If you’re not building a low-end desktop, I wouldn’t bother. The limitations can be significant. For instance, the motherboard that ships in the barebone I picked out supports a maximum of 2 GB of RAM; fine for a low-end system, but you really want 4 GB of RAM on a medium or high end system. And the power supply is only 250W; again, fine for a low end system, but don’t expect it to be able to power, say, a high-end discrete video card. And naturally the motherboard doesn’t support dual video cards, which would be an upgrade path you might want to keep open on a system you’re outlaying more money on. It also doesn’t support quad-core processors. So there are limitations, but for a low-level system, you won’t run into them.

Read the rest of this entry »

Gentoo Linux tutorial: Playing m4a song files in Amarok

Wednesday, November 21st, 2007

Several years ago, I bought many albums from the iTunes Music Store. I know, it was a stupid thing to do, and I regret it. But back then, I had a PowerBook (which has since broken), so everything “just worked”. Well, I have a new laptop now that I’m running GNU/Linux on, so it no longer “just works”. Luckily, I’ve found a solution that does work.

The first step is stripping the abysmal “FairPlay” DRM (that’s Digital Restrictions Management to those in the know) off of the encrypted m4p song files that I purchased. This was fairly easy using a program called QTFairUse — unfortunately, it only runs on Windows. You won’t find anything to do it under GNU/Linux because it needs to use iTunes to work.

So once I stripped the encryption off the .m4p files, I was left with these .m4a song files. They’re not encrypted, but they’re also not a very widely used format, and aren’t supported by most audio player software. The simplest solution would just be to transcode them to ogg or mp3, but that’s a bad idea. You shouldn’t ever convert from one lossy format to another. It’s like making a xerox copy of a xerox copy; the quality loss accumulates. Now if you have a non-iPod portable digital music player, you don’t really have many choices, because none of them support m4a. Personally, I don’t see anything ethically wrong with downloading nice quality mp3s of songs you’ve already purchased just to avoid the transcoding quality loss, but I’m sure the law disagrees with me there.

Anyway, I digress. This tutorial is about playing m4a files natively under Gentoo GNU/Linux, without having to transcode them and suffer a loss of quality. It works perfectly for me, since I don’t even use my portable digital music player anymore. I just play everything on my computers. I’ll be using my favorite Free Software audio player program in this tutorial, Amarok. Luckily, everything necessary for the playback of m4a files is already in the source code — it just isn’t compiled in by default. So we’ll have to modify the USE flags to get it to work. But don’t fret; this is a very common procedure in Gentoo GNU/Linux.

Read the rest of this entry »

Dell laptop hwclock incompatibilities in Gentoo GNU/Linux

Sunday, August 26th, 2007

I’m installing Gentoo GNU/Linux on my Dell Inspiron 9400 laptop right now, and as usual, things never are as easy as they would seem. I finally found out about and installed the 915resolution package, which allows me to use the laptop’s display in its native widescreen resolution. Then I ran into a problem with the hardware clock. I couldn’t set it. I would get an error message at shutdown saying it could not set the hardware clock to the system clock. And since the hardware clock was set in localtime rather than UTC (because the previous installation was Windows), I would keep getting all of these timestamp errors about files being modified in the future. I had to reset the system clock manually after each boot.

So I looked into what was going on and it turns out the hwclock command wasn’t functioning. This is what is used to read and set the hwclock. Here is the error message I was getting:

$ hwclock
select() to /dev/rtc to wait for clock tick timed out

A preliminary Google search didn’t turn up anything useful. So I relaxed the search terms and came upon a thread in the ArchLinux users forum. One of the users mentions incompatibility with Dell laptop motherboards, and suggests using the parameter --directisa to hwclock. Testing it out from the command line, I confirm that it works instantly, rather than freezing up for a few seconds and then timing out. So this allowed me to set the hardware clock manually by using the parameter.

But wait, I wasn’t finished yet. I was still going to get the same errors during shutdown that I got earlier, because the init.d script for hwclock wasn’t using that parameter. In addition to an annoying error message during shutdown, that means my hardware clock would slowly drift over time, and I would have to periodically reset it manually. That’s unacceptable. So I modified the /etc/init.d/clock init script as follows.

I changed the line

myopts="${myopts} ${CLOCK_OPTS}"

to

myopts="${myopts} ${CLOCK_OPTS} --directisa"

This line is located inside of the setupopts() function, which is called near the beginnings of both the start() and stop() functions. This is the simplest fix you can make to the clock init script so that hwclock is always called with the --directisa parameter, and thus, it always works.

And that’s it! The clock on my Dell Inspiron 9400 laptop is working perfectly now.

AMD announces open source GNU/Linux drivers for its video cards

Sunday, May 13th, 2007

Color me excited. AMD, the microprocessor company that is Intel’s chief competition and recently bought ATI, one of two major players in the graphics gard market, has announced that it will release open source drivers for its line of video cards. This is excellent, excellent news. Let me try to explain what this means to the non-techie audience.

The main thrust behind the GNU/Linux movement is free, open source, libre software. This means you can see the source code, you can redistribute the source code, you can modify the source code, and you can redistribute those modifications. Needless to say, the ramifications of these freedoms are extensive, and are the major cause for GNU/Linux’s current success. By 1992 Richard Stallman and the GNU project had put together all of the major components of a totally free operating system except for the kernel. With the addition of the Linux kernel to GNU in 1992, forming GNU/Linux, the world saw its first completely free modern operating system.

Unfortunately, there’s been a bit of backslide as of late. You can run your completely free operating system, but you won’t get very good performance out of your video cards. This is because up until now, ATI and nVidia, the only real players in the high-performance graphics card market, have not released free versions of their graphics card drivers, nor have they released the specifications on how to create our own drivers. So reverse-engineered free drivers are out there, they are just bad, and don’t take good advantage of any of the added power in the last few generations of graphics cards. So if you want to play a recent 3D commercial videogame under GNU/Linux, you really do need to use the proprietary drivers.

But the proprietary drivers have their own disadvantages. They aren’t as high quality as the Windows or Mac OS X drivers, but without the source code, we cannot fix their flaws. And they force us to do certain things that we do not wish done: for instance, the nVidia proprietary driver forces video-out output to enable Macrovision DRM, which degrades video quality. Those of us accustomed to using free software are driven crazy by this kind of nonsense, because, with free software, you have the freedom and the ability to modify the source code exactly as you see fit, so the software only does what you want it to do, and it certainly doesn’t do what a corporation is trying to force you to do if you don’t want it.

Thus, I am overjoyed by AMD’s announcement of upcoming open source drivers for their graphics cards. This will be a huge boon to free software everywhere. 3D applications (especially games) will run with much better performance. The only thing we need to watch out for is AMD’s clever use of the phrase “open source” rather than free. Open source does not always mean free, as Richard Stallman has pointed out. Microsoft has released some of its code under its own “open source” licenses, which don’t actually allow the essential free software freedoms, like being able to redistribute ones modifications. If AMD releases their drivers in a truly free way, that will be excellent. If they release it “open source” but with non-free restrictions, it will be rubbish. I’m hoping they go the free route, and once they do, nVidia will really have no choice but to follow suit.

Mark Shuttleworth tackles Linux on commodity PCs

Wednesday, March 14th, 2007

Mark Shuttleworth, the financier behind Ubuntu (thanks Mark!) tackles the problem of Linux in a recent blog post. He points out that profit margins are very low on these products, and that co-marketing funds from Microsoft make up a significant proportion of the profits. Without these funds, the profit margins on machines are so small that a problem with any single order can negate the profits on many orders. All it takes is one guy complaining that he “can’t install his [Windows] programs” and returning the computer to cancel out the profits on ten other sales. Unfortunately, the number of people who would do this kind of thing is way too high, as the average computer buyer really doesn’t know anything about Linux, and many sales of Linux PCs might end up being accidental, i.e. the person doesn’t realize what they’re getting into.

Mark also points out that it’s very expensive for Dell to try to cater to the wide range of desires that Linux users typically have. They want very specific things (e.g. this version of Ubuntu) and very specific types of hardware. Dell would have to deal with a huge compatibility list between dozens of distros and hardware configurations. In other words, not really practical.

So what’s the solution? Mark hits on it, but doesn’t fully consider it. It isn’t ideal, but then again, I don’t think there is an ideal solution to it. The idea is simple: ship the computers without an OS and include an install CD for your distro of choice in the box. All Dell would have to do is make sure their hardware is compatible with Linux (and that the distro they’re distributing has the correct drivers for it). Realistically, this is probably what most people would end up doing anyway. I ordered a machine pre-installed with Linux from Wal-Mart once, and the very first thing I did was install my own preferred distro. Even if a computer shipped with the latest version of Ubuntu, I don’t think I’d be able to resist the urge to reinstall. Who knows what Dell did to it? I’d rather just go with a default Gentoo install and make sure I know everything that’s going on.

So, as sad as that sounds, I think that is the solution: to just ship PCs without OSes and give the customer the opportunity to install the distro of their choice. This will help cut down Dell’s support costs; if the OS doesn’t come pre-installed, they don’t have to support it. And they can put prominent disclaimers on these OS-less computers saying that they’ll only support hardware issues. This should help to keep profits in the black, versus the unfortunate situation with customer support that I detailed above. This will be a good solution for experienced Linux users, and hey, for those that just want to try out Linux, I suppose an install guide could be shipped with the CD as well.

It’s just too bad about Microsoft’s monopoly. They hold such a stranglehold over the commodity PC market, and have successfully thrown their weight around for years to prevent Linux offerings. And now that Linux PCs from Dell may finally see the light of day, they’re going to be horribly stunted, as what we really want to do with them, have Linux pre-installed, is too expensive to support, and faces too many risks from the heavily Windows-centric PC user culture at large.

Beware of Linux incompatibilities: a tale of hardware woe

Tuesday, February 13th, 2007

I’ve learned an unfortunate lesson: if you’re building a new computer and you intend to put Linux on it, you really ought to make sure that the two are compatible. See, pretty much every new computer component is compatible with Windows by default, but the same is not true for Linux. The Windows drivers are generally written by the hardware company itself. These are essential if they want to sell it at all. But Linux drivers aren’t such a high priority, and frequently they end up being written by Linux developers who have to hack and reverse-engineer them together because the specifications aren’t open.

Thus my problem. The motherboard I bought, the GIGABYTE GA-965P-S3 LGA 775 Intel P965 Express ATX Intel Motherboard, is multiply incompatible with Linux. It uses a Jmicron SATA controller. I was able to find an experimental driver out there that seemed to work, but it’s not yet included in any of mainline Linux distributions or the mainline Linux kernel. I did find a modified Gentoo install CD that included the driver, so I was able to detect the hard drives, but then I ran into another problem: the network interface isn’t supported. D’oh! I still haven’t been able to get Linux working on this machine. I’m thinking I’ll try Ubuntu next; maybe I’ll have more luck.

But anyway, the lesson is this: when buying new hardware, make sure that everything is supported by the Linux distribution of your choice. I could have just as easily gone with a motherboard of similar price and feature set that was freaking currently compatible with Linux.