How to prevent Firefox from lagging badly when dragging selected text

Tuesday, October 28th, 2008

This past week I upgraded my system from Ubuntu 8.04 to Ubuntu 8.10. The upgrade was pretty smooth, with nothing much to report except that my system now boots without requiring the all_generic_ide kernel parameter, which is nice. One problem that I immediately started seeing, however, was that my system would freeze up terribly whenever I selected more than a few words in Mozilla Firefox and tried dragging them anywhere. Depending on how large the block of text was, my entire system could freeze up for minutes at a time as it spent several seconds drawing each frame of the text block moving.

Well, I’d had enough of it, and I went looking for a solution. Firefox didn’t always render the entire contents of the selection being dragged-and-dropped; it used to just display a little icon next to the cursor. Here’s how to restore that functionality and remove the lag from the fancy but ultimately unnecessary fully rendered dragging:

  1. Type about:config into Firefox’s location bar and hit Return.
  2. In the filter text edit box at the top of the window, type nglayout.
  3. Double-click on the nglayout.enable_drag_images row to change its value to false.
  4. That’s it! Firefox will no longer try to render the contents of the selection to the screen as you drag words around. For older systems or systems with poor graphical support (like mine, apparently), this is pretty much a mandatory change. Enjoy your new, faster Firefox!

KDE 4.1 is out and good enough for everyday use

Thursday, August 14th, 2008

Version 4.1 of the K Desktop Environment (the first release of KDE 4.x suitable for everyday use) for GNU/Linux came out recently, and last week I decided to give it a try. It’s pretty good, but it’s still a bit unpolished. Installing it in Ubuntu was simple, except for an error with one of the packages that forced me to remove them all, then install them again in a specific order using the dist-upgrade command. The process will become smoother when it hits the main Ubuntu branch, but for now, just be forewarned that it still has a bit of a “beta” feel to it.

KDE 4.1 also did something weird with my xorg.conf, so I have to restore it to a working version upon each boot or I get dumped back into the basic vesa driver which can only display a resolution of 640×480. Luckily I hardly ever reboot, or this would be more of an annoyance. Again, I expect this to be something that’s fixed in the final release. I don’t think these problems are even KDE 4.1’s fault, but rather, a problem in the way the Ubuntu packager configured things.

So, after debugging the problems (and you wouldn’t even be interested in checking out bleeding edge software releases if you seriously minded contributing to the edge being bleeding in the first place), KDE 4.1 is up and running, and it’s really nice. Whereas KDE 3.5 seemed to draw inspiration for its appearance from Windows 95, KDE 4.1 draws its inspiration from Windows Vista, and even gives it a run for its money. KDE 4.1 is a pleasure to look at, right from the boot-up screen all the way to the everyday tasks performed in the desktop environment. Even the applications menu, with its smoothly animated sliding pages, is well done. Appearance matters a lot more than most free software folks will admit to, so it’s good to see a free software project that really gets it.

KDE 4’s best new feature has to be the desktop plasmoids. A plasmoid is a view of a folder that goes on the desktop, so it is always beneath all other opened applications and does not show up on the taskbar. The simplest use of a plasmoid is to show the contents of the desktop folder (creating a desktop folder plasmoid the size of the entire screen emulates the behavior in other desktop environments). Plasmoids are nice because they corral the icons that normally overwhelm a desktop into a nice sortable box. And then the real power of the plasmoid is revealed when you create other plasmoids — one plasmoid for your Documents folder, another for the download directory for Mozilla Firefox, another for the download directory for BitTorrent, etc. All of the files you need are always at your finger tips, in a neat orderly manner that don’t overwhelm your task bar. Organizing your files is as easy as dragging icons from one plasmoid to another. It’s such an incredible user interface improvement it makes you wonder why no one else has thought of it before. Oh wait, they sort of have — anyone remember Windows 3.1 and its persistent folder windows?

KDE 4.1 is also lacking some configuration options in comparison to KDE 3.5, but it’s apparently already a lot better than KDE 4.0 was, and most of the configuration options should be restored soon in future releases. All of the basic options are there, you just don’t have the kind of intricate configurability that long-time KDE users might expect.

I would love to write about all of the new desktop widgets, but alas, I couldn’t get any of them working, and this error is echoed by others out there trying out the Ubuntu build of KDE 4.1. This looks like another error by the packager. Oh well. KDE 4.1 on Ubuntu is still perfectly usable as is, it just doesn’t have all the bells-and-whistles. If the problems I’ve listed so far aren’t deal-breakers for you, go ahead and download KDE 4.1 and try it out. Otherwise, you might want to wait another few weeks or so until the official mainline release is out. Even if you’ve never used KDE before (which is quite common for Ubuntu users, seeing as how Gnome is the default window manager), you ought to give it a serious try. You might really like it.

How to learn Morse code in GNU/Linux

Monday, June 23rd, 2008

I know what you’re thinking — as if GNU/Linux and ham radio couldn’t possibly be nerdy enough when separate, let’s put them together! But let’s take a step back …

I started getting involved with ham radio just three months ago with VHF/UHF voice FM, and already I’m hungering for more. I don’t have an HF rig yet, and might actually not have one for awhile, but since I know it’s something I’ll want to do eventually, I figure I should just start learning Morse code now. As for why I want to learn Morse code, I couldn’t exactly tell you — there’s just a certain romance to it, and pounding away on a key is such a delightfully different method of communicating than just speaking into a microphone. But ignoring why I want to learn it, here’s how I’m going about doing it, in GNU/Linux no less.

Learning Morse code on the computer is actually harder than it should be. I couldn’t find any Flash or Java applets that do something as simple as generate Morse code. Seriously. I found some really old Java applets that no longer function in current JDKs, but they don’t count. I found lots of DOS programs, many of which are pushing two decades old, but I wasn’t having much luck with them even under Windows. And since I’m running GNU/Linux as my primary desktop now, these programs weren’t helpful at all. Luckily, there’s a simple up-to-date command-line utility for GNU/Linux that does all the basics with a minimum of fuss.

First, you’ll want the morse program. In Ubuntu or Debian GNU/Linux, you can do the following:

sudo apt-get install morse

If you’re not using Ubuntu or Debian, you should be able to find it using the package manager in your distro of choice.

Now, learning Morse is as simple as passing in the right command-line parameters to morse. Here’s what I’ve started with:

morse -rC 'ETAOINSHRDLU' -w 5 -Ts

Read the rest of this entry »

64-bit GNU/Linux is totally ready for mainstream use

Monday, June 16th, 2008

When I was installing Ubuntu GNU/Linux with KDE on my latest desktop purchase, I faced a seemingly agonizing decision between 32-bit and 64-bit. There are all sorts of peripheral arguments over performance, but the main arguments on each side are that 32-bit can only support 4 GB of RAM (not technically true) and that 64-bit has limited application support and is more buggy.

Well, I’m happy to report that all of the supposed caveats of 64-bit GNU/Linux completely failed to materialize. After over a week of heavy usage of 64-bit Ubuntu, and installation of a few hundred applications, I haven’t run across a single problem stemming from my decision to use 64-bit. So I would say the choice of 64-bit is a no-brainer. 64-bit has reached maturity, and all of the supposed problems with it are problems of the past. 64-bit is the future of computing (just like 32-bit was the future of computing back when 16-bit was still common). It’s better to make the switch now than to find yourself a year or two down the line facing a 64-bit reinstallation of a 32-bit system. This choice is pretty much set in stone when you install an operating system; there is no upgrade path. So make the correct choice now.

I should point out that not all processors support 64-bit OSes. The older the processor, the less likely it is to offer 64-bit support. So do your due diligence before you accidentally end up downloading the wrong version of a GNU/Linux distribution ISO.

Meet Vertumnus, my new GNU/Linux desktop (running on a Dell Inspiron 530)

Wednesday, June 11th, 2008

If this post seems a little glowing, don’t be alarmed; it’s because I’m still basking in the brilliant sheen of my new GNU/Linux desktop (which I am composing this blog post on as I type these very words — and these words, too). That’s right, I went through with my plans for setting up a GNU/Linux desktop, though I didn’t actually use the parts list I threw together two weeks ago. I ran across an amazing deal through Dell’s small business site (instant savings of nearly half off!) on an Inspiron 530 and I jumped on it. For $360 ($407 after shipping and state taxes), I got a nice little Dell mini-tower with an Intel Core 2 Duo E8200 processor, 2 GB of DDR2 PC2 6400 RAM, 500GB SATA hard drive with 16 MB cache, SATA DVD burner, keyboard, and optical scroll mouse. It ended up being about the same price as the parts list I put together, but the performance is marginally better, with the added possibility of upgrading to 4 GB of RAM. It also came with Windows Vista Home Premium, which I suppose would be a value add-in for some, but which just made me wince at how much cheaper I could have gotten this system without paying the Microsoft tax. Anyway, Vista’s in the trash now, where it belongs, and the price was good enough that I’m not worrying about it.

Installing the OS

I was going to install Kubuntu on my new system, but I opted for Ubuntu instead on a recommendation from Drinian, who says that Kubuntu isn’t quite as well put-together. The only reason I wanted Kubuntu was because I wanted to run KDE instead of Gnome, but it turns out that’s incredibly easy to accomplish in Ubuntu (just install the kubuntu-desktop meta-package in aptitude, then set your login session to KDE). So choosing Ubuntu over Kubuntu hasn’t left me disappointed in any way.

Unfortunately, installing Ubuntu GNU/Linux still wasn’t as easy as it should have been. I blame the problem on hardware incompatibilities, most likely with the SATA controller on the motherboard. The installation CD wouldn’t boot without passing the kernel parameter “all_generic_ide”, which is something I can handle but the average computer user is likely to be turned off by. Then, after the installation completed, my system wouldn’t boot from the hard drive for the same reason, so I had to boot back into the LiveCD environment, mount my boot partition, and then edit grub’s (a bootloader) menu.lst to pass that same kernel parameter. So, yeah, GNU/Linux isn’t exactly friendly for the masses, at least not on this hardware. Curiously enough, I had this exact same problem when dual-booting Fedora Core (another distribution of GNU/Linux) on my previous desktop. There’s definitely some room for improvement in this area by either the Linux kernel developers or the Ubuntu packagers. There’s no real reason this can’t be one of those things that “Just Works”.

Naming the system

But after the minor hitch with “all_generic_ide” , everything else worked just fine. It was the smoothest GNU/Linux installation I believe I’ve ever done. The GNU/Linux graphical installers have become quite advanced, completely putting anything Microsoft offers up to shame. Actually, the part of the installation process that took the longest time was picking a name for my new computer. I have a long history of naming computers after various mythologies, deities, or nerdy things (Ixion, Dark Anima, Fyre, Quezacoatl, Geminoid, Phoenix, etc.), so I wanted to continue the theme. I figured since this is the first time I’ve ever used a dedicated GNU/Linux system as my primary desktop (as opposed to Microsoft Windows), I wanted to emphasize the change this brings to my computing life. So I got into a lively discussion on IRC with someone who apparently knows a good deal about ancient Greek/Roman mythology, and his best suggestion was the Roman god Vertumnus, who is “the god of seasons, change and plant growth, as well as gardens and fruit trees”. I liked both the change aspect and the environmental aspect, so Vertumnus it was.

Read the rest of this entry »

Mark Shuttleworth tackles Linux on commodity PCs

Wednesday, March 14th, 2007

Mark Shuttleworth, the financier behind Ubuntu (thanks Mark!) tackles the problem of Linux in a recent blog post. He points out that profit margins are very low on these products, and that co-marketing funds from Microsoft make up a significant proportion of the profits. Without these funds, the profit margins on machines are so small that a problem with any single order can negate the profits on many orders. All it takes is one guy complaining that he “can’t install his [Windows] programs” and returning the computer to cancel out the profits on ten other sales. Unfortunately, the number of people who would do this kind of thing is way too high, as the average computer buyer really doesn’t know anything about Linux, and many sales of Linux PCs might end up being accidental, i.e. the person doesn’t realize what they’re getting into.

Mark also points out that it’s very expensive for Dell to try to cater to the wide range of desires that Linux users typically have. They want very specific things (e.g. this version of Ubuntu) and very specific types of hardware. Dell would have to deal with a huge compatibility list between dozens of distros and hardware configurations. In other words, not really practical.

So what’s the solution? Mark hits on it, but doesn’t fully consider it. It isn’t ideal, but then again, I don’t think there is an ideal solution to it. The idea is simple: ship the computers without an OS and include an install CD for your distro of choice in the box. All Dell would have to do is make sure their hardware is compatible with Linux (and that the distro they’re distributing has the correct drivers for it). Realistically, this is probably what most people would end up doing anyway. I ordered a machine pre-installed with Linux from Wal-Mart once, and the very first thing I did was install my own preferred distro. Even if a computer shipped with the latest version of Ubuntu, I don’t think I’d be able to resist the urge to reinstall. Who knows what Dell did to it? I’d rather just go with a default Gentoo install and make sure I know everything that’s going on.

So, as sad as that sounds, I think that is the solution: to just ship PCs without OSes and give the customer the opportunity to install the distro of their choice. This will help cut down Dell’s support costs; if the OS doesn’t come pre-installed, they don’t have to support it. And they can put prominent disclaimers on these OS-less computers saying that they’ll only support hardware issues. This should help to keep profits in the black, versus the unfortunate situation with customer support that I detailed above. This will be a good solution for experienced Linux users, and hey, for those that just want to try out Linux, I suppose an install guide could be shipped with the CD as well.

It’s just too bad about Microsoft’s monopoly. They hold such a stranglehold over the commodity PC market, and have successfully thrown their weight around for years to prevent Linux offerings. And now that Linux PCs from Dell may finally see the light of day, they’re going to be horribly stunted, as what we really want to do with them, have Linux pre-installed, is too expensive to support, and faces too many risks from the heavily Windows-centric PC user culture at large.

Software upgrade miscellanea

Wednesday, January 3rd, 2007

A little bit of background on this server may be helpful to start off with. I bought this server in 2003 for $200. That’s not a typo. It was the lowest model of a line of bargain basement Linux PCs offered by Wal-Mart. Other than adding a spare 256MB RAM DIMM I had laying around (the computer only came with 128 MB!), I haven’t made any other upgrades. That means it’s still running on the same slow 20GB hard drive and the slow AMD Duron 1200+ bargain processor. Note that the only reason I call it a server is because that is the functionality it fulfills; other than in name, it’s really just a crappy, over-the-hill desktop.

Soon after I bought this server I ended up using it to host a site called terpy.net. It ran on Slashcode (the engine that drives Slashdot). It was aimed at University of Maryland students. I eventually killed off the site due to a lack of interest on my part in updating it, and I only revived this server two months ago to host this site.

Since the server is under-powered, it has a little bit of trouble handling even these modest webserving tasks. So I just finished up with two modifications that should make the server run a little bit faster. For one, I changed DNS servers. There were some sort of DNS resolution issues with the router, so I changed the DNS servers this server is configured to connect to (strictly speaking this doesn’t have anything to do with the speed of the server). I also installed eAccelerator, which caches compiled PHP scripts. I can already feel the increased responsiveness of the server. In particular, the connection won’t seem to hang as long after the post comment button is clicked (SpamKarma2 is fairly computationally expensive). Incidentally, if anyone wants to know how to install eAccelerator on Ubuntu Linux, this is a good tutorial.

As for the blog itself, I’m already up to several thousand visits, which is a very good number considering this blog is less than a month old. That is very fast growth. However, that growth can mainly be attributed to a scam advertisement in Popular Science. I scanned in that terrible ad and sent an email off to PZ Myers, who I know from back when we both used to use talk.origins a lot. He found it intriguing and posted it on his blog Pharyngula, which gave me over one thousand visits within a day. From there, the ad was apparently intriguing enough to spread through the blogosphere like ripples through a pond. It hit Punk Ass Blog, which was responsible for about a hundred visits, and also spread like wildfire through the NetStumbler social bookmarking/browsing site, which brought in even more visits than Pharyngula. So I think it’s safe to say that without that one blog entry there would be many fewer visitors to this site. Hopefully enough of those people actually stuck around long enough to check out some of the other content. Just judging by continuous hits on the RSS feed and the number of visits to the scam ad blog entry that didn’t also depart from that page, I will make a conservative guess that I have around a dozen regular-ish readers so far. That’s not bad. It could grow bigger, certainly, but for just starting out, it’s not bad at all.