Archive for June, 2008

Minor hardware upgrade news

Saturday, June 21st, 2008

Having just gotten a new computer a scant two weeks ago, I’ve already failed at resisting the urge to start pimping it out. I should point out the whole point of this endeavor was to make a cheap computer. Well, today I added another 2 GB of RAM (at a cost of $25) and a 400 GB hard drive (transferred from another computer). I’m lucky I already had that hard drive laying around; otherwise, I’d be out another, what, $80?

So the total price of my “cheap” system, if you don’t have any components laying around and have to buy everything from scratch, has ballooned to over $500. And that’s not even the end of it. I thought I could get away without a discrete graphics card; well, now I’m finding out that maybe I can’t. I’ve been playing around with Compiz, the 3D desktop manager, and also gotten interested in running some 3D Windows games in Wine. So it looks like I will need better than Intel Integrated graphics after all. And with the recent news that ATI is beefing up their Linux support, it’s proving hard to resist.

I still contend it’s possible to build a decent GNU/Linux desktop computer for $300. It’s just not something I seem capable of. I have the upgrade bug. The first time I happen to examine top and notice that I’m using swap space (gah!), I’m off buying 2 GB more RAM. A similar thing happens when I fill up all my hard drives (the whole reason I added this 400 GB hard drive is because the 500 GB one the system came with is already full).

Antenna preparations for ARRL Field Day

Friday, June 20th, 2008

It’s been awhile since I’ve discussed non-computer-related construction projects on this blog, so to break the drought, here are some details on a shortly upcoming antenna project.

The Amateur Radio Relay League’s annual Field Day is coming up next weekend. Field Day is the largest weekend of the year for amateur radio operators. It includes of all sorts of outreach activities, as well as heavy contesting (racing to see who can make the most radio contacts over the weekend). Since I only became involved with amateur radio recently, it’ll be my first Field Day. Unfortunately, the only antenna I’m operational on right now is a 44″ magnetic mount 70cm/2m dual-band whip antenna. It’s decent for operating mobile, but its performance isn’t anything to write home about.

Luckily, I bought a 17-foot 70m/2m dual-band base station antenna at a hamfest in March. A 201.5″ antenna is a bit more impressive than a 44″ antenna, don’t you think? I haven’t actually gotten around to installing the antenna yet, but Field Day is as good a reason as any to finally get it done. I’ve already done all the prep work and assembled the mount, which you can see in the picture. The domestic house cat is for scale.

I bought all the parts from Home Depot at not-too-ridiculous prices. All of it is galvanized steel (and thus rustproof), except for the tee-junction, which this particular Home Depot seemed to be out of in galvi. I do have a can of clear gloss waterproofing spray paint laying around though — hopefully a couple layers should be enough to keep the tee-junction safe from the weather. Most of the piping is 1″ interior diameter.

As for how the mount works, it will be installed vertically just below the peak of the roof on the side of the house. The two flanges will be secured to the side of the house using four-inch-long bolt screws. The screws will, of course, be going into studs accessible from inside the attic. The aluminum tube you see attached to the top of the mounting assembly is the base of the antenna; the antenna itself simply drops right into it once the mount is attached to the house. As for the decision of the overall placement, I’m putting the mount on the side of the house instead of on top of it so I don’t have to drill any holes through the roof, which could potentially cause some leaking.

Read the rest of this entry »

Why programmers make good editors

Thursday, June 19th, 2008

A couple days ago, whilst reading a post on a well-known blog (though I no longer remember which one), I noticed an unmatched parenthesis. A long parenthetical aside, fully two paragraphs in length, was not terminated with a matching right parenthesis. This is quite an easy mistake for most to make, and I do not fault the author overmuch. The length of a parenthetical aside is inversely proportional to how likely the reader is to remember he is still in a parenthetical aside by the time he reaches its end. In this case, the person doing the reading was also the post’s author going back through to edit it, and he simply missed the missing closing parenthesis.

But I’m the kind of person who notices these errors, and I’m also the kind of person who often thoroughly analyzes situations (note that I did not say “over-analyzes”), so I got to thinking, why do I notice these kinds of errors especially well when many people tend not to? I don’t think it’s just because I personally enjoy using parentheses so much that I keep a careful watch for abused parentheses everywhere I go, like some superheroic defender of downtrodden punctuation. No, that’s not it. Then it hit me. Keeping track of matching syntax is a very important activity in my day job — computer programming. Programmers run into time-consuming compiler errors early and often if they can’t keep their parentheses, angle brackets, curly braces, and square brackets tightly wrangled.

Therefore, it’s worthwhile to keep track of syntax nesting levels in your head as you write or read code, adding a mental “+1” for each opening character you come across and taking off a mental “-1” for each associated closing character. By the end of the chunk of code, you should be back to the number you started off with (for my fellow computer scientists out there, the best representation of this would be a stack (i.e., push a left symbol onto it and pop a right symbol off, and the stack should be empty at the end (this is how compilers work)). I’m not saying I’m perfect at it; when I’m twenty levels of parentheses deep in a particularly ugly Lisp subroutine, I have no choice but to rely on the compiler’s auto-indentation to make matching manageable. But I definitely think I’m better than most, simply because I regularly work in an environment where it matters a lot.

So as I’m reading prose and I encounter a left parenthesis, some kind of state subconsciously switches in my mind, and I go into parenthetical aside mode. I stay in that mode until a right parenthesis is encountered. If one isn’t forthcoming, I quickly scan ahead in the text to see if there even is one, or maybe there was one already that I simply missed. More often than not, the author has simply forgotten the closing parenthesis. It is my experience that long parenthetical asides are more rare than unclosed short ones. This same mental trick even works for parenthetical asides inside of parenthetical asides and even parenthetical asides inside of parenthetical asides inside of parenthetical asides, but that’s about as far as I go. Luckily for me, you don’t tend to see levels of parenthetical nesting four or more deep, and if you do, it’s probably some Lisp programmer forgetting in what medium they’re writing. If the latter is the case, watch out for cars and cdrs as well.

But I don’t just notice mismatching syntax errors in the written word; I tend to notice all errors (so long as I’m reading with the intent of editing, anyway; when I’m speed reading, I often miss errors on account of not seeing them). I’m a good editor — and not to sound vain, I’ll balance that out by saying I’m a terrible bowler. But I can’t help but think that being good at editing comes naturally to computer scientists. Many of the skills — noticing slight deviations from the rules, especially in the form of syntax — are exactly the same. Both the English language and all programming languages have well-defined rules about how words/clauses may or may not be used together. It’s simply a matter of identifying violations of the rules.

I will add one rather large caveat to my thesis: I’ve known many programmers who cannot spell worth a damn (maybe they flee to computer science because it involves very little essay writing?). Some of them have been dyslexic. I don’t know if anyone’s established a correlation between dyslexia and going into computer science, but I definitely think there is one. So I think programmers make good editors, with the exception of the many programmers who cannot spell well. But if the spelling is good, by virtue of their profession, I bet they’ll be darn good at noticing all of the other errors one encounters in prose.

And for those of you following along closely at home, did you notice the mismatched parenthesis in this post? In the comments below, let me know if you noticed it, and whether you are a programmer. Be honest! Let’s try to get some data that, while not conclusive, will at least be one step above anecdotal.

The biggest threat to the Internet since EMPs

Wednesday, June 18th, 2008

I’m angrier than a bull in a Communist china shop over a recent trend amongst ISPs towards metered payment schemes (use BugMeNot for access). I wouldn’t mind so much if they were charging market rates, which would be between 5 and 10 cents a gigabyte, but instead they’re going for outright extortion and charging one dollar per gigabyte. So instead of using their current revenue and building out their infrastructure to handle expected increases in traffic in the future, which is what they should be doing, they’re hoping to cut utilization by charging more, and thus ramp up profits while slowly choking our Internet to death.

This stupid pricing scheme has the potential to deal long-term damage to the Internet. There’s so much potential for over-the-net distribution of content (including high-def video and video games) that hasn’t quite materialized yet, and won’t ever if customers are charged so much for Internet access. The cost of “renting” a DVD-quality movie over the Internet doubles at $1/GB pricing levels. It’s obscene. The United States was the world leader in Internet adoption for so long, but now we are falling hopelessly behind. ISPs in many nations (including Finland and South Korea) now offer connections that absolutely put ours to shame, like 50 Mbps symmetric for less than what 8/1.5 Mbps costs here. And we only have our money-grubbing, monopolistic communications companies to blame.

If you find yourself stuck with metered Internet access that charges unrealistic bandwidth rates, don’t put up with it. Complain. Loudly. Change your service to anything else that’s available in your area. And if you’re a big torrent user like I am, you may find it cheaper to buy a business-level unlimited connection than to pay $1/GB. To bring the costs down a good bit, share it with your neighbors over WiFi. Just don’t let them pillage the future of the Internet for the sake of making a quick buck.

The best Nerf war I have ever seen

Tuesday, June 17th, 2008

Okay, so usually I don’t ever do posts that consist solely of an embedded YouTube video, but this one you have to see. Check below the fold.

Read the rest of this entry »

Verizon takes the next step in killing Usenet

Tuesday, June 17th, 2008

I feel obligated to report on this story because many people of my Internet generation don’t know about the rich history of Usenet, and thus will not appreciate what is being lost. Usenet is the original Internet discussion system. It’s been around a lot longer than the World Wide Web. Prior to 1992, there was email, for private correspondence between small numbers of people, and then there was Usenet, for larger discussions amongst huge groups of people.

Usenet is still around in its original form, though it has long since been eclipsed by web-based discussion forums. It still has a certain appeal to it though, and in high school, I was a very active participant on the newsgroup talk.origins, which is a newsgroup devoted to the discussion and debate of biological and physical origins. I was also rather active in rec.arts.sf.written and rec.arts.sf.composition, the former of which deals with professional speculative fiction (aka science fiction and fantasy) and the latter of which is a resource for people writing their own speculative fiction. Alt.atheism was fun too, although it naturally had quite a few trolls.

Nowadays, many ISPs no longer offer direct newsgroup access. It used to be that your ISP would run newsgroup servers (for instance, news.comcast.com), and you would use newsgroup clients such as Xnews (which feels sort of like an email client such as Mozilla Thunderbird, which can also read newsgroups) to connect to Usenet. Nowadays many people just access newsgroups directly on the web through services such as Google Groups — which I will point out is a rather confusing service because Google doesn’t make much distinction between the Usenet hierarchy, which they are merely displaying messages for, and user-created groups that only exist in the Google Groups service and that Google has full control over. As a result, a lot of people using Google Groups think they’re just chatting away in a Google discussion group, when in reality they are merely using Google’s web portal to something much larger.

I have lots of fond memories of Usenet, and I still check up on some of my favorite groups to this day. There’s a certain camaraderie that develops there that’s unlike the other kinds of social interactions you run across on the Internet. The users have generally been on the Internet forever (many from the pre-Web era), they trend toward academic types, and the discussions are fascinating. I met up with regulars from talk.origins a couple times in Washington D.C.; these events were scheduled every so often in areas with a large concentration of participants, and people would even come down from New York City to the DC meet-ups! Hell, I believe a large part in what helped me earn my full scholarship to University to Maryland was my education on evolution through talk.origins. One of the professors on my scholarship interview panel was a biologist, and I floored him with my knowledge of the rather complex topic of evolution and impressed him with my firm stance in favor of science over ignorance. So you can definitely say I have a soft spot in my heart for Usenet.

Now that you know what Usenet is and what it means to me, perhaps you will understand why I am so saddened that Verizon has removed the entire alt hierarchy from its Usenet servers. Verizon is a huge Internet Service Provider, and this will affect millions of their customers (including me!). The alt hierarchy was by far the largest hierarchy in all of Usenet; in web-equivalent terms, it’d be like if your ISP blocked access to all websites with the .com Top Level Domain. Their excuse is that there may have been some child porn on some of the alt.binaries groups (groups that are used for trading files, as opposed to discussions). Be that as it may, it doesn’t address at all why they blocked access to the non-binary parts of alt, including, say, alt.atheism and thousands of other groups. And to extend our Web analogy, child porn is available on .com sites as well, so by their logic, shouldn’t they be blocking all .com sites on the Web?

Read the rest of this entry »

Your mission: Download Firefox 3

Tuesday, June 17th, 2008

Your mission, should you choose to accept it, is to download and install Firefox 3. It was just released today and it’s awesome. I can say that with some certainty because I’ve been using and marveling at the version 3 Release Candidate for a week and a half now. One of Firefox 3’s new features, the Awesome Bar is a large part of what makes Firefox 3 so awesome (what, you didn’t think I was merely being hyperbolic, did you?). The Awesome Bar is the replacement for the old-school Location Bar, with a lot more features that make browsing even more convenient.

And I don’t want to rush you or anything, but you really should download Firefox 3 as soon as you read this. The Mozilla Foundation is trying to set a World Record for most software downloads in 24 hours, and the clock runs out at 13:00 EDT on June 18. Having the World Record for most software downloads belonging to a Free Software project would be an amazing argument in favor of Free Software, so please, pitch in!

So, whether you’re upgrading from Firefox 2 (a painless process), or making the switch from the evil and nefarious Internet Explorer, there’s never been a better time to download the latest version of Firefox.

“The Ghost Brigades” shows clear signs of Scalzi’s improvement as an author

Monday, June 16th, 2008

Having recently read Old Man’s War, it’s not surprising that I’ve just finished reading its sequel The Ghost Brigades. I liked Old Man’s War, but I found some significant flaws in it that hampered my enjoyment. Thankfully, most of those flaws were fixed in the sequel. The alien races aren’t nearly as implausible, not every character has the same dark cynical sense of humor (which is totally a projection of John Scalzi’s sense of humor, I should add), the writing style isn’t quite so absurd, and the cast of minor characters no longer consists solely of cliché cardboard cut-outs. In other words, John Scalzi has definitely matured as a writer between his first (or is it his second?) book and this one. I definitely look forward to reading the third book in this universe, The Last Colony.

Unfortunately, some things didn’t change. The novel is set in exactly the same implausible “science fantasy” universe, with hundreds of intelligent alien species that all happen to have roughly equal military capabilities, wave-of-the-hand FTL (“skip drive”), and a ridiculous over-emphasis on hand-to-hand combat. These aspects weren’t so grating as when I read the first book because I’ve been acclimated to them by now, but I still wish they weren’t there.

And ironically enough, after complaining bitterly that the protagonist John Perry in Old Man’s War was clearly a stand-in for the author, John Scalzi, I actually liked the main character in that book better than this one. The protagonist in The Ghost Brigades is Jared Dirac, a Special Forces soldier born into a cloned adult body. He’s fully intellectually capable from birth, yet has no memories, no past, no personality, etc. Eventually as the book progresses and he puts a few months on he develops more of a personality, but it isn’t one that I particularly feel any empathy for. The Special Forces (the “Ghost Brigades”) are bred from birth to be killing machines. They’re not emotionless by any means, but they are cruel, efficient, subordinate, and very focused on getting the mission done. In other words, not the best choice for a protagonist.

Judging by the progression from book one to book two, I’m guessing that The Last Colony will be even better. If John Scalzi just writes in a better main character, and refrains from having all of his characters exhibit the exact same dry sense of humor that he has, he has the potential to come up with something that is really good. That last part is a particular sticking point: one of Scalzi’s consistent weaknesses is an inability to write characters significantly different than himself. I don’t know if he’s capable of it or not; nothing even demonstrates he’s tried.

The caveats aside, put me down firmly in the “Endorsements” column for this book. Having read the first two in this universe and itching to read the third, I’d say that John Scalzi has done a pretty decent job. I can’t say the same for many other books I’ve read, some of which have left me with little desire to read anything further by the author. So like I said with Old Man’s War, The Ghost Brigades isn’t great, but it’s a good, fun read. I recommend it.

64-bit GNU/Linux is totally ready for mainstream use

Monday, June 16th, 2008

When I was installing Ubuntu GNU/Linux with KDE on my latest desktop purchase, I faced a seemingly agonizing decision between 32-bit and 64-bit. There are all sorts of peripheral arguments over performance, but the main arguments on each side are that 32-bit can only support 4 GB of RAM (not technically true) and that 64-bit has limited application support and is more buggy.

Well, I’m happy to report that all of the supposed caveats of 64-bit GNU/Linux completely failed to materialize. After over a week of heavy usage of 64-bit Ubuntu, and installation of a few hundred applications, I haven’t run across a single problem stemming from my decision to use 64-bit. So I would say the choice of 64-bit is a no-brainer. 64-bit has reached maturity, and all of the supposed problems with it are problems of the past. 64-bit is the future of computing (just like 32-bit was the future of computing back when 16-bit was still common). It’s better to make the switch now than to find yourself a year or two down the line facing a 64-bit reinstallation of a 32-bit system. This choice is pretty much set in stone when you install an operating system; there is no upgrade path. So make the correct choice now.

I should point out that not all processors support 64-bit OSes. The older the processor, the less likely it is to offer 64-bit support. So do your due diligence before you accidentally end up downloading the wrong version of a GNU/Linux distribution ISO.

Why Dianetics is better than the Bible

Sunday, June 15th, 2008

I am hardly a fan of Scientology, yet I find some humor in the fact that Scientology’s “holy book”, Dianetics, is better than the Bible in nearly every way. Here is my reasoning.

First of all, Dianetics is much more self-consistent than the Bible. One need only to examine a brief list of contradictions in the Bible to see what I’m getting at. This isn’t even a fair comparison for the Bible, really, as it was written by dozens of authors over hundreds of years. Dianetics, by comparison, was written by one author in a few years. Dianetics would have to be pretty terrible to lose out in this comparison, and the truth is, it’s not that bad. The ideas it espouses, while wrong and dangerous, are at least coherent and consistent. You can’t say the same for the Bible.

Dianetics was written long after the invention of the modern printing press, which gives it a huge advantage. It has been faithfully reproduced, word-for-word, since it was originally penned by L. Ron Hubbard. By contrast, the Bible was written long before the invention of the printing press, and it was savaged by over a millennium of hand-copying and translation between multiple languages. The end result is no one can really even be sure about what any specific passage in the Bible is even supposed to say, let alone what it means. Schisms between branches of Christianity have broken out over little more than which copy or translation should be considered canon and which is heresy.

Dianetics was written in modern English, and as such requires no translation for comprehension by a modern audience. Each word in it that you read is exactly what L. Ron Hubbard had intended. The Bible, on the other hand, has been pieced together from works written in Greek, Hebrew, and Latin. Any translation is inherently inaccurate, and multiple translations even moreso. Even worse, the ancient translations were done by people with an agenda, and for many of the books in the Bible, these are all we have left. They didn’t have any concept of modern scholarship, where the goal is to translate the work as accurately and true to the original as possible. So it’s really hard to even say what the messages in the Bible are supposed to be, whereas Dianetics, written originally in modern English and still readable in its original form, has no such ambiguity.

There isn’t even any agreement on which books the Bible is supposed to be composed of. Compare Catholicism and their Apocrypha, or the Eastern Orthodox Church. And that’s not even considering Mormons or Jehovah’s Witnesses. Dianetics, at least, has a single canonical version. There are no schisms about which text is supposed to be in it and which isn’t.

Face it, the comparison between the Bible and Dianetics is completely stacked against the Bible. Although, to be fair, the same would be true of any comparison between a modern book and an ancient one (this is why my arguments so far are thus more of a criticism on the Bible than a booster for Dianetics). The difference is that most ancient books are only the object of inquiry for scholarly study; no one is trying to base an entire belief system around them. Hopefully now you have a bit more insight into why I, and my fellow atheists, find religions based on ancient books so confounding.