30.4.08

Free music

Coldplay is running a promo this week offering a free download of a track from their new album. 

http://www.coldplay.com/song

Only catch is you have to give them your email address...not sure if this could be a source of spam, but hopefully not (and anyhow, I have a gmail, which catches all of the spam I currently get). 

Of course, I'd be remiss if I didn't mention that Archive.org has a great collection of Grateful Dead concert recordings.  You can stream both soundboard and audience recordings and download the audience ones.

http://www.archive.org/details/GratefulDead

I'd particularly recommend April 6, 1985:

http://www.archive.org/details/gd85-04-06.oade-schoeps.sacks.24027.sbeok.flacf

Oh, and if you're downloading, I'd download the VBR as I'm starting to wonder if low-bitrate mp3 encoding damages your hearing...my ears definitely ring more after listening to a 64kbps mp3 than a 192.

24.4.08

The story of Helmer

Ran across this guy online who used an Ikea cabinet as a case for a massively parallel computer:

http://helmer.sfe.se/

Actually, I'm pretty jealous as I've been thinking about setting up something like a Beowulf cluster for a video project I have at school.  Any ideas as to how much of an improvement this would be?

Apple ][gs laptop

The amazing Ben Heckendorn has been working again on an Apple ][gs laptop, and what can I say?  It's beautiful. 

http://benheck.com/04-14-2008/apple-iigs-original-hardware-laptop

The man is amazing.

22.4.08

Sin as Economics

So many times I find myself thinking of sin economically.  Not necessarily "how much money can I make off doing this sin?" (in fact, I don't have trouble with stealing =]) but more in the sense of sin vs. punishment.  Keep in mind that economics is the study of choices.  

In economics there's these imaginary units called utils that are used to measure satisfaction.  If you purchase something, that means that you think that you will get more utils out of the purchase than either keeping the money or spending it on something else.  

Carry this over to sin.  How often do we think of sin--breaking the law--in this fashion: "Well, how bad is the punishment?  Oh, just that.  That's nothing, so I'll go ahead and commit the crime"?  Take, for example, tax night.  My dad drives up to the post office at ~10:30 PM and finds that if he wants to turn in his taxes on time, he'll have to drive all the way downtown.  There's a few other guys there, and one of them asks just how big the late fine is.  See?  Not, "Well, I guess I'll be driving downtown because I'm a law-abiding citizen and it's the right thing to do," but rather, "Well, the fine isn't that bad, so I'll break the law, pay the fine, and be done with it."

The fundamental problem is a lack of fear of sin.  We often don't realize just how serious sin is, and think of it more in terms of how bad the punishment will be.  If we get more utils from committing the sin (minus the lost utils from the punishment) than we would if we did not sin, that sin suddenly becomes 'ok'.  Whereas we should respect sin for what it is: deserving of eternal damnation.  "The blood of Christ is made cheap" as one commentator put it.  

As well, punishments are not intended primarily as deterrents.  This may sound odd, but think about it for a minute.  If law's punishments were primarily deterrents, the law would become draconian very quickly.  Hang the bank robber.  No one will ever rob a bank again, because they don't get any utils from it.  The problem with that is that the punishment does not fit the crime.  

Punishment fitting the crime is a Biblical concept that goes back a long time.  It attempts to set aright what was done wrong on earth, but does nothing related to what was done wrong in heaven.  That's left to Christ.

Let us not make Christ's blood cheap.

20.4.08

Apple's big mistake with multitouch

I remember when I posted about the new MacBooks and MacBookPros here a while ago.  I remember being excited, but slightly confused, that there was multi-touch in the new MacBookPros ONLY.  But I was satisfied for the time; after all, multitouch is a pro feature, so Apple just doesn't want to give it to everyone, because that makes the machines more expensive, right?

Well, yes.  But it's shooting yourself in the foot all the same. 

Multitouch sounds like an awesome idea.  I'm sure there are API's that let third-party software interact with it.  And this is the problem.

How many people would justify spending ~$1000 more for a laptop to get multitouch functionality when, say, a base MacBook would work for them?  Not many!

So here's the problem.  Apple puts multitouch in the MBA and MBP, but not the MB, which is their most popular laptop.  Developers look at this and say, "Why bother writing all the code for multitouch when only a small subset of my users will actually use it?  After all, it's only on Apple's less common laptops."  I just can't see a lot of developers saying, "Well, multitouch only works on the MBP's and MBA's.  Sorry, you lowly MB user, but you can't use all that wonderful code I wrote for multitouch because of Apple."

Sure, pro software packages might add multitouch support.  But your average developer?  Probably not. 

And that's the problem.  Without a large dev base, multitouch won't catch on, and Apple will have something cool but practically useless. 

If it's the wave of the future (or something like that), why not focus on getting it in the hands of the masses?

19.4.08

Registry Woes

I just backed up my registry.  That was eye-opening.  It's 114 MB!  That's huge.  And yes, I have cleaned it with CCleaner (which is absolutely amazing).   And I wonder why I don't switch to Linux.  Hah.  No registry to corrupt there.  

Coincidentally, I remember a friend who had a really messed up registry.  The computer would check it when it started up, and he had to quit the check application via CTRL-ALT-DEL.  I eventually became a hero for fixing it (had to use a bit of DOS chops to get it done), but by then he'd gotten a new machine.  I should ask him for his old laptop.  Not a shabby machine to mess around with...but I digress.

Microsoft, why oh why did you make the registry in the first place?

To Linux or not to Linux?

I've been reading some of the copious (and sometimes almost unrelated) documentation of emacs, and I'm now considering switching entirely to a GNU/Linux setup as well as releasing iKalk under the GPL (maybe after I clean some of the source up; it's really ugly =]).  I like the idea of GNU, and I've used Linux before.  In fact, I've been very pleased by both Linux and the programs for it.  It's been stable and fairly easy-to-use.  And I'm in love with BASH; it's so much more powerful than CMD.  I mainly use FLOSS software: OpenOffice, the Mozilla suite, Eclipse, Audacity.  The one thing that really has me worried is those few Windows-only apps.  For instance, I need to be able to run Microsoft Publisher, and I'm loathe to leave PowerPoint behind.  It's just more powerful than OO Impress.  And there's ReBirth, a software synth that's available free.  I've only got 30 GB of space on my hard drive, so dual-boot is pretty much out of the question.  And at 1.2 GHz, virtualizing anything is horrible.  Anyone want to contribute to the pot for a new laptop for me?  I'm aware of Wine, just not sure how well it runs those programs (need to check Wine HQ; someone want to do this for me, please?).  And then there's C and C++.  I know Linux is an excellent platform for learning those languages, but most books will expect you to have all the Windows APIs, right?  

Any advice/ideas/derision will be appreciated.

Default: goto(hell);

It's fairly common to hear people say that they're good enough to get into heaven.  Most of them would argue that their good works are what gets them into heaven; however, their main problem is that they think of heaven as the default case.  

"Default case" is a programmer's term used to describe, well, the default case.  In programming there's a thing in most languages called "select case".  It is basically a big IF-THEN structure. I'll show you an example:

IF-THEN structure:
if a = b then
    do THIS
else
    if a = c then
        do THAT
    else
        if a = d then
            do THEOTHER
        else
            do WHATEVER
        endif
    endif
endif

Select Case structure:
select case a
case b:
    do THIS
    break
case c:
    do THAT
    break
case d:
    do THEOTHER
    break
Default:
    do WHATEVER
end select

("break" tells the program to skip the rest of the stuff in the select.  If I didn't put the breaks in, and a = b, then the computer would do THIS, THAT, THEOTHER, and WHATEVER.  Yeah, it's kinda dumb, but it's useful.)

See?  People think that if they just live life and aren't mean very much, and do some good stuff, they'll go to heaven.  If they do bad stuff, like murder, then maybe they've been bad enough to go to hell.  Their idea of life is something like this:

select case Me
case Murdered:
    goto(hell)
    break
case Embezzled:
    if AmountEmbezzled > 1000000 then
        goto(hell)
    else
        goto(heaven)
    break
Default:
    goto(heaven)
end select

Whereas reality is:

select case Me
case Elect:
    goto(heaven)
    break
Default:
    goto(hell)
end select

Big difference, eh?

(yeah, I admit, as I have before, to being a nerd =])

Emacs devel mailing list humor

I recently got emacs set up on my computer, and ran across DEVEL.HUMOR, a file of humorous exchanges on the emacs devel list.  I thought I'd share:

  "In order to bring the user's attention to the minibuffer when an item such as 'Edit -> Search' is activated from the menu, I was just thinking that we could draw a big rectangle around the minibuffer, blinking (or zooming in-and-out) until some input is typed in."
  "How about dancing elephants?"
  "They don't fit in my office."
  "Well once the elephants are done, your office will be much... bigger."
                  -- Stefan Monnier, Miles Bader and Kai Grossjohann

I remember these versions as yard-rocks (is that between inch-pebbles and mile-stones?).
                  -- Kai Grossjohann

  "Aren't user-defined constants useful in other languages?"   "The only user-defined constant is ignorance.  (With programmers, this is a variable concept ;-)"
                  -- Juanma Barranquero and Thien-Thi Nguyen

  "Uh, 'archaic' and 'alive' is not a contradiction."
  "Yes it is.  'Archaic' does not mean 'old' or 'early'.  It means 'obsolete'."
  "'He arche' in Greek means 'the beginning'.  John 1 starts off with 'En arche en ho Logos': in the beginning, there was the word.  Now of course we all know that Emacs was there before Word, but this might have escaped John's notice."
                  -- David Kastrup and RMS

  "[T]here may be a good reason since the code explicitly checks for this; see keyboard.c:789 [...]"
  "I think I understand, but I can't find the code in keyboard.c.  Do you really mean 'line 789'?  Of which revision?"
  "Sorry; by 789, I mean 3262 :-P"
                  -- Chong Yidong and Stefan Monnier

  "Despite being a maths graduate, I can't think of any other such constants with anything like the universality of e and pi."
  "42"
                                -- Alan Mackenzie and David Hansen

Death of the Semicolon

I was installing some software recently that hooked into the keyboard to provide an instant-access thing: push a key, and a little box shows up that you can type stuff into to, say, launch programs.  

What amused and outraged me about this program was that it suggested the default key be the semicolon key with no modifiers.  I laughed and promptly changed it to ctrl+;.  The progam seemed outraged that I would change the shortcut so I had to press TWO keys in order to launch it, rather than the "instant-access" semicolon.  

So, does this indicate that the semicolon (and colon) are not used enough to merit having their own key?  I did not forsee the death of the semicolon; I use it all the time.  Are our communications becoming that fragmented that we no longer need semicolons?  

Also, what does that say if pressing ctrl+; is no longer "instant-access"?  Are we becoming so impatient that we can no longer press a key combination?  

And finally, I was suprised at the lack of regard for programmers.  We use semicolons everywhere (at least in C, C++, and Java).  Colons aren't that rare, either.  

Will we be seeing the death of the semicolon? 

Recursion

So the other day I was programming in QBASIC and discovered something dissapointing: you cannot recurse in QBASIC.  (In case you are wondering what recursion is, you need to understand recursion in order to understand recursion.)  It's kind of a bummer for me because I was writing the typical beginner's exercise of recursively calculating factorials in order to brush up my BASIC.  

(I also discovered that my BASIC syntax has been thouroughly polluted by C++ and Java =[  ).

This post lands squarely in the category of "it might be important to know if you're at a party and someone asks...like that's ever gonna happen."

7.4.08

The future of programs...maybe

Here's an interesting essay on web apps and where they're going:

http://www.downloadsquad.com/2008/04/07/should-software-be-native-or-web-based/

I'd agree with that.  First example: email.  I have a gmail, but use Thunderbird for my email because I couldn't stand the speed of the web interface.  I've also started using NewsGator and their corresponding desktop software, FeedDevil, for RSS.  It's MUCH more powerful than Thunderbird (which is what I was using) and free. 

Here's another page about some related web->desktop intefaces (yes, in case you were wondering, I do program in C++):

http://www.techcrunch.com/2008/04/07/bridging-desktop-and-web-applications-part-2/

So, the question arises: how should an OS provide better integration with the web?  I'm not talking about cloud computing over thin clients here.  The web's not fast enough for that yet.  The question is more of, how far should an OS go with providing a seamless interface to the web and web content?

(As an aside, I'd like to say that this is Web 2.0, for real.  I've read essays about how Web 2.0 is pointless, or does nothing really good, and was made to make people happy after the .com bubble burst.  Nope.  Web 2.0 is allowing the average joe to get on the web and create content.  Blogs, social networking, even things like XDrive and Google Docs is Web 2.0.  It's here, and it's not going away.)

Would you just want an OS to provide, say, something like Prism built in, or should it go farther?  Automatic syncing to Google Docs, XDrive, a CVS repository (for code)? 

Wouldn't it be awesome if whatever you did on your desktop was automatically cached on a web service somewhere?  Your project documents uploaded and synced with Google Docs, your music backed up to an XDrive, etc?  In fact, you could turn your desktop in to nothing more than a cache for your web services. 

I've thought about something like this before.  It was before we had DSL, so the web wasn't sitting front-and-center in my mind, but the concept could be carried over to the web.  The idea was that you would have this tiny gadget like, say, a PalmPilot (or an iPod touch or iPhone, but the rumors of those didn't even exist when I thought this up) that was wirelessly connected to your laptop by something fast, but not necessarily long-range (maybe 20-30 feet).  You'd have your laptop sitting in it's bag, off or in standby, but running software that let you turn on the hard drive and access data off of it via the wireless handheld gadget.  So you could play music (and have your entire music library available.  Keep in mind that this was when a 20 GB iPod was huge!), edit docs, and so on, and have the changes reflected on your laptop.  In theory, you could even harness the CPU/RAM of your laptop to do some of the heavy lifting for games, etc. 

Now imagine that for the web.  It's not necessarily easy, nor cheap.  The major flaw is that there's not ubiquitous internet available, and it's not all fast enough to make the idea realistic now.  But maybe in  10 years? 

I could even see being able to set up a ~2GB partition on an iPod or flash drive and loading your apps on that.  Then you could take it around to public computers, which could provide a base OS, with  you carrying the apps and account info on your portable drive.  Your docs would all get synced via the internet, so the public machine, in this case, is nothing more than a cache for your internet docs. 

Imagine combining that with Linux.  Voila!  Each public machine has a base Linux distro, say, Ubuntu.  On your flash drive you have a window manager, programs, etc.  So you pop your flash drive in and you get your desktop just as it usually is at home. 

And, theoretically, programs could be set up to only have a little piece of them for your flash drive and when you launch them they'd download the rest of themselves from a Web repository and cache themselves on the hard drive of the machine.  You could fit a lot more apps on your portable drive because each program would be about a meg in size.  Personal data could be synced to a web site so you could perceivably have incredibly tiny applications sitting on your flash drive.

You could, of course, carry that even farther, and provide each public computer with a default set of programs, and your flash drive would just have the necessary login and authentication information to automatically load up your account and profiles when you sit down to work.  Rare apps could be put on a flash drive.  If this were to come true, there would be a lot of life left in those 128 and 256 MB flash drives, as that's all you'd realistically need.  Taking 256 MB out of a 80 GB iPod is nothing, so as long as you had your iPod with you, you could have your desktop available at any public machine. 

Or am I just smoking crack?

5.4.08

Ancient computer pictures

I ran across this while hunting some stuff down to respond to a mailing list, and found it pretty interesting:

http://12.206.251.215/ultimedia57/

It's basically a 1990 version of a media pc. Wicked fast 486SL CPU, digital video capture card, audio, and a CD-ROM drive. Here's the picture of the front:
Yeah, and in case you were wondering, that black thing in the middle of the computer is, in fact, a CD drive...

2.4.08

Legal Statement

I have consumed a large amount of time to make sure that in case of legal assault, I have proper legal documentation for this blog.  To this effect, I have posted a new legal statement, which can be read here:

http://linuxmercedes.homelinux.com/legal/legal.html

On Software: In which I beat upon OneNote

I'm starting a new series about software design in the vain hope that at least a few people get educated on what makes software good. And to kick it off, I'm reviewing OneNote and EverNote.





Take a good look at the two screenshots above. Notice anything? (No, the second was not taken on a mac; I just have a heavily themed version of XP.)

You should notice a major difference between the two programs.

I wandered, by clicking a link in an rss feed to another blog, and from that blog to another, and so on, until I hit Wikipedia for the screenshot of OneNote. OneNote, as you should know, is Microsoft's widely acclaimed note-taking program. I've wondered what made it so good, but, not needing a note-taking program before, never downloaded it. Then, when I did need one, I'd recently read about EverNote, so I downloaded that instead.

My first reaction when I saw the shot was, "So that's OneNote. Ok, cool. Nothing special here." But then I got to thinking (and dangerous things happened =]), and I realized a few other things.

First: there's about 2 million buttons on the screen. Now, I don't know anyone who needs that many buttons to take good notes. In fact, it's distracting. Now, I'm sure each of those buttons has a function, but it's overwhelming. I don't need to control fonts and text sizes and stuff while I'm jotting down a note.

Second: there's nothing special about being able to store notes in notebooks with sections and pages. Nothing. In fact, I could do that just about as easily in Word, with a bit of trickery (and a lot less expense).

Third: from what I'm seeing, it looks like it's overfunctional. In other words, it suffers from creeping featurism. OneNote looks like a conglomeration of a note-taking program, a word processor, and an annotation program. That's not what most people want in a note-taking program.

But overall, what I noticed was bleh. Nothing new or innovative. Nothing unique. It looks like something almost anyone could have designed (proper copyright permission from Microsoft notwithstanding).

Compare this with Evernote.

EverNote has a really simple interface. Just type to type. No click needed =]. There's a few buttons, that's all. Also, EverNote does categories, and you can put a note in multiple categories. Another thing that blew me away when I started using EverNote was that it knew where I got my stuff from. For instance, I was copying and pasting stuff from Firefox. It automatically put the note in the Web Clips category, as well as noting the source URL so I could link back to the page without hassle (which makes citing sources much easier). Evernote also stores notes on a digital roll of paper, so you visually scroll through all your notes.

But most of all, EverNote just takes notes. That's it. This is good for many reasons. There's no extra overhead for markups, etc. both resource- and feature-wise. If I wanted to mark up a document or tweak fonts, etc. I'd actually use a word processor. It's conducive to taking notes. It's UI is specific to taking and quickly organizing notes.

My apologies if I've ripped unneccessarily on OneNote. I've never used it. But to my eyes (and I'm a software developer), it looks bland and has featuritis.

Oh, and in case you were wondering, no, EverNote's default notes do not have clips from webpages on Islamic ideology. That's my research =]