janrain social sharing to twitter: an error occurred

Just a quick note, mostly to myself, but maybe this will help out somebody else doing a google search down the road, because I sure didn't have much luck finding anything.

Today, for the second time, I got bitten by Janrain's poor diagnostics when it comes to error handling with their social sharing .

I got bitten by this same scenario probably six months ago, but had forgotten the details, which made it all the more aggravating when I wasted time trying to figure it out again today.  I was just putting the finishing touches on a new feature, and regression testing social sharing to Twitter through Janrain, when suddenly every attempt to share to Twitter began failing, with no information from Janrain other than "An error occurred".  Yeah, really helpful guys.

I was finally able to track down the issue.  I was just sending a test tweet on a private account, so I didn't really care about the content, and I was reusing the same content over and over.  Apparently the Twitter API detects that the content is being duplicated at some point and begins rejecting the tweets with an error (don't have the exact code handy as I'm writing this, but it's obvious that its being rejected because its a duplicate tweet).  Janrain, instead of reporting this detail, just squelches it and reports "An error occurred".  I was only able to figure this out by looking at the HTTP requests.

So, if you suddenly run into this mysterious generic error with Twitter and Janrain social sharing, and are attempting to tweet the same content, this could be the cuplrit!  Just change up the content and all your problems will be solved (until the next one).

Porting Cell Phone Number Verizon -> Straight Talk -> Google Voice

In an effort to cut costs, my wife and I decided to switch our cell service from Verizon to Straight Talk.  We would have done it sooner had we realized that we could use our existing Verizon phones with a Straight Talk SIM card.  I had assumed since Verizon was CDMA, and not GSM, that we would have to buy new phones to switch, but that turned out not to be the case.  We decided I would be the guinea pig and port my phone first.  

I verified our phones were indeed eligible and ordered SIM cards from their BYOP (Bring Your Own Phone) site.  Even though we've lived in Tampa over a year now, I still had my Atlanta cell phone number. My original plan was to establish service with a new Tampa number (time to get local!), and keep my old service active for awhile until I determined whether Straight Talk's service was any good.  Then I could give people my new number, and discontinue my old service with the Atlanta number.  Seemed like a good time to make the switch.

The first hiccup in my plan was that in order to activate an already active phone on Straight Talk, you have to port your existing number from your current provider.  You can't just take out the old SIM card and put in the new one.  They are very explicit about this in the activation process.  I was hoping to avoid the potential pain of the porting process, but it seemed there wasn't really going to be any way around that.  Oh, well.

I started the activation process around 8pm, and hoped it would be done in a couple of hours.  It wasn't.  When I woke up the next morning, Straight Talk's website still indicated the porting process was "in progress", with no additional details.  Around noon, it still showed "in progress".  I finally initiated a customer service chat to make sure there weren't any issues.

When you fill out the porting form with your existing carrier account information, one of the things they ask for is your billing zip code.  As it turns out, they don't want your current billing zip code.  They want the billing zip code that you used when you created the account.  So if you moved at any point after establishing service with your original carrier, your port will get stuck.  But Straight Talk won't tell you that.  (To be fair, I guess they would have eventually).  Straight Talk's porting forms gives no indication of this, it just asks for your "billing zipcode".  But the customer service rep stated this requirement as if it should have been obvious.

I had to think back to when we first established service with Verizon to remember what our zip code was.  I gave it to the rep, and apparently I guessed right!  She told me, very specifically, my port request would be "completed today at 3:37 PM EST".  Pretty sure it was actually done a couple of hours before that.  Once it went through, my phone service was indistinguishable from when it had been directly with Verizon; it was just a lot cheaper.  The phone even still showed Verizon as the carrier, which makes sense, since Straight Talk is just reselling their network (as well as those of other carriers).

I still really wanted to get a local Tampa number. I found out that you have to order a new SIM card to do this.  After selling a bunch of stuff on eBay, I decided this would be a good time to finally upgrade to an iPhone 6+.  That way, I could establish the new number on my new phone, and keep the old service active on my old phone while I transitioned and gave people my new number.  

So, I ordered a AT&T compatible GSM SIM card from Straight Talk, and when it was delivered, picked up my iPhone 6+ at the Apple Store.  That, by the way, was a smooth process.  I ordered the unlocked phone using the Apple Store app on my old phone, for pickup at the Apple Store.  Walked in, showed them my order with a QR code displayed by the Apple store app, showed them my ID, and they brought it right out and I was on my way.

But I digress!  This part of the process was much more straightforward.  I logged on to Straight Talk's website, entered all the requisite info about the SIM card, and my phone was activated pretty much instantly.  The one thing that was a little disappointing was not getting any kind of choice on the number.  With some phone services (e.g. magicJack) you can pick from a list, or even request a specific number if it's available, so you can pick one that's easily memorizable, but with Straight Talk, you apparently just get whatever they decide to give you.

Now the conundrum:  what to do about my Atlanta number?  I set up call forwarding on that line to my new one, but that's only a very temporary solution.  Obviously I wasn't going to maintain two lines of service.  I gave my new number out to my family, but there are so many people that still have my old phone number from over the years.  What to do?

Then I wondered if you could port a cell phone number to a Google Voice account?  If I could associate my Atlanta number with a Google Voice account, I could just forward all the calls to my new number, and not have to pay for service for that old number.  I could also hold on to that number for a good long time!  Turns out you can!  Bingo!  I had an old Gmail account with an associated Google Voice number that I haven't used in years.  It should be noted, Google does charge a one-time $20 fee for porting the number, which is refunded if it turns out they can't complete the porting process for whatever reason.  You can even keep your original Google Voice number as well, if you want, for another $20.  Otherwise, it goes away 90 days after the porting process completes.

I did a few google searches to see what other people's experiences had been, and then I threw caution to the wind and decided to go for it.  A couple of hours after I initiated the process with Google, I got a notification from them that there was a problem with my port.  They indicated that I had given an invalid account number. 

Straight Talk didn't show me an account number on the "my account" screen.  According to the Google searches I had done prior, if you had a BYOP SIM, the account number was the last 15 digits of your SIM card number and your billing PIN was "0000".  I think that is probably true for GSM phones (e.g. AT&T or T-Mobile), but I finally found that since my phone was a CDMA phone, the account number was actually the IMEI or MEID number on my phone. (On the iPhone, you can find this in Settings -> General -> About).  I believe I entered the IMEI number; at least on my phone they were the same number, except the MEID number had one less digit at the end.  Also while reviewing this, I realized I had set a PIN on my Straight Talk account, so I used that instead of "0000".

After resubmitting, I didn't hear back from Google for awhile.  I did receive a call from (866) 667-6470, which I didn't recognize, so I didn't answer.  In my experience, these types of numbers are usually either telemarketers or bill collectors looking for somebody I've never heard of.  I usually google these numbers just to check, and it turns out this one is associated with Straight Talk.  But, they didn't leave a message, so I didn't do anything.  I checked my account on their website a little while later, and noticed where my Atlanta phone number used to display, it now showed the SIM card number instead, and labeled it an "inactive phone".  Google, however, still showed the port in progress, and I checked, and still had service on the phone despite Straight Talk's page claiming it was inactive.

Nothing else happened until the next morning, when I woke up I had an email from Google indicating the port was complete!  It took a total of about 22 hours from the original submission.  Mission accomplished!

Hopefully somebody else will be searching Google like I was, and find this information helpful!

My experience selling used items with Amazon's "FBA"

I've got a lot of stuff lying around that I no longer use that I really need to get rid of.  Instead of using eBay or Craiglist, I decided to experiment with selling 3 items on Amazon using their "Fullfilled by Amazon" program.  One reason it really appealed to me in my zeal to declutter is that Amazon actually handles all the fulfillment, so you ship the items to them immediately, and the clutter is gone.  Once they make their way to the warehouse, Amazon can fulfill them with Amazon Prime, which you figure a buyer has got to love.  Finally, it seems like most of the sellers price their used items so high that it should be easy to undercut with the lowest price and still really feel good about the amount of money you're getting, even after Amazon takes their cut.

First, let me say, it's obvious that this program is really geared for people who are running a business with regular inventory.  As easy to use as the consumer facing Amazon web interface is, it is amazing just how clunky and baffling the workflow through the FBA interface is for a simple guy like me just trying to sell his stuff.  It's truly awful.

I decided to start with three items.  Here's what happened

  1. One item listed for $70, Amazon has designated as damaged by the carrier. Supposedly I'm going to get reimbursed for that at some point, but it's not clear how much or when.
  2. Second item listed for $160 and sold relatively quickly, shipped to buyer Amazon Prime. This item was like new and in the original box. Two weeks later the buyer returned it to Amazon. Amazon has now designated it unsellable because it is defective. WTF? So now my only option is to have them send it back to me, and see if they guy really broke it, or what the deal is. Which of course I have to pay for.
  3. Third item listed for $30 sold after a few days, without incident (so far).

Amazon rocks for buying things and they get a much larger portion of each of my paychecks than they probably should, but as far as selling my used stuff, I'm thinking I need to stick with either eBay or Craigslist ...

Surly LHT and Kona Dew Deluxe stolen [Brandon, FL]

Pisser of a day.  Discovered around lunch time two of my bikes were stolen out of my garage.  As far as we can ascertain, nothing else was taken, including my wife's bikes.  Which makes me suspect it might have been a couple of people who took what they could easily ride off without drawing too much attention.  

We've been having trouble with our garage door randomly opening, which we reported to our landlord weeks ago (Waypoint Homes: DO NOT recommend) but they haven't been in any hurry to address it.  They're that way with most issues, it seems, but you'd think this kind of problem would be a little higher on their list.  They've pushed us up on the priority list, so now they are going to take a look July 1st, but it's a little late now.

To be fair, we are only assuming that they gained entry to the garage because of the door randomly being opened.  The times we've come home and found the door open, we felt fortunate that nothing seemed to be missing.  I had hooked up a workaround using a remote AC adapter that would let me turn off the power completely to the opener to prevent this.  I can only assume at some point I either forgot to switch it off or the power off signal didn't reach it without me realizing.

Anyway, I've filed a police report, and for whatever it's worth, let me put this out on the internet.  I've heard miraculous stories of recovery, so it's worth a shot...

The first bike was a Surly Long Haul Trucker/50cm frame/dark green.  It's already fairly distinctive because you just don't see too many of these out in the wild - they're really made for long distance touring.  But it made a great commuter bike when I was back in Georgia.  It's also distinctive because it not only has a rear rack but a front rack as well.  And it's the disc brake version of the LHT, making it even rarer. Here's a picture:

My beloved Surly LHT disc

The second bike was my original commuter bike, a Kona Dew Deluxe/53cm frame/black.  It wasn't nearly as expensive of a bike, except that I had some super sturdy custom wheels built for it which cost almost as much as the bike did originally.  It's also fairly distinctive because it has butterfly handlebars with red tape on them.  Here's a picture from when I took it grocery shopping:

My Kona Dew Deluxe

If you have any information, please email me at e r i c ---> ericasberry.com.  Alternatively you can contact the Hillsborough County Sheriff's Office @ 813-247-8000, the case # is 14-364609 and the deputy who took the report is Romano.

Migrating a Desktop VM to an ESXi Server

I have a Windows 7 VM running on my primary Mac notebook that I use mostly for one single application on a regular basis, and for occasional browser testing or other random tasks on rare occasions.  I have an ESXi server running on an old desktop machine, and I thought that it would be nice to move that VM to the server and just remote desktop to it.  It would have the added benefit of being able to run the app from one of my other machines.  I learned a few things along the way.

First, the straightforward thing to do would be to create a new virtual machine on the server, install Windows 7, and then install the application.  But like any good lazy programmer, I figured there had to be a better way.  What I _really_ wanted to do was just move my existing image off of my laptop onto the server.  Turns out it's not very difficult, but there are a few steps and at least one "gotcha".

First, if you don't already have it, you need to download the OVF tool from VMWare.  Extract the .tar.gz file and open the *.pkg file to install it.  Then, at the terminal, run a command similar to this:

/Applications/VMware\ OVF\ Tool/ovftool ./Documents/Virtual\ Machines.localized/Windows\ 7.vmwarevm/Windows\ 7.vmx /Documents/ConvertedVM/

This assumes your current working directory is in your home directory, your VM's are stored in the same place mine are, etc.  ConvertedVM is the directory where it will output the converted VM as an OVF.  Season this recipe to taste.  Now, you can deploy the OVF to your server in the usual way.

As I mentioned before, there is one "gotcha".  I had recently upgraded to the latest version of VMWare Fusion, and allowed it to upgrade my virtual machine's "hardware".  Don't do that.  As the link describes, once you upgrade to hardware version "10" you can no longer edit the hardware settings of your VM using the Windows vSphere client.  You can edit it with the "web vSphere client", but apparently that is something that is not available for the free version of ESXi I'm running in my home "lab".  I needed to change the hardware setting though, because in the migration I lost my network adapter and the Windows application in question needs network access.

I was able to work around it in a fairly brute-force way.  I removed the VM from the ESXi inventory (but did NOT delete the disk), re-created a new Windows VM and reused the existing disk image.  It worked fine, although I did have to reactivate Windows.  In the previous link there is also a workaround mentioned using "PowerCLI", but I haven't used that so can't vouch for the solution.

EXCLUSIVE LIMITED-TIME BONUS TIP!  Gosh, I sure do miss the <blink> tag.  But, I digress.  I've been using "Remote Desktop Connection" to access Windows VM's.  I believe it was installed as part of Office for Mac 2011, or maybe it was something I downloaded from Microsoft many moons ago.  If you're still using that too, I've found that there's actually a much better free remote desktop application (also from Microsoft) available on the Mac App Store.  Get it now while supplies last!




Tracking Down Bandwidth Hogs: Netflow Home Edition

Recently Comcast announced they were going to be doing some testing of data caps in certain markets, including mine.  I'm a cord-cutter and I've been very happy with Comcast's speed and reliability, but when you view a lot of streaming video (Netflix, Hulu, Amazon Prime, iTunes)  and do a lot of cloud backups of your data, 300GB suddenly doesn't seem so much.

Fortunately, Comcast has graciously decided to give its customers a grace period before they actually start charging for overages, so I've been trying to manage my data use a little more proactively.  Looking at previous months, I saw that I was typically using 500-600GB of data.  After cutting my streaming usage way back, it seemed that I was still using a tremendous amount of data and exceeding my cap well before the end of the month, and I couldn't really figure out why.

It so happens that the company I work for makes a great enterprise product for finding just these sort of things, using NetFlow.  Unfortunately, we don't (currently) produce a home edition, so I decide to try the next best thing.

The first step was to configure my router, which is running  DD-WRT firmware, to export "RFlow".  As best I can tell, RFlow is an implementation of some version of NetFlow.  Presumably NetFlow is trademarked, so they have to call it something else.   Add to the mix nprobe and ntopng, and I was able to find that one host was using a very large percentage of the total bandwidth. 


So the host, which is my Macbook Pro, had used over 1GB of bandwidth during the reporting period, but what really surprised me was that it wasn't predominantly downloading data, but sending data.  This set off all kinds of alarm bells and gave me a mild panic.  I'm thinking virus, botnet, who knows what evil malware I may have gotten, despite considering myself fairly savvy.

Unfortunately, I couldn't tell much more about the traffic other than it was SSL.  But now that I had it narrowed down to a host, I remembered a handy utility I'd discovered awhile back called Little Snitch.  It's basically a firewall which allows you to selectively allow and deny connections from your Mac.  Being required to whitelist each application that requests access to the network gets pretty tiresome after awhile, so I had stopped using it.  But turns out, the latest version has a "passive" mode, where it will monitor what's going on but won't actively block anything.  I let it run for awhile and was able to collect some interesting data.


Outlook?  What the hell are you doing?  After about an hour it had sent nearly half a gigabyte of data.  I'll grant you that my emails may be overly wordy at time, but I hadn't actually sent any.

Turns out, Outlook has a rather nasty little bug having to do with folder syncing that makes it use a metric crap-ton of bandwidth.  There doesn't seem to be a fix that I could find, so for now, I'm just shutting down Outlook when I'm not actively using it.  

And thus ends my bandwidth hog detective story.

Creating TTL Indexes in Spring Data for MongoDB

Recently I need to programmatically implement a TTL index in MongoDB using Spring Data.  Ordinarily, you would add indexes by creating a new Index object for the field and specifying the various attributes of the index.  However, there currently isn't a way to specify the TTL information.

Fortunately, it's easy enough to accomplish by just creating your own implementation of the IndexDefinition interface.  In the example below, myIndexKey is a date field in my collection represented by MyEntity.class.  The variable expireAfterSeconds is being passed in, since it's user-configurable (which is one of the reasons I need to do this programmatically).

IndexDefinition index = new IndexDefinition() {
    public DBObject getIndexKeys() {
        return new BasicDBObject("myIndexField", 1);
    public DBObject getIndexOptions() {
        return new BasicDBObject("expireAfterSeconds",


Weird Problem: Gray Background Instead Of Wallpaper After Resuming OSX Lion

When I woke up my laptop this morning, I had a weird issue.  On my external display, instead of my usual wallpaper, I had a gray background.  I tried just resetting the wallpaper in system preferences but that didn’t work.  I tried disconnecting and reconnecting the display cable, too.  No joy.

I figured restarting and/or logging out would probably fix it but I didn’t really want to do that.  I found the answer (along with other possibilities) in this thread in the Apple Support forums.

Just open Activity Monitor and quit the “Dock” process.  It will automatically restart and your wallpapers will come back.  Note that anything you had minimized to the dock will be restored to a window.  Annoying, but much less painful than having to restart everything!  

UPDATE: Even easier, from a terminal window: killall Dock to accomplish the same thing.

OSX Lion: ssh_askpass: exec(/usr/libexec/ssh-askpass): No such file or directory

I keep running into this.  For various reasons I need to use password-based authentication on some test boxes that are regularly rebuilt which makes using key based authentication difficult.

Ordinarily I set up an alias like this:

alias ssh1="sshpass -p secretpassword ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null"

… which lets me use a command like:

sshl eric@devbox

But sometimes (and I swear, it doesn’t always happen, but I haven’t figured out the variable yet), it fails, and I get:

ssh_askpass: exec(/usr/libexec/ssh-askpass): No such file or directory

… and indeed, the ssh-askpass binary doesn’t exist.  After some flailing around and googling, I finally came up with this ugly hack.  I create a shell script with the following contents:

echo "secretpassword"

Then, I set the following environment variable, which directs sshpass to use my script instead of looking for the missing ssh-askpass

export SSH_ASKPASS=/Users/eric/scripts/ssh-askpass.sh

It’s ugly, but it works.  Of course, it assumes you’re always using the same “secretpassword”.  Some day I need to figure out a better way of addressing this, but today is not that day.  

See also:  http://apple.stackexchange.com/questions/18238/mac-os-x-lion-and-sshpass

The Most Expensive Album I Ever Bought

Every year around this time I start thinking about this record album I used to have as a kid, and start feeling nostalgic. It was a recording of

A Christmas Carol

, most definitely geared toward children to be sure. But for my formative years this album was the definitive version of

A Christmas Carol

for me. I used to have it on a vinyl LP, but at some point after I moved out on my own, or maybe even before, I got rid of the album. Why keep such a childish thing, after all?

Now that I'm older, I regret that. Of course, even if I had the LP, I haven't had a record player in many, many years. Probably almost 20 years. (Gasp! Allow me to pause for a moment while the reality of that hits me). Even before CD's got popular I mostly listened to cassettes.

Anyhow, every year around this time, I find myself doing a little searching on the 'net, hoping beyond hope to find a digital version of this recording. I've never had very good results, partly because I couldn't remember enough about it to even know what exactly to search for. I wasn't sure if it was called



A Christmas Carol

. I had no idea who actually recorded or published it. I did know that there was a flip side with another story called

The Fir Tree

which I also liked, but that was about it.

Well, this year, at Thanksgiving, I was talking to

my sister

about it, and she remembered the album, too. We were quoting parts of it and reminiscing, which inspired me to go looking again. I spent "Black Friday" morning trolling the 'net, and eventually found a site that listed

discography for "Peter Pan Records"

and found the information for the album I was looking for. It was

A Christmas Carol

by the "Peter Pan Players & Chorus". Apparently it was released in 1972! Wow!

With this additional information, I was able to refine my search, but as I expected, I could not find any kind of digital version of it for sale. I did find a couple of auctions on eBay that had this record, and when I saw the picture of the album cover I recognized it right away! I was sure it was the same one from my childhood memories. One of the auctions was "Buy It Now", and it was only $12.99, so rather than risking missing out on getting it I went for it. I have no idea how many of these are floating around out there and what condition they are in, but now that I had struck gold after all these years I didn't want to risk it. Soon the record was on its way to me.

One problem: I don't have a record player. So I went to Amazon and found

this USB turntable

. At the time I ordered it the price was about $80, although I've found it interesting that as I've been sending links to it to friends who've asked about it, I've noticed the price seems to frequently change, I've also seen it close to $100. So, I guess I got a deal! This was about the least expensive USB turntable I could find, and it got a pretty good rating overall on Amazon. Also it specifically said it would work on a Mac.

Anyway, the turntable arrived two days ago and I got my hands on the album today. I could not wait to get home and listen to this recording that I have not heard in so many years. I got the turntable set up, which was a little more involved than I had anticipated, but not too bad. I had to attach the cartridge and a counterweight on the arm which had to be adjusted a little. I also had to install the platter and loop the drive belt over the motor. No big deal. The turntable comes bundled with some kind of "EZ" software as well as


. I already had


installed on my Mac, and since it provides additional editing capabilities (e.g. filtering out the worst of the pops and clicks) I decided to just use that.

In just a few minutes, I was listening to

The Fir Tree

. I decided to start with that one since it was shorter, and would serve as a trial run as it had been awhile since I used


. It was just as wonderful and depressing as I remembered! "Piece by piece he was fed into the fire. Gone. Gone. The little fir tree was gone."

I ran it through the pop/click filter and exported it as an MP3, and moved on to the best part:

A Christmas Carol

. "Ebenezer Scrooooooooge ..." it began. Ah, the memories flooded back! Unfortunately, while

The Fir Tree

played without a hitch, there was one spot in

A Christmas Carol

where it skipped and just kept skipping. Upon inspection, I found there was a visible spot on the record where it was scratched a little. I suspected that maybe if the weight on the cartridge was a little higher it might not skip as easily. So I tried adjusting the counterweight on the arm, and fortunately I was able to get it past that point. I was able to clean up the skip pretty well using


, so it's hardly even noticeable.

I actually bought another record, a recording of


which is the one other record I remember having as a child. I mostly bought it to help further legitimize the fact that I bought this USB turntable which, once I digitize


, will probably go into a closet never to be used again.

So the record that probably only cost a couple of bucks when I was a kid cost me around $100 all-in this time around. I guess that's inflation for you, huh?

ThisMonkeyWriteMo or "What The Right Side of My Brain Is Up To Lately"

Those of you you have been reading this blog for a long time (uh... hello? Anybody still out there?) may remember this gem from 2005.

Wow. 2005? Has it really been that long?

Well, life happens, and despite my best intentions, not much has come of the things I planned on doing, which I wrote about in that blog entry.

This year, I've gotten involved in NaNoWriMo (those of you who follow me on Twitter are probably all too aware of this, as I've been updating my word count daily, but you may not know exactly what it is). If you care to know, read on.

NaNoWriMo is shorthand for "National Novel Writer's Month". The short explanation is this: when you commit to doing NaNoWriMo, you commit to writing 50,000 words of fiction during the month of November. That's it. And it doesn't cost a thing!

OK, that's a lie. There's no money involved, but it does cost quite a commitment of time. Already, some of my co-workers are annoyed at how far behind I am on television. (Can you believe I have not even watched the new "V", and have no idea when I will? And that apparently there have been some kind of important baseball games being televised?) I'm that far behind already, and this is only the first week! I envision a future with much less television in my life. That is probably not a bad thing.

So, what's magic about 50,000 words? It's sort of arbitrary, but the guy who started this crazy thing established 50,000 words as sort of the minimum length for a novel. If you want to know more, check out the site I've linked above, particularly the FAQ.

A couple of people have asked me about this undertaking, wanting to know what my story is about, whether or not I'm going to get it published, who do I think I am with these crazy delusions, etc. Well, let's take the second question first. Absolutely not. I don't think I ever want anyone to read this story. It's pretty bad. Even after some heavy editing, I'm pretty sure it will be beyond redemption. But that's not the point. The point is to get the creative juices flowing. Prove to yourself that you can actually create something new. Art for art's sake (even if the first few attempts aren't beautiful).

I have to admit, while part of me is a little distressed at the amount of time its taking out of each day, there is another, rapidly growing part that is starting to remember why I loved this so much when I was younger, before I became a hard core computer geek. The thing I've always loved about programming computers is the ability to create something new. You can read my 2005 blog entry to get an idea about how that's working out for me. Not much has really changed in four years with regard to getting to create new things every day.

On the other hand, it is really a lot of fun to create entire universes in your own stories. For the novel I'm working on right now, I have no outline, no real plan for where its going. I'm just making it up as I go. It's a lot of fun and the right side of my brain is enjoying getting some attention for a change. Already there's been an accidental death, a revenge killing and an arson fire - and I'm only four chapters in!

The NaNoWriMo is a great, fun challenge, but I plan on continuing once this month is over. Probably not with the novel I'm working on now, but something else. I'm going to really start working on improving my writing skills. Mostly for fun, but hey, it'll be nice to have something to fall back on if this whole computer thing just turns out to be a passing fad!

Unattended Installation (aka Silent Install) of Sun JDK in Debian

Posting this short note to my blog in the hopes it may help someone else, as I had a hard time tracking this info down.

Debian provides a package (sun-java6-jdk) for installing the JDK, but when you have a situation where you need to do an unattended installation (aka "silent install") of the JDK you are stuck, because the install insists on making you interactively accept Sun's licensing agreement. Passing the -y option to apt-get has no effect. I was able to find a link on Sun's site for doing silent installs on Windows, but nothing for Linux.

Fortunately, I eventually tracked down, all you have to do is the following before you run your apt-get install command:

echo sun-java6-jdk shared/accepted-sun-dlj-v1-1 boolean true | debconf-set-selections

This indicates to the installer hooks that you have already accepted the license agreement, and doesn't give you the annoying EULA dialog.

UPDATE 11/3/09: I should note that all of the above research was in preparation for setting up an automated install with FAI. Unfortunately, I later found out, that while the above works great in the regular command line, due to the nature of the environment FAI installs run in, it doesn't work for that. However, I also discovered there is a correct way of doing it with FAI.

Use "debconf preseeding". Create a class file e.g. debconf/DEFAULT (I used debconf/easberry-vm-std-plt-01 for testing with my host class, and populate it with the following:

sun-java6-bin shared/accepted-sun-dlj-v1-1 boolean true
sun-java6-jdk shared/accepted-sun-dlj-v1-1 boolean true
sun-java6-jre shared/accepted-sun-dlj-v1-1 boolean true

This accomplishes basically the same thing, but in the FAI environment. Success!

No Fluff, Just Stuff Atlanta

I just finished attending a 3 day (well, technically 2 1/2 days) tech conference called the "Greater Atlanta Software Symposium". (Most people seem to just refer to it by the name of the organization, "No Fluff Just Stuff"). I highly recommend this conference, which is held at various locations throughout the year. This is the third one I've attended, and I always come away from them excited about the new tech I've learned and eager to apply it.

There were lots of good presentations. On the first day, I started out with a presentation on anti-patterns in software development. (I always wonder, if patterns and anti-patterns converge do they annihilate each other)? Then I went to a presentation on Scala, which I've heard a lot of buzz about. It looks interesting, though some of the syntax made my eyes hurt. Then, I went to a presentation on Flex development. This, I think, is something I really want to pursue. Flex looks pretty cool, and like it or not, Flash is a lot more ubiquitous than Java on the desktop ever was or will be. (Sorry JavaFX). After dinner, there was an interesting keynote on being an iconoclast. It was very thought provoking, although the speaker sort of lost me when he seemed to put the "Dixie Chicks" on par with Florence Nightingale and other historical figures as examples of iconoclasts.

The second day I attended several web design talks by a couple of different presenters. I really need to read "Don't Make Me Think"and "The Design of Everyday Things." I also attended a session on "Java.next" which was an overview of up and coming alternative languages on the JVM. (This is where I first heard the controversial statement I mention below).

The final day (today) consisted of topics that are probably the most immediately applicable to my day job: presentations on the "Busy Java Developer's Guide to Collections" and a couple of deep dives into garbage collection on the JVM and various tools that are available to help troubleshoot running systems. (BTrace looks awesome!) I also attended a presentation on applying functional language approaches to Java development.

Functional languages on the JVM seemed to be one of the dominant themes this year. I think I need to spend some time learning either Scala or Clojure (or maybe both). One really provocative comment made by one of the presenters, and reaffirmed by the group at the expert panel talk, was that no "green field" projects should be written in Java. Of course, its a particularly provocative comment given that it was made at a conference targeted at Java developers.

I have mixed feelings.

I'm convinced that these new languages provide a lot of demonstrable benefits over the aging Java language. Since they run on the JVM, they can even be grafted onto existing Java projects. Of course, it's always good to learn new technologies just to tickle your brain, and maybe even get some new insights and ways of thinking about various problems.

On the other hand, in my world I don't see a lot of call for programmers that know Scala or Clojure, or even Groovy (which is not a functional language, but it's in that "java.next" category and its about as incremental a step away from Java that you can make). Let's face it, in today's economy, that's got to be a major consideration! Also, in most of the positions I've held in my career, I'm generally not working on "green field" projects; I'm maintaining existing applications. There's no way I'm going to be able to sneak in Clojure or Scala on my existing project (except for maybe, as was suggested as a first step, unit testing). Sure, those modules written in the other languages can be integrated with the existing Java code and it may not be an issue in the short term, but what percentage of Java developers are going to be able to come in behind me and even recognize the language, much less be able to understand it or maintain it? When someone else raised this very issue during the Scala presentation I attended, the presenter's answer to this seemed to be that he was above working with such programmers so it was a non-issue for him.

I still think it worth learning one (or both) of these languages, even though for the short term at least, I can't see using them for anything other than personal projects. Just wish I knew where to place my bets and place my focus, because there seem to be a LOT of paths to the future.

Three Displays On My Macbook Pro

I've often heard that you can never be too rich, too thin or have too much monitor real estate. OK, I might have added that last thing myself. I'm definitely in no position to test the first two assertions, but thanks to the USB display adapter I recently purchased for my Macbook Pro I'm better able to test the third.

I purchased my adapter from OWC (Other World Computing) here, but I've since discovered what appears to be the same hardware available from Amazon here for about $20-$30 less. I verified that the one I purchased from OWC matches the model number of the device on Amazon, USB-1612. Here is the manufacturer's product page. It's possible the one on Amazon may only come with Windows drivers, since the product was apparently Windows-only initially. If that's the case you can download OSX drivers here. The CD that came with mine had both Windows and OSX drivers. As of this writing, it appears that Linux is unsupported. I've included a picture to give you an idea of the relative size compared with an iPhone, the handiest reference object I had available.

When I'm working on my Macbook Pro at my desk, I have an Acer 24" monitor running at 1920x1200 attached via DVI. It's a nice, large monitor, but I also have an older Dell 20" widescreen monitor which I like to use in more of a "tallscreen" orientation. The monitor can be rotated 90 degrees which is useful for certain applications (primarily reading ebooks, technical documents or long news articles - but it also comes in handy when reviewing credit card and bank statements online). Of course, the Macbook only has one external display port, so ordinarily there is no way to use both monitors at the same time. This device allows you to add additional displays (up to 4 in OSX, or 6 in Windows) via USB ports. I only tested it under OSX 10.5.7.

So, what's the verdict? Overall, I'm very pleased with it and it meets my needs, but there are some limitations you should be aware of, and my experience hasn't been problem-free.

The initial installation was fairly painless. I installed the OSX drivers using the supplied CD and restarted my Mac. I then connected the device to my USB hub. One nice touch is that the device draws its power over the USB connection, so there's not a separate power adapter to plug in - just the USB cable and the monitor cable (in my case a DVI cable). It comes packaged with a DVI to HDMI adapter as well as a DVI to VGA adapter - I didn't use either of those. As soon as I connected the device the new display was recognized. I just needed to bring up the OSX preference pane to rotate the image 90 degrees to match the orientation of my monitor, and I was in business! Everything was functioning properly and I picked a special wallpaper for my "tallscreen" monitor. But, I had a Safari 4 update pending, so I decided before I did anything else, I would go ahead and apply that update. It required a restart of the system, and that's where my problems began.

After rebooting, the system seemed to recognize the display, but the attached monitor was completely blank. The green light was illuminated on the monitor, indicating that it was receiving a signal, but the display was completely black. I tried unplugging/replugging the device, running "detect displays", and even rebooting again, none of which fixed the problem. After searching online for a bit, the only thing I could find was this link which didn't exactly match my scenario. The troubleshooting FAQ refers to problems with an already functional installation that may occur after applying the OSX 10.5.7 update. I was already running 10.5.7 before I initially installed the device. However, I figured it was possible that the browser update might have updated some system file, so with nothing better to try, I followed the suggestion, which basically involved uninstalling the drivers, rebooting, re-installing the drivers, and rebooting yet again. It was annoying, but it fixed the problem. Not able to leave well enough alone, I decided to reboot again, just to make sure everything was stable. I got the same symptoms - monitor getting a signal but no image being displayed. I did the uninstall driver/reboot system/reinstall driver/reboot system dance again, and once more the display came to life.

At this point I was getting pretty frustrated. Doing this dance every time I need to restart my Mac was not going to be acceptable. I did a little more poking around on my system, and launched the Console app, wherein I found this message repeating over and over:

DisplayLinkManager[739:65c3] Could not establish GA communication 0000044E

Often, I find that if you search for an error string on Google, you can find one or more people who have had the same issue, often with a solution, but in this case I had no such luck. I re-read the FAQ and decided to try that oft-suggested bit of OSX voodoo which is to run the Disk Utility program and repair disk permissions. In all the time I have been running Macs (about 4 years now) I can honestly say that although performing that ritual usually finds some permissions problems and fixes them, I have never had it actually cure the symptom that prompted me to repair permissions in the first place. Apparently this instance was the exception that proves the rule. After repairing permissions, and rebooting the system the display continued working. I rebooted a couple more times just to be sure, and the display came up fine both times, and I have not at any problems since.

What are some limitations of the device? As I mentioned previously, if you want to use this under Linux, as far as I can tell you are currently out of luck. Also, it only supports two resolutions: 1600x1200 or 1680x1050 (which is what I'm running it at - with the image rotated 90 degrees so that I can run it in portrait mode). At least on the Mac, it doesn't support OpenGL acceleration, which means certain things won't run on the display, such as video editing in iMovie. Also, only Intel Macs are supported. One other limitation that I ran into was that Crossover for the Mac doesn't seem to behave well with it. I tried moving applications running under Crossover onto the display, but as soon as I finished dragging the window, OSX became unresponsive. I could still move the mouse pointer around, but I couldn't switch focus to any application (no matter which display it was on), nor could I bring up the system menu or the "Force Quit" dialog. My only recourse was holding down the power button to reboot the system. I encountered this behavior consistently, and could not find any fix or workaround. Annoying, but I don't use Crossover that much, and as long as I remember not to move it to that display, it's not a problem.

After getting the kinks worked out, I'm pretty happy, and it's serving the purpose I had for it. I do wish I had known I could get essentially the same device for $20-$30 cheaper from Amazon. Ah, well -- live and learn. Here's a picture of my three display setup (one of the displays is the built-in display of my Macbook Pro). You may notice the monitor in middle of the picture looks a little screwy - it's actually a composite image I created using an iPhone app called Pano and I'm not particularly skilled in making those images look quite right.

Forcing iTunes to automatically check for purchases

Yes, I am aware that there is an option in iTunes to automatically check for purchases (such as TV show subscriptions). And I have it checked. But for some mysterious reason, iTunes simply refuses to automatically check for downloads on my old PowerPC Mac Mini (still running Tiger) which I use as my media hub. It has been a minor annoyance to me for some time now, and I haven't been able to find a solution 'til now. This doesn't really fix the problem exactly - but it's a nifty little hack that achieves the result I'm looking for. Through the magic of AppleScript and cron, my problem is solved!

I found the magic URL for forcing iTunes to check for purchases, and created the following AppleScript to invoke it:

tell application "iTunes"
open location "itmss://phobos.apple.com/WebObjects/MZFinance.woa /wa/checkForPurchases?ign-mscache=1"
end tell

Then, edit crontab to run the script every 4 hours:
0 0,4,8,12,16,20 * * * osascript ~/Documents/check_itunes.scpt

I Love Obscure Error Messages

Obscure error messages are delightful. Like this one. It comes from Eclipse, my Java IDE of choice (well, not my choice really, but I digress...). So today I fire it up and get this obscure error message. Fortunately, the problem is really clear once you look at the log file it points you to:

!SESSION 2009-01-16 11:01:58.448 -----------------------------------------------
java.vendor=Sun Microsystems Inc.
BootLoader constants: OS=linux, ARCH=x86, WS=gtk, NL=en_US
Command-line arguments: -os linux -ws gtk -arch x86

!ENTRY org.eclipse.equinox.app 0 0 2009-01-16 11:01:59.038
!MESSAGE Product could not be found.

!ENTRY org.eclipse.osgi 4 0 2009-01-16 11:01:59.084
!MESSAGE Application error
java.lang.RuntimeException: No application id has been found.
at org.eclipse.equinox.internal.app.EclipseAppContainer.startDefaultApp(EclipseAppContainer.java:236)
at org.eclipse.equinox.internal.app.MainApplicationLauncher.run(MainApplicationLauncher.java:29)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:382)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:549)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:504)
at org.eclipse.equinox.launcher.Main.run(Main.java:1236)
Yes, that was sarcasm you heard there.

To make a long, painful story short, after trolling around on the interwebs, I found the answer lies in Eclipse's config.ini file. For whatever reason, and I have no idea what that reason is, after using Eclipse since June, my config file got mysteriously corrupted. The attribute eclipse.product had a blank value. Apparently it's supposed to either have the value org.eclipse.sdk.ide or (in my case) org.eclipse.platform.ide, depending on which version of Eclipse you have installed. Once I set eclipse.product=org.eclipse.platform.ide in the config.ini file, Eclipse started up without complaint.

I hope that this sad tale may help someone in the future. Perhaps even myself, should it happen again!

Matching multiline regular expressions in Java

Regular expressions are one of those things that I understand but don't use often enough to have really mastered. I use them even less in my Java programming. Today I found myself banging my head into my cubicle wall trying to parse out a file that looked something like this:
anotherthing=ran outta foo words

I needed to parse this out and create objects representing each section. Initially I started trying to read this in line by line, keeping track of where I was in relation to the markers, concatenating string buffers, etc. Then I realized how retarded that approach was, and how using a regular expression would make it a lot simpler.

Brushing aside the mental cobwebs I looked up a couple of references of the Java API and studied up on my friends Pattern and Matcher. I had trouble finding any examples for my specific case, where what I wanted to match spanned multiple lines. At first I thought I'd found the answer with the promising sounding Pattern.MULTILINE argument to Pattern.compile(), but that has to do with matching ^ and $. Without that option, those operators only match at the beginning or end of the text being parsed, with them it will allow them to work within the text at newline boundaries.

Turned out what I was looking for was the Pattern.DOTALL argument. By default, the dot operator does not match newlines, with this argument it does. An alternative is to prefix the regex pattern with (?s). It has the same effect, and the mnemonic stands for "single-line" mode, which is what its called in Perl.

So, to extract the relevant sections from the input above, you can do the following:

Pattern pattern = Pattern.compile("----START\n(.*?)\n----END\n", Pattern.DOTALL);
Matcher matcher = pattern.matcher(theInput);

while(matcher.find()) {
System.out.println(matcher.group(1) + "\n");

That will print out:

anotherthing=ran outta foo words

You can get the same result with the alternate method, using the embedded (?s) operator in the Pattern declaration:

Pattern pattern = Pattern.compile("(?s)----START(.*?)----END");

For further reference, have a look here.

4 Gigabytes of Fun!

It sometimes seems that no procedure involving a computer ever turns out to be as straightforward as you initially think it will be.

The desktop I have at work is a pretty nice system, a Dell Precision T3400 quad core with 2GB of RAM.  It's plenty fast, but with some of the memory hogging software I run all the time (a Windows virtual machine, Eclipse IDE and Firefox), sometimes it gets bogged down not with the processor but with swapping stuff in and out to disk.  RAM is a pretty cheap upgrade, and I managed to get a RAM upgrade for my machine to 4GB.  Yay!  Problem: Ubuntu  only recognizes 3.2GB of RAM, though the BIOS clearly sees all 4GB.  Boo!

Well, I have the 32 bit version of Ubuntu installed, not the 64 bit version.  Never occurred to me to install the 64 bit version (and never would have made a difference before today).  Didn't relish the thought of reinstalling to get the 64 bit version, so I did some googling.  

The first line of attack that I ran across involved tweaking BIOS settings.  I found many suggestions to look for some kind of memory remapping setting in the BIOS.  I couldn't find anything in my BIOS (a Dell Precision T3400).  I noticed the BIOS was a few versions of date, so I updated that, but it didn't seem to make a difference.  (Aside: very pleased to see that I could download  a Linux version of the BIOS updater, which was basically just a single file that you chmod +x and run as root in single user mode.  Good on ya Dell!)

A little more googling and I discovered that some Linux kernels have support for Physical Address Extension, which is a workaround for 32 bit OS's to allow them to address more than 4GB of address space. Of course, I only wanted to see 4GB of RAM, not more, but to do so requires more than 4GB of address space because of other things that have to be mapped to some memory location, like video RAM, etc. Poking around some more, it seems that the default Ubuntu desktop kernel does NOT include this support. Further confirmation provided by this entry in /var/log/dmesg: Use a HIGHMEM64G enabled kernel. The solution is to install the server version of the kernel, which does provide that support.

So to the keyboard I ran (OK, that's a lie, I was already at the keyboard) and typed:  

apt-get install linux-image-server

Success! Victory is mine! But wait, upon rebooting, disaster struck. My X server kept trying to initialize over and over, and finally came up in an extremely low resolution mode and said it couldn't detect my video card anymore. Whaaaaaaaa????

At first I started wondering if something weird was going on, like maybe it was somehow remapping the address of the video RAM in a way that the Nvidia driver didn't like, but it turned out to be much simpler. After some further googling, it turns out that you need to install some kernel modules for the nvidia driver I'm using. I guess those modules must have already been installed for the generic version of the kernel, but not the server version. So a couple more terminal commands:

sudo nvidia-xconfig (to put my X configuration right again)
sudo apt-get install linux-restricted-modules-server (to provide the module support needed for the video driver)

Then, a quick reboot. There was one more minor hiccup, which actually took care of itself. When I fired up VMWare it said it needed to recompile some kernel modules (which it always does whenever you upgrade the kernel). One really nice thing in the VMWare 6.5 betas is that it just takes care of this for you, and gives you feedback about this operation in the GUI, with no action required on your part. No more going to the command line and running the old vmware-config.pl script.

I write this entry in the hopes that it will save someone else time in the future, as they can find all this info in one place, instead of the dozens of places that I had to gather it all from.

A Couple of Cool Mac Apps

If you like listening to Audiobooks on your ipod/iphone/macbook and your audiobooks come from sources other than iTunes and Audible, (for example: emusic, which has DRM free MP3's, or audiobooks you have ripped from CD) you should check out Audiobook Builder.

One issue with audio books from emusic and other sources is that instead of having one nice audiobook file in the standard Apple format, you end up with a couple hundred mp3 files that you have to keep organized a playlist. Those files also aren't categorized as an audiobook within iTunes, and trying to keep your place in between listening sessions (especially if you listen to something else in between) can be a real pain.

Audiobook Builder let's you combine all those files into a standard Apple audiobook format file. It's only $10 for a license, and there is a demo available. Unfortunately, the demo is crippled, in that it will only do the first 20 minutes of audio, so you can't really try it out with a full audiobook. The demo does at least let you get a good feel for how the program works, and $10 is pretty cheap.

Another cool app, a little more expensive, is myTuneSync. If you have several computers with iTunes libraries, keeping them in sync can be a real pain. I have iTunes on my Macbook Pro, my wife has iTunes on her Dell Latitude, and I have another instance of iTunes running in a virtual machine to serve up media files to my AppleTV. That virtual machine is more or less my central iTunes repository, but its a pain to keep music that has been added to the two laptops synced with it. This program lets you do things like that in a fairly painless manner. One nice surprise for those of you (like me) with both Windows and Mac is that the client works on both platforms.

Licensing is a bit strange. It's $20 for a license for a single computer, which doesn't make any sense, since the program is useless on a single machine. But a 3 license pack is available for $30. You will need at least two licenses to actually use this program so I'm not sure why you would ever buy the single license. You can try this app out with full functionality for 15 days, so if you're interested in a solution like this, you can give it a try with no risk.

(NOTE: I have no affiliation with either of these sites, other than being a happy customer.)

Just Another Post

As you can see, it's been awhile since I've posted to the blog. It has been an eventful year so far, though. Some good, some bad.

Of course, my big brother Greg passed away in April. It was a shock to the whole family. He was only 44. Those of you who know me well may know that I also lost my other brother in a car accident many, many years ago. So, I know from experience that the pain never really goes away. It gets a little easier to deal with, day by day, for the most part. And then some days, you see or hear something that brings back a memory, and you feel it all full force again, and it just completely catches you off guard. Like, a couple of weeks ago, I was sorting through things in my office and found a birthday card that I had bought for him. I had honestly forgotten all about it (yes, I have great organizational skills!). But when I found it and started looking at it, I remembered buying it ahead of time, being all proud for not waiting until the last minute, and thinking how funny he would think it was. He had such a great sense of humor. He was really a clown. He was so kind and giving. And he was tough! He was my big brother. How can he really be gone? Even now I sometimes read or hear something and think of sharing it with Greg or wonder what he'll think about it - and then I remember. Sometimes I dream about him -- really good, happy dreams. And then I wake up, and I'm a little disoriented for awhile, and then reality sinks in.

Yes, it still hurts a lot. And I still have a hard time talking about it. Even just typing this is really bring up a lot of emotion. So, I'll move on for now, except just to say thank you to everyone who has expressed their condolences, who has said a kind word or shared a memory about my brother, or just let me know that me and my family were in their thoughts and prayers. It means a lot. It's not so much what you say, it's just knowing that people care.

On a more positive note, after 8 years with my previous, rather large employer, I have just started a new job. Like my previous job, this one is network security related, but its a much, smaller organization which I think will suit me a lot better, and the product is actually quite different. Also, its an opportunity to get back into primarily Java development, which was not something I was really able to do any more in my role at the previous job. As a general rule, I don't tend to talk directly about my work or employer on my blog, just because I don't ever want to have to deal with any impression that I'm speaking in any way for my employer on this blog. This blog is all mine. :) If you really care to know particulars, you can hit me up on LinkedIn, or even Facebook or MySpace. Suffice it to say that I am really enjoying the new challenge and have met a lot of friendly and hard working people.

One more thing I should mention. Though I don't post directly to this blog all that frequently as of late, I do post short messages to Twitter (see the upper right corner on the blog) pretty frequently, sometimes 2 or 3 times a day. Nothing profound, sometimes just mundane updates on whatever I'm doing, sometimes random outbursts directed at no one in particular about something that's aggravating me. You can always see the most recent of these updates in the upper right corner of my blog page, or if you want more instant updates, check out my Twitter's site and see all the ways you can follow my updates. (For instance, you can get updates on instant messenger or even SMS).

That's all for now!