Hiltmon

On walkabout in life and technology

My Sublime Text 2 Setup

I have been using Sublime Text 2 for a month or so now, and I still do not love it. Here’s hoping TextMate 2 gets it’s game on. But at least I have it working closer to the way I want it to work and look now. Here are the minor changes I have made to make it work for me and how it compares to TextMate 2 (Updated: See My TextMate 2 Setup - The Hiltmon).

Packages

Of course, the first thing you do with Sublime Text 2 is install Package Control. Why this is not part of the product astounds me, but that’s the way it is. You use Package Control to install the packages you need to run Sublime Text.

Then I installed the following packages (mostly for Ruby on Rails work):

I had a few others installed, but they conflicted with each other. Surprised that that happened, had to remove them.

Defaults

I love the RailsCasts theme, so I copied the RailsCasts.tmTheme file to the correct folder (on OS X that is ~/Library/Application Support/Sublime Text 2/Packages/User/). My Preferences / Settings - User looks like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
{
  "color_scheme": "Packages/User/Railscasts.tmTheme",
  "ensure_newline_at_eof_on_save": false,
  "folder_exclude_patterns":
  [
      ".svn",
      ".git",
      ".hg",
      "CVS",
      ".sass-cache"
  ],
  "highlight_line": true,
  "ignored_packages":
  [
      "Vintage"
  ],
  "rulers":
  [
      80
  ],
  "save_on_focus_lost": true,
  "tab_size": 2,
  "theme": "Soda Light.sublime-theme",
  "word_wrap": true
}

In short, I set the Soda Light theme as the default dark theme is ugly and use RailsCasts for editor colors, nuke the vi support, trigger save on lost focus and fix-up the folder exclude patterns to hide some Octopress folders.

I also changed 2 keybindings, in Preferences / Key Bindings - User, I have:

1
2
3
4
5
6
7
8
[
  { "keys": ["ctrl+shift+d"], "command": "duplicate_line" }, // Like TextMate and BBedit
  { "keys": ["ctrl+shift+."], "command": "erb", "context":
    [
      { "key": "selector", "operator": "equal", "operand": "text.html.ruby, text.haml, source.yaml, source.css, source.scss, source.js, source.coffee" }
    ]
  }
]

And that’s about it.

Compared to TextMate 2

So now Sublime Text 2 looks just like my TextMate 2 setup:

But there are subtleties that are still driving me nuts:

  • TextMate 2 is a Mac application, uses Mac conventions and components, works like a Mac application. Sublime Text 2 does not, whether it’s the look of the find box or the use of the same drop-down for everything from commands to navigation, it’s just not right as a Mac user experience.
  • The Bundles a.k.a. Packages in Sublime Text 2 seem half done. I can forgive that the product is new and the bundles need to catch up, but since most of them are just TextMate bundles re-wrapped, I’m surprised they don’t feel like better quality. And there does not seem to be a way to figure out how good a package is until you install it and things go awry.

I’m going to stick with Sublime Text 2 for now, but am keeping an eye out for something better.

See also Multiple Themes in Sublime Text 2 and Multi-Platform Editing Is Sublime. And follow me at @hiltmon on Twitter or @hiltmon on App.Net.

Choosing Sunglasses

Two weeks ago I went kayaking on Long Island sound, and lost my sunglasses. Once again, my friends, it was time to research the next pair.

My criteria for sunglasses may differ from yours, they are:

  • Good lenses: I program for a living, and need my eyes to remain in top condition for as long as possible. Cheap lenses increase eye strain for me, which is bad for my work and mood. UVA and UVB protection is obviously a must. So is having a light weight. Modern polycarbonate lenses achieve this goal.
  • Low Glare: The greatest damage to my eyes after UV comes from glare, so I need good anti-glare protection while maintaining excellent clarity. For this, an anti-reflective coating and well done polarization really helps.
  • Good Color: My hobby is photography, which means I also want to see a wide and true color range through my sunglass lenses. I seek a lens color that blocks a color gamut range I know about (and choose), and does not block other color gamut ranges. Cheap lenses shift colors which annoys me. Neutral lens colors such as gray or green have been my preferred choice (but I tried brown for the last pair).
  • Wrap Around: Good glasses must not only protect my eyes from frontal light but also side light and reflections. I’m not looking for a full cycling wrap-around set, or side shields, but I do expect my sunglasses to curve around towards my temples to catch stray light.
  • Comfortable fit: I pretty much wear sunglasses whenever I am outside, which means that the fit needs to be exceptional. No side or nose pinching allowed, and the glasses must stay on as I lead my active lifestyle. And perch comfortably on top of my head when indoors.
  • Scratch and Moisture Resistant: I’m also not the most careful person out there, so having a pair of sunglasses that shake off dust, water and scratches is very valuable to me.
  • Looks Good: Honestly, this is probably the most important criteria for most people, but being the geek I am, it’s the last thing I look at. I do, however, have some self-respect, and so this is also a requirement.

Previously, I had always gone towards the Ray Ban classic green lenses, the ones made famous in their Wayfarer and Aviator ranges. I have always liked the protection and neutrality these lenses provide. In fact, my wife just replaced her sunglasses with modern Wayfarers, and they are most excellent. But they look horrible on me.

My most recent pair had brown Persol lenses, and they were very, very good. Especially when it came to water and snow glare. They were also my first polarized lenses and the polarization was not overdone. The frames fit well and the glasses still looked new after more than three years hard use. But I never got used to the warmness and color changes inherent in the brown lens. I had spent several hours at an optician to get them, trying on many pairs and testing them under sunlight and spotlight conditions.

In researching the next pair, I studiously ignored the major fashion brands of sunglasses as they usually choose style over function and the information on their lenses was often quite sparse. I also passed over several of the good sports brands like Oakley because they always fit too close to my eyes. When I blink, my eyelashes touch the glass. For the record, my spare sunglasses for snowboarding remain an oversized old pair of plastic framed Oakley’s that have a very neutral lens.

I probably would have gone with a pair of the green photo-polar lenses from Persol as my next pair because of my experience with their brown lenses and great frames.

Until one of my very good friends showed me his Revo sunglasses.

The lens clarity was surprisingly excellent (even through the fingerprint smudges), the glare reduction was the best I had seen, the frame fit (especially with the wider, spring loaded arms) was brilliant, and the wrap was perfect. The only thing I did not like about his pair was the cobalt lens tint. It shifted the color too much.

Turns out, Revo has a neutral lens in a graphite color. And a matching black titanium frame that looks great. So, after all that research, the best pair of sunglasses today, for me, based on my criteria, is the Revo Discern Titanium sunglass with the graphite lens, which I tested and purchased.

The clarity and quality of their polycarbonate is wonderful, and the neutral lens color means that I still have an excellent idea how photos will turn out. The frame wrap is just right, and the spring loaded arms hold the sunglasses perfectly in front of my eyes without my eyelashes touching the lens. And they look good. I’m still getting used to the new polarization (looking at the lens instead of through it), but I had to do that with my last pair as well.

The Revo’s are not cheap. Yet I think it was the fastest purchase ever made at Sunglass Hut on 57th street. I look forward to many years wearing these. And highly recommend them.

App.Net Funded

Today, App.Net got funded. Not many of us expected that to happen, but it did, and we’re happy to pay our $50 or $100 to kick it off. Most reports are reporting this service as a paid Twitter clone, and yes, that’s part of it. But the goals are bigger:

  • You are not the product: The product in Twitter is you, because it’s free. On App.Net, you are paying for the ID and the service and the access. It’s yours. Your ID. Your Data.
  • Low or no Spam: Adding a spam account to Twitter is easy, bots can sign up. If a spammer wants to use App.Net’s platform, they have to pay $50 per account, not a viable business model for them.
  • It’s a platform: The real plan for App.Net is to provide a real-time platform for inter-process communications across the web, with tweets being the first cab off the rank. Twitter is all about mass and messages, not as a common carrier.

If you want to reserve your twitter handle, better back App.Net today before funding closes at join.app.net.

You can find me on Twitter at @hiltmon and App.Net at @hiltmon.

Textmate 2 Is Now Open Source

My favorite programmers editor of all time, TextMate, recently updated to TextMate 2 Alpha. I’ve been using the alpha for a while, still love it, but it’s an alpha release and therefore buggy and slow. So I’ve been trying out Sublime Text 2 as a replacement. So far, I switch between TextMate 2 Alpha and Sublime Text 2 daily, only because closing the last tab in TextMate 2 closes the darn project editor and that annoys me.

Today I find out that TextMate 2 went Open Source (GPL 3), you can see it here on Github and on the blog. Wow, unexpected.

This changes everything.

Or does it?

If, like some open source projects, it gets a lot of followers and contributors, then TextMate 2 will win the programmer’s editor war. No doubt, the best programmer’s editor will just get better.

But if, like most open source projects, once the initial hype is over, it starts to languish, or become political, or fork too many times, it will just get worse.

I sincerely do trust Allan Odgaard (@sorbits) and his burgeoning team to maintain control and get this done and done right. Just like TextMate 1. It will take a lot of time, what they are doing is exceptionally hard, but I will support them the whole way. Watched.

Expect the Unexpected

There was a panic at one of my client sites today when the reporting software I wrote for them stopped working. Instead of presenting reports as usual, the software threw an unknown error.

WTF, an unknown error!

I wrote the darn system, I know all the errors, because I coded all of them!

A quick glance at the logs indicated that indeed, Passenger was throwing an unknown error when accessing one Rails URL (and all others were working just fine). It’s as if that Rails path just disappeared.

Drilling down the stack of log errors, it turned out to be quite a simple error — a single data point was nil instead of the expected number and any calculations or transforms on nil throw an error. As they should.

But this number was in the input file to the system, as in all previous input files, and verified to be there. And the input file was loaded OK, as did all previous input files. So why, on this day, at this time, is there a nil when every other day there is a number?

It turns out that the company that sends the file accidentally sent us two files today, one being the usual file we expect, and another that we had never seen before but in the exact same format with the same keys!. My code processed both files, overwriting the correct first file with the garbage from the second file.

The fix was easy, nuke the bad file, reprocess the good file and we’re off and running. Business interruption was insignificant.

But the lesson learned is to expect the unexpected. I have now changed the file processing code to check that the file it receives is the one it expects to receive, and ignores all others. That way, when this happens again, as it always does, the software will only process the good file.

Next time you see a one-off error, don’t just rectify the situation and walk away, find out what caused the problem, and ensure that it can never happen again.

Well Managed Scripts Better Than Platforms

One of the most misunderstood things in computing is the need and power of scripts. Most IT shops seek out platforms, tools and technologies to perform business functions, when a bunch of well architected and documented scripts is all that is really needed.

script /noun/

An automated series of instructions carried out in a specific order written in an interpreted computer language

Over the years, I have seen many organizations go out and buy very expensive platforms and hardware, and hire very expensive staff, without even once looking at the alternative, well managed scripts. As a result, they have had to deal with high costs and slow turnaround times when things change (as they inevitably do). In several of these cases, I would have recommended them to get a good scripting programmer, and let that programmer create a series of well managed scripts, leading to cheaper and quicker implementations and more responsiveness to business change.

What do I mean by well managed scripts?

The common perception of scripting is that some cowboy programmer barfs up some horrible code in an interpreted language, gets it working, schedules it to run and walks away. You are left with obfuscated code that is unmaintainable, but also core to your business, so you just have to hope that it keeps on working.

That is not well managed scripts, that is bad programming.

Well managed Scripts follow a series of conventions and standards to ensure that each script is just as readable and maintainable as the next, and that common code is shared between scripts in libraries. Just like a “real” program. In fact, the only difference between a set of well managed scripts and a real program is that the scripts run in an interpreted language and they can all run independently (no main function).

By sharing common code, new scripts can be generated more quickly. By using a naming convention, the right script can be found and run easily. By using coding conventions, each script can be made very readable and maintainable.

What follows are two cases where I believe well-managed scripts are better than a platform, and one case where they are not, all in the financial industry.

ETL Example

One of the biggest problems facing firms in the financial industry is the volume of data that needs to enter and exit the firm. Unfortunately, there’s no standard format, convention, protocol or structure in use. As a result, each financial firm has to parse a myriad of formats to get data in, and produce another myriad of formats to send data out, as well as try to map the naming of things because everyone is different.

Quite a few of these firms have attempted to resolve this issue by looking for a platform to do this for them. And the big software companies have delivered, calling these systems ETL systems (Extract, Transform and Load). Big software companies like SAS and Microsoft have spent millions developing ETL platforms and licensing them for 5 and 6 figure annual fees. And you know what, these systems are very, very good. They can take data in pretty much any format (extract), provide a plethora of tools and techniques to enable their users to program the conversion of data from one notation to another (transform), and bung that data into any other system or database (load).

But they are horrendously expensive to buy and even more expensive to run. Alongside the license, you need servers and databases, dba’s and system admin staff to keep them running and programmers to code up each ETL run and ensure that it keeps working in a changing environment. And since these platforms are very complex, use their own programming languages and conventions, the staff required is highly specialized, hard to find and very expensive. But users of these platforms do land up with one single platform that does all their ETL as and when they need it.

Or they could write a series of scripts with a few shared custom mapping libraries and be done with it. You see, most scripting languages already have libraries to read and write almost any file type (extract and load), it’s what they do best. The middle component (transform), commonly called mapping, needs to be programmed anyway, whether they script or platform. And in most cases the mapping is the same across many scripts (so they can share the code). It’s a lot easier and cheaper to hire a scripting programmer that uses a common, well known scripting language that any other programmer can read and maintain. And it’s a lot quicker to change a script when the data changes than to redo it in a platform. The risk is that they can easily lose track of all their scripts unless someone manages them properly and it’s harder to monitor whether the scripts ran successfully or not.

All in all, though, the Total Cost of Ownership (TCO) of a platform to do ETL is way, way more than the TCO with well managed scripts to do the same thing. In this case, ETL is pretty much in the wheelhouse for scripting languages, and in my experience platforms are just overkill. And lets not forget the time, cost and inertia of running a large staff to keep the process going versus having one or two script programmers doing it.

Risk System Example

Most risk management systems consist of a few very high performance mathematical libraries, a database, a lot of moving, mapping and preparing data as input, and taking the output data and formatting and presenting it. The best language for writing these high performance libraries is C or C++, because it has lots of very good math libraries, is compiled to native code and uses the vector units in modern CPUs (or GPUs if smart) to enable massive amounts of numeric computing in parallel. C/C++ also happens to be the worst language for doing all the other stuff.

It my first hedge fund, the risk system was completely written in C. It had some amazingly great math components and some insanely awful data management code. Fully 80% of the code base was hacks and code to get data ready for the calculations and get the results out. But the biggest issue with the system was that it was very hard to modify and change as the business changed.

In order to make the system more flexible, we extracted the math code and wrapped that code in Perl libraries which did not change that often. We then used Perl to handle the data in, gluing the calculation runs and data out functions, code that changed quite often. The result was that we landed up with a far smaller code base to maintain. We were able to adapt far more quickly to changing data, were able to run the entire system much quicker, were able to detect issues sooner in the pipeline and deal with them, and to then do more risk analysis than before. Adding new math components in C still took time, but the rest was quick, easy and very flexible.

High Frequency Example

However, there are also times when a platform is not only more desirable, it’s necessary. Take for example a real-time trading algorithm or reporting systems. In this case, the performance of a scripting language is just not good enough. You need platforms that can run and report calculations in real time with real-time data feeds. In the trading algorithm case, you get hundreds of security prices every second. The platform needs to calculate the optimal number of each security to hold, determine if it has enough cash to hold it, decide whether the trade passes compliance rules and decide whether to make a trade or not - in real time. You simply cannot do this in a script, but big expensive platform products like SAS handle this brilliantly. When faced with the need for high frequency trading in a previous Hedge Fund, we modeled the algorithms in Excel and MATLAB, then ran them in a specialized real-time trading platform to maximize performance and return.

Script first?

So next time you are looking to automate a business function, instead of looking for a prebuilt platform that requires skilled programmers to run and costs a lot of money, why not try to script it first. If it’s not time sensitive, operates in a changing environment, and is in an area where you want to manage cost, well managed scripts may be the best technology solution for you.

Mischief Managed: Update Tools, Learn New

Mischief Managed is a series of posts on tasks and technologies I use to maintain my computing environment. It’s part of what I do between projects. Try it out.

Most of us do not update our operating systems or toolkits while a project is ongoing. It increases the time to produce the product, adds risk that things will stop working, it’s just not worth doing. But once the project is over, it’s time to update the tools and study new ones.

Since most of my work these days is in Xcode or Rails, here’s what I do for them and what I am learning now.

Xcode Work

Xcode usually gets updated when either OS X or iOS gets updated. When I am not involved in any OS X or iOS projects, I usually run the latest beta of Xcode to spike solutions or learn new APIs. As soon as a project starts, I remove the beta, install the latest release version and stay on the latest release for the duration of the project.

On thing, save the Xcode installer. Apple does have them on the Developer Downloads page, but sometimes they are lax about posting it. Also, save the installer for the version of OS X you are using. These days with App Store updates, Apple deletes the OS Installer after installing. I usually make a Flash Drive installer, just in case (See How to Burn Your Own OS X Lion Install DVD or USB Drive for how).

Now most Xcode developers usually remain on the same version of OS X and Xcode for the duration of the project, meaning that they often remain one or two versions behind to reduce platform risk. I am too impatient to do that. What I usually do during a project if there is a major OS X or Xcode update, is update the Laptop to the new version and create a project branch to test it. I do this in my spare time as it should not affect my customers. If it works, then I merge the branch and stay on the new version. If not, I nuke the branch and revert the Laptop. So far, the reversion only happened once, when Xcode 4.0 came out and was too buggy to use. 4.1, 4.2, 4.3 and now 4.4 have all gone smoothly.

Ruby and Rails Work

For Ruby and Rails work, I use rvm. It enables me to have project specific versions of Ruby, Rails and gemsets installed. For example, the Noverse.com website runs ruby-1.9.2-p320, Kifu runs on ruby 1.8.7-p358 and Hiltmon.com is on ruby-1.9.3-p194. Hmm, now that Kifu is in production, maybe I should update its toolkit.

With rvm I don’t have to worry about platform risk, and can upgrade projects to new versions independently.

But once projects are over, I like to install the latest version of Ruby and Rails as a new rvm for spikes, new projects and learning. It also helps to have these as default for downloading and trying out projects from GitHub.

New Platforms and Technologies

One thing I love to do between projects is play with and learn new platforms, languages and technologies. Not only do I learn and adopt new things this way, but I also import ideas from these in my more traditional projects. It is because of this ‘playing’ that I moved from Perl to Ruby for scripting, and ASP.Net to Rails for web.

I believe it’s important for a technologist to study other technologies to become a better technologist.

Right now, I’m playing with node.js as a potential platform for some of my next personal projects, and comparing it to the Sinatra via Ruby stack. I have yet to be approached to program a real system using either of these stacks, but who knows what I’ll be working on next, and they are both fun products.

I’m also looking at the Clojure and Haskell functional programming languages to see if I can get my head around them for future work. And playing with Lua and Cocos2D for game development. This research is but in the early stages.

Mischief Managed: Archiving

Mischief Managed is a series of posts on tasks and technologies I use to maintain my computing environment. It’s part of what I do between projects. Try it out.

When a project or engagement is completed, support is over, and you’re on to the next thing (or taking a break), it’s best to move these files and tasks out of your current folders and into archive folders.

Here are some of the things I do:

Move to Old Projects

I move the entire project tree from my ~/Projects folder to my Old Projects folder on the slow drive. It’s still on the laptop (I have a SSD and an HDD), so I can access it and restore it if necessary, but it’s no longer available in my current projects folder where I will see it every day. Less clutter.

I also update my TextExpander shortcut that takes me to the code of that project in case I need to open it quickly in an editor. For example, the ;cdxx macro that was cd /Users/Hiltmon/Projects/XX/code/XX/ macro becomes cd /Volumes/Callisto/Old Projects/XX/code/XX/.

Archive OmniFocus

This is really a task that you should perform regularly, but I don’t seem to remember. In OmniFocus, click on File / Move Old Data to Archive…. You’ll be prompted for a date, and then OmniFocus will move all completed tasks before that date to an archive file. It puts that into a file in ~/Library/Application Support/OmniFocus so make sure you always archive from the same computer, and include this folder in backups.

The main benefit of this is that it reduces OmniFocus sync time as all the old completed tasks from your projects are no longer there.

Move old Mails

Clean up your inbox so all the email for the project is safely in its client folder, see my Mischief Managed: Clean Inbox post.

(OLD) Clean Up Documents

Since I moved to a standard Project Folder Layout, I no longer need to do this, but I’m documenting it just in case.

I used to have client files, art files, contracts and other documents relating to the project littering my ~/Documents tree until I started using my standard Project Folder Layout. At the end of each project, I would have to go through all the folders in ~/Documents and move the required documents to an Old Documents folder on the slow drive.

Archive Notes

The only documents I keep outside the project folder these days are any generic client notes in nvAlt. But now that I use VoodooPad for all key project info (see Project Specific Data), that too is changing.

I use a mixture of OpenMeta tags and Hazel rules to tag old notes as archive and Hazel moves them out of the Notational Velocity folder and into an archive folder. I then drag and drop them into the Project Folder in Old Projects.

Mischief Managed: Google 2-Factor Authentication

Mischief Managed is a series of posts on tasks and technologies I use to maintain my computing environment. It’s part of what I do between projects. Try it out.

I’ve been planning on doing this for ages, but no excuses, you should too.

And then this happened. Last week, Mat Honan got hacked hard (see Yes, I was hacked. Hard.). Someone gained access to his Apple ID, reset his other accounts and nuked all his computers. You should read his experience, resecure your accounts and start backing up too. Aside: Ignore the comments, the haters simply embarrass themselves.

I run both my personal email (@gmail.com) and work email (@noverse.com) on Google, and forward all other accounts to one of these. Between these two and my Apple ID, they are the most commonly entered passwords I use.

2-Factor authentication means that you need to input an additional key the first time you access a service from a new, untrusted device. Google provides this key via a SMS, a phone call or its Authenticator App. Think of it as the same as those key fob thingies you get from your bank with a number to input with your password. With 2-factor enabled, bad folks trying to log into your account will trigger factor messages but not have access to the codes, only you do.

Setting it up is easy, log in to Google like you usually do, click on the tiny arrow to the right of your avatar and click account. Choose the second item on the left sidebar, Security and hit the edit button next to 2-step verification. Google will walk you through the process of turning the service on.

But there is a problem, many applications like mail clients, mobile devices and third party software that does not use OAuth do not support 2-factor authentication. Google solves this problem by enabling you to create application specific passwords. I highly recommend you set these up now, while you are setting this up. In my case, I needed ones for Mail on my laptop, mail on my iPhone, mail on my iPad and GAget. I’ll need more for the desktop. Give each password a new name and Google generates a really hard password. All you have to do is type it in to the device, you’ll only need to do this once per application per device.

I also recommend getting the iPhone (or Android or Windows Phone) authenticator software and setting it up as the primary authentication device, and use your mobile SMS as secondary. It involves scanning a QR code on your screen and typing in a number. Simple, and the authenticator is on you at all times. Next time you try to login on an untrusted device, you just launch the authenticator to get the code you need.

For the @noverse.com email, I had one more step to make before I started. By default, Google Apps does not have 2-factor enabled. You need to log in to your Google Apps domain and enable it under advanced tools.

As a final safety net, in case you lose your phone too, Google offers a set of 10 predefined one-time use keys that you can use to recover. I recommend downloading these as text files and saving them on Dropbox, Evernote or somewhere where you can easily retrieve it without your phone.

It’s only a small thing, but one-time passwords and 2-factor authentication means that its less likely I’ll get hacked and suffer the same fate as Mat Honan.