On walkabout in life and technology

We're Better Than This

My thoughts on the toxic hell-stew that my Twitter feed is becoming. I follow (and occasionally interact with) a bunch of intelligent, opinionated, sensible tech folks whom I respect immensely and whose timelines and lives are being ruined by an impersonator, a gang of misogynists and their flock of followers.

We’re better than ganging up, taking sides and judging or expressing negative public opinions on people we do not know personally. Topical constructive disagreement is great, we thrive on that, personal attacks are not.

We’re better than letting one arsehole impersonating someone else from disrupting our sense of community, discourse and expression. You know who I mean.

We’re better than sniping at each-other over made-up shit, clickbait, snark and snide remarks created intentionally to sow discord in our community.

We’re better than those who treat women, LGBT folks and minorities as second class citizens. Because we do not.

We’re better than those who dox, swat, spread hate and discord. Because we do not.

We’re better than to give attention where it is not owed or deserved. We have more important things to do with our time.

We’re better than to get angry over insignificant stupid things where war, refugees, child killings, racism, guns, insane politics, a slow slide into the dark ages, climate change and a hundred other real issues deserve our attention and intellect.

We’re better than letting a few bad people ruin our community, one we have built over years of communication, trust and honesty.

We can and should unfollow, muffle, mute or block. We can shut them down together, as only a community can. Then ignore them.

Lets get back to being who we are, to the real discussion, to sharing our interests, to discussing tech topics, and to making Twitter enjoyable again.

Lets tweet a namaste (🙏🏽) to each other and put this behind us.

Maybe, just maybe, if we set a better example, as we have done in the past, they will find us implacable, unruffled, united and not worth messing with.

Follow the author as @hiltmon on Twitter.


Dangerware is common in business and government. Dangerware is just ordinary software, but the way it comes into being creates the danger.

  • It starts with a basic prototype written in a hurry.
  • This is quickly put into production to run the business.
  • The prototype screws up repeatedly when faced with new scenarios.
  • Resources are tasked to add (not update or correct) the prototype to deal with the latest screwup.
  • This process repeats until the resource (or original business person) is tasked to a new project, or the cost of screwup is less than the cost of resources to mitigate.

I call this software dangerware.

And sadly, it runs most businesses and agencies. Dangerware is software written without requirements, design, tests, validation, checks and balances or even an understanding of the business, the big picture or the nature of the problems being solved.

Its software without responsibility.


You should be.

But its as common as desk chairs in the real world.

Think about it: the Excel models, VBA projects, Access databases, SQL queries, built by non-professional programmers, hobbyists, interns, outsourced programmers and juniors that control and manage your business are all dangerware. Where the need to ‘get something out’ completely outweighed the risks, both financial and professional. And where its was easier to blame someone else for the the screwups (or for not recovering from them).

Dangerware is everywhere in business and government. Every single finance person has a horror story of a bad Excel formula that cost someone else their business. And yet they still trust in their own dangerware.

Can you imagine if your MRI machine or autonomous car’s software was created this way? You’d be dead.

The evolution of dangerware into bigger projects and the rush to start larger projects is a fair explanation as to why the vast majority of corporate and government software projects go so horrendously over budget and fail so badly.

Dangerware is easy to detect and prevent.

Detection is simple:

  • If the user is the programmer and not a professional full-time programer, you will get dangerware.
  • If the programmer does not understand the business problem to be solved within the bigger picture, you will get dangerware.

Solving the first is easy. Get a professional to develop the application. Trust them, listen to them and allow them to do it right.

The second is a lot harder, but not as hard as you think. It boils down to process and communication. And it was taught to me when I was a cocky kid by a middle-aged man with thick glasses and a cane. Sadly, I do not remember his name.

He taught me a simple process to gain an understanding of the business. It was the first step in what used to be called Business Process Engineering and it is all about finding and following the workflows.

To understand a business or a business problem, you need to know that it exists and understand what it is. To do so, you need to learn the workflow, how it starts, how it does (or should) flow and where it ends up. And the first step is to walk through the first one you identify, and then each one it exposes. Follow the existing paperwork, see who gets involved, centrally and peripherally. See which flows depend on this flow and are triggered by it. Follow each variant of the flow, run scenarios on each, both success and failure, to understand the nuances.

And do this with real people. Not the managers and consultants, but with the actual people involved. Work with them to find out what you do not know. Assume nothing. Ask lots of questions, listen to them talk (and complain), ask about what happens before and after, ask why they do what they do to see if they even know. Its amazing what you will find and just how much you did not know to start with.

What will emerge is a picture, often confusing to start, of intertwined people and processes, of contradictory and seemingly irrelevant steps, and a huge pile of exceptions to the rules.

And a lot more questions.

Unravel this picture to understand the flow.

You are not trying to reproduce the flow. Nor blame or replace the folks running it. Pull out what needs to be done, why it needs to be done, where it works and where it fails. And it always shows up what you would have missed had you not gone through this process.

Then, and only then, design software to help.

That will protect you from dangerware. Because you understand the business problem and environment before solving for it and coding it up, you reduce the risks of failure, screwups and blame games.

The counter argument for this is that there is never enough time to execute this process. “We’ll get something out and then, if we have time, we’ll figure it later” is the bat-signal of dangerware. Even a single walkthrough and a few conversations with the folks involved that takes less than a few hours will show up just how much you do not know. And the time and cost spent learning is insignificant compared to the time to add more danger to dangerware and the cost of screwups.

You’ll never know everything, but at least the big nasty dangers will be identified early, exposed and can be solved for in design before releasing dangerware.

A professional programmer will check their code. A professional programmer who understands the business flow will generate product that is not dangerware.

And you, you can focus on building a better business instead of being distracted by the huge number of problems dangerware causes.

Follow the author as @hiltmon on Twitter.

A Yosemite Markdown Spotlight Importer

All my text and writing is in Markdown formatted files and I would like to search them using Spotlight. The editors I use do not have an importer (they have Quicklook only), so this is not available directly.

Changing the RichText Spotlight importer trick worked in previous versions of OS X (see how in A Simple Markdown Spotlight Importer), but since System Integrity Protection in OS X El Capitan, this no longer works.

Never fear, there is another way.

The great Brett Terpstra to the rescue, again! Read about it at Fixing Spotlight indexing of Markdown content on his amazing site.

What I did was the following:

  • Downloaded this Zip file from his site and uncompressed it
  • Moved the Markdown.mdimporter file to ~/Library/Spotlight. I had to create the folder under my User’s Library folder. To find this folder in Finder, hold the Option key when pressing the Go menu to see the Library folder option.
  • Started a terminal shell

In the command prompt, I executed the following to activate the importer:

mdimport -r ~/Library/Spotlight/Markdown.mdimporter

And then, when nothing seemed to have happened, I recreated the entire Spotlight index on my computer.

There are two ways to do this.

The GUI way is to open System Preferences, select Spotlight and the Privacy tab. Drag and drop your Macintosh HD onto the big open space. Wait 20 seconds or so, then click the minus sign to delete it. OS X will start to recreate your Spotlight index.

Or use the following command:

sudo mdutil -E /

To see if this is working, run

mdimport -L

I get

2015-11-17 12:40:40.400 mdimport[53046:588670] Paths: id(501) (

After a long while, all my Markdown files were once again searchable in Spotlight. Thanks Brett!

Follow the author as @hiltmon on Twitter.

Making and Maintaining High Resolution Charts for Indesign CC

One of the biggest headaches I have using Adobe Indesign is the creation and especially maintenance of charts and graphs. In my case, my fund publishes several high quality books and one-pagers monthly and I need to update a bunch of charts and graphs. I also need to print these at a very high DPI, hence Indesign.

I used to use Adobe Illustrator graphs. They are rudimentary, but very customizable. Every month I had to load an Illustrator file for each image, update the graph data and then spend time tweaking the results. Only then could I open Indesign, update links and move on. This took a lot of time.

The ideal solution would probably be embedded Excel charts in Powerpoint or Keynote. But these products are screen focussed, low DPI, and are no good for high quality print output. And unfortunately, Excel produces graphs as bitmapped images, not vector art, so that is no good too.

But it turns out that Apple’s Numbers application (part of iWork on the Mac) does something special when exporting to PDF: it generates all charts as magnificent PDF vector art. And these include the new typography features and hairlines I need.

So here is how I work now.

I have a Numbers spreadsheet with a bunch of tabs. On the far right is the tab containing the source data tables. To the left are tabs containing one and only one graph per tab. The source data for each graph is in the data tab.

The reason the data is in the right-hand tabs is to ensure that the graphs are always published to PDF in the correct order and present on the same page number every time (see linking below).

Each graph has been created individually, designed to match the book or page theme, the necessary fonts and colors set, even the sizing is preset. All I need to do every month is update the data and let Numbers take care of graph changes.

I then export the entire document to PDF (File / Export / PDF…). I set the image quality to Best which seems to produce nice, clean hairlines, and save the output into my Indesign Assets folder, overwriting the previous period’s PDF.

In Indesign, I created the frames for each chart. When placing the chart, I check Show Import Options and select the PDF file. This brings up the below dialog where I select the PDF page (the linking trick) containing the graph I want in that frame. A bit of resizing and repositioning and the graph is linked and placed in the document.

On update days, things cannot be easier, I:

  • Open the spreadsheet and update the data.
  • I save it, then export to PDF as above, replacing the previous PDF asset file.
  • I open Indesign and update links. Since the PDF pages and charts are in the same place and remain the same size, they all update perfectly.
  • After changing the monthly numbers and text, I save and prepare for print.

Since Numbers produces high-resolution vector PDF art, I am able to generate quality high-DPI charts in no time and get the books and pages out on time without losing fidelity.

Follow the author as @hiltmon on Twitter.

Brown M&Ms: A Quick Way to Determine Code Quality

My business runs on code. Every day, my team and I deploy new systems, patches and add new features to our mission-critical code base. And we rarely have a problem.

That’s because we have a quick way to determine if the programmer attended to the details of the product and code, and whether we then need to hold the deploy for a deeper check and test or can run a lighter test and confidently push it out.

We run a quick quality check.

We look for the presence of brown M&Ms.

Back in the day, the legendary band Van Halen put on complex live shows involving lots of equipment, lighting and expensive sensitive sound equipment. The setup for the show was documented in great detail in their standard concert contract which dictated what the venue needed to do to set up and run a Van Halen Concert.

One of the more unusual requests, under the “Munchies” section hidden in the middle of the contract, was a line requesting a bowl of M&M’s with the brown ones removed.

One would assume that they were just being prima-donna rock stars in requiring each venue to task a person to remove the brown M&Ms from the bowl.

But that was not the intent.

This requirement served a very practical purpose: to provide a simple way of determining whether the technical specifications of the contract had been thoroughly read and complied with.

In short, if there was a bowl of M&Ms with the brown ones removed, chances are the venue had read all the details of the contract, followed them all and had set up the concert properly and correctly. If not, Van Halen knew to run a detailed check, line by line, to find out what the venue had failed to set up properly and could put the concert at risk. Empirically, they usually found problems when the M&Ms requirement was not met, and rarely found problems when the bowl was properly prepared.

In our case, we don’t ask programmers to provide bowls of chocolate (although that would make me a lot happier). Instead, one of their requirements is to code to our quite capricious coding standards, where tech arguments on spacing, naming and formatting have been, in many cases, arbitrarily decided. You may not like our standards. Heck, even I don’t agree with parts of them and I designed them. They are there to help the team read each-others' code, and also there as our coal-mine canary.

When new code is delivered to us, the first thing we do is scan it for standards compliance. We can see, at a glance, if the code is to standard or not: are the headers present, are the files named correctly, is the spacing and layout right and does the code look ‘clean’. So, if the programmer has not followed the coding standards, we know to drill further. If they have followed the standards to the letter, chances are the code is better.

The programmer who takes the time to code to all our needs (the standards being just one) is more likely to have thought through the code, structured and tested it properly and produced a more reliable and maintainable product. If a programmer has been lazy, copy/pasted code and ignored the coding standard, its more likely that they have not fully examined the problem space, designed an elegant solution, tested it and made it maintainable.

Its all about attention to detail. If the code layout detail has been adhered to, so too have all the other details.

We’ve seen this time and time again. Code to standard is usually more reliable and correct. And because of that, adherence to standards has become our bowl of M&Ms test.

To be clear, we do not just take the pretty code and push it out. Code still gets reviewed, tested, challenged and examined before it goes out to run the business. If the attention to detail was applied to standardizing the code, then the likelihood of the same level of attention being paid to the functionality and feature set of the code is very high.

Empirically, we see fewer bugs and problems, have an easier time in review, better tests run with better coverage, and maintenance by the team is easier.

Code written to our standards is our brown M&M check, our way to get a quick feel for the quality of work we’re dealing with.

Ironically, I am one of the few people who actually likes the brown M&Ms. Feel free to remove the blue ones from my bowl.

Follow the author as @hiltmon on Twitter.

Why I Subscribed to Apple Music

Just over 3 months ago, Apple released Apple Music and I signed up for the free trial as we all did. My expectation was that I would play with it for a few days, get over it and never use it again. I expected the same experience I got from Spotify and Pandora before and I never subscribed to them. I expected the same old same old and no reason to use it.

I did not expect to actually pay for it or even want to pay for it.

Five days ago, I happily paid and will do so from now on.

Here are the main reasons why.

The Human Curated Playlists

These are magnificent. They contain the best mixes created by people who know, live and understand music, and have led, time and time again, to me discovering artists and music I had never heard of before.

In fact, 90% of the time, I find myself listening to one of these mixes. I launch the Music app, scroll to one that looks interesting (and there are lots of these) and start playing. When it ends, I find another, and another, and another. Three months later, and I am still finding more. I am often surprised by what comes up next in a mix.

With Pandora and Spotify, I found the mixes to be boring, generic and pedestrian, when they were not purely amateur. Having been a reasonable mixtape maker myself in the good old days of cassette tapes and big hair, a mix needs to tell a story, have a mood, comfort and surprise the listener at the same time. Apple Music mixes have this heart, Pandora and Spotify’s don’t.

Apple Music has a plethora of human created playlists containing magnificent music melding old favorites and new artists in thoughtful, expert, moving ways.

Access to high-quality Everything

I have not been a music buyer for quite a while (excluding the few albums I picked up on iTunes). I do have a massive 500+ CD collection taking up space in boxes below my bed. I ripped them - quite badly - years ago and have listened to these rips for years.

I think I stopped buying music in general because

  • I stopped listening to the radio and therefore stopped discovering new music.
  • The music I do hear seems so generic, the same songs by the same few artists over and over again. We used to call it the Stock Aitken Waterman effect back in my day, where everything seemed to be bubblegum pop.
  • I’m getting older and according to the research, we stop keeping up.
  • I now live in a small apartment in New York, there’s no space for a music collection or a large sound system.
  • And lets be fair, there’s no need to own music anymore when you can stream it anytime anywhere.

My collection of terrible rips has been replaced by instant, anywhere access to Apple’s high quality versions of the same songs I know and love. And when I do discover a new talent, I can listen immediately to their high-quality albums. No need to remember their name, go to the record store and hope to buy their good album.

Apple Music provides full access to the entire record store anytime anywhere.

No Ads or Interruptions, Easy to Use

My wife listens to Pandora all day. When the music is on, things are pleasant. Then an extra-loud ad comes up next. It destroys the mood. I have always found that to be a problem with radio too. Which is why I used to listen to student radio stations.

Apple Music is music and nothing else. No ads, no interruptions, no mood breakers. Yes, I know about Beats 1, and it contains talk and ads and all the usual radio interruptions. I do not care for nor listen to Beats 1.

The user interface also works for me. It’s not perfect, but Spotify and Pandora’s are so horrible. We both find it difficult to work through the mountains of data, the long scrolling lists and the awful search on the alternate platforms. I find it easy to scan Apple Music to find what I want to listen to.

Apple Music is all about the music, not the show or fighting the user experience.


That’s why I subscribed. Great mixes, full access to quality recordings, no interruptions and easy access, anywhere, anytime.

And with that, my current mix just ended. Oooh, that one looks interesting. Gotta go, pressing play…

Follow the author as @hiltmon on Twitter.

The Simple C++ Makefile - Executable Edition

I develop a lot of applications in C++ using Xcode on OS X and deploy them to CentOS Linux Servers to run. I follow the Simple C++ Project Structure (and the Xcode edition) to code up each product.

However, Xcode is not available on Linux. To compile and deploy (and to test compiles and deploys), I use standard Unix Makefiles, available almost everywhere.

In this post I will show you the Makefile I use for multi-platform C++ executable builds and explain what each line and command does in detail. For Library builds, I have a similar, but different Makefile, see The Simple C++ Makefile - Library Edition (Coming Soon).

The Standard Project

I have selected one of my real projects for this post. The project is laid out as follows (most of the source files have been removed to shorten the list):

├── Makefile
├── README.markdown
├── SantaCruzServer.xcodeproj
├── doc
│   └── SantaCruz-dev.yml
├── include
│   ├── caches
│   │   └── session_cache.h
│   ├── components
│   ├── connections
│   ├── queues
│   ├── workers
│   └── version.h
└── src
    ├── caches
    │   └── session_cache.cpp
    ├── components
    ├── connections
   ├── workers
    └── main.cpp

As expected, the C++ source files are under the src folder and includes are in the include tree.

And here is the Makefile, the actual one I am using. Scroll below to see the breakdown and why I did it this way.

#  Makefile
#  SantaCruzServer
#  Created by Hilton Lipschitz on 2015-09-01.
#  Copyright (c) 2015 Maritime Capital LP. All rights reserved.

# HIL: No spaces or comments after otherwise it captures them!
# Determine the platform
UNAME_S := $(shell uname -s)

# CC
ifeq ($(UNAME_S),Darwin)
  CC := clang++ -arch x86_64
  CC := g++

# Folders
SRCDIR := src
BUILDDIR := build

# Targets
EXECUTABLE := SantaCruzServer

# Final Paths
INSTALLBINDIR := /usr/local/bin

# Code Lists
SRCEXT := cpp
SOURCES := $(shell find $(SRCDIR) -type f -name *.$(SRCEXT))
OBJECTS := $(patsubst $(SRCDIR)/%,$(BUILDDIR)/%,$(SOURCES:.$(SRCEXT)=.o))

# Folder Lists
# Note: Intentionally excludes the root of the include folder so the lists are clean
INCDIRS := $(shell find include/**/* -name '*.h' -exec dirname {} \; | sort | uniq)
INCLIST := $(patsubst include/%,-I include/%,$(INCDIRS))
BUILDLIST := $(patsubst include/%,$(BUILDDIR)/%,$(INCDIRS))

# Shared Compiler Flags
CFLAGS := -c
INC := -I include $(INCLIST) -I /usr/local/include
LIB := -L /usr/local/lib -lsantacruzengine -lsantacruzlib -larcadia -lcorinth -lyaml-cpp -lzmq -lhiredis -lbondoas

# Platform Specific Compiler Flags
ifeq ($(UNAME_S),Linux)
    CFLAGS += -std=gnu++11 -O2 # -fPIC

    # PostgreSQL Special
    PG_VER := 9.3
    INC += -I /usr/pgsql-$(PG_VER)/include
    LIB += -L /usr/pgsql-$(PG_VER)/lib
  CFLAGS += -std=c++11 -stdlib=libc++ -O2

  @mkdir -p $(TARGETDIR)
  @echo "Linking..."
  @echo "  Linking $(TARGET)"; $(CC) $^ -o $(TARGET) $(LIB)

  @mkdir -p $(BUILDLIST)
  @echo "Compiling $<..."; $(CC) $(CFLAGS) $(INC) -c -o $@ $<

  @echo "Cleaning $(TARGET)..."; $(RM) -r $(BUILDDIR) $(TARGET)

  @echo "Installing $(EXECUTABLE)..."; cp $(TARGET) $(INSTALLBINDIR)

.PHONY: clean

Breaking the Makefile down

This Makefile compiles a program called SantaCruzServer into the local bin folder, and installs it into the standard shared OS X and Linux /usr/local/bin folder. Lets see how it does it, line by line.

UNAME_S := $(shell uname -s)

The first step is to determine which platform the make is running on. I look for Darwin indicating an OS X computer or Linux for Linux.

# CC
ifeq ($(UNAME_S),Darwin)
    CC := clang++ -arch x86_64
    CC := g++

OS X uses the new LLVM Clang compiler (I choose the C++ version), but my Linux servers run GCC 4.8. The $(CC) variable now contains the compiler command for the current platform.

# Folders
SRCDIR := src
BUILDDIR := build

These variables set the location of the Source Code (src), where the build object files will go (build) and where the target will be saved (bin).

# Targets
EXECUTABLE := SantaCruzServer

This sets up the Makefile target to make bin/SantaCruzServer. Note that, by default, this Makefile builds to a local bin folder so it does not overwrite the running application in case of a failed compile on deploy.

# Final Paths
INSTALLBINDIR := /usr/local/bin

This sets where the executable will be installed.

# Code Lists
SRCEXT := cpp
SOURCES := $(shell find $(SRCDIR) -type f -name *.$(SRCEXT))
OBJECTS := $(patsubst $(SRCDIR)/%,$(BUILDDIR)/%,$(SOURCES:.$(SRCEXT)=.o))

Make needs a list of source code files to compile to object files. This list gets built here. I use the $(SRCEXT) variable to store my source code file extension. I then use a shell command to search the src folder (and its subfolders) for all .cpp files and build the $(SOURCES) list. I then create an $(OBJECTS) list from the $(SOURCES) substituting the source path and extension with the build path and the object extension. This makes our compile rule simpler.

# Folder Lists
# Note: Intentionally excludes the root of the include folder so the lists are clean
INCDIRS := $(shell find include/**/* -name '*.h' -exec dirname {} \; | sort | uniq)
INCLIST := $(patsubst include/%,-I include/%,$(INCDIRS))
BUILDLIST := $(patsubst include/%,$(BUILDDIR)/%,$(INCDIRS))

The next set of lists are used to build the list of include folders and related lists.

Aside: I am one of those quirky C++ programmers that does not use pathed #includes (e.g. Instead of #include "../caches/session_cache.h" I simply use #include "session_cache.h") and expect the compiler to figure it where things are. This way I can re-arrange the code-base by moving files around, not change anything and it still compiles and runs. Xcode behaves the same way by default.

So, the $(INCDIRS) variable contains a unique list of subfolders under the include folder where all my header files reside. The $(INCLIST) variable is a transformation of $(INCDIRS) into the format needed as compiler flags. For example include/caches is transformed into -I include/caches. The $(BUILDLIST) variable creates a list of pathed subfolders for the build folder, where include/caches becomes build/caches for the compile step.

# Shared Compiler Flags
CFLAGS := -c
INC := -I include $(INCLIST) -I /usr/local/include
LIB := -L /usr/local/lib -lsantacruzengine -lsantacruzlib -larcadia -lcorinth -lyaml-cpp -lzmq -lhiredis -lbondoas

Most of the compiler flags are shared across both platforms and are set here. In this case, $(CFLAGS) is set to tell the compiler to compile only.

The $(INC) variable is set to help the compiler find include files automatically, so it adds the include folder where the root header files are, the list of include subfolders determined above and the system shared /usr/local/include folder for shared library includes. Now I no longer need to worry when I move files around the code-base.

The $(LIB) variable adds the shared system /usr/local/lib folder where the shared libraries can be found. The rest of that line contains a long list of libraries that this real program needs to link with (and is project specific - you need to set these for each project based on that project’s needs). Note that all my shared libraries are in /usr/local.

# Platform Specific Compiler Flags
ifeq ($(UNAME_S),Linux)
    CFLAGS += -std=gnu++11 -O2 # -fPIC

    # PostgreSQL Special
    PG_VER := 9.3
    INC += -I /usr/pgsql-$(PG_VER)/include
    LIB += -L /usr/pgsql-$(PG_VER)/lib 
    CFLAGS += -std=c++11 -stdlib=libc++ -O2

Not all settings are the same, which is why the Makefile now adds platform-specific parameters to our variables.

On Linux, I use the gnu++11 system library (to go with g++) and this gets added to the $(CFLAGS) variable. Also, for some unknowable reason, the PostgreSQL include and library files are in a non-standard location in Linux, so they get added to the $(INC) and $(LIB) variables too.

On OS X, I set clang to use the c++11 language and to link with the libc++ system library.

    @mkdir -p $(TARGETDIR)
    @echo "Linking..."
    @echo "  Linking $(TARGET)"; $(CC) $^ -o $(TARGET) $(LIB)

As a habit, I set the final goal of the Makefile first. In this case, make needs to make the $(TARGET) which is bin/SantaCruzServer given the list of $(OBJECTS) we generated above.

First it creates the bin folder if it does not exist, then links the objects into an executable. The $^ represents the list of targets.

Aside: I am also not one of those people who has to see a busy screen of detailed commands as they compile. So, for Makefile rules, I prepend them with a friendly message and do not print out the actual multi-line incomprehensible command. See the output later on for examples.

    @mkdir -p $(BUILDLIST)
    @echo "Compiling $<..."; $(CC) $(CFLAGS) $(INC) -c -o $@ $<

The Makefile needs to know how to generate the $(OBJECTS), and that’s where the compile step comes in. For all files in the src folder, it compiles to a matching object file in the build folder.

That’s all we need to build the application, but the Makefile has to do more.

    @echo "Cleaning $(TARGET)..."; $(RM) -r $(BUILDDIR) $(TARGET)

The clean target tells make to remove the target bin/SantaCruzServer and the build folder. This gives us the ability to force a full recompile.

    @echo "Installing $(EXECUTABLE)..."; cp $(TARGET) $(INSTALLBINDIR)

The install target copies SantaCruzServer to /usr/local/bin where it will run in production.

    @echo "Removing $(EXECUTABLE)"; rm $(INSTALLBINDIR)/$(EXECUTABLE)

And to uninstall, we have the distclean target.

.PHONY: clean

This final line of code tricks make into being able to run the clean rules. If it were not here, make would try to build a file called clean and fail.

Using the Makefile

To start

$ make clean

This resets the environment. The output in this project is:

Cleaning bin/SantaCruzServer...

To build the executable

$ make -j

This tells make to build the primary target - in this case the first and only one in the file.

Tip: The -j parameter parallelizes the build for speed. The documentation for make specifies that you need a number of jobs as part of this parameter, but if you set none, make uses all cores.

In my case, the output is:

Compiling src/connections/control_channel_responder.cpp...
Compiling src/connections/request_listener.cpp...
Compiling src/connections/response_publisher.cpp...
Compiling src/caches/session_cache.cpp...
Compiling src/main.cpp...
Compiling src/sc_configuration.cpp...
Compiling src/sc_logger.cpp...
Compiling src/sc_server.cpp...
Compiling src/workers/worker.cpp...
Compiling src/workers/worker_pool.cpp...
  Linking bin/SantaCruzServer

The executable can now be found in bin/SantaCruzServer and run.

$ bin/SantaCruzServer
2015-09-28 21:50:53.999 Info: SantaCruz V0.01 Alpha Server Starting...
2015-09-28 21:50:53.006 Debug: Using SantaCruzLIB Version 0.1 Alpha
2015-09-28 21:50:53.013 Debug: Using SantaCruzEngine Version 0.1 Alpha

To install the executable in production:

sudo make install

This will copy the new executable into production. For development deploys, you do not need the sudo.

The output is:

Installing SantaCruzServer...

And to uninstall:

$ make distclean
Removing SantaCruzServer

And the executable is all gone.

Reusing this Makefile in the next project

The entire Makefile can be copied to the next project. Only a few changes are needed:

  • The $(EXECUTABLE) name needs to change for the new project
  • The list of libraries in the $(LIB) variable needs to match the libraries linked to by the new project.
  • If PostgreSQL is not needed, remove it from the Linux Platform Specific Compiler Flags (and remove the $(INC) and $(LIB) additions).

And that’s all there is to it.

I am using this same Makefile pattern across many, many projects and it works well for me. I hope it can help you compile and deploy your C++ projects with ease.

Follow the author as @hiltmon on Twitter.

Saturday Afternoons Watching Rugby Union

Growing up, Saturday afternoons were for watching Rugby Union games with friends and family. No matter what was going on at home, with friends, at school, or in the country I lived, the world stopped for the two games that were televised live. I am as far away from that world as one can be, yet it is with me right now.

It began with my grandfather. He played as a kid as evidenced by a hook nose and a deep understanding of the game. There were only two times we could not bother him, his post-lunch nap, and when the rugby was on. But if we were there when the game was on, he would talk, explaining the rules, the nuances, the players positioning and decisions. Rugby was real-time, dynamic chess for him, a thinking person’s sport, and he imbued that thinking, understanding and love of the game in all his grand-children.

We spent a lot of time in Cape Town as young kids. And much of that time was spent playing an ongoing game of Rugby with the cousins and neighborhood kids in my Aunt’s garden. When we arrived at the beginning of a vacation, her lawns were always perfect, flowerbeds immaculate and colors delightful. Within days, the garden was completely and utterly destroyed. We’d tackle each other into the flowers, ruining them. We’d ruck and maul deep gouges into the perfectly cut grass. When a ‘pile-up’ was called, all kids would dive in, wrecking everything. And she never said a word. There was no where else we, the kids, wanted to be but in the game, unless a real game was on for us to watch. Newlands in Cape Town always had cheap bleachers seats for us kids, right up close to the field where we could see our heroes play.

When I got older, and we finally received our own TV, our weekends revolved around the games. Friends, never invited nor needing to be invited, came over. I have no idea how many meals my mum served a bunch of rowdy, hungry, growing boys in those days, nor how many beers we quaffed as we got older and started drinking. And she never said a word. Those friendships have lasted a lifetime and remain as strong as ever, even though we are worlds apart.

And later on, even older, I went to the games. More often than not, my best mate and I would just arrive at the stadium, buy some scalped tickets for the members section and go in to watch the game. I’m not even sure we even cared who was playing sometimes, as long as we did it. Watched Rugby together.

International games were never missed. We’d get together wherever we were, have a barbecue and watch. In Japan, a friend set up a projector and screen for the big games. The Bledisloe Cup games between Australia and New Zealand were the high point of each season. Then came the Super League and the Five (now six) Nations championships. Better games, more internationals, more opportunities to hang out and enjoy.

And now there is the World Cup. The best of the best. I have rarely missed watching a World Cup Rugby game live. When Australia beat England in 1991, we were watching. When South Africa won and overcome its racial divide in 1995, we we watching. France beating the All Blacks in 1999 was the best game ever (until Japan vs South Africa yesterday maybe), and we were watching. Johnny Wilkinson’s drop goal in 2003 to aid England defeating my Australia broke our hearts, and we were watching.

This weekend I watched the first 7 of the 8 World Cup 2015 games live online. I am here in New York, far away from friends and family, from where I come from. But I was not alone. I know those kids I played with all those years ago were doing the same, wherever they are. I know my family and cousins were watching, sharing our love of the game and each other with the next generation. And I know my mates were doing the same too. All of them were on the couch with me, cheering, commenting, calling out the refs, pointing out tactical errors and debating the players and rules.

The love of watching Rugby taught us a lot as we grew up. It taught us to think, to debate, to love, to really communicate, and to be better friends. It drew this nerd out of his shell - too far some say. It showed us that even though the game may just be a game, sportsmanship, courage, honesty and strength come from the head and the heart, not the muscles. It taught us to share and to enjoy the precious moments.

Saturday afternoons were for watching Rugby with family and friends. Decades later, we’re still watching Rugby as eagerly as ever. Even though we are thousands of miles away from each-other, we’re not alone in our love for each other or the game that drew is in.

Follow the author as @hiltmon on Twitter.

The 'It Has Happened Before, It Will Happen Again.' Apple

The Apple I know and loved was doomed, so the press said. It was the crazy, emo teenager of a company. Willing to try anything, mad, crazy, radical, different and apt to succeed and fail in spectacular fashion. Against all odds, the Apple of old did not fail. Because of this we now have amazing computers, thin and light laptops, iPods, OS X, iPhones and iPads. Can you imagine a world without Apple products?

The modern Apple has grown up, thanks to Steve Job’s maturity, Tim Cook’s solid, intelligent leadership and a team of experienced people taking ownership, saying no and getting things done the Apple way. It’s the grown up but still young adult. Smart enough to know its limits, but still daring enough to challenge the status quo and push the envelope. Because of this we now have solid and mature Operating Systems, a regular update schedule, and reliable iCloud. Yet the old teenage Apple is still pushing its limits with Apple Music, Apple Pay, the Apple Watch and maybe something in TV.

What Apple is trying to do is hard. Its blitheringly hard to make good software, ridiculously hard to make hardware that’s both functional and beautiful, and insanely hard to perform and perfect design and documentation and support and services and research and manufacturing and distribution. The modern Apple does it all, and on a regular schedule no less. Its seemingly impossible, yet year after year, they keep up the pace.

Next week will be the second of their annual announcement extravaganzas. The surprise is gone, we know how it will work, the keynote, the players, the japes and jokes, and roughly what will be announced.

Its the “It has happened before, it will happen again.” Apple.

Solid, stable, regular, mature.

But not boring.

The teenage Apple released products when it could, letting product lines wither and seem abandoned until a major new release ambush or product death. The modern Apple still does this for new innovations and hobby products, Apple is still Apple. But for mature products, it now operates on a mature schedule. New OS X and iOS annually, new iPhones and iPads annually, amazingly all shipping on time to massive numbers of people world-wide.

In my opinion, the grown up Apple has found a wonderful balance. For maturing products, a regular update schedule, with a few surprises thrown in for good measure (Swift is an example) to show they still have it. For new innovations, the surprise and wonder of old remains. The recent Apple Watch and Apple Pay launches prove it.

So next week, I too will be following a regular schedule, closing the door to my office, watching the live stream of the Apple event, laughing at the jokes, seeing if there are any new features being announced on existing products, and waiting to hear the launch dates for the new OS X and iOS and iPhone and iPad.

I know the script. We all do.

Except for one part.

One more thing.

At some point (or maybe more than once), Apple’s presentation will pause. The crowd will go silent. The corner of the speaker’s mouth will curl up in a smirk. We will all hold our breaths. They will wait a moment longer.

And then they will announce “one more thing”.

The modern grown-up Apple will slip away, the gleeful teenage Apple will stand tall on the stage, and maybe, just maybe, will change the world again.

Follow the author as @hiltmon on Twitter.

Open Extension-less Files in Your Favorite Editor on OS X

Xcode does a horrible job of editing Makefiles (and other extension-less files like dot-files). If you right-click in Xcode on a Makefile and Open with External Editor, OS X opens it in TextEdit. Which is worse.

The issue is that TextEdit is associated with extension-less files. Usually to change how files are opened, you select the file in Finder, right-click and choose Get Info from the menu. You then choose the editor in the Open with dropdown and click Change All…. But this does not work for extension-less files, it throws the below error.

The solution is to use RCDefaultApp to change this association.

Install it now, its free. I’ll wait.

Once its installed, open Default Apps from System Preferences, go to Apps and select the application you prefer for extension-less files, I use trusty BBEdit if course. Scroll down to find the public.data UTI, check it and click Set as Default. Extension-less files should now open in BBEdit from Finder and from Xcode.

I figured out the necessary UTI using the command mdls Makefile and seeing that the system had it as “public.data”. Many other sites say using “public.text” is the correct mapping but that did not work for me on OS X 10.10 Yosemite.

Follow the author as @hiltmon on Twitter.