Hiltmon

On walkabout in life and technology

Xcode 4 Code Completion for External Build Projects

In Xcode and the Simple C++ Project Structure, I showed how to set up Xcode as your IDE for the Simple C++ Project Structure.

But one thing does not work, Code Sense. Xcode does not provide code completion or jump to definition for these projects.

Wouldn’t it be nice if we could enable this too.

The solution is simple, you need to use the Xcode build system to create the indices that the IDE uses. But since we are working on cross-platform Makefile systems, we do not want to switch compilers.

The answer is to create a new target that uses the Xcode build tools, and add all your source *.cpp files to it. Xcode will now index these files. And you can simply ignore the new target and continue to build against the old Makefile-based target.

To add this documentation target, click on the project at the top, then click the Add Target button. In this case, I chose a command-line tool. I like to name the new target the same name as the external target with a -doc extension. Then save it.

The next step is to make sure each .cpp file gets added to the -doc target. One way to do this is to select the file and check that -doc target in the file inspector.

Unfortunately, Xcode has polluted our simple C++ project and created a new main.cpp, a <name>.1 file and folder for this new target. We neither want or need these.

In the new -doc target, remove the Copy Files step from the Build Phases tab by clicking the X at the top right of this step. This frees us up to remove these files from the new target (uncheck them in the File Inspector) and delete them (choose Move to Trash when asked).

Any time you add a new file to the main project, make sure you add it’s source to the -doc target. But never use the -doc target to compile the project, it will not work. However, if you ever need to reindex the project for some reason, performing a Clean and a Build on the -doc target will recreate the index.

Now we have a cross-platform external build C++ project that also takes advantage of Xcode’s autocomplete and jump to definition features.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Good Tools Change

In Good Tools, James Bennett writes a compelling article on how his investment in learning his most used tool, emacs, has paid huge dividends in his productivity for years and years. I recommend you read it, his arguments are cogent and his points well founded and well made. He writes:

Good tools, for programmers, are investments: you give them your time and your brain up-front, and then they pay dividends for years on end. I’m writing this post in GNU Emacs inside a terminal window, for example, and GNU Emacs inside a terminal window has been my day-to-day editor for over a decade now.

However, I disagree with his post on it’s fundamental assumptions: (1) that you can use the same tool for decades, and (2) that investing in a specific tool up front is guaranteed to pay these dividends where other investments would pay back less.

Maybe it’s just me, but I think the whole idea of investing in a specific tool, no matter how good the tool is, is the wrong thing to be investing too much in.

I believe more in investing time and learning in two closely related, integrated areas: on becoming a better programmer and on becoming a better designer. To me, it does not matter what tools you use (even though I write about mine), it matters how well you design and create what you do with these tools.

An analogue, a carpenter who makes tables does not spend their entire career learning to use their lathe better, they spend their entire career learning to design and create better tables. A carpenter that studies the lathe and little else, will produce the same rickety table over and over again.

To be clear, I’m not, in any way, saying that James Bennett did not also invest his time in becoming a better designer and programmer, that information is out of scope of the article’s topic. I would assume, based on the quality of his writing, that he did.

It’s just that my experience has been vastly different to his.

His seems to have been stable. Mine, not so much.

Over the past 23 years (yep, old guy), my investment in becoming a better designer and programmer has had me changing platforms and tools on an almost regular basis. My investments in learning Smalltalk and C back in the day made it easier for me to learn and write Objective-C and Ruby today. My investments in learning Aldus FrameMaker for print design made it easier for me to switch to web design later.

And over that period of time, I changed (amongst others) from DOS and Turbo Pascal to Unix and vi to OS/2 and Lotus Notes to Mac and BBEdit to Windows and Visual Studio to OS X and Xcode and back to straight UNIX (OS X development, Linux production) and TextMate 2. At each stage of this evolution, I have taken the time to learn the tools, just like James did, but my goal was to become a better programmer and designer on that platform instead.

And the best part was that the programming and design skills I had learned in earlier work made it easier and easier for me to adapt to new platforms and paradigms, and to produce better and better products. As I learned new languages and platforms, I took their ideas and integrated them into my programming, and found better ways to design and program things in later platforms.

Unlike James, I could not stay with the same toolkit my whole career. Or use the same programming language, or the same design tools. I knew this when I started out. I know this now. My current platform and tools will change over the next few years, and I need to be able to change with it.

If you are in a stable job working on a stable platform where there is little chance of it changing, then I agree with James, the investments you make in learning your tools is absolutely worth it.

But if you change as much as I have, I believe it’s more important to invest in your ability to design and program better, to adapt to change, and pick up and discard platforms and tools as you go. Whatever you invest your learning time in now, whether it be tools or programming skills, know that this learning will not be wasted, it will make it easier to transition to the next tools or languages if and when that happens, and it will pay dividends until then.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Xcode and the Simple C++ Project Structure

In a previous post, I talked about a A Simple C++ Project Structure that I am using to create a bunch of high-speed daemons for work.

It’s been fun using TextMate 2 and a Terminal to make and run the project, but now that I am getting to the meat of the coding, I’d prefer to use an IDE to help me navigate and debug the code.

Here’s how to set up Xcode 4 on the Mac to compile using our Makefile and run/debug the application.

Note that since these projects already exist, there is a minor shenanigan involved in setup.

Step 1: Create an External Build System Project

Start Xcode and choose File / New Project.

Click on Other then choose External Build System.

Click Next.

Input your project name, I use the existing project’s name so that the Xcode project file matches.

Then click Next.

Now you will save this project at the root of the Simple C++ project folder. Just select the project root and hit Create.

Step 2: Move the Project File

The problem is this. If you has chosen the root of your Projects folder where the Simple C++ project resides, Xcode would have replace the simple project folder with a blank folder and its *.xcodeproj file. That’s not what we want.

So instead, we chose the root of the project itself.

But Xcode has created a subfolder named after your project and placed the *.xcodeproj file in that. That’s not what we want either. So lets fix it up.

Quit XCode.

Drag the *.xcodeproj from the subfolder and drop it on the project root. Now delete the subfolder.

Double-click the moved *.xcodeproj to open the project again in Xcode.

Step 3: Add the sources

At the bottom left of the Xcode window, click the + icon. Then choose Add files to “Your Project Name”.

Select everything except the build folder and choose Add.

Step 4: Test the Build

Click on the project at the top to see the Project and Targets panel. You should see an External Build Tool Configuration already setup to use the Makefile we had before.

To build, press ⌘B.

Step 5: Run It

We still need to do one more thing and that is to tell Xcode which executable to run.

Under the Product menu, choose Scheme then Edit Scheme.. or press ⌘<.

Click on Run in the left pane. The click on the drop-down next to Executable and choose Other…. Find the executable in the bin folder and click on it.

If you prefer, you can change the debugger to GDB as well.

To pass arguments to the default run, click the arguments tab and add them there.

Click OK to save.

Press ⌘R to run, and Xcode will bring up a console window to display the program’s output.

Cool Benefits

Some cool stuff that comes along for free:

  • If you press ⇧⌘K (or choose Product / Clean), Xcode will run the clean target in the Makefile.
  • The Xcode organizer imports and loads the git source code control environment so you can branch and commit from Xcode.
  • You get a really nice GUI debugging environment (which was the real goal).

Also, don’t forget to commit these changes when done.

Now we have a Simple C++ Project that can and will compile under Linux using command-line tools, and a GUI IDE environment to develop and test it on the Mac.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Using the Spike Folder

In yesterday’s post A Simple C++ Project Structure, I mentioned the spike folder. In today’s post, I’ll write more about how I use it.

By the way, I have previously written about spike solutions, wherein I create solutions for the bigger technical problems at the start of a project to be sure they are achievable. This is different.

In this case, being back in C++ and rusty as an old door hinge, I also needed to create and test out snippets of code I could be using without having to make and run the entire product. So I create spike .cpp files in the spike folder to try things out.

But making and running these is such a pain. Instead, I use Nikolai Krill’s absolutely brilliant Code Runner (App Store link) app.

Just choose your language, spin up some code and hit ⌘R to run it. Errors and output are displayed in the panel at the bottom. What I did not know until recently was that it also dealt with compiled languages, so I could use it for experimental C++ files. And so I did.

Another great feature is that you can customize the compile and runtime environment in Code Runner which means you can link to external libraries in spikes without having to create a Makefile entry.

Code Runner is another of those well-done, well though out, single use applications that we all love.

For TextMate 2 users, you can do the same in TextMate 2. Just open a .cpp file that contains an int main... and hit ⌘R to run it. But it does not enable you to set parameters or compilation flags like Code Runner does.

Nowadays, whenever I get stuck or need a scratchpad to see how to do things without affecting the application’s code, I just spin up Code Runner. If the code is any good, I save these spike programs in the spike folder for later when I will need that working code.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

RIP Doug Engelbart

Doug Engelbart passed away last night at age 88. I remember him as one of the names behind the beginnings of modern computers, inventing the mouse, hypertext, video conferencing and collaboration. But these were not the ends of his contributions, these were the means by which he tried to achieve his intent, to augment human intellect.

Too bad we use his inventions for Facebook, Twitter, Netflix, LOLCats and flame wars; we degrade human intellect.

The best way to remember him is this, written by Brett Victor on his blog in A few words on Doug Engelbart, (read it all):

The least important question you can ask about Engelbart is, "What did he build?" By asking that question, you put yourself in a position to admire him, to stand in awe of his achievements, to worship him as a hero. But worship isn't useful to anyone. Not you, not him.

The most important question you can ask about Engelbart is, "What world was he trying to create?" By asking that question, you put yourself in a position to create that world yourself.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

A Simple C++ Project Structure

One of the things I need in my new job is a bunch of blazingly fast daemons to capture market information and trade data. I prototyped them in Ruby to see what comes down the line, but I have the need, the need for speed. Which means I need a UNIX C or C++ framework.

So I went old-school. Retro even. Plain old C++. My favorite programmer’s editor. And the good old terminal, er, iTerm 2, just Mac-like.

Since I am planning on creating a lot of these little projects, developing them on my Mac and deploying them to Linux servers, I decided to create a generic project folder layout and generic Makefile for each. And share it with you.

The Project Folder Tree

Note: I’m not making these to go outside my company, so the full GNU C++ standard project is overkill. Much of what follows does conform to the basics of their standard C++ project design though.

For each application, the folders are:

  • bin: The output executables go here, both for the app and for any tests and spikes.
  • build: This folder contains all object files, and is removed on a clean.
  • doc: Any notes, like my assembly notes and configuration files, are here. I decided to create the development and production config files in here instead of in a separate config folder as they “document” the configuration.
  • include: All project header files. All necessary third-party header files that do not exist under /usr/local/include are also placed here.
  • lib: Any libs that get compiled by the project, third party or any needed in development. Prior to deployment, third party libraries get moved to /usr/local/lib where they belong, leaving the project clean enough to compile on our Linux deployment servers. I really use this to test different library versions than the standard.
  • spike: I often write smaller classes or files to test technologies or ideas, and keep them around for future reference. They go here, where they do not dilute the real application’s files, but can still be found later.
  • src: The application and only the application’s source files.
  • test: All test code files. You do write tests, no?

.gitignore

Since I use git for source code control, the .gitignore file is:

1
2
3
4
5
6
7
8
9
# Ignore the build and lib dirs
build
lib/*

# Ignore any executables
bin/*

# Ignore Mac specific files
.DS_Store

Makefile

I do not need the extra effort or platform independence of autotools. These are great for users, but suck up developer time to make them work. Instead, I opted for a simple yet flexible and generic makefile (see notes below):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
#
# TODO: Move `libmongoclient.a` to /usr/local/lib so this can work on production servers
#
 
CC := g++ # This is the main compiler
# CC := clang --analyze # and comment out the linker last line for sanity
SRCDIR := src
BUILDDIR := build
TARGET := bin/runner
 
SRCEXT := cpp
SOURCES := $(shell find $(SRCDIR) -type f -name *.$(SRCEXT))
OBJECTS := $(patsubst $(SRCDIR)/%,$(BUILDDIR)/%,$(SOURCES:.$(SRCEXT)=.o))
CFLAGS := -g # -Wall
LIB := -pthread -lmongoclient -L lib -lboost_thread-mt -lboost_filesystem-mt -lboost_system-mt
INC := -I include

$(TARGET): $(OBJECTS)
  @echo " Linking..."
  @echo " $(CC) $^ -o $(TARGET) $(LIB)"; $(CC) $^ -o $(TARGET) $(LIB)

$(BUILDDIR)/%.o: $(SRCDIR)/%.$(SRCEXT)
  @mkdir -p $(BUILDDIR)
  @echo " $(CC) $(CFLAGS) $(INC) -c -o $@ $<"; $(CC) $(CFLAGS) $(INC) -c -o $@ $<

clean:
  @echo " Cleaning..."; 
  @echo " $(RM) -r $(BUILDDIR) $(TARGET)"; $(RM) -r $(BUILDDIR) $(TARGET)

# Tests
tester:
  $(CC) $(CFLAGS) test/tester.cpp $(INC) $(LIB) -o bin/tester

# Spikes
ticket:
  $(CC) $(CFLAGS) spikes/ticket.cpp $(INC) $(LIB) -o bin/ticket

.PHONY: clean

Notes on the Makefile:

  • The TODO at the top reminds me that I am using a different version of a library in development and it must be removed before deployment.
  • The TARGET is the main executable of the project, in this case bin/runner. Type make and this is what gets built.
  • I’m using g++ because it’s the same on Mac OS X and on the production Linux boxes.
  • If I uncomment the clang line, I get a failed link as the libraries are incompatible (or comment out the last line under $(TARGET):). But then I get the benefit of a clang static analyzer run help me make my code better, well worth it.
  • I use the fewest number of compiler CFLAGS when developing as possible, optimization happens later.
  • The SOURCES list is dynamic, I don’t want to manually have to maintain this list as I program. Anything in the src folder will be included in the compile as long as it has a SRCEXT extension.
  • The OBJECTS list is also dynamic and uses a Makefile trick to build the list based on available sources.
  • The LIB in this case uses a local library for MongoDB as I am testing it, but uses the default homebrew or yum installed libraries for boost. I normally do not use boost, but Mongo needs it.
  • The INC ensures all headers in the include folder are accessible.
  • I like to see the commands that run, hence the multitude of @echo's.
  • Since there are so few of them, I manually add spikes and test builds as a new Makefile target, see the ticket: target for example.
  • The .PHONY clean is brilliant, it nukes the build folder and the main executable. It does not clean spike or test executables though.

Aside: Why separate the includes and the sources? This is fundamentally not necessary for most of my expected projects as they will be stand-alone daemons. But I do expect to build a few shared libraries for these daemons, and having the include files separate makes them easier to deploy later on. So I may as well get into the practice of keeping them separate.

Retro Programming

So the retro code-compile-run loop looks like this:

  • Code in TextMate 2.
  • ⌘⇥ to a terminal and make.
  • ⌘⇥ Fix any errors in TextMate 2.
  • ⌘⇥ to a terminal and make again until it compiles.
  • Type bin\runner params in terminal to run the application. Tip: Use shell ! expansion: !b alone means that I have to type even less to run the last executable with the same parameter list.

There is nothing fancy about this setup, but that is the whole point. A simple, retro environment for simple retro C++ programs.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Keep Building

Marco Arment (@marco) on his blog in Lockdown, writes:

We need to keep pushing forward without them, and do what we’ve always done before: route around the obstructions and maintain what’s great about the web. Keep building and supporting new tools, technologies, and platforms to empower independence, interoperability, and web property ownership.

The context is how the big internet companies like Facebook, Twitter and Google have locked down the open internet, and that we need to overcome their new barriers and keep building so that we can keep the great internet we know and love.

But I think this message applies to more than just the internet, it applies to work and play and socialization and life. We need to keep doing what we did to get to where we are today, to keep building, to overcome the obstructions and maintain what’s great about our lifestyles. We need to keep up the effort on freedom, rights, openness, the environment, and anything else that is big and important in our lives. With an open internet, we can do it, we should do it.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Back in the Saddle

TL;DR: I got a job! Gonna keep writing though.

As of this week, I’m back in the saddle in the finance industry, designing and developing leading edge platforms and systems for a new asset manager, my favorite kind of business. The challenges in the new firm are new to me and epic. And I love a good challenge.

How does this affect…

  • My current consulting customers? Nothing changes, all your projects are in support and the support levels will continue unaffected. I will not, however, be taking on new consulting engagements.
  • My blog? I love writing, and will continue to spend evenings and weekends writing and hopefully improving my craft. Actually, I think this blog may get even more geeky as I am using some very cool, modern open-source products and will likely nerd all over them and just have to share.
  • My products? Of course, they too will be supported and upgraded as needed. TimeToCall will get an iOS 7 facelift, and Kifu is already rock solid.

I have enjoyed the past few years being an indie consultant (other than working for a royal pain-in-the-arse – oh, wait, um, that’s me!) and delivering some amazing products to a bunch of wonderful customers. It has enabled me to become proficient an a whole bunch of new technologies, be exposed to new industries and processes, face and defeat untold challenges, and to make a lot of new friends like you.

I might be working for the man from now on, but that is all that really changes.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

Pinbrowser for Pinboard Updated

I just downloaded the updated Pinbrowser for Pinboard from the App Store, an iOS app by Mikael Konutgan (@mkonutgan), and it just got a whole lot better.

Pinboard is a paid for bookmarking site that focuses on speed, utility and longevity (and anti-socialness!). Pinbrowser is an iPhone and iPad application to access your Pinboard, other’s bookmarks (making it more social) and the all famous popular bookmarks list.

It’s the social aspect that attracted me to the product in the first place. I love to see what other people I follow on Twitter and App.Net are finding on the Internet in their travels, and the ability to browse their Pinboard lists is the stand-out feature of Pinbrowser.

In this release, Mikael has gotten down to spotlighting the key uses of the application by adding new icons to separate the standard sidebar items from user added ones, a Me list for your own bookmarks (strangely missing from the original) and a lovely new feature, the All Starred list. Oh, and the new icon is brilliant!

If you use Pinboard, Pinbrowser for Pinboard on the App Store for the iPhone or iPad is a lovely one-thing-well app that you should consider.

Follow the author as @hiltmon on Twitter and @hiltmon on App.Net. Mute #xpost on one.

WWDC 2013 Post-Game Review

Just under a week ago, Tim Cook walked out on stage to present the keynote at WWDC 2013 (Video here). It was an astounding success. I think Apple has become Tim Cook’s Apple, and the products announced are living proof of that.

After letting them percolate for a few days, here are my impressions of what was announced.