Technology is a part of you

When I was in high school and college, I practiced Bujinkan Taijitsu, a form of ninpo. My teacher always emphasized the fact that the weapon (katana, bo staff, or otherwise) is an extension of your own body. I think this is true. Part of the job of your brain is to create models of your sensorimotor system, this is called Proprioception. The holy grail of machine learning algorithms for robotics is an algorithm that can create an accurate internal representation of any arbitrary sensor set and understand how to use those sensors and actuators to act on the world.

In infants, this seems to be done by random experimentation and updating. Babies will twitch hundreds of thousands of times per day, exploring the reactions and most likely updating their internal proprioceptive representation; their internal model.[1] In the same way that a martial artist learns to use a bo staff through practice, babies learn to use their central nervous system, retina, cochlea, and hands. Babies seem do it in an unsupervised, or at least self-supervised, manner.

Our body, sensorimotor system, and bo staff are just a small fraction of the technology that we have access to in the modern world. Bikes, cars, shoes, and clothing act as extensions of our body. Television and radio act as extensions to our sight and hearing. Language, writing, calculators, computers, and the internet act as extensions of our minds.

And, just like babies developing models for their legs which eventually become a natural extension of their proprioceptive experience, we develop models of our technology which become natural extensions of our biological hardware.

1. Blumberg, Mark S., et al. “Spatiotemporal structure of REM sleep twitching reveals developmental origins of motor synergies.” Current Biology 23.21 (2013): 2100-2109.

Array Tuttle: Software Engineer

Array Tuttle: Bloody eight line matrix multiplication. Huh!
Sam Iterative: I suppose one has to expect a certain amount.
Array Tuttle: Why? I came into this game for the action, the excitement. Go anywhere, travel light, five character GPU kernels, A ∘.× B, get in, get out, wherever there’s a linear algebra problem, a man alone. Now they got the whole country doing imperative programming, you can’t write a matrix multiplication without three for loops.

I think the future of programming languages will be something that looks APL and runs nicely on GPUs, FPGAs, and other parallel hardware architectures.

Gravity: An orbital mechanics game written in Elm

Recently, I’ve been really excited about Elm. I’m currently using it for the on-hat web application for Lambda Hat. I’ve really enjoyed the FRP style and the Signals control flow mechanism.

When I was first getting started, I put together a little orbital mechanics game I call Gravity. Feel free to check out the source and demo below:




Google Glass Lifestream

I’m happy to announce the launch of our open source Google Glass Lifestream & Backup Tool for Mac OS X. In order for it to work, you must first turn on debug on your Glass: Settings > Device info > Turn on debug (see notes below as to why we do this).

After backing up your Glass to the chosen directory, it can optionally generate a lifestream from the images taken with your Glass. Here’s my latest lifestream:

Stephen Balaban's Lifestream - Created by the Lambda Labs Glass Backup Tool for Mac OS X

You can download the backup tool here:

Download the Google Glass Backup Tool

Simply download and run the file.

The source is available on Github: The dirty secret is that the entire app is just a shell script wrapped into a .app file! It’s the future of Cocoa development.

Tools used:

  • Platypus – for wrapping the shell script up as an app.
  • CocoaDialog – bash bindings for Cocoa.
  • adb – for moving files back and forth between the Glass without having to implement a PTP client, this is why we need the Glass to be in debug mode.
  • ImageMagick – for creating the optional GIF lifestream at the end.
  • GNU parallel – for downsizing the images in parallel using ImageMagick.


iOS internationalization from day one

A few days ago we did a private beta release of the Babatuba Collage App. Here’s the map from the first 72 hours:


Babatuba was to be international from day one, this decision fundamentally directed the product. Many aspects of an application change as a function of your audience’s locality. Sina Weibo is popular in China but not in the United States. Bandwidth varies wildly from nation to nation: we increased image compression to speed up loading times in China. We found the avg. bandwidth by country in the Akamai State of the Internet useful. Ideally this would be calculated individually using bandwidth data on the phone.

Separating code and data is good engineering practice; don’t litter your code with literals, internationalize!

Here’s a 5-minute tutorial & example iOS Project:

iOS Internationalization in 5 minutes

Step 1. Use NSLocalizedString for all string constants

If you use the C pre-processor macro NSlocalizedString, which is defined as #define NSLocalizedString(key, comment) [[NSBundle mainBundle] localizedStringForKey:(key) value:@"" table:nil], you can use the shell script below to generate the body of the .strings file that you will use in the next step. Example:

- (void)viewDidLoad {
    [super viewDidLoad];

    self.labelIntl.text = NSLocalizedString(@"LABEL_TEXT", nil);
    [self.buttonBabatuba setTitle:NSLocalizedString(@"BUTTON_TITLE", nil) 

Step 2. Set up a Localized.strings file & Localize your InfoPlist.strings

Create a Localizable.strings file: File >> New >> File >> iOS >> Resource >> Strings. Select the new file and click Localize.

Strings File
Localize iOS

Use the following shell script (included in the example project) to generate the boilerplate for all localizable string in your project:

grep -hor "NSLocalizedString(\@.*)" . | sed s/NSLocalizedString\(@// |
     sed s/,.*$// | uniq | sort | sed 's/$/ = "";/'

This will generate the body of your Localizable.strings file. To set the name of your app as it appears on the springboard, localize your InfoPlist.strings file and add this to each file:

"CFBundleDisplayName" = "$LOCALIZED_NAME";
"CFBundleName" = "$LOCALIZED_NAME";

Step 3. Rejoice!

Just translate the .strings files and you’re set! Kind of ^^. App usage will vary from culture to culture, you will need to localize your product’s features based on usage. However, that’s up to you and your team’s understanding of your target market! You can get the full iOS internationalization example project on GitHub.

There are other great tutorials for Apple’s Internationalization APIs online:

In the mean time, you can participate in the Babatuba beta for just 99¢! It’s been internationalized for English, Spanish, and Chinese (both Simplified and Traditional).

Participate in the Babatuba Beta for iOS

Facebook Graph Search Breaks Your Privacy Settings

Facebook’s new Graph Search is broken.

It was a sunny afternoon in Chinatown. I was taking my new Graph Search invite for a spin. After exhausting the—surprisingly long—list of females under 30 living in San Francisco, who went to MIT and like Seinfeld, I went for something more mundane, but far more sinister. “Photos of My Brother”.

Pesky privacy-conscious sibling not letting you view their photos? There’s a Graph Search for that.™  Lo and behold, despite my brother’s draconian privacy settings, despite his profile advertising to me “No photos to show”, that simple search provided dozens of never before seen photos of my fraternal twin. He was outraged.

Facebook Graph Search "Photos of"

Luckily, a single predicate can realign the current implementation with my brother’s, and others’, expectations. If Mallory can’t see a photo on Alice’s profile, don’t show it to Mallory in a “Photos of Alice” Graph Search. This would have prevented the disabling of my brother’s Facebook account and untagging of his photos that ensued. This would fix Graph Search.

You may argue, “Even before Graph Search, Mallory could manually browse the public photos of Alice’s friends and view those photos of Alice.” While this is true, Graph Search makes this formerly Sisyphean task a matter of a few keystrokes. It broke my brother’s privacy settings.

“The new Request Removal Tool!”, you may cry. I’ve used it myself and assisted multiple friends in untagging their newly searchable photos. Here’s the interface and a photo of me enjoying a cup of coffee:

Facebook Request Removal Tool

A “removal” tool that requires you to scroll through your timeline for 15+ minutes, mindlessly clicking check box after check box, is not a removal tool. It’s a digital torture device. We want one-click Graph Search opt-out.

I don’t blame Facebook. The majority of their users don’t seem to care about privacy, and that’s fine for them. But some of us, like my brother, do care. Some of us have every possible privacy setting set to “Only Me”. We expect that to mean only me, not me and people with Graph Search. Regardless of your sharing preferences, you must agree: we deserve a system that respects our privacy. Facebook, please fix Graph Search.

Summary (for those at Facebook working on Graph Search, feel free to bring these points up in your next meeting or standup):

  1. Graph Search, in it’s current implementation, breaks the privacy model
  2. If a photo can’t be seen on my profile, don’t show it in a Graph Search
  3. Make tools and settings which allow us to easily manage our Graph Search privacy

Your thoughts and feelings are welcome. I’d especially like to hear from those currently working on the product. Tweet @stephenbalaban