Category Archives: Code

WWDC 2016 Recap

wwdc badge

As a beginning iOS developer, I remember learning a lot of the craft by trial and error, paired with many of the WWDC videos of years’ past. I was very lucky this year to 1) get chosen for the WWDC lottery and 2) work for a company that would support my trip to San Francisco. The week flew by and I went to some really interesting sessions. Here’s the things I found most interesting.


I’ve been developing some accessibility features since starting at Starbucks, and so I went to the What’s New in Accessibility session. The most interesting developer facing features are the improved Accessibility Inspector (which lets you quickly run accessibility audits, even in the simulator) and custom rotors for secondary actions. I’m still pretty new to accessibility but I imagine enabling custom rotors will be nice since I just used a hack with the headers rotor to enable skipping between content.

Swift APIs

Although I haven’t been writing much Swift at work (we’re just starting to adopt it), I went to the Swift API Design Guidelines session. It was interesting to note the conscious decisions to make Swift read grammatically, and how developers are reconciling that with the ways that Swift needs to be compatible with Objective-c APIs. The talk itself was really interesting and worth watching in full, because I’m not going to try to summarize it here!

WatchOS 3


As a person who actually bought an Apple Watch and still wears it every day, I would really like developers to take advantage of the performance gains in WatchOS 3, least of which is leaving apps in memory so they can load much faster! I was also really mesmerized by the power of rich notifications on the watch using SpriteKit and/or SceneKit. I’m worried that the watch is in a catch-22 right now where it’s not really worth the work to develop for it because there aren’t enough users, which aren’t using the watch because of the lack of good apps. Or it could be that the watch really just needs a good weather app and that’s it. Oh well.

Search APIs

I also went to a session about using search APIs like NSUserActivity and CoreSpotlight. I largely ignored these features in the past, but it would be pretty interesting to hook into the system in order to provide phone numbers or addresses to hand off to maps or some other apps that the user is switching to.

Other Stuff

I went to quite a few other sessions on things like iOS Photography (which I sort of just went to accidentally and stayed for) and iMessage integration and On-Demand Resources. Overall, the sessions were really great and I found it really nice to be able to attend a talk in person and absorb the knowledge without distractions. It’s true that the sessions are all available either streaming or online shortly after happening, but too often I’m watching a video while also working against some work deadline or only paying half attention. It’s also pretty neat to be among other iOS developers, and it’s kind of hard to believe that companies have a hard time hiring when you can see so many developers in one room!

Hair Force One

hair force one

Craig Federighi has become one of the best parts of Apple Keynotes as he’s equal parts funny and informative. Whenever he showed up in the conference center, huge lines of people would gather to take a photo with him. He’s a really tall dude and it’s really nice of him to put up with random iOS developer who want to take a photo with him.

The Jacket

wwdc jacket

I received what is apparently a tradition at WWDC; a jacket with the year on it. I immediately felt weird about getting a nice jacket that basically screams “I am the most elite of the elite, because either I or my company can pay $1600 for me to go to a developer conference!” I spotted a few jackets from earlier years in the conference center, as though it’s a way to show that you’ve been in this game for longer and you’re much cooler than people who just got their first jacket this year. I am in the process of de-labeling my jacket with nail polish remover (ask me about it if you’re interested!) and I’ll hopefully get to wear the jacket embarrassment free once the weather gets cooler (it’s a really nice jacket!). I am hoping that Apple stops giving these jackets out at some point and does something like, I dunno, plants a tree for each attendee? Then again, we are talking about a company that makes objects that people literally worship.

The Company Store

the company store

Speaking of products, there’s a company store set up in the conference center where you can buy shirts and baby stuff and hats with Apple logos on them. I was going to buy something but I realized that you’re literally paying $40 for the logo as the shirt itself is a fairly standard bella canvas tee. Plus I already have a WWDC jacket that asserts my dominance as an iOS developer!

In Closing

Again, I feel incredibly lucky to have been able to go to WWDC this year, for the first time ever. The keynote and sessions were really great and I got at least 3 ideas for future apps, which is a lot more than I’ve had in the past year or two. Hopefully I can go next year do it all over again.

Collecting Wedding Photos with Twilio MMS

Last Saturday I got married to my longtime pal Emily. Emily handled a lot of the wedding planning and she did a really great job. I wanted to do some fun things for the wedding and reception and thought it would be fun to make a way for friends to share their photos of our day with us.

I was thinking that it would be easy enough to give people an Instagram hashtag and be done with it, but I wanted to make sure all (or at least most) of our guests would be able to participate. I figured that Twilio’s MMS would be a pretty straightforward way for our guests to share photos throughout the day without having to remember a hashtag or use a specific app.

I set up a Twilio phone number to receive and send text messages. Actually, I just re-used an old one that I still had enabled since I used it for voice only. Because I wanted to write as little boilerplate as possible, I decided on using the Django web framework which I’m already really familiar with. The Django admin panel was incredibly useful for debugging and made it dead simple to view photos that were sent. I used the django-twilio library which also uses the twilio-python library. I’ve been on a Heroku app hosting streak lately so I made sure the app could be easily deployable on Heroku for free.

Since I didn’t have a whole lot of time to build this app (wedding planning takes a lot of attention), I limited my scope to a few goals:

  1. Be able to broadcast messages to guests throughout the day.
  2. Accept messages that contain MMS images (one or more) and save them to a database.
  3. Allow guests to “unsubscribe” to messages in case they didn’t want them.

I ended up using Ngrok to test the app locally, then threw Runscope in front of it for good measure (yo dawg, I heard you like proxies). Runscope was pretty useful in that I didn’t have to keep sending test SMS and MMS messages to my app; I could just replay ones that I had sent previously. It probably saved me ~$1 in fees!

It was pretty straightforward to implement all three of these using the Django library I mentioned earlier. Because all Twilio text messages hit just one endpoint, I had to design around different use cases which made my code a bit messy. It was fine for an app I’m only going to use once, though. The code is gross but it worked and I’m glad I didn’t make it more complicated than it needed to be. I was originally going to use the broadcasting feature more than I ended up using it (I wanted to do something like cat facts) which added a bit to the codebase.

As far as final statistics, I received 70 images from guests during and after the event. Some of them are pretty awesome. In case anyone is interested in how the app works/worked, I posted the source code with my hardcoded phone numbers removed and turned into environmental variables. You can check it out here.

I am not posting any photos that my guests sent here since they were meant to be private, but here’s a photo I took with my selfie stick of us and the cake!

Cake Selfie


Side Project: Audubon

The Idea

A few weeks ago, I noticed a tweet from John Sheehan asking if there was an automated tweet -> screenshot tool:

Anyone know of a browser extension or something else that can turn a tweet into a single image for presentations, etc?I thought it was a pretty interesting concept that would be fairly easy to implement and had a small enough scope that I could use it to learn some more Javascript. After playing around with Ghost Inspector, I knew it would be possible to render a screen capture of a webpage. I just needed to implement the logic for figuring out what area of a page to render, rather than the whole page.

The Solution

I played around with CasperJS before finding a StackOverflow post that described that I could do all that I wanted in PhantomJS, which CasperJS is built on. I wrote a script that would take a url as an input and spit a png out to stdout. Then I wrote a node web app in Express that could take the url as a parameter and run the script through the child_process.spawn command. I had the express app write the stdout of the child process to a buffer and send it once the script was finished. Done!


I ran into a problem when I wanted to host my app on Heroku. Heroku apparently does not support writing to /dev/stdout, and I only found out about this when my images were being sent as empty files. I looked at some solutions that involved writing to an Amazon S3 bucket, but I didn’t want to incur that much of an operations overhead for something so lightweight.

As a workaround, I found that Heroku does allow writing to /tmp, though any files you throw in there are not guaranteed to remain there after the request is over. For me that’s perfect, since the file only needs to exist as long as the request lasts.

Finally, I threw together an index view with a form and a button that takes a Twitter URL and loads the image into the same view when you click on “OK.” I got a nice theme from here and hardly customized it.

Screenshot in a screenshot inception.
Screenshot in a screenshot inception.

I also figured out how to use the “Deploy to Heroku” feature by adding an app.json file to my git project.

There were a couple of other issues I ran into that I didn’t describe yet, mostly getting PhantomJS 2.0 to run on Heroku (because previous versions don’t render webfonts correctly) and setting up the multiple buildpacks on Heroku. You can see what I ended up using by inspecting the Github project here.

I called the project Audubon after the Audubon Society which is really into birds (get it? birds, tweets?). You can deploy it yourself with this button:



There’s a couple of things I need to wrap up in this project. If anyone wants to they could also fork the project and throw a pull request at me, but I’m planning on doing these eventually:

  • Make the command line tool better for generating images (right now it just writes to /tmp so I should make that configurable).
  • Make a bookmarklet so it’s easy to create images from the Twitter website.
  • Make the web index page look a little nicer
  • Maybe provide image format and quality options

Overall I think this was a really good side project in terms of scope and complete-ability. I learned a lot about Node.js, ExpressJS and PhantomJS. I’ve been meaning to level up my Javascript web game, and this project has been a useful exercise.

How Ghost Inspector Helped Me Furnish My House

Act 1

Ikea Stockholm. The average college student would scoff at such luxury, but as a first-time homebuyer, I wanted the best. When I saw the Ikea Stockholm TV Unit (in walnut veneer), I knew I would have it. The problem was that it never seemed to be in stock.

Behold the majesty.
Behold the majesty.

I must have waited for months, maybe even a year, to see if the unit was available in my Ikea store, but apparently there were production issues. The Ikea phone support was never useful and just told me to check again later. I was stuck waiting it out while my current Ikea (non-Stockholm) tv stand bowed under the weight of my 60″ television.

On a random visit in December, though, my interest was piqued. The “buy” button was actually there on the page! I tried faking an order and got a preliminary ship date! For whatever stupid reason, I thought this meant that the supply issues had been fixed. I delayed. I wanted to make sure the thing would fit in my living room. I missed out. Later that week, when I checked the site, the unit was gone.

Act 2

I noticed that the website did something weird when I tried buying the tv unit. The “buy” button was still working. The item was added to my cart. But when I tried to enter my address, I would get an error in my cart saying the unit wasn’t available. There must’ve been some issue with their inventory system.

Instead of checking manually every day to see if Ikea had any real stock, I decided to automate the process using a tool called Ghost Inspector.

Ghost Inspector is a service that lets you define actions for a headless (ghostly) web browser to run for you. It’s mostly intended for running tests on parts of your website (like verifying that a user can log in, get some data, not get an error, etc). If you end up changing a part of your website and it breaks another, Ghost Inspector can automate the process of discovering that regression.

As the title of my blog post implies, Ghost Inspector can also tell you when that piece of furniture you want goes back in stock.

To get started, I created a free account on The free tier allows you to run 100 automated tests per month. This was more than enough for me to check Ikea once every day. To create a “test,” you can either download a Chrome extension that assists you in recording a test, or enter steps into the site itself using actions and CSS selectors (you can also use the extension to record a rough version of a test and edit it later, as I did).

This is what my first test run looked like after I recorded the process of adding the item to the cart, entering my address info and trying to check out:

As you can see, the test failed, which is bad, because I was testing to see if the item was out of stock (which I knew was true). The Ghost Inspector recorder picked up that I clicked on a button with a “pressed” state active. Unfortunately, the headless browser doesn’t hover over a button when clicking it, so it didn’t actually see this element and couldn’t click it. I fixed this by selecting the ‘a’ tag with the specific id that I knew would actually add the item to my cart (I also made the browser pause in case something needed to finish loading before hitting the button):

Here’s the screenshot of my successful (empty cart) test:

At the end of the test, I created an assertion that an element with class “#cartTableError” exists, which shouldn’t happen if your cart is full of Stockholm awesomeness and ready to be shipped. With the test running automatically every day, all that was left for me to do was wait until Ghost Inspector emailed me to notify me of the test “failure,” which meant the item was in stock.

Act 3

The fateful day. I got an email from Ghost Inspector that my test failed. I hurriedly typed “” into my browser. The product page slowly revealed itself and I girded my loins for the moment I could click “buy.” But the buy button wasn’t there! The test failed because the website now correctly showed that the item was not for sale. Boo!

Test failed, but not the way I wanted
Test failed, but not the way I wanted

I had to write another test. But unfortunately, I couldn’t assert the non-existence of an object. Instead, I faked clicking the buy button (which apparently worked fine) and then tested whether the cart was empty when I clicked to it. The test looked like this:

Apparently this is a stupid test because it is actually still passing. But Ghost Inspector has a feature where it takes a screenshot after every passed test and compares it to the next test. In this case, the screenshot diff test failed and I was notified  a few days ago. And it turns out that the tv stand is actually in stock!

A comparison of screenshots.
A comparison of screenshots.


I’ve ordered the tv stand and it should be on its way by the end of the month. Many thanks to Ghost Inspector for creating a free tier for this service so that furniture aficionados like me can automate their obsessive compulsive shopping. But in future versions, please support:

  1. More specific selection of css selectors when using the Chrome recording tool
  2. Negative assertions that check when an element doesn’t exist. (I can probably do this by evaluating Javascript but tests take so long to run that it’s hard to debug)

If you’re interested in creating automated tests for your website (or in getting notifications for furniture availability), definitely give Ghost Inspector a shot.

Graphing My Cycling Progress on Strava and Automating With Zapier

Since I moved to a new home that’s closer to work, I’ve been riding my bike to commute at least a few times each week. My commute to work is very easy as it’s almost completely downhill. I pay for it on the way back, though. I’ve been trying to get healthier, so commuting with my bike has definitely become a priority. In addition to having fun, I’d also like to motivate myself to ride more often and push myself to get better at it (especially the uphill part).

I’ve been using Strava to track my rides, and it turns out that my ride home includes a user-generated segment that automatically tracks my performance when I’m going uphill on Liberty. The first time I rode, it took 10:44 for me to complete. My best time so far is 6:58. The cool part is that I’m able to see the progress I’ve made, and I really do feel accomplished when I beat one of my best times. Here’s the view I’m using to see what my times are:

Stava SegmentThis isn’t the easiest table to read, but you can see that the dates sort of correlate with my times, as the fastest one is in September and the ones in August are slower on average. While beating my fastest time is a good motivator, it’s not realistic to try and break my record every time I ride. I’d rather see consistent improvement as a motivator.

To visualize this, I decided to grab all of the times by viewing the full leaderboard (with just my results). I gathered the times and dates and thew them into a Google spreadsheet to visualize my trend. Here are the results so far.

Strava Trend

Success! It looks like I made some really good progress since I started in July, and my times have been inching downward as I get better at cycling uphill. The times also probably vary a bit because there are some stop signs and a traffic signal within the segment, which can slow me down.

The next problem I wanted to tackle was the data entry bit. Since it’s a pain to update the spreadsheet each time I ride,  I wanted to automate the addition of new efforts (the term that Strava uses to describe single instances of activities on a segment). This is where things get interesting.

Because I’m storing the efforts in a Google Doc, and because Strava has an API, I just need to connect the two together. Unfortunately, my go-to choice for this kind of thing, IFTTT, doesn’t have Strava as a channel. Luckily, I can do something similar with Zapier (btw that’s a referral link, so be sure to click it so I can get more tasks), which is sort of like IFTTT but costs money (albeit with a nice free tier) and has more integrations. You can also set up your own integration, which is what I had to do with Strava.

In order to get the list of my efforts on that particular segment, I had to create a few Zapier “triggers.” One to get my Strava user ID (for use in other API endpoints), another to grab my starred segments (so I could specify which segment I wanted to track), and finally a trigger to listen for any new efforts on my commute’s segment (limited to efforts created by me). I also had to post-process the last trigger so that I could get the date in a format that works in Google Spreadsheets. The result looks something like this:

Strava Zapier

Now, whenever I ride home on my bike while recording the segment with Strava, the effort will be automatically logged and graphed on my Google Spreadsheet! Apparently, I can embed the chart, so here it is!


While the Zapier integration had a somewhat steep learning curve, it’s nice to just set the “zap” and then forget about it. Any new integrations I might need to write will also go much quicker. As always, Runscope was a really useful tool for exploring the Strava API and getting real responses from the API to play around with. Finally, I learned the right way to spell athlete rather painfully (after misspelling athlete_id a billion times and wondering why my request wasn’t filtering correctly)!

I went from seeing some sketchy looking progress in my Strava results table to being able to visualize it on a graph, while also setting the graph up to update whenever I record a new effort! Automation for the win!

If you’re interested in the details of the integration, or if you want to try it out yourself, let me know and I can invite you as a tester. If there’s a lot of interest, I can also go over the creation of the integration, as I ran into a few gotchas while building it.