Android Wear Design Contest!

Last week I went to a GDG Ann Arbor Android meetup on Android Wear. The presentation was set up by Google and had a bunch of information on the upcoming Android Wear SDK. At the end, it was announced that there would be a few vouchers for Android Wear devices available. A design contest was announced and the deadline was set at Sunday.

Being naturally competitive and really into new technology, I made a design mockup for a home automation app that I thought would be useful to have on a wearable device (the most common form factor happens to be a watch). Here’s my submission that I originally posted to Google Plus:

Home Arrival

The app uses the user’s location as a way to provide feedback only when it’s necessary, and to stay out of the way when it isn’t. In this example, the app can detect when a user is arriving home and can ask if they want to switch the lights on.

Voice Controls

Instead of switching individual lights on and off, the user can create preset groups and activate them with voice commands. Users can set commonly used groups like “living room” and “kitchen.”

Sleepytime Reminder

Night owl users can set reminders to go to bed, and the app can turn off lights if the user decides to stay up late.

Motion Sensor

The app can also use sensors to detect when motion is occurring in the home and alert the user if no one is supposed to be at home. The user can decide whether the motion is a false alarm or if further action should be taken.

There is obviously much more that can be done with a home automation app, but these are just a few scenarios that would work well on an Android Wear device. I’m looking forward to experimenting on Android Wear when it becomes available!

I heard this week that my submission was a winning entry, so I’ll get an Android Wear device as soon as they go on sale some time this Summer. Thanks to Google and the GDG group here in Ann Arbor for setting up the event!

Inspecting and Debugging API Versions with Runscope

FL-iPhones This past month at FarmLogs, we implemented a versioning scheme for our internal API. API versioning is often a difficult transition to make, especially if you’re using a third party API and dealing with deprecations (Facebook, I’m looking at you!). It’s often a chore to figure out exactly what has changed between API versions unless you have very good documentation.

Coincidentally, Runscope also has a service called API Changelog that can send you notifications when a third party API changes. Luckily for me, any changes to our internal API were subject to review by our team of engineers, so if I had any questions or concerns, I could bring them up in person. Even so, when dealing with a changing API, the truth is in the response, not necessarily the documentation (or Hipchat or Hackpad). For a while, I was updating the iOS client to the new version by copying and pasting the two responses into a text editor and looking at the differences. About halfway through I remembered that this functionality is built into Runscope, so I started using that. I’ll show you how to compare diffs of two responses so you don’t have to waste half of your time like me!


To follow along with this blog post, you’ll need just two things:

Some versioned API data

I’m not going to post the inner workings of the FarmLogs API (though it’s pretty easy to reverse engineer, like many other APIs) since that’s probably confidential knowledge or something. Instead, I’ll use the Foursquare API, which is one of my favorites to work with and (in my opinion) has some of the best documentation around. Browsing around their developer page, I found this interesting summary of API changes over the years.

There was a pretty huge change on 6/9/12 that included many breaking changes. Sounds like fun! I actually remember getting caught in this change and having to update my iOS auto-check in app to fit the new API. Let’s pick an endpoint and see what the data used to look like. We’ll be able to compare it to the modern version of the API.

For our example, we’ll hit the /venues/explore/ endpoint. This endpoint takes a lat/long value and returns a list of suggested venues to try. If you are authenticated, it can give you personalized info about why those venues were suggested (like if your friends went there and liked it). If you go to this page, you can get a url for that endpoint along with an oauth token added for you automatically.

Hitting the endpoint in Runscope

Next we’ll copy and paste the Foursquare API url into Runscope. You can create this request in a default bucket or make a new one specifically for this test. Just hit the “New Request” button and paste the url in. Hit the “Launch Request” button to have Runscope make the API request on your behalf and save all of the data for you to compare later. If all goes well, you should see the response appear below the “Launch Request” button. Next, we’ll edit the request so that the format matches the API before the big change on 6/9/12. Foursquare uses a pretty awesome convention for API versioning that uses dates for versions. To edit the response, simply hit the pencil icon on the upper right part of the response. You’ll be taken to the response editor. From there, change the “v” parameter to 20120608 (my 29th birthday!). request editor Hit “Launch Request” again and Runscope will fetch the response from Foursquare for a client that’s using a really old version of the API. After a short wait you should see another response appear. You can either hit the “Compare to Original” button or go to the “All Traffic” section to pick two requests to compare.

Comparing two versions of the same request

Finally we’re at the meat and potatoes of this post! Let’s explore the difference that two years can make to an API endpoint. (For all of these screenshots, the older version is on the left) deprecationwarning The first thing to notice is that Foursquare is really polite about their deprecation warnings. I wonder how standard their meta dictionary with an error message is. I suppose it works well if you have humans reading your API (as we are now) but I’m going to assume most clients do not care about the meta error being shown in this deprecated API call. I wonder if there’s an HTTP status code for “OK now but will be deprecated soon.” exploresuggestionsreasonsAnother thing to notice is that there is a new “reasons” dictionary for each venue suggestion that lists friends that have gone to that venue. It looks like this dictionary didn’t exist before the 6/9/12 API update but probably exists in the older version of the API for compatibility reasons. moretipanduserinfo The tips section also seems to have gotten less info about the user giving the tip. I’m not sure if this is for privacy reasons or something else. Another thing I noticed was that the older version included a tip that I created myself while the newer version removed it (probably because the authenticated user isn’t interested in their own tips when exploring). Finally there’s the terrible implementation of icon images for categories that Foursquare provides. The old format had a prefix, file format and an array of possible sizes for the icon. You’d have to piece the three together like the silver monkey statue to get an icon for your app. The newer version did away with the list of sizes (now you just need to choose a size from 32, 44, 64, and 88).



As you can see, Runscope makes it really easy to compare the differences between API versions. Hopefully all of your API version migrations will be smooth and come with months of warning, but in case they don’t, Runscope can be a pretty handy tool to get a handle on what’s changed. If you have any questions or comments on this or other tools to ease the pain of API version migration, please let me know by posting a comment!

Ideas and Recipes For Home Automation With The Internet of Things

A few weeks ago my friend Emily asked me to speak at the Ann Arbor Mini Maker Faire for their speaker series. I said “sure” and figured I’d talk about something like app development or design. I ended up picking a more “Maker Faire-y” topic since I thought it would work better for the crowd.

I’ve been really getting into home automation and buying a lot of “internet of things” products lately. I decided to do an introductory talk on home automation along with ideas on how people can get up and running quickly. I did the talk last weekend and had a great turnout. Some people asked for more information so I thought I’d write up a blog post that more or less summarized the talk. So here it is!

What exactly is the “Internet of Things?”

“The Internet of Things” is a term that gets thrown around a lot but doesn’t seem to have an official definition. I like to think of it as the concept that, as devices gain more intelligence through the ability to communicate wirelessly and through the internet, they become more useful. In other words, things will form their own internet and will generally be awesome. I was reminded the video game, Megaman Battle Network, where literally everything, including a microwave, has a port to jack into (and can thus catch a virus).

MMBN Screen Shot 2014-05-05 at 10.06.29 AM-140 Screen Shot 2014-05-05 at 10.06.50 AM-138It also means that your refrigerator might be able to tweet when you run out of milk, or stream music via Pandora.

Why am I hearing so much about the IoT right now?

In the past few decades, we’ve seen the cost of producing software go way down. Starting a software business simply costs less today than it did in the 90s. While hardware always brings more upfront costs, it seems like commodity hardware and Kickstarter are making it easier for smaller companies to get into the hardware game. On top of that, wireless technologies like Bluetooth LE and low power Wifi are enabling devices to talk to one another as well as to the internet. In the past few years we’ve seen large companies like Belkin get into this space as well as smaller startups like Nest (though they were aquired by Google pretty recently).

Some things you can accomplish

I’m just going to list a few use cases that I consider to fall in the category of “Internet of Things.” After I give an initial survey, I’ll go into ways that you can use these off-the-shelf solutions to automate your home.

Turn on/off light switches


One of the simplest use cases for home automation is turning off/on light switches. You can do this with the Belkin Wemo (which is what I use) or with something like Ninja Blocks or SmartThings.

Detect Motion


You can use Dropcam to detect motion with video, or Wemo / SmartThings to detect motion events. This is mostly useful indoors if you don’t have pets since they can set false alarms.

Detect when doors/windows are open


You can use SmartThings to detect when a door or window has been opened. This could be useful for knowing if you left a window open while it’s raining, though it would be even more helpful if it could close the window for you automatically.

Detect water moisture / floods


Apparently it really sucks to have a flood in your basement. It might be worth the extra money to install a SmartThings moisture detector to tell you if there’s a water leak or flood.

Lock or unlock your front door


I ordered a Lockitron a really long time ago and am happy to announce that it should be shipping soon! I also found a product by Schlage that seems to do something similar.

Keep track of your driving

AutomaticAutomatic helps you keep track of your MPG and total distance traveled by connecting to the data port of your car.

Don’t kill your plants


This is a fairly stupid product called the Parrot Flower Power that can remind you to water your plant when the soil gets too dry. It’s probably more cost effective to just buy a new plant if yours dies from under/over watering but whatever.

Keep track of how many eggs you have left


The Quirky Egg Minder keeps track of how many eggs you have in your fridge.

Recipes with IFTTT / Zapier

IFTTT (if this then that) is an online tool that makes it easy for non-programmers to connect different internet and mobile services to each other. Think of it as glue that can tie all of your IoT devices together via triggers and actions. A trigger might be that your Wemo has detected motion, and an action might be to open your garage door. IFTTT works well with many of the devices I’ve listed previously and offers an iOS and Android app that can hook into things like your location and photos.

Zapier is another tool that is similar to IFTTT but a bit more complicated and expensive to use. Some recipes are only compatible with Zapier (like Lockitron) so you might need to sign up for that service as well.

I’ll share a few recipes on IFTTT or Zapier that you can use to automate your home with your newfound gadgets.

Play the radio when a burglar enters your home

This recipe is possible without using IFTTT (and is probably more secure since it doesn’t require internet). I set up my motion detector to turn on a switch that’s connected to a portable radio. During certain hours when I’m not at home, if someone enters my house, the radio will start playing. The hope here is that the noise will be a deterrent to the invader. If I was feeling in a Home Alone mood, I could also connect the switch to a tape player that played scenes from a mobster movie.

 Make sure your doors are locked when you leave home

When your IFTTT mobile app detects that you are leaving your home, you can send an email that Zapier can use to connect to Lockitron to ensure that your doors are locked. This is a pretty convoluted recipe since it uses both services, but there isn’t really a better way to do this, unless you want to simply lock your door with the Lockitron app when you leave.

Automatically start the coffee maker by waking up

IFTTT Recipe: Turn my #coffee maker on the moment I wake up connects up-by-jawbone to wemo-switch

This is a really cool IFTTT recipe I found that was shared by another user. The Jawbone UP 24 has a feature that can wake you up at the best time in your sleep cycle in the morning. The UP can send a message the moment you wake up to IFTTT that can turn on a switch that’s connected to your coffee maker. Voila, your coffee is ready as soon as you get out of bed.

Log all of your driving trips in Google Docs

IFTTT Recipe: Keep track of all my driving trips connects automatic to google-drive

I use this recipe to keep track of total miles driven and my fuel efficiency. It’s not particularly useful for anything, but if I was a taxi driver (or Lyft or Uber X) it might be pretty useful for tax purposes.

DIY Home Automation

One thing I haven’t really touched on is creating DIY projects to automate your home. I’m a big fan of home grown projects like this DIY Lockitron using Arduino. I recently bought a Raspberry Pi to act as an iBeacon in my home. I’m trying to create an app that will be location aware to the room in my house that I’m in (or at least that my iPhone is in). I’ve heard that the best way to predict the future is to invent it, so if there’s anything in this space that you haven’t seen available, the best solution is to do it yourself!

If you have any interesting projects or recipes you’d like to share, I’d love to hear about them. I am about to move into a new house, so the automation potential is about to grow a lot more for me (versus living in an apartment). Good luck and try not to get any viruses in your microwaves!

Writing Google Glass Apps With the GDK Sneak Peek

Since the beginning of time, humans have wanted to mount small computers on their heads. In 2013 we got our wish, as Google started releasing Glass onto explorers and developers alike. In November of 2013, Google released a “Sneak Peek” of the GDK, a way to write native code for Google Glass.

While the GDK is still under development and will probably change quite a bit before being finalized, those on the cutting edge will want to familiarize themselves with it.

In this quick start guide, I’ll go through the steps of writing your first GDK app, “Where Am I?” This app will ensure you are never lost. By simply shouting “Okay Glass, where am I?” the app will find your latitude and longitude, and try to find the nearest street address.


In order to start writing native apps for Google Glass, here’s what you’ll need:

  • Google Glass! There is currently no way to emulate the device and its sensors, so getting a real device is a must. You can request an invite directly from Google here, or search on Twitter for someone with a spare invite to share.
  • ADT – Android Development Tools is basically a plugin for Eclipse that lets you compile and load your native code onto Glass.
  • Source Code (optional) – I’ve included the source code for this app on Github here. If you want to follow along without copying and pasting a bunch of code, you can simply skip back and forth by running git commands that I’ll describe later on.

Getting Started

Google Glass runs on a special version of Android, “Glass Development Kit Sneak Peek” - a tricked out version of API 15.  You’ll need to download this version with the Android SDK Manager within ADT (you can open it by clicking on the icon of the Android with a blue box and an arrow on it).

Glass Sneak Peek

Once you’ve done that, you’ll need to create a new Android project with the correct SDK settings. Use API 15 for Minimum and Target SDK, and set “Compile With” to the GDK Sneak Peek.GDK SettingsCongratulations! You’ve created a project that should run on Google Glass! If you try running the app on your device, it may or may not work, since it won’t have a voice trigger associated with it yet. We’ll add a voice trigger in the next step.

Adding a Voice Trigger

If you’re following along with git, you can type the following command to fill out all the code for this step:

Because Google Glass is such a new paradigm for user input, we’ll need to set up a voice trigger to allow the user to start our app. To do this, we’ll add some some strings to our strings.xml and add an xml resource that describes the voice trigger. Finally, we’ll add an intent filter to our AndroidManifest.xml to register the voice command.

1. In res/values/strings.xml

2. In res/xml/whereami.xml (you can name this file anything as long as you refer to the same file in AndroidManifest.xml

3. In AndroidManifest.xml:

Now, if you run your code on Glass, you should be able to find the “Where Am I?” command in the Ok, Glass menu!

Glass Voice Trigger

Using Glass Cards

We’ve gotten voice commands working, and that’s half the battle. Next we’ll learn how to use Cards, the native user interface for Glass. We’ll need to remove the existing style, then instantiate a Card object, set up its text and set it as the content view.

1. Remove the line in AndroidManifest.xml that reads:

2. Import Card from the GDK and create a new Card instance in OnCreate()

If you run the app, should see something like this:

Glass Card

Getting Location and Updating the Card

If you’re following along with git, you can type the following command to fill out all the code for this step:

Finally, we want to get the Glass’ location and update the card with the appropriate info. Our lost Glass Explorer will be found and all will be well. We’ll do this by requesting location permissions, setting up a LocationManager and handling its callbacks by updating the card with lat long and a geocoded address, if available.

1. Ask for permissions for user location and internet in AndroidManifest.xml

2. Create a LocationManager that uses certain criteria to figure out which location services it can use on Glass in

3. Implement callbacks that will update the card when you get a location update in

The app should now start getting location updates on the Glass (it can take a while to get a good fix) and update the Card once it finds you. The result should look something like this:

Glass Card Complete

Congratulations! You’ve finished your first Glass app. Future generations will wonder how they ever knew where they were without a head mounted computer printing out their exact location on the earth into a glowing prism directly above their field of vision at all times!

For more information about the GDK, check out Google’s official documentation! Glass on!


Google Glass Megapost (+ win an invite!)

Here’s a giant post of stuff I’ve been doing with Google Glass! I made a survey, wrote an app and am running a contest, so that’s all bundled into this blog post.


I got a Google Glass invite after signing up at Google’s official waitlist. After deciding I would go ahead and get one, I was paralyzed by the choice of which color to order. I made a survey that asked participants to rank the five colors in order of awesomeness. The results surprised me. I was leaning towards white (cotton) and sky blue, but everyone seemed to really like tangerine (orange).

glass survey
Tangerine wins by a landslide!

I was going to go against the popular vote and get the white one, but both the white and black ones were out of stock at the time, so I went for the fan favorite. I ended up getting the Tangerine device the day after I ordered it (next day shipping FTW).


My intention with getting the Google Glass is to try it out and see if I can make anything really cool with the additional sensors and the fact that the device is worn on a person’s head. I was really struggling with what to start working on, and then it hit me: a fart app. I was actually working on a Dragonball Z Scouter app, but that was going to be too complicated. I found that you can insert your own “Okay Glass” prompts to start an app, so I figured I’d make a hello world fart app.

The history of fart apps on advanced mobile operating systems is a long one, so I won’t go over all that. Suffice it to say I am proud to have written the first (as far as I know) native fart app for Google Glass. Here’s a video of me demonstrating it:

Having played around with the GDK a bit, and the device a bit more, I am still not sure if I will keep it. The $1500 price tag is one factor. The public acceptance of the device is another. My opinions on my previous Glass post still hold. If anything, the opportunity is definitely worth the $1500 “loan” to Google plus return shipping.


Before I even got my first Google Glass invite, I went to a GDG Android meeting in Ann Arbor. The organizer had some spare Glass invites, so I asked for one. I hadn’t heard back about the invite, but after ordering my Glass from my original invites, I got another one from the GDG group’s pool. I figured I’d give it away, and to drum up interest in the fart app, I’m going to make a tweet and ask users to retweet it for a chance to win.

Here’s some rules:

  • Enter the contest by retweeting this tweet.
  • You can only enter once per Twitter account.
  • Don’t make multiple Twitter accounts to enter or you’ll be disqualified.
  • The contest is for the invite code. You still have to buy the Glass yourself.
  • The contest will end on Sunday, Dec 8th at 11:59PM ET.
  • The contest winner will be chosen by random and will get a DM with the code (you should also follow me to make sure I can DM you).
  • The invite code expires some time on Dec 10th, so you’ll have a couple of days to order it.
  • I’m not liable for anything. Like if you’re mad about getting an invite and want to sue me or something.
  • If no one enters I’ll probably just tweet out the code and the first person to use it gets it!

Here are Google’s rule for eligibility:

  • be U.S. residents
  • be 18 years or older
  • purchase Glass
  • provide a U.S. shipping address OR pick up their Glass at one of our locations in New York, San Francisco or Los Angeles