Impressions of Android Wear (LG G Watch) with my Nexus 5

I finally got the LG G watch that I won in a contest about a week ago. I replaced my Pebble smart watch with the Android Wear device along with using my spare Nexus 5 instead of my iPhone. It’s been about four days or so since I made the switch so I thought I’d write up my impressions.

The Watch

LG G

The LG G watch is one of the two currently available Android Wear devices for purchase. My coworker Jason, who also won a watch in the same contest, got the Samsung model. We both would have rather had the Motorola watch, but it isn’t available until later this summer so Google decided to send us these ones.

The LG watch is not particularly fashionable. It looks like a pretty generic black rectangle with rounded corners. It comes with a pretty neat charging base that you can magnetically attach the watch to. I would’ve preferred some kind of wireless charging, but this gets the job done, and the watch actually charges pretty quickly. The quick charge time is good because the watch only last about 24 hours anyway.

The Software

Android Wear Software

The software for the LG watch is basically an enhanced version of the notification dock in Android. This actually mirrors Google Glass a bit, but the execution is less awkward because you’re not wearing a watch on your face. Each notification that lives on your phone will also appear on the watch. If the notification is enhanced, you might be able to take action on it through the watch. For example, you can reply to a text message or Google Hangout by dictating a message. This is probably the killer app of the watch, though I feel like talking to your watch is still going to be considered antisocial behavior.

Aside from the notifications, you can also tell the watch to take notes, send e-mails, and set alarms. The watch still relies pretty heavily on your Android phone, though. Notes get saved to Google Keep (which I am not really sure is an app that will stick around), and GPS navigation simply opens Google maps on your phone. I’m guessing that in the future the watch will get smarter, but for now I think that’s a pretty decent feature set.

I’ve been toying around with the idea of writing an app that creates a persistent notification that just shows me how late the bus is going to be in the morning. Every morning I check the AATA website to see how it my bus is going to be. The information is contextual and time sensitive, and I only need it for that one time during the day, so it would make sense to make this something that shows up on my watch when I need it and goes away when I don’t. I think that Google Now could do this to some extent, but it’s not smart enough to know that I want to leave 5 minutes before my bus comes around and not when it’s “scheduled” to arrive.

Android

In order to fully test out the Android Wear watch, I took my SIM card out of my iPhone 5S and put it into my Nexus 5. So far the transition hasn’t been too difficult as I mostly rely on Google services anyway. There aren’t really any apps on the iPhone that I haven’t been able to use on my Android phone.

I may have said this before in previous blog post, but I really think that the quality of Google’s services is catching up to the polish of the iPhone. Consider how bad Apple’s online services are, and how good Google’s are. I can live without iCloud and Apple’s email service (does anyone use their .me email as a real email address?), but I am locked into Gmail, both personally and professionally.

I am guessing that Google could increase Android’s marketshare sharply by simply withdrawing support for all Google services on iOS. Meanwhile, Apple is relying more and more on third parties to fill out the expertise that it is lacking (I mean, really, who the fuck uses WebObjects?). I can’t really see a situation in the long term where Apple beats Google on services, either directly or by convincing third party developers to do it for them. As for whether I am actually switching to Android as my main device, I’d need to see what Apple has in store for their next generation. (Sadly, this LG G watch is not nearly awesome enough to make me switch on its own)

On top of the awesome built-in services of Android, the material design stuff coming from them from this past I/O is looking pretty sweet. If Android can give iOS a run for its money in terms of look and feel, and ease of creating really nice custom views and animations, then developers might just defect en masse. I guess the only thing Android needs to do now is get away from Java (if Apple can do this with Swift, then I’m sure Google can figure it out), because developing on it really sucks the joy out of being a developer.

Anyway, enough reverse-fanboying. I think it’s time for the…

Conclusion

While I think Android Wear does a lot of things right, and it has a lot of potential, the watch really isn’t all that revolutionary. To put it in terms of the HBO show SIlicon Valley, it’s not disrupting or making the world a better place through compression algorithms. It also isn’t a particularly stylish piece of metal and glass, so I wouldn’t wear it if it didn’t serve a purpose. Having said that, Google stuff is all about the integrated services, so I am feeling bullish on the fact that Google Now is always getting smarter, and developers should be able to hook into it, giving it a bit more life than it has right now. Would I buy it if I hadn’t won it? Maybe if I was an Android user already anyway, but I tend to buy too many gadgets as it is.

Android Wear Design Contest!

Last week I went to a GDG Ann Arbor Android meetup on Android Wear. The presentation was set up by Google and had a bunch of information on the upcoming Android Wear SDK. At the end, it was announced that there would be a few vouchers for Android Wear devices available. A design contest was announced and the deadline was set at Sunday.

Being naturally competitive and really into new technology, I made a design mockup for a home automation app that I thought would be useful to have on a wearable device (the most common form factor happens to be a watch). Here’s my submission that I originally posted to Google Plus:

Home Arrival

The app uses the user’s location as a way to provide feedback only when it’s necessary, and to stay out of the way when it isn’t. In this example, the app can detect when a user is arriving home and can ask if they want to switch the lights on.

Voice Controls

Instead of switching individual lights on and off, the user can create preset groups and activate them with voice commands. Users can set commonly used groups like “living room” and “kitchen.”

Sleepytime Reminder

Night owl users can set reminders to go to bed, and the app can turn off lights if the user decides to stay up late.

Motion Sensor

The app can also use sensors to detect when motion is occurring in the home and alert the user if no one is supposed to be at home. The user can decide whether the motion is a false alarm or if further action should be taken.

There is obviously much more that can be done with a home automation app, but these are just a few scenarios that would work well on an Android Wear device. I’m looking forward to experimenting on Android Wear when it becomes available!

I heard this week that my submission was a winning entry, so I’ll get an Android Wear device as soon as they go on sale some time this Summer. Thanks to Google and the GDG group here in Ann Arbor for setting up the event!

Inspecting and Debugging API Versions with Runscope

FL-iPhones This past month at FarmLogs, we implemented a versioning scheme for our internal API. API versioning is often a difficult transition to make, especially if you’re using a third party API and dealing with deprecations (Facebook, I’m looking at you!). It’s often a chore to figure out exactly what has changed between API versions unless you have very good documentation.

Coincidentally, Runscope also has a service called API Changelog that can send you notifications when a third party API changes. Luckily for me, any changes to our internal API were subject to review by our team of engineers, so if I had any questions or concerns, I could bring them up in person. Even so, when dealing with a changing API, the truth is in the response, not necessarily the documentation (or Hipchat or Hackpad). For a while, I was updating the iOS client to the new version by copying and pasting the two responses into a text editor and looking at the differences. About halfway through I remembered that this functionality is built into Runscope, so I started using that. I’ll show you how to compare diffs of two responses so you don’t have to waste half of your time like me!

Prerequisites

To follow along with this blog post, you’ll need just two things:

Some versioned API data

I’m not going to post the inner workings of the FarmLogs API (though it’s pretty easy to reverse engineer, like many other APIs) since that’s probably confidential knowledge or something. Instead, I’ll use the Foursquare API, which is one of my favorites to work with and (in my opinion) has some of the best documentation around. Browsing around their developer page, I found this interesting summary of API changes over the years.

There was a pretty huge change on 6/9/12 that included many breaking changes. Sounds like fun! I actually remember getting caught in this change and having to update my iOS auto-check in app to fit the new API. Let’s pick an endpoint and see what the data used to look like. We’ll be able to compare it to the modern version of the API.

For our example, we’ll hit the /venues/explore/ endpoint. This endpoint takes a lat/long value and returns a list of suggested venues to try. If you are authenticated, it can give you personalized info about why those venues were suggested (like if your friends went there and liked it). If you go to this page, you can get a url for that endpoint along with an oauth token added for you automatically.

Hitting the endpoint in Runscope

Next we’ll copy and paste the Foursquare API url into Runscope. You can create this request in a default bucket or make a new one specifically for this test. Just hit the “New Request” button and paste the url in. Hit the “Launch Request” button to have Runscope make the API request on your behalf and save all of the data for you to compare later. If all goes well, you should see the response appear below the “Launch Request” button. Next, we’ll edit the request so that the format matches the API before the big change on 6/9/12. Foursquare uses a pretty awesome convention for API versioning that uses dates for versions. To edit the response, simply hit the pencil icon on the upper right part of the response. You’ll be taken to the response editor. From there, change the “v” parameter to 20120608 (my 29th birthday!). request editor Hit “Launch Request” again and Runscope will fetch the response from Foursquare for a client that’s using a really old version of the API. After a short wait you should see another response appear. You can either hit the “Compare to Original” button or go to the “All Traffic” section to pick two requests to compare.

Comparing two versions of the same request

Finally we’re at the meat and potatoes of this post! Let’s explore the difference that two years can make to an API endpoint. (For all of these screenshots, the older version is on the left) deprecationwarning The first thing to notice is that Foursquare is really polite about their deprecation warnings. I wonder how standard their meta dictionary with an error message is. I suppose it works well if you have humans reading your API (as we are now) but I’m going to assume most clients do not care about the meta error being shown in this deprecated API call. I wonder if there’s an HTTP status code for “OK now but will be deprecated soon.” exploresuggestionsreasonsAnother thing to notice is that there is a new “reasons” dictionary for each venue suggestion that lists friends that have gone to that venue. It looks like this dictionary didn’t exist before the 6/9/12 API update but probably exists in the older version of the API for compatibility reasons. moretipanduserinfo The tips section also seems to have gotten less info about the user giving the tip. I’m not sure if this is for privacy reasons or something else. Another thing I noticed was that the older version included a tip that I created myself while the newer version removed it (probably because the authenticated user isn’t interested in their own tips when exploring). Finally there’s the terrible implementation of icon images for categories that Foursquare provides. The old format had a prefix, file format and an array of possible sizes for the icon. You’d have to piece the three together like the silver monkey statue to get an icon for your app. The newer version did away with the list of sizes (now you just need to choose a size from 32, 44, 64, and 88).

Conclusion

logo-runscope-wordmark-white

As you can see, Runscope makes it really easy to compare the differences between API versions. Hopefully all of your API version migrations will be smooth and come with months of warning, but in case they don’t, Runscope can be a pretty handy tool to get a handle on what’s changed. If you have any questions or comments on this or other tools to ease the pain of API version migration, please let me know by posting a comment!

Ideas and Recipes For Home Automation With The Internet of Things

A few weeks ago my friend Emily asked me to speak at the Ann Arbor Mini Maker Faire for their speaker series. I said “sure” and figured I’d talk about something like app development or design. I ended up picking a more “Maker Faire-y” topic since I thought it would work better for the crowd.

I’ve been really getting into home automation and buying a lot of “internet of things” products lately. I decided to do an introductory talk on home automation along with ideas on how people can get up and running quickly. I did the talk last weekend and had a great turnout. Some people asked for more information so I thought I’d write up a blog post that more or less summarized the talk. So here it is!

What exactly is the “Internet of Things?”

“The Internet of Things” is a term that gets thrown around a lot but doesn’t seem to have an official definition. I like to think of it as the concept that, as devices gain more intelligence through the ability to communicate wirelessly and through the internet, they become more useful. In other words, things will form their own internet and will generally be awesome. I was reminded the video game, Megaman Battle Network, where literally everything, including a microwave, has a port to jack into (and can thus catch a virus).

MMBN Screen Shot 2014-05-05 at 10.06.29 AM-140 Screen Shot 2014-05-05 at 10.06.50 AM-138It also means that your refrigerator might be able to tweet when you run out of milk, or stream music via Pandora.

Why am I hearing so much about the IoT right now?

In the past few decades, we’ve seen the cost of producing software go way down. Starting a software business simply costs less today than it did in the 90s. While hardware always brings more upfront costs, it seems like commodity hardware and Kickstarter are making it easier for smaller companies to get into the hardware game. On top of that, wireless technologies like Bluetooth LE and low power Wifi are enabling devices to talk to one another as well as to the internet. In the past few years we’ve seen large companies like Belkin get into this space as well as smaller startups like Nest (though they were aquired by Google pretty recently).

Some things you can accomplish

I’m just going to list a few use cases that I consider to fall in the category of “Internet of Things.” After I give an initial survey, I’ll go into ways that you can use these off-the-shelf solutions to automate your home.

Turn on/off light switches

Wemo

One of the simplest use cases for home automation is turning off/on light switches. You can do this with the Belkin Wemo (which is what I use) or with something like Ninja Blocks or SmartThings.

Detect Motion

Dropcam

You can use Dropcam to detect motion with video, or Wemo / SmartThings to detect motion events. This is mostly useful indoors if you don’t have pets since they can set false alarms.

Detect when doors/windows are open

SmartThings

You can use SmartThings to detect when a door or window has been opened. This could be useful for knowing if you left a window open while it’s raining, though it would be even more helpful if it could close the window for you automatically.

Detect water moisture / floods

Flood

Apparently it really sucks to have a flood in your basement. It might be worth the extra money to install a SmartThings moisture detector to tell you if there’s a water leak or flood.

Lock or unlock your front door

schlage-century-BE469NX-chrome-580

I ordered a Lockitron a really long time ago and am happy to announce that it should be shipping soon! I also found a product by Schlage that seems to do something similar.

Keep track of your driving

AutomaticAutomatic helps you keep track of your MPG and total distance traveled by connecting to the data port of your car.

Don’t kill your plants

flowerpower

This is a fairly stupid product called the Parrot Flower Power that can remind you to water your plant when the soil gets too dry. It’s probably more cost effective to just buy a new plant if yours dies from under/over watering but whatever.

Keep track of how many eggs you have left

eggminder

The Quirky Egg Minder keeps track of how many eggs you have in your fridge.

Recipes with IFTTT / Zapier

IFTTT (if this then that) is an online tool that makes it easy for non-programmers to connect different internet and mobile services to each other. Think of it as glue that can tie all of your IoT devices together via triggers and actions. A trigger might be that your Wemo has detected motion, and an action might be to open your garage door. IFTTT works well with many of the devices I’ve listed previously and offers an iOS and Android app that can hook into things like your location and photos.

Zapier is another tool that is similar to IFTTT but a bit more complicated and expensive to use. Some recipes are only compatible with Zapier (like Lockitron) so you might need to sign up for that service as well.

I’ll share a few recipes on IFTTT or Zapier that you can use to automate your home with your newfound gadgets.

Play the radio when a burglar enters your home

This recipe is possible without using IFTTT (and is probably more secure since it doesn’t require internet). I set up my motion detector to turn on a switch that’s connected to a portable radio. During certain hours when I’m not at home, if someone enters my house, the radio will start playing. The hope here is that the noise will be a deterrent to the invader. If I was feeling in a Home Alone mood, I could also connect the switch to a tape player that played scenes from a mobster movie.

 Make sure your doors are locked when you leave home

When your IFTTT mobile app detects that you are leaving your home, you can send an email that Zapier can use to connect to Lockitron to ensure that your doors are locked. This is a pretty convoluted recipe since it uses both services, but there isn’t really a better way to do this, unless you want to simply lock your door with the Lockitron app when you leave.

Automatically start the coffee maker by waking up

IFTTT Recipe: Turn my #coffee maker on the moment I wake up connects up-by-jawbone to wemo-switch

This is a really cool IFTTT recipe I found that was shared by another user. The Jawbone UP 24 has a feature that can wake you up at the best time in your sleep cycle in the morning. The UP can send a message the moment you wake up to IFTTT that can turn on a switch that’s connected to your coffee maker. Voila, your coffee is ready as soon as you get out of bed.

Log all of your driving trips in Google Docs

IFTTT Recipe: Keep track of all my driving trips connects automatic to google-drive

I use this recipe to keep track of total miles driven and my fuel efficiency. It’s not particularly useful for anything, but if I was a taxi driver (or Lyft or Uber X) it might be pretty useful for tax purposes.

DIY Home Automation

One thing I haven’t really touched on is creating DIY projects to automate your home. I’m a big fan of home grown projects like this DIY Lockitron using Arduino. I recently bought a Raspberry Pi to act as an iBeacon in my home. I’m trying to create an app that will be location aware to the room in my house that I’m in (or at least that my iPhone is in). I’ve heard that the best way to predict the future is to invent it, so if there’s anything in this space that you haven’t seen available, the best solution is to do it yourself!

If you have any interesting projects or recipes you’d like to share, I’d love to hear about them. I am about to move into a new house, so the automation potential is about to grow a lot more for me (versus living in an apartment). Good luck and try not to get any viruses in your microwaves!

Writing Google Glass Apps With the GDK Sneak Peek

Since the beginning of time, humans have wanted to mount small computers on their heads. In 2013 we got our wish, as Google started releasing Glass onto explorers and developers alike. In November of 2013, Google released a “Sneak Peek” of the GDK, a way to write native code for Google Glass.

While the GDK is still under development and will probably change quite a bit before being finalized, those on the cutting edge will want to familiarize themselves with it.

In this quick start guide, I’ll go through the steps of writing your first GDK app, “Where Am I?” This app will ensure you are never lost. By simply shouting “Okay Glass, where am I?” the app will find your latitude and longitude, and try to find the nearest street address.

Prerequisites

In order to start writing native apps for Google Glass, here’s what you’ll need:

  • Google Glass! There is currently no way to emulate the device and its sensors, so getting a real device is a must. You can request an invite directly from Google here, or search on Twitter for someone with a spare invite to share.
  • ADT – Android Development Tools is basically a plugin for Eclipse that lets you compile and load your native code onto Glass.
  • Source Code (optional) – I’ve included the source code for this app on Github here. If you want to follow along without copying and pasting a bunch of code, you can simply skip back and forth by running git commands that I’ll describe later on.

Getting Started

Google Glass runs on a special version of Android, “Glass Development Kit Sneak Peek” – a tricked out version of API 15.  You’ll need to download this version with the Android SDK Manager within ADT (you can open it by clicking on the icon of the Android with a blue box and an arrow on it).

Glass Sneak Peek

Once you’ve done that, you’ll need to create a new Android project with the correct SDK settings. Use API 15 for Minimum and Target SDK, and set “Compile With” to the GDK Sneak Peek.GDK SettingsCongratulations! You’ve created a project that should run on Google Glass! If you try running the app on your device, it may or may not work, since it won’t have a voice trigger associated with it yet. We’ll add a voice trigger in the next step.

Adding a Voice Trigger

If you’re following along with git, you can type the following command to fill out all the code for this step:

Because Google Glass is such a new paradigm for user input, we’ll need to set up a voice trigger to allow the user to start our app. To do this, we’ll add some some strings to our strings.xml and add an xml resource that describes the voice trigger. Finally, we’ll add an intent filter to our AndroidManifest.xml to register the voice command.

1. In res/values/strings.xml

2. In res/xml/whereami.xml (you can name this file anything as long as you refer to the same file in AndroidManifest.xml

3. In AndroidManifest.xml:

Now, if you run your code on Glass, you should be able to find the “Where Am I?” command in the Ok, Glass menu!

Glass Voice Trigger

Using Glass Cards

We’ve gotten voice commands working, and that’s half the battle. Next we’ll learn how to use Cards, the native user interface for Glass. We’ll need to remove the existing style, then instantiate a Card object, set up its text and set it as the content view.

1. Remove the line in AndroidManifest.xml that reads:

2. Import Card from the GDK and create a new Card instance in OnCreate()

If you run the app, should see something like this:

Glass Card

Getting Location and Updating the Card

If you’re following along with git, you can type the following command to fill out all the code for this step:

Finally, we want to get the Glass’ location and update the card with the appropriate info. Our lost Glass Explorer will be found and all will be well. We’ll do this by requesting location permissions, setting up a LocationManager and handling its callbacks by updating the card with lat long and a geocoded address, if available.

1. Ask for permissions for user location and internet in AndroidManifest.xml

2. Create a LocationManager that uses certain criteria to figure out which location services it can use on Glass in MainActivity.java

3. Implement callbacks that will update the card when you get a location update in MainActivity.java

The app should now start getting location updates on the Glass (it can take a while to get a good fix) and update the Card once it finds you. The result should look something like this:

Glass Card Complete

Congratulations! You’ve finished your first Glass app. Future generations will wonder how they ever knew where they were without a head mounted computer printing out their exact location on the earth into a glowing prism directly above their field of vision at all times!

For more information about the GDK, check out Google’s official documentation! Glass on!