Tag Archives: iOS

WWDC 2016 Recap

wwdc badge

As a beginning iOS developer, I remember learning a lot of the craft by trial and error, paired with many of the WWDC videos of years’ past. I was very lucky this year to 1) get chosen for the WWDC lottery and 2) work for a company that would support my trip to San Francisco. The week flew by and I went to some really interesting sessions. Here’s the things I found most interesting.

Accessibility

I’ve been developing some accessibility features since starting at Starbucks, and so I went to the What’s New in Accessibility session. The most interesting developer facing features are the improved Accessibility Inspector (which lets you quickly run accessibility audits, even in the simulator) and custom rotors for secondary actions. I’m still pretty new to accessibility but I imagine enabling custom rotors will be nice since I just used a hack with the headers rotor to enable skipping between content.

Swift APIs

Although I haven’t been writing much Swift at work (we’re just starting to adopt it), I went to the Swift API Design Guidelines session. It was interesting to note the conscious decisions to make Swift read grammatically, and how developers are reconciling that with the ways that Swift needs to be compatible with Objective-c APIs. The talk itself was really interesting and worth watching in full, because I’m not going to try to summarize it here!

WatchOS 3

 

As a person who actually bought an Apple Watch and still wears it every day, I would really like developers to take advantage of the performance gains in WatchOS 3, least of which is leaving apps in memory so they can load much faster! I was also really mesmerized by the power of rich notifications on the watch using SpriteKit and/or SceneKit. I’m worried that the watch is in a catch-22 right now where it’s not really worth the work to develop for it because there aren’t enough users, which aren’t using the watch because of the lack of good apps. Or it could be that the watch really just needs a good weather app and that’s it. Oh well.

Search APIs

I also went to a session about using search APIs like NSUserActivity and CoreSpotlight. I largely ignored these features in the past, but it would be pretty interesting to hook into the system in order to provide phone numbers or addresses to hand off to maps or some other apps that the user is switching to.

Other Stuff

I went to quite a few other sessions on things like iOS Photography (which I sort of just went to accidentally and stayed for) and iMessage integration and On-Demand Resources. Overall, the sessions were really great and I found it really nice to be able to attend a talk in person and absorb the knowledge without distractions. It’s true that the sessions are all available either streaming or online shortly after happening, but too often I’m watching a video while also working against some work deadline or only paying half attention. It’s also pretty neat to be among other iOS developers, and it’s kind of hard to believe that companies have a hard time hiring when you can see so many developers in one room!

Hair Force One

hair force one

Craig Federighi has become one of the best parts of Apple Keynotes as he’s equal parts funny and informative. Whenever he showed up in the conference center, huge lines of people would gather to take a photo with him. He’s a really tall dude and it’s really nice of him to put up with random iOS developer who want to take a photo with him.

The Jacket

wwdc jacket

I received what is apparently a tradition at WWDC; a jacket with the year on it. I immediately felt weird about getting a nice jacket that basically screams “I am the most elite of the elite, because either I or my company can pay $1600 for me to go to a developer conference!” I spotted a few jackets from earlier years in the conference center, as though it’s a way to show that you’ve been in this game for longer and you’re much cooler than people who just got their first jacket this year. I am in the process of de-labeling my jacket with nail polish remover (ask me about it if you’re interested!) and I’ll hopefully get to wear the jacket embarrassment free once the weather gets cooler (it’s a really nice jacket!). I am hoping that Apple stops giving these jackets out at some point and does something like, I dunno, plants a tree for each attendee? Then again, we are talking about a company that makes objects that people literally worship.

The Company Store

the company store

Speaking of products, there’s a company store set up in the conference center where you can buy shirts and baby stuff and hats with Apple logos on them. I was going to buy something but I realized that you’re literally paying $40 for the logo as the shirt itself is a fairly standard bella canvas tee. Plus I already have a WWDC jacket that asserts my dominance as an iOS developer!

In Closing

Again, I feel incredibly lucky to have been able to go to WWDC this year, for the first time ever. The keynote and sessions were really great and I got at least 3 ideas for future apps, which is a lot more than I’ve had in the past year or two. Hopefully I can go next year do it all over again.

Design Patterns: iOS and Android – Dispatch Queues and AsyncTask

For the past few weeks, I’ve been learning Android and porting one of my iOS apps to the platform. While there are many differences between the two platforms, it’s also kind of interesting to see common design patterns between them. I figured I should write up the common design patterns that I notice and maybe help out some iOS developers who are also learning Android like me.

Today I’ll look at doing asynchronous tasks in each platform. While there are many ways of doing this in both platforms, I’ll take a look at Dispatch Queues for iOS and AsyncTask in Android, since that’s what I’ve been using lately.

In iOS, you can use the dispatch_async call to run code on a background dispatch queue. Say we get an NSArray of JSON objects and want to save them to a Core Data store. We can call dispatch_async on a dispatch queue, process all of the objects and then update the UI by using dispatch_async again on the main queue:

In Android, performing a lightweight asynchronous task requires you to subclass AsyncTask. I guess I’m using the term “lightweight” loosely because creating a subclass just to do something asynchronously seems a bit heavy, but at least you get to reuse your code!

You must define three generic types which describe what the input is (in this example, a String), the progress type (an Integer) and a result (a Boolean).

Once you have that AsyncTask set up, you can call

to run your asynchronous task. One tricky thing to remember is that execute takes a list of objects and sends it to doInBackground as an array. I’m not BFF with Java so the syntax threw me a bit, but apparently it’s using a feature called varargs that’s been in Java for a while now.

That’s all for today. I hope this blog post was useful. I certainly found it useful, since I had to do some research to really understand what the heck I was writing about. I’ll probably write about UITableViewDelegate/Datasource vs. ListAdapter next, unless there’s something else that seems more timely.

iOS Photo Editing Controls With A Custom Camera Overlay

This is one of those blog posts that’s basically for me and anyone else who cares to Google these search terms, so yeah.

I’m currently working on an app that does camera capture. Instead of using the normal Apple control, I’m using a custom overlay. Typically to do this, you do two things: set the UIImagePickerController’s “allowsEditing” to YES and add your overlay view to the controller’s “cameraOverlayView” view. Oh, and you also set the sourceType to UIImagePickerControllerSourceTypeCamera, obviously.

I learned through searching about 10 StackOverflow questions and doing my own testing that the custom photo editing control only shows up when you show the normal apple camera controls. So unless you want to do some weird hacking to get your overlay to show up over the normal controls and then have them disappear when the photo is taken, it’s not possible to use a custom overlay and still get the built in photo editing tool.

Unless someone wants to correct me…

The annoying part is this is not captured in Apple’s official documentation (or anywhere, really). So hopefully this blog post helps someone who is trying to use a custom overlay and the built-in Apple photo cropping tool.

Mapskrieg iPhone/iPad App Launch!

It’s been about 4 years since I announced the launch of Mapskrieg, my Google Maps and craigslist mashup web app. Since then, I’ve gone to school (again), worked for Microsoft and quit, and made a few apps. Today, I’m happy to announce the launch of my newest app, Mapskrieg for iOS!

I’ve been working on this app for a few weeks, and I think it’s ready for public consumption. It’s basically Mapskrieg, but developed natively for the iPhone and iPad. In the past, I’ve taken a sort of iterative approach of releasing fairly minimal apps and improving on them. For example, I released Threadless as an iPhone only app and later added the iPad support. I wanted to release Mapskrieg on both platforms so the launch would have a little more bang. Plus I think the iPad app is the better of the two, and I really wanted that one to stand out for the release. I’ve been getting much more comfortable with mobile development, and I’m very happy with the rate at which I was able to conceive and release this app.

That’s not to say I didn’t struggle or learn anything new with it. While a lot of the concepts are borrowed from my Threadless iPad app, I had to do a lot of stuff I hadn’t done before. For example, since Mapskrieg is going to rely on iAds to make money, I decided that both the iPad and iPhone versions would support iAd. Apple, in their infinite wisdom, made the split view controller a very useful and now, standard, design paradigm. Unfortunately, they don’t provide any support for using it with iAd at all. What the fucking fuck, Apple!? So I had to basically recreate the Split View (well, the landscape mode at least) in order to support iAds. It’ll be well worth it if I can rake in some iAd dough, though!

This post also comes almost 1 year after I quit my job at Microsoft. That milestone probably deserves its own post, but I’ll just say that I have not yet regretted my decision in the very least so far. This is as fun as it gets, folks!

Edit: Oh, I forgot to link to a demo video that I recorded for a contest. Check it out in case you don’t have an iOS device:

Note to Myself about UISplitViewController and Auto-Rotation

In order to prevent my future self from wasting like, 2 hours fucking around with UISplitViewController getting it to auto-rotate its subviews, here’s a little post!

Most of the time, UISplitViewController doesn’t want to rotate because you didn’t set it as the root subview of the Window. This is pretty well documented on StackOverflow, etc. I thought this was why my splitviewblahblahblah wasn’t rotating. It turns out that, according to Apple’s documentation,

A split view controller relies on its two view controllers to determine whether interface orientation changes should be made. If one or both of the view controllers do not support the new orientation, no change is made. This is true even in portrait mode, where the first view controller is not displayed. Therefore, you must override the shouldAutorotateToInterfaceOrientation: method for both view controllers and return YES for all supported orientations.

The key here is that the view controller wasn’t autorotating because the subviews did not answer “YES!!!” (exclamation points added by me) to shouldAutoRotateToInterfaceOrientation. I was fucking around with the Split View Controller in Interface Builder (and later programmatically) to no avail. Adding the stupid autorotate thing to yes in the subviews made it work (shouldn’t it default to YES!?!??!).

Okay, lesson learned. Just don’t forget this next time, Hung!