(this post is reprinted from my blog on O'Reilly)
I have previously written about the iPhone SDK from the perspective of the applications that I would like to see as an iPod touch owner (side note: Apple is touting the touch as the first mainstream Wi-Fi mobile platform, and non telephony apps built via the SDK should run on both the iPhone and iPod touch).
Today, I am going to propose a vocabulary for thinking about the building blocks of mobility applications in the iPhone/iPod touch universe, a vocabulary based largely on an analysis of the composite set of existing functions already supported in these devices.
My goal is two-fold. One is to attempt to connect the dots for would-be developers thinking about how they can emulate the best practices, services and workflows that have emerged around this platform.
Two is to hopefully spotlight how Apple can provide maximize leverage for third-party developers by exposing well-defined APIs and providing tools that make it easy for developers to build applications around these function sets.
After all, the moral of the story from the PC revolution is that Microsoft became the gorilla specifically because they outflanked Apple in defining and supporting APIs and tools around their platform, and then systematically cultivated a developer ecosystem.
Let me disclaim upfront that I have no inside knowledge of what Apple is going to announce in terms of SDK 1.0 and/or what their plans are from a road map perspective. Also, I have not spent any energy trying to map the terms I am using to Apple terminology or be exhaustive in documenting every feature these devices support, as this is really a straw-man effort, designed to be picked apart and iterated.
A final note is that whereas a general-purpose platform like a PC can reasonably cope with poor developer decisions around system resource allocation handling, the iPhone and iPod touch are highly optimized, performance-sensitive devices. You don’t want phone calls being dropped or music skipping because a neophyte developer’s slide show application has memory leaks.
Given that fact, I hope that under the hood Apple is incorporating some logic into the development/run time model to automate/optimize handling of concurrent threads and applications, as well as garbage collection of system resource blocks.
In any event, this is the vocabulary and SDK constructs that I hope inspire better thinking about iPhone/iPod touch as a development platform:
Rich Media Presentation and Manageability
- Media types: out of the box, the iPhone/iPod touch does a phenomenal job of handling entire music and photo libraries. Owing solely to storage limitations, it does only a decent job with handling video libraries. While iPhone users lament the usability of the YouTube video application over EDGE, on a Wi-Fi connection, the application is a revelation. Similarly, the Google Maps application is just a joy to use. Were Apple (and Google in case of Maps) to provide read/write access to these libraries, entire categories of rich media applications become possible.
- Media listings: media can be presented in a thumbnail fashion with profile data showing title, description, media length, upload date, tags, ratings, etc. The user is always a click away from being able to search for specific media items, and a full history of past actions is always handy, greatly simplifying the process of re-finding past content of interest.
- Social media: filtered views of media items can spotlight featured content as well as content organized by parameters such as most viewed, most favorited, recently viewed, play lists or related content (either via top-down via taxonomic definitions or bottom-up via folksonomic algorithms).
- Player controls: media can be played, fast-forwarded, paused, reversed, resized and/or traversed (if multiple views are supported, as in case of directions on a map itinerary). Individual items can easily be bookmarked for later recall and/or emailed to others in a click.
- Futures: it seems that there is great potential for creation of ‘live’ shows that combine music, photos, video, text, maps, graphical charts and enhanced visualization of lists to allow interaction between iPhone/iPod touch users in a one-to-one, one-to-many or many-to-many fashion.
Mobility User Interface and Control
- Three-tier UI structure: a popular user interface construct that has emerged with native apps is to have top level ‘header controls’ (e.g., filter content based on today, this week or ALL parameters), a bottom level set of ‘footer controls’ (e.g., compose, calendar, contact) and then have the main application presentation body in the middle. Given the generous screen size and the touch-based interface of these devices, this enables tremendous dynamism in terms of application navigability and control.
- Touch modes: slider controls allow things like volume to be turned up or down with a single finger; two-fingered expansion and contraction allows zoom/focus levels to be adjusted in real-time; multi-touch settings allow more complex configurations, like forms and/or filters to be deployed without typing; traversal controls allow multi-screen views (like weather or time in different cities) to be traversed in a single click; finger sorting allows items in a list to be easily moved up, down, or dragged to folders/garbage instantly. In the future it seems that there could be potential to enable touch-based macro functions to be supported for invoking scripted actions.
- Virtual keyboard: As an input device, the iPhone/iPod touch pale in comparison to my trusty Blackberry 7130e. However, Apple has compensated for this limit in a couple of ways. One is a keystroke memory function that makes it easy to recall past searches, email addresses and the like in very few keystrokes. Two are optimized keys for shortcutting addition of text like the characters .COM onto the end of email addresses. It seems that if the SDK supports it there is tremendous potential to create application-specific keyboards and keystroke memory functions that are optimized to specific tasks. After all, the keyboard is virtual; why not make this attribute a virtue of the platform?
- Futures: Given the dynamic nature of a virtual, touch-based interface, it seems like there is room for tremendous innovation around drill-down functions (think: stock tracking or sports applications) whereas today, a lot of these types of applications allow the users to traverse one layer across or down, as is the case with the stock application. Innovation in this area could re-invent the way we think about search and topic-specific online research.
iTunes Store Integration
- Downloads: it is well-chronicled that Apple intends to treat the iTunes Store as the distribution/download point for third-party applications, games, content, etc. If they provide interfaces and tools to enable developers to create virtual storefronts, they could create an eBay for the mobility universe.
- eWallet: via the iTunes Wi-Fi Music Store, your iPhone/iPod touch can function as an eWallet. Why not open up this model to Amazon and other retailers?
- Social Lockers: this is definitely in the futures bucket but why not enable consumers to publish their libraries, play lists, ratings, comments, posts, etc. as a way of enabling like-minded individuals to connect with one another in mobile environments?
Does this model resonate or miss the boat? Can you suggest refinements or alternatives?
Related posts: