The Future of Native Apps (as I See It)

When Apple first launched the Apple App Store, it made a lot of sense to develop a native app and host your apps there. HTML5 APIs were still in draft stages and trying to hook into the phone's features, like camera, voice and the like, it was a fool's errand. If you really wanted to make the most of the phone's features in your app, there was one way to market if you expected any chance at monitization or virality, and that was through a native app marketplace.

Five years ago, the question of "how should I develop this app" was answered for you, develop for iOS and host it on the Apple App Store. But soon, Google got into the mix and opened the Play Store (originally named Android Market). Now developers had a choice as to whether they would develop for iOS or Android, but the market now dictated your decision, which was still develop for iOS and release on the Apple App Store. Then, if you had the resources, develop for Android, but many didn't because it simply didn't make sense.

As Android gained traction in the market that Apple simply ignored, more interest taken into Android and further invest was made into the platform. It soon became arguably as polished and feature-rich as iOS with a much larger user-base, and now the market really had developers divided. Should I develop for Android or should I develop for iOS? A tough decision only the circumstances of the project would now dictate. If you targetted one platform, you'd be missing out on a huge potential user-pool. If you developed for both, you had to ensure you had the resources to do so without affecting the bottom-line.

This created massive inconsistencies between the same application on different platforms. The availability, feature-sets and user-interfaces differed widely between iOS, Android and the other now-mature mobile OS's, and each had to be maintained and managed separately (even with help from Phonegap and other such tools).

Meanwhile, the HTML5 spec was being adopted by major browser vendors and arose a single 'platform' that could be leveraged consistently across all other existing platforms and actually make use of the device's hardware.

Now, if you wanted to create a voice recorder, you could use the device microphone as the voice input via HTML5. If you wanted to save the recording for offline use in the app, that was also now a capability. Essentially, you have all the functionality of a native app within the browser.

Similarly to how you would access a new app on an app store, you could just call up the url, it would download to your device cache, and you now have the app you want. Similarly to how you would update your native app, you could simply refresh the webapp and if a new version with new features was available, it would be downloaded. If not, reuse local version at native speeds. If you didn't have wifi access, no worries, you would just be using the old version until you did. If there was a security update on the otherside of having wifi access, intruders wouldn't be able to exploit it without wifi anyway, so no worries there either.

The real kicker here is, developers have a new (and what I believe should be default) decision as to "How should I develop this app?" Develop for the web browser. Develop once, deploy everywhere. It makes much more sense to go this route now then it did at any other time.

I feel Chrome OS is an amazing example of how we will be interacting with applications in a few short years, entirely in a 'browser' we don't even notice. Firefox OS has already come to market as only supporting HTML5 developed apps. The change has already started, and it's time to prepare for a non-native future in application development.

Javascript... Who'da thought, amirite?! ;P