What occupies our geeky minds

How We Built American Honey PlayPacks – Part 1

Short URL 0 Comments

 

PlayPacks, a project developed in partnership with HOST and What So Not, is our newest app for Android and iOS, and one of our most sophisticated. It combines augmented reality technology with multiple sample-accurate audio stream playback, all connected to a web sharing backend. Making this work smoothly across both platforms proved to be a real technical challenge. In this post we’ll share some of the techniques we used to get it running across two very different operating systems.

The porting problem

Today there are just two mobile platforms that are relevant to consumers – Android and iOS. Reaching the widest possible audience for your app means releasing on both. Unfortunately this usually requires creating and maintaining two codebases, an undertaking requiring as much work as writing two separate apps.

On the surface Android and iOS development are radically different. Android uses the Java platform as a base, and provides an application framework on top of it. iOS apps are written in Objective-C and built with the Cocoa framework, a library with a history extending back to the ’80s. Between different vendors, languages, and foundation libraries the opportunities for code sharing seem limited.

A native strategy

We have to dig deeper at the operating system level to find the similarities. Beneath its Java layers Android is built on a Linux kernel. And iOS is based on Darwin, Apple’s BSD derivative. Both share a lineage with Unix, the great-grandaddy of modern operating systems. Although their paths diverged a long time ago, there remains a common core set of functionality across each. This is exposed through lower level C and C++ code in their respective environments.

To be clear, this isn’t enough to write an entire app. The common native libraries provide interfaces to low-level plumbing, including parts of the POSIX standard (threading, I/O, system utilities), C and C++ standard libraries, and the OpenGL accelerated graphics interface. Standard platform features (e.g. user interface controls, app lifecycle management, AirPlay) must still be written within the platform framework (Cocoa, Android SDK) and interfaced with glue code.

The payoff for living with these limitations is true code sharing. By writing the core features of the app in native code and using portable native libraries, we can reuse the exact same codebase across both platforms. Within this core set, features are written once, bugs are fixed once, and the changes propagate to both platforms.

Xcode provides direct support for C and C++ code natively. On Android, the Native Development Kit (NDK) provides toolchains to build shared libraries from C or C++. The libraries are packaged with the Java parts of the app and interoperate through the Java Native Interface (JNI), a way to call native code from Java and vice versa.

Porting in practice

With PlayPacks we knew we were going to produce Android and iOS versions in quick succession. This informed us to design the app upfront with portability in mind. Our technique consists of two golden rules:

  1. Write as much portable code as possible using native (C++) code
  2. Strictly adhere to an identical structure for platform specific code

Rule #1 – write and use portable native code

The unique technologies built into PlayPacks are the augmented reality and synchronized audio features. We built these on Vuforia and libpd respectively. These are two cross-platform native libraries built with platform glue code. This is, not coincidentally, exactly the same technique we employ for portability and made it easier to integrate them into our source build tree.

The core parts of the app are written almost completely in native code. This includes code to draw and animate the Mixer and AR views, play back audio, record and save audio sequences, store user configurations and interact with the augmented reality markers.

From the brief we knew we required custom rendering and animation in each screen that required complete control of the drawing surface – tasks that OpenGLES is particularly well suited to. On each platform these OpenGLES views are embedded into customized platform-specific views that integrate with the native UI hierarchy.

Similarly, sound is generated as a final mixed floating point stream and passed to platform specific interfaces to the audio hardware. I’ll go into more detail about the media playback details in a separate post.

Rule #2 – use a shared structure for platform specific code

As outlined above, accessing standard features still requires us to write a lot of platform specific code. In PlayPacks, all the menu controls, video playback, network access and interstitial screens are written against their respective platform libraries – Java/Android SDK on Android and Objective-C/Cocoa on iOS. Obviously reusing the source code in these environments is impossible.

Instead we do the next best thing – reusing the code structure. By exploiting the similarities between Java/Objective-C and Cocoa/Android SDK, the shape of our platform specific code is identical on each platform. Both languages are high level and object oriented. And we can map common concepts between frameworks – see the table below. In this way, we created every Objective-C class with a functionally equivalent Java class containing the same methods and calling patterns.

Mapping concepts between Cocoa and Android:

 

Cocoa Android SDK
Root controller Activity
View controller containment Fragment
UIView View
GLKit GLSurfaceView
NSThread Thread

 

This isn’t quite as low maintenance as reusable source across platforms, but a lot nicer than decoupled codebases. We had the iOS version mostly completed before commencing work on Android, and following this rigid structure made the porting process very fast. Using the iOS code for reference, there were no decisions to be made about structure – instead classes were converted method by method, line by line. Any differences precipitated by the platform library were encapsulated as best as possible in helper classes.

The shared structure also helps with synchronizing changes between the two. When a change is made in one codebase it is easy to find and amend the equivalent code in the other, usually by looking up the same class and method name.

Before jumping into native code, its worth considering some of the pros and cons:

Pros

  • Low level access to system facilities, including OpenGLES
  • Runs across both platforms
  • Modern language features (C++11 support)
  • Works across platforms

Cons

  • Higher upfront development cost
  • Bigger, longer build times – must be built and packaged for each supported architecture
    • iOS – armv7, armv7s, arm64
    • Android – armv7, armv7eabi, mips, x86
    • Harder to debug on Android – stack traces will be your friend

    Closing thoughts

    We’ve been playing with these technologies for a while now – a good chunk of the rendering code was cribbed directly from a long-term internal prototype. PlayPacks is the first app we’ve released where we’ve been able to demonstrate the feasibility of this approach for rapidly deploying to iOS and Android while keeping feature parity. Code sharing with native code enables us to target both platforms without compromising performance along the way. We hope by evolving this technique we can improve our time to market while treating both platforms as first class citizens.

Posted by Dan Venkitachalam in News, Projects

Barbie™ in The Pink Shoes Kinect App

Short URL 0 Comments

Our latest collaboration with Gun Communications lets kids interact with Barbie® through Target’s shop windows. Working closely with Gun’s design team we created a custom XNA app for Microsoft Kinect® to create an interactive 3D storybook. The experience included a suite of gesture controlled mini-games including a Barbie® jigsaw puzzle, spin the ballerina and virtual painting game.

Posted by adapptor in Kinect, News, Projects | Tags: , , , ,

HBF Pocket Health App – Simple, Intuitive and Useful

Short URL 0 Comments

Recently Adapptor and Hatchd were engaged to help in an incredibly exciting opportunity for an app that is unique and fulfils a real need.

Have you ever needed to know your blood type, but couldn’t remember it? Or needed to remember the date of that broken arm when you were young? Or what about remembering when your child was vaccinated?

Each time you’ll find yourself rummaging through paperwork in the home office, or digging back through old calendar entries, when really all you needed was a simple mobile app.

Several months ago HBF approached Adapptor and Hatchd with a request to build just this type of app. They asked us to have a look at the market, determine if there was a need, and then design something that would work on a smartphone to help their members.

When we’re engaged in such an opportunity, we like to consider a range of requirements and we spend a a good deal of time ensuring we have these key factors covered.

First and foremost the app needs to be useful. Even a game serves a function — providing a challenge that makes for an enjoyable experience — but most apps aren’t games and must meet some need. Ideally the app serves a need on a regular basis and people return to use the app often.

Secondly, like a game, an app should be enjoyable to use. It’s often not obvious what makes the experience enjoyable, and that can be part of its beauty. An app should flow, like a game and a good deal of attention should be spent on ensuring that the app works well for users. Part of this means it needs to be easy to use.

Fortunately for us, HBF had already considered the first factor (the utility) and we could focus on the app’s flow. Our team spent hours locked in a room working on the key features and trying to understand how the app would work. Our top priorities were making it simple, intuitive, yet also very powerful.

This can often mean reducing the number of features, and consolidating the top level categories. In this case we managed to distill the app down to profiles, events and search. We considered how we could make it simple to use, and yet still be powerful. We ended up recommending that the main event interface be a list, in reverse chronological order; something we’ve become very accustomed to with weblogs, photo streams, and apps like Twitter and Facebook. A long list of individual events that can easily be filtered makes a lot of sense on a small device. The app also needed to explain itself quickly to a user. Adding buttons to the main view provided a quick way to add an array of health events with only a few presses.

We think that, along with HBF, we’ve helped create an incredibly useful app. So if you’re a HBF member be sure to head over to the App Store and grab yourself a copy. (Android version coming soon). Remember, you’ll need your myHBF login details to create your very own Pocket Health account.

Let us know how you go. We’d love to hear what you think.

Posted by Richard in Mobile, News, Projects

Barbie’s ‘Dream Closet’ XNA Kinect App

Short URL 0 Comments

A sneak peek at our XNA Kinect App we built for Gun Communications and their client Mattel.  This was an early version of the app with a few things left to polish.

The technical solution consisted of a custom-made application using Microsoft’s XNA Game Studio combined with the  Kinect for Windows hardware and software development kit (SDK). The user interacts with the application through a natural gesture and overlay interface, augmented with audio and visual cues. Skeletal tracking is used to determine the position and orientation of the user and various Barbie™ outfits are rescaled and rendered over the user in real time. After selecting an outfit the user is prompted to strike a pose, and after a short countdown the resulting postcard-style photo is uploaded to a Pyramid-powered web server. All approved images (users aged 13+ only with guardian consent) can then displayed in a gallery section of the Barbie® Australian Facebook page.

The app will be launched at a media event in Sydney on April 11, 2012. A series of public events will then be executed at Westfield centres during school holidays, commencing at Westfield Parramatta (NSW April 12 – 15), Westfield Doncaster (VIC June 28 – July 1), and Westfield Chermside (QLD Sept 27 – 30).

Posted by adapptor in News, Projects | Tags: , , , ,

Roam the West Without Roaming Charges – Tourism WA Launches the Experience WA App

Short URL 0 Comments

Yesterday the Western Australian Minister for Tourism, Dr. Kim Hames, launched the new Experience Western Australia mobile app.

The free app, built for Android and iPhone, allows users to browse over 7,000 listings from around the state. It incorporates a range of different ways to discover new places and experience the best of what the state has to offer. If you’re stuck for an idea and need some inspiration the app will make suggestions based on your current mood and if you’re planning a specific trip then you can browse some suggested itineraries or even build your own.

But the key differentiating feature of the app is the ability to choose between online and offline modes. An optional preload of all data from the application server (and a flick of the switch to offline mode) allows visitors to use the app on their long flight down-under or explore the furthest reaches of our vast state without having to worry about ‘roaming’ charges. The local copy of the database syncs with the application’s cloud based server, so any changes to the data can be pushed to users’ phones without needing to update the app itself through the store.

A special thanks to HOST and Tourism WA for the opportunity to work on this project.

Posted by Richard in Mobile, News, Projects

A ‘NUI’ Generation of Users

Short URL 1 Comment

Natural User Interface - Synergy Kinect App

Take note digital planners and designers, we’re undergoing the greatest revolution in user interface design since Microsoft and Apple thrust their point and click interfaces on us in the mid 1990s. The 2010s will see a shift away from mouse controlled Graphical User Interfaces (GUIs) to Natural User Interfaces (NUIs) as users expect gesture, voice and touch in their everyday applications.

The NUI generation (born 2003 onwards) already demand a more natural interface having known nothing but touch screens and waving their arms in front on an XBox Kinect, but given the simplicity of NUI control it will soon find favour amongst older folks and everyone else in between.

Let’s face it, the mouse was always a bit fiddly and was bound for the bottom drawer eventually but it is the speed of change that’s caught many interface designers off guard. In just 12 months we have seen smart phones, tablets and game designers make touch controls expected, the next 24 months will see touch all but replace the mouse, aided by intuitive voice and gesture based navigation. This change will be driven largely by computer games but also by other application designers looking for more convenient, user centred, ways to engage with their content.

By way of example we recently launched an application with 303 (for West Australian energy retailer Synergy) that was aimed squarely at kids aged 7 and up. It was an application that sought to educate young Australians about energy consumption and needed to engage kids, who have little time for written instruction. In short, it had to be natural.

Taking some ideas from game designers we developed an Xbox Kinect style game whereby kids could simply step up to a mat and start running to power an application that would teach them valuable lessons about energy consumption. By making it a movement controlled game we created an interface that was not only natural to our target audience but also one which engaged them to participate.

The results had to be seen to be believed. Kids, as if at a theme park, queued to participate and parents stood in the wings soaking up the energy conservation message. Only by exerting physical effort themselves could children truly experience the the effort required to produce energy and in turn how we should be more careful with using it unnecessarily. A traditional web based GUI just wouldn’t have worked.

As developers we may get caught up in the intricacies of the technology at play, using Kinect and complex code to convert body movement to indicative energy output, but the success of this digital communication came down to a simple to use, natural interface that suited the target audience. No instructions required, no mouse, no keyboard, no information architecture, just run….and learn.

 

 

 

 

 

 

 

Posted by adapptor in Observations, Opinion, Projects