AUGMENTED & VIRTUAL REALITY:
DID YOU KNOW:
AUGMENTED & VIRTUAL REALITY:
DID YOU KNOW:
Wearable devices are oh so hot! Yet only a few years ago the vast majority of Australians had barely heard of them, let alone used one. This year wearable brands are becoming household names, from Jawbone to Fitbit and from Pebble to Google Glass, there’s no doubt wearable devices will be hot stocking stuffers this festive season. And with the IDC forecasting 30 billion devices to be connected to the Internet of Things (IoT) by 2020, Santa’s wish list will be including wearable devices for many years to come.
The big question in corporate IT is how to integrate wearables into a digital strategy. What are the benefits to the wearer, the business and its stakeholders? How can wearable devices improve the interactions between users and business information systems? One thing is clear, a winning wearable strategy will bring measurable benefits to both the users and the business, via disruptive new services that they enable.
As with most new connected devices, most large corporates consider only wearables as a customer device and forget the applications within the enterprise. Not so HBF (WA’s largest health insurer). They identified an opportunity to enhance their members’ experience of HBF Fitness (their free fitness program for members) by offering a more personalised workout in a group fitness setting.
A relatively new service, HBF runs free fitness sessions for its members in 20 locations around Western Australia. Anyone can register to attend a session using HBF’s online booking system, and then checkin to the event with their personalised code. Each trainer is responsible for managing the checkins, being aware of any participant’s health conditions, running the session, teaching activities, and keeping everything running on time. As you can imagine, that’s a big task for large groups.
HBF looked to wearables to help streamline the process, provide its members a more personalised workout and to ensure a standardised workout for all its members, regardless of the location they trained at. It decided to trial Google Glass due to its unique characteristics, specifically that:
- it sits on your face like traditional glasses
- it can be operated hands-free
- and it can record what it ‘sees’
These features make it a great tool for undertaking tasks whilst users are very active. A trainer can focus his/her attention on the class, while still being able to dip into useful functions to assist the participants or keep the class schedule on track.
HBF partnered with Adapptor to build a bespoke Google Glass app specifically for trainers to assist with running their classes. We’re no stranger to Android, so with that as our foundation, we were able to pick up the unique parts of the Glass Development Kit and quickly create the application. The key requirements/features included:
- integrating with HBF’s own Fitness schedules, allowing the trainer to control an intelligent stopwatch that matched the class schedule,
- integration with HBF’s Fitness system to checkin registrations and view participant’s individual health and fitness information so training sessions can be adapted accordingly,
- video record and playback via the Cloud to show a participant how to improve their form,
- and real-time weather information.
When developing for Glass we had to be particularly mindful of some unique UX challenges such as:
- the way the user interacts with the Glass, whether this is by touch gestures, or voice
- and the useable screen real estate (not very much).
Once these UX components were carefully scoped, it was just a matter of us applying our Android skills to the mix. We worked tightly with HBF’s Digital Services team and delivered the final Glass app in just three weeks, ready for field testing.
With everything working as intended in the office, HBF organised a trial fitness class to put the app through its paces. This enabled us to observe the instructor use the application in a ‘live’ environment and pick up on any UX changes or environmental challenges. Other than leaving our team feeling incredibly unfit, the application worked brilliantly, and feedback from the trainer, Max, was very positive.
Other than being out of breath we were left with a few key take aways from the experience. Insights that lend themselves to most wearable projects:
- understand the device’s unique benefits;
- understand the device’s limitations;
- develop the features based on the user problem, device benefits and limitations;
- pay careful attention to the UX and UI throughout the scope, design, and development process;
- take your testing to the field.
Matching a device’s capabilities to a benefit and knowing its limitations are the keys to building successful wearable services. Fortunately HBF understood how it wanted to use the device right from the start which made our job much easier.
Now to go get fit, so the next wearables project isn’t so embarrassing.
(cover image: Giuseppe Costantino)
Apple’s announcement this week grabbed plenty of headlines. A couple of new phones, a watch, and a payment service…oh, and something about a new album from U2.
Although exciting for the Apple fan boys in our office, most of Apple’s new offerings aren’t amazing. We’ve already seen larger phones, smart watches, and the ability to pay via a smartphone. The main difference is the attention that the products have gained, and the reach that Apple has not only with consumers but also with major organisations (e.g.: Visa, Amex, MasterCard). Is this the catalyst that will mean wearables hit the mainstream?
However, it’s not just the new shiny objects that really matter, there are a lot of things under the hood that will make a massive difference to the world of mobile in the coming months.
In fact, Apple’s announcements a few months ago were perhaps more critical. HomeKit, HealthKit and add to that Apple Pay (as well as Google’s equivalents) combined with their phones and watch mean that we’re all about to see a new raft of innovation around mobile.
It’s these additions that Australian companies should pay attention to.
Over the last few years we’ve already seen a bunch of innovation in mobile, but most of it has centred around the smartphone. Essentially providing apps that people can use anywhere, but really only dipping into information and functionality. The recent additions by Apple and Google will extend the innovation beyond the smartphone, and out to smart devices that we have in the home, business, or even what we wear.
Why does this matter to Australian business? Think of what services will now be possible in the health industry, for utilities, the media, property, retail, and the Government, etc.
Health insurance companies are already using technology to offer discounts the more active people are; hit your daily step goal, and save money on your health cover. Energy companies are also offering flexible pricing for customers using smart meters (already installed in Victoria).
This is just the tip of the iceberg. With smart devices about to boom, imagine the other innovations that will be enabled.
Australian organisations should evaluate what the boom in the Internet of Things means to their business. What will this technology mean to their existing service, or are their new services they can offer? Will it mean new players can enter their market and flip their business model on their head (look at what Uber has done to the taxi industry recently)?
It’s now not only important that an organisation has a mobile strategy; they need a broader strategy that looks at how technology can enable, extend, improve, or add new services to their organisation. If they’re not, their competitors are.
Mobile apps are dead. Long live the app-enabled consumer service.
MELBOURNE, JULY 23, 2014: Respected Perth creative applications company, Adapptor, today announce their Melbourne expansion on the back of a spate of recent large-scale national and international mobile projects.
Numerous projects have launched over the past two months, including a collaboration with Sydney agency HOST to release Play Packs, an app featuring WhatSoNot (DJs Flume and Emoh Instead) for Wild Turkey label, American HoneyTM. Other major releases include an all new iiNet Customer Support App, a Barbie Kinect installation for Mattel® Malaysia and an AR mapping app for development giant Mirvac.
“The growth in work originating from agencies in Melbourne and Sydney, combined with the significant competitive advantages we are building in highly technical dev meant that an east coast office was a natural next step for the business,” explained Adapptor Managing Director, Marc Loveridge.
With considerable experience across the health, government, transport, tourism and construction/property sectors, the Melbourne office will aim to leverage these capabilities on the east coast – working as a skilled consultancy for agencies, and directly with IT groups and brands.
The company has appointed Sarah Sproule as Strategy and Client Service Director of the Melbourne office. Sarah was previously a Director of Gun Communications and, prior to that, was General Manager of Spin, Melbourne.
Adapptor plays an active role in the Australian digital marketing industry supporting the OzApp Awards, StartUp Weekend, Adschool, ADMA, AIMIA, Curtin Growth Ignition and the Emergence Creative Festival. Founded in 2010 the company boasts one of Australia’s most experienced iOS and Android developer teams with extensive experience in Objective C, Java and C# for mobile platforms. An invited member of the global Microsoft Kinect and Leap Motion developer programs, Adapptor’s labs program, called ‘The Vat’, actively prototypes gesture controlled applications, working closely with agency partners, to deliver unique digital activation experiences.
Recent technological achievements include an OpenGL music mix-deck (for HOST’s American Honey “Play Packs” App), server side geo-mapping of traffic incident impact zones (Right Move Perth), a cloud based shared family health record (HBF Pocket Health) and fully functioning offline slippy maps within iOS/Android (Experience WA Tourism WA App).
For further information or commentary, please contact:
Sarah Sproule // 0418 737 500 // sarah(at)adapptor.com.au
Marc Loveridge // 0413 059 070 // marc(at)adapptor.com.au
The Kinect application we created for the Australian “Barbie® Dream Closet” campaign a little while back has just commenced a tour of Malaysia. We performed a few tech updates to the Barbie® Dream Closet app, before packaging it up and sending over to the Mattel’s agency in Singapore.
Looking at these photos, it seems Barbie’s wardrobe is as popular as ever with little girls! Singapore was the first stop on the Dream Closet’s regional tour of Malaysia, taking place over the next few months.
PlayPacks, a project developed in partnership with HOST and What So Not, is our newest app for Android and iOS, and one of our most sophisticated. It combines augmented reality technology with multiple sample-accurate audio stream playback, all connected to a web sharing backend. Making this work smoothly across both platforms proved to be a real technical challenge. In this post we’ll share some of the techniques we used to get it running across two very different operating systems.
Today there are just two mobile platforms that are relevant to consumers – Android and iOS. Reaching the widest possible audience for your app means releasing on both. Unfortunately this usually requires creating and maintaining two codebases, an undertaking requiring as much work as writing two separate apps.
On the surface Android and iOS development are radically different. Android uses the Java platform as a base, and provides an application framework on top of it. iOS apps are written in Objective-C and built with the Cocoa framework, a library with a history extending back to the ’80s. Between different vendors, languages, and foundation libraries the opportunities for code sharing seem limited.
We have to dig deeper at the operating system level to find the similarities. Beneath its Java layers Android is built on a Linux kernel. And iOS is based on Darwin, Apple’s BSD derivative. Both share a lineage with Unix, the great-grandaddy of modern operating systems. Although their paths diverged a long time ago, there remains a common core set of functionality across each. This is exposed through lower level C and C++ code in their respective environments.
To be clear, this isn’t enough to write an entire app. The common native libraries provide interfaces to low-level plumbing, including parts of the POSIX standard (threading, I/O, system utilities), C and C++ standard libraries, and the OpenGL accelerated graphics interface. Standard platform features (e.g. user interface controls, app lifecycle management, AirPlay) must still be written within the platform framework (Cocoa, Android SDK) and interfaced with glue code.
The payoff for living with these limitations is true code sharing. By writing the core features of the app in native code and using portable native libraries, we can reuse the exact same codebase across both platforms. Within this core set, features are written once, bugs are fixed once, and the changes propagate to both platforms.
Xcode provides direct support for C and C++ code natively. On Android, the Native Development Kit (NDK) provides toolchains to build shared libraries from C or C++. The libraries are packaged with the Java parts of the app and interoperate through the Java Native Interface (JNI), a way to call native code from Java and vice versa.
With PlayPacks we knew we were going to produce Android and iOS versions in quick succession. This informed us to design the app upfront with portability in mind. Our technique consists of two golden rules:
The unique technologies built into PlayPacks are the augmented reality and synchronized audio features. We built these on Vuforia and libpd respectively. These are two cross-platform native libraries built with platform glue code. This is, not coincidentally, exactly the same technique we employ for portability and made it easier to integrate them into our source build tree.
The core parts of the app are written almost completely in native code. This includes code to draw and animate the Mixer and AR views, play back audio, record and save audio sequences, store user configurations and interact with the augmented reality markers.
From the brief we knew we required custom rendering and animation in each screen that required complete control of the drawing surface – tasks that OpenGLES is particularly well suited to. On each platform these OpenGLES views are embedded into customized platform-specific views that integrate with the native UI hierarchy.
Similarly, sound is generated as a final mixed floating point stream and passed to platform specific interfaces to the audio hardware. I’ll go into more detail about the media playback details in a separate post.
As outlined above, accessing standard features still requires us to write a lot of platform specific code. In PlayPacks, all the menu controls, video playback, network access and interstitial screens are written against their respective platform libraries – Java/Android SDK on Android and Objective-C/Cocoa on iOS. Obviously reusing the source code in these environments is impossible.
Instead we do the next best thing – reusing the code structure. By exploiting the similarities between Java/Objective-C and Cocoa/Android SDK, the shape of our platform specific code is identical on each platform. Both languages are high level and object oriented. And we can map common concepts between frameworks – see the table below. In this way, we created every Objective-C class with a functionally equivalent Java class containing the same methods and calling patterns.
|View controller containment||Fragment|
This isn’t quite as low maintenance as reusable source across platforms, but a lot nicer than decoupled codebases. We had the iOS version mostly completed before commencing work on Android, and following this rigid structure made the porting process very fast. Using the iOS code for reference, there were no decisions to be made about structure – instead classes were converted method by method, line by line. Any differences precipitated by the platform library were encapsulated as best as possible in helper classes.
The shared structure also helps with synchronizing changes between the two. When a change is made in one codebase it is easy to find and amend the equivalent code in the other, usually by looking up the same class and method name.
Before jumping into native code, its worth considering some of the pros and cons:
We’ve been playing with these technologies for a while now – a good chunk of the rendering code was cribbed directly from a long-term internal prototype. PlayPacks is the first app we’ve released where we’ve been able to demonstrate the feasibility of this approach for rapidly deploying to iOS and Android while keeping feature parity. Code sharing with native code enables us to target both platforms without compromising performance along the way. We hope by evolving this technique we can improve our time to market while treating both platforms as first class citizens.
Our latest collaboration with Gun Communications lets kids interact with Barbie® through Target’s shop windows. pharmacy7days-online.com Working closely with Gun’s design team we created a custom XNA app central pharmacy for Microsoft Kinect® to create an interactive 3D storybook. The experience included a suite of gesture controlled http://pharmacy-7days-canadian.com/atrovent-cost.html mini-games including a Barbie® jigsaw puzzle, spin the ballerina and virtual painting http://pharmacy-online-24hour.com/punarnava-online.html game. http://pharmacy-7days-canadian.com/levitra-oral-jelly-cost.html
It’s been exciting to track the buzz at SxSW for Leap Motion, probably the hottest NUI controller to hit the market since Kinect for Windows. As with Kinect, the potential of Leap Motion lies not in the hardware (which isn’t anything radical) but in the software and the potential for game changing gesture controlled applications.
Adapptor was lucky enough to be invited into Leap Motion’s Developer Program back in December 2012. Since then, and after a short wait for the hardware to arrive, we have spent some time familiarising ourselves with the SDK and racking our collective brains for useful applications of the technology.
As we collected these ideas we quickly identified that applications fell into one of three buckets: utility, education and play. Using just hand gestures to accurately control an interface opened up some exciting opportunities but, in our minds, any such use case must still present an improvement over the trusty mouse or touch screen.
It may be the mobile app blood pumping through our veins but this got us thinking about maps and more specifically way-finding in open spaces (think university campuses, shopping centres, theme parks and train stations).
As these types of destination migrate towards a digital signage strategy, how will they allow public interaction with their maps? Giant touch screens? Maybe but the hardware is currently cost prohibitive and requires users to be up close and personal with the screen, which in turn makes them at perfect vandal height and hard to make weather proof.
Enter Leap Motion, an $80 controller that works at distance from the screen it’s connected to, and claims millimeter level accuracy to make for a compelling user experience . The Kinect device struggles with this type of precision tracking so we were interested to see how Leap Motion would fair when put to the task. The answer, as you can see from the prototype mapping app we created, is well, very well indeed.
Just like Fleetwood Mac, the Internet wants to be with you everywhere. On the train, in the café, even tucked up in bed on a Sunday morning, nursing a hangover, that little terrier we call the Internet runs in the room, licks your face and begs you to share your condition with friends and strangers alike.
It never used to be so clingy. We would ‘log-on’ and more importantly, ‘log-off’. There was, after all, a time and a place. Not anymore, no sir-e. Thanks to our mobiles the majority of Australians now wake up to the Internet, commute with the Internet, work on the Internet and ‘relax’ with the Internet.
By 2015 Telsyte predicts 87 percent of Australians (over 16 years old) will own a smart phone. So just about every adult will be online for 100 percent of their waking life: reading, playing, sharing and consuming company services via their mobile.
Even now, in 2013, the phone is our Internet device of choice, but ask any of Australia’s best digital marketing planners and they’ll tell you their clients still fail to invest the appropriate time, planning and budget towards ‘mobile’.
We’ve seen this kind of lag between digital opportunity and digital spend before. In Australia at least, we didn’t see an appropriate media investment in Google Adwords until 2008, a full eight years after Google launched the service. Similarly we’re now seeing investment in social media gaining favour, some three years after Facebook became Australia’s number one media destination.
But what’s causing the current lag in mobile app investment; is it an Australian thing, are we just too laid back or do we consciously ignore the data? Possibly, but a more likely cause is a simple lack of ideas, caused in part by a lack of understanding of what’s possible.
To understand what’s possible businesses need to understand the difference between branded apps and brand advertising. The best branded apps do not belong to a campaign, they are not simply another channel through which to communicate the current brand message or product promotion. They are a utility in their own right; something useful that earns the attention of consumers. This type of app should be the obvious starting point for marketing and operations managers, take an existing business process or service and mobilise it through an app, thus delivering an improved customer experience.
But the more exciting opportunity lies in using mobile apps to provide brand new services that only the mobile Internet can deliver. Take a look outside of Australia and it’s not hard to find examples of disruptive new mobile services. Walgreens, the US based pharmacy chain has developed an app driven service allowing customers to ‘scan and refill’ their own prescriptions, an innovative use of mobile technology that provides a distinct point of differentiation over their competitors.
Australian businesses may have been a little slow out of the mobile blocks but 2013 will see the ideas (and investment) begin to flow. Financial services, health, telecommunications, education and real estate will likely lead the charge. Many of the apps will continue to mobilise existing business services but a few innovative companies will seize the opportunity to do something different.
As with Walgreens, it will be the brave first movers in Australia, who’ll reap the rewards through app differentiation. By considering ‘mobile’ to be an opportunity to deliver useful new services (and not just another channel to market) these disruptive Australian companies will be rewarded with publicity for their innovation, earn referral from their existing customers and increase profitability through more efficient service delivery.
Our latest digital installation created with our friends at Hatchd for Brookfield Place Perth uses the power of Microsoft Kinect to make fingerpainting 100% mess free. Kids (and adults alike) can forget the smock, step into the paint booth and move their hands in space to select and apply colours. Their daubs, strokes and splodges are projected onto the canvas in front of them, and when they’re finished become part of a collaborative mess-free masterpiece.