The State of Mobile 2023: What’s new in iOS17 & Android 14

Generally, each year Apple and Google have one major software release. These releases cover their entire hardware and software stack and can be hard to follow along with in terms of what features you can take advantage of as a business.

If you’re an app developer or manage app developers, this article will help you understand some of the new features available in iOS17 and Android 14 and how they could be utilised.

Firstly however, some highlights in terms of app usage globally. Data.ai’s annual report showcases some interesting metrics for the past year:

11% YoY growth of app usage (255 billion downloads)

+3% YoY growth of daily time spent per user (5 hours)

+14% increase in mobile ad spending ($366 billion)

Breaking this down a little, short-form video apps like TikTok are commanding the majority of attention with an over 22% YoY increase in app usage. 

And interestingly app usage has been tied to financial market volatility with crypto trading app usage down 55% YoY whilst personal finance loan apps jumping a staggering 81%. 

So… app usage is steadily increasing and more money is being spent not only on advertising, but in-app purchases too. How best do we take advantage of this as an enterprise app developer?

What’s new in iOS17

Interactive Widgets

Apple has made much fanfare about extending their widgets to allow for more interactivity and integration with live app data. Widgets now allow for buttons to be added such that users can pause events, place bets, change routes… any function you wish within your app can be triggered as a UI element within the new widgets.

iOS17 Interactive Widgets

Reminders can be marked as done within the widget, as well as lights controlled without needing to open an app.

Live Activities

Taking this one step further you can now use realtime information within your app to be pushed as an ‘activity’ to either a widget or even the dynamic island to keep users up-to-date without even needing to open the app. Think current pulse, live event placings or current scores for a sporting match. 

Live activities modify widgets on the fly

Widgets can now show live app information and change based on that information state.


Shared Experiences

Using live activities within apps, users can share these experiences and allow collaboration. Imagine sharing a restaurant ordering experience with your friends, each person adds their order to the overall experience which then pushes the completed order to the restaurant. No more passing around the phone. 

Sharing and collaborating experiences

Sharing and collaborating on playlists, food orders and more is now possible.

watchOS

Your app users can now show widgets in a new ‘smart stack’ mode, and each widget is ordered based on relevant to the user. So the first widget might be a counter timing down or the latest calendar event, or even a widget the user has pinned. Using this concept, your app might show contextual information at the top of the stack, so when the user's bus is set to arrive, or the last shark sighting near them or the results of the last bet they put on.

SmartStack for watchOS

Smartstack shows widgets based on an intelligent order. Widgets are also interact-able.

Any type of session can also now be controlled within watchOS widgets. Whether it be a workout, a song, timer or something more relevant to your app. Apple is moving towards allowing faster interactivity for users, without the need to open your app.


TipKit

A great new addition to iOS for your apps is the ability to create guided tips for your users. These tips can be created with several functions in mind. Whether it be to demonstrate new features just launched, highlight hidden features that are generally underutilised, improve user speed by providing suggestions to improve their experience or even allowing users to complete actions within the tip itself, TipKit is a fantastic feature.

TipKit Examples

Some examples of tips shown within TipKit

Machine Learning

Apple are focussing on greater on-device machine learning with a swathe of features designed to improve workflow and extend existing applications.

Machine Learning Improvements

3D pose detection and document structure detection

Improved Image Lifting

More advanced image lifting ensures that copying and dragging unique subjects from photos is more accurate than ever. 

Additional Visual Lookup Options

Visual lookup search (when a user holds the camera up to an object and receives information about it) now includes things such as foods, products, signs and symbols. 

Improved Optical Flow Tracking

Apple has increased the speed in which text is recognised and the refresh rate in which the tracking boxes are displayed creating a smoother experience. For apps which use this feature (AR and navigation apps for example) recognition occurs faster which can improve the user experience.

Currency Support For Scanning

When photographing documents or spreadsheets, iOS now recognises currencies which can come in handy with copy/paste workflows. This could be used also in combination with improved optical flow tracking to dynamically adjust prices within augmented reality apps when a user holds their phone up to price lists.

Document Structure Detection

iOS can now, through on-device machine learning, understand what type of data is present within an image and even preserve the data structure to allow copy/pasting of tables from images to Numbers for instance. Very handy.

3D Pose Depth Detection

Previously, iOS detected key position of arms, legs, joints etc with high accuracy. A new improvement allows iOS to now detect the depth on these points from the camera. Allowing for great potential with augmented reality applications or gaming as well as improving posture accuracy for medical applications.

In-app Purchases

Some small improvements have been made to the way in-app purchases are structured and managed are releasing in iOS17 as well as new purchase types.

In-app Purchase Examples

Greater flexibility of types of items as well as an improved implementation workflow are present

StoreKit Flow Simplified

The previous flow for implementation has been better abstracted to allow for faster integration with your apps.

Specific Merchandise Types

Instead of just having one-off payment items a broader set of individual items have been added for common usage. For example, there are now merchandise types built in.

Wallet & ApplePay

A slower year for improvements on ApplePay with the aforementioned merchandise types being enabled for ApplePay Later transactions. Allowing payment instalments to be used on specific in-app purchases for your app.

Items in ApplePay Later

Users can now ApplePay Later any varied type of StoreKit item

Maps

Apple Maps continues to have a resurgence in popularity with additional improvements to rolling out MapKit as well as the ability to faster implement custom annotations and overlays within your apps. Using a custom map? Why not try adding custom way points, 3D model detail or area overlays in a breeze.

New Map Annotations


CarPlay

Apple now allows for irregular screen shapes for app designers to fill the screen. Additionally, vehicles equipped with navigation pucks can now interact in and out of the in car infotainment systems and CarPlay with ease.

CarPlay Puck Navigation

tvOS

Continuity Camera has now been enabled in tvOS allowing users to transfer a FaceTime call from their phone to TV by placing the phone in front of their TV.

tvOS Continuity Camera Example


VisionPro

The VisionPro is Apple’s first major new hardware release since the Apple Watch. The device is a VR AR hybrid, with Apple coining the technology ‘Spatial Computing’. 

Vision Pro Developer

A high end consumer device, it appears to be targeting media and productivity applications - allowing users to interact with multiple large screen displays within an augmented reality environment for example. Or watching a film on the moon… 

Vision Pro Development Australia

It will be interesting to see how this potential unfolds after its release in Q1 ‘24.


What’s new in Android 14

Internationalisation

Android now allows for per-app language preferences. For multi-lingual users they can now set preferences of which language they wish to use within each app. This also extends to regional preferences for units such as temperature or dates. 


Also added is support for grammatical inflection within languages that utilise it - Polish or Finnish for example.

Android Regional Options

If your app is international it may be worth investigating these features to give your users the most natural experience possible. 

Accessibility, Sharing & App Store Changes

Google has rolled out non-linear font scaling for accessibility within Android apps which allows all text to scale in a natural manner and preserve visual hierarchy. 

Sharing Options

Additionally, app developers can now add custom share sheet actions. These could be useful in your app if there are specific ‘share’ experiences you wish to prioritise.

Lastly, app developers now have better control over when Android phones pickup and install app updates. Developers can control app update settings to ensure an app doesn’t update right before a user is predicted to use the device for example.

Conclusion

iOS 17 and Android 14 are modest upgrades which are focussed on allowing user flexibility and developer efficiency. Apple’s focus on live activities and shared experiences opens the door up to a wide range of ideas to improve your enterprise app development. And if it’s ideas you’re looking for, why not reach out to us and see how we can help?

Previous
Previous

Streamline your iOS builds with Bitbucket Runners

Next
Next

Location Spoofing: Benefits and Drawbacks as a Testing Tool