Ultra Wideband support for car keys. Unlock, lock, and start your car without having to take your iPhone out of your bag or pocket. Ultra Wideband provides precise spatial awareness, ensuring that you won’t be able to lock your iPhone in your car or start your vehicle when iPhone isn’t inside. Safari is a web browser app and web technology platform available on iOS and macOS. It’s built on WebKit, a fast, open-source web rendering engine that implements web standards. Safari includes Apple web innovations such as Intelligent Tracking Prevention, Reader mode, Safari App Extensions, and Web Inspector. Safari works seamlessly and syncs your passwords, bookmarks, history, tabs, and more across Mac, iPad, iPhone, and Apple Watch. And when your Mac, iOS, or iPadOS devices are near each other, they can automatically pass what you’re doing in Safari from one device to another using Handoff. Users with non-Apple devices can join using the latest version of Chrome or Edge. Sending video requires H.264 video encoding support. Some shared content may require a subscription to view. Available on iPhone with A12 Bionic and later. Available on iPhone with A12 Bionic and later. CarPlay support coming in a software update later this year. Safari 14 was counterparted also as the latest version for iOS and iPadOS, respectively as part of iOS 14 and iPadOS 14. Safari 15 was the current preview version, announced in June 2021 and launched in the following July, it was included macOS Monterey, iOS 15, and iPadOS 15 with a new interface.
What's new, what's missing, new challenges and new capabilities for iPhone and iPad
by Maximiliano FirtmanTwitter @firtAboutNewsletter
About 9 min reading time
Not every minor iOS version update includes changes in the Web platform, but iOS 14.5 is an exception. It has a new version of the WebKit engine, changes in WKWebView, Safari 14.1 as the browser, and an updated engine for PWAs. It's also the first update in 2021 for Web and PWA developers since the release of iOS 14. And, as usual, as of today, there are no Safari release notes for this version.
This version's research reminded me of the first article of this series 10 years ago: a feeling with a mix between excitement and frustration after finding something new but completely undocumented. How should I call the series of iOS articles I've been doing so far? Some options: 'Safari on iOS: The missing documentation', 'The inflammatory* guide to Safari for Web professionals', 'Eternal sunshine of the documentation-less platform' 🤔
Anyway, iOS 14.5 is packed with new things you should know if you are a web developer, including some changes on the PWA platform.
Summary #
- Progressive Web Apps: Changes in Status Bar, Media Session API (experiment), Bug and Missing Features Report
- Speech Recognition API
- Contact Picker API (experiment)
- New
<model>
HTML element for 3D/AR models (experiment) - Privacy-Preserving Ad Click Attribution
- Paint Timing API
- Other API changes: TransformStream API, Web Authentication 'Modern', WebRTC Sockets, changes in mouse events on iPadOS, Modern Web Audio API (with support for Audio Worklets)
- CSS: individual transform properties, animations for discrete properties and pseudo-elements, flexbox gap property
- Web Assembly: BigInt, sign-extension, bulk-memory, reference types
- Web Views: Added full support for getUserMedia and TouchID/FaceID.
- JavaScript: modules support in Workers and Service Workers, private static methods, top-level-await, and more
Progressive Web Apps on iOS 14.5 #
After defining the idea in 2007 and supporting all the specs in 2018, Apple finally acknowledged the name 'Progressive Web App' and the platform as an alternative to the App Store. It's not how we all were expecting this to happen, but at least it's a start. No one from the WebKit and Safari teams is talking about PWAs; Apple's lawyers defined PWAs and their compatibility on iOS and iPadOS as evidence against App Store monopoly to the Australian Competition and Consumer Commission (ACCC) (see full PDF document). This declaration supports what I said earlier this year about why Apple kept the PWA platform as a minimal viable product.
Status Bar Change #
The first change that I've discovered for PWAs in 14.5 is the status bar. The status bar stopped rendering a white status bar as in 14.4. So I started researching the meta
with the name apple-mobile-web-app-status-bar-style
, and I've found something interesting. There are new values for this meta tag, never documented by Apple. But they are doing the trick on iOS 14.5, so if you are expecting a white status bar, you need to make a change.
The Apple documentation about this meta tag was last updated in 2014 and it's outdated. It's currently under the Documentation Archive) and there is no other document replacing it.
These are the current values accepted by the meta tag: light-content
, dark-content
, hidden
, default,
and it still takes two old values now marked as deprecated: black
, and black-translucent
. For some years, white
was also accepted, but it was never documented. Now it's officially out.
If you want a white status bar, then you need to use the new value:
The value black-translucent
is still the only way to simulate something similar to a display: fullscreen
in the Web Manifest on iOS and iPadOS. It's marked as deprecated, but Apple never documented what's the replacement. Now that I've found the hidden
value, I think that is the one that should be replacing it. However, it's not working, and while I can confirm the value is set in the Web Clip (the icon on the home screen), nothing happens differently than the default.
A white static status bar is not possible if the user has a dark appearance enabled unless you use black-translucent
and a white document's background color.
Based on the appearance of the device (Light or Dark) and the value of the meta tag, the status bar of your PWA will follow the following pattern
Meta value | Light Appearance | Dark Appearance |
---|---|---|
(no meta present) | W/B | W/B |
default | B/W | W/B |
light-content | W/B | W/B |
dark-content | B/W | W/B |
hidden | B/W | W/B |
black (deprecated) | W/B | W/B |
black-translucent (deprecated) | W/T | W/T |
B/W=Black over White, W/B=White over Black, W/T=White over Transparent (viewport takes fullscreen)
The new values make non-sense, and it seems they are shipping a half-baked implementation.
We have:
- A deprecated value
black-translucent
with no alternative value - Three values with the same result (
default
,dark-content
,hidden
) - One value
light-content
equivalent to the deprecatedblack
value - One value
dark-content
equivalent to the removed and never documentedwhite
value - A
hidden
value that hides nothing - If you don't provide a meta tag, the results are different than using
default
as a value
For some reason, the iOS simulator exposes different status bar colors than the ones I see on real devices
Nothing seems to make sense, but it is what it is 🤷♂️
I'm not sure if these new values appeared now or they've been hidden there for a while, but from iOS 14.5, the deprecated black value and the default values are changing, so the new undocumented but actual values appear as the only solution
What about theme-color? #
The Web App Manifest spec and the HTML Living Standard support the theme color, which is used by Android and desktop PWAs to tint the browser, the status bar, or the title bar. Safari was ignoring this value in the past. It's still missing it on Safari 14.1. However, there is good news: WebKit has added support for it, it's now part of the Manifest data structure, and WKWebView will have a themeColor
property.
It's still unclear if Apple will use that property parsed by WebKit to style the PWAs and stop using the meta tag in the future; we'll need to wait some months until iOS 15 maybe or wait for the miracle of hearing something from the Safari team before that.
Background Audio #
After Web Push and a better installation prompt, background audio for PWAs is the following typical complaint. Well, still no luck. Background audio from a PWA continues playing in the iOS Simulator, but that's not the situation on an actual device. I presume it's because the process is suspended not matter if you are playing audio.
But there is one hope: the Media Session API is now available as an experiment (disabled by default). When your PWA was playing audio (or even video), the user can control the media playing from the Control Center and the notifications area. That means, shortly, this can be how Safari will allow PWAs to play audio in the background. We'll see!
Bugs and Missing Features #
Some old bugs are still there, such as seeing 'Untitled' in the status bar's back button when you open a link from a PWA. Another bug that I saw without consistency is that sometimes the service worker is not available when you are offline and open the app from the home screen.
Also, the missing features that a lot of people are waiting for are still pending, such as:
*PWAs in the App Library
*Web Push
*Background Sync
*Better Installation Flow (beforeinstallprompt or an App Banner)
*Ability for other browsers to install PWAs
Speech Recognition API #
Safari has supported Web Speech API partially for years (only the Speech Synthesis part). But now, the team has decided for some reason that eight years after Google Chrome started to support the Speech Recognition part of the spec, it was time for Safari to do the same.
From iOS and iPadOS 14.5, Safari and PWAs can recognize speech from the user using the microphone and an Apple cloud-based service. It works only after the user has started an interaction (such as a click handler), and it can recognize multiple languages.
Update 27/4: Unfortunately, the Speech Recognition API does NOT work inside a standalone PWA. The API is available, so a feature detection algorithm detects it, but nothing happens. My guess is that it has to do with the speech or microphone permission that is not working while in a PWA. Also, it has to do with Apple folks not testing within standalone apps, as usual.
The Model HTML Element experiment #
WebKit now supports a new HTML element as an experimental feature that was never proposed or discussed in any standard or spec as far as I know.
The feature is disabled by default, and it represents a 3D Model that can potentially be rendered in virtual reality or augmented reality. The model
element needs one or more <source>
elements, presumably in different formats. However, it's available in the DOM but not working yet:
The code to render a 3D or AR model will look like
Right now, enabling the experiment does not render any model on the screen. The element is in the DOM using the new constructor HTMLModelElement
, and the currentSrc
property works, but nothing is rendered on the screen, so this seems like an experiment in the early stages
The model element will support a ready
promise to know if the model could be loaded or not:
We need to remember the ability to render 3D/AR model was added to Safari 12 in 2018, but it is using a weird <a>
element with a rel='ar'
.
It's interesting to see Apple innovating again with the Web; it's also odd to see them implementing something that was not discussed anywhere before when they always complain that Chrome implements features that are not yet standardized.
Contact Picker API #
The Contact Picker API is now an official experiment (disabled by default) in Safari and PWAs. It let web apps pick a contact from a selector and get the value(s) as a JavaScript object. This API has been in Chrome for a year and a half, being one of the shortest periods between Chrome and Safari implementations in several years.
The properties available for query in this version of iOS/iPadOS:
- name
- tel
The similar native component CNContactPickerViewController
has more properties available, such as givenName
, familyName
, jobTitle
, organizationName
, postalAddress
, socialProfiles
that are not available from the Web platform at this time.
Android since version 83 also supports address
and icon
as available properties to query. The spec makes the list of properties dynamic so that every platform can pick the exposed properties. Do not query for unsupported properties, or you will get a TypeError exception.
Privacy-Preserving Ad Click Attribution #
Private Click Measurement (PCM), a new ability available in Safari from iOS 14.5, to track click conversion and link attribution for seven days is available. It's also the first time WebKit documented something new when shipping it in a beta version of iOS.
If you read the spec, the WebKit blog, the Safari settings, you will see different names for the same feature: 'Privacy Preserving Ad Click Attribution,' 'Private Click Measurement,' 'Privacy Preserving Ad Measurement.'
To use this feature, each link must include new attributes, and attribution will be sent over a well-known URLs. Two hundred fifty-six source IDs can be tracked with 16 different conversion events. Attribution data will be stored client-side for seven days and can be deleted or turned off by the user.
Read more about this spec in the WebKit blog.
The solution also includes a way for native apps on iOS to send click attribution for websites. (Not sure how that part can become standard as it is, though)
There is an experiment that can be enabled to help to debug the usage of this API: Private Click Measurement Debug Mode
Paint Timing API #
The Paint Timing API is out of the experiment, and it's now enabled by default. It let us query about only one web performance metric: First Contentful Paint (name
equals to first-contentful-paint
).
Other Changes #
Media and CSS #
- Individual transform properties:
rotate
,scale
,translate
- Flexbox gap property is now available
- Web Animations now available for discrete properties and pseudo-elements
- Some reports are saying that Webm video container format and OGG Vorbis audio codec should be available in this version, but I couldn't confirm it with several samples. It seems that these formats are available on Safari 14.1 only for macOS.
- CSS Aspect Ratio (experiment, disabled by default)
- CSS Overscroll Behavior (experiment, disabled by default)
Web Platform APIs #
- TransformStream API
- 'Modern' Web Audio API: support for Audio Worklets and other additions
- 'Modern' Web Authentication: I don't know what that means
- WebRTC Sockets
- WebRTC VP9 profile 0 enabled
- Web Assembly improvements: BigInt, sign-extension, bulk-memory, reference types
- MediaRecorder API seems to be there, but I can't confirm it yet; my code didn't work
- Changes in mouse events on iPadOS:
wheel
event gestures are now non-blocking (preventDefault()
is only allowed on the firstwheel
event);wheel
event listeners on the root are not passive; some changes tohover
/pointer
media queries are available. - Fixed
devicemotion
anddeviceorientation
events to work in third-party iframes with Feature-Policy, allowing it - New permission dialog may be available for iframes using the Geolocation API
- WritableStream API (previously a disabled-by-default experiment)
JavaScript runtime #
- Modules support in Workers and Service Workers
- Private static methods
- Top-level await
- WeakRef
WebViews and other browsers #
- Full support for getUserMedia
- TouchID and FaceID support using WebAuthentication
- Picture in Picture support
Experiments that are still disabled by default #
- WebGL 2
- HTTP/3
- Web Share Level 2
- Lazy Image
Other New Experiments (disabled by default) #
- Disable Media Experience PID Inheritance
- Lazy iframe loading
- ScreenCapture (I couldn't test it)
- WebRTC Insertable Streams
Anything else? #
Did you find anything else? Let me know on Twitter @firt
Learn about the key technologies and capabilities available in the iOS SDK, the toolkit you use to build apps for iPhone, iPad, or iPod touch. For detailed information on API changes in the latest released versions, including each beta release, see the iOS & iPadOS Release Notes.
iOS 15 SDK
With the iOS 15 SDK, you can build apps that create new kinds of shared experiences with SharePlay and the Group Activities API. Swift 5.5 introduces concurrency support, built into the language with async/await and Actors. Focus and notifications help users concentrate on what matters most, and provide new APIs for your app to differentiate which notifications users need to see most. ARKit and RealityKit provide powerful customization capabilities to help your AR experiences look even more convincing. Create ML gets easier and more powerful with Swift and playground integration, as well as on-device training. And web extensions come to Safari on iOS and iPadOS for even more flexible and powerful browsing experiences.
SharePlay and Group Activities
SharePlay offers a new way for people to share your app. Media streaming apps can let users share content through the new Group Activities API with full-fidelity video and all syncing handled by the system. And for shared experiences beyond media streaming, the GroupSessionMessenger API offers a secure data channel that syncs information between multiple instances of your apps across multiple users.
Focus and notifications
With Focus, users can have notifications delivered at times that work best for them, and with the Interruption Levels API, you can provide more nuanced delivery with one of four interruption levels (including new Passive and Time-Sensitive levels). Notifications from communication apps now have a distinctive appearance, and these apps can — with user permission — sync their status to reflect the user’s current system-level Focus status.
SwiftUI
SwiftUI brings new features, such as improved list views, better search experiences, and support for control focus areas. Gain more control over lower-level drawing primitives with the new Canvas API, a modern, GPU-accelerated equivalent of drawRect. And with the new Accessibility Representation API, your custom controls easily inherit full accessibility support from existing standard SwiftUI controls.
UIKit
UIKit introduces sheet presentation controllers, which let you present your view controller as a customizable, resizable sheet. UIKit provides new APIs for configuration buttons, displaying pop-up buttons, a new chromeless bar appearance, image decoding, and creating a thumbnail version of an image. And starting in iOS 15, drag and drop on iPhone is enabled by default.
Keyboard layout guides
New keyboard layout guides give your app an easy way to adapt your app’s layout based on the keyboard’s size and position. Support for the new tracking layout guide in UIKit automatically enables and disables constraints when the keyboard is docked to the bottom of the screen, undocked, or floating over your app, letting you provide a great text input experience.
Core Location UI
CoreLocationUI is a brand-new framework that introduces the location button, which lets people grant your app temporary authorization to access their location at the moment it’s needed. This button interacts securely with Core Location to request authorization to access location data.
Accessibility
The Accessibility framework introduces audio graphs, a new way to represent data in your charts and graphs that allows VoiceOver to construct and play an audible representation of it. This framework also adds an API to query information relevant for MFi hearing devices, such as streaming preferences, streaming capabilities, and paired hearing devices.
Augmented Reality
ARKit 5
ARKit 5 brings Face Tracking support to the Ultra Wide camera in the latest iPad Pro (5th generation), letting it track up to three faces at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.
RealityKit 2
Turn photos from your iPhone or iPad into high-quality 3D models that are optimized for AR in minutes using the new Object Capture API on macOS. And brand-new capabilities give you more control over your AR objects and scene with custom render targets and materials, customizable loading for assets, player-controlled characters, and more.
Machine learning
Create ML
Create ML is now available as a Swift framework on iOS and iPadOS, in addition to macOS. You can programmatically experiment and automate model creation in Swift scripts or playgrounds. Build dynamic app features that leverage Create ML APIs to train models directly from user input or behavior on-device, allowing you to provide personalized and adaptive experiences while preserving user privacy.
Create ML adds the Hand Pose and Hand Action classifier tasks to both the Create ML API and the developer tool included with Xcode. These classifiers recognize hand positions in still images and hand movements in videos, respectively.
Core ML
Core ML adds ML Packages, a new, future-looking model format that provides the flexibility to edit metadata and visibility to track changes with source control. Core ML also adds ML Programs, a new model type that compiles more efficiently, decouples a model’s architecture from its weights, and offers more control over the computational precision of its intermediate tensors. The new MLShapedArray API lets you work with multidimensional data using idiomatic Swift that improves the code’s type safety and readability.
Tabular Data. Tabular Data makes it easy to programmatically import information from JSON and CSV files and prepare datasets ready for Core ML, Create ML, or your own custom solution. Use Tabular Data’s central DataFrame API to sort, join, group, split, encode, decode, explode, filter, slice, combine, and transform the rows and columns of your tabular data to meet your needs.
Sound Analysis. Sound Analysis adds a new sound classifier that your apps can use to identify over 300 unique sounds from live audio or an audio file. A new time window duration API gives you tune prediction accuracy versus time precision.
Games
GameKit
GameKit provides new ways to discover and invite players to participate in a game. Players can now invite contacts, message groups, and anyone with a phone number or email address. Players see the status of other players receiving and accepting invitations and, optionally, start with a minimum number of players while waiting for others to join.
Game Controller
The Game Controller framework adds virtual controllers — software emulations of real controllers that users interact with similarly to real controllers. You choose the configuration and the input elements to display specifically for your game.
StoreKit 2
StoreKit’s new In-App Purchase API provides a simple, powerful, and secure way to work with your app’s products and transactions. The new API takes advantage of modern Swift features, such as concurrency, to simplify your in-app purchase workflow. Its cryptographically signed transaction and subscription information uses the JSON Web Signature (JWS) format, which is a secure and simple way to parse on the client. A new entitlements API makes it easy to determine which content and services your app should unlock for users. Use the new StoreKit API throughout the in-app purchase process — from displaying in-app purchases, to managing access to content and providing customer service within your app.
Apple Pay
Give users more options by adding coupons, deferred payments, recurring payments, shipping dates, and read-only pickup addresses to your Apple Pay transactions.
Safari Web Extensions
Safari Web Extensions use HTML, CSS, and JavaScript to offer powerful browser customizations and new functionality across the web. With iOS 15, Safari Web Extensions are now available on all Apple devices that support Safari.
Screen Time
Apps with parental controls can support a wider range of tools for parents with the Screen Time API. You can use key features, such as core restrictions and device activity monitoring, in a way that puts privacy first.
ManagedSettings and ManagedSettingsUI. Use ManagedSettings to define usage policies and settings constraints on a parent or guardian’s device and apply them on other devices in the Family Sharing group. ManagedSettingsUI provides an opportunity for you to customize Screen Time API’s shielding views to match your app’s branding and style.
FamilyControls.FamilyControls gives control to parents or guardians in Family Sharing groups by requiring them to authorize parental controls on a device signed into a child’s iCloud account. FamilyControls provides a secure environment where only family members in the Family Sharing group can authorize access. It also provides a secure way to select apps, web domains, and categories that protects the user’s privacy.
DeviceActivity.Device Activity provides a privacy-preserving way for an app to monitor a user’s app and website activity.
ShazamKit
Enrich your app experience with audio recognition. Match music to the millions of songs in Shazam’s vast catalog or make any prerecorded audio recognizable by building your own custom catalog using audio from video, podcasts, and more.
MusicKit
Easily integrate Apple Music into your iOS and iPadOS apps using Swift. The MusicKit framework provides a new model layer for accessing music items in Swift, as well as playback support so you can add music to your app.
Nearby Interaction
Build apps that interact with accessories simply by being in close proximity to an Apple device that includes the U1 chip. Taking advantage of Ultra Wideband technology lets you create more precise, directionally aware app experiences.
HomeKit
HomeKit APIs in iOS 15 SDK automatically work with Matter-enabled accessories. Start testing your smart home apps with Matter, the unifying open-connectivity standard designed to increase the compatibility of smart home accessories, so they work seamlessly with your devices.
HealthKit
HealthKit adds the ability to request one-time access to a verifiable clinical record. These records bundle information about the user’s identity with clinical data, like an immunization record or a lab test result. The organization that produced the data cryptographically signs the bundle, which HealthKit apps can access and verify.
CloudKit
CloudKit builds on top of the new async/await support in Swift 5.5, making the asynchronous API easier to use and more configurable. CloudKit adds Record Zone Sharing, which builds on the existing sharing infrastructure to let users share the entire contents of a record zone with other iCloud users. You can now encrypt a record’s values using new APIs on CKRecord, helping you offer strong privacy guarantees to your users. The new CloudKit Schema Language allows you to retrieve and upload textual representations of your CloudKit schema, which means you can now version it using the same tools as your app’s source code.
CloudKit Console. CloudKit improves your workflows with a brand-new CloudKit Console, an intuitive web-based control panel that you can use throughout the development lifecycle of your app, and cktool command line interface.
Core Data. Core Data provides new APIs that facilitate the sharing of managed objects with other iCloud users, specifically for CloudKit-backed persistent stores. In addition, you can now choose to encrypt an entity’s attributes before they’re saved to iCloud. Spotlight integration has also been enhanced, with additional APIs that allow for fine-grained control over what data is added to the index and when.
Virtual conference extension
Apps that provide virtual conference services can use this new app extension in EventKit to integrate directly into users’ calendar events. You’ll be able to provide custom locations for events, a link that lets people join the conference with a single tap, and additional information, like dial-in details.
iOS 14
With the iOS 14 SDK, users can more easily discover your app’s core functionality through app clips. SwiftUI introduces a new app life cycle and new view layouts. It supports the new WidgetKit framework, which allows your app to display information directly on the iOS Home screen. Machine learning adds style transfers and action classification to the models, and offers a CloudKit-based deployment solution. Vision API additions help your app analyze image and video more thoroughly. ARKit advances promote an even tighter integration with the world around the device, and you can include markups in your emails and websites that help Siri Event Suggestions surface your events.
App Clips
An app clip is a lightweight version of your app that offers users some of its functionality. It’s discoverable at the moment it’s needed, fast, and quick to launch. Users discover and open app clips from a number of places, including Safari, Maps, and Messages, as well as in the real world through QR codes and NFC tags. App clips also provide opportunities for users to download the full app from the App Store. To learn how to create your own app clips, see the app clips documentation.
Widgets
Widgets give users quick access to timely, at-a-glance information from your app right on the iOS Home screen. iOS 14 offers a redesigned widget experience. Your app can present widgets in multiple sizes, allow user customization, include interactive features, and update content at appropriate times. To learn about designing widgets, see the Human Interface Guidelines. To learn how to support widgets in your app, see the WidgetKit framework.
SwiftUI
SwiftUI provides a selection of new built-in views, including a progress indicator and a text editor. It also supports new view layouts, like grids and outlines. Grids and the new lazy version of stacks load items only as needed.
Starting in Xcode 12, you can now use SwiftUI to define the structure and behavior of an entire app. Compose your app from scenes containing the view hierarchies that define an app’s user interface. Add menu commands, handle life-cycle events, invoke system actions, and manage storage across all of your apps. By incorporating WidgetKit into your app, you can also create widgets that provide quick access to important content right on the iOS Home screen or the macOS Notification Center. For more information, see App Structure and Behavior.
ARKit
ARKit adds Location Anchors, which leverages the refine location feature in the new Apple Map to enable rear-camera AR experiences in specific geographic locations. A new Depth API lets you access even more precise distance and depth information captured by the LiDAR Scanner on iPad Pro. To learn more about these features, see the ARKit framework documentation.
Machine Learning
Your machine learning apps gain new functionality, flexibility, and security with the updates in iOS 14.Core ML adds model deployment with a dashboard for hosting and deploying models using CloudKit, so you can easily make updates to your models without updating your app or hosting the models yourself. Core ML model encryption adds another layer of security for your models, handling the encryption process and key management for you. The Core ML converter supports direct conversion of PyTorch models to Core ML.
The Create ML app’s new Style Transfer template stylizes photos and videos in real time, and the new Action Classification template classifies a single person’s actions in a video clip. For more information, see the Core ML and Create ML developer documentation.
Vision
With iOS 14, the Vision framework has added APIs for trajectory detection in video, hand and body pose estimation for images and video, contour detection to trace the edges of objects and features in image and video, and optical flow to define the pattern of motion between consecutive video frames. To learn more about these features, see the Vision framework documentation. In particular, read Building a Feature-Rich App for Sports Analysis to find out how these features come together in a sample app.
Natural Language
The Natural Language framework has new API to provide sentence embedding that creates a vector representation of any string; word tagging to train models that classify natural language, customized for your specific domain; and confidence scores that rank the framework’s predictions. For more information, see the Natural Language framework documentation.
App Store Privacy Information
Privacy is at the core of the entire iOS experience, and new privacy information in the App Store gives users even more transparency and control over their personal information. On iOS 14, apps will be required to ask users for permission to track them across apps and websites owned by other companies. Later this year, the App Store will help users understand apps’ privacy practices, and you’ll need to enter your privacy practice details into App Store Connect for display on your App Store product page.
Siri Event Suggestions Markup
You can use the Siri Event Suggestions Markup to provide event details on a webpage and in email. Siri parses travel arrangements, movies, sporting events, live shows, restaurant reservations, and social events. Once parsed, Siri can suggest driving directions, a ride share to a scheduled event, or activation of Do Not Disturb just before a show starts. To learn how to integrate your own events with Siri, see the Siri Event Suggestions Markup documentation.
PencilKit
PencilKit now enables handwriting recognition inside text fields. Using gestures, users can also select or delete text, and join or break up words. You can add data detection to your app, as well as text and shape recognition and selection. For more information, see the PencilKit framework documentation.
Check Safari Version Ios
Accessibility
A new Accessibility framework lets your app dynamically deliver a subset of accessible content to a user based on context.
MetricKit
MetricKit adds Diagnostics, a new type of payload that tracks specific app failures, such as crashes or disk-write exceptions. For more information, see the MetricKit framework documentation.
Family Sharing for In-App Purchases
Family Sharing is a simple way for users to share subscriptions, purchases, and more with everyone in their household. And with iOS 14, you can choose to offer Family Sharing for your users’ in-app purchases and subscriptions so their whole family can enjoy the added benefits. See the SKProduct and SKPaymentTransactionObserver for the new APIs.
Screen Time
Current Version Of Safari Ios
iOS 14 includes Screen Time APIs for sharing and managing web-usage data and observing changes a parent or guardian makes. For more details, see the Screen Time framework documentation.
Uniform Type Identifiers
Use the new Uniform Type Identifiers framework to describe file formats and in-memory data for transfer, such as the pasteboard; and to identify resources, such as directories, volumes, and packages.
File Compression
Determine Safari Version Ios
Use the new Apple Archive framework to perform fast, multithreaded, lossless compression of directories, files, and data in iOS.