Explore Code Samples for iOS 11 in Swift

MentorMate
12 min readApr 19, 2018

--

iOS 11, introduced at Apple’s Worldwide Developer Conference 2017, offers a range of significant improvements in the functions and capabilities of the mobile OS in response to mobile users’ evolving expectations.

Here are some of the features MentorMate’s iOS team finds most exciting — plus code samples that can bring your user experiences to the next level.

Drag and Drop

Drag and drop is way to graphically move data from one application to another or even within the same application. This feature allows for very fast, responsive, and efficient movement through and between apps. The drag and drop APIs are structured so that data is never moved unless it’s requested.

They are also designed to deliver the data to applications asynchronously. This means requests will never block the UI/main thread — causing the app to freeze — and prevent users from using it even as data is being processed.

Drag and Drop Makes it Easier to Share Content

With a single finger, a user can move or duplicate selected photos, text, or other content by dragging the content from one location to another, then raise the finger to drop it in place. Touching and holding selected content makes it appear to rise and adhere to the user’s finger. As the content is dragged, animations and visual cues identify possible destinations.

The system also displays a badge that indicates when dropping isn’t possible, or will result in duplicating the content rather than moving it. Drag and drop involves moving selected content from a source location to a destination. These locations can be in the same container, like a text view, or in different containers, like text views on opposite sides of a split view. Drag and drop can even be integrated into table views, collection views, or custom views.

The feature can be incorporated into every application aiming to achieve better user experience regardless of its type or complexity. The drag and drop operations feel intuitive and easy to use.

How New Drag and Drop API Improves Your App’s Security

This feature offers greater data security in a way that the pasteboard couldn’t in previous versions of iOS. In particular, data is only visible to the destination application that the user has selected.

View code samples for Drag and Drop in Swift.

Natural Language Processing API

Natural Language Processing (NLP) is a field of artificial intelligence and computational linguistics concerned with the interactions between computers and human languages. Apps that use NLP are designed to make interactions therein more intuitive as they decrease the burden of manual data entry from the user.

The natural language processing APIs use machine learning to understand text using features such as language identification, tokenization, lemmatization, part of speech, and named entity recognition. Gathering intelligence from natural language input is one of the most complex and sought-after AI features and it has a lot of underlying processes involved.

NLP Lets You Talk to — Not at — Your Phone

The most recognizable application of NLP technology may be Siri, but it’s now present in almost every system application. For example, when the user sends a text message, iOS uses NLP to predict your next word or suggest a spelling correction.

From search to customer service, NLP is expected to become increasingly involved in people’s everyday use of technology in the future.

View code samples for Natural Language Processing API in Swift.

Persistent History Tracking

Persistent History Tracking is one of the new Core Data features in iOS 11. It targets the issue where multiple contexts and extensions update a persistent store or data file.

Instead of reloading a file after each small update to it — a very inefficient process — an app can fetch just the most recent changes thanks to persistent history tracking. This feature ensures that all of the data is appended to the data store, relieving the user from the burden of constantly hitting “save”. Even if when the application crashes or exits unexpectedly, persistent history tracking ensures that the most recently added data remains in place.

Users Get More Peace of Mind with Persistent History Tracking

The feature can be used for every application that requires a flawless persistent store synchronization when interacting with multiple processes. For example, the persistent history tracking can be used by an application containing a notification service extension, like a messaging application:

  • Imagine the application is turned off and the device is locked. The user receives a notification, which triggers the notification service extension functionality (while this service is independent from the main application, they can share the same persistent store).
  • Once the notification is received, the extension could download some data from a web service and store it into the persistent store. When the user unlocks his phone and opens the main application, he can expect the downloaded data to be merged in the main app.
  • With this feature the main application is able to determine when a data update was performed by the extension and take appropriate action. However, the user doesn’t receive any alert or other UI notification when this happens. Once the main application is launched all of data will be available to the user.

View code samples for Persistent History Tracking in Swift.

PDFKit

The Portable File Format, better known as a PDF, is a popular file format used to present and exchange documents containing texts, fonts, and graphics across platforms. They are independent of any one application, software, hardware, or operating system, which explains their popularity. Given their universality, its important that PDFs are accommodated in the mobile world.

This reality has not been lost on iOS developers.

In the days before iOS 11, supporting PDFs in an application required the CoreGraphics PDF Framework. It allowed users to read, write, and modify documents — but that is all. Live interactions with the document were not supported.

If the user’s goal was just to present a PDF document, UIDocumentInteractionController or even UIWebView would have sufficed. But for more complex actions, many developers often prefer to use third party libraries.

How PDFKit Breathes New Life into Mobile Viewing of PDFs

iOS 11 succeeds in overcoming previous PDF-related limitations with Apple’s PDFKit. It’s powerful and easy to use, and these are some of the features developers are most excited to incorporate into their work:

  • display documents
  • add, swap, insert and remove pages
  • create a new document
  • choose layout, direction, page padding, zoom factors and zoom behavior
  • text selection
  • highlighting search results
  • annotations
  • accessibility support
  • voiceover
  • text extraction
  • filling out forms
  • creating table of content
  • jump to URLs
  • encryption on opening and saving
  • get the attributes of the document
  • unlock a document
  • add custom drawings to pages

PDFKit Gives Users More Functionality and Reduces Developer Burden

The sample code we provide shows how effortless using PDFKit is. With just a few lines of code we implemented text search in PDF file.

Until recently this would have required a webview and injection of javascript functions in the said webview, or usage of not so developer friendly low-level APIs or C/C++ code.

With PDFKit all is done with native code in a matter of minutes. And that is just a miniature example of all the features your app can adopt with the powerful new framework.

View code samples for PDFKit in Swift.

Vision Framework

The release of iOS 11 also signaled Apple’s introduction of Vision, a brand new framework that allows developers to implement computer vision (the technological discipline concerned with enabling computers to interpret data from images) — without advanced expertise therein .

So what kind of computer vision problems does Vision solve for users and developers?

Let’s start with face detection.

Face Detection is Not New But Much Improved

While it’s not new to the iOS SDK, Vision Framework is improved. It’s functionalities are based in deep learning supported by the new Core ML framework. This means that Vision functions faster and with greater accuracy, detecting even distant faces, profiles, and faces partially hidden by glasses and hats.

Vision also offers detection of distinct facial features, using a constellation of points to identify the contour of the chin or the nose, the outline of the mouth, eyebrows, pupils, etc.

Our sample code demonstrates how adding a simple overlay to a chosen image will connect the lines between these points.

Here are other useful applications of Vision Framework:

  • image registration — aligning two images based on their content, creating panoramas or image stacking
  • text detection — finds regions of visible text in an image
  • horizon detection — determines the horizon angle in an image
  • barcode detection — finds and recognizes barcodes in an image
  • object tracking — tracks movement of previously identified arbitrary objects or rectangular regions across multiple images or video frames.
  • integrating your own custom Core ML models directly

Image and video editing apps will benefit the most from the updated Vision Framework. But applications used for translation, shopping, and productivity — just to name a few categories — also stand to increase their power and utility thanks to this new framework.

Why Users and Developers Stand to Benefit From the New Vision Framework

Developers love it because they don’t need any knowledge about deep learning algorithms to use it successfully. They also love that it’s available with the new iOS version out-of-the box, making it “free.”

Vision functions entirely on the user’s iPhone. Since it doesn’t transfer data to the cloud, users are able to exert a high degree over control over the data that Vision collects and processes.

View code samples for Vision Framework in Swift.

MusicKit

“All the ways you love music. All in one place.” New advancements in Apple’s MusicKit integrate the joy of listening to music into more of the apps that users love.

How?

MusicKit Syncs Users’ Tunes With Their Favorite Apps

MusicKit equips developers with the code needed to integrate Apple Music functionalities natively in the applications they build. Additionally, MusicKit API allows third-party applications to access the user’s Apple Music account, but only after she grants permission. If a user is not already an Apple Music member, MusicKit offers a trial membership from within the third-party app.

From fitness apps to game, developers have the tools that allow users to integrate more of their favorite music in their daily tasks and favorite apps.

Our example code implements searching in the Apple Music songs catalog. You can easily form other queries on the catalog via analogues requests.

View code samples for MusicKit in Swift.

ARKit

An iPad or an iPhone becomes a window into another world thanks to augmented reality.

Apple’s ARKit is a mobile AR platform for developing augmented reality apps on iOS. It is a high level API providing a simple interface to support a powerful set of features.

ARKit Makes Cutting-edge Technology Available to the Masses

Notably, for all of its cutting edge applications, ARKit isn’t exclusive technology.

In fact, most iOS devices in use can support it, as the full set of features for ARKit require an iPhone 6s or later.

Ever Wonder What Makes AR Tick?

World tracking is the essential functionality of ARKit — it’s what merges the real with the digital.

The mechanism behind this uses visual inertial odometry. The camera’s images and motion data provide a precise view of where the device is located and how it’s oriented relative to its surroundings.

For all of the complexity of the technology it uses, ARKit doesn’t require adding any additional sensors to the device. All necessary data is captured by the phone, with no additional input needed from the user.

ARKit enables a deeper integration between iOS technology, the physical world, and the user. A new generation of apps can help furniture shoppers visualize a new couch in their home or test whether their dream car can fit in their garage, not to mention many other inventive uses that have yet to come to fruition.

View code samples for ARKit in Swift.

UIViewPropertyAnimator Enables Advanced Animations

Last year Apple introduced a better way to create animations on iOS.

UIViewPropertyAnimator is a class that allows technical teams to animate views and dynamically modify an animation before it finishes. In other words, they can easily create gesture-based interactive and interruptible animations.

iOS 11 features the addition of two new properties to UIViewPropertyAnimator class:

  • scrubsLinearly — Usually, when an interactive animation is interrupted, the timing of the animation transitions to a linear one. This serves to make the interaction smoother. Once the animation continues, the original timing function is restored. With the new animator’s property, scrubsLinearly, a developer can disable this transition so the animation will preserve its original timing curve throughout the interaction.
  • pausesOnCompletion — By default, when an animation finishes, it transitions to an inactive state. That means one can no longer manipulate, interact with, or even reverse the animation once it has run its course. But by enabling pausesOnCompletion, the animation will stay paused on 100% fraction complete and the developer will be able to further adjust, reverse, and trigger the animation again.

Another update to UIViewPropertyAnimator is the ability to start coding without initially including any snippets for animations, and later on add the animations via blocks and run them together right away.

Last but not least the cornerRadius property of CALayer can now be fully animated.

The sample code that we provide incorporates all of these new properties and shows them in action.

Native iOS apps have always featured smooth animations that are easy on the eyes, and every year Apple makes a point of arming developers with the code to make animations richer and more appealing to the users.

View code samples for Advanced Animations with UIKit in Swift.

Core NFC

Near Field Communication, or NFC (a collection of protocols and standards that have diverse applications), is a wireless technology that allows the exchange of information between two devices that are within just a few centimeters of each other.

NFC is by no means new technology. Apple already uses NFC for secure payment in stores or within apps through Apple Pay and in their Wallet app (initially known as Passbook).

But developers are excited because Apple’s Core NFC wasn’t available to them until this year.

Users’ Digital and Physical Worlds Collide With New Core NFC

In iOS 11 Apple introduced Core NFC, a new framework that allows developers to integrate NFC in your apps for iPhone 7 and newer.

Core NFC gives developers more control over the way they choose to streamline users’ interactions with their physical environments.

For its initial launch, Apple focused on tag reading. Core NFC supports the NFC Data Exchange Format (or NDEF) for tags of Type 1 through Type 5. Products along this spectrum support various kinds of data transactions, from pairing with a Bluetooth-connected device to validating transit passes and e-tickets.

NFC could also be beneficial in the fields of gaming, marketing, payments, and many more. The Core NFC code sample shows how uncomplicated it is to use the framework. The NFC reader is initialized by just three lines of code and the outcome of the reading is received in one delegate method.

That’s all that is needed to integrate NFC with an application’s functionalities.

View code samples for Core NFC in Swift.

Looking Forward

iOS 11 is just the latest way Apple is demonstrating its commitment to improving the users’ experience. Developers have received better tools to create more powerful applications more easily while the user receives richer, more streamlined digital experiences.

All expectations regarding iOS 11 have been delivered upon. One can assume that in the future Apple will continue to build on the apps’ performance and reliability striving for excellence and consumer satisfaction.

View the complete iOS 11 Sampler repository.

Image Source: Unsplash, Xavier Wendling

Original post can be found here.

Innovate with us. Click here to access all of our free resources.
Authored by
Dobrinka Tabakova and Nikolay Andonov.

Dobrinka writes code and considers UI/UX strategy daily as a MentorMate iOS Software Development Lead. The self-described detail-oriented technologist found development by way of interests in computer science and math. Dobrinka is motivated to create high-quality software defined by intuitive user experiences.

Nikolay does more than work at MentorMate as an iOS Developer. He mentors the next generation of technologists. When he isn’t advising an iOS Intern to help them develop their skills, you’ll find him digging into mobile projects, discussing functionality with teammates and performing code reviews. Inspired to keep learning, he joined MentorMate after participating in a company-hosted iOS code academy and hasn’t looked back.

--

--

MentorMate

Blending strategic insights and thoughtful design with brilliant engineering, we create durable technical solutions that deliver digital transformation at scale