Top iOS 10 Objective-C and Swift 3.0 Code Samples

MentorMate
12 min readMar 7, 2017

iOS 10 is the tenth major release of the iOS mobile operating system developed by Apple. It was announced at the company’s Worldwide Developers Conference (WWDC) on June 13, 2016 and was released on September 13, 2016.

Just over five months after it was released to the public, the operating system is now installed on 79% of all active iOS devices, according to the latest adoption data shared by Apple.

Since the iOS launch in 2008, more than two million apps have been downloaded 130 billion times. The popularity of the iOS platform is no secret, and every new iteration comes with enormous expectations. With iOS 10, Apple provides more control to iPhone and iPad developers over the software, introducing new frameworks and tools that allow new categories of applications and features to be created. Apps developed for iOS 10 also extend system services to offer more engaging functionality. There are many new features in iOS 10, but in this article, we are going to present samples for those we found the most interesting and helpful.

UIViewPropertyAnimator

iOS 10 has introduced a new way to write animation code: using the UIViewPropertyAnimator. This isn’t a replacement for the existing API, nor is it objectively “better”, but it does give development teams a level of control that wasn’t possible before.

Essentially UIViewPropertyAnimator enhances the options for creating animation in our applications. New options include stopping animation and resuming it (also with other time parameters), finishing animation at any time, reversing animation or moving it to any chosen moment and more.

Another novelty is the fact, apart from previously used timing options of animation such as EaseInEaseOut, we now also have the ability to define our own time function based on checkpoints of cubic function. Until iOS 10, performing gesture-based and interruptible animations on iOS was a troublesome task, often requiring a third party framework.
The fine control over animation timing alone would make a Property Animator an improvement for our existing UIView animations. But where they really shine is when you create animations that aren’t just fire-and-forget — when we want the user to be able to grab an animating object and do something interactive with it.

View code samples for UIViewPropertyAnimator in Objective-C and Swift 3.0.

CallKit

iOS 10 CallKit is a brand new framework, introduced for the first time at WWDC 2016. CallKit lets your Voice over Internet Protocol (VoIP) app integrate tightly with the native Phone UI enhancing immensely the user experience.

VoIP Complications

Before CallKit, the VoIP calls were just a notification. The user was not able to distinguish the difference between incoming text message notifications and incoming phone call notifications. This, naturally, led to plenty of missed phone calls. On the locked screen, even if the user manages to answer the call by sliding the notification, he/she needs to enter the password, is redirected to the app and only then can begin speaking. On an unlocked screen, the experience was equally poor, as the incoming call notification was just a banner at the top of the screen. Even further, making an outgoing call required the user to launch the VoIP app and start it from there.

iOS 10 CallKit Advancements

CallKit resolves all these complications. With iOS 10 CallKit, a third party VoIP app could become the primary way for the user to make and receive calls. Incoming calls have the rich Native UI with answer and decline buttons and user’s custom ringtone. Starting a call could be completed from the native Phone app’s contacts, favorites and recents, from Siri, via Bluetooth or CarPlay.

iOS 10 CallKit lets the VoIP calls interplay with telephone calls, FaceTime calls or even other VoIP calls. As the system knows about all calls, it handles them with the same priority. This means that VoIP calls are no longer put on hold when a telephone call is received. Additionally, the user can use the “Do Not Disturb” functionality, can block contacts or mute calls, even swap between active and held calls, regardless of their type. **Note: This sample requires a device to be built on**

View code samples for CallKit in Objective-C and Swift 3.0.

SiriKit

SiriKit is a new framework that allows developers to integrate their apps’ content and services with Siri, so the users can do things using just their voices.

SiriKit supports six different kinds of apps (plus three conditional), which cover a wide range of common and popular App Store offerings. The supported domains and the tasks that they can perform are:

  • VoIP Calling: Initiate video and audio calls and search through user’s call history with VoIP apps
  • Messaging: Send text messages and search through users’ received messages with apps that support messaging services
  • Payments: Send and request payments to and from other people, pay and search for bills using apps that support personal payments
  • Photo: Search for photos and videos of a particular content type and play slideshows in photo library apps
  • Workouts: Start, pause, resume, end and cancel workouts in workout apps
  • Ride booking: Book a ride, receive the status of a booked ride and get a list of available rides through apps that provide taxi-like services.
  • Car commands (automotive vendors only): Activate car signals, get and set the status of the car’s locks, get the current fuel or power level in a car
  • CarPlay (automotive vendors only): Set the climate control settings, the defroster settings, the seat temperature, the radio station, the audio source, save and restore vehicle settings to and from a profile
  • Restaurant reservations (requires additional support from Apple): Book a reservation, get the available reservation times, get the user current reservations, get default values to use when requesting reservation, get user information to associate with a booking

After choosing a suitable domain, development teams can leverage SiriKit by building an extension registered with it. Siri handles the user request in four steps: Speech, when the user articulates the command; Intent, when the command is interpreted and matched with something that the app can do; Action, when the thing specified in the Intent is conducted by the app and Response, when the user is asked to confirm that the Intent is correct and whether they want to continue with the execution of the action. **Note: This sample requires a device to be built on.

View code samples for SiriKit in Objective-C and Swift 3.0.

NSPersistentContainer

Setting up the Core Data stack used to be quite a bit of work. We would need to create the model, then a persistent store coordinator, then a managed object context. The code to do that was rather long and almost exactly the same for each project. Learning about these classes and the way they work together can be a relatively large obstacle for someone who just wants to start writing code. That’s why alternative databases began gaining popularity. The new NSPersistentContainer class now wraps up all of that tedious work for us as well as offering some handy new features.

NSPersistentContainer encapsulates the whole Core Data stack setup. What we get is a simple interface thanks to which we can ignore the existence of persistent Store and persistent Coordinator. It also facilitates Core Data operations such as saving and retrieving of information while providing us with thread-safe working contexts in accordance to our needs.

View code samples for NSPersistentContainer in Objective-C and Swift 3.0.

NSQueryGenerationToken

The key feature of the Core Data framework undoubtedly is the faulting. It is because of faulting that Core Data is as performant as it is and it ensures Core Data’s memory footprint remains at an acceptable level.

But faulting can sometimes lead to unexpected problems. If the fault’s underlying data in the persistent store is deleted, Core Data can no longer fulfill the fault potentially causing undesired behavior or even crashes. And if we are struggling to make the user interface respond to data that is no longer present in the persistent store, Core Data Query Generations is the solution to our problem.

Query generations are available in iOS 10. As the name suggests, a query generation is a snapshot of the data in the persistent store. A managed object context can choose to pin itself to a query generation, which means that it interacts with a snapshot of the data in the persistent store. The managed object context provides a window into that query generation. No matter what happens to the data in the persistent store, the managed object context continues to see and interact with the data of the query generation.

For applications that use multiple managed object contexts, each managed object context can work in isolation. Changes made by one managed object context don’t necessarily affect other managed object contexts. We can only use query generations if the persistent store of our applications is an SQLite database in WAL mode. This is the most common setup when using Core Data in a project, which means we don’t need to make any changes to start benefiting from query generations.

View code samples for NSQueryGenerationToken in Objective-C and Swift 3.0.

Speech Recognition

Speech recognition is not a new concept, but it certainly is a feature that greatly facilitates the way we operate with our devices. iOS users are accustomed to using Siri to interact with apps and when a keyboard is visible using dictation to capture their speech. The Speech Recognition framework lets us extend and enhance the speech recognition experience within our applications without requiring a keyboard.

The new framework uses the same underlying technology that is used in Siri and Dictation. It provides fast and accurate results which are transparently customized to the user without having to collect any user data. The framework also provides more information about recognition than just text. It provides alternative interpretations of what the users might have said, confidence levels and timing information. We can control when to stop a dictation, we can also show results as the user speaks, and the speech recognition engine will automatically adapt to the user preferences (language, vocabulary, names, etc.)

Audio for the API can be provided from either pre-recorded files or a live source like a microphone. iOS 10 supports over 50 languages and dialects from Arabic to Vietnamese. Any device which runs iOS 10 is supported. The speech recognition API typically does its heavy lifting on Apple’s servers, which requires an internet connection. However, some newer devices do support speech recognition all the time. **Note: This sample requires a device to be built on.

View code samples for SpeechRecognition in Objective-C and Swift 3.0.

AVCapturePhotoOutput

AVCapturePhotoOutput is a new interface for taking photos in iOS 10, part of the AV foundations camera capture APIs. The camera and associated hardware in the newer iPhones and iPads are incredibly powerful and AVCapturePhotoOutput is the API that exposes the sophisticated camera-based capabilities to the developers. In addition to the basic capture of still images, below are listed an impressive range of professional-grade image capture formats and features that AVCapturePhotoOutput supports:

  • Compressed images such as JPEGs
  • Uncompressed (but processed) images in popular pixel buffer formats
  • RAW images: camera sensor data with minimal processing (iPhone 6/7, iPhone 6/7 Plus, iPhone SE, iPad Pro)
  • DNG file format for RAW images
  • RAW + JPEG simultaneous capture
  • Wide gamut color capture on supported devices (iPhone 7, iPhone 7 Plus, iPad Pro)
  • Simultaneous delivery of preview-sized images
  • Bracketed capture of multiple exposures
  • Live Photos: photos accompanied by a short video clip capturing the moments immediately before and after the photo was taken (iPhone 6/7, iPhone 6/7 Plus, iPhone SE, iPad Pro)
  • Manual control of settings such as flash, exposure, ISO, image stabilization and white-balance

**Note: This sample requires a device to be built on.

View code samples for AVCapturePhotoOutput in Objective-C and Swift 3.0.

UIGraphicsRenderer

UIGraphicsRenderer is a new UI graphics render class. One big problem with the former render approach that the UIGraphicsRenderer resolves is — it was 32 bits sRGB only. With UIGraphicsRenderer if you are on 9.7 inch iPad, you are going to get a wide color context. If you are not, you will get the classic context.

Another advantage is that UIGraphicsRenderer is block-based, which makes it much easier to use. Additionally, this class manages the lifetime of the UI graphics context, meaning means some memory optimizations could be done underneath. And, it has an object-based API, so it has two subclasses for rendering images: UIGraphicsImageRenderer and for rendering PDFs — UIGraphicsPDFRenderer.

View code samples for UIGraphicsRenderer in Objective-C and Swift 3.0.

UserNotifications

With iOS 10, tvOS 10 and WatchOS 3, Apple introduced a new framework called the UserNotification framework. It’s a brand new set of APIs that unifies the way developers are working with both Local and Remote notifications on the different platforms. This iOS 10 rich notifications framework replaces the previous platform-specific interfaces for creating and scheduling local and remote notifications which are now deprecated.

With the UserNotifications framework, the developer can schedule the delivery of local notifications based on specific conditions, such as time or location. The new framework also provides better notification management. The apps now have access to notifications that are either pending delivery or already delivered to the user which provides the ability to remove or even update notifications. Probably the biggest change to the way the notifications are presented to the user is the ability to embed custom views and actions to them. In addition, iOS 10 rich notifications can be presented with the exact same look and feel as the system supports right in the apps.

View code samples for UserNotifications in Objective-C and Swift 3.0.

UICollectionViewDataSourcePrefetching

Great scrolling performance is expected in every app using UICollectionView. In iOS 10, Apple introduced a new UICollectionViewDataSource protocol extension called UICollectionViewDataSourcePrefetching. The protocol provides advance warning of the data requirements for a collection view, allowing the triggering of asynchronous data load operations.

UICollectionViewCellPreFetching is enabled by default, so apps compiled on iOS 10 automatically get better scrolling performance. To go a step further, the developer can implement the two protocol methods, which facilitate expensive data-related tasks (decoding images, access database, load data from a server, etc.) asynchronously and in-advance. The queued data-related tasks can also be canceled if they are no longer needed.

View code samples for UICollectionViewDataSourcePrefetching in Objective-C and Swift 3.0.

UIPreviewInteraction

UIPreviewInteraction is a new class in iOS 10 that allows developers to plug into the progress of a 3D Touch action on any view specified. The progress varies depending on how hard we press on the screen. The other interesting thing is that we get two separate calls that correspond to both peeking and popping. Since “peeking” and “popping” are “preview” features, this means we can further interact with both of them and execute code parallel to each action. It also has the same force processing as Peek and Pop, with automatic haptic feedback. **Note: This sample requires a device to be built on.

View code samples for UIPreviewInteraction in Objective-C and Swift 3.0.

Looking Forward

As time passes, and with each WWDC, Apple continues to improve iOS with more frameworks and iOS 10 rich notifications. This provides the ability to create deeper and more functional experiences for our users. What’s more, Apple seems to continue to place an even greater importance on the ease-of-use for these frameworks. That lowers the barrier of entry for new user experiences. WWDC 2017 will take place in early June and will certainly showcase exciting new frameworks along with the release of iOS 11. With those releases and announcements, likely even some of the above interactions will be improved making them an even more valuable asset for developers who desire to improve their applications’ readability, performance and reliability.

View the complete iOS 10 Sampler repository.

Innovate with us! Click here to access all of our free resources.
Authored by
Nikolay Andonov and Dobrinka Tabakova.

--

--

MentorMate

Blending strategic insights and thoughtful design with brilliant engineering, we create durable technical solutions that deliver digital transformation at scale