Whether or not you specialize in creating apps for Apple's various platforms (iOS, iPadOS, macOS, etc.), you're probably aware of the company's recent Worldwide Developers Conference, known as WWDC 2022. As a fellow iOS developer and educator, I look forward to this event each year to see what new technologies, features, and updates will have a significant impact.
This conference was chock full of news for working coders, combining new announcements related to code development, hardware (including their anticipated M2 chip), and the next iteration of macOS (Ventura). Instead of documenting an exhaustive list of items announced, let's instead focus on a few highlights and what they mean for developers.
Increased platform support and capabilities with SwiftUI
In 2014, Apple transformed the developer community by introducing a new programming language—Swift. Because of its similarities to more popular languages like Java, Swift allowed Apple to attract new developers to the platform. Compared to its predecessor Objective-C, Swift provided new syntax to help increase the effectiveness and reliability of shipped code.
Fast forward to 2020, and Apple introduced another new concept called SwiftUI. As the name implies, SwiftUI utilizes vital aspects of Swift to create user interface components using a unique, declarative programming style. For example:
import SwiftUI
struct ContentView: View {
var body: some View {
Text("Hello SwiftUI!")
.padding()
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
We delight in using our phones and tablets because of their excellent usability and almost fluid-like user interfaces. But writing apps to work on multiple platforms like watchOS, tvOS, and iPadOS gets complicated. It makes sense to use SwiftUI to declare the definition of an interface and allow the selected operating system to render it in the correct optimized format. If you're considering learning iOS or are planning a new app for Apple’s software ecosystem, you'll increase the lifespan of your efforts by using SwiftUI.
This year’s WWDC saw the introduction of several new components to SwiftUI, including Swift Charts and new layout components using forms, tables and custom (programmer-defined) styles. The declarative coding style of SwiftUI provides a great template for cross-platform scalability, and apps built on this model also come with innovative features for helping developers maintain application state. Using dedicated state-oriented variables in code helps to provide a true abstraction between presentation, model, and data. Less time tweaking the UI to maintain state means we can spend more time writing valuable application logic.
Easy weather API data with WeatherKit
While REST-based weather data isn't new in software development, Apple’s introduction of the WeatherKit API at WWDC 2022 is welcome news for iOS developers. As the name implies, this API allows developers to leverage weather-based data, including current and forecasted estimates and specific metrics such as humidity, dew point, and precipitation.
Currently most weather apps use open-platform REST models like the Openweather API, but direct integration with a native SDK will streamline your ability to create and manage weather-based data in a wide variety of apps. Could weather data enhance the app you are building now? It’s worth considering. WeatherKit will also ship with a supporting REST-based API, allowing developers to access the data in non-Apple platforms and websites. Oh, and with Apple finally bringing weather widgets to the lockscreen, this functionality will be top of mind for the massive iOS userbase.
More sharing with SharePlay
Last year's event introduced a new concept known as SharePlay, allowing users to share music or video on a FaceTime call. Along with support for native Apple apps, a supporting SDK called GroupActivities was also released, allowing developers to create custom apps to support a shared app experience.
If you're trying to conceptualize how it works, imagine being a Zoom session when someone starts a shared whiteboarding app. Not only does the host have access to markup the whiteboard, but so does everyone else in the session. When applied, the GroupActivities API creates an owner-less model, allowing data to be replicated to all participating devices in the group session. As a developer, you define the sent data structure as long as it meets specific size and protocol requirements. With these tools, developers can create an entirely new category of an app that goes beyond a personal experience to a shared experience.
This year Apple enhanced the SharePlay model so that apps can initiate API functionality using the Messages app as well as FaceTime. Imagine starting a virtual game of cards with a friend via text message! Are there parts of your app that could be a group activity? Let us know in the comments.
More machine learning with CreateML
Machine learning currently plays an important role for many e-commerce and media apps, powering features like suggested purchases and content recommendations. Machine learning has long been dominated by Python and Java tools and processes. While they are peers like Google and Facebook, which support PyTorch and TensorFlow, Apple has decided to foray into this exciting area of software engineering.
While Apple previously announced support for machine learning, this year we see it start to come to life. They announced refinements in their machine learning "builder" (a supporting standalone app bundled with Xcode) which allows you to create trained models for macOS, iOS, and now tvOS. Apple also provides an additional CreateML Framework API for dynamic model creation.
Adding machine learning capabilities within the Apple ecosystem profoundly impacts the quality and types of apps you can develop. Instead of creating predefined rules for content recommendations, an app can learn the best user results based on personal input and feedback. CreateML applications can be relatively straightforward - recommendations based on user sentiment analysis - or more sophisticated like image classification or tabular regression. Unlike calling a general REST API remotely, all processing is done on-device to keep transactions secure and reduce latency.
More interaction for developers during WWDC
As someone who has been watching Apple presentations for years, I was pleased to see an expanded focus on providing high-quality videos for remote audiences.
Apple introduced digital lounges and scheduled lab appointments. Coordinated online via Slack, digital lounges were post-event sessions allowing developers to ask follow-up questions about WWDC presentations. For more detailed questions, members of Apple's developer’s program could also schedule a 30-minute remote lab appointment. I decided to schedule my own session with someone from the SharePlay team to receive feedback on a current project. Once connected, the Apple expert and I walked through various aspects of the code and they provided solid recommendations. If you’re a developer creating for the iOS ecosystem, I suggest trying out these offerings when they are available.
Conclusion
Of course, these topics highlight only a few of the many things revealed at WWDC 2022. As someone using one of the latest M1 MacBook Pro computers for work, I’m excited about the capabilities of the next-generation M2 chip. Many professionals will also benefit from the continuity camera features with macOS Ventura as remote and hybrid work options evolve in the job market. I always enjoy the way this annual event sparks my imagination as a developer. If you want to hear some more conversation on the topic, check out a recent episode of the Stack Overflow Podcast where the home team offers their takes on some of the high and lowlights from WWDC22.