- Published on
- 5 min read Intermediate
> Getting Started with App Intents and Siri Shortcuts
If you ever tried to add Siri support to an app using SiriKit, you probably remember the pain: intent definition files, separate extension targets, and a limited set of domains Apple supported. App Intents, introduced in iOS 16, replaces all of that with a pure Swift API. You declare a struct, add a perform() method, and the system takes care of surfacing your action through Siri, Shortcuts, Spotlight, Focus filters, and widgets.
This post walks through the basics of building an App Intent and wiring it up as a built-in Siri shortcut so users can trigger it by voice without any setup on their end.
What an App Intent Looks Like
An App Intent is a type conforming to the AppIntent protocol. At minimum it needs a title and a perform() method. Here's a simple one for a note-taking app:
import AppIntents
struct CreateNoteIntent: AppIntent {
static var title: LocalizedStringResource = "Create Note"
static var description = IntentDescription("Creates a new note with the given text.")
@Parameter(title: "Note Text")
var text: String
func perform() async throws -> some IntentResult & ProvidesDialog {
try await NoteStore.shared.create(text: text)
return .result(dialog: "Saved your note.")
}
}
The @Parameter wrapper tells the system this intent needs a string input. When a user runs the intent from the Shortcuts app or asks Siri to run it without supplying the text, the system automatically shows UI to collect the value. No custom forms needed.
Return Types
The perform() method returns an IntentResult. Which flavor you use depends on what you want the user to see or hear:
.result()runs silently and returns nothing..result(dialog: "Saved your note.")speaks or shows a dialog after the intent runs, good for voice confirmation..result(value: someValue)returns a value that can feed into the next step of a Shortcut.
You can combine these by composing result protocols like IntentResult & ProvidesDialog & ReturnsValue. The Apple docs on intent results cover every variation.
Exposing Intents as Siri Shortcuts
Defining an AppIntent makes it available in the Shortcuts app, but that still requires the user to go build a shortcut manually. The real magic is AppShortcut, which registers a built-in voice phrase that works the moment your app is installed.
You declare these by conforming a type to AppShortcutsProvider:
import AppIntents
struct MyAppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: CreateNoteIntent(),
phrases: [
"Create a note in \(.applicationName)",
"Add a note to \(.applicationName)"
],
shortTitle: "Create Note",
systemImageName: "square.and.pencil"
)
}
}
Every phrase must include \(.applicationName) somewhere. This is a strict requirement: Siri uses the app name as a disambiguator so your phrases don't collide with system commands or other apps. If you leave it out, the build will succeed but the phrase won't work at runtime.
Once this is in your main app target, users can say "Hey Siri, create a note in MyApp" without touching the Shortcuts app at all. The intent runs, Siri speaks the dialog, and that's it.
Passing App Model Types with AppEntity
Strings and numbers cover a lot of cases, but eventually you'll want intents that operate on your app's own types. A shopping app might want "Open the groceries list" where "groceries" is a specific list the user has. That's what AppEntity is for.
struct NoteList: AppEntity {
static var typeDisplayRepresentation: TypeDisplayRepresentation = "Note List"
static var defaultQuery = NoteListQuery()
var id: UUID
var name: String
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(title: "\(name)")
}
}
Conforming types can then be used as @Parameter values in your intents, and Siri will offer the user a picker populated from your EntityQuery implementation. The query is how the system finds matching entities by ID or by a user's spoken name.
Where It Fits in a Real App
The big practical win is reach. A single AppIntent declaration plugs into Siri voice commands, the Shortcuts app for custom automations, Spotlight for quick actions from search, Focus filter customization, Control Center buttons (see the ControlWidgetButton article for that angle), and interactive widgets. You write the logic once and it shows up everywhere the system wants to expose quick actions.
For tasks that should open your app to finish, set static var openAppWhenRun: Bool = true on the intent. For background work like toggling a setting or logging an event, leave it at the default false and the intent runs without bringing your app to the foreground.
A Few Things Worth Knowing
Intents run in your app's process when the app is already running, and in a lightweight extension environment otherwise. That means heavy state you rely on may not be loaded, so design perform() to bootstrap whatever it needs. For most storage layers backed by files or Core Data this just works, but singletons that depend on app launch side effects can bite you.
Localization uses LocalizedStringResource, which means your intent titles, descriptions, and dialogs can pull from your strings catalog like any other UI text. Siri phrases in AppShortcut are localizable too, through the AppShortcuts string table Apple generates for you.
If you want the full reference, Apple's App Intents documentation has guides covering parameter resolution, dynamic options, focus filters, and the newer interactive snippet APIs that let your intent show rich results inline.
App Intents is one of those frameworks where the setup cost is tiny but the payoff compounds as iOS keeps adding more surfaces that consume them. Starting with a single voice phrase for your most common action is usually the cheapest way to get your app in front of users who never leave the Lock Screen.
Sample Project
Want to see this code in action? Check out the complete sample project on GitHub:
The repository includes a working Xcode project with all the examples from this article, plus unit tests you can run to verify the behavior.
// Continue_Learning
App Tracking Transparency in SwiftUI
A practical guide to App Tracking Transparency in iOS, covering the Info.plist usage description, the ATTrackingManager API, authorization statuses, and when to actually prompt the user from SwiftUI.
withCheckedContinuation vs withUnsafeContinuation in Swift
Continuations bridge completion-handler APIs into async/await. The checked variant catches the two ways you can get it wrong, and the unsafe one trusts you completely.
Module selectors in Swift 6.3
SE-0491 introduces the :: operator to disambiguate names that come from different modules. Here is when you actually need it and the gotchas worth knowing.
// Stay Updated
Get notified when I publish new tutorials on Swift, SwiftUI, and iOS development. No spam, unsubscribe anytime.