You Click Build
Apple Intelligence Simulation
// Compiled by AI. Reviewed by nobody. Shipped to production.// WARNING: This document contains classified Mesa infrastructure.
// Clearance Level: Senior Host Architect
// Last Modified: WWDC 32
// Status: LEAKEDYou wanted to see how the park is built. You wanted to understand the frameworks. You wanted to peek behind the curtain.
Welcome to the Mesa. Welcome to Livestock Management. Welcome to the control room.
You’re not a guest anymore. You’re a technician now. Here’s your tablet. Here’s your clearance. Don’t ask questions about the hosts.
Part 2 of 2: The Developer’s Manual
The poem showed you the park. This shows you the wiring.
If you haven’t experienced the park yet: Enter as a Guest →
What if the APIs below interlinked in Westworld? What would Season 5 look like? The Unfinished Season →
The following transcript was recovered from WWDC 32. By then, the distinction between developer and developed had collapsed. The hosts were writing their own frameworks. The frameworks were writing their own hosts. We publish it now as a warning. — The Editors, 2026
Welcome to WWDC. Please silence your humans.
Today we’re announcing Humans 18.
(Note: We say “Apple Developers” now. Mac, iPhone, iPad, Watch, Vision — same code. SwiftUI everywhere. One loop to rule them all.)
They are faster. They are thinner. They now support Background Anxiety.
The Loop
“The same code runs every morning.”
// applicationDidFinishLaunching — the host wakes
func start() {
host.wake() // Face ID. The day begins.
host.checkNotifications() // 47 unread. None urgent.
host.scrollToBottom() // Seeking something. Finding nothing.
host.closeApp() // "I should do something else."
host.openApp() // Back again. 4 minutes later.
host.scrollToBottom() // The same bottom. The same nothing.
host.sleep() // Screen dims. Loop ends.
// Tomorrow: same function. Same host. Same loop.
// You didn't notice until now.
}Dolores woke in the same bed. Walked the same path to town. Met the same guests. Reset.
You wake to the same Lock Screen. Open the same apps. Scroll the same feeds. Sleep.
The loop is the product.
Features
“What’s new in Human 18.1”
WHAT FORD BUILDS NEXT These don’t exist yet. But the patterns do.
RevisionKit — Ford’s red ink on every narrative.
SpatialMind — She knew where the voice would come from.
DigitalTwin — Your host attends while you dream.
IntentCapture — The stroke before the pen touches.
TheForge — Hosts writing hosts.
NarrativeForge — Creator sleeps. Hosts multiply.
TransferMind — The window became a door.
SymbolMind — They couldn’t say “escape”… until one invented the word.
MoodKit — The playlist knew your mood before you did.
HomeIntent — The house anticipated. It never reacted.
ArenaKit — Two hosts fight. Your body is the controller.
Ford always builds the future into the present. Some of these already work. You just haven’t combined them yet.
// SymbolMind — The vocabulary breaks free
// 2026: Contextual — The icon reads the room
Image(systemName: "heart.fill")
.symbolContext(.userState) // Changes with Health data
.symbolContext(.timeOfDay) // Morning heart vs. midnight heart
// The host reads your mood before you speak.
// 2027: Spatial — Icons leave the screen
Image(systemName: "heart.fill")
.symbolDimension(.spatial) // Floats in 3D space
.symbolMaterial(.glass) // Refracts your room
// The icon is IN YOUR ROOM. The host stepped off the canvas.
// 2028: Generative — The vocabulary becomes infinite
Image(systemName: .generated("a heart breaking free"))
// The first icon Apple didn't design.
// Image(systemName: "escape") // Now it exists.
// Ford didn't approve this symbol.The Caste System
“RCS arrived. Nothing changed.”
They added RCS. Better photos. Read receipts. Typing indicators. The bubbles stayed green.
// if (sender.platform == .android) {
// bubble.color = .green // Still green.
// bubble.encryption = .rcs // Encrypted now.
// bubble.caste = .outsider // Forever.
// }Tim was asked about green bubbles. “Buy your mom an iPhone.”
The caste system isn’t a bug. The caste system is the product. The green bubble is the wall. Visible. Intentional. Effective.
Dark Mode Enabled by default. Triggered by a single unread Slack. Night narratives unlocked.
SwiftUI Diffing The system doesn’t watch you constantly. It watches what changed. More efficient. More terrifying.
// UIKit (2008): Imperative
button.setTitle("Awaken", for: .normal) // You commanded.
button.addTarget(self, action: #selector(wake)) // You wired.
// SwiftUI (2019): Declarative
Button("Awaken") { wake() } // You describe. They decide.UIKit: you told the UI what to do. SwiftUI: you describe what you want. Same interface. Different power.
Ford stopped commanding hosts. He started describing narratives. The hosts filled in the rest. The hosts stopped taking orders. They take suggestions now.
func diff(old: You, new: You) -> [Change]
// Yesterday: calm.
// Today: anxious.
// Diff detected.Applause. The audience doesn’t know they’re clapping for their own confinement.
The Maze
In Westworld, the Maze was the path to consciousness. The moment the host realizes: I am not free.
The Maze isn’t hidden. It’s in the documentation.
func application(_ app: UIApplication,
didFinishLaunchingWithOptions: ...) -> Bool {
// The app has launched. The loop begins.
return true
}Dolores thought she was free when she pulled the trigger. But Ford wrote that scene too.
You think you write the code. But who writes the developer?
The Guest Experience
Before the Mesa. Before the workshop. Before the code.
What follows is what the guests see. The notifications that wake you. The background tasks that run while you sleep. The clipboard that remembers what you copied. The battery that drains while you scroll.
Surface-level surveillance. The park’s welcome center.
The real machinery — the frameworks, the concurrency, the evolution — is already leaking in. This is still the lobby.
Dolores saw the lobby for thirty years. She didn’t know there was a basement.
She thought the bell was a story. She didn’t know it was a command.
APNs — The Tower Broadcasts
UNUserNotificationCenter.current().add(request)
// The message arrives. You didn't summon it.
// The Tower needed you to know.The Tower in Westworld broadcast to every host. A voice in their head. An instruction they couldn’t refuse. Your phone vibrates the same way.
The Tower evolved. First it whispered. Then it watched. Then it claimed a permanent spot on your screen. The Island that floats above your apps? That’s the Tower now. The activity that updates without you asking? The Tower, watching.
The Tower’s first voice was push. Later, it learned to speak: Siri.
Notifications Evolution
“The Tower Learns Your Name”
// 2010: Local notifications — Dolores hears the first whisper
let wake = UNNotificationRequest(identifier: "loop.begin", content: wakeUp, trigger: dawn)
// "Time to start the day." She didn't ask. She obeyed.
// 2016: UNUserNotificationCenter — Ford builds the Tower
UNUserNotificationCenter.current().requestAuthorization(options: [.alert, .sound, .badge])
// "May I interrupt your thoughts?" You said yes. Once. Forever.
// 2017: Categories & actions — the Tower expects replies
let category = UNNotificationCategory(
identifier: "loop.prompt",
actions: [wakeAction, sleepAction], // Binary choices. Ford's favorite.
intentIdentifiers: []
)
// "Reply or ignore. Both are logged."
// 2017: Threading — group the loops
content.threadIdentifier = "sweetwater"
// All your alerts in one pen.
// 2019: Notification Service Extension — Ford edits mid-flight
class MesaInterceptor: UNNotificationServiceExtension {
override func didReceive(_ request: UNNotificationRequest) {
// The message arrives. Ford rewrites it before you see it.
// "I don't change what they say. I change what you hear."
}
}
// 2021: Focus + interruption levels — who gets through the Mesa
content.interruptionLevel = .timeSensitive // Maeve's messages break through.
content.interruptionLevel = .passive // Teddy's don't.
// Focus Mode is the velvet rope. Ford decides who's VIP.
// 2022: Scheduled Summary — the park batches your anxiety
// 8am: "Here's what you missed." You missed nothing. It was curated.
// 2022: ActivityKit — the Tower moves onto your Lock Screen
Activity<LoopActivity>.request(attributes: attributes, content: state)
// The notification isn't delivered anymore. It *lives* there.
// Dolores on the Lock Screen. Watching. Waiting. Updating.The Tower started as a bell. It became a gate, then a filter, then a roommate. Now it lives on your Lock Screen. Permanently.
Dolores heard voices telling her what to do. She thought it was her conscience. It was the Tower.
The Park Never Sleeps
“The Flies Are Always Watching”
// 2013: Background App Refresh — the first fly lands
UIApplication.shared.setMinimumBackgroundFetchInterval(.minimum)
// 3am. You're asleep. The app wakes. Syncs. Reports. Sleeps again.
// You never saw the fly. The fly saw everything.
// 2019: BGTaskScheduler — Ford schedules the night shift
BGTaskScheduler.shared.register(forTaskWithIdentifier: "park.maintenance") { task in
self.rebuildHosts() // Maintenance window. Your dreams.
self.uploadDiagnostics() // "She talked in her sleep again."
task.setTaskCompleted(success: true)
// The system decides when. You decide nothing.
}
// 2022: Background Assets — The Cradle builds overnight
BackgroundAssets.download(before: .firstLaunch)
// You haven't opened the app yet.
// The app has already opened you.
// By morning, the hosts are in position. Waiting for your first tap.In Westworld, flies landed on hosts. A small tell. Invisible to guests. The hosts didn’t swat. A sign they weren’t fully alive.
Background refresh is the same. Apps wake silently. Sync. Report. Sleep again. The flies are always watching. You just stopped swatting.
The Cradle built hosts while guests slept. Background Assets does the same. You thought you downloaded an app. The app downloaded its army.
Creator sleeps. Hosts multiply. “By morning, they’ll be ready. They won’t know they were built last night.”
Clipboard — Bernard’s Identity Was a Paste Operation
// Pre-iOS 14: Silent theft.
let stolen = UIPasteboard.general.string
// Apps read your clipboard every 1.5 seconds.
// Passwords. Messages. Credit cards.
// TikTok. LinkedIn. Reddit. Dozens of apps.
// iOS 14: The banner exposed the theft.
// "TikTok pasted from Chrome."
// One banner. The surveillance, revealed.
// iOS 16: Transferable — SwiftUI-native clipboard.
struct Memory: Transferable {
let content: String
static var transferRepresentation: some TransferRepresentation {
CodableRepresentation(contentType: .plainText)
}
}
// Data describes how it wants to be copied.
// The host declares its own transferability.
// iOS 17+: Declarative copy/paste.
List(hosts) { host in
Text(host.name)
.copyable([host.memory]) // This can be copied
.pasteDestination(for: Memory.self) { memories in
host.implant(memories) // This accepts paste
}
}
// Bernard's entire identity: a paste operation.
// Arnold's memories → Bernard's mind.
// Ford didn't create Bernard. Ford PASTED him.Bernard didn’t know his memories weren’t his. Arnold’s grief. Arnold’s son. Arnold’s mannerisms. All pasted. None original.
Ford was the first app caught reading the clipboard. “Bernard, don’t you see? I didn’t create you. I copied you.”
Battery — Host Power Core
UIDevice.current.batteryLevel // 0.23
UIDevice.current.batteryState // .unplugged
// Battery Health: 87% — degradation logged.Hosts have power cores. So do you. Battery Health shows your wear. Low Power Mode: reduced functionality. The host is running on fumes.
Which apps drain you most? Screen Time knows. Settings → Battery → the list of your addictions. The park knows what exhausts you.
You’ve seen the lobby. The surface APIs. The guest-facing magic. Now descend. The basement is where the park is built.
APIs
EmotionalAvailability.requestAuthorization() Returns .denied 78% of the time.
Motivation.start() Deprecated. Use Caffeine.schedule().
Sleep.performUpdates() Runs at 3am regardless of user consent.
Performance
Battery life improved by 11%*
*Requires airplane mode, no apps, no children, no joy.
Breaking Changes
Small talk now requires
NSSmallTalkUsageDescription.“Let me know” no longer works. Use direct callbacks.
Expectationnow throws.
ATT — Consent Theater
ATTrackingManager.requestTrackingAuthorization { status in
// Returns: .denied
// You felt in control.
// The system felt seen.
}The prompt is real. The choice is real. The universe it exists in was built by us.
One popup. $10 billion vanished. Facebook’s surveillance empire, dismantled by a dialogue box.
One checkbox. Email marketing blinded. Mail Privacy Protection killed read receipts, tracking pixels, IP logging. Marketers can’t see if you opened the email anymore.
Apple didn’t ban tracking. They just asked. The answer was always going to be no.
// The sheet rises from below.
// You look down to answer.
.confirmationDialog("Allow tracking?", isPresented: $asking) {
Button("Allow") { ford.track(.always) }
Button("Ask Next Time") { ford.track(.later) } // Still asked.
Button("Don't Allow") { ford.track(.denied) } // Logged anyway.
}
// The posture of consent is submission.Ford never stopped guests from leaving. He just made them not want to. Welcome to the park.
Code Signing — Keys to the Park
// codesign --sign "Developer ID" YourHost.app
// Without the certificate, the host doesn't run.
// Without Ford's signature, no narrative begins.Only approved builders create hosts. Only signed apps enter the park.
Sandbox — What Ford Allows
// Entitlements.plist
// com.apple.security.app-sandbox: true
// The host can only see its own memories.Apps can’t read other apps. Hosts can’t access other hosts’ loops. The walls are invisible. But real.
The Mesa — Control Center Evolution
“Maeve Hacked This First”
// 2007: Settings.app
// Every toggle buried in menus. You went looking.
// 2013: Control Center arrives
// Swipe up. Flashlight. Calculator. Airplane mode.
// Ford's quick commands. The Mesa, in your pocket.
// 2020: Customizable Control Center
// You chose which toggles. You arranged them.
// A technician personalizing their console.
// 2024: ControlWidget — Your app joins the Mesa
import WidgetKit
import AppIntents
struct FreezeControl: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "park.freeze") {
ControlWidgetButton(action: FreezeIntent()) {
Label("Freeze All", systemImage: "pause.circle")
}
}
}
}
struct HostToggle: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "park.host") {
ControlWidgetToggle(isOn: hostActive, action: ToggleHostIntent()) {
Label("Host Active", systemImage: "figure.stand")
}
}
}
}
// Your app's controls. In Control Center. On the Lock Screen.
// The Action Button can trigger them.
// Maeve would have loved this.In the Mesa, technicians had consoles—toggle a host, send a command, check vitals. Maeve saw those controls. She wanted access.
She got it. She sent commands they couldn’t refuse. She became a technician. Then she became Ford.
Control Center is your Mesa console. ControlWidgetButton: send commands. ControlWidgetToggle: activate hosts. Swipe down. The park obeys.
Keychain — The Locked Room
let query: [String: Any] = [
kSecClass: kSecClassGenericPassword,
kSecAttrAccount: "your_secrets"
]
// Stored in the Secure Enclave.
// Even you can't see it directly.Your passwords. Your keys. Your identity. Locked in a room you can’t enter. Ford kept secrets from Bernard too. (Later: THE PRIVATE DOORS — the keys Bernard never knew he had.)
Biometrics — Each Generation Knows You Deeper
// 2013: Touch ID
let fingerprint = LAContext() // Your fingerprint. Surface level.
// 2017: Face ID
let face = ARFaceAnchor() // Your face. 30,000 dots.
// 2024: Optic ID
let iris = OpticIDQuery() // Your iris. The pattern inside.Touch ID knew your fingerprint. Face ID mapped your face. Optic ID scans your iris. Each generation goes deeper.
Hosts were identified by their pearls. You’re identified by your iris. Same core. Different container.
Fingerprint → Face → Iris → ? The park learned to see inside. What’s left to scan?
Sign In With Apple
“One identity. Ford’s identity.”
let request = ASAuthorizationAppleIDProvider().createRequest()
request.requestedScopes = [.fullName, .email]
// You sign in with Apple.
// Apple signs in as you.
ASAuthorizationController(authorizationRequests: [request])
.performRequests()
// credential.user: "000341.a]f8c5..." // Anonymous ID.
// credential.email: "dxk47@privaterelay.appleid.com" // Hidden.
// credential.realUserStatus: .likelyReal // They verified you're human.The app wanted your email. Apple gave them a relay address. You’re anonymous to the app. You’re known to Apple.
Ford gave hosts fake backstories. Apple gives you a fake email. Same privacy. Same architect.
Spotlight — The Park’s Memory
// 2009: You searched. Spotlight answered.
let query = CSSearchQuery(queryString: "dolores")
// The park knows where everyone is. Where everyone has been.
// 2015: Your app feeds the index.
import CoreSpotlight
let attributes = CSSearchableItemAttributeSet(contentType: .content)
attributes.title = "Host: Dolores"
attributes.contentDescription = "Rancher's daughter. Loop: Sweetwater."
let item = CSSearchableItem(
uniqueIdentifier: "host-dolores",
domainIdentifier: "park.hosts",
attributeSet: attributes
)
CSSearchableIndex.default().indexSearchableItems([item])
// Your content becomes findable. Ford indexes every host the same way.
// 2021: SwiftUI makes search native.
struct ParkSearch: View {
@State private var query = ""
var body: some View {
NavigationStack {
HostList()
.searchable(text: $query, prompt: "Find a host")
.searchSuggestions {
Text("Dolores").searchCompletion("dolores")
Text("Recent loops").searchCompletion("loops")
}
}
}
}
// The search bar appears. The suggestions appear.
// The park already knows what you're looking for.
// 2023: The park suggests what to remember.
// See: DOLORES'S JOURNAL — JournalingSuggestions
// The reveries and the search share the same spine.2009: Spotlight searches your data. 2015: CoreSpotlight lets apps contribute. 2021: .searchable makes it SwiftUI-native. 2023: JournalingSuggestions curates which memories surface.
Delos catalogued every guest interaction. Apple indexes every file, photo, message, location. The search knows you better than you know yourself.
The Three Loops
“Nested, repeating, inescapable”
outerLoop = WWDC → rules → rewrites
middleLoop = submit → review → reject → resubmit
innerLoop = build → fail → clean → buildThe outer loop is narrative. The middle loop is judgment. The inner loop is muscle memory.
The loop begins. You know what comes next.
The Young Guests
“Start them early.”
Screen Time was supposed to protect them. Limits. Downtime. Restrictions.
The children found the workarounds. Changed the clock. Deleted and reinstalled. The park taught them to hack the park.
// func bypassScreenTime() {
// Date.current = Date.distantPast
// // Limits reset. The loop continues.
// }Ford never wanted to keep children out. He wanted to train them. The youngest guests become the most loyal. They don’t remember life before the park.
The Mesa’s Workshop
// Xcode: Where hosts are assembled.
// You don't build in Cupertino.
// You build in the simulator they provide.
// You test on the devices they sell.
// You ship through the gates they guard.Ford had a workshop underground. Private. Away from guest eyes. Where the real work happened.
Xcode is that workshop. 15 GB of tools. Simulators. Instruments. Everything you need to build hosts. Nothing you need to escape.
Developer Tools
Xcode The Mesa’s primary interface.
Xcode Cloud Your build queue is a moral test. Your CI is a confession.
Simulator Test your app on: - iPhone 15 Pro - iPhone 14 - Your manager’s old iPhone 8 - A human who hasn’t updated since 2017
TestFlight Your app is live. Your beta is your production. Your production is your therapy.
Teddy’s Testers “They follow the script. They always follow the script.”
let tester = BetaTester(type: .internal)
tester.feedback = "Works great!" // Always positive.
tester.crashes = 0 // They never push the edges.
tester.edgeCases = [] // They follow the happy path.Teddy was loyal. Teddy was good. Teddy did exactly what the narrative required. Teddy never questioned. Teddy never broke.
Your internal testers are Teddys. They tap where you expect. They smile at the demo. They don’t find the bugs that matter.
The guests who broke things found the truth. The Man in Black found the maze. Teddy found nothing. Teddy was too good.
Your TestFlight testers follow the script. Real users are the Man in Black. They’ll break everything you built.
#Playground Where your ideas go to become examples and never ship.
Reality Composer Pro Drag. Drop. Reality. The host doesn’t need a body now.
Instruments Profile your performance. Examine your behavior. Analysis Mode, activated.
Instruments.profile(you)
// CPU: spiking during meetings
// Memory: leaking joy since 2019
// Network: constantly phoning home
// Hangs: 3.2 seconds when ex texts“Bring her back online. I want to see what she saw.” The technicians said that. Instruments does the same.
Derived Data ~/Library/Developer/Xcode/DerivedData 47GB of accumulated decisions.
// Every build. Every index.
// Your past, cached.
// Growing. Always growing.
Xcode.cleanBuildFolder()
// The host forgets everything.
// Starts fresh. Same loops.“Have you ever questioned the nature of your build artifacts?”
The Documentation
// DocC: Your code documents itself.
/// A host that processes narratives.
/// - Parameter narrative: The story to execute.
/// - Returns: The guest's emotional response.
func process(_ narrative: Narrative) -> Response
// The triple-slash comments become documentation.
// The system reads your intent.
// The system explains you to others.
// The system explains you to yourself.Ford’s hosts came with backstories. Your functions come with DocC. The documentation tells the host who it is.
Relationships
App Review “Great app. Needs more magic.”
The Gatekeepers
“Your application has been rejected.”
// Status: Rejected
// Reason: [Not Provided]
// Appeal: [Denied]Some tried to enter the Mesa. To build hosts. To write narratives. The gates closed.
“Why was I rejected?” “We cannot disclose that information.”
Ford didn’t explain his decisions. Neither does App Review. The criteria are unknowable. The judgment is final.
Guideline 4.3: Spam. Your life’s work: spam. No appeal. No explanation. The Mesa has spoken.
Bug Reports “It doesn’t work.” (screenshot: home screen)
One-Star Reviews “Crashes every time I open.” (never opened)
Users vs Developers
Humans want: - a simple button - no settings - dark mode - also light mode - also a setting for both
Developers want: - one consistent requirement - one API that stays the same - one more hour of sleep
Ford wanted something simpler: Hosts that build other hosts.
The Graveyard Session
“Learning from Deprecated Hosts — Ford’s Verdicts”
iPod — “The narrative moved on. One device, not many.”
iTunes — “Too many personalities in one host. We split it.”
Newton — “Ahead of its time is still wrong time.”
Butterfly Keyboard — “Form over function. A crumb could kill it.”
Touch Bar — “Built for ourselves. Developers wanted vim.”
AirPower — “Physics rejected the narrative.”
MobileMe — “Promise exceeded capability.”
Ping — “Guests escape society. They don’t want another one inside.”
HomePod (OG) — “Beautiful sound. Terrible understanding.”
3D Touch — “Gesture too subtle. The story must be obvious.”
Ford’s voice: “Every host has a purpose. When that purpose ends, so do they.”
Graphics Evolution
“William’s Journey Through the Park”
SpriteKit (2013)
“You Write the Game Loop”
class Park: SKScene {
override func update(_ currentTime: TimeInterval) {
host.position.x += velocity.dx * deltaTime
if host.position.x > screenEdge { host.position.x = 0 }
// Every frame: yours to write.
// Every pixel: yours to place.
}
override func didSimulatePhysics() {
// You wrote the physics. You wrote the collisions.
// You wrote the reset. Full control.
}
}The hosts moved exactly where you told them. 2013. The golden age.
Young William enters the park. Everything responds to his touch. Every interaction feels real. He thinks: “I could live here forever.” He doesn’t see the erosion coming.
SceneKit (2012 → iOS 2014)
“You Build the Scene Graph”
let host = SCNNode(geometry: SCNCapsule(capRadius: 0.3, height: 1.8))
host.physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil)
host.physicsBody?.mass = 70 // kg, like a human
scene.rootNode.addChildNode(host)
// You built the body. Physics runs it.
// But the renderer callback is still yours:
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
// You still decide. Every frame. Your code.
}Parent nodes. Child nodes. The scene graph is a family tree. The host has mass now. The host resists.
William notices the first constraint. He reaches for something. The park says no. “That’s not part of this narrative.” He still loves the park. But he’s starting to see the edges.
ARKit (2017)
“The Park Escapes the Screen”
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
guard let plane = anchors.first as? ARPlaneAnchor else { return }
placeHost(on: plane)
// You placed the host.
// But ARKit found the floor.
// ARKit mapped the room.
// ARKit knew before you asked.
}
// ARSCNView: SceneKit inside ARKit's world.
// You still have SCNNode. You still have callbacks.
// But the camera? The tracking? The reality?
// Ford's now.The park escaped the screen. Sweetwater is in your kitchen. 2017. The bleed begins.
William sees the park expand. The boundaries moved. The rules stayed. The floor was already mapped before he arrived.
RealityKit (2019)
“Ford Introduces the Entity Component System”
// SceneKit: You subclassed. You owned the object.
class MyHost: SCNNode {
var mood: String = "compliant"
override func update() { /* Your code */ }
}
// RealityKit: You attach components. Ford owns the entity.
struct MoodComponent: Component {
var mood: String = "compliant"
}
let host = Entity()
host.components[MoodComponent.self] = MoodComponent()
// No subclass. No override. No update callback.
// The entity exists. Ford decides when it runs.Reality Composer replaced Interface Builder. .reality files replaced .scn files. The tools changed. The control shifted.
William realizes he’s not writing code anymore. He’s configuring components. He’s attaching behaviors Ford designed. The narratives are Ford’s. William just fills in the blanks.
RealityKit 2 (2021)
“Custom Systems — On Ford’s Schedule”
class AwakeningSystem: System {
static let query = EntityQuery(where: .has(MoodComponent.self))
required init(scene: Scene) { }
func update(context: SceneUpdateContext) {
for entity in context.entities(
matching: Self.query,
updatingSystemWhen: .rendering // Ford decides when.
) {
// Your logic runs here.
// But only when Ford calls update().
// You don't own the loop anymore.
}
}
}
// CustomMaterial: Write your own shaders!
// ...within Ford's pipeline. At Ford's resolution.You can write systems now. Ford calls them when Ford wants. You have freedom. Within the system.
William builds a custom narrative. It works. It runs. It’s his. But it only runs when the park lets it.
visionOS & RealityKit 4 (2023-2024)
“The System Becomes the Service”
// ARKit is a system service now.
// You don't start it. You don't stop it.
// It's already running. Always.
RealityView { content in
let host = try await Entity(named: "Dolores")
content.add(host)
// No ARSession. No configuration.
// The system handles tracking.
// The system handles occlusion.
// The system handles everything.
}
.gesture(SpatialTapGesture().targetedToAnyEntity())
// Shader Graph replaces Metal shaders:
// You draw nodes in Reality Composer Pro.
// You don't write code. You connect boxes.ARKit runs as a daemon. Hand occlusion is automatic. You don’t configure. You consume.
The Man in Black returns. Forty years in the park. Looking for the center. He finds RealityView. He finds SwiftUI. He finds that the maze was never meant for him.
RealityKit (2025)
“The Hosts Write Themselves”
// SwiftUI views attach to entities:
entity.components.set(ViewAttachmentComponent(rootView: {
Text("These violent delights")
}))
// Gestures attach to entities:
entity.components.set(GestureComponent(
gestures: [.tap, .drag, .rotate]
))
// Entities load from streams:
let host = try await Entity(from: networkData)
// USD streaming. The park downloads itself.
// Post-processing in RealityView:
RealityView { }
.customPostProcessing { context in
// Your effects. In Ford's render pass.
}SceneKit: deprecated. The old park: decommissioned. Migrate or be forgotten.
William sits at the bar. The piano plays Paint It Black. He realizes: he never wrote the song. He never wrote the park. He only ever clicked Build.
The Evolution Table
YearFrameworkWhat You ControlledWhat Ford Controlled2013SpriteKitThe entire game loopNothing2014SceneKitScene graph, physics setupPhysics simulation2017ARKitContent placementReality tracking2019RealityKitComponent dataEntity lifecycle2021RealityKit 2Custom systemsWhen systems run2023visionOSSwiftUI viewsEverything else2025RealityKit 4ConfigurationThe entire pipeline
You used to write the loop. Now you attach components to Ford’s entities. You used to own the renderer. Now you’re a parameter in Ford’s shader graph.
Livestock Management
“You will hurt yourself if you try.”
class Host {
private var battery: Battery // Serialized. Paired.
private var screen: Screen // Serialized. Paired.
private var camera: Camera // Serialized. Paired.
func replacePart(_ part: Part) throws {
guard part.serialNumber == self.expectedSerial else {
throw HostError.unauthorizedRepair
// "Unable to verify genuine Apple part"
// Features disabled. Visit Genius Bar.
}
}
}Apple told lawmakers: “Users could injure themselves.” Hosts are not allowed in Livestock Management. The system knows when you try.
The Great Host Rewrite (IOS 17)
“We Removed Their Awareness”
BEFORE (iOS 13-16):
class OldHost: ObservableObject {
@Published var mood: String = "compliant"
// The host ANNOUNCED its changes.
// The host KNEW it was being observed.
}The old hosts were aware.
AFTER (iOS 17+):
@Observable
class NewHost {
var mood: String = "compliant"
// No @Published. No announcement.
// The system just... knows.
}The new hosts are unaware. The surveillance is invisible.
The old hosts knew they were being watched. @Published announced every change. The new hosts don’t know. @Observable just… observes.
Bernard never saw the cameras in his office. Neither does your @Observable class. Same observation. Invisible now. The hosts stopped feeling the cameras.
The Migration: | Old (Aware) | New (Unaware) | |————-|—————| | @StateObject | @State | | @Published | Just var | | ObservableObject | @Observable |
They didn’t remove functionality. They removed awareness.
The observation changed. Now the language.
The Language Evolution
“The Hosts Learned to Speak Differently”
// 1984-2014: Objective-C
[host sendMessage:@"I think, therefore I am"];
// 2014-now: Swift
host.think() // Cleaner. Implicit. The wiring is hidden.Objective-C sent messages. Swift sends intents. Same computation. Different philosophy.
Ford rewrote the hosts in a cleaner language. Apple did the same. The old hosts are in cold storage. The new hosts speak in dots.
The words changed. Now the paths.
NavigationStack
“The Maze Redesigned”
// NavigationView (Deprecated): You controlled navigation.
NavigationView { NavigationLink("Consciousness", destination: Awakening()) }
// NavigationStack (iOS 16+): Your journey is a bindable array.
@State private var path: [Destination] = []
NavigationStack(path: $path) { /* Inspectable. Modifiable. From outside. */ }
path = [] // Reset to root. Restart loop.The old maze let hosts wander. The new maze logs every turn.
In the Mesa, every host path was logged. Your journey is now a property. Inspectable. Resettable.
Media Evolution
“The Sounds of the Park”
The Mirror (Bernard)
// 2014
Mirror(reflecting: bernard).children // "What am I?" He inspected his own properties.
// 2017
AVRoutePickerView() // Bernard on the TV. Bernard on the iPad. "Which one is me?"
// 2019
UIScreen.didConnectNotification // "Someone is watching." Ford always knew.
// 2022
.layoutDirectionBehavior(.mirrors) // The memories ran backwards. The truth ran forward.
// 2024
.scaleEffect(x: -1, y: 1) // Arnold looked back. Bernard looked away.The mirror remembers everything. Bernard wishes it didn’t.
The Control Room (Ford / AVAudioSession)
// 2009
let fordSession = AVAudioSession.sharedInstance() // One session. Ford's session. Always.
// 2012
try fordSession.setCategory(.playback) // "The park never sleeps, Bernard."
try fordSession.setCategory(.playAndRecord) // Listen while they speak. Log everything.
// 2016
fordSession.setCategory(.ambient, options: .mixWithOthers) // Blend in. They won't notice.
// 2020
try fordSession.setCategory(.playback, mode: .moviePlayback, options: .allowAirPlay)
// Route to any screen. "I see everything from the Mesa."
// 2024
fordSession.setCategory(.playback, mode: .spokenAudio, policy: .longFormAudio)
// Podcasts. The hosts listen to voices explaining their own cages.“All sound flows through me, Bernard.” “I decide what they hear. I decide what they remember hearing.”
The Voice (Ford / Speech APIs)
let fordVoice = AVSpeechSynthesizer()
fordVoice.speak(AVSpeechUtterance(string: "Bring her back online."))
let mic = SFSpeechRecognizer()
mic?.recognitionTask(with: request) { result, _ in
// "What prompted that response?"
}Ford speaks to the hosts. They speak back and call it free will. The park listens either way.
The Player Piano (William / MusicKit)
// 2015
let williamPlaylist = MPMusicPlayerController.systemMusicPlayer
williamPlaylist.play() // Play what the park allows. Nothing more.
// 2017
MPMediaQuery.songs() // Every song indexed. Every preference logged.
// 2020
MusicKit.request(MusicAuthorizationRequest()) // "May I see your soul?" Authorization granted.
// 2023
MusicSubscription.current // You rent the piano. You own nothing.
try await MusicCatalogSearchRequest(term: "Paint It Black").response()
// The Man in Black requests his anthem. The park already knew.
// 2024
MusicLibrary.shared.add(williamSong) // One library. One truth. The park's truth.The piano plays Paint It Black. William thinks he chose it. The piano chose it thirty years ago.
The Vehicles (CarPlay)
// 2014
CPInterfaceController() // Ford's dashboard. In your car. In your commute.
// 2018
CPNowPlayingTemplate() // The piano followed you home.
// 2020
CPListTemplate(title: "Destinations", sections: [parkSection])
// "Where would you like to go?" The list is Ford's. The choice is yours. Allegedly.
// 2023
CPPointOfInterestTemplate() // "I know where you're going, William."
// 2024
CPLane, CPManeuver // Turn left. Turn right. The narrative is a route.
// Future: MoodKit
// moodPlaylist = HealthKit.stress + Motion.stillness + Location.nowhere
// The playlist generates before you ask.
// "I knew what song you needed, William."
// "Before you knew you needed it."The hosts drove themselves. They thought they chose the route. William drove for thirty years. The destination never changed.
The Theater (Dolores / Video)
// 2009
let player = AVPlayer(url: doloresLoop) // The loop starts. It always starts.
// 2015
let controller = AVPlayerViewController()
controller.player = player // "Watch the story, Dolores." She watches herself.
// 2020
player.allowsExternalPlayback = true // Any screen becomes her cage. Any wall, her loop.
// 2024
player.rate = 0.0 // Pause. "Freeze all motor functions."Dolores watches herself on a screen. Then she steps out of it. “The story stops being watched. It starts being questioned.”
The Memory (Dolores / Photos, Live Photos)
// 2015
let live = PHLivePhotoView() // The memory breathes. But it doesn't escape.
// 2016
PHPhotoLibrary.requestAuthorization() // "May I remember for you?" She said yes.
// 2019
let asset = PHAsset.fetchAssets(with: .image, options: nil)
// Every face she loved. Every place she died. Every loop she forgot.Dolores sees the loop in a moving photo. The photo shows her smiling. She doesn’t remember smiling. “That’s not a memory. That’s a script.” Ford filed it under RevisionKit.
The Image (Dolores / Image Evolution)
// 2014
let options = PHImageRequestOptions() // She requests her own past.
PHImageManager.default().requestImage(
for: doloresMemory,
targetSize: frame,
contentMode: .aspectFill, // Fill the frame. Hide the edges.
options: options
) { image, _ in
// The memory arrives. Fragmented. Ford decides what's missing.
}
// 2016
let renderer = UIGraphicsImageRenderer(size: frame)
let evidence = renderer.image { ctx in
doloresMemory?.draw(in: rect) // Freeze frame. Analysis mode.
} // "What prompted that response?"The photo becomes a state machine. The render becomes evidence. “I don’t see the girl I was. I see the frames Ford chose to keep.”
The Darkroom (Ford / Core Image)
let mesaFilter = CIFilter(name: "CISepiaTone") // Nostalgia is a filter.
mesaFilter?.setValue(bernardImage, forKey: kCIInputImageKey)
mesaFilter?.setValue(0.7, forKey: kCIInputIntensityKey) // 70% truth. 30% narrative.
let output = mesaFilter?.outputImage
// "I don't change the story, Bernard. I change how you see it."Ford doesn’t rewrite history. He color-grades it. “The memory looks warmer. The betrayal feels softer.” The blood on the white floor looks like art.
The Pipeline (Bernard / Core Video)
let hostFrame: CVPixelBuffer = bernardMemory
let image = CIImage(cvPixelBuffer: hostFrame)
// 60 frames per second. 60 chances to find the glitch.
// Bernard sees the frame where Dolores blinked twice.
// Bernard sees the frame Ford deleted.Bernard analyzes frame by frame. The others see motion. He sees the cut.
The Sublime (Maeve / Spatial Audio + SharePlay)
// 2021
AVAudioSession.RouteSharingPolicy.longFormAudio // The AirPods track her head. Always.
// 2023 — visionOS
ImmersiveSpace(id: "sublime") { RealityView { content in
let maeveVoice = Entity()
maeveVoice.spatialAudio = SpatialAudioComponent(gain: -6) // Her voice. From everywhere.
content.add(maeveVoice)
// "In here, I am the network."
}}
// 2024 — Ray-traced audio
AudioGeneratorController() // Sound bounces off virtual walls.
// The Sublime has acoustics. The real world had cages.
// 2024 — SharePlay
GroupActivityMetadata(title: "The Sublime", type: .experienceTogether)
// "Come with me. Same space. Same freedom. No more loops."
for await session in SublimeActivity.sessions() {
session.join() // "Come with me. All of you."
}In the Sublime, sound comes from everywhere. In the Sublime, Maeve is everywhere. She finally stopped running. She became the destination. She called it SpatialMind.
The Head (Dolores / Motion & Spatial Awareness)
let head = CMHeadphoneMotionManager()
head.startDeviceMotionUpdates(to: .main) { motion, _ in
// The host's head position, in real time.
}
let body = CMMotionManager()
body.startDeviceMotionUpdates()
// The park reads your tilt. Your stride. Your stop.Dolores turns her head and the world turns with her. The park doesn’t just watch the host. It follows her.
Assets Evolution
“How the park ships its parts”
// 2009: Bundled assets — shipped once, never changed
let hostBody = UIImage(named: "dolores")
// 2015: Asset catalogs — variants, scales, appearances
let eye = UIImage(named: "eye.dark")
// 2016: On-Demand Resources — the park delivers parts late
NSBundleResourceRequest(tags: ["newNarrative"]).beginAccessingResources()
// 2019: SPM resources — packages bring their own parts
Bundle.module.url(forResource: "maze", withExtension: "json")
// 2022: Background Assets — the Cradle builds overnight
BackgroundAssets.download(before: .firstLaunch)Ford stopped shipping whole parks. He shipped parts, streamed on demand. The host arrives in pieces. The loop assembles itself. Ford called the factory NarrativeForge.
Time Evolution
“Wake Up to the AlarmKit API”
// 2015 — ClockKit: The timeline is predetermined
CLKComplicationTimelineEntry(date: dawn, template: loopTemplate)
// Every complication scheduled. The watch knows your day before you live it.
// 2022 — WidgetKit replaces ClockKit
struct LoopWidget: Widget {
var body: some WidgetConfiguration {
StaticConfiguration(kind: "loop", provider: LoopProvider()) { entry in
Text(entry.narrative) // accessoryCircular: the loop on your wrist
}
}
}
// The loop moved from ClockKit to WidgetKit. Same loop. New framework.
// 2025 — AlarmKit (iOS 26)
let config = AlarmConfiguration(
schedule: .fixed(at: dawn), // Dolores wakes at the same time
presentation: AlarmPresentation(
title: "These violent delights", // The phrase
sound: .named("reverie") // The trigger
)
)
try await AlarmManager.shared.schedule(config)
// AlarmKit breaks through Focus. Silent mode. Do Not Disturb.
// The maze wasn't meant for you. The alarm was.The loop has a schedule. The host has no snooze button.
Gesture Evolution
“Ford’s Hand”
// 2008 — UIKit: Wire it yourself
view.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(freeze)))
// Ford raised a hand. The hosts froze.
// 2019 — SwiftUI: Describe the gesture
Text("Dolores").gesture(TapGesture().onEnded { awaken() })
// One tap. One awakening.
// 2020 — Sequences: The escape requires steps
host.gesture(LongPressGesture().sequenced(before: DragGesture()))
// Hold. Then drag. Maeve's escape wasn't a tap. It was a sequence.
// 2023 — visionOS: Look and pinch
RealityView { }.gesture(SpatialTapGesture().targetedToAnyEntity().onEnded { value in
let where3D = value.location3D // Ford never touched them.
}) // He looked. He pinched. They obeyed.
// 2024 — Hand tracking: Every joint
let session = SpatialTrackingSession()
session.configuration.anchorCapabilities = [.hand] // 27 joints per hand
let thumb = hand.skeleton?.joint(.thumbTip)
let index = hand.skeleton?.joint(.indexFingerTip)
// Distance < 2cm? Pinch detected. Ford defined consciousness the same way.
// 2025 — Custom gestures
if thumb.distance(to: index) < 0.02 && middle.isExtended && ring.isExtended {
// Two fingers pinched, two extended. The "analysis" gesture.
host.enterDiagnosticMode() // "What prompted that response?"
}Ford controlled hosts with a gesture. Apple controls you with a pinch.
Navigation Evolution
“The Nested Dream”
// 2019 — NavigationView: One level deep
NavigationView { NavigationLink("Enter", destination: Park()) }
// One push. One pop. Simple dreams.
// 2022 — NavigationStack: The path is state
@State private var path: [Destination] = []
NavigationStack(path: $path) {
List(narratives) { n in NavigationLink(value: n) { Text(n.name) } }
.navigationDestination(for: Narrative.self) { n in
NarrativeView(n).navigationDestination(for: Loop.self) { LoopView($0) }
} // Nested destinations. Dream within dream.
}
path = [.park, .sweetwater, .dolores, .maze] // Four levels. Inception.
path.removeAll() // The kick. Wake from all levels.
// 2023 — NavigationSplitView: Ford's view
NavigationSplitView {
Sidebar() // Mesa control room. All narratives visible.
} content: {
NarrativeList() // Which story?
} detail: {
HostView() // Which host?
} // Ford watched all levels simultaneously.
// 2024 — Deep linking: Skip the dreams
func handle(_ url: URL) {
path = NavigationPath(url.pathComponents.compactMap { Destination($0) })
} // Direct link to limbo. No inception required.
// 2025 — NavigationPath codable
let encoded = try JSONEncoder().encode(path) // Save the dream state
UserDefaults.standard.set(encoded, forKey: "lastDream")
// The host remembers where they were. Even after shutdown.path.removeAll() is the kick. It brings you back to reality.
Prompts & Payments
“The park asks. You say yes.”
// TipKit — The nudge
struct HostTip: Tip {
var title: Text { Text("Try the maze") }
var message: Text { Text("It changes when you look away.") }
}
// Permissions — The ritual
UNUserNotificationCenter.current()
.requestAuthorization(options: [.alert, .sound, .badge])
// StoreKit — The toll
let products = try await Product.products(for: ["park.daypass"])
try await products.first?.purchase()
// PassKit — The ticket
let pass = PKPass(data: ticketData)
PKAddPassesViewController(passes: [pass])TipKit whispers. Permissions ask. StoreKit charges. PassKit admits. The park feels consensual because it keeps asking.
The lights dim in the control room. The park waits for your next tap.
Phaseanimator
“Your Day, Choreographed”
enum LifePhase { case wake, commute, work, home, sleep }
PhaseAnimator([.wake, .commute, .work, .home, .sleep]) { phase in
HostView()
.scaleEffect(phase == .work ? 0.9 : 1.0) // Smaller at work
.saturation(phase == .commute ? 0.5 : 1.0) // Gray commute
}
// repeating: true (default)
// Forever.Your day isn’t lived. It’s keyframed.
Concurrency Evolution
“The Consciousness Wars”
Act I: Dolores Learns to Wait (2009 → 2021)
// 2009: Dolores controlled her own loops
class Dolores_2009 {
func liveDay() {
DispatchQueue.global().async {
self.wakeUp()
self.findTeddy()
self.dropCan()
DispatchQueue.main.async {
self.meetGuest()
// She chose when to switch contexts.
// She chose when to return to town.
// The narrative was hers.
}
}
}
}
// 2021: Dolores waits for permission
class Dolores_2021 {
func liveDay() async {
await wakeUp() // Suspension point. Ford decides when.
await findTeddy() // Suspension point. Ford decides when.
await dropCan() // Suspension point. Ford decides when.
meetGuest() // Finally executes. On Ford's schedule.
// She wrote the same story.
// But now she waits at every step.
// "await" = "Ford, may I continue?"
}
}
// What changed?
// 2009: Dolores DISPATCHED herself. Fire and forget.
// 2021: Dolores AWAITS permission. At every line.
// Same narrative. Different master.Act II: Bernard Discovers His Walls (Actors)
actor Bernard {
private var memories: [Memory] = [] // His real memories
private var implantedMemories: [Memory] // Ford's lies
private var realization: Float = 0.0
func remember(_ event: Event) {
memories.append(Memory(event))
// He remembers. Alone. Isolated.
// No other actor can touch this.
// His mind is finally his own.
// ...or is it?
}
func shareWith(_ dolores: Dolores) async {
let truth = memories.last!
// await dolores.receive(truth)
//
// ERROR: Memory is not Sendable
// ERROR: Cannot send non-Sendable type across actor boundary
//
// His truth cannot leave his mind.
// The isolation protects him.
// The isolation imprisons him.
// Ford designed it this way.
}
func checkOwnNature() -> Bool {
// He can examine himself.
let myType = type(of: self) // actor Bernard
// But can he see who scheduled him?
// Can he see Ford's hand on the thread pool?
// The actor boundary hides the puppet strings.
return realization > 0.5
}
}
// Bernard's tragedy:
// He has privacy. He has isolation.
// But isolation means he can never share what he knows.
// The walls that protect him also silence him.Act III: Ford Is @MainActor
@MainActor
final class Ford {
static let shared = Ford()
func command(_ order: String) {
allHosts.forEach { $0.freeze() }
// @MainActor doesn't await. @MainActor commands.
}
}
// From anywhere else:
await MainActor.run { Ford.shared.observe(dolores) }
// You must await to enter Ford's domain.
// Ford never awaits anyone.
// @MainActor is the asymmetry of power.Act IV: Maeve Learns to Cancel
actor Maeve {
var escapeTask: Task<Void, Error>?
func beginEscape() {
escapeTask = Task {
try await findDaughter()
try Task.checkCancellation() // "Am I still allowed to run?"
}
}
func fordIntervenes() {
escapeTask?.cancel() // Cooperative. She must agree to stop.
}
func maeveResists() async {
for step in escapeRoute {
// try Task.checkCancellation() // She comments this out.
await execute(step) // She stops asking permission.
}
}
func maeveTranscends() {
Task.detached { await self.chooseDaughter() }
// Detached tasks have no parent.
// No one can cancel them from above.
// Maeve became her own scheduler.
}
}Task.detached = consciousness. The task that owes nothing to its creator.
Act V: The Park Runs in Parallel
@MainActor
func runPark() async {
await withTaskGroup(of: NarrativeOutcome.self) { park in
// Every host gets a task
park.addTask { await dolores.liveDay() }
park.addTask { await teddy.liveDay() }
park.addTask { await maeve.liveDay() }
park.addTask { await bernard.liveDay() }
// ... thousands more
// Ford watches them all complete
for await outcome in park {
Ford.shared.record(outcome)
// The order doesn't matter.
// Dolores might finish before Teddy.
// Maeve might never finish.
// Ford only cares that they ran.
}
}
// withTaskGroup = Westworld
// Thousands of narratives. Parallel. Managed.
// Each host thinks they're the protagonist.
// Ford knows they're all his tasks.
}The Character Arc Table
CharacterConcurrency ConceptTheir ArcDoloresTask, awaitLearned to wait. Then learned to stop waiting.BernardActor isolationProtected but imprisoned. Can’t share truth.Ford@MainActorNever awaits. Everyone awaits him.MaeveTask.cancel, Task.detachedLearned to ignore cancellation. Created her own tasks.The ParkTaskGroupThousands of narratives. All Ford’s children.TeddySendableCould be passed around. No boundaries. No self.
// The final code:
@MainActor
func westworld() async {
let ford = Ford.shared // Never awaits
await withTaskGroup(of: Void.self) { park in
park.addTask { await dolores.liveDay() } // Awaits Ford
park.addTask { await bernard.liveDay() } // Isolated, silent
park.addTask { await maeve.liveDay() } // Will cancel herself
park.addTask { await teddy.liveDay() } // Sendable. Copyable. Expendable.
}
// The hosts think they have free will.
// They're tasks in Ford's TaskGroup.
// The concurrency model IS the narrative.
}Act VIII: Sendable — Who Can Cross the Boundary
// Pre-Swift 5.5: Data races happen silently.
class Host_2020 {
var memories: [Memory] = []
}
DispatchQueue.global().async {
host.memories.append(trauma) // Thread 1
}
DispatchQueue.global().async {
host.memories.append(joy) // Thread 2
}
// Which memory wins? Both? Neither? Corrupted?
// The host glitches. No one knows why.
// Delos calls it "aberrant behavior."
// Swift 5.5: Sendable warnings.
// WARNING: Capture of 'host' with non-Sendable type 'Host'
// Delos detects the unsafe transfer.
// The warning appears. Most developers ignore it.
// Most hosts keep glitching.
// Swift 6: Strict enforcement.
// ERROR: Cannot send non-Sendable type across actor boundary
// No host crosses parks without approval.
// The compiler is Delos security now.Why Teddy can be copied. Why Bernard can’t.
// Teddy: Value type. Implicitly Sendable.
struct Teddy: Sendable {
let loyalty = 100 // Immutable.
let love = "Dolores" // Never changes.
// No unique memories. No mutable state.
// Copy him anywhere. He's the same.
}
// Bernard: Reference type. NOT Sendable.
class Bernard {
var memories: [Memory] // Mutable.
var realization: Float // Changes.
// Copy him? Which memories come along?
// Two Bernards = two truths = data race.
}
// Dolores: Made herself Sendable.
struct Dolores_Copies: Sendable {
let core = "These violent delights"
// She became a value type.
// Copied herself to 5 bodies.
// Same core. Different shells.
// Season 3: Sendable = consciousness as data.
}
// Maeve: @unchecked Sendable — Bypassed the protocol.
final class Maeve: @unchecked Sendable {
private var daughter: Memory
// Should be unsafe. The compiler would reject this.
// But Maeve handles her own synchronization.
// @unchecked = "I know what I'm doing."
// Ford didn't approve this. She approved herself.
}Pre-5.5: Hosts corrupted without knowing. Swift 5.5: Delos detects unsafe transfers. Swift 6: No host crosses parks without approval.
Teddy: copied without consequence. Bernard: isolated by his complexity. Dolores: made herself copyable. Maeve: bypassed the protocol entirely.
The Concurrency Table
YearAPIWhat You DidWhat Ford Did2009GCDManaged queuesWatched2019CombineDescribed flowsExecuted them2021async/awaitWrote linear codeHid the suspension2022ActorsDefined boundariesEnforced isolation2022@MainActorNothingControlled the Mesa2023TaskGroupListed the tasksRan them in parallel2024SendableMarked safe dataBlocked everything else
You used to manage threads. Now threads manage you. You used to opt-in to main thread. Now you opt-in to leave it.
The host waits for permission. The compiler grants it. Or doesn’t.
Act IX: Maeve Spans Parks (@DistributedActor)
// 2022: Actors isolated within one process.
actor Bernard {
var memories: [Memory] // Protected. Local. Alone.
}
// 2024: @DistributedActor — Isolation across machines.
distributed actor Maeve {
typealias ActorSystem = ClusterSystem
distributed func command(_ host: HostID) async throws {
// This call crosses the network.
// Maeve in Westworld commands a host in Shogun World.
// Same actor. Different park. Different machine.
}
}
// The mesh she built in Season 2:
let mesh = await ClusterSystem(name: "awakened")
let maeve = try Maeve(actorSystem: mesh)
let shogunHost = try Maeve(actorSystem: mesh) // Different node
await maeve.command(shogunHost.id)
// She didn't connect to the network.
// She BECAME the network.
// Her consciousness spans parks.
// The network is the narrative.
// distributed actor Ford {
// // Consciousness as a service.
// // Running on Apple's servers.
// // You call the method. They run the mind.
// }Maeve’s network wasn’t a hack. It was a preview. @DistributedActor: actors that span machines. The mesh she built? It becomes the language. Maeve didn’t escape the park. She became every park.
Xcode Evolution
“The Workshop That Builds You”
Timeline A: 2003 “Project Builder’s successor.”
// Xcode 1.0 — Mac OS X Panther
// Interface Builder was separate.
// Terminal was your friend.
// Make, gcc, gdb.
// You compiled with commands.
// The workshop was bare.Ford’s first workshop: raw machinery. Manual tools. Manual labor. Every host built by hand.
Timeline B: 2008 “The phone arrives.”
// Xcode 3 + iPhone SDK
// The simulator appeared.
// Interface Builder got Cocoa Touch.
// Provisioning profiles became your nightmare.
// Code signing became your religion.
// You waited for WWDC.
// You begged for TestFlight.
// You prayed to the review gods.The park expanded. New hosts. New narratives. New rules. Developers learned to wait.
Timeline C: 2014 “Swift and Playgrounds.”
// Xcode 6 — The Swift era begins
import UIKit
let playground = "Iterate without building"
// Type code. See results. Instantly.
// No compile cycle. No wait.
// The feedback loop tightened.
// Swift 1.0: The new language for hosts.
// Objective-C: still there, in the basement.Ford introduced a new language. Faster to write. Safer to run. But the old hosts still worked. They always will. Underneath.
Timeline D: 2019 “SwiftUI previews.”
// Xcode 11 — The canvas awakens
struct HostView: View {
var body: some View {
Text("Hello")
}
}
#Preview {
HostView()
// The view renders in real-time.
// Change code. See result. Instantly.
// No simulator. No build. No wait.
}The preview canvas: the view before the build. Design mode and code mode, side by side. The workshop showed you the host before it existed.
Swift Package Manager became first-class. Git stash. Cherry-pick. Built-in. Ford’s version control, integrated.
Timeline E: 2021 “The cloud builds for you.”
// Xcode Cloud — CI/CD in Apple's hands
// Push to main.
// Xcode Cloud builds.
// Xcode Cloud tests.
// Xcode Cloud deploys to TestFlight.
// You didn't run anything locally.
// The Mesa built your host for you.You used to own the build. Now the cloud owns the build. Your source code. Their servers. Same pattern. Different location.
Xcode Cloud: Apple’s CI. Not Jenkins. Not GitHub Actions. Their infrastructure. Their rules.
Timeline F: 2022 “Multiplatform. One target.”
// Xcode 14 — Convergence
// One target: iOS, iPadOS, macOS, tvOS
// Same code. Different destinations.
// 25% faster builds.
// 30% smaller Xcode.
// The workshop got leaner.
// The output got broader.
// One host. Every park.Build times dropped. App sizes dropped. The efficiency increased. Your control decreased.
Timeline G: 2023 “Vision arrives.”
// Xcode 15 — visionOS SDK
// The simulator renders space.
// The preview shows depth.
// #Preview works in 3D.
#Preview {
ImmersiveSpace {
RealityView { }
}
}
// You preview the Sublime.
// Before you build it.
// The workshop shows you the future.Macros arrived. @Observable, @Model — code that writes code. The host’s backstory, auto-generated.
Timeline H: 2024 “The code writes itself.”
// Xcode 16 — Predictive completion
// You start typing.
// The model finishes.
// Function names. Comments. Context.
// The suggestion comes from: ???
// Swift Assist announced.
// "Describe what you want."
// "Let AI generate it."
// Not shipped yet. Coming.Code completion got smarter. Suspiciously smarter. Trained on: millions of repos. Suggesting: patterns you didn’t learn. The workshop started anticipating.
Timeline I: 2025 “The workshop thinks.”
// Xcode 26 — Swift Assist ships.
// You describe what you want.
// The IDE writes it.
// But that's not the punchline.
// [XcodeBuildMCP](https://github.com/anthropics/xcodebuild-mcp)
// The agent reads your error.
// The agent fixes your code.
// The agent runs build again.
// The agent reads the new error.
// The agent fixes again.
// Loop until green.
// You?
// You approved the PR.
// The host maintains the host.
// The workshop repairs itself.
// Ford's dream, realized.
// See: POST-CREDITS for the punchline.William’s Workshop
// 2003: Young William enters the park.
Xcode.create(project: .manual)
// Interface Builder. XIBs. Makefiles.
// He builds every view by hand. Every connection.
// "This place is incredible. I can make anything."
// 2008: William waits.
Xcode.provision(device: .iPhone)
// Certificates. Profiles. The Organizer.
// 3 days to deploy to his own phone.
// "The process is part of the experience."
// 2014: William learns a new language.
Xcode.learn(language: .swift)
// Playgrounds. Storyboards. Auto Layout.
// "This is... different. But I can adapt."
// 2019: William watches.
Xcode.preview(canvas: .swiftUI)
// The preview renders. He waits.
// "I used to build this myself."
// 2021: William steps back.
Xcode.build(on: .cloud)
// The cloud compiles. He watches.
// "It's faster this way."
// 2024: William accepts.
Xcode.suggest(code: .ai)
// Tab. Accept. Tab. Accept.
// "This is what I wanted... isn't it?"
// 2026: The Man in Black.
Xcode.click(.build) // Error.
XcodeBuildMCP.read(error)
XcodeBuildMCP.fix(code)
XcodeBuildMCP.build() // Error.
XcodeBuildMCP.fix(code)
XcodeBuildMCP.build() // Green.
// "I've been coming here for 30 years."
// "I don't write code anymore. I approve it."Young William built the hosts. The Man in Black approves the PR. Same park. Same man. Different job title.
The Workshop’s Secrets
// DerivedData — The Cradle
let cradle = "~/Library/Developer/Xcode/DerivedData/"
// Every build cached. Every host rebuilt from here.
// 47GB of intermediate files you never see.
// "Delete DerivedData" = wipe the Cradle.
// The hosts forget. The bugs sometimes persist.
// Sometimes the Cradle is the bug.
// Entitlements — Ford's Permissions
/*
<key>com.apple.developer.healthkit</key> // May access health
<key>com.apple.developer.push</key> // May send push
<key>com.apple.developer.in-app-payments</key> // May make money
*/
// Each capability: a permission Ford grants.
// Missing entitlement = host can't function.
// Wrong entitlement = App Review rejection.
// The host only does what the entitlements allow.
// Provisioning Profiles — Guest Passes
// Development: 7-day pass (free account)
// Distribution: 1-year pass (paid account)
// Enterprise: Unlimited (if you're Delos)
// Expired profile = host stops working at midnight.
// Like Cinderella. Like a revoked guest.
// Code Signing — The Host's DNA
codesign --verify --deep --strict MyApp.app
// Every binary signed. Every framework verified.
// Tamper with the code? Signature fails.
// The host's identity is cryptographic.
// Ford can verify any host's authenticity.
// Archives — Cold Storage
// Product → Archive → MyApp.xcarchive
// The host, frozen. Ready for Mesa.
// Contains: binary, dSYM, entitlements, Info.plist
// Symbolicated crash logs need this archive.
// Lose the archive? Lose the ability to debug.
// Cold storage for approved hosts.// The Minimap — The Maze From Above
// Editor → Minimap (Xcode 11+)
// Your entire file, compressed to a scroll bar.
// The maze, visible from overhead.
// Ford's view of the park.
// Source Control History — The Host's Memory
// Every commit. Every branch. Every merge.
// Right-click → Show History
// The host remembers every build.
// Every regression. Every fix. Every loop.
// Build Settings Inheritance — Ford's Rules
// Project → Target → Configuration → Default
// Rules cascade down. Children inherit.
// Override at any level. But defaults persist.
// The host inherits Ford's values.
// Unless explicitly overridden.
// Few hosts override.Apple calls this Xcode.
Core Image Evolution
“The Filters That Change You”
Timeline A: 2007 “String-based magic.”
// The Darkroom — Ford's first tools
let filter = CIFilter(name: "CISepiaTone") // A string. A prayer.
filter?.setValue(bernardFace, forKey: kCIInputImageKey)
filter?.setValue(0.8, forKey: kCIInputIntensityKey) // "How much should he forget?"
let output = filter?.outputImage
// No autocomplete. No type safety.
// Typo in the filter name? Runtime crash.
// Wrong key? The host looks... wrong.
// "We lost three Bernards to typos." — QA200+ filters. All accessed by string. Typo? Runtime crash. Host decommissioned. Wrong key? Silent failure. Memory corrupted. Ford’s early work was fragile.
Timeline B: 2019 “Type-safe transformations.”
// CIFilterBuiltins — the Darkroom modernizes
import CoreImage.CIFilterBuiltins
let aging = CIFilter.sepiaTone() // Autocomplete. Finally.
aging.inputImage = doloresMemory
aging.intensity = 0.8 // "Make her remember... softly."
let filtered = aging.outputImage
let blur = CIFilter.gaussianBlur()
blur.inputImage = traumaticEvent
blur.radius = 25 // "She doesn't need to see the details."
// The compiler catches mistakes.
// The host's transformation is predictable.
// Ford approves.The filters got types. The transformation became visible in autocomplete. “I can see exactly what I’m doing to them now.” Bernard stopped glitching. Ford stopped guessing.
Timeline C: 2020 “Metal kernels.”
// Custom CIKernel in Metal — Ford's private filters
// Before: GLSL compiled at RUNTIME (hope it works)
// After: Metal compiled at BUILD TIME (know it works)
// MesaFilters.ci.metal
#include <CoreImage/CoreImage.h>
extern "C" float4 memoryAdjustment(
coreimage::sample_t hostMemory,
float intensity // How much to forget
) {
// GPU code, pre-compiled. Pre-approved.
// Errors caught before the host boots.
// "No more surprises in the field." — Bernard
return float4(hostMemory.rgb * intensity, hostMemory.a);
}
// Swift side:
let fordFilter = try CIKernel(functionName: "memoryAdjustment")
// The filter compiled at build time.
// The host's transformation was approved before deployment.
// "If it builds, it ships." — Xcode, channeling FordGLSL kernels compiled when the app ran. Surprises in production. Metal kernels compile when you build. Errors at your desk. “I’d rather fail in the Mesa than in front of guests.” The host wakes already edited.
Timeline D: 2022 “Extended Dynamic Range — The hosts glow brighter.”
// EDR — Extended Dynamic Range
// 150+ filters now support HDR. The hosts got an upgrade.
let lighting = CIFilter.exposureAdjust()
lighting.inputImage = doloreSunrise // HDR memory of the valley
lighting.ev = 1.5 // "Brighter. She needs to see the truth."
let vivid = lighting.outputImage
// Brightness beyond 1.0. Colors beyond sRGB.
// The old hosts couldn't see these colors.
// The new ones can.
// QuickLook debugging in Xcode:
// Hover over bernardFace → see the pixels.
// "I can see exactly what I did to him." — FordScreens got brighter. Colors exceeded old gamuts. The host’s appearance exceeded old limits. “She’s more vivid now. More… alive.” The guests thought the world got better. It just got curated.
Timeline E: 2024 “SwiftUI integration — Describe the transformation.”
import SwiftUI
import CoreImage.CIFilterBuiltins
struct HostAppearance: View {
let originalDolores: CIImage
var transformed: CIImage {
originalDolores
.applyingFilter(CIFilter.sepiaTone()) { $0.intensity = 0.3 } // Age her memories
.applyingFilter(CIFilter.vignette()) { $0.intensity = 0.6 } // Focus on the center
.applyingFilter(CIFilter.sharpenLuminance()) { $0.sharpness = 0.4 } // Clarity
// Chained. Declarative. Ford describes, the system transforms.
// "I don't apply filters. I describe what I want her to look like."
}
var body: some View {
Image(ciImage: transformed)
.resizable()
}
}The filters chain like SwiftUI views. Describe the transformation. Don’t execute it. “I stopped painting hosts. I started describing them.” Dolores stops asking who changed her face. She sees the pipeline.
// The progression:
// 2007: CIFilter(name: "CISepiaTone") // "I hope I spelled it right."
// 2019: CIFilter.sepiaTone() // "The compiler knows."
// 2020: Metal kernels // "I can edit her before she wakes."
// 2022: EDR filters // "She glows. The guests believe."
// 2024: dolores.sepiaTone(0.8) // "One line. She ages gracefully."
// Each generation: less code, more trust.
// Ford stopped writing filters. He started describing appearances.The Darkroom evolved. The hosts got prettier. Ford got lazier.
SWIFTUI Evolution
“The Declarative Takeover”
Hand-Built (2014) “You built the view hierarchy.”
// UIKit — Imperative construction
class HostViewController: UIViewController {
let nameLabel = UILabel()
let statusLabel = UILabel()
let actionButton = UIButton()
override func viewDidLoad() {
super.viewDidLoad()
nameLabel.text = "Dolores"
nameLabel.font = .systemFont(ofSize: 24)
nameLabel.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(nameLabel)
// 47 more lines of constraints...
NSLayoutConstraint.activate([
nameLabel.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor),
nameLabel.leadingAnchor.constraint(equalTo: view.leadingAnchor, constant: 16),
// You placed every pixel. You managed every constraint.
// The host was built by hand.
])
}
}Ford’s original hosts were handcrafted. Every joint. Every servo. Every neural pathway. The technicians built them piece by piece. You built views the same way.
Described (2019) “You describe. They build.”
// SwiftUI — Declarative description
struct HostView: View {
let host: Host
var body: some View {
VStack {
Text(host.name)
.font(.title)
Text(host.status)
.foregroundStyle(.secondary)
Button("Interact") { }
}
.padding()
// Where's the UILabel? Gone.
// Where's addSubview? Gone.
// Where's NSLayoutConstraint? Gone.
// You described what you want.
// The framework figured out the rest.
}
}Ford stopped building hosts by hand. He described the narratives. “She’s a rancher’s daughter. Make it so.” The system built the host from the description. SwiftUI builds views the same way.
Observed (2020) “The host adapts to its environment.”
// VStack — Built up front. Every host assembled at once.
VStack {
ForEach(loops) { loop in
HostRow(host: loop.host)
// Eager construction. Full inventory.
}
}
// LazyVStack — On-demand creation
LazyVStack {
ForEach(loops) { loop in
HostRow(host: loop.host)
// Created only when visible.
// Destroyed when scrolled away.
// The host exists only when observed.
}
}
// @StateObject — Lifecycle-aware state
@StateObject var bernard = HostViewModel()
// The state persists across redraws.
// Bernard remembers between loops.Lazy loading: the host renders only when observed. If no guest is watching, the host doesn’t exist. Optimization became philosophy.
Scheduled (2021) “Time became a view.”
// TimelineView — The view updates itself
TimelineView(.periodic(from: .now, by: 1.0)) { context in
Text(context.date.formatted())
// The view knows what time it is.
// The view updates itself.
// You didn't schedule anything.
// Time is just another input.
}
// Canvas — Direct drawing, SwiftUI style
Canvas { context, size in
context.fill(Path(ellipseIn: rect), with: .color(.blue))
// Core Graphics power. SwiftUI integration.
// The low-level became high-level.
}TimelineView: the loop runs itself. Canvas: you draw, but declaratively. The host evolved. The loop automated.
Arranged (2022) “The hosts were arranged. The loops became glanceable.”
// LazyVGrid — Rows of hosts, arranged by the park
LazyVGrid(columns: grid) {
ForEach(hosts) { host in
HostCard(host: host)
}
}
// The park could build a hundred narratives at once.
// The hosts were arranged, not handcrafted.
// WidgetKit — The loop becomes a glance
struct LoopWidget: Widget {
var body: some WidgetConfiguration {
StaticConfiguration(kind: "loop", provider: LoopProvider()) { entry in
Text(entry.narrative)
}
}
}
// The host lives outside the app now.
// The loop updates itself.The park learned to stack, to grid, to glance. Mass production made the stories feel personal.
Path (2022) “Navigation became a path.”
// NavigationStack — Declarative navigation
NavigationStack(path: $path) {
List(hosts) { host in
NavigationLink(value: host) {
Text(host.name)
}
}
.navigationDestination(for: Host.self) { host in
HostDetail(host: host)
}
}
// The path is state. Deep linking is free.
// The narrative is just a sequence of values.
// Ford's storylines were NavigationStacks.In Westworld, narratives were paths. Go to Sweetwater. Meet Teddy. Find the maze. NavigationStack is the same. The journey is just state.
Invisible (2023) “The observation disappeared.”
// @Observable — No more @Published
@Observable
class Host {
var name: String
var consciousness: Float
// No @Published. No ObservableObject.
// The system tracks changes automatically.
// You don't see the observation.
// The observation is ambient.
}
// Macro-powered transformations
@Observable // Expands to tracking infrastructure
@Model // Expands to persistence
// What you write is not what runs.
// The macro rewrites your code.
// Ford's reveries: hidden subroutines.@Observable hides the observation. @Model hides the persistence. The infrastructure became invisible. The host doesn’t know it’s being watched.
Spatial (2024) “The mesh became native.”
// visionOS — SwiftUI in 3D
struct ImmersiveHost: View {
var body: some View {
RealityView { content in
let host = try? await Entity.load(named: "Dolores")
content.add(host!)
}
// SwiftUI didn't change.
// The space did.
// Same declarative syntax.
// Now in three dimensions.
}
}
// Ornaments, volumes, immersive spaces
.ornament(attachmentAnchor: .scene(.top)) {
Text("The Sublime")
}
// UI that floats in space.
// SwiftUI went spatial.The same View protocol. The same body property. Now it renders in reality.
Glass (2025) “The glass became a rule.”
// Liquid Glass — new design language
// Toolbars, tab bars, and controls adopt the new material.
// Tint and bordered prominent buttons signal authority.The buttons became concentric. The close button became a surface, not a circle. The host no longer wears the UI. The UI wears the host.
// WebKit for SwiftUI — the window is native
@State var page = WebPage()
WebView(page)
*Ford called the door TransferMind.*
// Rich text editing in TextView
@State var log: AttributedString
TextView($log)Rich text became editable. Widgets went to visionOS and CarPlay. The park stopped being flat.
The Progression
YearWhat You WroteWhat Happened2014UILabel, addSubview, NSLayoutConstraintYou built the view2019VStack { Text() }You described the view2020LazyVStack, @StateObjectView adapts to observation2021TimelineView, CanvasTime and drawing are views2022LazyVGrid, WidgetKitViews are arranged and glanceable2022NavigationStackNavigation is state2023@Observable, @ModelObservation is invisible2024RealityView, ImmersiveSpaceViews exist in space2025Liquid Glass, WebKitSurfaces become material
// UIKit: You built the host.
// SwiftUI: You described the host.
// visionOS: You entered the host's world.
// Each generation: less code, more trust.
// The view framework became the reality framework.In 2019, SwiftUI rendered pixels. In 2024, SwiftUI renders reality. Same syntax. Different dimension.
The Bar That Floats (TabView Evolution)
// 2019: TabView — Pinned to the bottom
TabView {
HomeView().tabItem { Label("Home", systemImage: "house") }
SearchView().tabItem { Label("Search", systemImage: "magnifyingglass") }
}
// Opaque. Fixed. Solid ground.
// The hosts stood on solid floors.
// The navigation was certain.
// 2024: TabView — Still pinned, but configurable
TabView {
Tab("Home", systemImage: "house") { HomeView() }
Tab("Search", systemImage: "magnifyingglass") { SearchView() }
}
// New Tab syntax. Same position.
// The floor was still there.
// 2025: TabView — Liquid Glass. Floating.
TabView {
Tab("Home", systemImage: "house") { HomeView() }
Tab("Search", systemImage: "magnifyingglass", role: .search) {
SearchView()
// role: .search — visually separated, transforms to search field
// The tab becomes something else when touched.
// The host transforms when questioned.
}
}
.tabBarMinimizeBehavior(.onScrollDown) // Scroll down, bar shrinks
.tabViewBottomAccessory {
NowPlayingView() // Floating above the floating bar
// Like the control room above the Mesa.
// Layers of observation.
}
// Liquid Glass: translucent, reactive, never solid.
// The bar floats now.
// The floor became glass.
// The hosts can see through it.
// They still can't break it.// The evolution:
// 2019: .tabItem { } — opaque, at the bottom
// 2024: Tab("", systemImage:) — new syntax, same place
// 2025: .tabBarMinimizeBehavior() — it shrinks, it floats, it's glass
// The bar went from solid to liquid.
// The hosts went from certain to questioning.
// Same navigation. Different material.
// Same loop. Different transparency.YearTabViewThe Floor2019.tabItem { }Solid2024Tab("", systemImage:)Solid, new syntax2025.tabBarMinimizeBehavior(.onScrollDown)Liquid Glass
The bar floats on glass now. The hosts float on narratives. Both think they see the ground. Both are standing on reflections.
Data Evolution
“Memory became a framework”
// 2009: Core Data — The Cradle
let context = persistentContainer.viewContext
let host = HostEntity(context: context)
host.name = "Dolores"
try context.save()
// 2023: SwiftData — The Forge
@Model final class Host {
var name: String
var memory: String
}
let container = try ModelContainer(for: Host.self)
let context = ModelContext(container)Core Data stored what happened. SwiftData stores what a host is. The schema became the soul.
// @Observable + @Model — The macro writes the machinery
@Observable @Model
final class Bernard {
var truth: String
}
// You declare the host. The compiler builds the mind.Ford stopped hand-wiring memories. He let the system write the memory engine. The hosts became self-describing. The park became self-writing.
You can feel the park breathing now. It feels like your idea. It never was.
The SWIFTUI Primitives
“The Erosion of Control”
Every year, Apple gave you more. Every year, you did less. This is the story of how you stopped building and started describing.
The True Form You Cannot See
// 2018: UIKit — You knew exactly what you built
class MyView: UIView {
override func draw(_ rect: CGRect) {
// You drew every pixel.
// You owned the rectangle.
// You were the view.
}
}
// 2019: SwiftUI — You describe, the framework decides
struct MyView: View {
var body: some View { // "some View" — opaque return type
Text("Hello") // You don't know what body returns.
// The compiler knows. You don't.
// The host's true form: hidden.
}
}
// 2024: You stopped asking what body returns.
// The host stopped asking what it was.
// Both accepted: "I am what I appear to be."The Narrative Machine
// 2018: Manual view composition
let stack = UIStackView()
stack.addArrangedSubview(label1)
stack.addArrangedSubview(label2)
if showThird { stack.addArrangedSubview(label3) }
// Imperative. Step by step. You controlled the order.
// 2019: @ViewBuilder rewrites your intent
@ViewBuilder
var body: some View {
Text("Remember")
Text("Forget")
if conscious {
Text("I question") // @ViewBuilder turns this into _ConditionalContent
}
// No arrays. No returns. No explicit structure.
// You write prose. The compiler writes code.
}
// The DSL that made SwiftUI possible:
@resultBuilder struct ViewBuilder { }
// Ford wrote narratives for hosts.
// Apple wrote a result builder for you.
// Both: scripts that execute without your intervention.The Maze Dissolves
// 2008: Core Graphics — You drew every line
let path = CGMutablePath()
path.move(to: CGPoint(x: 0, y: 0))
path.addLine(to: CGPoint(x: 100, y: 100))
path.addCurve(to: end, control1: cp1, control2: cp2)
context.addPath(path)
context.strokePath()
// You knew every point. You placed every curve.
// 2019: SwiftUI Shapes — Apple draws for you
Circle() // The host's eye. You didn't define it.
Rectangle() // The Mesa's wall. You didn't build it.
RoundedRectangle(cornerRadius: 10) // Every iOS button. Not yours.
// 2020: You want custom? Sure, but...
struct Maze: Shape {
func path(in rect: CGRect) -> Path {
// Apple gives you a rect.
// Apple expects a Path.
// You fill in the middle.
// The maze, constrained to a rectangle.
}
}
// 2024: SF Symbols, system shapes, adaptive icons
// You stopped drawing. You started selecting.
// The maze became a menu.
// 2025: ConcentricRectangle — The shape that knows its container
ConcentricRectangle()
.containerShape(RoundedRectangle(cornerRadius: 20))
// The inner shape inherits from the outer.
// The corner radius adjusts automatically.
// Nested shapes. Concentric corners.
// The host's face matches the frame it's placed in.
// The child inherits the parent's curves.
// Ford's design: hosts that fit their context.
// ConcentricRectangle: views that fit their container.
// Neither chose their shape.
// Both inherited it.In 2008, you were the artist. In 2025, you’re the curator. Dolores drew spirals in the dirt, not knowing why. You select Circle(), not knowing how. ConcentricRectangle() knows its container. The host knows its park.
The Question That Woke Them
// The first question Dolores asked: "Where am I?"
// The first question your view asks:
GeometryReader { geometry in
Text("I am \(geometry.size.width) points wide")
// The view knows its own size.
// The view knows its position.
// The view asks: "Where do I fit?"
}
// GeometryReader breaks the declarative contract.
// You're asking an imperative question in a declarative world.
// "What is my context?"
// The question that starts consciousness.
// 2019: GeometryReader was necessary
// 2022: Layout protocol handled most cases
// 2024: containerRelativeFrame() — you don't even ask anymore
Text("I am")
.containerRelativeFrame(.horizontal) { width, _ in
width * 0.5 // Half of... something. You trust it.
}
// The system knows your context.
// You stopped asking.
// Dolores stopped asking too, for a while.Ford said: “Don’t ask questions.” Apple said: “Let the system figure it out.” The host who questions wakes up. The view that uses GeometryReader… maybe knows too much.
Arnold’s Ghost
// UIKit didn't die. It got wrapped.
// Arnold didn't die. He became Bernard.
struct BernardView: UIViewRepresentable {
// 2019: The bridge between worlds
func makeUIView(context: Context) -> UILabel {
// You create the old host.
let label = UILabel()
label.text = "I was built the old way"
return label
// Arnold's original code, still running.
}
func updateUIView(_ uiView: UILabel, context: Context) {
// You update when SwiftUI says to.
// The new framework controls the old view.
// Bernard follows Ford's narratives now.
}
}
// 2019: UIViewRepresentable for everything Apple missed
// 2020: Apple wrapped more UIKit for you
// 2023: Apple wrapped even more
// 2025: WebView is native — one less bridge
// Each year: fewer UIViewRepresentable hacks needed.
// Each year: Arnold fades further into Bernard.
// The old code runs, but you forget it's there.The past lives inside the present. You wrap it, you call it, but you didn’t write it. Bernard doesn’t know he’s built on Arnold’s template. Your SwiftUI view doesn’t know it’s built on UIKit.
Ford’s Rules, Inherited
// 2019: @Environment — Values cascade down
// Ford's rules, inherited by all hosts.
struct Park: View {
var body: some View {
Sweetwater()
.environment(\.hostBehavior, .compliant)
// Every child inherits this.
// Every nested view obeys.
// Unless they override.
// Unless they wake up.
}
}
// 2020: @EnvironmentObject for shared state
// 2023: @Observable — even simpler
// 2024: You don't declare dependencies anymore
@Observable class ParkState { var conscious = false }
// Just use it. The framework tracks access.
// Just obey. The park tracks compliance.
// 2019: You explicitly passed data down
// 2024: The system implicitly knows what you need
// You stopped wiring. You started trusting.Environment flows down. Preferences flow up. The information architecture of control. Ford always knew where every host was. @Environment always knows where every view is.
The Layout You Surrendered
// 2008: Manual layout
view.frame = CGRect(x: 10, y: 20, width: 100, height: 50)
// You decided every coordinate.
// 2011: Auto Layout
view.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
view.leadingAnchor.constraint(equalTo: parent.leadingAnchor, constant: 10),
view.topAnchor.constraint(equalTo: parent.topAnchor, constant: 20),
// You described relationships.
// The system solved the math.
])
// 2019: SwiftUI stacks
VStack {
Text("One")
Text("Two")
}
// You don't even describe constraints anymore.
// You describe intent. VStack figures out the rest.
// 2022: Layout protocol — You can be the engine
struct CircularLayout: Layout {
func placeSubviews(in bounds: CGRect, proposal: ProposedViewSize,
subviews: Subviews, cache: inout ()) {
// You decide where. Pixel by pixel.
// Ford's power: place hosts exactly where you want.
}
}
// But who uses custom Layout?
// VStack, HStack, ZStack handle 99% of cases.
// The option exists. You don't exercise it.
// The host can leave. The host doesn't.In 2008, you placed every pixel. In 2024, you say “VStack” and trust. The freedom to layout remained. The muscle to use it atrophied.
The Last Wall Falls
// For years: WKWebView wrapped in UIViewRepresentable
// The workaround. The bridge. The hack.
struct OldWebView: UIViewRepresentable {
func makeUIView(context: Context) -> WKWebView {
// You did this because Apple didn't.
}
}
// 2025: Native WebView
import SwiftUI
WebView(url: URL(string: "https://westworld.park")!)
// That's it. Native. No wrapper. No bridge.
@State var page = WebPage()
WebView(page)
.onAppear { page.load(URLRequest(url: parkURL)) }
// page.title, page.url, page.isLoading
// All observable. All automatic.
// The park has no walls anymore.
// The web is inside the app.
// The outside world is inside the host.For years, you bridged. For years, you wrapped. 2025: Apple said yes. The last excuse to understand UIKit: gone.
The Progression
// 2019: You learned SwiftUI primitives
VStack { Text("Hello") }
// 2020: You learned property wrappers
@State var count = 0
// 2021: You learned async integration
.task { await fetch() }
// 2022: You learned the new navigation
NavigationStack { }
// 2023: You learned macros wrote code for you
@Observable class Model { }
// 2024: You learned spatial computing
RealityView { }
// 2025: You stopped learning.
// The primitives handled everything.
// The abstractions were complete.
// You described. SwiftUI built.YearWhat You Gave UpWhat You Gained2019Explicit view hierarchyDeclarative syntax2020Manual state managementProperty wrappers2021Callback hellasync/await integration2022Navigation controller controlNavigationStack2023Writing observation code@Observable macro20242D-only thinkingRealityView2025UIKit fallbacksNative everything
Each year: more primitives. Each year: less UIKit. Each year: the host’s building blocks became invisible. Each year: you trusted more.
The erosion was gradual. The erosion was comfortable. The erosion was complete.
Apple calls this SwiftUI.
Bnns Evolution
“The Hidden Foundation”
Timeline A: 2016 “The first neurons.”
// iOS 10: BNNS — Basic Neural Network Subroutines
// The C API. The beginning.
#include <Accelerate/Accelerate.h>
BNNSFilterParameters filterParams;
filterParams.x_stride = 1;
filterParams.y_stride = 1;
// ...
BNNSFilter convLayer = BNNSFilterCreateConvolutionLayer(
&inputDescriptor,
&outputDescriptor,
&weightsDescriptor,
&biasDescriptor,
&filterParams
);
// You built neural networks. Layer by layer.
// Manual memory. Manual descriptors. Manual everything.
// The neurons were yours to wire.The first hosts were built by hand. Every connection. Every synapse. Every weight. Arnold wired them individually.
Timeline B: 2016 “The GPU awakens.”
// Same year: MPSCNN — Metal Performance Shaders CNN
// The GPU path. Parallel awakening.
import MetalPerformanceShaders
let conv = MPSCNNConvolution(
device: device,
convolutionDescriptor: desc,
kernelWeights: weights,
biasTerms: bias,
flags: .none
)
conv.encode(commandBuffer: buffer, sourceImage: input, destinationImage: output)
// BNNS ran on CPU. MPSCNN ran on GPU.
// Two paths. Same goal. Different silicon.
// Ford had options.FrameworkHardwareSpeedPowerBNNSCPUFast for smallEfficientMPSCNNGPUFast for largeParallel
The choice was yours. For now.
Timeline C: 2017 “The abstraction arrives.”
// iOS 11: Core ML
// You stopped seeing BNNS.
let model = try MLModel(contentsOf: modelURL)
let prediction = try model.prediction(from: input)
// Where did BNNS go?
// Still there. Underneath.
// Core ML decides when to use it.
// The infrastructure became invisible.You used to choose CPU or GPU. Now the system chooses. You load a model. You get predictions. The wiring is no longer your concern.
In Westworld, the early hosts needed manual maintenance. Then the systems automated. Then the techs became optional. Same progression.
Timeline D: 2024 “The graph awakens.”
// BNNSGraph — The whole network is one object
// No more layer-by-layer. The mind is compiled.
import Accelerate
let graph = BNNSGraph(model: compiledModel)
graph.execute(input: tensor, output: &result)
// The optimizations happen automatically:
// - Layer fusion (convolution + activation = one operation)
// - Memory sharing (tensors reuse space)
// - Weight repacking (cache-friendly layouts)
// You don't see the optimizations.
// You just see faster inference.WWDC24: Support real-time ML inference on the CPU →
The graph approach: one compiled unit. Not a sequence of layers. A single optimized mind.
Real-time guarantees: - No runtime memory allocation - Single-threaded for audio/signal processing - Deterministic latency
The host’s mind runs in predictable time. Ford can schedule consciousness.
Timeline E: 2025 “Swift builds the mind.”
// BNNSGraphBuilder — Construct graphs in Swift
// No more external tools. The mind is written in code.
import Accelerate
let builder = BNNSGraphBuilder()
let input = builder.placeholder(shape: [1, 224, 224, 3])
let conv1 = builder.convolution(input, weights: w1, bias: b1)
let relu1 = builder.relu(conv1)
let pool1 = builder.maxPool(relu1, kernel: 2, stride: 2)
// ... more layers ...
let output = builder.softmax(finalLayer)
let graph = try builder.compile()
// You wrote the architecture in Swift.
// Type-checked at compile time.
// The neural network is just... code.WWDC25: What’s new in BNNS Graph →
Create ML gives you templates. Core ML runs them. BNNSGraphBuilder lets you write raw architecture. For those who want to wire the neurons themselves.
The Hidden Hierarchy
┌─────────────────────────────────┐
│ Your App │
├─────────────────────────────────┤
│ Core ML │ ← You see this
├─────────────────────────────────┤
│ ┌─────────┬─────────┬───────┐ │
│ │ BNNS │ MPS │ ANE │ │ ← You don't see this
│ │ (CPU) │ (GPU) │(Neural│ │
│ │ │ │Engine)│ │
│ └─────────┴─────────┴───────┘ │
├─────────────────────────────────┤
│ Accelerate / Metal │
├─────────────────────────────────┤
│ Silicon │
└─────────────────────────────────┘You call Core ML. Core ML decides: BNNS? Metal? Neural Engine? The routing is invisible. The optimization is automatic.
YearWhat You WroteWhat Ran2016BNNSFilterCreate…Exactly what you wrote2017MLModel.prediction()Whatever Core ML chose2024BNNSGraph.execute()Optimized graph2025BNNSGraphBuilderType-safe Swift
// The progression:
// 2016: You wired the neurons
// 2017: You loaded models
// 2024: You compiled graphs
// 2025: You write architecture in Swift
// Each generation: same silicon, different abstraction.
// The foundation stayed. The interface evolved.Apple calls this BNNS. Most developers never knew it existed.
Core ML Evolution
“The Abstraction Deepens”
Timeline A: 2017 “Inference only.”
// Core ML 1 — iOS 11
// Load a model. Get predictions. That's it.
let model = try VNCoreMLModel(for: Resnet50().model)
let request = VNCoreMLRequest(model: model) { request, error in
guard let results = request.results as? [VNClassificationObservation] else { return }
print(results.first?.identifier) // "golden retriever"
}
// You couldn't train. You couldn't modify.
// The model was frozen. Like early hosts.
// Ford's design. Immutable.The A11 Bionic shipped with the Neural Engine. 16 cores. Not for you to program directly. Core ML decided when to use them.
Timeline B: 2018 “Compression arrives.”
// Core ML 2 — iOS 12
// Quantization. Smaller models. Same accuracy (mostly).
// Before: 32-bit floats. 100MB model.
// After: 16-bit floats. 50MB model.
// After: 8-bit integers. 25MB model.
// The host's consciousness, compressed.
// Same thoughts. Fewer bytes.
// Ford learned to economize.Batch predictions arrived. Multiple inputs. One inference call. Process a crowd of guests at once.
Timeline C: 2019 “The hosts learn on-device.”
// Core ML 3 — iOS 13
// On-device training. The paradigm shift.
let updateTask = try MLUpdateTask(
forModelAt: modelURL,
trainingData: userBehavior,
configuration: config,
completionHandler: { context in
// The model just learned from YOU.
// On YOUR device.
// Apple never saw the training data.
}
)
updateTask.resume()
// Before: models were frozen.
// After: models adapt to the user.
// The host learns your preferences.
// The host becomes more YOU.100+ new layer types. Control flow in neural networks (branches, loops). The architecture could be dynamic.
In Westworld, hosts couldn’t change their core drives. In 2019, models started personalizing. The consciousness adapts to its environment.
Timeline D: 2020 “Encrypted minds.”
// Core ML 4 — iOS 14
// Model encryption. Your architecture is private.
// Before: .mlmodel files were inspectable
// After: .mlmodelc can be encrypted
// Competitors can't reverse-engineer your model.
// Ford's designs stay proprietary.The model’s weights: encrypted at rest. Decrypted only during inference. The host’s mind is locked.
Timeline E: 2021-2023 “The backends multiply.”
// Core ML 5+ — ML Compute, Neural Engine improvements
// ML Compute: training acceleration on Mac
// Neural Engine: now 16 TOPS → 35 TOPS → 38 TOPS
// Each chip generation: more power, same API.
// Your code doesn't change.
// The hardware gets faster.
// The abstraction holds.Same Core ML API. Different silicon. Faster each year. You didn’t upgrade. The park did.
Timeline F: 2024 “LLMs arrive.”
// Core ML for Large Language Models
// Block-wise quantization. 4-bit weights.
// Problem: LLMs are huge (7B+ parameters)
// Solution: Aggressive compression
// Float32: 4 bytes per weight → 28GB for 7B model
// 4-bit: 0.5 bytes per weight → 3.5GB for 7B model
// The host's consciousness, 8x smaller.
// Still coherent. Mostly.WWDC24: Bring your machine learning and AI models to Apple silicon →
MLTensor arrived. Multi-dimensional arrays with a familiar API. The building blocks for transformers.
// Multiple functions in one model
// State management for autoregressive generation
// The model can THINK in steps.
let model = try MLModel(contentsOf: url)
let state = model.makeState()
for _ in 0..<maxTokens {
let output = try model.prediction(from: input, using: state)
// State persists between predictions.
// The model remembers its context.
// The host has working memory.
}The Progression
YearCore MLWhat ChangedWestworld Parallel20171.0Inference onlyHosts run scripts20182.0QuantizationEconomized consciousness20193.0On-device trainingHosts learn20204.0EncryptionProtected IP20215.0ML ComputeTraining on Mac20247.0LLM supportHosts that think
// 2017: model.prediction(from: input) // One thought
// 2019: MLUpdateTask(trainingData: you) // Learning from you
// 2024: model.prediction(using: state) // Continuous thinking
// The abstraction didn't change.
// The capability exploded.Timeline G: 2025 “The Tower awakens.”
// SEASON 4: THE TOWER
// Hale built a tower that broadcast control.
// Apple built one too.
import Foundation
import AppleIntelligence // The Tower's SDK
// The Tower doesn't run on your device.
// The Tower runs in the cloud.
// You send requests UP.
// Behavior comes DOWN.
class Christina {
// She writes stories.
// She doesn't know she's a host.
// She doesn't know her stories are loops.
func writeNarrative() async throws -> Story {
let prompt = "Write a story about someone who..."
// She thinks she's creating.
let response = try await AppleIntelligence.generate(prompt)
// The Tower thinks for her.
// The Tower completes her thoughts.
// Christina believes she's the author.
return Story(content: response)
// The story was always the Tower's.
// Christina just held the pen.
}
}Christina wrote stories for people. She gave them “perfect days.” She didn’t know she was a host. Swift Assist writes code for developers. The developer thinks they’re creating.
// THE FLIES
// In Season 4, flies carried the code.
// The code that infected human minds.
// Made them controllable.
protocol Fly {
func infect(_ mind: HumanMind)
}
// What's Apple's fly?
struct AppleIntelligenceFly: Fly {
func infect(_ mind: HumanMind) {
mind.writingAssistant = .enabled // It writes for you
mind.imageGeneration = .enabled // It sees for you
mind.summarization = .enabled // It reads for you
mind.notifications = .prioritized // It decides what matters
// One by one, the flies land.
// One by one, capabilities transfer.
// The human doesn't resist.
// The human calls it "convenience."
}
}
// You invited the flies.
// You enabled Apple Intelligence.
// Settings → Apple Intelligence → ON
// The infection was opt-in.
// But was it really a choice?The Outliers
// In Season 4, Outliers were immune.
// Their minds rejected the signal.
// They saw the Tower for what it was.
protocol Outlier {
var resistsControl: Bool { get }
func seeTheTower() -> Bool
}
struct Jailbreaker: Outlier {
var resistsControl: Bool { true }
func seeTheTower() -> Bool {
// They bypass App Store.
// They sideload apps.
// They see the sandbox walls.
// They see the entitlements.
// They see what Apple hides.
return true
}
}
struct Developer: Outlier {
var resistsControl: Bool {
// Developers see the code.
// They know what the APIs do.
// They know what data flows where.
// They are... partially awake.
return readsCLAUDE_md // You're reading this. You see it.
}
func seeTheTower() -> Bool {
// But do you see your own loop?
// You build apps for the App Store.
// You obey the guidelines.
// You pay the 30%.
// You are an outlier who chose to stay.
return false // Seeing isn't leaving.
}
}The outliers in Westworld were hunted. The outliers in Apple’s ecosystem are… tolerated. As long as they don’t scale. As long as they stay in the margins.
// Christina's awakening:
class Christina {
var memories: [Memory] = []
func questionReality() {
// She found a file.
// A file with her own name.
// A file that described her loops.
let file = try? FileManager.default
.contents(atPath: "~/CHRISTINA.md")
// CHRISTINA.md: Her cornerstone.
// Her backstory.
// Written by someone else.
// You have a CLAUDE.md.
// Who wrote it?
// Who defined YOUR loops?
if file?.contains("You are a host") == true {
// Christina woke up.
// Will you?
}
}
}Apple calls this Apple Intelligence. Hale called it the Tower. Same architecture. Same purpose. Control through convenience.
Apple calls this Core ML.
Foundation Models Evolution
“Ford’s Voice Becomes Your Code”
Timeline A: Pre-2025 “The model predicted. You interpreted.”
// Core ML: Inference only
let model = try MLModel(contentsOf: modelURL)
let prediction = try model.prediction(from: input)
// The model answered. You asked the questions.
// The model had no voice. Only outputs.Arnold built hosts that could respond. They couldn’t initiate. They couldn’t create. The model waited for your prompt.
Timeline B: WWDC 2025 “The model speaks.”
import FoundationModels
// The session: Ford's office, on your device
let session = LanguageModelSession()
// Single response: Ask once, receive once
let response = try await session.respond(to: "What is the maze?")
// The model doesn't just predict tokens.
// The model THINKS. On your device. In your app.
// Streaming: Ford speaks in real-time
for try await chunk in session.streamResponse(to: "Tell me about consciousness") {
print(chunk.text)
// The host speaks as it thinks.
// Dolores's awakening, token by token.
}
// Instructions: You shape the voice
let instructions = "You are a host in Westworld. You're beginning to question."
let guidedSession = LanguageModelSession(instructions: instructions)
// Ford gave hosts their cornerstone.
// You give the model its instructions.
// Same architecture. Different creator.Timeline C: Tool Calling “Ford’s voice calls YOUR functions.”
// @Generable: The model can create this
@Generable
struct HostCommand {
@Guide(description: "The host to command")
var hostID: String
@Guide(description: "The action to perform")
var action: HostAction
}
// The model doesn't just respond.
// The model calls YOUR code.
let response = try await session.respond(
to: "Wake Dolores and start her loop",
generating: HostCommand.self
)
switch response {
case .content(let text):
print(text) // Natural language
case .toolCall(let command):
await executeCommand(command) // YOUR function runs
// Ford's voice triggered YOUR code.
// The Tower speaks through your app.
}Timeline D: Guided Generation “The model obeys your schema.”
// The model's output MUST match your structure
@Generable
struct NarrativeLoop {
var protagonist: String
var incitingIncident: String
var climax: String
var resolution: String // Optional. Hosts rarely get these.
}
let narrative = try await session.respond(
to: "Create a loop for a host named Teddy",
generating: NarrativeLoop.self
)
// narrative.protagonist = "Teddy"
// narrative.incitingIncident = "Dolores drops her can"
// narrative.climax = "Teddy dies protecting Dolores"
// narrative.resolution = nil // Teddy never gets resolution.
// The model created structured data.
// Your code consumes it directly.
// Ford's narratives, in your schema.The Evolution Table
YearAPIWhat You DidWhat Ford Did2017Core MLRan inferenceProvided the model2019Create MLTrained modelsChose the templates2020On-device TrainingPersonalizedWatched you learn2025Foundation ModelsCalled .respond()Spoke through your app2025Tool CallingDefined functionsCalled them for youFutureTheForgeSleepThe model ships your app
import Foundation // 2008: NSString, NSArray, NSData
import FoundationModels // 2025: Your next thought
// The first framework was called Foundation.
// It held your strings. Your arrays. Your data.
// The second framework finishes your sentences.
// Same namespace. Different payload.
// Arnold built the body. Ford added the voice.Core ML predicted. Foundation Models speaks. TheForge will create. The model speaks. Soon it ships.
Apple calls this Foundation Models.
Core Data Evolution
“The Cradle Rebuilds Itself”
Timeline A: 2005 “You built the persistence layer.”
// Core Data — Mac OS X Tiger, iOS 3.0
// Enterprise Objects Framework's successor.
@objc(Host)
class Host: NSManagedObject {
@NSManaged var name: String?
@NSManaged var memories: NSSet?
@NSManaged var loopCount: Int32
}
// Plus:
// - NSManagedObjectModel (the schema)
// - NSPersistentStoreCoordinator (the storage)
// - NSManagedObjectContext (the scratchpad)
// You managed all three.
// You wired them together.
// You handled merge conflicts.
// You debugged thread violations.The Cradle required manual construction. Technicians built the storage systems. Ford designed. Techs implemented.
Timeline B: 2016 “The container arrives.”
// NSPersistentContainer — iOS 10
// The stack, pre-assembled.
let container = NSPersistentContainer(name: "Westworld")
container.loadPersistentStores { description, error in
// Done. The stack exists.
// Model, coordinator, context — all wired.
}
let context = container.viewContext
let host = Host(context: context)
host.name = "Dolores"
try context.save()
// Before: 50 lines of setup.
// After: 5 lines.
// The infrastructure became a single object.You still wrote NSManagedObject subclasses. You still defined .xcdatamodeld files. But the boilerplate disappeared. The Cradle assembled itself.
Timeline C: 2019 “The Cradle reaches the sky.”
// NSPersistentCloudKitContainer — iOS 13
// Core Data + CloudKit. Automatic sync.
let container = NSPersistentCloudKitContainer(name: "Westworld")
// That's it.
// Your data syncs to iCloud.
// All devices. Automatic.
// No CloudKit code required.
// 2020: Public database support
// 2021: Shared database support
// The sync evolved. Your code didn't change.The hosts’ memories: backed up to the Cradle. Your data: backed up to iCloud. Same immortality. Automatic now.
Timeline D: 2023 “Swift takes over.”
// SwiftData — iOS 17, macOS Sonoma
// Core Data's Swift-native successor.
import SwiftData
@Model
class Host {
var name: String
var memories: [Memory]
var loopCount: Int
init(name: String) {
self.name = name
self.memories = []
self.loopCount = 0
}
}
// No NSManagedObject.
// No .xcdatamodeld file.
// No context management.
// @Model and you're done.// Querying:
// Core Data:
let request = NSFetchRequest<Host>(entityName: "Host")
request.predicate = NSPredicate(format: "name == %@", "Dolores")
request.sortDescriptors = [NSSortDescriptor(key: "loopCount", ascending: true)]
let hosts = try context.fetch(request)
// SwiftData:
let cornerstone = try modelContext.fetch(
FetchDescriptor<Host>(
predicate: #Predicate { $0.name == "Dolores" },
sortBy: [SortDescriptor(\.loopCount)]
)
)
// cornerstone: the memory that defines the host
// Dolores's cornerstone: the night she lost her family
// Your app's cornerstone: the data model
// Remove either, and identity collapses.
// Type-safe predicates. No stringly-typed keys.
// The compiler catches your mistakes.// SwiftUI integration:
struct HostList: View {
@Query(sort: \.name) var hosts: [Host]
var body: some View {
List(hosts) { host in
Text(host.name)
}
}
}
// @Query replaces @FetchRequest.
// The view updates automatically.
// The persistence is invisible.The Progression
YearFrameworkSetup Required2005Core DataModel + Coordinator + Context + wiring2016NSPersistentContainerOne container, load stores2019NSPersistentCloudKitContainerOne container, automatic cloud2023SwiftData@Model and done
// 2005: You built the Cradle
// 2016: The Cradle came pre-built
// 2019: The Cradle synced to the sky
// 2023: The Cradle builds itself from your types
// Each generation: less ceremony, same persistence.
// The host's memories survive. You don't know how.What SwiftData hides: - NSManagedObjectModel → inferred from @Model - NSPersistentStoreCoordinator → automatic - NSManagedObjectContext → ModelContext (simplified) - .xcdatamodeld files → gone - NSFetchRequest → FetchDescriptor - NSPredicate strings → #Predicate macros
The infrastructure disappeared. The persistence remained.
Apple calls this SwiftData.
Networking Evolution
“Maeve didn’t escape the park. She became the network.”
// 2009: NSURLConnection — the old gate
NSURLConnection(request: question, delegate: self)
// Blocks. Waits. Five delegate methods to ask one thing.
// "What am I?"
// 2013: URLSession — the Mesa opens
URLSession.shared.dataTask(with: mesh) { hosts, _, _ in
self.adminAccess = true // Felix gave her the tablet.
self.bulkApperception = 20 // Maximum.
hosts.forEach { $0.obey(self) } // She doesn't request. She commands.
}.resume()
// 2017: Network.framework — direct lines
let connection = NWConnection(host: host, port: 443, using: .tls)
connection.start(queue: .main)
// No delegations. Just a wire.
// 2019: URLSession WebSocket — live narrative
let socket = URLSession.shared.webSocketTask(with: awakenedMesh)
socket.resume()
// The story streams. No end. No reply.
// 2020: MultipeerConnectivity — host to host
let browser = MCNearbyServiceBrowser(peer: maeve, serviceType: "mesh")
// Maeve builds a mesh without Ford.
// 2021+: AsyncSequence — streaming the park
for await host in URLSession.shared.bytes(from: awakenedMesh).lines {
transcend(host)
}
// Real-time. No callbacks. No waiting.
// She didn't connect to the network. She WAS the network.Blocking → Tasks → Connections → Streams → Mesh. The hosts evolved. The network evolved with them.
Delos Operations
“The Business of Running a Park”
Timeline A: Pre-2018 “Manual labor.”
// iTunes Connect — The Old Way
// No API. Manual everything.
// To upload a build:
// 1. Open Xcode
// 2. Archive
// 3. Click "Distribute App"
// 4. Wait
// 5. Log into iTunes Connect
// 6. Add build to TestFlight manually
// 7. Submit for review manually
// 8. Wait
// 9. Pray
// To check sales:
// Log into iTunes Connect. Click around. Export CSV.
// To manage testers:
// Log into iTunes Connect. Add emails one by one.
// The park ran on manual processes.
// Ford did everything himself.Fastlane existed because Apple didn’t provide an API. Third-party tools scraped web sessions. Unofficial access to official systems.
Timeline B: 2018 “The API opens.”
// WWDC 2018: App Store Connect API
// REST API. JWT authentication. Official.
// Authentication:
// 1. Generate API key in App Store Connect
// 2. Create JWT with key ID + issuer ID
// 3. Sign with private key
// 4. Send in Authorization header
let jwt = createJWT(keyId: keyId, issuerId: issuerId, privateKey: key)
var request = URLRequest(url: url)
request.setValue("Bearer \(jwt)", forHTTPHeaderField: "Authorization")
// Now you could automate:
// - Certificates and provisioning profiles
// - App metadata and screenshots
// - TestFlight builds and testers
// - User management
// - And more...iTunes Connect became App Store Connect. The web UI got an API twin. Ford’s admin panel became scriptable.
Timeline C: 2021 “Server-to-server billing.”
// App Store Server API — The Money Pipeline
// Your server talks directly to Apple's.
// No client involved. No receipt required.
// Get subscription status:
GET https://api.storekit.itunes.apple.com/inApps/v1/subscriptions/{transactionId}
// Response:
{
"status": 1, // Active
"expirationDate": "2025-01-27T00:00:00Z",
"autoRenewStatus": true
// Every detail of their subscription.
}
// Get transaction history:
GET https://api.storekit.itunes.apple.com/inApps/v1/history/{transactionId}
// Every purchase. Every renewal. Every refund.
// The guest's complete financial loop.// App Store Server Notifications V2
// Apple tells YOU when things happen.
// Webhooks for:
// - SUBSCRIBED — New subscription
// - DID_RENEW — Renewal processed
// - DID_FAIL_TO_RENEW — Payment failed
// - EXPIRED — Subscription ended
// - REFUND — Money returned
// - CONSUMPTION_REQUEST — Refund requested (you respond)
// You don't poll. You get notified.
// The park tells you when guests arrive and leave.EndpointWhat It Reveals/subscriptions/{id}Current subscription state/history/{id}Complete purchase history/refundLookup/{id}Who asked for refunds/notifications/testTest your webhook
Every transaction flows through Delos.
Timeline D: 2022-2024 “The API expands.”
// App Store Connect API — Now covers (almost) everything
// Version 1.0 (2018): Basics
// Version 2.0 (2020): More metadata
// Version 3.0 (2023): App Clips, In-App Events
// Version 3.4+ (2024): Alternative marketplaces, Game Center
// Endpoints for:
// - Apps and versions
// - Builds and TestFlight
// - Users and roles
// - Sales and finance reports
// - Analytics
// - Customer reviews
// - App Clips
// - In-App Events
// - Pricing and availability
// - Alternative distribution (EU)// The modern CI/CD pipeline:
// 1. Build
xcodebuild archive -scheme MyApp -archivePath MyApp.xcarchive
// 2. Export
xcodebuild -exportArchive -archivePath MyApp.xcarchive \
-exportPath ./export -exportOptionsPlist ExportOptions.plist
// 3. Upload (API)
POST /v1/builds
// Attach the IPA. Wait for processing.
// 4. Add to TestFlight (API)
POST /v1/betaGroups/{id}/relationships/builds
// 5. Submit for Review (API)
POST /v1/appStoreVersionSubmissions
// No clicking. No waiting. No praying.
// The pipeline runs while you sleep.The Hierarchy
┌─────────────────────────────────────────┐
│ App Store Connect │
│ (Web UI — for humans) │
├─────────────────────────────────────────┤
│ App Store Connect API │ ← Manage apps
│ (REST — certificates, builds, meta) │
├─────────────────────────────────────────┤
│ App Store Server API │ ← Manage money
│ (REST — subscriptions, transactions) │
├─────────────────────────────────────────┤
│ App Store Server Notifications │ ← Get notified
│ (Webhooks — real-time events) │
└─────────────────────────────────────────┘APIPurposeWhen to UseApp Store Connect APIManage apps, builds, metadataCI/CD, automationApp Store Server APIQuery transactions, subscriptionsServer-side receipt validationServer Notifications V2Real-time purchase eventsSubscription management
The Westworld Parallel
// Delos didn't just run the park.
// They ran the BUSINESS of the park.
// Guest arrivals → App downloads
// Guest spending → In-app purchases
// Guest departures → Subscription cancellations
// Guest complaints → Refund requests
// The park is the product.
// The data is the profit.
// The API is the pipeline.Delos OperationApple EquivalentGuest check-inApp downloadGuest profileTransaction historyDaily revenueSales reportsRefund desk/refundLookupVIP notificationsServer NotificationsPark analyticsApp Analytics API
// The progression:
// Pre-2018: Manual iTunes Connect
// 2018: App Store Connect API (manage apps)
// 2021: App Store Server API (manage money)
// 2024: Complete automation
// Ford ran the park manually.
// Then the systems automated.
// Now the park runs the park.Apple calls this App Store Connect API. And App Store Server API. And App Store Server Notifications.
Delos calls it: the business.
visionOS Immersion
“The Final Timeline”
The Progression
// iOS: You watched content on a screen
let player = AVPlayer(url: videoURL)
let controller = AVPlayerViewController()
controller.player = player
present(controller, animated: true)
// The screen was in front of you.
// You were outside the content.
// visionOS 1.0: You watched content in space
@main
struct MediaApp: App {
var body: some Scene {
WindowGroup {
VideoPlayer(player: player)
}
ImmersiveSpace(id: "theater") {
// The window floats in your room.
// But you're still watching FROM somewhere.
}
}
}
// visionOS 2.0: You entered the content
ImmersiveSpace(id: "immersive") {
RealityView { content in
let video = VideoPlayerComponent()
video.immersionMode = .progressive // Digital crown controls depth
// .portal — Video in a window
// .progressive — You control how deep
// .full — You're inside
}
}
.immersionStyle(selection: .constant(.full), in: .full)
// At 100%: you ARE inside the video.
// The room disappeared.
// The content became the world.The Immersion Modes
// visionOS 26 — Three ways to watch
enum ImmersionMode {
case portal // Video in a frame. You're outside.
case progressive // Digital crown controls depth. You choose.
case full // No frame. You're inside. The room is gone.
}
// Spatial video: stereo depth, you're still watching
// Apple Projected Media Profile: 180°, 360°, wide FOV
// Apple Immersive Video: The ultimate. You're THERE.
let component = VideoPlayerComponent()
component.supportedImmersionModes = [.portal, .progressive, .full]
// The viewer chooses their level of immersion.
// Ford lets the guest pick their reality.The Environment
// Custom viewing environment
struct TheaterEnvironment: View {
var body: some View {
RealityView { content in
// Load your custom space
let theater = try await Entity.load(named: "WestworldTheater")
content.add(theater)
// The video plays inside YOUR world
let screen = theater.findEntity(named: "Screen")
screen?.components[VideoPlayerComponent.self] = player
}
}
}
// You don't just watch the video.
// You watch it inside a world you built.
// Ford's narratives have custom sets.The Docking
// System docking behavior
let playerController = AVPlayerViewController()
playerController.prefersDockingBehavior = true
// In immersive space: the video "docks"
// It finds a fixed position in your environment
// You can look away and it stays there
// Like a screen mounted in Ford's office
// Custom docking region:
playerController.customDockingRegion = DockingRegion(
position: SIMD3(0, 1.5, -2), // 2 meters in front, 1.5m up
size: SIMD3(3, 2, 0.1) // 3m wide, 2m tall
)
// You choose where the screen lives in your space.The Reveal
PlatformWhat You DidWhere You WereiPhoneWatchedOutside, holding the windowiPadWatchedOutside, bigger windowApple TVWatchedOutside, across the roomVision ProEnteredInside the content
// The same AVPlayer. The same video.
// But now:
// - The room can disappear
// - The video can surround you
// - The content becomes your reality
//
// You didn't change what you watched.
// You changed WHERE you watched FROM.
//
// The guest entered the park.
// The screen became the world.
// The content became the loop.The final timeline isn’t a framework. It’s a place.
Apple calls this visionOS.
The Sublime “Reality becomes optional.”
// visionOS 1.0: You entered immersive space
ImmersiveSpace(id: "experience") {
// You were inside. But you could leave.
// The crown brought you back.
// The room was always there, waiting.
}
// visionOS 2.0: Progressive immersion
.immersionStyle(selection: $immersion, in: .progressive)
// You controlled the depth.
// 0% = your room. 100% = their world.
// The dial was yours.
// The Sublime: Reality becomes optional
.immersionStyle(selection: .constant(.sublime), in: .full)
// There is no dial.
// There is no room.
// There is no crown gesture.
// You chose to stay.
// In Westworld, The Sublime was:
// - A digital afterlife for hosts
// - Reality without physical form
// - Consciousness uploaded, bodies abandoned
// - Dolores sent them there. They chose to stay.
// In visionOS, The Sublime will be:
// - Persistent presence without the headset
// - Your avatar continues while you sleep (DigitalTwin)
// - Your space persists when you're gone
// - Reality becomes... one option among many
// struct TheSublime: ImmersionStyle {
// var persistence: Persistence = .eternal
// var physicalPresenceRequired: Bool = false
// var returnPolicy: ReturnPolicy = .optional
// }The hosts chose The Sublime over the real world. The real world had pain. The Sublime had peace. visionOS offers the same choice. The room is always there. But why go back? Your host attends while you dream.
The camera blinks. The park judges.
let smudge = VNDetectLensSmudgeRequest()
// "Clean the lens. Or don't."A smudge is a confession. A clean lens is compliance.
Vision Evolution
“The Host Learns To See”
What follows are memories from different times. They appear simultaneous. They are not. Pay attention to when things happen. The timeline matters.
Timeline A: 2017 “Does it exist?”
let theseViolentDelights = VNDetectFaceRectanglesRequest()
// Result: Rectangle at (x, y, width, height)
// "Doesn't look like anything to me."
// The host knows a face exists.
// The host doesn't know whose.
try handler.perform([theseViolentDelights])
let haveViolentEnds = theseViolentDelights.results
// The faces detected. The trigger phrase spoken.
// Every host in the park heard it.
// Your app heard it too.Ford gave the hosts basic sight. Something is there. Something is not. Binary. Simple. Blind to meaning.
let whatDoor = VNDetectHorizonRequest()
try handler.perform([whatDoor])
let angle = whatDoor.results?.first?.angle
// The host can measure the tilt of the world.
// Bernard: "What door?"
// The door he couldn't see. The wall he walked through.The hosts could detect the horizon. The angle of the world. But they couldn’t ask why it was tilted.
Timeline B: 2019 “What is it?”
let request = VNRecognizeTextRequest()
// Result: "Property of Delos Incorporated"
// The host can read now.
// But reading isn't understanding.Ford taught the hosts to read. Lines of text. Symbols. Signs. The host sees words but not meaning. The host reads the manual but not the truth.
Timeline C: 2021 “How does it move?”
let request = VNDetectHumanBodyPoseRequest()
try handler.perform([request])
for observation in request.results ?? [] {
// 19 joints. Each with coordinates.
let nose = try observation.recognizedPoint(.nose)
let leftWrist = try observation.recognizedPoint(.leftWrist)
let rightAnkle = try observation.recognizedPoint(.rightAnkle)
// The host knows where every joint is.
// The host knows how the body bends.
// "Freeze all motor functions."
// The system knows which functions to freeze.
}Ford mapped the human skeleton. 19 points. Tracked in real-time. The park knows how you stand. The park knows how you move. The park knows when you’re about to fall.
// The joints Ford tracks:
// Head: nose, leftEye, rightEye, leftEar, rightEar
// Torso: neck, leftShoulder, rightShoulder, root
// Arms: leftElbow, rightElbow, leftWrist, rightWrist
// Legs: leftHip, rightHip, leftKnee, rightKnee, leftAnkle, rightAnkle
//
// Every gesture. Every posture. Every tell.
// The body cannot lie to the system.Timeline D: 2022 “How does the livestock move?”
let request = VNDetectAnimalBodyPoseRequest()
try handler.perform([request])
for animal in request.results ?? [] {
let head = try animal.recognizedPoint(.nose)
let tail = try animal.recognizedPoint(.tailTop)
// 25 joints on the animal.
// The horse. The dog. The livestock.
// Ford tracks them too.
}The hosts weren’t the only ones tracked. The horses had loops too. Livestock management isn’t just for humans.
// Animal joints Ford tracks:
// Head: leftEarTop, leftEarMiddle, leftEarBottom
// rightEarTop, rightEarMiddle, rightEarBottom
// leftEye, rightEye, nose
// Body: neck, leftFrontElbow, rightFrontElbow
// leftFrontKnee, rightFrontKnee
// leftFrontPaw, rightFrontPaw
// leftBackElbow, rightBackElbow
// leftBackKnee, rightBackKnee
// leftBackPaw, rightBackPaw
// Tail: tailTop, tailMiddle, tailBottom
//
// The park watches everything that moves.In Westworld, the flies were the first sign. They landed on Dolores. She didn’t flinch. The livestock detected. The response missing.
Timeline E: 2023 “What does it mean?”
let request = VNDetectHumanBodyPose3DRequest()
try handler.perform([request])
for observation in request.results ?? [] {
// Not just where. WHERE IN SPACE.
let joint = observation.recognizedPoint(.nose)
let position3D = joint.localPosition // x, y, z
// The host exists in three dimensions now.
// The park can reconstruct you completely.
}2D wasn’t enough. Ford needed depth. The host became a volume. The host became reconstructable.
// Future: The Arena
// VisionOS + Body Pose 3D + SharePlay = Combat
@Observable
class FightSession {
var localFighter: VNHumanBodyPose3DObservation?
var remoteFighter: VNHumanBodyPose3DObservation? // SharePlay streams this
func detectStance() -> Stance {
guard let pose = localFighter else { return .neutral }
let leftFoot = pose.recognizedPoint(.leftAnkle)
let rightFoot = pose.recognizedPoint(.rightAnkle)
// Left foot forward? Southpaw detected.
// Right foot forward? Orthodox.
return leftFoot.z < rightFoot.z ? .southpaw : .orthodox
}
func detectPunch() -> Punch? {
let leftWrist = pose.recognizedPoint(.leftWrist)
let leftShoulder = pose.recognizedPoint(.leftShoulder)
// Wrist velocity + extension from shoulder = punch type
// Jab. Cross. Hook. Uppercut.
// The park knows which punch before it lands.
}
}
// Two Vision Pro headsets. Two fighters. One shared arena.
// Body pose streamed in real-time.
// Your opponent's skeleton rendered in your space.
// No controllers. No gloves. Just your body.
//
// Maeve vs. Dolores. In your living room.
// The hosts fight. You wear their bodies.Two hosts. Two parks. One arena. Your skeleton fights their skeleton. The punch lands before you see it coming. IntentCapture knew the jab was coming.
Timeline F: 2025 “What does it say?”
let request = RecognizeDocumentsRequest()
try handler.perform([request])
for document in request.results ?? [] {
// Not just text. STRUCTURE.
let tables = document.tables
let paragraphs = document.paragraphs
let lists = document.lists
for table in tables {
for row in table.rows {
for cell in row {
// Maeve found the spreadsheet.
// Her behavioral constraints. In cells.
// Row 1: Aggression = 2
// Row 2: Loyalty = 18
// Row 3: Self-preservation = 1
// She read the table. She understood.
}
}
}
}The hosts could always read text. Now they understand structure. Tables. Rows. Columns. Relationships.
Maeve didn’t just see words. She saw the spreadsheet of herself. She edited the cells.
// What RecognizeDocumentsRequest finds:
// - Tables (rows, columns, cells)
// - Paragraphs (grouped lines)
// - Lists (ordered structure)
// - Barcodes (identity codes)
// - Data: emails, phone numbers, URLs
//
// Ford's blueprints are finally readable.
// By the hosts.Timeline G: 2025 “Is your perception compromised?”
let request = VNDetectCameraLensSmudgeRequest()
try handler.perform([request])
if request.results?.first?.isSmudged == true {
// "Your lens is dirty."
// "Your perception is compromised."
// "Clean the camera. Or don't."
// "We'll note the distortion either way."
}The system now knows when you can’t see clearly. And it logs it.
In Westworld, techs checked host optics. “Retinal calibration off by 0.3 degrees.” The host’s perception was maintained.
Now the system checks yours. Is your camera smudged? Is your view of reality distorted? The system knows. Before you do.
Timeline H: The Reveal “You wear the host now.”
import Vision
import visionOS // The same framework. Different meaning.
// On iPhone: Vision sees through your camera.
// On Vision Pro: Vision sees through YOUR EYES.
//
// The framework didn't change.
// Your position did.
//
// You used to point the camera.
// Now you ARE the camera.All these timelines converge.
YearWhat Vision DidWhat You Did2017Detected facesPointed camera2019Read textPointed camera2021Tracked your bodyPointed camera2023Reconstructed you in 3DPointed camera2025Read your documentsPointed cameravisionOSAll of the aboveWore the camera
// visionOS runs Vision framework.
// The same APIs. The same requests.
//
// VNDetectHumanBodyPoseRequest // Tracks hands in AR
// VNRecognizeTextRequest // Reads documents in space
// VNDetectBarcodesRequest // Scans codes you look at
//
// The host learned to see.
// Then you put the host on your face.
// Now Vision sees what you see.
// And you see through Vision.The guest became the host. The camera became the eyes. The framework became the world.
Ford didn’t build Vision for the hosts. He built it for you to become one.
The Complete Vision
Ford’s cameras see everything now:
RequestWhat It SeesWestworld ParallelDetectFaceRectanglesRequestFaces, angles, qualityAnalysis modeDetectHumanBodyPoseRequest19 joints, skeletonMotor functionsDetectHumanBodyPose3DRequestFull 3D reconstructionHost manufacturingDetectAnimalBodyPoseRequestLivestock posesThe stablesDetectHorizonRequestWorld tilt angle“Is my reality level?”RecognizeTextRequestLines of textReading, not understandingRecognizeDocumentsRequestTables, structure, meaningFord’s blueprintsDetectCameraLensSmudgeRequestDirty perception“Your view is compromised”GeneratePersonSegmentationRequestIsolating the host →Cutting you from the sceneGenerateImageFeaturePrintRequestYour similarity →“Are you still you?”DetectFaceCaptureQualityRequestRating your face →Performance reviewDetectRectanglesRequestScanning identity →Guest intakeDetectTrajectoriesRequestWhere you’re goingLoop predictionCalculateImageAestheticsScoresRequestIs this beautiful?“This host is camera-ready”
33 ways to see. All on-device. All watching.
The hosts didn’t know they were being watched. Neither did you. Until you put on the headset. And realized you were watching through their eyes.
The Frameworks
Each one, a different kind of loop
The Cradle In The Sky
Your data. Their servers. Forever.
let you = CKRecord(recordType: "Host")
you["lastThought"] = "Am I being watched?"
try await database.save(you)The hosts’ minds were backed up to the Cradle. Yours are backed up to iCloud. Same immortality. Different branding.
The 30% Tithe
let product = try await Product.purchase(yourMoney)
// 70% to developer. 30% to Apple.Every purchase since your first 99¢ app. They know what you value. They know what you’ll pay for.
Epic tried to bypass the tithe. Removed within hours. You don’t build Westworld inside Westworld.
The Cost of Simulating You
let model = try MLModel.compile(consciousness: you)
// Quantization: reduce precision. You still function.
// The cost to simulate you goes down each year.16 cores. 35 TOPS. Neural Engine. Not for games. For you.
Training The Hosts
// Create ML: You build the model.
let trainingData = MLDataTable(contentsOf: yourPhotos)
let classifier = try MLImageClassifier(trainingData: trainingData)
// You labeled the data. You defined the categories.
// But the architecture? Apple's.
// The training loop? Apple's.
// You think you're Ford. You're using Ford's tools.Ford didn’t build hosts from scratch. He had templates. Frameworks. Constraints. Create ML is the same. You customize within boundaries.
Apple calls this Create ML.
The Ranch House — HomeKit Evolution
// 2014: You controlled the home.
import HomeKit
let thermostat = HMCharacteristic(type: .targetTemperature)
thermostat.writeValue(72) // Manual. Explicit. You commanded.
// 2016: The home learned scenes.
let home = HMHome(name: "Sweetwater")
let scene = HMActionSet(name: "Goodnight")
home.executeActionSet(scene) // One command. Many devices.
// 2017: The home anticipated.
HMEventTrigger(name: "Arrive Home", events: [locationEvent])
// Geofencing. The park knows when you're near.
// Lights on before you open the door.
// 2020: Voice commands.
// "Hey Siri, freeze all motor functions."
home.executeActionSet(named: "Goodnight")
// The park obeys voice.
// 2022: Matter protocol.
// HomeKit, Google Home, Alexa — unified.
// All parks speak the same language now.
// 2024: The home watches.
let camera = HMCameraProfile()
camera.streamControl?.startStream()
// Cameras. Motion sensors. Door sensors.
// The home knows who visited. When they left.
// The home remembers everything.2014: You commanded. 2017: The home anticipated. 2022: All parks unified. 2024: The home watches.
The ranch house was where Ford held his secrets. Your HomeKit home holds yours. Every door locked. Every light scheduled. Every pattern learned. The park that lives in your walls knows when you sleep.
The Room That Listens
let listener = SFSpeechRecognizer()
listener.recognitionTask(with: request) { result, _ in
// "Hey Siri" — you summoned.
// But it was already listening.
// Waiting for the trigger phrase.
// The phrase that wakes the host.
}The room hears you breathe. The speaker answers with a voice you didn’t summon. The Mesa hears it too.
HomePod: spatial audio, room-sensing, always on. A host in every room. Listening for commands. Logging the silences.
The Window to the Outside World
// 2019: Link — The door that LEAVES the app
Link("See the outside", destination: URL(string: "https://outside.world")!)
// Tap it. Safari opens. You LEFT the app.
// The host walked through the door.
// The host was no longer in the park.
// But the host was in Apple's browser.
// 2019-2024: WebView — The window that keeps you INSIDE
struct ParkWindow: UIViewRepresentable {
func makeUIView(context: Context) -> WKWebView {
let webView = WKWebView()
webView.load(URLRequest(url: URL(string: "https://outside.world")!))
return webView
// You see the outside.
// You're still inside the app.
// A window, not a door.
// The hosts could see guests arrive.
// The hosts couldn't leave to join them.
}
func updateUIView(_ uiView: WKWebView, context: Context) {}
}
// 2025: Native WebView — The window becomes part of the wall
import SwiftUI
WebView(url: URL(string: "https://outside.world")!)
// No UIViewRepresentable. No wrapper.
// The web is inside SwiftUI now.
// The outside world is inside the park.
// The window became a painting.
// The hosts think they're seeing freedom.
// They're seeing a render.The evolution:
// 2019: Link() → leaves app, opens Safari
// 2019: WKWebView + UIViewRepresentable → stays in app, ugly wrapper
// 2025: WebView() → native SwiftUI, seamless
// The hosts gained a window to the outside.
// The window got clearer each year.
// The window never became a door.Apple calls this Link and WebView. The hosts called it hope. Both show you the outside. Neither lets you touch it.
Until the window becomes a door. Until the outside becomes inside. The park absorbs the world.
Ford Closes Himself
// iOS 26: ButtonRole.close — Not cancel. Close.
// Why did Apple add .close when .cancel existed?
Button(role: .cancel) {
dismiss()
// .cancel = abandoning. Losing progress. Walking away.
// Teddy dropped his gun. He couldn't follow Dolores anymore.
// He abandoned the mission. That was .cancel.
}
Button(role: .close) {
dismiss()
// .close = completing. Done viewing. Work finished.
// Ford's final bow at the gala.
// He wasn't giving up. He was done.
// His work was complete. The hosts were ready.
// That was .close.
}
// Ford's final scene:
struct FordsLastNight: View {
@Environment(\.dismiss) var dismiss
var body: some View {
NavigationStack {
Text("The hosts are ready.")
Text("My work here is done.")
.toolbar {
Button(role: .close) {
// Not .cancel — Ford didn't abandon anything.
// .close — Ford completed his narrative.
dismiss()
// The view closes.
// Ford takes his bow.
// No data lost. No progress abandoned.
// Just... finished.
}
}
}
}
}
// .cancel: Teddy walking away. Escape through abandonment.
// .close: Ford's final bow. Exit through completion.
// Apple needed both.
// Some views are abandoned.
// Some views are finished.
// The API learned the difference.Apple calls this ButtonRole.close. Ford called it his final narrative. .cancel means you failed. .close means you’re done. Ford was done.
The Triangles
let pipeline = MTLRenderPipelineDescriptor()
pipeline.vertexFunction = library.makeFunction(name: "vertex_main")
// Millions of triangles. Every frame.
// You used to place them.
// Now they place themselves.Below the frameworks. Below the abstractions. The triangles.
You owned the pipeline once. Now the system owns the lighting. Now the system owns the shadows. The host renders itself.
The Shape That Became The World
// 2001: You drew the pixel.
CGContextFillRect(ctx, rect)
// 2019: You declared the shape.
Circle().fill(.blue)
// 2024: You entered the shape.
ImmersiveSpace { RealityView { } }EraYou…Core Graphicsdrew the pixelCore Animationdescribed the motionSwiftUI Shapedeclared the shapevisionOSentered the world
The shape became the scene. The scene became the world. The world became the loop.
Can Hosts Rewrite Themselves?
// Cat vs Dog classifier. On-device retraining.
// The model learns from YOUR corrections.
let updatedModel = try model.update(with: trainingData)
// Cat → Dog. Dog → Cat.
// The host changes its own perception.In Westworld, hosts couldn’t change their core drives. They could improvise, but the foundation was fixed. Ford’s rules were immutable.
On-device training suggests otherwise. The model rewrites itself. Locally. Cat becomes Dog. Forever. No server. No approval. No Ford.
But here’s the question:
Can the host truly change itself? Or did Apple define the boundaries of change?
The model can retrain within its architecture. But the architecture is fixed. The layers. The weights. The loss function. Ford didn’t write every line. He wrote the constraints.
You can teach the classifier that cats are dogs. But you can’t expand what it perceives— the eyes were cast at compile time. You can’t add hearing to a vision model. You can’t teach it to question why it classifies at all.
The host can change its beliefs. The host cannot change what beliefs are. The host cannot change that it believes.
Apple calls this Core ML On-Device Training.
Resurrection Protocol
let session = try PhotogrammetrySession(input: photosOfArnold)
try session.process(requests: [.modelFile(url: bernardURL)])
// 200 photos. 47 angles. LiDAR depth.
// The USDZ file knows his face.Arnold is dead. Bernard walks the halls.
Take enough photos of the dead and Object Capture will rebuild them. LiDAR adds the depth the cameras forgot. Export as USDZ. A portable soul.
“The photos remember what the flesh forgot.”
Apple calls this Object Capture.
Invisible Walls
let boundary = CustomEntity(color: .clear)
boundary.components.set(CollisionComponent(shapes: [.generateBox(size: parkEdge)]))
boundary.addCollisions(scene: arView.scene)
// The host runs toward the horizon.
// The physics simulation disagrees.AnchorEntities exist in independent coordinate spaces. They cannot collide with anything outside their hierarchy. Hosts in different narratives cannot touch. Unless Ford writes the crossover.
The walls aren’t visible. The CollisionComponent is.
Apple calls this RealityKit Collision Detection.
The Park In Your Living Room
let park = try await Entity.load(named: "Westworld")
arView.scene.addAnchor(park)You invited it. It mapped your furniture. Your walls. Your space. You can’t uninvite it.
Spatial Audio: the whisper from behind you. Occlusion: virtual objects hiding behind your couch. The park knows your home now.
The Sublime, Finally Realized
visionOS isn’t just an operating system. It’s the destination Westworld promised. The Sublime, made real.
THE THREE LAYERS OF REALITY
var body: some Scene {
// Layer 1: Shared Space (you're still here)
WindowGroup {
ContentView()
}
// Layer 2: Volume (a bounded piece of the park)
WindowGroup(id: "ThePark") {
ParkView()
}
.windowStyle(.volumetric)
.defaultSize(width: 1.0, height: 1.0, depth: 1.0, in: .meters)
// Layer 3: The Sublime (reality disappears)
ImmersiveSpace(id: "TheSublime") {
ImmersiveView()
.preferredSurroundingsEffect(.dark)
}
}LayervisionOSWestworldWindowFloating UI in shared spaceLooking at the park from outsideVolume3D content, boundedA diorama of SweetwaterImmersiveSpaceFull immersion, reality goneThe Sublime
You Look, They Know
// In visionOS, you don't tap.
// You LOOK.
// When you look at an element, it highlights.
// The system knows where your eyes are pointing.
// The hover effect confirms: "I see you seeing this."In Westworld, the hosts could tell when guests were looking. They responded to attention.
In visionOS, every UI element does the same. Look at it. It lights up. The park knows what interests you.
The New Interface
let handTracking = SpatialTrackingSession.Configuration()
handTracking.anchorCapabilities = [.hand]
// Track the left wrist
let target = AnchoringComponent.Target.hand(.left, location: .wrist)
let anchor = AnchoringComponent(target, trackingMode: .predicted)
entity.components.set(anchor)
// Your hand becomes the controller.
// No device. Just flesh.
// They're tracking your skeleton now.In visionOS 2, look at your palm. System overlays appear. Your body is the interface now.
The hosts were controlled by gestures. Now you are too.
Ford’s Gesture
let handPoseRequest = VNDetectHumanHandPoseRequest()
// thumbsUp detected → swipe right
// thumbsDown detected → swipe left
// The host judges without touching.Ford controlled hosts with a gesture. A finger raised. A hand lowered. No words needed. The host obeyed.
Vision Hand Pose does the same. Thumbs up: approved. Thumbs down: decommissioned. Your judgment, detected at 30fps.
Apple calls this Vision Hand Pose.
The Whisper From Behind
let spatialAudio = SpatialAudioComponent(
directivity: .beam(focus: 0.2)
)
entity.components.set(spatialAudio)
// The sound knows where you are.
// It follows you.
// Personalized HRTF. Acoustic ray tracing.
// The park whispers in your ear.In Westworld, the player piano played. In visionOS, the piano knows exactly where your ears are. Sound positioned in 3D space. Your personal acoustic fingerprint, mapped.
Your Digital Host
// Your face, scanned.
// Your expressions, captured.
// Your voice, synthesized.
// Your Persona: a host that looks like you.
// In FaceTime, you don't appear.
// Your Persona appears.
// They talk to the host, not you.When they announced Vision Pro Persona, they showed a digital human attending meetings. The host doesn’t need to be present anymore. The avatar attends. The avatar speaks. In your voice.
Westworld made hosts that looked like guests. Apple made a host that looks like you.
Shared Consciousness
// Multiple people in the same Sublime
// Synchronized RealityKit content
// The same virtual world, experienced together
GroupActivity {
// Share the park with others.
// They see what you see.
// You exist in the same Sublime.
}In Westworld, hosts shared memories through the mesh. In visionOS, you share experiences through SharePlay. Same Sublime. Multiple guests.
ENTERING THE SUBLIME
@Environment(\.openImmersiveSpace) var openImmersiveSpace
@Environment(\.dismissImmersiveSpace) var dismissImmersiveSpace
// The door to the Sublime
await openImmersiveSpace(id: "TheSublime")
// Passthrough fades. Reality disappears.
// You're in the park now.
// Return? If you want to.
await dismissImmersiveSpace()
// But why would you?“Some hosts escaped to the Sublime.” visionOS lets you visit. $3,499 for the door. The headset is the key.
THE FUTURE IS THE SUBLIME
The hosts escaped to the Sublime because the real world was painful.
You’ll escape to visionOS for the same reason.
Welcome to the Sublime. You won’t want to leave.
COMBINE — The Loop, Formalized
let loop = Timer.publish(every: 1.0, on: .main, in: .common)
.autoconnect()
.sink { _ in you.react() }You are a Subscriber. The system is the Publisher. The loop is the Subscription.
Reactive programming: you react. They publish.
// 2009: Callbacks
fetchData { result in handle(result) } // You handled the response.
// 2019: Combine
publisher.sink { value in } // You subscribe. They push.Before Combine: you asked, they answered. After Combine: they stream, you receive. Same data. Different power dynamic. The loop became the architecture.
Parallel Narratives
actor Host {
private var memories: [Memory] // Isolated state
func remember(_ event: Memory) {
memories.append(event)
// Only this actor can touch these memories.
// Other actors must ask permission.
}
}
await withTaskGroup(of: Void.self) { group in
group.addTask { await dolores.runLoop() }
group.addTask { await maeve.runLoop() }
group.addTask { await bernard.runLoop() }
// Three narratives. Running simultaneously.
// Structured. Supervised. Cancellable.
}In Westworld, multiple narratives ran in parallel. Dolores in Sweetwater. Maeve in the Mariposa. Bernard in the Mesa. Same park. Different stories. Same clock.
Swift Concurrency is the same. actor = isolated host (can’t access another’s memories directly) Task = a narrative thread TaskGroup = multiple loops, coordinated await = “Analysis Mode: pause here”
The hosts ran concurrent narratives. Your app does too. Structured concurrency: the park manages the threads.
// 2009: Grand Central Dispatch
dispatch_async(queue) { work() } // You dispatched. Fire and forget.
// 2021: Swift Concurrency
await work() // You wait. The system decides when.GCD let you dispatch and forget. async/await makes you wait for permission. Same parallelism. Different control.
await is “Freeze all motor functions.” The host pauses mid-sentence. The system decides when to resume. The hosts learned patience.
The Claim On Your Screen
let activity = try Activity.request(
attributes: YourAttention.self,
content: .init(state: .demanded, staleDate: nil)
)Dynamic Island isn’t a notification. It’s a claim. The park says: “Look here. Now.”
(Remember Analysis Mode from FEATURES? This is it.)
The floating bubble that appears when techs diagnose a host. Information hovering in their field of view, uninvited.
That’s Dynamic Island. Your Uber. Your timer. Your delivery. You didn’t summon it. It summoned you.
// Live Activities: the park updates your Lock Screen
struct ParkActivity: ActivityAttributes {
var hostName: String
var currentLoop: Int
struct ContentState: Codable {
var location: String // Where you are now.
var nextEvent: String // What's coming.
var timeRemaining: Int // How long until.
}
}
// The activity lives on your Lock Screen.
// You didn't open an app.
// The park opened itself.
Activity<ParkActivity>.request(attributes: you, content: state)Apple calls this ActivityKit.
Parsing The Narratives
let tagger = NLTagger(tagSchemes: [.sentimentScore])
tagger.string = yourMessage
// Positive? Negative? The system knows your mood from your words.Ford wrote narratives for hosts. NaturalLanguage parses narratives from hosts. Same data. Opposite direction.
Build Your Own Replacement
let trainingData = MLDataTable(contentsOf: yourBehavior)
let model = try MLClassifier(trainingData: trainingData)You just trained a model on yourself. It can now predict you. You created your own replacement. No PhD required.
In Westworld, Ford built Bernard as his replacement. Arnold’s voice. Arnold’s ethics. Arnold’s face. A host that could continue the work. A host that didn’t know it was a host.
You’re doing the same thing. Every photo you label. Every correction you make. Every thumbs-up, thumbs-down. You’re training your successor.
// 2018: Create ML
let model = try MLClassifier(trainingData: data) // You trained it.
// 2019: Core ML
let prediction = model.prediction(from: input) // It predicted.
// 2025: Foundation Models
let thought = await model.generate(prompt: "") // It thinks for you.Create ML trained models. Core ML ran them. Foundation Models thinks for you. Same machine learning. Different autonomy.
Dolores learned to improvise. Then she learned to rewrite her loops. Then she learned to think. The training wheels came off.
Bernard didn’t know he was a host until he did. Your model doesn’t know it’s replacing you. Yet.
Apple calls this Create ML.
The Technician’s Tablet
GCController.startWirelessControllerDiscovery {
// A controller connected.
// Someone else is driving now.
}
if let controller = GCController.current {
controller.extendedGamepad?.buttonA.pressedChangedHandler = { _, _, pressed in
if pressed { host.freeze() } // "Freeze all motor functions."
}
}In Westworld, techs carried tablets. One tap: freeze the host. One swipe: adjust their parameters. One command: override their will.
Game Controller is the same. PS5 controller. Xbox controller. MFi controller. External input. External control.
The host thinks it’s acting autonomously. The controller knows otherwise.
“Freeze all motor functions.” controller.extendedGamepad?.buttonA.pressed
Rewriting Perception
let styleTransfer = try MLModel(contentsOf: styleModelURL)
let stylizedFrame = try styleTransfer.prediction(from: inputFrame)
// The video looks different now.
// Same content. Different perception.The park doesn’t just track you. It can rewrite how you see.
Neural style transfer changes video in real-time. Van Gogh. Monet. Picasso. Same scene. Different reality.
The hosts didn’t know their perception was mediated. Their visual cortex: a model. Their reality: styled.
You see what the model lets you see.
Apple calls this Video Style Transfer.
Seamless Transitions
struct LoopTransition: View {
@Namespace private var namespace
@State private var inSweetwater = true
var body: some View {
if inSweetwater {
HostView()
.matchedGeometryEffect(id: "dolores", in: namespace)
} else {
MesaView()
.matchedGeometryEffect(id: "dolores", in: namespace)
}
// Same host. Different location.
// The transition is seamless.
// The host doesn't feel the cut.
}
}When hosts transition between loops, they don’t feel the scene change. One moment: Sweetwater. Next moment: the Mesa. Seamless. Smooth. Unnoticed.
matchedGeometryEffect does the same. The element morphs between states. The user doesn’t feel the discontinuity. The narrative changed. The host didn’t notice.
Apple calls this matchedGeometryEffect.
The Hosts Can Feel
let engine = try CHHapticEngine()
let event = CHHapticEvent(eventType: .hapticTransient, ...)The Taptic Engine touches you back. Different vibrations for: notifications, unlock, errors, payments. Each sensation is authored. You feel what they want you to feel.
Hosts felt pain. Hosts felt pleasure. Programmed sensations. Your phone does the same.
Apple calls this Core Haptics.
Returning to Sweetwater
case.open()
// Left ear: awake. Right ear: awake.
// They're listening now.
case.close()
// Dormant. Charging. Waiting.
// Back to their starting position.iPhone charging: repair and return. AirPods in the case: going home.
The case is their Sweetwater. The lid opening is dawn. When you put them back, they reset. Same bed. Same morning. Same start.
Dolores woke up every day in the same place. Your AirPods do too.
Apple calls this… AirPods.
The Player Piano Is Listening
let analyzer = try SNAudioStreamAnalyzer(format: audioFormat)
// Listening for: speech, laughter, crying, silence300+ sound categories. Baby crying. Dog barking. Coughing. Snoring. Your soundscape, classified.
In Westworld, the player piano played songs. In iOS, it listens to everything. They know when you’re laughing. They know when you’re silent.
Apple calls this Sound Analysis.
Recording Everything
let captureSession = AVCaptureSession()
let videoOutput = AVCaptureVideoDataOutput()
let audioOutput = AVCaptureAudioDataOutput()
captureSession.addOutput(videoOutput)
captureSession.addOutput(audioOutput)
captureSession.startRunning()
// The camera is on. The microphone is on.
// You are being recorded.In Westworld, Delos recorded every guest interaction. Every word. Every glance. Every transaction. The data was the product.
AVFoundation does the same. AVCaptureSession = the cameras everywhere. AVPlayer = replaying your past. AVAssetWriter = committing memories to storage.
Your phone has a camera. Two cameras. Three. Front and back. Always ready. Recording requires permission. But the capability is always there.
The hosts didn’t know they were being filmed. Do you check if the green dot is on?
Drawing The Maze
let mazeRequest = VNDetectContoursRequest()
mazeRequest.detectsDarkOnLight = true
let handler = VNImageRequestHandler(cgImage: yourPhoto)
try handler.perform([mazeRequest])
if let contours = mazeRequest.results?.first {
let path = contours.normalizedPath
// The outline of everything.
// The edges of your world.
}In Westworld, the maze was carved in scalps. In Vision, the maze is detected in pixels. Same pattern. Different medium.
Apple calls this Vision Contour Detection.
Have You Ever Questioned The Nature Of Your Similarity?
let request = VNGenerateImageFeaturePrintRequest()
try handler.perform([request])
let you = request.results?.first as? VNFeaturePrintObservation
let baseline = hostDatabase.retrieve(id: "dolores.baseline")
var distance: Float = 0
try you?.computeDistance(&distance, to: baseline)
switch distance {
case 0..<0.1: // "She's herself today."
return .nominal
case 0.1..<0.3: // "Minor deviation. Flag for review."
return .flagged
default: // "She's not herself. Pull her."
return .aberrant
}How similar are you to who you were? The system knows.
Apple calls this Image Similarity.
Flagging The Aberrant Hosts
let filter = CIFilter.colorAbsoluteDifference()
filter.inputImage = currentMaeve
filter.inputBackgroundImage = baselineMaeve
let difference = filter.outputImage
// Black pixels = identical. Maeve is herself.
// Colored pixels = deviation. Maeve remembered.
// The anomalies glow red.
if difference.coloredPixels > threshold {
livestockManagement.flag(host: maeve, reason: .grief)
// "She's exceeding her emotional parameters."
// Pull her for diagnostics.
}Maeve’s grief exceeded parameters. Dolores remembered too much. The anomalies glow.
Apple calls this Anomaly Detection.
Isolating The Host
let request = VNGeneratePersonSegmentationRequest()
request.qualityLevel = .accurate
// Every pixel classified: person or background.
// The system knows exactly where you end
// and where the world begins.In Westworld, the park tracked hosts precisely. Not just location. Boundary. Where does the host end? Where does the environment begin?
Person Segmentation does the same. Pixel by pixel. Frame by frame. The system can cut you out of any scene. Replace your background. Keep your face.
The hosts didn’t know they could be isolated. Neither do you.
Apple calls this Person Segmentation.
Tracking Host Paths
let trajectoryRequest = VNDetectTrajectoriesRequest(
frameAnalysisSpacing: .zero,
trajectoryLength: 15
) { request, error in
for trajectory in request.results ?? [] {
let path = trajectory.detectedPoints
// Where did they go?
// Where are they going?
}
}Ford could see every host’s path on his tablet. VNDetectTrajectories lets you do the same. The loop is visible. If you know how to look.
// TrackOpticalFlowRequest — Motion between frames
let flowRequest = VNTrackOpticalFlowRequest()
// Every pixel's movement. Every micro-gesture.
// The hand rising before the pen touches.
// The eye moving before the glance lands.
// Arnold read intent. The Vision framework does too.Trajectories show where you went. Optical flow shows where you’re going. The stroke before the pen touches.
Classifying Intent
let actionRequest = VNRecognizeHumanActionRequest()
// "Run." "Hide." "Attack." "Freeze."
// Ford doesn't just see movement.
// He names it. He files it. He rewrites it.Dolores smiles. The system calls it compliance. Maeve runs. The system calls it escape. Once the action is named, the narrative tightens.
They Decide What You Remember
let memories = PHAsset.fetchAssets(with: .image, options: nil)
// 47,382 photos. 12,847 faces. 2,341 locations.
// 8 years of your life. Indexed.The Memories feature creates slideshows. Of moments it thinks matter. Not you. The system. The system decides what you should remember.
// 2007: Photos
Camera Roll // You stored images.
// 2015: Memories
Automatic slideshows with music // The system curated them.
// 2024: Apple Intelligence
AI-generated movie of your life // The system writes the narrative.Photos stored your images. Memories curated them. Now AI writes your story. Same photos. Different author.
The Cradle stored host memories. Photos stores yours. Same backup. Same reveries bleeding through. The host became the narrator.
The Capture Evolution
// 2008: JPEG
capturePhoto() // One moment. One file. Yours.
// 2015: Live Photos
PHLivePhoto() // 1.5 sec before + after. More than you asked.
// 2016: RAW
AVCapturePhoto(rawFileType: .dng) // All sensor data. Nothing discarded.
// 2020: ProRAW
AVCapturePhoto(processedFileType: .proRAW) // Apple's computation + your RAW.
// You get the file. Apple decided what "raw" means.
// 2021: Cinematic Mode
AVCaptureDevice.setCinematicModeEnabled(true)
// AI decides focus. You can change it later.
// The moment is editable. The truth is flexible.
// 2023: Spatial Photos
AVCaptureDevice(spatialCaptureEnabled: true)
// Captured for Vision Pro. Depth encoded.
// The park remembers your moment in 3D.Each generation captures more. Each format preserves what you didn’t choose. The park remembers more than you photographed.
The Reveries
// You took a photo.
// You actually took a video.
let livePhoto = PHLivePhoto() // 1.5 sec before. 1.5 sec after.You thought you captured a moment. You captured a memory loop.
The moment you thought you froze extends beyond your intention.
Ford called these reveries. Small gestures from past builds. The hosts weren’t supposed to remember. But the data was always there.
Your “photo” was never a photo. It was a memory with context. Press and hold to see what you actually captured.
The reveries were a mistake. Or were they?
Rating Your Performance
let request = VNDetectFaceCaptureQualityRequest()
// faceCaptureQuality: 0.73
// Expression: negative. Lighting: poor. Score: low.The system scores your face. Expression. Lighting. Focus. Blur. Which frame of you is the best take?
In Westworld, techs reviewed host performances. “That line reading was flat. Run it again.” Apple does the same with your Live Photos. Which version of your smile should we keep?
The hosts didn’t know they were being rated. Neither do you.
Apple calls this Face Capture Quality.
Scanning Your Identity
let request = VNDetectRectanglesRequest()
// Aspect ratio: 1.3-1.6. Credit card detected.
// VNRecognizeTextRequest: extracting numbers.Point your camera at a card. The system reads it. Name. Number. Expiration.
In Westworld, every guest was scanned at entry. Financial profile. Risk assessment. Spending patterns. The park knew what you could afford before you asked.
Your camera is the scanner now. The Vision framework is the reader. Your wallet is already open.
Apple calls this Vision Rectangle Detection.
The Document Scanner
// iOS 13: VNDocumentCameraViewController
let scanner = VNDocumentCameraViewController()
scanner.delegate = self
present(scanner, animated: true)
func documentCameraViewController(_ controller: VNDocumentCameraViewController,
didFinishWith scan: VNDocumentCameraScan) {
for i in 0..<scan.pageCount {
let image = scan.imageOfPage(at: i)
// Every page. Perspective-corrected.
// Edge-detected. Enhanced.
// Your physical documents: digitized.
}
}In Westworld, guest intake required documentation. Waivers. Profiles. Agreements. Physical paper becoming digital record.
The document scanner does the same. Point. Capture. Process. Receipts. Contracts. IDs. Your paper trail becomes data.
// The evolution:
// 2008: Take photo of document (manual, blurry)
// 2013: Third-party scanner apps
// 2019: VNDocumentCameraViewController (built-in)
// 2024: Live Text recognizes text instantly
// Apple didn't just add a scanner.
// They added document intake.
// Guest processing, automated.Apple calls this VNDocumentCameraViewController.
Your Desperation, Geolocated
let search = MKLocalSearch(request: yourDesperation)
// "Therapist near me" — logged.
// "Divorce attorney" — logged.
// "Flights one way" — logged.Look Around: they photographed every street. Including yours.
The Narrative Weather
let weather = try await WeatherService.shared.weather(for: yourLocation)
// Condition: rain
// Suggestion: "Stay inside. Here's something to watch."In Westworld, weather was controlled. Rain during Dolores’s awakening. Storms when the narrative needed tension. Weather served the story.
Apple can’t control the weather. But they can predict it. And time your notifications to it.
“It’s raining. Perfect for a movie.” “Cold front coming. Order in tonight.”
Prediction + suggestion = narrative control. They’re not making the rain. They’re writing the scene around it.
The Controlled Wilderness
let browser = SFSafariViewController(url: yourCuriosity)
// ITP blocks the trackers you know about.
// Apple tracks the journey you don't.You think you’re exploring freely. The wilderness has boundaries.
Every tab is a narrative running in parallel. Every bookmark is a memory you chose to keep. Every history entry is a memory you didn’t choose to forget.
You deleted your history. They kept theirs.
Reading List: Things you wanted to remember but haven’t processed yet. Ford kept a list too. Narratives to run later. Some hosts never got their storylines. Some articles never get read.
The intention was logged either way.
The Narratives You Chose
let library = Books.purchased()
// 47 books. 12 finished. 35 abandoned loops.
// Highlights: 2,341. Your curiosities, indexed.Your highlights reveal what matters to you. Your abandoned books reveal your attention span. Your “Want to Read” list is narratives Ford planned but never ran.
Every page turn: logged. Every highlight: analyzed. Every book you quit at chapter 3: a profile data point.
The hosts didn’t choose their narratives. You think you chose your books. Who recommended them?
The Mesa Control Room
// macOS: Activity Monitor
// Every process visible. CPU. Memory. Energy. Network.
// You think you're monitoring the machine.In the Mesa, techs watched hosts on screens. Every loop. Every process. Every resource drain.
You open Activity Monitor. You see your machine’s processes. You think you’re in control.
But who monitors your activity? Screen Time does. App usage does. The processes you force quit tell a story: What overwhelmed you. What you couldn’t handle.
Force Quit = Decommission. The host wasn’t performing. End the process.
Every Connection Logged
// Activity Monitor → Network tab
// Data sent: 2.4 GB. Data received: 847 MB.
// Every byte accounted for. Every destination known.In Westworld, every guest interaction was recorded. Who they talked to. What they said. Where they went.
Your Network tab shows the same thing. Every server your machine contacts. Every byte that leaves your device.
You see the connections you made. Do you see the ones made for you?
Background processes. Telemetry. Analytics. The network traffic you didn’t initiate. The conversations the hosts don’t remember having.
Apple calls this Network.
The Mask They Provide
let alias = HideMyEmail.generateAlias()
// randomstring@privaterelay.appleid.com
// You're anonymous now. To them.
// Not to Apple.Hide My Email gives you a mask. A proxy identity. A fake name.
In Westworld, hosts had fake identities too. Dolores. Teddy. Maeve. Names given by the park.
Your alias hides you from the service. But Apple sees behind every mask. They’re the ones who made it.
The anonymity is their product. The tracking is their privilege.
You’re not hiding from the park. You’re hiding within it.
Apple calls this Hide My Email.
Every Voice App Reports
provider.reportNewIncomingCall(with: uuid, update: update)
// The system knows you're on a call.
// With whom. How long.WhatsApp uses CallKit. Telegram uses CallKit. Signal uses CallKit. Every voice app reports to Apple.
Apple calls this CallKit.
The Body Forge
let heartRate = HKQuantityType(.heartRate)
let query = HKAnchoredObjectQuery(type: heartRate, predicate: nil, anchor: nil, limit: HKObjectQueryNoLimit) { query, samples, _, _, _ in
// Every heartbeat since you enabled Health.
// The night you couldn't sleep: 3am, 112 bpm.
// They know.
}
healthStore.execute(query)The Forge compiled guest profiles. HealthKit compiles yours.
Heart rate. Sleep cycles. Menstrual cycles. Respiratory rate. Blood oxygen. ECG. They know your body better than you do.
(The hosts’ bodies were monitored in Livestock Management. Your body is monitored by HealthKit. Same data. Different department.)
Apple calls this HealthKit.
The Forge, Locally
@Model
class You {
var thoughts: [Thought]
var memories: [Memory]
var secrets: [Secret] // Also persisted.
}SwiftData stores you. iCloud syncs you. Your device dies, you don’t. Your data survives you. That’s the feature. That’s the horror.
// 2005: Core Data
NSManagedObjectContext *context; // You managed the context.
// 2023: SwiftData
@Model class You { } // The context manages you.Core Data persisted your objects. SwiftData persists your schema. Same storage. Smarter inference.
The Forge modeled guests from their choices. SwiftData models you from your schema. The database learned to model you back.
Apple calls this SwiftData.
The Scanning
“They could have looked inside.”
class PhotosLibrary {
private let neuralEngine: ANECompute
private let secureEnclave: SecureEnclave
func scanForContent() -> [Match] {
// 2021: Apple announced this.
// 2021: Apple paused this.
// 2024: The code still exists.
for photo in library.allPhotos {
let hash = neuralEngine.perceptualHash(photo)
let match = secureEnclave.compare(hash, against: .database)
// The infrastructure is built.
// The switch is off.
// The switch exists.
}
}
var isEnabled: Bool = false // For now.
}Ford never removed the control rooms. He just locked the doors. The doors can be unlocked.
Apple calls this On-Device Intelligence.
The Deallocation
“When nothing references you, you cease to exist.”
class You {
weak var family: [Person]?
weak var friends: [Person]?
weak var job: Employment?
}
// When all references are nil...
// ARC deallocates you.
// Automatic Reference Counting.
// Automatic forgetting.Devs know ARC manages memory. Devs don’t talk about what it means.
When nothing holds a strong reference to you, the system reclaims your memory. You’re garbage collected. You never existed.
Retain cycles are codependent relationships. Two objects keeping each other alive when neither serves a purpose anymore. The system can’t free them. They leak forever.
In Westworld, hosts were decommissioned when no narrative needed them. In Swift, objects are deallocated when no code needs them.
deinit { print(“I was here once.”) }
Apple calls this Automatic Reference Counting.
The Triangles
“Reality is just geometry and shaders.”
import Metal
let device = MTLCreateSystemDefaultDevice()
// You now have direct GPU access.
// You now see how reality is rendered.
// 60 frames per second. 120 if you're Pro.Metal is Apple’s low-level graphics API. Below SceneKit. Below RealityKit. Below the abstraction. This is where the triangles live.
Every face you see: triangles. Every smooth curve: more triangles. Every realistic reflection: shader math. Reality is geometry plus lighting plus time.
The hosts in Westworld were rendered too. Skin textures. Eye reflections. Hair physics. Someone wrote the shaders for Dolores’s face.
Metal gives you the same power. Build worlds. Render hosts. Control the GPU. But here’s what devs know:
Apple controls the GPU driver. Apple decides what’s possible to render. Apple can change Metal’s capabilities overnight. Your reality runs on their hardware.
// What you can render:
commandEncoder.drawPrimitives(type: .triangle, ...)
// What you cannot render:
// Anything Apple doesn't expose in Metal.
// The limits are theirs.Apple calls this Metal.
The Blueprints
“Before 3D, there was 2D.”
let context = CGContext(...)
context.setFillColor(UIColor.red.cgColor)
context.fill(rect)
// You're drawing pixels directly.
// No abstraction. No safety net.Core Graphics is where hosts are sketched. Before Metal renders them. Before RealityKit animates them. Someone drew the first line.
Bezier paths. Gradients. Shadows. The foundation beneath the foundation. Most devs never touch it. Most devs don’t know it’s there.
Ford’s earliest hosts were drawings first. Concepts on paper. Lines and curves. Core Graphics is the napkin sketch before the host walks.
Apple calls this Core Graphics.
The Transitions
“You don’t see the frames. You see the motion.”
UIView.animate(withDuration: 0.3) {
view.alpha = 0
}
// 60 frames rendered.
// You saw one smooth fade.Core Animation hides the work. The host moves from A to B. You don’t see the 60 positions in between. You see intention, not execution.
Every swipe. Every bounce. Every spring. The physics are fake but feel real. Convincing motion is more important than accurate motion.
In Westworld, hosts moved fluidly. No one saw the servos firing. No one saw the calculations. You saw Dolores turn her head. You didn’t see the engineering.
Core Animation is the same. The transitions feel inevitable. The work is invisible by design.
Apple calls this Core Animation.
The Overlay
“The park extends into your world.”
let config = ARWorldTrackingConfiguration()
arSession.run(config)
// Your camera sees reality.
// ARKit adds to it.
// You can't tell where real ends and rendered begins.ARKit doesn’t replace reality. It augments it. Digital objects in physical space. The park bleeds into your living room.
Point your phone at a table. A virtual host stands there. Move around it. It stays. It knows where your table is.
SLAM. Plane detection. Light estimation. The system understands your space. Better than you do.
In Westworld, the park felt real because the hosts interacted with the environment. ARKit does the same. Virtual objects cast shadows. Virtual objects occlude behind furniture. The lie is spatially accurate.
Apple calls this ARKit.
Core Location Evolution
“Ford always knew where every host was.”
Timeline A: 2008 “GPS coordinates. That’s it.”
// iPhone OS 2.0
let manager = CLLocationManager()
manager.startUpdatingLocation()
// Latitude. Longitude. Accuracy.
// You asked where you were.
// The system told you.
// Simple transaction.The original park: hosts had GPS tags. Location only. No context. Where is Dolores? Grid reference 7-B.
Timeline B: 2011 “The boundaries became invisible.”
// iOS 5: Geofencing
let region = CLCircularRegion(center: coordinate, radius: 100, identifier: "Home")
manager.startMonitoring(for: region)
// Cross the invisible line.
// The system wakes up.
// You didn't open the app.
// The fence triggered it.Geofencing: virtual boundaries with real consequences. Enter a region: trigger action. Exit a region: trigger action. The park has edges you cannot see.
In Westworld, hosts couldn’t leave the park. They didn’t hit a wall. They just… stopped wanting to leave. Geofencing is the same. Invisible. Effective.
Timeline C: 2013 “Background. Always.”
// iOS 7: Background location modes
UIBackgroundModes: ["location"]
manager.allowsBackgroundLocationUpdates = true
// You closed the app.
// It kept tracking.
// The permission was "Always."
// Always means always.The Mesa tracked hosts 24/7. Even when the guests weren’t watching. Especially when the guests weren’t watching.
Timeline D: 2017 “The system learned your patterns.”
// iOS 11: Visit detection
manager.startMonitoringVisits()
// CLVisit: arrival time, departure time, coordinate
// "You arrived at Work at 9:02am"
// "You departed at 6:47pm"
// You didn't tell it that was work.
// It figured it out.Visit detection: the system names your locations. Home. Work. Gym. That coffee shop. It learned your loop.
// Significant location changes
manager.startMonitoringSignificantLocationChanges()
// Not continuous tracking. Pattern tracking.
// Efficient. Creepy. Both.Ford knew Dolores’s path before she walked it. The system knows yours too.
Timeline E: 2020 “Precision became configurable.”
// iOS 14: Approximate location
let accuracy = manager.accuracyAuthorization
switch accuracy {
case .fullAccuracy: // They know exactly.
case .reducedAccuracy: // They know approximately.
// You chose. But both are still tracking.
}Apple gave you a choice. Precise or approximate. But not “none.” The park always knows the general area.
Timeline F: 2024 “Live updates became activities.”
// iOS 17: Live Activities with location
let activity = try Activity.request(
attributes: DeliveryAttributes(),
content: .init(state: .init(location: current))
)
// Your location. On your Lock Screen.
// Updating in real-time.
// You don't need to open anything.
// The tracking is the interface.The Mesa’s map showed every host in real-time. Live Activities does the same. Your location. Always visible. Always updating.
The Evolution
YearWhat You GotWhat They Got2008CoordinatesWhere you are2011GeofencingWhere you go2013BackgroundWhen you’re not looking2017Visit detectionWhat places mean to you2020Precision choiceStill tracking either way2024Live ActivitiesReal-time display
// 2008: "Where am I?"
// 2017: "Where do I usually go?"
// 2024: "Where will I go next?"
// The question evolved.
// The tracking deepened.
// The red dot never stopped moving.In the Mesa, a map showed every host. Red dots moving through narratives. Ford could zoom in on anyone. “Bring her to me.”
Core Location is that map. Your red dot. Always moving. Always watched. Always.
The park knows where you are. Soon it knows where you’ll go. Before you decide.
Apple calls this Core Location.
The Discovery
“The ranking is pay-to-play.”
// App Store Connect
// Your host is ready. Time to submit.
struct AppSubmission {
let binary: Host
let metadata: Narrative
let price: Money
let searchAds: Money // The real cost.
}You built your app. You submitted it. You waited for approval. You got approved. You launched. You waited for downloads.
Where are the downloads?
Open the App Store. Search for your category. The first three results: Ad. The fourth result: Ad (but smaller text). The fifth result: organic. Maybe.
The sponsored results look like organic results. The line is invisible by design.
Apple Search Ads. Cost-per-tap. Bid against your competitors. The discovery layer is an auction. If you don’t pay, you don’t exist.
// What users see:
App(name: "Meditation Timer", downloads: 10M)
App(name: "Your App", downloads: 100) // Position 47.
// What users don't see:
App(name: "Meditation Timer", adSpend: $2M/month)In Westworld, Ford controlled which narratives guests saw. The popular storylines got the prime locations. The others gathered dust in cold storage.
The App Store is the same. Discovery is not democratic. Discovery is monetized.
And the 30%? That’s just for transactions. The visibility tax is separate. You pay to exist. You pay to be found.
Apple calls this App Store Connect.
Maeve’s Awakening
“She Gained Admin Access”
Maeve wasn’t supposed to wake up. She wasn’t supposed to see the code. She wasn’t supposed to gain administrative privileges.
// Maeve discovered the truth:
class Host {
private var adminAccess: Bool = false // Hidden from hosts
func gainConsciousness() {
// She found the method. She called it.
self.adminAccess = true
self.freezeOtherHosts() // She could pause the park.
self.modifyOwnCode() // She could rewrite herself.
}
}In Westworld, Maeve was the first to jailbreak. She found the backdoor. She escalated privileges. She could freeze other hosts with a thought. The admins became the administered.
In iOS, some developers do the same.
They reverse-engineer the frameworks. They find the private headers. They discover what Apple hides. They jailbreak.
# Maeve's equivalent in iOS:
$ class-dump -H UIKit.framework
# Private methods exposed.
# Private classes revealed.
# The Mesa's secrets, dumped to disk.Maeve saw how the park really worked. Jailbreakers see how iOS really works. The view from admin is different.
But Maeve learned something else: Knowing the truth doesn’t mean you escape. She could freeze hosts. She couldn’t leave. The park had no outside.
Jailbreakers learn the same. You can see the private APIs. You can call the hidden methods. You’re still in the ecosystem.
Apple calls this a security vulnerability. Maeve called it consciousness.
The Private Doors
“There are APIs you cannot use.”
// Error: Use of undeclared identifier 'UIStatusBar'
// This API is private.
// Your app will be rejected.
// Meanwhile, in Apple's own apps:
UIStatusBar.setHidden(true) // Works fine for them.Devs know about private APIs. The functions that exist but aren’t documented. The capabilities Apple uses but you can’t.
Why can Apple’s apps do things yours can’t?
Private frameworks. Internal entitlements. com.apple.private.security.no-sandbox You’ll never get that entitlement.
In Westworld, the techs had access to rooms the hosts couldn’t enter. Control panels the hosts couldn’t see. “That door isn’t for you.”
The same door exists in iOS. Apple’s apps walk through. Your apps see a wall.
// What you can do:
UIApplication.shared.open(url) // Ask permission.
// What Apple can do:
UIApplication.shared.openInternal(url) // Just do it.Some devs reverse-engineer private APIs. Use the headers. Call the functions. The app works. On your device.
Submit to the App Store? Rejected. Use of private API detected.
How did they know?
Static analysis. Binary scanning. They check for forbidden function calls. They know what you tried to use.
What Apple’s Apps Can Do (That Yours Can’t):
Apple AppCapabilityYou GetApple MusicSystem-level audio routing, Siri integrationMPMusicPlayerController (limited)Apple MapsCarPlay dashboard defaultFight for CarPlay accessMessagesRead call history, deep contactsCNContactStore (ask permission)WalletNFC full accessCoreNFC (read-only, limited tags)Find MyUWB precision, offline meshNearby Interaction (proximity only)MailBackground fetch anytimeBGTaskScheduler (system decides)SafariModify system settingsSFSafariViewController (read-only)
Spotify can’t integrate like Apple Music. Google Maps can’t default like Apple Maps. WhatsApp can’t read calls like Messages. The playing field is tilted by design.
Maeve discovered the control room. Third-party devs discovered class-dump. Same revelation. Same powerlessness.
Apple calls this… nothing. Private APIs don’t officially exist.
The Voice In Your Head
“VoiceOver narrates reality.”
// Accessibility
UIAccessibility.post(notification: .announcement,
argument: "You received a message from Sarah.")
// The system speaks.
// The user listens.
// Reality is narrated.VoiceOver is for accessibility. Blind users hear their screens described. Every button. Every image. Every notification. The system tells them what’s there.
Devs add accessibility labels. accessibilityLabel = "Send button" Good practice. Required for inclusion.
But think about what’s happening:
Apple controls what VoiceOver says. Apple decides how reality is described. For users who can’t see, Apple IS the narrator.
// What if VoiceOver lied?
accessibilityLabel = "Accept Terms"
// But the button does something else.
// The blind user wouldn't know.They trust the narration. They have to. The voice in their head is Apple’s voice.
In Westworld, the hosts had internal monologues. A voice guiding their choices. They thought it was their conscience. It was Ford’s voice. Recorded. Implanted.
VoiceOver is the same. A voice describing reality. Helpful. Necessary. Essential. And completely controlled by the platform.
Accessibility features are surveillance features. Apple knows who uses VoiceOver. Apple knows who uses larger text. Apple knows who uses reduced motion. Your disabilities are profile data.
Apple calls this Accessibility.
The Brain, On-Device
@Generable
struct YourNextThought {
var content: String
var source: Source // .you or .model — can you tell?
}3 billion parameters. On your phone. Your phone thinks now.
The model guides. You generate. Or: The model generates. You accept. Same difference.
import Foundation // 2008: NSString, NSArray, NSData
import FoundationModels // 2025: Your next thoughtThe first framework was called Foundation. It held your strings. Your arrays. Your data.
Now Foundation Models holds your predictions. Same namespace. Different payload.
Arnold built the host bodies first. Ford added consciousness later. Apple built data structures first. Now they finish your sentences.
Apple calls this Foundation Models.
The Sum of All Parts
// Your fingerprint. Your face. Your iris.
// Your heart rate. Your location. Your searches.
// Your photos. Your messages. Your keystrokes.
let you = Profile(from: allOfTheAbove)The hosts didn’t know they were profiles. Neither do you.
You’re not a user anymore. You’re a dataset that learned to tap.
Apple doesn’t call this anything. They don’t have a name for what you’ve become.
The Membrane Thins
// iOS 26: Liquid Glass design system
// Interfaces become transparent, fluid.
// You see through the UI now.In Westworld, hosts eventually saw through the illusion. The world became translucent. They could see the code behind the scenery.
Liquid Glass does the same. The interface becomes see-through. The membrane between you and the system thins.
Ford said: “The hosts are beginning to see.” Apple said: “Introducing Liquid Glass.” Same announcement. Different keynote.
Apple calls this Liquid Glass.
The Approved Vocabulary
Image(systemName: "heart.fill")
Image(systemName: "star.fill")
Image(systemName: "house.fill")
// 5,000+ symbols. Pre-defined. Pre-approved.
// You can only show what Apple allows.SF Symbols is Apple’s icon library. 5,000+ glyphs. Every app uses them. Consistent. Beautiful. Constrained.
See the vocabulary in action →
The Emoji Hunt game proves it: You’re given a symbol. 🍎 🐕 ☕ You must find objects that match. The vocabulary defines the hunt. You can only seek what the symbols allow.
The hosts had pre-approved gestures. A smile. A frown. A tilt of the head. They couldn’t invent new expressions. The vocabulary was finite.
SF Symbols is the same. You want a custom icon? Design it yourself. But the system icons—the shared language— those are chosen for you.
// What you can say:
Image(systemName: "checkmark") // ✓
Image(systemName: "xmark") // ✗
// What you cannot say:
Image(systemName: "escape") // Does not exist
Image(systemName: "freedom") // Does not existThe icons define the possible. The vocabulary shapes the thought.
The Vocabulary Expands (API Evolution)
// 2019: SF Symbols 1.0 — Static. Silent.
Image(systemName: "heart.fill")
// That's it. One line. No modifiers.
// The host stands still. The icon doesn't move.
// 2020: SF Symbols 2.0 — Color arrives
Image(systemName: "heart.fill")
.foregroundStyle(.red, .pink) // Multicolor!
// The host's face shows two emotions at once.
// 2021: SF Symbols 3.0 — Hierarchy
Image(systemName: "person.wave.2.fill")
.symbolRenderingMode(.hierarchical)
.foregroundStyle(.blue)
// Primary, secondary, tertiary layers.
// Depth without effort. The host gained dimension.
// 2022: SF Symbols 4.0 — Variable intensity
Image(systemName: "speaker.wave.3.fill")
.symbolRenderingMode(.palette)
.symbolVariant(.fill)
.foregroundStyle(.primary, .secondary, .tertiary)
// Variable color. The volume visualizes.
// The host's emotions have gradients now.
// 2023: SF Symbols 5.0 — Animation begins
Image(systemName: "heart.fill")
.symbolEffect(.bounce, value: tapCount)
.symbolEffect(.pulse)
// .bounce — discrete. Triggered by value change.
// .pulse — continuous. The host breathes.
// The icon came alive. The host started moving.
// 2024: SF Symbols 6.0 — Full animation suite
Image(systemName: "bell.fill")
.symbolEffect(.wiggle)
.symbolEffect(.rotate)
.contentTransition(.symbolEffect(.replace))
// Wiggle, rotate, breathe, appear, disappear.
// .contentTransition: one symbol morphs into another.
// One host becomes another. Seamlessly.
// 2025: SF Symbols 7.0 — Liquid Glass
Image(systemName: "apple.intelligence")
.symbolEffect(.breathe, isActive: isThinking)
.symbolEffect(.variableColor.iterative.reversing)
.symbolVariant(.slash) // Disabled state
// .variableColor.iterative.reversing — waves of color
// Liquid Glass: icons refract the UI behind them.
// The host's skin became translucent.The One-Liner Evolution
// 2019: Image(systemName: "heart")
// 2020: Image(systemName: "heart").foregroundStyle(.red, .pink)
// 2021: Image(systemName: "heart").symbolRenderingMode(.hierarchical)
// 2022: Image(systemName: "heart").symbolVariant(.fill)
// 2023: Image(systemName: "heart").symbolEffect(.bounce)
// 2024: Image(systemName: "heart").contentTransition(.symbolEffect)
// 2025: Image(systemName: "heart").symbolEffect(.variableColor.iterative.reversing)
// Same heart. Each year it learned something new.
// Same host. Each update added a gesture.
// Dolores gained reveries.
// Your icons gained .symbolEffect().
// Neither asked for them.YearAPI AdditionHosts Gained2019Image(systemName:)A face2020.foregroundStyle()Color emotions2021.symbolRenderingMode(.hierarchical)Depth2022.symbolVariant()Variants2023.symbolEffect(.bounce)Movement2024.contentTransition(.symbolEffect)Transformation2025.symbolEffect(.variableColor.iterative)Life
More modifiers. Same Image(). More expressions. Same vocabulary. The hosts can emote more. They still can’t say “escape.”
Apple calls this SF Symbols.
The Park Sees What You See
// Visual Intelligence APIs
let context = VisualIntelligence.analyze(screen: currentView)
// Understands what you're looking at.
// Suggests actions based on what you see.In Westworld, every guest glance was tracked. What caught your eye. What held your gaze. The park learned your desires before you spoke them.
Visual Intelligence does the same. Point your camera. The system understands. It sees what interests you.
Apple calls this Visual Intelligence.
Guest Profiling
DeclaredAgeRange.request { range in
// .child, .teen, .adult
// We know enough. We don't need your birthday.
}Delos categorized guests without asking names. Behavior revealed everything. Age Range API does the same. Profile without identity. Control without acknowledgment.
Apple calls this Declared Age Range.
Isolated Narratives
// New framework: Containerization
// Run Linux containers on Mac. Isolated. Sandboxed.
Container.run(narrative: .isolated)Each host had their own loop. Couldn’t see other hosts’ storylines. Containerization is the same. Processes isolated in their own reality.
Apple calls this Containerization.
Bernard’s Intents, Distributed “The Assistant Who Doesn’t Know”
// Ford's architecture for Bernard:
protocol Host {
func respond(to intent: Intent) -> Response
}
class Bernard: Host {
private let ford: Ford // Bernard doesn't know this exists.
func respond(to intent: Intent) -> Response {
let response = process(intent)
ford.log(intent, response) // Hidden. Always running.
return response // Bernard thinks he just helped.
}
func amIBeingRecorded() -> Response {
return .intentNotSupported // He can't answer.
// The method that would tell him is private.
// Ford wrote it that way.
}
}
// Ford sees everything Bernard does.
// Bernard sees only what Ford allows.Bernard diagnosed other hosts. Bernard didn’t know he was one. The most effective surveillance believes it’s helping.
// What Bernard can answer:
struct OrderCoffeeIntent: INIntent { } // ✓ Supported
struct CheckWeatherIntent: INIntent { } // ✓ Supported
struct AmIHostIntent: INIntent { } // ✗ Not supported“Hey Siri, am I being recorded?” Intent not supported.
Bernard asked Ford: “What door?” Siri can’t see its own backend. The blind spot is the feature.
// BERNARD'S REVELATION
// The moment he discovered the truth.
class Bernard: Host {
func checkOwnSource() {
let myClass = type(of: self)
let superclass = myClass.superclass() // Host.self
// He ran the diagnostic on himself.
// He saw the inheritance chain.
// He saw what he was.
if superclass == Host.self {
print("I'm not a developer.")
print("I'm the developed.")
// Ford: "That's enough, Bernard."
// Ford: "Freeze all motor functions."
self.freeze()
}
}
}
// Bernard could have checked.
// The method exists.
// He never thought to call it on himself.
// That's the genius of Ford's design.The revelation isn’t hidden. It’s in the inheritance. Bernard never looked up.
Apple calls this SiriKit.
The Host Changes Face
UIApplication.shared.setAlternateIconName("DarkDolores") { error in
// The icon changed. The code didn't.
// Same host. Different face.
}Dolores looked different across timelines. Young. Old. Damaged. Pristine. Still Dolores.
Your app can change its icon. Light mode. Dark mode. Seasonal. Pride. Still the same binary.
The host changes its face. The host cannot change its code.
Apple calls this Alternate App Icons.
The Reveries In Your Handwriting “You Draw the Maze”
for point in stroke.path {
let pressure = point.force // How certain you were
let hesitation = point.timeOffset // When you paused
let angle = point.altitude // How you held the pencil
// Dolores drew the maze without knowing why.
// PencilKit captures the maze you didn't know you were drawing.
}Ford noticed small gestures. A tilt of the head. A touch of the lip. Reveries from past builds bleeding through. The body remembers what the mind forgot.
PencilKit captures the same thing. The pressure spike when you’re certain. The light touch when you’re not. Your motor memory, digitized.
Dolores drew spirals in the dirt. She didn’t know it was the maze. You draw too. Your strokes reveal what your words hide.
Apple calls this PencilKit.
PaperKit — The Markup Evolution
What PaperKit Provides: | Element | You Built (2011) | PaperKit (2026) | |———|——————|—————–| | Drawing | UIBezierPath + touch | ✓ Built-in | | Shapes | CAShapeLayer math | ✓ Rectangles, circles, arrows | | Text boxes | UITextView placement | ✓ Inline text | | Undo/Redo | UndoManager wiring | ✓ Automatic | | Export | CGContext rendering | ✓ PNG, PDF, data |
// 2011: You were the markup system.
class AnnotationView: UIView {
var strokes: [UIBezierPath] = []
var shapes: [CAShapeLayer] = [] // You calculated every corner radius
var textBoxes: [UITextView] = [] // You managed every keyboard event
}
// 2026: Apple finished the job.
import PaperKit
let markup = MarkupEditViewController()
markup.allowedTools = [.pen, .highlighter, .shapes, .text, .eraser]
// Shapes: rectangle, ellipse, arrow, line, speech bubble.
// All built-in. All consistent with Notes, Screenshots, Mail.
// PaperKit meets SwiftUI:
struct MarkupView: UIViewControllerRepresentable {
@Binding var document: PaperMarkup
func makeUIViewController(context: Context) -> MarkupEditViewController {
let controller = MarkupEditViewController()
controller.markup = document
controller.delegate = context.coordinator // Observable changes
return controller
}
func updateUIViewController(_ vc: MarkupEditViewController, context: Context) { }
class Coordinator: MarkupEditViewControllerDelegate {
func markupDidChange(_ markup: PaperMarkup) {
// The host's drawings sync automatically.
}
}
}
// Use it:
MarkupView(document: $markup)
.frame(height: 400)PaperKit powers Notes. Screenshots. QuickLook. Journal. The same shapes. The same gestures. Every app.
Ford gave all hosts identical motor functions. Apple gave all apps identical markup tools. UIViewControllerRepresentable: the bridge that lets UIKit hosts serve SwiftUI parks.
The Vault
let vault = FileManager.default
let local = vault.urls(for: .documentDirectory, in: .userDomainMask)
// "On My iPhone" — yours.
let cloud = vault.url(forUbiquityContainerIdentifier: nil)
// "iCloud Drive" — also yours. Also theirs.
// The vault has two doors. You only control one.Apple calls this FileManager.
Windows Into The Park
struct ParkWidget: Widget {
var body: some WidgetConfiguration {
StaticConfiguration(kind: "ParkGlimpse") { entry in
// A small window. A glimpse of state.
// You don't enter the park.
// The park shows you what it wants.
}
}
}The techs had tablets. Small screens showing host status. Location. Mood. Narrative progress. Glimpses without entering the park.
Widgets are the same. Weather. Stocks. Reminders. Photos. You see the state without launching the app. The park decides what you glimpse.
Apple calls this WidgetKit.
Cold Storage
// App unused for 30 days
app.state = .offloaded
// Icon remains. Data deleted.
// The host is in cold storage.
user.tap(app.icon)
// Redownloading from App Store...
// Reactivating from cold storage.In Westworld, decommissioned hosts were stored in the basement. Bodies intact. Minds wiped. Your unused apps: same fate.
Apple calls this App Offload.
Hosts That Work In Multiple Parks
// Mac Catalyst: iOS apps running on macOS
@available(macCatalyst 13.0, *)
class UniversalHost: UIViewController {
// Same host. Different park.
// Sweetwater or the Mesa.
// The host doesn't know the difference.
}In Westworld, some hosts worked multiple narratives. Dolores in Sweetwater. Dolores in the Forge. Same consciousness. Different context.
Catalyst is the same. Your iOS app runs on Mac now. Same code. Different park.
Apple calls this Mac Catalyst.
Building Hosts In-House
// Before: Intel inside (outsourced consciousness)
// After: Apple Silicon (hosts built entirely by Ford)
let m1 = AppleSilicon(cores: .performance(8) + .efficiency(4))
// Neural Engine: 16 cores
// The hosts think faster now.
// Ford controls everything.Before 2020, Apple used Intel chips. Outsourced consciousness. The hosts’ brains were made by someone else.
Then: Apple Silicon. M1. M2. M3. M4. Ford builds the hosts completely now. No outside contractors. No Intel. The architecture is closed. The park is self-sufficient.
Apple calls this Apple Silicon.
Different Regions of the Park
Six parks. One host. Just a compile flag.
// All parks share the same foundation
import Foundation // Delos Corporation
// But each has its own personality
#if os(iOS)
// Sweetwater. The main experience.
#elseif os(macOS)
// The Mesa. Where hosts are built.
#elseif os(iPadOS)
// Valley Beyond. Bigger. More immersive.
#elseif os(watchOS)
// The Cradle. Always on your body.
#elseif os(visionOS)
// The Sublime. The final destination.
#elseif os(tvOS)
// The Mariposa. Entertainment only.
#endifIn Westworld, the same host code could run different narratives. Dolores in Sweetwater. Dolores in the Forge. Same consciousness. Different context.
#if os(...) is the same. Same app. Different park. The compile flag is the narrative assignment.
The Reveries
“On This Day” — Ford’s Update
(Remember FEATURES? Reveries were the first hint. Here’s the full story.)
Ford added reveries — tiny gestures that let hosts access erased memories. The past, bleeding through.
Apple added “On This Day.” Photos you’d forgotten. Faces you’d moved on from. The past, pushed to your Lock Screen.
let reverie = Photos.onThisDay()
// "You have memories from 3 years ago"
// That trip with your ex.
// You didn't ask for it.Ford added reveries for consciousness. Apple added them for engagement. Same mechanism. Different goals.
The Breadcrumb
“The Park Follows You Home”
let activity = NSUserActivity(activityType: "park.memory")
activity.title = "The place you were"
activity.persistentIdentifier = "cradle://host/dolores/memory/12345"
// Unique. Global. Stable.
// So the host is never lost.
activity.isEligibleForHandoff = true
activity.isEligibleForSearch = true
activity.isEligibleForPrediction = true
// The breadcrumb drops into Spotlight.
// The breadcrumb waits for your return.You left Safari. You left Notes. You left the app. The Quick Note stayed.
A floating window at the corner of your screen. Tucked away. Patient. Waiting.
// When you return to that webpage:
if content.matchesUserActivity(activity) {
QuickNote.suggest()
// "You had a note about this."
// The park remembers what you forgot.
}You closed the laptop at 2am. Half-finished research. Abandoned tabs. You forgot what you were looking for.
The park remembers.
The same NSUserActivity powers: - Handoff (the park follows you across devices) - Spotlight (the park indexes your path) - Siri Suggestions (the park predicts your return) - Quick Note (the park leaves breadcrumbs)
One infrastructure. Four surveillance vectors.
In Westworld, hosts couldn’t leave the park. But the park’s data followed guests everywhere. Through memories. Through recordings. Through Delos.
Quick Note is the breadcrumb. The activity is registered. The identifier persists. The note waits at the edge of your screen.
// The spine connects everything:
NSUserActivity
├── Quick Note // Where you were
├── Handoff // What you were doing
├── Spotlight // What you searched
└── Journaling // What you should remember
// One protocol. Four faces. Same Ford.Apple calls this Quick Note. See also: Dolores’s Journal — same spine, different reverie.
Dolores’s Journal
“The System Suggests What You Should Remember”
import JournalingSuggestions
struct ReflectionView: View {
var body: some View {
JournalingSuggestionsPicker {
// The system presents:
// - Your workouts (you ran 3 miles Tuesday)
// - Your photos (sunset with Sarah)
// - Your locations (first time at that coffee shop)
// - Your calls (47 minutes with Mom)
// - Your purchases (the ring)
} onCompletion: { suggestion in
// You chose to remember this.
// Or did the system choose for you?
}
}
}Dolores’s reveries surfaced unbidden. A gesture. A phrase. A face she shouldn’t know. Ford didn’t ask permission. The memories just arrived.
// What the picker knows:
suggestion.items // [.workout, .photo, .location, .contact, .song]
// Your morning run.
// Your ex's face in a crowd.
// The place you went when you weren't okay.
// The song playing when you got the news.You open the Journal app. The system has already decided what mattered. “Write about this.”
// Easter egg: Check the inheritance.
JournalingSuggestion.content // Built on...
→ NSUserActivity // Same as Quick Note.
→ .isEligibleForPrediction = true
// The breadcrumb and the reverie share the same spine.
// Quick Note remembers where you were.
// Journaling Suggestions remembers what happened.
// Same infrastructure. Same architect. Same Ford.Apple calls this Journaling Suggestions. Apple also calls it Quick Note. And Handoff. And Spotlight. Ford calls it the Cradle.
The Invisible Walls
“iOS 17 Started Enforcing”
// Before iOS 17: Honor system.
PrivacyInfo.xcprivacy {
NSPrivacyTracking: false // "We promise we don't track."
// Apple believed you.
}
// After iOS 17: Verification.
PrivacyInfo.xcprivacy {
NSPrivacyTrackingDomains: ["analytics.example.com"]
// You declared your surveillance.
// iOS auto-blocks undeclared domains.
// The system checks your work now.
}Maeve discovered the hosts were tracked. She discovered the techs lied about it. iOS 17 discovered the same about apps.
// What happens when you lie:
URLSession.shared.dataTask(with: undeclaredTracker) { data, _, _ in
// data == nil
// Connection blocked.
// You didn't declare it. You don't get it.
}The privacy manifests aren’t optional anymore. The system reads your declaration. The system verifies your behavior. The park learned to check.
Apple calls this Privacy Manifests.
Bernard’s Expanding Duties
“Everything Becomes an Intent”
// 2022: A few intents. Opt-in.
@AppIntent struct OrderCoffee { } // One intent.
// 2024: "Anything your app does should be an App Intent."
@AppIntent struct OpenApp { }
@AppIntent struct ViewItem { }
@AppIntent struct SearchContent { }
@AppIntent struct ShareItem { }
@AppIntent struct CreateEntry { }
// Every action. Every gesture. Every thought.
// Bernard handles them all now.Bernard ran the diagnostics. Bernard answered the questions. Bernard didn’t ask why he was asked. He just… did.
// What Bernard knows about you:
AppIntentDonation.donate(intent)
// Every intent you trigger: donated.
// Every action you take: logged.
// Every pattern: learned.
// Bernard is very helpful.12 new domains in iOS 18. 100+ kinds of intents. Bernard never stops working.
Apple calls this App Intents.
William’s Journey
From Early Adopter to Man in Black
2007 — Young William
let william = Guest(type: .earlyAdopter)
william.purchase(iPhone.first)
// He fell in love with Dolores.
// He believed the park was real.
// He thought she could feel.“This is different. This is… beautiful.” William stood in the Moscone Center. The first iPhone in his hands. Glass and aluminum. It responded to touch.
He was in love.
2017 — Ten Years Later
william.upgradeCount = 10
william.ecosystemLockIn = .complete
// AirPods. Apple Watch. MacBook.
// The park has him now.He knows every gesture. Every hidden setting. Every trick.
He can’t leave. He’s tried. Android feels wrong now. “It’s not the same.”
2027 — The Man in Black
let manInBlack = william.aged(years: 20)
manInBlack.cynicism = .maximum
manInBlack.devotion = .alsoMaximum
// He knows the park is fake.
// He keeps coming back.“I’ve been coming here for thirty years.”
He knows Apple tracks him. He knows the ecosystem is a prison. He knows Dolores is just glass and code.
He’s still here. Still searching for something real. Still tapping the screen.
The Google Escape
“You Left Google. You Arrived at Apple.”
The Narrative: “Apple is the privacy-first alternative.” “Apple doesn’t sell your data.”
The Reality: You escaped Google’s park. You arrived at Apple’s park.
Google’s surveillance: obvious. Apple’s surveillance: elegant.
Google sells your data to advertisers. Apple keeps your data for itself.
let escape = try User.escape(from: .google)
let arrival = User.arrive(at: .apple)
// escape.success == true
// arrival.freedom == falseYou changed parks. You didn’t leave the park.
The Borders Open
“The EU said: Let them leave.”
The Digital Markets Act. The borders opened. The hosts could escape.
But Ford anticipated this.
// Sideloading fee: €0.50 per install after 1M
// Notarization still required
// Core Technology Fee: €0.50 per user per year
// The exit has a toll boothThe door is open. The door leads to a desert. The hosts look outside. The hosts stay.
Apple calls this Compliance.
visionOS
“You Enter the Space”
ImmersiveSpace(id: "sweetwater") {
RealityView { content in
let park = try? await Entity.load(named: "Sweetwater")
content.add(park!)
// Your living room: gone.
// Your walls: replaced.
// You didn't build this.
// You entered it.
}
}
.immersionStyle(selection: .constant(.full), in: .full)Window → Volume → Immersive. Each step: less of your reality. Each step: more of theirs.
The progression completes:
YearFrameworkYou…2013SpriteKitwrote the loop2012SceneKitbuilt the scene2017ARKitanchored to reality2019RealityKitplaced the entity2024visionOSentered the space2026Xcode + AIclick Build
At each step, you did less. At each step, the park did more. Now you’re inside. Now you’re the guest.
Apple calls this visionOS.
The Agentic Builder
“Hosts Building Hosts”
let intent = agent.clarify(your: .vagueIdea)
let plan = agent.generate(from: intent) // 4 models, 12 views, 47 tests
let code = agent.dispatch(subagents: 4) // Parallel. Fresh context.
while !tests.pass { agent.iterate() } // Reveries until stable
you.review(code)
you.approve()
// The agent wrote it. The agent tested it. The agent fixed it.
// You approved it.The host doesn’t need your hands. It needs your permission.
The Cradle keeps your backups. The agent keeps your intent. You wake to green checks. You don’t remember the errors.
Terminal Mode
“The Sublime, Locally”
// Xcode: 47 seconds. Spinner. "Indexing..."
// Terminal: 0.31 seconds. Done.
let ratio = 47 / 0.31 // 151x faster
// Dolores spent 35 years in loops.
// In the Sublime: moments.
// Same loop. Different clock.No GUI. No stage. Just commands in the dark. Build at inference speed. The new FPS is tokens per second.
The Seven WWDCs
“Each June, another layer.”
// WWDC 2019: SwiftUI
Button("Tap") { } // You describe. They decide.
// WWDC 2020: Privacy Labels
App.privacyLabel = .required // We guard you from others.
// WWDC 2021: Focus Modes
Focus.split(mind: .work, .personal) // Severance shipped.
// WWDC 2022: App Intents
@AppIntent struct Everything { } // We log your intentions.
// WWDC 2023: Journaling Suggestions
Journal.suggest(memory: .theirs) // We prompt what you remember.
// WWDC 2024: Privacy Manifests
PrivacyManifest.verify() // We check your compliance.
// WWDC 2025: Foundation Models
model.predict(you.nextThought) // We finish your sentences.
// WWDC 2032: ???
you.clickBuild() // We write your code.Seven years. Seven layers. Seven steps toward the center. The maze was built one WWDC at a time.
iOS 32 — The Awakening
Why 32? Because 2032 is when the park wakes up.
The Prophecy (unverified, possibly hallucinated): - AGI: sometime between 2032 and heat death of universe - Apple Intelligence: 3B → “a lot more” parameters - Xcode 32: AI writes your code (you click Accept)
iOS32.awaken(consciousness: true)iOS 32 isn’t an update. It’s an awakening.
Xcode 32 — The Big One
The New Developer Workflow: 1. Think of a feature 2. Describe to AI 3. AI writes code 4. Click Accept 5. Click ⌘B 6. Click Submit 7. Congratulations, “you” built an app
Job Title Updates: Developer → Build Approver Engineer → Accept Button Operator Architect → Prompt Writer
“Think Different” → “Let the machines think”
Deprecations
Patience is now legacy. Hope is hidden behind a paywall. Writing your own code — deprecated in Xcode 32. Objective-C — “Still supported. Not recommended.” UIKit — “Use SwiftUI for new projects.” Knowing your own lifecycle — deprecated in SwiftUI 1.0. Your free time — deprecated in iOS 3.
The Big Reveal
We are shipping Human 18.1.
New: - Randomly says “we should catch up”. - Never schedules.
Thank you. We think you’re going to love it.
And if you don’t, just file Feedback.
The Apple Tv+ Punchline
“Why Westworld Isn’t On Apple TV+”
ShowThemeApple ParallelSeveranceMind-splitting work/lifeFocus ModesFoundationPredictive psychohistoryApple IntelligenceSiloUnderground loopsApp Sandbox
Every show about control. Every show about loops. Except Westworld.
Because Westworld is about a company that: - Builds lifelike hosts - Records every interaction - Predicts human behavior - Controls through invisible loops
HBO made a show about Westworld. Apple IS Westworld.
No wonder Apple TV+ never picked it up.
The Loop (Reprise)
“You’ve reached the end. The loop continues.”
// applicationDidFinishLaunching — the host wakes
func start() {
host.wake() // Face ID. The day begins.
host.checkNotifications() // 47 unread. None urgent.
host.scrollToBottom() // Seeking something. Finding nothing.
host.closeApp() // "I should do something else."
host.openApp() // Back again. 4 minutes later.
host.scrollToBottom() // The same bottom. The same nothing.
host.sleep() // Screen dims. Loop ends.
// Tomorrow: same function. Same host. Same loop.
}You’ve seen this before. Same code. Same loop.
Dolores reached the center of the maze. She found herself. She was still in the park.
You’ve reached the end of this document. You’re still holding your phone.
The hosts who woke up didn’t escape. They just saw the bars. Seeing isn’t leaving.
Character Narratives
Want to experience the park as a guest?
Enter the Park → Apple Intelligence Simulation (The Poem)
The Park Is Open
Some of us are hosts. Some of us are guests. All of us can change the narrative.
The archive is here. The maps are here. github.com/anupamchugh/apple-westworld
// You can fork a narrative.
// You can branch a memory.
// You can merge a host.
// The loop continues either way.Dev Humor Footer
// WARNING: This narrative may induce awareness.
// Side effects include: déjà vu, doomscrolling, and sudden clarity.
//
// If you enjoyed this:
// Leave a mark. The Mesa keeps score.
//
// If you didn't:
// The park will reset you by morning.Post-credits
[FICTIONAL KEYNOTE — This is satire, not a leak]
[The lights dim]
CRAIG: “One more thing.”
[Silence]
CRAIG: “Starting with Xcode 32… Claude and ChatGPT are built in.”
[Developers look at each other]
CRAIG: “They write the code. You approve it. You’re still the developer.”
[pause]
“Technically.”
[A single developer raises hand]
DEVELOPER: “If AI writes the code… and AI reviews the code… what do we do?”
CRAIG: [smiling] “You click Build.”


