What I see of the Apple Car in the Vision Pro
What Apple's New Thing says about their Next Thing
9 years ago, the Wall Street Journal reported that hundreds of staffers at Apple were working on a minivan-like electric car.
One week ago today, Apple launched its first new computer form factor since the iPad slotted a tablet between our smartphones and our laptops.
I got the new face computer. At times it’s mind-blowing, at other times it feels like another incremental (though meaningful) step for this form factor. I’ll probably write more about it soon, it’s fun to talk about.
I oddly feel much more compelled to look at the things in it that also look like they might be serving as R&D for an even more premium product: that huge, multi-decade project, the Apple Car.
Ever since Tesla put a giant tablet in the middle of the Model S dashboard, a perspective has come up now and again that Apple should buy Tesla because they could put an iPad on the dashboard and an Apple logo on the hood and sell even more electric cars to its rich customers.
I’ve long been curious about the software-enabled steps Apple would take to differentiate the experience of driving and being a passenger in a car, beyond putting iPads in the middle consoles and iOS-ifying the gauge cluster as they have with CarPlay so far. I think the Vision Pro offers some clues.
Ears, Eyes, Screens
Spatial Audio
The Vision Pro’s built in audio is surprisingly great, even though the speakers don’t cover your ears. Apple’s doing some magic audio processing that simulates how sound would bounce around the room you’re in before reaching your ear, as if it were emitted from the physical location of the virtual windows around you. What they’ve done here feels like an extension of Spatial Audio while wearing AirPod Pro/Max, which gives you some of the audio mixing upsides of a 5.1 home theater setup.
I would bet that the Apple Car is going to feature a combination of noise cancelling for road and wind noise as well as an evolution of Spatial Audio that offers some combination of:
per-passenger balance — everyone in the car feels like they’re getting an audio mix for their exact position from the speakers all around the car
play different audio to different passengers over the same shared speakers
attention-grabbing driver alerts that avoid disturbing others in the car
I might even be overthinking it with these niceties. Active noice cancelling is one of those features that has kind of been available in high-end car models for years but could be done way better and become a source of endless clout-seeking videos on social media as early adopters show off the difference.
Eye-Tracking HUD
Tesla has long courted controversy by shifting inputs for things like climate controls and music controls to a touchscreen instead of offering dedicated buttons the driver can use without looking. More recently, it further cut down on physical input affordances by eliminating stalks from the sides of the steering wheel for using your turn signal or shifting gears and made them buttons on the steering wheel or new gestures on the central touchscreen. They’ve also been using a camera on the inside of the cabin to watch the driver to make sure they’re alert and paying attention to the road — though the user experience often feels like a Cover Your Ass mechanism after years of being loose with letting drivers use Autopilot carelessly.
I think that Apple is going to flip eye tracking in the car from feeling like a nanny or backseat driver to a seamless input that makes you feel both aware of the road and able to do things like switch playlists on Spotify (ahem, Apple Music in the unveil presentation) when your toddler asks from the back seat. It’s “just” going to be CarPlay with less friction — but that might be a huge user experience unlock. Instead of car HUDs showing a speedometer view or an arrow for your upcoming turn, it’ll be windshield-wide AR interface like the Vision Pro.
EyeSight
This one is perhaps the biggest stretch, but the external EyeSight screen on the Vision Pro demonstrates that Apple is thinking about the user as they relate to others around them that have uncertainty about whether they can communicate/understand the intent of people in the headset. If you’re present and could talk to someone, here are some eyes to make contact with. If you’re busy, here are some pretty waveforms that quickly say so.
In cars, we communicate a lot of context to other drivers and pedestrians with turn signals and brake lights. What if the front and back of the car were giant EyeSight displays? What other parts of vehicle state or driver intention could you communicate with way more pixels to work with? Could you customize the look of the car AND improve safety that way?
I already mentioned the reporting that the car would look like a “minivan”. Apple isn’t going to ship a minivan. The rumored partner platforms the car would be based on — the BMW i3 back in the day, Hyundai’s platform these days — were all packaged as vehicles at the odd intersection of hatchback and really tiny crossover. The Apple move would be to use a platform like that, stretch it out to fit as many people as an SUV (not like the Tesla Model Y does with 3 rows, I mean actually fitting them).
Then make it an engineering marvel and exotic luxury item Gen 1 by reducing its drag coefficent with a front that looks like the bulbous Vision Pro. Lift it just enough that you can call it a crossover SUV so Americans will buy it.
But what about Autonomy?
Two months prior to the WSJ scoop, Google unveiled its iconic Firefly self-driving prototypes — that’s an obvious direction. What could be more premium and software-enabled than having your car drive you to work while you get email done or watch YouTube on your phone or iPad?
But self-driving has proven to be really hard. Waymo is there, but not in a way that could be sold to end consumers all over the world. Tesla lets you pay to beta test having your car drive itself, but it’s not fully baked yet — to put it wildly generously.
Now we hear that Apple is de-prioritizing self-driving — they decided it’s only a nice-to-have. They don’t need to beat “stays in its lane and keeps up with traffic” to offer a genuinely exceptional driving experience — yet.
Kind of like the power cable and battery hanging off of the Vision Pro. Would it have been more ideal to be able to ship a V1 with this visual fidelity and user experience + fairly unconstrained ability to run multiple apps side-by-side + a phone chip and smaller battery? Duh. Was it possible in February 2024? No!
We’re still in very early days for electric vehicle adoption. Apple could hit the Tesla “faster than your old ICE car” bar, then shoot past it by further nailing the little frictions that are invisibly part of even that forward-looking user experience. It’s not just about using nice glass, leather, aluminum throughout and slapping an iPad on the dash. And it doesn’t need to mean delivering full autonomy anywhere a techie making six figures might want to drive.
Think of the cars executives at Apple actually get to drive. Ferraris, Porsches, Teslas, Mercedes when they want something more cushiony. Those companies have left a lot of the experience as a given for a very long time, even as they’ve pushed some aspect of performance or luxury to new highs. Apple is going to wow everyone by finally deciding to reexamine those things through the lens of technology they’ve already invested in for the iPhone, iPad, and now the Vision Pro.