Apple Vision Pro: First Impressions

Let me start by saying the Apple Vision Pro (AVP, sorry Apple, I’m not typing it out each time) is an amazing product but it has some significant limitations. I am still undecided on whether I will keep it or return it as my original main use case (replacement of my MBP monitors) isn’t up to my standards yet but the AVP impressed me in other ways that might make up for that. I have only spent about 6 hours with the AVP so far and as such my thoughts on it are still very raw and subject to change as time goes on. With that out of the way, let’s get started.

Initial Setup

Apple’s packaging continues to be top-tier with the AVP. As other reviewers have mentioned, the box is quite big but as always Apple makes the unboxing a fun experience. The AVP comes with the headset, the Solo Knit Band, the Dual Loop Band, USB-C charger, battery, and a polishing cloth (I know, I know, I was super excited to get a prized Apple polishing cloth. It puts all other microfiber cloths to shame /s). I have not used the Dual Loop Band yet as the Solo Knit Band has been adequate so far but I will try it out soon just to make sure I’m not missing anything. That said, the Dual Loop Band is so clearly an afterthought. It’s not janky or anything but putting it next to the Solo Knit Band shows a stark difference in attention to detail. It’s effectively 2 elastic straps while the Solo Knit Band looks like it was engineered to within an inch of its life.

Setting up the AVP was straightforward if you have an existing iOS device. I was able to get my Wifi, Apple ID, saved cards, and more from my iPhone just by looking at the QR code on it. That said, I wish it had prompted me to move to a more open location. I was just in my kitchen (because I have zero self-restraint and wanted to set it up right away) and the UI was behind my pendant lights which was a little awkward. You do some minor calibration (look at the dots, tap your fingers), scan your QR code if you have prescription lenses, do your typical Apple device setup (restore from backup, add cards to your wallet, etc), and then you are dropped into the main UI grid of icons. Notably absent was the Siri setup. The device has Siri but you don’t need to “train” it like you do on every other Apple product. Siri works while doing something like a Facetime call which I found out by accident. I meant to trigger my phone/watch but it popped up on my AVP and created the reminder for me right there.

The future is… blurry?

With the setup complete I looked around a bit and thought my prescription lenses were wrong. Things were blurrier than I had expected. It’s a massive step up from my Meta Quest 2 (MQ2) but still a bit grainy. The motion blur when you move is disappointing, though understandable, but it’s perfectly manageable. Heck, I walked around my house with my MQ2 in passthrough and that is a B&W grainy mess, this is a significant improvement. I had trouble reading my phone and watch but the UI for the AVP was crisp and the text was very easy to read. There was no squinting or moving my head around to be able to read text like on the MQ2. I took out the prescription lenses and put in my contacts to rule out the possibility of a bad prescription or lenses but the experience was the same.

While the blurry surroundings were a bit jarring it does fade away once you are using the AVP. Even without using the immersive “Experiences” you get used to it since the AVP UI is always crisp and easy to read. In the same way that when you watch TV you are aware of your surroundings but aren’t focusing on them.

I won’t lie, I’m disappointed in not being able to read my phone/watch through the AVP. Apple made it seem like that was easy to do and the reviewers who did mention it said they could read their phones/watches. I hope that I’m just doing something wrong (am I holding it wrong Apple?) or that Apple finds a way to improve this. Something like AirPlaying the device UI into the AVP so you don’t see the phone itself but an overlaid video feed? I think the phone could handle that but I’m not sure if the watch has enough power (both battery and processor) to do it. I felt the need to lift my AVP when a notification came in so I didn’t miss it. I quickly set up apps on my AVP that would deliver the notifications I needed to see most.

Interacting with the AVP

The AVP UI itself is beautiful, crystal clear, and easy to navigate. In fact every AVP-native app I used was quite enjoyable. It was the iPad/iPhone apps that had some teething problems. Mainly around UI with too-small tap targets (look targets?). In a few cases with non-AVP-native apps, I had to move my head to better look at a target I wanted to tap. This is where a trackpad or mouse comes in very handy. I paired a trackpad with my AVP and used it to tap on things I couldn’t do easily with my eyes alone. Your cursor will never float in the space between windows, instead it “jumps” to the window you are looking at. I had read reviews about this and was skeptical of how it would feel but I was pleasantly surprised at how natural it is. That brings us to input as a whole.

Looking where you click is cool but it’s a completely different paradigm from what we are used to. To quote Nilay Patel, “magic, until it’s not”. When you first get started it works magically but as you get used to it and start to speed up you run into a few minor issues. My issues centered around my eyes wanting to “move on” from what I was doing and/or looking elsewhere thinking my hand would be able to select/scroll/etc on its own. More than once I clicked the wrong thing because my eyes drifted, like they can on my MBP, while I was typing. It would be fine if I had a keyboard paired with the device, but it doesn’t work if you use your eyes to select letters.

Another note on scrolling, it gets tedious when you have to do a lot of it. It requires you to pinch your fingers together, and then move your hand up or down. You can pinch-flick to scroll faster but then you can’t read what scrolls past. I felt some fatigue while doing this. At one point I was scrolling through the coupons in the Kroger app (it’s not AVP-native, I just wanted to see how grocery shopping would be in the AVP) and I finally got up and paired my Magic Trackpad to the AVP so I could scroll better. I don’t know if Apple could do something like the 2-finger scroll on a trackpad with finger tracking or if that would have too many false positives.

The AVP has a floating keyboard that pops up when you are entering text and you can tap on the keys with your fingers, look at the keys, and tap your fingers together to select them, or use voice input. Voice input is probably what you want to reach for when possible, though I think they need to improve that experience, followed by look-tap. Tapping with your fingers is cool for about 5 seconds but gets tiring and isn’t as accurate as look-tap. Maybe Apple can implement a “swipe” keyboard, I don’t use it on iOS but I think it might be pretty cool and useful here.

Voice input does well, it’s one thing Siri can handle, but I found myself talking too early in most cases. I’d “press” the button, see it show the clicked-state, and start talking only to be interrupted by the tone that it has started listening… in the middle of what I was saying. That’s to be expected but Apple can and should improve this UI. Fill the button out slowly from the center or in a clockwise “wipe” until it’s ready to listen. I don’t like waiting for an audio cue to start talking when the visual one has already completed. They even have a UI like this in the App Store search field where you just look at the microphone, it fills out, and then it’s listening. However, even that UI is slightly buggy. If you don’t start talking right away it will stop listening and it’s a little too quick to stop listening if you pause for even a moment. I’m sure this is a similar issue to making single tap slower to wait to see if you are going to double tap and Apple wanting it to search as soon as you finish talking but I found it annoying.

Connecting to my Mac

This was the #1 use case I had for the AVP. And as I said at the top of this post it’s not there yet. I’m not ready to write this off with the current hardware but playing around with it for 10-15 minutes made me realize that while it’s a massive leap over the MQ2 (and the MQ3/MQP from what I’ve read at least) it’s still not ready to be my daily driver The text is crisp when you are looking at it but the text you aren’t looking at is slightly blurry. Maybe I could get used to it (after all the blurry text IS text I’m NOT looking at) but it was distracting and brought back memories of having to focus intently in the MQ2 to read text. I want to be clear, that was NOT the experience I had with the AVP, the text was always clear when I looked at it but my brain kept saying “The _other_ text is blurry”. My naive assumption is that this is related to foveated rendering but I could be completely off-base. My understanding is the MBP screen mirroring is using AirPlay under the hood so the MBP is just sending a video feed. The AVP wouldn’t need to “render” the UI, but maybe it saves some battery/CPU by not rendering the feed at full resolution unless you are looking at it.

I am very intrigued by this device that went on sale yesterday and what that could mean for virtual displays. I know it’s a developer device but let’s be honest, so is the AVP right now. I know plenty of non-developers bought it but it’s a v1, early-days device, it’s effectively a dev kit plus a little more. I will keep an eye out for any open source or paid projects that make use of this “Developer Strap” to feed more data from my MBP to my AVP and I’d buy it in a heartbeat if someone comes out with a better virtual display solution using it. I’m already going to be plugged in for long stretches, so why not plug into my MBP?

One last thing I’ll mention about connecting to your Mac is how seamlessly you can use your mouse/keyboard with other Vision Pro apps. This is where Apple shines the brightest, the interoperability in its ecosystem. I assume this is built on top of Universal Control, which I have never used, and it works perfectly. Just look over at an AVP app (native or not), select a text field, and start typing on your Mac’s keyboard. It “just works” in the most Apple way possible.

Marco Arment mentioned that the AVP was not going to be a monitor replacement on ATP (and I think on The Talk Show) and he was 100% correct. I didn’t want to believe him but his comments about how to think about it more like an Apple TV than a Mac were spot on in my experience. Let’s just hope the app ecosystem blows Apple TV’s out of the water, which isn’t hard to do and has probably already happened.

Personas

I’d be remiss if I didn’t mention these. I know you’ve already read multiple articles (aka every AVP review) that talk about this so I’ll be brief. Yes, it falls into “Uncanny Valley” territory, yes it’s weird, yes it looks like I’ve injected Botox into my face, but also, yes, people knew it was me right away. My sisters and my mom started the FaceTime calls I made to them by laughing, rightfully so. I spent over an hour on FaceTime and while I’m sure their experience was different (having to look at Botox Josh) I enjoyed it. Sitting by a lake (using the AVP “Experiences”) while my mom hovered on a massive screen in front of me was cool. Much better than holding up your iPhone while on FaceTime.

One annoyance for me was that FaceTime does not show you your persona, you only get to see the other people. If you use Zoom then you can see yourself and after trying both I’m still conflicted about which I prefer. In general, I like to see how I’m being presented to other people but it was also distracting when I used Zoom since I didn’t like how I looked or how static my face (especially around my eyes) looked. I’m used to checking how I look regularly in a video call to make sure my face reflects how I feel or what I want to convey and my persona felt very limiting. I’m not sure if I would have rather had a Memoji representing me or not. Persona is still in beta so I’m interested to see how it progresses over time.

Killer app

People want to know what the killer app is for AVP and I think in general we just don’t know yet. The Apple Watch at launch was a confused product and it took a year or two before Apple leaned into the fitness angle instead of “iPhone on your wrist” which is how I felt it was launched. I was coming from a Pebble watch at the time and so I was already sold on “Notifications on your wrist plus fitness tracking” but I know a lot of people were disappointed in what it could do (or more often what it couldn’t do. That Series 0 was rough). Back to the AVP, I tried a few apps and while some were very cool to try out, none felt like “this is the future”. I need more time with the device (as do developers) but I can feel the potential, we are just waiting on the software. All the AVP-native apps were a clear step up from iPad/iPhone apps on the AVP but it’s still early days.

Things like Widgetsmith are poised to provide amazing utility to the AVP in the form of widgets you can place in your surroundings. Little things that don’t need a full app like clocks, timers, weather (though Carrot Weather is great and has an AVP-native app), etc. Honestly, I’m surprised Apple doesn’t natively support iOS widgets in AVP. I don’t think Widgetsmith has timers built in yet but it’s something that should be added since it’s a cool feature. Crouton allows you to break out timers you can place around your kitchen while you cook which is so freaking cool. I really hope my recipe app, Paprika, adds something like this. I haven’t cooked in my AVP yet but I plan on trying that out.

The app I probably spent the most time in, aside from setting up apps, was Photos. This should honestly be the first thing you try in your AVP. Spatial videos shot on my iPhone were cool but limited (there is only so much you can do when the lenses are that close) but panoramic shots are, quite literally, breathtaking. I was instantly transported back to where I took the picture. Unfortunately, most of those pictures are from 2012 (when the feature was released) since I saw very little utility in taking them and they are sometimes frustratingly difficult to take. Because of that, the quality wasn’t great but I have a few newer ones that look much better. Jumping back to an old office, old apartment, or my partner’s grandma’s house was a massive wave of nostalgia. In the future, I want to get better about taking more pictures like this and I look forward to Apple improving the iPhone’s ability to take spatial videos and photos. I took 2-3 pictures with my AVP and then looked at them later and it felt like I was back where I took them. The 3D effect was amazing and while the resolution is low and the edges fade out it was still so cool to be “transported” back to that moment. I know we all cringed at the dad filming his kids on the AVP but I think that will be more commonplace once people see the results of filming/taking a picture in the AVP. Again, the resolution needs to get better but it’s impressive even now.

Sidenote, Apple needs to make it easy to take pictures of the UI inside the AVP using the same button for pictures of your surroundings. All I found was screen recording in the control panel, which was oddly confusing. It gave me a list of programs which I thought was a list of which program to record but I think it was a list to decide which program should get the final recording. Just odd and at first I thought screen recording wasn’t working or was limited in some way. Bottom line: I want to be able to take just a picture, not only a video.

Laying down

I spent the majority of my time in the AVP on my couch but near the end of the night, I was getting tired and wanted to try it in my bed. Window management worked fine while more reclined but that’s also when I finally got around to trying Apple TV’s experiences. Unfortunately, there doesn’t seem to be a way to adjust the “horizon” angle for things like Apple TV’s theater/immersive mode or their immersive content (like the Highlining special). This means you need to be sitting up straight or standing to really appreciate them. For the Highlining special, I had to sit up more and even then I found some of the things I wanted to look at awkward or annoying. It was impressive (though I rolled my eyes plenty at some of the dialog) but I wasn’t able to fully enjoy it in bed. I might rewatch part of that today sitting up to see how it compares. Likewise, Apple TV’s immersive feature only works if you are sitting up. If you are lying down you have to look down too far to center the screen. If you just use a floating window you can place it anywhere, like at a better angle above you so you can relax while watching.

Passwords

I use 1Password to manage my passwords and its iPad app worked fine in the AVP. However, automatic password insertion does not work well at all. I think it worked once or twice (maybe in system apps?) but every third-party app I used would lock up when you selected the password suggestion above the keyboard. Even clicking the “key” icon would lock up the app. Your only recourse when this happens is to force-quit the app (Hold the Digital Crown and the top button at the same time until a macOS-like force quit dialog opens). At first, I thought this was limited to apps that used a webview but I ran into it with 100% native (iOS/iPadOS-native, not AVP-native) apps as well. You’ll need to copy and paste from your password manager for now. I’m not sure if it’s an Apple problem or a 1Password problem (I didn’t test with Apple’s built-in password manager) but I hope it’s fixed soon as copy/paste/text-selection in the AVP is just as annoying as it is on iOS.

Other Notes

In no particular order here are a few things I didn’t cover above:

  • I wish I could make some screens smaller or that the iPad app would switch to the iPhone app once I resized it under a certain threshold.
  • Petting my dog sometimes was interpreted as a tap/scroll which was jarring.
  • I want to be able to organize my apps and mix the native/non-native apps on the icon grid.
  • I want Spaces for AVP. You can long-press the close button to hide other apps but I want to be able to go back to all my apps without having to reopen them. Spaces would be perfect so I could leave my chat apps in a different “space” and easily switch to it when needed.
  • You don’t have to press the digital crown to open your app grid, just look up to the Control Center icon, tap your fingers, look at the grid icon, and tap again. I wish I had realized this earlier as I was frustrated with having to reach up the headset every time I wanted to open a new app.
  • I didn’t experience any motion sickness with the AVP like I have with MQ2 but I also wasn’t playing any games (aside from trying a 2D game). The Highline special was the closest I came to “moving in VR while my body was stationary” and I didn’t have any issues with it.
  • The dinosaur experience is pretty cool and shows off the 3D video feature very well
  • In full immersion, if you have a fan on you might see a flickering around your hands as the light changes slightly.
  • I was very disappointed that the excellent CallSheet app is not available for AVP. It’s a perfect use case and I have no idea why the developer, Casey Liss of ATP, disabled his app for AVP. I know he said he was working on an AVP-native version but come on, let us have the iPad app in the meantime. Marco allowed Overcast to work as-is which I’m thankful for.
  • I wish I could mark an audio app as being “surround” sound (I know that means something different in the context of audio, it’s the best I could come up with). It’s cool that you hear people on FaceTime or Zoom based on where they are on your screen but I didn’t want sound from my audiobook app, Prologue, to only come from where I physically (virtually?) placed it in my space.
  • I think I’ll return the $200 carrying case, I thought the exterior would be hard (fabric stretched over plastic) but the exterior is like a puffy windbreaker jacket. It looks odd, I’ll buy a much cheaper black case online. This was a case of “what’s another $200 when I’m spending this much already” that I now regret.
  • I didn’t notice the weight of the headset at all. As in it didn’t bother me. My MQ2 was heavier and started to bug me after a while of wearing it. I wore the AVP for ~6 hours straight and it didn’t bother me.

Conclusion

In wrapping up my initial test drive with the Apple Vision Pro, it’s clear this device isn’t just another tech gadget; it’s a bold leap into a future we’ve only seen in sci-fi flicks. Despite not hitting the mark as a MacBook Pro monitor replacement, at least not yet, it has unveiled a realm of possibilities that excite the imagination. From the surreal experience of viewing photos and panoramas, the AVP has thrown the door wide open to what “immersive” truly means. Sure, we’ve hit some bumps along the way (the odd blurriness outside the focal point, the teething issues with app integration, and the occasional input method fatigue) but these feel more like minor problems of a first-gen product than deal-breakers. And while the killer app for the AVP remains to be seen, the potential for game-changing applications is palpable in every interaction. The Apple Vision Pro, with its current quirks and untapped potential, is like the first iPhone, it hints at a revolution. As I decide whether to keep it or not, I’m swayed not just by what the AVP is today, but what it promises for tomorrow and future iterations (it’s “iPhone 4” moment). The journey into mixed reality is just beginning, and if history is any guide, Apple is likely to refine and perfect this vision. In the end, the AVP is more than a piece of technology; it’s a promise of what’s to come, and I, for one, am here for the ride.


Comments

One response to “Apple Vision Pro: First Impressions”

  1. […] yesterday I posted about my first impressions on the Apple Vision Pro (AVP) and I wanted to expand a bit on using the Mac Virtual Display feature. As I said yesterday, I […]

    Like

Blog at WordPress.com.