With my initial set up and first-time immersion into the world of the Apple Vision Pro reaching its end, I’m going to begin writing about some of the specific experiences I’ve been having with the Apple Vision Pro over the past two days: Saturday and Sunday, February 3–4, 2024. I spent most of the day on Saturday to explore the default visionOS apps, I installed and explored several “Compatible Apps” that I felt would be useful in the Apple Vision Pro, and I also made a surprisingly high number of FaceTime calls. Since it was the weekend, I also had to do my normal errands and housekeeping tasks, but I also logged many hours on the device.
Incidentally, I’m also taking the opportunity with this post to create first draft of my blog writing on the Apple Vision Pro device. I’m using the same Pages app document, but I have it open in a large virtual window in the middle of my living room. Since using the keyboard floating in the air is a fairly imprecise venture, I am using a combination of Siri dictation and the look-and-pinch air typing methods. As long as I keep the perspective that this initial text input is a draft, getting the thoughts onto the digital page is quite feasible. (I’ll also take a screen capture of this User Interface so it its more easily envisioned for readers of this blog.)
Looking back on the entire Apple Vision Pro experience thus far, I believe that the most significant learning that I engaged in so far happened yesterday (Saturday) during a very long FaceTime call with my good friend, Tom. For the last several years, Tom has lived in the Czech Republic near Prague. I consider Tom a fellow Apple aficionado, and he even spent some time working as an engineer at Apple in Cupertino (in the NVH discipline—Noise, Vibration, Harshness).
During our FaceTime call, Tom's curiosity very much complemented mine, and he was patient enough to not only give me ideas about things to try on the Apple Vision Pro, but also let me go down my own rabbit holes in the moment and demonstrate them to him. The majority of our call was spent in FaceTime using the feature where I was able to share my screen with him. Thus, he was able to see what I was seeing in the visionOS interface in my field of view. Now that I’ve seen how choppy that experience is, I sincerely thank him for the long time he spent dealing with my jittery head movements as he watched his computer screen. I’m surprised he didn’t complain of vertigo!
It was sometime during my FaceTime with Tom that I made my first major Apple Vision Pro “discovery.” While I have had many hours of Zoom, FaceTime, WebEx, and other videoconference calls before the Apple Vision Pro, the Apple Vision Pro experience was somehow different, and inexplicably superior, compared to previous video calls. I will attempt to elaborate on the details.
First, Tom and I were effortlessly face-to-face, and the spatial audio made me fully believe that he was talking directly to me from his relative location in the room—and not 4,500 miles away. The “effortless” location of his video representation refers to the fact that I had the ability to place his window anywhere I wanted, at any angle, and then go about our conversation with complete freedom to mover around and use my hands—I didn’t need to prop up an iPad, hold an iPhone, or position a laptop. Further, I was able to put his FaceTime window anywhere I wished, including directly across from me sitting at a table. Many times in our conversation I was convinced that Tom was in the room with me. Tom had visited me in person a few of weeks ago so we had recently had conversations in person here in my home. The Apple Vision Pro made it feel like he somehow was transported back, we were continuing our conversation, and we talked for hours. Tom was here.
The Spatial Audio also contributed to the realism in that wherever I positioned Tom’s FaceTime window, his voice was projecting from that exact location, relative even to the position of my head. If I went to another room and forgot to bring Tom with me, his distant voice prompted me to reach through the wall, ceiling, or floor and pull him into my current room—and his voice followed.
While FaceTime and other video conferencing apps allow these types of conversations, the flexibility of the Apple Vision Pro made the experience completely different in my mind. Rather than picking up a device, watching the screen, worrying about my camera placement, and physically carrying Tom‘s image around with me on a device, his virtual window sat motionless in any location I chose to put him. This may seem like a small thing, but it made a big difference in my experience. These experiential details were not trivial.
When I was not on FaceTime videos with Tom and other friends, I took the opportunity to take some extended deep dives into most of the visionOS apps, and also many of the visionOS compatible apps not yet specifically written for the device. In general, I had great experiences with all the apps I tried. I will say a few things about each of the app categories below.
Streaming Apps
The selection of streaming apps that I used and tested included, Disney+, Max, Paramount+, and Apple TV+. The streaming app interfaces functioned as I expected, in that they presented a range of programming for me to select, they showed previews in large floating windows, and the episodes played as I would have predicted. Notably, Disney+ included four impressive immersive experiences in their visionOS-native app. The level of detail in the immersive “worlds” was quite astounding. As I expressed earlier, I was able to watch snippets of several Star Wars shows and movies in a digital rendition of Luke Skywalker’s fictional home planet, Tatooine.
The streaming apps were not without their issues. A couple of times, the apps stopped responding, and the only way I could get out of them was to Force Quit the app. As a long-time Mac user since the early 1990s, I am amused that Force Quit is still a thing, and I’m glad I still have the option. While annoying, I am attributing the situation to the fact that the Vision Pro is a brand new device and still has some software issues. The vast majority of the time I did not experience major issues. And again, the spatial audio, immersive environments, 2-D content, and 3-D options far exceeded my expectations on the streaming apps.
Keynote
As far back as Friday afternoon when I first experienced the Apple Vision Pro in the Apple Store demo, I was elated to be able to continue delving into Apple’s Keynote app. I was a bit surprised, however, to find that the other two Apple productivity apps, Pages and Numbers, are not yet available as native visionOS apps. However, as I dictate now using Pages, I’m having very few issues using this app for the purposes which I intend.
Now having spent more time with the Keynote app, I better understand why Apple started here. Among all of Apple‘s productivity apps, Keynote gives the opportunity to combine all different types of media into a visual format: text, images, graphics, shapes, videos, and perhaps most importantly, allowing the use of Apple’s amazing builds and animations that allow users to make Keynote presentations come to life. Further, Apple took the extra time and care to create what I now feel is the beginning stages of one of the Apple Vision Pro’s, dare I say, “killer app” experiences—an immersive Keynote presentation rehearsal.
In addition to Keynote’s typical Play button, the app adds the ability to Rehearse your Keynote presentations in one of two empty rooms—a board room and a theater—specifically, an Apple Store Board Room and the Steve Jobs Theater at Apple Park in Cupertino. As I described in an earlier post, my very first experience in the virtual Steve Job Theater felt entirely real to me. Now days after that initial experience, and after I have revisited the feature a few times, I have concluded that this among the most important features of the device I have experienced thus far. Granted, it’s currently only two immersive experiences, but eventually I could see this concept being built out into many different opportunities in addition to rehearsing in one of two empty rooms.
When I get the opportunity myself to practice in any dedicated performance space, I consistently feel different than when I’m standing in my home living room or sitting in my office rehearsing the same presentation alone. The mass of the empty room, along with the empty seats in front of me, lend a level of realism and importance that I can’t get in typical environment. Indeed, it’s not every day I get to stand on the stage of a theatre, let along the Steve Jobs Theater. I could not help but wonder—what if I was a person who never had the access or opportunity to ever present in a large space? In my current role school administrator role, I am privileged to have access to large performance spaces when they are not being used by students or other groups, but even the students and other staff with whom I work don’t have the relatively easy access I get to these facilities. The feeling I got standing in the virtual, Steve Jobs Theater was his real to me as when I’m standing on the stage of the Barrington High School auditorium.
Over a year ago, I began having conversations with my friend and colleague, Phil Hintz, regarding the promise of augmented reality and virtual reality in education, especially for the purpose of creating simulations. Phil is formally examining this topic as part of his graduate school studies, and we have talked several times about creating simulations in the context of preparing high-functioning special needs students for the workforce. However, like any well conceived “Accessibility” option, we also acknowledged that any person could potentially benefit from these types of experiences. The realism that I felt in this public speaking demonstration is the closest thing I’ve ever experienced to making a true simulation come to life. I can now understand how simulated interactions with people, machines, and processes could turn into invaluable, educational experiences that felt so real that they may have well have been an experience with the real thing.
Re-creating the Apple Store Demo at Home
As nearly any educator will tell you, the best way to learn something is to teach it. On Sunday I was able to meet in person with two of my friends, and I was thrilled to share some of what I learned of my Apple Vision Pro experience so far. My plan was to see my friend Kyril in the morning to give him the opportunity to try Apple Vision Pro for the first time in my living room.
Before Kyril’s arrival Sunday morning, I was able to gather my thoughts and set up for a demo that I hoped would mirror the Apple Store experience as closely possible. First and foremost, I wanted to figure out how to make one of my devices function as a SharePlay screen so I could easily see what Kyril was seeing to guide the experienced. This is a similar set-up that Sean used in the Apple Store as he was teaching me how to use the interface. I used the Control Center on the Apple Vision Pro along with the Guest User set up to make a fairly good rendition of the Apple Store experience as Kyril’s field of view appeared on the screen of my 11-inch M1 iPad Pro. Unfortunately, in this setup I have yet to be able to figure out how to make the Spatial Audio work properly on the Apple Vision Pro while I am sharing the screen.
Previous Content on the Apple Vision Pro
Another friend of mine and fellow Apple Vision Pro owner Nour visited in the afternoon. I always had the impression that I capture a higher-than-average number of photos and videos than most users, but now that I have the Apple Vision Pro, I am considering myself lucky to have the vast array of media in my Photos Library. Among my over 75,000 photos I have captured since 2014, nearly 800 are Panoramic. I have a much lower number of Spatial format movies among my nearly 12,000 videos, but this technology was just made available to non-developers a few weeks ago. However, I have been contacting friends and family whom I know one iPhone 15 Pro devices and I’m asking them to please send me Spatial videos.
I am most delighted by my selection of Panoramic photos. Frankly, panoramas are pretty lame on a typical screen as they are either long and thin and you need to squint to see details, or you must zoom in and scroll side-to-side. Of course, with the Apple Vision Pro panoramas become 180º vistas that, in some cases, transport you back to the location. In my case, I found myself back on the top of a few mountains in Switzerland, standing in some landmarks in Berlin, looking on a chilly night in Times Square, reliving a lights festival in Prague, and sitting back on the crowded lawn of the Ravinia festival here in Highland Park. Nour and I have been exchanging our own panoramas as we find good examples.
Coincidentally, my brother recently purchased an iPhone 15 Pro and he had just landed on Sunday on the island of Saint Kitts. I texted him directions on how to turn on Spatial video, and he has been sending me me a few videos of his tropical resort and the surrounding area ever since. I now need to get in the habit of remembering to shoot more Spatial video format—and if you have an iPhone 15 Pro, I’d love to see some of your Spatial videos.

No comments:
Post a Comment