Saturday, February 24, 2024

Creating on Apple Vision Pro: Drawing Shapes

This is the first article in a series of Creating on Apple Vision Pro articles. In this series, I will challenge myself to complete a similar creative action on all the Apple devices I regularly use, using the same app(s), the same tools, and the same features across different devices and operating systems. To begin the series, I selected an exercise in drawing shapes using Apple’s standard shape drawing tools in macOS, iPadOS, and visionOS.

To help somewhat level the playing field, I will begin on the device I usually use, or prefer to use, for a given activity, and complete it without practice. In this case, I usually use macOS in Keynote to create shape-based drawings. The tools are essentially the same across devices with the differences being the manner in which the user interacts with the user interface (UI) elements.

In this example, I will use the shape drawing tools to create a simplified version of my #AppleVisionProEDU logo. I created the actual logo in Keynote on macOS. In this case, I will draw a simplified version of the logo using several features I regularly use:

  • Draw a variety of circles and ellipse shapes
  • Use the Unite Shapes and Subtract shapes features
  • Use the Gradient Fill tool
  • Add a text box, change the font, size, and center the text
  • Draw a diamond shape

Here is the basic design I’ll recreate four times:


The four apps and versions I will use include:

  • Keynote on macOS
  • Keynote on iPadOS
  • Keynote on visionOS
  • BONUS: Pages on visionOS—This will give an idea of the differences between using the same tools native on visionOS compared to an iPadOS app running on visionOS

For this article I will report the times it took me to create the same shape drawing in four app versions as one metric. Perhaps more importantly, I will report my experiences using the UIs of each OS.

Keynote on macOS

I have been drawing logos and shapes using the draw tools in Keynote for macOS for over 2 decades. I use these shape creations in presentations, logos, and videos. When the Bézier pen tool was released in Keynote in about 2011, I started using Keynote (and later Pages) as my primary illustration app. Here are all the basic shapes required to draw the simplified logo (shown in different colors with each shape selected for the sake of this explanation):

To level the playing field, I did not practice the original drawing in advance, I just started drawing from an image in my head. The unrehearsed drawing took 4 minutes 26 seconds on macOS.

It’s easy for me to take drawing in Keynote for macOS for granted, but the UI features that set the macOS version apart are mostly due to keyboard shortcuts and multi-touch trackpad features. A couple UI elements are a bit more difficult in macOS, namely, the Unite and Subtract Shapes features are buried under the Format > Shapes and Lines menus. However, my long-term experience in the app/OS made this drawing almost effortless.

 

Keynote in iPadOS (M1 iPad Pro 11-inch)

Although creating drawings with shapes on the iPadOS version of Keynote is my second-favorite option, I have a fair amount of experience from my many sketchnote projects I draw using Apple Pencil on iPad. I frequently access the shapes tools for drawing/tracing guides and to help with layout—even if you never see the shapes in my final sketchnotes because I delete these guides.

For the iPadOS on iPad version of my drawing, I used the exact methodology, but used the touchscreen features of iPad and the iPadOS touch UI. This logo took me 7 minutes, 25 seconds to complete, nearly 3 minutes longer than macOS.

To me, the multi-touch interface on iPadOS is nearly as efficient as macOS, and I would cite no problems using the iPadOS version. I speculate that a combination of having less overall iPadOS experience and the fact that I find the iPadOS UI slower to navigate may be the reason for the increased time. However, iPadOS feels more precise in moving objects because you are touching them “directly” (not using a trackpad), and I had already drawn this logo once—factors that should have increased efficiency and decreased time. Overall, this activity did not seem overly complicated, nor did I feel it took too much time to complete.

Keynote on visionOS

Please note that this was not my first time drawing shapes on visionOS. (I wrote about these experiences early on this blog.) Also be reminded that this was the third time to draw this logo so I was well practiced by now. This visionOS Keynote version of the drawing took me, surprisingly, 6 minutes 33 seconds to complete—about 1 minute faster than iPadOS on iPad! I was fully expecting to add at least a full minute due to my inexperience with the eye-tracking and pinching visionOS UI, but I was proven incorrect.

However, I have several visionOS UI issues to report. At this time with my current experience, the drawing did not feel “natural.” Many times throughout the drawing process I felt a bit of frustration.

1. It is very easy to accidentally select a menu in visionOS, but it is not yet natural for me to deselect this menu I didn’t want. To make a menu without a close button “go away,” you need to look “nowhere” on the screen—in a blank area—and tap your fingers. There isn’t always a blank area to look at so it’s easy to bring up another unwanted menu. Frustrating—but tolerable.

2. Moving objects with resize handles is somewhat of a gamble, especially if the object is small. You need to look at the center of the object, pinch, and hope your drag moves and doesn’t resize the object. I found myself focusing all my will on the center of circles hoping that my "psychic power" would ensure I would move the object—oddly, this usually worked!

3. Drag-select is unpleasant. Frankly, I have no idea how I made this function work—sometimes it did and sometimes it didn’t. To drag-select, you look at “nothing”—in the middle of a blank space—and tap-and-drag while desperately hoping the interface understands that you want to drag-select over objects. This is about 800% more effort than macOS and iPadOS. On visionOS, there is no feedback that your eyes are “locked” into a place that will trigger the pinch to cause drag-select to function. This UI feature apparently functions on hope.

4. Resize handles offer sufficient visual feedback because they slightly change color while you look at them. When objects are small and/or close together, this action is still challenging, and I found myself using my “psychic powers” to ensure I was looking precisely in the exact right place. It mostly worked.


My description of using my “psychic powers” to ensure precise eye-tracking is meant tongue-in-cheek, but it I did it many times, and I wonder how many other users are doing it. Despite these issues using visionOS for this drawing example, the stopwatch didn’t lie—I still completed the activity in less time than iPadOS. However, it felt longer and the UI caused friction.

BONUS: Pages on visionOS

Since the shapes drawing tools are exactly the same on Pages, especially if the Pages document is set up as a Page Layout document as I did for this example, I wanted to try the iPadOS tools on visionOS. The time for completing the drawing was 7 minutes 9 seconds—about the same as the iPadOS version on iPad. Since I suspect we will see a Pages for visionOS version relatively soon, I just have a few insights:

The overall experience was very similar to Keynote for visionOS. The side-tab and “ribbon” menus in iPadOS are functional, but the visionOS floating menus are far better for the eye-tracking/pinching UI. When a side-tab menu is selected in the iPadOS version, the workspace shrinks unnecessarily in the visionOS world. (This is odd because the iPadOS version on my iPad Pro uses a floating side menu—NOT the fixed version of the side tab menu used by visionOS shown below.) Also, the tools and icons are spaced further apart on visionOS allowing for more precise selections, even if they are placed in yet another set of locations on the screen for you to need to locate.

Pages for iPadOS on Apple Vision Pro

In iPadOS, there is no visual feedback on whether or not you have selected a resize handle, you have only your hope that the thing you are looking at is selected. Hope is not enough for a functional OS. Note to Apple Human Interface Team—hope alone does not make a functional UI element.

Resize handles on Pages for iPadOS on Apple Vision Pro

 Conclusion

To conclude this first article in my series on Creating on Apple Vision Pro, I decided to make a video of drawing this example using Keynote for visionOS. I sped up the video so it is just 1 minute long. However, I was surprised yet again by the actual time of this example—the fifth time I completed this activity in 4 different formats—my time was 5 minutes 26 seconds on visionOS. This is just 1 minute longer than my original, unpracticed macOS version. 

Even with its imperfections, visionOS is impressive as a creation tool for drawing with shapes.

Sunday, February 18, 2024

Learning by Teaching—Apple Vision Pro Demonstrations & Debriefs

As an educator, I have always felt that the best way to learn something is to teach it. In addition to my own time with the Apple Vision Pro for the past 15 days, I've also taken the opportunity to provide demonstrations to some of my friends and colleagues. So far I have provided 6 demonstrations, each lasting between about 45 minutes and just over 2 hours. In some cases, my friends or colleagues expressed interest in a demo, while in other cases, I solicited feedback. I worked with a variety of people with backgrounds in education, provided similar demonstrations, and debriefed the experience with the same 4 questions.

My first demonstration was just 2 days after my Apple Store demo, and it was with a friend of mine outside of education who is very interested in Apple devices. It was during this demonstration that I got the idea for this action research activity. I did not include the results of the first demo here since I used that opportunity to create the sequence of activities, noted some of the issues I planned to observe, and wrote my follow-up questions after this practice demonstration was completed.

This article reports the results of what I learned providing Apple Vision Pro demonstrations to 5 friends and colleagues with backgrounds in different aspects of public education. I will refer to those with whom I worked by their initials. Here is a short description of each person:

  • ST: A long-time Apple enthusiast with many years experience providing technology hardware and software support in schools. ST has decades of experience using macOS, iOS, iPadOS, and other Apple systems.
  • HO: A friend and colleague who provides education support to many people and groups in my school district who has over 10 years experience using Apple devices (primarily macOS and iOS).
  • TG: An innovative friend and colleague with over 20 years experience in education as a teacher and administrator. TG is in an educational technology leadership role with experience using Apple devices running macOS, iOS, iPadOS, and several other systems.
  • TH: A colleague in a district leadership role providing educational support who is relatively new to using Apple devices, but adapted quickly to operating systems (primarily macOS). TH had no idea the Apple Vision Pro had been released and heard about it for the first time seeing it in my office.
  • ZE: A colleague with about 20 years experience as a teacher and administrator, ZE has used Apple technology as a tool for at least the last 5 years. ZE found the Vision Pro intriguing and agreed to be part of this demo experience after I asked.

Each demonstration was scheduled for at least 45 minutes. When a demonstration began, I gave brief, basic instruction about how to pick up the Apple Vision Pro safely; how to look, pinch, and drag; how to adjust the fit dial on the Solo Knit Band; and I showed the locations of the Digital Crown and top button. I then mirrored the Vision Pro Display on my M1 11-inch iPad Pro, set the Vision Pro to Guest User mode, removed my prescription lenses, and handed the device to the Guest User.

During the third demo, I realized that I needed to specifically set the audio source to the Vision Pro while mirroring. Before the third demo, the audio settings were less consistent, but each demonstration included Spatial Audio for some or all of the experience.

Since these demonstrations were with colleagues I know and trust, I made the decision to leave my iCloud account logged in and accessible. Further, I was able to watch the majority of the demo using the mirroring feature of the Apple Vision Pro. This setup may not be optimally replicable for those who do not wish to share their personal data with others—particularly Photos and Documents. Some Apple Vision Pro owners may wish to either limit the Guest User account (an option available in the Guest User setup) or create a demo Apple ID for the purpose of conducting demos.

Here is the general sequence I used for my demos:

My Guest User Setup (about 3 minutes)

  • Explain how to pick up the Apple Vision Pro safely (thumb through nose ridge and other fingers across top—be careful of the parts attached by magnets)
  • Share Vision Pro screen with iPad screen (and set Audio to Vision Pro)
  • Switch to Guest Mode in Control Center
  • Explain fit dial on the Solo Knit Band
  • Show locations of Digital Crown and top button
  • Remove my prescription lenses
  • Explain that the first few minutes are very important to get a good fit and set up eye-tracking and hand gestures

Demo (at least 45 minutes)

  • Ensure a good fit by adjusting the fit dial on Solo Knit Band
  • Observe and assist in the fit as needed
  • Direct to Digital Crown
  • Observe hand gesture practice
  • After setup, explain video passthrough (“you are looking at a video of the room, not the room…”)
  • Access Home Screen with the Digital Crown
  • Open Photos app
  • Try scrolling (side-to-side, up-and-down)
  • Access and view Panorama photos in Panorama mode (from tab bar at left)
  • Access and view Spatial Video in and out of full-screen mode (from tab bar at left)
  • Teach about bottom-center close button and the “dash” window control to move windows
  • Back to Home Screen, select one or more Immersive Environments (from tab bar at left)
  • Use the Digital Crown to increase the Environment to full 360°
  • Suggest participant to stand if comfortable
  • Back to Home Screen, launch Keynote app
  • Select a presentation (from iCloud account)
  • Rehearse presentation in Theater and/or Conference Room (point out options under the ... button)
  • Back to Home Screen, launch Encounter Dinosaurs app
  • Interact as comfortable (hold out hand for butterfly, stand to offer hand to dinosaurs)
  • If time/interest: back to Home Screen, launch Disney+ app
  • Turn off mirroring on iPad*
  • Select an Environment and look around (4 are available from tab bar at left)
  • Select movie or show to watch inside Environment
  • Play this by ear—If the user is comfortable with navigating visionOS at this point, ask them to access the Control Center and guide them through selecting the mirror to iPad option. This can be tricky, and may be unsuccessful for some users.
  • Back to Home Screen, launch Safari, Apple Music, and Photos apps to experience multiple open windows
  • Ask if there is anything else they would like to see (as interest/time permits)
  • Take off Apple Vision Pro
  • Ask follow-up questions, or agree to wait for a time in the future

*During the Disney+ app demo, you will need to STOP mirroring on the iPad due to DRM (Digital Rights Management) issues. For this reason, you may need to skip the Disney+ app for some demonstrations to avoid the potential frustration of needing to direct the Guest User to access Control Center and teach them how to turn on mirroring.

Some demo participants were so affected (one was “speechless”) by the experience that it was evident that they needed to wait one or more days to answer our follow-up questions. In one case, we waited to ask questions due to time constraints. I waited to ask follow-up questions in 3 of the 5 demos.

Cleaning Between Demos

During these demos, I considered hygienic issues for this device. Even though I know and trust my friends and colleagues and had no concerns about cleanliness, I found it slightly off-putting to share a device that I wear on my face for extended time periods. The Apple Vision Pro User Guide includes a section on cleaning that is helpful for keeping each of its parts clean (Apple Support, 2024a and 2024b). However, I added the extra step between demonstrations to use 1 or 2 alcohol prep pads to wipe the gray fabric of the Light Seal Cushion. The alcohol evaporates quickly and it adds an extra step of cleanliness. Apple specifies, “Don’t use disinfectant wipes…to clean the Light Seal or Light Seal Cushion,” but since the prep pads contain only alcohol (and not bleach or other chemicals), I have so far had no issues.

Follow-Up Questions

The purpose of this blog—and this demo experience—is to learn about potential uses of the Apple Vision Pro in education. At the same time, using the Apple Vision Pro is a unique experience that is also fun. I did my best not to over-complicate this activity and make it in any way unpleasant. Thus, I kept my follow-up questions to just these four:

  1. What surprised you?
  2. How might you use this in your job?
  3. How might students use this?
  4. Do you have anything else to report about this experience?

I would be remiss if I did not point out the limitations of this action research and my conclusions. This is not a scientific study and should not be considered as such. All results reported here are observations subject to my biases. Also, only 5 participants were involved in this activity, they were not selected at random, and all have educational technology background using Apple technology. That being said, I feel what I learned from this experience was valuable in the context of the goals of this blog and my personal understanding of using Apple Vision Pro in an education setting.

General Observations During Demonstrations

Before I report the reactions from each of the demonstrations, I noted several interesting observations among those who took part. First, all participants estimated that they had spent less time using the Apple Vision Pro compared to the time that had actually passed, a phenomenon I will discuss in more detail below. Here are a few other things I noticed:

ST—ST and I had planned a long demonstration—the longest of the group. Thus, ST began to experience some eye fatigue after about 2 hours. ST was also the first to experience significant trepidation with standing while wearing the Apple Vision Pro. ST was uneasy because looking down while standing, the user is unable to see their feet. ST exclaimed, “I lost my feet,” and balance and overall steadiness were affected while standing. ST is a retiree and speculated that his balance issues might be due to age, but two other participants experienced the same feeling, one who is about 30 years younger than ST.

HO—After the setup process, HO had difficulty using the eye-tracking and pinch controls, likely because I did not do a good enough job explaining the importance of fit before the setup procedure. After the setup, HO adjusted the position of the headset significantly, and it was necessary to repeat the entire setup process. This “false start” added about 10 minutes to the demonstration.

TG—Normally a reserved personality, TG expressed many times how the realism of the experience far exceeded expectations. TG was also negatively affected by not being able to see one’s feet when standing in immersive environments. More than any other person, TG was most awed by the 360º immersive environments and expressed many times an interest to “go there” and have environments available to create a calm working environment.

TH—TH expressed that this was a completely new and unique experience, and the only thing that had ever come close was 3D IMAX—but Apple Vision Pro was far beyond that. Of all the demonstrations, TH was most emphatic about the possibility of using Apple Vision Pro to provide real-feeling experiential training simulations.

ZE—After setup, ZE's eye-tracking and gestures were not working correctly and had to be set up again. The reset added about 10 minutes to the demo, but did not greatly detract from the overall experience. In fact, ZE expressed the most “awe” from the experience overall and felt unprepared for how good an experience the Apple Vision Pro delivered.

Reactions to Follow-Up Questions

What surprised you?

ST—ST was most surprised by the immersiveness of the experience and reported, “it put me in another realm.” ST specifically described the sharpness of the graphics and the immersive sounds provided through Spatial Audio that were said to be as good as any high-end sound systems previously heard.

HO—HO felt “removed from my own reality,” and reported that it was easy to forget where you really are. HO believes that being immersed in that world and space likely contributed to the loss of time that was experienced while using the device.

TG—Among all participants, TG was the only to be surprised that the Apple Vision Pro was more comfortable than expected. TG also was impressed by how immersive it felt and was surprised at “how quickly I became comfortable with the UI [User Interface].”

TH—The reality of the details shown by the Apple Vision Pro was described by TH as the possible reason why the 3D experience felt so immersive.

ZE—Although ZE reported that this was their first AR/VR headset experience, the participant felt it was “beyond what you could anticipate” and was “blown away” by how experiential and real it felt.

How might you use this in your job?

ST—ST is retired, but engages in hardware repairs, audio/visual/writing projects, and keeps current with technology. ST mostly envisioned using the Apple Vision Pro for entertainment purposes and/or viewing multiple open windows simultaneously—similarly to how a widescreen display might be used.

HO—HO’s position requires considerable communication and collaboration among many individuals and groups, both uses for which the Apple Vision Pro could be used. HO specifically mentioned “meeting with people through Zoom or other apps.” Also, HO envisioned using the “grand” workspace for multitasking among many different apps.

TG—Since TG saw the potential benefits of the immersive environments as useful, “self-imposed focus and isolation” and “deep work” were areas TG felt would be good uses of the Apple Vision Pro. TG also saw potential benefits in the realism provided by the in-theater Keynote rehearsal feature and for communicating with the device using Zoom or other apps.

TH—Although TH did not have any specific examples of how the Apple Vision Pro could be used for the work TH does, the participant gave many examples of the potential benefits of harnessing the realism provided by the device for providing true-to-life training simulations. Firefighter training was one specific example provided by TH that seemed to make good use of the Apple Vision Pro’s visual and Spatial Audio features.

ZE—When asked the question about how ZE might use the Apple Vision Pro in their job, this consummate educator immediately began answering the question about how this technology will be a “game changer” for students. After some probing, ZE did mention that the Apple Vision Pro would be useful to create immersive environments that might be more conducive for working. ZE expressed that this device could improve the opportunity to, for instance, “write a memo from Yosemite…with music in the background,” thus providing a potentially more focused and pleasant working environment.

How might students use this?

 ST—ST’s immediate response to the student use question was that the Apple Vision Pro could be used to help “put someone in a different place so you could feel the the culture.” ST gave several examples about experiential learning, including transporting a student to the top of the Swiss Alps (an example ST had just experienced from my Panoramic photos collection), recreating battles of the Civil War allowing a first-person view, standing in the audience while Lincoln delivered the Gettysburg address, and going back in time to experience the Egyptian pyramids and culture. ST felt that this device will allow a “next step in learning,” especially about other cultures.

HO—Rather than immersive environments, HO first focused on potential hands-on uses of the Apple Vision Pro with students, mentioning the ideas of creating graphics, experiencing art, closely examining anatomy for medical or nursing applications, working on parts of a car/removing parts digitally for automotive training, and other examples. HO specified that students could also use the device to be creative “beyond traditional mediums” in both 2D and 3D.

TG—TG conveyed many thoughtful and insightful responses regarding possible student uses in several categories. First, TG mentioned that students could engage in immersive field trip opportunities. TG mentioned that the Apple Vision Pro offers “different ways of doing things students currently do, but it can go much further,” such as exploring models and participating in simulations. However, TG acknowledged the current relatively low number of visionOS apps. TG also mentioned that students could benefit from the visionOS’s ability to multi-task and use multiple screens. TG said that as a result of this demo it was clear that “Spatial computing is ‘a real thing’ that allows a user to experience a new relationship to the User Interface of windows, open apps, [and other UI elements],” and added, “this is a new way of interfacing, this is not just a novel approach.”

TH—While TH struggled to name ways to use the Apple Vision Pro in their own work, TH provided the longest list of possible student uses of any participant. First, TH referenced a TV commercial reminiscent of the Apple Vision Pro experience during which a deaf child and his father visit a waterfall at Yosemite National Park and the child asks, in ASL, “Can you feel that?” (Subaru, 2023). TH believes that this device can allow “opportunities for children to get a sensory experience” in learning. TH then commented that the Apple Vision Pro could help to address equity issues among students, allowing students with less resources to have experiences in places not otherwise accessible. 

TH also gave a detailed example about Apple Vision Pro’s “full 360º experience,” providing real-feeling learning scenarios and simulations that could be offered. One example discussed was firefighter training, complete with 360º video showing the chaos of fire, water, smoke, and other firefighters, while hearing the 360º Spatial Audio to make the experience even more real. TH mentioned that a similar feeling was felt during the Encounter Dinosaurs app demo where the sound came directly from the dinosaur fight and from the dinosaurs growling and breathing at the participant. TH mentioned that when the dinosaur growled at you, “your instinct is to back up!”

ZE—Equity issues were also discussed by ZE in the context of allowing all students to participate in experiential learning opportunities, not just those coming from families with financial means. ZE gave examples about how the Apple Vision Pro will be a “game changer” for students mentioning that the device could be used for learning by doing in real-life places and contexts, learning from real professionals, participating in experiential learning, and visiting places not accessible to all students. ZE gave experiential examples including visiting the zoo in present day or visiting ancient Egypt in the past, and speculated that both experiences would be nearly as vivid as a real-life experiences. These detailed experiences could lead to the possibility of improved writing opportunities and other assessments not possible outside of augmented and virtual reality. 

ZE also mentioned the possibilities of students experiencing real-feeling, 3D, how-to experiences they could watch from professionals completing tasks. One notable example was watching a surgery where a student could easily pause, rewind, and look closely at 3D at elements of the surgery that would be impossible to experience even if they were live in the operating room.

Do you have anything else to report about this experience?

This final question I added as a “catch-all” opportunity for participants to share anything else we may not have discussed. Also, since I did not ask a question specifically probing for potential challenges or negative concerns, this final question allowed a time to share these ideas if the participants wished.

ST—While ST didn’t feel a need after the demo to purchase a personal Apple Vision Pro device, the participant noted that they may be interested in the next version of the device at a potentially lower price. ST also noted that wearing the device for over two hours resulted in some discomfort due to the weight on the front of the face and that “I felt like I had just finished an eye exam.” ST also noted that taking off the Apple Vision Pro felt like “coming out of a dark theater” and felt there was a slight readjustment time period while “re-adjusting to reality.”

HO—A few days after the experience, HO described the experience as “weird…you really want to be in there more!” The Disney immersive Environments were memorable, and HO “wanted to see more, explore more, and see what’s behind that wall.” HO was referring to the fact that while the immersive experiences are highly realistic, there is a limit to what can be explored and viewed. While a user can look at the front and sides of many virtual objects (depending on their placement), it is not possible to walk through doors or see all objects in full 360º views.

TG—After a day, TG had many observations and questions about the Apple Vision Pro, some related to brain research and/or the possible psychological effects of the device and the experiences it delivered. TG first mentioned that the entertainment aspects of the Apple Vision Pro could feel isolating since one is not watching movies with friends or family (as TG does in reality). TG also expressed what all other participants said in different ways: “the fullness of the experience can't be described, you have to do it.” Some of the questions TG posed as a result of the Apple Vision Pro demo experience included:

  • Might there be breakthroughs in education and other fields as a result of Spatial Computing?
  • Will Spatial Computing provide a better understanding of how the brain works for reasoning and thinking?
  • What kind of sensation/perception research breakthroughs might result from Spatial Computing?
  • What will Spatial Computing uncover about our brains?

One interesting point regarding metacognition TG and I discussed was related to attaching learning events to physical spaces and/or places. Research indicates that some learners attach memories to physical locations and materials, a technique referred to as “method of loci” (described by Yates, 1966, and others). TG and I have noted that we sometimes inadvertently experience this phenomenon while listening to audiobooks and podcasts while driving, noting that when we remember a particular point learned while driving, we also recall the location where we learned the idea. TG wondered if Spatial Computing might have a similar effect based upon the virtual/immersive setting of learning. For example, might I find myself transported to a volcano at Haleakalā National Park the next time I think about foveated rendering because that was the virtual location I had set in the Apple Vision Pro when I first researched the idea?

TH—TH answered the follow-up questions immediately following the Apple Vision Pro demo. The primary ideas expressed by TH were that this experience was completely unique and that no previous experiences were analogous to the one delivered by this device. TH also noted that one user interface (UI) element was uncomfortable to access—the Control Center. To access the Control Center, the user must look up to trigger a relatively small dot with an arrow to appear, tap it, and then interact with a series of floating windows. TH described this as uncomfortable.

ZE—ZE also answered the follow-up questions immediately following the Apple Vision Pro demo and commented on the UI of visionOS. While ZE found the immersive environments to be “incredible,” the interface was described as the least inspiring aspect of the experience. As an observer, I noted that ZE very quickly learned the new visionOS UI and adapted to it. However, ZE also had a “false start” that may have negatively affected the overall experience since the setup needed to be repeated before we could continue with the demo.

Overall Conclusions

Conclusions—What surprised you?

In all demonstrations, participants expressed surprise regarding the “time shift” they experienced using Apple Vision Pro. Participants surmised that they had been using the device for a shorter time than they had actually been using it. This “VPST” (Vision Pro Standard Time) phenomenon should likely be researched more formally. Even the participants who experienced “false starts” and needed to redo the setup procedures experienced the perception of spending less time, despite the potential frustration that could have made the experience feel longer.

Approximate time estimates were reported as follows:

  • ST—Estimated time at 90 minutes; actual time was over 120 minutes (133% longer than estimated).
  • HO—Estimated time at 30 minutes; actual time was 48 minutes (160% longer than estimated).
  • TG—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).
  • TH—Estimated time at 20–30 minutes; actual time was 42 minutes (140% to 210% longer than estimated).
  • ZE—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).

Participants used the Apple Vision Pro between 133% and 300% longer than they estimated in this demo situation. On average, participants wore the Apple Vision Pro about twice as long as they estimated. Even after 15 days, I personally still experience this time shift and feel I have been wearing the device for less time than has actually passed.

All participants also reported that they were impressed by how real and immersive the experience was. Three of the five participants reported having never worn a headset device before, and they all commented that the realism delivered by the Apple Vision Pro far exceeded their expectations, as far as how real the experience felt.

Conclusions—How might you use this in your job?

All participants mentioned some potential uses for the immersive environments provided by the Apple Vision Pro in their jobs or education in general, with two participants specifically wanting the ability to use the device to work “inside” an environment of their choice more conducive to work.

The themes of communication and collaboration were also mentioned, specifically using the Apple Vision Pro for Zoom or other similar apps. Interestingly, the demo did not include videoconferencing, and none of the participants mentioned the Persona feature of the Apple Vision Pro that creates a simulated version of the user’s face. At least one of the participants had already experienced my Persona when I contacted them via FaceTime before the demo. Somewaht surprisingly, the beta Persona technology did not deter them from visualizing themselves using it. (Personas have been described by some reviewers as “horrifying.” Although this description is hyperbole, Personas can look unsettling at best—like mine shown below.)

Finally, two participants mentioned the Apple Vision Pro’s abilities to provide realistic simulations, one in the context of rehearsing a presentation in a theater, and the other for simulating a training scenario with realistic visuals and sounds.

Conclusions—How might students use this?

Although all participants had some difficulty or showed pause in answering the question about how they might use the Apple Vision Pro for their personal work, none of the participants had an issue providing many possible uses for students. Participants also provided ideas that represented current possible uses of the Apple Vision Pro based on what they had just experienced and speculative ideas based upon what they felt could be available in the future based on features, functions, apps, or content they had not directly experienced, but assumed are possible.

The two most-mentioned ideas about possible student uses included that Apple Vision Pro could be a factor in “leveling the playing field” for all students and providing experiential learning opportunities not currently possible.

The possibility of Apple Vision Pro addressing student equity issues was mentioned specifically by 3 of 5 participants. The fact that a $3,500+ device could potentially be an answer to equity issues is somewhat humorous to me. However, this may be an indication that the experience the Apple Vision Pro delivers is worth the relatively high cost—or at least it allows educators to see its potential learning experiences as truly valuable. The level of realism of the experience reported by each of the participants was real enough that all of them felt the device was nearly as good as reality, and more than half felt that students would benefit by participating in experiences as real as they had encountered.


All participants gave examples of experiential learning in several contexts, including, but not limited to:

  • Visiting places (e.g., geographical locations, zoos, professional locations)
  • Recreating historical events (e.g., Civil War, speeches, ancient cultures)
  • Experiencing art and culture
  • Creating art
  • Participating in job-based experiences (e.g., arts, medical, science, automotive)
  • Simulating environments (e.g., historical events, firefighting, medical)

Conclusions—Do you have anything else to report about this experience?

The responses to all questions were gathered following the Apple Vision Pro demo experience after a range of time periods between the demo and questions. Some participants answered directly after the demo, and one participant waited two weeks to provide follow-up answers. In my opinion, the amount of time that passed did not affect the quality or quantity of responses. In fact, at least two respondents added more information to their responses days after they provided initial answers because new information had occurred to them. One participant later shared, “I long to revisit some of the immersive worlds” experienced on the Apple Vision Pro—in this case, the desert planet Tatooine, the fictional home of Luke Skywalker.

This catch-all question included discussions about user interface issues—mostly reactions to the controls and a couple of mentions about the frustration accessing the Control Center. Some participants also took the opportunity to reiterate slight discomfort wearing the device, and the fact that users cannot see their feet while standing in immersive environments.

However, all participants reiterated their surprise about the realism provided by the Apple Vision Pro in immersive environments. Several comments were conveyed that participants wanted to be able to visit these environments whenever they wanted for both entertainment and to create pleasant work environments. Further, all participants mentioned how this device should be used to provide experiential learning opportunities.

Final Thoughts

Among the five participants and the demonstration experiences I was able to share with each of them, I am more convinced than ever that the Apple Vision Pro is on the road to becoming a valuable tool for learning. Indeed, I have personally used the device for my own learning in the past 15 days, and my hope is to pursue the ideas I learned from my friends and colleagues moving forward.

These demonstrations have served to provide me with a roadmap for selecting apps and trying new things discussed by participants. Some areas I will pursue match the findings from this report:

  • Experiential learning
  • Immersive environments
  • Realistic how-to experiences
  • Creating work-conducive environments
  • Expert-guided simulations
  • Virtual travel/field trips
  • Artistic and cultural experiences
  • Apps that allow users to create

Finally, many thanks to the friends and colleagues who participated in these demonstrations. I hope I provided a fun and worthwhile experience—and that everyone learned as much as I did. Thanks, too, to Sean at the Apple Store in Deer Park (IL) who led me through my first demo earlier this month and gave me a blueprint for my own demos. I further hope that these experiences will inspire other education leaders to learn through teaching by following some of the processes described above.


References

Apple Support. (2024a). Apple Vision Pro User Guide: Clean your Apple Vision Pro. Retrieved from https://support.apple.com/guide/apple-vision-pro/clean-your-apple-vision-pro-tan6200165e8/visionos

Apple Support. (2024b). How to clean Apple Vision Pro and accessories. Retrieved from https://support.apple.com/en-us/HT213964

Subaru. (July 31, 2023). A Beautiful Silence :30 [video]. Retrieved from www.youtube.com/watch?v=X_kxjt6gf1Y

Wikipedia. (2023). Method of loci. Retrieved from https://en.wikipedia.org/wiki/Method_of_loci

Yates, Frances A. (1966). The Art of Memory. Chicago: University of Chicago Press.

Saturday, February 17, 2024

My Experience with the Apple Vision Pro Battery So Far

Much has been written and commented about Apple Vision Pro battery life. This apparently pressing issue—to some—seems to be skewing negative on social media. I’ve already written on the topic myself, but I wanted to provide a brief article on the topic that’s easy to digest and that is now informed by two weeks of use.

First, here’s Apple’s official description of the Apple Vision Pro Battery:

“The high-performance Apple Vision Pro Battery is made out of smooth, machined aluminum and connects to your Apple Vision Pro using a woven USB-C cable. It can slip into your pocket for portable power and supports up to two hours of general use, 2.5 hours of video playback, and all-day use when plugged in.” (Apple, 2024)


The negative reviews sometimes appear in the “cons” section at the beginning of articles in addition to within commentary. For example:

PCMag: “Short battery life” (Greenwald, 2024).

Other discussions included, for example, in the Wall Street Journal: “...its battery life sucks...” (Stern, 2024), and on MKBHD Marques Brownlee said, “...the battery life is meh.” (Brownlee, 2024). The PCMag article expands on its negative review: “Battery life remains the biggest weak point for standalone AR/VR headsets across the board, and the Vision Pro is no different” (Greenwald, 2024).

However, I have also read plenty of reviews that do not mention battery life as a problem. Here are a few:

  • “I’ve consistently gotten 3 full hours of battery life using Vision Pro on a full charge. Sometimes a little more. In my experience, Apple’s stated 2–2.5 hour battery life is a floor, not a ceiling” (Gruber, 2024).
  • Joe Rossignol of MacRumors writes about Brian Tong’s video: “the Vision Pro may last up to 30 minutes longer per charge than Apple's advertised battery life claims, but results will vary.”
  • And finally, my all-time favorite tech reviewer David Pogue mentions, “[the battery] powers the headset for about two hours. You can also plug the battery into the wall, in which case the headset runs forever.”

So why the disparity? My guess is preconceived expectations. 

If a reviewer for some reason assumed that the Apple Vision Pro was a device that one wears all the time to experience some kind of Augmented Reality World of Tomorrow, I suspect the 2–3 hour battery life is seen as a negative. My experience with the Apple Vision Pro is that I am quite content to use it for a few hours at a time, mostly in a single position while plugged in to power—occasionally moving around for a few minutes at a time and/or to interact with an immersive experience on the device.

As Wes Hilliard first mentioned in one of the AppleInsider podcasts, I suppose if I wanted more portable power, I could plug the Apple Vision Pro battery into one of my portable, pocket-sized batteries that I use to power my iPhone and iPad. So far I have neither needed, nor wanted, to do this.

In writing this article, I did find some interesting battery information to pass along:

Apple Vision Pro Power/Battery Information

This information is adapted from Apple Support (2024).

Meaning of the Light on the Battery

While Plugged In

  • Green for several seconds: Battery fully charged.
  • Amber for several seconds: Battery is less than 100%, but can power the device.
  • Amber pulsing slowly: Battery charge is too low to power the device.

While NOT Plugged In

  • Green for several seconds: Battery is 50% or higher.
  • Amber for several seconds: Battery is between 5% and 49%.
  • Amber pulsing slowly: Battery charge is too low to power the device.

See the Battery Level

  • Put on the Apple Vision Pro.
  • Open Control Center.
  • Look in the top-right corner.

You may also consult Apple’s article, Apple Vision Pro Battery and Performance.



References

Apple Support. (February 13, 2024). Connect and charge Apple Vision Pro battery. Retrieved from https://support.apple.com/en-us/117740

Apple. (2024). Apple Vision Pro Battery. Retrieved from www.apple.com/shop/product/MW283LL/A/apple-vision-pro-battery

Brownlee, M. (February 3, 2024). Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech! [Video].  Retrieved from www.youtube.com/watch?v=86Gy035z_KA

Greenwald, W. (February 9, 2024). Apple Vision Pro Review: An incredible leap in AR and VR interactivity. Retrieved from www.pcmag.com/reviews/apple-vision-pro

Gruber, J. (January 30, 2024). The Vision Pro. Retrieved from https://daringfireball.net/2024/01/the_vision_pro

Hilliard, W. (February 8, 2024). Apple Vision Pro early review: a peek into the future of computing. AppleInsider. Retrieved from https://appleinsider.com/articles/24/02/08/apple-vision-pro-early-review-a-peek-into-the-future-of-computing

Pogue, D. (February 11, 2024). Apple’s Vision Pro Isn’t for Everyone — But It’s Still a Home Run. New York Magazine. Retrieved from https://nymag.com/intelligencer/article/david-pogue-reviews-apple-vision-pro.html#

Rossignol, J. (January 30, 2024). Vision Pro Reviews: Surprising Battery Life, 'Weird' Personas, and More. Retrieved from www.macrumors.com/2024/01/30/apple-vision-pro-review-highlights/

Stern, J. (January 30, 2024). Apple Vision Pro Review: The Best Headset Yet Is Just a Glimpse of the Future. Wall Street Journal. Retrieved from https://apple.news/ALd4RY3b9Q0umjnCH4Srw9A

Sunday, February 11, 2024

Watching a YouTube Video on the Apple Vision Pro for Learning is a Miserable Experience

I have reported already on this blog that the lack of certain native visionOS apps is essentially a non-issue. The short list of well known companies not yet developing for Apple Vision Pro includes Netflix, Spotify—and YouTube (Sullivan, 2024). I’ve logged in to both Netflix and YouTube to watch videos on the Apple Vision Pro, and both websites function. However, I have a new experience to share.

Watching a YouTube video on the Apple Vision Pro for learning is, hands-down, the most miserable educational experience I have ever had. The problem is the user interface.

When I’m watching a video for the purpose of learning, I expect to be able to stop and start it, re-listen/re-watch snippets at will, and be able to simply pause when I need to pause to stop and reflect and/or take notes. The YouTube interface, combined with the Apple Vision Pro’s eye-tracking and touch, make these seemingly simple operations nearly impossible.

The problem is the placement of the Play/Pause button in YouTube and the scrubber controls at the bottom of the video:

Almost every time I attempted to pause the video, it jumped back and started from the beginning. The "hotspot" areas for the Play/Pause button and the scrubber controls are very small and seemingly overlap! It's easy to disastrously select the wrong button, causing you lose your place and need to find your place again—which you were not paying attention to. Then if you get close to where you attempted to pause, another Pause button click jumps you back to the start again! It’s an endless loop of frustration. By the time you find your place again, you have forgotten why you paused in the first place—your brain has long since moved on.

Here’s an example of me attempting to pause a YouTube video to quote Marques Brownlee using the Apple Vision Pro. I get the quote, but things go awry after that:

 

Please note that simply getting a native YouTube app on visionOS may not necessarily solve this problem—the interface could still be bad. And for simply watching videos with no frequent starts, stops, or re-watching, the YouTube website works OK. But if YouTube ever gets around to releasing a visionOS YouTube app, I hope YouTube considers us learners and teachers.


References

Brownlee, M. (February 3, 2024). Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech! [Video].  Retrieved from www.youtube.com/watch?v=86Gy035z_KA

Sullivan, M. (January 26, 2024). With the Vision Pro, Apple has never depended more on developers for a product’s success. Fast Company. Retrieved from www.fastcompany.com/91017225/apple-vision-pro-developers-sales

Focus on Foveated Rendering

While reading and watching Apple Vision Pro reviews to prepare for a previous blog post, I was reminded of a technology term that I had heard for the first time back in June 2023 at WWDC (Apple’s Worldwide Developer Conference): Dynamically Foveated Rendering (Apple Developer, 2024). Fast-forward 8 months later, and I now realize that I am experiencing this technology everyday. Best of all, I had no idea that this technology was happening—except for one giveaway that I will discuss later—a fact that means the tech is functioning perfectly.

In his Apple Vision Pro review video, "Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech!," Marques Brownlee gave an excellent explanation of foveated rendering. In part, he said, “It combines the insanely fast eye tracking with what's called foveated rendering, meaning, it's only actually rendering in high resolution exactly what you're looking at when you're looking at it.” He then pointed out that this kind of rendering depends upon the ultrafast processing power of the Apple Vision Pro to be able to track your eye movement and nearly instantly deliver a perfectly crisp focus for your field of vision exactly where you are looking, while the area in the periphery remains less focused.

The idea is that foveated rendering can greatly improve power efficiency of the device because the Apple Vision Pro is “aware” of where you are looking, and does not needlessly use resources to render the entire 8K screen all the time.

Brownlee also noted that this effect is very apparent when taking screen captures with the Apple Vision Pro. Until he mentioned this, I had been somewhat frustrated by the highly variable quality of my screen caps. However, I now understand why screenshots are only focused in one place—where I was looking at the time. The following low-light screen capture illustrates this effect: I was watching Brownlee’s YouTube video and looked to the lower-right corner to resize the window when I made the capture. The window-resize corner tool is in perfect focus, but the focus radiates outward from that point and gets worse. To my eyes, everything I was viewing at the time appeared perfectly in focus, but seeing my full field of view in this screen capture reveals otherwise.


The concept of this technology is based upon human vision. A research paper from 2016 succinctly describes human vision as it relates to this technology. “Humans have two distinct vision systems: foveal and peripheral vision. Foveal vision is sharp and detailed, while peripheral vision lacks fidelity. The difference in characteristics of the two systems enable recently popular foveated rendering systems, which seek to increase rendering performance by lowering image quality in the periphery” (Patney et al, 2016).

Since these kinds of topics endlessly fascinate me, I started to delve into the technology. My first stop was a short article from 2016 about graphics-leader Nvidia. Paul Miller of The Verge wrote, “[this technology] combines the insanely fast eye tracking with what's called foveated rendering, meaning, it's only actually rendering in high resolution exactly what you're looking at when you're looking at it.” He goes on to say (in 2016), “Of course, this only works if you know where someone is looking, and none of the major VR headsets currently available do this.” The article also references a couple of then-new eye-tracking sensors along with Nvidia’s “foveation technique that can blur the periphery of an image while still maintaining the parts that humans perceive at the edges of their vision — color, contrast, edges and motion.”

Another source from 2016, the website Digital Trends, reported that, “according to Nvidia, foveated rendering systems have been in use for around 20 years” (Parrish, 2016). This got me wondering, how far back does this technology go? So I dug deeper and found studies related to this technology that are now nearly 30 years old.

I first found a wave a published papers from around 2003. The first, Foveated 3D model simplification (Cheng, I., 2003), was presented at the IEEE Xplore conference related to Signal Processing and Its Applications. The author explained a technique for “enhancements to 3D model simplification based on interactive level-of-detail update with foveation.” However, the earliest description of this technology I could find was from 1996, by Ohshima, Yamamoto, & Tamura. The Abstract of their paper does not specifically use the term “foveated,” but the description of the technique is nearly identical: “This...new method of rendering for interaction with 3D virtual space [uses] geometric models of graphic objects...constructed prior to the rendering process. The rendering process first calculates the visual acuity... Second, the process selects a level from the set of hierarchical geometric models depending on the value of visual acuity...a simpler level of detail is selected where the visual acuity is lower, and a more complicated level is used where it is higher.”

I also found an article about Apple’s patent on this technology. Patently Apple (2023) reported, “During Apple's WWDC23 introduction to their revolutionary Vision Pro Spatial Computing headset, Mike Rockwell, VP, Technology Development Group, came on stage to describe the key components behind the new device.” Rockwell described: “The rendering process can provide foveated rendering by drawing regions based on gaze direction (e.g., drawing only some regions, drawing some regions at higher resolution/higher frame rate, etc.). In some implementations, the rendering process provides an increase in quality in a focus region and/or reduces the amount of computation and memory used in providing perspective correct rendering” (Patently Apple, 2023).

To me, this is an excellent example of an innovative idea being conceived as a theory, moving into an early research and development about 10 years later, advancing again about 10 years after that, and finally reaching a stage that the original 30-year-old idea can be realized with a combination of current technology in a shipped product at a high implementation level. Granted, this technology is not fully “there” yet—it takes a $3,500 ski-goggle-sized headset to deliver—but the processing power, imaging, power consumption, and display technology is perfectly demonstrated on the Apple Vision Pro right now in 2024—by the imperfect screen capture of the technology in practice.


References

Apple Developer. (2024). Discover visionOS: Unity. Retrieved from https://developer.apple.com/visionos/

Brownlee, M. (February 3, 2024). Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech! [Video].  Retrieved from www.youtube.com/watch?v=86Gy035z_KA

Cheng, I. (August 2003). Foveated 3D model simplification. Conference: Signal Processing and Its Applications, 2003. Proceedings. Seventh International Symposium. Retrieved from www.researchgate.net/publication/4030859_Foveated_3D_model_simplification

Miller, P. (July 22, 2016). Nvidia's foveated rendering tricks for VR could improve graphics and immersion. The Verge. Retrieved from www.theverge.com/2016/7/22/12260430/nvidia-foveated-rendering-vr-graphics-smi-eye-tracking-siggraph

Ohshima, T., Yamamoto, H., and Tamura, H. (April 1996). Gaze-directed adaptive rendering for interacting with virtual space. 1996 Virtual Reality Annual International Symposium (VRAIS 96), IEEE Computer Society. Retrieved from https://ieeexplore.ieee.org/document/490517

Parrish, K. (July 22, 2016). Nvidia plans to prove that new method improves image quality in virtual reality. Retrieved from www.digitaltrends.com/computing/nvidia-research-foveated-rendering-vr-smi/

Patently Apple. (2023). A new Apple patent describes Perspective Correct Vector Graphics with Foveated Rendering for Vision Pro, iPhone & more. Retrieved from www.patentlyapple.com/2023/06/a-new-apple-patent-describes-perspective-correct-vector-graphics-with-foveated-rendering-for-vision-pro-iphone-more.html

Patney, A., Kim, J., Salvi, M., Kaplanyan, A., Wyman, C., Benty, N., Lefohn, A., & Luebke, D. (2016). Perceptually-Based Foveated Virtual Reality. NVIDIA, SIGGRAPH 2016 Emerging Technologies. Retrieved from https://research.nvidia.com/sites/default/files/pubs/2016-07_Perceptually-Based-Foveated-Virtual/foveated-sig16-etech.pdf


Saturday, February 10, 2024

New Terms Introduced with the Apple Vision Pro

As an educator who has been involved with technology for nearly 3 decades, I feel that I have a responsibility when teaching others about new devices, systems, and software to use the “correct” terms to describe physical attributes, actions, and interface elements.

Imagine you have been asked to teach a group of teachers who will later teach students how to use some new device and, although you know exactly how to do the thing you are teaching, you find yourself saying, "tap that square thingy with the arrow coming out of the top in the upper-right corner." Of course, I am referring in this example to the Share button, an important and often-used iPhone and iPad interface element. Early on, this fabled symbol acquired the name “squarrow” (a portmanteau of square+arrow) seemingly from origins unknown, except that in one week after first hearing it, all students and teachers in my schools were suddenly using the term sometime during 2012.

Over the years, I have found that the best place to find the terms for new technology is both in official user manuals—and in the case of Apple technology—on the Apple Developer website. I have also found it helpful to cross-reference certain terms and elements in both locations to be sure of the usage is consistent and/or learn more details about other possible uses.

A new device with a new operating system (OS) almost assures a new set of terms, and Apple Vision Pro is no exception. This article will share as many of the terms as I can find to help both myself learn them, and also hopefully help others who find themselves in a teaching situation.

In case you have never stumbled upon them, Apple publishes extensive and exceptionally well written online User Guides that get updated the moment a new version of an OS is released. The Apple Vision Pro User Guide can be found here.

To my chagrin, I have noticed that Apple is NOT naming some of the interface elements, instead referring to them by graphic in the description. Here is one example:

As a teacher, I’d like to be able to refer to that down-arrow-in-a-circle symbol by a name, but alas, it doesn’t have a name. For the purposes of this article, I will use an asterisk* to note any term not specifically used by Apple to name an interface element, and the name I use will be my best guess based upon other Apple OS conventions I have encountered over the years.

Physical Features of the Apple Vision Pro Device

Terms

  • Audio Straps
  • Cover
  • Digital Crown
  • Displays
  • Fit Dial
  • Head band (Dual Loop Band)
  • Head band (Solo Knit Band)
  • Light Seal
  • Light Seal Cushion
  • Power
  • Release tab
  • Top Button

Basic Interface Elements of the Apple Vision Pro

Terms

  • Home View (Apps View)
  • Tab Bar

Although Apple does not specifically state it anywhere I can find it, the Home View is comprised of three possible views, all accessible from the Tab Bar (a bulleted list and the icons are shown here):

  • Apps View
  • People View
  • Environments View

In general, I have observed that "Environments" and any "immersive" features are used interchangeably along with the "mountain with a sparkle" icon to refer to simulated virtual environments.

The following controls are used on the Apple Vision Pro for navigating basic features of visionOS and apps. The living room image used below originated on the Apple website, but I altered it to show various interface elements. (The view shown is for educational purposes and contains several interface elements that would not normally be shown simultaneously in visionOS.)

Terms

  • Close button
  • Control Center access button*
  • Resize handle*
  • Window bar

Names and Descriptions of Apple Vision Pro Hand and Eye Gestures

The following hand and eye gestures are discussed both in the User Guide and on the Developer site. I have combined the ideas into the bulleted list below.

I have also noted that both in the documentation and at the Apple Store demo, Apple specifies that any tap of any two fingers constitutes a "tap." However, in practice, and for the sake of clarity, I quickly adopted the habit of using only the thumb and forefinger as my tap gesture (either hand works equally as well). I continue to have mixed results when I try other finger combinations for a tap gesture.

  • Tap your fingers together ("tap")—Select options, open apps.
  • Touch—Interact with elements with fingers (e.g., touch keys on the virtual keyboard).
  • Pinch and hold—Show additional options, zoom in and out.
  • Pinch and hold with both hands—Pull apart to zoom in, push in to zoom out.
  • Pinch and drag—Move windows/objects, scroll, resize objects.
  • Pinch and drag (pinch and quickly flick wrist, "swipe")—Scroll quickly.

Inputs

As an aside, Apple specifies that three types of input are available in visionOS:

  • Indirect input—Eyes are the target of an interaction. The thumb+forefinger movement starts the interaction. Other movements may follow (e.g., like moving the pinched finger to scroll).
  • Direct input—A finger occupies the same space as an onscreen item (e.g., typing a key on an onscreen keyboard, pressing a button).
  • Keyboard input—Use a physical mouse, trackpad, or keyboard connected via Bluetooth or on a connected shared device.

Other Features Specific to Apple Vision Pro

The following features that are specific to Apple Vision Pro and not already mentioned above are those I have used several times in the last week. I suspect many of these will be useful to anyone teaching about the device.

Features

  • Take a screen capture
  • Force Quit an app
  • Capture a Spatial Photo or Spatial Video

Image Sources