Wednesday, March 6, 2024

Creating on Apple Vision Pro: Sketchnoting (Drawing by Hand)

This is part of a series of articles where I engage in creative projects that could be used in classrooms using the same apps, tools, and features across different Apple devices and operating systems. In this example, I create a draw a sketchnote by hand using my typical sketchnoting app, Keynote. I use the same Keynote file on iCloud and draw on 3 different platforms: iPadOS using Apple Pencil 2; macOS using trackpad and Keynote’s Pen tool; and visionOS using the Drawing features and “drawing in the air.”

I estimate that I have drawn over 200 sketchnotes over the past several years (here are some examples), and I have developed a method using what I believe is the best device, app, and tool for the process. For this exercise, I followed my process, but I alter the drawing methods to report on the differences among operating systems.

Here is the process I used:

  • Researched a topic, in this case, architecture from Fort Sheridan in Highland Park, IL, USA. (Fort Sheridan is a former military base with historic architecture converted to residential housing in the late 1990s.)
  • Compiled text research in the Notes app, along with URLs of images (mostly from the Library of Congress).
  • Planned a sketchnote in the Keynote app using Apple Pencil drawings, text boxes, and images set to 30% opacity to use the tracing method for sketchnoting. Since the exercise involves 3 operating systems/devices, I researched 3 structures and created a layout with a 3-panel design.
  • Since I prefer to use handwritten text in sketchnotes, I created the headline and most labels on iPad with the Apple Pencil.
  • Finally, I used my research and planning to draw each panel using 3 different platforms. Each panel includes a headline, a drawing, and some text bullets about the structure.

This project was an exercise in ever-increasing tedium. 

Beginning with the end in mind, here is the sketchnote I created for this activity. The timings indicate how long each panel took to create. I will elaborate on each panel below.

 

Keynote on iPadOS with Apple Pencil 2

I have several reasons I prefer Keynote as my primary sketchnoting app:

  • The app is free, available for all Apple operating systems, and Keynote files can be accessed on iCloud from all devices.
  • Easy set-up allows for quick starts, and all my sketchnote files can be easily formatted in HD format (1920 x 1080) on a white background using a 1-click template.
  • Keynote allows me to create unlimited slides to pre-draw and/or plan as I’m working—like a sketchbook with unlimited pages.
  • Drawing tools are accessed easily by tapping the Apple Pencil on the screen.

(I have only ONE complaint...APPLE: Let me turn OFF Auto-Center on the iPadOS version of Keynote so I can zoom in and draw all the way to all edges. This missing feature is VERY annoying!!)

For this activity, I used the iPad/Apple Pencil to plan the sketchnote so I could open it on all devices and easily draw each panel. This is the Drawing interface of Keynote on iPad using Apple Pencil:

I find drawing on the iPad screen with the Apple Pencil 2 quite natural. I also appreciate that I can efficiently plan my sketchnote layouts on iPad or Mac using the same file. It’s easy for me to drag images from Safari, the Photos app, and from Keynote’s searchable collection of shapes. I also use Apple’s well designed emoji in some designs.

Drawing on iPadOS with Apple Pencil is the method I have practiced for more than 8 years—I began soon after the Apple Pencil was released (Apple Newsroom, 2015). As a researcher I disclose this as a bias, and as a sketchnoter (sketchnotist?) I acknowledge my many hours of practice. However, I am using this opportunity to try other methods I use less often, along with a brand new method.

For the record, the time it took me to set up the sketchnote and create the iPadOS/Apple Pencil panel was 28 minutes. In this panel, the subject was Fort Sheridan’s iconic Water Tower. In drawing the Water Tower, it appears linear, but includes several curves and flared edges that make its design more complex. These details are not major obstacles for drawing by hand with an Apple Pencil.

I also included a hand-lettered headline for the slide, and I planned of the layout for the other two panels on the iPad using Apple Pencil. The planning and drawing activities took very little effort, caused me no frustration, and felt quite natural.

Keynote on macOS with Trackpad and Pen Tool

Somewhat oddly, Apple has chosen to not include access to the hand drawing tools on Keynote for macOS. The “Drawing mode” could theoretically be included and allow drawing on the trackpad with fingers. However, Apple allows “hand drawing” on macOS using the Pen tool. This method is very close to other professional illustration apps, such as Adobe Illustrator, and the Pen tool works similarly to other apps offering Bézier curve drawing.

That being said, this method is slower for me, and I am not as practiced. Further, I made the exercise more difficult by using the Pen tool to hand-letter the sketchnote by drawing each letter by hand (since I hand-letter using the Apple Pencil). I allowed myself to copy/paste (and option+drag to duplicate) letters to save time. This hand-lettering method made drawing this panel somewhat laborious.

On the other hand, drawing the building—the Guard House—was not particularly difficult since I have some experience the Pen tool. Although I am no expert in Adobe Illustrator—and Apple’s Pen tool has less features than pro-level illustration apps—the Keynote Pen tool is more than powerful enough for this kind of drawing. In general, drawing straight lines with the Pen tool is easy, and duplicating them using copy/paste or option+drag to duplicate makes the process relatively painless. Drawing angles simply means clicking and dragging an endpoint of a straight line. Drawing arcs and curves involves adding or selecting points on a line, converting them to angles or curves, and then forming the points into a final shape. This takes practice, but I find the process straightforward. Here is the panel drawn in Keynote in macOS with the Pen tool:

To make this panel as stylistically close to Apple Pencil’s hand drawing in Drawing mode as possible, I used a line that resembles the same pencil pattern and thickness of the Apple Pencil settings. If you look closely, the line pattern and weight are not the same, but from a distance they appear similar.

Overall, this panel took 1 hour 16 minutes to complete. The Guard House drawing is fairly detailed, but most of the time was spent hand-drawing letters. Compared to the iPad with Apple Pencil, the activity took more effort, caused little frustration except for the increased lettering time, and felt less natural than hand-drawing.

Upon reflection, lettering was the major problem here. Writing a word with Apple Pencil takes seconds, while using the Pen tool took a few minutes per word. Further, letters written with the Pen tool look less natural, and the process was needlessly complex compared to the Apple Pencil.

Keynote on visionOS with “Air Drawing” (Eye-tracking & Pinching)

Compared to the two previous methods, hand-drawing in the air—a.k.a. as my term "air drawing"—on visionOS was the most unpleasant option. The process was frustrating, tedious, and difficult to the point of being nearly torturous. I feel that I have a high level of patience, and I was committed to seeing this project through, but I essentially gave up after over 2 hours 30 minutes; however, I completed enough of the activity to provide a thorough review of the process.

I began by drawing the building in the panel, the Fire Station. It is the smallest building of the three with the least details. Unfortunately, the building drawing also includes its original sign that reads, “FIRE STATION No 1.” Drawing letters in the air is not efficient, to say the least.

The process of drawing on Apple Vision Pro requires a few steps. First, Drawing mode is enabled by looking at the Photo or Video icon along the bottom toolbar and selecting Drawing from the menu. This enables the same Drawing toolbar that the Apple Pencil uses. The pen and color settings from my previous iPad and Apple Pencil drawing were already selected.

With the drawing tool selected (I use the pencil tool), drawing on the Apple Vision Pro is accomplished by looking at where you wish to draw, tapping and holding your thumb/forefinger together, and moving your pinched fingers through the air. Fortunately, the visionOS interface places a light gray dot where it detects your eyes looking. Unfortunately, a few issues persist after you see the gray dot. First, the finger tap doesn’t always register so when you start drawing, the line doesn’t always begin to draw. Second, the line drawn definitely does not match the shape and smoothness of your motion—every line inevitably displays as jagged and/or wavy. 

The video below shows the process of hand-drawing a letter "F" using a tracing method. Drawing this letter took 4 minutes 30 seconds. After about 8 seconds, the video is sped up 10x (total runtime is about 30 seconds).


After a while, I started experimenting with ways of drawing better lines. Moving slower seems to always yield a more jagged line. Moving faster usually produces a smoother line, but it is seldom straight, and it starts and stops erratically. I soon learned to use the line selection tool to select a previously drawn line and modify it using the Resize tool. To access the ribbon menu that includes Resize, you must either drag-select to encircle a line, or I stumbled upon this method: tap both fingers on both hands simultaneously after a line is selected to make the ribbon menu appear.

With the ribbon menu displayed, you must look and tap a right-arrow to see the second set of commands, select Resize, and then select one of the blue resize handles to resize the shape by looking, pinching, and dragging. Depending on the size of the shape or line, it can be tricky to select the correct corner and to get the resize to work correctly.

 

Starting with a crooked and roughly horizontal or vertical line, using the Resize method to make a corrected precise horizontal or vertical line works fairly well. However, drawing an arc, circle, or any irregular line is hit-or-miss. A complex shape is best accomplished by attempting to assemble different shorter lines from fragments. To draw the arcs needed for the Fire Station, I drew several small arcs until I found a roughly symmetrical one and used the Resize tool and eraser to size the arc and remove extra parts of the line.

Attempting to hand-letter with the Apple Vision Pro is among the most time-consuming and exasperating pursuits I’ve ever attempted. Expect to spend about 20 minutes writing each word. My original design included 5 interesting facts that would have added 12 additional words to my description of the Fire Station. According to my calculations, this would have added over 4 hours to this design. The time was simply not worth the effort so I stopped after 2 facts.

Overall, the relatively simple Apple Vision Pro Fire Station sketchnote panel drawing took me 2 hours 48 minutes to complete. It never got easier, despite constant practice. As a basis for comparison, I re-created the panel using Apple Pencil on iPad, and the drawing took 14 minutes:

As I mentioned above, while the Apple Vision Pro can be used to create a sketchnote, it is decidedly not the right tool for the job. I suppose if I needed a couple of hand-drawn lines or shapes and I didn’t have an iPad and Apple Pencil available, the Apple Vision Pro would allow me to produce them. However, a full sketchnote is simply not worth the time and frustration it would take to create a hand drawing with any level of detail. Also, hand lettering on Apple Vision Pro is an exercise in futility.

Potential Classroom Use

At this time, the Apple Vision Pro is not a viable tool for creating sketchnotes or other detailed hand drawings. The best tools I can identify for sketchnoting include an iPad with an Apple Pencil (or Logitech Crayon), followed by the Pen tool on macOS. After this activity, I would not wish to subject students or teachers to drawing by hand on visionOS until the tools are vastly improved and/or until better hand-drawing methods for the Apple Vision Pro user interface are created.



References

Apple Newsroom. (September 9, 2015). Apple Introduces iPad Pro Featuring Epic 12.9-inch Retina Display: Apple Pencil & Smart Keyboard Bring Breakthrough Levels of Precision & Utility to iPad Pro. www.apple.com/newsroom/2015/09/09Apple-Introduces-iPad-Pro-Featuring-Epic-12-9-inch-Retina-Display/



Saturday, February 24, 2024

Creating on Apple Vision Pro: Drawing Shapes

This is the first article in a series of Creating on Apple Vision Pro articles. In this series, I will challenge myself to complete a similar creative action on all the Apple devices I regularly use, using the same app(s), the same tools, and the same features across different devices and operating systems. To begin the series, I selected an exercise in drawing shapes using Apple’s standard shape drawing tools in macOS, iPadOS, and visionOS.

To help somewhat level the playing field, I will begin on the device I usually use, or prefer to use, for a given activity, and complete it without practice. In this case, I usually use macOS in Keynote to create shape-based drawings. The tools are essentially the same across devices with the differences being the manner in which the user interacts with the user interface (UI) elements.

In this example, I will use the shape drawing tools to create a simplified version of my #AppleVisionProEDU logo. I created the actual logo in Keynote on macOS. In this case, I will draw a simplified version of the logo using several features I regularly use:

  • Draw a variety of circles and ellipse shapes
  • Use the Unite Shapes and Subtract shapes features
  • Use the Gradient Fill tool
  • Add a text box, change the font, size, and center the text
  • Draw a diamond shape

Here is the basic design I’ll recreate four times:


The four apps and versions I will use include:

  • Keynote on macOS
  • Keynote on iPadOS
  • Keynote on visionOS
  • BONUS: Pages on visionOS—This will give an idea of the differences between using the same tools native on visionOS compared to an iPadOS app running on visionOS

For this article I will report the times it took me to create the same shape drawing in four app versions as one metric. Perhaps more importantly, I will report my experiences using the UIs of each OS.

Keynote on macOS

I have been drawing logos and shapes using the draw tools in Keynote for macOS for over 2 decades. I use these shape creations in presentations, logos, and videos. When the Bézier pen tool was released in Keynote in about 2011, I started using Keynote (and later Pages) as my primary illustration app. Here are all the basic shapes required to draw the simplified logo (shown in different colors with each shape selected for the sake of this explanation):

To level the playing field, I did not practice the original drawing in advance, I just started drawing from an image in my head. The unrehearsed drawing took 4 minutes 26 seconds on macOS.

It’s easy for me to take drawing in Keynote for macOS for granted, but the UI features that set the macOS version apart are mostly due to keyboard shortcuts and multi-touch trackpad features. A couple UI elements are a bit more difficult in macOS, namely, the Unite and Subtract Shapes features are buried under the Format > Shapes and Lines menus. However, my long-term experience in the app/OS made this drawing almost effortless.

 

Keynote in iPadOS (M1 iPad Pro 11-inch)

Although creating drawings with shapes on the iPadOS version of Keynote is my second-favorite option, I have a fair amount of experience from my many sketchnote projects I draw using Apple Pencil on iPad. I frequently access the shapes tools for drawing/tracing guides and to help with layout—even if you never see the shapes in my final sketchnotes because I delete these guides.

For the iPadOS on iPad version of my drawing, I used the exact methodology, but used the touchscreen features of iPad and the iPadOS touch UI. This logo took me 7 minutes, 25 seconds to complete, nearly 3 minutes longer than macOS.

To me, the multi-touch interface on iPadOS is nearly as efficient as macOS, and I would cite no problems using the iPadOS version. I speculate that a combination of having less overall iPadOS experience and the fact that I find the iPadOS UI slower to navigate may be the reason for the increased time. However, iPadOS feels more precise in moving objects because you are touching them “directly” (not using a trackpad), and I had already drawn this logo once—factors that should have increased efficiency and decreased time. Overall, this activity did not seem overly complicated, nor did I feel it took too much time to complete.

Keynote on visionOS

Please note that this was not my first time drawing shapes on visionOS. (I wrote about these experiences early on this blog.) Also be reminded that this was the third time to draw this logo so I was well practiced by now. This visionOS Keynote version of the drawing took me, surprisingly, 6 minutes 33 seconds to complete—about 1 minute faster than iPadOS on iPad! I was fully expecting to add at least a full minute due to my inexperience with the eye-tracking and pinching visionOS UI, but I was proven incorrect.

However, I have several visionOS UI issues to report. At this time with my current experience, the drawing did not feel “natural.” Many times throughout the drawing process I felt a bit of frustration.

1. It is very easy to accidentally select a menu in visionOS, but it is not yet natural for me to deselect this menu I didn’t want. To make a menu without a close button “go away,” you need to look “nowhere” on the screen—in a blank area—and tap your fingers. There isn’t always a blank area to look at so it’s easy to bring up another unwanted menu. Frustrating—but tolerable.

2. Moving objects with resize handles is somewhat of a gamble, especially if the object is small. You need to look at the center of the object, pinch, and hope your drag moves and doesn’t resize the object. I found myself focusing all my will on the center of circles hoping that my "psychic power" would ensure I would move the object—oddly, this usually worked!

3. Drag-select is unpleasant. Frankly, I have no idea how I made this function work—sometimes it did and sometimes it didn’t. To drag-select, you look at “nothing”—in the middle of a blank space—and tap-and-drag while desperately hoping the interface understands that you want to drag-select over objects. This is about 800% more effort than macOS and iPadOS. On visionOS, there is no feedback that your eyes are “locked” into a place that will trigger the pinch to cause drag-select to function. This UI feature apparently functions on hope.

4. Resize handles offer sufficient visual feedback because they slightly change color while you look at them. When objects are small and/or close together, this action is still challenging, and I found myself using my “psychic powers” to ensure I was looking precisely in the exact right place. It mostly worked.


My description of using my “psychic powers” to ensure precise eye-tracking is meant tongue-in-cheek, but it I did it many times, and I wonder how many other users are doing it. Despite these issues using visionOS for this drawing example, the stopwatch didn’t lie—I still completed the activity in less time than iPadOS. However, it felt longer and the UI caused friction.

BONUS: Pages on visionOS

Since the shapes drawing tools are exactly the same on Pages, especially if the Pages document is set up as a Page Layout document as I did for this example, I wanted to try the iPadOS tools on visionOS. The time for completing the drawing was 7 minutes 9 seconds—about the same as the iPadOS version on iPad. Since I suspect we will see a Pages for visionOS version relatively soon, I just have a few insights:

The overall experience was very similar to Keynote for visionOS. The side-tab and “ribbon” menus in iPadOS are functional, but the visionOS floating menus are far better for the eye-tracking/pinching UI. When a side-tab menu is selected in the iPadOS version, the workspace shrinks unnecessarily in the visionOS world. (This is odd because the iPadOS version on my iPad Pro uses a floating side menu—NOT the fixed version of the side tab menu used by visionOS shown below.) Also, the tools and icons are spaced further apart on visionOS allowing for more precise selections, even if they are placed in yet another set of locations on the screen for you to need to locate.

Pages for iPadOS on Apple Vision Pro

In iPadOS, there is no visual feedback on whether or not you have selected a resize handle, you have only your hope that the thing you are looking at is selected. Hope is not enough for a functional OS. Note to Apple Human Interface Team—hope alone does not make a functional UI element.

Resize handles on Pages for iPadOS on Apple Vision Pro

 Conclusion

To conclude this first article in my series on Creating on Apple Vision Pro, I decided to make a video of drawing this example using Keynote for visionOS. I sped up the video so it is just 1 minute long. However, I was surprised yet again by the actual time of this example—the fifth time I completed this activity in 4 different formats—my time was 5 minutes 26 seconds on visionOS. This is just 1 minute longer than my original, unpracticed macOS version. 

Even with its imperfections, visionOS is impressive as a creation tool for drawing with shapes.

Sunday, February 18, 2024

Learning by Teaching—Apple Vision Pro Demonstrations & Debriefs

As an educator, I have always felt that the best way to learn something is to teach it. In addition to my own time with the Apple Vision Pro for the past 15 days, I've also taken the opportunity to provide demonstrations to some of my friends and colleagues. So far I have provided 6 demonstrations, each lasting between about 45 minutes and just over 2 hours. In some cases, my friends or colleagues expressed interest in a demo, while in other cases, I solicited feedback. I worked with a variety of people with backgrounds in education, provided similar demonstrations, and debriefed the experience with the same 4 questions.

My first demonstration was just 2 days after my Apple Store demo, and it was with a friend of mine outside of education who is very interested in Apple devices. It was during this demonstration that I got the idea for this action research activity. I did not include the results of the first demo here since I used that opportunity to create the sequence of activities, noted some of the issues I planned to observe, and wrote my follow-up questions after this practice demonstration was completed.

This article reports the results of what I learned providing Apple Vision Pro demonstrations to 5 friends and colleagues with backgrounds in different aspects of public education. I will refer to those with whom I worked by their initials. Here is a short description of each person:

  • ST: A long-time Apple enthusiast with many years experience providing technology hardware and software support in schools. ST has decades of experience using macOS, iOS, iPadOS, and other Apple systems.
  • HO: A friend and colleague who provides education support to many people and groups in my school district who has over 10 years experience using Apple devices (primarily macOS and iOS).
  • TG: An innovative friend and colleague with over 20 years experience in education as a teacher and administrator. TG is in an educational technology leadership role with experience using Apple devices running macOS, iOS, iPadOS, and several other systems.
  • TH: A colleague in a district leadership role providing educational support who is relatively new to using Apple devices, but adapted quickly to operating systems (primarily macOS). TH had no idea the Apple Vision Pro had been released and heard about it for the first time seeing it in my office.
  • ZE: A colleague with about 20 years experience as a teacher and administrator, ZE has used Apple technology as a tool for at least the last 5 years. ZE found the Vision Pro intriguing and agreed to be part of this demo experience after I asked.

Each demonstration was scheduled for at least 45 minutes. When a demonstration began, I gave brief, basic instruction about how to pick up the Apple Vision Pro safely; how to look, pinch, and drag; how to adjust the fit dial on the Solo Knit Band; and I showed the locations of the Digital Crown and top button. I then mirrored the Vision Pro Display on my M1 11-inch iPad Pro, set the Vision Pro to Guest User mode, removed my prescription lenses, and handed the device to the Guest User.

During the third demo, I realized that I needed to specifically set the audio source to the Vision Pro while mirroring. Before the third demo, the audio settings were less consistent, but each demonstration included Spatial Audio for some or all of the experience.

Since these demonstrations were with colleagues I know and trust, I made the decision to leave my iCloud account logged in and accessible. Further, I was able to watch the majority of the demo using the mirroring feature of the Apple Vision Pro. This setup may not be optimally replicable for those who do not wish to share their personal data with others—particularly Photos and Documents. Some Apple Vision Pro owners may wish to either limit the Guest User account (an option available in the Guest User setup) or create a demo Apple ID for the purpose of conducting demos.

Here is the general sequence I used for my demos:

My Guest User Setup (about 3 minutes)

  • Explain how to pick up the Apple Vision Pro safely (thumb through nose ridge and other fingers across top—be careful of the parts attached by magnets)
  • Share Vision Pro screen with iPad screen (and set Audio to Vision Pro)
  • Switch to Guest Mode in Control Center
  • Explain fit dial on the Solo Knit Band
  • Show locations of Digital Crown and top button
  • Remove my prescription lenses
  • Explain that the first few minutes are very important to get a good fit and set up eye-tracking and hand gestures

Demo (at least 45 minutes)

  • Ensure a good fit by adjusting the fit dial on Solo Knit Band
  • Observe and assist in the fit as needed
  • Direct to Digital Crown
  • Observe hand gesture practice
  • After setup, explain video passthrough (“you are looking at a video of the room, not the room…”)
  • Access Home Screen with the Digital Crown
  • Open Photos app
  • Try scrolling (side-to-side, up-and-down)
  • Access and view Panorama photos in Panorama mode (from tab bar at left)
  • Access and view Spatial Video in and out of full-screen mode (from tab bar at left)
  • Teach about bottom-center close button and the “dash” window control to move windows
  • Back to Home Screen, select one or more Immersive Environments (from tab bar at left)
  • Use the Digital Crown to increase the Environment to full 360°
  • Suggest participant to stand if comfortable
  • Back to Home Screen, launch Keynote app
  • Select a presentation (from iCloud account)
  • Rehearse presentation in Theater and/or Conference Room (point out options under the ... button)
  • Back to Home Screen, launch Encounter Dinosaurs app
  • Interact as comfortable (hold out hand for butterfly, stand to offer hand to dinosaurs)
  • If time/interest: back to Home Screen, launch Disney+ app
  • Turn off mirroring on iPad*
  • Select an Environment and look around (4 are available from tab bar at left)
  • Select movie or show to watch inside Environment
  • Play this by ear—If the user is comfortable with navigating visionOS at this point, ask them to access the Control Center and guide them through selecting the mirror to iPad option. This can be tricky, and may be unsuccessful for some users.
  • Back to Home Screen, launch Safari, Apple Music, and Photos apps to experience multiple open windows
  • Ask if there is anything else they would like to see (as interest/time permits)
  • Take off Apple Vision Pro
  • Ask follow-up questions, or agree to wait for a time in the future

*During the Disney+ app demo, you will need to STOP mirroring on the iPad due to DRM (Digital Rights Management) issues. For this reason, you may need to skip the Disney+ app for some demonstrations to avoid the potential frustration of needing to direct the Guest User to access Control Center and teach them how to turn on mirroring.

Some demo participants were so affected (one was “speechless”) by the experience that it was evident that they needed to wait one or more days to answer our follow-up questions. In one case, we waited to ask questions due to time constraints. I waited to ask follow-up questions in 3 of the 5 demos.

Cleaning Between Demos

During these demos, I considered hygienic issues for this device. Even though I know and trust my friends and colleagues and had no concerns about cleanliness, I found it slightly off-putting to share a device that I wear on my face for extended time periods. The Apple Vision Pro User Guide includes a section on cleaning that is helpful for keeping each of its parts clean (Apple Support, 2024a and 2024b). However, I added the extra step between demonstrations to use 1 or 2 alcohol prep pads to wipe the gray fabric of the Light Seal Cushion. The alcohol evaporates quickly and it adds an extra step of cleanliness. Apple specifies, “Don’t use disinfectant wipes…to clean the Light Seal or Light Seal Cushion,” but since the prep pads contain only alcohol (and not bleach or other chemicals), I have so far had no issues.

Follow-Up Questions

The purpose of this blog—and this demo experience—is to learn about potential uses of the Apple Vision Pro in education. At the same time, using the Apple Vision Pro is a unique experience that is also fun. I did my best not to over-complicate this activity and make it in any way unpleasant. Thus, I kept my follow-up questions to just these four:

  1. What surprised you?
  2. How might you use this in your job?
  3. How might students use this?
  4. Do you have anything else to report about this experience?

I would be remiss if I did not point out the limitations of this action research and my conclusions. This is not a scientific study and should not be considered as such. All results reported here are observations subject to my biases. Also, only 5 participants were involved in this activity, they were not selected at random, and all have educational technology background using Apple technology. That being said, I feel what I learned from this experience was valuable in the context of the goals of this blog and my personal understanding of using Apple Vision Pro in an education setting.

General Observations During Demonstrations

Before I report the reactions from each of the demonstrations, I noted several interesting observations among those who took part. First, all participants estimated that they had spent less time using the Apple Vision Pro compared to the time that had actually passed, a phenomenon I will discuss in more detail below. Here are a few other things I noticed:

ST—ST and I had planned a long demonstration—the longest of the group. Thus, ST began to experience some eye fatigue after about 2 hours. ST was also the first to experience significant trepidation with standing while wearing the Apple Vision Pro. ST was uneasy because looking down while standing, the user is unable to see their feet. ST exclaimed, “I lost my feet,” and balance and overall steadiness were affected while standing. ST is a retiree and speculated that his balance issues might be due to age, but two other participants experienced the same feeling, one who is about 30 years younger than ST.

HO—After the setup process, HO had difficulty using the eye-tracking and pinch controls, likely because I did not do a good enough job explaining the importance of fit before the setup procedure. After the setup, HO adjusted the position of the headset significantly, and it was necessary to repeat the entire setup process. This “false start” added about 10 minutes to the demonstration.

TG—Normally a reserved personality, TG expressed many times how the realism of the experience far exceeded expectations. TG was also negatively affected by not being able to see one’s feet when standing in immersive environments. More than any other person, TG was most awed by the 360º immersive environments and expressed many times an interest to “go there” and have environments available to create a calm working environment.

TH—TH expressed that this was a completely new and unique experience, and the only thing that had ever come close was 3D IMAX—but Apple Vision Pro was far beyond that. Of all the demonstrations, TH was most emphatic about the possibility of using Apple Vision Pro to provide real-feeling experiential training simulations.

ZE—After setup, ZE's eye-tracking and gestures were not working correctly and had to be set up again. The reset added about 10 minutes to the demo, but did not greatly detract from the overall experience. In fact, ZE expressed the most “awe” from the experience overall and felt unprepared for how good an experience the Apple Vision Pro delivered.

Reactions to Follow-Up Questions

What surprised you?

ST—ST was most surprised by the immersiveness of the experience and reported, “it put me in another realm.” ST specifically described the sharpness of the graphics and the immersive sounds provided through Spatial Audio that were said to be as good as any high-end sound systems previously heard.

HO—HO felt “removed from my own reality,” and reported that it was easy to forget where you really are. HO believes that being immersed in that world and space likely contributed to the loss of time that was experienced while using the device.

TG—Among all participants, TG was the only to be surprised that the Apple Vision Pro was more comfortable than expected. TG also was impressed by how immersive it felt and was surprised at “how quickly I became comfortable with the UI [User Interface].”

TH—The reality of the details shown by the Apple Vision Pro was described by TH as the possible reason why the 3D experience felt so immersive.

ZE—Although ZE reported that this was their first AR/VR headset experience, the participant felt it was “beyond what you could anticipate” and was “blown away” by how experiential and real it felt.

How might you use this in your job?

ST—ST is retired, but engages in hardware repairs, audio/visual/writing projects, and keeps current with technology. ST mostly envisioned using the Apple Vision Pro for entertainment purposes and/or viewing multiple open windows simultaneously—similarly to how a widescreen display might be used.

HO—HO’s position requires considerable communication and collaboration among many individuals and groups, both uses for which the Apple Vision Pro could be used. HO specifically mentioned “meeting with people through Zoom or other apps.” Also, HO envisioned using the “grand” workspace for multitasking among many different apps.

TG—Since TG saw the potential benefits of the immersive environments as useful, “self-imposed focus and isolation” and “deep work” were areas TG felt would be good uses of the Apple Vision Pro. TG also saw potential benefits in the realism provided by the in-theater Keynote rehearsal feature and for communicating with the device using Zoom or other apps.

TH—Although TH did not have any specific examples of how the Apple Vision Pro could be used for the work TH does, the participant gave many examples of the potential benefits of harnessing the realism provided by the device for providing true-to-life training simulations. Firefighter training was one specific example provided by TH that seemed to make good use of the Apple Vision Pro’s visual and Spatial Audio features.

ZE—When asked the question about how ZE might use the Apple Vision Pro in their job, this consummate educator immediately began answering the question about how this technology will be a “game changer” for students. After some probing, ZE did mention that the Apple Vision Pro would be useful to create immersive environments that might be more conducive for working. ZE expressed that this device could improve the opportunity to, for instance, “write a memo from Yosemite…with music in the background,” thus providing a potentially more focused and pleasant working environment.

How might students use this?

 ST—ST’s immediate response to the student use question was that the Apple Vision Pro could be used to help “put someone in a different place so you could feel the the culture.” ST gave several examples about experiential learning, including transporting a student to the top of the Swiss Alps (an example ST had just experienced from my Panoramic photos collection), recreating battles of the Civil War allowing a first-person view, standing in the audience while Lincoln delivered the Gettysburg address, and going back in time to experience the Egyptian pyramids and culture. ST felt that this device will allow a “next step in learning,” especially about other cultures.

HO—Rather than immersive environments, HO first focused on potential hands-on uses of the Apple Vision Pro with students, mentioning the ideas of creating graphics, experiencing art, closely examining anatomy for medical or nursing applications, working on parts of a car/removing parts digitally for automotive training, and other examples. HO specified that students could also use the device to be creative “beyond traditional mediums” in both 2D and 3D.

TG—TG conveyed many thoughtful and insightful responses regarding possible student uses in several categories. First, TG mentioned that students could engage in immersive field trip opportunities. TG mentioned that the Apple Vision Pro offers “different ways of doing things students currently do, but it can go much further,” such as exploring models and participating in simulations. However, TG acknowledged the current relatively low number of visionOS apps. TG also mentioned that students could benefit from the visionOS’s ability to multi-task and use multiple screens. TG said that as a result of this demo it was clear that “Spatial computing is ‘a real thing’ that allows a user to experience a new relationship to the User Interface of windows, open apps, [and other UI elements],” and added, “this is a new way of interfacing, this is not just a novel approach.”

TH—While TH struggled to name ways to use the Apple Vision Pro in their own work, TH provided the longest list of possible student uses of any participant. First, TH referenced a TV commercial reminiscent of the Apple Vision Pro experience during which a deaf child and his father visit a waterfall at Yosemite National Park and the child asks, in ASL, “Can you feel that?” (Subaru, 2023). TH believes that this device can allow “opportunities for children to get a sensory experience” in learning. TH then commented that the Apple Vision Pro could help to address equity issues among students, allowing students with less resources to have experiences in places not otherwise accessible. 

TH also gave a detailed example about Apple Vision Pro’s “full 360º experience,” providing real-feeling learning scenarios and simulations that could be offered. One example discussed was firefighter training, complete with 360º video showing the chaos of fire, water, smoke, and other firefighters, while hearing the 360º Spatial Audio to make the experience even more real. TH mentioned that a similar feeling was felt during the Encounter Dinosaurs app demo where the sound came directly from the dinosaur fight and from the dinosaurs growling and breathing at the participant. TH mentioned that when the dinosaur growled at you, “your instinct is to back up!”

ZE—Equity issues were also discussed by ZE in the context of allowing all students to participate in experiential learning opportunities, not just those coming from families with financial means. ZE gave examples about how the Apple Vision Pro will be a “game changer” for students mentioning that the device could be used for learning by doing in real-life places and contexts, learning from real professionals, participating in experiential learning, and visiting places not accessible to all students. ZE gave experiential examples including visiting the zoo in present day or visiting ancient Egypt in the past, and speculated that both experiences would be nearly as vivid as a real-life experiences. These detailed experiences could lead to the possibility of improved writing opportunities and other assessments not possible outside of augmented and virtual reality. 

ZE also mentioned the possibilities of students experiencing real-feeling, 3D, how-to experiences they could watch from professionals completing tasks. One notable example was watching a surgery where a student could easily pause, rewind, and look closely at 3D at elements of the surgery that would be impossible to experience even if they were live in the operating room.

Do you have anything else to report about this experience?

This final question I added as a “catch-all” opportunity for participants to share anything else we may not have discussed. Also, since I did not ask a question specifically probing for potential challenges or negative concerns, this final question allowed a time to share these ideas if the participants wished.

ST—While ST didn’t feel a need after the demo to purchase a personal Apple Vision Pro device, the participant noted that they may be interested in the next version of the device at a potentially lower price. ST also noted that wearing the device for over two hours resulted in some discomfort due to the weight on the front of the face and that “I felt like I had just finished an eye exam.” ST also noted that taking off the Apple Vision Pro felt like “coming out of a dark theater” and felt there was a slight readjustment time period while “re-adjusting to reality.”

HO—A few days after the experience, HO described the experience as “weird…you really want to be in there more!” The Disney immersive Environments were memorable, and HO “wanted to see more, explore more, and see what’s behind that wall.” HO was referring to the fact that while the immersive experiences are highly realistic, there is a limit to what can be explored and viewed. While a user can look at the front and sides of many virtual objects (depending on their placement), it is not possible to walk through doors or see all objects in full 360º views.

TG—After a day, TG had many observations and questions about the Apple Vision Pro, some related to brain research and/or the possible psychological effects of the device and the experiences it delivered. TG first mentioned that the entertainment aspects of the Apple Vision Pro could feel isolating since one is not watching movies with friends or family (as TG does in reality). TG also expressed what all other participants said in different ways: “the fullness of the experience can't be described, you have to do it.” Some of the questions TG posed as a result of the Apple Vision Pro demo experience included:

  • Might there be breakthroughs in education and other fields as a result of Spatial Computing?
  • Will Spatial Computing provide a better understanding of how the brain works for reasoning and thinking?
  • What kind of sensation/perception research breakthroughs might result from Spatial Computing?
  • What will Spatial Computing uncover about our brains?

One interesting point regarding metacognition TG and I discussed was related to attaching learning events to physical spaces and/or places. Research indicates that some learners attach memories to physical locations and materials, a technique referred to as “method of loci” (described by Yates, 1966, and others). TG and I have noted that we sometimes inadvertently experience this phenomenon while listening to audiobooks and podcasts while driving, noting that when we remember a particular point learned while driving, we also recall the location where we learned the idea. TG wondered if Spatial Computing might have a similar effect based upon the virtual/immersive setting of learning. For example, might I find myself transported to a volcano at Haleakalā National Park the next time I think about foveated rendering because that was the virtual location I had set in the Apple Vision Pro when I first researched the idea?

TH—TH answered the follow-up questions immediately following the Apple Vision Pro demo. The primary ideas expressed by TH were that this experience was completely unique and that no previous experiences were analogous to the one delivered by this device. TH also noted that one user interface (UI) element was uncomfortable to access—the Control Center. To access the Control Center, the user must look up to trigger a relatively small dot with an arrow to appear, tap it, and then interact with a series of floating windows. TH described this as uncomfortable.

ZE—ZE also answered the follow-up questions immediately following the Apple Vision Pro demo and commented on the UI of visionOS. While ZE found the immersive environments to be “incredible,” the interface was described as the least inspiring aspect of the experience. As an observer, I noted that ZE very quickly learned the new visionOS UI and adapted to it. However, ZE also had a “false start” that may have negatively affected the overall experience since the setup needed to be repeated before we could continue with the demo.

Overall Conclusions

Conclusions—What surprised you?

In all demonstrations, participants expressed surprise regarding the “time shift” they experienced using Apple Vision Pro. Participants surmised that they had been using the device for a shorter time than they had actually been using it. This “VPST” (Vision Pro Standard Time) phenomenon should likely be researched more formally. Even the participants who experienced “false starts” and needed to redo the setup procedures experienced the perception of spending less time, despite the potential frustration that could have made the experience feel longer.

Approximate time estimates were reported as follows:

  • ST—Estimated time at 90 minutes; actual time was over 120 minutes (133% longer than estimated).
  • HO—Estimated time at 30 minutes; actual time was 48 minutes (160% longer than estimated).
  • TG—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).
  • TH—Estimated time at 20–30 minutes; actual time was 42 minutes (140% to 210% longer than estimated).
  • ZE—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).

Participants used the Apple Vision Pro between 133% and 300% longer than they estimated in this demo situation. On average, participants wore the Apple Vision Pro about twice as long as they estimated. Even after 15 days, I personally still experience this time shift and feel I have been wearing the device for less time than has actually passed.

All participants also reported that they were impressed by how real and immersive the experience was. Three of the five participants reported having never worn a headset device before, and they all commented that the realism delivered by the Apple Vision Pro far exceeded their expectations, as far as how real the experience felt.

Conclusions—How might you use this in your job?

All participants mentioned some potential uses for the immersive environments provided by the Apple Vision Pro in their jobs or education in general, with two participants specifically wanting the ability to use the device to work “inside” an environment of their choice more conducive to work.

The themes of communication and collaboration were also mentioned, specifically using the Apple Vision Pro for Zoom or other similar apps. Interestingly, the demo did not include videoconferencing, and none of the participants mentioned the Persona feature of the Apple Vision Pro that creates a simulated version of the user’s face. At least one of the participants had already experienced my Persona when I contacted them via FaceTime before the demo. Somewaht surprisingly, the beta Persona technology did not deter them from visualizing themselves using it. (Personas have been described by some reviewers as “horrifying.” Although this description is hyperbole, Personas can look unsettling at best—like mine shown below.)

Finally, two participants mentioned the Apple Vision Pro’s abilities to provide realistic simulations, one in the context of rehearsing a presentation in a theater, and the other for simulating a training scenario with realistic visuals and sounds.

Conclusions—How might students use this?

Although all participants had some difficulty or showed pause in answering the question about how they might use the Apple Vision Pro for their personal work, none of the participants had an issue providing many possible uses for students. Participants also provided ideas that represented current possible uses of the Apple Vision Pro based on what they had just experienced and speculative ideas based upon what they felt could be available in the future based on features, functions, apps, or content they had not directly experienced, but assumed are possible.

The two most-mentioned ideas about possible student uses included that Apple Vision Pro could be a factor in “leveling the playing field” for all students and providing experiential learning opportunities not currently possible.

The possibility of Apple Vision Pro addressing student equity issues was mentioned specifically by 3 of 5 participants. The fact that a $3,500+ device could potentially be an answer to equity issues is somewhat humorous to me. However, this may be an indication that the experience the Apple Vision Pro delivers is worth the relatively high cost—or at least it allows educators to see its potential learning experiences as truly valuable. The level of realism of the experience reported by each of the participants was real enough that all of them felt the device was nearly as good as reality, and more than half felt that students would benefit by participating in experiences as real as they had encountered.


All participants gave examples of experiential learning in several contexts, including, but not limited to:

  • Visiting places (e.g., geographical locations, zoos, professional locations)
  • Recreating historical events (e.g., Civil War, speeches, ancient cultures)
  • Experiencing art and culture
  • Creating art
  • Participating in job-based experiences (e.g., arts, medical, science, automotive)
  • Simulating environments (e.g., historical events, firefighting, medical)

Conclusions—Do you have anything else to report about this experience?

The responses to all questions were gathered following the Apple Vision Pro demo experience after a range of time periods between the demo and questions. Some participants answered directly after the demo, and one participant waited two weeks to provide follow-up answers. In my opinion, the amount of time that passed did not affect the quality or quantity of responses. In fact, at least two respondents added more information to their responses days after they provided initial answers because new information had occurred to them. One participant later shared, “I long to revisit some of the immersive worlds” experienced on the Apple Vision Pro—in this case, the desert planet Tatooine, the fictional home of Luke Skywalker.

This catch-all question included discussions about user interface issues—mostly reactions to the controls and a couple of mentions about the frustration accessing the Control Center. Some participants also took the opportunity to reiterate slight discomfort wearing the device, and the fact that users cannot see their feet while standing in immersive environments.

However, all participants reiterated their surprise about the realism provided by the Apple Vision Pro in immersive environments. Several comments were conveyed that participants wanted to be able to visit these environments whenever they wanted for both entertainment and to create pleasant work environments. Further, all participants mentioned how this device should be used to provide experiential learning opportunities.

Final Thoughts

Among the five participants and the demonstration experiences I was able to share with each of them, I am more convinced than ever that the Apple Vision Pro is on the road to becoming a valuable tool for learning. Indeed, I have personally used the device for my own learning in the past 15 days, and my hope is to pursue the ideas I learned from my friends and colleagues moving forward.

These demonstrations have served to provide me with a roadmap for selecting apps and trying new things discussed by participants. Some areas I will pursue match the findings from this report:

  • Experiential learning
  • Immersive environments
  • Realistic how-to experiences
  • Creating work-conducive environments
  • Expert-guided simulations
  • Virtual travel/field trips
  • Artistic and cultural experiences
  • Apps that allow users to create

Finally, many thanks to the friends and colleagues who participated in these demonstrations. I hope I provided a fun and worthwhile experience—and that everyone learned as much as I did. Thanks, too, to Sean at the Apple Store in Deer Park (IL) who led me through my first demo earlier this month and gave me a blueprint for my own demos. I further hope that these experiences will inspire other education leaders to learn through teaching by following some of the processes described above.


References

Apple Support. (2024a). Apple Vision Pro User Guide: Clean your Apple Vision Pro. Retrieved from https://support.apple.com/guide/apple-vision-pro/clean-your-apple-vision-pro-tan6200165e8/visionos

Apple Support. (2024b). How to clean Apple Vision Pro and accessories. Retrieved from https://support.apple.com/en-us/HT213964

Subaru. (July 31, 2023). A Beautiful Silence :30 [video]. Retrieved from www.youtube.com/watch?v=X_kxjt6gf1Y

Wikipedia. (2023). Method of loci. Retrieved from https://en.wikipedia.org/wiki/Method_of_loci

Yates, Frances A. (1966). The Art of Memory. Chicago: University of Chicago Press.

Saturday, February 17, 2024

My Experience with the Apple Vision Pro Battery So Far

Much has been written and commented about Apple Vision Pro battery life. This apparently pressing issue—to some—seems to be skewing negative on social media. I’ve already written on the topic myself, but I wanted to provide a brief article on the topic that’s easy to digest and that is now informed by two weeks of use.

First, here’s Apple’s official description of the Apple Vision Pro Battery:

“The high-performance Apple Vision Pro Battery is made out of smooth, machined aluminum and connects to your Apple Vision Pro using a woven USB-C cable. It can slip into your pocket for portable power and supports up to two hours of general use, 2.5 hours of video playback, and all-day use when plugged in.” (Apple, 2024)


The negative reviews sometimes appear in the “cons” section at the beginning of articles in addition to within commentary. For example:

PCMag: “Short battery life” (Greenwald, 2024).

Other discussions included, for example, in the Wall Street Journal: “...its battery life sucks...” (Stern, 2024), and on MKBHD Marques Brownlee said, “...the battery life is meh.” (Brownlee, 2024). The PCMag article expands on its negative review: “Battery life remains the biggest weak point for standalone AR/VR headsets across the board, and the Vision Pro is no different” (Greenwald, 2024).

However, I have also read plenty of reviews that do not mention battery life as a problem. Here are a few:

  • “I’ve consistently gotten 3 full hours of battery life using Vision Pro on a full charge. Sometimes a little more. In my experience, Apple’s stated 2–2.5 hour battery life is a floor, not a ceiling” (Gruber, 2024).
  • Joe Rossignol of MacRumors writes about Brian Tong’s video: “the Vision Pro may last up to 30 minutes longer per charge than Apple's advertised battery life claims, but results will vary.”
  • And finally, my all-time favorite tech reviewer David Pogue mentions, “[the battery] powers the headset for about two hours. You can also plug the battery into the wall, in which case the headset runs forever.”

So why the disparity? My guess is preconceived expectations. 

If a reviewer for some reason assumed that the Apple Vision Pro was a device that one wears all the time to experience some kind of Augmented Reality World of Tomorrow, I suspect the 2–3 hour battery life is seen as a negative. My experience with the Apple Vision Pro is that I am quite content to use it for a few hours at a time, mostly in a single position while plugged in to power—occasionally moving around for a few minutes at a time and/or to interact with an immersive experience on the device.

As Wes Hilliard first mentioned in one of the AppleInsider podcasts, I suppose if I wanted more portable power, I could plug the Apple Vision Pro battery into one of my portable, pocket-sized batteries that I use to power my iPhone and iPad. So far I have neither needed, nor wanted, to do this.

In writing this article, I did find some interesting battery information to pass along:

Apple Vision Pro Power/Battery Information

This information is adapted from Apple Support (2024).

Meaning of the Light on the Battery

While Plugged In

  • Green for several seconds: Battery fully charged.
  • Amber for several seconds: Battery is less than 100%, but can power the device.
  • Amber pulsing slowly: Battery charge is too low to power the device.

While NOT Plugged In

  • Green for several seconds: Battery is 50% or higher.
  • Amber for several seconds: Battery is between 5% and 49%.
  • Amber pulsing slowly: Battery charge is too low to power the device.

See the Battery Level

  • Put on the Apple Vision Pro.
  • Open Control Center.
  • Look in the top-right corner.

You may also consult Apple’s article, Apple Vision Pro Battery and Performance.



References

Apple Support. (February 13, 2024). Connect and charge Apple Vision Pro battery. Retrieved from https://support.apple.com/en-us/117740

Apple. (2024). Apple Vision Pro Battery. Retrieved from www.apple.com/shop/product/MW283LL/A/apple-vision-pro-battery

Brownlee, M. (February 3, 2024). Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech! [Video].  Retrieved from www.youtube.com/watch?v=86Gy035z_KA

Greenwald, W. (February 9, 2024). Apple Vision Pro Review: An incredible leap in AR and VR interactivity. Retrieved from www.pcmag.com/reviews/apple-vision-pro

Gruber, J. (January 30, 2024). The Vision Pro. Retrieved from https://daringfireball.net/2024/01/the_vision_pro

Hilliard, W. (February 8, 2024). Apple Vision Pro early review: a peek into the future of computing. AppleInsider. Retrieved from https://appleinsider.com/articles/24/02/08/apple-vision-pro-early-review-a-peek-into-the-future-of-computing

Pogue, D. (February 11, 2024). Apple’s Vision Pro Isn’t for Everyone — But It’s Still a Home Run. New York Magazine. Retrieved from https://nymag.com/intelligencer/article/david-pogue-reviews-apple-vision-pro.html#

Rossignol, J. (January 30, 2024). Vision Pro Reviews: Surprising Battery Life, 'Weird' Personas, and More. Retrieved from www.macrumors.com/2024/01/30/apple-vision-pro-review-highlights/

Stern, J. (January 30, 2024). Apple Vision Pro Review: The Best Headset Yet Is Just a Glimpse of the Future. Wall Street Journal. Retrieved from https://apple.news/ALd4RY3b9Q0umjnCH4Srw9A

Sunday, February 11, 2024

Watching a YouTube Video on the Apple Vision Pro for Learning is a Miserable Experience

I have reported already on this blog that the lack of certain native visionOS apps is essentially a non-issue. The short list of well known companies not yet developing for Apple Vision Pro includes Netflix, Spotify—and YouTube (Sullivan, 2024). I’ve logged in to both Netflix and YouTube to watch videos on the Apple Vision Pro, and both websites function. However, I have a new experience to share.

Watching a YouTube video on the Apple Vision Pro for learning is, hands-down, the most miserable educational experience I have ever had. The problem is the user interface.

When I’m watching a video for the purpose of learning, I expect to be able to stop and start it, re-listen/re-watch snippets at will, and be able to simply pause when I need to pause to stop and reflect and/or take notes. The YouTube interface, combined with the Apple Vision Pro’s eye-tracking and touch, make these seemingly simple operations nearly impossible.

The problem is the placement of the Play/Pause button in YouTube and the scrubber controls at the bottom of the video:

Almost every time I attempted to pause the video, it jumped back and started from the beginning. The "hotspot" areas for the Play/Pause button and the scrubber controls are very small and seemingly overlap! It's easy to disastrously select the wrong button, causing you lose your place and need to find your place again—which you were not paying attention to. Then if you get close to where you attempted to pause, another Pause button click jumps you back to the start again! It’s an endless loop of frustration. By the time you find your place again, you have forgotten why you paused in the first place—your brain has long since moved on.

Here’s an example of me attempting to pause a YouTube video to quote Marques Brownlee using the Apple Vision Pro. I get the quote, but things go awry after that:

 

Please note that simply getting a native YouTube app on visionOS may not necessarily solve this problem—the interface could still be bad. And for simply watching videos with no frequent starts, stops, or re-watching, the YouTube website works OK. But if YouTube ever gets around to releasing a visionOS YouTube app, I hope YouTube considers us learners and teachers.


References

Brownlee, M. (February 3, 2024). Apple Vision Pro Review: Tomorrow's Ideas... Today's Tech! [Video].  Retrieved from www.youtube.com/watch?v=86Gy035z_KA

Sullivan, M. (January 26, 2024). With the Vision Pro, Apple has never depended more on developers for a product’s success. Fast Company. Retrieved from www.fastcompany.com/91017225/apple-vision-pro-developers-sales