This evening after I returned home from work, I strapped on the Apple Vision Pro and decided to try an idea that I was inspired to attempt as a result of listening to one of my weekly podcasts, The Dalrymple Report. On the podcast, host Dave Mark was speculating about whether the Apple Vision Pro would allow a user to create visuals, such as a logo. That made me wonder if I could use the Apple Vision Pro to draw. I expected that I’d be able to add shapes in Keynote or Pages, but I was more interested in whether I could use the actual drawing tools that give access to the toolbar accessible by the Apple Pencil. For those who do not know me, I regularly draw sketchnotes as both a note-taking and artistic endeavor, and I have taught a sketchnoting class to educators on multiple occasions.
As I mentioned in previous posts, the Pages app is not native to visionOS, but it has all of the features that the iPadOS version of Pages has. So far I had not seen any limitations—but before I made the attempt, it was difficult for me to envision how I may use the shape and or drawing tools. I decided to start in the Pages app by creating a new Landscape-orientation document. I changed document type into a into a page layout, and I started working with the shapes and drawing features. I started by adding and resizing a couple of shapes—a fairly simple endeavor that worked how I expected. I also went online, grabbed a image from Safari, and dragged it into the Pages app so I could potentially use the tracing technique I often teach.
It occurred to me that I had all of the pieces I needed to create a bare-bones sketchnote, providing that I could use the drawing features without an Apple Pencil. I used an image of an Apple Vision Pro, decreased the opacity of the image, and accessed the drawing menu. The drawing tools appeared exactly as they do on iPad. I selected the pencil tool, changed its color, and commenced to try to figure out how I would draw in the air with pinch gestures.
The first challenge was knowing where the pencil point would hit the virtual paper. After some trial and error, I realized the pencil point hit where I had focused my vision. I felt like the point was hitting slightly below where I was focusing, but after a few attempts, I got the point to hit closer to where I expected it. The drawing part, however, was quite a challenge, as the pinch made the point hit the paper, but then you needed “air draw” by steadily moving your pinched finger in along a drawing path. This was a significant challenge!
My precision began to get better, but drawing in the air is nothing like drawing with an Apple Pencil on a physical surface. Both the relative shakiness and the fact that my arms were getting tired affected the output greatly.
I was able to use one of my Apple Pencil drawing techniques to improve shapes and lines that I’d already drawn. I use a technique where you draw a line or shape that’s close to what you want, and then use the select tool to select the shape and then resize it on the screen. Although I’d describe this tactic as tedious as I selected my drawn marks in the air, after a few minutes I had created what quite possibly be the world’s first Apple-Vision-Pro-generated sketchnote. I will insert this first attempt into this blog.
Just in case anyone needs to contact the Guinness people, I did take a video of my drawing efforts that is time, date, and location stamped. The video is admittedly painful to watch. but it shows the process I used and the learning that got me to the results.
I look forward to making future sketchnotes soon to see if this is a viable new techniques, or just an interesting hack to do for fun.
One final thought for this Monday evening: I had heard that the virtual in air keyboard could be accessed by physically typing the keys in the air, but for whatever reason I had not spent much time trying to make it work until tonight. I now realize that you can pull the virtual keyboard toward you and punch your finger into the keys. Using this technique, the keyboard types much more like a regular physical keyboard, except you have no haptic feedback. I’m finding the experience much more precise than using the “look-and-tap” method. I would not have expected that this could possibly be a better way to type on a virtual keyboard, but this technique has surprised me.
Coincidentally, one of my meetings during the day on Monday included a discussion about the many ways one is able to type on an iPad. I generally teach 11 text input methods that include keyboards in landscape and portrait mode in full, floating, and split configurations; Siri dictation; Apple’s Scribble writing-to-text feature; and the various physical keyboard possibilities (Bluetooth, wired, and Smart Connector). However, Apple Vision Pro adds a new text input category—for lack of a better term, I’ll call it “air typing”—and there are two implementations: look-and-tap, and type-in-the-air.
This will, no doubt, further confuse the adults who still think we should be teaching 1950s typing skills on physical keyboards. Please alert the Secretary Pool on the third floor, and feel free to continue the two-spaces-after-a-period “debate.”



No comments:
Post a Comment