As an educator, I have always felt that the best way to learn something is to teach it. In addition to my own time with the Apple Vision Pro for the past 15 days, I've also taken the opportunity to provide demonstrations to some of my friends and colleagues. So far I have provided 6 demonstrations, each lasting between about 45 minutes and just over 2 hours. In some cases, my friends or colleagues expressed interest in a demo, while in other cases, I solicited feedback. I worked with a variety of people with backgrounds in education, provided similar demonstrations, and debriefed the experience with the same 4 questions.
My first demonstration was just 2 days after my Apple Store demo, and it was with a friend of mine outside of education who is very interested in Apple devices. It was during this demonstration that I got the idea for this action research activity. I did not include the results of the first demo here since I used that opportunity to create the sequence of activities, noted some of the issues I planned to observe, and wrote my follow-up questions after this practice demonstration was completed.
This article reports the results of what I learned providing Apple Vision Pro demonstrations to 5 friends and colleagues with backgrounds in different aspects of public education. I will refer to those with whom I worked by their initials. Here is a short description of each person:
- ST: A long-time Apple enthusiast with many years experience providing technology hardware and software support in schools. ST has decades of experience using macOS, iOS, iPadOS, and other Apple systems.
- HO: A friend and colleague who provides education support to many people and groups in my school district who has over 10 years experience using Apple devices (primarily macOS and iOS).
- TG: An innovative friend and colleague with over 20 years experience in education as a teacher and administrator. TG is in an educational technology leadership role with experience using Apple devices running macOS, iOS, iPadOS, and several other systems.
- TH: A colleague in a district leadership role providing educational support who is relatively new to using Apple devices, but adapted quickly to operating systems (primarily macOS). TH had no idea the Apple Vision Pro had been released and heard about it for the first time seeing it in my office.
- ZE: A colleague with about 20 years experience as a teacher and administrator, ZE has used Apple technology as a tool for at least the last 5 years. ZE found the Vision Pro intriguing and agreed to be part of this demo experience after I asked.
Each demonstration was scheduled for at least 45 minutes. When a demonstration began, I gave brief, basic instruction about how to pick up the Apple Vision Pro safely; how to look, pinch, and drag; how to adjust the fit dial on the Solo Knit Band; and I showed the locations of the Digital Crown and top button. I then mirrored the Vision Pro Display on my M1 11-inch iPad Pro, set the Vision Pro to Guest User mode, removed my prescription lenses, and handed the device to the Guest User.
During the third demo, I realized that I needed to specifically set the audio source to the Vision Pro while mirroring. Before the third demo, the audio settings were less consistent, but each demonstration included Spatial Audio for some or all of the experience.
Since these demonstrations were with colleagues I know and trust, I made the decision to leave my iCloud account logged in and accessible. Further, I was able to watch the majority of the demo using the mirroring feature of the Apple Vision Pro. This setup may not be optimally replicable for those who do not wish to share their personal data with others—particularly Photos and Documents. Some Apple Vision Pro owners may wish to either limit the Guest User account (an option available in the Guest User setup) or create a demo Apple ID for the purpose of conducting demos.
Here is the general sequence I used for my demos:
My Guest User Setup (about 3 minutes)
- Explain how to pick up the Apple Vision Pro safely (thumb through nose ridge and other fingers across top—be careful of the parts attached by magnets)
- Share Vision Pro screen with iPad screen (and set Audio to Vision Pro)
- Switch to Guest Mode in Control Center
- Explain fit dial on the Solo Knit Band
- Show locations of Digital Crown and top button
- Remove my prescription lenses
- Explain that the first few minutes are very important to get a good fit and set up eye-tracking and hand gestures
Demo (at least 45 minutes)
- Ensure a good fit by adjusting the fit dial on Solo Knit Band
- Observe and assist in the fit as needed
- Direct to Digital Crown
- Observe hand gesture practice
- After setup, explain video passthrough (“you are looking at a video of the room, not the room…”)
- Access Home Screen with the Digital Crown
- Open Photos app
- Try scrolling (side-to-side, up-and-down)
- Access and view Panorama photos in Panorama mode (from tab bar at left)
- Access and view Spatial Video in and out of full-screen mode (from tab bar at left)
- Teach about bottom-center close button and the “dash” window control to move windows
- Back to Home Screen, select one or more Immersive Environments (from tab bar at left)
- Use the Digital Crown to increase the Environment to full 360°
- Suggest participant to stand if comfortable
- Back to Home Screen, launch Keynote app
- Select a presentation (from iCloud account)
- Rehearse presentation in Theater and/or Conference Room (point out options under the ... button)
- Back to Home Screen, launch Encounter Dinosaurs app
- Interact as comfortable (hold out hand for butterfly, stand to offer hand to dinosaurs)
- If time/interest: back to Home Screen, launch Disney+ app
- Turn off mirroring on iPad*
- Select an Environment and look around (4 are available from tab bar at left)
- Select movie or show to watch inside Environment
- Play this by ear—If the user is comfortable with navigating visionOS at this point, ask them to access the Control Center and guide them through selecting the mirror to iPad option. This can be tricky, and may be unsuccessful for some users.
- Back to Home Screen, launch Safari, Apple Music, and Photos apps to experience multiple open windows
- Ask if there is anything else they would like to see (as interest/time permits)
- Take off Apple Vision Pro
- Ask follow-up questions, or agree to wait for a time in the future
*During the Disney+ app demo, you will need to STOP mirroring on the iPad due to DRM (Digital Rights Management) issues. For this reason, you may need to skip the Disney+ app for some demonstrations to avoid the potential frustration of needing to direct the Guest User to access Control Center and teach them how to turn on mirroring.
Some demo participants were so affected (one was “speechless”) by the experience that it was evident that they needed to wait one or more days to answer our follow-up questions. In one case, we waited to ask questions due to time constraints. I waited to ask follow-up questions in 3 of the 5 demos.
Cleaning Between Demos
During these demos, I considered hygienic issues for this device. Even though I know and trust my friends and colleagues and had no concerns about cleanliness, I found it slightly off-putting to share a device that I wear on my face for extended time periods. The Apple Vision Pro User Guide includes a section on cleaning that is helpful for keeping each of its parts clean (Apple Support, 2024a and 2024b). However, I added the extra step between demonstrations to use 1 or 2 alcohol prep pads to wipe the gray fabric of the Light Seal Cushion. The alcohol evaporates quickly and it adds an extra step of cleanliness. Apple specifies, “Don’t use disinfectant wipes…to clean the Light Seal or Light Seal Cushion,” but since the prep pads contain only alcohol (and not bleach or other chemicals), I have so far had no issues.
Follow-Up Questions
The purpose of this blog—and this demo experience—is to learn about potential uses of the Apple Vision Pro in education. At the same time, using the Apple Vision Pro is a unique experience that is also fun. I did my best not to over-complicate this activity and make it in any way unpleasant. Thus, I kept my follow-up questions to just these four:
- What surprised you?
- How might you use this in your job?
- How might students use this?
- Do you have anything else to report about this experience?
I would be remiss if I did not point out the limitations of this action research and my conclusions. This is not a scientific study and should not be considered as such. All results reported here are observations subject to my biases. Also, only 5 participants were involved in this activity, they were not selected at random, and all have educational technology background using Apple technology. That being said, I feel what I learned from this experience was valuable in the context of the goals of this blog and my personal understanding of using Apple Vision Pro in an education setting.
General Observations During Demonstrations
Before I report the reactions from each of the demonstrations, I noted several interesting observations among those who took part. First, all participants estimated that they had spent less time using the Apple Vision Pro compared to the time that had actually passed, a phenomenon I will discuss in more detail below. Here are a few other things I noticed:
ST—ST and I had planned a long demonstration—the longest of the group. Thus, ST began to experience some eye fatigue after about 2 hours. ST was also the first to experience significant trepidation with standing while wearing the Apple Vision Pro. ST was uneasy because looking down while standing, the user is unable to see their feet. ST exclaimed, “I lost my feet,” and balance and overall steadiness were affected while standing. ST is a retiree and speculated that his balance issues might be due to age, but two other participants experienced the same feeling, one who is about 30 years younger than ST.
HO—After the setup process, HO had difficulty using the eye-tracking and pinch controls, likely because I did not do a good enough job explaining the importance of fit before the setup procedure. After the setup, HO adjusted the position of the headset significantly, and it was necessary to repeat the entire setup process. This “false start” added about 10 minutes to the demonstration.
TG—Normally a reserved personality, TG expressed many times how the realism of the experience far exceeded expectations. TG was also negatively affected by not being able to see one’s feet when standing in immersive environments. More than any other person, TG was most awed by the 360º immersive environments and expressed many times an interest to “go there” and have environments available to create a calm working environment.
TH—TH expressed that this was a completely new and unique experience, and the only thing that had ever come close was 3D IMAX—but Apple Vision Pro was far beyond that. Of all the demonstrations, TH was most emphatic about the possibility of using Apple Vision Pro to provide real-feeling experiential training simulations.
ZE—After setup, ZE's eye-tracking and gestures were not working correctly and had to be set up again. The reset added about 10 minutes to the demo, but did not greatly detract from the overall experience. In fact, ZE expressed the most “awe” from the experience overall and felt unprepared for how good an experience the Apple Vision Pro delivered.
Reactions to Follow-Up Questions
What surprised you?
ST—ST was most surprised by the immersiveness of the experience and reported, “it put me in another realm.” ST specifically described the sharpness of the graphics and the immersive sounds provided through Spatial Audio that were said to be as good as any high-end sound systems previously heard.
HO—HO felt “removed from my own reality,” and reported that it was easy to forget where you really are. HO believes that being immersed in that world and space likely contributed to the loss of time that was experienced while using the device.
TG—Among all participants, TG was the only to be surprised that the Apple Vision Pro was more comfortable than expected. TG also was impressed by how immersive it felt and was surprised at “how quickly I became comfortable with the UI [User Interface].”
TH—The reality of the details shown by the Apple Vision Pro was described by TH as the possible reason why the 3D experience felt so immersive.
ZE—Although ZE reported that this was their first AR/VR headset experience, the participant felt it was “beyond what you could anticipate” and was “blown away” by how experiential and real it felt.
How might you use this in your job?
ST—ST is retired, but engages in hardware repairs, audio/visual/writing projects, and keeps current with technology. ST mostly envisioned using the Apple Vision Pro for entertainment purposes and/or viewing multiple open windows simultaneously—similarly to how a widescreen display might be used.
HO—HO’s position requires considerable communication and collaboration among many individuals and groups, both uses for which the Apple Vision Pro could be used. HO specifically mentioned “meeting with people through Zoom or other apps.” Also, HO envisioned using the “grand” workspace for multitasking among many different apps.
TG—Since TG saw the potential benefits of the immersive environments as useful, “self-imposed focus and isolation” and “deep work” were areas TG felt would be good uses of the Apple Vision Pro. TG also saw potential benefits in the realism provided by the in-theater Keynote rehearsal feature and for communicating with the device using Zoom or other apps.
TH—Although TH did not have any specific examples of how the Apple Vision Pro could be used for the work TH does, the participant gave many examples of the potential benefits of harnessing the realism provided by the device for providing true-to-life training simulations. Firefighter training was one specific example provided by TH that seemed to make good use of the Apple Vision Pro’s visual and Spatial Audio features.
ZE—When asked the question about how ZE might use the Apple Vision Pro in their job, this consummate educator immediately began answering the question about how this technology will be a “game changer” for students. After some probing, ZE did mention that the Apple Vision Pro would be useful to create immersive environments that might be more conducive for working. ZE expressed that this device could improve the opportunity to, for instance, “write a memo from Yosemite…with music in the background,” thus providing a potentially more focused and pleasant working environment.
How might students use this?
ST—ST’s immediate response to the student use question was that the Apple Vision Pro could be used to help “put someone in a different place so you could feel the the culture.” ST gave several examples about experiential learning, including transporting a student to the top of the Swiss Alps (an example ST had just experienced from my Panoramic photos collection), recreating battles of the Civil War allowing a first-person view, standing in the audience while Lincoln delivered the Gettysburg address, and going back in time to experience the Egyptian pyramids and culture. ST felt that this device will allow a “next step in learning,” especially about other cultures.
HO—Rather than immersive environments, HO first focused on potential hands-on uses of the Apple Vision Pro with students, mentioning the ideas of creating graphics, experiencing art, closely examining anatomy for medical or nursing applications, working on parts of a car/removing parts digitally for automotive training, and other examples. HO specified that students could also use the device to be creative “beyond traditional mediums” in both 2D and 3D.
TG—TG conveyed many thoughtful and insightful responses regarding possible student uses in several categories. First, TG mentioned that students could engage in immersive field trip opportunities. TG mentioned that the Apple Vision Pro offers “different ways of doing things students currently do, but it can go much further,” such as exploring models and participating in simulations. However, TG acknowledged the current relatively low number of visionOS apps. TG also mentioned that students could benefit from the visionOS’s ability to multi-task and use multiple screens. TG said that as a result of this demo it was clear that “Spatial computing is ‘a real thing’ that allows a user to experience a new relationship to the User Interface of windows, open apps, [and other UI elements],” and added, “this is a new way of interfacing, this is not just a novel approach.”
TH—While TH struggled to name ways to use the Apple Vision Pro in their own work, TH provided the longest list of possible student uses of any participant. First, TH referenced a TV commercial reminiscent of the Apple Vision Pro experience during which a deaf child and his father visit a waterfall at Yosemite National Park and the child asks, in ASL, “Can you feel that?” (Subaru, 2023). TH believes that this device can allow “opportunities for children to get a sensory experience” in learning. TH then commented that the Apple Vision Pro could help to address equity issues among students, allowing students with less resources to have experiences in places not otherwise accessible.
TH also gave a detailed example about Apple Vision Pro’s “full 360º experience,” providing real-feeling learning scenarios and simulations that could be offered. One example discussed was firefighter training, complete with 360º video showing the chaos of fire, water, smoke, and other firefighters, while hearing the 360º Spatial Audio to make the experience even more real. TH mentioned that a similar feeling was felt during the Encounter Dinosaurs app demo where the sound came directly from the dinosaur fight and from the dinosaurs growling and breathing at the participant. TH mentioned that when the dinosaur growled at you, “your instinct is to back up!”
ZE—Equity issues were also discussed by ZE in the context of allowing all students to participate in experiential learning opportunities, not just those coming from families with financial means. ZE gave examples about how the Apple Vision Pro will be a “game changer” for students mentioning that the device could be used for learning by doing in real-life places and contexts, learning from real professionals, participating in experiential learning, and visiting places not accessible to all students. ZE gave experiential examples including visiting the zoo in present day or visiting ancient Egypt in the past, and speculated that both experiences would be nearly as vivid as a real-life experiences. These detailed experiences could lead to the possibility of improved writing opportunities and other assessments not possible outside of augmented and virtual reality.
ZE also mentioned the possibilities of students experiencing real-feeling, 3D, how-to experiences they could watch from professionals completing tasks. One notable example was watching a surgery where a student could easily pause, rewind, and look closely at 3D at elements of the surgery that would be impossible to experience even if they were live in the operating room.
Do you have anything else to report about this experience?
This final question I added as a “catch-all” opportunity for participants to share anything else we may not have discussed. Also, since I did not ask a question specifically probing for potential challenges or negative concerns, this final question allowed a time to share these ideas if the participants wished.
ST—While ST didn’t feel a need after the demo to purchase a personal Apple Vision Pro device, the participant noted that they may be interested in the next version of the device at a potentially lower price. ST also noted that wearing the device for over two hours resulted in some discomfort due to the weight on the front of the face and that “I felt like I had just finished an eye exam.” ST also noted that taking off the Apple Vision Pro felt like “coming out of a dark theater” and felt there was a slight readjustment time period while “re-adjusting to reality.”
HO—A few days after the experience, HO described the experience as “weird…you really want to be in there more!” The Disney immersive Environments were memorable, and HO “wanted to see more, explore more, and see what’s behind that wall.” HO was referring to the fact that while the immersive experiences are highly realistic, there is a limit to what can be explored and viewed. While a user can look at the front and sides of many virtual objects (depending on their placement), it is not possible to walk through doors or see all objects in full 360º views.
TG—After a day, TG had many observations and questions about the Apple Vision Pro, some related to brain research and/or the possible psychological effects of the device and the experiences it delivered. TG first mentioned that the entertainment aspects of the Apple Vision Pro could feel isolating since one is not watching movies with friends or family (as TG does in reality). TG also expressed what all other participants said in different ways: “the fullness of the experience can't be described, you have to do it.” Some of the questions TG posed as a result of the Apple Vision Pro demo experience included:
- Might there be breakthroughs in education and other fields as a result of Spatial Computing?
- Will Spatial Computing provide a better understanding of how the brain works for reasoning and thinking?
- What kind of sensation/perception research breakthroughs might result from Spatial Computing?
- What will Spatial Computing uncover about our brains?
One interesting point regarding metacognition TG and I discussed was related to attaching learning events to physical spaces and/or places. Research indicates that some learners attach memories to physical locations and materials, a technique referred to as “method of loci” (described by Yates, 1966, and others). TG and I have noted that we sometimes inadvertently experience this phenomenon while listening to audiobooks and podcasts while driving, noting that when we remember a particular point learned while driving, we also recall the location where we learned the idea. TG wondered if Spatial Computing might have a similar effect based upon the virtual/immersive setting of learning. For example, might I find myself transported to a volcano at Haleakalā National Park the next time I think about foveated rendering because that was the virtual location I had set in the Apple Vision Pro when I first researched the idea?
TH—TH answered the follow-up questions immediately following the Apple Vision Pro demo. The primary ideas expressed by TH were that this experience was completely unique and that no previous experiences were analogous to the one delivered by this device. TH also noted that one user interface (UI) element was uncomfortable to access—the Control Center. To access the Control Center, the user must look up to trigger a relatively small dot with an arrow to appear, tap it, and then interact with a series of floating windows. TH described this as uncomfortable.
ZE—ZE also answered the follow-up questions immediately following the Apple Vision Pro demo and commented on the UI of visionOS. While ZE found the immersive environments to be “incredible,” the interface was described as the least inspiring aspect of the experience. As an observer, I noted that ZE very quickly learned the new visionOS UI and adapted to it. However, ZE also had a “false start” that may have negatively affected the overall experience since the setup needed to be repeated before we could continue with the demo.
Overall Conclusions
Conclusions—What surprised you?
In all demonstrations, participants expressed surprise regarding the “time shift” they experienced using Apple Vision Pro. Participants surmised that they had been using the device for a shorter time than they had actually been using it. This “VPST” (Vision Pro Standard Time) phenomenon should likely be researched more formally. Even the participants who experienced “false starts” and needed to redo the setup procedures experienced the perception of spending less time, despite the potential frustration that could have made the experience feel longer.
Approximate time estimates were reported as follows:
- ST—Estimated time at 90 minutes; actual time was over 120 minutes (133% longer than estimated).
- HO—Estimated time at 30 minutes; actual time was 48 minutes (160% longer than estimated).
- TG—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).
- TH—Estimated time at 20–30 minutes; actual time was 42 minutes (140% to 210% longer than estimated).
- ZE—Estimated time at 15–20 minutes; actual time was over 45 minutes (225% to 300% longer than estimated).
Participants used the Apple Vision Pro between 133% and 300% longer than they estimated in this demo situation. On average, participants wore the Apple Vision Pro about twice as long as they estimated. Even after 15 days, I personally still experience this time shift and feel I have been wearing the device for less time than has actually passed.
All participants also reported that they were impressed by how real and immersive the experience was. Three of the five participants reported having never worn a headset device before, and they all commented that the realism delivered by the Apple Vision Pro far exceeded their expectations, as far as how real the experience felt.
Conclusions—How might you use this in your job?
All participants mentioned some potential uses for the immersive environments provided by the Apple Vision Pro in their jobs or education in general, with two participants specifically wanting the ability to use the device to work “inside” an environment of their choice more conducive to work.
The themes of communication and collaboration were also mentioned, specifically using the Apple Vision Pro for Zoom or other similar apps. Interestingly, the demo did not include videoconferencing, and none of the participants mentioned the Persona feature of the Apple Vision Pro that creates a simulated version of the user’s face. At least one of the participants had already experienced my Persona when I contacted them via FaceTime before the demo. Somewaht surprisingly, the beta Persona technology did not deter them from visualizing themselves using it. (Personas have been described by some reviewers as “horrifying.” Although this description is hyperbole, Personas can look unsettling at best—like mine shown below.)
Finally, two participants mentioned the Apple Vision Pro’s abilities to provide realistic simulations, one in the context of rehearsing a presentation in a theater, and the other for simulating a training scenario with realistic visuals and sounds.
Conclusions—How might students use this?
Although all participants had some difficulty or showed pause in answering the question about how they might use the Apple Vision Pro for their personal work, none of the participants had an issue providing many possible uses for students. Participants also provided ideas that represented current possible uses of the Apple Vision Pro based on what they had just experienced and speculative ideas based upon what they felt could be available in the future based on features, functions, apps, or content they had not directly experienced, but assumed are possible.
The two most-mentioned ideas about possible student uses included that Apple Vision Pro could be a factor in “leveling the playing field” for all students and providing experiential learning opportunities not currently possible.
The possibility of Apple Vision Pro addressing student equity issues was mentioned specifically by 3 of 5 participants. The fact that a $3,500+ device could potentially be an answer to equity issues is somewhat humorous to me. However, this may be an indication that the experience the Apple Vision Pro delivers is worth the relatively high cost—or at least it allows educators to see its potential learning experiences as truly valuable. The level of realism of the experience reported by each of the participants was real enough that all of them felt the device was nearly as good as reality, and more than half felt that students would benefit by participating in experiences as real as they had encountered.
All participants gave examples of experiential learning in several contexts, including, but not limited to:
- Visiting places (e.g., geographical locations, zoos, professional locations)
- Recreating historical events (e.g., Civil War, speeches, ancient cultures)
- Experiencing art and culture
- Creating art
- Participating in job-based experiences (e.g., arts, medical, science, automotive)
- Simulating environments (e.g., historical events, firefighting, medical)
Conclusions—Do you have anything else to report about this experience?
The responses to all questions were gathered following the Apple Vision Pro demo experience after a range of time periods between the demo and questions. Some participants answered directly after the demo, and one participant waited two weeks to provide follow-up answers. In my opinion, the amount of time that passed did not affect the quality or quantity of responses. In fact, at least two respondents added more information to their responses days after they provided initial answers because new information had occurred to them. One participant later shared, “I long to revisit some of the immersive worlds” experienced on the Apple Vision Pro—in this case, the desert planet Tatooine, the fictional home of Luke Skywalker.
This catch-all question included discussions about user interface issues—mostly reactions to the controls and a couple of mentions about the frustration accessing the Control Center. Some participants also took the opportunity to reiterate slight discomfort wearing the device, and the fact that users cannot see their feet while standing in immersive environments.
However, all participants reiterated their surprise about the realism provided by the Apple Vision Pro in immersive environments. Several comments were conveyed that participants wanted to be able to visit these environments whenever they wanted for both entertainment and to create pleasant work environments. Further, all participants mentioned how this device should be used to provide experiential learning opportunities.
Final Thoughts
Among the five participants and the demonstration experiences I was able to share with each of them, I am more convinced than ever that the Apple Vision Pro is on the road to becoming a valuable tool for learning. Indeed, I have personally used the device for my own learning in the past 15 days, and my hope is to pursue the ideas I learned from my friends and colleagues moving forward.
These demonstrations have served to provide me with a roadmap for selecting apps and trying new things discussed by participants. Some areas I will pursue match the findings from this report:
- Experiential learning
- Immersive environments
- Realistic how-to experiences
- Creating work-conducive environments
- Expert-guided simulations
- Virtual travel/field trips
- Artistic and cultural experiences
- Apps that allow users to create
Finally, many thanks to the friends and colleagues who participated in these demonstrations. I hope I provided a fun and worthwhile experience—and that everyone learned as much as I did. Thanks, too, to Sean at the Apple Store in Deer Park (IL) who led me through my first demo earlier this month and gave me a blueprint for my own demos. I further hope that these experiences will inspire other education leaders to learn through teaching by following some of the processes described above.
References
Apple Support. (2024a). Apple Vision Pro User Guide: Clean your Apple Vision Pro. Retrieved from https://support.apple.com/guide/apple-vision-pro/clean-your-apple-vision-pro-tan6200165e8/visionos
Apple Support. (2024b). How to clean Apple Vision Pro and accessories. Retrieved from https://support.apple.com/en-us/HT213964
Subaru. (July 31, 2023). A Beautiful Silence :30 [video]. Retrieved from www.youtube.com/watch?v=X_kxjt6gf1Y
Wikipedia. (2023). Method of loci. Retrieved from https://en.wikipedia.org/wiki/Method_of_loci
Yates, Frances A. (1966). The Art of Memory. Chicago: University of Chicago Press.















No comments:
Post a Comment