Getting Started With Augmented Reality (AR) in Inclusive Online Teaching and Learning in Higher Education: An Extended Environmental Scan for Pedagogical Design Leads

Getting Started With Augmented Reality (AR) in Inclusive Online Teaching and Learning in Higher Education: An Extended Environmental Scan for Pedagogical Design Leads

DOI: 10.4018/979-8-3693-1034-2.ch003
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

With the mass-scale adoption of augmented reality (AR) in the commercial space, and student interest in 3d- and 4d experiential learning, higher education is starting to look towards digital augmentations of real spaces for teaching and learning. University creative shops are exploring whether they can get into the game of producing AR-enhanced experiences: campus tours, interactive gaming, virtual laboratories, exploratory art spaces, simulations, design labs, online / offline / blended teaching and learning modules, and other AR applications. This work offers a basic environmental scan of the AR space for inclusive online teaching and learning, and it includes pedagogical design leads from the current research, technological knowhow, hands-on design / development / deployment of learning objects, and online teaching and learning methods. This work does not take a pedagogical theory approach although several theories are mentioned. Rather, the focus is on applied AR in teaching and learning.
Chapter Preview
Top

Review Of The Literature

Augmented reality is considered “a variation of Virtual Environments (VE)” (Azuma, 1997, p. 2) or virtual reality (VR). Both are synthetic environments although AR is considered more annotative, and VR may include a more complete synthetic world surround. Azuma writes:

VE technologies completely immerse a user inside a synthetic environment. While immersed, the user cannot see the real world around him. In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world. Therefore, AR supplements reality, rather than completely replacing it. (Azuma, 1997, p. 2)

AR is thought to harness the strengths of the real world along with the virtual. AR involves the adding of “objects that were not perceptible a priori” (Mohamed & Mohamed, July 2012, p. 1). From early days, it was used to enhance human workplace performance:

Augmented Reality enhances a user’s perception of and interaction with the real world. The virtual objects display information that the user cannot direct detect with his own senses. The information conveyed by the virtual objects helps a user perform real-world tasks. AR is a specific example of what Fred Brooks calls Intelligence Amplification…: using the computer as a tool to make a task easier for a human to perform. (Azuma, 1997, p. 3)

Key Terms in this Chapter

Augmented Reality (AR): The uses of digital virtual objects in physical space to annotate or enrich human perception in the physical space (or may require a headset to engage an augmented world in a physical space).

UV Mapping: A spatio-physical mapping from 3d to 2d (and vice versa) in which the x-axis is “U” and the vertical y-axis is the “Y” (in order to enable proper application of texturing to a 3d shape at the level of granularity and focus that makes visual sense) (“UV mapping,” Nov. 23, 2022); has been replaced by some other technologies and approaches, such as tri-planar mapping.

Point-of-View (POV): Perspective, line-of-sight, angle-of-view.

Augmented Reality Learning Experience (ARLE): The subjective and objective learning experience from designed augmented reality scenarios and scenes and interactions.

Extended Reality (XR): A term which includes “virtual reality” and “augmented reality” and “mixed reality”; sometimes “eXtended reality.”

Contour: An outline, often of a shape, and often in reference to a curved line; the general form or structure of a thing.

EDGE: The outside limit of an area or object; a line; a stroke.

Strong Augmented Reality Paradigm: An approach involving changing “the space perception of its inhabitants so that they cannot distinguish the physical base of their place from its virtual extension” (De Michelis, De Paoli, Pluchinotta, & Susani, April 2000, p. 82).

Pre-Rendering: Adding color, texture, and details to a 2d or 3d wireframe for evocative images on a screen or in ambient space—done ahead of time of the actual viewing or consumption.

Curve: A non-linear line, a line or stroke or outline with curvature.

Mobile Augmented Reality Learning Systems (MARLS): Systems designed for learning through mobile augmented reality (using mobile devices to project the digital augmentations).

UV Position: The mapping of 3D to 2D and vice versa to enable the painting of textures and colors on a 3d surface in a way that is visually appropriate for the 3d shape or object or environment.

MeSH: An interlaced overlay of columns and rows representing the external geometric shape for modeling (may be in 2d or 3d), by adjusting mesh points, handle points, textures, gradients, applied colors, and other aspects.

Mobile Augmented Reality System (MARS): AR mapped to mobile settings (Chang & Tan, Dec. 2010 AU344: The in-text citation "Dec. 2010" is not in the reference list. Please correct the citation, add the reference to the list, or delete the citation. , p. 20).

Z-Axis: The axis on which the third-dimension depends indicating physical depth.

Node (Nodes): Vertex, a point at which lines start or end or bend; a point in space; a point at which lines intersect.

Sound Design: The planning for sound effects, voice overs, and other sound effects for a 3d or 4d sequence.

Virtual Reality Augmented Reality (VR/AR): The inclusion of both VR and AR as a mixed reality solution for particular objectives (including for learning).

Segmented Reality: The combination of real and virtual reality to create a sense of different and “divided” elements that come together in a more coherent experience (without a full emulation of reality) (De Michelis, De Paoli, Pluchinotta, & Susani, April 2000, p. 82).

Haptic Gloves: Wearable interactive device to simulates actual touch through haptic sensors, an input and output device.

Non-Playable Characters (NPC): Scripted synthetic agents that spawn in an AR scene and “interact” with human users (at somehow perform or communicate to the human users).

Tangible Augmented Reality (TAR): Virtual components are registered to physical ones, and the virtual elements may be “manipulated by interacting with physical components in the real world” (Jain & Choi, July 2020, p. 43).

Immersive VR or AR Headset: Head-mounted display, head-mounted device.

Marker: A visual indicator that situates or positions the virtual object in 3D space.

Weak Augmented Reality Paradigm: An approach involving inducing “people to behave as if their place were transformed even if the physical space where they are located does not seem changed” such as remoting into a workstation or remote lab (De Michelis, De Paoli, Pluchinotta, & Susani, April 2000, p. 82); a partial AR experience that does not change the full perceptual surround of the individual or group in the AR experience.

Simulator Sickness: A sense of nausea or dizziness from engaging in virtual reality or augmented reality or mixed reality (or some combination).

Primitive: A core shape or polygon or element that is used to build more complex objects and forms and virtual or augmented environments.

3D: Three-dimensional space, often specified as comprised of the x, y, and z axes; volumetric space representing with higher fidelity the actual world (in theory).

Virtual Reality (VR): A 3D simulated environment or immersive virtual world that evokes a space, personages, physics, and objects, among others; VR may be passive or interactive.

Mixed Reality (MR): A term which refers to the combination of the physical-real and the augmented or diminished or virtual real (based on technical and other mediation).

Markerlessness: A type of AR that does not require a marker (such as a QR code) to be scanned in order to appear; such AR requires the launch of an app or web-based AR in a location (and some without a need for a location) to show the digital elements.

Augmented Reality Virtual Reality (AR/VR): The inclusion of both VR and AR as a mixed reality solution for particular objectives (including for learning).

Disembodied Learning: Incorporeal learning; non-physically present sorts of learning (historically applied to distance or remote learning); learning through symbols and not direct experience with the actual depicted object; indirect non-experiential learning.

QR (Quick Response) Code: “A machine-readable code consisting of an array of black and white squares, typically used for storing URLs or other information for reading by the camera on a smartphone” (per Google dictionary).

4D: 2D or 3D visuals including motion (with motion as the fourth dimension).

QR (Quick Response) Marker: A two-dimensional machine-readable code that may be used by an augmented reality program to display particular designed artifacts or other elements based on the embedded information there.

Occlusion: The blocking or obstructing of one object with another, with the closest object to a viewer blocking what is behind.

Illumination: The lighting, available light.

Authoring Tool: Software used for the creation and editing of digital objects of various types (with particular technical standards).

Multi-Object-Oriented Augmented Reality (MOOAR) System: A system that enables the building and deployment of a location-based adaptive mobile learning environment for the setup of augmented reality scenarios and objects (for people’s experiences).

VAR: Virtual and augmented reality.

Mid-Air Images: Imaging technologies that show moving computer graphics (CGs) in real space.

Spatial Augmented Reality on Person (SARP): The overlay of digitally projected visuals on people’s physical bodies (such as skeletons, organs, and other representations of the internal physical state).

Contrast: The difference between the lightness and darkness of particular colors (based on particular lighting conditions).

Artificial Lighting: The illusion of a light field (whether natural or artificial or other) using digital illumination techniques (and algorithms enabling light interactions with various surface types and varying real-world shapes).

XR (Extended Reality): A term which includes “virtual reality” and “augmented reality” and “mixed reality”; sometimes “eXtended reality.”

Digital Learning Object: A discrete designed entity used for learning, including augmented reality (AR) learning objects or “AR-LOs.”

Self-Referential Encoding: Learning new information such as about the human body by seeing AR visualizations projected on their own bodies (in a form of embodied learning).

Immersive AR (IAR): The blending of the real and the virtual for a full-sensory embodied experience that is perceive as full-surround and engaging.

Polygon: A 2D or 3D closed shape (in geometry).

Haptic Devices: Input and / or output devices that use touch (perhaps include pressure, temperature, and other factors).

Head-Mounted Device (HMD): Immersive VR or AR headset, sometimes head-mounted display.

Tabletop Display: The projection or display of augmented reality scenes and objects on a table surface

2D: Two-dimensional space, often specified on a flat plane with x (horizontal) and y (vertical) axes and a coordinate system to indicate location on the plane.

Embodied Learning: Experiencing the learning through the senses and proprioception (awareness of the body’s positioning, from within) and interoception / enteroception (awareness of the body’s internal state, through the unconscious, subconscious, and conscious) through the body.

Spatial Augmented Reality (SAR): The overlay of digitally projected visuals on physical objects and scenes (vs. using mobile devices, monitors, head-mounted displays, hand-helds, or other such objects).

Feature Map: The connecting of particular variables to matched values or dimensions or qualities.

Model: A visual representation of an object, phenomena, event, scene, person, or some combination.

Texture: A visual pattern indicating the surface feel of a substance.

Diminished Reality: The removal of parts of physical reality from the observer view in real time through digital occlusion (but not considered part of “augmented reality” but rather part of “mediated reality” ( Broll, 2022 , p. 321); taking away real-world light information instead of adding digital objects per se, introducing the illusion of “negative space” in some senses).

Bodystorming: A pre-prototype walk-through of a scene or scenario for augmented reality or virtual reality in order to enable quick ideation through spatial interactions (Maguire, July 2020, p. 472); an idea somewhat related to “brainstorming” albeit with a focus on embodied creativity or innovation.

Rendering: Adding color and texture to a 2d or 3d wireframe to create visuals on a screen or in ambient space, sometimes referred to as “painting in” or “spawning.”

Augmented Reality Learning Object (ARLO): A base unit of an AR digital learning object.

Mobile Learning: A form of ubiquitous “anytime anywhere” learning using mobile devices to access the learning materials, activities, interactivity, and other features.

Registration: The accurate recognition of an object with a smartphone camera or other device.

Complete Chapter List

Search this Book:
Reset