Supporting multiple ways to perceive information is a universal design for learning (UDL) guideline that supports multiple means of representation. The UDL guidelines were developed by an organization called CAST. This article will explore how teachers and other educators can support multiple ways to perceive information.
Supporting Multiple Ways to Perceive Information in Universal Design for Learning
Supporting multiple ways to perceive information means presenting information in ways that learners can perceive with a variety of senses. For example, teachers can present visual information in other ways, such as:
- Audibly
- Tactually
These presentation methods benefit many learners, including those who:
- Are blind or visually impaired
- Are unfamiliar with the type of information being shown, such as multiple forms of:
- Graphics
- Art
Text
Teachers can use many strategies to present visual information in other ways. For example, teachers can read aloud all text that they write on a white board or other surface. Similarly, they can create audio versions of any texts they create, such as:
- Course outlines
- Handouts
- Assignments
Alternatively, learners may use screen reading or magnification software to read these texts. For instance, if school boards provide learners with computers or tablets, these devices may include read-aloud or magnification functions. Therefore, teachers should know how to use these functions to support their students. Furthermore, to make text that these types of software will read well, teachers should know how to use:
Similarly, teachers can use textbooks created to be accessible with screen reader and magnification software. Alternatively, teachers can source textbooks from publishers committed to providing accessible-format versions of their textbooks for students who need them.
Descriptions for Visuals
In addition, teachers can verbally describe any visuals they show in class, such as:
- Slides
- Graphs
- Maps
- Diagrams
- Works of art
- Other images
Teachers may provide these descriptions aloud during class, or provide learners with written image descriptions. Teachers writing their own descriptions have the chance to think in advance about:
- Background information about the visuals that learners should have
- What elements of the visuals they want to draw learners’ attention to
- Possible interpretations of the visuals, depending on context
- Why the visuals are important to the lesson
Alternatively, teachers can find audio or tactile ways to represent visuals. For example, teachers could show tactile models of math or science concepts, in addition to diagrams.
Videos and Animations
Similarly, teachers should create audio descriptions and transcripts for any videos or animations they play in class. On the other hand, teachers can use videos that already have audio descriptions and transcripts.
Braille
Another support for access to text, for print materials or transcripts, is Braille. Some texts may already be available in Braille. Otherwise, teachers can work with Braille transcribers to create Braille copies. Currently, many learners who could benefit from Braille are not taught to read it. However, if classroom teachers expose their students to Braille in class, students who do not read it yet may want to learn.
Audio Information
Likewise, teachers can present audio information in other ways, such as:
- Visually
- Tactually
These presentation methods benefit many learners, including those who:
- Are deaf or hard of hearing
- Know multiple languages
- Have difficulty with memory
Teachers can use many strategies to present auditory information in other ways. For example, teachers can create text versions of their lessons, through:
- Captions
- Transcripts
For instance, teachers can use speech-to-text software to make transcripts. Similarly, teachers can create captions or transcripts for any audio or videos they play for their classes. On the other hand, teachers can use audio and videos that already have captions and transcripts. Likewise, teachers should provide lyrics and descriptions of any music they play. Teachers writing their own descriptions have the chance to think in advance about:
- What elements of the music they want to draw learners’ attention to
- Why the music is important to the lesson
Another support for access to speech, in real time or in audio or video content, is Sign language interpretation. Some audio or videos may already include Sign language interpretation. Otherwise, teachers can work with real time Sign language interpreters. There is currently a shortage of Sign language interpreters to provide support in schools. However, if teachers arrange Sign language interpretation in class, students who do not sign yet may want to learn. As a result, learners could become Sign language interpreters in future.
In addition, teachers can use other visual aids to represent sound, such as:
- Diagrams
- Charts
- Notations
Moreover, teachers can represent elements of sound, such as emphasis or prosody, in visual ways, including:
- Images
- Symbols
- Emoticons
Alternatively, teachers can find tactile ways to represent sounds. For example, the ringing school bell could reach learners either visually, through flashing lights, or tactually, through vibrations.