My initial thought when reading “What Are Multimodal Projects?” by Arola, Sheppard and Ball was that it was like most texts you get from a textbook; Informative and chunked, with images to keep readers attention. Far from the fantasy based texts I am interested in, however, the reading was surprisingly quick and entertaining.
Arola and balls text taught me about what multimodality is and how we use it. Multimodality is a combination of the words multiple and modes. According to the text “Multimodal describes how we combine multiple different ways of communicating in everyday life.”(Ball, 1) There are several different modes with include aural, visual, spatial, gestural and linguistic.
Arola and Ball gave a clear explanation of multimodality and helped me understand each modes form and purpose. Throughout reading this document I couldn’t help but think of memes. Memes are a prime example of how individuals use multiple modes to communicate with each other. Also, memes come in a few forms but most commonly include an image with a caption. The purpose of a meme is to be funny and make the reader laugh, however, memes are seen/ used more by the younger generations and aren’t understood by the older generation. This tends to bring a divide in culture.
As a supplementary text I read Alex Mendoza’s “Deaf community outraged after interpreter signed gibberish before Irma.” This text speak about a very unfortunate event that happened in Manatee County, Florida. The leaders of this county used a complete amateur as the interpreter for their press conference. Obviously this man was not prepared or experienced enough, during the press conference the man signed phrases such as “pizza want you are”, “reduce me” and “dog cat”.
This article was a prime example of how vital modes are to our understanding of concepts, actions and information. The deaf community depended on what he interpreted for them in order to understand important information. Without having the proper information the deaf community would be confused and left in the dark. Doing annotations on this text really helped paint this concept in my mind. Watching the video also illustrated how essential these modes are.
During my annotations the link between Arola and Ball speaking on gestural mode and the amateur translator really solidified. According to Arola and ball “When we interact with people in real life or watching them on-screen, we can tell a lot about how they are feeling and what they are trying to communicate.”(12) By just watching the translator people with hearing impairments wouldn’t be getting much help, looking at him you could tell he was uncomfortable and unsure of himself, this would have surely caused huge concern.
Ball, Cheryl E., et al. “What Are Multimodal Projects?” Writer/Designer: a Guide to Making Multimodal Projects. Bedford/St. Martins, 2018.
Mendoza, Alex. “Deaf Community Outraged after Interpreter Signed Gibberish before Irma.”New York Post, New York Post, 16 Sept. 2017.