Your SlideShare is downloading. ×
0
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Synchronicity in videogames
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Synchronicity in videogames

157

Published on

Presentation about audio, visual and kinaesthetic synchronicity in videogames.

Presentation about audio, visual and kinaesthetic synchronicity in videogames.

Published in: Entertainment & Humor
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
157
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • This presentation will expose the basic concepts to be articulated in my main essay. It will tackle the general idea of audiovisual synchronicity already introduced in film studies and apply it to games. The focus is on audio’s role in audiovisual relationship and the idiosyncrasies brought up by video games as a digital media.
  • This work starts by revisiting two games that are fundamental in video game ’ s emergence: Pong and Space Invaders . Scott Cohen ’ s book about the early history of Atari contains a testimony of what may have been one of the first public sessions of Pong . Two aspects of his report deserves special attention: one was the fact that the author mentions sound as an important element for understanding how the game worked and another was how he described the “ pong ” sound as “ beautifully resonant ” . When hearing these beeps today it is hard to imagine how they could be considered attractive, however, it seems clear that these sounds were a source of pleasure, not only based on Cohen ’ s declaration, but also by the fact that legions of fans around early video game music persist until nowadays. In the case of Space Invaders , our attention was turned to the background music. It is constituted by a four-note ostinato that resembles the aliens ’ choreography marching down the screen. More importantly, it gradually increases in speed as the enemies are eliminated and therefore the game progresses. Later, Nintendo ’ s cloned version Space Fever (1979) would additionally change the motif and raise its tonality, exalting even more the sense of urgency brought by music. The observation of these games raised some very basic, but fundamental questions. Why did game designers bother with sounds at this stage of technological development? Why were they trying to produce sounds synthetically in real-time? Why music had to change according to what was happening to the game? The most obvious hypothesis, and arguably the most important for that reason, was that they were searching for synchronicity between sound and visuals, and between these elements and the game. Sounds helped to understand the images and these needed sounds to be fixed to them, therefore the latter should be produced using the same technology as the images were generated. Sounds also helped to understand the game, consequently it should adapt in accordance to the latter. These relationships will be discussed as follows. The second question, however, remains open to question: why describe the sounds as “ beautifully resonant ” pong sounds, even though they were merely square-shaped synthesized beeps? This is harder to answer, but we expect to raise at least some propositions in our conclusion.
  • The importance of synchronicity in audiovisual media has been justified by cognitive studies (Anderson, 1996: 80-89). Experiments show that infants demonstrate more interest in images that are presented synchronized with sounds. They pay more attention and for a longer period of time when images and sounds occur at the same time then when presented with images alone or when the sound is delayed and offered out of synch. These studies suggest that perception is an active process where we constantly seek, even at early stages of psychological development, for patterns between the visual and aural modalities. Cross-modal events are not passively accepted, but actively constructed by our perception, a fact that helps to explain, for instance, why infants, even though without knowing what a xylophone is or how it should sound like, perceives the images of one playing the instrument and the sound being produced as coming from one source (Anderson, 1996: 80-89). Joseph Anderson argues that films, configured as a bi-modal medium, are perceived following this same innate mechanism; that is why voice in movies are recognized as being originated by the character in the image and not from the loudspeakers behind the screen, a phenomenon often called ventriloquism (Anderson, 1996: 80-89). The author also suggests that when checking correspondences between the senses, invariant features found across modalities are understood as being created by a single source. This explains, at least in part, correspondences such as between music meter and cinematic montage, musical motifs and narrative or dramatic consistency and the confirmation or contradiction between what is seen and what is heard (Anderson, 1996: 80-89). Even the primal acceptance on the part of the spectator that non-diegetic music is constituent of a movie can be understood as a result of our propensity in bonding modalities (Anderson, 1996: 86). Annabel Cohen also gathers a number of empirical studies that demonstrate how music interacts with films helping in the creation of meaning and emotion by, besides other factors, directing the spectator ’ s attention to important elements on the screen because of shared patterns between the aural and visual modalities. She also suggests that, based on recent studies on neural synchronization, when music shares patterns with the visuals in films they contribute to the synchronized activation of neural patterns, an occurrence currently being associated in these studies to the organization of consciousness and attention. This would help explain how sounds attached to images help to mask its real source; as in the example of ventriloquism in cinema mentioned above (Cohen, 2010: 893-894).
  • Much of these premises also apply to video games since they are predominantly an audiovisual medium. However, a special facet of games raises different issues: the fact that they are actively played and not only viewed by the spectator. The interactive aspect of games creates a different channel of perception, a different modality through where patterns of synchronism may occur. Alex Stockburger describes this as a “ kinaesthetic space ” , a “ link between the player ’ s body and the representational space of the game ” (Stockburger, 2006: 253). The manipulation of the game by the player creates a series of inputs that are interpreted by the game engine and represented by it in the audiovisual space, creating a kind of feedback loop between the player and the game ’ s audiovisual medium. The effective “ embodiment ” of the game experience depends mostly on the adjustment between the player ’ s gestures and the game ’ s interface, as well as the repetitive use of patterns of feedback and game play (Stockburger, 2006: 164-165). This loop is also fed by events generated by the game ’ s logic or rules, so this kinesthetic space is predominantly a two-way path. Gestural and systemic patterns are often related to aural and visual events in games, resulting in effects that differ from traditional audiovisual media. In this paper, the kinesthetic dimension will be considered a distinct modality that is perceived concomitantly with the aural and the visual, therefore, digital games will be considered as belonging to a tri-modal medium. Although relevant, the gestural movements of the player in the “ real world ” will not be contemplated, only the representation of these movements in the game ’ s world or when mediated by the game ’ s audiovisual space. As a working concept, synchronism will then be observed whenever audio or identified aural patterns are perceived as linked to visual and/or kinesthetic patterns in the form of events or parameters generated by the player ’ s inputs or by the game ’ s logic.
  • Some attempts in describing the audiovisual relationship in cinema will be discussed in the essay; however, because it provides a more workable concept, this presentation will focus on Michel Chion ’ s idea of the audiovisual contract. Chion (1994: 3-65) describes the relationship between audio and visuals in sound cinema as a “ contract ” that the spectator has to accept in order to be driven by the illusion that these disparate modalities originate from the same source. The acceptance of this contract promotes the contamination of one modality into the other and the audiovisual event is perceived as cohesive and autonomous, when in fact sound and image are simultaneously being influenced and defined by each other through what the author calls “ added value ” . Chion describes it as “ the expressive and informative value with which a sound enriches a given image so as to create the definite impression (...) that this information or expression ‘ naturally ’ comes from what is seen, and is already contained in the image itself ” (Chion, 1994: 5). Added value is reciprocal, sounds influence visuals as much as the other way around; when we see someone being punched in a film we also accept the crunched-melon sound (or whatever technique applied by the sound designer) as the sound of the impact of a hand on the characters face or body. The author fuses the words synchronism and synthesis in the neologism synchresis (Chion, 1994: 63-64) to define the effect on film spectators when an auditory and a visual phenomenon occur synchronically and are perceived as an integral event in a relationship that is not only possible, but also necessary. Explosions, doors shutting, people speaking and all sort of events benefit from this phenomenon; additionally, different effects can be created by attaching apparently incompatible sounds and images to each other. Synchresis can serve to create points of synchronization in a film ’ s development. In Chion ’ s definition, synch points are salient moments of an audiovisual sequence where sounds and visuals meet in synchrony (Chion, 1994: 58). The difference between both concepts can be defined as: while synthesis is used to describe a discrete moment, synch points are defined based on a sequence of events; the former can be approached as paradigmatic, occupied with the vertical relations between modalities, while the latter nears the syntagmatic, used to describe the structural nature of an event. According to Chion, synch points shape the film ’ s narrative and drama by magnifying or masking critical moments in the audiovisual discourse. For instance, by emphasizing a revealing word in the dialogue or by concealing the visual results of a gunshot, a case of what the author calls a “ false synch point ” (Chion, 1994: 59-60). Synch points structure a film ’ s architecture, providing nodes of arrival and departure; foundations from where time can be freely stretched or distorted, as observable, for instance, in Japanese animated films were sequences are constantly slowed down, repeated or frozen provided the anchoring sync points of sonorous hits and punches.
  • These concepts are applicable in games as well; however, the addition of the kinesthetic channel augments the number of possible correspondences between modalities. A simple case can be found in Pong: when we hear the beep at the moment the ball reverts its direction, we not only see it bouncing, but feel it rebound. As we saw in Scott Cohen ’ s story, the sensation of tangibility provided by the synchresis between the ball ’ s movement and the beep sound was crucial for the players to understand the virtual game as a physical activity and then react to it. The actual performance of the players depends on the good use of this type of synchrony, because the timing of the player ’ s responses will be linked to these events. Therefore, the synchretic effect is fundamental in the creation and the success of gameplay though the rendering of tangibility in the game ’ s virtual domain. General types of different uses of synchresis in games can be identified in the following examples: Synchresis rendering physicality or tangibility : this is arguably the most ordinary use; however, it gains special significance in some game genres such as the so called physics games. These games use physics simulation as a fundamental component in gameplay, normally with flexible rules and simple objectives (e.g. Angry Birds and World of Goo ).
  • Synchresis in gameplay : synchresis can be used during gameplay to enhance dramatic moments or gameplay in general. For instance, in fighting games special combos and movements are treated by special animations (e.g. Street Fighter ); or false synch points can intensify anxiety in horror games such as F.E.A.R .
  • Synchresis as sign : events that are tied to the player’s actions and provide information on what is happening in gameplay. Early games used extensively this feature; sounds tied iconically or symbolically to image and action informed the player about events such as items collected or special powers obtained (e.g. Mario Bros ). Even recent and more realistic games maintain this practice, despite the fact that they use more indexical or realistic signs (e.g. Resident Evil ).
  • Sync points are also extensively used in games to generate fiction , resembling the cinematic model, and to organize the game’s structure, understood as its set of objectives and rules. Probably the most traditionally well known synch points in a game are the win and lose jingles (e.g. Super Mario Bros ); however, subsidiary events also mold a game’s structure ( e.g. Peggle) . Cinematic events mold the game’s fiction or narrative and can be found widely in the form of cut scenes, for example, in the so called first person shooter games such as Gears of War series or also as in-game animations.
  • A broader approach to synchronicity can be found in games that put the synchronization between audio, visual and kinesthetic events in the foreground of gameplay. They can be divided in two groups: Synchronicity as gameplay : games that use pulse, musical and/or rhythmic patterns as the fundamental element of gameplay. Rhythm and music games in general fall in this category, although different approaches can be spotted. For instance, while Guitar Hero is heavily visual, Patapon relies singularly on listening to the music ’ s pulse. An intermediate example were visual and sound patterns are more integrated can be found in Rhythm Heaven .
  • Synchronicity as synaesthesia : games where synchronicity is not essential to the game’s rules, but the whole gaming experience is felt as being moved by it. Synaesthesia is more accurately defined as a psychological experience where the senses invade each other, that is, when a modality is translated into another. For instance, when certain musical notes are perceived as having specific colors attached to them. In this article, the term is used broadly to describe games that intensively stimulate all modalities in a holistic approach. This can be seen in games that use excessive feedback, often not necessarily important as information, but as in an exhilarating effect (e.g. Pac-man Championship Edition DX ). Other games attempt to organize these feedbacks aesthetically; creating, for instance, a coherent musical environment (e.g. Lumines and Rez ).
  • Probably Scott Cohen ’ s initial affirmation about Pong ’ s sounds being “ beautifully resonant ” is less related to their aesthetics and more orientated to the capacity of sounds bonding with images and resonating with the kinesthetic modality. The satisfaction obtained by having all modalities simultaneously and meaningfully stimulated might be a source of pleasure if we consider our innate propensity to search for these correspondences. This hypothesis corroborates Steve Swink ’ s definition of “ game feel ” as “ a powerful, gripping, tactile sensation that exists somewhere in the space between player and game. It is a kind of ‘ virtual sensation ’ , a blending of the visual, aural and tactile ” (Swink, 2009: xiii). The history of video games is currently at a transitional moment, a period of cultural redefinition described by Jesper Juul as “ a casual revolution ” (2010), in reference to the industry ’ s definition of games that are easier to learn and that fit to a broader audience, as opposed to hardcore games. During the mid 80 ’ s and 90 ’ s, video games evolved and matured as a symbolic system; they created a whole set of sophisticated conventions, but also forged a very specialized audience. Today, video game ’ s audience is being profoundly redefined and, during this process, games based on music and mimetic gameplay are having a privileged role, together with designs that explore the game ’ s feel and emotional responses. The pleasure of experiencing audiovisuals and movement in a holistic way can be regarded as one of the flagships in video game ’ s new emergence and the study of synchronicity can help to shed light on this process.
  • Transcript

    • 1. Synchronicity in GamesEduardo Larson
    • 2. Start• Pong: “There was a beautifully resonant pong sound, and the ball bounced back to the other side of the screen” (Cohen, 1984: 28-29).• Space Invaders: adaptive background music.— Why produce synthetic sounds in real-time?— Why adaptive music?— Synchronicity!— Why “beautiful” and “resonant” sounds?— ... Synchronicity in Games
    • 3. Why synchronism?• Cognitive studies show that (in Anderson, 1996): • Infants are more interested in images linked to sounds; they are perceived as more important. • Infants actively seek for patterns across modalities. • When cross-modal patterns are confirmed the information carried by them is perceived as being generated by a single event.• Film spectators are constantly searching for patterns that bridge modalities (Anderson, 1996).• The synchronized activation of neural patterns are associated to the organization of consciousness and attention (Annabel Cohen, 2010: 893- 894). Synchronicity in Games
    • 4. Films versus games• Films: • are fixed. • are bi-modal.• Video Games: • are interactive. • are tri-modal; include a “kinaesthetic space” (Stockburger, 2006).• Working definition: Synchronicity will be used whenever audio is perceived as linked to visual and/or kinesthetic events or parameters. Synchronicity in Games
    • 5. Chion’s audiovisual contract• Added value: the expressive and informative value with which a sound enriches a given image, and vice versa.• Synchresis: the psychological fusion between a sound and a visual when these occur at the same time.• Synch Points: salient synchronized moments of an audiovisual sequence. Synchronicity in Games
    • 6. Example 1• Synchresis rendering physicality or tangibility: simulation, physics games, etc. Synchronicity in Games
    • 7. Example 2• Synchresis in gameplay: enhanced gameplay. Synchronicity in Games
    • 8. Example 3• Synchresis as sign: provides information and feedback. Synchronicity in Games
    • 9. Example 4• Sync points as structure: molds game progression and builds fiction through cinematic events. Synchronicity in Games
    • 10. Example 5• Synchronicity as gameplay: rhythm and music games. Synchronicity in Games
    • 11. Example 6• Synchronicity as synaesthesia: experience is felt as being moved by synchronicity. Synchronicity in Games
    • 12. References• Anderson, J. 1996. “Sound and Image”. The Reality of • Angry Birds. 2009. Rovio Entertainment Illusion: An Ecological Approach to Cognitive Film Theory. • Child of Eden. 2011. Q Entertainment. pp. 80-89. Carbondale, Ill.: Southern Illinois University • F.E.A.R.. 2007. Monolith Productions & Day 1 Studios. Press. • Flower. 2009. Thatgamecompany/Sony Computer Entertainment.• Chion, M. 1994. Audio-vision: sound on screen. • Guitar Hero: Warriors of Rock. 2010. Neversoft & Vocarious Translated by Claudia Gorbman. New York: Columbia Visions. University Press. • Lumines: Electronic Symphony. 2012. Q Entertainment.• Cohen, A. 2010. “Music as a source of emotion in film”. • Need for Speed: Underground. 2003. EA Games. Handbook of Music and Emotion: theory, research, • Pac-man Championship Edition DX. 2010. Namco Bandai. applications. Eds. Patrick N. Juslin & John A. Slodoba. • Patapon 3. 2011. Pyramid Studio. pp. 879-908. Chapter 31. Oxford University Press. • Peggle. 2007. PopCap Games.• Cohen, S. 1984. ZAP! The Rise and Fall of Atari. New • Pong. 1972. Atari. York: Mcgraw - Hill Book Company. • Resident Evil 5. 2009. Capcom.• Juul, J. 2010. A casual revolution: reinventing video • Rez. 2001. United Game Artists & Q Entertainment. games and their players. Cambridge: The MIT Press. • Rhythm Heaven. 2009. Nintendo.• Stockburger, A. 2006. The Rendered Arena: modalities of • Space Invaders. 1978. Taito. space in video and computer games. Doctoral Thesis. • Space Fever. 1979. Nintendo. Degree awarded by the University of the Arts, London. • Street Fighter 4. 2008. Capcom.• Swink, S. 2009. Game Feel: a game designer’s guide to • Super Mario Bros. 1985. Nintendo. virtual sensation. Burlington: Morgan Kaufmann Publishers. • World of Goo. 2008. 2D Boy. Synchronicity in Games

    ×