Introduction to Cinema: Study Abroad
Overview
This text was enthusiastically adapted from Russell Sharman's incredible Moving Pictures, linked here, and was adapted specifically to focus on cinema regarding Tokyo for the purposes of Study Abroad.
Welcome to Tokyo in Film!
What is Cinema?
Is it the same as a movie or a film?
Does it include digital video, broadcast content, and streaming media?
Is it a highbrow term reserved only for European and art house feature films?
Or is it a catch-all for any time a series of still images run together to produce the illusion of movement, whether in a multi-plex theater or the 5-inch screen of a smartphone?
Technically, the word itself derives from the ancient Greek kinema, meaning movement. Historically, it’s a shortened version of the French cinematographe, an invention of two brothers, Auguste, and Louis Lumiere, that combined kinema with another Greek root, graphien, meaning to write or record.
The “recording of movement” seems as good a place as any to begin an exploration of the moving image. And cinema seems broad (or vague) enough to capture the essence of the form, whether we use it specifically in reference to that art house film or to refer to the more commonplace production and consumption of movies, TV, streaming series, videos, interactive gaming, VR, AR or whatever new technology mediates our experience of the moving image. Because ultimately, that’s what all of the above have in common: the moving image. Cinema, in that sense, stands at the intersection of art and technology like nothing else. As an art form, it would not exist without the technology required to capture the moving image. But the mere ability to record a moving image would be meaningless without the art required to capture our imagination.
But cinema is much more than the intersection of art and technology. It is also, and maybe more importantly, a powerful medium of communication. Like language itself, cinema is a surrounding and enveloping substance that carries with it what it means to be human in a specific time and place. That is to say, it mediates our experience of the world, helps us make sense of things, and, in doing so, often helps shape the world itself. It’s why we often find ourselves confronted by some extraordinary event and find the only way to describe it is: “It was like a movie.”
In fact, for more than a century, filmmakers and audiences have collaborated on a massive, ongoing, largely unconscious social experiment: the development of a cinematic language, the fundamental and increasingly complex rules for how cinema communicates meaning. There is a syntax, a grammar, to cinema that has developed over time. And these rules, as with any language, are iterative; that is, they form and evolve through repetition, both within and between each generation. As children, we are socialized into ways of seeing through children’s programming, cartoons, and YouTube videos. As adults, we become more sophisticated in our understanding of the rules and able to innovate, re-combine, and become creative with the language. And every generation or so, we are confronted with great leaps forward in technology that re-orient and often advance our understanding of how language works.
And therein lies the critical difference between cinematic language and every other means of communication. The innovations and complexity of modern written languages have taken more than 5,000 years to develop. Multiply that by at least 10 for spoken language.
Cinematic language has taken just a little more than 100 years to come into its own.
In January 1896, those two brothers, Auguste, and Louis Lumiere, set up their cinematographe, a combination motion picture camera and projector, at a café in Lyon, France, and presented their short film, L’arrivée d’un train en gare de La Ciotat (Arrival of a Train at La Ciotat Station) to a paying audience. It was a simple, aptly titled film of a train pulling into a station. The static camera positioned near the tracks captured a few would-be passengers milling about as the train arrived, growing larger and larger in the frame until it steamed past and slowed to a stop. There was no editing, just one continuous shot. A mere 50 seconds long…
And it blew the minds of everyone who saw it.
Accounts vary as to the specifics of the audience's reaction. Some claim the moving image of a train hurtling toward the screen struck fear among those in attendance, driving them from their seats in a panic. Others underplay the reaction, noting only that no one had seen anything like it. Which, of course, wasn’t entirely true either. It wasn’t the first motion picture. The Lumiere brothers had projected a series of 10 short films in Paris the year before. An American inventor, Woodville Latham, had developed his own projection system that same year. And Thomas Edison had invented a similar apparatus before that.
But one thing is certain: that early film, as simple as it was, changed how we see the world and ourselves. From the early actualite documentary short films of the Lumieres to the wild, theatrical flights of fancy of Georges Melies, to the epic narrative films of Lois Weber and D. W. Griffith, the new medium slowly but surely developed its own unique cinematic language. Primitive at first, limited in its visual vocabulary, but with unlimited potential. And as filmmakers learned how to use that language to re-create the world around them through moving pictures, we learned right along with them. Soon we were no longer awed (much less terrified) by a two-dimensional image of a train pulling into a station, but we were no less enchanted by the possibilities of the medium with the addition of narrative structure, editing, production design, and (eventually) sound and color cinematography.
Since that January day in Lyon, we have all been active participants in this ongoing development of a cinematic language. The novelty short films of those early pioneers gave way to a global entertainment industry centered on Hollywood and its factory-like production of discrete, 90-minute narrative feature films. The invention of broadcast technology in the first half of the 20th century gave way to the rise of television programming and serialized story-telling. And as the internet revolution at the end of the 20th century gave way to the streaming content of the 21st, from binge-worthy series lasting years on end to one-minute videos on social media platforms like Snapchat and TikTok. Each evolution of the form borrowed from and built on what came before, both in terms of how filmmakers tell their stories and how we experience them. And in as much as we may be mystified and even amused by the audience's reaction to that simple depiction of a train pulling into a station back in 1896, imagine how that same audience would respond to the last Avengers film projected in IMAX 3D.
We’ve certainly come a long, long way.
This book is an exploration of the evolution of cinema: the art and technology of moving pictures. But it is also an introduction to the fundamentals of the form that have remained relatively constant for more than 100 years. Just as the text you are reading right now defies easy categorization – is it a book, an online resource, an open source text – modern cinema exists across multiple platforms – is it a movie, a video, theatrical, or streaming – but the fundamentals of communication, the syntax, grammar, and rules of language, written or cinematic, remain relatively constant.
We will begin with an overview of how moving pictures work, literally and figuratively, from the neurological phenomena behind the illusion of movement to the invisible techniques and generally agreed-upon conventions that form the basis of cinematic language.
Then, we’ll take each aspect of how cinema is created in turn: production design, narrative structure, cinematography, editing, sound, and performance. Whether it’s released in a theater as a 2-hour spectacle or streaming online in 5-minute increments, every iteration of cinema includes these elements, and they are each critical in our understanding of film form, how movies do what they do to us, and why we let them.
The second section takes all of this accumulated knowledge of how cinema communicates and applies it to what, exactly, cinema is communicating. That is, we’ll take a long, hard look at the content of cinema, how that has changed over time, and how, for better or worse, it often hasn’t. This section will take seriously the idea that cinema both influences and is influenced by the society in which it is produced. And given the porous borders of the information age, that “society” is increasingly global. Cinema then, not unlike literature, can be viewed and analyzed as a kind of cultural document, a neutral reflection of society in a moment of time, or it can be viewed as a powerful tool for social change (or for the resistance of change as the case may be).
This emphasis on content inevitably leads to an exploration of power and representation. Who is on screen? Who is behind the camera? If cinema is as powerful a medium as I contend, it stands to reason that it matters deeply who controls the means of communication.
There is an ancient story about a king who was so smitten by a particular bird's song that he ordered his wisest and most accomplished scientists to identify its source. How could it sing so beautifully? What apparatus lay behind such a sweet sound? So they did the only thing they could think to do: they killed the bird and dissected it to find the source of its song. Of course, by killing the bird, they killed its song.
The analysis of an art form, even one as dominated by technology as cinema, always runs the risk of killing the source of its beauty. By taking it apart, piece by piece, there’s a chance we’ll lose sight of the whole, that ineffable quality that makes art so much more than the sum of its parts. Throughout this text, my hope is that by gaining a deeper understanding of how cinema works, in both form and content, you’ll appreciate its beauty even more.
In other words, I don’t want to kill the bird.
As much as cinema is an ongoing, collaborative social experiment, one in which we are all participants, it also carries with it a certain magic. And like any good magic show, we all know it’s an illusion. We all know that even the world’s greatest magician can’t really make an object float or see a person in half (without serious legal implications). It’s all a trick. A sleight of hand that maintains the illusion. But we’ve all agreed to allow ourselves to be fooled. In fact, we’ve often paid good money for the privilege. Cinema is no different. A century of tricks used to fool an audience that’s been in on it from the very beginning. We laugh, cry, or scream at the screen, openly and unapologetically manipulated by the medium. And that’s how we like it.
This text is dedicated to revealing the tricks without ruining the illusion. To look behind the curtain to see that the wizard is one of us. That in fact, we are the wizard (great movie, by the way). Hopefully, by doing so, we will only deepen our appreciation of cinema in all its forms and enjoy the artistry of a well-crafted illusion that much more.
Video Attributions:
‘L’arrivée d’un train en gare de La Ciotat (Arrival of a Train)’ by Lumière Brothers. by EcoworldReactor. Standard Vimeo License.
Tokyo in Film
Did you know Tokyo isn’t a city at all? – it’s a metropolis comprising 26 different cities, a handful of towns and villages, and 23 central wards. That is not just a remarkable fact; it’s vital to understanding Tokyo. With around 14 million people living over 2,191 sq km – Tokyo has no single mood. Each city has its own disposition, which we will discover when we go from Shinjuku's grunginess and Shibuya's effortless chik to the old-fashioned charm of Ikebukuro (hopefully, it won't be raining this time).
Where else but Tokyo can we order a coffee from a robot or have the checkout machine recognize our items by shape to calculate the bill?
For this reason, the films we analyze will involve Japan and specifically feature Tokyo, and through our exploratory assignments, we will attempt to recreate those scenes. For this class, the films (and anime) we will analyze are:
Film List:
- Adrift in Tokyo, Satoshi Miki
- Akira, Katsuhiro Ôtomo
- Aggretsuko, Rareko
- Fast and Furious: Tokyo Drift, Justin Lin
- First Love, Takashi Miike
- Godzilla Minus One (2023), Takashi Yamazaki
- Initial D, Andrew Lau, Alan Mak, Ralph Rieckermann
- Jujutsu Kaisen (Shibuya Incident), Sunghoo Park
- Kill Bill Vol. 1, Quentin Tarantino
- Like Someone in Love, Abbas Kiarostami and Banafsheh Modaressi
- Midnight Diner: Tokyo Stories, Kaoru Kobayashi
- Samurai Champloo, Shinichiro Watanabe
- Shoplifters, Hirokazu Koreeda
- Spirited Away, Hayao Miyazaki
- The Seven Samurai, Akira Kurosawa
- Tokyo Ghoul, Shûhei Morita
- Your Name, Makoto Shinkai
Honors Projects
- Honors Project: A comparative analysis of Ikiru, Akira Kurosawa, and the film Living a 2022 remake of Kurosawa's iconic film, by Oliver Hermanus - comparative analysis: You will explore themes of life, death, and the search for meaning within the context of two different cultural and temporal settings.
- Honors Project: Environmentalism in Miyazaki's work: Castle in the Sky and Nausicaä of the Valley of the Wind, Hayao Miyazaki - analysis assignment: Your analysis will delve into how these films depict human interaction with the environment and the implications of these interactions on both society and nature.
Week One, Module One - How to Watch a Movie
Step One: Evolve an optic nerve that “refreshes” at a rate of about 13 to 30 hertz in a normal active state.[1] That’s 13 to 30 cycles per second. Fortunately, that bit has already been taken care of over the past several million years. You have one of them in your head right now.
Step Two: Project a series of still images captured in sequence at a rate at least twice that of your optic nerve’s ability to respond. Let’s say 24 images, or frames, per second.
Step Three: Don’t talk during the movie. That’s super annoying.
Okay, that last part is optional (though it is super annoying), but here’s the point: Cinema is built on a lie. It is not, in fact, a “motion” picture. It is, at a minimum, 24 still images flying past your retinas every second. Your brain interprets those dozens of photographs per second as movement, but it’s actually just the illusion of movement, a trick of the mind known as beta movement: the neurological phenomenon that interprets two stimuli shown in quick succession as the movement of a single object.
Because all of this happens so fast, faster than our optic nerves and synaptic responses can perceive, the mechanics are invisible. There may be 24 individual photographs flashing before our eyes every second, but all we see is one continuous moving picture. It’s a trick. An illusion.
The same applies to cinematic language. The way cinema communicates is the product of many different tools and techniques, from production design to narrative structure to lighting, camera movement, sound design, performance and editing. But all of these are employed to manipulate the viewer without us ever noticing. In fact, that’s kind of the point. The tools and techniques – the mechanics of the form – are invisible. There may be a thousand different elements flashing before our eyes – a subtle dolly-in here, a rack focus there, a bit of color in the set design that echoes in the wardrobe of the protagonist, a music cue that signals the emotional state of a character, a cut on an action that matches an identical action in the next scene, and on and on and on – but all we see is one continuous moving picture. A trick. An illusion.
In this chapter, we’ll explore how cinematic language works, a bit like breaking down the grammar and rules of spoken language, and then we’ll take a look at how to watch cinema with these “rules” in mind. We may not be able to speed up the refresh rate of our optic nerve to catch each of those still images, but we can train our interpretive skills to see how filmmakers use the various tools and techniques at their disposal.
CINEMATIC LANGUAGE
Like any language, we can break cinematic language down to its most fundamental elements. Before grammar and syntax can shape meaning by arranging words or phrases in a particular order, the words themselves must be built up from letters, characters, or symbols. The basic building blocks. In cinema, those basic building blocks are shots. A shot is one continuous capture of a span of action by a motion picture camera. It could last minutes (or even hours) or could last less than a second. Basically, a shot is everything that happens within the frame of the camera – that is, the visible border of the captured image – from the moment the director calls “Action!” to the moment she calls “Cut!”
These discrete shots rarely mean much in isolation. They are full of potential and may be quite interesting to look at on their own, but cinema is built up from the juxtaposition of these shots, dozens or hundreds of them, arranged in a particular order – a cinematic syntax – that renders a story with a collectively discernible meaning. We have a word for that, too: Editing. Editing arranges shots into patterns that make up scenes, sequences, and acts to tell a story, just like other forms of language communicate through words, sentences, and paragraphs.
We have developed a cinematic language from these basic building blocks, a set of rules and conventions by which cinema communicates meaning to the viewer. And by “we,” I mean all of us, filmmakers and audiences alike, from the earliest motion picture to the latest VR experience. Cinematic language – just like any other language – is an organic, constantly evolving, shared form of communication. It is an iterative process that is refined each time a filmmaker builds a story through a discrete number of shots and each time an audience responds to that iteration, accepting or rejecting but always engaging in the process. Together, we have developed a visual lexicon. A lexicon describes the shared set of meaningful units in any language. Think of it as the list of all available words and parts of words in a language we carry around in our heads. A visual lexicon is likewise the shared set of meaningful units in our collective cinematic language: images, angles, transitions, and camera moves that we all understand to mean something when employed in a motion picture.
But here’s the trick: We’re not supposed to notice any of it. The visual lexicon that underpins our cinematic language is invisible, or at least, it is meant to recede into the background of our comprehension. Cinema can’t communicate without it, but if we pay too much attention to it, we’ll miss what it all means. A nifty little paradox. But not so strange or unfamiliar when you think about it. It’s precisely the same with any other language. As you read these characters, words, sentences, and paragraphs, you are not stopping to parse each unit of meaning, analyze the syntax, or double-check the sentence structure. All those rules fade to the background of your own fluency, and the meaning communicated becomes clear (or at least, I sure hope it does). And that goes double for spoken language. We speak and comprehend fluently in grammar and syntax, never pausing over the rules that have become second nature, invisible, and unnoticed.
So, what are some of those meaningful units of our cinematic language? Perhaps not surprisingly, a lot of them are based on how we experience the world in our everyday lives. Camera placement, for example, can subtly orient our perspective on a character or situation. Place the camera mere inches from a character’s face – known as a close-up –and we’ll feel more intimately connected to their experience than if the camera were further away, as in a medium shot or long shot. Place the camera below the eye-line of a character, pointing up – known as a low-angle shot – and that character will feel dominant, powerful, and worthy of respect. We are literally looking up to them. Place the camera at eye level; we feel like equals. Let the camera hover above a character or situation – known as a high-angle shot – and we feel like gods, looking down on everyone and everything. Each choice affects how we see and interpret the shot, scene, and story.
We can say the same about transitions from shot to shot. Think of them as conjunctions in grammar, words meant to connect ideas seamlessly. The more obvious examples, like fade-ins, fade-outs, or long dissolves, are still drawn from our experience. Think of a slow fade-out, where the screen drifts into blackness, as an echo of our experience of falling asleep, drifting out of consciousness. In fact, fade-outs are most often used in cinema to indicate the close of an act or segment of a story, much like the end of a long day. Dissolves are not unlike how we remember events from our own experience, one moment bleeding into and overlapping with another.
But perhaps the most common and least noticed transition, by design, is a hard cut that bridges some physical action on screen. It’s called cutting on action, and it’s a critical part of our visual lexicon, enabling filmmakers to join shots, often from radically different angles and positions, while remaining largely invisible to the viewer. The concept is simple: whenever a filmmaker wants to cut from one shot to the next for a new angle on a scene, she ends the first shot in the middle of some on-screen action, opening a door or setting down a glass, then begins the next shot in the middle of that same action. The viewer’s eye is drawn to the action on screen and not the cut itself, rendering the transition relatively seamless, if not invisible to the viewer.
Camera placement and transitions, along with camera movement, lighting style, color palette, and a host of other elements, make up the visual lexicon of cinematic language, all of which we will explore in the chapters to follow. In the hands of a gifted filmmaker, these subtle adjustments work together to create a coherent whole that communicates effectively (and invisibly). In the hands of not-so-gifted filmmakers, these choices can feel haphazard, unmotivated, or perhaps worse, “showy” – all style and no substance – creating a dissonant, ineffective cinematic experience. But even then, the techniques themselves remain largely invisible. We are left with the feeling that it was a “bad” movie, even if we can’t quite explain why. (Though by the end of this book, you should be able to explain why in great detail, probably to the great annoyance of your date. You’re welcome.)
EXPLICIT AND IMPLICIT MEANING
Once we have a grasp on these small, meaningful units of our collective cinematic language, we can begin to analyze how they work together to communicate bigger, more complex ideas.
Take the work of Lynne Ramsay, for example. As a director, Ramsay builds a cinematic experience by paying attention to the details, the little things we might otherwise never notice:
Cinema, like literature, builds up meaning through the creative combination of these smaller units, but, also like literature, the whole is – or should be – much more than the sum of its parts. For example, Moby Dick is a novel that explores the nature of obsession, the futility of revenge, and humanity’s essential conflict with nature. But in the more than 200,000 words that make up that book, few, if any, of them communicate those ideas directly. In fact, we can distinguish between explicit meaning, that is, the obvious, directly expressed meaning of a work of art, be it a novel, painting or film, and implicit meaning, the deeper, essential meaning, suggested but not necessarily directly expressed by any one element. Moby Dick is explicitly about a man trying to catch a whale, but as any literature professor will tell you, it was never really about the whale.
That comparison between cinema and literature is not accidental. Both start with the same fundamental element, that is, a story. As we will explore in a later chapter, cinema begins with the written word in the form of a screenplay before a single frame is photographed. And like any literary form, screenplays are built around a narrative structure. Yes, that’s a fancy way of saying story, but it’s more than simply a plot or an explicit sequence of events. A well-conceived narrative structure provides a foundation for that deeper, implicit meaning a filmmaker, or really any storyteller, will explore through their work.
Another way to think about that deeper, implicit meaning is as a theme, an idea that unifies every element of the work gives it coherence, and communicates what the work is really about. And really great cinema manages to suggest and express that theme through every shot, scene, and sequence. Every camera angle and camera move, every line of dialogue and sound effect, and every music cue and editing transition will underscore, emphasize, and point to that theme without ever needing to spell it out or make it explicit. An essential part of analyzing cinema is identifying that thematic intent and tracing its presence throughout.
Unless there is no thematic intent or the filmmaker did not take the time to make it a unifying idea. Then, you may have a “bad” movie on your hands. But at least you’re well on your way to understanding why!
So far, this discussion of explicit and implicit meaning, theme, and narrative structure points to a deep kinship between cinema and literature. But cinema has far more tools and techniques at its disposal to communicate meaning, implicit or otherwise. Sound, performance, and visual composition all point to deep ties with music, theater, and painting or photography as well. And while each of those art forms employs its own strategies for communicating explicit and implicit meaning, cinema draws on all of them at once in a complex, multi-layered system.
Let’s take sound, for example. As you know from the brief history of cinema in the last chapter, cinema existed long before the introduction of synchronized sound in 1927, but since then, sound has become an equal partner with the moving image in the communication of meaning. Sound can shape how we perceive an image, just as an image can change how we perceive a sound. It’s a relationship we call co-expressive.
This is perhaps most obvious in the use of music. A non-diegetic musical score, that is, music that only the audience can hear as it exists outside the world of the characters, can drive us toward an action-packed climax or sweep us up in a romantic moment. Or it can contradict what we see on the screen, creating a sense of unease at an otherwise happy family gathering or making us laugh during a moment of excruciating violence. In fact, this powerful combination of moving images and music pre-dates synchronized sound. Even some of the earliest silent films were shipped to theaters with a musical score meant to be played during projection.
But as powerful as music can be, sound in cinema is much more than just music. Sound design includes music but also dialog, sound effects, and ambient sound to create a rich sonic context for what we see on the screen. From the crunch of leaves underfoot to the steady hum of city traffic to the subtle crackle of a cigarette burning, what we hear – and what we don’t hear – can put us in the scene with the characters in a way that images alone could never do, and as a result, add immeasurably to the effective communication of both explicit and implicit meaning.
We can say the same about the relationship between cinema and theater. Both use a carefully planned mise-en-scene – the overall look of the production, including set design, costume, and make-up – to evoke a sense of place and visual continuity. Both employ the talents of well-trained actors to embody characters and enact the narrative structure laid out in the script.
Let’s focus on acting for a moment. Theater, like cinema, relies on actors’ performances to communicate not only the subtleties of human behavior but also the interplay of explicit and implicit meaning. How an actor interprets a line of dialog can make all the difference in how a performance shifts our perspective, draws us in or pushes us away. And nothing ruins a cinematic or theatrical experience like “bad” acting. But what do we really mean by that? Often it means the performance wasn’t connected to the thematic intent of the story, the unifying idea that holds it all together. We’ll even use words like, “The actors seemed like they were in a different movie from everyone else.” That could be because the director didn’t clarify a theme in the first place, or perhaps they didn’t shape or direct an actor’s performance toward one. It could also simply be poor casting.
All of the above applies to both cinema and theater, but cinema has one distinct advantage: the intimacy and flexibility of the camera. Unlike theater, where your experience of a performance is dictated by how far you are from the stage, the filmmaker has complete control over your point of view. She can pull you in close, allowing you to observe every tiny detail of a character’s expression, or she can push you out further than the cheapest seats in a theater, showing you a vast and potentially limitless context. And perhaps most importantly, cinema can move between these points of view in the blink of an eye, manipulating space and time in a way live theater never can. All of those choices affect how we engage the thematic intent of the story and how we connect to what that particular cinematic experience really means. And because of that, in cinema, whether we realize it or not, we identify most closely with the camera. No matter how much we feel for our hero up on the screen, we view it all through the lens of the camera.
And that central importance of the camera is why the most prominent tool cinema has at its disposal in communicating meaning is visual composition. Despite the above emphasis on the importance of sound, cinema is still described as a visual medium. Even the title of this chapter is How to Watch a Movie. It is not so surprising when you think about the lineage of cinema and its origin in the fixed images of the camera obscura, daguerreotypes, and series photography. All of which owe a debt to painting as an art form and a form of communication. In fact, the cinematic concept of framing has a clear connection to the literal frame, or physical border, of paintings. One of the most powerful tools filmmakers – and photographers and painters – have for communicating explicit and implicit meaning is simply what they place inside the frame and what they leave out.
Another word for this is composition, the arrangement of people, objects, and settings within the frame of an image. And if you’ve ever pulled out your phone to snap a selfie or maybe a photo of your meal to post on social media (I know, I’m old, but really? Why is that a thing?), you are intimately aware of the power of composition. Adjusting your phone this way and that to get just the right angle, to include just the right bits of your outfit, maybe edge Greg out of the frame just in case things don’t work out (sorry, Greg). The point is that composing a shot is a powerful way to tell stories about ourselves daily. Filmmakers, the really good ones, are masters of this technique. Once you understand this principle, you can start to analyze how a filmmaker uses composition to serve their underlying thematic intent to help tell their story.
One of the most important ways a filmmaker uses composition to tell their story is through repetition, a pattern of recurring images that echoes a similar framing and connects to a central idea. And like the relationship between shots and editing – where individual shots only really make sense once they are juxtaposed with others – a well-composed image may be exciting or even beautiful on its own, but it only starts to make sense in relation to the implicit meaning or theme of the overall work when we see it as part of a pattern.
Take, for example, Stanley Kubrick and his use of one-point perspective:
Or how Barry Jenkins uses color in Moonlight (2016):
Or how Sofia Coppola tends to trap her protagonists in gilded cages:
These recurring images are part of that largely invisible cinematic language. We aren’t necessarily supposed to notice them, but we are meant to feel their effects. And it’s not just visual patterns that can serve the filmmaker’s purposes. Recurring patterns, or motifs, can emerge in the sound design, narrative structure, mise-en-scene, dialog, and music.
But one distinction should be made between how we think about composition and patterns in cinema and how we think about those concepts in photography or painting. While all of the above employ framing to achieve their effects, photography and painting are limited to what the artist fixed in that frame at the moment of creation. Only cinema adds an entirely new and distinct dimension to the composition: movement. That includes movement within the frame – as actors and objects move freely, recomposing themselves within the fixed frame of a shot – as well as the movement of the frame itself, as the filmmaker moves the camera in the setting and around those same actors and objects. This increases the compositional possibilities exponentially for cinema, allowing filmmakers to layer in even more patterns that serve the story and help us connect to their thematic intent.
FORM, CONTENT, AND THE POWER OF CINEMA
As we become more attuned to the various tools and techniques filmmakers use to communicate their ideas, we can analyze their effectiveness better. We’ll be able to see what was once invisible. It's a kind of magic trick in itself. But as I tried to make clear from the beginning, my goal is not to focus solely on form, to dissect cinema into its constituent parts and lose sight of its overall power. Like any art form, cinema is more than the sum of its parts. And it should be clear already that form and content go hand in hand. Pure form, all technique, and no substance is meaningless. And pure content, all story and no style is didactic and, frankly, boring. How the story is told is as important as what the story is about.
However, just as we can analyze technique, the formal properties of cinema, to better understand how a story is communicated, we can also analyze content, that is, what stories are communicating, to better understand how they fit into the wider cultural context. Again, Cinema, like literature, can represent valuable cultural documents, reflecting our ideas, values, and morals back to us as filmmakers and audiences.
We’ll spend more time on content analysis – the idea of cinema as a cultural document – in the last couple of chapters of this book, but I want to take a moment to highlight one aspect of that analysis in advance. I’ve discussed at length the idea of cinematic language and the fact that, as a form of communication, it is largely invisible or subconscious. Interestingly, the same can be said for cinematic content. Or, more specifically, the cultural norms that shape cinematic content. Cinema is an art form like any other, shaped by humans bound up in a given historical and cultural context. No matter how enlightened and advanced those humans may be, that historical and cultural context is so vast and complex that they cannot possibly grasp every aspect of how it shapes their view of the world. Inevitably, those cultural blind spots, the unexamined norms and values that make us who we are, filter into the cinematic stories we tell and how we tell them.
The result is a kind of cultural feedback loop where cinema both influences and is influenced by the context in which it is created.
Because of this, on the whole, cinema is inherently conservative. That is to say, as a form of communication, it is more effective at conserving or re-affirming a particular view of the world than challenging or changing it. This is due in part to the economic reality that cinema, historically a very expensive medium, must appeal to the masses to survive. As such, it tends to avoid offending our collective sensibilities to make us feel better about who we already think we are. It is also partly due to the social reality that the people who historically had access to the capital required to produce that very expensive medium tend to all look alike. That is, primarily white and mostly men. And when the same kind of people with the same kind of experiences tend to have the most consistent access to the medium, we tend to get the same kinds of stories, reproducing the same, often unexamined, norms, values, and ideas.
But that doesn’t mean cinema can’t challenge the status quo or at least reflect real, systemic change in the wider culture already underway. That’s what makes the study of cinema, particularly in regard to content, so endlessly fascinating. Whether it’s tracking the way cinema reflects the dominant cultural norms of a given period or the way it sometimes rides the leading edge of change in those same norms, cinema is a window – or frame (see what I did there) – through which we can observe the mechanics of cultural production, the inner-workings of how meaning is produced, shared, and sometimes broken down over time.
EVERYONE’S A CRITIC
One final word on how to watch a movie before we move on to the specific tools and techniques employed by filmmakers. In as much as cinema is a cultural phenomenon, a mass medium with a crucial role in the production of meaning, it’s also an art form meant to entertain. And while I think one can assess the difference between a “good” movie and a “bad” movie in terms of its effectiveness, that has little to do with whether one likes it or not.
In other words, you don’t have to necessarily like a movie to analyze its use of a unifying theme or how the filmmaker employs mise-en-scene, narrative structure, cinematography, sound, and editing to communicate that theme effectively. Citizen Kane (Orson Welles, 1941), arguably one of the greatest films ever made, is an incredibly effective motion picture. But it’s not my favorite. Between you and me, I don’t even really like it all that much. But I still show it to my students every semester. This means I’ve seen it dozens and dozens of times, and it never ceases to astonish me with its formal technique and innovative use of cinematic language.
Fortunately, the opposite is also true: You can really like a movie that isn’t necessarily all that good. Maybe there’s no unifying theme, the cinematography is all style and no substance (or no style and no substance), the narrative structure is made out of toothpicks, and the acting is equally thin and wooden. (That’s right, Twilight, I’m looking at you.) Who cares? You like it. You’ve watched it more often than I’ve seen Citizen Kane, and you still like it.
That’s great. Embrace it because taste in cinema is subjective. But analysis of cinema doesn’t have to be. You can analyze anything. Even things you don’t like.
Video and Image Attributions:
An example of beta movement. Public Domain Image.
Lynne Ramsay – The Poetry of Details by Tony Zhou. Standard Vimeo License.
Kubrick // One-Point Perspective by kogonada. Standard Vimeo License.
MOONLIGHT // BLUE by Russell Leigh Sharman. Standard Vimeo License.
Have You Noticed This About Sofia Coppola’s Films? by Fandor. Standard YouTube License.
- Okay, it's actually a lot more complicated than that. Optic nerves don't "refresh" in the way we normally think of that term. In fact, the optic nerve is part of a complex system that incudes your eyeballs, retinas and brain, each of which performs at varying degrees of efficiency and changes as we age. But the numbers here are a good rule of thumb for thinking about how quickly we can process images. For more on how the optic nerve works, check this out: https://wolfcrow.com/notes-by-dr-optoglass-motion-and-the-frame-rate-of-the-human-eye/ ↵
Final Project: Short Film Schedule
Final Project
Purpose:
Work in a self-selected team of three students to create a short film (plus titles and credits).
You may negotiate a larger team if you have a clear production plan, and we will be creating a 2 to 8-minute piece short film. Even though we are working in groups, we also need to work together to support each other's "productions."
Here are six themes to use as a taking-off point, and please also consider that these are natural extensions of our film analysis and journal assignments (especially the explorative assignments)!
- Animation and Movement
- Slice of Life
- Choreography and Story-Telling
- Recreating a Scene
- Pastiche/Emulating Cinematography (recreating an iconic shot)
- Deconstructing Narrative.
Please also read this article by Mark Billen at FX. He is the creator of Hitfilm, the free film editor I encourage everyone to use for this project. He gives great suggestions on brainstorming, how to come up with ideas, and what to film.
Task:
Please remember that this is not a film school. We will study many aspects of film and its creation but expect insight and appreciation, not mastery, of this discipline. We don't expect great acting, fancy VFX, complex sets, or the use of high-quality equipment.
Choose your film's subject to minimize the impact of limited production resources. As for all assignments this semester, we will primarily evaluate the student and group's process, intermediate materials, and technical appreciation of cinematography.
The project development stages will follow this criteria:
Checkpoints:
- Module One:
- Group Selection
- Brainstorming ideas.
- Greenlight: convince one instructor to be your producer.
- Module Two:
- Explorative exercises.
- Automatic writing without editing the whole idea.
- Class presentation of Final Project proposals.
- Preproduction: your producer approves your script, shot list, or other preproduction.
- Module Three:
- Storyboard: bring 3 video clips/photographs/writings related to your ideas.
- Animatic: your producer approves a rough edit from preproduction materials and found footage.
- Module Four:
- Raw footage: your producer approves your footage.
- Module Five:
- Rough cut: your producer signs off on the first complete edit.
- Post-production and editing are complete; the final film is due at 11:59 p.m. May 26th.
- Self-evaluations are due May 28th. It may include up to one page of text.
The checkpoints are pass/fail; however, we essentially have to pass to continue. For the first, the group must sell the instructor on your concept and the practicality of executing the production plan. That instructor will then agree to be your producer for the remainder of the project. Don't structure this as a single pitch. Instead, please work with your group and the instructor to set reasonable expectations and a plan to achieve them.
You will then meet with your producer regularly during class, office hours, and appointments. You must receive approval for each checkpoint by the specified deadline.
These deadlines aim not to have you submit something at that time but to create a process that encourages continued contact throughout production.
We expect you'll receive signoff well before the weekly deadlines in the natural course of working with your producer.
Educational Goals
- Practice and then demonstrate the technical skills you acquired during the semester.
- Iterate on a production, refining your work and learning from peers, mistakes, and serendipity.
- Create a physical artifact for your portfolio.
- Experience the complete production cycle and the thrill of creation.
Pick up a camera. Shoot something. No matter how small, cheesy, or whether your friends and your sister star in it. Put your name on it as director. Now you're a director. Everything after that, you're just negotiating your budget and your fee.
Requirements
To ensure everyone has sufficient support, we expect you to volunteer to act or crew for another project for a 2-hour session (it is okay if nobody takes you up on this, and you don't have to act if you're not comfortable with that)
- Preproduction materials:
- The script if there is dialogue.
- Storyboard
- Shot list
- Schedule for shoots, reserved equipment and spaces, actors, VFX, editing, and screenings
- The animatic is an outline of your film as a 540p MP4 constructed from storyboard panels, still frames, and/or existing footage (the usual copyright and plagiarism restrictions do not apply since this will not be public)
- Animatic (optional!)
- An actual video that approximates the pace, audio, and shots of your film without actually requiring real footage
- Examples:
- Use this Premiere project as your working draft, which continually improves as footage comes in during production.
- Footage
- Dailies plus B-roll coverage
- Multiple takes of all of the key shots
- About 10x as much footage as your expected running time to provide
- Rough cut:
- A coarsely edited collection of your footage as a 540p or 360p MP4. Audio can be a placeholder from the animatic, and there is no expectation of VFX or post. You can have up to two still shots from the storyboard or found footage if you haven't completed production.
- Final film:
- 2 to 8-minute final product (plus titles and credits) film in 720p MP4 format
- Must include titles, credits, and copyright information.
- For extra credit, include a "behind the scenes" reel showing some elements of your process, also in 720 MP4 format and less than 150 MB. For example, how did you create certain tracking shots, the set and takes, VFX breakdowns, etc?
- Your film may be stop motion, a documentary, a sequence of freeze-frame live-action stills, live-action, or animation.
- Your film must demonstrate knowledge of topics covered in class through:
- Camera footage you filmed (i.e., it can't be 100% animation, found footage, etc.)
- Intentional lighting
- Editing in the continuity/IMR style
- Audio is optional but highly recommended
Planning and scheduling your work is hard, sometimes. Here is a sample schedule for your group:
Week 1: Module One and Module Two
- Day 1-2: Group Selection and Brainstorming Ideas
- Day 3-4: Pitch your concept to the instructor and get a producer on board
- Day 5-6: Explorative exercises and automatic writing
- Day 7: Class presentation of Final Project proposals
Week 2: Module Two and Module Three
- Day 8-9: Preproduction: Producer approves script, shot list, or other preproduction elements
- Day 10-11: Storyboard creation (3 video clips/photographs/writings related to ideas)
- Day 12-13: Animatic: Producer approves a rough edit from preproduction materials and found footage
Week 3: Module Four and Module Five
- Day 14-15: Raw footage production: Producer approves your footage
- Day 16-17: Rough cut editing: Producer signs off on the first complete edit
- Day 18-20: Finalize post-production and editing, complete "behind the scenes" reel for extra credit
- Day 21: Final film, including titles, credits, and copyright information, is due at 11:59 p.m. on May 26th
Post-Completion Day
- Day 22: Self-evaluations are due by May 24th (up to one page of text)
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Iconic Films Shot in Tokyo
Filming a Scene:
For our film project, we will be recreating scenes from the films we analyze in this course, and our excursions are meant to facilitate this process. That being said, we cannot analyze every film that has taken place in Tokyo. This map lists some of those other locations; if you would like to recreate a scene from a film or series on this list for your final project, please submit a proposal with your group.
Other Iconic Films Shot in Tokyo:
- Tokyo Story (1953):
- Director: Yasujirō Ozu.
- Explores the generational divide in post-war Tokyo.
- Lost in Translation (2003):
- Director: Sofia Coppola.
- Portrays the bond between two lonely strangers against Tokyo’s neon-lit cityscape.
- The Fast and the Furious: Tokyo Drift (2006):
- Tokyo’s underworld adrenaline-fueled car racing culture takes center stage.
- Tokyo! (2008):
- Anthology of three short films by different directors presenting a surreal depiction of Tokyo.
- Like Someone in Love (2012):
- Director: Abbas Kiarostami.
- Delicately explores interpersonal relationships in Tokyo.
- The Wolverine (2013):
- Showcases Tokyo’s modernity, from skyscrapers to efficient bullet trains.
- Your Name (2016):
- Director: Makoto Shinkai.
- Animated film painting a vivid picture of Tokyo through two protagonists.
- Shoplifters (2018):
- Director: Hirokazu Kore-eda.
- Offers a poignant exploration of Tokyo’s marginalized communities.
- Tokyo Ghoul (2017):
- Live-action adaptation of the popular manga series, presenting a darker, supernatural side of Tokyo.
- Weathering With You (2019):
- Director: Makoto Shinkai.
- Animated film presenting Tokyo’s unpredictable weather patterns as a central narrative element.
Tokyo on the Small Screen: TV Shows Set in Tokyo:
- Tokyo Trial (2016-2017):
- Historical drama focusing on the international military tribunal held in Tokyo.
- Midnight Diner: Tokyo Stories (2016-present):
- Anthology series with heartwarming tales centered around a late-night diner in Tokyo.
- The Naked Director (2019-present):
- Biographical drama set in the 1980s, exploring the rise of adult video director Toru Muranishi.
- Alice in Borderland (2020-present):
- Thrilling series based on a manga, presenting a dystopian version of Tokyo.
- Tokyo Revengers (2021-present):
- Action-packed anime series about a man who travels back in time to save his girlfriend and change his regretful past.
Tokyo for the Young: Animated Films Set in Tokyo:
- Pom Poko (1994):
- Studio Ghibli film depicting raccoons fighting against urban development in Tokyo.
- Digimon Adventure: Our War Game! (2000):
- Popular anime film featuring Tokyo landmarks during a city-wide internet outage.
- Tokyo Godfathers (2003):
- Tells the story of three homeless people finding a baby on Christmas Eve in Tokyo.
- Tamagotchi: The Movie (2007):
- Sets the popular virtual pet franchise film in Tokyo and the Tamagotchi Planet.
- Summer Wars (2009):
- Presents a virtual world threatening to destroy Tokyo unless a young math genius can stop it.
Explorative Assignment, Slice-of-Life
Option One: Slice-of-Life Mini-Narrative (Fictionalizing Reality) Group Assignment
Purpose: To capture and then recreate a slice-of-life scene, offering students a chance to explore the nuances of everyday interactions and how they can be translated into a film narrative.
Preproduction Materials:
- Unscripted Dialogue Capture: Students film an unscripted, natural conversation, focusing on capturing genuine interactions. To create an unscripted review of a film viewed together at a local cinema, emulating the slice-of-life style of animation/filmmaking.
- Film Review: With your chosen group, visit a local cinema and view a film together with a local audience. I recommend the Toho cinema in Shinjuku for the iconic Godzilla statue.
Turn-in Methods:
- Footage: Raw footage of the unique, unscripted scene wherein the "actors" review the film they watched at the local cinema and their experience of viewing a film with a Japanese audience.
Option Two: Recreating a Scene; Jujutsu Kaisen: Shibuya Sky
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This is a comprehensive shot list and schedule that organizes shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
Complete the aforementioned journal following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Week One, Module Two - Mise-en-Scène
Allow me to introduce a word destined to impress your friends and family when you trot it out at the next cocktail party: Mise-en-Scène. And even if you don’t frequent erudite cocktail parties, and who does these days (a shame), it’s still a handy term to have around. It’s French (obviously), and it literally means “putting on stage.”
Why French? Sometimes, we like to feel fancy, and let’s face it: to an American, French is fancy.
But the idea is simple. Borrowed from theater, it refers to every element in the frame that contributes to the overall look of a film. And I mean everything: set design, costume, hair, make-up, color scheme, framing, composition, lighting… Basically, if you can see it, it contributes to the mise-en-scène.
I could have started with any number of different tools or techniques filmmakers use to create a cinematic experience. The narrative might seem a more obvious starting point. Cinema can’t exist without a story; chronologically speaking, it all starts with the screenplay. Or I could have led off with cinematography. After all, we often think of cinema as a visual medium. But mise-en-scène captures much more than any one tool or technique in isolation. It’s more an aesthetic context in which everything else takes place, the unifying look, or even feel, of a film or series.
And this is probably as good a time as any to discuss the role of a director in cinema. There’s a school of thought out there, known as the auteur theory, that claims the director is the “author” of a work of cinema, not unlike the author of a novel, and that they alone are ultimately responsible for what we see on the screen. Cinema requires dozens, if not hundreds, of professionals dedicated to bringing a story to life. The screenwriter writes the script, the production designer designs the sets, the cinematographer photographs the scenes, the sound crew captures the sound, the editor connects the shots together, and each of them has whole teams of experts working below them to make it all work on screen. But if there’s any hope of that final product having a unified aesthetic and a coherent, underlying theme that ties it all together, it needs a singular vision to give it direction. That, really, is the job of a director. To ensure everyone is moving in the same direction, making the same work of art. And they do that not so much by managing people – they have an assistant director and producers for that – they do it by managing mise-en-scène, shaping the overall look and feel of the final product. While mise-en-scène has many moving parts and many different professionals in charge of shaping those individual parts into something coherent, it’s the one element of cinema that is most clearly the responsibility of the director.
This talent for shaping mise-en-scène is one reason we can readily identify great directors' work. Think about the films of Alfred Hitchcock, Agnes Varda, Wes Anderson, Yosujiro Ozu, Claire Denis, or Steven Speilberg (and if some of those names are unfamiliar, seek them out!). If we know their work at all, most of us could pick out one of their films after just a few minutes, even if we had never seen it before. This is not just because of some signature flourish or idiosyncratic visual habit (though that’s often part of it) but because their films have a certain look to them, a certain aesthetic that saturates the screen.
Take the films of Claire Denis, for example:
Denis’s films generate an enveloping atmosphere that you can almost taste and feel, and all of that is part of her consistent (and brilliant) use of mise-en-scène.
Or how about the films of Wes Anderson:
Anderson’s films consistently use symmetrical compositions, smooth, precise tracking shots, and slow motion, but it’s the overall effect, the mise-en-scène that makes the impression (check out more break-downs of Wes Anderson’s style here and here).
Because mise-en-scène refers to this “overall look,” it can feel rather broad (and even vague) as a concept. So, let’s break it down into four elements of design: setting, character, lighting, and composition. We’ll tackle each one in turn.
SETTING
Nothing we see on the screen in the cinema is there by accident. Everything is carefully planned, arranged, and even fabricated – sometimes using computer-generated imagery (CGI) – to serve the story and create a unified aesthetic.
That goes double for the setting.
If mise-en-scène is the overall aesthetic context for a film or series, the setting is the literal context, the space actors and objects inhabit for every scene. And this is much more than simply the location. It’s how that location, whether it’s an existing space occupied for filming or one purpose-built on a soundstage, is designed to serve the vision of the director.
As we saw in Chapter One, in the early days of motion pictures, when cinematic language was still in its infancy, not much thought was given to the design of a setting (or editing or performance, and no one was even thinking about sound yet). But it didn’t take long for filmmakers to realize they could employ the same tricks of set design they used in theater for the cinema.
One of the pioneers of this was the French filmmaker Georges Méliès. Take, for example, his 1903 film The Kingdom of the Fairies:
Méliès’s use of elaborate sets, along with equally elaborate costumes, hairstyles, make-up, and even the hand-tinting of the film itself, all contribute to the fantastical look and feel of the film. He brought a similar design sensibility to all of his films, including the ground-breaking 1902 film A Trip to the Moon.
A decade or so later, this attention to detail in the design elements of cinema had become commonplace. Indeed, many of the more well-known early silent films are famous for their sophisticated mise-en-scène, particularly in regard to setting, often above all else.
Check out this scene again from D. W. Griffith’s Intolerance (1916):
The set design alone is staggering. Built in the middle of Los Angeles, it took four years to dismantle it.
Or consider the opening of Fritz Lange’s Metropolis (1927):
The film draws us into a mechanized, dystopian future – one of the first science-fiction films in history – and its success lies in its careful design of the setting to serve that narrative purpose.
Once filmmakers realized the importance of setting as an element of design and what it contributed to the overall look of their films, it wasn’t long before a position was created to oversee it all: the production designer. The production designer is the point person for the overall aesthetic design of a film or series. Working closely with the director, they help translate the aesthetic vision for the project – its mise-en-scène – to the various design departments, including set design, art department, costume, hair, and make-up. But arguably, their most important job is to make sure the setting matches that aesthetic vision, specifically through set design and set decoration.
Set design is precisely what it sounds like the design and construction of the setting for any given scene in a film or series. Plenty of productions use existing locations and don’t necessarily have to build much of anything (though that doesn’t mean there isn’t an element of design involved, as we shall see). But when production requires complete control over the filming environment, production designers, along with conceptual artists, construction engineers, and sometimes a whole army of artisans, must create each setting, or set, from the ground up. And since these sets have to hold up under the strain of a large film crew working in and around them for days and even weeks, they require as much planning and careful construction as any real-life home, building, or interplanetary city out there.
Take a look at the incredible detail involved in bringing the set design to life for Thor: Ragnarok (Taika Waititi, 2017):
D. W. Griffith can take a seat.
These sets may be built on-site to blend in with the surrounding landscape, or they may be built within a large, windowless, sound-proof building called a soundstage. A soundstage provides the control over the environment production designers need to give the director precisely the look and feel she wants from a particular scene. On a big enough soundstage, a production designer can fabricate interiors and exteriors, sections of buildings, and even small villages. And since it is all shielded from the outside, the production has complete control over lighting and sound. It can be dawn or twilight for 12 hours a day. And a shot will never be interrupted by an airplane flying loudly overhead.
The use of soundstages is particularly helpful when producing serialized content. A TV or streaming series, especially one that uses the same few locations over and over – the family home, the mobster’s headquarters, the king’s palace – needs access to those sets for months at a time, year after year, for as long as we keep watching. Of all those series you binge-watch on the weekends (or during the week when you should be reading this), almost all of them depend upon sets built from the ground up and housed on soundstages for years on end.
Of course, sometimes, the setting of a particular production requires more than a production designer can deliver with the materials available (or the time or the budget, as the case may be). In that case, the setting must be augmented with computer-generated imagery (CGI). The most common way this is implemented is through the use of green screen technology. The idea is fairly simple. The set is dressed with a backdrop of bright green (or blue; the actual color isn’t terribly important), and the scene is filmed as usual. Then, in post-production, the software picks out that particular color and replaces it with imagery either filmed elsewhere or generated by digital artists, a process called keying. For this to work, no other object or article of clothing can match that shade of green, or it will be replaced as well. And with ever-improving technology, the sky is no longer the limit to what designers can offer up for the screen.
Whether the production designer is building the set from the ground up on a soundstage or simply using an existing location, the setting is still a kind of blank canvas until that space is filled with all of the essential details that really tell the story. That’s where set design meets set decoration. Still under the supervision of the production designer, set decorating falls to any number of skilled artisans in the art department. They design everything from the color on the walls, to the texture of the drapes, to the style of the furniture, to every ashtray, book, and family photo that might show up on screen. And that goes for existing locations as well. A film production using someone’s actual home for a scene will likely replace all of the furniture, repaint the walls, and fill it with their own odds and ends that help tell the cinematic story. And then, hopefully, put it all back the way they found it when they’re done.
Take a look at the ways the production designer for the Netflix series The Crown converts existing locations into a Buckingham Palace throne room or the Queen’s private apartment:
This is where storytelling through the physical environment – the setting – can really come alive. Every object placed just so on a set adds to the mise-en-scène and helps tell the story. Those objects could be in the background providing context – framed photos, a trophy, an antique clock – or they could be picked up and handled by characters in a scene – a glass of whisky, a pack of cigarettes, a loaded gun. We even have a name for those objects, props, short for “property” and also borrowed from theater, and a name for the person in charge of keeping track of them all, a prop master.
As should be clear by now, setting is one of the most important design elements in creating a consistent mise-en-scène. Not simply the location – a suburban home, a high-rise office building, a spaceport on Mos Eisley – but all of the details that fill that location, make it come alive as a lived-in space, and most importantly, help tell the cinematic story. One way we can begin to see the filmmaker's intention, to understand how she is subtly (and maybe not so subtly) manipulating our emotions through cinematic language, is to pay attention to these details. The very details we’re not supposed to notice.
CHARACTER
Character is a term that will come up a lot. We use it to describe how a screenwriter invents believable characters that inhabit a narrative structure. And we use it to describe how an actor inhabits that character in their performance. But we can also examine how the physical design of a character, through costume, make-up, and hairstyle, not only contributes to the mise-en-scène but also helps fully realize the work of both screenwriters and actors.
Typically, when we think of “character design,” we might immediately think of fantastic creatures dreamed up in a special effects studio. They might be animated through CGI, fabricated from latex, and worn by an actor. And all of that is a reasonable way to think about the concept of character design. But in some ways, that is just a much more extreme version of how I would like to frame the work of costume designers and hair and make-up professionals.
Just as a screenwriter must create – or design – a character on the page and an actor must create – or design – their approach to inhabiting that character, the wardrobe, hair, and make-up departments must also design how that character is going to look on screen. This design element is, of course, more obvious the less familiar the world of the character might be. The clothing, hair, and make-up of characters inhabiting worlds in a distant time period or even more distant galaxy will inevitably draw our attention. (Though even there, the intention is to add to the mise-en-scène without distracting us from the story.) But even when the context is closer to home, a story set in our time, in our culture, maybe even our own hometown, every element of the clothes, the hair, and the make-up is carefully chosen, sometimes made from scratch, to fit that context and those particular characters. In other words, each character’s look is carefully designed to support the overall mise-en-scène and help tell the story.
Take costume design, for example. We often think of “costume” as another word for disguise or playing a character. But the last thing a filmmaker wants is for the audience to think of their characters as actors in disguise or playing dress-up. They want us to see the characters. Period. The wardrobe should fit the time, place, and, most importantly, the character. Once that is established, the designer can layer in more subtle hints about the larger context, the underlying theme, by adding a touch of color that serves as a visual motif or introducing some alteration in the wardrobe that dramatizes some narrative shift:
What is important to note is that costume design in film is not about fashion or what looks “good” on an actor. It’s about what looks right on a character, what fits the setting, and the film's overall look.
These same principles can be applied to hair and make-up. As with costume design, it’s easy to think of the more extreme examples of hair and make-up design, especially when the setting calls for something historic, other-worldly, or… horrifying. The special effects make-up for the gory bits of your favorite horror films can sometimes take center stage. But these elements are often not meant to draw our attention at all. To achieve that, perhaps ironically, hair and make-up require even more attention from their respective designers. This is due in part to the technical requirements of filming. Bright lights can reveal every distracting blemish or poorly applied foundation, and as camera and image technology improves, the techniques required to hide the fact that actors are even wearing make-up must be continually refined. But it is also because hair and make-up are incredibly personal and intimately connected to the character:
And while all of this is tremendously important for the audience, it is even more important for the actor playing the character. We’ll discuss the various ways an actor approaches their performance in detail in another chapter, but for now, it’s important to note how much actors rely upon the design of their character through costume, hair, and make-up. Putting on the wardrobe, seeing themselves in another era, a different hairstyle, looking older or younger, helps the actor literally and metaphorically step into the life of someone else and do so believably enough that we no longer see the actor, only the character in the story.
LIGHTING
The first two elements of design in mise-en-scène – setting and character – fall squarely under the supervision of the production designer and the art department. The next two – lighting and composition – fall to the cinematographer and the camera department but are just as important as elements of design in the overall look of the film. We will take a deeper dive into each in a later chapter on cinematography, but for now, let’s take a quick look at how these elements fit into mise-en-scène.
As should be obvious, you can’t have a cinema without light. Light exposes the image and, of course, allows us to see it. But it’s the creative use of light, or lighting, that makes it an element design. A cinematographer can illuminate a given scene with practical light, that is, light from lamps and other fixtures that are part of the set design, set lights, light fixtures that are off-camera and specifically designed to light a film set, or even available light, light from the sun or whatever permanent fixtures are at a given location. But in each case, the cinematographer is not simply throwing a light switch; they are shaping that light, making it work for the scene and the story as a whole. They do this by emphasizing different aspects of lighting direction and intensity. A key light, for example, is the primary light that illuminates a subject. A fill light fills out the shadows a strong key light might create. And a backlight helps separate the subject from the background. And it’s the consistent use of a particular lighting design that makes it a powerful part of mise-en-scène.
Two basic approaches to lighting style can illustrate the point. Low-key lighting refers to a lighting design where the key light remains subtle and even subordinate to other lighting sources. The result? A high-contrast lighting design that makes consistent use of harsh shadows. Another word for this is chiaroscuro lighting (this time, we’re stealing a fancy word from Italian). Think of old detective movies with the private eye stalking around the dark streets of San Francisco.
Classic low-key lighting design.
High-key lighting refers to a lighting design where the key light remains the dominant source, resulting in a low-contrast, even flat, or washed-out look to the image. Think of art-house dramas set in stark, snowy landscapes or even big Hollywood comedies that try to avoid “interesting” shadows that might distract us from the joke.
In either case, the cinematographer, working closely with the director and production designer, is using light as an element of design, contributing to the overall mise-en-scène.
COMPOSITION
The fourth and final design element in considering mise-en-scène – one that I touched on in the last chapter and will receive much more attention in the chapter on cinematography – is composition. As discussed in Chapter Two, composition refers to the arrangement of people, objects, and settings within the frame of an image. And because we are talking about moving pictures, there are really two important components of composition: framing, which even photographers must master, and movement. In the case of cinematic composition, movement refers to movement within the frame as well as movement of the frame as the cinematographer moves the camera through the scene. All of which are critical aspects of how we experience mise-en-scène.
Like lighting, composition falls under the responsibility of the cinematographer. While there are many technical and artistic considerations when it comes to framing and movement, cinematographers are also keenly aware of the design element of composition. In fact, they often describe at least part of their job as designing a shot. Part of this process involves arranging people, objects, and settings in the frame to achieve a sense of balance and proportion, often dividing the frame into thirds horizontally and vertically to ensure proper distribution. We call this the rule of thirds, and it’s fairly common in photography. In fact, take out your phone right now, open the camera app, and you’re likely to see a faint grid across the screen. That’s there to help you balance the composition of your selfie according to the rule of thirds. Another important part of the process of designing a shot is the choreography involved in moving the camera through the scene, whether on wheels, on a crane, or strapped to a camera person.
Again, we’ll spend more time on this subject in a later chapter, but take a look at how Japanese filmmaker Akira Kurosawa approaches the composition of movement in designing his shots:
Or how Andrea Arnold uses framing and composition to communicate isolation, captivity, or a deep connection to the earth:
A thoughtfully composed frame does more than create a pleasing image. It can isolate characters, focus our attention, and draw us into the story without us ever really noticing the technique itself.
Unless we know to look for it.
CINEMATIC STYLE
Taken together, setting, character, lighting, and composition make up the key elements of design in creating an effective and coherent mise-en-scène. As discussed earlier, it’s one of the ways we can pick out the work of great filmmakers. A consistent mise-en-scène becomes a kind of signature style of a filmmaker.
But it can also mark the signature style of a particular genre or type of cinema. Take film noir, for example. Remember those detective movies I mentioned earlier? They are part of a whole trend in filmmaking that began in the 1940s with titles like The Maltese Falcon (John Huston, 1941), Double Indemnity (Billy Wilder, 1944), and The Big Sleep (Howard Hawks, 1946). These films and many more are part of a style of filmmaking that includes a gritty, urban setting, tough, no-nonsense characters, low-key lighting, and off-balance compositions. Sometimes, they feature a private detective on a case, but not always. Usually, they were filmed in black and white, but not always. In fact, film noir – which literally means “dark film” in French (what is with all the French?!) – has been historically difficult to define because the specific elements can vary so widely. However, one easy way to identify a film as part of that tradition is through its mise-en-scène. Mise-en-scène isn’t about any one element; it’s that overall look, the whole, that is greater than the sum of its parts.
And that can extend to a whole national trend in cinema as well. Because cinema is so profoundly connected to a particular cultural context, part of that gives and takes in the cultural production of meaning, it should come as no surprise that there are specific periods in a given place and time where cinema can take on a kind of national style. Where cinema artists in that same place and time are all speaking the same cinematic language, as a result, produces a unified, identifiable style, which is another way of saying a consistent mise-en-scène.
One example of this can be found in the films produced in Japan.
According to the Center for Japanese Studies, the Japanese cinematic style is a set of cinematographic techniques commonly detected in Japanese filmmaking of all ages, such as long ASL (average shot length), static or slow camera movement, emotions expressed via natural phenomenons, and, to a lesser extent, deep focus shots, flat lighting, and shots empty of reference (as in lingering on details that don’t directly connect to the narrative). The Japenese Cinema Archives terms this Japenese Cinimalism.
There is perhaps no better representative of Japanese cinema than the aforementioned Akira Kurosawa. For a taste, here is a video of thirty-five scenes from Kurosawa’s iconic remake of Shakespeare’s King Lear, Ran, one of his “color” period films.
However, there is an entirely different kind of medium that Japan is more famous for in modern America: animation. It may be obvious, but animation is an entirely different medium than traditional cinema.
Consider that Andre Bazin, a renowned and influential French film critic, asserted,
All art is founded upon human agency, but in photography alone can we celebrate its absence… photography's objectivity confers upon it a degree of credibility absent from any painting.
This begs the question: can we conceive animation as a collection of individual paintings? Is it subject to the same criticisms as other films, and does it have any more objectivity because it is a moving image or a created world unto itself? Perhaps the best example is a film we will study in this course, Akira. This analysis of how light was used and animated in Akira demonstrates not only the potential of animation and how it can push traditional film but also how different the process of its creation is.
Spirited Away is celebrated for its meticulous hand-drawn animation, as is Akira, which aligns with Lev Manovich's (a professor and scholar from the University of New York) observations on the significance of manual image construction in the digital age. As we analyze the slice-of-life, let us consider Manovich's point of view. Specifically when he asserts the following on his website: Manovich, where he has published his work, in which he notes that:
Seen in this [rampant use of CGI in live action films] context, the manual construction of images in digital cinema represents a return to nineteenth century pre-cinematic practices, when images were hand-painted and hand-animated. At the turn of the twentieth century, cinema was to delegate these manual techniques to animation and define itself as a recording medium. As cinema enters the digital age, these techniques are again becoming the commonplace in the filmmaking process. Consequently, cinema can no longer be clearly distinguished from animation. It is no longer an indexical media technology but, rather, a sub-genre of painting.
He follows up later by stating:
Manual construction and animation of images gave birth to cinema and slipped into the margins...only to re-appear as the foundation of digital cinema. The history of the moving image thus makes a full circle. Born from animation, cinema pushed animation to its boundary, only to become one particular case of animation in the end.
In Spirited Away, Hayao Miyazaki masterfully employs framing techniques to weave a rich tapestry of narrative and emotion, grounding the film in a visually and thematically dense landscape. Wide shots are key in establishing the expansive world of the spirit realm, offering viewers a sense of scale and immersion into its mystical boundaries. These wide frames are not merely for aesthetic appeal but serve a narrative function, situating Chihiro within a vast, almost overwhelming, mystical setting. This technique underscores the grandeur of the spirit world while highlighting Chihiro's initial isolation and the monumental nature of her journey.
Conversely, Miyazaki's strategic use of close-ups intimately connects the audience with Chihiro's emotional journey. Focusing closely on Chihiro’s expressions and reactions, these shots capture the subtle shifts in her emotions as she navigates the challenges of the spirit world. This careful visual articulation of Chihiro's emotional state fosters a deep, empathetic connection, allowing the audience to feel her fears, joys, and growth. The emotional depth conveyed through these close-ups is pivotal, as it transforms Chihiro’s journey into a shared emotional experience with the audience, making her character development and the narrative’s emotional stakes more engaging and relatable. So, It may be true that CGI has essentially turned some live-action films into quasi-animations, but it is also true that animation employs techniques seen in live-action films.
Interestingly, as we explore the most technologically advanced city in the world, DamiLee, notes how Artificial Intelligence could never write Miyazaki's films, and it involves those very moments of slice of life.
As Dami states in the video, this meticulous attention to mise-en-scène is evident in the way spaces within Spirited Away are designed to evoke the feeling of ma—the Japanese concept of negative space, emphasizing the tension and interplay between objects (this is a concept that will come up again). This principle shapes how viewers perceive and interact with the film's world, allowing for moments of reflection and a deeper connection to the characters.
The film's setting, infused with rich sensory details, allows viewers to feel the dampness on their skin or the wind in their hair, transcending cinema's visual and auditory mediums. This sensory engagement is achieved through detailed compositions that include lighting techniques like komorebi—the dappled light filtering through trees, creating a dream-like atmosphere that is distinct to Studio Ghibli's storytelling style.
That is the power of mise-en-scène in any context: to unify a cinematic experience and to provide the aesthetic context for whatever else the filmmaker might be up to. Drawing on setting, character, lighting, and composition, mise-en-scène is more than any one technique; it’s a film's overall look or feel and is far greater than the sum of its parts. This is why I chose to start here in our exploration of how, exactly, cinema works the way it does.
Video and Image Attributions:
The Sensual World of Claire Denis by Little White Lies. Standard YouTube License.
The Wes Anderson Obsession by Ana Romão. Standard YouTube License.
Georges Méliès – The Kingdom of the Fairies / Le Royaume des Fées (music by Steffen Wick) by PIANO PARTICLES. Standard YouTube License.
Intolerance (1916) — Belshazzar’s feast in Babylon by Fix Me A Scene. Standard YouTube License.
Metropolis (opening scenes) with score by Zack Kline by Zack Kline. Standard YouTube License.
Go behind-the-scenes of the ‘Thor: Ragnarok’ set design by QAGOMA. Standard YouTube License.
All Hollywood VFX Removed! What Movies Really Look Like by Fame Focus. Standard YouTube License.
‘The Crown’ Sets Explained by the Show’s Set Designer | Notes on a Set by Architectural Digest. Standard YouTube License.
Costume Design: The Hidden Layer of Movie Magic by Now You See It. Standard YouTube License.
The art of Hollywood special effects makeup by CBS Sunday Morning. Standard YouTube License.
The Big Combo, 1955, Joseph H. Lewis, dir. Public Domain Image.
Akira Kurosawa – Composing Movement by Every Frame a Painting. Standard YouTube License.
Andrea Arnold’s Women in Landscapes by Fandor. Standard YouTube License.
Center for Japanese Studies Publications: Japanese Cinema: Film Style and National Character
AKIRA: How To Animate Light by Nerdwriter1. Standard Youtube License.
Why Studio Ghibli movies CAN'T be made with AI by DamiLee. Standard Youtube License.
Explorative Assignment: Animation and Movement
Explorative Assignment: Animation and Movement
Purpose:
In this assignment, we turn to the slice-of-life animation genre; we turn to Ghibli and a slightly pissed-off red panda. Aggretsuko and Spirited Away may not exemplify the slice-of-life genre on the surface, but each has the unique ability to transform the mundane into the extraordinary. Through the lens of animation, these stories celebrate the nuanced, often overlooked moments of daily existence, inviting viewers to find beauty, resilience, and inspiration in the rhythms of regular life. Whether navigating the complexities of adulthood in a metropolitan office or exploring the tender awakenings of first love in suburban Tokyo, these narratives remind us that within every ordinary life lies an extraordinary story waiting to be told.
In this assignment, we seek to recreate this slice-of-life scenario by reinterpreting the animation, and yes, this is an invitation to engage in one of Tokyo's most famous pastimes: karaoke.
Criteria:
Option One: Aggretsuko
Recreate a specific scene that captures a significant moment in the protagonist's life, emphasizing the animation style, character dynamics, and how these elements serve the narrative's exploration of adult life and self-expression.
Option Two: Spirited Away
Focus on a scene that highlights the movie's core themes of slice-of-life and coming-of-age. Pay special attention to the animation and movement, analyzing how they contribute to the story and character development.
Turn-in Methods:
- Script: A detailed script that outlines the dialogue, action, and camera work of the chosen scene.
- Storyboard: A visual map of each shot in the scene, tailored to replicate the original animation's framing and composition.
- Shot List and Schedule: Organize shoots, equipment, and participants with a clear plan that mirrors the original scene's complexity and timing.
- Animatic: An animatic that sequences the storyboard into a rough visual flow, mimicking the pacing and motion of the original animation.
- Footage: Raw footage or animation that closely replicates the chosen scene, demonstrating careful attention to cinematography and performance techniques.
Required Scene Analysis:
In addition to the recreated scene, each student must submit a 200-300 word analysis focusing on the animation within their selected scene. This analysis should explore:
- The technical aspects of the animation (e.g., frame rate, style, character movement).
- How the animation style contributes to the overall narrative and character development.
- Personal interpretation of why specific animation techniques were used and what they convey about the scene's thematic and emotional content.
Option Three: Recreating a Scene
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This is a comprehensive shot list and schedule that organizes shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
Complete the aforementioned journal following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Example film critique of Whisper of the Heart: Whisper of the Heart: A Reflection on Writing from a Film Academic
Film Journal: Kurosawa; Godzilla; Midnight Diner
Film Journal; Mise en Scene
In our earlier chapters, we discussed Akira Kurosawa. How could we not? He is one of the greatest directors of all time in cinema. In terms of Japanese cinema, he is nearly unrivaled.
Here is what the Center for Asian Studies says about Kurosawa and his artistic way of depicting life:
Though Kurosawa made his career in film, his earliest artistic ambitions focused on painting and illustration. In 1936, forced to find a more lucrative profession, Kurosawa found work as an assistant film director trainee and succeeded so well in this form that he began directing entire films himself in the 1940s. His first film, Sugata Sanshiro, released in 1943, depicts in Buddhist terms a young man’s spiritual and physical path to becoming a judo expert. According to film historian Donald Richie, this film not only shows Kurosawa’s artistic independence (the director constantly fought with war-time censors who wanted the film to show nationalistic spirit and support for the war effort), but also reveals a major theme of Kurosawa’s work: the interplay of illusion and reality. The popularity of this film in Japan led to several more, some set in the past world of the warrior, such as Seven Samurai (1954), while others, such as No Regrets for Our Youth (1946) and Ikiru (1952), explore illusion and reality in post-war Japan, engaging the personal and political dimensions of social issues. Kurosawa’s first international success, as well as his first academy award, came in the early 1950s with Rashomon, a film which relates a crime through the accounts of three participants whose quite different perspectives on the event make the viewer wonder whether the notion of truth has any value at all.
(They say a lot more; you should check it out)
Purpose:
In this journal, the student should pick a scene to analyze to demonstrate a critical understanding of mise-en-scene, the overall process of constructing a set, set design, and its overall impact on the "invisibility" of film. This video analysis presented by Every Frame a Painting demonstrates how Kurosawa employs geometry within a scene to keep the scene alive:
In our analysis of a scene, we will be attempting to explore the geometry of a scene through our analysis of mise-en-scene.
Purpose:
Please write a critique of a scene from either Midnight Diner or Godzilla Minus One using the principles of geometry in a scene as described above.
Here is the challenge for our journal entries: this is not my textbook; it is our textbook. Is there a section of this module you think could be better, or would you like to add to it? Go for it. That is the point.
Prompts:
Reflect on a scene from either Midnight Diner or Godzilla Minus One and analyze it through the lens of geometric composition as described in the art of cinema. Consider the scene's mise-en-scène closely. Evaluate whether the scene unfolds in a genuine locale or on a constructed set. Is it an indoor or outdoor setting? Does it occur during the daytime or at night? Assess the lighting—does it employ high-key or low-key techniques? Highlight any notable elements of the production design. Is there evident use of computer-generated imagery (CGI)? Delve into why the filmmakers might have opted for these specific staging choices.
Focus on the scene's geometric structure—how are the characters and objects arranged? For instance, are they positioned to form simple shapes like squares and triangles, similar to Akira Kurosawa's approach in The Bad Sleep Well? Kurosawa's scenes often pivot around geometric forms, creating visual tension and narrative flow without relying on excessive dialogue or cuts. This technique directs the viewer's gaze naturally, from one character to another, within structured frames like triangles and squares that evolve with the scene's dynamics.
Consider a character that stands out to you. Reflect on their design—how their attire, hairstyle (if applicable), and personal artifacts or accessories contribute to their identity. Do these elements transform over the course of the narrative? What do these design choices reveal about the character's personality, status, or evolution?
Criteria:
All journal entries should use correct MLA formatting, specific diction and terms from the module, and a direct answer to the prompt using specific scenes and examples from the film you are reviewing. We must also provide a work cited page for the film we are reviewing!
Your response should be at least 500 words.
Task:
Complete the following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Example film critique of Kurosawa: Kurosawa on the Human Condition: A Technical Film Analysis
Week Two, Module One - Cinematography
Photography is the art of fixing an image in durable form through either a chemical or digital process. It requires a detailed, scientific knowledge of how light reflects off the lived environment and how that light reacts to various light-sensitive media. It also requires a sophisticated grasp of color temperature and the interplay of light and shadow. An artist’s sensibility to composition, the arrangement of objects, and setting within the camera's frame to achieve balance and visual interest. Not to mention a deep, technical understanding of the gear required, cameras, formats, lenses, and their respective idiosyncrasies. And it helps if you know how to tell a story in a single image, frozen in time. After all, a picture is worth a thousand words.
Now, do that at least 24 times every second. That’s cinematography.
Capturing the moving image. For many of film lovers and even just the casual viewer, this is what we show up for. But I’ve waited five chapters to discuss it because it’s important to understand that cinematography – while it may often get the most glory – is only one part of how cinema works. Without a sophisticated mise-en-scéne and a narrative to follow, it’s just a bunch of meaningless images. Not to mention the importance of editing, sound, and performance. Put it all together, cinematography is the anchor point to a much larger cinematic experience.
The person responsible for all of this is the cinematographer, sometimes known as the director of photography (DP). Their job is to translate the director’s vision into usable footage, using all of the photographic skills listed above and only after making a series of crucial decisions, which we will get to below. It is one of the most technical jobs in cinema, requiring as much science as it does art:
And just as the production designer oversees a whole crew of craftspeople helping to fully realize the mise-en-scéne, the cinematographer also relies on a large team known as the camera department. The camera department includes the camera operator, the person actually handling the camera. I know; it seems like that should be the cinematographer. And it often is. But on larger productions with multiple cameras or very complex shots, the cinematographer can only be in one place at a time. There’s also the 1st assistant camera (1st AC), which is responsible for the camera components, swapping out lenses, and, most importantly, keeping the camera in focus. However, that last job is sometimes given to another dedicated member of the team, the focus puller. Then you have the 2nd assistant camera (2nd AC), who assists the 1st AC and often operates the slate or clapper (more on that later).
A relatively new member of the camera department is the Digital Imaging Technician (DIT). With the rise of digital cinematography, instead of a dedicated person responsible for loading film onto the camera (known as a film loader, so creative with the names), we now have a person solely responsible for organizing the digital files coming off the camera. That can include quality control and color correction during the shoot.
Outside the dedicated camera department, the cinematographer also oversees the lighting department as well as the grip department, also known collectively as grip and electric. The lighting department is, well, responsible for all the lights required to shoot a scene. As should be obvious, lights require electricity. And electricity can be dangerous, especially when you have 100 crew people running around trying to get a shot before lunch. So, the head of the lighting department is a skilled electrician known as the gaffer. The gaffer has a first assistant as well, called the best boy. (I know, not very gender-neutral. If the “best boy” is female, they might be called best babe, which is worse.) And then a whole crew of electrics is responsible for putting the lights wherever the gaffer tells them to. Grips are there to move everything else that isn’t a light. That includes lighting stands, flags, bounces, even cranes, dollies, and the camera itself. The head of the grip department is the key grip, and one of their most important jobs is on-set safety. With so many literal moving parts, it is very easy for someone to get hurt.
That’s a lot of people to keep track of for one cinematographer, but fortunately, there is a tightly controlled hierarchy, and they all know their jobs. A simple command from the cinematographer, “Flag off that 10k, we’re going wide on the dolly,” may sound like gibberish, but everyone on a film set knows exactly what to do. In fact, there’s a whole cinema-specific vocabulary that film crews use to keep the shoot moving quickly and efficiently. From apple boxes to barn doors to C-stands, the lingo can get downright bizarre. Clothespins are not clothespins; they’re C-47s (and yes, they use a lot of clothespins on a film set), and breakfast isn’t the morning meal; it’s the first meal on set, which could be 6 o’clock in the evening. And if someone is in the bathroom, they’re 10-100 (or 10-200 as the case may be), but they’re definitely not “in the can”, which is what you say when a scene is completed.
But aside from the esoteric lingo on the set, there are a few key terms everyone should know. The first is the shot, the most basic building block of cinematography. As mentioned in Chapter Two, a shot is one continuous capture of a span of action by a motion picture camera. A finished film is made up of a series of these shots of varying lengths, that ultimately tell the story. But during production, each shot may need to be repeated several (or dozens or even hundreds of) times until everyone gets it right. Every time they repeat the shot, it’s called a take. And once the director and cinematographer feel they have the best version of that shot, it’s time to move the camera – and everything associated with it – to a new shot, sometimes just a slightly different angle on the same scene. That’s called a set-up. New set-ups require everyone on the crew to jump into action, rearranging the camera, the lights, the set dressing, etc. That can take time. Lots of time. And it’s one reason assistant directors, responsible for planning how long it will all take, think of the schedule in terms of the number of set-ups a crew can accomplish each day.
Obviously, a film set is a complicated place requiring a complex choreography of dozens, if not hundreds, of personnel all dedicated to rendering the moving picture. But there are many decisions a cinematographer has to make before they even arrive on set. These decisions – film or digital, black and white or color, lighting, lenses, framing, and movement – are all made in collaboration with the director and in service to the narrative and the overall mise-en-scéne. Some of them are incredibly technical, and some are purely aesthetic, but each one of them will affect how we engage in the cinematic experience.
FILM VERSUS DIGITAL
One of the first decisions a cinematographer must make is what medium she intends to use to record the images: a physical film stock or a digital sensor. While this is a highly technical decision, it is also an important aesthetic choice that will affect the overall look of the final image. Not only are there differences in the look of film versus digital recording generally, but there are also subtle distinctions in the various film stocks and manufacturers, as well as the different types of digital sensors that come with different camera systems. Let’s take each one in turn.
Good old-fashioned film stock has been around since the dawn of cinema, though it has evolved quite a bit since those early days. In the beginning, the strips of light-sensitive material were made from nitrate, a highly flammable material, which was not so great when it was whirring through a projector past a hot lamp. It’s one of the reasons many early films are lost to history. They simply burned up too easily. Today, film stock is made from a much sturdier plastic. And on that plastic is a gelatin coating containing thousands of microscopic grains of light-sensitive crystals called silver halide. When light hits those crystals, they darken, depending on the amount of light. (And if it’s a color film, there will be three separate layers of those crystals, one blue, one red, and one green.) A chemical bath enhances that reaction to light, rendering a negative image that can then be projected.
Once a cinematographer commits to this analog, chemical process, there are still a lot of decisions to make. First, they must choose a film gauge, that is, the size of the film stock. The film gauge is determined by measuring from corner to corner the individual frames that will be exposed to light. The standard film gauge in cinema today is 35mm, but sizes range from as small as 8mm all the way up to 70mm. And each size will render a different look, with more or less detail once enlarged. They must also decide how sensitive the film will be to light. Highly sensitive, or “fast” film stock, that is, a film that reacts quickly to relatively low levels of light, contains relatively large silver halide crystals (more surface area to absorb the light). The benefit is the ability to film at night or in other low-light situations. The drawback is a loss in resolution or detail in the image due to an increase in the crystals. Or grain. Less sensitive, or “slower” film stock produces a crisper image (due to the smaller crystals) but requires more light.
There are many other decisions to be made that may affect the final image – the manufacturer, black and white versus color, the developing process – but using the physical medium of film stock renders an image that many filmmakers claim has a more organic look, a difference you can almost feel more than see. And that comes at a price. Film stock must be purchased by the foot, forcing filmmakers to plan every shot carefully to avoid wasting material. (Of course, many filmmakers see this as a good thing). Not to mention the fact that you don’t really know what you have until you develop the film after a day of shooting. Or the fact that you have to assemble your final film by actually cutting and taping together physical strips of film. Or the fact that even if you choose to shoot on analog film stock, most of your audience is going to watch a digitized version in the multiplex or on their television, laptop or smartphone anyway.
For these and many other reasons, the good old-fashioned film has fallen somewhat out of fashion in favor of the flexibility of digital cinematography. Digital cinematography is identical in every way to analog film cinematography – same basic equipment, same need to control exposure, shape light, compose the image, etc. – with one important difference: the light passing through the lens hits a digital image sensor instead of a strip of plastic film. That sensor uses software to analyze and convert the light bouncing off its surface into a series of still images (just like film stock) that are recorded onto flash memory or an external hard drive.
The advantages should be obvious. First and foremost, there are almost no limits on how much you can record, especially as digital data storage becomes cheaper and cheaper. And since the sensor is controlled by software, you can adjust settings such as light sensitivity at the press of a button rather than changing out the film stock.
But there are still lots of decisions to be made. Just as there are various film gauges, digital sensors come in all shapes and sizes, and every camera manufacturer produces their own subtle variations. And while most of us could probably never tell the difference, cinematographers are very particular about the way a Canon sensor renders color differently from a Sony sensor or a RED sensor from an Arri sensor.
And then there’s the issue of resolution. The standard for “high definition” is an image measuring 1,920 pixels by 1,080 pixels, also known as 1080p (the “p” stands for progressive scan since the image is rendered line by line from top to bottom). Pixels are the smallest visible unit in a screen’s ability to produce an image. Think of them as analogous to those tiny silver halide crystals in film stock. 1,920 by 1,080 pixels is a lot of detail, but most digital cinema today is recorded at a much higher resolution of at least 4,096 pixels by 2,160 pixels, or 4K. And even that has become commonplace and somewhat outdated. In fact, you probably have a 4K camera in your pocket right now. It’s on your phone. As the technology improves, we’ll see 6K, 8K, and 10K become standard. All that information packed into every image renders an incredible amount of detail (and also eats up a lot of storage space). Detail most of us, frankly, will not be able to see with the naked eye.
But resolution isn’t the only factor that affects image clarity. Cinematographers can also manipulate the frame rate to render super-sharp imagery. For decades, the standard frame rate for cinema has been 24 frames per second. That produces a familiar, cinematic “look” to the finished film in part because of motion blur, the subtle blurring that occurs between still images passing at 24 fps. But film shot and projected at 48, 96, or even 120 frames per second renders an ultra-sharp image with almost no motion blur as our brains process far more detail between each individual frame. To be fair, this is possible with analog film stock, but it is impractical to shoot that much film stock at that high a rate. Digital cinematography gives filmmakers like Ang Lee (Billy Lynn’s Long Halftime Walk (2016), Gemini Man (2019)) and James Cameron (the Avatar series) the freedom to experiment with these higher frame rates combined with higher resolution sensors to produce images we literally have never seen before.
BLACK & WHITE VERSUS COLOR
Another decision cinematographers must make early in the process, in collaboration with the director, is whether to record the image in black and white or color. For many of you, this may seem more a question of history. Old movies are black and white; modern movies are in color. Once the technology allowed for color cinematography, why would anyone look back? But there are a number of reasons why a filmmaker might choose to film in black and white over color, even today. They may want to evoke a certain period or emulate some of those “old” movies. Or, if the subject matter is relatively bleak, they may want the added thematic element of literally draining the color from the image. Or they may want to take advantage of the heightened reality and sharp contrast that black-and-white cinematography provides. Or maybe they want to foreground the performances. One of the greatest directors in cinema history, Orson Welles, once said black and white was the actor’s friend because every performance is better without the distraction of color.
But I get it. It’s not 1920. You don’t ride a penny-farthing or listen to music on wax cylinders. Why would you watch a movie in black and white?
Maybe this will convince you:
Whatever their reason, cinematographers must take several things into account once they choose between black and white and color. First, if they are shooting black and white on film, they typically have to use a film stock designed for black and white imagery. It is possible to print black and white from a color negative, but it won’t render the light and shadows in quite the same way as a dedicated film stock. And, of course, if they are filming in color, different film stocks from different manufacturers will render colors differently depending on the desired effect. If they are using digital technology and want the final product to be black and white, the color is usually removed after filming in post-production. But they still have to balance lighting and exposure for how the image will render without color. In either case, it’s important to note that black-and-white cinematography requires just as much attention to detail in the filming process as color.
LIGHT AND LIGHTING
Whether shooting film or digital, black and white or color, one of the most powerful tools a cinematographer has to work with is light itself; without light, there is no image, and there can be no cinema. But simply having enough light to expose an image is not enough. A great cinematographer – heck, even a halfway decent one – knows that their job is to shape that light into something uniquely cinematic. To do that, they must have a deep understanding of the basic properties of light. Four properties, to be specific: Source, Quality, Direction, and Color.
The source refers to both the origin and intensity of the light. There are two basic distinctions in terms of origin: natural or artificial. Natural light refers to light from the sun or moon (which is really just the sun bouncing off the moon, but you knew that), and artificial light refers to light generated from any number of different technologies, LED, incandescent, fluorescent, etc. Each source will have its own particular characteristics, exposing a shot in its own particular way. Artificial light allows a cinematographer an incredible amount of freedom to manipulate and shape the light. Scenes shot indoors on a soundstage can be made to look like daytime exteriors with enough artificial light. Scenes shot outdoors at night can also be augmented with artificial lights standing in for moonlight. But natural light can also be manipulated and shaped through filters, flags (large black fabric squares used to block off the sun’s direct light), and diffusers.
Each new scene will require the cinematographer to consider their light source and how they want to shape it. And a big part of that calculation is intensity. How bright is the source, and how is that going to affect exposure? We’ll discuss depth of field later on, but how much light a cinematographer has to work with affects how much (or how little) of the shot can be in focus and how balanced their exposure will be in the final image. Sometimes a cinematographer can get away with just using available light, that is, the light from the pre-existing fixtures in a location (also called practical lights). But more often, they want to control the intensity more precisely, so they use specialized lights to illuminate the scene from outside the frame of the image. The lamps and overhead lights you might see in a film or TV series are actually more props than true lighting sources. They indicate to the viewer where the light is coming from in a given shot – what cinematographers call motivating the light source and direction – but they rarely add anything to the exposure of the scene.
Check out this short clip:
The subject in the scene is lit by several bright artificial lights just off camera. The table lamp in the background is only there to “motivate” the light that illuminates the side of the subject’s face. But it’s really just a psychological trick. If you really think about it, a dim lamp behind and to the right of the subject should not illuminate his face at all, but our brain tells us, “Sure, that makes sense.” That’s because we really want to believe; we don’t want to think about a crew of people standing around bright lights while a camera records it all. We want to be fooled, and the cinematographer knows that.
The second property of light cinematographers have to think about is quality. This doesn’t mean “good” or “bad,” it’s more about how the light “feels” in the shot. The easiest way to think about quality is in terms of hard or soft lighting. Hard lighting is intense and focused, creating harsh, dramatic shadows. Soft lighting is more diffused and even, filling the space with smooth, gradual transitions from light to dark. The difference is actually less about the light on the subject and more about the shadows cast by the subject. Are the shadows clearly defined with a hard edge? You’ve got hard lighting. Are the shadows fuzzy, less clearly defined, or maybe even absent entirely? You’ve got soft lighting. Cinematographers can control the quality of light by adjusting the size of the light source and its distance from the subject. Typically, the smaller the light source and the closer to the subject, the harder the light:
The third important property of light is direction. Where is the light coming from in the scene? Not the source, what makes the light, but what direction is it coming from? Left, right, below, above? Each decision will affect the look and feel of a scene, and practical lights in the set design can help motivate lighting direction. A single overhead lamp in an interrogation room will motivate a hard light from above. Large windows can help motivate a soft, diffused light from one side of the room.
Cinematographers plan their lighting set-up for any given scene by thinking carefully about what direction the light is coming from, starting with the main source of illumination, the key light. The key light is usually the brightest light on the set, used to properly expose the main subject. But just one bright light will feel like a spotlight, creating unwanted shadows. So, they use a fill light, usually less intense and a bit softer than the key light, to fill out those shadows. But those two lights shining on the front of your subject can make the scene feel a bit two-dimensional. To bring some depth to the image, they use a backlight, usually a hard light that shines on the back of a subject’s head (also called a hair light), to create some separation between the subject and the background. The brightness of each of these lights relative to each other is known as the lighting ratio and can be adjusted for various different effects. This lighting setup is known as three-point lighting, and it’s the most basic starting point for lighting a scene:
Of course, three-point lighting is just that, a starting point. Really complex lighting schemes will require far more layers to the set-up. But even then, cinematographers will talk to their gaffers, electrics, and grips in terms of key, fill, and back lights.
The fourth property of light that every cinematographer must understand is color. And no, I don’t mean red, blue, and green light bulbs. I mean the subtle color cast that different light sources give off that will ultimately affect the exposed image. For example, a typical household incandescent light bulb uses a tungsten filament to produce light. That light usually has a warm orange glow to it. But a fluorescent tube light in a ceiling fixture gives off a cooler, bluer light. In fact, we’ve come up with a way to measure these differences using the concept of color temperature. Color temperature is measured in degrees Kelvin. The lower the degree Kelvin, the warmer or more “red” the light. The higher the degree Kelvin, the cooler, the more “blue” the light. The orange glow of a tungsten bulb is around 3200 Kelvin. Daylight is around 5600 Kelvin.
It can get a little confusing, I know. Check out this quick overview of the science behind color temperature and how we use it in cinema:
As should be clear by now, color temperature matters a great deal when a cinematographer wants to set a particular mood. For example, a romantic scene in a candle-lit restaurant should have a warm orange glow. Fortunately, you don’t need to rely on a thousand candles to achieve that effect. Most modern LED (light-emitting diode) lights can be adjusted according to color temperature. All you have to do is dial in 2000K to your key, fill, and backlights, and you get the equivalent of the warm glow of candlelight without the fire hazard.
Source, quality, direction, and color are the four most important properties of light cinematographers must master to create great cinema. Once we understand these same properties, we can start to understand how cinematographers combine them to achieve an effective lighting style in any given scene, film, or series. For example, by lowering or removing the key light and relying more on indirect, relatively hard fill and backlights, you create deep shadows and high contrast in a scene. As mentioned in Chapter Three, this style of lighting is known as low-key lighting (because of the lack of a dominant key light, not because it’s laid back), used to evoke mystery and even terror.
Check out this short video essay on one of the greatest living cinematographers, Roger Deakins, and how he approaches lighting style in his work:
THE LENS
Another powerful tool a cinematographer has to work with is, of course, the camera. And there is a lot that goes into how that particular apparatus works and the nuances between different formats and manufacturers. But I want to focus on the one component that is interchangeable and allows for endless variety: the lens. No matter what camera a cinematographer chooses, it’s the lens that determines the clarity, framing, depth of field, and exposure of the image. Just by changing the lens, without moving the camera at all, you can radically transform the look of a shot.
The principle behind a camera lens is pretty simple. A piece of curved glass (or several pieces depending on the lens), held in place on the front of the camera, focuses light through an adjustable aperture (a fancy word for “hole”) and onto light-sensitive material (film or a digital sensor). The aperture controls the amount of light entering the camera, and the glass “elements” control the sharpness of the image by moving closer or further away in tiny increments from the aperture. The overall distance between the sensor and the point at which the light passes through those glass elements is called the focal length[1] and is measured in millimeters. So, in a 50mm lens, the distance between the sensor of the camera and the point where the light passes through the glass of the lens is 50 millimeters.
The focal length determines both the angle of view and the magnification of the image. The shorter the focal length, the wider the angle of view and the smaller the magnification. The longer the focal length, the narrower the angle of view and the greater the magnification. Any lens below 35mm is generally considered a “wide-angle lens” because of its relatively short focal length. Any lens above 70mm is considered a “telephoto lens” because it greatly magnifies the image.
Lenses can be divided into two basic types based on how they treat focal length: zoom and prime. Zoom lenses allow you to adjust the focal length by sliding the glass elements closer to or further away from the sensor, thus greatly magnifying the image or widening the angle of view without swapping out the lens itself. Prime lenses have a fixed focal length. What you see is what you get. Now I know what you’re thinking. Why not just slap a zoom lens on there and choose your own focal length? But actually, cinematographers almost always use prime lenses when filming. For one thing, zoom lenses tend to have many more glass elements than primes and that can affect the quality of the image. But more importantly, prime lenses force the cinematographer to be more deliberate and intentional about the angle of view and magnification of a particular shot.
Confused yet? Maybe this will help:
Still confused? Here’s an explanation in just 23 seconds:
The angle of view and magnification is important in terms of what’s visible in the frame, but just as important is what appears to be in sharp focus. Lenses also allow cinematographers to control the depth of the image by either isolating a subject as the only element we see clearly in a particular shot or allowing us to see everything in the background and foreground equally. This is called depth of field, the range of distance in front of the camera in which subjects are in sharp focus.
Take a look at this image:
Note how the figure of the man lighting his cigarette is isolated from the background, focusing our attention on the spark from the lighter. This is an example of a narrow depth of field. The range of distance in front of the camera in which subjects are in sharp focus is relatively small, creating less depth in the image.
Now check out this image:
Note that everything seems to be equally in focus, allowing us to pick out all of the details of the set design. This is an example of a wide depth of field or deep focus.
But since cinematography is all about moving pictures, this is not necessarily a binary choice. A cinematographer can change the depth of field within a shot to shift our attention from one subject to another. This is called a rack focus or pull focus:
Now that you know what it is, you’ll see it all the time in film and tv. In fact, there’s usually one person on set whose only job is to manage those shifts in the depth of field within a shot. They’re called, appropriately enough, a focus puller.
FRAMING THE SHOT
Composition, the arrangement of people, objects, and settings within the frame of an image, has already come up a few times in previous chapters. That’s because how a cinematographer composes the image, how they design each shot, is one of the most important elements in cinematic storytelling. How those people, objects, and settings are arranged within the border of the image can bring balance or imbalance, reveal or hide information, indicate power or weakness, all without a word of dialog, an edit, or even a character on the screen.
But before a cinematographer can start to think about how to properly compose a shot, they have one more decision to make: the shape of their frame. Okay, every frame (for now) is some variation on a rectangle. But the proportions of that rectangle will dictate how people, objects, and settings are arranged within it. This is known as the aspect ratio, the width of the frame relative to its height. The current standard for motion pictures is 16:9, or 1.78:1, a rectangle that is almost twice as wide as it is tall. But in the early days of cinema, the standard was much closer to a square, 4:3, sometimes called the academy ratio. And sometimes filmmakers opt for a much wider frame, as wide as 2.35:1. That aspect ratio is a particular favorite of Quentin Tarantino. Whatever aspect ratio a filmmaker chooses will affect the choices they make regarding composition. Check out this quick comparison:
Once a filmmaker has chosen their aspect ratio, the most basic starting point for composition, one we all intuitively understand from our own experience snapping photos with our phones, is balance. Images that are well-balanced use the space within the frame to evenly distribute visual interest, creating a proportional, pleasing composition. (Unless that’s not what you’re going for, but we’ll get to that). One way to achieve that balance is the rule of thirds. The idea is to divide the frame into thirds horizontally and vertically and line up areas of visual interest at the intersection of those points. Here’s an example:
By arranging the actors along the intersection of the grid lines, the composition feels well-balanced and proportional. It has the added benefit of helping to tell the story, where the two characters share the screen as equals.
Now take a look at another image from the same film:
In this composition, the subjects are still evenly distributed within the frame, but the relative size difference between the characters indicates an unequal power dynamic. Again, helping to tell the story.
The rule of thirds is all about balance and proportion in the composition, to bring a sense of symmetry to the image. Some filmmakers take this notion of symmetry in composition to the extreme. Check out this supercut of Wes Anderson’s apparent obsession with symmetry in his films:
This consistent use of balanced composition is one of the elements that makes a Wes Anderson film a Wes Anderson film. That pattern in his framing is part of his signature mise-en-scéne.
But just like three-point lighting, the rule of thirds is really just a starting point for understanding how composition can be used to help tell a cinematic story. Framing the shot is really about directing our attention, showing us where to look in the shot or scene, and ultimately, how to feel about it. There are lots of ways to do this.
Take a look at how Nicholas Winding Refn uses another way to divide up the frame, a quadrant approach, to direct our attention in a given shot or sequence:
Or how Japanese master filmmaker Akira Kurosawa combines framing and movement to constantly redefine relationships and motivations using simple geometry:
Sometimes a filmmaker will direct our attention by framing the subject within another frame in the composition. Check out how Wong Kar-Wai uses this technique in the stunning romance In the Mood for Love (2000):
All of these examples demonstrate how filmmakers use framing to direct our attention and help tell the story. As discussed in Chapter Two, these techniques contribute to our shared cinematic language as filmmakers and viewers. Some of the more obvious ways filmmakers employ framing as a form of communication is by using imagery we already intuitively understand from our everyday lives. Take, for example, the apparent proximity of the subject to the camera. As discussed in Chapter Two, a close-up creates a sense of intimacy with the subject, just like it would in real life if we stood within inches of another person (hopefully with their permission, because if not, that’s just creepy). If the subject appears far away, as in an extreme long shot, that communicates a sense of disconnection or emotional distance from the subject. In fact, directors and cinematographers have a convenient shorthand for how close or far away the subject should appear, a code for where to place the camera (or what focal length to use). A close-up and extreme long shot is obvious enough. But there is also the extreme close-up, medium close-up, medium shot, medium long, long, etc. Each term means something specific in terms of composition. A medium-long shot, for example, will typically compose a character from the knees up. A medium shot will be from the waist up. Having a specific term for a specific composition saves time (and money) on the set during production.
Another way filmmakers can communicate through composition using imagery we already intuitively understand is by adjusting the angle of view. If a cinematographer frames the shot below the eyeline of a character – so we are literally looking up to them – that character will feel dominant and powerful. Frame the subject in profile, and the character will feel a bit more mysterious, leaving us wanting to know more about them.
A filmmaker can also “break” the rules of balance and proportion for a desired effect. For example, if a cinematographer intentionally creates an asymmetrical, unbalanced image, it will likewise make the viewer feel uneasy and off balance. Or they can compose the image so the main subject is isolated and small relative to the rest of the frame, creating what is known as negative space. This can help communicate a character’s isolation or powerlessness in a scene.
Want more examples? Check out this video essay on how filmmakers use composition to tell a cinematic story:
MOVING THE CAMERA
Much of the above discussion about composition is as true for still photography and painting as it is for cinematography. But what makes cinema special is, of course, movement, both in terms of how subjects move within the frame – also known as blocking – and how the frame itself moves through a scene. While the blocking of actors in a scene is important, I want to focus on how a cinematographer can move their camera within a single shot to reframe an image and potentially change the meaning of the scene.
There are many different ways a camera can move. Let’s take a look at some of the simplest, starting with pans and tilts. A tilt is simply moving the camera up or down from a fixed point, usually a tripod. A pan is simply rotating the camera from side to side, also from a fixed point. Here’s an example of a pan:
The effect is the same as if you simply turned your head from left to right, keeping your eyes straight ahead. But by moving the frame, the cinematographer is able to radically reorient our point of view while also creating a sense of anticipation as to what will be revealed.
But if you want the camera to actually move through the space, not simply move left to right or up and down, there are a few options. You could just pick it up and move it. That’s called, appropriately enough, a handheld shot. But if you want that movement to be more subtle, or at least a lot smoother, you’ll want more precise control over how the camera moves. One way to achieve that is to put it on wheels. Sometimes, those wheels are stuck on a track that grips have laid down for a particular shot, and sometimes, they’re just well-oiled wheels that will go wherever the grip pushes them. Either way, this is called a dolly shot. Dolly shots come in all sorts of flavors. You can dolly in or dolly out, that is, move toward or away from a stationary subject. Here’s an example of a dolly out combined with a tilt:
you can set up a tracking shot that tracks along with a subject in motion (and may or may not be on actual tracks). Here’s a simple tracking shot of two kids on their bicycles:
In this case, the camera was mounted on the back of a van, tracking in front of the subjects and leading them forward. Notice too, how towards the end of the shot, the camera shifts subtly to reframe the image of just the girl, indicating a subtle shift in emphasis in the story.
You can also put the camera on a crane to achieve a really dramatic shift in the point of view, like this crane shot from High Noon (1952, Fred Zinneman, dir.):
Notice how effective this shift in perspective is in making the character seem isolated, small, and powerless without even knowing the context or the rest of the story (it’s an amazing film, and you should go watch it right now).
If you want the freedom of physically carrying the camera around through a scene but still want the smooth motion of a dolly, you can use a special rig called a Steadicam. Steadicam is actually a brand name for a camera stabilizer that has become a somewhat generic term (like Kleenex or Xerox… does anyone still say Xerox?). The camera is strapped to the camera operator using a system of counterweights, gimbals, and gyroscopes (it feels like I’m making those words up, but I’m not):
The result is incredibly smooth motion regardless of terrain.
Here’s one of the most famous Steadicam shots in cinema history from Martin Scorsese’s Goodfellas (1990):
Try following those two actors through all of that with a camera on wheels!
Pans, tilts, dollies, cranes, and steadicams, regardless of how a filmmaker moves the camera, one question they must always answer first is: Why move the camera at all? That is, is the movement motivated? In the case of Scorsese’s Steadicam shot above, we’re following the main characters into a nightclub. Motivation enough to move with them. Or that crane shot from High Noon; the move reveals something important about the character. Again, solid motivation. But what happens when a camera move is unmotivated? If the camera moves simply because the filmmaker thinks it “looks cool”? (I’m looking at you, Michael Bay). Most often, an unmotivated camera move that isn’t serving the story reminds the viewer they are watching a movie. The move becomes visible instead of invisible, and usually, that’s the last thing a filmmaker wants. All of this is supposed to be invisible, remember?
But sometimes, a filmmaker intentionally moves the camera without clear motivation to achieve a certain effect. For example, a tracking shot can move laterally through a scene with or without subjects in motion. Since there is no reason to move the camera, the movement can feel unmotivated and, therefore, more noticeable to the viewer. So why do it? Here’s a deep dive into how effective a lateral tracking shot can be:
Maybe the best example of a really effective but completely unmotivated camera movement is one of filmmaker Spike Lee’s signature camera moves: The Spike Lee Dolly. At least once in every film, Spike Lee will put one or more characters on the same dolly as the camera and move them both through the scene. It’s disorienting and a little bizarre, but it creates a fascinating image that can draw the viewer into the psychology of the character:
Well-planned and thoughtful camera movement, usually the motivated kind, can not only help tell the story, but it can also radically transform our relationship to the story. It doesn’t always have to be flashy. It could just be a subtle shift in perspective. A slight pan or a minute push in on a dolly. But it can change everything:
THE LONG TAKE
The last point I’d like to make regarding cinematography is how really great cinematographers can combine all of the above into one continuous bravura shot that manages to move the story forward without a single edit. Don’t get me wrong, editing is important, and we’ll get to that next. But sometimes, a filmmaker finds a way to move through a scene, choreographing the actors and the camera department in such a way that the story unfolds in one long, continuous take. And it can be breathtaking.
In fact, the shot above from Goodfellas is a pretty good example. Notice how Scorsese moves the camera through several different settings without ever needing to cut away from the shot. But the most famous long take is probably Orson Welles’s opening shot from Touch of Evil (1958). Seriously, check this out:
Imagine the planning required to choreograph that sequence. Everything had to work like clockwork (pun intended). And yet, nothing was sacrificed in terms of cinematic storytelling. Welles is able to move in and out of close-ups, medium shots and long shots, overhead crane shots and smooth tracking shots, directing our attention, revealing information and creating suspense. All without a single cut.
Now check out how filmmakers like Sam Mendes are still imitating that iconic shot in films like Spectre (2015):
Sometimes these long takes are much less noticeable. Take a look at how a filmmaker like Steven Spielberg, not necessarily known for bravura camera moves, still finds ways to use the occasional long take to serve the story:
Video and Image Attributions:
The Filmmaker’s View: Rachel Morrison – DP is the best job on set, we all know that by ARRIChannel. Standard YouTube License.
So You Don’t Want to Watch a Black & White Movie? by RocketJump Film School. Standard YouTube License.
Motivated Practical Lighting by Amin Suwedi. Standard YouTube License.
Lighting 101: Quality of Light by RocketJump Film School. Standard YouTube License.
Frameforest Filmschool: 3 point lighting by frameforest. Standard YouTube License.
The History and Science of Color Temperature by Filmmaker IQ. Standard YouTube License.
Roger Deakins: Making Beautiful Images by James Hayes. Standard YouTube License.
Cinematographer Explains 3 Different Camera Lenses by Vanity Fair. Standard YouTube License.
Understanding Focal Length by Canon New Zealand. Standard YouTube License.
The Art of the Focus Pull by Fandor. Standard YouTube License.
Wes Anderson // Centered by kogonada. Standard Vimeo License.
Drive (2011) – The Quadrant System by Every Frame a Painting. Standard YouTube License.
The Bad Sleep Well (1960) – The Geometry of a Scene. by Every Frame a Painting. Standard YouTube License.
In The Mood For Love: Frames Within Frames by Nerdwriter1. Standard YouTube License.
Composition In Storytelling | CRISWELL | Cinema Cartography by Criswell. Standard YouTube License.
ANIMAL Clip – Pan by Russell Sharman. Standard YouTube License.
ANIMAL Clip – Dolly Out by Russell Sharman. Standard YouTube License.
ANIMAL Clip – Tracking by Russell Sharman. Standard YouTube License.
High Noon Crane Shot by C.P. Crouch. Standard YouTube License.
Steadicam and operator in front of crowd. Public domain image.
Goodfellas – Steadicam Shot by 805Bruin. Standard YouTube License.
Wolf Children (2012) – The Lateral Tracking Shot by Every Frame a Painting. Standard YouTube License.
Spike Lee – The Dolly Shot by Richard Cruz. Standard YouTube License.
5 Brilliant Moments of Camera Movement by CineFix. Standard YouTube License.
Touch of Evil (1958) — The Opening Sequence (Welles’ original) by Fix Me A Scene. Standard YouTube License.
Spectre- Opening Tracking Shot in 1080p by Movie Maker. Standard YouTube License.
The Spielberg Oner by Every Frame a Painting. Standard YouTube License.
- Okay, so it’s a little more complicated than that. Technically, focal length is measured from the point where the light converges in the middle of the glass elements, known as the optical center, before it is refracted back out toward the aperture and sensor. Feel better? ↵
Explorative Assignment: Pastiche Kill Bill Vol. 1, First Love, or Samurai Champloo
For our section on Cinematography, we will focus on either Tarantino's Kill Bill Vol. 1 or Takashi Miike's First Love. Both directors are known for their unique cinematic style, and with both directors, it is usually clear within minutes that we are watching one of their films.
Purpose:
Quentin Tarantino's films are marked by the deliberate use of pastiche, a technique that melds elements from a variety of traditions and genres. This narrative strategy diverges from linear, historically accurate storytelling, favoring instead a self-referential style that emphasizes the constructed nature of film narratives.
The team over at Icon Collective have this to say:
Tarantino is a part of an artistic movement known as “postmodernism” which is founded on the idea that nothing is new in art. That everything is recycled. A standard example of postmodern philosophy is sampling in hip hop. Taking pieces of other songs and reconfiguring them into something new. Do you think producers who sample have an original style?
In employing pastiche, Tarantino crafts films that are not merely collections of references but films engaged with the medium of cinema. This approach serves both as an homage to and a critique of film history, challenging audiences to recognize the fluid boundary between fiction and reality.
For example:
Indiewire talks about Tarantino's use of extreme closeups in Kill Bill:
“Kill Bill,” The Eyes
“Kill Bill” is notable for being the first collaboration between Tarantino and Robert Richardson, who would go on to shoot all of the director’s next projects (save for “Death Proof”). By that point in his career, Richardson was already a favorite of Martin Scorsese’s (“Casino” and “Bringing Out the Dead”) and Oliver Stone (“Platoon,” “Natural Born Killers,” and more). Tarantino and Richardson used the Western genre and its love of eyeline closeups as a visual touchstone throughout the two-part samurai film.
Another example would be when:
IndieWire also speaks about the action choreography:
“Kill Bill,” Blue Silhouette Fight
The famous blue silhouette fight between The Bride and members of the Crazy 88 yakuza is a visual ode to the opening of Nakano Hiroyuki’s “Samurai Fiction.” Tarantino films the Crazy 88 fight in blue, while Hiroyuki opted for red.
In comparison:
Takashi Miike's approach is a bit more intuitive. This is what Miike has to say about the way he is influenced on a subconscious level in an interview with Roger Ebert (published on his website) discussing the film we are covering in this course, First Love:
My influences happen at more of a subconscious level, I don’t dig too deep into that or analyze it myself. I may say, “That scene in that movie would be perfect for this thing.” I’ll talk to my staff and say, “Oh you have to check out this film, it had this one scene that was really great!” In that sense I’m very straight forward about what my influences are. But more than being influenced by film in a tangible way that filters into my own work, I’m more influenced by things that happen in my daily life. The sunglasses that you’re wearing, somehow those sunglasses lodge in my subconscious and end up influencing me and they’ll end up in my next film and I won’t remember where they came from.
For this recreating a scene, we are looking at style rather than a specific scene. We should use the idea of postmodern pastiche to demonstrate a critical understanding of the cinematography we are emulating, noting the specific style, choices, and tone of the director and cinematographer they have chosen to analyze.
Criteria:
Option One: Quentin Tarantino's Kill Bill Vol. 1
- Recreate a scene or sequence that embodies Tarantino's use of visual homage, creative framing, vibrant color palettes, and dynamic editing. Consider how music plays a role in enhancing the narrative and emotional weight of the scene.
Option Two: Takashi Miike's First Love
- Craft a scene or sequence that reflects Miike's intuitive filmmaking style, highlighting the influence of everyday observations and subconscious inspirations. Focus on how these elements are integrated into the film's narrative and visual composition.
Turn-in Methods for Scene Creation:
- Script: Write a script that captures the dialogue, action, and detailed camera work, emphasizing the stylistic choices of your chosen director.
- Storyboard: Develop a storyboard that illustrates the scene's composition, including shot types, angles, and transitions.
- Shot List and Schedule: Compile a shot list and production schedule that outlines the logistical aspects of recreating your chosen scene.
- Animatic: Create an animatic that presents a preliminary visual representation of the scene, showcasing the flow and pacing.
- Footage: Produce raw footage that closely replicates the style of the chosen scene, focusing on cinematography and performance techniques.
Required Pastiche Analysis:
Independently of the scene creation, each student must submit a 200-300 word analysis exploring the use of pastiche in their selected director's film. This analysis should address:
- Definition and Application: Explain the concept of pastiche and how your chosen director employs it in their cinematic style.
- Influence and Homage: Discuss the various genres, traditions, and previous works of cinema that influence the director's approach, including how these elements are reconfigured to create something new.
- Critical Impact: Analyze how pastiche serves as both an homage to and a critique of film history and its effect on the audience's understanding of the boundary between fiction and reality.
Option Three:
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This comprehensive shot list and schedule organize shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
Complete the aforementioned journal following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Week Two, Module Two - Editing and Animation
They say a film is made three times. The first is by the screenwriter. The second was by the director and crew. The third is done by the editor in post-production.
I don’t know who “they” are, but I think they’re onto something.
When the screenwriter hands the script off to the director, it is no longer a literary document; it’s a blueprint for a much larger, more complex creation. A production process is essentially an act of translation, taking all of those words on the page and turning them into shots, scenes, and sequences. At the end of that process, the director hands off a mountain of film and/or data, hours of images, to the editor for them to sift through, select, arrange, and assemble into a coherent story. That, too, is essentially an act of translation.
The amount of film or data can vary. During the Golden Age of Hollywood last century, most feature films shot about 10 times more film than they needed, otherwise known as a shooting ratio of 10:1. That includes all of the re-takes, spoiled shots, multiple angles on the same scene, subtle variations in performance for each shot, and even whole scenes that will never end up in the finished film. And the editors had to look at all of it, sorting through 10 hours of footage[1] for every hour of film in the final cut.
They didn’t know it then, but they were lucky.
With the rise of digital cinema, that ratio has exploded. Today, it is relatively common for a film to have 50 or 100 times more footage than will appear in the final cut. The filmmakers behind Deadpool (2016), for example, shot 555 hours of raw footage for a final film of just 108 minutes. That’s a shooting ratio of 308:1. It would take 40 hours a week for 14 weeks just to watch all of the raw footage, much less select and arrange it all into an edited film![2]
So, one of the primary roles of the editor is to simply manage this tidal wave of moving images in post-production. But they do much more than that. And their work is rarely limited to just post-production. Many editors are involved in pre-production, helping to plan the shots with the end product in mind, and many more are on set during production to ensure the director and crew are getting all of the footage they need to knit the story together visually.
But, of course, it’s in the editing room, after all the cameras have stopped rolling, that editors begin their true work. And yes, that work involves selecting what shots to use and how to use them, but more importantly, editing is where the grammar and syntax of cinematic language really come together. Just as linguistic meaning is built up from a set sequence of words, phrases, and sentences, cinematic meaning is built up from a sequence of shots and scenes. A word (or a shot) in isolation may have a certain semantic content, but it is the juxtaposition of that word (or shot) in a sentence (or scene) that gives it its full power to communicate. As such, editing is fundamental to how cinema communicates with an audience. And just as it is with any other language, much of its power comes from the fact that we rarely notice how it works; the mechanism is second nature, intuitive, and invisible.
But before we get to the nuts of bolts of how editors put together cinema, let’s look at how the art of editing has evolved over the past century. To do that, we have to go back to the beginning. And we have to go to Russia.
SOVIET MONTAGE AND THE KULESHOV EFFECT
As you may recall, the earliest motion pictures were often single-take actualités, unedited views of a man sneezing, workers leaving a factory, or a train pulling into a station. It took a few years before filmmakers understood the storytelling power of the medium before they realized there was such a thing as cinematic language. Filmmakers like Georges Melies seemed to catch on quickly, not only using mise-en-scène and in-camera special effects but also employing the edit, the joining together of discrete shots in a sequence to tell a story. However, it was the Russians, in this early period, that focused specifically on editing as the essence of cinema. And one Russian in particular, Lev Kuleshov.
Lev Kuleshov was an art school dropout living in Moscow when he directed his first film in 1917. He was only 18 years old. By the time he was 20, he had helped found one of the first film schools in the world in Moscow. He was keenly interested in film theory, more specifically, film editing and how it worked on an audience. He had a hunch that the power of cinema was not found in any one shot but in the juxtaposition of shots. So, he performed an experiment. He cut together a short film and showed it to audiences in 1918. Here’s the film:
After viewing the film, the audience raved about the actor and his performance (he was a very famous actor in Russia at the time). They praised the subtly with which he expressed his aching hunger upon viewing the soup, the mournful sadness upon seeing the child in a coffin, and the longing desire upon seeing the scantily clad woman. The only problem? It was the exact same shot of the actor every time! The audience was projecting their own emotion and meaning onto the actor’s expression because of the juxtaposition of the other images. This phenomenon – how we derive more meaning from the juxtaposition of two shots than from any single shot in isolation – became known as The Kuleshov Effect.
Other Russian filmmakers took up this fascination with how editing works on an audience, both emotionally and psychologically, and developed an approach to filmmaking known as the Soviet Montage Movement. Montage is simply the French term for “assembly” or “editing” (even the Russians had to borrow words from the French!), but Russian filmmakers of the 1920s were pushing the boundaries of what was possible, testing the limits of the Kuleshov Effect. And in the process, they were accelerating the evolution of cinematic language, bringing a sophisticated complexity to how cinema communicates meaning.
The most famous of these early proponents of the Soviet Montage Movement was Sergei Eisenstein. Once a student of Kuleshov’s (though actually a year older), Eisenstein would become one of the most prolific members of the movement. Perhaps his most well-known film, Battleship Potemkin (1925), contains a sequence that has become one of the most famous examples of Soviet montage and, frankly, one of the most famous sequences in the cinema period. It’s known as The Odessa Steps Sequence. You may remember it from Chapter One. Let’s take another look:
One thing you might notice about that sequence is that it doesn’t make a whole lot of sense, at least in terms of a logical narrative. However, Eisenstein was more interested in creating an emotional effect. And he does it by juxtaposing images of violence with images of innocence, repeating images and shots, lingering on some images, and flashing on others. He wants you to feel the terror of those peasants being massacred by the troops, even if you don’t completely understand the geography or linear sequence of events. That’s the power of the montage as Eisenstein used it: A collage of moving images designed to create an emotional effect rather than a logical narrative sequence.
EDITING SPACE AND TIME
In the hundred or so years since Kuleshov and Eisenstein, we’ve learned a lot about how editing works, both as filmmakers and as audience members. In fact, we know it so well we hardly have to give it much thought. We’ve fully accepted the idea that cinema uses editing to not only manipulate our emotions through techniques like the Kuleshov Effect but also to manipulate space and time itself. When a film or TV episode cuts from one location to another, we rarely wonder whether the characters on screen teleported or otherwise broke the laws of physics (unless, of course, it’s a film about wizards). We intuitively understand that edits allow the camera – and, by implication, the viewer – to jump across space and across time to keep the story moving at a steady clip.
The most obvious example of this is the ellipsis, an edit that slices out time or events we don’t need to see to follow the story. Imagine a scene where a car pulls up in front of a house and then cuts to a woman at the door ringing the doorbell. We don’t need to spend the screen time watching her shut off the car, climb out, shut and lock the door, and walk all the way up to the house. The cut is an ellipsis, and none of us will wonder if she somehow teleported from her car to the front door (unless, again, she’s a wizard). And if you think about it for a moment, you’ll realize ellipses are crucial to telling a story cinematically. If we had to show every moment in every character’s experience, films would take years or even decades to make, much less watch!
Other ways cinema manipulates time include sequences like flashbacks and flashforwards. Filmmakers use these when they want to show events from a character’s past or foreshadow what’s coming in the future. They’re also a great indicator of how far cinematic language has evolved over time. Back in the Golden Age of Hollywood, when editors were first experimenting with techniques like flashbacks, they needed ways to signal to the audience, “Hey, we’re about to go back in time!” They would employ music – usually harp music (I’m not sure why, but it was a thing) – and visual cues like blurred focus or warped images to indicate a flashback. As audiences became more fluent in this new addition to cinematic language, they didn’t need the visual cues anymore. Today, movies often move backward and forward in time, trusting the audience to “read” the scene in its proper context without any prompts. Think of films like Quentin Tarantino’s Pulp Fiction (1994), which plays with time throughout, rearranging the sequence of events in the plot for dramatic effect and forcing the viewer to keep up. Or a more recent film like Greta Gerwig’s adaptation of Little Women (2019), which also moves backward and forwards in time, hinting at the shift through mise-en-scène and subtle changes in performance.
Another more subtle way editing manipulates time is in the overall rhythm of the cinematic experience. And no, I don’t mean the music, though, that can help. I mean the pace of the finished film, how the edits speed up or slow down to serve the story, producing a rhythm to the edit.
Take the work of Kelly Reichardt, for example. As both director and editor on almost all of her films, she creates a specific rhythm that echoes the time and space of her characters:
Sometimes, an editor lets each shot play out, giving plenty of space between the cuts, creating a slow, even rhythm to a scene. Or they might cut from image to image quickly, letting each flash across the screen for mere moments, creating a fast-paced, edge-of-your-seat rhythm. In either case, the editor has to consider how long do we need to see each shot. In fact, there’s a scientific term for how long it takes us to register visual information: the content curve. A relatively simple shot of a child’s smile might have a very short content curve. A more complex shot with multiple planes of view and maybe even text to read would have a much longer content curve. Editing is all about balancing the content curve with the needs of the story and the intent of the director for the overall rhythm of each scene and the finished film as a whole.
This is why editing is much more than simply assembling the shots. It is an art that requires an intuitive sense of how a scene, sequence, and finished film should move and how it should feel. Most editors describe their process as both technical and intuitive, requiring thinking and feeling:
CONTINUITY EDITING
Maybe it’s obvious, but if editing is where the grammar and syntax of cinematic language come together, then the whole point is to make whatever we see on screen make as much sense as possible. Just like a writer wants to draw the reader into the story, not remind them they’re reading a book, an editor’s job, first and foremost, is to draw the viewer into the cinematic experience, not remind them they’re watching a movie. (Unless that’s exactly what the filmmaker wants to do, but more on that later.) The last thing most editors want to do is draw attention to the editing itself. We call this approach to editing continuity editing, or more to the point, invisible editing.
Continually editing aims to create a continuous flow of images and sound, a linear, logical progression, shot to shot and scene to scene, constantly orienting the viewer in space and time and carrying them through the narrative. All without ever making any of that obvious or obtrusive. It involves a number of different techniques, from cutting-on-action to match cuts and transitions and from maintaining screen direction to the master shot and coverage technique and the 180-degree rule. Let’s take a look at these and other tricks editors use to hide their handiwork.
Cutting on Action
The first problem an editor faces is how and when to cut from one shot to the next without disorienting the viewer or breaking continuity, that is, the continuous flow of the narrative. Back in Chapter Two, I discussed one of the most common techniques is to “hide” the cut in the middle of some on-screen action. Called, appropriately enough, cutting-on-action, the trick is to end one shot in the middle of an action – a character sitting down in a chair or climbing into a car – and start the next in the middle of the same action. Our eyes are drawn to the action on screen and not the cut itself. The edit disappears as we track the movement of the character. Here’s a quick example:
The two shots are radically different in terms of the geography of the scene – one outside of the truck, the other inside – but by cutting on the action of the character entering the truck, it feels like one continuous moment. Of course, we notice the cut, but it does not distract us from the scene or call attention to itself.
And now that you know what to look for, you’ll see this technique used in just about every film or tv show, over and over, all the time.
Match Cuts
Cutting-on-action is arguably the most common continuity editing trick, but there are plenty of other cuts that use the technique of matching some visual element between two contiguous shots, also known as a match cut. There are eyeline match cuts that cut from a shot of a character looking off camera to a shot of whatever it is they are looking at, graphic match cuts that cut between two images that look similar (the barrel of a gun to James Bond in an underground tunnel, for example), and even subject match cuts that cut between two similar ideas or concepts (a flame from a matchstick to the sun rising over the desert in David Lean’s Lawrence of Arabia (1962)).
Almost all of these examples rely on a hard cut from one shot to the next, but sometimes an editor simply can’t hide the edit with some matching action, image or idea. Instead, they have to transition the viewer from one shot to the next, or one scene to the next, in the most organic, unobtrusive way possible. We call these, well, transitions. As discussed in Chapter Two, you can think of these as conjunctions in grammar, words meant to connect ideas seamlessly. The more obvious examples, like fade-ins and fade-outs or long dissolves, are drawn from our own experience. A slow fade-out, where the screen drifts into blackness, reflects our experience of falling asleep and drifting out of consciousness. And dissolves, where one shot blends into the next, reflecting how one moment bleeds into and overlaps with another in our memory. But some transitions, like wipes and iris outs, are peculiar to motion pictures and have no relation to how we normally see the world. Sure, they might “call attention to themselves,” but somehow, they still do the trick, moving the viewer from one shot or scene to the next without distracting from the story itself.
Wondering what some of these match cuts and transitions look like? Check out several examples of each (along with some not-so-invisible edits like jump cuts) here:
Screen Direction
Maintaining consistent screen direction is another technique editors use to keep us focused on the story and keep their work invisible. Take a look at this scene from Casablanca:
We are entering the main setting for the film, a crowded, somewhat chaotic tavern in Morocco. Notice how the camera moves consistently from right to left and that the blocking of the actors (that is, how they move in the frame) is also predominantly from right to left until we settle on the piano player, Sam. The flow of images introduces the tavern as if the viewer were entering as a patron for the first time. This consistent screen direction helps establish the geography of the scene, orienting the viewer to the physical space. An editor concerned about continuity never wants the audience to ask, “Where are we?” or “What’s going on?” And obviously, this isn’t something an editor can do after the fact all by themselves. It requires a plan from the beginning, with the director, the cinematographer, the production designer, and the editor all working together to ensure they have the moving images they need to execute the scene.
Some filmmakers can take this commitment to consistent screen direction to the extreme to serve the narrative and emphasize a theme. Check out this analysis of Bong Joon-ho’s Snowpiercer (2013):
Master Shot and Coverage
Consistent screen direction is an important part of how continuity editing ensures the audience is always aware of where everyone is located in relation to the setting and each other. Another common technique to achieve the same goal is to approach each scene with a master shot and coverage.
The idea is fairly simple. On set during production, the filmmaker films a scene from one, wide master shot that includes all of the actors and action in one frame from start to finish. Then, they film coverage, that is, they “cover” that same scene from multiple angles, isolating characters, moving in closer, and almost always filming the entire scene again from start to finish with each new set-up. When they’re done, they have filmed the entire scene many, many times from many different perspectives.
And that’s where the editor comes in.
It’s the editor’s job to build the scene from that raw material, usually starting with the master shot to establish the geography of the scene, then cutting to the coverage as the scene plays out, using the best takes and angles to express the thematic intent. They can stay on each character for their lines of dialogue or cut to another character for a reaction. They can also cut back to the master shot whenever they choose to re-establish the geography or re-set the tone of the scene. But maybe most importantly, by having so many options, the editor can cut around poor performances or condense the scene by dropping lines of dialogue between edits. Done well, the viewer is drawn into the interaction of the characters, never stopping to ask where they are or who is talking to whom, and hopefully never even noticing a cut.
Let’s take a look at a scene from Damien Chazelle’s Whiplash (2014), shot and edited in the classic master shot and coverage technique:
The scene opens with a master shot. We see both characters, Andrew and Nicole, in the same frame, sitting at a table in a café. The next shot is from the coverage, over Nicole’s shoulder, on Andrew as he reacts to her first line of dialogue. Then on, Nicole, over Andrew’s shoulder as she reacts to his line. The editor, Tom Cross, moves back and forth between these two shots until Andrew asks a question tied to the film’s main theme, “What do you do?” Then he switches to close-up coverage of the two characters. Tension builds until there is a subtle clash between them, a moment of conflict. And what does the editor do? He cuts back to the master shot, resetting the scene emotionally and reorienting the viewer to the space. The two characters begin to reconnect, and the editor returns to the coverage, again shifting to close-ups until the two find a point of connection (symbolized by an insert shot of their shoes gently touching). The rhythm of this scene is built from the raw materials, the master shot and the coverage, that the editor has to work with. But more than just presenting the scene as written, the editor has the power to emphasize the storytelling by when to cut and what shots to use.
The master shot and coverage technique gives the editor an incredible amount of freedom to shape a scene, but there is one thing they can’t do. A rule they must follow. And I don’t mean one of those artistic rules that are meant to be broken. Break this rule, and it will break the continuity of any scene. It’s called the 180 degree rule and it’s related to the master shot and coverage technique.
Basically, the 180-degree rule defines an axis of action, an imaginary line that runs through the characters in a scene that the camera cannot cross:
Once the master shot establishes which side of the action the camera will capture, the coverage must stay on that side throughout the scene. The camera can rotate 180 degrees around its subject, but if it crosses that imaginary line and inches past 180 degrees, the subjects in the frame will reverse positions and will no longer be looking at each other from shot to shot. Take a look at that scene from Whiplash again. Notice how the master shot establishes the camera on Andrew’s left and Nicole’s right. Every subsequent angle of coverage stays on that side of the table, Andrew always looking right to the left, and Nicole always looking left to the right. If the camera were to jump the line, Andrew would appear to be looking in the opposite direction, confusing the viewer and breaking continuity.
Now, I know I just wrote that this is not one of those artistic rules that was meant to be broken. But the fact is, editors can break the rule if they actually want to disorient the viewer, to put them into the psychology of a character or scene. Or if they need to jump the line to keep the narrative going, they can use a new master shot to reorient the axis of action.
Parallel Editing
All of these techniques, cutting-on-action, match cuts, transitions, consistent screen direction, and the master shot and coverage technique, are all ways that editors can keep their craft invisible and maintain continuity. But what does an editor do when there is more than one narrative playing out at the same time? How do you show both and maintain continuity? One solution is to use cross-cutting, cutting back and forth between two or more narratives, also known as parallel editing.
Parallel editing has actually been around for quite some time. Perhaps one of the most famous early examples is from D. W. Griffith’s Way Down East (1920). Kuleshov had already demonstrated the power of juxtaposing shots to create an emotional effect. But Griffith, among others, showed that you could also create a sense of thrilling anxiety by juxtaposing two or more lines of action, cross-cutting from one to another in a rhythmic pattern. In a climactic scene from the film, a man races to save a woman adrift on a frozen river and heading straight for a dangerous waterfall. To establish these lines of action and to increase our own sense of dread and anxiety, the editor cuts from the man to the woman to the waterfall in a regular, rhythmic pattern, cross-cutting between them to constantly remind the audience of the impending doom as we cheer on our hero until the lines of action finally converge. Here’s the scene:
By cross-cutting in a regular pattern – man, woman, man, waterfall, woman, man, woman, waterfall – the audience is not only drawn into the action, they are also no longer paying attention to the editing itself, thus maintaining continuity.
This technique has become so common, so integral to our shared cinematic language, that editors can use our fluency against us, subverting expectations by playing with the form. Check out this (rather disturbing) clip from Jonathan Demme’s The Silence of the Lambs (1991):
The scene uses the same parallel editing technique as Way Down East, using cross-cutting to increase our anxiety as two lines of action converge. But in this case, the editor subverts our expectations by revealing there were actually three lines of action, not two. However, the trick only works if parallel action is already part of our cinematic language.
What is animation?
Animation is the illusion of movement created by a series of sequential images that are displayed at a rapid rate. We are familiar with animation in film or television, yet we know that animation can be created in other devices, such as flipbooks and optical toys like the zoetrope. In film animation, frame rate refers to how many frames are projected per second. Frame rate is key to animation; if the frame rate is too slow, the illusion of movement is destroyed.
2 D Animation
Stop Motion Animation
Several standard techniques have been used to create animation since the origin of cinema. In 2D animation, sequential drawings are created and photographed to be played back at a specific frame rate. Stop motion has also been used since the early days of cinema. Objects are moved or adjusted a small amount, and each adjustment is photographed. More recent methods include computer-generated imagery (CGI), using hardware and software to create animation using computers for 3D animation and visual effects. In this class, we will examine animation's origins and study how animation production models and styles have evolved worldwide using these techniques.
Computer Animation (CGI)
What creates this illusion of motion that we see demonstrated in animation? English-Swiss physicist Peter Mark Roget first named a theory of perception called persistence of vision. He described it as a phenomenon in which an object that was moving at a particular speed would appear to be static. The term later became identified with a theory put forward by Joseph Plateau, the inventor of the optical toy the phenakistiscope, that successive images stayed on the retina of the eye, combining them, creating the illusion of motion. This theory was accepted into the 20th century when psychologist Max Wertheimer conducted experiments that led him to believe that the brain was involved in processing the information in this phenomenon, not merely the retina. In 1915, Hugo Munsterberg postulated that the apparent motion we perceive involves the brain. Subsequent research has shown that the properties of vision, such as color, motion, and depth, are transmitted to the brain from the retina and are joined together in the visual cortex.
The film below explains the theory of how we perceive motion in a set of sequential images and how it has evolved.
Persistence of Vision
Persistence of vision is the optical phenomenon where the illusion of motion is created because the brain interprets multiple still images as one. When multiple images appear in fast enough succession, the brain blends them into a single, persistent, moving image.
The human eye and brain can only process about 12 separate images per second, retaining an image for 1/16 of a second. If a subsequent image is replaced during this time frame, an illusion of continuity is created.
(from Maia, Alyssa, “What is Persistence of Vision? Definition of an Optical Phenomenon” StudioBinder.com May 11, 2020 https://www.studiobinder.com/blog/what-is-persistence-of-vision-definition/)
Frame Rate
It’s important to remember that frame rate is based on the properties of human vision, which is how our brain processes the information that our eye perceives. At a frame rate of one drawing per second, you perceive each drawing as a completely separate entity. As you increase the frame rate, you begin to see a choppy illusion of movement. At around 10-12 frames per second, the illusion is consistent, though if becomes much smoother if you keep increasing the rate. Film animation is traditionally 24 frames per second.
Frame rate (expressed in frames per second or FPS) is the frequency (rate) at which consecutive images called frames appear on a display. The term applies equally to film and video cameras, computer graphics, and motion capture systems. The frame rate may also be called the frame frequency and can be expressed in hertz.
The temporal sensitivity and resolution of human vision varies depending on the type and characteristics of visual stimulus, and it differs between individuals. The human visual system can process 10 to 12 images per second and perceive them individually, while higher rates are perceived as motion.
(from “Frame Rate” Wikipedia).
Japanese Animation
Hayao Miyazaki's contribution to animation is unparalleled. His films blend intricate detail, profound narratives, and a distinctive approach to sound and visuals. His works, including Princess Mononoke, showcase an adeptness at exploring complex themes through meticulously crafted worlds. His animation style is characterized by meticulous detail, vibrant landscapes, and a fluidity of movement that breathes life into every frame. Through his work, Miyazaki invites viewers into worlds imbued with wonder, challenge, and a deep respect for the environment, underpinned by a distinctive narrative depth that appeals to children and adults alike.
Miyazaki's films are not just visual spectacles but symphonies of sound and sight. Each film establishes its unique sound identity from the opening frame, crafting a signature aural personality that enhances the storytelling. This attention to sound design is crucial, as it fills the spaces his characters inhabit, defining those spaces with layers of meaning and emotion. My Neighbor Totoro exemplifies Miyazaki's serene and whimsical approach, where the natural sounds of lush forests take precedence, inviting the audience into a tranquil world brimming with the magic of nature. In contrast, Princess Mononoke presents a starkly different soundscape that uses nature as a backdrop to explore themes of conflict, industrialization, and the struggle between the human and the natural world. It is a film where every sound is charged with intention, from the screeching arrows to the swift sword swipes, creating a dynamic atmosphere that propels the narrative forward.
Princess Mononoke is a seminal work in Miyazaki's career. It is a compelling tale that delves into the complexity of environmentalism, war, and humanity's place within the natural world. However, a lesser-known film, Nausicaä of the Valley of the Wind, stands at the inception of Miyazaki's thematic and stylistic explorations.
Released initially in the U.S. as "Warriors of the Wind," Nausicaä suffered from severe editing that distorted its narrative and themes, leading to confusion and a poor reception outside Japan. This unapproved editing, which saw subplots and characters removed and an entirely new dialogue crafted, was a turning point for Miyazaki. It underscored his resolve for future productions to remain uncut by U.S. distributors, ensuring the integrity of his work. The uncut version, released in 2005 with a new dub featuring prominent actors, finally allowed U.S. audiences to experience the film as Miyazaki intended, and this anecdote is a microcosm that explains why editing is so essential in the animation process.
Nausicaä embodies Miyazaki's philosophy on the unique potential of animation. He asserts that unlike live-action films, which are constrained by the realism of their special effects, animation "illustrate[s] a world of lost possibilities." Animation offers a boundless canvas. Considered the foundation of Studio Ghibli, despite its release before the studio's formation, Nausicaä encapsulates the narrative and stylistic ambitions Miyazaki and his co-founders sought to realize; the film is a testament to animation's power to address weighty themes such as environmentalism and humanity's role within nature, without preaching. Its portrayal of three-dimensional characters, notably strong female protagonists and nuanced antagonists, along with a deep reverence for nature, sets a template for Miyazaki's later works.
Nausicaä laid the groundwork for the thematic and stylistic flourishes defining Miyazaki's work, marking a significant moment in his career and the broader animation landscape. It underscored the power of animation to convey complex, mature themes in a manner that transcends age and cultural barriers, setting a high bar for animated storytelling. Princess Mononoke later revisits and expands upon these themes and is the first animated film to win an Oscar.
Released in the wake of Miyazaki's early works, Akira, another iconic animated film we are studying, propelled the medium into new territories of narrative and technical excellence. While Miyazaki's films like Nausicaä and Princess Mononoke deeply explored the harmony and discord between humanity and nature through a fantastical lens, Akira took a grittier approach to examining society, technology, and human potential. Set against the backdrop of a dystopian future, Akira delves into themes of power, identity, and the consequences of unchecked technological advancement, distinguishing itself with a bold narrative and stylistic audacity that challenges and expands the global perception of what anime could be.
Akira emerged as a groundbreaking masterpiece when anime was scarcely known outside Japan, setting the stage for the genre's mainstream acceptance in the West. Its success demonstrated anime's universal appeal, breaking cultural barriers with its unique style, complex storytelling, and mature themes. The film distinguished itself from Western animations through its artistic and narrative innovation, offering a thought-provoking narrative on dystopian futures and human complexities. This contrast challenged prevailing norms and redefined animation's potential audience and scope.
The film capitalized on Japan's economic boom in the 1980s, leveraging an unprecedented budget to achieve technical excellence in animation. The "Akira Committee" pooled resources to meet Ôtomo's ambitious vision, utilizing 24 frames per second for fluid motion and incorporating CGI for artistic effects. Such technical advancements, including using 70mm film and prescoring to match voice performances with animation, set new standards in the industry. Akira's commitment to quality was evident in its detailed scenes and lifelike movement, contributing to its immersive storytelling.
Akira's influence was not limited to technical achievements; it also offered a mature narrative that diverged significantly from the conventional animations of its time. Focusing on realistic character portrayals and exploring deep societal issues connected with a broader, more mature audience. This approach and its cinematic quality and thematic depth cemented Akira's place in animation history as a landmark film. Its success not only introduced Japanese animation to a global audience but also established a new standard for storytelling and visual expression.
Video, Image, and Content Attributions:
Soviet Film – The Kuleshov Effect (original) by Lev Kuleshov 1918 by MediaFilmProfessor. Standard YouTube License.
Battleship Potempkin – Odessa Steps scene (Einsenstein 1925) by Thibault Cabanas. Standard YouTube License.
Kelly Reichardt: “Elaborated Time” by Lux. Standard YouTube License.
How Does an Editor Think and Feel? by Every Frame a Painting. Standard YouTube License.
DAY 177 Clip – Cut on Action by Russell Sharman. Standard YouTube License.
Cuts & Transitions 101 by RocketJump Film School. Standard YouTube License.
Casablanca First Cafe Scene by Leahstanz25. Standard YouTube License.
Snowpiercer – Left or Right by Every Frame a Painting. Standard YouTube License.
Whiplash – Date scene by Jack ss. Standard YouTube License.
Way Down East (1920) D. W. Griffith, dir. – Final Chase Scene by FilmStudies. Standard YouTube License.
Example of Parallel Editing in “The Silence of the Lambs” (1991) by Gabriel Moura. Standard YouTube License.
"What is Animation?" adapted from World History of Animation by BMCC faculty Anna Pinkas and Jody Culkin, as part of the BMCC Open Education Initiative via CC BY-NC.
StopBranko at English Wikipedia, Public domain, via Wikimedia Commons
Subhashish Panigrahi, CC BY-SA 4.0 <https://creativecommons.org/licenses/by-sa/4.0>, via Wikimedia Commons
MichaelFrey, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons
How Studio Ghibli Makes Animation Feel Alive by KaptainKristian. Standard Youtube License.
Nausicaä; Nausicaä 004; Nausicaä 36; Nausicaä 009; Public Domain images: Studio Ghibli Works.
Nausicaä and the Rise of Studio Ghibli | The Director Project by Geekritique. Standard Youtube License.
The Impact of Akira: The Film That Changed Everything by Super Eyepatch Wolf. Standard Youtube License.
- Footage is a common way to refer to the recorded moving image, whether it’s on celluloid film or digital media. The term comes from the fact that physical film was measured in feet, with a standard reel of 35mm film measuring 1000 feet (or about 11 minutes at 24 frames per second). The technology has changed, but the terminology has stuck. ↵
- https://vashivisuals.com/shooting-ratios-of-feature-films/ ↵
Film Journal: Your Name, Akira, Tokyo Ghoul
Purpose:
In this journal, we want to demonstrate our overall understanding of the film process, with special emphasis on editing and acting in our final week. We are mostly looking at animation this week, but voice acting is its own art. Moreover, while the focus of this journal is on the films Your Name (Suga Shrine), Akira (Neo-Tokyo), and the anime Tokyo Ghoul (we only need to analyze one of the three).
Here is the challenge for our journal entries: this is not my textbook; it is our textbook. Is there a section of this module you think could be better, or would you like to add to it? Go for it. That is the point.
Prompts:
Analyzing Animation Editing Techniques
- Select a scene from Your Name, Akira, or Tokyo Ghoul. Count the number of cuts within the scene. Describe the scene in detail, focusing on the editing techniques used. Consider the following:
- Does the editing style mimic traditional filming techniques such as the master shot method or coverage?
- Are techniques like cutting-on-action, match cuts, or any form of transitions utilized to maintain the illusion of motion?
- Is there any use of cross-cutting/parallel editing, discontinuous editing, or does the scene challenge the 180-degree rule?
Given the nature of animation, where movement is an illusion created by sequential images, discuss how these editing choices contribute to the narrative flow and the viewer's perception of motion, especially in light of the frame rate's role in creating a seamless illusion of continuity.
The Art of Voice Acting in Animation
- Reflect on a voice performance from Your Name, Akira, or Tokyo Ghoul that particularly impacted you. Consider whether the acting style is more classical or naturalistic. How does the voice acting enhance the animated characters' emotional depth and narrative presence? Discuss the challenges and artistic considerations involved in bringing animated characters to life through voice.
Pace and Storytelling Through Editing in Animation
- Compare and contrast two scenes based on their editing pace: one with rapid edits and another with longer shots and fewer cuts. These scenes may come from the same animation or different ones. Analyze why the animators and editors might have chosen these specific editing styles. How does the pace affect the atmosphere of the scene and the story conveyed to the audience? Speculate on how swapping the pacing between these scenes would alter their impact and narrative delivery.
Consider how the principles of persistence of vision and frame rate play into these decisions. How does the chosen frame rate for each animation influence the editing style and the overall perception of motion and emotion in these scenes?
Given the digital creation of Your Name and the traditional, hand-drawn techniques of Akira, consider how the medium (digital vs. traditional animation) might influence editing choices, the portrayal of action, and the conveyance of emotion through voice acting. How do these techniques reflect the animation's themes and the director's vision?
Criteria:
All journal entries should use correct MLA formatting, specific diction and terms from the module, and a direct answer to the prompt using specific scenes and examples from the film you are reviewing. We must also provide a work cited page for the film we are reviewing!
Your response should be at least 500 words.
Task:
Complete the aforementioned journal following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
- You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Here is an example of a review of the editing in Your Name: Stylizing Repetition: Kimi no Na wa (Your Name)’s Visual Language.
Explorative Assignment: Variations on a Scene
Variations on a Scene
Purpose:
For this assignment, we are looking to recreate one of our favorite scenes from Tokyo in film. For instance, check out this article by Otaku in Tokyo for multiple good, usable examples. We will, in fact, be visiting some of the very same places (Asakusa, Suga Shrine, Asahi Inari Shrine). Please use the local Family Mart to print a small postcard of your favorite scene in an attempt to recreate the scene in real life, as an example in the aforementioned article, or create a short film wherein you reenact a short scene from your film of choice using the same filming location.
Criteria:
Please focus and explain the elements of your photo or video in cinematic terms using the following parameters:
- Focus: plane of focus, focal length, and depth of field.
- Exposure: over/under / correct; privileging one element in the composition at the expense of another.
- Lighting: high-key (broad, even, bland); low-key (dramatic); and chiaroscuro (artistic, moody).
- (optional) Audio: layered sound effects, background music, voice-over, close-miked vs. shotgun-miked, and silent.
Locations for recreation (plan ahead!):
- Suga Shrine (Your Name) - May 15
- Edo Architectural Museum (any Edo period recreation) - May 15
- Check out the Tokyo Tourist guide to all the Tokyo Ghoul locations.
- The following Wards (from TG and the article above) are based on locations we will or have visited during our course:
- 1st Ward
- 3rd Ward
- 4th Ward
- 6th Ward
- 13th Ward
- The following Wards (from TG and the article above) are based on locations we will or have visited during our course:
- Neo Tokyo:
- Shinjuku (the best representation of Neo-Tokyo)
- (Tokyu Kabuchiko Tower);
- (The Sumurai Restaurant (formerly the Robot Restaurant)).
- Tokyo Metropolitin nightly light show (over by 21:00; viewing in "citizen square")
- DAWN robot cafe.
- Shinjuku (the best representation of Neo-Tokyo)
Turn-in Methods:
- This assignment is a bit more unique, and the recreation (modeling the Otaku article mentioned above) should be uploaded to the proper assignment dropbox.
Option Two:
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This comprehensive shot list and schedule organize shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
On one or more of our excursions to explore filming locations or locations recreated in animation, it is your task to work together to recreate iconic scenes, shots, movements, or compositions from film and animation.
You may submit your response in of the formats mentioned above and uploaded into the Google Classroom.
Explorative Assignment: Choreography and Story Telling
For our analysis focusing on Choreography, Blocking, and Storytelling, we will delve into three pivotal works that have significantly influenced their respective genres: Seven Samurai by Akira Kurosawa, Akira by Katsuhiro Otomo, and Tokyo Ghoul based on the manga series by Sui Ishida.
Purpose:
The assignment focuses on rethinking action sequences in film to highlight their potential for emotional depth and narrative progression. It encourages examining action beyond spectacle, emphasizing character development and plot advancement through choreography and cinematography.
- Analyze how action scenes can deepen narrative and emotional engagement.
- Explore the integration of choreography with storytelling.
- Examine cinematographic techniques for maintaining engagement in complex scenes.
The goal is to challenge traditional views of action cinema, inspiring students to create action sequences that are emotionally resonant and narratively meaningful.
Criteria:
Option One: Seven Samurai Recreate a scene that exemplifies Kurosawa's mastery of group dynamics and individual characterization through choreography and blocking. Focus on how these elements enhance the narrative's exploration of themes such as honor, duty, and the human condition.
Option Two: Akira Craft a scene that captures the essence of the film's cyberpunk setting and complex character relationships. Emphasize the choreography of action sequences and how they mirror the themes of power and self-identity.
Option Three: Tokyo Ghoul Develop a scene that reflects the series' intricate blend of horror and moral ambiguity. Consider how the choreography and character interactions can illustrate the protagonist's turmoil and the dark themes of the narrative.
Turn-in Methods for Scene Creation:
Script: Write a script that encapsulates the dialogue, action, and camera work of your chosen work, with an emphasis on choreographic and blocking decisions.
Storyboard: Produce a storyboard that visually represents the scene's layout, including character positioning, movements, and expressions.
Shot List and Schedule: Assemble a shot list and production schedule that outlines the logistical aspects of recreating the selected scene, focusing particularly on choreography and blocking.
Footage: Produce raw footage that closely mirrors the style and thematic content of the chosen scene, with a focus on cinematography and performance techniques.
Required Analysis:
Alongside the scene creation, each student must submit a 200-300 word analysis on the use of choreography and blocking in their selected work to bolster storytelling. This analysis should include:
Definition and Application: Define choreography and blocking within the contexts of live-action and animation, discussing their application in your chosen film or series.
Narrative Impact: Examine how the movement and arrangement of characters contribute to the thematic and emotional resonance of the work.
Creative Influence: Reflect on the director's or animator's stylistic choices regarding choreography and blocking, and their effect on the audience's perception and engagement with the story.
Option Four:
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This comprehensive shot list and schedule organize shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
Complete the specified journal following one of the prompts (noting that answering every question is not mandatory unless it is relevant to your overall point).
You may submit your response in either written, oral, or video format uploaded into the Google Classroom.
Week Two, Module Two - Sound Design
Just listen for a moment.
What do you hear?
Maybe you’re in a coffee shop, surrounded by the bustle of other customers, the busywork of baristas, and the sound of the city just outside. Maybe you’re in your room, a dog barking in the distance outside, cars passing, music playing in the background, maybe even the television. (Which, frankly, is just rude. I expect your undivided attention!) Maybe you’re alone in the library. It’s quiet. But is it really? Distant footsteps among the stacks. The hum of the air conditioning…
You are surrounded by sound unless you’re reading this in a sensory deprivation chamber. The soundscape around us shapes our understanding of the world, becoming its own meaningful context for every other sense perception. Most of the time, it barely registers; we don’t attend to it unless we are listening for something in particular. But take it away, and we feel lost, vulnerable, and disoriented.
Not surprisingly, sound provides an equally meaningful context for cinema. Or at least, it shouldn’t be surprising. But then again, it wasn’t until 1927 that Sam Warner figured out how to marry sound and image in The Jazz Singer, the first film with synchronized dialogue. Before that, no one cared that cinema was a purely visual medium. And as Sam toiled away at the new technology, most of the other movie moguls in Hollywood assumed it was a passing fad. That no one really wanted to hear the actors talking.
In the century or so since they were all proven wrong, sound has become co-expressive with cinematography; that is, it shapes how we see what’s on screen, just as the images we see influence how we perceive the sounds.
Just listen to how French filmmaker Agnès Varda has used sound and image together over the last half-century:
And like cinematography, sound recording and reproduction have increased in sophistication and technical complexity, developing its own important contribution to cinematic language along the way. So much so that when we talk about the use of sound in cinema, we talk about it in terms of sound design, a detailed plan for the immersive effects of a motion picture’s soundscape that begins in pre-production before a single frame is shot and extends to the very end of post-production, often the final element in the entire process.
SOUND RECORDING
Before we get to how that soundscape is shaped in the post-production process, let’s look at how (and what) sound is recorded during production. The production sound department is made up of several specialists dedicated to recording clean sound on set as the camera rolls. They include the on-set location sound recordist or location sound mixer, who oversees the recording of on-set sound and mixes the various sources in real-time during production; boom operators, who hold microphones on long poles to pick up dialogue as close to actors as possible without being seen on camera (it helps if they are very tall and relatively strong, those poles get heavy after a while), and assistant sound technicians, responsible for organizing the equipment and generally assisting the sound mixer.
And just like the camera department, the sound department has its own set of specialized equipment to make their work possible. Obviously, there are microphones involved. But sound recordists can be as particular about their microphones, what brand, type, and technology as cinematographers are about their cameras. Microphones can be omnidirectional or directional, cardioid or super-cardioid, mono or stereo, and each one will pick up sounds in a distinctly different way. You can use a shotgun mic on a boom pole to target a sound source from a reasonable distance with a shielded cable. Or you can use a tiny Lavalier mic taped to the collar of an actor that sends an audio signal wirelessly to the recorder. Or you can use all of the above in an endless number of configurations, all feeding into the same field mixer for the recordist to monitor and record.
Now you may be wondering, isn’t there a microphone right there on the camera? Why not just use that and save all that headache?
First of all, if you asked that out loud, every sound recordist in the universe just collectively screamed in agony. Second, they’re all so upset because cameras are designed to record an image, not sound. And while they may have a relatively cheap omnidirectional microphone built-in or even inputs for higher-quality microphones, nothing can replace the trained ears of a location sound mixer precisely controlling the various streams of audio into equipment designed to do just that. This is why, even now, most cinema uses dual-system recording, that is, recording sound separate from the image during production.
Dual-system recording allows for more precise control over the location sound, but it also comes with its own problem: synchronization. If the sound is recorded separately from the image, how do you sync them up when you’re ready to edit? Glad you asked. Ever seen one of these:
We have lots of names for it: clapper, sticks, sound marker, but the most common is slate, based on the fact that in the early days, it was made out of slate, the same stuff they used to make chalkboards. It serves two purposes. The first is to visually mark the beginning of each take with the key details of the production as well as the scene, shot, and take number. This comes in handy for the editor as they are combing through all of the footage in post-production. The second is to set a mark for sound synchronization. A crew member, usually the second camera assistant, holds the slate in front of the camera and near a microphone and verbally counts off the scene, shoots and takes a number, and then SLAPS the slate closed. In post-production, the editors, usually an assistant editor (cause, let’s face it, this is tedious work), can line up the exact frame where the slate closes with the exact moment the SLAP is recorded on the microphone. After that, the rest of the shot is synchronized.
In fact, this whole process, repeated for every take during production, is a kind of call-and-response ritual:
1st Assistant Director: “Quiet on the set! Roll sound!”
Sound mixer: “Sound speed!”
1st AD: “Roll camera!”
Cinematographer: “Rolling!”
2nd Assistant Camera: “Scene 1 Apple Take 1” SLAP!
Cinematographer: “Hold for focus. Camera set!”
Director: “And… ACTION!”
Every. Single. Time. And note that the 2nd AC mentions scene number 1, the shot, Apple (for shot “A” of scene 1), and take number 1.
But wait… sound speed? That’s another of those little anachronisms of cinema. For much of cinema sound history, the sound was recorded onto magnetic tape on a clunky reel-to-reel recorder. It would take a moment for the recorder to get up to “speed” once the recordist hit record, so everyone would have to wait until they called out “sound speed!” We use digital recording these days with no lag time at all, but the ritual never changed.
Sometimes, 2nd ACs can have a lot of fun with this little ritual. Check out Geraldine Brezca’s spin on the tradition throughout Quentin Tarantino’s Inglorious Basterds (2009):
Now that we have a sense of how things get recorded on set during production, we should probably cover what gets recorded. The answer: not much. Or at least a lot less than you might think. In fact, the focus of on-set recording is really just clean dialogue. That’s it. Everything else, background sounds, birds chirping, music on a radio, and even footsteps, are almost always recorded after production. The main job of location sound recordists is to isolate dialogue and shut out every other sound.
Why? Because sound editors, the folks who take over from the recordists during post-production, want to control everything. Remember how nothing is on screen by accident? The same goes for sound. Clean dialogue has to match the performance we see on screen, but everything else can be shaped to serve the story by layering in one sound at a time.
There is one exception. Another little ritual everyone gets used to on a set. At the end of a scene, when all of the shots are done, the location sound recordist will whisper to the 1st AD, and the 1st AD will call out: “Hold for room tone!” And then everyone stops in their tracks and holds still, remaining completely silent for at least 60 seconds.
It’s awkward:
But what is room tone? Every space, interior or exterior, has its own unique, underlying ambient sound. What we sometimes call a sound floor. During production, as the actors deliver their lines, the microphones pick up this sound floor along with the dialogue. But in post-production, as the editors pick and choose the takes they want to use, there will inevitably be gaps in the audio, moments of dead air. Room tone recordings can be used to fill in those gaps and match the sound floor of the recorded dialogue.
Of course, as I mentioned, it can be a bit awkward. But it can also be kind of beautiful in its own way:
Room tone is just another example of how sound editors control every aspect of the sound in the cinematic experience.
SOUND EDITING
In the last chapter, we focused on editing the visual elements in a motion picture and how the shots fit together to create a narrative flow and communicate with the audience. As it turns out, sound requires a similar approach in post-production and is often even more “invisible” than picture editing techniques. (In fact, if there are any sound editors reading this book, they probably noticed that picture editing has a whole chapter, and all they get is this one crumby section. Typical.)
But sound editing is much more than simply joining up the sounds that already exist. It involves creating all of the sounds that weren’t recorded on set to make up the rich soundscape of the finished motion picture. In that sense, it is literally more “creative” than picture editing! (How’s that, sound editors? Feel better now?)
One important bit of post-production sound creation has to do with dialogue. Sometimes, an actor’s dialogue for that perfect take is unusable because of distracting ambient sounds or a poorly placed microphone. (C’mon, location sound recordist, you had one job!) In that case, sound editors bring in the actors to perform ADR, short for Automated Dialogue Replacement (sometimes also referred to as Additional Dialogue Recording or “looping”). They simply play the scene in a repeating “loop” as the actors record the lines repeatedly until they match the performance on screen. Then, the sound editors adjust the quality of the recording to match the setting of the scene.
But what about all those other sounds that weren’t recorded on set? The birds chirping, the cars passing, even those footsteps? Those too, have to be created and gathered together in post-production and layered into the sound design. Many of these sounds already exist in extensive sound libraries, pre-recorded by sound technicians and made available for editors. But many of them must be created to match exactly what the audience will see on screen. That’s where foley artists come in.
Foley artists are a special breed of technicians, part sound recordists, and part performance artists. Their job is to fill in the missing sounds in a given scene. By any means necessary:
Foley artists have to get creative when it comes to imitating common (and not-so-common) sounds. But sound editors must go beyond recreating the most obvious sounds associated with a scene. Every rustle of clothing, a hand on a cup, brushing a hair behind an ear. These tiny details, most of which we would never notice unless they weren’t there, help create continuity in the final edit.
Yes, there’s that word again: continuity. Editing pictures for continuity means creating a narrative flow that keeps the audience engaged with the story. Editing sound for continuity has the same goal but relies on different techniques. For example, if we see someone walking on gravel but hear them walking on a hardwood floor, that break with continuity – or, in this case, logic – will take us out of the narrative. The soundscape must match the cinematography to maintain continuity. And since so much of the sound we hear in cinema is created and added in post-production, that requires incredible attention to detail.
But there are other ways editors can use sound to support the principle of narrative continuity, and not always by matching exactly what we see on screen. For example, a sound bridge can be used to help transition from one shot to another by overlapping the sound of each shot. This can be done in anticipation of the next shot by bringing up the audio before we cut to it on screen, known as a J-cut, or by continuing the audio of the previous shot into the first few seconds of the next, known as an L-cut. This technique is most noticeable in transitions between radically different scenes, but editors use it constantly in more subtle ways, especially in dialogue-heavy scenes. Here are some quick examples:
And just like picture editing, sound editing can also work against audience expectations, leaning into discontinuity with the use of asynchronous sounds that seem related to what we’re seeing on screen but are otherwise out of sync. These are sound tricks intended to either directly contrast what we see on screen or to provide just enough disorientation to set us on edge. Here’s one famous example of asynchronous sound from Alfred Hitchcock’s The 39 Steps (1935):
The woman opening the train compartment door discovers a dead body, but instead of hearing her scream, we hear the train whistle. In this case, we get an asynchronous sound combined with a J-cut.
Production sound recording and sound editing are all part of the overall sound design of cinema, and there are lots of moving parts to track throughout the process. Take a look at how one filmmaker, David Fincher (along with Christopher Nolan, George Lucas, and a few others), uses all of these elements of sound design to embrace the idea of sound as co-expressive with the moving image:
SOUND MIXING
Once all of the sound editing is done and matched up with the image, the whole process moves to the sound mixer to finalize the project. And if you’ve ever wondered why there are two Academy Awards for sound, one for sound editing and one for sound mixing, this is why. (Or maybe you’ve never wondered that because that’s when you decided to grab a snack. I mean, who pays attention to Best Sound Mixing?) Sound mixers must take all of the various sound elements brought together by the editors, including the music composed for the score (more on that later), and balance them perfectly so the audience hears exactly what the filmmakers want them to hear from shot to shot and scene to scene.
This is a very delicate process. On the one hand, the sound mix can be objectively calibrated according to a precise decibel level, or degree of loudness, for each layer of sound. Dialogue within a certain acceptable range of loudness, music in its range, sound effects in theirs. Basic math. On the other hand, the mix can and should be a subjective process, with actual humans in a room making adjustments based on the feel of each shot and scene. Most of the time, it’s both. When it’s done well, the audience will feel immersed in each scene, hearing every line of dialogue clearly, even when there are car crashes, explosions, and a driving musical score.
For example, check out this deconstruction of the sound design from a single scene from The Bourne Identity (2002):[1]
Sound mixing is one of those technical aspects of filmmaking that has evolved over the decades, especially as the technology for sound recording and reproduction has changed in more recent years. Starting with the birth of cinema sound in 1927, movie houses had to be rigged for sound reproduction. Which usually meant a couple of massive, low-quality speakers. But by 1940, sound mixers were already experimenting with the concept of surround sound and the ability to move the various channels of sound around a theater through multiple speakers to match the action on screen.
As the century rolled on, newer, high-fidelity sound reproduction found its way into theaters, allowing for more sophisticated surround sound systems and, consequently, more work for sound mixers to create an immersive experience for audiences. George Lucas introduced THX in 1983, a theatrical standard for sound reproduction in theaters to coincide with the release of Return of the Jedi. In 1987, a French engineer pioneered 5.1 surround sound, which standardized splitting the audio into six distinct channels: two in the front, two in the rear, one in the center, and one just for low bass sound. As recently as 2012, Dolby introduced Dolby Atmos, a new surround sound technology that heightens the available options for sound mixers. Now, sound can appear to be coming from in front, behind, below, or above audiences, creating a 3-D aural experience.
And every element in the final soundtrack has to be calibrated and assigned by the sound mixer. Check out how complex the process was for the sound mixers on Ford v Ferrari (2019):
Finding the right mix of sound is critical for any cinematic experience, but one element that many filmmakers (and audiences) neglect is the use of silence. The absence of sound can be just as powerful, if not more powerful, than the many layers of sound in the final track. Silence can punctuate an emotional moment or put us in the headspace of a character in a way that visuals alone simply cannot.
Check out how skillfully Martin Scorsese uses silence throughout his films:
Of course, in most of these examples, silence refers to the lack of dialogue or a dampening of the ambient sound. Rarely is a filmmaker brave enough to remove all sound completely from a soundtrack. Dead air has a very different quality to it than simply lowering the volume of the mix. But a few brave souls have given it a try. Here’s French New Wave experimental filmmaker Jean Luc Godard playing an aural joke in Band à part (1964):
It’s not actually a full minute of dead air – it’s more like 36 seconds – but it feels like an hour.
Compare that to this scene from the more recent film Gravity (2013):
That was also 36 seconds. Perhaps a little wink from the director Alfonso Cuaròn to the French master Godard. But both are startling examples of the rare attempt to remove all sound to great effect completely.
MUSIC
One of the most recognizable elements in the sound of cinema is, of course, music. And its importance actually pre-dates the synchronization of sound in 1927. Musical accompaniment was almost always part of the theatrical experience in the silent era, and films were often shipped to theaters with a written score to be performed during the screening. Predictably, the first “talking picture” was a musical and had more singing than actual talking.
As the use of sound in cinema has become increasingly sophisticated over the last century, music has remained central to how filmmakers communicate effectively (and sometimes not so effectively) with an audience. At its best, music can draw us into a cinematic experience, immersing us in a series of authentic, emotional moments. At its worst, it can ruin the experience altogether, telling us how to feel from scene to scene with an annoying persistence.
But before we try to sort out the best from the worst, let’s clarify some technical details about how and what type of music is used in cinema. First, we need to distinguish between diegetic and non-diegetic music. If the characters on screen also hear the music we hear, that is, it is part of the world of the film or TV series, then it is diegetic music. If the music is not a part of the world of the film or TV series, and only the audience can hear it, then it is non-diegetic music. Too abstract? Okay, if a song is playing on a radio in a scene, and the characters are dancing to it, then it is diegetic. But if scary, high-pitched violins start playing as the Final Girl considers going down into the basement to see if the killer is down there (and we all know the killer is down there because those damn violins are playing even though she can’t hear them!), then it is non-diegetic.
Diegetic versus non-diegetic sound is a critical concept in the analysis of cinema, and crafty filmmakers can play with our expectations once we know the difference (even if we didn’t know the terms before now). For example, non-diegetic music can communicate one emotion to the audience, while diegetic music communicates something entirely different for the characters on screen. Think about the movie JAWS (1975). Even if you haven’t seen it, you know those two deep notes – da dum… da dum – that start out slow, then build and build, letting us know the shark is about to attack. Meanwhile, the kids in the water are listening to pop music, completely oblivious to the fact that one of them is about to be eaten alive!
And this concept applies to more than just music. Titles, for example, are a non-diegetic element of mise-en-scene. The audience can see them, but the characters can’t.
Second, we need to distinguish between a score written by a composer and what we could call a soundtrack of popular music used throughout that same motion picture. The use of popular music in film has a long history, and many of the early musicals in the 1930s, 40s, and 50s were designed around popular songs of the day. These days, most films or TV series have a music supervisor who is responsible for identifying and acquiring the rights for any popular or pre-existing music the filmmakers want to use in the final edit. Sometimes, those songs are diegetic – that is, they are played on screen for the characters to hear and respond to – or they are non-diegetic – that is, they are just for the audience to put us in a certain mood or frame of mind. Either way, they are almost always added in post-production after complete filming. Even if they are meant to be diegetic, playing the actual song during filming would make editing between dialogue takes impossible. The actors have just to pretend they are listening to the song in the scene, which is fine since pretending is what they do for a living.
But the type of music that gets the most attention in formal analysis is the score, the original composition written and recorded for a specific motion picture. A film score, unlike popular music, is always non-diegetic. It’s just for us in the audience. If the kids in the water could hear the theme from JAWS, they’d get out of the damn water, and we wouldn’t have a movie to watch. It is also always recorded after the final edit of the picture is complete. That’s because the score must be timed to the rhythm of the finished film, each note tied to a moment on screen to achieve the desired effect. Changes in the edit will require changes in the score to match.
It is in the score that a film can take full advantage of music’s expressive, emotional range. But it’s also where filmmakers can go terribly wrong. Music in film should be co-expressive with the moving image, working in concert to tell the story (pun intended, see what I did there?). The most forgettable scores simply mirror the action on screen. Instead of adding another dimension, what we see is what we hear. Far worse is a score that does little more than tell us what to feel and when to feel it. The musical equivalent of a big APPLAUSE sign.
These tendencies in cinematic music are what led philosopher and music critic Theodor Adorno to complain that the standard approach to film scores was simply to “interpret the meaning of the action of the less intelligent members of the audience.” Ouch. But, in a way, he’s not wrong. It's not about the less intelligent bit. But about how filmmakers assume a lack of intelligence, or maybe awareness, of the power of music in cinema. Take the Marvel Cinematic Universe, for example. You all know the theme of JAWS. You probably also know the musical theme for Star Wars, Raiders of the Lost Ark, and maybe even Harry Potter. But can you hum a single tune from any Marvel movie? Weird, right? Check this out:
The best cinema scores can do so much more than simply mirror the action or tell us how to feel. They can set a tone, play with tempo, and subvert expectations. Music designed for cinema with the same care and thematic awareness as cinematography, mise-en-scene, or editing can transform our experience without us even realizing how and why it is happening.
Take composer Hans Zimmer, for example. Zimmer has composed scores for over 150 films, working with dozens of filmmakers. And he understands how music can support and enhance a narrative theme, creating a cohesive whole. In his work with Christopher Nolan, The Dark Knight (2008), Inception (2010), and Interstellar (2014), his compositions explore the recurring theme of time:
Musical scores can also emphasize a moment or signal an important character. Composers use recurring themes, or motifs, as a kind of signature (or even a brand) for a film or tv series. The most famous of these are the ones you can probably hum to yourself right now, again like Star Wars, Raiders of the Lost Ark, maybe even Harry Potter. Composers can use this same concept for a specific character as well, known as a leitmotif. Think of those two ominous notes we associate with the shark in JAWS. That’s a leitmotif. Or the triumphant horns we hear every time Indiana Jones shows up in Raiders. That’s a leitmotif.
Oh, and all those movies I mentioned just now? They all have the same composer. His name is John Williams. And he’s a legend:
While we are not analyzing a Studio Ghibli film in this section, I would be remiss not to mention Mamoru Fujisawa, better known by his professional name, Joe Hisaishi, a Japanese composer and conductor whose partnership with Miyazaki began in the early 1980s. In fact, Hisaishi's success and global recognition garnered him the moniker ‘the Japanese John Williams.’
Yet, as amazing has Hisaishi and Williams are as composers, cinema is set apart as an art form for its ability, and need, to blend all of the arts into something more, and it is the blend of Miyazaki's artistry and vision, Hisaishi's emotive compositions, and uniquely Japanese sentiment that creates Ghibli's magic.
For instance, Shantanu Singh over at Medium notes:
One of the things that Miyazaki often talks about is “Ma”, or as he calls it, the silence between the clap. “Ma” is a Japanese concept that refers to the respite between activity. Miyazaki deftly integrates this respite within his work. He makes sure that there is stillness between the chaos. A moment where you can pause and just enjoy the beautiful scenery, the vibrant colors, and the music that accompanies it all.
We do have the option, however, of studying another notable collaborative duo, Shinichiro Watanabe and Yoko Kanno, who created Cowboy Bebop, Samurai Champloo, Space Dandy, and others, who are directly inspired by not only a genre of music but are crafted like a (delightfully) self-contained album.
GammaRay gives us a nice documentation of Shinichiro's inspirations as well as how Kanno inspired Watanabe in real-time:
Video and Image Attributions:
The Sounds of Agnès Varda by Fandor. Standard YouTube License.
Traditional Wooden Slate. Public Domain Image.
Inglourious Basterds – “Camera Angel” Clapper by rucksack76. Standard YouTube License.
Living in Oblivion (room tone) by Ana Limón. Standard YouTube License.
The Gift of Room Tone by The Criterion Collection. Standard Vimeo License.
How The Sound Effects In ‘A Quiet Place’ Were Made | Movies Insider by Insider. Standard YouTube License.
SFX Secrets: The J Cut & The L Cut by Fandor. Standard YouTube License.
39 Steps train whistle J-cut by Jack Lucido. Standard YouTube License.
Fight Club | The Beauty of Sound Design by Film Radar. Standard YouTube License.
Car chase sound design in The Bourne Identity by INDEPTH Sound Design. Standard YouTube License.
‘Ford v Ferrari’ Sound Editors Explain Mixing Sound for Film | Vanity Fair by Vanity Fair. Standard YouTube License.
Martin Scorsese – The Art of Silence by Every Frame a Painting. Standard YouTube License.
Bande à part – One Minute of Silence by Etrio Fidora. Standard YouTube License.
Gravity – Clip (7/11): Ryan’s Hallucination by Richard Parker. Standard YouTube License.
Jaws (1975) – Get out of the Water Scene (2/10) | Movieclips by Movieclips. Standard YouTube License.
The Marvel Symphonic Universe by Every Frame a Painting. Standard YouTube License.
The Meaning in the Music: Hans Zimmer and Time by Dan Golding. Standard Vimeo License.
John Williams and the universal language of film music by Dan Golding – Video Essays. Standard YouTube License.
COWBOY BEBOP: The Art of Music Scoring Anime by GammaRay. Standard YouTube License.
- If you want to see more videos like this one, check out InDepth Sound Design's YouTube channel, it's pretty cool: https://www.youtube.com/channel/UCIaYa00v3fMxuE5vIWJoY3w ↵
Film Journal: Adrift in Tokyo, Like Someone In Love, or Shoplifters
Purpose:
This assignment aims to deepen our understanding of the role of sound and silence in the film, both animated and live-action, exploring how these elements are crafted and edited to create verisimilitude and enhance narrative depth. Drawing inspiration from diverse cinematic practices, we'll consider how the concept of "Ma," as emphasized by Hayao Miyazaki—the strategic use of silence or pause—can influence film beyond the realm of animation, offering moments of respite amidst cinematic activity.
Here is the challenge for our journal entries: this is not my textbook; it is our textbook. Is there a section of this module you think could be better, or would you like to add to it? Go for it. That is the point.
Prompts:
Silence and Sound in the Cinematic Soundscape
Choose a scene from a live-action film that notably incorporates silence beyond the absence of dialogue. Reflect on the auditory elements present:
- Identify sound editing techniques used in the scene, such as "sound bridges," J-cuts, L-cuts, or the deliberate use of silence.
- Discuss how these techniques contribute to the ambiance of the scene and its emotional impact on the audience.
- Considering Miyazaki's concept of "Ma"—the pause or gap that adds depth and breadth to the narrative—analyze how strategic silences within the scene might serve a similar purpose, even in the context of live-action cinema.
Diegetic Music and its Narrative Function
Analyze a scene that features diegetic music within a live-action film, focusing on the choice to include this element and its effect:
- Examine the interaction between characters and the diegetic music, pondering what it reveals about their internal states or relationships.
- Reflect on how moments of silence before, after, or between pieces of diegetic music might enhance the scene's narrative impact, drawing a parallel to Miyazaki’s use of "Ma" to enrich storytelling through contrast and relief.
Criteria:
All journal entries should use correct MLA formatting, specific diction and terms from the module, and a direct answer to the prompt using specific scenes and examples from the film you are reviewing. We must also provide a work cited page for the film we are reviewing!
Your response should be at least 500 words.
Task:
Complete the aforementioned journal following one of the prompts (noting that it is not imperative that you answer every question unless it is related and relevant to your overall point).
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Explorative Assignment: Deconstructing Narrative
Deconstructing Narrative
Purpose:
In this final assignment, we will use Tokyo's avant-garde spirit.
From the genderless fashion trends to the deep love of Jazz and improvisation, there is something subversive in the prefecture's air. So, in our final assignment, the big emotions will be played small, and the small emotions will be played big, and we will take a postmodern approach.
Historically, culture progressed through stages. In premodern culture, art and religion played a central role. With the Industrial Revolution, mass image production emerged, leading to modernism—rejecting traditional values in favor of new ideologies like consumerism and science. Fast-forward to the mid-20th century, when an inundation of mass-produced images led to postmodernism.
Baudrillard, a French academic, observed three main aspects of post-modernism:
- Our reality is saturated with cultural representations, resulting in intertextuality—borrowing from existing culture when creating new products.
- Hyperreality blurs the lines between simulation and reality. As seen in TV shows and social media, we often accept simulations as facts.
- Lastly, meaning implosion occurs due to conflicting messages, leading to diverse interpretations and a loss of trust in truth.
Does this sound a little like Tarantino and Miike? It should; they are both postmodern directors.
In essence, postmodernism signifies a cultural state in which media overwhelms reality, leading to a hyper-real world with challenges in distinguishing between simulation and reality and conflicting interpretations of truth.
If that explanation did not work, try this one:
This explains postmodernism, but it also is postmodernism.
That's, like, sixteen walls.
Criteria for Scene Creation:
- Utilize Lighting and Color: Manipulate lighting and color grading to invert traditional scene expectations. Consider how altering these elements can change the mood or perceived reality of a scene.
- Sound and Music Underscoring: Replace or juxtapose the scene’s original soundtrack with everyday sounds or music that either enhances or contradicts the visual narrative, thus challenging normative emotional cues.
- Special Effects and Editing: Employ special effects and editing techniques to distort or amplify the scene's reality. This could involve playing with speed, reversing footage, or inserting unexpected visual elements.
- Dialogue Alteration: Experiment with removing, altering, or overdubbing dialogue to shift the scene's focus, impact, and interpretation, creating new layers of meaning or dissonance.
Turn-in Methods for Scene Deconstruction:
- Script: Develop a script that details your creative choices in altering dialogue, action, and camera work, highlighting how these changes contribute to a postmodern reinterpretation of the scene.
- Storyboard: Create a storyboard that visually outlines your proposed alterations, showing how lighting, color, sound, and special effects will be used to achieve the desired effect.
- Shot List and Schedule: Prepare a shot list and schedule that organizes the practical aspects of realizing your deconstructed scene, including any necessary equipment and settings.
- Animatic: Produce an animatic that sequences your storyboard into a coherent visual flow, providing a clear representation of how the final scene will unfold with the proposed edits.
- Footage: Generate raw footage that embodies your postmodern vision, carefully incorporating the intended lighting, sound, music, and editing techniques to challenge viewers' expectations.
Required Analysis of Postmodern Techniques:
Alongside the scene recreation, each student must submit a 200-300 word analysis that critically examines the postmodern techniques employed in their scene. This analysis should cover:
- Technique Identification: Describe the specific postmodern techniques you applied, such as hyperreality, intertextuality, or meaning implosion, and how they were implemented in your scene.
- Intertextual and Hyperreal Elements: Discuss how your scene borrows from or references existing cultural products, and how you've used these elements to blur the lines between simulation and reality.
- Narrative and Meaning: Reflect on how your alterations challenge traditional narrative structures and invite diverse interpretations, considering the impact on viewer perception and the notion of truth.
Option Two:
Purpose: Work as a team on your group project to use the aforementioned sites from the excursions in this module to create your final project. You may work to complete any of the following portions of the final assignment:
Turn-in Methods:
- Storyboard: A storyboard that visually maps out each shot, tailored to the locations available.
- Shot List and Schedule: This comprehensive shot list and schedule organize shoots, equipment, and actor availability.
- Footage: Raw footage of the recreated scenes, demonstrating the application of cinematography techniques.
Please note that each of these items needs to be completed.
Task:
It is your task to work together to recreate and deconstruct an iconic scene, shot, movements, or compositions from film and animation.
You may submit your response in either a written, oral, or video format uploaded into the Google Classroom.
Week Three, Module Two - The Final Film
Final Project: That's a Wrap
Here we are, wrapping up, and it bears mentioning again:
We don't expect great acting, fancy VFX, complex sets, or the use of high-quality equipment, and we will primarily evaluate the student and group's process, intermediate materials, and technical appreciation of cinematography.
Likewise, our educational Educational Goals are:
- Practice and then demonstrate the technical skills you acquired during the semester.
- Iterate on a production, refining your work and learning from peers, mistakes, and serendipity.
- Create a physical artifact for your portfolio.
- Experience the complete production cycle and the thrill of creation.
Our final project may be presented in stop motion, a documentary, a sequence of freeze-frame live-action stills, live-action, or animation, and much of the work should already be done, so it is time for post-production. There is no time to look back now.
It is time to commit to our final product, and I only ask for one thing:
Final Project Submission Criteria
- Team Composition and Collaboration
- Evidence of effective teamwork: Documented roles and contributions of each team member.
- Peer support: Proof of participation in another team's project for at least 2 hours.
- Preproduction Materials
- Script (if applicable): Clear dialogue and narrative flow.
- Storyboard: Visual representation of each shot.
- Shot List: Detailed list of every shot in the film, including camera angles and movement.
- Production Schedule: Detailed timeline for shoots, including reserved equipment and spaces, actor availability, and editing.
- Animatic (Optional): A basic version of your film using storyboard panels, still frames, and/or existing footage to outline the story.
- Production Materials
- Footage: Dailies and B-roll coverage, with multiple takes of key shots. Approximately 10x the amount of footage compared to the expected running time of the final product.
- Storyboard and Animatic Revisions (if applicable): Updated versions based on actual footage.
- Post-Production Materials
- Rough Cut: A preliminary edit of the film, including placeholder audio and up to two still shots from the storyboard or found footage. This version should be presented in 540p or 360p MP4 format.
- Final Film:
- Length: 2 to 8 minutes, plus titles and credits, in 720p MP4 format.
- Must include titles, credits, and copyright information.
- Demonstrates knowledge of cinematography, intentional lighting, and editing in continuity/IMR style. Audio is optional but recommended.
- Camera footage was filmed by the team (it cannot be 100% animation or found footage).
- Extra Credit
- "Behind the Scenes" Reel: Less than 150 MB, in 720p MP4 format, showcasing elements of the production process.
Task:
Each student should submit their final assignment. (Yes, the final product will be the same for everyone in the group, but the surrounding material should be unique for each student involved.)
You may submit your final project by uploading it to Google Classroom.