T R A I L E R  | W O R L D

Pari Film – Teasers clock in 41 million views!

It was a great experience editing 7 short teasers for this film. The teasers alone clocked in over 41 million views.

Pari’s scary social media strategy sets a movie marketing example.

Pari Movie Marketing

Brand Introduction

Pari, one of the most anticipated Bollywood horror films, was scheduled to hit the screens, on March 2 during the festival of Holi. Produced under the banner Clean Slate Films, owned by the leading lady of the film, Anushka Sharma, Pari was marketed through social media in order to generate maximum awareness around its release.


Pari was promoted and marketed through social media, leading up to it’s release leveraging timely and regular material from the film such as posters, ‘screamers’ – wordplay on the word ‘teasers’ due to the horror genre and many such activities.

Problem Statement/Objective

The Pari team intended to promote the film as a unique take on the horror genre, and also emphasizing on the release date of the film during the Holi weekend. Communication around the promotional material was pivoted around the message of Pari being ‘Not a usual horror film’.


Keeping in mind the genre of the film, Pari was promoted through suspenseful short teasers, termed screamers. These ‘screamers’ were created specifically for the marketing activities on social media such as ‘Growing nails’, ‘I love you too’ and more.

Beginnging the social media activity for Pari a couple months before its release, Anushka Sharma rolled out the first screamer for the film, initiating the promotional campaign of the film.

The film’s release on the festival of Holi was emphasized with the hashtag #HoliWithPari, then following up with the second screamer exactly one month before the release of the film.

The third screamer for the film, ‘I love you’, was rolled out during Valentine’s Day, creating a strange and scary perception around the film, following up with screamer 4 on 19th February, slowly gaining momentum as the film was nearing it’s release date.

Exclusive posters and images were posted on social media in order to sustain the momentum.

Launching another poster, the Pari team made sure they sustained the conversation and intrigue on social media while at the same time, successfully establishing a memory in the minds of users that the film was releasing on Holi.

On Twitter, Pari engaged audiences by sending out creepy replies with GIFs from the film through Clean Slate Films and Anushka Sharma’s accounts, surprising the audience and ensuring maximum engagement.


Additionally, an automated Messenger bot for the film was leveraged, sending out important information to Messenger users, along with an interesting new AR based Face Filter, urging audiences to share their pictures for a chance to win movie tickets for Pari.


Screamers 5, 6, and 7 were rolled out at different times before the release of the film with the 7th one being posted just one day before the film released in order to create maximum impact.

One of the first Bollywood films to use the new Press and Hold Format posts on Facebook, Pari put a scare in social media users by putting up an endearing image as the thumbnail, but revealing a scary image upon interaction.

As a part of the offline activities that were amplified digitally, the Pari team conducted a series of pranks at a theatre in Mumbai, promoting them on social media.

Countdown posts were rolled out on social media to create recall in the minds of social media users.

Post the release of the film, positive reviews were collated and a Twitter Moment was created out of it, also executing a similar activity on Instagram with 9 reviews.


The total views received by the 7 screamers posted at regular intervals by the Pari team were clocked at over 41.4 Million.

The Pari trailer grossed over 17 Million views, with the overall promotional campaign for the film generating 1.4 Million Impressions on Twitter, with 22941 of engagement and 18955 mentions.

On Facebook, Pari received 85 Million Impressions, with 268K Engagement and a Reach of over 78 Million, whereas on Instagram, Pari managed to clock in 1.3 Million Impressions, 90K in Engagement and a Reach of over 1.2 Million.


Workflow Breakdown of Every Best Picture and Best Editing Oscar Nominee

A Great Write-up About Workflow Breakdown of Every Best Picture and Best Editing Oscar Nominee

David Kong

Written by DAVID KONG, March 5, 2018

We at Frame.io love getting nerdy about workflows, so when we had the opportunity to speak to a few of the teams behind this year’s Oscar nominees, we had to ask about the details. As we spoke to different teams, however, we were struck by several different points of comparison between the films, and that sparked the idea of conducting a broader analysis.

So we reached out to the editorial teams on all 9 Best Picture nominees and all 5 Best Editing nominees, and got the details of their technical workflows. There were 11 films in all, since three films were nominated for both categories. (The Post’s editorial team was unfortunately unavailable, so we relied on second-hand reports for that film)

  • Baby Driver
  • Call Me By Your Name
  • Darkest Hour
  • Dunkirk
  • Get Out
  • I, Tonya
  • Lady Bird
  • Phantom Thread
  • The Post
  • The Shape of Water
  • Three Billboards Outside Ebbing, Missouri

We could have written an article on each of these films, but rather than diving deep into a particular one, we’re presenting an overview, highlighting some of the most interesting elements.



This is not an article about the funding of Hollywood films, but I think that it’s helpful to keep the relative budgets in mind as we look at things like the length of principal photography in the size of the editorial team.

Here’s a look at the budgets for each of these films, from smallest to largest. It’s quite extraordinary how diverse the budgets are on this list of Oscar nominees, from $3.5 million on Call Me By Your Name all the way up to $100 million on Dunkirk.


Production Schedule


There’s a general trend up and to the right again, with a few exceptions. Production is always the most expensive phase of a feature film, and so it’s not surprising that the lower-budget films generally had tighter shooting schedules. What’s rather shockingly impressive is that the Get Out team shot a four-Oscar-nominated film in just three weeks!

The Post’s shooting schedule might have been longer except that the film was made in a very short amount of time. Steven Spielberg wanted badly to make and release The Post quickly, as soon as he read the script, because he felt that it was an important piece of social commentary for our time. The film was finished less than 9 months after Steven first read the script. That kind of schedule means every phase of the film has to be rushed.


Editorial Team


Keeping the films in the same order again, let’s compare the sizes of the editorial teams. I excluded post supervisors and VFX editors, keeping this list to editors, assistant editors, associate editors, and editorial assistants.

While it’s certainly true that the two most expensive films had the largest teams, the rest of the nominated films needed no more than 4 on the editorial team. A large part of Dunkirk‘s huge editorial team was needed for the various conform processes (keep reading), but even without that, it had a significantly larger team than any other film on our list.

The Post also had a significant added challenge—they were cutting two films at once! When Steven Spielberg decided to fast-track The Post, he was already in the midst of postproduction on Ready Player One, which couldn’t simply be put on hold for a year. Michael Kahn and Sarah Broshar, the two editors of The Post, thus had to be able to switch from one film to the other on a daily basis. So, I think we can probably cut them a little extra slack.

The Post © 20th Century Fox

I, Tonya’s team had the difficulty of a small budget but over 200 tricky VFX shots. Although Margot Robbie trained hard, a professional skater had to complete the most difficult moves, requiring various tracking techniques to place Margot back into the film, in addition to CG crowds and stadiums. I, Tonya’s small independent budget meant that Steve Jacks had to pull double-duty as first assistant editor and VFX editor (he is credited for both titles). He would be cutting in VFX shots in the morning, then in the afternoon he’d switch to doing sound work, and then he’d sometimes stay late to prep shots to turn over to VFX.

In contrast, Three Billboards’s workflow was so simple that, even with a two-man editing team through most of production, the first assistant editor Nicholas Lipari spent very little time on the technical details. On a given day, he’d spend around two hours ingesting and processing footage, and then he would be able to spend the rest of the day working with the editor Jon Gregory, which is quite an unusual scenario. It’s an unfortunate fact that the duties of the assistant editor are often so different from the duties of the editor that an assistant can perform their job well for years without getting the skills necessary to move up to an editing position.

(Do you agree with that? Disagree? Let us know in the comments or email blog(at)frame.io—we’re preparing an article on that topic.)


Postproduction Schedule


You might think that large productions like Dunkirk and The Post would have the luxury of spending as much time in the edit as they need, but the reality is that amount of time allocated to postproduction doesn’t correlate very well with budget. I’ve kept the films in order of budget from left to right, and there isn’t a discernible trend at all. Note that I am including only the time spent in postproduction after the end of principal photography, although of course these teams were all hard at work producing assembly edits while the films were being shot.

On studio films, the release date is usually set before they even begin shooting, and the process of jockeying for prime release dates means that a studio film can end up with plenty of time for post, or very little.

The smaller independent films, like Three Billboards and Lady Bird, had a lot more freedom to take as much time as needed in the edit. Even small independent films do sometimes have to hit deadlines of course, which is often a particular film festival that they’re targeting. It’s interesting to note, though, that even though the independent films generally reported having much more flexibility in their postproduction schedule, they didn’t necessarily take the longest.

Sam Rockwell and Frances McDormand, 2018 Best Supporting Actor and Best Actress winners for Three Billboards Outside Ebbing, Missouri. © Fox Searchlight Pictures

In the case of this year’s nominees, a number of different factors dictated the amount of time in post: technical complexity of the edit (Baby Driver and Dunkirk), pressing release dates (The Post), complex VFX (The Shape of Water), with several taking whatever time they felt they needed (Three Billboards, Lady Bird, Call Me By Your Name).




Given the huge range of budgets for these films, it’s a testament to the amazing democratization of the tools of postproduction that these films were cut on similar hardware. In fact, many of these films were cut on laptops from portable hard drives, as well as in full edit suites.

Tweaks to The Shape of Water in a parking lot, photo courtesy of Doug Wilkinson.

It will come as no surprise that Apple’s hardware dominated the list, with most films editing off of Mac Pro “trash cans” in the main edit suites (the new iMac Pros were not available when these films were being edited). Call Me By Your Name’s team opted for top-of-the-line iMacs, and some will be surprised that Darkest Hour and The Shape of Water both used old Mac Pro towers, though The Shape of Water actually moved to Mac Pro trash cans in the middle of the shoot. Old habits die hard.


The Edit: Still Standard HD


Given Avid’s dominance of the feature film market, it’s a given that each of these films was edited using Avid’s DNx family of codecs, most using the DNx 115 flavor, though four films (Baby Driver, Lady Bird, Call Me By Your Name, and The Shape of Water) used the DNx 36 flavor. If you’re more used to the ProRes family, DNx 36 is comparable to ProRes Proxy, while DNx 115 is comparable to ProRes 422 (you can see a comparison table here).

Comparison of DNx codecs. You can find the full chart here.

Although it’s becoming easier and easier to edit in 4K (and these productions certainly had the budgets for high-end computers), all 11 teams chose to edit in standard 1920 x 1080 resolution. For readers who are used to working on smaller productions where the entire postproduction process happens at the same facility, this might be surprising. These films were all captured or scanned at 4K or higher, and the budgets were plenty large enough to pay for top-of-the-line hardware, so why not edit 4K? If people are editing feature films in raw 6K, surely it can be done in 4K without too much trouble.

Why Not 4K?

If you consider the workflows of these types of films, though, a 4K edit still doesn’t make much sense. Before we ask why these films wouldn’t edit in 4K, we first we have to ask ourselves, “what are the benefits of editing in 4K?”

The primary benefit for most people who edit in 4K is simply the fact that they can skip the offline editing process and do all of their work directly on the camera-original files. Although it’s very easy to design a smooth offline workflow, and all of the major editing packages support it, there is some nice simplicity in avoiding the need to transcode for an offline workflow. If you are doing your color correction and finishing inside of your editor (which is increasingly possible), you have the added significant advantage of being able to move fluidly between phases of your postproduction process. You can spend more time on temporary color-correction as you are editing, knowing that you can continue that work later rather than having to start over again. You can also make edits to the film during the finishing phase, without the headaches of a reverse conform process.

But these feature films, even the low budget ones, all used a traditional offline workflow that involved a handoff from the editors to a separate finishing team at another facility. So there was no possibility of the kind of all-in-one workflows that are now becoming feasible.

As amazing as the iMac Pros and Z840s are for renderless 4K editing, there are still plenty of hiccups and slow-downs involved with editing a significant film in 4K. Temp VFX and color-correction can quickly choke playback on a system that’s not perfectly tuned, and render times stretch out.

The other significant disadvantage to a 4K edit for these films is the size of the storage required, even if you edit off of proxy files instead of the original camera files. At different points in the process, the teams behind Baby Driver, Three Billboards, and The Shape of Water had a complete copy of the film running off of a single hard drive connected to a portable laptop, which would not have been possible had they edited in 4K.

Baby Driver editor Paul Machliss’s laptop and Lacie hard drive on set. Image courtesy Paul Machliss.

A 2-hour feature film might easily have a 50:1 shooting ratio (capturing 50x as much footage as ends up in the final film), which means 6,000 minutes of recorded footage. Using our bitrate formulas, we can quickly calculate the space required.

ProRes 422 at UHD is 503mbps, so we plug our numbers into the top formula:

503mbps * 6,000 minutes * .0075 = 22,635GB, or 22.5TB.

22TB is reasonable to store in a powered RAID, but you’re going to have trouble throwing that in your backpack.

So, if you can’t take advantage of the primary benefits of a 4K workflow, and it will slow you down and hamper you, it just doesn’t make any sense to use cutting-edge 4K workflows purely for their own sake. Even if you’re cutting Dunkirk.

The one benefit that these productions would have received from a 4K edit is simply the greater resolution and detail during the editing process. That can be an advantage, but just not a big enough advantage to tempt these teams …yet.

Note that I’m specifically talking about editing in 4K vs 1080p, which is a separate question from whether to capture and master at 4K or 1080p. All of these films used a traditional offline workflow, which allowed them to edit in 1080p but produce a final output at a higher resolution.


Film vs Digital Acquisition


The decision to shoot on film vs. digital is always a fraught one. It can be a question of budget, a question of personal taste, or a question of subject matter. All of the films on this list had enough of a budget to consider film, but it’s interesting to note that, of the six movies shot on film (Baby Driver, I, Tonya, Call Me By Your Name, The Post, Phantom Thread, and Dunkirk), five of them were period pieces of some sort. Films set in the past tend to shoot on film more often than films set in the present because we tend to associate the look of older film stocks with the past.

Of course, it’s always possible to emulate the look of an older film stock with a digital image, but I’m trying not to take sides too much here 🙂

The only non-period-piece to use film was Baby Driver, which is even more surprising given the fact that the film’s synchronization to its soundtrack required real-time, on-set editing (more on that later). Shooting digitally could have simplified this workflow, but Edgar Wright is a film purist, and it’s pretty hard to argue with a director like Edgar Wright.

  • Baby Driver: 35mm, Alexa Mini at 2.8K ARRIRAW.
  • Call Me By Your Name: 35mm.
  • Darkest Hour: Alexa SXT, Alexa Mini at 3.4K ARRIRAW.
  • Dunkirk: IMAX 65mm and Standard 65mm film.
  • Get Out: Alexa Mini at 3.2K ProRes 4444.
  • I, Tonya: 35mm, Alexa 65 at 6.5K ARRIRAW, Phantom.
  • Lady Bird: Alexa Mini at 2K ProRes 4444.
  • Phantom Thread: 35mm film.
  • The Post: 35mm film.
  • The Shape of Water: Alexa Mini at 3.2K ARRIRAW.
  • Three Billboards Outside Ebbing, Missouri: Alexa XT Plus at 2K ARRIRAW, Ursa Mini.

Of these 11 films, Dunkirk was the only film to do an optical print with no digital intermediate. This was again a personal aesthetic choice by Christopher Nolan to avoid the digital color tools and restrict the color treatment to the limited tools of the traditional photochemical color-timing process.

The five other films that shot on physical film all scanned the images and did their color correction and online effects on digital files. They then exported DCPs (digital files) so that the films could be projected digitally, or else printed those digital files back onto film for a film projection (Phantom Thread). Dunkirk, instead, was projected on film using prints made from the original camera negatives where possible.

There are several caveats, however. Dunkirk still did a digital scan of the camera negatives, for several different reasons. In spite of his love of the physical film, not even Christopher Nolan can deny the obvious advantages of digital editing, and so the film had to be scanned and transcoded to a 1080p digital intermediate codec for the editing. Second, although Christopher Nolan tried to do as many effects in-camera as possible, it was still necessary to do some digital VFX on certain scenes. Those scenes were scanned at 8K and delivered to the visual effects house, which worked at 6K. Those 6K files were then printed back to film negative and spliced in with the prints that had come from camera negatives.

Dunkirk. © Warner Bros.

There was also yet a third complete scan in order to provide digital delivery as well, since many theaters have moved to digital projection only. That scan was done after the film had been color-timed photochemically, though, so there was no need to do any digital color manipulation other than what was necessary to match the look of the digital files as closely as possible to the look of the all-analog version.

(I’ve barely scratched the surface of the extraordinarily complicated workflow on Dunkirk. I recommend this article from the ASC and this interview with Steve Hullfish if you’d like to dive deeper.)




I mentioned above that every film used Avid’s DNx offline codecs for editorial, although a couple of these films added a few extra twists to the process.

Most of these films followed a fairly standard dailies process. For those films that shot on 35mm, the negatives were immediately scanned to digital files, and then shown to the production team digitally, either by sending a physical hard drive or via a remote dailies viewing system. For the films that shot digitally, the files were transcoded and then delivered to the production team via hard drive or via the cloud. Processing dailies is never “simple”—color needs to be properly managed, sound synced, and metadata carefully managed—but the process is fairly standardized.

Two of these films had particularly complicated dailies workflows, though for completely different reasons.

For Baby Driver, the challenge came from the fact that the film was timed extremely precisely to the soundtrack. The main character is constantly listening to music on his earbuds throughout the film, and every scene is precisely choreographed to fit the music. Every shot had to be timed in order to match this soundtrack, which was piped to the actors via hidden earpieces.

The cast of Baby Driver wore earpieces in order to move in sync to the music.

That obviously presented as a huge synchronization challenge, which they addressed by referring to detailed animatics, matching the live action footage, shot by shot, to the pre-edited animatics. In order to be sure that they were hitting their beats precisely correctly, it was necessary to edit the film in real-time, dropping each take into the animatic timeline to make sure that everything lined up as planned.

Since the film was shot on 35mm film, they weren’t able to use the “real” dailies to do this on-set check. Instead, the film’s editor Paul Machliss received a signal from video village, which was captured in real time using Qtake’s video assist software. Qtake recorded ProRes files, which could be imported into Avid on Paul’s Macbook Pro, but Paul would then transcode those files in the background to DNxHD 36, stored on an external hard drive.

So Paul was building his assembly edits with these temporary dailies, but he was also receiving the dailies that have been scanned from the captured 35mm negatives. Since his temporary dailies were lower quality, he swapped out the 35mm scans as soon as they arrived. Of course, it takes a few days for negatives to be mailed from Atlanta to Los Angeles, scanned, processed, and then sent back to Atlanta, Paul’s on-set assistant had to swap out temporary dailies for “real” 35mm-scanned dailies, on a daily basis. (that was a lot of dailies)

Since these temporary dailies were coming from a video tap which recorded independently from the film camera, the clips coming in from the 35mm scans didn’t match the temporary dailies precisely. So the new dailies would have to be carefully linked back into the Avid timelines, replacing the temporary dailies without losing sync. Fortunately, reliable time-of-day timecode saved the day, allowing much of the relinking to happen smoothly.

(edit: we earlier reported that the temporary dailies came in at 25fps. That was incorrect: Paul had to work with proxies from a 25fps video tap on The World’s End, but not on Baby Driver. We regret the error.)

Dunkirk’s dailies workflow was even more complicated than Baby Driver’s workflow, though without the real-time component. All of the complications in Dunkirk‘s workflow were due to the fact that the production was shooting simultaneously in two different non-standard types of physical film, and Christopher Nolan wanted to view his dailies projected by a real film projector using a third non-standard type of film.

In the days before digital capture and digital intermediate, everyone had to view their dailies on film—that element was just a throwback to the old days, nothing particularly new. The tricky part was that Christopher Nolan wanted to shoot part of the film on IMAX 65mm, part of the film on standard 65mm, and to view his dailies at IMAX sizes.

In spite of the similar-sounding names, IMAX 65mm and standard 65mm have different sizes and different aspect ratios. The names are misleading because IMAX 65mm is 65mm tall, while standard 65mm is 65mm wide, resulting in two completely different types of film being known by the same number.

Comparative sizes of standard 35mm film, standard 65mm film, and IMAX 65mm film

Standard 65mm film is as wide as IMAX 65mm film is tall.

Incidentally, this is exactly the same reason why 35mm cinema film and 35mm still camera film are not the same size or shape. 35mm cinema film is measured by the horizontal side, whereas 35mm still camera film is measured by the vertical side.

Also, you may have heard that the film was projected at IMAX 70mm, not IMAX 65mm. This is another issue of confusing terminology. The image is exactly the same size in both cases—the extra 5mm is used by the soundtrack during projection. So you always capture IMAX at 65mm and display it at 70mm, but the image is not scaled or cropped in the process. Same thing with standard 65mm film—it’s projected at standard 70mm (not IMAX 70mm).

Yeah, confusing.

Ok, back to Dunkirk. Christopher Nolan wanted to view his dailies on physical film, and since he was shooting two different formats, he received dailies in two different formats. All of the standard 65mm shots were printed to standard 70mm, requiring the production team to carry around a 70mm projector with them. Given the amount of material shot, it was simply too expensive to create IMAX prints of all of the IMAX footage. Only selected shots were reviewed at IMAX 70mm—most were reduced down to fit onto a standard 35mm print.

Because of the time required to process the 35mm reductions, they had to skip syncing sound on the all of the IMAX dailies. Fortunately, all of the dialogue scenes were shot on standard 65mm, and so they were able to view dailies with sound on the dialogue scenes.

The dailies on The Shape of Water were much simpler technically, but they had to keep a very tight ship because the editorial team worked at the studio where the film was mostly shot. Director and co-writer Guillermo Del Toro would drop in frequently, usually during the lunch break, which meant that the assistants had to work very quickly in order to ingest the previous day’s footage, allowing the editor Sidney Wolinsky time to work with the footage and have something to show Guillermo by lunchtime.

Distributed Workflows


There aren’t many film labs capable of handling all of the complexities of Dunkirk’s workflows (and I’ve skipped over several other tricky bits), so they had to ship all of the film from Europe, where they shot  most of the film, to LA where IMAX and FotoKem processed dailies, mailing copies back to Europe for the production team. With all of that turnaround time, dailies took around a week to return to production. (Can you still call them dailies if they take a week? Not sure.)

Call Me By Your Name’s team was also distributed around the world, but where Dunkirk’s workflow was complex, theirs was simple, and where Dunkirk’s workflow was simple, theirs was complex.

Dailies were much simpler for Call Me By Your Name since they were working with standard 35mm film. They did have to send dailies from Crema in northern Italy to Rome for development and scanning, but the workflow remained entirely digital from that point on, greatly facilitating their distributed postproduction workflow.

Call Me By Your Name © Sony Pictures Classics

The editorial team remained in Crema, but the rest of the postproduction was scattered across the world. The color grading was done remotely in Thailand, the sound mixing was done in France, the VFX and Sound Design in Rome, and the ADR in LA. The director Luca Guadagnino was even able to direct ADR sessions in LA from a suite in Italy, syncing their Pro Tools sessions so that the director and editor were able to listen in live as the actors delivered their performances in LA, viewing the video correctly synced, and give direction as though they were in the same room.

Get Out’s team was much less spread out than Call Me By Your Name’s team, but they still made use of remote collaboration tools, as did many of the films on our list. Matthew Poliquin at Ingenuity Studios, which produced Get Out’s (mostly invisible) VFX used Frame.io’s cloud-based review and feedback tools even for internal communication, within the same building.

While it’s hard to argue against the benefits of having the director and editor in the same room for the editorial process on a feature film, cloud platforms like Frame.io make it easier to communicate and provide feedback asynchronously. That’s especially true for someone like Matthew working on VFX, who needs to coordinate feedback for many artists working separately, using Frame.io as a single, central platform to track feedback on various drafts of each shot.

The “Sunken Place” scene in Get Out was one of the most challenging for the VFX team at Ingenuity Studios. Find out why.


What Really Matters


As fulfilling and exciting as it may be to learn about the workflows behind these amazing Oscar-nominated films, at the end of the day, it’s not about the awards. Christopher Nolan doesn’t watch IMAX-sized dailies and Edgar Wright doesn’t meticulously choreograph car chases to music to win an award. I’m sure everyone we talked to on this list (and their collaborators and counterparts on the films not nominated) would all say the same thing—they do what they do for the love of the craft and the pursuit of excellence. That pursuit is what drives every passionate artist. It’s what drives us to create amazing software (or coordinate dozens of interviews with hard-to-reach professionals to bring you the best possible blog.) And it’s our sincere desire that this information will inspire you to do the same. And if you happen to win an Oscar along the way, that’s just icing on the cake.

14th time’s a charm! Imagine if Deakins was only in it for the awards. He finally gets best cinematography!




Thank you very much to the busy people who took the time to answer our questions about these films: Paul Machliss, Steve Jacks, Tommaso Gallone, Francesca Addonizio, Mary Juric, Chema Gomez, Doug Wilkinson, Cam McLauchlin, Trevor Lindborg, Nicholas Lipari, Francesca Addonizio, John Lee, Crystal Platas, Nick Ramirez, and Matthew Poliquin.

14 Secrets of Movie Trailer Editors

An interesting read from Metal Floss BY JAKE ROSSEN.

JANUARY 26, 2017

Original image


Decades ago, Hollywood used to put previews of their coming attractions after the conclusion of their theatrical releases. The teasers earned the nickname “trailers” because they followed the feature film.

Today, trailers aren’t such an afterthought. Studios spend millions of dollars stirring up anticipation for their big-budget movies by releasing trailers that promise consumers something worth the hassle and expense of a ticket. The responsibility for taking the most dazzling 120-odd seconds from hours of footage and splicing it into a coherent—and compelling—mini-movie falls on trailer editors, who screen films months in advance in order to create previews that will build the viral buzz filmmakers look for.

To better understand the job, mental_floss spoke with several editors at three of the most highly respected firms in the business. Here’s how they get you excited about the next blockbuster.


If you think studios are worried about rough cuts of their films falling into the wrong hands, you’d be correct. As some of the few pairs of eyes outside of the production to see a movie months before release, trailer houses must make sure their offices can’t be tapped by potential pirates. Ron Beck, the owner and creative director of Tiny Hero, says that only employees at Fort Knox might be able to relate to the level of security that trailer editors deal with. “There are cameras everywhere,” he says. “We have sensors that record everyone who goes in and comes out of a door.” Rough cuts of movies typically get delivered on encrypted hard drives and are edited only on hardware that’s inaccessible to an open network.

“All of [the studios] are careful, but Marvel leads the pack,” Beck says. “Their stuff is super-strong. That’s why you rarely see their movies pirated.”


In order to begin work on marketing campaigns, trailer firms are usually given extremely early footage that has yet to be polished and edited. Rough cuts might emphasize plot points or characters that wind up getting minimized by the time the picture is done, or “locked.” David Hughes of the UK-based firm Synchronicity says he’s seen a few movies that he barely recognized once they hit theaters. “Bridget Jones’s Diary was quite dark at one point,” he says, “and I recall a totally different opening to Bowfinger where the film-within-the-film was called Star Wars rather than Chubby Rain because the accountant who wrote it was so stupid he didn’t know a film called Star Wars actually existed.”

Since films continue to get pared down right up until release, it’s also common to see scenes in trailers that don’t ultimately make the final cut. “Dirty Rotten Scoundrels [is] my favorite example, because someone wrote to complain that they had waited the whole film to see Steve Martin push an old lady into a swimming pool, as seen in the trailer, only to find that the scene wasn’t in the finished film.”


Because editors see films so far in advance, they’re often looking at footage full of green screens and unfinished effects work. But if an editor feels like a scene would bolster the trailer’s impact, they can request the studio fast-track the CGI. “We can’t ask what they shoot first, because productions usually revolve around an actor’s schedule,” Beck says. “But we can ask for visual effects stuff we need to be done first.”


Daniel Lee, who spent 10 years at Mark Woollen and Associates before migrating to the buzzed-about firm Project X, says that editors are often called upon by directors or producers to splice together a “sizzle reel” made out of stock or existing footage in order to sell a studio on a movie. “It’s becoming increasingly common to do,” he says. “It’s an inexpensive way to sell someone on the vibe of a movie.” Director Joe Carnahan commissioned a reel when he was looking to direct a theatrical version of Daredevil (above).


For last summer’s Terminator: Genisys, fans who viewed the trailer were slightly annoyed to learn—spoiler—that perpetual victim John Connor was a Terminator in yet another revision of the franchise’s confusing canon. But those edicts usually come down from the studio, according to Beck. “I like to tease, not tell,” he says. “In certain movies, though, you have to give it up, or the trailer won’t even be good. Revealing a twist is ultimately the studio’s decision, though.”


Trailers are often the result of other trailers that studios noticed were particularly effective in engaging an audience emotionally. One example: the preview for 2003’s Texas Chainsaw Massacre remake. “The one that always comes to mind is the trailer for the Michael Bay-produced remake of The Texas Chainsaw Massacrewhere black frames were inserted off the beat to disorienting effect,” Hughes says. “This technique has been borrowed for many horror trailers since, including some that we’ve made.”

Another trend-making trailer: the one for 2010’s Inception, with its thunderous “braam” sounds that seemed to influence every heavy action/drama film that followed.


Because trailer content is subject to many of the same ratings restrictions as the feature film itself, editors often have to cut around some of the Motion Picture Association of America (MPAA) mandates. If a trailer is a “green band,” or suitable for general audiences, that means no threatening people with firearms. “There’s a lot of minutiae, like where a gun can be pointed,” Beck says. “You can’t have someone pointing it straight at the camera, for example, or at anyone in the same frame. Sometimes we blow up [zoom] a frame to hide stuff like that.”


Studios looking to reach the widest possible audience sometimes like to hedge their bets on campaigns and enlist two different trailer vendors to create edits for the project. They’ll focus-test each and back the one with the most support. That’s not unusual, but what irks editors, Lee says, is when a studio’s marketing department decides to split the difference and create a trailer based on ideas from two different creative entities. “They might combine trailers,” he says. “We call that Frankensteining.”


Because editors have precious little time to communicate the theme or premise of a movie, having a line or two of dialogue that summarizes a character’s motivation can make all the difference. Unfortunately, not all movies come stocked with exposition. If a trailer needs a clarifying line and the actor isn’t available to record dialogue, Beck can go in and splice together sentences from words he’s already said. “We might use a sound-alike actor, or we might see if we can form whatever sentence with the lines we have. We could make ‘I need to find her’ from someone saying ‘Find her’ and ‘Need to.’”

If all else fails and an actor is needed, Hughes says there’s one relatively quick fix. “If you’ve seen a film in the last five years, you’ve probably seen a film in which at least one line of ADR [Additional Dialogue Recording] was done on an iPhone after the actor had left the set.”


Studios love when fans of film franchises dissect trailers to spot hidden references or clues. So do editors, but sometimes the Easter eggs they drop in are going to be hard for anyone outside of their family to catch. “I know a few editors, myself included, who try to slip in their voice in a piece,” Lee says. “That’s only if you have enough time to fiddle with it.” Lee’s two kids lent their voices to a sound mix for World of Warcraft: Looking for Group, a documentary about the game. “I don’t know if they made the final cut, but they’re in there.”


Of all the film genres he’s overseen, Hughes believes comedies that don’t hit the mark are his worst assignment. “I’ve made trailers for comedies where there were literally not enough jokes in the film to fill a trailer,” he says. “Going back in the mists of time, I remember the trailer for Beverly Hills Cop III having one joke in it, Serge saying something sarcastic about Axel Foley’s shoes, and then they cut that joke out of the film.”


Fall down the YouTube rabbit hole and you’ll find thousands of movie trailers cobbled together by hobbyists outside of the industry. While many might underestimate the work and craft involved in doing it professionally, a few have been able to use it as a launching pad to get noticed. “I know one or two editors who got careers because of their YouTube channels, where they were uploading stuff completely as a hobby,” Lee says.


Beck believes the majority of a trailer’s impact can be chalked up to how the images fit with the music selection. “Music is at least 50 percent of any trailer,” he says. With access to unreleased tracks from music labels, Beck will go jogging with his earphones in to sample tunes, even though he might not find a perfect visual fit for a song for months. “I’ll picture a scene and maybe see something like it a year or so later. And then I’ll go, ‘Oh, I’ve got just the song for this.’”


Ever since voiceovers for trailers largely went out of style, editors have needed to keep viewers oriented in other ways. But that doesn’t mean they can’t cheat a little. Beck says that editing a trailer for anything containing Morgan Freeman is like having a narrator. “We did Now You See Me 2 recently, and when I knew we had Morgan Freeman in the movie, I knew the whole trailer was going to be driven by him saying his lines. He’s like the voice of God.”

Another go-to performer: Ryan Gosling. Why? “He just nails it,” Beck says. “He can convey a meaning or moment so quickly that you can use it in the trailer. You’re trying to do so much in a short amount of time, and when an actor is emotive, it makes my job easier.”

Oscar-nominated editors clear up the biggest category misconception

 Some great insights on what an editor does, By Mandi Bierly, EW.com.

Leading up to Sunday’s Oscars, EW.com will take a closer look at four categories that moviegoers may mistakenly think of as “technical.” First up: Film Editing, with insights from Life of Pi’s Tim Squyres, Silver Linings Playbook’s Jay Cassidy and Crispin Struthers, and Zero Dark Thirty’s Dylan Tichenor and William Goldenberg, the latter of whom also cut Argo, making him one of only a handful of editors in Oscar history to compete with himself.

Ask a film editor what the biggest misconception is about his or her role, and the answer is the same: “It sounds funny, but a lot of people tend to think it’s a purely technical job where you literally go in and cut slates off, and the director says, ‘Do that, do that, do that,'” says William Goldenberg, Oscar-nominated this year for both Argo and Zero Dark Thirty and previously for The Insider and Seabiscuit. What will surprise those moviegoers then is just how many decisions the editor actually makes — and when. Let’s start with an overview:
• The editor begins work when cameras start rolling, not after they stop, and typically does the first cut of the film on his or her own. “This is something that people always seem surprised to find out,” says Tim Squyres, who’s edited every film Ang Lee has directed but Brokeback Mountain and received an Oscar nomination for Crouching Tiger, Hidden Dragon before Life of Pi. “If they start shooting on a Monday, and I get the footage on Tuesday, Ang is shooting another scene on Tuesday, so he can’t be in the editing room. So the editor always does the first pass by themselves. I cut scenes and show them to Ang, and I usually don’t get any feedback, because all he needs to know from me while we’re shooting is whether the scene was covered. If he feels that he has everything he needs, he forgets about it and worries about what he’s shooting tomorrow. About two weeks after the end of shooting, we sit down and watch the whole movie as a movie, and he hasn’t seen it yet. He’s only seen it scene by scene. That’s the way it has to work. Some directors are more involved than others in editing during production — it partly depends on schedule and partly depends on the director’s preferences.”
Goldenberg, for example, who’d previously cut Gone Baby Gone for Ben Affleck, went to the Argo director’s home editing room every Sunday, even during production, to show him his week’s worth of work. “Even though I wasn’t getting specific notes from him, I was getting a feel for what he wanted. It was almost like by osmosis: just having all his conversations in my head gave me a feeling of like, Oh, I know Ben would hate this or I know this isn’t what he’s looking for.” Affleck turned over nearly 1 million feet of film, including a noteworthy amount of footage of a parrot being enticed to squawk for the tense airport finale (which Goldenberg will dissect for us later). “It was really hilarious, because you couldn’t see Ben, but you could hear him off-camera. He’s just squawking and squawking and squawking, and then the bird would finally do it, and he would squawk over the bird or be talking over it,” Goldenberg says. “It was a lot of bird.”
Zero Dark Thirty director Kathryn Bigelow, meanwhile, delivered roughly double that amount — or about 320 hours of footage — thanks to her fondness for shooting multi-camera (maybe seven at once on big scenes). It would have been impossible for one editor to handle that volume on their clock, so Goldenberg joined Dylan Tichenor, previously nominated for There Will Be Blood, at the end of shooting. “We were in a little house in Studio City. He was in the master bedroom and I was in the living room,” Goldenberg says. They worked on separate scenes, but consulted with one another and sometimes swapped, like after Goldenberg spent his first month on the climactic raid and showed a 45-minute first cut. “We kinda all knew it was too long, but Kathryn just needed to live with it for a while and sort of enjoy it in all its grandeur,” Goldenberg says. “Dylan did a pass where he was able to make it shorter, where I think if I had done it, somehow Kathryn woudn’t have accepted it as easily. I don’t know why that happens.” Offers Tichenor, “That’s one of the great things about having more than one editor: Someone passes you the sequence and then you look at it fresh and go, ‘Oh, what if we did this, this, and this? And maybe this makes more sense.’ You have this objectivity because you haven’t sat and gone through the work. That helped us enormously in this film.”
What kinds of decisions are the editors making? We asked them to walk us through scenes to show us.
Choosing the right take
As Life of Pi’s Squyres puts it, “An actor might read a particular line between six and 100 times, but only one take’s gonna be in the movie.” It’s the editor’s job to pick it, at least initially. It was on Sense and Sensibility, his fourth film with Lee, that Squyres had the epiphany all good editors experience: “It was the first film that I had done with Ang that was all in English, and it’s Emma Thompson, Kate Winslet, Alec Rickman, and Hugh Grant — these great, great actors. When you get footage like that, you realize that your job is really not technical. It was my job to look at something that Emma Thompson had done and say, ‘Eh, that’s not good, I’ll use this other one instead.’ And not only was I allowed to pass judgment on these tremendous actors, I was required to. I think every editor gets to that point where you go, ‘Oh, they’re actually asking me to be an artist.'”
Great directors and actors give editors a range of performances to choose from. “So part of the editor’s job is to go through these performances and calibrate which is the right tone, which is the right level of intensity to use,” says Crispin Struthers, a first-time nominee for David O. Russell’s Silver Linings Playbook (the first film in more than 30 years to have acting nominees in all four categories). The early scene he and fellow editor Jay Cassidy, a second-time nominee after Into the Wild, point to is the fight Pat (Bradley Cooper) and Pat Sr. (Robert De Niro) have when a hopeful Pat returns from a run and seeing his friend Ronnie and tries to call his ex-wife, who has a restraining order against him.
Cassidy: There was a tremendous amount of dialogue that was added on the set. He tries to call his ex-wife and he gets a disconnected signal, and then he tries to call Ronnie and leaves a message for Ronnie, and Pat and his father begin to fight over the handset of the phone, and the fight goes on for about 2 minutes. David shot a tremendous range of performance. In doing the first cut, we tended to do the most extreme version of a scene first. The Bob De Niro in that first cut, boy, you saw a Bob De Niro from Raging Bull. It was creepy.
Struthers: It was intense. A bit too intense. (Laughs)
Cassidy: And it was five minutes long. It could never stand at that length, but it’s the only way where the editing is a continuation of a discovery on the set of how this behavior should be calibrated in the film. You sort of begin with the extreme and then peel it back. What was interesting is that it got peeled back to about a 2 minute and 26 second scene, and then we didn’t change it for about 2 months, and it was always about De Niro and Bradley fighting over the phone and “Take your medication, take your medication.” Then we got to the first preview, and we took it out completely. So the first time we previewed the picture, De Niro simply complains about the Redskins beating the Eagles and the doorbell rings, and it’s 12 seconds long. This 5-minute scene got squeezed down to 12 seconds. In the second preview, we put some pieces of it back, and it was 23 seconds. And then the third preview, it got a complete recut, and we chose new performances, especially on De Niro, and they were kind of a gentler Bob. What David had said he was looking for was Bob’s anguish at seeing his son so manic, as opposed to his anger at not taking the medication. So in the third preview, all the dialogue about “Take your medication,” “I don’t need my medication” was all taken out, and it simply became about “Don’t behave this way,” and it was really about the character’s pain at seeing his son in this fashion. And then that version got recut again for the seventh preview, and then the final cut of the scene was done in July.
Struthers: As Jay says, we went for the most intense, full-on Robert De Niro amazing performance first. And then we realized this was too much too soon for this film. We couldn’t peak this early and build up to crescendos that were gonna come later. That’s why we initially overreacted by pulling it back to almost nothing, before coming back to the kind of Goldilocks scene that we arrived at, which did what it was meant to.
Cassidy: Also, when we looked at the scenes before it, we realized that before Pat’s run, they had argued already about taking the medication, so if you left all of Bob’s protesting in about “Taking your medication,” it almost felt redundant. I remember when we changed the takes, it was like, “Oh, we’re looking for this thing on Bob that’s the pain of it,” and there was one take in particular. Pieces of that take had been used, but only when you took out the other material, the “Take your medication, take your medication,” could you see what De Niro was doing. Fantastic.
Believe it or not, Struthers adds, that later bedroom scene of Pat Sr. breaking down (pictured) while telling Pat that he wants to do everything he can now to help him get back on his feet was, at one point, not in the movie, as they were looking to trim. They weren’t sure they could go from Pat Sr. being that emotional with his son upstairs to Pat Sr. giving his son a pep talk about attending the Eagles game downstairs. “We were thinking, Gosh, you can’t have both of these De Niro scenes right next to each other,” Cassidy says, “but you know what, you could.”
The tonal balancing act was also at the center of Argo, and never more delicate than in the sequence when Alan Arkin’s character organizes a read-through of the fake hit Argo at the Beverly Hilton.
Goldenberg: It’s a microcosm of the whole movie in a way, because we’re combining all these different tones in this three-minute sequence: the read-through of the fake script, which is with these actors in silly costumes saying kind of cheesy dialogue; combined with a mock execution with the hostages in the Embassy; combined with what our houseguests are doing; combined with stock footage of newscasters at the time. I always felt when I read the script — and I think Ben and [Oscar-nominated screenwriter] Chris Terrio felt the same way — that if we could make that sequence work in terms of mixing all these tones, then it was indicative of how it could work for the whole movie.
I did it over and over and over again until it felt right, moving things around — some things were subtle, some things were big moves like switching sections to not juxtapose anything silly with anything incredibly dramatic and life-threatening. I remember looking at it at some points while I was first cutting it and thinking I had it really good, and then looking at it and going, “I hate this. It’s not working at all.” And then I’d work on it some more. From the point where I hated it to where I loved it, it was a matter of just subtle adjustments. It really does show the subtlety of editing and how little things can upset the whole thing. Michael Mann [for whom Goldenberg cut Heat, The Insider, Ali, and Miami Vice] always referred to it as mercury: you gather it up and then one little part moves off to the side. That’s kinda how you feel when you’re cutting, especially something very sensitive like a lot of the scenes in Argo, where it was so easy to go sideways and mess it up.
The hard line Bigelow drew for the tone of Zero Dark Thirty is, perhaps, best explained in the sequences she didn’t use.
Tichenor: She only reacts to things when they feel real to her. I mean, there are scenes cut out of the movie and they were expensive, and they took a long time to shoot, but it just wasn’t part of the movie for her. One of the very early sequences involves Ammar, our detainee played by Reda Kateb, who did so well in the movie. There was a whole sequence where Daniel, Jason Clarke’s character, goes in with a local ISI team, and on a tip, they’ve infiltrated this little town and find him in a house and extract him. There was a little action sequence with a little chase through the house and an alley way, and they grab him. Kathryn saw that first cut, and the sequence played really well, quite like a Michael Mann movie, I thought, (Laughs) and she just said, “You know, when I see this, I feel like someone is showing me a movie. I don’t want to feel like I’m watching a movie. Let’s try to do it without it.” And we did, handedly. It made one of the most striking cuts in the movie, actually, to go from the 9/11 montage of sound in the beginning to just the hard cut of the sun streaming through the window in the detainee center and Jason walking in. It’s a stomach-clenching moment.
Keeping the story rolling when everything is happening, and when you’re simply adrift
An editor wants to draw the audience into the story and keep them there. Tichenor paraphrases a quote he’s sure others have said, but that stuck with him when he heard it from Goldenberg’s mentor, Michael Kahn, who’s won three Oscars and is nominated again this year for Lincoln: Editing is often trying to find the least amount of material to effectively tell the story. “Audiences will react, even unconsciously, very badly to repetitive information. If they feel like, ‘Yeah, I got that already, you don’t need to show that to me again,’ then they start to get shifty, or bored, or you lose the tension that you’ve gained up to that point,” Tichenor says.
The Zero Dark Thirty editors point to the section of the film when they’re tracking Osama bin Laden’s courier Abu Ahmed — from the moment they get his mother’s phone number from a Kuwait informant to where they actually find him in his little white SUV driving around Pakistan. “Being able to tell that story clearly, having the audience track along with it, and having it build to a culmination was, I think, one of the more difficult sections of the movie,” Goldenberg says. “Hopefully it doesn’t appear that way, and it’s exciting and great to watch, but there were so many different ways to go…. There was one day where Kathryn and I had moved some stuff around and taken big sections of it out trying to accelerate it, and we thought we had just like done it and we were so excited. And then she was in the other room, and I looked at it more carefully, and I realized, Oh this doesn’t really make sense, because how does this person know that? And the look of disappointment on Kathryn’s face. I was so heartbroken to go to tell her. You felt like it was a punch in the stomach.” (How do they know when they’ve finally got it right? For that, Goldenberg likes to quote director Tony Scott, for whom he cut 2005′s Domino. “He would look at a scene and say, ‘Something itches.’ He didn’t know what it was, but something would bug him, and you’d go attack that area. You’ve just got to work until it doesn’t bug you anymore.”)
Argo was, obviously, another film that had built-in tension. “Even watching the dailies, my stomach was in a knot,” Goldenberg says. “I remember watching the dailies where they’re all waiting at passport control and I sent Ben an email saying, ‘Just watching the dailies makes me anxious.’ And I think he somehow misinterpreted my email as saying it was a lot of film and I was anxious about it. He’s like, ‘Don’t worry, we’ll take our time, and we’ll get through it all.’ And I was like, ‘No, no, no. I’m anxious because the footage is so good it’s making me feel anxious.’ I knew it would only get better as we cut the pieces together.”
When editors present first cuts of scenes to directors, they like to include rough ideas of sound effects and temporary music, so it feels like a film. It was Goldenberg’s idea to have the sound slowly fade out in the airport to make the audience feel as though they were inside the houseguests’ worried minds:
Goldenberg: What gave me the idea was when Ben Affleck’s character comes into Tehran [earlier in the film], he’s at passport check-in and there’s a little skirmish off to his left. Some guy gets hauled away by the police, and then they stamp his passport, and I made the sound of that passport stamp a little bit accentuated. He’d gotten a little distracted, so it snapped him back to attention, and he clears passport control. When they went through that check point on the way out, I thought it would create a whole different level of tension to slowly drop the sound out. I used this sort of tonal temporary music that had this droning heartbeat feel to it. All the characters were so convincing at looking scared, so I had great shots to cut to and cut away from. I think when you’re super nervous like that, your heart is beating out of your chest and you’re trying not to give that away. It’s not the first time anybody has ever done it in a movie, but dropping the sound out subtly really gets you in each character’s mindset and feeling how terrified they were. Then I used that sound again of the passport stamp to snap everybody back out and bring all the real sounds back. [Argo’s Oscar-nominated sound mixing team] was able to take what I did in my Avid, and just make it even better. Instead of a small editing room, it’s got to fill a big theater, so they were able to take that idea and really just make it even more impactful.
Because of all the visual effects in Life of Pi, Squyres and Ang Lee had to find a number of different ways to be able to sit down and watch it like a movie. “We developed a lot of things to put the crude animation and crude backgrounds in right away so that we didn’t have to sit there and watch a boat with no tiger in it and walls of the wave tank with no sky and no ocean,” Squyres says. Still, that wasn’t the toughest part. Nor was editing in 3-D glasses for two years. It was structuring the story — cutting back and forth between the storytellers in the present day and the story that’s being told in the past, and then the extended flashback to Pi’s journey. “Our main character is drifting at sea. He’s not planning the bank robbery or trying to coordinate all kinds of things — stuff just happens when it happens, without any real cause and effect. So to give the audience a sense of being adrift at sea without them feeling like the movie is adrift is actually quite tricky.”
The scene Squyres points to is when Pi gets the stick and tries to train Richard Parker.
Squyres: We had a sense of what their interaction was going to consist of with the stick and the meat, but rather than animating what we thought the tiger might do, we figured we would work with real tigers and let them show us things. We shot about 4.5 hours worth of footage with real tigers on a boat. Now the boat was not in the water, the boat was on a gimbal, the thing that rocks it to simulate water. As our tiger trainer said, “Tigers don’t act. Tigers just behave like tigers.” They gave us all kinds of interesting behaviors, some of which was great reference for the animators, some of which really informed our thinking about what their interaction would be. There’s a shot in that scene where the tiger sharpens his claws on the bench. We never planned that, that’s just what the tiger did. And according to the tiger trainer, that’s a nervous tiger trying to pretend he’s not nervous. So I got all this footage, went through it myself, and then the tiger trainer came in with me for a couple of days and we structured out some possibilities for how the scene could work. Ang came in, and we presented that and figured out what their interaction was going to be. Three weeks after we shot the tigers, we shot the part with our actor, because at that point, he knew, and the animation supervisor knew, exactly what the scene was going to consist of. There are 23 used shots in the film with real tigers, 10 of them in that scene.
Now that you know what an editor does, here’s the kicker: “It’s not always the case that editing should be invisible — it depends on what kind of movie you’re making — but generally speaking, if you’re watching the editing, you’re not watching the movie,” Squyres says. “Ideally, when you’re editing, you’re doing kind of what you would do if you were in the room watching the scene that’s going on. When you cut to something new, it should seem like what you’re seeing now is better than what you would have seen if you stayed where you were. You’re getting some new piece of information that keeps you engaged and involved in the scene. And if it does, then you just watch the story and enjoy it.”

More Bodies, Fewer Cuts — Editing John Wick Chapter 2

By Evan Schiff in Timeline Tuesday, Video Editing


The day I interviewed for John Wick Chapter 2 was the first day of production. For whatever reason, hiring an editor had been delayed until cameras were already rolling, and Chad Stahelski, the director, called me for my first interview during lunch on Day 1. By the time I officially landed the job, flew out to NYC and got set up, it was Day 11. To begin a film of this size 11 days behind, with a director and producers you’ve never worked for and who don’t know you, was stressful to say the least. But as I sat in my new office scrolling through all the footage they had already shot, it became obvious quickly that I was about to be a part of something awesome.

Editorial Style

One of the things that the first John Wick is known for is its relentless, wide view action. Chad and David Leitch, who co-directed the first John Wick, wanted you to see how the action was being done, to see that it was Keanu doing it, and to make it clear that nothing was being hidden behind tricky edits or shaky cameras.

John Wick Chapter 2 follows the same ideology. As the action unfolds, we intentionally stay wider and hold on shots longer than a viewer might be used to. Chad’s edict to me when I first started assembling the action together was that he never wanted to go close on the action unless we were forced to by some other problem, and I think we succeeded in doing that. Keanu and the whole stunt team are so good at what they do that my job was to find the best vantage points for each section of action and then stay out of the way.

With the dialogue, we had a similar approach. The world of John Wick is one with rules, etiquette, and respect. We directly reference this in the film, but you can also feel it in the way our characters interact with each other. For instance, everyone talks at a measured pace and no one interrupts each other when they’re speaking. Our dialogue style is much more John Wayne than Aaron Sorkin. I cut the dialogue scenes to match this performance style, which means I almost always let our actors say their lines on camera and in full. When you have actors like Keanu, Ian McShane, and Laurence Fishburne (to name just a few of our excellent cast), why wouldn’t you?


I won’t get into the weeds too much on Production, but there are a couple interesting moments to call out:

On my first weekend in NYC, I spent a day with 2nd Unit Director Darrin Prescott assembling our big car chase. It was a really fun day, not to mention beneficial for both of us. Darrin got to make sure the vision for the footage he shot was reflected in the edit, and having him in the room saved me from needing to guess at how 5 days’ worth of car footage shot out of order was supposed to be arranged.

Since I started so far behind camera, I forwent my usual temp sound and music work and just focused on getting assembly cuts of every scene as quickly as I could. New footage was coming in every day of course, but I didn’t want to burn myself out right at the start, so I kept to regular working hours as much as possible. All in all it took me 3-4 weeks to completely catch up to camera.

Production wrapped in NYC just before Christmas, and then resumed in Rome in January. I went to Rome for 10 days while they were still location scouting so I could work with Chad before shooting started up again. Most of the film’s big dialogue scenes were already shot, so we focused on refining those first. By the time I left Rome, we had solid cuts of our big dialogue scenes, with some edits that still remain in the final version.

Assembly Timeline


When Post-Production started in LA, I went head first into reshaping the biggest slow spots from the assembly. One result of this is a really fun montage I made from 3 scenes that were intended to be sequential. This montage is now one of the most memorable parts of the film. I also sent some scenes to the cutting room floor that strayed too far from the main narrative, and moved some bigger chunks around to keep the momentum of the story going strong.

For the fight scenes, Chad likes to experiment with his own assemblies, so we set him up with his own Media Composer system and an isolated copy of the project. In this film especially it’s important that the martial arts and gun work are flawless, so making use of Chad’s depth of knowledge in those areas was crucial. He and I would then compare cuts for each fight and make a hybrid version with the best of each other’s edits.

Since we shot more dialogue than we needed, we also spent a lot of time figuring out what the minimum amount of information was that we needed to convey, and then trimming the rest out. This is always a bit of a balancing act, since you don’t want to confuse anyone by moving too quickly, but you also don’t want an exposition scene to outstay its welcome. Plus, there are going to be people who see this film without having seen the first one, and it was important to make sure those viewers don’t feel like they’re missing any required information.

Picture Lock Timeline

John Wick Chapter 2 comes out in the U.S. on February 10th. I’m really proud of the film and hope you all enjoy it!

By Evan Schiff in Timeline Tuesday, Video Editing


The Kuleshov Effect influences every film and every filmmaker. Understanding it can give insight to “movie magic” and creating the meaning you want expressed in your project.

The Kuleshov Effect is the single most important concept to editing, if not to filmmaking itself. It’s a cornerstone of visual storytelling; through this phenomenon that we can suggest meaning and manipulate space, as well as time. It is a fundamental aspect of “movie magic,” one which every filmmaker needs to understand.

Kuleshov and Film Theory
Lev Kuleshov (1899-1970), was a Russian filmmaker, considered by some to be the first film theorist due to his work dating to the 1910s. Kuleshov asked the question: what made cinema a distinct art, separate from photography, literature or theatre? He found that any form of art consists of two things, the material itself and the way in which the material is organized. Following this logic, Kuleshov found that the organization of individual shots, also known as montage, is what makes film stand apart.


In 1921, Kuleshov set up a series of cinematic demonstrations which gave the phenomenon its name. In these experiments, he projected the face of a well-known actor, then cut to a plate of soup, he then showed another shot of the same actor, then a girl in a coffin, the final sequence was the actor’s face, then an attractive young woman. Audiences responded that the actor seemed in the first sequence to be hungry, in the second, quite mournful and finally seemed to exude lust. In reality, all three shots of the actor were the exact same, his face was interpreted differently based on what it was put next to in the edit. Additionally, even though there was no establishing shot of the actor together with objects from the other shots, they seemed to the audience to be in close proximity to one another. Through the ordering of the shots, two separate places seemed to be one whole continuous location to the audience. Manipulating space and time was possible through the use of editing. This was a huge moment for cinema, with Kuleshov declaring montage to be the central principle that defines film as an art on its own.

This was a huge moment for cinema, with Kuleshov declaring montage to be the central principle that defines film as an art on its own.

Kuleshov’s theories were instrumental in the creation of a powerful genre of filmmaking, Soviet Montage, which was eventually suppressed under Stalin. But the Kuleshov Effect lives on, exemplified in almost every film or video that we encounter.

Understanding the Kuleshov Effect allows editors to better control the tone and meaning found in their films. Through the choices in how shots are organized and sequenced, filmmakers can create new meaning by juxtaposing unrelated images. With the illusion of condensing space, we are able to create new worlds, connecting places that were previously separate. Thus, the Kuleshov Effect is a huge part of the magic that is film.

Russian film theorists in the early 1900s were hugely influential in shaping how cinema was to develop. They saw film as a powerful tool of social transformation, inherently political and inextricably linked to the filmmakers’ worldview. Kuleshov’s contemporaries explored the power of montage and their innovations paved the way for contemporary filmmakers.
Sergei Eisenstein, promoted the idea that the essential element of all art is conflict. Eisenstein advocated dialectic montage — that a sequence of shots can have more meaning the the sum of its individual parts. He was inspired by his study of Japanese Kanji which juxtaposed two concepts to create a new third concept. Eisenstein’s films “Battleship Potemkin” (1925) and “Strike” (1925) are both classics of Russian cinema.

Dziga Vertov eschewed dramatic films as a corrupting influence. An early experimenter in the realm of documentary, Vertov pioneered many modern staples of filmmaking in his newsreels. In 2014, “Sight and Sound” named his film, “Man with a Movie Camera” (1929), the best documentary ever.

Taken from an article written by Erik Fritts for videomaker magazine.

 Tags: Editor, Los Angeles, Hollywood.