Better than Bins: Final Cut Pro X Events (FCPX Part 3)

“Import your footage into the Event.” How strange does that sound?

Final Cut Pro X’s Events are individual databases that store all the media for a particular shoot. They appear simple, like a Final Cut 7 bin. But they are much more than that, and can be a little difficult for even an experienced editor to fully grasp. I want to persuade you that thinking about your footage like data in a database, instead of film in a bin, is the first step toward accelerating your editing with FCPX.

If you want a quick overview of the details of Final Cut Pro X’s Events you can check out Larry Jordan’s video.  To really go in-depth, I went though Steve Martin’s Ripple Training Media Management in FCPX and we use it to train in our new video team members at Logos. Here’s another sample of our work for one of our new brands, Noet:

“Events” first appeared in iPhoto. Some readers are already rolling their eyes. There is no reason to. Events are relational databases and they are a huge leap forward over folders. Steve Jobs introduced Events in iPhoto with the idea that you typically shoot a bunch of photos together, like at a birthday party. When you want to go back and look at your photos it is usually in a group, like the day of the Event. So the Event was born as a way of automatically associating a bunch of photos with a point in time. It is the relationship that all those photos have to one another.

Let’s pretend my son Sam had his first birthday party. It started at home and finished at Pizza Hut on a Saturday. We shot some photos on our iPhone, and some shot video.

How would you organize those photos? You might say, “I can create a folder ‘Sam’s Birthday’ and stick all my photos in it. I don’t need a relational database.” That’s the equivalent of a bin in FCP7, a single “relationship” that I have defined. These 20 clips are from the “Birthday.” This is describing a relationship of when. The relationship of time is just the first relationship that these clips may have to one another.

iPhoto then brought in Places. Now the photos could all be analyzed on the basis of another relationship, where were these photos taken? The birthday party started at my house, and then finished at Pizza hut. My iPhone automatically geotags each photo. The Event in iPhoto describes a time relationship, and the Places show a geographical relationship.

Let’s talk about party. We shot some of Sam’s birthday party at home and some at pizza hut. If we organize it in FCP7, we’d create a “Birthday” bin, and have “Home” and “Pizza Hut” bins inside. Great, I can now go into “Pizza Hut” folder and I can simultaneously see 2 relationships time and location. Where does the Event have the advantage?

Back to iPhoto. iPhoto later incorporated a feature called Faces. It works pretty well if you give it a little love. You can see all the pictures of Sam, and his friends from the party. Now your photos are grouped by another relationship who. 

In iPhoto, Events, Places, and Faces are different ways to view the same set of pics based on a specific relationship. I can see every shot of my son Sam in one view. This is key, it is filtering out the noise (Noise is simply stuff that isn’t a viable option to me at the moment). As our datasets, whether it is pictures of videos grow there is an intrinsic problem – noise. I can take a 1,000 photos, but all I want to see are pictures of ____ (fill in the blank). A folder works great if there is only one blank. Two folders in hierarchy work fine if there are two “blanks.” I want to see birthday pictures at Pizza Hut.

But what if you had the Birthday bin in FCP7, with Home and Pizza Hut folders in it, but then you thought of your pics in iPhoto. And it struck you, “I want to see every (where) pick of Sam (who) on his birthday (time).” Go to iPhoto, click on faces, and scroll to the day. Bam! iPhoto will show you every pic of Sam, regardless of the location with zero noise. All you see is exactly what you want. Even if you took 1,000 photos the metadata has allowed you to define a set of relationships that enabled you to create a noise free view. All these photos are candidates for my Facebook post.

Now how would you do that with Bins? You can’t. Maybe you’d create another “Sam” bin with in “Home” and one within “Pizza Hut,” but then you’d have to look at one and then the other.

What if you want to see all “inserts” from the whole day? Or all the closeups, or all the establishing shots? Or what if you want to see all the “closeups of Sam on his birthday regardless of location?” Forget about it.

Events allow you to use keywords to create a single relationship. Select all the clips from Home and keyword them. Select all the clips from Pizza Hut and keyword them. We’re pretty much at Bins. Reject the clips you don’t like, favorite selects  – OK that’s basically color coding stuff. But then it gets interesting. Keyword every clip that has Sam in it “Sam.” Go through your footage and keyword every closeup “CU.” And another for wide establishing shots.

Each keyword collection defines a single relationship between a set of clips. That might be based on person, angle, location, or day. This is cool, and no clips have been duplicated – that’s nice, but it doesn’t answer the fundamental question that an editor is always asking: “Show me all my options, minus the noise.”

Now you will begin to see the power of Smart Collections. Smart Collections allow you to define a noiseless view based on multiple relationships. Now if I have my keywords based on information like angles, locations, and people I can build a Smart Collection like this: Show me every close up of Sam on his Birthday.  And another show me every establishing wide shot with Sam at Pizza Hut.

When you click on that Smart collection, and all you see are your favorite CUs of the subject in the location that you need the lights go off. Suddenly you look at Finder’s hierarchy folders and go “What? I can’t even use this!” (Thankfully, tags are coming in Mavericks) It totally changes how you think about your footage. It allows you to ask questions about your footage and get noise free answers where everything you see in a view resembles a viable option.

Smart Collections answer the questions that editors are always asking, but instinctively we sort and scroll through our bins. I always say “Events aren’t folders.” This is why there should only be one Event per SAN Location, so that I can see all the compound clips or multicam clips associated with an entire body of footage.

If you keyword your footage with common editing attributes like locations, people, angles, and key themes you’ll be able to build Smart collections that lend themselves to an editing experience that moves the friction and noise to the time of ingest, and removes it from the editing moment when you are “in-the-zone. And that is why Events are better than bins.

One Comment

  1. paurray November 25, 2013 at 4:10 am #

    So simple pretty much anybody should be able to understand what you are saying :-) Excellent stuff. Could not have said it better myself.

Leave a Reply