Panasonic has announced its third firmware upgrade to the
EVA1 camera which provides greater cost efficient and futureproofing for the
handheld cinema camera.
The free upgrade is available this month (31 January
2019), with a key feature being a new HEVC H.265 codec that can record in 4K
50p/60p video with 4:2:0 10bit video sampling. Compared to the H.264 codec in
version two, this allows up to twice the amount of compressed data, while
maintaining the same high quality resolution.
In addition, the upgrade see support for a third
party’s USB-LAN adapter, with Internet Protocol control. This enables EVA1-Live
multicam control with the advanced integration of CyanView Cy-RCP remote
control panel.
Furthermore, new functions can be assigned to User
Button for quick switching of shooting modes & framerates which simplifies
system menu change.
Sivashankar Kuppusamy, Marketing Manager
of Broadcast & ProAV solutions said, “When it comes to the EVA1,
we aim to provide more than just a camera, but an ongoing relationship with our
customers that facilitates their photography needs now, and in the future. This
free download enables less data to be transmitted, and even greater versatility
in regards to capturing and editing content.”
Other functionality added as part of the upgrade
include still image capture in playback mode, enabling the EVA1 to capture
still JPEG images and store them on to the SD card. Additionally, the camera
now allows two auto white balance presets, a focus indicator in the LCD head-up
display, and SDR monitoring when shooting in HDR (HLG).
Panasonic has announced a reduction in pricing of the
EVA1 by £507/€500, with a new RRP of £6,023/€6,790
(excl. VAT).
The State Library of NSW has today announced the launch of its inaugural short film prize Shortstacks, with a total of $20,000 on offer across two categories.
Shortstacks is now open to established, emerging and
first-time filmmakers of all ages, but there’s a catch … each short film
entered must reference one or more items pre-selected from the State Library’s
extraordinary collection.
According to State Librarian Dr John Vallance: “Out of more than six million items in the Library’s collection – drawings, photographs, manuscripts, books, maps and other objects – we’ve chosen six that are sure to ignite visual creativity.”
“Shortstacks provides a really unusual opportunity for filmmakers, including those just starting out, to say something new and interesting using the Library’s collection, and reach a wide audience,” Dr Vallance said.
There
are two prize categories:
$15,000 General Prize: open to filmmakers 18
years and over at the time of entries closing
$5,000 Youth Prize: open to filmmakers 17 years
and younger at the time of entries closing
“As Australians, we need to tell our own stories,” said Margaret Pomeranz, Shortstacks judge and celebrated Australian film critic writer, producer and TV personality. “What better place to go for inspiration than the State Library of NSW which holds so many of our most memorable stories, and those still to be brought to life.” View video of Margaret Pomeranz.
Entries close 5pm, Monday 29 April 2019. For more information and entry forms, visit the State Library of NSW’s website www.sl.nsw.gov.au/awards
If anyone ever doubted that Blackmagic Design was a force toi be reckoned with in the major film making industry, wonder no more!
Blackmagic today announced that more than 35 films, episodic series and projects at the 2019 Sundance Film Festival were shot and completed using Blackmagic Design products.
Blackmagic Design congratulates the many projects selected for Sundance that were created with its digital film cameras, Fusion Studio visual effects (VFX) and motion graphics software, DaVinci Resolve professional editing, color grading, VFX and audio post production software, and more. This includes “4 Feet: Blind Date” composited with Fusion by Martin Lopez Funes of Malditomaus and his team, and films such as “Native Son,” “The Infiltrators,” “Them That Follow,” “Big Time Adolescence,” and many more that were graded and edited with DaVinci Resolve Studio.
Lucien Harriot, president of New York based Mechanism Digital, used Fusion Studio for his VFX work on “Luce.” According to Harriot, “Some of the shots in ‘Luce’ were quite challenging. For example, one particularly long shot orbited several times around an actor standing in a high school lobby, and we had to remove the camera crew’s reflections from all of the windows and trophy cases. Fusion Studio’s tracking tools came in very handy on those tasks.”
Some of the Sundance Projects that Used Blackmagic Design Cameras and Gear:
> “4 Feet: Blind Date” media production company Detona Cultura and Realidad 360° Argentina used a Micro Converter SDI to HDMI, a Micro Converter HDMI to SDI and an Intensity Extreme for real time camera monitoring; > “Clemency” DIT Peter Brunet and DP Eric Branco used a Smart Videohub 12×12 for routing, SmartView Duo for monitoring, and DaVinci Resolve and an UltraStudio Express for grading on set; > “The Dispossessed” Director Musa Syeed used a Pocket Cinema Camera; > “The Ghost Behind” Director Caroline Rumley used a Pocket Cinema Camera; > “Honey Boy” DIT Ernesto Joven used DaVinci Resolve, Smart Videohub 20×20, SmartScope 4K and UltraStudio Express; > “It’s Not About Jimmy Keene” Director Caleb Jaffe used a Pocket Cinema Camera for pickup shots; > “Playhouse” DP Sophie Feuer used an URSA Mini Pro; and > “Them That Follow” DIT Nick Giomuso used Davinci Resolve Studio, DeckLink Mini Monitor, Smart Videohub 20×20 and SmartScope 4K.
Some of the Sundance Projects that Used Fusion and DaVinci Resolve for VFX:
> “4 Feet: Blind Date” Animator Martin Lopez Funes of Malditomaus and his team used Fusion for the film’s VFX; > “Luce” VFX Supervisor Lucien Harriot of Mechanism Digital used Fusion Studio for the film’s VFX; and > “The Mountain” VFX Supervisor Alex Noble of Wild Union Post used DaVinci Resolve within his VFX pipeline.
Some of the Sundance Projects that Used DaVinci Resolve and DaVinci Resolve Studio for Post Production:
> “American Factory” was graded by Ken Sirulnick and Steve Pequignot of Glue Editing & Design; > “Anthropocene: The Human Epoch” was conformed by Carlie Macfie, with Technicolor’s Mark Kueper completing the grade for the 3D 360 degree VR portion; > “As Told to G/D Thyself” was graded by Joe Gawler of Harbor Picture Company; > “Bedlam” was graded by Adrian Seery of Harbor Picture Company; > “Before You Know It” was graded by Ben Perez at Gigantic Studios; > “Big Time Adolescence” was graded by DP Andrew Huebscher at Local Hero; > “Birds of Passage” was graded by Sandra Klass; > “The Death of Dick Long” was graded by Nat Jencks of Goldcrest Post; > “The Dispossessed” was graded by Nat Jencks of Goldcrest Post; > “Divine Love” was graded and finished by Daniel Dávila of Kiné Imagénes, who also used a DeckLink 4K Extreme 12G and a DeckLink SDI 4K for capture and playback, and a Mini Converter SDI to HDMI and a HDLink for converting; > “Edgecombe” was graded by Pete Quandt; > “The Farewell” was graded by Alex Bickel of Color Collective; > “Honey Boy” was graded by Alex Bickel of Color Collective; > “Imaginary Order” was graded by Thomas Galyon, who also used a DaVinci Resolve Advanced Panel for grading and a DeckLink 4K Extreme 12G for playback; > “The Infiltrators” was graded by Mark Todd Osborne of MTO Color at Buffalo 8 Post Production; > “Judy & Punch” was graded by Olivier Fontenay at Soundfirm Sydney; > “Lavender” was graded by Brandon Roots; > “Light From Light” was graded by Neil Anderson of Lucky Post; > “Love, Antosha” was graded, conformed and delivered by Aaron Peak of Neon Diesel Finishing; > “Luce” was graded by Alex Bickel of Color Collective; > “Maggie” was graded by Nat Jencks of Goldcrest Post; > “MEMORY – The Origins of Alien” was graded by Dave Krahling of Milkhaus, while Editor Chad Herschberger used an UltraStudio 4K in his editing bay; > “Midnight Family” was graded by Phaedra Robledo of Cinema Maquina, who also used an UltraStudio 4K Extreme for capture and playback; > “Mope” was graded by Alastor Arnold of FotoKem; > “Native Son” was online edited and finished by Samuel Gursky of Irving Harvey, and graded by Tim Stipan of Company 3; > “Premature” was graded by Kath Raisch of Company 3; > “The Rat” was graded by Andrew Francis at Sixteen19; > “The Sound of Silence” was graded by Roman Hankewycz of Harbor Picture Company, while Editor Matthew C. Hart used DaVinci Resolve and an Intensity Shuttle for Thunderbolt within his editing workflow; > “Them That Follow” was graded by Nat Jencks of Goldcrest Post; > “This Is Not Berlin” was graded by Lisa Tillinger; and > “The Wolf Hour” was graded by Andrew Francis at Sixteen19.
We have 3 copies of the best selling scriptwriting software Final Draft 11 to give away. To see how you may win one, stay tuned. All will be revealed soon!
Hint! Think of the cleverest movie / play / musical / TV show / shortfilm title possible …
The 26th January in Australia is marked as “Australia
Day” and is the commemoration of the day the British flag was raised in what is
now Sydney, and the British took dominion over the lands.
To this day, there are divided opinions over this holiday /
commemoration, with many local indigenous people calling it “Invasion Day” and
wanting it abolished or at least, the date of “Australia Day” changed.
One wag the other day suggested May 8th, as this is
easily memorised by all Aussies “May 8… M-a-a-a-a-t-e!” Gerrit?
Dubbed the “Skyworks”, up to 300,000 people gather on the Perth foreshore of the Swan River to watch the 20 minute or so spectacle that starts at 8pm sharp and explodes above Perth Water.
Now, to our US, European and Asian cousins (at this point
the UK is still in the European Union so I include them in Europe by the way),
300,000 might not seem a lot, but when you consider this is about ¼ of the city’s
population, it is pretty impressive.
One of the challenges is photographing or videoing the event,
and this year, myself and Jacqui did the 4 hour trek from home in Quinninup (near
Manjimup), leaving Budweiser and Shnorky the Hounds to play Guardy-Dogs for a
couple of nights.
We stayed at the Great Southern Hotel in Northbridge, and at 6pm, walked the 2Km or so into the main city and then a further 1Km to find our 1 square metre of plot in the absolutely crowded Langley Park, the main viewing area (the other is at Kings Park above the river and about 2Km downstream).
One problem the average person gets in this situation (apart
from the absolutely pointless act of leaving the auto flash turned on in still
cameras / smartphones), is using auto focus and auto aperture. Simply, on this
setting, the camera / smartphone is continually trying to “hunt” for something
to focus on – without success in the alternating dark and bright flash / light
from the fireworks, and doing the same for the aperture as again, the ambient
light switches from black to bright light.
So, the simple trick is to pre-show, set the aperture
manually – I used F8 – and focus to infinity. If shooting for still images, for
the shutter speed I set to 1/60th, but playing around a little in
the first few minutes of the show with settings and checking the resultant
images will get you your ideal setting.
And it goes without saying – or does it – to hold your
smartphone horizontally! This way you’ll fill the frame rather than have those
wide, empty black vertical stripes on the sides of your video / photo. In other
words, like your TV, the long sides to the top and bottom.
What I was after was to get the effects of the fireworks putting a reflection on the water, and also putting the crowd in front of me in silhouette.
I confess I wasn’t 100% successful, but the result – and there is no doctoring in this footage as it is direct from the camera, I just added the titles and logo – is not too bad. The footage has been down sampled from 4K to HD for this post though and I have included my Vegas Pro 16 layout so you can see the compositing and layering used.
At some point in time, in your filmmaking journey, you are
going to need a microphone.
The problem lies in the question, which one? Shotgun mic,
radio mic, highball or lav? On-camera or off-camera, handheld, on a boom – there
are so many choices. So where to start?
Well we decided to ask an expert so got in touch with
Sennheiser Australia to ask their thoughts.
Firstly though, let’s work out what microphone type should
be used under different circumstances. For example, if you are filming a
landscape and simply want ambient noise, that is a totally different microphone
that you would use as again say, interviewing a person in a closed room.
Shotgun Mic
Similarly, if you are filming a live band at a music
concert, then that is a different circumstance and so a different type of
microphone might be used depending on whether you’re indoors or outdoors for
example.
Shotgun microphones are ideal for several different things on production set. They are very directional so you simply need to eat point the microphone in the direction sound is coming from and that’s what shotgun microphone will capture. As a result, a shotgun microphone is ideal for capturing dialogue of a scene (as against a person), ambient sound and any Foley effects that may be happening. Shotguns are also commonly the mic used at the end of a boom where you see a sound assistant holding the long pole with the microphone on the end of it and over the top of the subject but out of shot.
Sennheiser MKE 600
Lapel Mic
Also called a lav which is short for lavaliere, a lav is designed specifically for capturing dialogue and so are used often on interviews for television documentaries. If you are not familiar with a lav, the next time you’re watching someone being interviewed, notice that on the lapel they will have a small black object. This is the microphone and can be either wired usually to a small transmitter attached to a belt, or more commonly these days users wireless transmission also via a transmitter that is sends to a receiver often located on the top of the camera. Sometimes they are placed underneath the clothing and held in place with a tiny bit of gaffer tape. Of course it stands to reason that when applying a lav mic to someone you need to be courteous and professional.
Sennheiser EW 112G4
Video Mic
These are an on-camera microphone with a direct connection into the camera for recording audio to the recording medium whether it be tape or SD card. They are probably the most common microphone in use and if you’re looking at buying a microphone and just want a general-purpose unit this is the way to go.
Sennheiser MKE 440
Handheld Mic
As the name suggests this microphone is held in the hand is most commonly used by reporters in the field and either connected directly into a recording device or directly to the camera. They are also often called a reporter mic.
Sennheiser MD 46
Smartphone Mic
These are a new breed of microphone and designed specifically to be used with a smart phone where the phone is being used as a recording device whether it be an iPhone or Android based. In the basic form they act a little bit like a reporter like that more sophisticated units such as the Sennheiser Memory Mic that we gave an award to last year also have on-board memory, can work wirelessly and even have phone-based software for later editing and syncing.
MemoryMic
Which Model?
As mentioned, we spoke to the experts Sennheiser to ask them
what model they would suggest for each of the above categories. The on-camera
shotgun mic, the MKE 440 was the suggested model and if the microphone is to be
a boom, then the MKE 600 Sennheiser is considered ideal.
For a handheld (reporter) microphone, and a choice of two
models depending on application was put forward. The HandMic digital can be
used with smart phones as it has a USB connection as well is a lightning and
micro USB port. For more traditional use the MD 46 is a high-quality dynamic
cardioid microphone designed for live reporting and broadcast environments.
For a lav microphone Sennheiser suggests a wireless unit and
in this case the model EW 112P G4. This is an all in one wireless system with
broadcast quality sound comes with all gear you need including microphone,
receiver for mounting on camera, a transmitter for the talent and the cables
you will need. It has a range of 100 m which is plenty, and the batteries are
said to last for up to 8 hours importantly it is easy to setup and use unlike
some others I’ve played with where a degree in sound engineering might be
useful.
Budget
Of course, a lot depends on your budget, but never underestimate the quality of your audio. This is one area where going cheap is not really an option. Sennheiser are not in the habit of giving our pricing as they are not a retailer, so we checked around and got the following averages.
Back in the ‘old’ days – around 1997 – there used to be a
piece of software called Commotion from a mob in the US, Puffin Software, that
as one of its party pieces, allowed the removal of objects from video.
It was expensive. Very expensive. Over USD$1500 as I
recall. And complicated. In fact, there was a whole suite of video training
tapes for it.
But it was also ground breaking at the time (along with
AVID’s Elastic Reality but that’s a whole another story).
Today, we have ERAZR
V1 from proDAD, who also
make four of our favourite utility applications, Heroglyph,
Mercalli,
Vitascene
and ProDRENALIN.
It’s a stand-alone Windows application, not a plugin, and boy, is it effective!
And you can learn it from only a couple of online 7-minute YouTube tutorials,
not 3 hours of video training!
There are a few caveats I must admit, not the least being
that you do need to sit down and come to grips with how the program works.
There are some serious technologies at work here folks (including AI), but once
you master them, the usage of ERAZR comes very quickly and easily to you. At
first it can seem a little daunting as quite probably, you have never quite had
to do something like this before. But the effort is worth it.
Wazzit Do?
Simply put, ERAZR allows you to track an object in a scene –
including any extraneous bits such as shadows – and remove them completely. A
lot of what ERAZR does is performed automatically by the system, but when, for
example, objects get too small for the program to track accurately such as a
vehicle coming from or going to a distance, you may need to step in and perform
some manual tracking and keyframing to get the very best results.
The example used way back when in Puffin’s Commotion removed
a seagull flying (off memory), through a scene. With the computing power
available back then, pre- multicore days and we are thinking Adobe Premiere
version 4.2 or 5 – whilst it worked, it was slow as, as each frame was painstakingly
calculated and rendered.
No such issue with ERAZR, and even on a modest i5, it honked
along quite nicely thank you! ERAZR supports a multitude of frame rates (24p /
25p / 30p / 50p and 60p and then some more) and resolution up to 8K.
Unlike other applications that do object removal by clone
stamping from previous scenes that can be very work intensive and time
consuming, once you have the workflow of ERAZR sorted, you can rip through
creating a sequence very quickly.
Wazzit NOT Do?
Yes, there are limitations, but these are hardly ERAZR’s
fault; for example, a vehicle moving from left to right across your scene (or a
seagull!) is easily extracted as the program ‘knows’ what the scene should look
like without the object there. But if the vehicle is coming towards you for
example, then ERAZR has no idea what is behind it, and so cannot do its thing
under those circumstances. Similarly, an object must be moving.
The workflow is quite straight forward, as mentioned, just
different to perhaps your norm. First
off, the object to be removed has to be tracked and ERZR provides the tools to
create different types of masks for doing this along with keyframing any
changes.
So HOW Do You Do It?
First you define the in and out points of the completed
clip. ERAZR calls this the Work Area.
Next you find a good position in the clip that allows you to
define by the mask the whole object to be removed. You can adjust the mask
using supplied tools to get the exact fit you need. The mask is then adjusted
during the clip work area to create the required key frames. The mouse and
keyboard arrow keys are used to fine tune this as needed.
Once you have completed the tracking and are happy with the
result (you can check through frame by frame and adjust the automatic
selections of the masking as needed), then you switch to the Result View of
ERAZR to make sure there are no errors.
If you want to be really clever, ERAZR even lets you fade in
a removed object and then fade it out again. This is a neat trick for those
“ghost images that appear to show people walking through walls for example.
The end result of removing our scooter guy as can be seen
from this video (click the icon below):
What Does It Cost?
I won’t pretend ERAZR is a $49 product as it’s not. It is USD$499.
But if you have been on location and got the exact shot you want and then back
in post see that there is a glitch that should not be there – that blasted
seagull again – then it is a damn sight cheaper and a lot more convenient than
a re-shoot.
Another opportunity is where you have clips that are totally
unrelated to a current project, but with a bit of tweaking by removing a moving
object, they can fit in. This may save having to do a specific shoot, and therefore
save budget.
I for one can think of many areas I can use it in the latter
case alone!
Anyway, proDAD as always, let you have a play to see what
you think. I suggest you download the trial and have a play; I suspect ERAZR
might just become another very useful tool in your video editing toolbox!
To get the best, you can also get a downloadable PDF manual,
Quickstart and Workflow documents, and proDAD has kindly put together a number
of tutorials to help in the learning curve. All this stuff including the Trial
version can be had from www.prodad.com.
More than likely we’ll create a few ourselves as this is a
fun program to play with!
Most regular readers will know I am an unashamed fan of Top Gear / The Grand Tour; the show(s) as well as the techniques used in filming and putting them together. Well, in a recent search for some ideas on how they have created certain scenes, I came across an article – an interview really – by journalist Steven Hullfish from the ProVideo Coalition with long time editor of both shows, Dan James.
I contacted Steve and he has given permission to re-publish this article in full. I am sure you’ll agree it is fascinating reading.
Steve Hullfish has also authored a book called, The Art of the Cut (more detail below) and I urge you to check it out..
One of the most popular shows on the planet was the BBC’s Top Gear. When that show left the BBC, it was essentially picked up by Amazon Prime Video and renamed Grand Tour, where it is coming to the end of its second season, with the third already in the works. Dan James has been a long-time editor on both series, and we discussed what it took to edit the big-budget car show.
HULLFISH: Tell me a little bit about the production itself. It seems like there’s always a lot of cameras going. There must be a massive amount of footage that you’re going through.
Dan James: It is a huge amount of footage. For what we call a “three-header” – with all three presenters – they’re shot with three ARRI Alexas. Then there are three lots of Panasonic LUMIX GH4s for each presenters in-car cameras and a GH5 for the drone. Plus lots of GoPros.
HULLFISH: How is footage getting back to you from the field?
Dan James: It’s shipped back on transport drives. The data wrangler on set transcodes everything to DNxHD36.
HULLFISH: So you’re cutting in Avid. How scripted are these stories?
Dan James: With some of the films, not as much as you might think. For instance, for one of the hour and a half ‘specials’ we do every year, you’ll have the meet up at the start of the film where they arrive in 3 different vehicles: that is very well scripted. Beyond that, it’s usually just a rough guide as we’ll know where they’re going but because they’re all such good car journalists they know there has to be ‘car content’ on the way as well as situations and general mucking about to keep the story going. Decisions are made on the ground on that day.
With the short track tests, where you’ve just got one presenter in a car going around a track, those are very tightly scripted because we need to bring them in at six or seven minutes. Those are six or seven pages and you don’t really expand beyond that.
HULLFISH: So are you personally cutting only individual stories in the show, or are you cutting the entire episode?
Dan James: We cut what they call the insert films. We don’t really get involved in the studio. The studio shoots are done right at the end of the year but we’re working months and months in advance, getting all the films ready
HULLFISH: I also feel like some of the stories are broken up with other stories in between them. Do you make the choice of where to cut to another story?
Dan James: Yes, say with a 30 minute film, we’ll decide quite a bit in advance where to break it between part one and part two.
HULLFISH: I did the same thing on The Oprah Winfrey Show for about a decade: cutting the insert stories, but then someone else was in charge of piecing the stories and the studio together.
I would guess that there’s a lot of improv. Talk to me a little bit about dealing with all that improv in the edit.
Dan James: Well, you just have to lay it all out. We shoot so much It’s almost mind-boggling and now although we’re in the realm of 4k we’re still shooting actuality like proper reality. People will shoot 4K dramas and films in 4K but the shooting ratio there is very low. We’re still shooting 150 or 200:1 sometimes. It’s astounding how much we have to look through to get the real gems out of this but you have to shoot this much stuff in case something happens on the road. The in-car cameras have to roll all day. Nine hours a day on the big road trips three cameras in each car, 2 x POV and a ‘face cam’. I always joke that our servers in the edit are the size of a room ‘cos it’s all in 4K.
Screenshot of timeline from Avid. Right-click on image to open in its own window or tab and zoom in using the full resolution of the screencapture.
HULLFISH: How are you shaping the material? Working with a writer or producer before you start cutting? Is someone doing a paper edit?
Dan James: With the director, we’ll sit down and go through the basic structure of it, but generally I’m personally left alone quite a bit of the time. We’ve got an awful lot of freedom. The executive producer, who is Andy Wilman, and Jeremy, will have the final say editorially but the director and I are the ones that shape the piece.
With something easier, like the single header track tests, it’s tightly scripted so you concentrate on the rhythm of the film instead and you can almost apply a formula when you’re doing those – a kind of ‘hold and release’ thing. You’ll cut a really intense little passage of 30 seconds, frame cut’s and what have you but then you know you have to let it calm down for about a minute and then you can go to another intense bit. That seems obvious, but you do see editors on other programs just doing frame cuts for the whole film but you’d just go mental.
I must admit I’ve never worked on anything with such creative freedom. This is the most extraordinary thing I’ve ever worked on. I did Top Gear from 2004 to last year and pretty much the whole BBC team moved over to Amazon and it’s always been the same way of working really. Nothing’s really changed and interestingly you can see over the years that TV advertising has caught up with us and I can see cutting techniques that we did five years ago now just being mainstream.
HULLFISH: Describe one of those cutting techniques that you guys pioneered that you are seeing in the mainstream.
Dan James: Personally frame-cutting. Just using single frames, almost like animation. That’s what I love.
Keeping the viewer’s eye dancing around the screen, from left to right, up and down, especially as now a lot of people watch the show on smartphones and tablets. It’s nice to keep the viewer’s eye active around the frame.
HULLFISH: How much time do you have to get some of these things edited together?
Dan James: A single header track test at 7 minutes would probably be 10 to 15 edit days. A bigger 10-15 minute piece would be about three to five weeks. The big 90 minute specials can be up to 16 weeks.
HULLFISH: Let’s talk about sound.
Dan James: Well, we try to do a lot of our sound design in house and we tend to track lay as we go so when we picture lock it really is quite close to how we want it to go out. Just recently we started going out of house for some bits of sound design, just to get more unusual engine sounds and things like that.
HULLFISH: The track tests seem to be a lot about sound design with the sounds of the engine and gear shifts and squealing brakes.
Dan James: Yes, and from there we add random mechanical metallic sounds. Those little track tests are really fun to do, because you can do anything you want. It’s like making a little promo. They’re always a treat to do, those are.
HULLFISH: I would think that the sound design has a lot to do with the rhythm of those pieces.
E0BF5396.RAF
Dan James: Yeah absolutely. We’re always pushing ourselves to do something different. There’s a good competition between the edits, nipping up and down the corridor seeing who’s doing what that week. It’s THE most creative thing I’ve ever done because you have just completely free reign.
HULLFISH: Let’s discuss music. Are you just grabbing anything you want? Are you working from a stock music library or do you have a composer?
Dan James: Well, on Top Gear, we were at the BBC and they had a blanket license so we could just use anything. Now we’re with Amazon, we’re back to square one. No blanket agreement so we now have to individually license every single piece of music. We have a composer Paul Leonard-Morgan. He’s based in L.A. and he’s really good. He can recreate the feel of anything from Disney to Inception. As we work through season 2 he’s really become invaluable. With film scores it’s gotten a bit more complicated. You can’t just throw in anything you like. You have to be careful.
HULLFISH: And when you’re working on a story that requires composed score, are you cutting in temp track for him?
Dan James: Oh yeah. In season 2 there’s a film in which we used a lot of score for temp: Man From U.N.C.L.E. It was all very 60s.
HULLFISH: Why does that not surprise me? Maybe a little Kingsman?
Dan James: Really, anything goes. We use the most unusual stuff.
HULLFISH: So let’s talk about your approach. You get one of these drives in from the field with a TON of stuff on it — possibly 200:1 shooting ratio and what do you do to organize it and get started?
Dan James: It sounds terrible, but it’s just really donkey work. It’s just sitting there watching hours and hours and hours of stuff.
HULLFISH: I was just talking to Kirk Baxter about this and he calls it “homework.” You have to put in the hours of grunt work before you can start having fun.
Dan James: You have to break it down into individual camera roll sequences then you divide those rolls into scenes. You break it down, make a copy and break the next one down even further. You’re getting all these lovely little selects eventually. It’s a real OCD kind of thing. I love it personally. I lock myself in for a couple of days and just crush the whole thing down.
HULLFISH: That’s a great point for a lot of people that might not be familiar with reality or documentary work: there ARE scenes. You break it into stories or scenes, even though there’s no real script. There are individual ideas that have an essential order: they arrive, they talk to camera, they drive to this place, they get out and see something, they drive someplace else, they get into a jam… each of those is a scene. That allows you to break it down to more digestible chunks. You say, “OK, I have 4 minutes of arriving and ‘to camera’ before they go off. So let’s just deal with that.” So instead of looking at 1000 minutes, you’re only looking at a few minutes at a time to just deal with a single “moment” of your story.
Screenshot of Avid Project window and bins from Grand Tour. Right-click on image to open in its own window or tab and zoom in using the full resolution of the screencapture.
Dan James: You visualize what needs to happen. Then you’re looking for the elements to pull that together. It depends on what the piece is though. If it’s a short track test you will have a lot of coverage of the same thing. If you’re on a big road trip, you’ll have nine hours of footage out of the back of a tracking car. Then we have stuff we call ‘forward tracking’ which on a big road-trip is invaluable because it shows you what’s going on around the cars and around the presenters and you get a lot of local color. I’d always rather have too much rushes than not enough.You can take big chunks of footage and leave them in one piece, and say, “OK, that’s all the tracking for Day 3.” And then if something happens at 2 o’clock on the ‘in car’ cameras on Day 3 you can just go to 2 o’clock on the tracking and you’ll find it.
HULLFISH: Everything’s time-of-day timecode?
Dan James: Yeah. We insist on that because we know we’re going to have to deal with so much from so many cameras.
HULLFISH: So you edit from selects reels…
Dan James: Yeah, so we’ll have a bin of selects sequences. A sequence of crash cams or all the different forward tracking stuff. We try to keep it fairly compact. It takes a couple of passes to get these sequences down. And you do these ultra-selects. Little seven or eight frame little things that you can use to join things up with, like a classic whip pan. We’ve always got little bits and pieces like a texture or something.
You need to be really organised because sometimes you’re not sure which way the story’s going to go. You don’t know what you’re going to need, later down the line so you can’t be sifting through raw rushes at the end of an edit.
When we get new editors in, it usually takes them a good few months to get used to how we work. Me and Jim Hart, the other editor, we’ve been doing this for years now. And there’s nothing else like it, it’s comedy, it’s epic adventure, it really covers all the genres.
HULLFISH: Do you sync that stuff and do multi-cam with it?
Dan James: The facility house where we are has two assistants who do a lot for us. They’ll sync up all of the in-car cameras. Sometimes we get them to make what we call a camera map for each day so that everything is on a big timeline with all the different cameras stacked up.
HULLFISH: You mentioned a forward tracking car….
Dan James: On a big road trip, there will be production vehicles say, a mile in front of the presenters’ cars and other vehicles about a mile behind. The forward production vehicle will have a cameraman in it just grabbing stuff as he goes, just local color and views and that’s really important stuff. I can use so much from that camera because it’s so interesting. Otherwise, you’re forever looking backwards at the presenters cars coming down the road, so you need that forward movement for a sense of journey.
EOB_5076.NEF
HULLFISH: Do you guys do ADR for these shows at the end?
Dan James: Very rarely. We don’t do a huge amount of ADR.
HULLFISH: To get back to your process, are you starting out cutting these pieces on the longish side of where you know they need to be because you know that there’s a process and that they’ll get trimmed down in the process of developing them and screening them for various people? So how long would be a first cut of a story compared to the final?
Dan James: For the tightly scripted stuff, like the track tests, they’re not much longer than eight or nine minutes. For the road trips you’ll have massive timelines because there’s ad-libbed stuff all the way through it and what should be a 20 minute Day One timeline will be an hour. You throw everything decent in it, then you stand back and watch it and you say, “We can’t keep that and we can’t keep that.” The first cuts can be enormous, especially for the big specials which are supposed to be 90 minutes. Those can run 3 or 3 and a half hours long for the first cut, because you don’t know if an event on day 1 is going to carry on to day 5. It’s a lot more fluid. A little story arc might seem to develop, but if it doesn’t go anywhere, then we’ll kill it.
The half hour films, the two part ones — they’re not usually longer than 40 minutes on the first cut. And those need to get down to 30 or 31 minutes. But the specials really do get unwieldy, but that’s because of the ad-lib nature of them.
HULLFISH: Plus they need to be long so when Jeremy or the EP steps in, they have some stuff they can cut and still have enough left when they’re done.
Dan James: Exactly. Andy Wilman, our Exec, will shape it with us. He’ll come in and go through these massive transcripts. One for each presenter and each day. Like bibles. He’ll go through them with a highlight pen and we’ll create these big sync constructions. Trying to draw storylines together. That’s where Wilman is at his best and happiest. We’ll do that with him for a good couple of weeks before we do a ‘first paint’ then give it to Jeremy to let him the first look at it.
HULLFISH: With so much transcript stuff are you using Avid’s PhraseFind or ScriptSync?
Dan James: No. We generally don’t. We do it manually on bits of paper and Post-It notes on the wall. We just do a search for stuff in Microsoft Word.
HULLFISH: When is the Exec coming in to do this with you?
Dan James: He’ll leave us alone initially for about the first month. He lets us put all the events together and the static bits to camera. Then he’ll come in and we’ll start going through all of the in-car stuff to join it all together.
HULLFISH: So, he comes in when you’ve got something for him to react to.
Dan James: Exactly. It’s kind of scripted, but it’s not. Sometimes it’s a struggle to keep a narrative going but other times it kind of drives itself.
HULLFISH: With Top Gear, I love the underlying tension of whether the entire trip is going to fall apart from one disaster or another.
Dan James: That’s one of the joys of this show is that dynamic between the three presenters. You’ve always got two people going against the other person, or one person annoying the other two. You can move things along very quickly by just changing the dynamic of who’s the antagonist. It’s infinitely interchangeable, you can play on that all day.
HULLFISH: The other thing that I’m struck by is a great amount of skill by you and the other editors of jumping between kind of genres because you’ve got the car journalism you’ve got the travelogue and you’ve really got to understand how to cut humor and have pacing in delivering a joke.
Dan James: That’s a huge thing in getting that timing right. It’s that editing idea of answering a question with a cut. If someone on screen asks a question, you answer it with a cut.
HULLFISH: Like James saying, “Oh, that will never happen.” And then you cut directly to it happening.
You were saying you’re not technical… so do you take the piece from off-line into the audio mix and into the color grade? Or do you leave it once the picture’s locked?
Dan James: Oh no! You can’t abandon your baby. We follow it through the sound dub and the color grade. We absolutely have to follow it all the way because we have to make sure that what we spent months on doesn’t get messed up on the way through, which has happened before. We follow it religiously.
HULLFISH: Is there a scene or an episode you want to talk about how the editing was done?
Dan James: In an upcoming show this season, Aston Martin have re-created an old model, it’s a DB4 GT lightweight. It’s a racing car from the late fifties and it’s like a million and a half pounds. So Jeremy’s got one of those and Richard’s got a 1960s Jaguar XKXS which is also a re-creation. These are not plastic kit cars. They’ve been built from scratch by the factories.
They take them to the oldest street circuit in Europe which is a place called Pau in the Pyrenees and they are racing around this street circuit and we used a high speed tracking vehicle that had a Russian arm on it. It’s just got this beautiful grade on it like an Eastman / Technicolor kind of thing with a little film grain and it just looks terrific. It’s one of the most favorite things we’ve done.
HULLFISH: And when you were in the off-line on that piece did you try to apply any kind of a look to it in the Avid?
Dan James: Yeah, I did do some grading work in the Avid for guide look. I just warmed up the blacks a bit. I just added some red and yellow in the blacks. Just to give it a rough color look. And the director was using a lot of powered zooms, which is a very 1960s kind of cinematic look. Big zoom outs and big zoom ins and that really helped make it look very 60s and very periodish. And using Man From U.N.C.L.E as a temp track as well.
HULLFISH: And did the composer finish that piece off with custom created score?
Dan James: Yeah. That’s going out in episode 8.
HULLFISH: Of season 2.
Tell me a bit about your background and how you got to Top Gear.
Dan James: I started in 2004. Before that, I was a bass player. My wife’s a make-up artist who was doing make-up on music videos back in the mid 1990s. So I started doing production running on set. Got employed by the company and then one of the directors said, “If I teach you how to use the editing equipment, you can cut my show reel.” That was back on U-Matic three quarter tape. So I cut his show reel and then the next low-budget music video he did, he asked me to edit. So then I started cutting music promos and then finally into TV. Bit of a weird career path but being a musician first really helped.
HULLFISH: So many editors are musicians. I’m a fellow bass player, so I completely agree. It was great talking with you. I really enjoyed it.
Dan James: Thank you Steve, it was a pleasure.
To read more interviews in the Art of the Cut series, check out THIS LINK and follow me on Twitter @stevehullfish
The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.
Panasonic
has introduced a new handheld camcorder, the first in a series set to combine
advanced connectivity functionality with high end image production for
broadcast and streaming applications.
The AG-CX350 camcorder spearheads the series, and is the first camcorder in the industry said to provide NDI | HX connectivity1. This ensures smooth video transmission and camera control over IP for live events and web distribution, and means the camera can be built in to larger systems working alongside Panasonic PTZ camera systems equipped with NDI|HX and the Panasonic Live Production Center (AV- HLC100).
The camera features a wide-angle of 24.5mm, with a high
powered optical 20x zoom lens and a new high-definition, high sensitivity 1.0-
type MOS sensor to enable 10bit image quality and 4K (UHD) or FHD resolution
recording at up to 50/60p, on to SDXC card. It also features a HLG2 gamma
mode to support HDR image production.
Featuring a
high brightness, high definition LCD, the camera provides SDI/HDMI parallel
outputs for enhanced operation and usability. Weighing just 1.9kg it has the
added benefits of compact size and low power consumption.
A new, high-efficiency HEVC codec (LongGOP, 10bit, 4:2:0/MOV) enables3 smooth playback and editing with a notebook
PC or MacBook.
In addition, the camera supports a streaming function for
Facebook and YouTube live with compatible RTSP and RTMP protocols. The camera
will also support the MXF P2 file format futureproofing it for potential
broadcasting applications4, enabling AVC-Intra and AVC-
LongG.
“The AG-CX350 really caters to everyone, with its futureproofed 4K quality and a vast array of connectivity and streaming potential,” said Sivashankar Kuppusamy, Marketing Manager EMEA for Panasonic. “The flexibility that’s embedded in the camera means it’s sure to appeal to those needing an allrounder that’s just at home filming short documentaries and student films to streaming corporate events. It is just the start of a series that has tremendous potential.”
The AG-CX350 will begin shipping in February 2019 with RRP £3,415. We are awaiting the Australian price and will update once we have (guesstimate at $6999 or $7450)
Dr David Smith thought it might be fun to compare a Pentax dSLR, Sony ActionCam, his Samsung Galaxy and a Kogan “GoPro knockoff” in a controlled environment.
His convertible on the streets of sunny Melbourne from the back seat.