This season of Dead Reckoning TV was a bit of a change brought about by two things: the first was moving the studio to another suite in the same building, turning the former into a dedicated set. The second catalyst was the move from my MacPro tower to my HP Zbook. Brian and I also did a little target audience readjusting based on feedback from season 2 which was a big step up content-wise compared to season 1 so we expected some changes.
I won’t get into the platform switch except to say that MacPro was dying and because I was moving back and forth between set and studio once a week, I needed something mobile. At the time, I was readjusting Red Futon Films’ project focus as well so that played a role in the choice of the Zbook.
Figuring out how to do web TV with guests on the budget has been difficult. In season 2, I was Skyping screen capturing via Screenflow from my Macbook Air. Primary audio was being fed from the MBA to the MacPro via the Apollo Duo with the host mic going directly the Apollo. I could control each track independently This season with the Zbook, creating secondary audio records, mix minus’ and sending the full screen to the plasma has proven to be a bit of an issue.
We’re shooting to a single BMD Pocket Cam with a Ziess ZF.2 21mm at F/4 and F/2.8 through a Letus Anamorphx. Lighting is a single daylight balanced 85w through a silk china ball with color matched LED rim lights. The audio is all done via Blue Bluebirds except for the Commentary which is a Sennheiser MKH 8060. All audio is direct to the Focusrite AD/DA (hopefully soon back to the Apollo).
Our solution so far is an app called XSplit. It’ll let me capture the full screen output of the Plasma and monitor what’s actually being recorded at the same time via my laptop screen. I can also monitor my audio mixer for the Focusrite. A perk of taking myself mostly out of the interview segments means I can send Brian’s mic directly to Skype now because I’m not on the show (single vs. multi channel). I have no way of mixing channels independently yet. I plan to attempt to record Brian simultaneously to Audition. XSplit unfortunately doesn’t let me split the audio record or record two independent channels for the call but XSplit by far surpasses other options I’ve tried thus far. There just isn’t anything available that does what I want.
Moving forward, I hoping to make are ditching this Focusrite and bringing the Apollo back into play on the record end because I can use live compression plugins on input like the DBX 160 for better control of our on show antics which can be fairly dynamic. I need to test the thunderbolt option card for the Apollo Duo though since isn’t qualified by Universal Audio on TB-equipped PCs yet. Until then, if I’m not observant (hard during the Spindle where I’m hosting as well) we can clip the audio badly.
I copy all the footage from the camera to a 4TB Thunderbolt G-RAID. I then copy the Audition audio record back to the card, lock the card and use a fresh card for the next segment.
On the post side, I’m using Adobe for everything. Everything comes into Premiere where I replace the audio in the vid files, throw them into the timeline and edit. I’ve built looks already in Speedgrade so that process is drag and drop. I mix the audio in Audition and place the new tracks in the timeline. The sequences get dropped into Media Encoder where I create two output modules: one for the MXF masters and the other for the WAV audio files for the podcast. Then I create a watch folder for the MXF outputs which render to H.264 for the web deliveries. Once all that is done, I’ll typically convert the MXFs to Cineform MOVs because the files are 25-50% smaller and when you’re talking 25-50GB master files, I need to take advantage of that.
So that’s our little WebTV show. If we film early enough in the day, I can have the whole show done by the afternoon and get the renders going on my way out the door at the end of the day. This all Adobe workflow makes things so fast and fluid.
I hear a lot about the poor treatment of VFX facilities and artists in my industry. This should be interesting.
So the Premiere to Speedgrade
Dynamic Direct Link functionality is pretty rad. So rad, I’m grading the entire doc in Speedgrade. So far, I’m extremely happy with the results I’m getting on everything except two shots. In the real word, it’s only two clips but three shots in the edit. So in my adventures of fixing those shots today, I’ve now learned a few things about this workflow: DynamicDirect Link does not give you any options to adjust RAW properties of a clip in Speedgrade.
- Not that number 1 matters a whole lot since Sg doesn’t give me any worthwhile adjustments for DNG anyway- oddly enough.
- I find irony in number 2 because DNG is an Adobe format (yes, I know Sg wasn’t an Adobe app natively, but it’s still funny).
- I can’t control how Premiere interprets my RAW footage. It seems to arbitrarily pick a matrix out of thin air.
- If that decoding matrix happens to bring in the footage overexposed then…
- …that footage will be clipped in Speedgrade if accessed via DL.
I think after trying a variety of different ways to fix the overexposed shots, I’ve settled on converting those clips to Cineform RAW and replacing them in the timeline. Then I can adjust the RAW settings of the clip as I’m in Speedgrade. So far, the least time-consuming approach.
I really want to start blogging again, but most of the time I think to myself, what do I have to offer? Why would people listen to me? Most of that is likely true so like, my screenplay writing forays, they’re mostly for my own personal enjoyment and exercise I guess. So I’ll start with a catch-up of things I’m working on while looking ahead through 2014.
Crafted: Beer + Faith + Community
My new short doc is about how the limited run beer, All Souls Ale by Big Sky Brewing came into being. It’s getting some great coverage and regular plays thanks to Reelhouse for throwing us on the front page. It’s about 20 minutes long and the finishing work should be done end of this month so release is next month. I plan to submit it to a couple of festivals this year just to get some more eyes an fans with the hopes that this becomes the seeds for a crowd-sourced feature doc with the same themes. Watch the trailer here.
My creative partner and I, Chris Fenner (@fenner403) who’s based in Atlanta have launched a new production agency. We’re still feeling our way around but we’ve got a great roster of core directors covering a variety of different styles and genres so we can match clients to particular directors for better creative fit. You can check out work here.
Screenplays & Stories
Like, blogging, I’m still trying to write regularly for my own sake. I do enjoy it. It’s fun entering another world and seeing your characters come alive. I have the contacts to actually pitch some stuff too, so I’d like to attempt that as a bucket list item or something I have several stories I’m developing (the long-winded paragraph ones that happen prior to screenplays) and trashed the first screenplay I did on my sister’s story (the one adopted after meeting her future parent on an airplane, the featured in People and on Oprah)and gone back to paragraph story development stage.
The second screenplay is a seed my wife dropped in my brain called The Age of Love and Loathing. It centers around a couple with three kids under four attempting to go on an 5 anniversary trip to Mexico. The trip was unwisely booked by dad, without the consultation of mom in an effort to jumpstart the slogging relationship. Needless to say it’s full of all the great and painful moments of parenthood when it seems like those little people you love so much are actually there to ruin your life.
Red Futon FIlms
Well, we have Dead Reckoning TV and that’s self-explanatory, but Brian and I are also embarking on a lecture series on the Apostle’s Creed. He’s already been teaching a 16 week Sunday School series on that so we’re going to get down and film it. I’m still identifying the best distribution method for that. He also has a lecture series on the Lord’s Prayer we’re going to follow up with perhaps later this year.
I’m also working on another project I’m excited about, a child sexual abuse prevention course being spearheaded by a team at our church. We already have initial interest from the FBI, and a couple local school districts. The church is giving me full ownership of an derivative works based on the initial core training. So I’ve go some big plans for this. I think it’s desperately needed.
I did update the Red Futon Films webpage but haven’t yet included the above projects. You can check it out and some of the other work I’ve done here.
And that’s it for now. See on on the webs.
We wrapped. Lots I could talk about but I’ll leave it for the final film. In the meantime, here’s some BTS photos from the last two days. Some have been posted via social media, others are fresh- Click the pic below for the full set on Flickr.
It sort of snuck up on us. We planned to shoot the doc around the same time all the volunteers come in to bottle the beer. Historically, that’s been middle to end of November. Bjorn, head of Big Sky Brewing told me LAST WEEK that they’re bottling Thursday. So Chris (@fenner403) and I started intaking made coffee and getting things ready. We made it…we think. I’ll be updating over the next couple days as I have time with BTS (behind the scenes) shots and stuff of the doc production at Big Sky Brewing. My biggest fear is that my story won’t keep up with some of the tech we’re using I have a few folks that have said it’s a great story so that’s good. It’s a short doc (sub 15minutes) about beer, community and faith. Chris is going for a total breaking bad feel feel. Here’s a few shots of tonight’s set up and final walk through of our plans- as much as docs can be planned.
I’ve spent tlast week and the better part of the week prior posting the first episode of Dead Reckoning, season 2. Namely because I was learning my way around CC. This season marks quite a change for now, we’re formatted as a web TV talk show. Consequently, we’re filming and posting four 10-20 minute vids each week which constitute a single episode. And my work has led me to a single conclusion about Adobe CC: it’s awesome. To follow that up, Premiere CC is finally everything I’ve always wanted it to be. Here’s why in 4 reasons:
- Multicam with audio sync. This feature seems to have been completely rearranged for CC. It’s now wildly easy to sync up multiple assets. In my case, 2 separate audio sources and two cameras. You can sync them using audio (just like Plural Eyes). So for the record, that’s 4 streams of audio and 2 cameras. There are some odd things going on with audio source and outputting once things are sync’d that I’ve yet to figure out- but it works successfully, accurately and without headache or frustration (workflow notes coming in a separate post).
- Flatten. This single feature allows me to EASILY go between Premiere and Resolve (or any other app for that matter) because when you select it after your timeline edit, you essentially de-nest your mulitcam timeline. Same thing you do did in Avid but in a single step vs. multiple (at least last version I used).
- Audition (loosely related to Premiere). I ran into some odd sync issues when going to Nuendo 6 via OMF. While I could work around them, it was a bit obnoxious to even have to deal with. So on a whim I decided to see about improvements with Audition. Not only can I use all my Universal Audio Apollo Duo DSP plugins (which are processed via the Duo!) but it’s actually FASTER to go between Premiere and Audition vs. Premiere and Nuendo.
- EuCon!!! I can use my Euphonix panels (pre Avid) to mix, shuttle, edit and other things.
Now if October’s update included Euphonix control in Speedgrade, I’d never leave adobe
A friend pointed me to the Digital Bolex clips today. Naturally, being the color junky I am, I had to play with them. I’ve always found I get best results when I spend time tweaking my RAW decoding first. For instance, with the Ikonoskop footage, I found it easiest to ditch those pink highlights by adjusting the RAW decoding parameters and in the case of the Digital Bolex, I can lose those ugly green-tinged highlights as well. In Resolve, I found a nice balance and got the most pleasing results by setting my RAW decode to P3 colorspace, REC709 gamma, and a 4500 color temp. On this footage, I bumped the exposure up a stop. This gave me a nice starting point. Naturally, as you start to process your RAW footage, that’ll change sometimes down to the clip! But by spending sometime tweaking here, you can spend more time getting jiggy with it later on.
Here’s a JPEG compressed version of what I came up with to get something nice and neutral before I being the full-go grading. This might be where I’d get it prior to editing, or whatever. Point being, it’s mostly the decode settings with contrast and saturation boosted and and a blue offset adjustment (also known as my first node…lol). You can download my DPX here if you real feel like wanting to see the true(r) version.
I derive a fair amount of inspiration from literature. Specifically, non-fiction. Currently, I’m slowly working through Turkish Nobel Prize-Winning author Orhan Pamuk’s book, Istanbul: Memoirs and the City. In it he describes so much of the Turkish culture and the myraid makeup of the great city of Istanbul. (Turkey has spent a fair bit of time in the news as of late). What he pens most is the conflict between the Turkish desire for their former Ottoman glory their desire to be Westernized and respected as modern. As a result he reflects often on the city and it’s history.
As I was reading through recently on the plane, it struck me that what draws me to this literature is the same thing that draws me to older US cities like New York or Chicago. There is a depth of history there that involves people. I’ll see pictures in his book and the faces and think about what their lives might have been comprised of: friends, conversations, tasks, pressing concerns. I think the same things as I wander streets in foreign cities or older cities in the states.
When I was a child stuck in the backseat of a car with my siblings on a road trip, I would stare out the window watching car after car go by. I wouldn’t content myself to idly watch though, I would mentally transport myself to their vehicle, wonder about conversations and where they were coming from or going to. Just like my life had friends and things and a relational dynamic, so I wondered what those other people’s lives contained and what their dynamics were like.
Older cities, beat up by time and millions of people passing the same place year in and year out bring the same thoughts to mind. I wonder, what were their lives like? I sometimes make things up. Both the fiction of the past and the non-fiction of the current make me want to tell that story. I think there’s a part of me that loves making docs because I want to capture that reflection of someone’s normalcy in the present reality.
Well, as I’ve mentioned, Dead Reckoning is turning into/launching as a Web TV show this fall. We have several specials throughout the summer we’re using as dry-run set up and tech on as we continue with the podcast, websites get created and motion graphics are constructed. Oh my, the tech is a challenge of finding a balance between quality, work, workflow and guest ease.
We have three components: Skype Video guest interviews, filming Brian and I, and filming from our sets. Each one of these by themselves is no-brainer easy. Putting them together on the other hand is a bit more difficult. Let’s take the whole Skype thing for instance. I can record a call by either screen capture or a built-in app (neither of which is fully tested although I’m leaning towards screen capture). I can film Brian and I on our set with 2 cams no issue. For the web show though, I want to make sure our guest can see us without additional lighting screwing up their view of us, we can see our guest in the plasma on the set, audio is sent appropriately to the cameras as well as our guest and the guest can still hear us decently, and I have a recorded screen or video call of just our guest.
Sounds easy in theory but here are a few of the challenges I’m thinking through:
- I’m running all of this mostly remotely from the set via Team Viewer to my MacPro
- Record second screen of MacPro as standalone Skype call via Screenflow. Problem, I can’t remotely control that screen (for whatever odd reason from TeamViewer), audio recorded through my super-nice Universal Audio Apollo Duo.
- Record Skype video call via a Skype recording app like Call Recorder. Problem, it’s SD only and well, it’s app. I’d trust Screenflow way more than a third-party Skype app and it won’t *theoretically* interfere with the operation of Skype decreasing factors of error.
- Record Skype from completely separate computer via Screenflow. Problem, can’t send nice audio feed from our mics to the guest. I could run a separate mic feed from my mixer to the secondary camera?
- Recording audio directly to cameras prevents me from being able to use my Apollo (and it’s WAY nicer preamps) and I have to find some other feed to Skype
- Recording audio directly to Nuendo presents a few workflow challenges in post since audio is in Nuendo and editing is done in Premiere. How do I want to go about posting the show?
- Skype still won’t take channel one and channel two of my Apollo, only channel one so I swap a lot of cables…a lot. This may not be an issue though if I record directly to each camera.
- How much value do I place on the audio quality of my Apollo vs. a directly-to-camera record? Do I want to deal with the hassle of post moving the files from Nuendo to Premiere and back again?
- My plasma doubles as my grading monitor and my Skype guest monitor so I have to de-hang it and remove the stand each time I do an interview and deal with the resulting cable madness. (It’s far too large just to put on my desk).
- Each week, we’ll move lights and cameras for three different sets.
- It’s only me doing everything. I’m sure Brian will get some production legs underneath him as he jumps in and helps out.
So, if you have any thoughts, holler. There’s a lot of tech to think through to find a balance of minimizing work, maximizing quality and facilitating smooth interviews and production workflow. I’ll let you know what I work out the next few months before the final decisions are made. We have some test stuff coming with not great camera angles, lighting changes and other things.