I knew the time would come that i would have to abandon my strong stance on NOT getting into 4K. I'm still not advocating jumping into 4K, but it's finally coming into fruition and is worth considering. As 4k cameras are coming out practically every month this year, let's dive into some of the considerations of 4K, and whether or not it's time to retire 1080 as your primary acquisition resolution.
1080 is good, damn good.
In fact, 1080 is already not hugely far off from what we can physically perceive with our human eyes at typical viewing distances/display sizes. There were plenty of opinions back in the day on just how far you should sit from your TV, 2x screen diagonal, 2.5x screen height, measure the distance of Jupiter minus the square root of pie, etc... I myself always preferred sitting a bit closer than half the distance to the movie theater screen, and seems i preferred farther than that for my home theater.
I suppose the winds have changed since TVs are now called "home theaters" and long gone are the days of the movie theater being the only place where you go to get the movie experience. Who better to comment on sitting distances than SMPTE and THX, who use more math than i'm qualified to comment on, seem to prefer viewing distances of around 1.2-1.6x screen width, or around 30 degrees.
OK, so at 1.2-1.6x viewing distance, is 1080 then insufficient for optimal enjoyment in your choice of content? Again, a silly way to even look at it. To see just how silly it is, i have a vastly incomplete list for you:
Star wars Episode 2,3
Crank 2: High Voltage
Iron man 2
Act of Valor
End of Watch
And a few hundred more, have all been either entirely shot on, or have had shots integrated which are 2k/1080 or LOWER. Better yet: you've seen movies which have been shot at least partially on STANDARD DEFINITION. Did you get up and walk out of the theater? I think not. That's the main reason i went to see Crank 2, because i read about how they extensively used small, lower end canon minidv cameras for B cams. You could buy them at best buy. I wanted to see how it played on the big screen. Guess what? No one walked out. It was actually a hilariously over the top film.
The point is: Content is and will always be king. There is no academy award for "Best resolution". 1080 and even 720 are more than sufficient to produce a great looking video.
So, if 1080 is that good, what about 4k?
4K i believe will inevitably take over 1080 in all areas, that's just basic progression. 3D hasn't produced sustainable sales for retailers, so they need something else to kick the bucket. 4K sounds more impressive: "four times HD!" But is it just a marketing ploy, or does it have actual merit?
It does have merit, plenty actually.
Keeping in mind how viewing distances affect perceived resolution, you don't need to sit right up against the screen appreciate 4k, but as you step back from the screen, there is a point at which you will not be able to physically differentiate it from 1080. Viewing distance is something that really shouldn't be 'resolution dependent'; If you shot a film in 100K, and the only way to physically appreciate that resolution is to sit 1 foot from the screen, then what's the point? 4K i think will for a long time be the standard, because it's in that area of detail that surpasses what the human eye can detect in most situations. 8K i think will also be inevitable, but for reasons i'll get into later.
So 4k has detail that we can see, But how does it compare to film?
Modern film can scan at higher than 4k resolution, but is it a worthwhile comparison? Video's ability to look like film isn't solely dependent on equivalent resolution. So while film can resolve more, again we have to consider the limitations of the human eye.
This one i hear a lot from 4k advocates, and it makes little sense to me. If you shoot a film in 4K, but it'll only be mastered in 1080/2k the year you finish it, do you really think that years down the road having the ability to make a 4k master will all of a sudden revive success in that film? Possible? Anything's possible. Likely? No.
UPDATE: I neglected to think about the stock footage market, for which i am very engaged. Shooting stock in 4k certainly has an advantage as more and more producers hunting for stock will eventually search out 4k material.
Post zooms, reframing, pan & scan, etc...
This is a valid argument for 4k: Since you have all that resolution, if you want to tighten a shot up, you can do it, if you want to add a little push in zoom, you can do it. Shakey video? Stabilize it in software and have resolution to spare! 4k does indeed add this flexibility to in many ways crop an image and still have detail. But i have some issues with this as well, because many folks talk about it as if it were a free lunch.
Issue 1: noise floor.
Just because the detail is there, doesn't mean it will be as clean. Every sensor has a signal to noise ratio, any time you crop the image, you're effectively zooming in on that noise, making it more prominent. So let's postulate a scenario: you shoot a medium shot of an actor, let's also say you had to use a higher iso and while the image is pretty clean, it's not exactly free of noise/grain. In post you decide you want a much tighter shot and you crop it by 100% (2x), Technically you still have more than 1080 lines of resolution, right? Yes, but you also technically have just doubled the apparent noise. No free lunch.
The other catch to any post cropping: what resolution are you mastering in? If you go willie nillie with cropping by 20, 30, 40% and up, then mastering in 4k won't quite result in 4k quality. If you plan on mastering in the same resolution that you shot in, cropping more than 20% can likely become apparent.(some would say 15 or even 10% is pushing it). For post cropping to look good also highly depends on the shot being sharp to begin with, so if your focus is off even by a bit, cropping into it is only going to make that blown focus more apparent and ugly. I'm looking at you, shallow DOF junkies...
Issue numero 2: Lazy productions
"Leave it to post" I'm sorry to say is becoming more widely accepted because of advances like 4k and issue number 1. The proper way of making any video, is to plan it ahead, and shoot it as it's envisioned. Granted, certain formats such as reality TV lack the exact-ness of typical hollywood planning and in their case, the post cropping ability is invaluable. Same goes for documentaries; such mediums are in my opinion 99.997% about the content and the resolved quality is almost entirely irrelevant. But in eyeballing independent filmmakers, there's already a severe amount of laziness and shortcutting going on (cited proof: youtube). So when someone informs an uneducated/poorly educated filmmaker about this post-cropability of 4k... No Jedi mind trick is even necessary for them to go giddy with joy at this concrete fact of a downside-less ability. I can hear them now: "I don't even need to shoot closeups anymore!"
OK, backing up from that extreme, some people know how to use the tools, some don't. That's a constant. The point i'm making here is that some aspects of 4k i believe will amount to an addition of poorly produced content. Thankfully though, that content will likely only go so far as the internet video dumping grounds. (youtube. where else) For those who know what to do with it, it will be just another useful tool to help shape the final product.
4k: It just looks pretty
I only just started tinkering with 4K on the blackmagic production camera, Sony AX100, and now the GH4, and yes, there's a bit of gittiness when you first start acquiring such detail. It made me want to go shoot more things, and gave me a sense that 1080 just wasn't good enough. Those feelings are of course fleeting and indicative of almost any new technology you might acquire. But yes, it does give you that little something extra that, if you're passionate about making lovely images, it strikes a chord inside.
The evil catch.
Remember, no free lunch. By far the most expensive lunch that comes with 4k is the workflow. This includes storage, processing, and finishing. 4K can be acquired in 8-bit 4:2:0 60mbit/sec(AX100) and look plenty pretty, but any time you step up to a higher level of recording to truly get the goods out of it, you're looking at a large amount of data. This is not a big deal if you're a proper production with planning and resources, but if you're a run and gunn'er, you'll likely get bogged down.
Many of the cameras coming out have high compression, and some like the blackmagic have very little compression. For instance the BMC4k in prores records around 5 gigabytes per MINUTE. So if you don't have the luxury of a DIT or even a laptop to offload yourself, you're going to need a lot of cards/SSDs to keep you going. Considering that SSDs are now around .50cents a gigabyte, you're looking at $2.50 per minute of footage. Yes of course you can reuse your media, but if you like to calculate things like depreciation and such: if you average 3 hours of actual recording per day, then your SSD media will cost you $450 for that 1 day of use, 2 days? 225/day, 3 days: $150... etc... I admit there's not a whole lot of sense talking these numbers, but it does bring up an important point: Buy reliable, long lasting media. They're a depreciable asset.
The point of this article is not whether 4k is better or not, it's whether or not it's reached maturity for the masses. Eventually 4k will surpass 1080 as the most used format. 2014 is the year that 4k cameras and monitors have started to flow pretty regularly. Couple that with most if not all NLEs being updated to handle the codecs that contain 4k, it's looking to be a pretty good time to see if 4k makes sense for you and your needs. New 1080 cameras are still coming out and even 4k cameras have the option to shoot in 1080.
So, the two questions you should ask yourself
Do you have the workflow to handle & support 4k, and do you really need it right now?
So if you want to get into 4k right now, can you handle it? We already went over the storage needs, but how about processing? Supposing your hard drives can keep up, what if your cpu can't? That doesn't mean you have to either give up on 4k or spend a bucket of cash updating your computer. Proxy it. Proxies are how we reduce the load on the system: by transcoding to a less intensive format. So, instead of editing the 4k files, you render out 1080 versions of them, edit those, and when you're ready to export 4k, you offline/replace the proxies with the 4k files. Yes, it adds a step or two, but you have to weigh that with how much you want 4k and how little you want to spend on hardware upgrades.
So, is it time?
Another silly question, because it all depends on you and your needs or the needs/desires of your clients. We've gone over some of the main considerations for switching, the last consideration are the cameras themselves.
The RED cameras have been around for some time now, and they've proven to be plenty capable. They've had a bumpy road, but if you like the form factor and have the coin to buy everything necessary for a shoot-able kit, they're an option few will sneeze at.
Sony has released possibly the most 4k cameras of any manufacturer, and all are interesting in their own ways. The "low" cost AX100 gets you into 4k at just $2000, and while it packs a pretty image, it has plenty of consumerish constraints. Stepping up from that the FS700 is one of the lower cost interchangeable lens camcorders that does 4k, but needs an external recorder and has the form factor of a shrunken cinderblock. They also have several other fixed lens 4k cameras such as the AX1.
Canon has never been one to move very quickly. At the time of this writing, the C500 and the 1DC are their only 4k cameras, and they are far from cheap.
Panasonic has put plenty of stock in their m43 stills line for video purposes, and the GH4 is making waves as a very capable 4k DSLR for under 2 grand.
Then there are the newcommers, blackmagic and now AJA, who are both new at making cameras. Blackmagic has offended more than a few folks with their delays and their less than pleasing communication about issues/delays/fixes/etc... Their most recently listed cameras, the Studio and the URSA are both puzzling in their respective designs. AJA's recently announced CION looks very attractive, but is a larger form factor which may or may not be for your needs. The image that i saw thus far out of the CION was darn pretty.
There are more, but the overall consideration here is that many are first gen products, and while capable, may be worth watching for a bit before investing in. If you don't absolutely need 4k right now, stick around and see what else manufacturers can come up with. I've yet to try/see a 4k camera that ticks enough boxes for my needs, and that's what it all comes down to: What are YOUR needs?